Search Results

Search found 83852 results on 3355 pages for 'net framework version'.

Page 235/3355 | < Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >

  • What is the best certification for the self-taught ASP.Net programmer? [closed]

    - by Wahid Bitar
    I learned the C#.Net language from books and many other resources then i did some good projects with ASP.Net "Web Forms & MVC". But i wanna a good certificate to get a better work out of my country as i suppose. I've two choices: Apply for a " college / institute " and start study academic courses to be more professional and maybe in two years I'll "hopefully" graduate with this college certificate. Apply for kind of " Certifications by Companies " like MCTS from Microsoft or something like that and study their straightforward courses then maybe with three or four months I'll get this "Not official Certificate". Is the second type of certifications good for learning and work or the best is the hard way ?. Please give me advises with certifications names please. ======================= Update: This is not related to certain country or region. I'm asking about a good certification and courses for an ASP.Net and C# in general programmer.

    Read the article

  • Handling (many) multiple projects in Git in an enterprise environment

    - by Michael K
    One of the advantages of older version control systems such as CVS and SVN in enterprise development is that anyone can connect to source control and see all the projects that the company has. This can make it easier to get a high level view of what kid of development is happening outside your sprint and also keeps everything in one place and easy to find. However, distributed version control systems (Git, specifically) use the repository as their base unit. They work best with one project (or several closely related projects) per repository. This makes repository management more difficult in most enterprise environments where it is not unusual to have more than 25-50 projects to support. As far as I have been able to determine, you have to keep a list somewhere else of all the repos you have. There is software available, like GitHub, that help, but that still is an extra step beyond a single connection string and listing the contents of the repository. What is the best way to deal with the complexity of multiple repositories?

    Read the article

  • Best ASP.Net Host for Developers

    - by Tyler
    I would need it to allow me to host subdomains and multiple domains is a huge plus. Required: ASP.NET 2.0, 3.0, 3.5 Subdomain Hosting MS-SQL & MySQL Databases Want Multiple Domain Hosting ASP.NET 4.0 Ability to directly connect to MS SQL using SQL SMS So what do you have for me SF?

    Read the article

  • Windows Phone 7 Isolated Storage Explorer

    - by help.net
    WP7 Isolated Storage Explorer is a tool designed to help developers and testers interact with the isolated storage file for Silverlight Windows Phone 7 applications. The explorer can work both as a desktop application for testers or integrated in Visual Studio for developers. Whenever a WP7 application/project involves storing data locally the the device, it will be to the isolated storage file. A common difficulty is accessing the data for testing or rapidly restoring the application's data/state...(read more)

    Read the article

  • Visual Studio 2010 Web Development Improvements

    - by Aamir Hasan
    VS2010 emulates what is available in previous framework versions through reference assemblies. These assemblies contain metadata that describes functionality available in previous versions. VS2010 itself uses the .NET 4 framework, so when adding multitargeting support the team decided against running a previous framework version inside the same process. When your application is compiled to 100 percent guarantee application compatibility, previous compiler versions are used.Web development in Visual Studio 2010 has been enhanced for greater CSS compatibility, increased productivity through HTML and ASP.NET markup snippets and new dynamic IntelliSense JavaScript.Improved CSS CompatibilityHTML and JavaScript SnippetsJavaScript IntelliSense Enhancements

    Read the article

  • How to maintain a demo version of an application?

    - by O.O
    I need to be able to demo our production application to prospective clients. The way I have it setup today is simple. The demo application is an exact duplicate of the production system, except that the data in the database is obfuscated to protect our current clients' data. This works great because it doesn't require any application changes. Boss dropped a potential BOMBSHELL today and said that the demo system needs to contain a special link and that ONLY shows up on demo. He went on to explain that in the future there may be much bigger differences between the demo and production apps (e.g. an entire area of functionality). What do I do now? Some things I have thought about doing: Maintain a different branch in subversion specific to the demo system Create an installation package that has the changes for demo, then revert and build a production installation package Modularize the application (no idea how) Say: "Screw you! I will not do it!" (LOL) Use some sort of conditional logic in the app to determine if it is a demo or a production app. E.g. (if the URL contains 'demo' then show else hide). If you haven't guessed by now, this is a web application Anyways, I have no experience in this scenario as to which one is better or if none of these are any good. Anyone have an answer, strategy, something!?

    Read the article

  • Flexible design - customizable entity model, UI and workflow

    - by Ngm
    Hi All, I want to achieve the following aspects in the software I am building: 1. Customizable entity model 2. Customizable UI 3. Customizable workflow I have thought about an approach to achieve this, I want you to review this and make suggestions: Entity objects should be plain objects and will hold just data Separate Entity model and DB Schema by using an framework (like NHibernate?). This will allow easy modification of entity objects. Business logic to fetch/modify entities has to be granular enough so that they can be invoked as part of the workflow. Business objects should not hold any state, and hence will contain only static methods The workflow will decide depending upon the "state" of an entity/entities which methods on business object/objects to invoke. The workflow should obtain the results of the processing and then pass on the business objects to the appropriate UI screen. The UI screen has to contain instructions about how to display a given entity/entites. Possibly the UI has to be generated dynamically based on a set of UI instructions. (like XUL) What do you think about this approach? Suggest which existing frameworks (like NHiberante, Window Workflow) fit into this model, so that I will not spend time on coding these frameworks Also suggest is there any asp.net framework that can generate dynamic asp.net ajax pages based on a set of UI instructions (like Mozilla XUL)? I have recently been exploring Apache Ofbiz and was impressed by its ability to customize most areas of the application: UI, workflow, entities. Is there any similar (not necessarily an ERP system) application developed in C#/.Net which offers a similar level of customization? I am looking for examples of applications developed in C# that are highly customizable in terms of UI, Workflow and Entity Model

    Read the article

  • using Autofac in a multi-layered architecture

    - by Kamyar
    I'm fairly new to the DI/IoC concept and would like to use Autofac in a 3-layered ASP.NET Webforms application. UI layer: An ASP.NET webforms website. BLL: Business logic layer which calls the repositories on DAL. DAL: .EDMX file (Entity Model) and ObjectContext with Repository classes which abstract the CRUD operations for each entity. Entities: The POCO Entities. Persistence Ignorant. Generated by Microsoft's ADO.Net POCO Entity Generator. I have asked a more general question here. Basically, I'd like to create an obejctcontext per HttpContext in my DAL. But i don't want to add a reference to DAL in UI or access to HttpContext in DAL directly. I guess this is where IoC tools come to play. The answer to my previous question is a very good example of using Windsor Castle. I'd like to use Autofac as my IoC tool and Don't know how to achieve this. (How to access DAL in application_start to register the component while I don't want to reference it in my UI, what are the proper references to be able to use DAL component in BLL with Autofac, Should I register BLL as a component with Autofac too) Sorry folks for not providing an explicit question and requesting a kind of working example, But I'm very unfamiliar to the whole IoC concept and I don't think I can achieve it to use in my current time-limited project.

    Read the article

  • Advice on Minimizing Stored Procedure Parameters

    - by RPM1984
    Hi Guys, I have an ASP.NET MVC Web Application that interacts with a SQL Server 2008 database via Entity Framework 4.0. On a particular page, i call a stored procedure in order to pull back some results based on selections on the UI. Now, the UI has around 20 different input selections, ranging from a textbox, dropdown list, checkboxes, etc. Each of those inputs are "grouped" into logical sections. Example: Search box : "Foo" Checkbox A1: ticked, Checkbox A2: unticked Dropdown A: option 3 selected Checkbox B1: ticked, Checkbox B2: ticked, Checkbox B3: unticked So i need to call the SPROC like this: exec SearchPage_FindResults @SearchQuery = 'Foo', @IncludeA1 = 1, @IncludeA2 = 0, @DropDownSelection = 3, @IncludeB1 = 1, @IncludeB2 = 1, @IncludeB3 = 0 The UI is not too important to this question - just wanted to give some perspective. Essentially, i'm pulling back results for a search query, filtering these results based on a bunch of (optional) selections a user can filter on. Now, My questions/queries: What's the best way to pass these parameters to the stored procedure? Are there any tricks/new ways (e.g SQL Server 2008) to do this? Special "table" parameters/arrays - can we pass through User-Defined-Types? Keep in mind im using Entity Framework 4.0 - but could always use classic ADO.NET for this if required. What about XML? What are the serialization/de-serialization costs here? Is it worth it? How about a parameter for each logical section? Comma-seperated perhaps? Just thinking out loud. This page is particulary important from a user point of view, and needs to perform really well. The stored procedure is already heavy in logic, so i want to minimize the performance implications - so keep that in mind. With that said - what is the best approach here?

    Read the article

  • IEnumerator seems to be effecting all objects, and not one at a time

    - by PFranchise
    Hey, I am trying to alter an attribute of an object. I am setting it to the value of that same attribute stored on another table. There is a one to many relationship between the two. The product end is the one and the versions is the many. Right now, both these methods that I have tried have set all the products returned equal to the final version object. So, in this case they are all the same. I am not sure where the issue lies. Here are my two code snipets, both yield the same result. int x = 1 IEnumerator<Product> ie = productQuery.GetEnumerator(); while (ie.MoveNext()) { ie.Current.RSTATE = ie.Current.Versions.First(o => o.VersionNumber == x).RSTATE; x++; } and foreach (var product in productQuery) { product.RSTATE = product.Versions.Single(o => o.VersionNumber == x).RSTATE; x++; } The versions table holds information for previous products, each is distinguished by the version number. I know that it will start at 1 and go until it reaches the current version, based on my query returning the proper number of products. Thanks for any advice.

    Read the article

  • ASP.NET MVC 3 Hosting :: Rolling with Razor in MVC v3 Preview

    - by mbridge
    Razor is an alternate view engine for asp.net MVC.  It was introduced in the “WebMatrix” tool and has now been released as part of the asp.net MVC 3 preview 1.  Basically, Razor allows us to replace the clunky <% %> syntax with a much cleaner coding model, which integrates very nicely with HTML.  Additionally, it provides some really nice features for master page type scenarios and you don’t lose access to any of the features you are currently familiar with, such as HTML helper methods. First, download and install the ASP.NET MVC Preview 1.  You can find this at http://www.microsoft.com/downloads/details.aspx?FamilyID=cb42f741-8fb1-4f43-a5fa-812096f8d1e8&displaylang=en. Now, follow these steps to create your first asp.net mvc project using Razor: 1. Open Visual Studio 2010 2. Create a new project.  Select File->New->Project (Shift Control N) 3. You will see the list of project types which should look similar to what’s shown:   4. Select “ASP.NET MVC 3 Web Application (Razor).”  Set the application name to RazorTest and the path to c:projectsRazorTest for this tutorial. If you select accidently select ASPX, you will end up with the standard asp.net view engine and template, which isn’t what you want. 5. For this tutorial, and ONLY for this tutorial, select “No, do not create a unit test project.”  In general, you should create and use a unit test project.  Code without unit tests is kind of like diet ice cream.  It just isn’t very good. Now, once we have this done, our brand new project will be created.    In all likelihood, Visual Studio will leave you looking at the “HomeController.cs” class, as shown below: Immediately, you should notice one difference.  The Index action used to look like: public ActionResult Index () { ViewData[“Message”] = “Welcome to ASP.Net MVC!”; Return View(); } While this will still compile and run just fine, ASP.Net MVC 3 has a much nicer way of doing this: public ActionResult Index() { ViewModel.Message = “Welcome to ASP.Net MVC!”; Return View(); } Instead of using ViewData we are using the new ViewModel object, which uses the new dynamic data typing of .Net 4.0 to allow us to express ourselves much more cleanly.  This isn’t a tutorial on ALL of MVC 3, but the ViewModel concept is one we will need as we dig into Razor. What comes in the box? When we create a project using the ASP.Net MVC 3 Template with Razor, we get a standard project setup, just like we did in ASP.NET MVC 2.0 but with some differences.  Instead of seeing “.aspx” view files and “.ascx” files, we see files with the “.cshtml” which is the default razor extension.  Before we discuss the details of a razor file, one thing to keep in mind is that since this is an extremely early preview, intellisense is not currently enabled with the razor view engine.  This is promised as an updated before the final release.  Just like with the aspx view engine, the convention of the folder name for a set of views matching the controller name without the word “Controller” still stands.  Similarly, each action in the controller will usually have a corresponding view file in the appropriate view directory.  Remember, in asp.net MVC, convention over configuration is key to successful development! The initial template organizes views in the following folders, located in the project under Views: - Account – The default account management views used by the Account controller.  Each file represents a distinct view. - Home – Views corresponding to the appropriate actions within the home controller. - Shared – This contains common view objects used by multiple views.  Within here, master pages are stored, as well as partial page views (user controls).  By convention, these partial views are named “_XXXPartial.cshtml” where XXX is the appropriate name, such as _LogonPartial.cshtml.  Additionally, display templates are stored under here. With this in mind, let us take a look at the index.cshtml file under the home view directory.  When you open up index.cshtml you should see 1:   @inherits System.Web.Mvc.WebViewPage 2:  @{ 3:          View.Title = "Home Page"; 4:       LayoutPage = "~/Views/Shared/_Layout.cshtml"; 5:   } 6:  <h2>@View.Message</h2> 7:  <p> 8:     To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC     9:    Website">http://asp.net/mvc</a>. 10:  </p> So looking through this, we observe the following facts: Line 1 imports the base page that all views (using Razor) are based on, which is System.Web.Mvc.WebViewPage.  Note that this is different than System.Web.MVC.ViewPage which is used by asp.net MVC 2.0 Also note that instead of the <% %> syntax, we use the very simple ‘@’ sign.  The View Engine contains enough context sensitive logic that it can even distinguish between @ in code and @ in an email.  It’s a very clean markup.  Line 2 introduces the idea of a code block in razor.  A code block is a scoping mechanism just like it is in a normal C# class.  It is designated by @{… }  and any C# code can be placed in between.  Note that this is all server side code just like it is when using the aspx engine and <% %>.  Line 3 allows us to set the page title in the client page’s file.  This is a new feature which I’ll talk more about when we get to master pages, but it is another of the nice things razor brings to asp.net mvc development. Line 4 is where we specify our “master” page, but as you can see, you can place it almost anywhere you want, because you tell it where it is located.  A Layout Page is similar to a master page, but it gains a bit when it comes to flexibility.  Again, we’ll come back to this in a later installment.  Line 6 and beyond is where we display the contents of our view.  No more using <%: %> intermixed with code.  Instead, we get to use very clean syntax such as @View.Message.  This is a lot easier to read than <%:@View.Message%> especially when intermixed with html.  For example: <p> My name is @View.Name and I live at @View.Address </p> Compare this to the equivalent using the aspx view engine <p> My name is <%:View.Name %> and I live at <%: View.Address %> </p> While not an earth shaking simplification, it is easier on the eyes.  As  we explore other features, this clean markup will become more and more valuable.

    Read the article

  • DNS Query.log - Multiple query’s for ripe.net

    - by Christopher Wilson
    Currently I run a DNS server (bind9) that handles queries from clients over the internet lately I have noticed hundreds of queries from all different address's that look like this (Server IP removed) client 216.59.33.210#53: query: ripe.net IN ANY +ED (0.0.0.0) client 216.59.33.204#53: query: ripe.net IN ANY +ED (0.0.0.0) client 208.64.127.5#53: query: ripe.net IN ANY +ED (0.0.0.0) client 184.107.255.202#53: query: ripe.net IN ANY +ED (0.0.0.0) client 208.64.127.5#53: query: ripe.net IN ANY +ED (0.0.0.0) client 208.64.127.5#53: query: ripe.net IN ANY +ED (0.0.0.0) client 205.204.65.83#53: query: ripe.net IN ANY +ED (0.0.0.0) client 69.162.110.106#53: query: ripe.net IN ANY +ED (0.0.0.0) client 216.59.33.210#53: query: ripe.net IN ANY +ED (0.0.0.0) client 69.162.110.106#53: query: ripe.net IN ANY +ED (0.0.0.0) client 216.59.33.204#53: query: ripe.net IN ANY +ED (0.0.0.0) client 208.64.127.5#53: query: ripe.net IN ANY +ED (0.0.0.0) Can someone please explain why there are so many clients querying for ripe.net ?

    Read the article

  • IIS mystery: "Deadlock detected" periodically makes site unavailable

    - by jskunkle
    A few times a day, our vb.net (IIS 6.0) website is randomly throwing the following error and becomes completely unavailable for 5-15 minutes at a time while the application is recycled: ISAPI 'c:\windows\microsoft.net\framework\v2.0.50727\aspnet_isapi.dll' reported itself as unhealthy for the following reason: 'Deadlock detected'. The website ran for months on the exact same server in beta without problem - but the problem started over the weekend when we made the site live. The live site is under some load but less than many of our other production websites. How should I attack this problem? I've looked into orphaning the worker process and creating a dump file - but I'm not sure how to analyze that. Any advice or information is appreciated. Thanks, Shane

    Read the article

  • gcServer config not taking effect

    - by G33kKahuna
    I'm supporting a ASP.NET v2.0 app installed on a Windows 2003 SP3 Enterprise on a quad core 8G machine running on .NET 2.0 SP1. 1.before enabling the config, ran "tasklist /m mscorwks.dll" Image Name PID Modules w3wp.exe 7888 mscorwks.dll 2.add under section in web.config 3.ran IISRESET, rebooted server too 4.ran "tasklist /m mscorsvr.dll" INFO: No tasks are running which match the specified criteria. 5.ran "tasklist /m mscorwks.dll" Image Name PID Modules w3wp.exe 6251 mscorwks.dll It seems like gcServer is not taking effect. Are there any additional settings/ configurations necessary to get it working?

    Read the article

  • Silverlight Version 4 latest build for Win7 64bit and WinXP 32bit

    - by Paul
    I have a requirement where a few people need the latest version of Silverlight 4 installed. I know the latest version is 5.xx... but apparently with some new software we're having installed we have to use version 4 After a bit of googling i can see that the latest version is... Build 4.1.10329.0 Released May 8, 2012 We have a mix of Win7 64-bit machines and WinXP 32-bit machines. Q: Is there a different version for each OS or the same one fits all. (This seems strangely hard to decipher by googling) Q: Does anyone know where i can download the latest version 4? Microsoft do not seem to offer it anymore unless i'm just not finding it. Q: Is there a separate browser version of it or will installing it also handle any browser needs (our new software will be browser based) Any pointers much appreciated. Paul

    Read the article

  • Update server version for postgres 9.1.2

    - by Nai
    I'm trying to run a postgis sql script and I'm running into the following error. Am I correct to say that updating my server version will fix it? If so, how can I go about updating it? I'm on Mac OSX Lion and installed Postgres via brew. Apparently I have an older version installed which is 9.1.2 but installing postgis installed postgres 9.2.1 on to my system. How can I point my postgres server to the new one? nai@nyc /usr/local/share/postgis (git::master) $ psql -d template_postgis -f postgis.sql SET BEGIN psql:postgis.sql:49: ERROR: incompatible library "/usr/local/Cellar/postgresql/9.2.1/lib/postgis-2.0.so": version mismatch DETAIL: Server is version 9.1, library is version 9.2. nai@nyc /usr/local/share/postgis (git::master) $ psql psql (9.2.1, server 9.1.2) WARNING: psql version 9.2, server version 9.1. Some psql features might not work.

    Read the article

  • Help with my application please! Can't open image(s) with error: External component has thrown an ex

    - by Brandon
    I have an application written in C# I believe and it adds images to a SQL Server 2005 Database. It requires .NET 3.5 to be installed on my computer. I installed .NET 3.5 and setup a database. It runs fine but then once it gets to image 100 when running on one computer, It stops and gives me this error: Can't open image(s) with error: External component has thrown an exception.... When I run the program on my own computer I am able to reach 300 images but then it stops after 300 images and gives me Can't open image(s) with error: External component has thrown an exception.... error once again. please help!

    Read the article

  • Does the .NET Framework need to be reoptimized after upgrading to a new CPU microarchitecture?

    - by Louis
    I believe that the .NET Framework will optimize certain binaries targeting features specific to the machine it's installed on. After changing the CPU from an Intel Nehalem to a Haswell chip, should the optimization be run again manually? If so, what is the process for that? Between generations here are some notable additions: Westmere: AES instruction set Sandy Bridge: Advanced Vector Extensions Ivy Bridge: RdRand (hardware random number generator), F16C (16-bit Floating-point conversion instructions) Haswell: Haswell New Instructions (includes Advanced Vector Extensions 2 (AVX2), gather, BMI1, BMI2, ABM and FMA3 support) So my, albeit naive, thought process was that the optimizations could take advantage of these in general cases. For example, perhaps calls to the Random library could utilize the hardware-RNG on Ivy Bridge and later models.

    Read the article

  • Database version control resources

    - by Wes McClure
    In the process of creating my own DB VCS tool tsqlmigrations.codeplex.com I ran into several good resources to help guide me along the way in reviewing existing offerings and in concepts that would be needed in a good DB VCS.  This is my list of helpful links that others can use to understand some of the concepts and some of the tools in existence.  In the next few posts I will try to explain how I used these to create TSqlMigrations.   Blogs entries Three rules for database work - K. Scott Allen http://odetocode.com/blogs/scott/archive/2008/01/30/three-rules-for-database-work.aspx Versioning databases - the baseline http://odetocode.com/blogs/scott/archive/2008/01/31/versioning-databases-the-baseline.aspx Versioning databases - change scripts http://odetocode.com/blogs/scott/archive/2008/02/02/versioning-databases-change-scripts.aspx Versioning databases - views, stored procedures and the like http://odetocode.com/blogs/scott/archive/2008/02/02/versioning-databases-views-stored-procedures-and-the-like.aspx Versioning databases - branching and merging http://odetocode.com/blogs/scott/archive/2008/02/03/versioning-databases-branching-and-merging.aspx Evolutionary Database Design - Martin Fowler http://martinfowler.com/articles/evodb.html Are database migration frameworks worth the effort? - Good challenges http://www.ridgway.co.za/archive/2009/01/03/are-database-migration-frameworks-worth-the-effort.aspx Continuous Integration (in general) http://martinfowler.com/articles/continuousIntegration.html http://martinfowler.com/articles/originalContinuousIntegration.html Is Your Database Under Version Control? http://www.codinghorror.com/blog/archives/000743.html 11 Tools for Database Versioning http://secretgeek.net/dbcontrol.asp How to do database source control and builds http://mikehadlow.blogspot.com/2006/09/how-to-do-database-source-control-and.html .Net Database Migration Tool Roundup http://flux88.com/blog/net-database-migration-tool-roundup/ Books Book Description Refactoring Databases: Evolutionary Database Design Martin Fowler signature series on refactoring databases. Book site: http://databaserefactoring.com/ Recipes for Continuous Database Integration: Evolutionary Database Development (Digital Short Cut) A good question/answer layout of common problems and solutions with database version control. http://www.informit.com/store/product.aspx?isbn=032150206X

    Read the article

  • Data Web Controls Enhancements in ASP.NET 4.0

    Traditionally, developers using Web controls enjoyed increased productivity but at the cost of control over the rendered markup. For instance, many ASP.NET controls automatically wrap their content in <table> for layout or styling purposes. This behavior runs counter to the web standards that have evolved over the past several years, which favor cleaner, terser HTML; sparing use of tables; and Cascading Style Sheets (CSS) for layout and styling. Furthermore, the <table> elements and other automatically-added content makes it harder to both style the Web controls using CSS and to work with the controls from client-side script. One of the aims of ASP.NET version 4.0 is to give Web Form developers greater control over the markup rendered by Web controls. Last week's article, Take Control Of Web Control ClientID Values in ASP.NET 4.0, highlighted how new properties in ASP.NET 4.0 give the developer more say over how a Web control's ID property is translated into a client-side id attribute. In addition to these ClientID-related properties, many Web controls in ASP.NET 4.0 include properties that allow the page developer to instruct the control to not emit extraneous markup, or to use an HTML element other than <table>. This article explores a number of enhancements made to the data Web controls in ASP.NET 4.0. As you'll see, most of these enhancements give the developer greater control over the rendered markup. Read on to learn more! Read More >Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Incrementing Assembly Version in TFS Builds and its affect over Other Build Definitions

    - by ssmantha
    A very common scenario while performing TFS builds is to increment version number of the assemblies. There are quite a few approaches of which I would like to share two links: Ewald Hofman’s Approach: http://www.ewaldhofman.nl/post/2010/05/13/Customize-Team-Build-2010-e28093-Part-5-Increase-AssemblyVersion.aspx#id_02e7b082-ce95-49a9-92e9-7dc88887b377 Richard Bank’s Approach : http://www.richard-banks.org/2010/07/how-to-versioning-builds-with-tfs-2010.html   Both these approaches work well, however there are scenarios where Editing and Checking–in the Assembly version information can create problems with Build Definitions meant for Continuous Integration, or gated Check-ins. You can suppress the Continuous Integration Builds while checking in the Assembly info file by just putting a comment “***NO_CI***” as specified by Ewald in his blog. However, if you have Gated Checkin in place, this can turn out to be difficult to suppress, I myself tried to suppress the Build Trigger during the check in process but things doesn’t turn out well. That’s where Richard’s solution comes as handy. Both the solutions have their own pros and cons, which I believe can only be experienced over a period of time. In case of Richard’s solution I believe that we don’t have any history of the Assembly Version Info file and when you take latest of the solution the information will be lost. If you notice closely, that suppressing the Continuous Integration (the NO_CI approach in check in comments) is a workaround provided by Microsoft, however I didn’t find anything to suppress the gated Checkin so far. Suggestions or Findings are most welcome.

    Read the article

  • Extending ASP.NET Output Caching

    One of the most sure-fire ways to improve a web application's performance is to employ caching. Caching takes some expensive operation and stores its results in a quickly accessible location. Since it's inception, ASP.NET has offered two flavors of caching: Output Caching - caches the entire rendered markup of an ASP.NET page or User Control for a specified duration.Data Caching - a API for caching objects. Using the data cache you can write code to add, remove, and retrieve items from the cache.Until recently, the underlying functionality of these two caching mechanisms was fixed - both cached data in the web server's memory. This has its drawbacks. In some cases, developers may want to save output cache content to disk. When using the data cache you may want to cache items to the cloud or to a distributed caching architecture like memcached. The good news is that with ASP.NET 4 and the .NET Framework 4, the output caching and data caching options are now much more extensible. Both caching features are now based upon the provider model, meaning that you can create your own output cache and data cache providers (or download and use a third-party or open source provider) and plug them into a new or existing ASP.NET 4 application. This article focuses on extending the output caching feature. We'll walk through how to create a custom output cache provider that caches a page or User Control's rendered output to disk (as opposed to memory) and then see how to plug the provider into an ASP.NET application. A complete working example, available in both VB and C#, is available for download at the end of this article. Read on to learn more! Read More >Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Optimize Images Using the ASP.NET Sprite and Image Optimization Framework

    The HTML markup of a web page includes the page's textual content, semantic and styling information, and, typically, several references to external resources. External resources are content that is part of web page, but are separate from the web page's markup - things like images, style sheets, script files, Flash videos, and so on. When a browser requests a web page it starts by downloading its HTML. Next, it scans the downloaded HTML for external resources and starts downloading those. A page with many external resources usually takes longer to completely load than a page with fewer external resources because there is an overhead associated with downloading each external resource. For starters, each external resource requires the browser to make an HTTP request to retrieve the resource. What's more, browsers have a limit as to how many HTTP requests they will make in parallel. For these reasons, a common technique for improving a page's load time is to consolidate external resources in a way to reduce the number of HTTP requests that must be made by the browser to load the page in its entirety. This article examines the free and open-source ASP.NET Sprite and Image Optimization Framework, which is a project developed by Microsoft for improving a web page's load time by consolidating images into a sprite or by using inline, base-64 encoded images. In a nutshell, this framework makes it easy to implement practices that will improve the load time for a web page that displays several images. Read on to learn more! Read More >

    Read the article

  • Storing Entity Framework Entities in a Separate Assembly

    - by Anthony Trudeau
    The Entity Framework has been valuable to me since it came out, because it provided a convenient and powerful way to model against my data source in a consistent way.  The first versions had some deficiencies that for me mostly fell in the category of the tight coupling between the model and its resulting object classes (entities). Version 4 of the Entity Framework pretty much solves this with the support of T4 templates that allow you to implement your entities as self-tracking entities, plain old CLR objects (POCO), et al.  Doing this involves either specifying a new code generation template or implementing them yourselves.  Visual Studio 2010 ships with a self-tracking entities template and a POCO template is available from the Extension Manager.  (Extension Manager is very nice but it's very easy to waste a bunch of time exploring add-ins.  You've been warned.) In a current project I wanted to use POCO; however, I didn't want my entities in the same assembly as the context classes.  It would be nice if this was automatic, but since it isn't here are the simple steps to move them.  These steps detail moving the entity classes and not the context.  The context can be moved in the same way, but I don't see a compelling reason to physically separate the context from my model. Turn off code generation for the template.  To do this set the Custom Tool property for the entity template file to an empty string (the entity template file will be named something like MyModel.tt). Expand the tree for the entity template file and delete all of its items.  These are the items that were automatically generated when you added the template. Create a project for your entities (if you haven't already). Add an existing item and browse to your entity template file, but add it as a link (do not add it directly).  Adding it as a link will allow the model and the template to stay in sync, but the code generation will occur in the new assembly.

    Read the article

  • Entity Framework &amp; Transactions

    - by Sudheer Kumar
    There are many instances we might have to use transactions to maintain data consistency. With Entity Framework, it is a little different conceptually. Case 1 – Transaction b/w multiple SaveChanges(): here if you just use a transaction scope, then Entity Framework (EF) will use distributed transactions instead of local transactions. The reason is that, EF closes and opens the connection when ever required only, which means, it used 2 different connections for different SaveChanges() calls. To resolve this, use the following method. Here we are opening a connection explicitly so as not to span across multipel connections.   using (TransactionScope ts = new TransactionScope()) {     context.Connection.Open();     //Operation1 : context.SaveChanges();     //Operation2 :  context.SaveChanges()     //At the end close the connection     ts.Complete(); } catch (Exception ex) {       //Handle Exception } finally {       if (context.Connection.State == ConnectionState.Open)       {            context.Connection.Close();       } }   Case 2 – Transaction between DB & Non-DB operations: For example, assume that you have a table that keeps track of Emails to be sent. Here you want to update certain details like DataSent once if the mail was successfully sent by the e-mail client. Email eml = GetEmailToSend(); eml.DateSent = DateTime.Now; using (TransactionScope ts = new TransactionScope()) {    //Update DB    context.saveChanges();   //if update is successful, send the email using smtp client   smtpClient.Send();   //if send was successful, then commit   ts.Complete(); }   Here since you are dealing with a single context.SaveChanges(), you just need to use the TransactionScope, just before saving the context only.   Hope this was helpful!

    Read the article

< Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >