Search Results

Search found 4879 results on 196 pages for 'geeks'.

Page 148/196 | < Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >

  • Is IE9 a modern browser?

    - by TATWORTH
    At http://people.mozilla.com/~prouget/ie9/ there is a very provocative article entitled "Is IE9 a modern browser?". There is a rebuttal by Tim Sneath at http://blogs.msdn.com/b/tims/archive/2011/02/15/a-modern-browser.aspx that is well worth a look. Certainly IE9 is already superior to its predecessors. My comment on the matter is that those that consider IE9 to be non-standards compliant, should submit tests to the W3C to demonstrate the non-compliance. Upon acceptance by the W3C, all the competing browsers can then be re-tested. I prefer objective tests to subjective opinion. I have used IE9 and on some sites such as Hotmail, it is noticeably faster. I have so far been unable to apply the promised IE9 lockout of spyware cookies. With Firefox, I just instal NoScript and never enable spyware sites.

    Read the article

  • Using Query Classes With NHibernate

    - by Liam McLennan
    Even when using an ORM, such as NHibernate, the developer still has to decide how to perform queries. The simplest strategy is to get access to an ISession and directly perform a query whenever you need data. The problem is that doing so spreads query logic throughout the entire application – a clear violation of the Single Responsibility Principle. A more advanced strategy is to use Eric Evan’s Repository pattern, thus isolating all query logic within the repository classes. I prefer to use Query Classes. Every query needed by the application is represented by a query class, aka a specification. To perform a query I: Instantiate a new instance of the required query class, providing any data that it needs Pass the instantiated query class to an extension method on NHibernate’s ISession type. To query my database for all people over the age of sixteen looks like this: [Test] public void QueryBySpecification() { var canDriveSpecification = new PeopleOverAgeSpecification(16); var allPeopleOfDrivingAge = session.QueryBySpecification(canDriveSpecification); } To be able to query for people over a certain age I had to create a suitable query class: public class PeopleOverAgeSpecification : Specification<Person> { private readonly int age; public PeopleOverAgeSpecification(int age) { this.age = age; } public override IQueryable<Person> Reduce(IQueryable<Person> collection) { return collection.Where(person => person.Age > age); } public override IQueryable<Person> Sort(IQueryable<Person> collection) { return collection.OrderBy(person => person.Name); } } Finally, the extension method to add QueryBySpecification to ISession: public static class SessionExtensions { public static IEnumerable<T> QueryBySpecification<T>(this ISession session, Specification<T> specification) { return specification.Fetch( specification.Sort( specification.Reduce(session.Query<T>()) ) ); } } The inspiration for this style of data access came from Ayende’s post Do You Need a Framework?. I am sick of working through multiple layers of abstraction that don’t do anything. Have you ever seen code that required a service layer to call a method on a repository, that delegated to a common repository base class that wrapped and ORMs unit of work? I can achieve the same thing with NHibernate’s ISession and a single extension method. If you’re interested you can get the full Query Classes example source from Github.

    Read the article

  • BizTalk 2009 - Messages: Last 100 Sent

    - by StuartBrierley
    Having previously talked about the lack of the traditional HAT in BizTalk 2009, the question then becomes how do you replicate some of the functionality that was previsouly relied on? I have already covered the Last 100 Messages Received query so what about sent messages? In BizTalk 2004 we had a query in HAT to return the messages sent in the last day.  While not a direct replacement the following query replicates some of the usefullness of this query in a BizTalk 2009 Hatless environment. Basically we are creating a query to search for the last one hundred tracked messages that were sent by BizTalk: Coming up Messages - last 50 suspended Service instances - last 100

    Read the article

  • What are they buying &ndash; work or value?

    - by Jamie Kurtz
    When was the last time you ordered a pizza like this: “I want the high school kid in the back to do the following… make a big circle with some dough, curl up the edges, then put some sauce on it using a small ladle, then I want him to take a handful of shredded cheese from the metal container and spread it over the circle and sauce, then finally I want the kid to place 36 pieces of pepperoni over the top of the cheese” ?? Probably never. My typical pizza order usually goes more like this: “I want a large pepperoni pizza”. In the world of software development, we try so hard to be all things agile. We: Write lots of unit tests We refactor our code, then refactor it some more We avoid writing lengthy requirements documents We try to keep processes to a minimum, and give developers freedom And we are proud of our constantly shifting focus (i.e. we’re “responding to change”) Yet, after all this, we fail to really lean and capitalize on one of agile’s main differentiators (from the twelve principles behind the Agile Manifesto): “Working software is the primary measure of progress.” That is, we foolishly commit to delivering tasks instead of features and bug fixes. Like my pizza example above, we fall into the trap of signing contracts that bind us to doing tasks – rather than delivering working software. And the biggest problem here… by far the most troubling outcome… is that we don’t let working software be a major force in all the work we do. When teams manage to ruthlessly focus on the end product, it puts them on the path of true agile. It doesn’t let them accidentally write too much documentation, or spend lots of time and money on processes and fancy tools. It forces early testing that reveals problems in the feature or bug fix. And it forces lots and lots of customer interaction.  Without that focus on the end product as your deliverable… by committing to a list of tasks instead of a list features and bug fixes… you are doomed to NOT be agile. You will end up just doing stuff, spending time on the keyboard, burning time on timesheets. Doing tasks doesn’t force you to minimize documentation. It makes it much harder to respond to change. And it will eventually force you and the client into contract haggling. Because the customer isn’t really paying you to do stuff. He’s ultimately paying for features and bug fixes. And when the customer doesn’t get what they want, responding with “well, look at the contract - we did all the tasks we committed to” doesn’t typically generate referrals or callbacks. In short, if you’re trying to deliver real value to the customer by going agile, you will most certainly fail if all you commit to is a list of things you’re going to do. Give agile what it needs by committing to features and bug fixes – not a list of ToDo items. So the next time you are writing up a contract, remember that the customer should be buying this: Not this:

    Read the article

  • Extending Expression Blend 4 &amp; Blend for Visual Studio 2012

    - by Chris Skardon
    Just getting this off the bat, I presume this will also work for Blend 5, but I can’t confirm it… Anyhews, I imagine you’re here because you want to know how to create an addin for Blend, so let’s jump right in there! First, and foremost, we’re going to need to ensure our development environment has the right setup, so the checklist: Visual Studio 2012 Blend for Visual Studio 2012 OK, let’s create a new project (class library, .NET 4.5): Hello.Extension The ‘.Extension’ bit is very very important. The addin will not work unless it is named in this way. You can put whatever you want at the front, but it has to have the extension bit. OK, so now we have a solution with one project. To this project we need to add references to the following things: Microsoft.Expression.Extensibility (from c:\program files\Microsoft Visual Studio 11.0\Blend\   -- x86 folder if you are on an x64 windows install) Microsoft.Expression.Framework (same location as above) PresentationCore PresentationFramework WindowsBase System.ComponentModel.Composition Got them? ACE. Let’s now add a project to contain our control, so, create a new WPF Application project, cunningly named something like ‘Hello.Control’… (I’m creating a WPF application here, because I’m too lazy to dig up the correct references, and this will add all the ones I need ) Once that is created, delete the App.xaml and MainWindow.xaml files, we won’t be needing them. You will also need to change the properties of the project itself, so it is only a class library. Once that is done, let’s add a new UserControl, which will be this: <UserControl x:Class="Hello.Control.HelloControl" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" mc:Ignorable="d" d:DesignHeight="300" d:DesignWidth="300"> <Grid> <TextBlock Text="HELLO!!!"/> </Grid> </UserControl> Impressive eh? Now, let’s reference the WPF project from the Extension library. All that’s left now is to code up our extension… So, add a class to the Extension project (name wise doesn’t matter), and make it implement the IPackage interface from the Microsoft.Expression.Extensibility library: public class HelloExtension : IPackage { /**/ } We’ll implement the two methods we need to: public class HelloExtension : IPackage { public void Load(IServices services) { } public void Unload() { } } We’re only really concerned about the Load method in this case, as let’s face it, the extension we have doesn’t need to do a lot to bog off. The interesting thing about the Load method is that it receives an IServices instance. This allows us to get access to all the services that Expression provides, in this case we’re interested in one in particular, the ‘IWindowService’ So, let’s get that bad boy… private IWindowService _windowService; public void Load(IServices services) { _windowService = services.GetService<IWindowService>(); } Nailed it… But why? The WindowService allows us to register our UserControl with Blend, which in turn allows people to activate and see it, which is a big plus point. So, let’s do that… We’ll create an ‘Initialize’ method to create our new control, and add it to the WindowService: private HelloControl _helloControl; public void Initialize() { _helloControl = new HelloControl(); if (_windowService.PaletteRegistry["HelloPanel"] == null) _windowService.RegisterPalette("HelloPanel", _helloControl, "Hello Window"); } First we check that we’re not already registered, and if we’re not we register, the first argument is the identifier used by the service to, well, identify your extension. The second argument is the actual control, the third argument is the name that people will see in the ‘Windows’ menu of Blend itself (so important note here – don’t put anything embarrassing or (need I say it?) sweary…) There are only two things to do now - Call ‘Initialize()’ from our Load method, and Export the class This is easy money – add [Export(typeof(IPackage))] to the top of our class… The full code will (should) look like this: [Export(typeof (IPackage))] public class HelloExtension : IPackage { private HelloControl _helloControl; private IWindowService _windowService; public void Load(IServices services) { _windowService = services.GetService<IWindowService>(); Initialize(); } public void Unload() { } public void Initialize() { _helloControl = new HelloControl(); if (_windowService.PaletteRegistry["HelloControl"] == null) _windowService.RegisterPalette("HelloControl", _helloControl, "Hello Window"); } } If you build this and copy it to your ‘Extensions’ folder in Blend (c:\program files\microsoft visual studio 11.0\blend\) and start Blend, you should see ‘Hello Window’ listed in the Window menu: That as they say is it!

    Read the article

  • Faster, Simpler access to Azure Tables with Enzo Azure API

    - by Herve Roggero
    After developing the latest version of Enzo Cloud Backup I took the time to create an API that would simplify access to Azure Tables (the Enzo Azure API). At first, my goal was to make the code simpler compared to the Microsoft Azure SDK. But as it turns out it is also a little faster; and when using the specialized methods (the fetch strategies) it is much faster out of the box than the Microsoft SDK, unless you start creating complex parallel and resilient routines yourself. Last but not least, I decided to add a few extension methods that I think you will find attractive, such as the ability to transform a list of entities into a DataTable. So let’s review each area in more details. Simpler Code My first objective was to make the API much easier to use than the Azure SDK. I wanted to reduce the amount of code necessary to fetch entities, remove the code needed to add automatic retries and handle transient conditions, and give additional control, such as a way to cancel operations, obtain basic statistics on the calls, and control the maximum number of REST calls the API generates in an attempt to avoid throttling conditions in the first place (something you cannot do with the Azure SDK at this time). Strongly Typed Before diving into the code, the following examples rely on a strongly typed class called MyData. The way MyData is defined for the Azure SDK is similar to the Enzo Azure API, with the exception that they inherit from different classes. With the Azure SDK, classes that represent entities must inherit from TableServiceEntity, while classes with the Enzo Azure API must inherit from BaseAzureTable or implement a specific interface. // With the SDK public class MyData1 : TableServiceEntity {     public string Message { get; set; }     public string Level { get; set; }     public string Severity { get; set; } } //  With the Enzo Azure API public class MyData2 : BaseAzureTable {     public string Message { get; set; }     public string Level { get; set; }     public string Severity { get; set; } } Simpler Code Now that the classes representing an Azure Table entity are defined, let’s review the methods that the Azure SDK would look like when fetching all the entities from an Azure Table (note the use of a few variables: the _tableName variable stores the name of the Azure Table, and the ConnectionString property returns the connection string for the Storage Account containing the table): // With the Azure SDK public List<MyData1> FetchAllEntities() {      CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionString);      CloudTableClient tableClient = storageAccount.CreateCloudTableClient();      TableServiceContext serviceContext = tableClient.GetDataServiceContext();      CloudTableQuery<MyData1> partitionQuery =         (from e in serviceContext.CreateQuery<MyData1>(_tableName)         select new MyData1()         {            PartitionKey = e.PartitionKey,            RowKey = e.RowKey,            Timestamp = e.Timestamp,            Message = e.Message,            Level = e.Level,            Severity = e.Severity            }).AsTableServiceQuery<MyData1>();        return partitionQuery.ToList();  } This code gives you automatic retries because the AsTableServiceQuery does that for you. Also, note that this method is strongly-typed because it is using LINQ. Although this doesn’t look like too much code at first glance, you are actually mapping the strongly-typed object manually. So for larger entities, with dozens of properties, your code will grow. And from a maintenance standpoint, when a new property is added, you may need to change the mapping code. You will also note that the mapping being performed is optional; it is desired when you want to retrieve specific properties of the entities (not all) to reduce the network traffic. If you do not specify the properties you want, all the properties will be returned; in this example we are returning the Message, Level and Severity properties (in addition to the required PartitionKey, RowKey and Timestamp). The Enzo Azure API does the mapping automatically and also handles automatic reties when fetching entities. The equivalent code to fetch all the entities (with the same three properties) from the same Azure Table looks like this: // With the Enzo Azure API public List<MyData2> FetchAllEntities() {        AzureTable at = new AzureTable(_accountName, _accountKey, _ssl, _tableName);        List<MyData2> res = at.Fetch<MyData2>("", "Message,Level,Severity");        return res; } As you can see, the Enzo Azure API returns the entities already strongly typed, so there is no need to map the output. Also, the Enzo Azure API makes it easy to specify the list of properties to return, and to specify a filter as well (no filter was provided in this example; the filter is passed as the first parameter).  Fetch Strategies Both approaches discussed above fetch the data sequentially. In addition to the linear/sequential fetch methods, the Enzo Azure API provides specific fetch strategies. Fetch strategies are designed to prepare a set of REST calls, executed in parallel, in a way that performs faster that if you were to fetch the data sequentially. For example, if the PartitionKey is a GUID string, you could prepare multiple calls, providing appropriate filters ([‘a’, ‘b’[, [‘b’, ‘c’[, [‘c’, ‘d[, …), and send those calls in parallel. As you can imagine, the code necessary to create these requests would be fairly large. With the Enzo Azure API, two strategies are provided out of the box: the GUID and List strategies. If you are interested in how these strategies work, see the Enzo Azure API Online Help. Here is an example code that performs parallel requests using the GUID strategy (which executes more than 2 t o3 times faster than the sequential methods discussed previously): public List<MyData2> FetchAllEntitiesGUID() {     AzureTable at = new AzureTable(_accountName, _accountKey, _ssl, _tableName);     List<MyData2> res = at.FetchWithGuid<MyData2>("", "Message,Level,Severity");     return res; } Faster Results With Sequential Fetch Methods Developing a faster API wasn’t a primary objective; but it appears that the performance tests performed with the Enzo Azure API deliver the data a little faster out of the box (5%-10% on average, and sometimes to up 50% faster) with the sequential fetch methods. Although the amount of data is the same regardless of the approach (and the REST calls are almost exactly identical), the object mapping approach is different. So it is likely that the slight performance increase is due to a lighter API. Using LINQ offers many advantages and tremendous flexibility; nevertheless when fetching data it seems that the Enzo Azure API delivers faster.  For example, the same code previously discussed delivered the following results when fetching 3,000 entities (about 1KB each). The average elapsed time shows that the Azure SDK returned the 3000 entities in about 5.9 seconds on average, while the Enzo Azure API took 4.2 seconds on average (39% improvement). With Fetch Strategies When using the fetch strategies we are no longer comparing apples to apples; the Azure SDK is not designed to implement fetch strategies out of the box, so you would need to code the strategies yourself. Nevertheless I wanted to provide out of the box capabilities, and as a result you see a test that returned about 10,000 entities (1KB each entity), and an average execution time over 5 runs. The Azure SDK implemented a sequential fetch while the Enzo Azure API implemented the List fetch strategy. The fetch strategy was 2.3 times faster. Note that the following test hit a limit on my network bandwidth quickly (3.56Mbps), so the results of the fetch strategy is significantly below what it could be with a higher bandwidth. Additional Methods The API wouldn’t be complete without support for a few important methods other than the fetch methods discussed previously. The Enzo Azure API offers these additional capabilities: - Support for batch updates, deletes and inserts - Conversion of entities to DataRow, and List<> to a DataTable - Extension methods for Delete, Merge, Update, Insert - Support for asynchronous calls and cancellation - Support for fetch statistics (total bytes, total REST calls, retries…) For more information, visit http://www.bluesyntax.net or go directly to the Enzo Azure API page (http://www.bluesyntax.net/EnzoAzureAPI.aspx). About Herve Roggero Herve Roggero, Windows Azure MVP, is the founder of Blue Syntax Consulting, a company specialized in cloud computing products and services. Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" from Apress and runs the Azure Florida Association (on LinkedIn: http://www.linkedin.com/groups?gid=4177626). For more information on Blue Syntax Consulting, visit www.bluesyntax.net.

    Read the article

  • First blog post from Surface RT using Microsoft Word 2013

    - by Enrique Lima
    One of the concerns I had in using a Surface RT was the need I have to be able to post.Recently, and not so recently, I have stopped posting. Between getting busy, carrying different devices. Well, it has been hard to do. Tried doing that with an iPad, and I can't say it didn't work, it just didn't work for me. Again, back to the concern with the Surface RT. But, looking at the App Store I started getting that same frustration I had with other platforms that left me with a feeling of "I have to compromise because I am on a SubText platform". So, I stuck to posting from Windows Live Writer (great tool!). This whole situation made me think and rethink my strategy, and then … a big DUH! What about using Microsoft Word 2013 for that? Would it work? So, here is the test!

    Read the article

  • Building Enterprise Smartphone App &ndash; Part 2: Platforms and Features

    - by Tim Murphy
    This is part 2 in a series of posts based on a talk I gave recently at the Chicago Information Technology Architects Group.  Feel free to leave feedback. In the previous post I discussed what reasons a company might have for creating a smartphone application.  In this installment I will cover some of history and state of the different platforms as well as features that can be leveraged for building enterprise smartphone applications. Platforms Before you start choosing a platform to develop your solutions on it is good to understand how we got here and what features you can leverage. History To my memory we owe all of this to a product called the Apple Newton that came out in 1987. It was the first PDA and back then I was much more of an Apple fan.  I was very impressed with this device even though it never really went anywhere.  The Palm Pilot by US Robotics was the next major advancement in PDA. It had a simple short hand window that allowed for quick stylus entry.. Later, Windows CE came out and started the broadening of the PDA market. After that it was the Palm and CE operating systems that started showing up on cell phones and for some time these were the two dominant operating systems that were distributed with devices from multiple hardware vendors. Current The iPhone was the first smartphone to take away the stylus and give us a multi-touch interface.  It was a revolution in usability and really changed the attractiveness of smartphones for the general public.  This brought us to the beginning of the current state of the market with the concept of an online store that makes it easy for customers to get new features and functionality on demand. With Android, Google made this more than a one horse race.  Not only did they come to compete, their low cost actually made them the leading OS.  Of course what made Android so attractive also is its major fault.  It is so open that it has been a target for malware which leaves consumers exposed.  Fortunately for Google though, most consumers aren’t aware of the threat that they are under. Although Microsoft had put out one of the first smart phone operating systems with CE it had to play catch up and finally came out with the Windows Phone.  They have gone for a market approach between those of iOS and Android.  They support multiple hardware vendors like Google, but they kept a certification process for applications that is similar to Apple.  They also created a user interface that was different enough to give it a clear separation from the other two platforms. The result of all this is hundreds of millions of smartphones being sold monthly across all three platforms giving us a wide range of choices and challenges when it comes to developing solutions. Features So what are the features that make these devices flexible enough be considered for use in the enterprise? The biggest advantage of today's devices is network connectivity.  The ability to access information from multiple sources at a moment’s notice is critical for businesses.  Add to that the ability to communicate over a variety of text, voice and video modes and we have a powerful starting point. Every smartphone has a cameras and they are not just useful for posting to Instagram. We are seeing more applications such as Bing vision that allow us to scan just about any printed code or text to find information.  These capabilities have been made available to developers in the form of standard libraries for reading barcodes of just about an flavor and optical character recognition (OCR) interpretation. Bluetooth give us the ability to communicate with multiple devices. Whether these are headsets, keyboard or printers the wireless communication capabilities are just starting to evolve.  The more these wireless communication protocols grow, the more opportunities we will see to transfer data between users and a variety of devices. Local storage of information that can be called up even when the device cannot reach the network is the other big capability.  This give users the ability to work offline as well and transmit information when connections are restored. These are the tools that we have to work with to build applications that can be leveraged to gain a competitive advantage for companies that implement them. Coming Up In the third installment I will cover key concerns that you face when building enterprise smartphone apps. del.icio.us Tags: smartphones,enterprise smartphone Apps,architecture,iOS,Android,Windows Phone

    Read the article

  • MVC Razor Engine For Beginners Part 1

    - by Humprey Cogay, C|EH, E|CSA
    I. What is MVC? a. http://www.asp.net/mvc/tutorials/older-versions/overview/asp-net-mvc-overview II. Software Requirements for this tutorial a. Visual Studio 2010/2012. You can get your free copy here Microsoft Visual Studio 2012 b. MVC Framework Option 1 - Install using a standalone installer http://www.microsoft.com/en-us/download/details.aspx?id=30683 Option 2 - Install using Web Platform Installer http://www.microsoft.com/web/handlers/webpi.ashx?command=getinstallerredirect&appid=MVC4VS2010_Loc III. Creating your first MVC4 Application a. On the Visual Studio click file new solution link b. Click Other Project Type>Visual Studio Solutions and on the templates window select blank solution and let us name our solution MVCPrimer. c. Now Click File>New and select Project d. Select Visual C#>Web> and select ASP.NET MVC 4 Web Application and Enter MyWebSite as Name e. Select Empty, Razor as view engine and uncheck Create a Unit test project f. You can now view a basic MVC 4 Application Structure on your solution explorer g. Now we will add our first controller by right clicking on the controllers folder on your solution explorer and select Add>Controller h. Change the name of the controller to HomeController and under the scaffolding options select Empty MVC Controller. i. You will now see a basic controller with an Index method that returns an ActionResult j. We will now add a new View Folder for our Home Controller. Right click on the views folder on your solution explorer and select Add> New Folder> and name this folder Home k. Add a new View by right clicking on Views>Home Folder and select Add View. l. Name the view Index, and select Razor(CSHTML) as View Engine, All checkbox should be unchecked for now and click add. m. Relationship between our HomeController and Home Views Sub Folder n. Add new HTML Contents to our newly created Index View o. Press F5 to run our MVC Application p. We will create our new model, Right click on the models folder of our solution explorer and select Add> Class. q. Let us name our class Customer r. Edit the Customer class with the following code s. Open the HomeController by double clickin HomeController of our Controllers folder and edit the HomeControllerusing System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.Mvc;   namespace MyWebSite.Controllers {     public class HomeController : Controller     {         //         // GET: /Home/           public ActionResult Index()         {             return View();         }           public ActionResult ListCustomers()         {             List<Models.Customer> customers = new List<Models.Customer>();               //Add First Customer to Our Collection             customers.Add(new Models.Customer()                     {                         Id = 1,                         CompanyName = "Volvo",                         ContactNo = "123-0123-0001",                         ContactPerson = "Gustav Larson",                         Description = "Volvo Car Corporation, or Volvo Personvagnar AB, is a Scandinavian automobile manufacturer founded in 1927"                     });                 //Add Second Customer to Our Collection             customers.Add(new Models.Customer()                     {                         Id = 2,                         CompanyName = "BMW",                         ContactNo = "999-9876-9898",                         ContactPerson = "Franz Josef Popp",                         Description = "Bayerische Motoren Werke AG,  (BMW; English: Bavarian Motor Works) is a " +                                       "German automobile, motorcycle and engine manufacturing company founded in 1917. "                     });                 //Add Third Customer to Our Collection             customers.Add(new Models.Customer()             {                 Id = 3,                 CompanyName = "Audi",                 ContactNo = "983-2222-1212",                 ContactPerson = "Karl Benz",                 Description = " is a multinational division of the German manufacturer Daimler AG,"             });               return View(customers);         }     } } t. Let us now create a view for this Class, But before continuing Press Ctrl + Shift + B to rebuild the solution, this will make the previously created model on the Model class drop down of the Add View Menu. Right click on the views>Home folder and select Add>View u. Let us name our View as ListCustomers, Select Razor(CSHTML) as View Engine, Put a check mark on Create a strongly-typed view, and select Customer (MyWebSite.Models) as model class. Slect List on the Scaffold Template and Click OK. v. Run the MVC Application by pressing F5, and on the address bar insert Home/ListCustomers, We should now see a web page similar below.   x. You can edit ListCustomers.CSHTML to remove and add HTML codes @model IEnumerable<MyWebSite.Models.Customer>   @{     Layout = null; }   <!DOCTYPE html>   <html> <head>     <meta name="viewport" content="width=device-width" />     <title>ListCustomers</title> </head> <body>     <h2>List of Customers</h2>     <table border="1">         <tr>             <th>                 @Html.DisplayNameFor(model => model.CompanyName)             </th>             <th>                 @Html.DisplayNameFor(model => model.Description)             </th>             <th>                 @Html.DisplayNameFor(model => model.ContactPerson)             </th>             <th>                 @Html.DisplayNameFor(model => model.ContactNo)             </th>         </tr>         @foreach (var item in Model) {         <tr>             <td>                 @Html.DisplayFor(modelItem => item.CompanyName)             </td>             <td>                 @Html.DisplayFor(modelItem => item.Description)             </td>             <td>                 @Html.DisplayFor(modelItem => item.ContactPerson)             </td>             <td>                 @Html.DisplayFor(modelItem => item.ContactNo)             </td>                   </tr>     }         </table> </body> </html> y. Press F5 to run the MVC Application   z. You will notice some @HTML.DisplayFor codes. These are called HTML Helpers you can read more about HTML Helpers on this site http://www.w3schools.com/aspnet/mvc_htmlhelpers.asp   That’s all. You now have your first MVC4 Razor Engine Web Application . . .

    Read the article

  • Windows Azure Platform eBook Update #2 &ndash; 100 pages of goodness

    - by Eric Nelson
    I previously mentioned I was working on a community authored eBook for the Windows Azure Platform. Well, today I assembled the 20 articles that made it through to the end of the review process into a single eBook – and it looks (and reads) great. Still a lot more to do (and stuff in the way of me doing it) but as a teaser, here is the (very draft) table of contents:

    Read the article

  • Attribute Key Not found Error while processing the cube in SSAS

    - by sathya
    Attribute Key Not found Error occurs in SSAS due to the following reasons :   Dimensions processed after measure groups Null values / references in Keys Please check for the Null values in Interrelated dimensions. (For Ex : You might have a dimension table employees, employees relates to table sub department, subdepartment relates to table department. If by chance there exists a Null value in the mapping between subdepartment and department you will get this error).

    Read the article

  • Gems In The Visual Studio 2010 Training Kit &ndash; Introduction to MEF: Learning Labs

    - by Jim Duffy
    No, this post doesn’t have anything to do with cooking up illegal drugs in some rundown shack outside of town. That, my friends, would be a meth lab and fortunately that is waaaaay outside my area of expertise. Now I can talk Kentucky bourbon, or as Homer Simpson would say “mmmmmmmmmmm bourbon”, with you but please refrain from asking me meth questions. :-) Anyway, what I’m talking about are the MEF (Managed Extensibility Framework) Learning Labs contained in the Visual Studio 2010 and .NET Framework 4 Training Kit. Not sure what MEF is and need an overview? Then start here or here. Ok, so you’ve read a bit about MEF or heard about MEF and you’re thinking it might be something you and your development team might want to take a hands-on look at. I have good news then because contained in the Visual Studio 2010 and .NET Framework 4 Training Kit is a series of hands-on learning labs for MEF. I’ve added working my way through them to my “things I want to take a closer look at” list. Have a day. :-|

    Read the article

  • Inexpensive Business Checks

    - by Randy Walker
    One of the most annoying things when setting up a business is paying the outrageous fees for business checks.  When starting out, rather than pay the $150 for the handful of computer printable checks, I had bought software that would create the checks for me.  But if you didn’t know, those little digits at the bottom of a check are magnetically encoded and requires special ink. Fortunately, my current bank has one of the best bill pay websites, so I have exclusively used it.  But since I recently had to open a new bank account, I went off in search of a cheap alternative for business checks.  A bit wary of some of the printers, I opted for TechChecks and was extremely surprised a few days later when my checks arrived in perfect condition.  (I recommend the diamond prismatic red-blue-green checks.  Beautiful and very professional looking.) It was perfect timing as well, since I now have to reorder some checks for another account.

    Read the article

  • #altnetseattle - Kanban

    - by GeekAgilistMercenary
    The two main concepts of Kanban is to keep the queues minimum and to maintain visibility. Management/leadership needs to make sure the Kanban Queue doesn’t get starved.  This is key and also very challenging, being the queue needs to be minimal but also can’t get too small during the course of work.  This is to maintain maximum velocity. Phases of the Kanban need to be kept flowing too, bottlenecks need removed ASAP when brought up. Victory Wall – I dig that idea.  Somewhere to look to see the success of the team. The POs work in Rally or other tools for some client management, but it causes issues with the lack of "visibility" – a key fundamental ideal & part of Kanban. One of the big issues is fitting things into a sprint, when Kanban is used with Scrum, but longer sprints are wasteful. Kanban work sizes are of a set size. At this point I got a bit side tracked by the actual conversation and missed out on note taking.  Overall, people doing Kanban and Lean Style Software Development I would say are some of the happiest coders around.  The clean focus, good velocity, sizing, and other approaches that are inferred by Kanban help developers be the rock stars and succeed. This is definitely a topic I will be commenting on a lot more in the near future.

    Read the article

  • Profiling Startup Of VS2012 &ndash; Ants Profiler

    - by Alois Kraus
    I just downloaded ANTS Profiler 7.4 to check how fast it is and how deep I can analyze the startup of Visual Studio 2012. The Pro version which is useful does cost 445€ which is ok. To measure a complex system I decided to simply profile VS2012 (Update 1) on my older Intel 6600 2,4GHz with 3 GB RAM and a 32 bit Windows 7. Ants Profiler is really easy to use. So lets try it out. The Ants Profiler does want to start the profiled application by its own which seems to be rather common. I did choose Method Level timing of all managed methods. In the configuration menu I did want to get all call stacks to get full details. Once this is configured you are ready to go.   After that you can select the Method Grid to view Wall Clock Time in ms. I hate percentages which are on by default because I do want to look where absolute time is spent and not something else.   From the Method Grid I can drill down to see where time is spent in a nice and I can look at the decompiled methods where the time is spent. This does really look nice. But did you see the size of the scroll bar in the method grid? Although I wanted all call stacks I do get only about 4 pages of methods to drill down. From the scroll bar count I would guess that the profiler does show me about 150 methods for the complete VS startup. This is nonsense. I will never find a bottleneck in VS when I am presented only a fraction of the methods that were actually executed. I have also tried in the configuration window to also profile the extremely trivial functions but there was no noticeable difference. It seems that the Ants Profiler does filter away way too many details to be useful for bigger systems. If you want to optimize a CPU bound operation inside NUnit then Ants Profiler is with its line level timings a very nice tool to work with. But for bigger stuff it is certainly not usable. I also do not like that I must start the profiled application from the profiler UI. This makes it hard to profile processes which are started by some other process. Next: JetBrains dotTrace

    Read the article

  • Hyper-V Manager version 6.2, an experience in virtual switch setup

    - by Kevin Shyr
    The version number of Hyper-V manager is 6.2.9200.16384   This is what came with my Windows 8 work laptop (by enabling Windows features) The blogs I read indicated that I need an external switch for my guest OS to access internet, and an internal one for them to share folder with my Host OS.  I proceeded to create an external virtual switch, and here is the screenshot. After setting up the network adapters on the guest OS, I peeked into host OS networking, and saw that Network Bridge was already created.  GREAT!  So I fired up my guest OS and darn, no internet.  Then I noticed that my host internet was gone, too.  I looked further and found that even though I have a network bridge, no connection has the status "Bridged"Once I removed the bridge (by removing individual connection from the bridge, I know, weird, since none of them say "Bridged" in status)  I re-selected the connection that I want and add them to the bridge to create a new network bridge.  Once my wireless connection status shows "Bridged", I was able to get to internet from my guest OS.Two things I noticed after I got internet for everyone ( my host and guest OS):My network adapters in the host OS no longer shows "Bridged", but everyone can still get to the internetThe virtual switch that I set up for "External" is now showing to be "Internal", and I was able to create shared folder between host and guest OS.  This means I didn't have to create the other "Internal" virtual switch.

    Read the article

  • TransformXml Task locks config file identified in Source attribute

    - by alexhildyard
    As background: the TransformXml MSBuild task is typically invoked in a custom build step to mark up a web.config file with per-environment configuration; its flexible directives offer highly granular control over the insertion, removal, substitution and transformation of existing configuration hierarchies. For those using the TransformXML task (typically in a Web Deployment Project) I raised an issue against Visual Studio 2010, in which the file handle on the input file was not released, meaning that following transformation the source file remained locked. As a result, the best way to transform a file was first to rename it, transform it, and then copy it back, leaving the "locked" file to be freed up later.I just heard today that this has now been resolved in Visual Studio 2012 RTM. That's good news, because Web Config Transformations offer a lot. An intelligent, automated build process will swap in the relevant transform(s), making it much easier to synthesise the Developer and Build server builds. This makes for a simpler and more exemplary build process, and with the tighter coupling comes a correspondingly quicker response to Developmental change.Oh, and don't forget -- it isn't just web.configs you can transform. You can transform app.configs, or indeed any XML file that honours the task's schema and hierarchical rules.

    Read the article

  • Windows Metro Requests

    - by Scott Dorman
    Windows 8 and Windows Metro style apps have a lot of potential, but only if application vendors realize there is a demand to see their app as a Metro style app and not just as a desktop app (or worse, only as an Android or iOS app). As consumers, the only thing we can do is be vocal about our desire to see these apps on Windows 8 as a Metro style app. In an effort to raise awareness, I just launched WinMetro Requests. This is our opportunity to request Windows Metro style apps  and show those companies just how much interest there is for seeing their app as a Metro style app. This site is running on UserVoice, so it allows you to easily submit application requests, add comments, and, more importantly, vote for your favorite applications to come to Windows as a Metro style app! As I find out the status of requested applications, I will update the status of the request. If you know and have official communication from one of the companies indicating they will be or are working on a Windows Metro style app, please let me know and I'll update the status of the request after verifying (or at least trying to verify) the information.

    Read the article

  • Time Capsule With Raspberry Pi

    - by Richard Jones
    So I have a Raspberry PI, with an 1TB external USB HD plugged into it. I have Debian Wheezy installed to which I added the NetaTalk package. Following this guide which is on Ubuntu, but was easy enough to understand - http://kremalicious.com/ubuntu-as-mac-file-server-and-time-machine-volume/ I was even able to change the icon to look like an X-Serve :-) Next step to add a second HD that will backup my Windows 8 laptop as well.

    Read the article

  • New Beta of GhostDoc v4

    - by TATWORTH
    A new beta of GhostDoc v4 is available at http://submain.com/download/ghostdoc/beta/The updated license key is at http://submain.com/blog/GhostDocV4Beta2IsAvailable.aspxHere are some of the excellent features of GhostDoc v4"Version 4 is a major milestone for us with great new features and rewrites that we have done over the last year. Here are the most significant additions to the GhostDoc feature set: Visual Studio 2012 support (Pro) Source code Spell Checker C/C++ language support XML Comment Preview StyleCop Compliance – comments generated by GhostDoc are now pass StyleCop validation Exception Documentation - exceptions raised within a method are documented in the XML Comment (Pro) File Header menu and template (Pro) Visual Studio toolbar with commands for documenting, comment preview and spell-checking (Pro) Options -> Global Properties - allows to reference custom configured user properties within T4 templates (CodeIt.Right users will find this very familiar) (Pro) IntelliSense in the T4 template editor Version update notification – you won’t miss new version release ever again!"

    Read the article

  • Using HTML5 Today part 4&ndash;What happened to XHTML?

    - by Steve Albers
    This is the fourth entry in a series of descriptions & demos from the “Using HTML5 Today” user group presentation. For practical purposes, the original XHTML standard is a historical footnote, although XHTML transitional will probably live on forever in the default web page templates of old web page editors. The original XHTML spec was released in 2000, on the heels of the HTML 4.01 spec.  The plan was to move web development away from HTML to the more formal, rigorous approach that XHTML offered, but it was built on a principle that conflicts with the history and culture of the Internet: XHTML introduced the idea of Draconian Error Handling, which essentially means that invalid XML markup on a page will cause a page to stop rendering. There is a transitional mode offered in the original XHTML spec, but the goal was to move to D.E.H.  You can see the result by changing the doc type for a document to “application/xhtml+xml” - for my class example we change this setting in the web.config file: <staticContent> <remove fileExtension=".html" /> <mimeMap fileExtension=".html" mimeType="application/xhtml+xml" /> </staticContent> With the new strict syntax a simple error, in this case a duplicate </td> tag, can cause a critical page error: While XHTML became very popular in the ensuing decade, the Strict form of XHTML never achieved widespread use. Draconian Error Handling was one of the factors that led in time to the creation of the WHATWG, or Web Hypertext Application Technology Group.  WHATWG contributed to the eventually disbanding of the XHTML 2.0 working group and the W3C’s move to embrace the HTML5 standard. For developers who long for XML markup the W3C HTML5 standard includes an XHTML5 syntax. For the longer, more definitive look at what happened to XHTML and how HTML5 came to be check out the Dive Into HTML mirror site or Bruce Lawson’s “HTML5: Who, What, When Why” talk.

    Read the article

  • How to Extract the images from the doc file without Having Microsoft office ?

    - by Anirudha
    Originally posted on: http://geekswithblogs.net/anirugu/archive/2013/06/29/how-to-extract-the-images-from-the-doc-file-without.aspxMany time we got the doc file who have some images. We need to try to extract them in Microsoft word which come with Windows 7 (not Microsoft office word). Looking to this article http://www.techrepublic.com/blog/itdojo/save-images-in-microsoft-word-documents-as-separate-files/135 This article is only useful when you have Microsoft word installed. Now if I don’t have Microsoft office then what ? No problem, here is a trick. when you open the doc file in Word  then select the image and right click on image and choose cut. open the Microsoft paint. paste them here. Without clicking anywhere click on crop icon on toolbar.   Now you got your image in the same size as you have in word file. Don’t worry about Image format. Microsoft paint have support for save them in PNG format.

    Read the article

  • PEX - are you licenced for it?

    - by TATWORTH
    There is an interesting artcile about PEX at http://blogs.msdn.com/b/ukmsdn/archive/2011/02/22/featured-article-pex-and-visual-studio.aspx PEX can be downloaded from http://research.microsoft.com/en-us/projects/pex/downloads.aspx The licence conditions are: "Pex and Moles are Power Tools available for commercial use for MSDN subscribers. Moles is also available separately for commercial use without requiring an MSDN subscription. Pex and Moles are also available for academic and non-commercial use." I note with interest that it is now available to MSDN subscribers. If I recall correctly it used to be only available to VS Team versions.

    Read the article

< Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >