Search Results

Search found 684 results on 28 pages for 'pipeline'.

Page 19/28 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • UI Controls layer on top of operating system.

    - by Mason Blier
    I'm kind of curious about what layer writing a UI platform to the level of Win32 or the X Windowing System would fall in the grand scheme of an operating system. What layers below do they primarily make use of, is it heavily based on direct communication with the graphics card driver (I can't imagine going though a rendering pipeline like OpenGL for this), or is there a graphical platform as part of the operating system which extracts this out a little more. I'm also interested in the creation of shells and the like, and I"m particularly curious as to how people go about creating alternative shells for windows, what do people look for when figuring out what methods to call or what to hook into, etc? I guess I'm fairly lost at these concepts and finding it difficult to find documentation on them. I was initially excited to have taken Operating Systems in college but it was all low level resource management stuff. Thanks all, Mason

    Read the article

  • How to define the order with ImportMany attribute?

    - by JD
    Hi to all, I am just getting into MEF and was wondering how you could define the order of collection exported with [ImportMany]? What I mean here is if I had two classes (Class1, Class2) that implement the interface IService and each of the implementations are in two different libraries (although they could be in the same), I want the Class2 instance to be created before the Class1 instance in the IEnumerable collection defined by the ImportMany attribute. So it is like a pipeline of functionality where Class2 calls are made before Class1 calls. Also, I have an another Class (Class3 which also implements IService) in another library, which I want introduced later on (i.e. some logging utility), how do I make this the 3rd instance in the ImportMany collection? JD

    Read the article

  • ASP.NET MVC Session usage

    - by Ben
    Currently I am using ViewData or TempData for object persistance in my ASP.NET MVC application. However in a few cases where I am storing objects into ViewData through my base controller class, I am hitting the database on every request (when ViewData["whatever"] == null). It would be good to persist these into something with a longer lifespan, namely session. Similarly in an order processing pipeline, I don't want things like Order to be saved to the database on creation. I would rather populate the object in memory and then when the order gets to a certain state, save it. So it would seem that session is the best place for this? Or would you recommend that in the case of order, to retrieve the order from the database on each request, rather than using session? Thoughts, suggestions appreciated. Thanks Ben

    Read the article

  • Will there be IQueryable-like additions to IObservable? (.NET Rx)

    - by Jason
    The new IObservable/IObserver frameworks in the System.Reactive library coming in .NET 4.0 are very exciting (see this and this link). It may be too early to speculate, but will there also be a (for lack of a better term) IQueryable-like framework built for these new interfaces as well? One particular use case would be to assist in pre-processing events at the source, rather than in the chain of the receiving calls. For example, if you have a very 'chatty' event interface, using the Subscribe().Where(...) will receive all events through the pipeline and the client does the filtering. What I am wondering is if there will be something akin to IQueryableObservable, whereby these LINQ methods will be 'compiled' into some 'smart' Subscribe implementation in a source. I can imagine certain network server architectures that could use such a framework. Or how about an add-on to SQL Server (or any RDBMS for that matter) that would allow .NET code to receive new data notifications (triggers in code) and would need those notifications filtered server-side.

    Read the article

  • Accessing HttpRequest from Global.asax via a page

    - by Polymorphix
    I'm trying to get a property (ImpersonatePersonId) from a page in global.asax but get a HttpException saying 'Request is not available in this context'. I've been searching for some documentation on where in the pipeline the request is accessible, but as far as I can see all Microsoft can produce of documentation is one-liners like "PostRequestHandlerExecute: Occurs when the ASP.NET event handler finishes execution." which really doesn't give me much... I've tried placing the call in both Pre and PostRequestHandlerExceute but with the same result. I wonder if anyone with experience in this would be so kind as to tell me where the Request object is available. My code from global asax is below. ICanImpersonate page = HttpContext.Current.Handler as ICanImpersonate; ImpersonatedUser impersonatePerson = page != null ? page.ImpersonatePersonId : null; Response.Filter = new TagRewriter(Response, new TagProcessor(Context, impersonatePerson).Process); What I want to do is rewrite some HTML based on some request parameters.

    Read the article

  • Profile:Object reference not set to an instance of an object

    - by sallalman83
    Hi, i just lunch my web site i used the asp.net routing technology on it, and its work fine in my localhost but when i moved the project to the hosting server(Godaddy.com) its just work fine when there is no virtual sub directories like this(havebreak.com) but when u click on this link (havebreak.com/Registration/) or any other links that contain a virtual sub-directories its give Object reference not set to an instance of an object error on the profile object if (!Profile.IsAnonymous) Line 18: mlvRegistratioin.ActiveViewIndex = 1; i check my IIS settings and found it using the "Integrated pipeline" as recommended (at least at my knowledge), i checked the httpModules and httpHandlers tags under the system.webServer (since my hosting plan use the IIS7) and under the normal tag and every thing is fine then i used the url-rewriting instead of URL-Routing and the same problem exist and i notice that the session also not working in the virtual sub-directories too and by the way the ASP.NET routing work fine with my site its just the profile and session objects that not workin any help will be appreciated

    Read the article

  • Fastest way to compress a database or .bak file and transfer it

    - by Nai
    As per the question title. I wonder if there are special programmes or commands that makes zipping up a .bak file and transferring it super quick. I read abour xp_cmdshell here but I'm not sure about the speed. My .bak file is about 12 gigs at the moment. Related to this is the possibility of using Red Gate's SQL Data Compare to just transfer the differential data across the network pipeline but I have never used SQL Data Compare before and I'm not sure how it goes about doing INSERTS on tables with Primary Keys and such. Also, not sure about the speed. Does anyone have any experience with this programme or similar programmes? Cheers!

    Read the article

  • System.Web.HttpException in asp.net mvc 2 on images and javascript files

    - by Rippo
    Hi I am getting the following errors reported by ELMAH on my asp.net mvc 2 site for javascript files, images etc. System.Web.HttpException: The remote host closed the connection I have done some research and it appears that the user/bot is clicking a link on the site before the page has fully loaded. Now this error never occurs on a controller action but always on a file that is on disk. e.g. /Content/CmsImages/logo.png /Content/CmsImages/MemberImages/Photo-001605.jpg /Content/jquery.tickertype.js So this means that all static files are being routed through the mvc pipeline. What options do I have?

    Read the article

  • sbt: "test" works "test:run" not

    - by Martin
    I try to establish a build pipeline on Jenkins with a Play(2.0.2) project. As I want to just build the sources once and use the classes for downstream builds, I now have created a "compile"-job, that runs "sbt test:compile". That works so far. The next job should then just run the compiled tests. If I use "sbt test" it works as expected, but compiles the sources again. But if I try to run "sbt test:run" it says: [info] Loading project definition from ~/myproject/project [info] Set current project to myproject (in build file: ~/myproject/) java.lang.RuntimeException: No main class detected. at scala.sys.package$.error(package.scala:27) [error] {file:~/myproject/test:run: No main class detected. The same happens locally. I can run "sbt test" but not "sbt test:run". Same error. Is there someone who can point me to the right direction?

    Read the article

  • Sitecore - version created in other language when renaming

    - by misteraidan
    So I've got this Sitecore content item right, and it's got one version in one language "en-AU". I have 3 potential languages in the system "en", "en-AU" and "en-NZ". I rename the item, right, and Sitecore creates a new version in "en". I delete the "en" version and rename again, same result.. a new version is created .. and again... and again .. see where I'm going with this? And again! Why would it do that? I thought it was a problem with my pipeline processor, but I turned it off and it still happens! Any ideas would be welcome.

    Read the article

  • How to build a data flow?

    - by salvationishere
    I am running Visual Studio 2008, the SSIS Tutorial described on: http://msdn.microsoft.com/en-us/library/ms167106.aspx I finished all of the tasks but am getting following errors: Error 1 Validation error. Extract Sample Currency Data: Extract Sample Currency Data: input column "CurrencyAlternateKey" (123) has lineage ID 55 that was not previously used in the Data Flow task. Lesson 1.dtsx 0 0 Error 2 Validation error. Extract Sample Currency Data SSIS.Pipeline: input column "CurrencyAlternateKey" (123) has lineage ID 55 that was not previously used in the Data Flow task. Lesson 1.dtsx 0 0 Can you tell what I need to do to make this build without errors?

    Read the article

  • How to implement a syndication receiver? (multi-client / single server)

    - by LeonixSolutions
    I have to come up with a system architecture. A few hundred remote devices will be communicating over internet with a central server which will receive data and store it in a database. I could write my own TCP/IP based protocol use SOAP use AJAX use RSS anything else? This is currently seen as one way (telemetry, as opposed to SCADA). Would it make a difference if we make it bi-directional. There are no plans to do so, but Murphy's law makes me wary of a uni-directional solution (on the data plane; I imagine that the control plane is bi-directional in all solutions (?)). I hope that this is not too subjective. I would like a solution which is quick and easy to implement and for others to support and where the general "communications pipeline" from remote deceives to database server can be re-used as the core of future projects. I have a strong background in telecomms protocols, in C/C++ and PHP.

    Read the article

  • How to configure IIS to serve my 404 response with my custom content?

    - by Marek
    This question is related to this, hopefully better phrased. I would like to serve a custom 404 page from ASP.NET MVC. I have the route handler and all the infrastructure set up to ensure that nonexistent routes are handled by a single action: public ActionResult Handle404() { Response.StatusCode = 404; return View("NotFound"); } Problem: IIS serves back its own content (some predefined message) when I set Response.StatusCode to 404 before returning the content. On the VS development web server, this works as intended - the status code of the HTTP response is 404 while my content (the NotFound view) is served. I believe that when the IIS processing pipeline sees that the application returns 404, it simply replaces the whole response with its own. What setting in IIS affects this behavior? I do not have access to the IIS installation so I can not investigate this - however, I can ask the hosting provider to tweak the configuration for me if I know what exactly needs to be changed.

    Read the article

  • Design issue with ATG CommercePipelineManager

    - by user1339772
    The definition of runProcess() method in PipelineManager is public PipelineResult runProcess(String pChainId, Object pParam) throws RunProcessException This gives me an impression that ANY object can be passed as the second param. However, ATG OOTB has PipelineManager component referring to CommercePipelineManager class which overrides the runProcess() method and downcast pParam to map and adds siteId to it. Basically, this enforces the client code to send only Map. Thus, if one needs to create a new pipeline chain, has to use map as data structure to pass on the data. Offcourse, one can always get around this by creating a new PipelineManager component, but I was just wondering the thought behind explicitly using map in CommercePipelineManager

    Read the article

  • Capturing WPF Vector Information BEFORE it Renders to Screen

    - by user273722
    I'm trying to "capture" or record the vector display information of a WPF (maybe Silverlight) application and play it back. However, instead of capturing bitmaps of what is rendered, I would like to capture the vector information BEFORE it gets rendered so that I can play it back at different resolutions without loss of quality. Ideally, I'd like to do this without having to add assemblies into my app (but willing to do so if necessary). I've looked into the WPF rendering pipeline and cannot find an appropriate starting point (or, stated differently, I couldn't figure it out). Maybe the VisualTreeHelper class?

    Read the article

  • What is the role of asserts in C++ programs that have unit tests?

    - by lhumongous
    Greetings, I've been adding unit tests to some legacy C++ code, and I've run into many scenarios where an assert inside a function will get tripped during a unit test run. A common idiom that I've run across is functions that take pointer arguments and immediately assert if the argument is NULL. I could easily get around this by disabling asserts when I'm unit testing. But I'm starting to wonder if unit tests are supposed to alleviate the need for runtime asserts. Is this a correct assessment? Are unit tests supposed to replace runtime asserts by happening sooner in the pipeline (ie: the error is caught in a failing test instead of when the program is running). On the other hand, I don't like adding soft fails to code (eg: if(param == NULL) return false;). A runtime assert at least makes it easier to debug a problem in case a unit test missed a bug. Thanks!

    Read the article

  • Retrieve web user's Identity outside of request scope

    - by Kendrick
    I have an ASP.NET app that logs Audit reports using nHibernate's IPreUpdateListener. In order to set the current user in the Listener events, I was using System.Security.Principal.WindowsIdentity.GetCurrent(). This works fine when debugging on my machine, but when I move it to the staging server, I'm getting the ASP.NET process credentials, not the requesting user. In the ASP.NET page, I can use Request.LogonUserIdentity (which works fine since I'm using integrated authentication), but how do I reference this user directly without having to pass it directly to my event? I don't want to have to pass this info through the pipeline because it really doesn't belong in the intermediate events/calls.

    Read the article

  • OpenGL Mapping Textures to a Grid Stored Inside Vertex Array

    - by Matthew Hoggan
    I am writing a test to verify something. This is not production code, just verification code. So I would appreciate it if the specific question was answered. I have code that uses indices and vertices to draw a set of triangles in the shape of a grid. All the vertices are drawn using glDrawElements(). Now for each vertex I will set its corresponding Texture Coordinates to 0 or 1 for each set of triangles that form a square in the grid. Basically I want to draw a collage of random textures in each one of the "squares" (consisting of two triangles). I can do this using the glBegin() and glEnd() method calls inside a for loop using the fixed functional pipeline, but I would like to know how to do this using Vertex Arrays.

    Read the article

  • AppDomain assemblies not being loaded correctly.

    - by SharePoint Newbie
    Hi, We are doing the following in the Application_Start (Global.ascx.cs) for a WCF Service hosted by IIS 7.0 (integrated pipeline). var mapperConfigurations = AppDomain.CurrentDomain.GetAssemblies() .SelectMany(a => a.GetExportedTypes().Where(t => typeof (IMapperConfiguration).IsAssignableFrom(t) && t.IsClass)) .ToList(); The web-service has 8-10 assemblies in its bin folder and each of them have multiple implementations of IMapperConfiguration. After an IIS Reset, no mapper configurations are found (found this using debug.write). However, this behaviour is inconsistent and at other times all implementations of IMapperConfiguration are found. When exactly does IIS load assemblies and what is wrong with this code? Thanks

    Read the article

  • Problem with httpContext.RewritePath on IIS 7

    - by PNR
    I am using httpContext.RewritePath in Global.asax for som URLrewriting, and it works very well in my development environment on the Cassini server. But when I copy it to the production server, witch is a IIS 7 it isn't working. I have also tried to use Context.Server.TransferRequest but thne I get the error: "This operation requires IIS integrated pipeline mode." on both Cassini and IIS 7, and on IIS 7 the website is running in "Integreret" mode in the AppPool. I rewrite all urls on the website like /[The main menuname]/[pagename].aspx fx from /web/thesite.aspx?mainmenu=manager to /manager/thesite.aspx OR /web/theOtherSite.aspx?mainmenu=about to /about/theOtherSite.aspx And so on... Thanks very much in advance!

    Read the article

  • When is a Transient-scope object Deactivated in Ninject?

    - by nwahmaet
    When an object in Ninject is bound with InTransientScope(), the object isn't placed into the cache, since it's, er, transient and not scoped to anything. When done with the object, I can call kernel.Release(obj); this passes through to the Cache where it retrieves the cached item and calls Pipeline.Deactivate using the cached entry. But since transient objects aren't cached, this doesn't happen. I haven't been able to figure out where (or who) performs the deactivation for transient objects. Or is the assumption that transient objects are only ever activated, and that if I want a deactivateable object, I need to use some other scope?

    Read the article

  • SQL Server Editions and Integration Services

    The SQL Server 2005 and SQL Server 2008 product family has quite a few editions now, so what does this mean for SQL Server Integration Services? Starting from the bottom we have the free edition known as Express, and the entry level Workgroup edition, as well as the new Web edition. None of these three include the full SSIS product, but they do all include the SQL Server Import and Export Wizard, with access to basic data sources but nothing more, so for simple loading and extraction of data this should suffice. You will not be able to build packages though, this is just a one shot deal aimed at using the wizard on an ad-hoc basis. To get the full power of Integration Services you need to start with Standard edition. This includes the BI Development Studio, for building your own packages, and fully functional IDE integrated into Visual Studio. (You get the full VS 2005/2008 IDE with the product). All core functions will be available but with a restricted set of transformations and tasks. The SQL Server 2005 Features Comparison or Features Supported by the Editions of SQL Server 2008 describes standard edition as having basic transforms, compared to Enterprise which includes the advanced transforms. I think basic is a little harsh considering the power you get with Standard, but the advanced covers the truly ground-breaking capabilities of data mining, text mining and cleansing or fuzzy transforms. The power of performing these operations within your ETL pipeline should not be underestimated, but not all processes will require these capabilities, so it seems like a reasonable delineation. Thankfully there are no feature limitations or artificial governors within Standard compared to Enterprise. The same control flow and data flow engines underpin both editions, with the same configuration and deployment options allowing you to work seamlessly between environments and editions if using the common components. In fact there are no govenors at all in SSIS, so whilst the SQL Database engine is limited to 4 CPUs in Standard edition, SSIS is only limited by the base operating system. The advanced transforms only available with Enterprise edition: Data Mining Training Destination Data Mining Query Component Fuzzy Grouping Fuzzy Lookup Term Extraction Term Lookup Dimension Processing Destination Partition Processing Destination The advanced tasks only available with Enterprise edition: Data Mining Query Task So in summary, if you want SQL Server Integration Services, you need SQL Server Standard edition, and for the more advanced tasks and transforms you need SQL Server Enterprise edition. To recap, the answer to the often asked question is no, SQL Server Integration Services is not available in SQL Server Express or Workgroup editions.

    Read the article

  • Top things web developers should know about the Visual Studio 2013 release

    - by Jon Galloway
    ASP.NET and Web Tools for Visual Studio 2013 Release NotesASP.NET and Web Tools for Visual Studio 2013 Release NotesSummary for lazy readers: Visual Studio 2013 is now available for download on the Visual Studio site and on MSDN subscriber downloads) Visual Studio 2013 installs side by side with Visual Studio 2012 and supports round-tripping between Visual Studio versions, so you can try it out without committing to a switch Visual Studio 2013 ships with the new version of ASP.NET, which includes ASP.NET MVC 5, ASP.NET Web API 2, Razor 3, Entity Framework 6 and SignalR 2.0 The new releases ASP.NET focuses on One ASP.NET, so core features and web tools work the same across the platform (e.g. adding ASP.NET MVC controllers to a Web Forms application) New core features include new templates based on Bootstrap, a new scaffolding system, and a new identity system Visual Studio 2013 is an incredible editor for web files, including HTML, CSS, JavaScript, Markdown, LESS, Coffeescript, Handlebars, Angular, Ember, Knockdown, etc. Top links: Visual Studio 2013 content on the ASP.NET site are in the standard new releases area: http://www.asp.net/vnext ASP.NET and Web Tools for Visual Studio 2013 Release Notes Short intro videos on the new Visual Studio web editor features from Scott Hanselman and Mads Kristensen Announcing release of ASP.NET and Web Tools for Visual Studio 2013 post on the official .NET Web Development and Tools Blog Scott Guthrie's post: Announcing the Release of Visual Studio 2013 and Great Improvements to ASP.NET and Entity Framework Okay, for those of you who are still with me, let's dig in a bit. Quick web dev notes on downloading and installing Visual Studio 2013 I found Visual Studio 2013 to be a pretty fast install. According to Brian Harry's release post, installing over pre-release versions of Visual Studio is supported.  I've installed the release version over pre-release versions, and it worked fine. If you're only going to be doing web development, you can speed up the install if you just select Web Developer tools. Of course, as a good Microsoft employee, I'll mention that you might also want to install some of those other features, like the Store apps for Windows 8 and the Windows Phone 8.0 SDK, but they do download and install a lot of other stuff (e.g. the Windows Phone SDK sets up Hyper-V and downloads several GB's of VM's). So if you're planning just to do web development for now, you can pick just the Web Developer Tools and install the other stuff later. If you've got a fast internet connection, I recommend using the web installer instead of downloading the ISO. The ISO includes all the features, whereas the web installer just downloads what you're installing. Visual Studio 2013 development settings and color theme When you start up Visual Studio, it'll prompt you to pick some defaults. These are totally up to you -whatever suits your development style - and you can change them later. As I said, these are completely up to you. I recommend either the Web Development or Web Development (Code Only) settings. The only real difference is that Code Only hides the toolbars, and you can switch between them using Tools / Import and Export Settings / Reset. Web Development settings Web Development (code only) settings Usually I've just gone with Web Development (code only) in the past because I just want to focus on the code, although the Standard toolbar does make it easier to switch default web browsers. More on that later. Color theme Sigh. Okay, everyone's got their favorite colors. I alternate between Light and Dark depending on my mood, and I personally like how the low contrast on the window chrome in those themes puts the emphasis on my code rather than the tabs and toolbars. I know some people got pretty worked up over that, though, and wanted the blue theme back. I personally don't like it - it reminds me of ancient versions of Visual Studio that I don't want to think about anymore. So here's the thing: if you install Visual Studio Ultimate, it defaults to Blue. The other versions default to Light. If you use Blue, I won't criticize you - out loud, that is. You can change themes really easily - either Tools / Options / Environment / General, or the smart way: ctrl+q for quick launch, then type Theme and hit enter. Signing in During the first run, you'll be prompted to sign in. You don't have to - you can click the "Not now, maybe later" link at the bottom of that dialog. I recommend signing in, though. It's not hooked in with licensing or tracking the kind of code you write to sell you components. It is doing good things, like  syncing your Visual Studio settings between computers. More about that here. So, you don't have to, but I sure do. Overview of shiny new things in ASP.NET land There are a lot of good new things in ASP.NET. I'll list some of my favorite here, but you can read more on the ASP.NET site. One ASP.NET You've heard us talk about this for a while. The idea is that options are good, but choice can be a burden. When you start a new ASP.NET project, why should you have to make a tough decision - with long-term consequences - about how your application will work? If you want to use ASP.NET Web Forms, but have the option of adding in ASP.NET MVC later, why should that be hard? It's all ASP.NET, right? Ideally, you'd just decide that you want to use ASP.NET to build sites and services, and you could use the appropriate tools (the green blocks below) as you needed them. So, here it is. When you create a new ASP.NET application, you just create an ASP.NET application. Next, you can pick from some templates to get you started... but these are different. They're not "painful decision" templates, they're just some starting pieces. And, most importantly, you can mix and match. I can pick a "mostly" Web Forms template, but include MVC and Web API folders and core references. If you've tried to mix and match in the past, you're probably aware that it was possible, but not pleasant. ASP.NET MVC project files contained special project type GUIDs, so you'd only get controller scaffolding support in a Web Forms project if you manually edited the csproj file. Features in one stack didn't work in others. Project templates were painful choices. That's no longer the case. Hooray! I just did a demo in a presentation last week where I created a new Web Forms + MVC + Web API site, built a model, scaffolded MVC and Web API controllers with EF Code First, add data in the MVC view, viewed it in Web API, then added a GridView to the Web Forms Default.aspx page and bound it to the Model. In about 5 minutes. Sure, it's a simple example, but it's great to be able to share code and features across the whole ASP.NET family. Authentication In the past, authentication was built into the templates. So, for instance, there was an ASP.NET MVC 4 Intranet Project template which created a new ASP.NET MVC 4 application that was preconfigured for Windows Authentication. All of that authentication stuff was built into each template, so they varied between the stacks, and you couldn't reuse them. You didn't see a lot of changes to the authentication options, since they required big changes to a bunch of project templates. Now, the new project dialog includes a common authentication experience. When you hit the Change Authentication button, you get some common options that work the same way regardless of the template or reference settings you've made. These options work on all ASP.NET frameworks, and all hosting environments (IIS, IIS Express, or OWIN for self-host) The default is Individual User Accounts: This is the standard "create a local account, using username / password or OAuth" thing; however, it's all built on the new Identity system. More on that in a second. The one setting that has some configuration to it is Organizational Accounts, which lets you configure authentication using Active Directory, Windows Azure Active Directory, or Office 365. Identity There's a new identity system. We've taken the best parts of the previous ASP.NET Membership and Simple Identity systems, rolled in a lot of feedback and made big enhancements to support important developer concerns like unit testing and extensiblity. I've written long posts about ASP.NET identity, and I'll do it again. Soon. This is not that post. The short version is that I think we've finally got just the right Identity system. Some of my favorite features: There are simple, sensible defaults that work well - you can File / New / Run / Register / Login, and everything works. It supports standard username / password as well as external authentication (OAuth, etc.). It's easy to customize without having to re-implement an entire provider. It's built using pluggable pieces, rather than one large monolithic system. It's built using interfaces like IUser and IRole that allow for unit testing, dependency injection, etc. You can easily add user profile data (e.g. URL, twitter handle, birthday). You just add properties to your ApplicationUser model and they'll automatically be persisted. Complete control over how the identity data is persisted. By default, everything works with Entity Framework Code First, but it's built to support changes from small (modify the schema) to big (use another ORM, store your data in a document database or in the cloud or in XML or in the EXIF data of your desktop background or whatever). It's configured via OWIN. More on OWIN and Katana later, but the fact that it's built using OWIN means it's portable. You can find out more in the Authentication and Identity section of the ASP.NET site (and lots more content will be going up there soon). New Bootstrap based project templates The new project templates are built using Bootstrap 3. Bootstrap (formerly Twitter Bootstrap) is a front-end framework that brings a lot of nice benefits: It's responsive, so your projects will automatically scale to device width using CSS media queries. For example, menus are full size on a desktop browser, but on narrower screens you automatically get a mobile-friendly menu. The built-in Bootstrap styles make your standard page elements (headers, footers, buttons, form inputs, tables etc.) look nice and modern. Bootstrap is themeable, so you can reskin your whole site by dropping in a new Bootstrap theme. Since Bootstrap is pretty popular across the web development community, this gives you a large and rapidly growing variety of templates (free and paid) to choose from. Bootstrap also includes a lot of very useful things: components (like progress bars and badges), useful glyphicons, and some jQuery plugins for tooltips, dropdowns, carousels, etc.). Here's a look at how the responsive part works. When the page is full screen, the menu and header are optimized for a wide screen display: When I shrink the page down (this is all based on page width, not useragent sniffing) the menu turns into a nice mobile-friendly dropdown: For a quick example, I grabbed a new free theme off bootswatch.com. For simple themes, you just need to download the boostrap.css file and replace the /content/bootstrap.css file in your project. Now when I refresh the page, I've got a new theme: Scaffolding The big change in scaffolding is that it's one system that works across ASP.NET. You can create a new Empty Web project or Web Forms project and you'll get the Scaffold context menus. For release, we've got MVC 5 and Web API 2 controllers. We had a preview of Web Forms scaffolding in the preview releases, but they weren't fully baked for RTM. Look for them in a future update, expected pretty soon. This scaffolding system wasn't just changed to work across the ASP.NET frameworks, it's also built to enable future extensibility. That's not in this release, but should also hopefully be out soon. Project Readme page This is a small thing, but I really like it. When you create a new project, you get a Project_Readme.html page that's added to the root of your project and opens in the Visual Studio built-in browser. I love it. A long time ago, when you created a new project we just dumped it on you and left you scratching your head about what to do next. Not ideal. Then we started adding a bunch of Getting Started information to the new project templates. That told you what to do next, but you had to delete all of that stuff out of your website. It doesn't belong there. Not ideal. This is a simple HTML file that's not integrated into your project code at all. You can delete it if you want. But, it shows a lot of helpful links that are current for the project you just created. In the future, if we add new wacky project types, they can create readme docs with specific information on how to do appropriately wacky things. Side note: I really like that they used the internal browser in Visual Studio to show this content rather than popping open an HTML page in the default browser. I hate that. It's annoying. If you're doing that, I hope you'll stop. What if some unnamed person has 40 or 90 tabs saved in their browser session? When you pop open your "Thanks for installing my Visual Studio extension!" page, all eleventy billion tabs start up and I wish I'd never installed your thing. Be like these guys and pop stuff Visual Studio specific HTML docs in the Visual Studio browser. ASP.NET MVC 5 The biggest change with ASP.NET MVC 5 is that it's no longer a separate project type. It integrates well with the rest of ASP.NET. In addition to that and the other common features we've already looked at (Bootstrap templates, Identity, authentication), here's what's new for ASP.NET MVC. Attribute routing ASP.NET MVC now supports attribute routing, thanks to a contribution by Tim McCall, the author of http://attributerouting.net. With attribute routing you can specify your routes by annotating your actions and controllers. This supports some pretty complex, customized routing scenarios, and it allows you to keep your route information right with your controller actions if you'd like. Here's a controller that includes an action whose method name is Hiding, but I've used AttributeRouting to configure it to /spaghetti/with-nesting/where-is-waldo public class SampleController : Controller { [Route("spaghetti/with-nesting/where-is-waldo")] public string Hiding() { return "You found me!"; } } I enable that in my RouteConfig.cs, and I can use that in conjunction with my other MVC routes like this: public class RouteConfig { public static void RegisterRoutes(RouteCollection routes) { routes.IgnoreRoute("{resource}.axd/{*pathInfo}"); routes.MapMvcAttributeRoutes(); routes.MapRoute( name: "Default", url: "{controller}/{action}/{id}", defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional } ); } } You can read more about Attribute Routing in ASP.NET MVC 5 here. Filter enhancements There are two new additions to filters: Authentication Filters and Filter Overrides. Authentication filters are a new kind of filter in ASP.NET MVC that run prior to authorization filters in the ASP.NET MVC pipeline and allow you to specify authentication logic per-action, per-controller, or globally for all controllers. Authentication filters process credentials in the request and provide a corresponding principal. Authentication filters can also add authentication challenges in response to unauthorized requests. Override filters let you change which filters apply to a given action method or controller. Override filters specify a set of filter types that should not be run for a given scope (action or controller). This allows you to configure filters that apply globally but then exclude certain global filters from applying to specific actions or controllers. ASP.NET Web API 2 ASP.NET Web API 2 includes a lot of new features. Attribute Routing ASP.NET Web API supports the same attribute routing system that's in ASP.NET MVC 5. You can read more about the Attribute Routing features in Web API in this article. OAuth 2.0 ASP.NET Web API picks up OAuth 2.0 support, using security middleware running on OWIN (discussed below). This is great for features like authenticated Single Page Applications. OData Improvements ASP.NET Web API now has full OData support. That required adding in some of the most powerful operators: $select, $expand, $batch and $value. You can read more about OData operator support in this article by Mike Wasson. Lots more There's a huge list of other features, including CORS (cross-origin request sharing), IHttpActionResult, IHttpRequestContext, and more. I think the best overview is in the release notes. OWIN and Katana I've written about OWIN and Katana recently. I'm a big fan. OWIN is the Open Web Interfaces for .NET. It's a spec, like HTML or HTTP, so you can't install OWIN. The benefit of OWIN is that it's a community specification, so anyone who implements it can plug into the ASP.NET stack, either as middleware or as a host. Katana is the Microsoft implementation of OWIN. It leverages OWIN to wire up things like authentication, handlers, modules, IIS hosting, etc., so ASP.NET can host OWIN components and Katana components can run in someone else's OWIN implementation. Howard Dierking just wrote a cool article in MSDN magazine describing Katana in depth: Getting Started with the Katana Project. He had an interesting example showing an OWIN based pipeline which leveraged SignalR, ASP.NET Web API and NancyFx components in the same stack. If this kind of thing makes sense to you, that's great. If it doesn't, don't worry, but keep an eye on it. You're going to see some cool things happen as a result of ASP.NET becoming more and more pluggable. Visual Studio Web Tools Okay, this stuff's just crazy. Visual Studio has been adding some nice web dev features over the past few years, but they've really cranked it up for this release. Visual Studio is by far my favorite code editor for all web files: CSS, HTML, JavaScript, and lots of popular libraries. Stop thinking of Visual Studio as a big editor that you only use to write back-end code. Stop editing HTML and CSS in Notepad (or Sublime, Notepad++, etc.). Visual Studio starts up in under 2 seconds on a modern computer with an SSD. Misspelling HTML attributes or your CSS classes or jQuery or Angular syntax is stupid. It doesn't make you a better developer, it makes you a silly person who wastes time. Browser Link Browser Link is a real-time, two-way connection between Visual Studio and all connected browsers. It's only attached when you're running locally, in debug, but it applies to any and all connected browser, including emulators. You may have seen demos that showed the browsers refreshing based on changes in the editor, and I'll agree that's pretty cool. But it's really just the start. It's a two-way connection, and it's built for extensiblity. That means you can write extensions that push information from your running application (in IE, Chrome, a mobile emulator, etc.) back to Visual Studio. Mads and team have showed off some demonstrations where they enabled edit mode in the browser which updated the source HTML back on the browser. It's also possible to look at how the rendered HTML performs, check for compatibility issues, watch for unused CSS classes, the sky's the limit. New HTML editor The previous HTML editor had a lot of old code that didn't allow for improvements. The team rewrote the HTML editor to take advantage of the new(ish) extensibility features in Visual Studio, which then allowed them to add in all kinds of features - things like CSS Class and ID IntelliSense (so you type style="" and get a list of classes and ID's for your project), smart indent based on how your document is formatted, JavaScript reference auto-sync, etc. Here's a 3 minute tour from Mads Kristensen. The previous HTML editor had a lot of old code that didn't allow for improvements. The team rewrote the HTML editor to take advantage of the new(ish) extensibility features in Visual Studio, which then allowed them to add in all kinds of features - things like CSS Class and ID IntelliSense (so you type style="" and get a list of classes and ID's for your project), smart indent based on how your document is formatted, JavaScript reference auto-sync, etc. Lots more Visual Studio web dev features That's just a sampling - there's a ton of great features for JavaScript editing, CSS editing, publishing, and Page Inspector (which shows real-time rendering of your page inside Visual Studio). Here are some more short videos showing those features. Lots, lots more Okay, that's just a summary, and it's still quite a bit. Head on over to http://asp.net/vnext for more information, and download Visual Studio 2013 now to get started!

    Read the article

  • Catching people up

    - by Randy Walker
    It’s been a while since I’ve blogged.  I suppose sometimes when one’s personal life gets busy, there are some things that fall by the wayside.  So what all has happened since I last blogged? Business has been good with lots of lessons learned.  I had hoped I would have had an important announcement several months ago concerning the business I own, but that simply hasn’t materialized yet. Will keep everyone posted.  Ensuring your business has a good sales pipeline and stays ahead in the technology curve is extremely important. I eventually resigned my INETA Board of Directors position.  Never one to mince words, frankly I had several issues with how things are run at INETA.  Mostly centered around some ethical issues compounded by higher expectations and what I felt was a lack of support.  I had put my hat into the ring in order to help change things, but eventually I didn’t really see change a possibility, and so all things must come to an end. I have started writing up a new business plan for a new startup, details to be forthcoming.  It’s new name will be Linker CRM.  I have some aggressive game changing plans ahead for it.  Ping me if you’re interested in finding out more information and don’t mind signing a non-compete and confidentiality agreement. ;) My personal life, has been hectic.  A 4 year old will do that to you.  As well as being divorced and the headaches associated with that.  If you’ve been divorced, I feel your pain, if you haven’t been, I would never wish the emotional roller coaster ride on anyone.  Dating has been interesting.  It’s a lot different at age 35 than your early 20s and relationships are far more complicated. Ethan is an absolutely fantastic adorable charmer of a kid.  He’s definitely going to be a heartbreaker.  His personality is really shining through and he’s taken onto my appreciation of music (and yes I’ll admit dance too).  We watched America’s Best Dance Crew (ABDC) together for the first time, he really loved it and I think he’ll probably start his own break dancing crew eventually.  I’ve posted a few videos on Facebook for those interested.  I’m extremely proud of him, but please say a little prayer for us as we try and continue to curb some behavior issues, as well as his mother and I try to settle some differences. This year’s travel plans have already included Dallas, Seattle, and a trip to Vancouver for the 2010 Olympics (a huge thanks to the Washington State Police for the nice souvenir they gave me).  Future travel plans include a trip to Korea in the 2nd half of May, Nashville again in the summer, and hopefully New Orleans for the Microsoft TechEd 2010 Conference. Look for some new blog posts soon …

    Read the article

  • Oracle anuncia resultados de Q3 FY10

    - by Paulo Folgado
    Oracle Reports GAAP EPS of $0.23, Non-GAAP EPS of $0.38New Software Licenses Up 13%, Applications New Licenses Up 21%Oracle Corporation today announced fiscal 2010 Q3 GAAP total revenues were up 17% to $6.4 billion, while non-GAAP total revenues were up 18% to $6.5 billion. Excluding the impact of Sun Microsystems, Inc., which Oracle acquired on January 26, 2010, GAAP total revenue grew 7%. GAAP new software license revenues were up 13% to $1.7 billion, and up 10% to $1.7 billion excluding Sun. GAAP software license updates and product support revenues were up 13% to $3.3 billion, while non-GAAP software license updates and product support revenues were up 12% to $3.3 billion. GAAP operating income was down 5% to $1.8 billion, and GAAP operating margin was 29%. Non-GAAP operating income was up 13% to $2.9 billion, and non-GAAP operating margin was 45%. GAAP net income was down 10% to $1.2 billion, while non-GAAP net income was up 9% to $1.9 billion. GAAP earnings per share were $0.23, down 11% compared to last year while non-GAAP earnings per share were up 9% to $0.38. GAAP operating cash flow on a trailing twelve-month basis was $8.2 billion. "Our solid top line growth, coupled with disciplined expense management, was key in generating $8.0 billion of free cash flow over the last twelve months," said Oracle CFO Jeff Epstein."The Sun integration is going even better than we expected," said Oracle President, Safra Catz. "We believe that Sun will make a significant contribution to our fourth quarter earnings per share as well as meet the profitability goals we set for next year.""Exadata is the fastest growing product in Oracle's history," said Oracle President, Charles Phillips. "Introduced a little over a year ago, the Exadata pipeline is now approaching $400 million with Q4 bookings forecast at nearly $100 million. This strengthens both sales growth and profitability in our Sun server and storage businesses.""Every quarter we grab huge chunks of market share from SAP," said Oracle CEO, Larry Ellison. "SAP's most recent quarter was the best quarter of their year, only down 15%, while Oracle's application sales were up 21%. But SAP is well ahead of us in the number of CEOs for this year, announcing their third and fourth, while we only had one."In addition, Oracle's Board of Directors declared a cash dividend of $0.05 per share of outstanding common stock to be paid to stockholders of record as of the close of business on April 14, 2010, with a payment date of May 5, 2010. Future declarations of quarterly dividends and the establishment of future record and payment dates are subject to the final determination of Oracle's Board of Directors.Q3 Earnings Conference Call and WebcastOracle will hold a conference call and web broadcast today to discuss these results at 2:00 p.m. Pacific. You may listen to the call by dialing (800) 214-0694 or (719) 955-1425, Passcode: 567035. To access the live Web broadcast of this event, please visit the Oracle Investor Relations Web site at http://www.oracle.com/investor.

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >