Search Results

Search found 22456 results on 899 pages for 'computer behavior'.

Page 727/899 | < Previous Page | 723 724 725 726 727 728 729 730 731 732 733 734  | Next Page >

  • Hosting the Razor Engine for Templating in Non-Web Applications

    - by Rick Strahl
    Microsoft’s new Razor HTML Rendering Engine that is currently shipping with ASP.NET MVC previews can be used outside of ASP.NET. Razor is an alternative view engine that can be used instead of the ASP.NET Page engine that currently works with ASP.NET WebForms and MVC. It provides a simpler and more readable markup syntax and is much more light weight in terms of functionality than the full blown WebForms Page engine, focusing only on features that are more along the lines of a pure view engine (or classic ASP!) with focus on expression and code rendering rather than a complex control/object model. Like the Page engine though, the parser understands .NET code syntax which can be embedded into templates, and behind the scenes the engine compiles markup and script code into an executing piece of .NET code in an assembly. Although it ships as part of the ASP.NET MVC and WebMatrix the Razor Engine itself is not directly dependent on ASP.NET or IIS or HTTP in any way. And although there are some markup and rendering features that are optimized for HTML based output generation, Razor is essentially a free standing template engine. And what’s really nice is that unlike the ASP.NET Runtime, Razor is fairly easy to host inside of your own non-Web applications to provide templating functionality. Templating in non-Web Applications? Yes please! So why might you host a template engine in your non-Web application? Template rendering is useful in many places and I have a number of applications that make heavy use of it. One of my applications – West Wind Html Help Builder - exclusively uses template based rendering to merge user supplied help text content into customizable and executable HTML markup templates that provide HTML output for CHM style HTML Help. This is an older product and it’s not actually using .NET at the moment – and this is one reason I’m looking at Razor for script hosting at the moment. For a few .NET applications though I’ve actually used the ASP.NET Runtime hosting to provide templating and mail merge style functionality and while that works reasonably well it’s a very heavy handed approach. It’s very resource intensive and has potential issues with versioning in various different versions of .NET. The generic implementation I created in the article above requires a lot of fix up to mimic an HTTP request in a non-HTTP environment and there are a lot of little things that have to happen to ensure that the ASP.NET runtime works properly most of it having nothing to do with the templating aspect but just satisfying ASP.NET’s requirements. The Razor Engine on the other hand is fairly light weight and completely decoupled from the ASP.NET runtime and the HTTP processing. Rather it’s a pure template engine whose sole purpose is to render text templates. Hosting this engine in your own applications can be accomplished with a reasonable amount of code (actually just a few lines with the tools I’m about to describe) and without having to fake HTTP requests. It’s also much lighter on resource usage and you can easily attach custom properties to your base template implementation to easily pass context from the parent application into templates all of which was rather complicated with ASP.NET runtime hosting. Installing the Razor Template Engine You can get Razor as part of the MVC 3 (RC and later) or Web Matrix. Both are available as downloadable components from the Web Platform Installer Version 3.0 (!important – V2 doesn’t show these components). If you already have that version of the WPI installed just fire it up. You can get the latest version of the Web Platform Installer from here: http://www.microsoft.com/web/gallery/install.aspx Once the platform Installer 3.0 is installed install either MVC 3 or ASP.NET Web Pages. Once installed you’ll find a System.Web.Razor assembly in C:\Program Files\Microsoft ASP.NET\ASP.NET Web Pages\v1.0\Assemblies\System.Web.Razor.dll which you can add as a reference to your project. Creating a Wrapper The basic Razor Hosting API is pretty simple and you can host Razor with a (large-ish) handful of lines of code. I’ll show the basics of it later in this article. However, if you want to customize the rendering and handle assembly and namespace includes for the markup as well as deal with text and file inputs as well as forcing Razor to run in a separate AppDomain so you can unload the code-generated assemblies and deal with assembly caching for re-used templates little more work is required to create something that is more easily reusable. For this reason I created a Razor Hosting wrapper project that combines a bunch of this functionality into an easy to use hosting class, a hosting factory that can load the engine in a separate AppDomain and a couple of hosting containers that provided folder based and string based caching for templates for an easily embeddable and reusable engine with easy to use syntax. If you just want the code and play with the samples and source go grab the latest code from the Subversion Repository at: http://www.west-wind.com:8080/svn/articles/trunk/RazorHosting/ or a snapshot from: http://www.west-wind.com/files/tools/RazorHosting.zip Getting Started Before I get into how hosting with Razor works, let’s take a look at how you can get up and running quickly with the wrapper classes provided. It only takes a few lines of code. The easiest way to use these Razor Hosting Wrappers is to use one of the two HostContainers provided. One is for hosting Razor scripts in a directory and rendering them as relative paths from these script files on disk. The other HostContainer serves razor scripts from string templates… Let’s start with a very simple template that displays some simple expressions, some code blocks and demonstrates rendering some data from contextual data that you pass to the template in the form of a ‘context’. Here’s a simple Razor template: @using System.Reflection Hello @Context.FirstName! Your entry was entered on: @Context.Entered @{ // Code block: Update the host Windows Form passed in through the context Context.WinForm.Text = "Hello World from Razor at " + DateTime.Now.ToString(); } AppDomain Id: @AppDomain.CurrentDomain.FriendlyName Assembly: @Assembly.GetExecutingAssembly().FullName Code based output: @{ // Write output with Response object from code string output = string.Empty; for (int i = 0; i < 10; i++) { output += i.ToString() + " "; } Response.Write(output); } Pretty easy to see what’s going on here. The only unusual thing in this code is the Context object which is an arbitrary object I’m passing from the host to the template by way of the template base class. I’m also displaying the current AppDomain and the executing Assembly name so you can see how compiling and running a template actually loads up new assemblies. Also note that as part of my context I’m passing a reference to the current Windows Form down to the template and changing the title from within the script. It’s a silly example, but it demonstrates two-way communication between host and template and back which can be very powerful. The easiest way to quickly render this template is to use the RazorEngine<TTemplateBase> class. The generic parameter specifies a template base class type that is used by Razor internally to generate the class it generates from a template. The default implementation provided in my RazorHosting wrapper is RazorTemplateBase. Here’s a simple one that renders from a string and outputs a string: var engine = new RazorEngine<RazorTemplateBase>(); // we can pass any object as context - here create a custom context var context = new CustomContext() { WinForm = this, FirstName = "Rick", Entered = DateTime.Now.AddDays(-10) }; string output = engine.RenderTemplate(this.txtSource.Text new string[] { "System.Windows.Forms.dll" }, context); if (output == null) this.txtResult.Text = "*** ERROR:\r\n" + engine.ErrorMessage; else this.txtResult.Text = output; Simple enough. This code renders a template from a string input and returns a result back as a string. It  creates a custom context and passes that to the template which can then access the Context’s properties. Note that anything passed as ‘context’ must be serializable (or MarshalByRefObject) – otherwise you get an exception when passing the reference over AppDomain boundaries (discussed later). Passing a context is optional, but is a key feature in being able to share data between the host application and the template. Note that we use the Context object to access FirstName, Entered and even the host Windows Form object which is used in the template to change the Window caption from within the script! In the code above all the work happens in the RenderTemplate method which provide a variety of overloads to read and write to and from strings, files and TextReaders/Writers. Here’s another example that renders from a file input using a TextReader: using (reader = new StreamReader("templates\\simple.csHtml", true)) { result = host.RenderTemplate(reader, new string[] { "System.Windows.Forms.dll" }, this.CustomContext); } RenderTemplate() is fairly high level and it handles loading of the runtime, compiling into an assembly and rendering of the template. If you want more control you can use the lower level methods to control each step of the way which is important for the HostContainers I’ll discuss later. Basically for those scenarios you want to separate out loading of the engine, compiling into an assembly and then rendering the template from the assembly. Why? So we can keep assemblies cached. In the code above a new assembly is created for each template rendered which is inefficient and uses up resources. Depending on the size of your templates and how often you fire them you can chew through memory very quickly. This slighter lower level approach is only a couple of extra steps: // we can pass any object as context - here create a custom context var context = new CustomContext() { WinForm = this, FirstName = "Rick", Entered = DateTime.Now.AddDays(-10) }; var engine = new RazorEngine<RazorTemplateBase>(); string assId = null; using (StringReader reader = new StringReader(this.txtSource.Text)) { assId = engine.ParseAndCompileTemplate(new string[] { "System.Windows.Forms.dll" }, reader); } string output = engine.RenderTemplateFromAssembly(assId, context); if (output == null) this.txtResult.Text = "*** ERROR:\r\n" + engine.ErrorMessage; else this.txtResult.Text = output; The difference here is that you can capture the assembly – or rather an Id to it – and potentially hold on to it to render again later assuming the template hasn’t changed. The HostContainers take advantage of this feature to cache the assemblies based on certain criteria like a filename and file time step or a string hash that if not change indicate that an assembly can be reused. Note that ParseAndCompileTemplate returns an assembly Id rather than the assembly itself. This is done so that that the assembly always stays in the host’s AppDomain and is not passed across AppDomain boundaries which would cause load failures. We’ll talk more about this in a minute but for now just realize that assemblies references are stored in a list and are accessible by this ID to allow locating and re-executing of the assembly based on that id. Reuse of the assembly avoids recompilation overhead and creation of yet another assembly that loads into the current AppDomain. You can play around with several different versions of the above code in the main sample form:   Using Hosting Containers for more Control and Caching The above examples simply render templates into assemblies each and every time they are executed. While this works and is even reasonably fast, it’s not terribly efficient. If you render templates more than once it would be nice if you could cache the generated assemblies for example to avoid re-compiling and creating of a new assembly each time. Additionally it would be nice to load template assemblies into a separate AppDomain optionally to be able to be able to unload assembli es and also to protect your host application from scripting attacks with malicious template code. Hosting containers provide also provide a wrapper around the RazorEngine<T> instance, a factory (which allows creation in separate AppDomains) and an easy way to start and stop the container ‘runtime’. The Razor Hosting samples provide two hosting containers: RazorFolderHostContainer and StringHostContainer. The folder host provides a simple runtime environment for a folder structure similar in the way that the ASP.NET runtime handles a virtual directory as it’s ‘application' root. Templates are loaded from disk in relative paths and the resulting assemblies are cached unless the template on disk is changed. The string host also caches templates based on string hashes – if the same string is passed a second time a cached version of the assembly is used. Here’s how HostContainers work. I’ll use the FolderHostContainer because it’s likely the most common way you’d use templates – from disk based templates that can be easily edited and maintained on disk. The first step is to create an instance of it and keep it around somewhere (in the example it’s attached as a property to the Form): RazorFolderHostContainer Host = new RazorFolderHostContainer(); public RazorFolderHostForm() { InitializeComponent(); // The base path for templates - templates are rendered with relative paths // based on this path. Host.TemplatePath = Path.Combine(Environment.CurrentDirectory, TemplateBaseFolder); // Add any assemblies you want reference in your templates Host.ReferencedAssemblies.Add("System.Windows.Forms.dll"); // Start up the host container Host.Start(); } Next anytime you want to render a template you can use simple code like this: private void RenderTemplate(string fileName) { // Pass the template path via the Context var relativePath = Utilities.GetRelativePath(fileName, Host.TemplatePath); if (!Host.RenderTemplate(relativePath, this.Context, Host.RenderingOutputFile)) { MessageBox.Show("Error: " + Host.ErrorMessage); return; } this.webBrowser1.Navigate("file://" + Host.RenderingOutputFile); } You can also render the output to a string instead of to a file: string result = Host.RenderTemplateToString(relativePath,context); Finally if you want to release the engine and shut down the hosting AppDomain you can simply do: Host.Stop(); Stopping the AppDomain and restarting it (ie. calling Stop(); followed by Start()) is also a nice way to release all resources in the AppDomain. The FolderBased domain also supports partial Rendering based on root path based relative paths with the same caching characteristics as the main templates. From within a template you can call out to a partial like this: @RenderPartial(@"partials\PartialRendering.cshtml", Context) where partials\PartialRendering.cshtml is a relative to the template root folder. The folder host example lets you load up templates from disk and display the result in a Web Browser control which demonstrates using Razor HTML output from templates that contain HTML syntax which happens to me my target scenario for Html Help Builder.   The Razor Engine Wrapper Project The project I created to wrap Razor hosting has a fair bit of code and a number of classes associated with it. Most of the components are internally used and as you can see using the final RazorEngine<T> and HostContainer classes is pretty easy. The classes are extensible and I suspect developers will want to build more customized host containers for their applications. Host containers are the key to wrapping up all functionality – Engine, BaseTemplate, AppDomain Hosting, Caching etc in a logical piece that is ready to be plugged into an application. When looking at the code there are a couple of core features provided: Core Razor Engine Hosting This is the core Razor hosting which provides the basics of loading a template, compiling it into an assembly and executing it. This is fairly straightforward, but without a host container that can cache assemblies based on some criteria templates are recompiled and re-created each time which is inefficient (although pretty fast). The base engine wrapper implementation also supports hosting the Razor runtime in a separate AppDomain for security and the ability to unload it on demand. Host Containers The engine hosting itself doesn’t provide any sort of ‘runtime’ service like picking up files from disk, caching assemblies and so forth. So my implementation provides two HostContainers: RazorFolderHostContainer and RazorStringHostContainer. The FolderHost works off a base directory and loads templates based on relative paths (sort of like the ASP.NET runtime does off a virtual). The HostContainers also deal with caching of template assemblies – for the folder host the file date is tracked and checked for updates and unless the template is changed a cached assembly is reused. The StringHostContainer similiarily checks string hashes to figure out whether a particular string template was previously compiled and executed. The HostContainers also act as a simple startup environment and a single reference to easily store and reuse in an application. TemplateBase Classes The template base classes are the base classes that from which the Razor engine generates .NET code. A template is parsed into a class with an Execute() method and the class is based on this template type you can specify. RazorEngine<TBaseTemplate> can receive this type and the HostContainers default to specific templates in their base implementations. Template classes are customizable to allow you to create templates that provide application specific features and interaction from the template to your host application. How does the RazorEngine wrapper work? You can browse the source code in the links above or in the repository or download the source, but I’ll highlight some key features here. Here’s part of the RazorEngine implementation that can be used to host the runtime and that demonstrates the key code required to host the Razor runtime. The RazorEngine class is implemented as a generic class to reflect the Template base class type: public class RazorEngine<TBaseTemplateType> : MarshalByRefObject where TBaseTemplateType : RazorTemplateBase The generic type is used to internally provide easier access to the template type and assignments on it as part of the template processing. The class also inherits MarshalByRefObject to allow execution over AppDomain boundaries – something that all the classes discussed here need to do since there is much interaction between the host and the template. The first two key methods deal with creating a template assembly: /// <summary> /// Creates an instance of the RazorHost with various options applied. /// Applies basic namespace imports and the name of the class to generate /// </summary> /// <param name="generatedNamespace"></param> /// <param name="generatedClass"></param> /// <returns></returns> protected RazorTemplateEngine CreateHost(string generatedNamespace, string generatedClass) { Type baseClassType = typeof(TBaseTemplateType); RazorEngineHost host = new RazorEngineHost(new CSharpRazorCodeLanguage()); host.DefaultBaseClass = baseClassType.FullName; host.DefaultClassName = generatedClass; host.DefaultNamespace = generatedNamespace; host.NamespaceImports.Add("System"); host.NamespaceImports.Add("System.Text"); host.NamespaceImports.Add("System.Collections.Generic"); host.NamespaceImports.Add("System.Linq"); host.NamespaceImports.Add("System.IO"); return new RazorTemplateEngine(host); } /// <summary> /// Parses and compiles a markup template into an assembly and returns /// an assembly name. The name is an ID that can be passed to /// ExecuteTemplateByAssembly which picks up a cached instance of the /// loaded assembly. /// /// </summary> /// <param name="namespaceOfGeneratedClass">The namespace of the class to generate from the template</param> /// <param name="generatedClassName">The name of the class to generate from the template</param> /// <param name="ReferencedAssemblies">Any referenced assemblies by dll name only. Assemblies must be in execution path of host or in GAC.</param> /// <param name="templateSourceReader">Textreader that loads the template</param> /// <remarks> /// The actual assembly isn't returned here to allow for cross-AppDomain /// operation. If the assembly was returned it would fail for cross-AppDomain /// calls. /// </remarks> /// <returns>An assembly Id. The Assembly is cached in memory and can be used with RenderFromAssembly.</returns> public string ParseAndCompileTemplate( string namespaceOfGeneratedClass, string generatedClassName, string[] ReferencedAssemblies, TextReader templateSourceReader) { RazorTemplateEngine engine = CreateHost(namespaceOfGeneratedClass, generatedClassName); // Generate the template class as CodeDom GeneratorResults razorResults = engine.GenerateCode(templateSourceReader); // Create code from the codeDom and compile CSharpCodeProvider codeProvider = new CSharpCodeProvider(); CodeGeneratorOptions options = new CodeGeneratorOptions(); // Capture Code Generated as a string for error info // and debugging LastGeneratedCode = null; using (StringWriter writer = new StringWriter()) { codeProvider.GenerateCodeFromCompileUnit(razorResults.GeneratedCode, writer, options); LastGeneratedCode = writer.ToString(); } CompilerParameters compilerParameters = new CompilerParameters(ReferencedAssemblies); // Standard Assembly References compilerParameters.ReferencedAssemblies.Add("System.dll"); compilerParameters.ReferencedAssemblies.Add("System.Core.dll"); compilerParameters.ReferencedAssemblies.Add("Microsoft.CSharp.dll"); // dynamic support! // Also add the current assembly so RazorTemplateBase is available compilerParameters.ReferencedAssemblies.Add(Assembly.GetExecutingAssembly().CodeBase.Substring(8)); compilerParameters.GenerateInMemory = Configuration.CompileToMemory; if (!Configuration.CompileToMemory) compilerParameters.OutputAssembly = Path.Combine(Configuration.TempAssemblyPath, "_" + Guid.NewGuid().ToString("n") + ".dll"); CompilerResults compilerResults = codeProvider.CompileAssemblyFromDom(compilerParameters, razorResults.GeneratedCode); if (compilerResults.Errors.Count > 0) { var compileErrors = new StringBuilder(); foreach (System.CodeDom.Compiler.CompilerError compileError in compilerResults.Errors) compileErrors.Append(String.Format(Resources.LineX0TColX1TErrorX2RN, compileError.Line, compileError.Column, compileError.ErrorText)); this.SetError(compileErrors.ToString() + "\r\n" + LastGeneratedCode); return null; } AssemblyCache.Add(compilerResults.CompiledAssembly.FullName, compilerResults.CompiledAssembly); return compilerResults.CompiledAssembly.FullName; } Think of the internal CreateHost() method as setting up the assembly generated from each template. Each template compiles into a separate assembly. It sets up namespaces, and assembly references, the base class used and the name and namespace for the generated class. ParseAndCompileTemplate() then calls the CreateHost() method to receive the template engine generator which effectively generates a CodeDom from the template – the template is turned into .NET code. The code generated from our earlier example looks something like this: //------------------------------------------------------------------------------ // <auto-generated> // This code was generated by a tool. // Runtime Version:4.0.30319.1 // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // </auto-generated> //------------------------------------------------------------------------------ namespace RazorTest { using System; using System.Text; using System.Collections.Generic; using System.Linq; using System.IO; using System.Reflection; public class RazorTemplate : RazorHosting.RazorTemplateBase { #line hidden public RazorTemplate() { } public override void Execute() { WriteLiteral("Hello "); Write(Context.FirstName); WriteLiteral("! Your entry was entered on: "); Write(Context.Entered); WriteLiteral("\r\n\r\n"); // Code block: Update the host Windows Form passed in through the context Context.WinForm.Text = "Hello World from Razor at " + DateTime.Now.ToString(); WriteLiteral("\r\nAppDomain Id:\r\n "); Write(AppDomain.CurrentDomain.FriendlyName); WriteLiteral("\r\n \r\nAssembly:\r\n "); Write(Assembly.GetExecutingAssembly().FullName); WriteLiteral("\r\n\r\nCode based output: \r\n"); // Write output with Response object from code string output = string.Empty; for (int i = 0; i < 10; i++) { output += i.ToString() + " "; } } } } Basically the template’s body is turned into code in an Execute method that is called. Internally the template’s Write method is fired to actually generate the output. Note that the class inherits from RazorTemplateBase which is the generic parameter I used to specify the base class when creating an instance in my RazorEngine host: var engine = new RazorEngine<RazorTemplateBase>(); This template class must be provided and it must implement an Execute() and Write() method. Beyond that you can create any class you chose and attach your own properties. My RazorTemplateBase class implementation is very simple: public class RazorTemplateBase : MarshalByRefObject, IDisposable { /// <summary> /// You can pass in a generic context object /// to use in your template code /// </summary> public dynamic Context { get; set; } /// <summary> /// Class that generates output. Currently ultra simple /// with only Response.Write() implementation. /// </summary> public RazorResponse Response { get; set; } public object HostContainer {get; set; } public object Engine { get; set; } public RazorTemplateBase() { Response = new RazorResponse(); } public virtual void Write(object value) { Response.Write(value); } public virtual void WriteLiteral(object value) { Response.Write(value); } /// <summary> /// Razor Parser implements this method /// </summary> public virtual void Execute() {} public virtual void Dispose() { if (Response != null) { Response.Dispose(); Response = null; } } } Razor fills in the Execute method when it generates its subclass and uses the Write() method to output content. As you can see I use a RazorResponse() class here to generate output. This isn’t necessary really, as you could use a StringBuilder or StringWriter() directly, but I prefer using Response object so I can extend the Response behavior as needed. The RazorResponse class is also very simple and merely acts as a wrapper around a TextWriter: public class RazorResponse : IDisposable { /// <summary> /// Internal text writer - default to StringWriter() /// </summary> public TextWriter Writer = new StringWriter(); public virtual void Write(object value) { Writer.Write(value); } public virtual void WriteLine(object value) { Write(value); Write("\r\n"); } public virtual void WriteFormat(string format, params object[] args) { Write(string.Format(format, args)); } public override string ToString() { return Writer.ToString(); } public virtual void Dispose() { Writer.Close(); } public virtual void SetTextWriter(TextWriter writer) { // Close original writer if (Writer != null) Writer.Close(); Writer = writer; } } The Rendering Methods of RazorEngine At this point I’ve talked about the assembly generation logic and the template implementation itself. What’s left is that once you’ve generated the assembly is to execute it. The code to do this is handled in the various RenderXXX methods of the RazorEngine class. Let’s look at the lowest level one of these which is RenderTemplateFromAssembly() and a couple of internal support methods that handle instantiating and invoking of the generated template method: public string RenderTemplateFromAssembly( string assemblyId, string generatedNamespace, string generatedClass, object context, TextWriter outputWriter) { this.SetError(); Assembly generatedAssembly = AssemblyCache[assemblyId]; if (generatedAssembly == null) { this.SetError(Resources.PreviouslyCompiledAssemblyNotFound); return null; } string className = generatedNamespace + "." + generatedClass; Type type; try { type = generatedAssembly.GetType(className); } catch (Exception ex) { this.SetError(Resources.UnableToCreateType + className + ": " + ex.Message); return null; } // Start with empty non-error response (if we use a writer) string result = string.Empty; using(TBaseTemplateType instance = InstantiateTemplateClass(type)) { if (instance == null) return null; if (outputWriter != null) instance.Response.SetTextWriter(outputWriter); if (!InvokeTemplateInstance(instance, context)) return null; // Capture string output if implemented and return // otherwise null is returned if (outputWriter == null) result = instance.Response.ToString(); } return result; } protected virtual TBaseTemplateType InstantiateTemplateClass(Type type) { TBaseTemplateType instance = Activator.CreateInstance(type) as TBaseTemplateType; if (instance == null) { SetError(Resources.CouldnTActivateTypeInstance + type.FullName); return null; } instance.Engine = this; // If a HostContainer was set pass that to the template too instance.HostContainer = this.HostContainer; return instance; } /// <summary> /// Internally executes an instance of the template, /// captures errors on execution and returns true or false /// </summary> /// <param name="instance">An instance of the generated template</param> /// <returns>true or false - check ErrorMessage for errors</returns> protected virtual bool InvokeTemplateInstance(TBaseTemplateType instance, object context) { try { instance.Context = context; instance.Execute(); } catch (Exception ex) { this.SetError(Resources.TemplateExecutionError + ex.Message); return false; } finally { // Must make sure Response is closed instance.Response.Dispose(); } return true; } The RenderTemplateFromAssembly method basically requires the namespace and class to instantate and creates an instance of the class using InstantiateTemplateClass(). It then invokes the method with InvokeTemplateInstance(). These two methods are broken out because they are re-used by various other rendering methods and also to allow subclassing and providing additional configuration tasks to set properties and pass values to templates at execution time. In the default mode instantiation sets the Engine and HostContainer (discussed later) so the template can call back into the template engine, and the context is set when the template method is invoked. The various RenderXXX methods use similar code although they create the assemblies first. If you’re after potentially cashing assemblies the method is the one to call and that’s exactly what the two HostContainer classes do. More on that in a minute, but before we get into HostContainers let’s talk about AppDomain hosting and the like. Running Templates in their own AppDomain With the RazorEngine class above, when a template is parsed into an assembly and executed the assembly is created (in memory or on disk – you can configure that) and cached in the current AppDomain. In .NET once an assembly has been loaded it can never be unloaded so if you’re loading lots of templates and at some time you want to release them there’s no way to do so. If however you load the assemblies in a separate AppDomain that new AppDomain can be unloaded and the assemblies loaded in it with it. In order to host the templates in a separate AppDomain the easiest thing to do is to run the entire RazorEngine in a separate AppDomain. Then all interaction occurs in the other AppDomain and no further changes have to be made. To facilitate this there is a RazorEngineFactory which has methods that can instantiate the RazorHost in a separate AppDomain as well as in the local AppDomain. The host creates the remote instance and then hangs on to it to keep it alive as well as providing methods to shut down the AppDomain and reload the engine. Sounds complicated but cross-AppDomain invocation is actually fairly easy to implement. Here’s some of the relevant code from the RazorEngineFactory class. Like the RazorEngine this class is generic and requires a template base type in the generic class name: public class RazorEngineFactory<TBaseTemplateType> where TBaseTemplateType : RazorTemplateBase Here are the key methods of interest: /// <summary> /// Creates an instance of the RazorHost in a new AppDomain. This /// version creates a static singleton that that is cached and you /// can call UnloadRazorHostInAppDomain to unload it. /// </summary> /// <returns></returns> public static RazorEngine<TBaseTemplateType> CreateRazorHostInAppDomain() { if (Current == null) Current = new RazorEngineFactory<TBaseTemplateType>(); return Current.GetRazorHostInAppDomain(); } public static void UnloadRazorHostInAppDomain() { if (Current != null) Current.UnloadHost(); Current = null; } /// <summary> /// Instance method that creates a RazorHost in a new AppDomain. /// This method requires that you keep the Factory around in /// order to keep the AppDomain alive and be able to unload it. /// </summary> /// <returns></returns> public RazorEngine<TBaseTemplateType> GetRazorHostInAppDomain() { LocalAppDomain = CreateAppDomain(null); if (LocalAppDomain == null) return null; /// Create the instance inside of the new AppDomain /// Note: remote domain uses local EXE's AppBasePath!!! RazorEngine<TBaseTemplateType> host = null; try { Assembly ass = Assembly.GetExecutingAssembly(); string AssemblyPath = ass.Location; host = (RazorEngine<TBaseTemplateType>) LocalAppDomain.CreateInstanceFrom(AssemblyPath, typeof(RazorEngine<TBaseTemplateType>).FullName).Unwrap(); } catch (Exception ex) { ErrorMessage = ex.Message; return null; } return host; } /// <summary> /// Internally creates a new AppDomain in which Razor templates can /// be run. /// </summary> /// <param name="appDomainName"></param> /// <returns></returns> private AppDomain CreateAppDomain(string appDomainName) { if (appDomainName == null) appDomainName = "RazorHost_" + Guid.NewGuid().ToString("n"); AppDomainSetup setup = new AppDomainSetup(); // *** Point at current directory setup.ApplicationBase = AppDomain.CurrentDomain.BaseDirectory; AppDomain localDomain = AppDomain.CreateDomain(appDomainName, null, setup); return localDomain; } /// <summary> /// Allow unloading of the created AppDomain to release resources /// All internal resources in the AppDomain are released including /// in memory compiled Razor assemblies. /// </summary> public void UnloadHost() { if (this.LocalAppDomain != null) { AppDomain.Unload(this.LocalAppDomain); this.LocalAppDomain = null; } } The static CreateRazorHostInAppDomain() is the key method that startup code usually calls. It uses a Current singleton instance to an instance of itself that is created cross AppDomain and is kept alive because it’s static. GetRazorHostInAppDomain actually creates a cross-AppDomain instance which first creates a new AppDomain and then loads the RazorEngine into it. The remote Proxy instance is returned as a result to the method and can be used the same as a local instance. The code to run with a remote AppDomain is simple: private RazorEngine<RazorTemplateBase> CreateHost() { if (this.Host != null) return this.Host; // Use Static Methods - no error message if host doesn't load this.Host = RazorEngineFactory<RazorTemplateBase>.CreateRazorHostInAppDomain(); if (this.Host == null) { MessageBox.Show("Unable to load Razor Template Host", "Razor Hosting", MessageBoxButtons.OK, MessageBoxIcon.Exclamation); } return this.Host; } This code relies on a local reference of the Host which is kept around for the duration of the app (in this case a form reference). To use this you’d simply do: this.Host = CreateHost(); if (host == null) return; string result = host.RenderTemplate( this.txtSource.Text, new string[] { "System.Windows.Forms.dll", "Westwind.Utilities.dll" }, this.CustomContext); if (result == null) { MessageBox.Show(host.ErrorMessage, "Template Execution Error", MessageBoxButtons.OK, MessageBoxIcon.Exclamation); return; } this.txtResult.Text = result; Now all templates run in a remote AppDomain and can be unloaded with simple code like this: RazorEngineFactory<RazorTemplateBase>.UnloadRazorHostInAppDomain(); this.Host = null; One Step further – Providing a caching ‘Runtime’ Once we can load templates in a remote AppDomain we can add some additional functionality like assembly caching based on application specific features. One of my typical scenarios is to render templates out of a scripts folder. So all templates live in a folder and they change infrequently. So a Folder based host that can compile these templates once and then only recompile them if something changes would be ideal. Enter host containers which are basically wrappers around the RazorEngine<t> and RazorEngineFactory<t>. They provide additional logic for things like file caching based on changes on disk or string hashes for string based template inputs. The folder host also provides for partial rendering logic through a custom template base implementation. There’s a base implementation in RazorBaseHostContainer, which provides the basics for hosting a RazorEngine, which includes the ability to start and stop the engine, cache assemblies and add references: public abstract class RazorBaseHostContainer<TBaseTemplateType> : MarshalByRefObject where TBaseTemplateType : RazorTemplateBase, new() { public RazorBaseHostContainer() { UseAppDomain = true; GeneratedNamespace = "__RazorHost"; } /// <summary> /// Determines whether the Container hosts Razor /// in a separate AppDomain. Seperate AppDomain /// hosting allows unloading and releasing of /// resources. /// </summary> public bool UseAppDomain { get; set; } /// <summary> /// Base folder location where the AppDomain /// is hosted. By default uses the same folder /// as the host application. /// /// Determines where binary dependencies are /// found for assembly references. /// </summary> public string BaseBinaryFolder { get; set; } /// <summary> /// List of referenced assemblies as string values. /// Must be in GAC or in the current folder of the host app/ /// base BinaryFolder /// </summary> public List<string> ReferencedAssemblies = new List<string>(); /// <summary> /// Name of the generated namespace for template classes /// </summary> public string GeneratedNamespace {get; set; } /// <summary> /// Any error messages /// </summary> public string ErrorMessage { get; set; } /// <summary> /// Cached instance of the Host. Required to keep the /// reference to the host alive for multiple uses. /// </summary> public RazorEngine<TBaseTemplateType> Engine; /// <summary> /// Cached instance of the Host Factory - so we can unload /// the host and its associated AppDomain. /// </summary> protected RazorEngineFactory<TBaseTemplateType> EngineFactory; /// <summary> /// Keep track of each compiled assembly /// and when it was compiled. /// /// Use a hash of the string to identify string /// changes. /// </summary> protected Dictionary<int, CompiledAssemblyItem> LoadedAssemblies = new Dictionary<int, CompiledAssemblyItem>(); /// <summary> /// Call to start the Host running. Follow by a calls to RenderTemplate to /// render individual templates. Call Stop when done. /// </summary> /// <returns>true or false - check ErrorMessage on false </returns> public virtual bool Start() { if (Engine == null) { if (UseAppDomain) Engine = RazorEngineFactory<TBaseTemplateType>.CreateRazorHostInAppDomain(); else Engine = RazorEngineFactory<TBaseTemplateType>.CreateRazorHost(); Engine.Configuration.CompileToMemory = true; Engine.HostContainer = this; if (Engine == null) { this.ErrorMessage = EngineFactory.ErrorMessage; return false; } } return true; } /// <summary> /// Stops the Host and releases the host AppDomain and cached /// assemblies. /// </summary> /// <returns>true or false</returns> public bool Stop() { this.LoadedAssemblies.Clear(); RazorEngineFactory<RazorTemplateBase>.UnloadRazorHostInAppDomain(); this.Engine = null; return true; } … } This base class provides most of the mechanics to host the runtime, but no application specific implementation for rendering. There are rendering functions but they just call the engine directly and provide no caching – there’s no context to decide how to cache and reuse templates. The key methods are Start and Stop and their main purpose is to start a new AppDomain (optionally) and shut it down when requested. The RazorFolderHostContainer – Folder Based Runtime Hosting Let’s look at the more application specific RazorFolderHostContainer implementation which is defined like this: public class RazorFolderHostContainer : RazorBaseHostContainer<RazorTemplateFolderHost> Note that a customized RazorTemplateFolderHost class template is used for this implementation that supports partial rendering in form of a RenderPartial() method that’s available to templates. The folder host’s features are: Render templates based on a Template Base Path (a ‘virtual’ if you will) Cache compiled assemblies based on the relative path and file time stamp File changes on templates cause templates to be recompiled into new assemblies Support for partial rendering using base folder relative pathing As shown in the startup examples earlier host containers require some startup code with a HostContainer tied to a persistent property (like a Form property): // The base path for templates - templates are rendered with relative paths // based on this path. HostContainer.TemplatePath = Path.Combine(Environment.CurrentDirectory, TemplateBaseFolder); // Default output rendering disk location HostContainer.RenderingOutputFile = Path.Combine(HostContainer.TemplatePath, "__Preview.htm"); // Add any assemblies you want reference in your templates HostContainer.ReferencedAssemblies.Add("System.Windows.Forms.dll"); // Start up the host container HostContainer.Start(); Once that’s done, you can render templates with the host container: // Pass the template path for full filename seleted with OpenFile Dialog // relativepath is: subdir\file.cshtml or file.cshtml or ..\file.cshtml var relativePath = Utilities.GetRelativePath(fileName, HostContainer.TemplatePath); if (!HostContainer.RenderTemplate(relativePath, Context, HostContainer.RenderingOutputFile)) { MessageBox.Show("Error: " + HostContainer.ErrorMessage); return; } webBrowser1.Navigate("file://" + HostContainer.RenderingOutputFile); The most critical task of the RazorFolderHostContainer implementation is to retrieve a template from disk, compile and cache it and then deal with deciding whether subsequent requests need to re-compile the template or simply use a cached version. Internally the GetAssemblyFromFileAndCache() handles this task: /// <summary> /// Internally checks if a cached assembly exists and if it does uses it /// else creates and compiles one. Returns an assembly Id to be /// used with the LoadedAssembly list. /// </summary> /// <param name="relativePath"></param> /// <param name="context"></param> /// <returns></returns> protected virtual CompiledAssemblyItem GetAssemblyFromFileAndCache(string relativePath) { string fileName = Path.Combine(TemplatePath, relativePath).ToLower(); int fileNameHash = fileName.GetHashCode(); if (!File.Exists(fileName)) { this.SetError(Resources.TemplateFileDoesnTExist + fileName); return null; } CompiledAssemblyItem item = null; this.LoadedAssemblies.TryGetValue(fileNameHash, out item); string assemblyId = null; // Check for cached instance if (item != null) { var fileTime = File.GetLastWriteTimeUtc(fileName); if (fileTime <= item.CompileTimeUtc) assemblyId = item.AssemblyId; } else item = new CompiledAssemblyItem(); // No cached instance - create assembly and cache if (assemblyId == null) { string safeClassName = GetSafeClassName(fileName); StreamReader reader = null; try { reader = new StreamReader(fileName, true); } catch (Exception ex) { this.SetError(Resources.ErrorReadingTemplateFile + fileName); return null; } assemblyId = Engine.ParseAndCompileTemplate(this.ReferencedAssemblies.ToArray(), reader); // need to ensure reader is closed if (reader != null) reader.Close(); if (assemblyId == null) { this.SetError(Engine.ErrorMessage); return null; } item.AssemblyId = assemblyId; item.CompileTimeUtc = DateTime.UtcNow; item.FileName = fileName; item.SafeClassName = safeClassName; this.LoadedAssemblies[fileNameHash] = item; } return item; } This code uses a LoadedAssembly dictionary which is comprised of a structure that holds a reference to a compiled assembly, a full filename and file timestamp and an assembly id. LoadedAssemblies (defined on the base class shown earlier) is essentially a cache for compiled assemblies and they are identified by a hash id. In the case of files the hash is a GetHashCode() from the full filename of the template. The template is checked for in the cache and if not found the file stamp is checked. If that’s newer than the cache’s compilation date the template is recompiled otherwise the version in the cache is used. All the core work defers to a RazorEngine<T> instance to ParseAndCompileTemplate(). The three rendering specific methods then are rather simple implementations with just a few lines of code dealing with parameter and return value parsing: /// <summary> /// Renders a template to a TextWriter. Useful to write output into a stream or /// the Response object. Used for partial rendering. /// </summary> /// <param name="relativePath">Relative path to the file in the folder structure</param> /// <param name="context">Optional context object or null</param> /// <param name="writer">The textwriter to write output into</param> /// <returns></returns> public bool RenderTemplate(string relativePath, object context, TextWriter writer) { // Set configuration data that is to be passed to the template (any object) Engine.TemplatePerRequestConfigurationData = new RazorFolderHostTemplateConfiguration() { TemplatePath = Path.Combine(this.TemplatePath, relativePath), TemplateRelativePath = relativePath, }; CompiledAssemblyItem item = GetAssemblyFromFileAndCache(relativePath); if (item == null) { writer.Close(); return false; } try { // String result will be empty as output will be rendered into the // Response object's stream output. However a null result denotes // an error string result = Engine.RenderTemplateFromAssembly(item.AssemblyId, context, writer); if (result == null) { this.SetError(Engine.ErrorMessage); return false; } } catch (Exception ex) { this.SetError(ex.Message); return false; } finally { writer.Close(); } return true; } /// <summary> /// Render a template from a source file on disk to a specified outputfile. /// </summary> /// <param name="relativePath">Relative path off the template root folder. Format: path/filename.cshtml</param> /// <param name="context">Any object that will be available in the template as a dynamic of this.Context</param> /// <param name="outputFile">Optional - output file where output is written to. If not specified the /// RenderingOutputFile property is used instead /// </param> /// <returns>true if rendering succeeds, false on failure - check ErrorMessage</returns> public bool RenderTemplate(string relativePath, object context, string outputFile) { if (outputFile == null) outputFile = RenderingOutputFile; try { using (StreamWriter writer = new StreamWriter(outputFile, false, Engine.Configuration.OutputEncoding, Engine.Configuration.StreamBufferSize)) { return RenderTemplate(relativePath, context, writer); } } catch (Exception ex) { this.SetError(ex.Message); return false; } return true; } /// <summary> /// Renders a template to string. Useful for RenderTemplate /// </summary> /// <param name="relativePath"></param> /// <param name="context"></param> /// <returns></returns> public string RenderTemplateToString(string relativePath, object context) { string result = string.Empty; try { using (StringWriter writer = new StringWriter()) { // String result will be empty as output will be rendered into the // Response object's stream output. However a null result denotes // an error if (!RenderTemplate(relativePath, context, writer)) { this.SetError(Engine.ErrorMessage); return null; } result = writer.ToString(); } } catch (Exception ex) { this.SetError(ex.Message); return null; } return result; } The idea is that you can create custom host container implementations that do exactly what you want fairly easily. Take a look at both the RazorFolderHostContainer and RazorStringHostContainer classes for the basic concepts you can use to create custom implementations. Notice also that you can set the engine’s PerRequestConfigurationData() from the host container: // Set configuration data that is to be passed to the template (any object) Engine.TemplatePerRequestConfigurationData = new RazorFolderHostTemplateConfiguration() { TemplatePath = Path.Combine(this.TemplatePath, relativePath), TemplateRelativePath = relativePath, }; which when set to a non-null value is passed to the Template’s InitializeTemplate() method. This method receives an object parameter which you can cast as needed: public override void InitializeTemplate(object configurationData) { // Pick up configuration data and stuff into Request object RazorFolderHostTemplateConfiguration config = configurationData as RazorFolderHostTemplateConfiguration; this.Request.TemplatePath = config.TemplatePath; this.Request.TemplateRelativePath = config.TemplateRelativePath; } With this data you can then configure any custom properties or objects on your main template class. It’s an easy way to pass data from the HostContainer all the way down into the template. The type you use is of type object so you have to cast it yourself, and it must be serializable since it will likely run in a separate AppDomain. This might seem like an ugly way to pass data around – normally I’d use an event delegate to call back from the engine to the host, but since this is running over AppDomain boundaries events get really tricky and passing a template instance back up into the host over AppDomain boundaries doesn’t work due to serialization issues. So it’s easier to pass the data from the host down into the template using this rather clumsy approach of set and forward. It’s ugly, but it’s something that can be hidden in the host container implementation as I’ve done here. It’s also not something you have to do in every implementation so this is kind of an edge case, but I know I’ll need to pass a bunch of data in some of my applications and this will be the easiest way to do so. Summing Up Hosting the Razor runtime is something I got jazzed up about quite a bit because I have an immediate need for this type of templating/merging/scripting capability in an application I’m working on. I’ve also been using templating in many apps and it’s always been a pain to deal with. The Razor engine makes this whole experience a lot cleaner and more light weight and with these wrappers I can now plug .NET based templating into my code literally with a few lines of code. That’s something to cheer about… I hope some of you will find this useful as well… Resources The examples and code require that you download the Razor runtimes. Projects are for Visual Studio 2010 running on .NET 4.0 Platform Installer 3.0 (install WebMatrix or MVC 3 for Razor Runtimes) Latest Code in Subversion Repository Download Snapshot of the Code Documentation (CHM Help File) © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  .NET  

    Read the article

  • Can't capture https traffic with Fiddler and Firefox (works with IE)

    - by Tony_Henrich
    I am trying to debug an asp.net app using Firefox. Firefox tries to connect to https://www.paypal.com and I get "Secure Connection Failed" error (Error code: sec_error_bad_signature). I installed Fiddler's cert in Firefox and the local computer and current user Cert stores. I set up the manual proxy in FF to 127.0.0.1 and port 8888 and for all protocols. Fiddler captures http traffic but gives an error with https traffic. Fiddler captures https from IE fine. What's the problem? (Is FiddlerHook FF extension needed and what setting should it be using: auto, disabled or Force traffic to Fiddler?)

    Read the article

  • Cannot build/create dll in Visual Studio because of admin rights??

    - by Vidar
    I have local admin rights on the PC I am using - but some things are not allowed i.e. I can't right click on My Computer and see the properties! Anyway I was trying to build my solution in Visual Studio - I just added a simple class library with barely any code in it and it won't build, it gives the error: C:\WINDOWS\Microsoft.NET\Framework\v3.5\Microsoft.Common.targets(3019,9): error MSB3216: Cannot register assembly "C:\Documents and Settings\fooUser\My Documents\Visual Studio 2008\Projects\FooSolution\Foo.Bar\bin\Debug\Foo.Bar.Utility.dll" - access denied. Please make sure you're running the application as administrator. Access to the registry key 'HKEY_CLASSES_ROOT\Record' is denied. I take it I need FULL admin rights?? - although the project did compile before I made this class library - so I dont' understand why the addition of this library has had such a drastic effect now!

    Read the article

  • How to Use Windows’ Advanced Search Features: Everything You Need to Know

    - by Chris Hoffman
    You should never have to hunt down a lost file on modern versions of Windows — just perform a quick search. You don’t even have to wait for a cartoon dog to find your files, like on Windows XP. The Windows search indexer is constantly running in the background to make quick local searches possible. This enables the kind of powerful search features you’d use on Google or Bing — but for your local files. Controlling the Indexer By default, the Windows search indexer watches everything under your user folder — that’s C:\Users\NAME. It reads all these files, creating an index of their names, contents, and other metadata. Whenever they change, it notices and updates its index. The index allows you to quickly find a file based on the data in the index. For example, if you want to find files that contain the word “beluga,” you can perform a search for “beluga” and you’ll get a very quick response as Windows looks up the word in its search index. If Windows didn’t use an index, you’d have to sit and wait as Windows opened every file on your hard drive, looked to see if the file contained the word “beluga,” and moved on. Most people shouldn’t have to modify this indexing behavior. However, if you store your important files in other folders — maybe you store your important data a separate partition or drive, such as at D:\Data — you may want to add these folders to your index. You can also choose which types of files you want to index, force Windows to rebuild the index entirely, pause the indexing process so it won’t use any system resources, or move the index to another location to save space on your system drive. To open the Indexing Options window, tap the Windows key on your keyboard, type “index”, and click the Indexing Options shortcut that appears. Use the Modify button to control the folders that Windows indexes or the Advanced button to control other options. To prevent Windows from indexing entirely, click the Modify button and uncheck all the included locations. You could also disable the search indexer entirely from the Programs and Features window. Searching for Files You can search for files right from your Start menu on Windows 7 or Start screen on Windows 8. Just tap the Windows key and perform a search. If you wanted to find files related to Windows, you could perform a search for “Windows.” Windows would show you files that are named Windows or contain the word Windows. From here, you can just click a file to open it. On Windows 7, files are mixed with other types of search results. On Windows 8 or 8.1, you can choose to search only for files. If you want to perform a search without leaving the desktop in Windows 8.1, press Windows Key + S to open a search sidebar. You can also initiate searches directly from Windows Explorer — that’s File Explorer on Windows 8. Just use the search box at the top-right of the window. Windows will search the location you’ve browsed to. For example, if you’re looking for a file related to Windows and know it’s somewhere in your Documents library, open the Documents library and search for Windows. Using Advanced Search Operators On Windows 7, you’ll notice that you can add “search filters” form the search box, allowing you to search by size, date modified, file type, authors, and other metadata. On Windows 8, these options are available from the Search Tools tab on the ribbon. These filters allow you to narrow your search results. If you’re a geek, you can use Windows’ Advanced Query Syntax to perform advanced searches from anywhere, including the Start menu or Start screen. Want to search for “windows,” but only bring up documents that don’t mention Microsoft? Search for “windows -microsoft”. Want to search for all pictures of penguins on your computer, whether they’re PNGs, JPEGs, or any other type of picture file? Search for “penguin kind:picture”. We’ve looked at Windows’ advanced search operators before, so check out our in-depth guide for more information. The Advanced Query Syntax gives you access to options that aren’t available in the graphical interface. Creating Saved Searches Windows allows you to take searches you’ve made and save them as a file. You can then quickly perform the search later by double-clicking the file. The file functions almost like a virtual folder that contains the files you specify. For example, let’s say you wanted to create a saved search that shows you all the new files created in your indexed folders within the last week. You could perform a search for “datecreated:this week”, then click the Save search button on the toolbar or ribbon. You’d have a new virtual folder you could quickly check to see your recent files. One of the best things about Windows search is that it’s available entirely from the keyboard. Just press the Windows key, start typing the name of the file or program you want to open, and press Enter to quickly open it. Windows 8 made this much more obnoxious with its non-unified search, but unified search is finally returning with Windows 8.1.     

    Read the article

  • Blending the Sketchflow Action

    - by GeekAgilistMercenary
    Started a new Sketchflow Prototype in Expression Blend recently and documented each of the steps.  This blog entry covers some of those steps, which are the basic elements of any prototype.  I will have more information regarding design, prototype creation, and the process of the initial phases for development in the future.  For now, I hope you enjoy this short walk through.  Also, be sure to check out my last quick entry on Sketchflow. I started off with a Sketchflow Project, just like I did in my previous entry (more specifics in that entry about how to manipulate and build out the Sketchflow Map). Once I created the project I setup the following Sketchflow Map. The CoreNavigation is a ComponentScreen setup solely for the page navigation at the top of the screen.  The XAML markup in case you want to create a Component Screen with the same design is included below. <UserControl xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d" xmlns:i="clr-namespace:System.Windows.Interactivity;assembly=System.Windows.Interactivity" xmlns:pb="clr-namespace:Microsoft.Expression.Prototyping.Behavior;assembly=Microsoft.Expression.Prototyping.Interactivity" x:Class="RapidPrototypeSketchScreens.CoreNavigation" d:DesignWidth="624" d:DesignHeight="49" Height="49" Width="624">   <Grid x:Name="LayoutRoot"> <TextBlock HorizontalAlignment="Stretch" Margin="307,3,0,0" Style="{StaticResource TitleCenter-Sketch}" Text="Aütøchart Scorecards" TextWrapping="Wrap"> <i:Interaction.Triggers> <i:EventTrigger EventName="MouseLeftButtonDown"> <pb:NavigateToScreenAction TargetScreen="RapidPrototypeSketchScreens.Screen_1"/> </i:EventTrigger> </i:Interaction.Triggers> </TextBlock> <Button HorizontalAlignment="Left" Margin="164,8,0,11" Style="{StaticResource Button-Sketch}" Width="144" Content="Scorecard"> <i:Interaction.Triggers> <i:EventTrigger EventName="Click"> <pb:NavigateToScreenAction TargetScreen="RapidPrototypeSketchScreens.Screen_1_2"/> </i:EventTrigger> </i:Interaction.Triggers> </Button> <Button HorizontalAlignment="Left" Margin="8,8,0,11" Style="{StaticResource Button-Sketch}" Width="152" Content="Standard Reports"> <i:Interaction.Triggers> <i:EventTrigger EventName="Click"> <pb:NavigateToScreenAction TargetScreen="RapidPrototypeSketchScreens.Screen_1_1"/> </i:EventTrigger> </i:Interaction.Triggers> </Button> </Grid> </UserControl> Now that the CoreNavigation Component Screen is done I built out each of the others.  In each of those screens I included the CoreNavigation Screen (all those little green lines in the image) as the top navigation.  In order to do that, as I created each of the pages I would hover over the CoreNavigation Object in the Sketchflow Map.  When the utilities drawer (the small menu that pops down under a node when you hover over it) shows click on the third little icon and drag it onto the page node you want a navigation screen on. Once I created all the screens I setup the navigation by opening up each screen and right clicking on the objects that needed to point to somewhere else in the prototype. Once I was done with the main page, my Home Navigation Page, it looked something like this in the Expression Blend Designer. I fleshed out each of the additional screens.  Once I was done I wanted to try out the deployment package.  The way to deploy a Sketchflow Prototype is to merely click on File –> Package SketchFlow Project and a prompt will appear.  In the prompt enter what you want the package to be called. I like to see the files generated afterwards too, so I checked the box to see that.  When Expression Blend is done generating everything you’ll have a directory like the one shown below, with all the needed files for deployment. Now these files can be copied or moved to any location for viewing.  One can even copy them (such as via FTP) to a server location to share with others.  Once they are deployed and you run the "TestPage.html" the other features of the Sketchflow Package are available. In the image below I have tagged a few sections to show the Sketchflow Player Features.  To the top left is the navigation, which provides a clearly defined area of movement in a list.  To the center right is the actual prototype application.  I have placed lists of things and made edits.  On the left hand side is the highlight feature, which is available in the Feedback section of the lower left.  On the right hand list I underlined the Autochart with an orange marker, and marked out two list items with a red marker. In the lower left hand side in the Feedback section is also an area to type in your feedback.  This can be useful for time based feedback, when you post this somewhere and want people to provide subsequent follow up feedback. Overall lots of great features, that enable some fairly rapid prototyping with customers.  Once one is familiar with the steps and parts of this Sketchflow Prototype Capabilities it is easy to step through an application without even stopping.  It really is that easy.  So get hold of Expression Blend 3 and get ramped up on Sketchflow, it will pay off in the design phases to do so! Original Entry

    Read the article

  • Guarding against CSRF Attacks in ASP.NET MVC2

    - by srkirkland
    Alongside XSS (Cross Site Scripting) and SQL Injection, Cross-site Request Forgery (CSRF) attacks represent the three most common and dangerous vulnerabilities to common web applications today. CSRF attacks are probably the least well known but they are relatively easy to exploit and extremely and increasingly dangerous. For more information on CSRF attacks, see these posts by Phil Haack and Steve Sanderson. The recognized solution for preventing CSRF attacks is to put a user-specific token as a hidden field inside your forms, then check that the right value was submitted. It's best to use a random value which you’ve stored in the visitor’s Session collection or into a Cookie (so an attacker can't guess the value). ASP.NET MVC to the rescue ASP.NET MVC provides an HTMLHelper called AntiForgeryToken(). When you call <%= Html.AntiForgeryToken() %> in a form on your page you will get a hidden input and a Cookie with a random string assigned. Next, on your target Action you need to include [ValidateAntiForgeryToken], which handles the verification that the correct token was supplied. Good, but we can do better Using the AntiForgeryToken is actually quite an elegant solution, but adding [ValidateAntiForgeryToken] on all of your POST methods is not very DRY, and worse can be easily forgotten. Let's see if we can make this easier on the program but moving from an "Opt-In" model of protection to an "Opt-Out" model. Using AntiForgeryToken by default In order to mandate the use of the AntiForgeryToken, we're going to create an ActionFilterAttribute which will do the anti-forgery validation on every POST request. First, we need to create a way to Opt-Out of this behavior, so let's create a quick action filter called BypassAntiForgeryToken: [AttributeUsage(AttributeTargets.Method, AllowMultiple=false)] public class BypassAntiForgeryTokenAttribute : ActionFilterAttribute { } Now we are ready to implement the main action filter which will force anti forgery validation on all post actions within any class it is defined on: [AttributeUsage(AttributeTargets.Class, AllowMultiple = false)] public class UseAntiForgeryTokenOnPostByDefault : ActionFilterAttribute { public override void OnActionExecuting(ActionExecutingContext filterContext) { if (ShouldValidateAntiForgeryTokenManually(filterContext)) { var authorizationContext = new AuthorizationContext(filterContext.Controller.ControllerContext);   //Use the authorization of the anti forgery token, //which can't be inhereted from because it is sealed new ValidateAntiForgeryTokenAttribute().OnAuthorization(authorizationContext); }   base.OnActionExecuting(filterContext); }   /// <summary> /// We should validate the anti forgery token manually if the following criteria are met: /// 1. The http method must be POST /// 2. There is not an existing [ValidateAntiForgeryToken] attribute on the action /// 3. There is no [BypassAntiForgeryToken] attribute on the action /// </summary> private static bool ShouldValidateAntiForgeryTokenManually(ActionExecutingContext filterContext) { var httpMethod = filterContext.HttpContext.Request.HttpMethod;   //1. The http method must be POST if (httpMethod != "POST") return false;   // 2. There is not an existing anti forgery token attribute on the action var antiForgeryAttributes = filterContext.ActionDescriptor.GetCustomAttributes(typeof(ValidateAntiForgeryTokenAttribute), false);   if (antiForgeryAttributes.Length > 0) return false;   // 3. There is no [BypassAntiForgeryToken] attribute on the action var ignoreAntiForgeryAttributes = filterContext.ActionDescriptor.GetCustomAttributes(typeof(BypassAntiForgeryTokenAttribute), false);   if (ignoreAntiForgeryAttributes.Length > 0) return false;   return true; } } The code above is pretty straight forward -- first we check to make sure this is a POST request, then we make sure there aren't any overriding *AntiForgeryTokenAttributes on the action being executed. If we have a candidate then we call the ValidateAntiForgeryTokenAttribute class directly and execute OnAuthorization() on the current authorization context. Now on our base controller, you could use this new attribute to start protecting your site from CSRF vulnerabilities. [UseAntiForgeryTokenOnPostByDefault] public class ApplicationController : System.Web.Mvc.Controller { }   //Then for all of your controllers public class HomeController : ApplicationController {} What we accomplished If your base controller has the new default anti-forgery token attribute on it, when you don't use <%= Html.AntiForgeryToken() %> in a form (or of course when an attacker doesn't supply one), the POST action will throw the descriptive error message "A required anti-forgery token was not supplied or was invalid". Attack foiled! In summary, I think having an anti-CSRF policy by default is an effective way to protect your websites, and it turns out it is pretty easy to accomplish as well. Enjoy!

    Read the article

  • SAP DCOM Connector on Windows Server 2008

    - by Michael
    Does anybody know, if it is possible to use the "old" SAP DCOM Connector on a Windows Server 2008 ? I want to migrate a old ASP Web Solution with DCOM Connection to SAP from Windows 2000 Server to Windows 2008 Server. When I try to install the DCOM Connector I get the Error Message: "Setup could not find ActivX(R) Data Objects verion 2.5 or higher on your computer...." That is strange, because ado is there under C:\Program Files\Common Files\System\ado ! Thanks in advance for your help !

    Read the article

  • MSDeploy: error using runCommand provider to call remote .cmd file (timeout)

    - by mjw.
    We are running into an error when trying to use the MSDeploy "runCommand" provider to execute a .cmd file on a remote machine. The expected run time should be about 10 seconds, but MSDeploy only runs it for about 2-3 seconds, after which time error details are returned. Here is the complete MSDeploy "runCommand" command line text I am using: msdeploy.exe -verb:sync -source:runCommand="D:\web deploy tester\test_cmd.cmd",dontUseCommandExe=false,waitAttempts=5,waitInterval=1000 -dest:auto,computername=http://test-machine:89/MsDeployAgentService/,userName=aaa,password=bbb Here are the error details returned: Error 'Error: (4/21/2010 12:19:25 PM) An error occurred when the request was processed on the remote computer. Error: The process 'C:\WINDOWS\system32\cmd.exe' (command line '/c "D:\web deploy tester\test_cmd.cmd"') was terminated because it exceeded the wait time. Error count: 1. ' occurred in call to RunCommand Any ideas as to why this is happening and how to resolve it?

    Read the article

  • Adding a self-signed certificate to iphone Simulator?

    - by jr
    I have a self-signed certificate at the endpoint of my API. I'm trying to test some things using the simulator but am getting "untrusted server certificate". I have tried to use safari on the simulator to download the .crt file, but that doesn't seem to work. Where does iPhone Simulator get its keychain from? How can I add a trusted certificate so my application will work? UPDATE I got it to work by creating a CA and then adding a CA certificate using the iPhone provisioning tool. Then I was able to have a certificate signed by that CA certificate on the API server and the NSConnection just worked. I was not able to get it to work using a self-signed certificate for some reason. I need to re-attempt this using the provisioning software. My real question is how do I get this to work on the simulator? I would think that the simulator uses the keychain of the actual computer.

    Read the article

  • Associating File Types with AutoVue Desktop Deployment

    - by [email protected]
    Windows users take for granted that when they double click on a document or design, that it will open up in its application automatically. One of the questions I'm commonly asked is "How can I get the same behavior with AutoVue Desktop Deployment?". It's pretty easy, but there are a few tricks to doing it. Step 1: Download new jvue_direct.bat and icon The first thing you'll need to do is download a slightly modified version of jvue_direct.bat. You can find it here (Document 1075784.1) on Oracle's Support Portal. You also want to download the AV.ico file. This is the icon that will be used for all file types associated with AutoVue. Place both of these files in your <AutoVueInstallDirectory>\bin directory. Step 2: Associate File Types With AutoVue There are two ways to do this. You can do this through the Windows user interface, or you can set up a batch file to do this. Associating File Types Through Windows The way most people associate file types to an application is using the Windows user interface. You've probably tried to open a file type that Windows doesn't recognize and seen this window pop up: Although you can use this dialog to associate that file type with AutoVue, I don't recommend it. I much prefer using a batch file to associate file types with AutoVue. Associating File Types Using A Batch File There are a few good reasons to associate file types using a batch file instead of using the pop-up dialog method: If you have several file types to associate with AutoVue, it's much easier to use a batch file to do them all at once. Doing it through the Windows user interface requires having files of each type available. Using a batch file doesn't require having the files you're associating. Associating file types through the dialog may work well for one person, but what if you're an administrator doing an enterprise wide deployment of AutoVue Desktop Deployment for several hundred users? You don't want to do this manually for each user. You can have one simple batch file that's run on each user's PC to set up all the file types. You can easily associate an icon with the file types you're opening with AutoVue. To use the batch file method follow these steps: Create a file called filetype.bat using a text editor and copy and paste the following into it: @assoc .dwg=AVFile @assoc .jpg=AVFile @assoc .doc=AVFile @ftype AVFile="%~dp0jvue_direct.bat" "%%1" @reg add HKEY_CLASSES_ROOT\AVFile\DefaultIcon /v "" /f /d "%~dp0AV.ico" Change the lines starting with @assoc. Each of these lines associates a file extension with AutoVue. You can have as many @assoc lines as you want. Save this file in your <AutoVueInstallDirectory>\bin directory. Double click this file, or run it from a command prompt. Restart Windows to get the icons to show up. How Does This Work? The first three lines are creating a file type called AVFile. We are associating the extensions .dwg, .jpg, and .doc with this file type. You will want to change these lines when creating your own batch file. For example, to associate Microstation designs, which have extension .dgn, you should delete the @assoc lines above and add the line: @assoc .dgn=AVfile The line beginning with @ftype tells Windows that all AVFile type files should be opened using AutoVue Desktop Deployment. The final line associates the AutoVue icon with these file types. You may need to restart Windows to see the new icons. Warning: One Size Doesn't Fit All When deciding which file types should be associated with AutoVue, remember that there are different types of users using it. Your engineers may be pretty surprised to find that after installing AutoVue, double clicking their .dwg file opens up AutoVue instead of AutoCAD. If you have more than one type of AutoVue user, make sure you've considered what file types each user group will and will not want to be associated with AutoVue. If necessary, create a separate file association batch file for each user type. So that's it. In two simple steps you can double click your favorite designs and have them open automatically in AutoVue Desktop Deployment. I'd love to hear how are you using AutoVue Desktop Deployment. What other deployment tips would you be interested in learning about?

    Read the article

  • "device-mapper resume ioctl failed" when run a instance

    - by user1490377
    I install ubuntu-12.04 server and openstack on two computer. When Launch an image, the instance can't run and show error state! nova-compute.log details: 2012-06-24 12:02:00 DEBUG nova.utils [req-71fdca27-5f93-438e-a4be-ddc54f698171 c737b66b2102415f817ca50b9649fd8f 5b1da4eaee3643919a230efc06473720] Unexpected error while running command. Command: sudo nova-rootwrap kpartx -a /dev/nbd15 Exit code: 1 Stdout: '' Stderr: 'device-mapper: resume ioctl failed: Invalid argument\ncreate/reload failed on nbd15p1\n' from (pid=1267) trycmd /usr/lib/python2.7/dist-packages/nova/utils.py:277 2012-06-24 12:02:00 DEBUG nova.utils [req-71fdca27-5f93-438e-a4be-ddc54f698171 c737b66b2102415f817ca50b9649fd8f 5b1da4eaee3643919a230efc06473720] Running cmd (subprocess): sudo nova-rootwrap qemu-nbd -d /dev/nbd15 from (pid=1267) execute /usr/lib/python2.7/dist-packages/nova/utils.py:219 2012-06-24 12:02:02 DEBUG nova.virt.disk.api [req-71fdca27-5f93-438e-a4be-ddc54f698171 c737b66b2102415f817ca50b9649fd8f 5b1da4eaee3643919a230efc06473720] Failed to map partitions: Unexpected error while running command. Command: sudo nova-rootwrap kpartx -a /dev/nbd15 Exit code: 1 Stdout: '' Stderr: 'device-mapper: resume ioctl failed: Invalid argument\ncreate/reload failed on nbd15p1\n' from (pid=1267) mount /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:205

    Read the article

  • Visual Studio has insufficient privileges to debug this process. To debug this process, Visual Studi

    - by ritu-kothari
    I have developed a windows service and this is service is running on my local computer under my account. When I try to debug this service by attaching this as a process in visual studio 2008 I get “Unable to attach to the process. Visual Studio has insufficient privileges to debug this process. To debug this process, Visual Studio must be run as an administrator.” I have logged in to my system as administrator and so when VS 2008 is launched it is running as administrator not sure why I get this error. I am using Windows XP Pro sp3

    Read the article

  • Cannot start TFS Build service: Error 1227

    - by Joni
    When I try to start the TFS 2008 Build service on the port 9191 I get the following error message: Windows could not start the Visual Studio Team Foundation Build service on Local Computer. Error 1227: The network transport endpoint already has an address associated with it. If I use another port it works, but I need it to be the default, 9191. I tried using the following commands wcfhttpconfig.exe free 9191 wcfhttpconfig.exe reserve Domain\ServiceAccount 9191 Both commands succeeses, but the service does not start. I will appreciate any help!

    Read the article

  • Visual studio 2010 remote debugging is very slow (across domains, over vpn)

    - by alex
    Overall debugging works, but each step through code takes dosens of seconds. I've already closed all additional windows like stack trace,watches,autos; deleted all breakpoints. server and dev machine are located in different domains, so i set up local user on both, with matching password. remote debugger is running as service. looking at security log, I found quite a lot of entries about remote debugging account logging in (record about every minute). Any suggestions on how i can speed up remote debugging? dev computer: quad core, 8 Gb mem, win 7 x64 , visual studio 2010 ultimate target server: asp.net website , 2xdual core xeon, 2Gb mem, remote debugger 2010 communication channel: vpn , 5 mbit , latency about 20ms. (seems that debbugging never uses more than 20 kb/s)

    Read the article

  • IIS6 - 301 permanent redirect response accessing an asp.net web site

    - by omatrot
    I've installed an ASP.NET 2.0 virtual directory application on the default web site in IIS6 (Windows 2003 computer) with a Visual Studio generated web deployment setup. Unfortunately, following a successfull installation, I'm unable to access our web site. I get a permanent redirect from IIS as shown in the log below : 2010-02-25 15:32:41 W3SVC1 127.0.0.1 GET /naweb3 - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+Trident/4.0;+.NET+CLR+1.1.4322;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.30;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729) 301 0 0 This is the first time I have this kind of problem. Before I go with completely resetting IIS6 on this machine, I'm wondering what could cause this problem. The virtual directory configuration seems fine to me, no redirection at all at this level. Any help appreciated. TIA.

    Read the article

  • FSharp.Compiler.CodeDom for VS2008 and VS2010 side-by-side

    - by SztupY
    I'm using FSharp.Compiler.CodeDom to dynamically create F# classes. The problem is, that I have both VS2008 and VS2010 on my computer side-by-side (they works fine), and using F# in this configuration is buggy at best: If I don't install InstallFSharp.msi, then under VS2008 the built classes complain about not finding FSharp.Core (even if they're referenced) If I install InstallFSharp.msi, then under VS2008 the built classes will use the F# built for VS2010, and will throw a binary-incompatibility exception, because it will load the .net4 variant: FSC: error FS0219: The referenced or default base CLI library 'mscorlib' is binary- incompatible with the referenced F# core library 'C:\Program Files (x86)\Microsoft F#\v4.0\FSharp.Core.dll'. Consider recompiling the library or making an explicit reference to a version of this library that matches the CLI version you are using. If I replace the F# found at the previous location to the separately installed dll-s, then of course VS2010 will complain about binary-incompatibility Am I overlooking something, or they won't simply work for a shared environment like this? This might mean real problems when I deploy the applications. Thanks

    Read the article

  • MVC 2 Editor Template for Radio Buttons

    - by Steve Michelotti
    A while back I blogged about how to create an HTML Helper to produce a radio button list.  In that post, my HTML helper was “wrapping” the FluentHtml library from MvcContrib to produce the following html output (given an IEnumerable list containing the items “Foo” and “Bar”): 1: <div> 2: <input id="Name_Foo" name="Name" type="radio" value="Foo" /><label for="Name_Foo" id="Name_Foo_Label">Foo</label> 3: <input id="Name_Bar" name="Name" type="radio" value="Bar" /><label for="Name_Bar" id="Name_Bar_Label">Bar</label> 4: </div> With the release of MVC 2, we now have editor templates we can use that rely on metadata to allow us to customize our views appropriately.  For example, for the radio buttons above, we want the “id” attribute to be differentiated and unique and we want the “name” attribute to be the same across radio buttons so the buttons will be grouped together and so model binding will work appropriately. We also want the “for” attribute in the <label> element being set to correctly point to the id of the corresponding radio button.  The default behavior of the RadioButtonFor() method that comes OOTB with MVC produces the same value for the “id” and “name” attributes so this isn’t exactly what I want out the the box if I’m trying to produce the HTML mark up above. If we use an EditorTemplate, the first gotcha that we run into is that, by default, the templates just work on your view model’s property. But in this case, we *also* was the list of items to populate all the radio buttons. It turns out that the EditorFor() methods do give you a way to pass in additional data. There is an overload of the EditorFor() method where the last parameter allows you to pass an anonymous object for “extra” data that you can use in your view – it gets put on the view data dictionary: 1: <%: Html.EditorFor(m => m.Name, "RadioButtonList", new { selectList = new SelectList(new[] { "Foo", "Bar" }) })%> Now we can create a file called RadioButtonList.ascx that looks like this: 1: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl" %> 2: <% 3: var list = this.ViewData["selectList"] as SelectList; 4: %> 5: <div> 6: <% foreach (var item in list) { 7: var radioId = ViewData.TemplateInfo.GetFullHtmlFieldId(item.Value); 8: var checkedAttr = item.Selected ? "checked=\"checked\"" : string.Empty; 9: %> 10: <input type="radio" id="<%: radioId %>" name="<%: ViewData.TemplateInfo.HtmlFieldPrefix %>" value="<%: item.Value %>" <%: checkedAttr %>/> 11: <label for="<%: radioId %>"><%: item.Text %></label> 12: <% } %> 13: </div> There are several things to note about the code above. First, you can see in line #3, it’s getting the SelectList out of the view data dictionary. Then on line #7 it uses the GetFullHtmlFieldId() method from the TemplateInfo class to ensure we get unique IDs. We pass the Value to this method so that it will produce IDs like “Name_Foo” and “Name_Bar” rather than just “Name” which is our property name. However, for the “name” attribute (on line #10) we can just use the normal HtmlFieldPrefix property so that we ensure all radio buttons have the same name which corresponds to the view model’s property name. We also get to leverage the fact the a SelectListItem has a Boolean Selected property so we can set the checkedAttr variable on line #8 and use it on line #10. Finally, it’s trivial to set the correct “for” attribute for the <label> on line #11 since we already produced that value. Because the TemplateInfo class provides all the metadata for our view, we’re able to produce this view that is widely re-usable across our application. In fact, we can create a couple HTML helpers to better encapsulate this call and make it more user friendly: 1: public static MvcHtmlString RadioButtonList<TModel, TProperty>(this HtmlHelper<TModel> htmlHelper, Expression<Func<TModel, TProperty>> expression, params string[] items) 2: { 3: return htmlHelper.RadioButtonList(expression, new SelectList(items)); 4: } 5:   6: public static MvcHtmlString RadioButtonList<TModel, TProperty>(this HtmlHelper<TModel> htmlHelper, Expression<Func<TModel, TProperty>> expression, IEnumerable<SelectListItem> items) 7: { 8: var func = expression.Compile(); 9: var result = func(htmlHelper.ViewData.Model); 10: var list = new SelectList(items, "Value", "Text", result); 11: return htmlHelper.EditorFor(expression, "RadioButtonList", new { selectList = list }); 12: } This allows us to simply the call like this: 1: <%: Html.RadioButtonList(m => m.Name, "Foo", "Bar" ) %> In that example, the values for the radio button are hard-coded and being passed in directly. But if you had a view model that contained a property for the collection of items you could call the second overload like this: 1: <%: Html.RadioButtonList(m => m.Name, Model.FooBarList ) %> The Editor templates introduced in MVC 2 definitely allow for much more flexible views/editors than previously available. By knowing about the features you have available to you with the TemplateInfo class, you can take these concepts and customize your editors with extreme flexibility and re-usability.

    Read the article

  • HttpWebRequest and BindIPEndPointDelegate getting socket exception

    - by Evgeny Gavrin
    I've got c# code running on a computer with multiple network interfaces, and the following code to select an IP address for a HttpWebRequest to bind ServicePoint: HttpWebRequest request = (HttpWebRequest)WebRequest.Create(remoteFilename); request.KeepAlive = false; request.ServicePoint.BindIPEndPointDelegate = delegate( ServicePoint servicePoint, IPEndPoint remoteEndPoint, int retryCount) { return new IPEndPoint(IPAddress.Parse(ipAddr), 0); }; But it works only for one of the available network interfaces. Trying to access remote server through others throws an exception: System.Net.WebException: Unable to connect to the remote server --- System.Net.Sockets.SocketException: A socket operation was attempted to an unreachable network How can it be solved?

    Read the article

  • Oracle OpenWorld Preview: Real World Perspectives from Oracle WebCenter Customers

    - by Christie Flanagan
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} If you frequent the Oracle WebCenter blog you’ve probably read a lot about the customer experience revolution over the last few months.  An important aspect of the customer experience revolution is the increasing role that peers play in influencing how others perceive a product, brand or solution, simply by sharing their own, real-world experiences.  Think about it, who do you trust more -- marketers and sales people pitching polished messages or peers with similar roles and similar challenges to the ones you face in your business every day? With this spirit in mind, this polished marketer personally invites you to hear directly from Oracle WebCenter customers about their real-life experiences during our customer panel sessions at Oracle OpenWorld next week.  If you’re currently using WebCenter, thinking about it, or just want to find out more about best practices in social business, next-generation portals, enterprise content management or web experience management, be sure to attend these sessions: CON8899 - Becoming a Social Business: Stories from the Front Lines of Change Wednesday, Oct 3, 11:45 AM - 12:45 PM - Moscone West - 3000Priscilla Hancock - Vice President/CIO, University of Louisville Kellie Christensen - Director of Information Technology, Banner EngineeringWhat does it really mean to be a social business? How can you change your organization to embrace social approaches? What pitfalls do you need to avoid? In this lively panel discussion, customer and industry thought leaders in social business explore these topics and more as they share their stories of the good, the bad, and the ugly that can happen when embracing social methods and technologies to improve business success. Using moderated questions and open Q&A from the audience, the panel discusses vital topics such as the critical factors for success, the major issues to avoid, how to gain senior executive support for social efforts, how to handle undesired behavior, and how to measure business impact. This session will take a thought-provoking look at becoming a social business from the inside. CON8900 - Building Next-Generation Portals: An Interactive Customer Panel DiscussionWednesday, Oct 3, 5:00 PM - 6:00 PM - Moscone West - 3000Roberts Wayne - Director, IT, Canadian Partnership Against CancerMike Beattie - VP Application Development, Aramark Uniform ServicesJohn Chen - Utilities Services Manager 6, Los Angeles Department of Water & PowerJörg Modlmayr - Head of Product Managment, Siemens AGSocial and collaborative technologies have changed how people interact, learn, and collaborate, and providing a modern, social Web presence is imperative to remain competitive in today’s market. Can your business benefit from a more collaborative and interactive portal environment for employees, customers, and partners? Attend this session to hear from Oracle WebCenter Portal customers as they share their strategies and best practices for providing users with a modern experience that adapts to their needs and includes personalized access to content in context. The panel also addresses how customers have benefited from creating next-generation portals by migrating from older portal technologies to Oracle WebCenter Portal. CON8898 - Land Mines, Potholes, and Dirt Roads: Navigating the Way to ECM NirvanaThursday, Oct 4, 12:45 PM - 1:45 PM - Moscone West - 3001Stephen Madsen - Senior Management Consultant, Alberta Agriculture and Rural DevelopmentHimanshu Parikh - Sr. Director, Enterprise Architecture & Middleware, Ross Stores, Inc.Ten years ago, people were predicting that by this time in history, we’d be some kind of utopian paperless society. As we all know, we're not there yet, but are we getting closer? What is keeping companies from driving down the road to enterprise content management bliss? Most people understand that using ECM as a central platform enables organizations to expedite document-centric processes, but most business processes in organizations are still heavily paper-based. Many of these processes could be automated and improved with an ECM platform infrastructure. In this panel discussion, you’ll hear from Oracle WebCenter customers that have already solved some of these challenges as they share their strategies for success and roads to avoid along your journey. CON8897 - Using Web Experience Management to Drive Online Marketing SuccessThursday, Oct 4, 2:15 PM - 3:15 PM - Moscone West - 3001Blane Nelson - Chief Architect, Ancestry.comMike Remedios - CIO, ArbonneCaitlin Scanlon - Product Manager, Monster WorldwideEvery year, the online channel becomes more imperative for driving organizational top-line revenue, but for many companies, mastering how to best market their products and services in a fast-evolving online world with high customer expectations for personalized experiences can be a complex proposition. Come to this panel discussion, and hear directly from customers on how they are succeeding today by using Web experience management to drive marketing success, using capabilities such as targeting and optimization, user-generated content, mobile site publishing, and site visitor personalization to deliver engaging online experiences. Your Handy Guide to WebCenter at Oracle OpenWorld Want a quick and easy guide to all the keynotes, demos, hands-on labs and WebCenter sessions you definitely don't want to miss at Oracle OpenWorld? Download this handy guide, Focus on WebCenter. More helpful links: * Oracle OpenWorld* Oracle Customer Experience Summit @ OpenWorld* Oracle OpenWorld on Facebook * Oracle OpenWorld on Twitter* Oracle OpenWorld on LinkedIn* Oracle OpenWorld Blog

    Read the article

  • [Windows 8] Update TextBox’s binding on TextChanged

    - by Benjamin Roux
    Since UpdateSourceTrigger is not available in WinRT we cannot update the text’s binding of a TextBox at will (or at least not easily) especially when using MVVM (I surely don’t want to write behind-code to do that in each of my apps !). Since this kind of demand is frequent (for example to disable of button if the TextBox is empty) I decided to create some attached properties to to simulate this missing behavior. namespace Indeed.Controls { public static class TextBoxEx { public static string GetRealTimeText(TextBox obj) { return (string)obj.GetValue(RealTimeTextProperty); } public static void SetRealTimeText(TextBox obj, string value) { obj.SetValue(RealTimeTextProperty, value); } public static readonly DependencyProperty RealTimeTextProperty = DependencyProperty.RegisterAttached("RealTimeText", typeof(string), typeof(TextBoxEx), null); public static bool GetIsAutoUpdate(TextBox obj) { return (bool)obj.GetValue(IsAutoUpdateProperty); } public static void SetIsAutoUpdate(TextBox obj, bool value) { obj.SetValue(IsAutoUpdateProperty, value); } public static readonly DependencyProperty IsAutoUpdateProperty = DependencyProperty.RegisterAttached("IsAutoUpdate", typeof(bool), typeof(TextBoxEx), new PropertyMetadata(false, OnIsAutoUpdateChanged)); private static void OnIsAutoUpdateChanged(DependencyObject sender, DependencyPropertyChangedEventArgs e) { var value = (bool)e.NewValue; var textbox = (TextBox)sender; if (value) { Observable.FromEventPattern<TextChangedEventHandler, TextChangedEventArgs>( o => textbox.TextChanged += o, o => textbox.TextChanged -= o) .Do(_ => textbox.SetValue(TextBoxEx.RealTimeTextProperty, textbox.Text)) .Subscribe(); } } } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } The code is composed of two attached properties. The first one “RealTimeText” reflects the text in real time (updated after each TextChanged event). The second one is only used to enable the functionality. To subscribe to the TextChanged event I used Reactive Extensions (Rx-Metro package in Nuget). If you’re not familiar with this framework just replace the code with a simple: textbox.TextChanged += textbox.SetValue(TextBoxEx.RealTimeTextProperty, textbox.Text); .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } To use these attached properties, it’s fairly simple <TextBox Text="{Binding Path=MyProperty, Mode=TwoWay}" ic:TextBoxEx.IsAutoUpdate="True" ic:TextBoxEx.RealTimeText="{Binding Path=MyProperty, Mode=TwoWay}" /> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Just make sure to create a binding (in TwoWay) for both Text and RealTimeText. Hope this helps !

    Read the article

  • CodePlex Daily Summary for Thursday, November 18, 2010

    CodePlex Daily Summary for Thursday, November 18, 2010Popular ReleasesSitefinity Migration Tool: Sitefinity Migration Tool 0.2 Alpha: - Improvements for the Sitefinity RC releaseMiniTwitter: 1.57: MiniTwitter 1.57 ???? ?? ?????????????????? ?? User Streams ????????????????????? ???????????????·??????·???????VFPX: VFP2C32 2.0.0.7: fixed a bug in AAverage - NULL values in the array corrupted the result removed limitation in ASum, AMin, AMax, AAverage - the functions were limited to 65000 elements, now they're limited to 65000 rows ASplitStr now returns a 1 element array with an empty string when an empty string is passed (behaves more like ALINES) internal code cleanup and optimization: optimized FoxArray class - results in a speedup of 10-20% in many functions which return the result in an array - like AProcesses...Microsoft SQL Server Product Samples: Database: AdventureWorks 2008R2 SR1: Sample Databases for Microsoft SQL Server 2008R2 (SR1)This release is dedicated to the sample databases that ship for Microsoft SQL Server 2008R2. See Database Prerequisites for SQL Server 2008R2 for feature configurations required for installing the sample databases. See Installing SQL Server 2008R2 Databases for step by step installation instructions. The SR1 release contains minor bug fixes to the installer used to create the sample databases. There are no changes to the databases them...VidCoder: 0.7.2: Fixed duplicated subtitles when running multiple encodes off of the same title.Razor Templating Engine: Razor Template Engine v1.1: Release 1.1 Changes: ADDED: Signed assemblies with strong name to allow assemblies to be referenced by other strongly-named assemblies. FIX: Filter out dynamic assemblies which causes failures in template compilation. FIX: Changed ASCII to UTF8 encoding to support UTF-8 encoded string templates. FIX: Corrected implementation of TemplateBase adding ITemplate interface.Prism Training Kit: Prism Training Kit - 1.1: This is an updated version of the Prism training Kit that targets Prism 4.0 and fixes the bugs reported in the version 1.0. This release consists of a Training Kit with Labs on the following topics Modularity Dependency Injection Bootstrapper UI Composition Communication Note: Take into account that this is a Beta version. If you find any bugs please report them in the Issue Tracker PrerequisitesVisual Studio 2010 Microsoft Word 2007/2010 Microsoft Silverlight 4 Microsoft S...Craig's Utility Library: Craig's Utility Library Code 2.0: This update contains a number of changes, added functionality, and bug fixes: Added transaction support to SQLHelper. Added linked/embedded resource ability to EmailSender. Updated List to take into account new functions. Added better support for MAC address in WMI classes. Fixed Parsing in Reflection class when dealing with sub classes. Fixed bug in SQLHelper when replacing the Command that is a select after doing a select. Fixed issue in SQL Server helper with regard to generati...MFCMAPI: November 2010 Release: Build: 6.0.0.1023 Full release notes at SGriffin's blog. If you just want to run the tool, get the executable. If you want to debug it, get the symbol file and the source. The 64 bit build will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit build, regardless of the operating system. Facebook BadgeDotNetNuke® Community Edition: 05.06.00: Major HighlightsAdded automatic portal alias creation for single portal installs Updated the file manager upload page to allow user to upload multiple files without returning to the file manager page. Fixed issue with Event Log Email Notifications. Fixed issue where Telerik HTML Editor was unable to upload files to secure or database folder. Fixed issue where registration page is not set correctly during an upgrade. Fixed issue where Sendmail stripped HTML and Links from emails...mVu Mobile Viewer: mVu Mobile Viewer 0.7.10.0: Tube8 fix.EPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.8.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the serverNew Features Improved chart support Different chart-types series on the same chart Support for secondary axis and a lot of new properties Better styling Encryption and Workbook protection Table support Import csv files Array formulas ...and a lot of bugfixesAutoLoL: AutoLoL v1.4.2: Added support for more clients (French and Russian) Settings are now stored sepperatly for each user on a computer Auto Login is much faster now Auto Login detects and handles caps lock state properly nowTailspinSpyworks - WebForms Sample Application: TailspinSpyworks-v0.9: Contains a number of bug fixes and additional tutorial steps as well as complete database implementation details.ASP.NET MVC Project Awesome (rich jQuery AJAX helpers): 1.3 and demos: a library with mvc helpers and a demo project that demonstrates an awesome way of doing asp.net mvc. tested on mozilla, safari, chrome, opera, ie 9b/8/7/6 new stuff in 1.3 Autocomplete helper Autocomplete and AjaxDropdown can have parentId and be filled with data depending on the value of the parent PopupForm besides Content("ok") on success can also return Json(data) and use 'data' in a client side function Awesome demo improved (cruder, builder, added service layer)Nearforums - ASP.NET MVC forum engine: Nearforums v4.1: Version 4.1 of the ASP.NET MVC forum engine, with great improvements: TinyMCE added as visual editor for messages (removed CKEditor). Integrated AntiSamy for cleaner html user post and add more prevention to potential injections. Admin status page: a page for the site admin to check the current status of the configuration / db / etc. View Roadmap for more details.UltimateJB: UltimateJB 2.01 PL3 KakaRoto + PSNYes by EvilSperm: Voici une version attendu avec impatience pour beaucoup : - La Version PSNYes pour pouvoir jouer sur le PSN avec une PS3 Jailbreaker. - Pour l'instant le PSNYes n'est disponible qu'avec les PS3 en firmwares 3.41 !!! - La version PL3 KAKAROTO intégre ses dernières modification et prépare a l'intégration du Firmware 3.30 !!! Conclusion : - UltimateJB PSNYes => Valide l'utilisation du PSN : Uniquement compatible avec les 3.41 - ultimateJB DEFAULT => Pas de PSN mais disponible pour les PS3 sui...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 2.0: Fluent Ribbon Control Suite 2.0(supports .NET 4.0 RTM and .NET 3.5) Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples (only for .NET 4.0) Foundation (Tabs, Groups, Contextual Tabs, Quick Access Toolbar, Backstage) Resizing (ribbon reducing & enlarging principles) Galleries (Gallery in ContextMenu, InRibbonGallery) MVVM (shows how to use this library with Model-View-ViewModel pattern) KeyTips ScreenTips Toolbars ColorGallery NEW! *Walkthrough (documenta...patterns & practices: Prism: Prism 4 Documentation: This release contains the Prism 4 documentation in Help 1.0 (CHM) format and PDF format. The documentation is also included with the full download of the guidance. Note: If you cannot view the content of the CHM, using Windows Explorer, select the properties for the file and then click Unblock on the General tab. Note: The PDF version of the guidance is provided for printing and reading in book format. The online version of the Prism 4 documentation can be read here.Farseer Physics Engine: Farseer Physics Engine 3.1: DonationsIf you like this release and would like to keep Farseer Physics Engine running, please consider a small donation. What's new?We bring a lot of new features in Farseer Physics Engine 3.1. Just to name a few: New Box2D core Rope joint added More stable CCD algorithm YuPeng clipper Explosives logic New Constrained Delaunay Triangulation algorithm from the Poly2Tri project. New Flipcode triangulation algorithm. Silverlight 4 samples Silverlight 4 debug view XNA 4.0 relea...New Projectsbizicosoft crm: crmBlog Migrator: The Blog Migrator tool is an all purpose utility designed to help transition a blog from one platform to another. It leverages XML-RPC, BlogML, and WordPress WXR formats. It also provides the ability to "rewrite" your posts on your old blog to point to the new location.bzr-tfs integration tests: Used to test bzr-tfs integrationC++ Open Source Advanced Operating System: C++ Open Source Advanced Operating System is a project which allows starter developers create their own OS. For now it is at a really initial stage.Chavah - internet radio for Yeshua's disciples: Chavah (pronounced "ha-vah") is internet radio for Yeshua's disciples. Inspired by Pandora, Chavah is a Silverlight application that brings community-driven Messianic Jewish tunes for the Lord over the web to your eager ears.CodePoster: An add-in for Visual Studio which allows you to post code directly from Visual Studio to your blog. CRM 2011 Plugin Testing Tools: This solution is meant to make unit testing of plugins in CRM 2011 a simpler and more efficient process. This solution serializes the objects that the CRM server passes to a plugin on execution and then offers a library that allows you to deserialize them in a unit test.Edinamarry Free Tarot Software for Windows: A freeware yet an advanced Tarot reading divinity Software for Psychics and for all those who practice Divinity and Spirituality. This software includes Tarot Spread Designer, Tarot Deck Designer, Tarot Cards Gallery, Client & Customer Profile, Word Editor, Tarot Reader, etc.EPiSocial: Social addons for EPiServer.first team foundation project: this is my first project for the student to teach them about the ms visual studio 201o and team foundation serverFKTdev: Proyecto donde subiremos las pruebas, códigos de ejemplo y demás recursos en nuestro aprendizaje en XNA, hasta que comencemos un desarrollo estable.Gardens Point Component Pascal: Gardens Point Component Pascal is an implementation for .NET of the Component Pascal Language (CP). CP is an object oriented version of Pascal, and shares many design features with Oberon-2. Geoinformatics: geoinformaticsGREENHOUSEMANAGER: GREENHOUSE es un proyecto universitario para manejar los distintos aspectos de un invernadero. El sistema esta desarrollado en c# con interfaz grafica en WPFHousing: This project is only for the asp.net learning. HR-XML.NET: A .NET HR-XML Serialization Library. Also supports the Dutch SETU standard and some proprietary extensions used in the Netherlands. The project is currently targeting HR-XML version 2.5 and Setu standard 2008-01.InternetShop2: ShopLesson4: Lesson4 for M.Logical Synchronous Circuit Simulator: As part of a student project, we are trying to make a logic synchronous circuit simulator, with the ultimate goal of simulating a processor and a digital clock running on it.MediaOwl: MediaOwl is a music (albums, artists, tracks, tags) and movie (movies, series, actors, directors, genres) search engine, but above all, it is a Microsoft Silverlight 4 application (C#), that shows how to use Caliburn Micro.N2F Yverdon Solar Flare Reflector: The solar flare reflector provides minimal base-range protection for your N2F Yverdon installation against solar flare interference.Netduino Plus Home Automation Toolkit: The Netduino Plus Home Automation project is designed to proivde a communication platform from various consumer based home automation products that offer a common web service endpoint. This will hopefully create a low cost DIY alternative to the expensive ethernet interfaces.NRapid: NRapidOfficeHelper: Wrapper around the open xml office package. You can easily create xlsx documents based on a template xlsx document and reuse parts from that document, if you mark them as named ranges (i.e. "names").OffProjects: This is a private project which for my dev investigationParis Velib Stations for Windows Mobile: Allow to find the closest Velib bike station in Paris on a Windows Mobile Phone (6.5)/ Permet de trouver la station de Vélib la plus proche dans Paris ainsi que ses informations sur un smartphone Windows MobilePolarConverter: Adjust the measured distance of HRM files created by Polar Heart Rate monitorsSexy Select: a jQuery plugin that allows for easy manipulation of select options. Allows for adding, removing, sorting, validation and custom skinningSilverlight Progress Feedback: Demonstrates how to get progress feedback from slow running WPF processes in Silverlight.Silverlight Tabbed Panel: Tabbed Panel based on Silverlight targeted for both developers and designers audience. Tabbed Control is used in this project. This is a basic application. More features will be added in further releases. XAML has been used to design this panel. slabhid: SLABHIDDevice.dll is used for the SLAB MCU example code on PC, the original source code is written by C++. This wrapper class brings SLABHIDDevice.dll to the .Net world, so it will be possible to make some quick solution for firmware testing purpose.SuperWebSocket: A .NET server side implementation of WebSocket protocol.test1-jjoiner: just a test projectTotem Alpha Developer Framework For .Net: ????tadf??VS.NET???????????,????jtadf???????????????。 ?????????tadf??????????????J2EE???????VS.NET?????????,??tadf?????.NET??,???????????,????????????,??????C#??????????Java???????,??????。 tadf?????????????,????HTML???????????,???????,?????????,?????。tadf???????????,????????RICH UI?????WEB??。??????,??。 tadf?????????????????????,????WEB??????????。???????,???????????,?Ajax???????,????????????????,????????,????????????????。???????????,???????????????????????????????,?xml??????,?????????????xml...Ukázkové projekty: Obsahuje ukázkové projekty uživatele TenCoKaciStromy.WPFDemo: This Peoject is only for the WPF learning.Xinx TimeIt!: TinyAlarm is a small utility that allows you to configure an Alarm so that you can opt for 1. Shutdown computer 2. Play a sound 3. Show a note with sound 4. Disconnect a dial-up connection 5. Connect via dial-up connection

    Read the article

  • WCF AuthenticationService in IIS7 Error

    - by germandb
    I have a WCF Server running on IIS 7 using default application pool, with SSL activate, the services is installed in a SBS Server 2008. I implement client application services with wcf and SQL 2005 for setting the access control in my application. The application run under windows vista and is make with WPF. In my developer machine the application and the WCF services run well, the IIS i'm use for the trials is the local IIS 7 and the database is the SQL Server 2005 database hosting in my server. I'm using Visual Studio Project Designer to enable and configure client application services. using https://localhost/WcfServidorFundacion. When i'm change the authentication services location to https://WcfServices:5659/WcfServidorFundacion and recompile the application, the following error show up. Message: The web service returned the error status code: InternalServerError. Details of service failure: {"Message":" Error while processing your request ","StackTrace":"","ExceptionType":""} Stack Trace: en System.Net.HttpWebRequest.GetResponse() en System.Web.ClientServices.Providers.ProxyHelper.CreateWebRequestAndGetResponse(String serverUri, CookieContainer& cookies, String username, String connectionString, String connectionStringProvider, String[] paramNames, Object[] paramValues, Type returnType) InnerException: System.Net.WebException Message="Remote Server Error: (500) Interal Server Error." I can access the WCF service from the navigator using the url mentioned above and even make a webReference in my project. I make a capture of the response but I'cant post it because i don't have 10 reputation points I activate the error log in the IIS 7 server, and the result is a Warning in the ManagedPipilineHandler. I appreciate if any one can help me Errors & Warnings No.? Severity Event Module Name 132. view trace Warning -MODULE_SET_RESPONSE_ERROR_STATUS ModuleName ManagedPipelineHandler Notification 128 HttpStatus 500 HttpReason Internal Server Error HttpSubStatus 0 ErrorCode 0 ConfigExceptionInfo Notification EXECUTE_REQUEST_HANDLER ErrorCode La operación se ha completado correctamente. (0x0) Maybe this can help, is the web.config of my service <?xml version="1.0" encoding="utf-8"?> <!-- Nota: como alternativa para editar manualmente este archivo, puede utilizar la herramienta Administración de sitios web para configurar los valores de la aplicación. Utilice la opción Sitio Web->Configuración de Asp.Net en Visual Studio. Encontrará una lista completa de valores de configuración y comentarios en machine.config.comments, que se encuentra generalmente en \Windows\Microsoft.Net\Framework\v2.x\Config --> <configuration> <configSections> <sectionGroup name="system.web.extensions" type="System.Web.Configuration.SystemWebExtensionsSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <sectionGroup name="scripting" type="System.Web.Configuration.ScriptingSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <section name="scriptResourceHandler" type="System.Web.Configuration.ScriptingScriptResourceHandlerSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> <sectionGroup name="webServices" type="System.Web.Configuration.ScriptingWebServicesSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <section name="jsonSerialization" type="System.Web.Configuration.ScriptingJsonSerializationSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="Everywhere" /> <section name="profileService" type="System.Web.Configuration.ScriptingProfileServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> <section name="authenticationService" type="System.Web.Configuration.ScriptingAuthenticationServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> <section name="roleService" type="System.Web.Configuration.ScriptingRoleServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> </sectionGroup> </sectionGroup> </sectionGroup> </configSections> <appSettings /> <connectionStrings> <remove name="LocalMySqlServer" /> <remove name="LocalSqlServer" /> <add name="fundacionSelfAut" connectionString="Data Source=FUNDACIONSERVER/PRUEBAS;Initial Catalog=fundacion;User ID=wcfBaseDatos;Password=qwerty_2009;" providerName="System.Data.SqlClient" /> </connectionStrings> <system.web> <profile enabled="true" defaultProvider="SqlProfileProvider"> <providers> <clear /> <add name="SqlProfileProvider" type="System.Web.Profile.SqlProfileProvider" connectionStringName="fundacionSelfAut" applicationName="fundafe" /> </providers> <properties> <add name="FirstName" type="String" /> <add name="LastName" type="String" /> <add name="PhoneNumber" type="String" /> </properties> </profile> <roleManager enabled="true" defaultProvider="SqlRoleProvider"> <providers> <clear /> <add name="SqlRoleProvider" type="System.Web.Security.SqlRoleProvider" connectionStringName="fundacionSelfAut" applicationName="fundafe" /> </providers> </roleManager> <membership defaultProvider="SqlMembershipProvider"> <providers> <clear /> <add name="SqlMembershipProvider" type="System.Web.Security.SqlMembershipProvider" connectionStringName="fundacionSelfAut" applicationName="fundafe" enablePasswordRetrieval="false" enablePasswordReset="false" requiresQuestionAndAnswer="true" requiresUniqueEmail="true" passwordFormat="Hashed" /> </providers> </membership> <authentication mode="Forms" /> <compilation debug="true" strict="false" explicit="true"> <assemblies> <add assembly="System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089" /> <add assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </assemblies> </compilation> <!-- La sección <authentication> permite la configuración del modo de autenticación de seguridad utilizado por ASP.NET para identificar a un usuario entrante. --> <!-- La sección <customErrors> permite configurar las acciones que se deben llevar a cabo/cuando un error no controlado tiene lugar durante la ejecución de una solicitud. Específicamente, permite a los desarrolladores configurar páginas de error html que se mostrarán en lugar de un seguimiento de pila de errores. <customErrors mode="RemoteOnly" defaultRedirect="GenericErrorPage.htm"> <error statusCode="403" redirect="NoAccess.htm" /> <error statusCode="404" redirect="FileNotFound.htm" /> </customErrors> --> <pages> <controls> <add tagPrefix="asp" namespace="System.Web.UI" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </controls> </pages> <httpHandlers> <remove verb="*" path="*.asmx" /> <add verb="*" path="*.asmx" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add verb="*" path="*_AppService.axd" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add verb="GET,HEAD" path="ScriptResource.axd" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" validate="false" /> </httpHandlers> <httpModules> <add name="ScriptModule" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </httpModules> <sessionState timeout="40" /> </system.web> <system.codedom> <compilers> <compiler language="c#;cs;csharp" extension=".cs" warningLevel="4" type="Microsoft.CSharp.CSharpCodeProvider, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"> <providerOption name="CompilerVersion" value="v3.5" /> <providerOption name="WarnAsError" value="false" /> </compiler> </compilers> </system.codedom> <!-- La sección webServer del sistema es necesaria para ejecutar ASP.NET AJAX en Internet Information Services 7.0. Sin embargo, no es necesaria para la versión anterior de IIS. --> <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <modules> <add name="ScriptModule" preCondition="integratedMode" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </modules> <handlers> <remove name="WebServiceHandlerFactory-Integrated" /> <add name="ScriptHandlerFactory" verb="*" path="*.asmx" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add name="ScriptHandlerFactoryAppServices" verb="*" path="*_AppService.axd" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add name="ScriptResource" preCondition="integratedMode" verb="GET,HEAD" path="ScriptResource.axd" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </handlers> <tracing> <traceFailedRequests> <add path="*"> <traceAreas> <add provider="ASP" verbosity="Verbose" /> <add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" /> <add provider="ISAPI Extension" verbosity="Verbose" /> <add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,Cache,RequestNotifications,Module" verbosity="Verbose" /> </traceAreas> <failureDefinitions statusCodes="401.3,500,403,404,405" /> </add> </traceFailedRequests> </tracing> <security> <authorization> <add accessType="Allow" users="germanbarbosa,informatica" /> </authorization> <authentication> <windowsAuthentication enabled="false" /> </authentication> </security> </system.webServer> <system.web.extensions> <scripting> <webServices> <authenticationService enabled="true" requireSSL="true" /> <profileService enabled="true" readAccessProperties="FirstName,LastName,PhoneNumber" /> <roleService enabled="true" /> </webServices> </scripting> </system.web.extensions> <system.serviceModel> <services> <!-- this enables the WCF AuthenticationService endpoint --> <service behaviorConfiguration="AppServiceBehaviors" name="System.Web.ApplicationServices.AuthenticationService"> <endpoint address="" binding="basicHttpBinding" bindingConfiguration="userHttps" bindingNamespace="http://asp.net/ApplicationServices/v200" contract="System.Web.ApplicationServices.AuthenticationService" /> </service> <!-- this enables the WCF RoleService endpoint --> <service behaviorConfiguration="AppServiceBehaviors" name="System.Web.ApplicationServices.RoleService"> <endpoint binding="basicHttpBinding" bindingConfiguration="userHttps" bindingNamespace="http://asp.net/ApplicationServices/v200" contract="System.Web.ApplicationServices.RoleService" /> </service> <!-- this enables the WCF ProfileService endpoint --> <service behaviorConfiguration="AppServiceBehaviors" name="System.Web.ApplicationServices.ProfileService"> <endpoint binding="basicHttpBinding" bindingNamespace="http://asp.net/ApplicationServices/v200" bindingConfiguration="userHttps" contract="System.Web.ApplicationServices.ProfileService" /> </service> </services> <bindings> <basicHttpBinding> <!-- Set up a binding that uses Username as the client credential type --> <binding name="userHttps"> <security mode="Transport"> </security> </binding> </basicHttpBinding> </bindings> <behaviors> <serviceBehaviors> <behavior name="AppServiceBehaviors"> <serviceMetadata httpGetEnabled="false" httpsGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="true" /> <serviceAuthorization principalPermissionMode="UseAspNetRoles" roleProviderName="SqlRoleProvider" /> <serviceCredentials> <userNameAuthentication userNamePasswordValidationMode="MembershipProvider" membershipProviderName="SqlMembershipProvider" /> </serviceCredentials> </behavior> </serviceBehaviors> </behaviors> <serviceHostingEnvironment aspNetCompatibilityEnabled="true" /> </system.serviceModel> </configuration>

    Read the article

  • How do I send emails in Java?

    - by Cris Carter
    Hey. I currently want to develop a simple program in Java that sends out email. Not just a few emails, but actually a lot (10k+) I have a subscribers list that all agree to it, by the way. Anyway, I cannot send these emails via Gmail or anything like that - They do not allow that many emails to be sent. So the basic question is: How do I send emails by making the actual sending computer an email server? I'm sure I should use some libraries, I heard about ChillKat or something like that. Could anyone explain / help me out? Would be very much appreciated.

    Read the article

  • For buffer overflows, what is the stack address when using pthreads?

    - by t2k32316
    I'm taking a class in computer security and there is an extra credit assignment to insert executable code into a buffer overflow. I have the c source code for the target program I'm trying to manipulate, and I've gotten to the point where I can successfully overwrite the eip for the current function stack frame. However, I always get a Segmentation fault, because the address I supply is always wrong. The problem is that the current function is inside a pthread, and therefore, the address of the stack seems to always change between different runs of the program. Is there any method for finding the stack address within a pthread (or for estimating the stack address within a pthread)? (note: pthread_create's 2nd argument is null, so we're not manually assigning a stack address)

    Read the article

  • Announcing the release of the Windows Azure SDK 2.1 for .NET

    - by ScottGu
    Today we released the v2.1 update of the Windows Azure SDK for .NET.  This is a major refresh of the Windows Azure SDK and it includes some great new features and enhancements. These new capabilities include: Visual Studio 2013 Preview Support: The Windows Azure SDK now supports using the new VS 2013 Preview Visual Studio 2013 VM Image: Windows Azure now has a built-in VM image that you can use to host and develop with VS 2013 in the cloud Visual Studio Server Explorer Enhancements: Redesigned with improved filtering and auto-loading of subscription resources Virtual Machines: Start and Stop VM’s w/suspend billing directly from within Visual Studio Cloud Services: New Emulator Express option with reduced footprint and Run as Normal User support Service Bus: New high availability options, Notification Hub support, Improved VS tooling PowerShell Automation: Lots of new PowerShell commands for automating Web Sites, Cloud Services, VMs and more All of these SDK enhancements are now available to start using immediately and you can download the SDK from the Windows Azure .NET Developer Center.  Visual Studio’s Team Foundation Service (http://tfs.visualstudio.com/) has also been updated to support today’s SDK 2.1 release, and the SDK 2.1 features can now be used with it (including with automated builds + tests). Below are more details on the new features and capabilities released today: Visual Studio 2013 Preview Support Today’s Window Azure SDK 2.1 release adds support for the recent Visual Studio 2013 Preview. The 2.1 SDK also works with Visual Studio 2010 and Visual Studio 2012, and works side by side with the previous Windows Azure SDK 1.8 and 2.0 releases. To install the Windows Azure SDK 2.1 on your local computer, choose the “install the sdk” link from the Windows Azure .NET Developer Center. Then, chose which version of Visual Studio you want to use it with.  Clicking the third link will install the SDK with the latest VS 2013 Preview: If you don’t already have the Visual Studio 2013 Preview installed on your machine, this will also install Visual Studio Express 2013 Preview for Web. Visual Studio 2013 VM Image Hosted in the Cloud One of the requests we’ve heard from several customers has been to have the ability to host Visual Studio within the cloud (avoiding the need to install anything locally on your computer). With today’s SDK update we’ve added a new VM image to the Windows Azure VM Gallery that has Visual Studio Ultimate 2013 Preview, SharePoint 2013, SQL Server 2012 Express and the Windows Azure 2.1 SDK already installed on it.  This provides a really easy way to create a development environment in the cloud with the latest tools. With the recent shutdown and suspend billing feature we shipped on Windows Azure last month, you can spin up the image only when you want to do active development, and then shut down the virtual machine and not have to worry about usage charges while the virtual machine is not in use. You can create your own VS image in the cloud by using the New->Compute->Virtual Machine->From Gallery menu within the Windows Azure Management Portal, and then by selecting the “Visual Studio Ultimate 2013 Preview” template: Visual Studio Server Explorer: Improved Filtering/Management of Subscription Resources With the Windows Azure SDK 2.1 release you’ll notice significant improvements in the Visual Studio Server Explorer. The explorer has been redesigned so that all Windows Azure services are now contained under a single Windows Azure node.  From the top level node you can now manage your Windows Azure credentials, import a subscription file or filter Server Explorer to only show services from particular subscriptions or regions. Note: The Web Sites and Mobile Services nodes will appear outside the Windows Azure Node until the final release of VS 2013. If you have installed the ASP.NET and Web Tools Preview Refresh, though, the Web Sites node will appear inside the Windows Azure node even with the VS 2013 Preview. Once your subscription information is added, Windows Azure services from all your subscriptions are automatically enumerated in the Server Explorer. You no longer need to manually add services to Server Explorer individually. This provides a convenient way of viewing all of your cloud services, storage accounts, service bus namespaces, virtual machines, and web sites from one location: Subscription and Region Filtering Support Using the Windows Azure node in Server Explorer, you can also now filter your Windows Azure services in the Server Explorer by the subscription or region they are in.  If you have multiple subscriptions but need to focus your attention to just a few subscription for some period of time, this a handy way to hide the services from other subscriptions view until they become relevant. You can do the same sort of filtering by region. To enable this, just select “Filter Services” from the context menu on the Windows Azure node: Then choose the subscriptions and/or regions you want to filter by. In the below example, I’ve decided to show services from my pay-as-you-go subscription within the East US region: Visual Studio will then automatically filter the items that show up in the Server Explorer appropriately: With storage accounts and service bus namespaces, you sometimes need to work with services outside your subscription. To accommodate that scenario, those services allow you to attach an external account (from the context menu). You’ll notice that external accounts have a slightly different icon in server explorer to indicate they are from outside your subscription. Other Improvements We’ve also improved the Server Explorer by adding additional properties and actions to the service exposed. You now have access to most of the properties on a cloud service, deployment slot, role or role instance as well as the properties on storage accounts, virtual machines and web sites. Just select the object of interest in Server Explorer and view the properties in the property pane. We also now have full support for creating/deleting/update storage tables, blobs and queues from directly within Server Explorer.  Simply right-click on the appropriate storage account node and you can create them directly within Visual Studio: Virtual Machines: Start/Stop within Visual Studio Virtual Machines now have context menu actions that allow you start, shutdown, restart and delete a Virtual Machine directly within the Visual Studio Server Explorer. The shutdown action enables you to shut down the virtual machine and suspend billing when the VM is not is use, and easily restart it when you need it: This is especially useful in Dev/Test scenarios where you can start a VM – such as a SQL Server – during your development session and then shut it down / suspend billing when you are not developing (and no longer be billed for it). You can also now directly remote desktop into VMs using the “Connect using Remote Desktop” context menu command in VS Server Explorer.  Cloud Services: Emulator Express with Run as Normal User Support You can now launch Visual Studio and run your cloud services locally as a Normal User (without having to elevate to an administrator account) using a new Emulator Express option included as a preview feature with this SDK release.  Emulator Express is a version of the Windows Azure Compute Emulator that runs a restricted mode – one instance per role – and it doesn’t require administrative permissions and uses 40% less resources than the full Windows Azure Emulator. Emulator Express supports both web and worker roles. To run your application locally using the Emulator Express option, simply change the following settings in the Windows Azure project. On the shortcut menu for the Windows Azure project, choose Properties, and then choose the Web tab. Check the setting for IIS (Internet Information Services). Make sure that the option is set to IIS Express, not the full version of IIS. Emulator Express is not compatible with full IIS. On the Web tab, choose the option for Emulator Express. Service Bus: Notification Hubs With the Windows Azure SDK 2.1 release we are adding support for Windows Azure Notification Hubs as part of our official Windows Azure SDK, inside of Microsoft.ServiceBus.dll (previously the Notification Hub functionality was in a preview assembly). You are now able to create, update and delete Notification Hubs programmatically, manage your device registrations, and send push notifications to all your mobile clients across all platforms (Windows Store, Windows Phone 8, iOS, and Android). Learn more about Notification Hubs on MSDN here, or watch the Notification Hubs //BUILD/ presentation here. Service Bus: Paired Namespaces One of the new features included with today’s Windows Azure SDK 2.1 release is support for Service Bus “Paired Namespaces”.  Paired Namespaces enable you to better handle situations where a Service Bus service namespace becomes unavailable (for example: due to connectivity issues or an outage) and you are unable to send or receive messages to the namespace hosting the queue, topic, or subscription. Previously,to handle this scenario you had to manually setup separate namespaces that can act as a backup, then implement manual failover and retry logic which was sometimes tricky to get right. Service Bus now supports Paired Namespaces, which enables you to connect two namespaces together. When you activate the secondary namespace, messages are stored in the secondary queue for delivery to the primary queue at a later time. If the primary container (namespace) becomes unavailable for some reason, automatic failover enables the messages in the secondary queue. For detailed information about paired namespaces and high availability, see the new topic Asynchronous Messaging Patterns and High Availability. Service Bus: Tooling Improvements In this release, the Windows Azure Tools for Visual Studio contain several enhancements and changes to the management of Service Bus messaging entities using Visual Studio’s Server Explorer. The most noticeable change is that the Service Bus node is now integrated into the Windows Azure node, and supports integrated subscription management. Additionally, there has been a change to the code generated by the Windows Azure Worker Role with Service Bus Queue project template. This code now uses an event-driven “message pump” programming model using the QueueClient.OnMessage method. PowerShell: Tons of New Automation Commands Since my last blog post on the previous Windows Azure SDK 2.0 release, we’ve updated Windows Azure PowerShell (which is a separate download) five times. You can find the full change log here. We’ve added new cmdlets in the following areas: China instance and Windows Azure Pack support Environment Configuration VMs Cloud Services Web Sites Storage SQL Azure Service Bus China Instance and Windows Azure Pack We now support the following cmdlets for the China instance and Windows Azure Pack, respectively: China Instance: Web Sites, Service Bus, Storage, Cloud Service, VMs, Network Windows Azure Pack: Web Sites, Service Bus We will have full cmdlet support for these two Windows Azure environments in PowerShell in the near future. Virtual Machines: Stop/Start Virtual Machines Similar to the Start/Stop VM capability in VS Server Explorer, you can now stop your VM and suspend billing: If you want to keep the original behavior of keeping your stopped VM provisioned, you can pass in the -StayProvisioned switch parameter. Virtual Machines: VM endpoint ACLs We’ve added and updated a bunch of cmdlets for you to configure fine-grained network ACL on your VM endpoints. You can use the following cmdlets to create ACL config and apply them to a VM endpoint: New-AzureAclConfig Get-AzureAclConfig Set-AzureAclConfig Remove-AzureAclConfig Add-AzureEndpoint -ACL Set-AzureEndpoint –ACL The following example shows how to add an ACL rule to an existing endpoint of a VM. Other improvements for Virtual Machine management includes Added -NoWinRMEndpoint parameter to New-AzureQuickVM and Add-AzureProvisioningConfig to disable Windows Remote Management Added -DirectServerReturn parameter to Add-AzureEndpoint and Set-AzureEndpoint to enable/disable direct server return Added Set-AzureLoadBalancedEndpoint cmdlet to modify load balanced endpoints Cloud Services: Remote Desktop and Diagnostics Remote Desktop and Diagnostics are popular debugging options for Cloud Services. We’ve introduced cmdlets to help you configure these two Cloud Service extensions from Windows Azure PowerShell. Windows Azure Cloud Services Remote Desktop extension: New-AzureServiceRemoteDesktopExtensionConfig Get-AzureServiceRemoteDesktopExtension Set-AzureServiceRemoteDesktopExtension Remove-AzureServiceRemoteDesktopExtension Windows Azure Cloud Services Diagnostics extension New-AzureServiceDiagnosticsExtensionConfig Get-AzureServiceDiagnosticsExtension Set-AzureServiceDiagnosticsExtension Remove-AzureServiceDiagnosticsExtension The following example shows how to enable Remote Desktop for a Cloud Service. Web Sites: Diagnostics With our last SDK update, we introduced the Get-AzureWebsiteLog –Tail cmdlet to get the log streaming of your Web Sites. Recently, we’ve also added cmdlets to configure Web Site application diagnostics: Enable-AzureWebsiteApplicationDiagnostic Disable-AzureWebsiteApplicationDiagnostic The following 2 examples show how to enable application diagnostics to the file system and a Windows Azure Storage Table: SQL Database Previously, you had to know the SQL Database server admin username and password if you want to manage the database in that SQL Database server. Recently, we’ve made the experience much easier by not requiring the admin credential if the database server is in your subscription. So you can simply specify the -ServerName parameter to tell Windows Azure PowerShell which server you want to use for the following cmdlets. Get-AzureSqlDatabase New-AzureSqlDatabase Remove-AzureSqlDatabase Set-AzureSqlDatabase We’ve also added a -AllowAllAzureServices parameter to New-AzureSqlDatabaseServerFirewallRule so that you can easily add a firewall rule to whitelist all Windows Azure IP addresses. Besides the above experience improvements, we’ve also added cmdlets get the database server quota and set the database service objective. Check out the following cmdlets for details. Get-AzureSqlDatabaseServerQuota Get-AzureSqlDatabaseServiceObjective Set-AzureSqlDatabase –ServiceObjective Storage and Service Bus Other new cmdlets include Storage: CRUD cmdlets for Azure Tables and Queues Service Bus: Cmdlets for managing authorization rules on your Service Bus Namespace, Queue, Topic, Relay and NotificationHub Summary Today’s release includes a bunch of great features that enable you to build even better cloud solutions.  All the above features/enhancements are shipped and available to use immediately as part of the 2.1 release of the Windows Azure SDK for .NET. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

< Previous Page | 723 724 725 726 727 728 729 730 731 732 733 734  | Next Page >