Search Results

Search found 62759 results on 2511 pages for 'view first'.

Page 688/2511 | < Previous Page | 684 685 686 687 688 689 690 691 692 693 694 695  | Next Page >

  • NetBeans Podcast 62

    - by TinuA
    Download mp3: 49 minutes – 39.5 MB Subscribe to the NetBeans Podcast on iTunes NetBeans Community News with Geertjan and Tinu What's NEW? Recap of a SUCCESSFUL NetBeans Community Day at JavaOne2012! Want to know what you missed? Download slides for: NetBeans Community Keynote NetBeans and JavaFX panel NetBeans and Java EE panel NetBeans Platform panel Visit the JavaOne Content Catalog for slides, and audio and video recordings of all NetBeans sessions at JavaOne 2012. (Type in keyword "NetBeans".) NetBeans Governance Board elections are done. Congratulations to Anton Epple and Hermien Pellissier, the new members of the 20th Board! How would you grade the NetBeans team on NetBeans IDE 7.2? Take the NetBeans 7.2 Satisfaction Survey. NetBeans IDE 7.3 Beta 2 is available for download. The first beta debuted at JavaOne with support for HTML5. Watch videos of HTML5 support in NetBeans and visit Geertjan's blog for a beginner's guide to HTML5 development. It's a busy Fall on the NetBeans Calendar with stops at Devoxx 2012, JavaOne Latin America, Jay Day Munich, Jay Days Sweden  JavaOne 2012 Reflections NetBeans had a fantastic showing at JavaOne 2012--from the full-day lineup of NetBeans Community Day to the numerous BOFs, Labs, and sessions at the main conference. But better to hear it in these short interviews with members of the community who attended JavaOne 2012. Veteran attendees and first-timers, panel participants and award winners, the interviewees share their experience of the conference, from highlights and insights, to new discoveries and inspiration. Listen in to why attending JavaOne is a tech pilgrimage every Java developer ought to make.   07:50   Anton Epple - Eppleton Consulting (Germany); Recipient of 2012 NetBeans Community Recognition Award 17:10   Henry Arousell and Thomas Boqvist - Bjorn Lunden Information (Sweden) 24:45   Glenn Holmer - Weyco Group, Inc. (USA); Recipient of 2012 NetBeans Community Recognition Award 33:09   Timon Veenstra - Agrosense (The Netherlands); 2012 Duke's Choice Award winner (Agrosense in the Nov/Dec '12 issue of Java Magazine.) 40:19   Rob Terplowski, - Linden, Inc. (USA) More thoughts about NetBeans Day and JavaOne can also be found in two recent NetBeans Zone articles: "Reflections on JavaOne 2012 by the NetBeans Community: Part 1 and Part 2". *Have ideas for NetBeans Podcast topics? Send them to nbpodcast at netbeans dot org. *Subscribe to the official NetBeans page on Facebook! Check us out as well on Twitter, YouTube, and Google+.

    Read the article

  • Dynamically loading Assemblies to reduce Runtime Depencies

    - by Rick Strahl
    I've been working on a request to the West Wind Application Configuration library to add JSON support. The config library is a very easy to use code-first approach to configuration: You create a class that holds the configuration data that inherits from a base configuration class, and then assign a persistence provider at runtime that determines where and how the configuration data is store. Currently the library supports .NET Configuration stores (web.config/app.config), XML files, SQL records and string storage.About once a week somebody asks me about JSON support and I've deflected this question for the longest time because frankly I think that JSON as a configuration store doesn't really buy a heck of a lot over XML. Both formats require the user to perform some fixup of the plain configuration data - in XML into XML tags, with JSON using JSON delimiters for properties and property formatting rules. Sure JSON is a little less verbose and maybe a little easier to read if you have hierarchical data, but overall the differences are pretty minor in my opinion. And yet - the requests keep rolling in.Hard Link Issues in a Component LibraryAnother reason I've been hesitant is that I really didn't want to pull in a dependency on an external JSON library - in this case JSON.NET - into the core library. If you're not using JSON.NET elsewhere I don't want a user to have to require a hard dependency on JSON.NET unless they want to use the JSON feature. JSON.NET is also sensitive to versions and doesn't play nice with multiple versions when hard linked. For example, when you have a reference to V4.4 in your project but the host application has a reference to version 4.5 you can run into assembly load problems. NuGet's Update-Package can solve some of this *if* you can recompile, but that's not ideal for a component that's supposed to be just plug and play. This is no criticism of JSON.NET - this really applies to any dependency that might change.  So hard linking the DLL can be problematic for a number reasons, but the primary reason is to not force loading of JSON.NET unless you actually need it when you use the JSON configuration features of the library.Enter Dynamic LoadingSo rather than adding an assembly reference to the project, I decided that it would be better to dynamically load the DLL at runtime and then use dynamic typing to access various classes. This allows me to run without a hard assembly reference and allows more flexibility with version number differences now and in the future.But there are also a couple of downsides:No assembly reference means only dynamic access - no compiler type checking or IntellisenseRequirement for the host application to have reference to JSON.NET or else get runtime errorsThe former is minor, but the latter can be problematic. Runtime errors are always painful, but in this case I'm willing to live with this. If you want to use JSON configuration settings JSON.NET needs to be loaded in the project. If this is a Web project, it'll likely be there already.So there are a few things that are needed to make this work:Dynamically create an instance and optionally attempt to load an Assembly (if not loaded)Load types into dynamic variablesUse Reflection for a few tasks like statics/enumsThe dynamic keyword in C# makes the formerly most difficult Reflection part - method calls and property assignments - fairly painless. But as cool as dynamic is it doesn't handle all aspects of Reflection. Specifically it doesn't deal with object activation, truly dynamic (string based) member activation or accessing of non instance members, so there's still a little bit of work left to do with Reflection.Dynamic Object InstantiationThe first step in getting the process rolling is to instantiate the type you need to work with. This might be a two step process - loading the instance from a string value, since we don't have a hard type reference and potentially having to load the assembly. Although the host project might have a reference to JSON.NET, that instance might have not been loaded yet since it hasn't been accessed yet. In ASP.NET this won't be a problem, since ASP.NET preloads all referenced assemblies on AppDomain startup, but in other executable project, assemblies are just in time loaded only when they are accessed.Instantiating a type is a two step process: Finding the type reference and then activating it. Here's the generic code out of my ReflectionUtils library I use for this:/// <summary> /// Creates an instance of a type based on a string. Assumes that the type's /// </summary> /// <param name="typeName">Common name of the type</param> /// <param name="args">Any constructor parameters</param> /// <returns></returns> public static object CreateInstanceFromString(string typeName, params object[] args) { object instance = null; Type type = null; try { type = GetTypeFromName(typeName); if (type == null) return null; instance = Activator.CreateInstance(type, args); } catch { return null; } return instance; } /// <summary> /// Helper routine that looks up a type name and tries to retrieve the /// full type reference in the actively executing assemblies. /// </summary> /// <param name="typeName"></param> /// <returns></returns> public static Type GetTypeFromName(string typeName) { Type type = null; // Let default name binding find it type = Type.GetType(typeName, false); if (type != null) return type; // look through assembly list var assemblies = AppDomain.CurrentDomain.GetAssemblies(); // try to find manually foreach (Assembly asm in assemblies) { type = asm.GetType(typeName, false); if (type != null) break; } return type; } To use this for loading JSON.NET I have a small factory function that instantiates JSON.NET and sets a bunch of configuration settings on the generated object. The startup code also looks for failure and tries loading up the assembly when it fails since that's the main reason the load would fail. Finally it also caches the loaded instance for reuse (according to James the JSON.NET instance is thread safe and quite a bit faster when cached). Here's what the factory function looks like in JsonSerializationUtils:/// <summary> /// Dynamically creates an instance of JSON.NET /// </summary> /// <param name="throwExceptions">If true throws exceptions otherwise returns null</param> /// <returns>Dynamic JsonSerializer instance</returns> public static dynamic CreateJsonNet(bool throwExceptions = true) { if (JsonNet != null) return JsonNet; lock (SyncLock) { if (JsonNet != null) return JsonNet; // Try to create instance dynamic json = ReflectionUtils.CreateInstanceFromString("Newtonsoft.Json.JsonSerializer"); if (json == null) { try { var ass = AppDomain.CurrentDomain.Load("Newtonsoft.Json"); json = ReflectionUtils.CreateInstanceFromString("Newtonsoft.Json.JsonSerializer"); } catch (Exception ex) { if (throwExceptions) throw; return null; } } if (json == null) return null; json.ReferenceLoopHandling = (dynamic) ReflectionUtils.GetStaticProperty("Newtonsoft.Json.ReferenceLoopHandling", "Ignore"); // Enums as strings in JSON dynamic enumConverter = ReflectionUtils.CreateInstanceFromString("Newtonsoft.Json.Converters.StringEnumConverter"); json.Converters.Add(enumConverter); JsonNet = json; } return JsonNet; }This code's purpose is to return a fully configured JsonSerializer instance. As you can see the code tries to create an instance and when it fails tries to load the assembly, and then re-tries loading.Once the instance is loaded some configuration occurs on it. Specifically I set the ReferenceLoopHandling option to not blow up immediately when circular references are encountered. There are a host of other small config setting that might be useful to set, but the default seem to be good enough in recent versions. Note that I'm setting ReferenceLoopHandling which requires an Enum value to be set. There's no real easy way (short of using the cardinal numeric value) to set a property or pass parameters from static values or enums. This means I still need to use Reflection to make this work. I'm using the same ReflectionUtils class I previously used to handle this for me. The function looks up the type and then uses Type.InvokeMember() to read the static property.Another feature I need is have Enum values serialized as strings rather than numeric values which is the default. To do this I can use the StringEnumConverter to convert enums to strings by adding it to the Converters collection.As you can see there's still a bit of Reflection to be done even in C# 4+ with dynamic, but with a few helpers this process is relatively painless.Doing the actual JSON ConversionFinally I need to actually do my JSON conversions. For the Utility class I need serialization that works for both strings and files so I created four methods that handle these tasks two each for serialization and deserialization for string and file.Here's what the File Serialization looks like:/// <summary> /// Serializes an object instance to a JSON file. /// </summary> /// <param name="value">the value to serialize</param> /// <param name="fileName">Full path to the file to write out with JSON.</param> /// <param name="throwExceptions">Determines whether exceptions are thrown or false is returned</param> /// <param name="formatJsonOutput">if true pretty-formats the JSON with line breaks</param> /// <returns>true or false</returns> public static bool SerializeToFile(object value, string fileName, bool throwExceptions = false, bool formatJsonOutput = false) { dynamic writer = null; FileStream fs = null; try { Type type = value.GetType(); var json = CreateJsonNet(throwExceptions); if (json == null) return false; fs = new FileStream(fileName, FileMode.Create); var sw = new StreamWriter(fs, Encoding.UTF8); writer = Activator.CreateInstance(JsonTextWriterType, sw); if (formatJsonOutput) writer.Formatting = (dynamic)Enum.Parse(FormattingType, "Indented"); writer.QuoteChar = '"'; json.Serialize(writer, value); } catch (Exception ex) { Debug.WriteLine("JsonSerializer Serialize error: " + ex.Message); if (throwExceptions) throw; return false; } finally { if (writer != null) writer.Close(); if (fs != null) fs.Close(); } return true; }You can see more of the dynamic invocation in this code. First I grab the dynamic JsonSerializer instance using the CreateJsonNet() method shown earlier which returns a dynamic. I then create a JsonTextWriter and configure a couple of enum settings on it, and then call Serialize() on the serializer instance with the JsonTextWriter that writes the output to disk. Although this code is dynamic it's still fairly short and readable.For full circle operation here's the DeserializeFromFile() version:/// <summary> /// Deserializes an object from file and returns a reference. /// </summary> /// <param name="fileName">name of the file to serialize to</param> /// <param name="objectType">The Type of the object. Use typeof(yourobject class)</param> /// <param name="binarySerialization">determines whether we use Xml or Binary serialization</param> /// <param name="throwExceptions">determines whether failure will throw rather than return null on failure</param> /// <returns>Instance of the deserialized object or null. Must be cast to your object type</returns> public static object DeserializeFromFile(string fileName, Type objectType, bool throwExceptions = false) { dynamic json = CreateJsonNet(throwExceptions); if (json == null) return null; object result = null; dynamic reader = null; FileStream fs = null; try { fs = new FileStream(fileName, FileMode.Open, FileAccess.Read); var sr = new StreamReader(fs, Encoding.UTF8); reader = Activator.CreateInstance(JsonTextReaderType, sr); result = json.Deserialize(reader, objectType); reader.Close(); } catch (Exception ex) { Debug.WriteLine("JsonNetSerialization Deserialization Error: " + ex.Message); if (throwExceptions) throw; return null; } finally { if (reader != null) reader.Close(); if (fs != null) fs.Close(); } return result; }This code is a little more compact since there are no prettifying options to set. Here JsonTextReader is created dynamically and it receives the output from the Deserialize() operation on the serializer.You can take a look at the full JsonSerializationUtils.cs file on GitHub to see the rest of the operations, but the string operations are very similar - the code is fairly repetitive.These generic serialization utilities isolate the dynamic serialization logic that has to deal with the dynamic nature of JSON.NET, and any code that uses these functions is none the wiser that JSON.NET is dynamically loaded.Using the JsonSerializationUtils WrapperThe final consumer of the SerializationUtils wrapper is an actual ConfigurationProvider, that is responsible for handling reading and writing JSON values to and from files. The provider is simple a small wrapper around the SerializationUtils component and there's very little code to make this work now:The whole provider looks like this:/// <summary> /// Reads and Writes configuration settings in .NET config files and /// sections. Allows reading and writing to default or external files /// and specification of the configuration section that settings are /// applied to. /// </summary> public class JsonFileConfigurationProvider<TAppConfiguration> : ConfigurationProviderBase<TAppConfiguration> where TAppConfiguration: AppConfiguration, new() { /// <summary> /// Optional - the Configuration file where configuration settings are /// stored in. If not specified uses the default Configuration Manager /// and its default store. /// </summary> public string JsonConfigurationFile { get { return _JsonConfigurationFile; } set { _JsonConfigurationFile = value; } } private string _JsonConfigurationFile = string.Empty; public override bool Read(AppConfiguration config) { var newConfig = JsonSerializationUtils.DeserializeFromFile(JsonConfigurationFile, typeof(TAppConfiguration)) as TAppConfiguration; if (newConfig == null) { if(Write(config)) return true; return false; } DecryptFields(newConfig); DataUtils.CopyObjectData(newConfig, config, "Provider,ErrorMessage"); return true; } /// <summary> /// Return /// </summary> /// <typeparam name="TAppConfig"></typeparam> /// <returns></returns> public override TAppConfig Read<TAppConfig>() { var result = JsonSerializationUtils.DeserializeFromFile(JsonConfigurationFile, typeof(TAppConfig)) as TAppConfig; if (result != null) DecryptFields(result); return result; } /// <summary> /// Write configuration to XmlConfigurationFile location /// </summary> /// <param name="config"></param> /// <returns></returns> public override bool Write(AppConfiguration config) { EncryptFields(config); bool result = JsonSerializationUtils.SerializeToFile(config, JsonConfigurationFile,false,true); // Have to decrypt again to make sure the properties are readable afterwards DecryptFields(config); return result; } }This incidentally demonstrates how easy it is to create a new provider for the West Wind Application Configuration component. Simply implementing 3 methods will do in most cases.Note this code doesn't have any dynamic dependencies - all that's abstracted away in the JsonSerializationUtils(). From here on, serializing JSON is just a matter of calling the static methods on the SerializationUtils class.Already, there are several other places in some other tools where I use JSON serialization this is coming in very handy. With a couple of lines of code I was able to add JSON.NET support to an older AJAX library that I use replacing quite a bit of code that was previously in use. And for any other manual JSON operations (in a couple of apps I use JSON Serialization for 'blob' like document storage) this is also going to be handy.Performance?Some of you might be thinking that using dynamic and Reflection can't be good for performance. And you'd be right… In performing some informal testing it looks like the performance of the native code is nearly twice as fast as the dynamic code. Most of the slowness is attributable to type lookups. To test I created a native class that uses an actual reference to JSON.NET and performance was consistently around 85-90% faster with the referenced code. That being said though - I serialized 10,000 objects in 80ms vs. 45ms so this isn't hardly slouchy. For the configuration component speed is not that important because both read and write operations typically happen once on first access and then every once in a while. But for other operations - say a serializer trying to handle AJAX requests on a Web Server one would be well served to create a hard dependency.Dynamic Loading - Worth it?On occasion dynamic loading makes sense. But there's a price to be paid in added code complexity and a performance hit. But for some operations that are not pivotal to a component or application and only used under certain circumstances dynamic loading can be beneficial to avoid having to ship extra files and loading down distributions. These days when you create new projects in Visual Studio with 30 assemblies before you even add your own code, trying to keep file counts under control seems a good idea. It's not the kind of thing you do on a regular basis, but when needed it can be a useful tool. Hopefully some of you find this information useful…© Rick Strahl, West Wind Technologies, 2005-2013Posted in .NET  C#   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Review the New Migration Guide to SQL Server 2012 Always On

    - by KKline
    I had the pleasure of meeting Mr. Cephas Lin, of Microsoft, last year at the SQL Saturday in Indianapolis and then later at the PASS Summit in the fall. Cephas has been writing content for SQL Server 2012 Always On. Cephas has recently published his first whitepaper, a migration guide to SQL Server AlwaysOn. Read it and then pass along any feedback: HERE Enjoy, -Kev - Follow me on Twitter !...(read more)

    Read the article

  • Share and Deliver BI Publisher Reports in Multiple Languages

    - by kanichiro.nishida
    When you share your reports with someone who speak and read in different languages you want your reports to be shown in their language, right ? Well, translating reports with BI Publisher is not only easy but also reduces the maintenance cost a lot. Many of us in the BI Publisher product development team used to work in Globalization and Multi Lingual support, which enables Oracle products and applications to be used in many different languages and countries and territories.  And we have a lot of experience in this area. In fact, being a strategic reporting platform for Oracle EBS, PeopleSoft, JD Edwards, Siebel, and many other Oracle application products, our customers from all over the world are generating thousands of thousands of reports, including out-of-the-box pre-developed reports from Oracle and customer created or customized reports, in their own local language everyday as they operate and manage their business. Today, I’m going to talk about this very topic, how to translate my reports with BI Publisher 11G. Translation Grows, not the Numbers of the Reports Most of the reporting tools, regardless if it’s traditional or new, always take this translation on the back burner. They require their users to copy an original report and translate the whole thing. So when you want to support additional10 languages you will need to have 10 copies of the original. Imagine when you have 50 reports then you will end up having 500 reports (50 x 10) ! Now you need to maintain these 500 reports, whenever you need to make a change in a report you need to apply the same change to the other 10 reports. And as you imagine this is not only a nightmare for IT managements but not acceptable especially for the applications like Oracle EBS that supports over 30 languages. So first thing we did was, very simple, we separated the translation out of the report and marry it to the report only at the report generation. This means, regardless of how many languages you need to support you need to have only one report and translation files for the 10 languages, which would contain the translated letters and words. So let’s say you have 50 reports and need to support 10 languages for those reports you still have only 50 reports and each report now has 10 language translation files. Yes, translation is the one should grow as you add more languages to support, not the report itself! And second, we provide the translation files in XLIFF format, which is an international standard XML based format to exchange and maintain translation strings. So once you generate the XLIFF files for your reports with BI Publisher then you can work with any translation vendors in the world to make a mass translation or you can translate the XML files by yourself by manually updating the translatable strings presented in this text file. Lastly, we made it easier to manage the translation process starting from generating the XLIFF files to uploading the translated XLIFF files back to the BI Publisher server. You can generate, download, upload the XLIFF files from the BI Publisher’s Web interface with your browser and you can see the translated reports right away without needing to shutdown or restart your server. While the translated reports are displayed based on your language preference setting you can also specify a different language when you schedule or deliver the reports so that they can be generated in your customer’s preferred language. What Can I Translate? When it comes to translation there are three things. First, report content translation. When you receive a report you like to see the content like report title, section title, comments, annotation, table column header, and anything that are static and embedded in the report. in your preferred language. We call this Reports Content translation. Second, when you open a report online you might want to see not only the report content being translated but also the report UI, such as report name, parameter name, layout name, and anything that would help you to navigate around the reports, to be translated in your language. We call this Reports UI translation. And this separation of the Reports Content and Reports UI translation makes it very useful especially when you want to navigate through the reports in your preferred language UI but want to generate the reports in your customer’s preferred language. Imagine you are English native speaker and need to generate and send a report to your customers in China. You like to see the report name, parameter name in English so that you can comfortably navigate to the report and generate the report output, but like to see the report generated in Chinese so that the your customers in China can understand the report when they receive it. And lastly, you might want to see even the data presented in the report to be translated. For example, you might want to see product names in an Order Status report to be translated based on the report viewer’s language preference. We call this Reporting Data translation. Since this Reporting Data translation is maintained at the data source level such as Database tables along with the main data, you need to prepare the translation at the data source level first. Then, you want to make sure that your query is switched accordingly based on the language preference setting so that the translated data will be retrieved. How to Translate BI Publisher Reports? Now when it comes to ‘how to translate BI Publisher reports?’ the main focus here is about the translation for the Report Content and Report UI. And I just created this video to show you how to create and manage the translation with BI Publisher 11G. Please take a look at the clip below.   In today’s business world, customers and suppliers are from all over the world regardless of the size of the company or organization. Supporting multiple languages for your reports is no longer something ‘nice to have’, it’s mandatory. BI Publisher is designed to support multi lingual reports from the beginning without any extra hidden cost of license or configuration like other reporting tools such as Crystal Reports. You can support additional languages translation at any time with the very simple steps shown in the video above. Happy translation! Please share your translation experience with us! 

    Read the article

  • TechEd 2012: Day 3 &ndash; Morning TFS

    - by Tim Murphy
    My morning sessions for day three were dominated by Team Foundation Server.  This has been a hot topic for our clients lately, so this topic really stuck a chord. The speaker for the first session was from Boeing.  It was nice to hear how how a company mixes both agile and waterfall project management.   The approaches that he presented were very pragmatic.  For their needs reporting is the crucial part of their decision to use TFS.  This was interesting since this is probably the last aspect that most shops would think about. The challenge of getting users to adopt TFS was brought up by the audience.  As with the other discussion point he took a very level headed stance.  The approach he was prescribing was to eat the elephant a bite at a time instead of all at once.  If you try to convert you entire shop at once the culture shock will most likely kill the effort. Another key point he reminded us of is that you need to make sure that standards and compliance are taken into account when you setup TFS.  If you don’t implement a tool and processes around it that comply with the standards bodies that govern your business you are in for a world of hurt. Ultimately the reason they chose TFS was because it was the first tool that incorporated all the ALM features that they needed. Reduced licensing cost because of all the different tools they would need to buy to complete the same tasks.  They got to this point by doing an industry evaluation.  Although TFS came out on top he said that it still has a big gap is in the Java area.  Of course in this market there are vendors helping to close that gap. The second session was on how continuous feedback in agile is a new focus in VS2012.  The problems they intended to address included cycle time and average time to repair, root cause analysis. The speakers fired features at us as if they were firing a machine gun.  I will just say that I am looking forward to digging into the product after seeing this presentation.  Beyond that I will simply list some of the key features that caught my attention. Feature – Ability to link documents into tasks as artifacts Web access portal PowerPoint storyboards Exploratory testing Request feedback (allows users to record notes, screen shots and video/audio) See you after the second half. del.icio.us Tags: TechEd,TechEd 2012,TFS,Team Foundation Server

    Read the article

  • Enhancing performance in Entity Framework applications by precompiling LINQ to Entities queries

    - by nikolaosk
    This is going to be the tenth post of a series of posts regarding ASP.Net and the Entity Framework and how we can use Entity Framework to access our datastore. You can find the first one here , the second one here , the third one here , the fourth one here , the fifth one here ,the sixth one here ,the seventh one here ,the eighth one here and the ninth one here . I have a post regarding ASP.Net and EntityDataSource . You can read it here .I have 3 more posts on Profiling Entity Framework applications...(read more)

    Read the article

  • My Body Summary template for Orchard

    - by Bertrand Le Roy
    By default, when Orchard displays a content item such as a blog post in a list, it uses a very basic summary template that removes all markup and then extracts the first 200 characters. Removing the markup has the unfortunate effect of removing all styles and images, in particular the image I like to add to the beginning of my posts. Fortunately, overriding templates in Orchard is a piece of cake. Here is the Common.Body.Summary.cshtml file that I drop into the Views/Parts folder of pretty much all Orchard themes I build: @{ Orchard.ContentManagement.ContentItem contentItem = Model.ContentPart.ContentItem; var bodyHtml = Model.Html.ToString(); var more = bodyHtml.IndexOf("<!--more-->"); if (more != -1) { bodyHtml = bodyHtml.Substring(0, more); } else { var firstP = bodyHtml.IndexOf("<p>"); var firstSlashP = bodyHtml.IndexOf("</p>"); if (firstP >=0 && firstSlashP > firstP) { bodyHtml = bodyHtml.Substring(firstP, firstSlashP + 4 - firstP); } } var body = new HtmlString(bodyHtml); } <p>@body</p> <p>@Html.ItemDisplayLink(T("Read more...").ToString(), contentItem)</p> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } This template does not remove any tags, but instead looks for an HTML comment delimiting the end of the post’s intro: <!--more--> This is the same convention that is being used in WordPress, and it’s easy to add from the source view in TinyMCE or Live Writer. If such a comment is not found, the template will extract the first paragraph (delimited by <p> and </p> tags) as the summary. And if it finds neither, it will use the whole post. The template also adds a localizable link to the full post.

    Read the article

  • SQL SERVER – Thinking about Deprecated, Discontinued Features and Breaking Changes while Upgrading to SQL Server 2012 – Guest Post by Nakul Vachhrajani

    - by pinaldave
    Nakul Vachhrajani is a Technical Specialist and systems development professional with iGATE having a total IT experience of more than 7 years. Nakul is an active blogger with BeyondRelational.com (150+ blogs), and can also be found on forums at SQLServerCentral and BeyondRelational.com. Nakul has also been a guest columnist for SQLAuthority.com and SQLServerCentral.com. Nakul presented a webcast on the “Underappreciated Features of Microsoft SQL Server” at the Microsoft Virtual Tech Days Exclusive Webcast series (May 02-06, 2011) on May 06, 2011. He is also the author of a research paper on Database upgrade methodologies, which was published in a CSI journal, published nationwide. In addition to his passion about SQL Server, Nakul also contributes to the academia out of personal interest. He visits various colleges and universities as an external faculty to judge project activities being carried out by the students. Disclaimer: The opinions expressed herein are his own personal opinions and do not represent his employer’s view in anyway. Blog | LinkedIn | Twitter | Google+ Let us hear the thoughts of Nakul in first person - Those who have been following my blogs would be aware that I am recently running a series on the database engine features that have been deprecated in Microsoft SQL Server 2012. Based on the response that I have received, I was quite surprised to know that most of the audience found these to be breaking changes, when in fact, they were not! It was then that I decided to write a little piece on how to plan your database upgrade such that it works with the next version of Microsoft SQL Server. Please note that the recommendations made in this article are high-level markers and are intended to help you think over the specific steps that you would need to take to upgrade your database. Refer the documentation – Understand the terms Change is the only constant in this world. Therefore, whenever customer requirements, newer architectures and designs require software vendors to make a change to the keywords, functions, etc; they ensure that they provide their end users sufficient time to migrate over to the new standards before dropping off the old ones. Microsoft does that too with it’s Microsoft SQL Server product. Whenever a new SQL Server release is announced, it comes with a list of the following features: Breaking changes These are changes that would break your currently running applications, scripts or functionalities that are based on earlier version of Microsoft SQL Server These are mostly features whose behavior has been changed keeping in mind the newer architectures and designs Lesson: These are the changes that you need to be most worried about! Discontinued features These features are no longer available in the associated version of Microsoft SQL Server These features used to be “deprecated” in the prior release Lesson: Without these changes, your database would not be compliant/may not work with the version of Microsoft SQL Server under consideration Deprecated features These features are those that are still available in the current version of Microsoft SQL Server, but are scheduled for removal in a future version. These may be removed in either the next version or any other future version of Microsoft SQL Server The features listed for deprecation will compose the list of discontinued features in the next version of SQL Server Lesson: Plan to make necessary changes required to remove/replace usage of the deprecated features with the latest recommended replacements Once a feature appears on the list, it moves from bottom to the top, i.e. it is first marked as “Deprecated” and then “Discontinued”. We know of “Breaking change” comes later on in the product life cycle. What this means is that if you want to know what features would not work with SQL Server 2012 (and you are currently using SQL Server 2008 R2), you need to refer the list of breaking changes and discontinued features in SQL Server 2012. Use the tools! There are a lot of tools and technologies around us, but it is rarely that I find teams using these tools religiously and to the best of their potential. Below are the top two tools, from Microsoft, that I use every time I plan a database upgrade. The SQL Server Upgrade Advisor Ever since SQL Server 2005 was announced, Microsoft provides a small, very light-weight tool called the “SQL Server upgrade advisor”. The upgrade advisor analyzes installed components from earlier versions of SQL Server, and then generates a report that identifies issues to fix either before or after you upgrade. The analysis examines objects that can be accessed, such as scripts, stored procedures, triggers, and trace files. Upgrade Advisor cannot analyze desktop applications or encrypted stored procedures. Refer the links towards the end of the post to know how to get the Upgrade Advisor. The SQL Server Profiler Another great tool that you can use is the one most SQL Server developers & administrators use often – the SQL Server profiler. SQL Server Profiler provides functionality to monitor the “Deprecation” event, which contains: Deprecation announcement – equivalent to features to be deprecated in a future release of SQL Server Deprecation final support – equivalent to features to be deprecated in the next release of SQL Server You can learn more using the links towards the end of the post. A basic checklist There are a lot of finer points that need to be taken care of when upgrading your database. But, it would be worth-while to identify a few basic steps in order to make your database compliant with the next version of SQL Server: Monitor the current application workload (on a test bed) via the Profiler in order to identify usage of features marked as Deprecated If none appear, you are all set! (This almost never happens) Note down all the offending queries and feature usages Run analysis sessions using the SQL Server upgrade advisor on your database Based on the inputs from the analysis report and Profiler trace sessions, Incorporate solutions for the breaking changes first Next, incorporate solutions for the discontinued features Revisit and document the upgrade strategy for your deployment scenarios Revisit the fall-back, i.e. rollback strategies in case the upgrades fail Because some programming changes are dependent upon the SQL server version, this may need to be done in consultation with the development teams Before any other enhancements are incorporated by the development team, send out the database changes into QA QA strategy should involve a comparison between an environment running the old version of SQL Server against the new one Because minimal application changes have gone in (essential changes for SQL Server version compliance only), this would be possible As an ongoing activity, keep incorporating changes recommended as per the deprecated features list As a DBA, update your coding standards to ensure that the developers are using ANSI compliant code – this code will require a change only if the ANSI standard changes Remember this: Change management is a continuous process. Keep revisiting the product release notes and incorporate recommended changes to stay prepared for the next release of SQL Server. May the power of SQL Server be with you! Links Referenced in this post Breaking changes in SQL Server 2012: Link Discontinued features in SQL Server 2012: Link Get the upgrade advisor from the Microsoft Download Center at: Link Upgrade Advisor page on MSDN: Link Profiler: Review T-SQL code to identify objects no longer supported by Microsoft: Link Upgrading to SQL Server 2012 by Vinod Kumar: Link Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Upgrade

    Read the article

  • Keep website and webservices warm with zero coding

    - by oazabir
    If you want to keep your websites or webservices warm and save user from seeing the long warm up time after an application pool recycle, or IIS restart or new code deployment or even windows restart, you can use the tinyget command line tool, that comes with IIS Resource Kit, to hit the site and services and keep them warm. Here’s how: First get tinyget from here. Download and install the IIS 6.0 Resource Kit on some PC. Then copy the tinyget.exe from “c:\program files…\IIS 6.0 ResourceKit\Tools'\tinyget” to the server where your IIS 6.0 or IIS 7 is running. Then create a batch file that will hit the pages and webservices. Something like this: SET TINYGET=C:\Program Files (x86)\IIS Resources\TinyGet\tinyget.exe"%TINYGET%" -srv:dropthings.omaralzabir.com -uri:http://dropthings.omaralzabir.com/ -status:200"%TINYGET%" -srv:dropthings.omaralzabir.com -uri:http://dropthings.omaralzabir.com/WidgetService.asmx?WSDL - status:200 First I am hitting the homepage to keep the webpage warm. Then I am hitting the webservice URL with ?WSDL parameter, which allows ASP.NET to compile the service if not already compiled and walk through all the operations and reflect on them and thus loading all related DLLs into memory and reducing the warmup time when hit. Tinyget gets the servers name or IP in the –srv parameter and then the actual URI in the –uri. I have specified what’s the HTTP response code to expect in –status parameter. It ensures the site is alive and is returning http 200 code. Besides just warming up a site, you can do some load test on the site. Tinyget can run in multiple threads and run loops to hit some URL. You can literally blow up a site with commands like this: "%TINYGET%" -threads:30 -loop:100 -srv:google.com -uri:http://www.google.com/ -status:200 Tinyget is also pretty useful to run automated tests. You can record http posts in a text file and then use it to make http posts to some page. Then you can put matching clause to check for certain string in the output to ensure the correct response is given. Thus with some simple command line commands, you can warm up, do some transactions, validate the site is giving off correct response as well as run a load test to ensure the server performing well. Very cheap way to get a lot done.

    Read the article

  • Propagaging Apache rules Automatically for all folders Beneath Root

    - by Sam
    Hi folks, my webroot folder /httpdocs folder contains a .htaccess file The first lines look like this: RewriteEngine on RewriteBase / AddDefaultCharset UTF-8 Options +FollowSymLinks -Indexes -ExecCGI # DirectoryIndex index.php /index.php # ServerSignature Off Now, I want all settings that I have set it to, to be propagated automatically to other folders as well. How can I do that? Thanks very much for suggestions.

    Read the article

  • How can i use JIRA for project management with Green Hopper

    - by user22
    I am thinking of using JIRA + GreenHopper for my project management. I have seen that Green Hopper is for making User stories , sprints. I am not able to find how do i need to add tasks , or how to break user stories in to sub stoires. DO i first need to create project in JIRA and then use Green Hopper or i can use use Green Hopper as stand alone for project management. I am thinking of JIRA as issue tracker not project management.

    Read the article

  • CodePlex Daily Summary for Friday, November 19, 2010

    CodePlex Daily Summary for Friday, November 19, 2010Popular ReleasesSQL Server CLR Function for Address Correction and Geocoding: Release 2.0: Release 2.0. New User Defined Function fields added.MiniTwitter: 1.58: MiniTwitter 1.58 ???? ?? ??????????????????、????????????????????????LateBindingApi.Excel: LateBindingApi.Excel Release 0.7d: Release+Samples V0.7: - Enthält Laufzeit DLL und Beispielprojekte Beispielprojekte: COMAddinExample - Demonstriert ein versionslos angebundenes COMAddin Example01 - Background Colors und Borders für Cells Example02 - Font Attributes undAlignment für Cells Example03 - Numberformats Example04 - Shapes, WordArts, Pictures, 3D-Effects Example05 - Charts Example06 - Dialoge in Excel Example07 - Einem Workbook VBA Code hinzufügen Example08 - Events Example09 - Eigene Gui Elemente erstellen und Ere...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.4 Released: Hi, Today we are releasing Visifire 3.6.4 with few bug fixes: * Multi-line Labels were getting clipped while exploding last DataPoint in Funnel and Pyramid chart. * ClosestPlotDistance property in Axis was not behaving as expected. * In DateTime Axis, Chart threw exception on mouse click over PlotArea if there were no DataPoints present in Chart. * ToolTip was not disappearing while changing the DataSource property of the DataSeries at real-time. * Chart threw exception ...Opalis Community Releases: Opalis Architecture and Workflow Deployment Docs: Opalis Architecture & Workflow Deployment Process Documentation The Opalis Architecture & Workflow Deployment Process Documentation includes two documents (for presentation purposes only, one is a DOCX with and embedded Visio diagram and the other is a PDF). The documentation is here as a "Best Practice Guide". The phrase "Best Practice Guide" is in quotes because this is an UNOFFICIAL example of an Opalis Architecture as well as an UNOFFICIAL example of a Workflow Deployment Process. The int...Sexy Select: sexyselect.0.2: Review index.html inside the source code for a working demoMicrosoft SQL Server Product Samples: Database: AdventureWorks 2008R2 SR1: Sample Databases for Microsoft SQL Server 2008R2 (SR1)This release is dedicated to the sample databases that ship for Microsoft SQL Server 2008R2. See Database Prerequisites for SQL Server 2008R2 for feature configurations required for installing the sample databases. See Installing SQL Server 2008R2 Databases for step by step installation instructions. The SR1 release contains minor bug fixes to the installer used to create the sample databases. There are no changes to the databases them...VidCoder: 0.7.2: Fixed duplicated subtitles when running multiple encodes off of the same title.Razor Templating Engine: Razor Template Engine v1.1: Release 1.1 Changes: ADDED: Signed assemblies with strong name to allow assemblies to be referenced by other strongly-named assemblies. FIX: Filter out dynamic assemblies which causes failures in template compilation. FIX: Changed ASCII to UTF8 encoding to support UTF-8 encoded string templates. FIX: Corrected implementation of TemplateBase adding ITemplate interface.Prism Training Kit: Prism Training Kit - 1.1: This is an updated version of the Prism training Kit that targets Prism 4.0 and fixes the bugs reported in the version 1.0. This release consists of a Training Kit with Labs on the following topics Modularity Dependency Injection Bootstrapper UI Composition Communication Note: Take into account that this is a Beta version. If you find any bugs please report them in the Issue Tracker PrerequisitesVisual Studio 2010 Microsoft Word 2007/2010 Microsoft Silverlight 4 Microsoft S...Craig's Utility Library: Craig's Utility Library Code 2.0: This update contains a number of changes, added functionality, and bug fixes: Added transaction support to SQLHelper. Added linked/embedded resource ability to EmailSender. Updated List to take into account new functions. Added better support for MAC address in WMI classes. Fixed Parsing in Reflection class when dealing with sub classes. Fixed bug in SQLHelper when replacing the Command that is a select after doing a select. Fixed issue in SQL Server helper with regard to generati...MFCMAPI: November 2010 Release: Build: 6.0.0.1023 Full release notes at SGriffin's blog. If you just want to run the tool, get the executable. If you want to debug it, get the symbol file and the source. The 64 bit build will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit build, regardless of the operating system. Facebook BadgeDotNetNuke® Community Edition: 05.06.00: Major HighlightsAdded automatic portal alias creation for single portal installs Updated the file manager upload page to allow user to upload multiple files without returning to the file manager page. Fixed issue with Event Log Email Notifications. Fixed issue where Telerik HTML Editor was unable to upload files to secure or database folder. Fixed issue where registration page is not set correctly during an upgrade. Fixed issue where Sendmail stripped HTML and Links from emails...mVu Mobile Viewer: mVu Mobile Viewer 0.7.10.0: Tube8 fix.EPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.8.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the serverNew Features Improved chart support Different chart-types series on the same chart Support for secondary axis and a lot of new properties Better styling Encryption and Workbook protection Table support Import csv files Array formulas ...and a lot of bugfixesAutoLoL: AutoLoL v1.4.2: Added support for more clients (French and Russian) Settings are now stored sepperatly for each user on a computer Auto Login is much faster now Auto Login detects and handles caps lock state properly nowTailspinSpyworks - WebForms Sample Application: TailspinSpyworks-v0.9: Contains a number of bug fixes and additional tutorial steps as well as complete database implementation details.ASP.NET MVC Project Awesome (rich jQuery AJAX helpers): 1.3 and demos: a library with mvc helpers and a demo project that demonstrates an awesome way of doing asp.net mvc. tested on mozilla, safari, chrome, opera, ie 9b/8/7/6 new stuff in 1.3 Autocomplete helper Autocomplete and AjaxDropdown can have parentId and be filled with data depending on the value of the parent PopupForm besides Content("ok") on success can also return Json(data) and use 'data' in a client side function Awesome demo improved (cruder, builder, added service layer)UltimateJB: UltimateJB 2.01 PL3 KakaRoto + PSNYes by EvilSperm: Voici une version attendu avec impatience pour beaucoup : - La Version PSNYes pour pouvoir jouer sur le PSN avec une PS3 Jailbreaker. - Pour l'instant le PSNYes n'est disponible qu'avec les PS3 en firmwares 3.41 !!! - La version PL3 KAKAROTO intégre ses dernières modification et prépare a l'intégration du Firmware 3.30 !!! Conclusion : - UltimateJB PSNYes => Valide l'utilisation du PSN : Uniquement compatible avec les 3.41 - ultimateJB DEFAULT => Pas de PSN mais disponible pour les PS3 sui...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 2.0: Fluent Ribbon Control Suite 2.0(supports .NET 4.0 RTM and .NET 3.5) Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples (only for .NET 4.0) Foundation (Tabs, Groups, Contextual Tabs, Quick Access Toolbar, Backstage) Resizing (ribbon reducing & enlarging principles) Galleries (Gallery in ContextMenu, InRibbonGallery) MVVM (shows how to use this library with Model-View-ViewModel pattern) KeyTips ScreenTips Toolbars ColorGallery NEW! *Walkthrough (documenta...New ProjectsALARM - ALert Application for Resource Managers: ALert Application for Resource ManagersAmino: Amino coming soon. A very dynamic MVVM application environment. More details to follow.ASDF-CRM: A Cleanly Designed CRM system based on .Net4 technologies with Silverlight GUI.Auto Slideshow with description: Auto Slideshow with image description targeted for webpage banners or website introduction. This is developed in XAML and C# using stroryboard and defining the timelines. This is a Silverlight 4 application. Can be resized depending on your requirements. Base De Datos: proyecto de base de datosBesteam Developments Safe Driving: School project developed with Visual Studio 2010, C# 4.0 and .NET 4.0BizTalk Mapper Extensions UtilityPack: BizTalk Mapper Extensions UtilityPack is a set of libraries with several useful functoids to include and use it in a map, which will provide an extension of BizTalk Mapper capabilities.BlogEngine Additions (Widgets,Extensions,Custom Code): Additions and custom code for BlogEngine. Widgets Extensions Custom code that can't be use in Widgets or Extenstions Equals Verifier .Net: Equals Verifier .Net is a small library to verify if classes implement Equals according to msdn guidelines.ERPSia: Proyecto de SIA TecFalafel Solution Rename Script: The Falafel Software Solution Rename PowerShell Script makes it easy to reuse an existing solution/project by performing a global rename. If you have a solution named ExampleSolution and want to reuse it as WidgetSolution, this script will rename everything for you.fOrganiz: This application allows you to automatically organize by date in specific subdirectories your picturesGEChecker: GEChecker makes it easier for you to view your RuneScape Grand Exchange offers whilst offline. It's developed in VB.NetHBUIMIS: HBUIMISHospital Management: 3 -tier architectinterpool: proyecto interpool - pis 2010 loud tweets: loud tweetsLuminous: Luminous library consists of various .NET components, controls and classes which make programming easier: WPF and Windows Forms TaskDialog (previously VDialog), Simple Popup Control, Glass Button, Linq to CSS, Linq to XHTML and various useful classes and extension methods.MailMonitr: MailMonitr helps improve email push notifications to your iPhone by using Prowl to deliver notifications, instead of the default "ding." Prowl has the ability to set "quiet hours." Also, a summery of unread messages is displayed on the lock screen for each push notification.MemoryGames: TODOMiko Ling's Open Source Projects: Open SourceMSCRM - Duplicate Checker - Plugin: Plugin to handle real-time Duplicate Checking and Constraining on any Int attribute specified. I use this to prevent service calls that are creating entities from creating duplicates. The external systems making the service calls use int as the primary key.MyTestProject: Test net projectNellen.dk: Det kan blive meget vildere....Orange Library: Orange LibraryRateIt: Rating plugin for jQuery RTL support, Progressive enhancement, Unobtrusive javascript (using HTML5 data-* attributes), supports as many stars as you'd like, and also any step size.RDPAddins .NET: With RDPAddins .NET framework you can build rdp channel addins in your favourite .NET languageREMS - Real State Management System based on ASP.NET 4.0: real state management system based on ASP.NET 4.0Repository of pan: My source code repository. record my idea and test code here. SharePoint Log Investigation Tool (SPLIT): SPLIT makes searching SharePoint logs easy. SQL Monitor: monitor sql server processes and jobs, view executing sql query, kill process / stop jobTestCodePlexForMe: TestCodePlexForMeweibo wp7 client: It's a project to create a windows phone 7 client for sina weibo, which is http://t.sina.com.cnWM2Day: WM2Day is a Windows Mobile (both Smartphone and PDA) client for Me2day.net, the Korean micro-blogging service.X10Dispatcher: Interface for automating x10 cm15a home automation powerline control unit. Requires physical cm15a control unit connected to computer running this program. Extends remote monitoring and automation of computer activities based on sensors and events.XNA 2D Particle Engine: XNA 2D Particle Engine is a flexible, extensible particle engine written in XNA Game Studio 4.0. The engine can emit texture-based particles in almost anyway you like and can easily be integrated as a (drawable) Game component in your XNA Game Studio 4.0 projects.

    Read the article

  • finite state machine used in mario like platform game

    - by juakob
    I dont understand how to use a finite state machine with the entity controlled by the player. For example i have a mario(2d platform) i can jump,run,walk,take damage,swim,etc so my first thought was to use this actions as states. But what happen when you are running when you take damage? or jumping taking damage and shooting at the same time? I just want to add functionalities(actions) to the player in a clean way(not using ifs for all the actions in the entity update)

    Read the article

  • Issue with distinguishing levels in isometric game

    - by Konrad
    I'm working on an isometric game however I am having trouble visually distinguishing between levels in the game. Take the example below, the first image shows concrete blocks at ground level and the following images show an attempt to build a few blocks a level above. As you can see the level above is visually swallowed the one below. I've tried shading to make lower levels darker with respect to camera, but this doesn't work that well.. any ideas?

    Read the article

  • Commercial Gaming, Coming Soon to Linux?

    <b>Linux Magazine:</b> "The inability to play the latest off the shelf commercial games has been a thorn in the side of Linux for a long time. With companies such as Valve starting to embrace other platforms, will that be the catalyst Linux needs to become a first class citizen? "

    Read the article

  • CodePlex Daily Summary for Friday, October 05, 2012

    CodePlex Daily Summary for Friday, October 05, 2012Popular ReleasesConfiguration Manager 2012 Automation: Beta Code (v0.1): Beta codefastJSON: v2.0.7: 2.0.7 - bug fix missing comma with single property and extension enabledWinRT XAML Toolkit: WinRT XAML Toolkit - 1.3.2: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Snoop, the WPF Spy Utility: Snoop 2.8.0: Snoop 2.8.0Announcing Snoop 2.8.0! It's been exactly six months since the last release, and this one has a bunch of goodies in it. In particular, there is now a PowerShell scripting tab, compliments of Bailey Ling. With this tab, the possibilities are limitless. It basically lets you automate/script the application that you are Snooping. Bailey has a couple blog posts (one and two) on his tab already, and I am sure more is to come. Please note that if you do not have PowerShell installed, y....NET Micro Framework: .NET MF 4.3 (Beta): This is the 4.3 Beta version of the .NET Micro Framework. Feature List for v4.3 Support for Visual Studio 2012 (including the Windows Desktop Express version) All v4.2 QFEs features and bug fixes (PWM enhancements, lwIP and network driver reliability improvements, Analog Output, WinUSB and latest GCC support) Improved diagnostic information for deployment Decreased boot time Bug fixes Work Item 1736 - Create link for MFDeploy under start menu Work Item 1504 - Customizing lwIP o...MCEBuddy 2.x: MCEBuddy 2.3.1: 2.3.1All new Remote Client Server architecture. Reccomended Download. The Remote Client Installation is OPTIONAL, you can extract the files from the zip archive into a local folder and run MCEBuddy.GUI directly. 2.2.15 was the last standalone release. Changelog for 2.3.1 (32bit and 64bit) 1. All remote MCEBuddy Client Server architecture (GUI runs remotely/independently from engine now) 2. Fixed bug in Audio Offset 3. Added support for remote MediaInfo (right click on file in queue to get ...D3 Loot Tracker: 1.5: Support for session upload to website. Support for theme change through general settings. Time played counter will now also display a count for days. Tome of secrets are no longer logged as items.NTCPMSG: V1.2.0.0: Allocate an identify cableid for each single connection cable. * Server can asend to specified cableid directly.Team Foundation Server Word Add-in: Version 1.0.12.0622: Welcome to the Visual Studio Team Foundation Server Word Add-in Supported Environments Microsoft Office Word 2007 and 2010 X86 (32-bit) Team Foundation Server 2010 Object Model TFS 2010, 2012 and TFS Service supported, using TFS OM / Explorer 2010. Quality-Bar Details Tool has been reviewed by Visual Studio ALM Rangers Tool has been through an independent technical and quality review All critical bugs have been resolved Known Issues / Bugs WI#43553 - The Acceptance criteria is not pu...Korean String Extension for .NET: ?? ??? ??? ????(v0.2.3.0): ? ?? ?? ?? ???? - string.KExtract() ?? ???? - string.AppendJosa(...) AppendJosa(...)? ?? ???? KAppendJosa(...)? ??? ?????UMD????? - PC?: UMDEditor?????V2.7: ??:http://jianyun.org/archives/948.html =============================================================================== UMD??? ???? =============================================================================== 2.7.0 (2012-10-3) ???????“UMD???.exe”??“UMDEditor.exe” ?????????;????????,??????。??????,????! ??64????,??????????????bug ?????????????,???? ???????????????? ???????????????,??????????bug ------------------------------------------------------- ?? reg.bat ????????????。 ????,??????txt/u...SharePoint Column & View Permission: SharePoint Column and View Permission v1.5: Version 1.5 of this project. If you will find any bugs please let me know at enti@zoznam.sk or post your findings in Issue TrackerUntangler: Untangler 1.0.0: Add a missing file from first releaseDirectX Tool Kit: October 2012: October 2, 2012 Added ScreenGrab module Added CreateGeoSphere for drawing a geodesic sphere Put DDSTextureLoader and WICTextureLoader into the DirectX C++ namespace Renamed project files for better naming consistency Updated WICTextureLoader for Windows 8 96bpp floating-point formats Win32 desktop projects updated to use Windows Vista (0x0600) rather than Windows 7 (0x0601) APIs Tweaked SpriteBatch.cpp to workaround ARM NEON compiler codegen bugHome Access Plus+: v8.1: HAP+ Web v8.1.1003.000079318 Fixed: Issue with the Help Desk and updating a ticket as an admin 79319 Fixed: formatting issue with the booking system admin header 79321 Moved to using the arrow with a circle symbol on the homepage instead of the > and < 79541 Added: 480px wide mobile theme to login page 79541 Added: 480px wide mobile theme to home page 79541 Added: slide events for homepage 79553 Fixed: Booking System Multiple Lesson Bug 79553 Fixed: IE Error Message 79684 Fixed: jQuery issue ...CRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1002.3): Visual Ribbon Editor 1.3.1002.3 What's New: Multi-language support for Labels/Tooltips for custom buttons and groups Support for base language other than English (1033) Connect dialog will not require organization name for ADFS / IFD connections Automatic creation of missing labels for all provisioned languages Minor connection issues fixed Notes: Before saving the ribbon to CRM server, editor will check Ribbon XML for any missing <Title> elements inside existing <LocLabel> elements...SubExtractor: Release 1029: Feature: Added option to make i and ¡ characters movie-specific for improved OCR on Spanish subs (Special Characters tab in Options) Feature: Allow switch to Word Spacing dialog directly from Spell Check dialog Fix: Added more default word spacings for accented characters Fix: Changed Word Spacing dialog to show all OCR'd characters in current sub Fix: Removed application focus grab during OCR Fix: Tightened HD subs fuzzy logic to reduce false matches in small characters Fix: Improved Arrow k...WallSwitch: WallSwitch 1.0.6: Version 1.0.6 Changes: Added hotkeys to perform a variety of operations (next/previous image, pause, clear history, etc.) Added color effects for grayscale, sepia and intense color. Various fixes.Readable Passphrase Generator: KeePass Plugin 0.7.1: See the KeePass Plugin Step By Step Guide for instructions on how to install the plugin. Changes Built against KeePass 2.20Windows 8 Toolkit - Charts and More: Beta 1.0: The First Compiled Version of my LibraryNew Projects3DGL: Bot de liga con comandos basicos para un pvpgn, necesita ser refactorizado y pasado en limpio pero esta totalmente funcional. Actualmente se usa OmbuServerADCMS: oneAptech.eProject.Batch59B - Online Book Store: eProject of Batch59B (SOFTECH - APTECH)AutoconsultaUB: this is a testAzure Active Directory Expense Demo Application: This application has been written to provide you with a quick and easy way to set up your first application that connects seamlessly to Azure Active DirectoryCentral: This a WPF project that implements PRISM with MEF as the IoC. This is an IDE with a modern UI where developers can run and centralize all their different apps.DataWay Connectors: ?????????? DataWay ??? ?????????? ?????????? ??? ????????? DataWay, ?????????? ?? ?????????????? ? ?????????? ??????-????????DLock - Open source distributed lock: DLock is a open source project based on .net framework that can provide the distributed lock. DoubleKeyDictionary: Project is just a single file, tests to be added as we go.Embedded Excel OLE Orphan Preventer: Prevents orphaned instances of embedded Excel workbooks from handing around even after the parent container document has been closed.File Duplicate Utility: This project will quickly detect duplicate files in a directory, then allow some minimal post processing on those duplicate files (i.e. move them to a new dir).Geoprocessing: A set of tools for processing Sea ice concentrations Google oAuth2 Service Account .NET: Google oAuth2 Service Account .NETHelperStart: NotingiDeal: C# library for connecting with iDealJCI-System: This project doesn't have a summaryLayoutting a long string: Takes a long string and splits it according a displacing enum layoutMicrosoft Dynamics CRM VB.Net SoapLogger: The VB.NET SoapLogger for VB.NET makes it easy to see what the SOAP looks like behind a Dynamics CRM 2011 call. Similiary to the C# version included in the SDKMinecraftRepository: Minecraft data repositoryModBUSLib.NET: TestMood Tracker - Decisions: A application to help you track your mood changes.Relying Party Federation Metadata Editor: This is a federation metadata editor for relying party trust applications. RPs can be created on any platform (as long as it's based on the oasis standart).Rest MVC: Personal project to play around with MVC APISharepoint 2010 Image carousel Sandboxed: This is a Image Slider Sandboxed solution web part for SharePoint 2010. Hope this helps. Cheers!Synapse - Micro Framework for: IoCC, AoP, Messages, Pipe and Filter Pattern: Micro Framework for: Inversion of Control Container, Aspect oriented Programming, Messages Pattern, Pipe and Filter Pattern.testdd10042012tfs01: b testnewgit1004201201: cWikimentation: A very simple Wiki to integrate in existing Asp.Net MVC4-Sites, implemented as an Area. Use it for a simple OnePage-Project-Documentation.Windows 8 AppBox: Este aplicativo vai ajuda-los a começar a desenvolver Aplicativos para a Windows 8 Store, comece desenvolvendo uma App simples, baseada somente em conteúdo :)Windows Phone Stateful Framework: A framework to correctly implement the Windows Phone tombstoned state.

    Read the article

  • Count a row VS Save the Row count after each update

    - by SAFAD
    I want to know whether saving row count in a table is better than counting it each time of the proccess. Quick Example : A visitor goes to Group Clan, the page displays clan information and Members who have joined the group,Should the page look for all the users who joined the clan and count them, or just display the number of members already saved in table ? I think the first one is not possible to get manipulated with but IT MIGHT cost performance Your Ideas ?

    Read the article

  • Converting #MDX to #DAX and PowerPivot Workshop online #ppws

    - by Marco Russo (SQLBI)
    I just published the article Converting MDX to DAX – First Steps on the renewed SQLBI web site about converting MDX to DAX. The reason is that with BISM Tabular in Analysis Services 2012 you will be able to write queries in both DAX and MDX. If you already know MDX, you might wonder how to “translate” your MDX knowledge in DAX. I think that this is another way you can improve your knowledge about DAX: it has different concepts behind and this comparison should be helpful in this purpose. This is...(read more)

    Read the article

  • Fluke AirCheck Wi-Fi Tester Reviewed

    Wi-Fi Planet's review of the Fluke AirCheck Wi- Fi Tester finds that even with some problems including PC-only configuration and inflexible reporting, "it could well become our first-look-go-to for routine trouble-shooting."

    Read the article

  • Gparted can't create partition table

    - by William
    Here's what the problem is. About a day or so ago I used Gparted live cd to create 3 NTFS primary partitions on my external 500 gig Goflex and one extended with 2 logical partitiones. I had planned to install windows 8 on the first partition, then ubuntu and kubuntu on the other 2. After I finished partitioning my drive with gparted, I booted into windows vista to make my bootable windows 8 usb to install it with, I also decided to check to make sure all my partitions were working properly. Then I found they were, and they weren't. My 50 gig first partition I had planned to install windows on showed up normal and the 300 gigs of space left in the extended partition did as well, the rest showed up as raw. So I figured alright, something went awal while making the partitions, so I booted up gparted once again. Then to my surprise gparted showed the entire drive as unallocated, and when I refreshed the list, it showed as all the partitions I had made earlier, buy with a exclamation mark by them all. So I figured ok, might be a problem with the partition table as I'd seen a similar problem in past on a drive that was not partitioned at all, so I decided to create a new partition table and take the time out again to sit and wait. Then I got a message saying gparted could not create the partition table, followed by it showing the entire drive as formatted into ntfs. After that I figured ok I'll take a break, come back in a hour, maybe it's something I did. So a hour later I came back after having booted up windows, plugged the drive in to see if by some miracle windows could access the drive. In disk management when I plugged the drive in, it would freeze attempting to read the drive, as I'd seen in the past with raw disks, yet when I unplugged it I got a glimpse of disk management showing it as a perfectly fine ntfs file system on the drive followed by a "you must format disk K in order to use it". So I then was assured the disk was raw as that is what had happened in the past, followed by a new partition table through gparted to fix the problem and a 10 hour format in windows. So I once again booted up gparted, to get the message "error fsyncing/closing/dev/sdg:input/output error" followed by "error opening dev/sdg No such file in directory" after I refreshed and somehow saw the disk show up as perfectly fine ntfs and then tried to create a new partition table to try to wipe out all my problems and start over again. And not gparted only shows the drive there about 1/10 refreshes the rest I get the directory error. If anybody can assist me in any way shape or form I will be thankful.

    Read the article

  • Search Engine Keyword Optimization - The Best Online Marketing Strategy

    Online business people dreams are fulfilled when they succeed in search engine keyword optimization. It is always a pleasure when out of 79,000,000 advertisers competing for a certain keyword; you find your website or article appearing on the first page of Google. Apart from Google, the other places to concentrate on in order to generate organic traffic are MSN, Yahoo and Bing.

    Read the article

  • Banner Ad Development: What programming should be required?

    - by FXquincy
    Banner Ad Development Career questions are potentially off-topic, but I've come across job postings wanting Banner Ad Developers. I'm not sure if they're development, design, or a niche particular to actionscript designers. First, What programming knowledge is required for Banner Ads (IDE, languages, Frameworks etc)? Second, is this a niche for designers or developers? How differently do designers and developers see their skills in JavaScript?

    Read the article

  • Yellow Dog Enterprise Linux for GPU computing

    <b>The H Open:</b> "The Japanese Fixstars Corporation, which specialises in software for the Cell processors, has announced the release of Yellow Dog Enterprise Linux (YDEL) 6.2 for CUDA, the first enterprise Linux OS optimised for GPU computing."

    Read the article

  • Novell Files Motion for Judgment and Motion to Strike

    <b>Groklaw:</b> "Novell points out that the only evidence SCO presented regarding malice is testimony by Maureen O'Gara of a conversation with Chris Stone, and no one corroborates her story, first of all, and second, O'Gara admitted she can't recall exactly what was said..."

    Read the article

  • Implement Negascout Algorithm with stack

    - by Dan
    I'm not familiar with how these stack exchange accounts work so if this is double posting I apologize. I asked the same thing on stackoverflow. I have added an AI routine to a game I am working on using the Negascout algorithm. It works great, but when I set a higher maximum depth it can take a few seconds to complete. The problem is it blocks the main thread, and the framework I am using does not have a way to deal with multi-threading properly across platforms. So I am trying to change this routine from recursively calling itself, to just managing a stack (vector) so that I can progress through the routine at a controlled pace and not lock up the application while the AI is thinking. I am getting hung up on the second recursive call in the loop. It relies on a returned value from the first call, so I don't know how to add these to a stack. My Working c++ Recursive Code: MoveScore abNegascout(vector<vector<char> > &board, int ply, int alpha, int beta, char piece) { if (ply==mMaxPly) { return MoveScore(evaluation.evaluateBoard(board, piece, oppPiece)); } int currentScore; int bestScore = -INFINITY; MoveCoord bestMove; int adaptiveBeta = beta; vector<MoveCoord> moveList = evaluation.genPriorityMoves(board, piece, findValidMove(board, piece, false)); if (moveList.empty()) { return MoveScore(bestScore); } bestMove = moveList[0]; for(int i=0;i<moveList.size();i++) { MoveCoord move = moveList[i]; vector<vector<char> > newBoard; newBoard.insert( newBoard.end(), board.begin(), board.end() ); effectMove(newBoard, piece, move.getRow(), move.getCol()); // First Call ****** MoveScore current = abNegascout(newBoard, ply+1, -adaptiveBeta, -max(alpha,bestScore), oppPiece); currentScore = - current.getScore(); if (currentScore>bestScore){ if (adaptiveBeta == beta || ply>=(mMaxPly-2)){ bestScore = currentScore; bestMove = move; }else { // Second Call ****** current = abNegascout(newBoard, ply+1, -beta, -currentScore, oppPiece); bestScore = - current.getScore(); bestMove = move; } if(bestScore>=beta){ return MoveScore(bestMove,bestScore); } adaptiveBeta = max(alpha, bestScore) + 1; } } return MoveScore(bestMove,bestScore); } If someone can please help by explaining how to get this to work with a simple stack. Example code would be much appreciated. While c++ would be perfect, any language that demonstrates how would be great. Thank You.

    Read the article

< Previous Page | 684 685 686 687 688 689 690 691 692 693 694 695  | Next Page >