Search Results

Search found 17911 results on 717 pages for 'load balancing'.

Page 157/717 | < Previous Page | 153 154 155 156 157 158 159 160 161 162 163 164  | Next Page >

  • ASP.NET Frameworks and Raw Throughput Performance

    - by Rick Strahl
    A few days ago I had a curious thought: With all these different technologies that the ASP.NET stack has to offer, what's the most efficient technology overall to return data for a server request? When I started this it was mere curiosity rather than a real practical need or result. Different tools are used for different problems and so performance differences are to be expected. But still I was curious to see how the various technologies performed relative to each just for raw throughput of the request getting to the endpoint and back out to the client with as little processing in the actual endpoint logic as possible (aka Hello World!). I want to clarify that this is merely an informal test for my own curiosity and I'm sharing the results and process here because I thought it was interesting. It's been a long while since I've done any sort of perf testing on ASP.NET, mainly because I've not had extremely heavy load requirements and because overall ASP.NET performs very well even for fairly high loads so that often it's not that critical to test load performance. This post is not meant to make a point  or even come to a conclusion which tech is better, but just to act as a reference to help understand some of the differences in perf and give a starting point to play around with this yourself. I've included the code for this simple project, so you can play with it and maybe add a few additional tests for different things if you like. Source Code on GitHub I looked at this data for these technologies: ASP.NET Web API ASP.NET MVC WebForms ASP.NET WebPages ASMX AJAX Services  (couldn't get AJAX/JSON to run on IIS8 ) WCF Rest Raw ASP.NET HttpHandlers It's quite a mixed bag, of course and the technologies target different types of development. What started out as mere curiosity turned into a bit of a head scratcher as the results were sometimes surprising. What I describe here is more to satisfy my curiosity more than anything and I thought it interesting enough to discuss on the blog :-) First test: Raw Throughput The first thing I did is test raw throughput for the various technologies. This is the least practical test of course since you're unlikely to ever create the equivalent of a 'Hello World' request in a real life application. The idea here is to measure how much time a 'NOP' request takes to return data to the client. So for this request I create the simplest Hello World request that I could come up for each tech. Http Handler The first is the lowest level approach which is an HTTP handler. public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public bool IsReusable { get { return true; } } } WebForms Next I added a couple of ASPX pages - one using CodeBehind and one using only a markup page. The CodeBehind page simple does this in CodeBehind without any markup in the ASPX page: public partial class HelloWorld_CodeBehind : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { Response.Write("Hello World. Time is: " + DateTime.Now.ToString() ); Response.End(); } } while the Markup page only contains some static output via an expression:<%@ Page Language="C#" AutoEventWireup="false" CodeBehind="HelloWorld_Markup.aspx.cs" Inherits="AspNetFrameworksPerformance.HelloWorld_Markup" %> Hello World. Time is <%= DateTime.Now %> ASP.NET WebPages WebPages is the freestanding Razor implementation of ASP.NET. Here's the simple HelloWorld.cshtml page:Hello World @DateTime.Now WCF REST WCF REST was the token REST implementation for ASP.NET before WebAPI and the inbetween step from ASP.NET AJAX. I'd like to forget that this technology was ever considered for production use, but I'll include it here. Here's an OperationContract class: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World" + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } } WCF REST can return arbitrary results by returning a Stream object and a content type. The code above turns the string result into a stream and returns that back to the client. ASP.NET AJAX (ASMX Services) I also wanted to test ASP.NET AJAX services because prior to WebAPI this is probably still the most widely used AJAX technology for the ASP.NET stack today. Unfortunately I was completely unable to get this running on my Windows 8 machine. Visual Studio 2012  removed adding of ASP.NET AJAX services, and when I tried to manually add the service and configure the script handler references it simply did not work - I always got a SOAP response for GET and POST operations. No matter what I tried I always ended up getting XML results even when explicitly adding the ScriptHandler. So, I didn't test this (but the code is there - you might be able to test this on a Windows 7 box). ASP.NET MVC Next up is probably the most popular ASP.NET technology at the moment: MVC. Here's the small controller: public class MvcPerformanceController : Controller { public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } } ASP.NET WebAPI Next up is WebAPI which looks kind of similar to MVC. Except here I have to use a StringContent result to return the response: public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } } Testing Take a minute to think about each of the technologies… and take a guess which you think is most efficient in raw throughput. The fastest should be pretty obvious, but the others - maybe not so much. The testing I did is pretty informal since it was mainly to satisfy my curiosity - here's how I did this: I used Apache Bench (ab.exe) from a full Apache HTTP installation to run and log the test results of hitting the server. ab.exe is a small executable that lets you hit a URL repeatedly and provides counter information about the number of requests, requests per second etc. ab.exe and the batch file are located in the \LoadTests folder of the project. An ab.exe command line  looks like this: ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld which hits the specified URL 100,000 times with a load factor of 20 concurrent requests. This results in output like this:   It's a great way to get a quick and dirty performance summary. Run it a few times to make sure there's not a large amount of varience. You might also want to do an IISRESET to clear the Web Server. Just make sure you do a short test run to warm up the server first - otherwise your first run is likely to be skewed downwards. ab.exe also allows you to specify headers and provide POST data and many other things if you want to get a little more fancy. Here all tests are GET requests to keep it simple. I ran each test: 100,000 iterations Load factor of 20 concurrent connections IISReset before starting A short warm up run for API and MVC to make sure startup cost is mitigated Here is the batch file I used for the test: IISRESET REM make sure you add REM C:\Program Files (x86)\Apache Software Foundation\Apache2.2\bin REM to your path so ab.exe can be found REM Warm up ab.exe -n100 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJsonab.exe -n100 -c20 http://localhost/aspnetperf/api/HelloWorldJson ab.exe -n100 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld ab.exe -n100000 -c20 http://localhost/aspnetperf/handler.ashx > handler.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_CodeBehind.aspx > AspxCodeBehind.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_Markup.aspx > AspxMarkup.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld > Wcf.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldCode > Mvc.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld > WebApi.txt I ran each of these tests 3 times and took the average score for Requests/second, with the machine otherwise idle. I did see a bit of variance when running many tests but the values used here are the medians. Part of this has to do with the fact I ran the tests on my local machine - result would probably more consistent running the load test on a separate machine hitting across the network. I ran these tests locally on my laptop which is a Dell XPS with quad core Sandibridge I7-2720QM @ 2.20ghz and a fast SSD drive on Windows 8. CPU load during tests ran to about 70% max across all 4 cores (IOW, it wasn't overloading the machine). Ideally you can try running these tests on a separate machine hitting the local machine. If I remember correctly IIS 7 and 8 on client OSs don't throttle so the performance here should be Results Ok, let's cut straight to the chase. Below are the results from the tests… It's not surprising that the handler was fastest. But it was a bit surprising to me that the next fastest was WebForms and especially Web Forms with markup over a CodeBehind page. WebPages also fared fairly well. MVC and WebAPI are a little slower and the slowest by far is WCF REST (which again I find surprising). As mentioned at the start the raw throughput tests are not overly practical as they don't test scripting performance for the HTML generation engines or serialization performances of the data engines. All it really does is give you an idea of the raw throughput for the technology from time of request to reaching the endpoint and returning minimal text data back to the client which indicates full round trip performance. But it's still interesting to see that Web Forms performs better in throughput than either MVC, WebAPI or WebPages. It'd be interesting to try this with a few pages that actually have some parsing logic on it, but that's beyond the scope of this throughput test. But what's also amazing about this test is the sheer amount of traffic that a laptop computer is handling. Even the slowest tech managed 5700 requests a second, which is one hell of a lot of requests if you extrapolate that out over a 24 hour period. Remember these are not static pages, but dynamic requests that are being served. Another test - JSON Data Service Results The second test I used a JSON result from several of the technologies. I didn't bother running WebForms and WebPages through this test since that doesn't make a ton of sense to return data from the them (OTOH, returning text from the APIs didn't make a ton of sense either :-) In these tests I have a small Person class that gets serialized and then returned to the client. The Person class looks like this: public class Person { public Person() { Id = 10; Name = "Rick"; Entered = DateTime.Now; } public int Id { get; set; } public string Name { get; set; } public DateTime Entered { get; set; } } Here are the updated handler classes that use Person: Handler public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { var action = context.Request.QueryString["action"]; if (action == "json") JsonRequest(context); else TextRequest(context); } public void TextRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public void JsonRequest(HttpContext context) { var json = JsonConvert.SerializeObject(new Person(), Formatting.None); context.Response.ContentType = "application/json"; context.Response.Write(json); } public bool IsReusable { get { return true; } } } This code adds a little logic to check for a action query string and route the request to an optional JSON result method. To generate JSON, I'm using the same JSON.NET serializer (JsonConvert.SerializeObject) used in Web API to create the JSON response. WCF REST   [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World " + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } [OperationContract] [WebGet(ResponseFormat=WebMessageFormat.Json,BodyStyle=WebMessageBodyStyle.WrappedRequest)] public Person HelloWorldJson() { // Add your operation implementation here return new Person(); } } For WCF REST all I have to do is add a method with the Person result type.   ASP.NET MVC public class MvcPerformanceController : Controller { // // GET: /MvcPerformance/ public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } public JsonResult HelloWorldJson() { return Json(new Person(), JsonRequestBehavior.AllowGet); } } For MVC all I have to do for a JSON response is return a JSON result. ASP.NET internally uses JavaScriptSerializer. ASP.NET WebAPI public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } [HttpGet] public Person HelloWorldJson() { return new Person(); } [HttpGet] public HttpResponseMessage HelloWorldJson2() { var response = new HttpResponseMessage(HttpStatusCode.OK); response.Content = new ObjectContent<Person>(new Person(), GlobalConfiguration.Configuration.Formatters.JsonFormatter); return response; } } Testing and Results To run these data requests I used the following ab.exe commands:REM JSON RESPONSES ab.exe -n100000 -c20 http://localhost/aspnetperf/Handler.ashx?action=json > HandlerJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJson > MvcJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorldJson > WebApiJson.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorldJson > WcfJson.txt The results from this test run are a bit interesting in that the WebAPI test improved performance significantly over returning plain string content. Here are the results:   The performance for each technology drops a little bit except for WebAPI which is up quite a bit! From this test it appears that WebAPI is actually significantly better performing returning a JSON response, rather than a plain string response. Snag with Apache Benchmark and 'Length Failures' I ran into a little snag with Apache Benchmark, which was reporting failures for my Web API requests when serializing. As the graph shows performance improved significantly from with JSON results from 5580 to 6530 or so which is a 15% improvement (while all others slowed down by 3-8%). However, I was skeptical at first because the WebAPI test reports showed a bunch of errors on about 10% of the requests. Check out this report: Notice the Failed Request count. What the hey? Is WebAPI failing on roughly 10% of requests when sending JSON? Turns out: No it's not! But it took some sleuthing to figure out why it reports these failures. At first I thought that Web API was failing, and so to make sure I re-ran the test with Fiddler attached and runiisning the ab.exe test by using the -X switch: ab.exe -n100 -c10 -X localhost:8888 http://localhost/aspnetperf/api/HelloWorldJson which showed that indeed all requests where returning proper HTTP 200 results with full content. However ab.exe was reporting the errors. After some closer inspection it turned out that the dates varying in size altered the response length in dynamic output. For example: these two results: {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.841926-10:00"} {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.8519262-10:00"} are different in length for the number which results in 68 and 69 bytes respectively. The same URL produces different result lengths which is what ab.exe reports. I didn't notice at first bit the same is happening when running the ASHX handler with JSON.NET result since it uses the same serializer that varies the milliseconds. Moral: You can typically ignore Length failures in Apache Benchmark and when in doubt check the actual output with Fiddler. Note that the other failure values are accurate though. Another interesting Side Note: Perf drops over Time As I was running these tests repeatedly I was finding that performance steadily dropped from a startup peak to a 10-15% lower stable level. IOW, with Web API I'd start out with around 6500 req/sec and in subsequent runs it keeps dropping until it would stabalize somewhere around 5900 req/sec occasionally jumping lower. For these tests this is why I did the IIS RESET and warm up for individual tests. This is a little puzzling. Looking at Process Monitor while the test are running memory very quickly levels out as do handles and threads, on the first test run. Subsequent runs everything stays stable, but the performance starts going downwards. This applies to all the technologies - Handlers, Web Forms, MVC, Web API - curious to see if others test this and see similar results. Doing an IISRESET then resets everything and performance starts off at peak again… Summary As I stated at the outset, these were informal to satiate my curiosity not to prove that any technology is better or even faster than another. While there clearly are differences in performance the differences (other than WCF REST which was by far the slowest and the raw handler which was by far the highest) are relatively minor, so there is no need to feel that any one technology is a runaway standout in raw performance. Choosing a technology is about more than pure performance but also about the adequateness for the job and the easy of implementation. The strengths of each technology will make for any minor performance difference we see in these tests. However, to me it's important to get an occasional reality check and compare where new technologies are heading. Often times old stuff that's been optimized and designed for a time of less horse power can utterly blow the doors off newer tech and simple checks like this let you compare. Luckily we're seeing that much of the new stuff performs well even in V1.0 which is great. To me it was very interesting to see Web API perform relatively badly with plain string content, which originally led me to think that Web API might not be properly optimized just yet. For those that caught my Tweets late last week regarding WebAPI's slow responses was with String content which is in fact considerably slower. Luckily where it counts with serialized JSON and XML WebAPI actually performs better. But I do wonder what would make generic string content slower than serialized code? This stresses another point: Don't take a single test as the final gospel and don't extrapolate out from a single set of tests. Certainly Twitter can make you feel like a fool when you post something immediate that hasn't been fleshed out a little more <blush>. Egg on my face. As a result I ended up screwing around with this for a few hours today to compare different scenarios. Well worth the time… I hope you found this useful, if not for the results, maybe for the process of quickly testing a few requests for performance and charting out a comparison. Now onwards with more serious stuff… Resources Source Code on GitHub Apache HTTP Server Project (ab.exe is part of the binary distribution)© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Simulación de carga productiva para anticipar errores

    - by [email protected]
    La presión por la agilidad en el día a día del negocio y por obtener siempre altos niveles de servicio hacen del manejo de la calidad un imperativo básico. Relacionado con ello, Oracle propone a través de su solución ATS (Application Testing Suite) servicios para cumplir con los objetivos de calidad. Oracle Functional Testing permitirá automatizar tediosas tareas de prueba reduciendo el nivel de esfuerzo dentro de los equipos de pruebas y garantizando calidad en cada cambio en los sistemas productivos. Oracle Load Testing permitirá simular carga productiva en los entornos y anticipar errores derivados de la concurrencia, congestión, rendimiento y falta de capacidad sin afectar a los usuarios finales. La suite de Oracle está probada y certificada sobre las siguientes plataformas: Siebel 7.x y 8.x, e-Business Suite 11i10 y superiores, Hyperion, Peoplesoft, JD Edwards, Aplicaciones Web, Web Services y sobre Base de Datos. Brochure: Oracle Load Testing

    Read the article

  • make IIS 7.5 cache static content files over diferent pages

    - by Achilles
    On a Windows 2008 R2, using DNS and IIS I've established my development test server; i.e. I'll have a web application that I can browse on http://test.dev I've moved all the static content files like images, js files and css files into another application which is visible on http://cdn.test.dev test.dev, uses cdn.test.dev urls like http://cdn.test.dev/js/jquery.js to load js, css and images. When I first load "~/" of test.dev, all files will load with a response code of 200; when I press F5 in Firefox, all files, except the "~/default.aspx", will load with 304 response code; but pressing Ctrl+F5 loads them again with a 200 code; if I browse another url like "~/pages/" in test.dev, all of those static files will reload with a 200 code... Is this normal or I'm doing something wrong? Actually, I'm looking for a behavior like this: I want the client to load http://cdn.test.dev/js/jquery.js, only once. I want the client's browser to use this jquery.js file, from cache, in all other pages of test.dev Is this possible? This is the web.config file I have in the root directory of cdn.test.dev: <configuration> <system.webServer> <caching> <profiles> <add extension=".png" policy="CacheUntilChange" varyByHeaders="User-Agent" location="Client" /> <add extension=".gif" policy="CacheUntilChange" varyByHeaders="User-Agent" location="Client" /> <add extension=".jpg" policy="CacheUntilChange" varyByHeaders="User-Agent" location="Client" /> <add extension=".js" policy="CacheUntilChange" varyByHeaders="User-Agent" location="Client" /> <add extension=".css" policy="CacheUntilChange" varyByHeaders="User-Agent" location="Client" /> <add extension=".axd" kernelCachePolicy="CacheUntilChange" varyByHeaders="User-Agent" location="Client" /> </profiles> </caching> <httpProtocol allowKeepAlive="true"> <customHeaders> <add name="Cache-Control" value="public, max-age=31536000" /> </customHeaders> </httpProtocol> <validation validateIntegratedModeConfiguration="false" /> <modules runAllManagedModulesForAllRequests="true"> <remove name="RadUploadModule" /> <remove name="RadCompression" /> <add name="RadUploadModule" type="Telerik.Web.UI.RadUploadHttpModule" preCondition="integratedMode" /> <add name="RadCompression" type="Telerik.Web.UI.RadCompression" preCondition="integratedMode" /> </modules> <handlers> <remove name="ChartImage_axd" /> <remove name="Telerik_Web_UI_SpellCheckHandler_axd" /> <remove name="Telerik_Web_UI_DialogHandler_aspx" /> <remove name="Telerik_RadUploadProgressHandler_ashx" /> <remove name="Telerik_Web_UI_WebResource_axd" /> <add name="ChartImage_axd" path="ChartImage.axd" type="Telerik.Web.UI.ChartHttpHandler" verb="*" preCondition="integratedMode" /> <add name="Telerik_Web_UI_SpellCheckHandler_axd" path="Telerik.Web.UI.SpellCheckHandler.axd" type="Telerik.Web.UI.SpellCheckHandler" verb="*" preCondition="integratedMode" /> <add name="Telerik_Web_UI_DialogHandler_aspx" path="Telerik.Web.UI.DialogHandler.aspx" type="Telerik.Web.UI.DialogHandler" verb="*" preCondition="integratedMode" /> <add name="Telerik_RadUploadProgressHandler_ashx" path="Telerik.RadUploadProgressHandler.ashx" type="Telerik.Web.UI.RadUploadProgressHandler" verb="*" preCondition="integratedMode" /> <add name="Telerik_Web_UI_WebResource_axd" path="Telerik.Web.UI.WebResource.axd" type="Telerik.Web.UI.WebResource" verb="*" preCondition="integratedMode" /> </handlers> <security> <requestFiltering> <requestLimits maxAllowedContentLength="10485760" /> </requestFiltering> </security> <staticContent> <clientCache cacheControlMode="UseExpires" httpExpires="Wed, 01 Jan 2020 00:00:00 GMT"/> </staticContent> </system.webServer> <appSettings /> <system.web> <compilation debug="false" targetFramework="4.0" /> <pages> <controls> <add tagPrefix="telerik" namespace="Telerik.Web.UI" assembly="Telerik.Web.UI" /> </controls> </pages> <httpHandlers> <add path="ChartImage.axd" type="Telerik.Web.UI.ChartHttpHandler" verb="*" validate="false" /> <add path="Telerik.Web.UI.SpellCheckHandler.axd" type="Telerik.Web.UI.SpellCheckHandler" verb="*" validate="false" /> <add path="Telerik.Web.UI.DialogHandler.aspx" type="Telerik.Web.UI.DialogHandler" verb="*" validate="false" /> <add path="Telerik.RadUploadProgressHandler.ashx" type="Telerik.Web.UI.RadUploadProgressHandler" verb="*" validate="false" /> <add path="Telerik.Web.UI.WebResource.axd" type="Telerik.Web.UI.WebResource" verb="*" validate="false" /> </httpHandlers> <httpModules> <add name="RadUploadModule" type="Telerik.Web.UI.RadUploadHttpModule" /> <add name="RadCompression" type="Telerik.Web.UI.RadCompression" /> </httpModules> <httpRuntime maxRequestLength="10240" /> </system.web> </configuration> and this is the resulting response header for http://cdn.test.dev/css/global.css: Cache-Control: private,public, max-age=31536000 Content-Type: text/css Content-Encoding: gzip Expires: Wed, 01 Jan 2020 00:00:00 GMT Last-Modified: Mon, 06 Sep 2010 08:53:06 GMT Accept-Ranges: bytes Etag: "0454eca04dcb1:0" Vary: Accept-Encoding Server: Microsoft-IIS/7.5 X-Powered-By: ASP.NET Date: Mon, 06 Sep 2010 14:57:08 GMT Content-Length: 4495

    Read the article

  • Recover Ubuntu grub without LiveCD - Can't boot with Ubuntu LiveCD after install Windows

    - by Paulocoghi
    I installed Windows after installing Ubuntu. But Ubuntu is still intact in its partition. I'm trying to run the Ubuntu LiveCD to recover grub, but the LiveCD no longer works. It stops the boot process and does not load completely. I can not run Ubuntu in live mode to recover grub. Is there any way to recover the grub/grub2 without the LiveCD? Edit: [Important] I've downloaded a new Ubuntu 10.10 ISO. I also did the MD5 check. It's all right. Then, I recorded this .iso and still can not load the LiveCD to the end.

    Read the article

  • "exception at 0x53C227FF (msvcr110d.dll)" with SOIL library

    - by Sean M.
    I'm creating a game in C++ using OpenGL, and decided to go with the SOIL library for image loading, as I have used it in the past to great effect. The problem is, in my newest game, trying to load an image with SOIL throws the following runtime error: This error points to this part: // SOIL.c int query_NPOT_capability( void ) { /* check for the capability */ if( has_NPOT_capability == SOIL_CAPABILITY_UNKNOWN ) { /* we haven't yet checked for the capability, do so */ if( (NULL == strstr( (char const*)glGetString( GL_EXTENSIONS ), "GL_ARB_texture_non_power_of_two" ) ) ) //############ it points here ############// { /* not there, flag the failure */ has_NPOT_capability = SOIL_CAPABILITY_NONE; } else { /* it's there! */ has_NPOT_capability = SOIL_CAPABILITY_PRESENT; } } /* let the user know if we can do non-power-of-two textures or not */ return has_NPOT_capability; } Since it points to the line where SOIL tries to access the OpenGL extensions, I think that for some reason SOIL is trying to load the texture before an OpenGL context is created. The problem is, I've gone through the entire solution, and there is only one place where SOIL has to load a texture, and it happens long after the OpenGL context is created. This is the part where it loads the texture... //Init glfw if (!glfwInit()) { fprintf(stderr, "GLFW Initialization has failed!\n"); exit(EXIT_FAILURE); } printf("GLFW Initialized.\n"); //Process the command line arguments processCmdArgs(argc, argv); //Create the window glfwWindowHint(GLFW_SAMPLES, g_aaSamples); glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2); g_mainWindow = glfwCreateWindow(g_screenWidth, g_screenHeight, "Voxel Shipyard", g_fullScreen ? glfwGetPrimaryMonitor() : nullptr, nullptr); if (!g_mainWindow) { fprintf(stderr, "Could not create GLFW window!\n"); closeOGL(); exit(EXIT_FAILURE); } glfwMakeContextCurrent(g_mainWindow); printf("Window and OpenGL rendering context created.\n"); //Create the internal rendering components prepareScreen(); //Init glew glewExperimental = GL_TRUE; int err = glewInit(); if (err != GLEW_OK) { fprintf(stderr, "GLEW initialization failed!\n"); fprintf(stderr, "%s\n", glewGetErrorString(err)); closeOGL(); exit(EXIT_FAILURE); } printf("GLEW initialized.\n"); <-- Sucessfully creates an OpenGL context //Initialize the app g_app = new App(); g_app->PreInit(); g_app->Init(); g_app->PostInit(); <-- Loads the texture (after the context is created) ...and debug printing to the console CONFIRMS that the OpenGL context was created before the texture loading was attempted. So my question is if anyone is familiar with this specific error, or knows if there is a specific instance as to why SOIL would think OpenGL isn't initialized yet.

    Read the article

  • Unable to run Tor even in terminal, Vidalia exit code 127

    - by Ubuntu Newb
    I'm using Ubuntu 12.10 (quantal) and I recently installed Tor (32 bits) following the instructions on the Tor project's page. Then I started the script after having it extracted from the console and got this: master@ubuntu:~/tor-browser_en-US$ ./start-tor-browser Launching Tor Browser Bundle for Linux in /home/master/tor-browser_en-US ./start-tor-browser: 225: ./start-tor-browser: ./App/vidalia: not found Vidalia exited abnormally. Exit code: 127 Then I ran Vidalia from the console and: master@ubuntu:~/tor-browser_en-US$ vidalia (<unknown>:11354): IBUS-WARNING **: Unable to load /var/lib/dbus/machine-id: Failed to open file '/var/lib/dbus/machine-id': Permission denied master@ubuntu:~/tor-browser_en-US$ vidalia (<unknown>:11358): IBUS-WARNING **: Unable to load /var/lib/dbus/machine-id: Failed to open file '/var/lib/dbus/machine-id': Permission denied And after Vidalia's GUI opens I get the error prompt about starting Tor: "Vidalia was unable to start Tor. Check your settings to ensure the correct name and location of your Tor executable is specified." How can I start Tor?

    Read the article

  • E_FAIL: An undetermined error occurred (-2147467259) when loading a cube texture

    - by Boreal
    I'm trying to implement a skybox into my engine, and I'm having some trouble loading the image as a cube map. Everything works (but it doesn't look right) if I don't load using an ImageLoadInformation struct in the ShaderResourceView.FromFile() method, but it breaks if I do. I need to, of course, because I need to tell SlimDX to load it as a cubemap. How can I fix this? Here is my new loading code after the "fix": public static void LoadCubeTexture(string filename) { ImageLoadInformation loadInfo = new ImageLoadInformation() { BindFlags = BindFlags.ShaderResource, CpuAccessFlags = CpuAccessFlags.None, Depth = 32, FilterFlags = FilterFlags.None, FirstMipLevel = 0, Format = SlimDX.DXGI.Format.B8G8R8A8_UNorm, Height = 512, MipFilterFlags = FilterFlags.Linear, MipLevels = 1, OptionFlags = ResourceOptionFlags.TextureCube, Usage = ResourceUsage.Default, Width = 512 }; textures.Add(filename, ShaderResourceView.FromFile(Graphics.device, "Resources/" + filename, loadInfo)); } Each of the faces of my cube texture are 512x512.

    Read the article

  • Simulación de carga productiva para anticipar errores

    - by [email protected]
    La presión por la agilidad en el día a día del negocio y por obtener siempre altos niveles de servicio hacen del manejo de la calidad un imperativo básico. Relacionado con ello, Oracle propone a través de su solución ATS (Application Testing Suite) servicios para cumplir con los objetivos de calidad. Oracle Functional Testing permitirá automatizar tediosas tareas de prueba reduciendo el nivel de esfuerzo dentro de los equipos de pruebas y garantizando calidad en cada cambio en los sistemas productivos. Oracle Load Testing permitirá simular carga productiva en los entornos y anticipar errores derivados de la concurrencia, congestión, rendimiento y falta de capacidad sin afectar a los usuarios finales. La suite de Oracle está probada y certificada sobre las siguientes plataformas: Siebel 7.x y 8.x, e-Business Suite 11i10 y superiores, Hyperion, Peoplesoft, JD Edwards, Aplicaciones Web, Web Services y sobre Base de Datos. Brochure: Oracle Load Testing

    Read the article

  • Minecraft shows black screen on watt-os 64 after logon

    - by uffe hellum
    Minecraft appears to launch with oracle java 7, but crashes after logon. $ java -Xmx1024M -Xms512M -cp ./minecraft.jar net.minecraft.LauncherFrame asdf Exception in thread "Thread-3" java.lang.UnsatisfiedLinkError: /home/uffeh/.minecraft/bin/natives/liblwjgl.so: /home/uffeh/.minecraft/bin/natives/liblwjgl.so: wrong ELF class: ELFCLASS32 (Possible cause: architecture word width mismatch) at java.lang.ClassLoader$NativeLibrary.load(Native Method) at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1939) at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1864) at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1825) at java.lang.Runtime.load0(Runtime.java:792) at java.lang.System.load(System.java:1059) at org.lwjgl.Sys$1.run(Sys.java:69) at java.security.AccessController.doPrivileged(Native Method) at org.lwjgl.Sys.doLoadLibrary(Sys.java:65) at org.lwjgl.Sys.loadLibrary(Sys.java:81) at org.lwjgl.Sys.(Sys.java:98) at net.minecraft.client.Minecraft.F(SourceFile:1857) at aof.(SourceFile:20) at net.minecraft.client.Minecraft.(SourceFile:77) at anw.(SourceFile:36) at net.minecraft.client.MinecraftApplet.init(SourceFile:36) at net.minecraft.Launcher.replace(Launcher.java:136) at net.minecraft.Launcher$1.run(Launcher.java:79)

    Read the article

  • when to use a scaled/enterprise agile software development framework and when to let agile processes 'emerge'?

    - by SHC
    There are quite a few enterprise agile software development frameworks available: Scott Ambler: Disciplined Agile Delivery Dean Leffingwell: Scaled Agile Framework Alan Shalloway: Enterprise Agile Book Craig Larman: Scaling Lean and Agile Barry Boehm: Balancing Agility and Discipline Brian Wernham: Agile Project Management in Government - DSDM I've also spoken with people that state that your enterprise agile processes should just 'emerge' and that you shouldn't need or use a framework because they constrain you. Question 1: When should one choose an enterprise agile software development framework, and when should one just let their agile processes 'emerge'. Question 2: If choosing an enterprise agile software development framework, how does one select the appropriate framework to use for their organisation? Please provide evidence of your experience or research when answering questions rather than just presenting opinions.

    Read the article

  • Pure Front end JavaScript with Web API versus MVC views with ajax

    - by eyeballpaul
    This was more a discussion for what peoples thoughts are these days on how to split a web application. I am used to creating an MVC application with all its views and controllers. I would normally create a full view and pass this back to the browser on a full page request, unless there were specific areas that I did not want to populate straight away and would then use DOM page load events to call the server to load other areas using AJAX. Also, when it came to partial page refreshing, I would call an MVC action method which would return the HTML fragment which I could then use to populate parts of the page. This would be for areas that I did not want to slow down initial page load, or areas that fitted better with AJAX calls. One example would be for table paging. If you want to move on to the next page, I would prefer it if an AJAX call got that info rather than using a full page refresh. But the AJAX call would still return an HTML fragment. My question is. Are my thoughts on this archaic because I come from a .net background rather than a pure front end background? An intelligent front end developer that I work with, prefers to do more or less nothing in the MVC views, and would rather do everything on the front end. Right down to web API calls populating the page. So that rather than calling an MVC action method, which returns HTML, he would prefer to return a standard object and use javascript to create all the elements of the page. The front end developer way means that any benefits that I normally get with MVC model validation, including client side validation, would be gone. It also means that any benefits that I get with creating the views, with strongly typed html templates etc would be gone. I believe this would mean I would need to write the same validation for front end and back end validation. The javascript would also need to have lots of methods for creating all the different parts of the DOM. For example, when adding a new row to a table, I would normally use the MVC partial view for creating the row, and then return this as part of the AJAX call, which then gets injected into the table. By using a pure front end way, the javascript would would take in an object (for, say, a product) for the row from the api call, and then create a row from that object. Creating each individual part of the table row. The website in question will have lots of different areas, from administration, forms, product searching etc. A website that I don't think requires to be architected in a single page application way. What are everyone's thoughts on this? I am interested to hear from front end devs and back end devs.

    Read the article

  • A DirectoryCatalog class for Silverlight MEF (Managed Extensibility Framework)

    - by Dixin
    In the MEF (Managed Extension Framework) for .NET, there are useful ComposablePartCatalog implementations in System.ComponentModel.Composition.dll, like: System.ComponentModel.Composition.Hosting.AggregateCatalog System.ComponentModel.Composition.Hosting.AssemblyCatalog System.ComponentModel.Composition.Hosting.DirectoryCatalog System.ComponentModel.Composition.Hosting.TypeCatalog While in Silverlight, there is a extra System.ComponentModel.Composition.Hosting.DeploymentCatalog. As a wrapper of AssemblyCatalog, it can load all assemblies in a XAP file in the web server side. Unfortunately, in silverlight there is no DirectoryCatalog to load a folder. Background There are scenarios that Silverlight application may need to load all XAP files in a folder in the web server side, for example: If the Silverlight application is extensible and supports plug-ins, there would be a /ClinetBin/Plugins/ folder in the web server, and each pluin would be an individual XAP file in the folder. In this scenario, after the application is loaded and started up, it would like to load all XAP files in /ClinetBin/Plugins/ folder. If the aplication supports themes, there would be a /ClinetBin/Themes/ folder, and each theme would be an individual XAP file too. The application would qalso need to load all XAP files in /ClinetBin/Themes/. It is useful if we have a DirectoryCatalog: DirectoryCatalog catalog = new DirectoryCatalog("/Plugins"); catalog.DownloadCompleted += (sender, e) => { }; catalog.DownloadAsync(); Obviously, the implementation of DirectoryCatalog is easy. It is just a collection of DeploymentCatalog class. Retrieve file list from a directory Of course, to retrieve file list from a web folder, the folder’s “Directory Browsing” feature must be enabled: So when the folder is requested, it responses a list of its files and folders: This is nothing but a simple HTML page: <html> <head> <title>localhost - /Folder/</title> </head> <body> <h1>localhost - /Folder/</h1> <hr> <pre> <a href="/">[To Parent Directory]</a><br> <br> 1/3/2011 7:22 PM 185 <a href="/Folder/File.txt">File.txt</a><br> 1/3/2011 7:22 PM &lt;dir&gt; <a href="/Folder/Folder/">Folder</a><br> </pre> <hr> </body> </html> For the ASP.NET Deployment Server of Visual Studio, directory browsing is enabled by default: The HTML <Body> is almost the same: <body bgcolor="white"> <h2><i>Directory Listing -- /ClientBin/</i></h2> <hr width="100%" size="1" color="silver"> <pre> <a href="/">[To Parent Directory]</a> Thursday, January 27, 2011 11:51 PM 282,538 <a href="Test.xap">Test.xap</a> Tuesday, January 04, 2011 02:06 AM &lt;dir&gt; <a href="TestFolder/">TestFolder</a> </pre> <hr width="100%" size="1" color="silver"> <b>Version Information:</b>&nbsp;ASP.NET Development Server 10.0.0.0 </body> The only difference is, IIS’s links start with slash, but here the links do not. Here one way to get the file list is read the href attributes of the links: [Pure] private IEnumerable<Uri> GetFilesFromDirectory(string html) { Contract.Requires(html != null); Contract.Ensures(Contract.Result<IEnumerable<Uri>>() != null); return new Regex( "<a href=\"(?<uriRelative>[^\"]*)\">[^<]*</a>", RegexOptions.IgnoreCase | RegexOptions.CultureInvariant) .Matches(html) .OfType<Match>() .Where(match => match.Success) .Select(match => match.Groups["uriRelative"].Value) .Where(uriRelative => uriRelative.EndsWith(".xap", StringComparison.Ordinal)) .Select(uriRelative => { Uri baseUri = this.Uri.IsAbsoluteUri ? this.Uri : new Uri(Application.Current.Host.Source, this.Uri); uriRelative = uriRelative.StartsWith("/", StringComparison.Ordinal) ? uriRelative : (baseUri.LocalPath.EndsWith("/", StringComparison.Ordinal) ? baseUri.LocalPath + uriRelative : baseUri.LocalPath + "/" + uriRelative); return new Uri(baseUri, uriRelative); }); } Please notice the folders’ links end with a slash. They are filtered by the second Where() query. The above method can find files’ URIs from the specified IIS folder, or ASP.NET Deployment Server folder while debugging. To support other formats of file list, a constructor is needed to pass into a customized method: /// <summary> /// Initializes a new instance of the <see cref="T:System.ComponentModel.Composition.Hosting.DirectoryCatalog" /> class with <see cref="T:System.ComponentModel.Composition.Primitives.ComposablePartDefinition" /> objects based on all the XAP files in the specified directory URI. /// </summary> /// <param name="uri"> /// URI to the directory to scan for XAPs to add to the catalog. /// The URI must be absolute, or relative to <see cref="P:System.Windows.Interop.SilverlightHost.Source" />. /// </param> /// <param name="getFilesFromDirectory"> /// The method to find files' URIs in the specified directory. /// </param> public DirectoryCatalog(Uri uri, Func<string, IEnumerable<Uri>> getFilesFromDirectory) { Contract.Requires(uri != null); this._uri = uri; this._getFilesFromDirectory = getFilesFromDirectory ?? this.GetFilesFromDirectory; this._webClient = new Lazy<WebClient>(() => new WebClient()); // Initializes other members. } When the getFilesFromDirectory parameter is null, the above GetFilesFromDirectory() method will be used as default. Download the directory’s XAP file list Now a public method can be created to start the downloading: /// <summary> /// Begins downloading the XAP files in the directory. /// </summary> public void DownloadAsync() { this.ThrowIfDisposed(); if (Interlocked.CompareExchange(ref this._state, State.DownloadStarted, State.Created) == 0) { this._webClient.Value.OpenReadCompleted += this.HandleOpenReadCompleted; this._webClient.Value.OpenReadAsync(this.Uri, this); } else { this.MutateStateOrThrow(State.DownloadCompleted, State.Initialized); this.OnDownloadCompleted(new AsyncCompletedEventArgs(null, false, this)); } } Here the HandleOpenReadCompleted() method is invoked when the file list HTML is downloaded. Download all XAP files After retrieving all files’ URIs, the next thing becomes even easier. HandleOpenReadCompleted() just uses built in DeploymentCatalog to download the XAPs, and aggregate them into one AggregateCatalog: private void HandleOpenReadCompleted(object sender, OpenReadCompletedEventArgs e) { Exception error = e.Error; bool cancelled = e.Cancelled; if (Interlocked.CompareExchange(ref this._state, State.DownloadCompleted, State.DownloadStarted) != State.DownloadStarted) { cancelled = true; } if (error == null && !cancelled) { try { using (StreamReader reader = new StreamReader(e.Result)) { string html = reader.ReadToEnd(); IEnumerable<Uri> uris = this._getFilesFromDirectory(html); Contract.Assume(uris != null); IEnumerable<DeploymentCatalog> deploymentCatalogs = uris.Select(uri => new DeploymentCatalog(uri)); deploymentCatalogs.ForEach( deploymentCatalog => { this._aggregateCatalog.Catalogs.Add(deploymentCatalog); deploymentCatalog.DownloadCompleted += this.HandleDownloadCompleted; }); deploymentCatalogs.ForEach(deploymentCatalog => deploymentCatalog.DownloadAsync()); } } catch (Exception exception) { error = new InvalidOperationException(Resources.InvalidOperationException_ErrorReadingDirectory, exception); } } // Exception handling. } In HandleDownloadCompleted(), if all XAPs are downloaded without exception, OnDownloadCompleted() callback method will be invoked. private void HandleDownloadCompleted(object sender, AsyncCompletedEventArgs e) { if (Interlocked.Increment(ref this._downloaded) == this._aggregateCatalog.Catalogs.Count) { this.OnDownloadCompleted(e); } } Exception handling Whether this DirectoryCatelog can work only if the directory browsing feature is enabled. It is important to inform caller when directory cannot be browsed for XAP downloading. private void HandleOpenReadCompleted(object sender, OpenReadCompletedEventArgs e) { Exception error = e.Error; bool cancelled = e.Cancelled; if (Interlocked.CompareExchange(ref this._state, State.DownloadCompleted, State.DownloadStarted) != State.DownloadStarted) { cancelled = true; } if (error == null && !cancelled) { try { // No exception thrown when browsing directory. Downloads the listed XAPs. } catch (Exception exception) { error = new InvalidOperationException(Resources.InvalidOperationException_ErrorReadingDirectory, exception); } } WebException webException = error as WebException; if (webException != null) { HttpWebResponse webResponse = webException.Response as HttpWebResponse; if (webResponse != null) { // Internally, WebClient uses WebRequest.Create() to create the WebRequest object. Here does the same thing. WebRequest request = WebRequest.Create(Application.Current.Host.Source); Contract.Assume(request != null); if (request.CreatorInstance == WebRequestCreator.ClientHttp && // Silverlight is in client HTTP handling, all HTTP status codes are supported. webResponse.StatusCode == HttpStatusCode.Forbidden) { // When directory browsing is disabled, the HTTP status code is 403 (forbidden). error = new InvalidOperationException( Resources.InvalidOperationException_ErrorListingDirectory_ClientHttp, webException); } else if (request.CreatorInstance == WebRequestCreator.BrowserHttp && // Silverlight is in browser HTTP handling, only 200 and 404 are supported. webResponse.StatusCode == HttpStatusCode.NotFound) { // When directory browsing is disabled, the HTTP status code is 404 (not found). error = new InvalidOperationException( Resources.InvalidOperationException_ErrorListingDirectory_BrowserHttp, webException); } } } this.OnDownloadCompleted(new AsyncCompletedEventArgs(error, cancelled, this)); } Please notice Silverlight 3+ application can work either in client HTTP handling, or browser HTTP handling. One difference is: In browser HTTP handling, only HTTP status code 200 (OK) and 404 (not OK, including 500, 403, etc.) are supported In client HTTP handling, all HTTP status code are supported So in above code, exceptions in 2 modes are handled differently. Conclusion Here is the whole DirectoryCatelog’s looking: Please click here to download the source code, a simple unit test is included. This is a rough implementation. And, for convenience, some design and coding are just following the built in AggregateCatalog class and Deployment class. Please feel free to modify the code, and please kindly tell me if any issue is found.

    Read the article

  • Playing with aspx page cycle using JustMock

    - by mehfuzh
    In this post , I will cover a test code that will mock the various elements needed to complete a HTTP page request and  assert the expected page cycle steps. To begin, i have a simple enumeration that has my predefined page steps: public enum PageStep {     PreInit,     Load,     PreRender,     UnLoad } Once doing so, i  first created the page object [not mocking]. Page page = new Page(); Here, our target is to fire up the page process through ProcessRequest call, now if we take a look inside the method with reflector.net,  the call trace will go like : ProcessRequest –> ProcessRequestWithNoAssert –> SetInstrinsics –> Finallly ProcessRequest. Inside SetInstrinsics ,  it requires calls from HttpRequest, HttpResponse and HttpBrowserCababilities. Using this clue at hand, we can easily know the classes / calls  we need to mock in order to get through the expected call. Accordingly, for  HttpBrowserCapabilities our required mock code will look like: var browser = Mock.Create<HttpBrowserCapabilities>(); // Arrange Mock.Arrange(() => browser.PreferredRenderingMime).Returns("text/html"); Mock.Arrange(() => browser.PreferredResponseEncoding).Returns("UTF-8"); Mock.Arrange(() => browser.PreferredRequestEncoding).Returns("UTF-8"); Now, HttpBrowserCapabilities is get though [Instance]HttpRequest.Browser. Therefore, we create the HttpRequest mock: var request = Mock.Create<HttpRequest>(); Then , add the required get call : Mock.Arrange(() => request.Browser).Returns(browser); As, [instance]Browser.PerferrredResponseEncoding and [instance]Browser.PreferredResponseEncoding  are also set to the request object and to make that they are set properly, we can add the following lines as well [not required though]. bool requestContentEncodingSet = false; Mock.ArrangeSet(() => request.ContentEncoding = Encoding.GetEncoding("UTF-8")).DoInstead(() =>  requestContentEncodingSet = true); Similarly, for response we can write:  var response = Mock.Create<HttpResponse>();    bool responseContentEncodingSet = false;  Mock.ArrangeSet(() => response.ContentEncoding = Encoding.GetEncoding("UTF-8")).DoInstead(() => responseContentEncodingSet = true); Finally , I created a mock of HttpContext and set the Request and Response properties that will returns the mocked version. var context = Mock.Create<HttpContext>();   Mock.Arrange(() => context.Request).Returns(request); Mock.Arrange(() => context.Response).Returns(response); As, Page internally calls RenderControl method , we just need to replace that with our one and optionally we can check if  invoked properly: bool rendered = false; Mock.Arrange(() => page.RenderControl(Arg.Any<HtmlTextWriter>())).DoInstead(() => rendered = true); That’s  it, the rest of the code is simple,  where  i asserted the page cycle with the PageSteps that i defined earlier: var pageSteps = new Queue<PageStep>();   page.PreInit +=delegate { pageSteps.Enqueue(PageStep.PreInit); }; page.Load += delegate { pageSteps.Enqueue(PageStep.Load); }; page.PreRender += delegate { pageSteps.Enqueue(PageStep.PreRender);}; page.Unload +=delegate { pageSteps.Enqueue(PageStep.UnLoad);};   page.ProcessRequest(context);   Assert.True(requestContentEncodingSet); Assert.True(responseContentEncodingSet); Assert.True(rendered);   Assert.Equal(pageSteps.Dequeue(), PageStep.PreInit); Assert.Equal(pageSteps.Dequeue(), PageStep.Load); Assert.Equal(pageSteps.Dequeue(), PageStep.PreRender); Assert.Equal(pageSteps.Dequeue(), PageStep.UnLoad);   Mock.Assert(request); Mock.Assert(response); You can get the test class shown in this post here to give a try by yourself with of course JustMock :-). Enjoy!!

    Read the article

  • Firefox 4.0 Error [closed]

    - by Nik
    Possible Duplicate: Firefox crashes very often - Attempting to load the system libmoon Hi, I am running ubuntu 10.10 and installed firefox 4.0 beta 10 using the ppa:mozillateam/firefox-next, by running the command sudo apt-get install firefox-4.0. However whenever I try running it, it starts up and then within 2 seconds it crashes without any warning. I also don't get the usual warning whether to send the crash report to firefox or not. So I tried running firefox-4.0 using the terminal and I get the following the error message. Attempting to load the system libmoon Segmentation fault What do it mean? And what do I do to fix this error? I really want to try the latest firefox beta in ubuntu.

    Read the article

  • Goodbye XML&hellip; Hello YAML (part 2)

    - by Brian Genisio's House Of Bilz
    Part 1 After I explained my motivation for using YAML instead of XML for my data, I got a lot of people asking me what type of tooling is available in the .Net space for consuming YAML.  In this post, I will discuss a nice tooling option as well as describe some small modifications to leverage the extremely powerful dynamic capabilities of C# 4.0.  I will be referring to the following YAML file throughout this post Recipe: Title: Macaroni and Cheese Description: My favorite comfort food. Author: Brian Genisio TimeToPrepare: 30 Minutes Ingredients: - Name: Cheese Quantity: 3 Units: cups - Name: Macaroni Quantity: 16 Units: oz Steps: - Number: 1 Description: Cook the macaroni - Number: 2 Description: Melt the cheese - Number: 3 Description: Mix the cooked macaroni with the melted cheese Tooling It turns out that there are several implementations of YAML tools out there.  The neatest one, in my opinion, is YAML for .NET, Visual Studio and Powershell.  It includes a great editor plug-in for Visual Studio as well as YamlCore, which is a parsing engine for .Net.  It is in active development still, but it is certainly enough to get you going with YAML in .Net.  Start by referenceing YamlCore.dll, load your document, and you are on your way.  Here is an example of using the parser to get the title of the Recipe: var yaml = YamlLanguage.FileTo("Data.yaml") as Hashtable; var recipe = yaml["Recipe"] as Hashtable; var title = recipe["Title"] as string; In a similar way, you can access data in the Ingredients set: var yaml = YamlLanguage.FileTo("Data.yaml") as Hashtable; var recipe = yaml["Recipe"] as Hashtable; var ingredients = recipe["Ingredients"] as ArrayList; foreach (Hashtable ingredient in ingredients) { var name = ingredient["Name"] as string; } You may have noticed that YamlCore uses non-generic Hashtables and ArrayLists.  This is because YamlCore was designed to work in all .Net versions, including 1.0.  Everything in the parsed tree is one of two things: Hashtable, ArrayList or Value type (usually String).  This translates well to the YAML structure where everything is either a Map, a Set or a Value.  Taking it further Personally, I really dislike writing code like this.  Years ago, I promised myself to never write the words Hashtable or ArrayList in my .Net code again.  They are ugly, mostly depreciated collections that existed before we got generics in C# 2.0.  Now, especially that we have dynamic capabilities in C# 4.0, we can do a lot better than this.  With a relatively small amount of code, you can wrap the Hashtables and Array lists with a dynamic wrapper (wrapper code at the bottom of this post).  The same code can be re-written to look like this: dynamic doc = YamlDoc.Load("Data.yaml"); var title = doc.Recipe.Title; And dynamic doc = YamlDoc.Load("Data.yaml"); foreach (dynamic ingredient in doc.Recipe.Ingredients) { var name = ingredient.Name; } I significantly prefer this code over the previous.  That’s not all… the magic really happens when we take this concept into WPF.  With a single line of code, you can bind to the data dynamically in the view: DataContext = YamlDoc.Load("Data.yaml"); Then, your XAML is extremely straight-forward (Nothing else.  No static types, no adapter code.  Nothing): <StackPanel> <TextBlock Text="{Binding Recipe.Title}" /> <TextBlock Text="{Binding Recipe.Description}" /> <TextBlock Text="{Binding Recipe.Author}" /> <TextBlock Text="{Binding Recipe.TimeToPrepare}" /> <TextBlock Text="Ingredients:" FontWeight="Bold" /> <ItemsControl ItemsSource="{Binding Recipe.Ingredients}" Margin="10,0,0,0"> <ItemsControl.ItemTemplate> <DataTemplate> <StackPanel Orientation="Horizontal"> <TextBlock Text="{Binding Quantity}" /> <TextBlock Text=" " /> <TextBlock Text="{Binding Units}" /> <TextBlock Text=" of " /> <TextBlock Text="{Binding Name}" /> </StackPanel> </DataTemplate> </ItemsControl.ItemTemplate> </ItemsControl> <TextBlock Text="Steps:" FontWeight="Bold" /> <ItemsControl ItemsSource="{Binding Recipe.Steps}" Margin="10,0,0,0"> <ItemsControl.ItemTemplate> <DataTemplate> <StackPanel Orientation="Horizontal"> <TextBlock Text="{Binding Number}" /> <TextBlock Text=": " /> <TextBlock Text="{Binding Description}" /> </StackPanel> </DataTemplate> </ItemsControl.ItemTemplate> </ItemsControl> </StackPanel> This nifty XAML binding trick only works in WPF, unfortunately.  Silverlight handles binding differently, so they don’t support binding to dynamic objects as of late (March 2010).  This, in my opinion, is a major lacking feature in Silverlight and I really hope we will see this feature available to us in Silverlight 4 Release.  (I am not very optimistic for Silverlight 4, but I can hope for the feature in Silverlight 5, can’t I?) Conclusion I still have a few things I want to say about using YAML in the .Net space including de-serialization and using IronRuby for your YAML parser, but this post is hopefully enough to see how easy it is to incorporate YAML documents in your code. Codeplex Site for YAML tools Dynamic wrapper for YamlCore

    Read the article

  • Graphics module: Am I going the right way?

    - by Paul
    I'm trying to write the graphics module of my engine. That is, this part of the code only provides an interface through which to load images, fonts, etc and draw them on the screen. It is also a wrapper for the library I'm using (SDL in this case). Here are the interfaces for my Image, Font and GraphicsRenderer classes. Please tell me if I'm going the right way. Image class Image { public: Image(); Image(const Image& other); Image(const char* file); ~Image(); bool load(const char* file); void free(); bool isLoaded() const; Image& operator=(const Image& other); private: friend class GraphicsRenderer; void* data_; }; Font class Font { public: Font(); Font(const Font& other); Font(const char* file, int ptsize); ~Font(); void load(const char* file, int ptsize); void free(); bool isLoaded() const; Font& operator=(const Font& other); private: friend class GraphicsRenderer; void* data_; }; GrapphicsRenderer class GraphicsRenderer { public: static GraphicsRenderer* Instance(); void blitImage(const Image& img, int x, int y); void blitText(const char* string, const Font& font, int x, int y); void render(); protected: GraphicsRenderer(); GraphicsRenderer(const GraphicsRenderer& other); GraphicsRenderer& operator=(const GraphicsRenderer& other); ~GraphicsRenderer(); private: void* screen_; bool initialize(); void finalize(); };

    Read the article

  • XNA extending an existing Content type

    - by Maarten
    We are doing a game in XNA that reacts to music. We need to do some offline processing of the music data and therefore we need a custom type containing the Song and some additional data: // Project AudioGameLibrary namespace AudioGameLibrary { public class GameTrack { public Song Song; public string Extra; } } We've added a Content Pipeline extension: // Project GameTrackProcessor namespace GameTrackProcessor { [ContentSerializerRuntimeType("AudioGameLibrary.GameTrack, AudioGameLibrary")] public class GameTrackContent { public SongContent SongContent; public string Extra; } [ContentProcessor(DisplayName = "GameTrack Processor")] public class GameTrackProcessor : ContentProcessor<AudioContent, GameTrackContent> { public GameTrackProcessor(){} public override GameTrackContent Process(AudioContent input, ContentProcessorContext context) { return new GameTrackContent() { SongContent = new SongProcessor().Process(input, context), Extra = "Some extra data" // Here we can do our processing on 'input' }; } } } Both the Library and the Pipeline extension are added to the Game Solution and references are also added. When trying to use this extension to load "gametrack.mp3" we run into problems however: // Project AudioGame protected override void LoadContent() { AudioGameLibrary.GameTrack gameTrack = Content.Load<AudioGameLibrary.GameTrack>("gametrack"); MediaPlayer.Play(gameTrack.Song); } The error message: Error loading "gametrack". File contains Microsoft.Xna.Framework.Media.Song but trying to load as AudioGameLibrary.GameTrack. AudioGame contains references to both AudioGameLibrary and GameTrackProcessor. Are we maybe missing other references? EDIT Selecting the correct content processor helped, it loads the audio file correctly. However, when I try to process some data, e.g: public override GameTrackContent Process(AudioContent input, ContentProcessorContext context) { int count = input.Data.Count; // With this commented out it works fine return new GameTrackContent() { SongContent = new SongProcessor().Process(input, context) }; } It crashes with the following error: Managed Debugging Assistant 'PInvokeStackImbalance' has detected a problem in 'C:\Users\Maarten\Documents\Visual Studio 2010\Projects\AudioGame\DebugPipeline\bin\Debug\DebugPipeline.exe'. Additional Information: A call to PInvoke function 'Microsoft.Xna.Framework.Content.Pipeline!Microsoft.Xna.Framework.Content.Pipeline.UnsafeNativeMethods+AudioHelper::OpenAudioFile' has unbalanced the stack. This is likely because the managed PInvoke signature does not match the unmanaged target signature. Check that the calling convention and parameters of the PInvoke signature match the target unmanaged signature. Information from logger right before crash: Using "BuildContent" task from assembly "Microsoft.Xna.Framework.Content.Pipel ine, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553". Task "BuildContent" Building gametrack.mp3 -> bin\x86\Debug\Content\gametrack.xnb Rebuilding because asset is new Importing gametrack.mp3 with Microsoft.Xna.Framework.Content.Pipeline.Mp3Imp orter Im experiencing exactly this: http://forums.create.msdn.com/forums/t/75996.aspx

    Read the article

  • How do you keep code with continuations/callbacks readable?

    - by Heinzi
    Summary: Are there some well-established best-practice patterns that I can follow to keep my code readable in spite of using asynchronous code and callbacks? I'm using a JavaScript library that does a lot of stuff asynchronously and heavily relies on callbacks. It seems that writing a simple "load A, load B, ..." method becomes quite complicated and hard to follow using this pattern. Let me give a (contrived) example. Let's say I want to load a bunch of images (asynchronously) from a remote web server. In C#/async, I'd write something like this: disableStartButton(); foreach (myData in myRepository) { var result = await LoadImageAsync("http://my/server/GetImage?" + myData.Id); if (result.Success) { myData.Image = result.Data; } else { write("error loading Image " + myData.Id); return; } } write("success"); enableStartButton(); The code layout follows the "flow of events": First, the start button is disabled, then the images are loaded (await ensures that the UI stays responsive) and then the start button is enabled again. In JavaScript, using callbacks, I came up with this: disableStartButton(); var count = myRepository.length; function loadImage(i) { if (i >= count) { write("success"); enableStartButton(); return; } myData = myRepository[i]; LoadImageAsync("http://my/server/GetImage?" + myData.Id, function(success, data) { if (success) { myData.Image = data; } else { write("error loading image " + myData.Id); return; } loadImage(i+1); } ); } loadImage(0); I think the drawbacks are obvious: I had to rework the loop into a recursive call, the code that's supposed to be executed in the end is somewhere in the middle of the function, the code starting the download (loadImage(0)) is at the very bottom, and it's generally much harder to read and follow. It's ugly and I don't like it. I'm sure that I'm not the first one to encounter this problem, so my question is: Are there some well-established best-practice patterns that I can follow to keep my code readable in spite of using asynchronous code and callbacks?

    Read the article

  • HPCM 11.1.2.2.x - How to find data in an HPCM Standard Costing database

    - by Jane Story
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} When working with a Hyperion Profitability and Cost Management (HPCM) Standard Costing application, there can often be a requirement to check data or allocated results using reporting tools e.g Smartview. To do this, you are retrieving data directly from the Essbase databases related to your HPCM model. For information, running reports is covered in Chapter 9 of the HPCM User documentation. The aim of this blog is to provide a quick guide to finding this data for reporting in the HPCM generated Essbase database in v11.1.2.2.x of HPCM. In order to retrieve data from an HPCM generated Essbase database, it is important to understand each of the following dimensions in the Essbase database and where data is located within them: Measures dimension – identifies Measures AllocationType dimension – identifies Direct Allocation Data or Genealogy Allocation data Point Of View (POV) dimensions – there must be at least one, maximum of four. Business dimensions: Stage Business dimensions – these will be identified by the Stage prefix. Intra-Stage dimension – these will be identified by the _Intra suffix. Essbase outlines and reporting is explained in the documentation here:http://docs.oracle.com/cd/E17236_01/epm.1112/hpm_user/ch09s02.html For additional details on reporting measures, please review this section of the documentation:http://docs.oracle.com/cd/E17236_01/epm.1112/hpm_user/apas03.html Reporting requirements in HPCM quite often start with identifying non balanced items in the Stage Balancing report. The following documentation link provides help with identifying some of the items within the Stage Balancing report:http://docs.oracle.com/cd/E17236_01/epm.1112/hpm_user/generatestagebalancing.html The following are some types of data upon which you may want to report: Stage Data: Direct Input Assigned Input Data Assigned Output Data Idle Cost/Revenue Unassigned Cost/Revenue Over Driven Cost/Revenue Direct Allocation Data Genealogy Allocation Data Stage Data Stage Data consists of: Direct Input i.e. input data, the starting point of your allocation e.g. in Stage 1 Assigned Input Data i.e. the cost/revenue received from a prior stage (i.e. stage 2 and higher). Assigned Output Data i.e. for each stage, the data that will be assigned forward is assigned post stage data. Reporting on this data is explained in the documentation here:http://docs.oracle.com/cd/E17236_01/epm.1112/hpm_user/ch09s03.html Dimension Selection Measures Direct Input: CostInput RevenueInput Assigned Input (from previous stages): CostReceivedPriorStage RevenueReceivedPriorStage Assigned Output (to subsequent stages): CostAssignedPostStage RevenueAssignedPostStage AllocationType DirectAllocation POV One member from each POV dimension Stage Business Dimensions Any members for the stage business dimensions for the stage you wish to see the Stage data for. All other Dimensions NoMember Idle/Unassigned/OverDriven To view Idle, Unassigned or Overdriven Costs/Revenue, first select which stage for which you want to view this data. If multiple Stages have unassigned/idle, resolve the earliest first and re-run the calculation as differences in early stages will create unassigned/idle in later stages. Dimension Selection Measures Idle: IdleCost IdleRevenue Unassigned: UnAssignedCost UnAssignedRevenue Overdriven: OverDrivenCost OverDrivenRevenue AllocationType DirectAllocation POV One member from each POV dimension Dimensions in the Stage with Unassigned/ Idle/OverDriven Cost All the Stage Business dimensions in the Stage with Unassigned/Idle/Overdriven. Zoom in on each dimension to find the individual members to find which members have Unassigned/Idle/OverDriven data. All other Dimensions NoMember Direct Allocation Data Direct allocation data shows the data received by a destination intersection from a source intersection where a direct assignment(s) exists. Reporting on direct allocation data is explained in the documentation here:http://docs.oracle.com/cd/E17236_01/epm.1112/hpm_user/ch09s04.html You would select the following to report direct allocation data Dimension Selection Measures CostReceivedPriorStage AllocationType DirectAllocation POV One member from each POV dimension Stage Business Dimensions Any members for the SOURCE stage business dimensions and the DESTINATION stage business dimensions for the direct allocations for the stage you wish to report on. All other Dimensions NoMember Genealogy Allocation Data Genealogy allocation data shows the indirect data relationships between stages. Genealogy calculations run in the HPCM Reporting database only. Reporting on genealogy data is explained in the documentation here:http://docs.oracle.com/cd/E17236_01/epm.1112/hpm_user/ch09s05.html Dimension Selection Measures CostReceivedPriorStage AllocationType GenealogyAllocation (IndirectAllocation in 11.1.2.1 and prior versions) POV One member from each POV dimension Stage Business Dimensions Any stage business dimension members from the STARTING stage in Genealogy Any stage business dimension members from the INTERMEDIATE stage(s) in Genealogy Any stage business dimension members from the ENDING stage in Genealogy All other Dimensions NoMember Notes If you still don’t see data after checking the above, please check the following Check the calculation has been run. Here are couple of indicators that might help them with that. Note the size of essbase cube before and after calculations ensure that a calculation was run against the database you are examing. Export the essbase data to a text file to confirm that some data exists. Examine the date and time on task area to see when, if any, calculations were run and what choices were used (e.g. Genealogy choices) If data does not exist in places where they are expecting, it could be that No calculations/genealogy were run No calculations were successfully run The model/data at feeder location were either absent or incompatible, resulting in no allocation e.g no driver data. Smartview Invocation from HPCM From version 11.1.2.2.350 of HPCM (this version will be GA shortly), it is possible to directly invoke Smartview from HPCM. There is guided navigation before the Smartview invocation and it is then possible to see the selected value(s) in SmartView. Click to Download HPCM 11.1.2.2.x - How to find data in an HPCM Standard Costing database (Right click or option-click the link and choose "Save As..." to download this pdf file)

    Read the article

  • WidgetBlock Speeds Up Browsing by Removing Social Media Widgets

    - by Jason Fitzpatrick
    Chrome: If you’re tired of web pages cluttered with social media buttons, WidgetBlock bans the buttons and speeds up the load time of web pages in the process. Even on a snappy internet connection you’ve likely noticed, thanks to the deluge of social media buttons loading in the background, a noticeable lag on popular web sites. WidgetBlock blocks widgets from loading (just like popular ad blocking software blocks ads from loading). The above screenshot, taken from a popular media site, shows just how much screen real estate is taken up by social media widgets. Installing WidgetBlock banishes the social media widgets and speeds load time. Hit up the link below to grab a free copy. WidgetBlock [Chrome Web Store] HTG Explains: When Do You Need to Update Your Drivers? How to Make the Kindle Fire Silk Browser *Actually* Fast! Amazon’s New Kindle Fire Tablet: the How-To Geek Review

    Read the article

  • E_INVALIDARG: An invalid parameter was passed to the returning function (-2147024809) when loading a cube texture

    - by Boreal
    I'm trying to implement a skybox into my engine, and I'm having some trouble loading the image as a cube map. Everything works (but it doesn't look right) if I don't load using an ImageLoadInformation struct in the ShaderResourceView.FromFile() method, but it breaks if I do. I need to, of course, because I need to tell SlimDX to load it as a cubemap. How can I fix this? Here is my new loading code after the "fix": public static void LoadCubeTexture(string filename) { ImageLoadInformation loadInfo = ImageLoadInformation.FromDefaults(); loadInfo.OptionFlags = ResourceOptionFlags.TextureCube; textures.Add(filename, ShaderResourceView.FromFile(Graphics.device, "Resources/" + filename, loadInfo)); }

    Read the article

  • Should I group all of my .js files into one large bundle?

    - by Scottie
    One of the difficulties I'm running into with my current project is that the previous developer spaghetti'd the javascript code in lots of different files. We have modal dialogs that are reused in different places and I find that the same .js file is often loaded twice. My thinking is that I'd like to just load all of the .js files in _Layout.cshtml, and that way I know it's loaded once and only once. Also, the client should only have to download this file once as well. It should be cached and therefore shouldn't really be a performance hit, except for the first page load. I should probably note that I am using ASP.Net bundling as well and loading most of the jQuery/bootstrap/etc from CDN's. Is there anything else that I'm not thinking of that would cause problems here? Should I bundle everything into a single file?

    Read the article

  • CodePlex Daily Summary for Sunday, March 18, 2012

    CodePlex Daily Summary for Sunday, March 18, 2012Popular ReleasesRiP-Ripper & PG-Ripper: RiP-Ripper 2.9.28: changes NEW: Added Support for "PixHub.eu" linksSmartNet: V1.0.0.0: DY SmartNet ?????? V1.0callisto: callisto 2.0.21: Added an option to disable local host detection.Javascript .NET: Javascript .NET v0.6: Upgraded to the latest stable branch of v8 (/tags/3.9.18), and switched to using their scons build system. We no longer include v8 source code as part of this project's source code. Simultaneous multithreaded use of v8 now supported (v8 Isolates), although different contexts may not share objects or call each other. 64-bit .Net 4.0 DLL now included. (Download now includes x86 and x64 for both .Net 3.5 and .Net 4.0.)MyRouter (Virtual WiFi Router): MyRouter 1.0.6: This release should be more stable there were a few bug fixes including the x64 issue as well as an error popping up when MyRouter started this was caused by a NULL valuePulse: Pulse Beta 4: This version is still in development but should include: Logging and error handling have been greatly improved. If you run into an error or Pulse crashes make sure to check the Log folder for a recently modified log file so you can report the details of the issue A bunch of new features for the Wallbase.cc provider. Cleaner separation between inputs, downloading and output. Input and downloading are fairly clean now but outputs are still mixed up in the mix which I'm trying to resolve ...Google Books Downloader for Windows: Google Books Downloader-2.0.0.0.: Google Books DownloaderFinestra Virtual Desktops: 2.5.4501: This is a very minor update release. Please see the information about the 2.5 and 2.5.4500 releases for more information on recent changes. This update did not even have an automatic update triggered for it. Adds error checking and reporting to all threads, not only those with message loopsAcDown????? - Anime&Comic Downloader: AcDown????? v3.9.2: ?? ●AcDown??????????、??、??????,????1M,????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。??????AcPlay?????,??????、????????????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDo...ArcGIS Editor for OpenStreetMap: ArcGIS Editor for OSM 2.0 Release Candidate: Your feedback is welcome - and this is your last chance to get your fixes in for this version! Includes installer for both Feature Server extension and Desktop extension, enhanced functionality for the Desktop tools, and enhanced built-in Javascript Editor for the Feature Server component. This release candidate includes fixes to beta 4 that accommodate domain users for setting up the Server Component, and fixes for reporting/uploading references tracked in the revision table. See Code In-P...C.B.R. : Comic Book Reader: CBR 0.6: 20 Issue trackers are closed and a lot of bugs too Localize view is now MVVM and delete is working. Added the unused flag (take care that it goes to true only when displaying screen elements) Backstage - new input/output format choice control for the conversion Backstage - Add display, behaviour and register file type options in the extended options dialog Explorer list view has been transformed to a custom control. New group header, colunms order and size are saved Single insta...Windows Azure Toolkit for Windows 8: Windows Azure Toolkit for Windows 8 Consumer Prv: Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v1.2.1Minor updates to setup experience: Check for WebPI before install Dependency Check updated to support the following VS 11 and VS 2010 SKUs Ultimate, Premium, Professional and Express Certs Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v1.2.0 Please download this for Windows Azure Toolkit for Windows 8 functionality on Windows 8 Consumer Preview. The core features of the toolkit include:...Facebook Graph Toolkit: Facebook Graph Toolkit 3.0: ships with JSON Toolkit v3.0, offering parse speed up to 10 times of last version supports Facebook's new auth dialog supports new extend access token endpoint new example Page Tab app filter Graph Api connections using dates fixed bugs in Page Tab appsCODE Framework: 4.0.20312.0: This version includes significant improvements in the WPF system (and the WPF MVVM/MVC system). This includes new styles for Metro controls and layouts. Improved color handling. It also includes an improved theme/style swapping engine down to active (open) views. There also are various other enhancements and small fixes throughout the entire framework.ScintillaNET: ScintillaNET 2.4: 3/12/2012 Jacob Slusser Added support for annotations. Issues Fixed with this Release Issue # Title 25012 25012 25018 25018 25023 25023 25014 25014 Visual Studio ALM Quick Reference Guidance: v3 - For Visual Studio 11: RELEASE README Welcome to the BETA release of the Quick Reference Guide preview As this is a BETA release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these BETA artifacts in a production environment. Quality-Bar Details Documentation has been reviewed by Visual Studio ALM Rangers Documentation has not been through an independent technical review Documentation ...AvalonDock: AvalonDock 2.0.0345: Welcome to early alpha release of AvalonDock 2.0 I've completely rewritten AvalonDock in order to take full advantage of the MVVM pattern. New version also boost a lot of new features: 1) Deep separation between model and layout. 2) Full WPF binding support thanks to unified logical tree between main docking manager, auto-hide windows and floating windows. 3) Support for Aero semi-maximized windows feature. 4) Support for multiple panes in the same floating windows. For a short list of new f...Windows Azure PowerShell Cmdlets: Windows Azure PowerShell Cmdlets 2.2.2: Changes Added Start Menu Item for Easy Startup Added Link to Getting Started Document Added Ability to Persist Subscription Data to Disk Fixed Get-Deployment to not throw on empty slot Simplified numerous default values for cmdlets Breaking Changes: -SubscriptionName is now mandatory in Set-Subscription. -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added ...IronPython: 2.7.2.1: On behalf of the IronPython team, I'm happy to announce the final release IronPython 2.7.2. This release includes everything from IronPython 54498 and 62475 as well. Like all IronPython 2.7-series releases, .NET 4 is required to install it. Installing this release will replace any existing IronPython 2.7-series installation. Unlike previous releases, the assemblies for all supported platforms are included in the installer as well as the zip package, in the "Platforms" directory. IronPython 2...Kooboo CMS: Kooboo CMS 3.2.0.0: Breaking changes: When upgrade from previous versions, MUST reset the all the content type templates, otherwise the content manager might get a compile error. New features Integrate with Windows azure. See: http://wiki.kooboo.com/?wiki=Kooboo CMS on Azure Complete solution to deploy on load balance servers. See: http://wiki.kooboo.com/?wiki=Kooboo CMS load balance Update Jquery and Jquery ui to the lastest version(Jquery 1.71, Jquery UI 1.8.16). Tree style text content editing. See:h...New Projects4sr4ss1: Assignment 1 WDT Due date: 26 March 2012ADLN: Project to ADLNbook2guest: book2guestBundlingTweaks: BundlingTweaks makes it easier for developers to develop ASP.NET MVC 4 projects with Bundling/Minification support. You can now take advantage of Bundling, but still see non-minified JavaScript when debugging.COBOL SCREENSAVER: This project will show how to create a Windows screen saver using 1) an object-oriented isCOBOL GUI application, 2) an OpenCobol Windows native executable wrapper for the isCOBOL GUI, and 3) XML, XSLT, HTML5, and a Java FX 2.0 web browser component. Release 0.1 consists of a Java GUI application, the OpenCobol wrapper, and a README file detailing usage, system requirements, and installation of the recommended OpenCobol binaries for Windows.Colorado Time System Get CTS Results to Text File: This application will query a Colorado Time System 5, retrieve the lane results by event and heat. These can then be written to a text file.CommonEventLog: This project provides a simple means of logging messages and exceptions to a custom event log.ContainerTrack: Application for tracking container locationContent Widgets Orchard module: This Orchard module makes it possible to add arbitrary widgets to content types (with the option to disable per item).Cup of Badminton: This is a application for draw cup of badminton. You can insert, edit and delete player. You can choose many variant of draw.FSGREE: FSGREEKeepSafety: ??、??、?????????LibGrafx: LibGrafix is a C++ DLL that contains several classes useful when developing computer graphics applications, particularly OpenGL applications, in the Windows environment. It also includes a class that loads and displays the ModelX format exported by the XNA Model Viewer program. Linq to SQL code generator for Windows Phone: Linq to SQL code generator for Windows Phone is an add-in for Visual Studio 2010 or later to generate the Linq to SQL classes from a DBML diagram. Offline Navigation for Windows Phone 7: This is a simple implementation of offline navigation for windows phone 7 with map offset adjustment support.PayrollSystemRedux: Payroll case study book Robert Martin (Agile PPP in C#)PhoneFlipMenu Control for Windows Phone: A control that mimics the behaviour of the Email app's reply/reply all/forward menu.QTscenarist: QTscenarist makes it easier for Synfig animation to create Synfig movies. You can write scenaries and create your movie with it. It's developed in QT. recycle: This is a recycle web pageSharePoint 2010 Accordian Launch: SharePoint 2010 and SharePoint Foundation Quick Launch Accordian Solution.SharePoint 2010 JavaScript Registration Panel: The SharePoint 2010 JavaScript Registration Panel allows you to add JavaScript files to a site collection without SharePoint 2010 Designer intervention. This is helpful in scenarios where SharePoint Designer 2010 is disabled globally, but you have to add some JavaScript files (e.g. jQuery) to your site collection for every page load. The aim of this project is to provide following features: - load JavaScript files on every page load of your site collection - define the sequence for loadi...Sistema de Gestion Escolar: Projecto final monografico universidad dominicana O&MSmartNet: DYSmartNet????????????????????????。???TCP??,??????????,??????????... solution: solution team explorerTanmiaGrp: This is the tanmia base project library.VideoMobileSteraming: Will Update soonXNASter: XNASter is a short library that allows beginner developers to create simple XNA games. Right now it is in the very early stages of development. Included are the following parts: * 2D generic sprite-based object that supports movement, acceleration, etc. In the future, we are also planning to support: * KinectSkeleton controller Together with the project, there are also a few sample games that show how to use it.???WP: HateCoWP is Hatena Coco Client.???: ????????????????????。

    Read the article

< Previous Page | 153 154 155 156 157 158 159 160 161 162 163 164  | Next Page >