Search Results

Search found 2011 results on 81 pages for 'raw'.

Page 1/81 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • free Raw-File Converter/Editor

    - by RCIX
    I have RAW files output by a program with a specific set of properties (Photoshop RAW, 16 bits, IBM PC byte order, no header, 1 non-interleaved channel, variable sizes like 257X257 or 129X513); does anyone know of a free tool that will allow me to convert to and from this format, and possibly do basic editing (selection, copy/paste, rotation of selection)? I've tried Picasa, XNView, and Paint Shop Pro 7 and none of them work properly. The closest i get is Paint Shop Pro which will at least make a serviceable attempt to open these files but i can't set all of the proper settings. XNView just might be able to edit it if i can figure out how to change the open settings for a particular raw file. So my questions at current are: how do i tell XNView to open a raw file a particular way? Failing that, is there any free tool that can open Photoshop-RAW files with the above settings (that's not photoshop)? If it helps, i'm trying to import/export/edit hieghtmap data for maps for Supreme Commander.

    Read the article

  • Create "raw disk file" from WIM file

    - by Joe Baltimore
    First timer here. I've searched around here, but haven't found a question like the one I have. Apologies if I missed it. The challenge at hand: produce a "raw disk image file" from a given WIM file. What I am pursuing so far is to use imagex.exe with the "/apply" operation to take the WIM and lay it down in a directory on a server. That seems to produce all the necessary "stuff" I need in that directory. How would I take that content and produce a "raw disk image file"? I'm told the definition of "raw disk image file" is a block-by-block copy of the disk image, which I hope is the output of the "imagex.exe /apply" command I use currently, but stored in a single file I can hand back to another system in our solution. imagex.exe /apply image.wim 1 R:\WimImagePoint I would like to take the contents of R:\WimImagePoint and produce the elusive (to me) "raw disk image file". ISO is not what they want, nor is anything requiring winPE. Any pointers? External utilities' references are welcome. Would like to avoid unmanaged code solutions as much as possible, but will entertain them if that's the only route. Also, I am not married to the idea of imagex /apply as the starting point, it's just the comfort zone so far.

    Read the article

  • Command line raw image processing tools in Linux?

    - by ???
    I'm wondering if there is any command to process raw images, for example, cat raw1.img | raw2jpg -w 640 -h 480 -pitch 1024 -pixelformat R8G8B8 and more examples: cat raw1.img raw2.img >y-merge.img tr='transpose -pitch 1024 -depth 24' cat <(cat raw1.img | $tr) <(cat raw2.img | $tr) | transpose -pitch 480 >x-merge.img and something like this: cat gamebitmap.dat | ( w=`readint32` h=`readint32` raw2png -w $w -h $h -depth 24 -pixelformat R8G8B8 ) | png2svg -extractoutline -fuzzy -error 8 -smooth Seems a little tricky, but is it possible? does ImageMagick support such raw formats?

    Read the article

  • Software for handling camera RAW-files

    - by Eikern
    I use a digital SLR as most other photographers do today and have quickly realised that capturing images using camera-RAW files is the way to go. Personally I use Adobe Lightroom to handle my photo library, but I know there are other software available like Apple Aperture. These applications are quite hard to use for a novice, and are quite expensive too. I've often recommended other photographers to switch to camera-raw, but they won't do it because Windows can't handle it natively. Are there any free or cheaper applications out there that can do simple file handling and adjustments? Preferably so simple that my mom can do it. I know Nikon offers a codec that allows you to view NEF-files natively inside Windows, but still limits the uses of the file and slows the system down if the file is big. Does anybody know of a drag-and-drop application that converts camera-raw to JPG on-the-fly? In case I or someone would need to upload an image to the web or use it inside a word-document. Thanks.

    Read the article

  • Read video from dvd in RAW filesystem

    - by kpinhack
    Hallo, i've got some dvds that were recorded on a dvd recorder. I can not read those dvds on windows PCs (i tried multiple PCs). The properties of the discs say that the filesystem of the disc is RAW and i think that is the reason why windows does not read the disc. If that is true, what can i do to read the dvds ? Or what else could be the problem ? EDIT: The windows PCs i tried play back regular DVDs without any problem, and the "RAW" dvds recorded with the dvd recorder will not play on a standalone dvd player. Thank you!

    Read the article

  • checksum in raw sockets and pcap

    - by hero
    i am using pcap library to sniff some packets, change their tcp data , and then inject my packet on the network. my question is: if i changed in the tcp data, should i recalculate the length field in the tcp header? should i also change the checksum? i read in a page on how to create raw sockets that if you set the tcp_checksum to 0, the kernel will automatically calculate it and fill it, is this true for windows machines also?

    Read the article

  • Windows 7 system drive says it is raw, but System Recovery starts without issues

    - by iulianchira
    I have been running Windows 7 RC1 since it was available a couple of months ago and had no issues whatsoever until today. When I start my laptop, Windows does not boot but instead Windows System Recovery starts. I've used diskpart to list the partitions on the drive and my system partition (c:) has a RAW filesystem. I really need to save all data on the disk as fast as I cant and I would really like not to have to reinstall my system.

    Read the article

  • Windows 7 system drive says it is raw, but System Recovery starts without issues

    - by Iulian Chira
    I have been running Windows 7 RC1 since it was available a couple of months ago and had no issues whatsoever until today. When I start my laptop, Windows does not boot but instead Windows System Recovery starts. I've used diskpart to list the partitions on the drive and my system partition (c:) has a RAW filesystem. I really need to save all data on the disk as fast as I cant and I would really like not to have to reinstall my system.

    Read the article

  • Raw Sockets on Android

    - by Tingo
    I want to create an application that runs on Android and uses Raw Sockets. I see there isn't any raw socket support in the java.net.* or the android.net.* libraries. Are raw sockets possible on Android?

    Read the article

  • Recover hard disk from Raw format

    - by user1632736
    I have been all over the web today with no results. So my drive was encrypted (truecrypt) the whole drive where windows resided. I decided to partition it to install W8 and forgot it was encrypted. So the drive got damaged and not accessible. When connected to a computer it asks for formatting. Somehow I enabled the drive through TrueCrypt on another computer and I could see and get all the files. Then I decided to decrypt the drive thinking that everything would be back to normal. After decryption my drive is not NTFS it is in RAW format. I am trying every possible way to recover, and I am desperate enough to ask lol. I tried: ddrescue (linux) (not mountable, no signature, ntfsfix no good) testdisk (linux and windows) Sees the partitions but cant do anything Many recovery applications. etc etc. I read in different places that doing a quickformat to NTFS and then doing a data recovery might help. I would definitely like a second opinion. Any suggestion would be really helpful

    Read the article

  • casting raw strings python

    - by dave
    in python, given a variable which holds a string is there a quick way to cast that into another raw string variable? the following code should illustrate what im after... def checkEqual(x, y): print True if x==y else False line1 = "hurr..\n..durr" line2 = r"hurr..\n..durr" line3 = "%r"%line1 print "%s \n\n%s \n\n%s \n" % (line1, line2, line3) checkEqual(line2, line3) #outputs False checkEqual(line2, line3[1:-1]) #outputs True The closest I have found so far is the %r formatting flag which seems to return a raw string albeit within single quote marks. Is there any easier way to do this like a line3 = raw(line1) kind of thing?

    Read the article

  • How to migrate existing udp application to raw sockets

    - by osgx
    Hello Is there a tutorial for migration from plain udp sockets (linux, C99/C++, recv syscall is used) to the raw sockets? According to http://aschauf.landshut.org/fh/linux/udp_vs_raw/ch03s04.html raw socket is much faster than udp. Application is client-server. client is proprietary and must use exactly same procotol as it was with udp server. But server can be a bit faster with raw sockets. What parts of udp I must to implement in server? Is there a "quick migration" libraries?

    Read the article

  • Virtualbox: Raw linux partition not booting

    - by abalter
    I have a dual-boot laptop with Windows 7 and Ubuntu 12.04. I am trying to boot the ubuntu partition from windows using Virtualbox. I have successfully created the .vmdk, and created the virtual machine. However, I can't get it to boot (in Virtualbox). All I get is a black screen with the cursor in the top left. I wonder if I'm specifying the partitions correctly. My Ubuntu install has 3 partitions: \, \boot, \home. No swap partition. These are all in Disk 0, partitions 3,4,5 respectively. The command I used to create the .vmdk is: VBoxManage internalcommands createrawvmdk -filename C:\Users\abalter\.virtualbox\ubuntu.vmdk -rawdisk \\.\PhysicalDrive0 -partitions 3,4,5 Then I create a virtual machine based on that .vmdk. Why won't it boot?

    Read the article

  • Injecting raw TCP packets with Python

    - by Evgeniy Arbatov
    Hello! What would be a suitable way to inject a raw TCP packet with Python? For example, I have the payload consisting of hexadecimal numbers and I want to send that sequence of hexadecimal numbers to a network daemon: so that if I choose to send 'abcdef', I see 'abcdef' on the wire too. But not '6162636566' as in the case of: new = socket.socket(socket.AF_INET, socket.SOCK_STREAM) new.connect(('127.0.0.1', 9999)) new.send('abcdef') Can I use Python's SOCK_RAW for this purpose? If so, can you give me an example of sending raw TCP packets with SOCK_RAW (since I did not get it working myself) Thanks! Evgeniy

    Read the article

  • linux raw socket programming question

    - by user194420
    Hi all, I am trying to create a raw socket which send and receive message with ip/tcp header under linux. I can successfully binds to a port and receive tcp message(ie:syn) However, the message seems to be handled by the os, but not mine. I am just a reader of it(like wireshark). My raw socket binds to port 8888, and then i try to telnet to that port . In wireshark, it shows that the port 8888 reply a "rst ack" when it receive the "syn" request. In my program, it shows that it receive a new message and it doesnot reply with any message. Any way to actually binds to that port?(prevent os handle it) Here is part of my code, i try to cut those error checking for easy reading sockfd = socket(AF_INET, SOCK_RAW, IPPROTO_TCP); int tmp = 1; const int *val = &tmp; setsockopt (sockfd, IPPROTO_IP, IP_HDRINCL, val, sizeof (tmp)); servaddr.sin_family = AF_INET; servaddr.sin_addr.s_addr = htonl(INADDR_ANY); servaddr.sin_port = htons(8888); bind(sockfd, (struct sockaddr*)&servaddr, sizeof(servaddr)); //call recv in loop

    Read the article

  • checksum in raw sockets and pcap [closed]

    - by hero
    i am using pcap library to sniff some packets, change their tcp data , and then inject my packet on the network. my question is: if i changed in the tcp data, should i recalculate the length field in the tcp header? should i also change the checksum? i read in a page on how to create raw sockets that if you set the tcp_checksum to 0, the kernel will automatically calculate it and fill it, is this true for windows machines also?

    Read the article

  • How to send raw XML in Python?

    - by davywahd
    Hi, I am trying to send raw xml to a service in Python. I have a the address of the service and my question is how would I wrap XML in python and send it to the service. The address is in the format below. 192.1100.2.2:54239 And say the XML is: <xml version="1.0" encoding="UTF-8"><header/><body><code><body/> Anyone know what to do?

    Read the article

  • Marshal.StringToCoTaskMemAnsi converting non-Latin characters when sending raw data to a printer

    - by rem
    For sending raw data to a thermal DATAMAX printer I'm using RawPrinterHelper class from this Microsoft KB article. When a string sent to printer contains only Latin characters, everything is OK. But non-Latin, in my case Russian characters in a string, are not printed correct. I think the problem is in using Marshal.StringToCoTaskMemAnsi method for converting the string: public static bool SendStringToPrinter(string szPrinterName, string szString) { IntPtr pBytes; Int32 dwCount; // How many characters are in the string? dwCount = szString.Length; // Assume that the printer is expecting ANSI text, and then convert // the string to ANSI text. pBytes = Marshal.StringToCoTaskMemAnsi(szString); // Send the converted ANSI string to the printer. SendBytesToPrinter(szPrinterName, pBytes, dwCount); Marshal.FreeCoTaskMem(pBytes); return true; } Just to note, Russian characters in the string are put in hex format, like "\x83", but nevertheless the method doesn't put this hex value in unmanaged memory as it is, but converts it, I think, according with ANSI code page to a character and then printer can not read it correctly. If I try to compose a file, using Hex editor and put correct hex values in place of non-Latin characters and then send the file to a printer using another method from the same class SendFileToPrinter, everything, including Russian characters is printed correctly. How in this case the problem with sending string, containing non-Latin characters, could be solved?

    Read the article

  • Raw types and subtyping

    - by Dmitrii
    We have generic class SomeClass<T>{ } We can write the line: SomeClass s= new SomeClass<String>(); It's ok, because raw type is supertype for generic type. But SomeClass<String> s= new SomeClass(); is correct to. Why is it correct? I thought that type erasure was before type checking, but it's wrong. From Hacker's Guide to Javac When the Java compiler is invoked with default compile policy it performs the following passes: parse: Reads a set of *.java source files and maps the resulting token sequence into AST-Nodes. enter: Enters symbols for the definitions into the symbol table. process annotations: If Requested, processes annotations found in the specified compilation units. attribute: Attributes the Syntax trees. This step includes name resolution, type checking and constant folding. flow: Performs data ow analysis on the trees from the previous step. This includes checks for assignments and reachability. desugar: Rewrites the AST and translates away some syntactic sugar. generate: Generates Source Files or Class Files. Generic is syntax sugar, hence type erasure invoked at 6 pass, after type checking, which invoked at 4 pass. I'm confused.

    Read the article

  • ASP.NET Frameworks and Raw Throughput Performance

    - by Rick Strahl
    A few days ago I had a curious thought: With all these different technologies that the ASP.NET stack has to offer, what's the most efficient technology overall to return data for a server request? When I started this it was mere curiosity rather than a real practical need or result. Different tools are used for different problems and so performance differences are to be expected. But still I was curious to see how the various technologies performed relative to each just for raw throughput of the request getting to the endpoint and back out to the client with as little processing in the actual endpoint logic as possible (aka Hello World!). I want to clarify that this is merely an informal test for my own curiosity and I'm sharing the results and process here because I thought it was interesting. It's been a long while since I've done any sort of perf testing on ASP.NET, mainly because I've not had extremely heavy load requirements and because overall ASP.NET performs very well even for fairly high loads so that often it's not that critical to test load performance. This post is not meant to make a point  or even come to a conclusion which tech is better, but just to act as a reference to help understand some of the differences in perf and give a starting point to play around with this yourself. I've included the code for this simple project, so you can play with it and maybe add a few additional tests for different things if you like. Source Code on GitHub I looked at this data for these technologies: ASP.NET Web API ASP.NET MVC WebForms ASP.NET WebPages ASMX AJAX Services  (couldn't get AJAX/JSON to run on IIS8 ) WCF Rest Raw ASP.NET HttpHandlers It's quite a mixed bag, of course and the technologies target different types of development. What started out as mere curiosity turned into a bit of a head scratcher as the results were sometimes surprising. What I describe here is more to satisfy my curiosity more than anything and I thought it interesting enough to discuss on the blog :-) First test: Raw Throughput The first thing I did is test raw throughput for the various technologies. This is the least practical test of course since you're unlikely to ever create the equivalent of a 'Hello World' request in a real life application. The idea here is to measure how much time a 'NOP' request takes to return data to the client. So for this request I create the simplest Hello World request that I could come up for each tech. Http Handler The first is the lowest level approach which is an HTTP handler. public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public bool IsReusable { get { return true; } } } WebForms Next I added a couple of ASPX pages - one using CodeBehind and one using only a markup page. The CodeBehind page simple does this in CodeBehind without any markup in the ASPX page: public partial class HelloWorld_CodeBehind : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { Response.Write("Hello World. Time is: " + DateTime.Now.ToString() ); Response.End(); } } while the Markup page only contains some static output via an expression:<%@ Page Language="C#" AutoEventWireup="false" CodeBehind="HelloWorld_Markup.aspx.cs" Inherits="AspNetFrameworksPerformance.HelloWorld_Markup" %> Hello World. Time is <%= DateTime.Now %> ASP.NET WebPages WebPages is the freestanding Razor implementation of ASP.NET. Here's the simple HelloWorld.cshtml page:Hello World @DateTime.Now WCF REST WCF REST was the token REST implementation for ASP.NET before WebAPI and the inbetween step from ASP.NET AJAX. I'd like to forget that this technology was ever considered for production use, but I'll include it here. Here's an OperationContract class: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World" + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } } WCF REST can return arbitrary results by returning a Stream object and a content type. The code above turns the string result into a stream and returns that back to the client. ASP.NET AJAX (ASMX Services) I also wanted to test ASP.NET AJAX services because prior to WebAPI this is probably still the most widely used AJAX technology for the ASP.NET stack today. Unfortunately I was completely unable to get this running on my Windows 8 machine. Visual Studio 2012  removed adding of ASP.NET AJAX services, and when I tried to manually add the service and configure the script handler references it simply did not work - I always got a SOAP response for GET and POST operations. No matter what I tried I always ended up getting XML results even when explicitly adding the ScriptHandler. So, I didn't test this (but the code is there - you might be able to test this on a Windows 7 box). ASP.NET MVC Next up is probably the most popular ASP.NET technology at the moment: MVC. Here's the small controller: public class MvcPerformanceController : Controller { public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } } ASP.NET WebAPI Next up is WebAPI which looks kind of similar to MVC. Except here I have to use a StringContent result to return the response: public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } } Testing Take a minute to think about each of the technologies… and take a guess which you think is most efficient in raw throughput. The fastest should be pretty obvious, but the others - maybe not so much. The testing I did is pretty informal since it was mainly to satisfy my curiosity - here's how I did this: I used Apache Bench (ab.exe) from a full Apache HTTP installation to run and log the test results of hitting the server. ab.exe is a small executable that lets you hit a URL repeatedly and provides counter information about the number of requests, requests per second etc. ab.exe and the batch file are located in the \LoadTests folder of the project. An ab.exe command line  looks like this: ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld which hits the specified URL 100,000 times with a load factor of 20 concurrent requests. This results in output like this:   It's a great way to get a quick and dirty performance summary. Run it a few times to make sure there's not a large amount of varience. You might also want to do an IISRESET to clear the Web Server. Just make sure you do a short test run to warm up the server first - otherwise your first run is likely to be skewed downwards. ab.exe also allows you to specify headers and provide POST data and many other things if you want to get a little more fancy. Here all tests are GET requests to keep it simple. I ran each test: 100,000 iterations Load factor of 20 concurrent connections IISReset before starting A short warm up run for API and MVC to make sure startup cost is mitigated Here is the batch file I used for the test: IISRESET REM make sure you add REM C:\Program Files (x86)\Apache Software Foundation\Apache2.2\bin REM to your path so ab.exe can be found REM Warm up ab.exe -n100 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJsonab.exe -n100 -c20 http://localhost/aspnetperf/api/HelloWorldJson ab.exe -n100 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld ab.exe -n100000 -c20 http://localhost/aspnetperf/handler.ashx > handler.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_CodeBehind.aspx > AspxCodeBehind.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_Markup.aspx > AspxMarkup.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld > Wcf.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldCode > Mvc.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld > WebApi.txt I ran each of these tests 3 times and took the average score for Requests/second, with the machine otherwise idle. I did see a bit of variance when running many tests but the values used here are the medians. Part of this has to do with the fact I ran the tests on my local machine - result would probably more consistent running the load test on a separate machine hitting across the network. I ran these tests locally on my laptop which is a Dell XPS with quad core Sandibridge I7-2720QM @ 2.20ghz and a fast SSD drive on Windows 8. CPU load during tests ran to about 70% max across all 4 cores (IOW, it wasn't overloading the machine). Ideally you can try running these tests on a separate machine hitting the local machine. If I remember correctly IIS 7 and 8 on client OSs don't throttle so the performance here should be Results Ok, let's cut straight to the chase. Below are the results from the tests… It's not surprising that the handler was fastest. But it was a bit surprising to me that the next fastest was WebForms and especially Web Forms with markup over a CodeBehind page. WebPages also fared fairly well. MVC and WebAPI are a little slower and the slowest by far is WCF REST (which again I find surprising). As mentioned at the start the raw throughput tests are not overly practical as they don't test scripting performance for the HTML generation engines or serialization performances of the data engines. All it really does is give you an idea of the raw throughput for the technology from time of request to reaching the endpoint and returning minimal text data back to the client which indicates full round trip performance. But it's still interesting to see that Web Forms performs better in throughput than either MVC, WebAPI or WebPages. It'd be interesting to try this with a few pages that actually have some parsing logic on it, but that's beyond the scope of this throughput test. But what's also amazing about this test is the sheer amount of traffic that a laptop computer is handling. Even the slowest tech managed 5700 requests a second, which is one hell of a lot of requests if you extrapolate that out over a 24 hour period. Remember these are not static pages, but dynamic requests that are being served. Another test - JSON Data Service Results The second test I used a JSON result from several of the technologies. I didn't bother running WebForms and WebPages through this test since that doesn't make a ton of sense to return data from the them (OTOH, returning text from the APIs didn't make a ton of sense either :-) In these tests I have a small Person class that gets serialized and then returned to the client. The Person class looks like this: public class Person { public Person() { Id = 10; Name = "Rick"; Entered = DateTime.Now; } public int Id { get; set; } public string Name { get; set; } public DateTime Entered { get; set; } } Here are the updated handler classes that use Person: Handler public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { var action = context.Request.QueryString["action"]; if (action == "json") JsonRequest(context); else TextRequest(context); } public void TextRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public void JsonRequest(HttpContext context) { var json = JsonConvert.SerializeObject(new Person(), Formatting.None); context.Response.ContentType = "application/json"; context.Response.Write(json); } public bool IsReusable { get { return true; } } } This code adds a little logic to check for a action query string and route the request to an optional JSON result method. To generate JSON, I'm using the same JSON.NET serializer (JsonConvert.SerializeObject) used in Web API to create the JSON response. WCF REST   [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World " + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } [OperationContract] [WebGet(ResponseFormat=WebMessageFormat.Json,BodyStyle=WebMessageBodyStyle.WrappedRequest)] public Person HelloWorldJson() { // Add your operation implementation here return new Person(); } } For WCF REST all I have to do is add a method with the Person result type.   ASP.NET MVC public class MvcPerformanceController : Controller { // // GET: /MvcPerformance/ public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } public JsonResult HelloWorldJson() { return Json(new Person(), JsonRequestBehavior.AllowGet); } } For MVC all I have to do for a JSON response is return a JSON result. ASP.NET internally uses JavaScriptSerializer. ASP.NET WebAPI public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } [HttpGet] public Person HelloWorldJson() { return new Person(); } [HttpGet] public HttpResponseMessage HelloWorldJson2() { var response = new HttpResponseMessage(HttpStatusCode.OK); response.Content = new ObjectContent<Person>(new Person(), GlobalConfiguration.Configuration.Formatters.JsonFormatter); return response; } } Testing and Results To run these data requests I used the following ab.exe commands:REM JSON RESPONSES ab.exe -n100000 -c20 http://localhost/aspnetperf/Handler.ashx?action=json > HandlerJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJson > MvcJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorldJson > WebApiJson.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorldJson > WcfJson.txt The results from this test run are a bit interesting in that the WebAPI test improved performance significantly over returning plain string content. Here are the results:   The performance for each technology drops a little bit except for WebAPI which is up quite a bit! From this test it appears that WebAPI is actually significantly better performing returning a JSON response, rather than a plain string response. Snag with Apache Benchmark and 'Length Failures' I ran into a little snag with Apache Benchmark, which was reporting failures for my Web API requests when serializing. As the graph shows performance improved significantly from with JSON results from 5580 to 6530 or so which is a 15% improvement (while all others slowed down by 3-8%). However, I was skeptical at first because the WebAPI test reports showed a bunch of errors on about 10% of the requests. Check out this report: Notice the Failed Request count. What the hey? Is WebAPI failing on roughly 10% of requests when sending JSON? Turns out: No it's not! But it took some sleuthing to figure out why it reports these failures. At first I thought that Web API was failing, and so to make sure I re-ran the test with Fiddler attached and runiisning the ab.exe test by using the -X switch: ab.exe -n100 -c10 -X localhost:8888 http://localhost/aspnetperf/api/HelloWorldJson which showed that indeed all requests where returning proper HTTP 200 results with full content. However ab.exe was reporting the errors. After some closer inspection it turned out that the dates varying in size altered the response length in dynamic output. For example: these two results: {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.841926-10:00"} {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.8519262-10:00"} are different in length for the number which results in 68 and 69 bytes respectively. The same URL produces different result lengths which is what ab.exe reports. I didn't notice at first bit the same is happening when running the ASHX handler with JSON.NET result since it uses the same serializer that varies the milliseconds. Moral: You can typically ignore Length failures in Apache Benchmark and when in doubt check the actual output with Fiddler. Note that the other failure values are accurate though. Another interesting Side Note: Perf drops over Time As I was running these tests repeatedly I was finding that performance steadily dropped from a startup peak to a 10-15% lower stable level. IOW, with Web API I'd start out with around 6500 req/sec and in subsequent runs it keeps dropping until it would stabalize somewhere around 5900 req/sec occasionally jumping lower. For these tests this is why I did the IIS RESET and warm up for individual tests. This is a little puzzling. Looking at Process Monitor while the test are running memory very quickly levels out as do handles and threads, on the first test run. Subsequent runs everything stays stable, but the performance starts going downwards. This applies to all the technologies - Handlers, Web Forms, MVC, Web API - curious to see if others test this and see similar results. Doing an IISRESET then resets everything and performance starts off at peak again… Summary As I stated at the outset, these were informal to satiate my curiosity not to prove that any technology is better or even faster than another. While there clearly are differences in performance the differences (other than WCF REST which was by far the slowest and the raw handler which was by far the highest) are relatively minor, so there is no need to feel that any one technology is a runaway standout in raw performance. Choosing a technology is about more than pure performance but also about the adequateness for the job and the easy of implementation. The strengths of each technology will make for any minor performance difference we see in these tests. However, to me it's important to get an occasional reality check and compare where new technologies are heading. Often times old stuff that's been optimized and designed for a time of less horse power can utterly blow the doors off newer tech and simple checks like this let you compare. Luckily we're seeing that much of the new stuff performs well even in V1.0 which is great. To me it was very interesting to see Web API perform relatively badly with plain string content, which originally led me to think that Web API might not be properly optimized just yet. For those that caught my Tweets late last week regarding WebAPI's slow responses was with String content which is in fact considerably slower. Luckily where it counts with serialized JSON and XML WebAPI actually performs better. But I do wonder what would make generic string content slower than serialized code? This stresses another point: Don't take a single test as the final gospel and don't extrapolate out from a single set of tests. Certainly Twitter can make you feel like a fool when you post something immediate that hasn't been fleshed out a little more <blush>. Egg on my face. As a result I ended up screwing around with this for a few hours today to compare different scenarios. Well worth the time… I hope you found this useful, if not for the results, maybe for the process of quickly testing a few requests for performance and charting out a comparison. Now onwards with more serious stuff… Resources Source Code on GitHub Apache HTTP Server Project (ab.exe is part of the binary distribution)© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • 'Photo editor' and 'RAW editor' in Shotwell

    - by Chris Wilson
    The preference menu in Shotwell allows the user to specify both an 'External photo editor' and an 'External RAW editor', but I'm confused as to why two external editors would be required. I'm not a photographer, so this confusion may simply be a result of my ignorance, but I thought RAW images were unprocessed photographs, in which case two editors would be kinda redundant. Am I simply missing one of the finer details of photograph processing?

    Read the article

  • How to convert from wav or mp3 to raw PCM [on hold]

    - by Komyg
    I am developing a game using Cocos2d-X and Marmalade SDK, and I am looking for any recommendations of programs that can convert audio files in mp3 or wav format to raw PCM 16 format. The problem is that I am using the SimpleAudioEngine class to play sounds in my game and in Marmalade it only supports files that are encoded as raw PCM 16. Unfortunately I've been having a very hard time finding a program that can do this type of conversion, so I am looking for a recommendation.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >