Search Results

Search found 87957 results on 3519 pages for 'system error'.

Page 658/3519 | < Previous Page | 654 655 656 657 658 659 660 661 662 663 664 665  | Next Page >

  • How to access Oracle system tables from inside of a PL/SQL function or procedure?

    - by mjumbewu
    I am trying to access information from an Oracle meta-data table from within a function. For example (purposefully simplified): CREATE OR REPLACE PROCEDURE MyProcedure IS users_datafile_path VARCHAR2(100); BEGIN SELECT file_name INTO users_datafile_path FROM dba_data_files WHERE tablespace_name='USERS' AND rownum=1; END MyProcedure; / When I try to execute this command in an sqlplus process, I get the following errors: LINE/COL ERROR -------- ----------------------------------------------------------------- 5/5 PL/SQL: SQL Statement ignored 6/12 PL/SQL: ORA-00942: table or view does not exist I know the user has access to the table, because when I execute the following command from the same sqlplus process, it displays the expected information: SELECT file_name FROM dba_data_files WHERE tablespace_name='USERS' AND rownum=1; Which results in: FILE_NAME -------------------------------------------------------------------------------- /usr/lib/oracle/xe/oradata/XE/users.dbf Is there something I need to do differently?

    Read the article

  • Why does the compiler give an ambiguous invocation error when passing inherited types?

    - by Matt Mitchell
    What is happening in the C# compiler to cause the following ambiguous invocation compilation error? The same issue applies to extension methods, or when TestClass is generic and using instance rather than static methods. class Type1 { } class Type2 : Type1 {} class TestClass { public static void Do<T>(T something, object o) where T : Type1 {} public static void Do(Type1 something, string o) {} } void Main() { var firstInstance = new Type1(); TestClass.Do(firstInstance, new object()); // Calls Do(Type1, obj) TestClass.Do(firstInstance, "Test"); // Calls Do<T>(T, string) var secondInstance = new Type2(); TestClass.Do(secondInstance, new object()); // Calls Do(Type1, obj) TestClass.Do(secondInstance, "Test"); // "The call is ambiguous" compile error }

    Read the article

  • How do I get transparent, efficient, file system snapshotting or versioning on ext3/4?

    - by shovas
    I've long thought about versioning file systems. This is a killer feature and I've looked at Wayback, ext3cow, zfs, fuse solutions, or just cvs/svn/git overlays. I consider ext3cow the model for my requirements. Transparent, efficient, but I can do without the extra ls abc@timestamp feature. As long as I somehow get automated, transparent versioning of my files. It could be instantaneous or it could be based on snapshots on intervals of 10s, 30s, 1m, 5m, 15m, etc. Just something that will efficiently deal with thousands of files in a given directory all of various sizes, most small, but some upwards of 100m to 1gb. ZFS isn't really an option as I'm on linux (and would prefer not to use it through fuse as I already have an ext3 setup I want to version, not something new). What solutions are out there?

    Read the article

  • Error about TypeError (wrong argument type Module (expected Class)): app/controllers/player_profiles_controller.rb:1:in `<top (required)>'

    - by edi susanto
    hy guys . . im new at this . . sorry for the word that's not understandable and the easy question . . i'd like to ask about an error that shown below : TypeError (wrong argument type Module (expected Class)): app/controllers/player_profiles_controller.rb:1:in `' i want to test the result by render json in soapUI. does anyone know what's the problem so that the error will show up like above ? thanks before.regards,edy

    Read the article

  • Are there actually lag times to remove an email address from "the system"? [closed]

    - by Alex Gosselin
    For example, you send an unsubscribe message to a legitimate company or a spam, they reply that they will remove you and it may take up to 72 hours to take effect. I find it hard to believe anything that simple could take more than 3/4 of a second to take effect system wide. Another example would be when you call the visa activation line, there is a "delay" of several minutes while they try to sell you some kind of insurance. Usually just as you get the point across that you don't want it they will tell you your card has been activated and let you go. Are these delays real?

    Read the article

  • Using Google Docs as a Storage System like S3. Is there any limit?

    - by mickthomp
    Hi all, I'm considering to upgrade to a Google Docs Premium Account (gDrive)? I'm wondering if that can be used as I'm using Amazon S3 at the moment. I'd like to upload images. Do you know if there is there any limit on the number of images I can upload on my 200GB Google Docs account? I think it could be really useful to have something like that and we could save some money on our webapps. Thank you ;)

    Read the article

  • How to configure permissions for win2008 task running as Network Service to stop/start a service on different system?

    - by weiji
    Well... title says it all, actually. We've got a Scheduled Task set up on a windows server 2003 box running as the Network Service, and the batch file it runs will invoke "sc" to stop and then start a service on another windows box, however sc reports: [SC] OpenService FAILED 5: Access is denied. Running the same batch file via the windows explorer has no issues, and my user account is part of the Administrators group so I believe this is why there are no issues when I try it manually. Is this a permissions thing I enable for Network Service on the first server? Or do I enable permissions for Network Service somehow on the target server? This question (http://serverfault.com/questions/19382/why-sc-query-fails-from-one-machine-but-works-from-another) touches on something similar, but I'm looking for enabling the Network Service to access the service via the scheduled task.

    Read the article

  • How to do inline paste from system buffer in Vim?

    - by yetapb
    When pasting from the system buffer in a line like foo( someVal , <cursor is here>, someVal3); If I use "*p I get foo( someVal, , someVal3); <pasted text> If I use "*P I get <pasted text> foo( someVal, , someVal3); but I want foo( someVal, <pasted text>, someVal3 ); How can I get the result I want? edit If there is a newline in the buffer as @amardeep suspects, is there a way I can tell vim to ignore it?

    Read the article

  • What is a good operating system for relative newbies for setting up a website and source control? [on hold]

    - by Zeroth
    I'm part of a small group of people working on video games. We want to set up our own boxes to serve the website and handle the source control for development work. I have the most technical expertise, but its been a few years since I've set up or played with Linux or other open source operating systems. So I'm a bit out of the loop on what are the user-friendly open source operating systems out there right now?

    Read the article

  • Else without if

    - by user2808951
    I'm trying to write a code for my computer programming class for a project due Monday, and I'm pretty new to Java, but I'm trying to write a program that will first determine if a number the user inputs is even or odd and then determine if the number is prime or not. I'm not sure if I did the algorithm right or not, so if anyone has any corrections on the program to my algorithm or anything else please say so, but my real issue is that the program is refusing to compile. Every time I try, it says it's having an else without if problem. Here's a link to my command box: http://s1341.photobucket.com/user/Emi_Nightshade/media/Capture_zps45f9a2ea.png.html Here's my code: import java.io.*; import java.util.*; public class Lesson9p1_ThuotteEmily { public static void main(String args[]) { Scanner kbReader0=new Scanner(System.in); System.out.print("\n\nPlease enter an integer. An integer is whole number, and it can be either negative or positive. Please enter your number: "); long num=kbReader0.nextLong(); if(num%2==0) //if and else with braces { System.out.println("Your integer " + num + " is even."); } else { System.out.println("Your integer " + num + " is odd."); } Scanner kbReader1=new Scanner(System.in); System.out.print("\n\nWould you like to know if your number is prime? Please enter yes or no: "); String yn=kbReader1.nextLine(); if(yn.equals.IgnoreCase("Yes")) { System.out.println("Okay. Give me a moment."); { if(num%2==0) { System.out.println("Your number isn't prime."); } else if(num==2) { System.out.println("Your number is 2, which is the only even prime number in existence. Cool, right?"); } for(int i=3;i*i<=n;i+=2) { if(n%1==0) { System.out.println("Your number isn't prime."); } } else { System.out.println("Your number is prime!"); } } } if(yn.equals.IgnoreCase("No")) { System.out.println("Okay."); } } } If anyone could help me out with this and also any problems I may have made elsewhere in the program, I'd be very grateful! Thanks.

    Read the article

  • Parallelism in .NET – Part 9, Configuration in PLINQ and TPL

    - by Reed
    Parallel LINQ and the Task Parallel Library contain many options for configuration.  Although the default configuration options are often ideal, there are times when customizing the behavior is desirable.  Both frameworks provide full configuration support. When working with Data Parallelism, there is one primary configuration option we often need to control – the number of threads we want the system to use when parallelizing our routine.  By default, PLINQ and the TPL both use the ThreadPool to schedule tasks.  Given the major improvements in the ThreadPool in CLR 4, this default behavior is often ideal.  However, there are times that the default behavior is not appropriate.  For example, if you are working on multiple threads simultaneously, and want to schedule parallel operations from within both threads, you might want to consider restricting each parallel operation to using a subset of the processing cores of the system.  Not doing this might over-parallelize your routine, which leads to inefficiencies from having too many context switches. In the Task Parallel Library, configuration is handled via the ParallelOptions class.  All of the methods of the Parallel class have an overload which accepts a ParallelOptions argument. We configure the Parallel class by setting the ParallelOptions.MaxDegreeOfParallelism property.  For example, let’s revisit one of the simple data parallel examples from Part 2: Parallel.For(0, pixelData.GetUpperBound(0), row => { for (int col=0; col < pixelData.GetUpperBound(1); ++col) { pixelData[row, col] = AdjustContrast(pixelData[row, col], minPixel, maxPixel); } }); .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Here, we’re looping through an image, and calling a method on each pixel in the image.  If this was being done on a separate thread, and we knew another thread within our system was going to be doing a similar operation, we likely would want to restrict this to using half of the cores on the system.  This could be accomplished easily by doing: var options = new ParallelOptions(); options.MaxDegreeOfParallelism = Math.Max(Environment.ProcessorCount / 2, 1); Parallel.For(0, pixelData.GetUpperBound(0), options, row => { for (int col=0; col < pixelData.GetUpperBound(1); ++col) { pixelData[row, col] = AdjustContrast(pixelData[row, col], minPixel, maxPixel); } }); Now, we’re restricting this routine to using no more than half the cores in our system.  Note that I included a check to prevent a single core system from supplying zero; without this check, we’d potentially cause an exception.  I also did not hard code a specific value for the MaxDegreeOfParallelism property.  One of our goals when parallelizing a routine is allowing it to scale on better hardware.  Specifying a hard-coded value would contradict that goal. Parallel LINQ also supports configuration, and in fact, has quite a few more options for configuring the system.  The main configuration option we most often need is the same as our TPL option: we need to supply the maximum number of processing threads.  In PLINQ, this is done via a new extension method on ParallelQuery<T>: ParallelEnumerable.WithDegreeOfParallelism. Let’s revisit our declarative data parallelism sample from Part 6: double min = collection.AsParallel().Min(item => item.PerformComputation()); Here, we’re performing a computation on each element in the collection, and saving the minimum value of this operation.  If we wanted to restrict this to a limited number of threads, we would add our new extension method: int maxThreads = Math.Max(Environment.ProcessorCount / 2, 1); double min = collection .AsParallel() .WithDegreeOfParallelism(maxThreads) .Min(item => item.PerformComputation()); This automatically restricts the PLINQ query to half of the threads on the system. PLINQ provides some additional configuration options.  By default, PLINQ will occasionally revert to processing a query in parallel.  This occurs because many queries, if parallelized, typically actually cause an overall slowdown compared to a serial processing equivalent.  By analyzing the “shape” of the query, PLINQ often decides to run a query serially instead of in parallel.  This can occur for (taken from MSDN): Queries that contain a Select, indexed Where, indexed SelectMany, or ElementAt clause after an ordering or filtering operator that has removed or rearranged original indices. Queries that contain a Take, TakeWhile, Skip, SkipWhile operator and where indices in the source sequence are not in the original order. Queries that contain Zip or SequenceEquals, unless one of the data sources has an originally ordered index and the other data source is indexable (i.e. an array or IList(T)). Queries that contain Concat, unless it is applied to indexable data sources. Queries that contain Reverse, unless applied to an indexable data source. If the specific query follows these rules, PLINQ will run the query on a single thread.  However, none of these rules look at the specific work being done in the delegates, only at the “shape” of the query.  There are cases where running in parallel may still be beneficial, even if the shape is one where it typically parallelizes poorly.  In these cases, you can override the default behavior by using the WithExecutionMode extension method.  This would be done like so: var reversed = collection .AsParallel() .WithExecutionMode(ParallelExecutionMode.ForceParallelism) .Select(i => i.PerformComputation()) .Reverse(); Here, the default behavior would be to not parallelize the query unless collection implemented IList<T>.  We can force this to run in parallel by adding the WithExecutionMode extension method in the method chain. Finally, PLINQ has the ability to configure how results are returned.  When a query is filtering or selecting an input collection, the results will need to be streamed back into a single IEnumerable<T> result.  For example, the method above returns a new, reversed collection.  In this case, the processing of the collection will be done in parallel, but the results need to be streamed back to the caller serially, so they can be enumerated on a single thread. This streaming introduces overhead.  IEnumerable<T> isn’t designed with thread safety in mind, so the system needs to handle merging the parallel processes back into a single stream, which introduces synchronization issues.  There are two extremes of how this could be accomplished, but both extremes have disadvantages. The system could watch each thread, and whenever a thread produces a result, take that result and send it back to the caller.  This would mean that the calling thread would have access to the data as soon as data is available, which is the benefit of this approach.  However, it also means that every item is introducing synchronization overhead, since each item needs to be merged individually. On the other extreme, the system could wait until all of the results from all of the threads were ready, then push all of the results back to the calling thread in one shot.  The advantage here is that the least amount of synchronization is added to the system, which means the query will, on a whole, run the fastest.  However, the calling thread will have to wait for all elements to be processed, so this could introduce a long delay between when a parallel query begins and when results are returned. The default behavior in PLINQ is actually between these two extremes.  By default, PLINQ maintains an internal buffer, and chooses an optimal buffer size to maintain.  Query results are accumulated into the buffer, then returned in the IEnumerable<T> result in chunks.  This provides reasonably fast access to the results, as well as good overall throughput, in most scenarios. However, if we know the nature of our algorithm, we may decide we would prefer one of the other extremes.  This can be done by using the WithMergeOptions extension method.  For example, if we know that our PerformComputation() routine is very slow, but also variable in runtime, we may want to retrieve results as they are available, with no bufferring.  This can be done by changing our above routine to: var reversed = collection .AsParallel() .WithExecutionMode(ParallelExecutionMode.ForceParallelism) .WithMergeOptions(ParallelMergeOptions.NotBuffered) .Select(i => i.PerformComputation()) .Reverse(); On the other hand, if are already on a background thread, and we want to allow the system to maximize its speed, we might want to allow the system to fully buffer the results: var reversed = collection .AsParallel() .WithExecutionMode(ParallelExecutionMode.ForceParallelism) .WithMergeOptions(ParallelMergeOptions.FullyBuffered) .Select(i => i.PerformComputation()) .Reverse(); Notice, also, that you can specify multiple configuration options in a parallel query.  By chaining these extension methods together, we generate a query that will always run in parallel, and will always complete before making the results available in our IEnumerable<T>.

    Read the article

  • ASP.NET GZip Encoding Caveats

    - by Rick Strahl
    GZip encoding in ASP.NET is pretty easy to accomplish using the built-in GZipStream and DeflateStream classes and applying them to the Response.Filter property.  While applying GZip and Deflate behavior is pretty easy there are a few caveats that you have watch out for as I found out today for myself with an application that was throwing up some garbage data. But before looking at caveats let’s review GZip implementation for ASP.NET. ASP.NET GZip/Deflate Basics Response filters basically are applied to the Response.OutputStream and transform it as data is written to it through the ASP.NET Response object. So a Response.Write eventually gets written into the output stream which if a filter is also written through the filter stream’s interface. To perform the actual GZip (and Deflate) encoding typically used by Web pages .NET includes the GZipStream and DeflateStream stream classes which can be readily assigned to the Repsonse.OutputStream. With these two stream classes in place it’s almost trivially easy to create a couple of reusable methods that allow you to compress your HTTP output. In my standard WebUtils utility class (from the West Wind West Wind Web Toolkit) created two static utility methods – IsGZipSupported and GZipEncodePage – that check whether the client supports GZip encoding and then actually encodes the current output (note that although the method includes ‘Page’ in its name this code will work with any ASP.NET output). /// <summary> /// Determines if GZip is supported /// </summary> /// <returns></returns> public static bool IsGZipSupported() { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (!string.IsNullOrEmpty(AcceptEncoding) && (AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate"))) return true; return false; } /// <summary> /// Sets up the current page or handler to use GZip through a Response.Filter /// IMPORTANT: /// You have to call this method before any output is generated! /// </summary> public static void GZipEncodePage() { HttpResponse Response = HttpContext.Current.Response; if (IsGZipSupported()) { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (AcceptEncoding.Contains("deflate")) { Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "deflate"); } else { Response.Filter = new System.IO.Compression.GZipStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "gzip"); } } } As you can see the actual assignment of the Filter is as simple as: Response.Filter = new DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); which applies the filter to the OutputStream. You also need to ensure that your response reflects the new GZip or Deflate encoding and ensure that any pages that are cached in Proxy servers can differentiate between pages that were encoded with the various different encodings (or no encoding). To use this utility function now is trivially easy: In any ASP.NET code that wants to compress its Response output you simply use: protected void Page_Load(object sender, EventArgs e) { WebUtils.GZipEncodePage(); Entry = WebLogFactory.GetEntry(); var entries = Entry.GetLastEntries(App.Configuration.ShowEntryCount, "pk,Title,SafeTitle,Body,Entered,Feedback,Location,ShowTopAd", "TEntries"); if (entries == null) throw new ApplicationException("Couldn't load WebLog Entries: " + Entry.ErrorMessage); this.repEntries.DataSource = entries; this.repEntries.DataBind(); } Here I use an ASP.NET page, but the above WebUtils.GZipEncode() method call will work in any ASP.NET application type including HTTP Handlers. The only requirement is that the filter needs to be applied before any other output is sent to the OutputStream. For example, in my CallbackHandler service implementation by default output over a certain size is GZip encoded. The output that is generated is JSON or XML and if the output is over 5k in size I apply WebUtils.GZipEncode(): if (sbOutput.Length > GZIP_ENCODE_TRESHOLD) WebUtils.GZipEncodePage(); Response.ContentType = ControlResources.STR_JsonContentType; HttpContext.Current.Response.Write(sbOutput.ToString()); Ok, so you probably get the idea: Encoding GZip/Deflate content is pretty easy. Hold on there Hoss –Watch your Caching Or is it? There are a few caveats that you need to watch out for when dealing with GZip content. The fist issue is that you need to deal with the fact that some clients don’t support GZip or Deflate content. Most modern browsers support it, but if you have a programmatic Http client accessing your content GZip/Deflate support is by no means guaranteed. For example, WinInet Http clients don’t support GZip out of the box – it has to be explicitly implemented. Other low level HTTP clients on other platforms too don’t support GZip out of the box. The problem is that your application, your Web Server and Proxy Servers on the Internet might be caching your generated content. If you return content with GZip once and then again without, either caching is not applied or worse the wrong type of content is returned back to the client from a cache or proxy. The result is an unreadable response for *some clients* which is also very hard to debug and fix once in production. You already saw the issue of Proxy servers addressed in the GZipEncodePage() function: // Allow proxy servers to cache encoded and unencoded versions separately Response.AppendHeader("Vary", "Content-Encoding"); This ensures that any Proxy servers also check for the Content-Encoding HTTP Header to cache their content – not just the URL. The same thing applies if you do OutputCaching in your own ASP.NET code. If you generate output for GZip on an OutputCached page the GZipped content will be cached (either by ASP.NET’s cache or in some cases by the IIS Kernel Cache). But what if the next client doesn’t support GZip? She’ll get served a cached GZip page that won’t decode and she’ll get a page full of garbage. Wholly undesirable. To fix this you need to add some custom OutputCache rules by way of the GetVaryByCustom() HttpApplication method in your global_ASAX file: public override string GetVaryByCustomString(HttpContext context, string custom) { // Override Caching for compression if (custom == "GZIP") { string acceptEncoding = HttpContext.Current.Response.Headers["Content-Encoding"]; if (string.IsNullOrEmpty(acceptEncoding)) return ""; else if (acceptEncoding.Contains("gzip")) return "GZIP"; else if (acceptEncoding.Contains("deflate")) return "DEFLATE"; return ""; } return base.GetVaryByCustomString(context, custom); } In a page that use Output caching you then specify: <%@ OutputCache Duration="180" VaryByParam="none" VaryByCustom="GZIP" %> To use that custom rule. It’s all Fun and Games until ASP.NET throws an Error Ok, so you’re up and running with GZip, you have your caching squared away and your pages that you are applying it to are jamming along. Then BOOM, something strange happens and you get a lovely garbled page that look like this: Lovely isn’t it? What’s happened here is that I have WebUtils.GZipEncode() applied to my page, but there’s an error in the page. The error falls back to the ASP.NET error handler and the error handler removes all existing output (good) and removes all the custom HTTP headers I’ve set manually (usually good, but very bad here). Since I applied the Response.Filter (via GZipEncode) the output is now GZip encoded, but ASP.NET has removed my Content-Encoding header, so the browser receives the GZip encoded content without a notification that it is encoded as GZip. The result is binary output. Here’s what Fiddler says about the raw HTTP header output when an error occurs when GZip encoding was applied: HTTP/1.1 500 Internal Server Error Cache-Control: private Content-Type: text/html; charset=utf-8 Date: Sat, 30 Apr 2011 22:21:08 GMT Content-Length: 2138 Connection: close ?`I?%&/m?{J?J??t??` … binary output striped here Notice: no Content-Encoding header and that’s why we’re seeing this garbage. ASP.NET has stripped the Content-Encoding header but left our filter intact. So how do we fix this? In my applications I typically have a global Application_Error handler set up and in this case I’ve been using that. One thing that you can do in the Application_Error handler is explicitly clear out the Response.Filter and set it to null at the top: protected void Application_Error(object sender, EventArgs e) { // Remove any special filtering especially GZip filtering Response.Filter = null; … } And voila I get my Yellow Screen of Death or my custom generated error output back via uncompressed content. BTW, the same is true for Page level errors handled in Page_Error or ASP.NET MVC Error handling methods in a controller. Another and possibly even better solution is to check whether a filter is attached just before the headers are sent to the client as pointed out by Adam Schroeder in the comments: protected void Application_PreSendRequestHeaders() { // ensure that if GZip/Deflate Encoding is applied that headers are set // also works when error occurs if filters are still active HttpResponse response = HttpContext.Current.Response; if (response.Filter is GZipStream && response.Headers["Content-encoding"] != "gzip") response.AppendHeader("Content-encoding", "gzip"); else if (response.Filter is DeflateStream && response.Headers["Content-encoding"] != "deflate") response.AppendHeader("Content-encoding", "deflate"); } This uses the Application_PreSendRequestHeaders() pipeline event to check for compression encoding in a filter and adjusts the content accordingly. This is actually a better solution since this is generic – it’ll work regardless of how the content is cleaned up. For example, an error Response.Redirect() or short error display might get changed and the filter not cleared and this code actually handles that. Sweet, thanks Adam. It’s unfortunate that ASP.NET doesn’t natively clear out Response.Filters when an error occurs just as it clears the Response and Headers. I can’t see where leaving a Filter in place in an error situation would make any sense, but hey - this is what it is and it’s easy enough to fix as long as you know where to look. Riiiight! IIS and GZip I should also mention that IIS 7 includes good support for compression natively. If you can defer encoding to let IIS perform it for you rather than doing it in your code by all means you should do it! Especially any static or semi-dynamic content that can be made static should be using IIS built-in compression. Dynamic caching is also supported but is a bit more tricky to judge in terms of performance and footprint. John Forsyth has a great article on the benefits and drawbacks of IIS 7 compression which gives some detailed performance comparisons and impact reviews. I’ll post another entry next with some more info on IIS compression since information on it seems to be a bit hard to come by. Related Content Built-in GZip/Deflate Compression in IIS 7.x HttpWebRequest and GZip Responses © Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET   IIS7  

    Read the article

  • Physics in my game confused after restructuring the Game loop

    - by Julian Assange
    Hello! I'm on my way with making a game in Java. Now I have some trouble with an interpolation based game loop in my calculations. Before I used that system the calculation of a falling object was like this: Delta based system private static final float SPEED_OF_GRAVITY = 500.0f; @Override public void update(float timeDeltaSeconds, Object parentObject) { parentObject.y = parentObject.y + (parentObject.yVelocity * timeDeltaSeconds); parentObject.yVelocity -= SPEED_OF_GRAVITY * timeDeltaSeconds; ...... What you see here is that I used that delta value from previous frame to the current frame to calculate the physics. Now I switched and implement a interpolation based system and I actually left the current system where I used delta to calculate my physics. However, with the interpolation system the delta time is removed - but now are my calculations screwed up and I've tried the whole day to solve this: Interpolation based system private static final float SPEED_OF_GRAVITY = 500.0f; @Override public void update(Object parentObject) { parentObject.y = parentObject.y + (parentObject.yVelocity); parentObject.yVelocity -= SPEED_OF_GRAVITY; ...... I'm totally clueless - how should this be solved? The rendering part is solved with a simple prediction method. With the delta system I could see my object be smoothly rendered to the screen, but with this interpolation/prediction method the object just appear sticky for one second and then it's gone. The core of this game loop is actually from here deWiTTERS Game Loop, where I trying to implement the last solution he describes. Shortly - my physics are in a mess and this need to be solved. Any ideas? Thanks in advance!

    Read the article

  • Gnome 3 freezes on logon on samsung RV 509

    - by Noufal
    I have a Samsung NP-RV509 A0FIN and I tried to install GNU/Linux with gnome 3.2 on it. I tried Fedora 16, Ubuntu 11.10 and Linux Mint 12 RC, but with no success. All of these freezes upon login into gnome shell. I think it is the problem with graphics driver, so I tried xorg-edgers ppa on my last installation, ie., Linux Mint. I also tried various intel graphics packages listed on Synaptic package manager, but no success again. My device configuration is as follows(obtained from windows 7): More details about my computer Component Details Subscore Base score Processor Intel(R) Pentium(R) CPU P6200 @ 2.13GHz 5.6 4.6 Memory (RAM) 4.00 GB 7.2 Graphics Intel(R) HD Graphics 4.6 Gaming graphics 1562 MB Total available graphics memory 5.2 Primary hard disk 12GB Free (50GB Total) 5.9 Windows 7 Ultimate System -------------------------------------------------------------------------------- Manufacturer SAMSUNG ELECTRONICS CO., LTD. Model RV409/RV509/RV709 Total amount of system memory 4.00 GB RAM System type 32-bit operating system Number of processor cores 2 64-bit capable Yes Storage -------------------------------------------------------------------------------- Total size of hard disk(s) 418 GB Disk partition (C:) 12 GB Free (50 GB Total) Media drive (D:) CD/DVD Disk partition (E:) 526 MB Free (191 GB Total) Disk partition (F:) 101 GB Free (177 GB Total) Graphics -------------------------------------------------------------------------------- Display adapter type Intel(R) HD Graphics Total available graphics memory 1562 MB Dedicated graphics memory 64 MB Dedicated system memory 0 MB Shared system memory 1498 MB Display adapter driver version 8.15.10.2202 Primary monitor resolution 1366x768 DirectX version DirectX 10 Network -------------------------------------------------------------------------------- Network Adapter Realtek PCIe GBE Family Controller Network Adapter Broadcom 802.11n Network Adapter Network Adapter Microsoft Virtual WiFi Miniport Adapter Notes -------------------------------------------------------------------------------- The gaming graphics score is based on the primary graphics adapter. If this system has linked or multiple graphics adapters, some software applications may see additional performance benefits. Any help is appreciated, and thanks in advance.

    Read the article

  • Beware: Upgrade to ASP.NET MVC 2.0 with care if you use AntiForgeryToken

    - by James Crowley
    If you're thinking of upgrading to MVC 2.0, and you take advantage of the AntiForgeryToken support then be careful - you can easily kick out all active visitors after the upgrade until they restart their browser. Why's this?For the anti forgery validation to take place, ASP.NET MVC uses a session cookie called "__RequestVerificationToken_Lw__". This gets checked for and de-serialized on any page where there is an AntiForgeryToken() call. However, the format of this validation cookie has apparently changed between MVC 1.0 and MVC 2.0. What this means is that when you make to switch on your production server to MVC 2.0, suddenly all your visitors session cookies are invalid, resulting in calls to AntiForgeryToken() throwing exceptions (even on a standard GET request) when de-serializing it: [InvalidCastException: Unable to cast object of type 'System.Web.UI.Triplet' to type 'System.Object[]'.]   System.Web.Mvc.AntiForgeryDataSerializer.Deserialize(String serializedToken) +104[HttpAntiForgeryException (0x80004005): A required anti-forgery token was not supplied or was invalid.]   System.Web.Mvc.AntiForgeryDataSerializer.Deserialize(String serializedToken) +368   System.Web.Mvc.HtmlHelper.GetAntiForgeryTokenAndSetCookie(String salt, String domain, String path) +209   System.Web.Mvc.HtmlHelper.AntiForgeryToken(String salt, String domain, String path) +16   System.Web.Mvc.HtmlHelper.AntiForgeryToken() +10  <snip> So you've just kicked all your active users out of your site with exceptions until they think to restart their browser (to clear the session cookies). The only work around for now is to either write some code that wipes this cookie - or disable use of AntiForgeryToken() in your MVC 2.0 site until you're confident all session cookies will have expired. That in itself isn't very straightforward, given how frequently people tend to hibernate/standby their machines - the session cookie will only clear once the browser has been shut down and re-opened. Hope this helps someone out there!

    Read the article

  • sudo: /usr/lib/sudo/sudoers.so must be owned by uid 0

    - by 7UR7L3
    Whenever I try to do anything at all that requires my password it returns this: u7ur7l3@ubuntu:~$ sudo sudo: /usr/lib/sudo/sudoers.so must be owned by uid 0 sudo: fatal error, unable to load plugins u7ur7l3@ubuntu:~$ So I can't install anything from the Software Center / package manager or run any commands in terminal that require my password. I can log in, but that's pretty much it. I accidentally changed the permissions of some files, then changed some more trying to fix it :/. Now I'm completely lost as to what to do. This is what happened when I tried to get sudo working again using pkexec: u7ur7l3@ubuntu:~$ pkexec chown root /usr/lib/sudo/sudoers.so Error getting authority: Error initializing authority: Error calling StartServiceByName for org.freedesktop.PolicyKit1: GDBus.Error:org.freedesktop.DBus.Error.Spawn.ExecFailed: Failed to execute program /usr/lib/dbus-1.0/dbus-daemon-launch-helper: Success u7ur7l3@ubuntu:~$ sudo ls sudo: /usr/lib/sudo/sudoers.so must be owned by uid 0 sudo: fatal error, unable to load plugins And to change permissions I was using Root Actions as a dolphin service/ plugin thing, so history doesn't show me the permission changes. I just realized that sounds don't work at all anymore. When I go into Phonon my default settings and playback devices aren't even there. Also I don't have the option to shutdown, I can only log out or leave.

    Read the article

  • How do I create a popup banner before login with Lightdm?

    - by Rich Loring
    When Ubuntu was using gnome I was able to create a popup banner like the banner below before the login screen using zenity in the /etc/gdm/Init/Default. The line of code would be like this: if [ -f "/usr/bin/zenity" ]; then /usr/bin/zenity --info --text="`cat /etc/issue`" --no-wrap; else xmessage -file /etc/issue -button ok -geometry 540X480; fi How can I accomplish this with Unity? NOTICE TO USERS This is a Federal computer system (and/or it is directly connected to a BNL local network system) and is the property of the United States Government. It is for authorized use only. Users (authorized or unauthorized) have no explicit or implicit expectation of privacy. Any or all uses of this system and all files on this system may be intercepted, monitored, recorded, copied, audited, inspected, and disclosed to authorized site, Department of Energy, and law enforcement personnel, as well as authorized officials of other agencies, both domestic and foreign. By using this system, the user consents to such interception, monitoring, recording, copying, auditing, inspection, and disclosure at the discretion of authorized site or Department of Energy personnel. Unauthorized or improper use of this system may result in administrative disciplinary action and civil and criminal penalties. By continuing to use this system you indicate your awareness of and consent to these terms and conditions of use. LOG OFF IMMEDIATELY if you do not agree to the conditions stated in this warning.

    Read the article

  • CodePlex Daily Summary for Monday, April 05, 2010

    CodePlex Daily Summary for Monday, April 05, 2010New Projects.Net Data Form Wizard: A Basic .Net application that will connect to a SQL Server, allow you to select a database, then select from the user created tables, read the tabl...Agilisa Data.Controls: Agilisa DataControls provides ready to use Databound controls, encapsulating the data connection logic, caching, for ASP.NET Controls. Just drop th...algoritmia: A Python 3.1+ library of Data Structures and Algorithms. This library is being used to teach a course on Algorithms in my university. It contains ...Bag of Tricks: The original WPF Bag of tricks, now maintained by your friends at Pixel Lab.DIMIS: It is a simple asp.net system, just for practice!DotNetNuke® Postgres Data Provider: DNN PG Provider is a DotNetNuke® 4.9.2 Data Provider for PostgreSQL, an enterprise class open source database system. With DNN PG DataProvider y...Home Finance: This project develop to manage your home finance.House Repair Management System: House Repair Management SystemLaunchpadNET: LaunchpadNET is a C# library for interfacing your .NET program with the Novation Launchpad controller.Mapsui - UI for maps: Mapsui is a UI library for mapping applications. It is based on BruTile and SharpMap. It is designed to be fast and responsive.micronovo: micronovomicronovomicronovomicronovoNPlurk: The project goal is to provide a .NET implemented Plurk API wrapper. PowerExt: PowerExt is a Windows Explorer add-in written in C++. Primarily targeted at programmers, it adds an additional .NET tab to the File Properties dial...Python Multiple Dispatch: Multiple dispatch (AKA multimethods) for Python 3 via a metaclass and type annotations.SpugDisposeCheck - Visual Studio Addin for validating Sharepoint dispose objects: AddIn that wraps the SPDisposeCheck Tool from Microsoft and fully integrate it with Visual Studio.System.Tuples: System.Tuples is a small tuple library. It uses T4 to generate tuples, and is made to be compatible with .net 2.0, .net 3.0 and .net 3.5.WebStatistics Server for Windows Server: WebStatistics Server for Windows Server is a tool to create visitor and traffic statistics of a Windows Server running IIS Webserver. It includes a...whileActivity Test: This is a temporary project to test the whileActivity and the updateResourceActivity (Forefront Identity Manager 2010 rtm)XBMC NFO Exporter: XBMC Nfo Exporter is a simple utility that allows you to create reports based on your media XMBC NFO files.XML Flattener: A simple tool to flatten "pretty"-printed XML files into a single line for use in web service test situations, etc. xvanneste: Sources et exemples utilisés sur le site http://www.xvanneste.com et http://media.xvanneste.comzhengym: 这是我个人的测试项目New Releases.Net Data Form Wizard: Alpha: I am only providing the logical code at this point. I will release a completed project once it has basic functionality, at the moment it only gener....Net Data Form Wizard: Alpha Code: This is only the basic VB code to create a form from the database information.Alter gear SQL index Management: Setup 1.1.1: Changes : Added ability to save / delete connection stringsExcelDna: ExcelDna Version 0.24: This versions adds packing support for .config files, and fixes a bug where temp files were not cleaned up.Hash Calculator: HashCalculator 1.1: Added drag-and-drop support Fixed some bugsHeadCounter: HeadCounter 1.2.4 'Vaelastrasz': Added a basic bbcode option for forum posting to sites that do not support full bbcode implementations (e.g. Guild Portal)Home Access Plus+: v3.2.5.0: v3.2.5.0 Release Change Log: Added the booking system File Changes: ~/app_data/* ~/bin/CHS.dll ~/bin/CHS.pdb ~/bin/CHS Extranet.dll ~/bin/...Home Access Plus+: v3.2.5.1: v3.2.5.1 Release Change Log: Fixed access to the booking system for non domain admin File Changes: ~/bin/CHS Extranet.dll ~/bin/CHS Extranet.pdb...Howard van Rooijen's Code Samples: Getting Started with MongoDB and NoRM: Code to accompany the blog post A .NET Developer Guide to: MongoDB and NoRM This download contains the a solution with the following structure: G...iExporter - iTunes playlist exporting: iExporter gui v2.5.1.0 - console v1.2.1.0: Paypal donate! Fixed small bug for iExporter Gui When pressing the Select button more then once, the Deselect button would not disable the Export...IST435: Lab 2 Demo Solution: Lab 2 Demo Solution - OverviewThis is a demo solution for Lab 2 which meets the basic requirements of the lab. Note that this solution has the foll...JSINQ - LINQ to Objects for JavaScript: JSINQ 1.0.0.1: Minor bugfixes with the Enumerable and Dictionary implementations.Mavention: Mavention Instant Page Create: Mavention Instant Page Create allows you to create new Publishing Pages with a single mouse click. Screenshots and more information available @ htt...Microsoft Dynamics CRM 4.0 Marketing List Member Importer: Nocelab ExcelAddin - Release 2.2: Version: 2.2 Release Note: - Added tab in the task panel - Added test button to check MSCRM connection How to install: - Uninstall previous ve...Multiplayer Quiz: Release 1_6_903_0b: Latest beta release - please leave any bugs etc in comments.MVVM Light Toolkit: MVVM Light Toolkit V3 SP1: This release can be installed on top of V3, and adds the following features: Project and Item templates for Visual Studio 10 Express (phone editio...NPlurk: First release: This is first release of NPlurk and it's almost completely workable. Enjoy!Performance Analysis of Logs (PAL) Tool: PAL v2.0 Alpha 5: Export to Perfmon Log Template or Data Collector Set Added: Added the feature to export perfmon log templates (*.htm) for WinXP/2003 computers or D...Python Multiple Dispatch: v0.1: Initial release. I believe it is working fine.ReRemind: V7: - Added new notification: "Unread MMS" <- Default is enabled, so be sure to go into Config if you don't want this. - Config now supports sound and ...SharePhone: SharePhone v.1.0.2: Added support for retrieving user profiles and saving back to SharePoint Use clientContext.GetUserProfile(..) or clientContext.UpdateUserProfile(..)Shinkansen: compress, crunch, combine, and cache JavaScript and CSS: Shinkansen 1.0.0.033010: Added support for ASP.NET MVC. Download contains binaries only.SpugDisposeCheck - Visual Studio Addin for validating Sharepoint dispose objects: SpugDisposeCheck Beta Release [Stable]: SpugDisposeCheck - Visual Studio Addin for validating Sharepoint dispose objects You can download the Microsoft SPDisposeCheck Tool from here:http...Starter Kit Mytrip.Mvc.Entity: Mytrip.Mvc.Entity 1.0 RC2: EF Membership XML Membership UserManager FileManager Localization Captcha ClientValidation Theme CrossBrowser VS 2010 RC MVC 2 R...System.Tuples: System.Tuples for .net 2.0: The System.Tuples release for .net 3.0System.Tuples: System.Tuples for .net 3.0: The System.Tuples release for .net 3.0 Extension methods have been removed to remain compatible with 3.0System.Tuples: System.Tuples for .net 3.5: The System.Tuples release for .net 3.5 Contains full functionality of the library.WatchersNET.TagCloud: WatchersNET.TagCloud 01.03.00: Whats New Html (non Flash) TagCloud can be skinned 11 Skins added for Html Cloud Skin Orange Skin Purple Import/Export of Custom Tags Sett...whileActivity Test: Activity1.zip: ActivityLibrary1.zip contains the source code to do a testWindows Phone 7 Panorama control: panorama control v0.5: Control source code for v0.5. This is the first drop. Doens't include sample project.Windows Phone 7 Panorama control: panorama control v0.5 + samples: Control source code and sample project. This drop includes 2 samples projects : - PhoneApp - Windows Phone sample - SilverlightApp - Silverlight...XML Flattener: XML Flattener: A simple WinForms app--paste in your XML, hit Flatten, and copy the result.xvanneste: RestFul SharePoint: ListItem.xslt ListItems.xslt Lists.xslt ListItemSPChat.xslt RestFull.htm SPChat.htmZinc Launcher: Zinc Launcher 1.0.1.0: Zinc Launcher requires that Zinc be properly installed. It should work under Vista Media Center and 7 Media Center, although Vista is untested. Zin...Most Popular ProjectsRawrWBFS ManagerMicrosoft SQL Server Product Samples: DatabaseASP.NET Ajax LibrarySilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesDotNetNuke® Community EditionMost Active ProjectsGraffiti CMSnopCommerce. Open Source online shop e-commerce solution.RawrFacebook Developer ToolkitjQuery Library for SharePoint Web ServicesBlogEngine.NETFarseer Physics EngineNcqrs Framework - A CQRS framework for .NETpatterns & practices – Enterprise LibraryN2 CMS

    Read the article

< Previous Page | 654 655 656 657 658 659 660 661 662 663 664 665  | Next Page >