Search Results

Search found 1581 results on 64 pages for 'compilation'.

Page 16/64 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Plan Caching and Query Memory Part I – When not to use stored procedure or other plan caching mechanisms like sp_executesql or prepared statement

    - by sqlworkshops
      The most common performance mistake SQL Server developers make: SQL Server estimates memory requirement for queries at compilation time. This mechanism is fine for dynamic queries that need memory, but not for queries that cache the plan. With dynamic queries the plan is not reused for different set of parameters values / predicates and hence different amount of memory can be estimated based on different set of parameter values / predicates. Common memory allocating queries are that perform Sort and do Hash Match operations like Hash Join or Hash Aggregation or Hash Union. This article covers Sort with examples. It is recommended to read Plan Caching and Query Memory Part II after this article which covers Hash Match operations.   When the plan is cached by using stored procedure or other plan caching mechanisms like sp_executesql or prepared statement, SQL Server estimates memory requirement based on first set of execution parameters. Later when the same stored procedure is called with different set of parameter values, the same amount of memory is used to execute the stored procedure. This might lead to underestimation / overestimation of memory on plan reuse, overestimation of memory might not be a noticeable issue for Sort operations, but underestimation of memory will lead to spill over tempdb resulting in poor performance.   This article covers underestimation / overestimation of memory for Sort. Plan Caching and Query Memory Part II covers underestimation / overestimation for Hash Match operation. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   To read additional articles I wrote click here.   In most cases it is cheaper to pay for the compilation cost of dynamic queries than huge cost for spill over tempdb, unless memory requirement for a stored procedure does not change significantly based on predicates.   The best way to learn is to practice. To create the below tables and reproduce the behavior, join the mailing list by using this link: www.sqlworkshops.com/ml and I will send you the table creation script. Most of these concepts are also covered in our webcasts: www.sqlworkshops.com/webcasts   Enough theory, let’s see an example where we sort initially 1 month of data and then use the stored procedure to sort 6 months of data.   Let’s create a stored procedure that sorts customers by name within certain date range.   --Example provided by www.sqlworkshops.com create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1)       end go Let’s execute the stored procedure initially with 1 month date range.   set statistics time on go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-31' go The stored procedure took 48 ms to complete.     The stored procedure was granted 6656 KB based on 43199.9 rows being estimated.       The estimated number of rows, 43199.9 is similar to actual number of rows 43200 and hence the memory estimation should be ok.       There was no Sort Warnings in SQL Profiler.      Now let’s execute the stored procedure with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 679 ms to complete.      The stored procedure was granted 6656 KB based on 43199.9 rows being estimated.      The estimated number of rows, 43199.9 is way different from the actual number of rows 259200 because the estimation is based on the first set of parameter value supplied to the stored procedure which is 1 month in our case. This underestimation will lead to sort spill over tempdb, resulting in poor performance.      There was Sort Warnings in SQL Profiler.    To monitor the amount of data written and read from tempdb, one can execute select num_of_bytes_written, num_of_bytes_read from sys.dm_io_virtual_file_stats(2, NULL) before and after the stored procedure execution, for additional information refer to the webcast: www.sqlworkshops.com/webcasts.     Let’s recompile the stored procedure and then let’s first execute the stored procedure with 6 month date range.  In a production instance it is not advisable to use sp_recompile instead one should use DBCC FREEPROCCACHE (plan_handle). This is due to locking issues involved with sp_recompile, refer to our webcasts for further details.   exec sp_recompile CustomersByCreationDate go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go Now the stored procedure took only 294 ms instead of 679 ms.    The stored procedure was granted 26832 KB of memory.      The estimated number of rows, 259200 is similar to actual number of rows of 259200. Better performance of this stored procedure is due to better estimation of memory and avoiding sort spill over tempdb.      There was no Sort Warnings in SQL Profiler.       Now let’s execute the stored procedure with 1 month date range.   --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-31' go The stored procedure took 49 ms to complete, similar to our very first stored procedure execution.     This stored procedure was granted more memory (26832 KB) than necessary memory (6656 KB) based on 6 months of data estimation (259200 rows) instead of 1 month of data estimation (43199.9 rows). This is because the estimation is based on the first set of parameter value supplied to the stored procedure which is 6 months in this case. This overestimation did not affect performance, but it might affect performance of other concurrent queries requiring memory and hence overestimation is not recommended. This overestimation might affect performance Hash Match operations, refer to article Plan Caching and Query Memory Part II for further details.    Let’s recompile the stored procedure and then let’s first execute the stored procedure with 2 day date range. exec sp_recompile CustomersByCreationDate go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-02' go The stored procedure took 1 ms.      The stored procedure was granted 1024 KB based on 1440 rows being estimated.      There was no Sort Warnings in SQL Profiler.      Now let’s execute the stored procedure with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go   The stored procedure took 955 ms to complete, way higher than 679 ms or 294ms we noticed before.      The stored procedure was granted 1024 KB based on 1440 rows being estimated. But we noticed in the past this stored procedure with 6 month date range needed 26832 KB of memory to execute optimally without spill over tempdb. This is clear underestimation of memory and the reason for the very poor performance.      There was Sort Warnings in SQL Profiler. Unlike before this was a Multiple pass sort instead of Single pass sort. This occurs when granted memory is too low.      Intermediate Summary: This issue can be avoided by not caching the plan for memory allocating queries. Other possibility is to use recompile hint or optimize for hint to allocate memory for predefined date range.   Let’s recreate the stored procedure with recompile hint. --Example provided by www.sqlworkshops.com drop proc CustomersByCreationDate go create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1, recompile)       end go Let’s execute the stored procedure initially with 1 month date range and then with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-30' exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 48ms and 291 ms in line with previous optimal execution times.      The stored procedure with 1 month date range has good estimation like before.      The stored procedure with 6 month date range also has good estimation and memory grant like before because the query was recompiled with current set of parameter values.      The compilation time and compilation CPU of 1 ms is not expensive in this case compared to the performance benefit.     Let’s recreate the stored procedure with optimize for hint of 6 month date range.   --Example provided by www.sqlworkshops.com drop proc CustomersByCreationDate go create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1, optimize for (@CreationDateFrom = '2001-01-01', @CreationDateTo ='2001-06-30'))       end go Let’s execute the stored procedure initially with 1 month date range and then with 6 month date range.   --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-30' exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 48ms and 291 ms in line with previous optimal execution times.    The stored procedure with 1 month date range has overestimation of rows and memory. This is because we provided hint to optimize for 6 months of data.      The stored procedure with 6 month date range has good estimation and memory grant because we provided hint to optimize for 6 months of data.       Let’s execute the stored procedure with 12 month date range using the currently cashed plan for 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-12-31' go The stored procedure took 1138 ms to complete.      2592000 rows were estimated based on optimize for hint value for 6 month date range. Actual number of rows is 524160 due to 12 month date range.      The stored procedure was granted enough memory to sort 6 month date range and not 12 month date range, so there will be spill over tempdb.      There was Sort Warnings in SQL Profiler.      As we see above, optimize for hint cannot guarantee enough memory and optimal performance compared to recompile hint.   This article covers underestimation / overestimation of memory for Sort. Plan Caching and Query Memory Part II covers underestimation / overestimation for Hash Match operation. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   Summary: Cached plan might lead to underestimation or overestimation of memory because the memory is estimated based on first set of execution parameters. It is recommended not to cache the plan if the amount of memory required to execute the stored procedure has a wide range of possibilities. One can mitigate this by using recompile hint, but that will lead to compilation overhead. However, in most cases it might be ok to pay for compilation rather than spilling sort over tempdb which could be very expensive compared to compilation cost. The other possibility is to use optimize for hint, but in case one sorts more data than hinted by optimize for hint, this will still lead to spill. On the other side there is also the possibility of overestimation leading to unnecessary memory issues for other concurrently executing queries. In case of Hash Match operations, this overestimation of memory might lead to poor performance. When the values used in optimize for hint are archived from the database, the estimation will be wrong leading to worst performance, so one has to exercise caution before using optimize for hint, recompile hint is better in this case. I explain these concepts with detailed examples in my webcasts (www.sqlworkshops.com/webcasts), I recommend you to watch them. The best way to learn is to practice. To create the above tables and reproduce the behavior, join the mailing list at www.sqlworkshops.com/ml and I will send you the relevant SQL Scripts.     Register for the upcoming 3 Day Level 400 Microsoft SQL Server 2008 and SQL Server 2005 Performance Monitoring & Tuning Hands-on Workshop in London, United Kingdom during March 15-17, 2011, click here to register / Microsoft UK TechNet.These are hands-on workshops with a maximum of 12 participants and not lectures. For consulting engagements click here.     Disclaimer and copyright information:This article refers to organizations and products that may be the trademarks or registered trademarks of their various owners. Copyright of this article belongs to R Meyyappan / www.sqlworkshops.com. You may freely use the ideas and concepts discussed in this article with acknowledgement (www.sqlworkshops.com), but you may not claim any of it as your own work. This article is for informational purposes only; you use any of the suggestions given here entirely at your own risk.   R Meyyappan [email protected] LinkedIn: http://at.linkedin.com/in/rmeyyappan

    Read the article

  • Adding Fake Build Information in TFS 2010

    - by Jakob Ehn
    We have been using TFS 2010 build for distributing a build in parallel on several agents, but where the actual compilation is done by a bunch of external tools and compilers, e.g. no MSBuild involved. We are using the ParallelTemplate.xaml template that Jim Lamb blogged about previously, which distributes each configuration to a different agent. We developed custom activities for running these external compilers and collecting the information and errors by reading standard out/error and pushing it back to the build log. But since we aren’t using MSBuild we don’t the get nice configuration summary section on the build summary page that we are used to. We would like to show the result of each configuration with any errors/warnings as usual, together with a link to the log file. TFS 2010 API to the rescue! What we need to do is adding information to the InformationNode structure that is associated with every TFS build. The log that you normally see in the Log view is built up as a tree structure of IBuildInformationNode objects. This structure can we accessed by using the InformationNodeConverters class. This class also contain some helper methods for creating BuildProjectNode, which contain the information about each project that was build, for example which configuration, number of errors and warnings and link to the log file. Here is a code snippet that first creates a “fake” build from scratch and the add two BuildProjectNodes, one for Debug|x86 and one for Release|x86 with some release information:   TfsTeamProjectCollection collection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://lt-jakob2010:8080/tfs")); IBuildServer buildServer = collection.GetService<IBuildServer>(); var buildDef = buildServer.GetBuildDefinition("TeamProject", "BuildDefinition"); //Create fake build with random build number var detail = buildDef.CreateManualBuild(new Random().Next().ToString()); // Create Debug|x86 project summary IBuildProjectNode buildProjectNode = detail.Information.AddBuildProjectNode(DateTime.Now, "Debug", "MySolution.sln", "x86", "$/project/MySolution.sln", DateTime.Now, "Default"); buildProjectNode.CompilationErrors = 1; buildProjectNode.CompilationWarnings = 1; buildProjectNode.Node.Children.AddBuildError("Compilation", "File1.cs", 12, 5, "", "Syntax error", DateTime.Now); buildProjectNode.Node.Children.AddBuildWarning("File2.cs", 3, 1, "", "Some warning", DateTime.Now, "Compilation"); buildProjectNode.Node.Children.AddExternalLink("Log File", new Uri(@"\\server\share\logfiledebug.txt")); buildProjectNode.Save(); // Create Releaes|x86 project summary buildProjectNode = detail.Information.AddBuildProjectNode(DateTime.Now, "Release", "MySolution.sln", "x86", "$/project/MySolution.sln", DateTime.Now, "Default"); buildProjectNode.CompilationErrors = 0; buildProjectNode.CompilationWarnings = 0; buildProjectNode.Node.Children.AddExternalLink("Log File", new Uri(@"\\server\share\logfilerelease.txt")); buildProjectNode.Save(); detail.Information.Save(); detail.FinalizeStatus(BuildStatus.Failed); When running this code, it will a create a build that looks like this: As you can see, it created two configurations with error and warning information and a link to a log file. Just like a regular MSBuild would have done. This is very useful when using TFS 2010 Build in heterogeneous environments. It would also be possible to do this when running compilations completely outside TFS build, but then push the results of the into TFS for easy access. You can push all information, including the compilation summary, drop location, test results etc using the API.

    Read the article

  • Reporting services 2008: ReportExecution2005.asmx does not exist

    - by Shimrod
    Hi everyone, I'm trying to generate a report directly from the code (to send it by mail after). I make this in a windows service. So here is what I'm doing: Dim rview As New ReportViewer() Dim reportServerAddress As String = "http://server/Reports_client" rview.ServerReport.ReportServerUrl = New Uri(reportServerAddress) Dim paramList As New List(Of Microsoft.Reporting.WinForms.ReportParameter) paramList.Add(New Microsoft.Reporting.WinForms.ReportParameter("param1", t.Value)) paramList.Add(New Microsoft.Reporting.WinForms.ReportParameter("CurrentDate", Date.Now)) Dim reportsDirectory As String = "AppName.Reports" Dim reportPath As String = String.Format("/{0}/{1}", reportsDirectory, reportName) rview.ServerReport.ReportPath = reportPath rview.ServerReport.SetParameters(paramList) 'This is where I get the exception Dim mimeType, encoding, extension, deviceInfo As String Dim streamids As String() Dim warnings As Microsoft.Reporting.WinForms.Warning() deviceInfo = "<DeviceInfo><SimplePageHeaders>True</SimplePageHeaders></DeviceInfo>" Dim format As String = "PDF" Dim bytes As Byte() = rview.ServerReport.Render(format, deviceInfo, mimeType, encoding, extension, streamids, warnings) When debugging this code, I can see it throws a MissingEndpointException where I make the SetParameters(paramList) with this message: The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version. Looking in the server's log file, I can see this: ui!ReportManager_0-8!878!06/02/2010-11:34:36:: Unhandled exception: System.Web.HttpException: The file '/Reports_client/ReportExecution2005.asmx' does not exist. at System.Web.UI.Util.CheckVirtualFileExists(VirtualPath virtualPath) at System.Web.Compilation.BuildManager.GetVPathBuildResultInternal(VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile) at System.Web.Compilation.BuildManager.GetVPathBuildResultWithNoAssert(HttpContext context, VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile) at System.Web.Compilation.BuildManager.GetVPathBuildResult(HttpContext context, VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile) at System.Web.Compilation.BuildManager.GetVPathBuildResult(HttpContext context, VirtualPath virtualPath) at System.Web.UI.WebServiceParser.GetCompiledType(String inputFile, HttpContext context) at System.Web.Services.Protocols.WebServiceHandlerFactory.GetHandler(HttpContext context, String verb, String url, String filePath) at System.Web.HttpApplication.MapHttpHandler(HttpContext context, String requestType, VirtualPath path, String pathTranslated, Boolean useAppConfig) at System.Web.HttpApplication.MapHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) I didn't find any resource on the web that fits my problem. Does anyone have a clue ? I'm able to view the reports from a web application, so I'm sure the server is running. Thanks in advance.

    Read the article

  • ASP.net MVC HttpException strange file not found

    - by Paddy
    I'm running asp.net MVC site on IIS6 - I've edited my routing to look like the following: routes.MapRoute( "Default", "{controller}.aspx/{action}/{id}", new { controller = "Home", action = "Index", id = "" } ); routes.MapRoute( "Root", "", new { controller = "Home", action = "Index", id = "" } ); So all my urls now contain .aspx (as per one of the solutions from Phil Haack). Now, I catch all unhandled exceptions using Elmah, and for almost every page request, I get the following error caught by Elmah, that I never see on the front end (everything works perfectly): System.Web.HttpException: The file '/VirtualDirectoryName/Home.aspx' does not exist. System.Web.HttpException: The file '/VirtualDirectoryName/Home.aspx' does not exist. at System.Web.UI.Util.CheckVirtualFileExists(VirtualPath virtualPath) at System.Web.Compilation.BuildManager.GetVPathBuildResultInternal(VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile) at System.Web.Compilation.BuildManager.GetVPathBuildResultWithNoAssert(HttpContext context, VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile) at System.Web.Compilation.BuildManager.GetVirtualPathObjectFactory(VirtualPath virtualPath, HttpContext context, Boolean allowCrossApp, Boolean noAssert) at System.Web.Compilation.BuildManager.CreateInstanceFromVirtualPath(VirtualPath virtualPath, Type requiredBaseType, HttpContext context, Boolean allowCrossApp, Boolean noAssert) at System.Web.UI.PageHandlerFactory.GetHandlerHelper(HttpContext context, String requestType, VirtualPath virtualPath, String physicalPath) at System.Web.UI.PageHandlerFactory.System.Web.IHttpHandlerFactory2.GetHandler(HttpContext context, String requestType, VirtualPath virtualPath, String physicalPath) at System.Web.HttpApplication.MapHttpHandler(HttpContext context, String requestType, VirtualPath path, String pathTranslated, Boolean useAppConfig) at System.Web.HttpApplication.MapHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) There is a Home controller, and it should be found, but I'm not sure a) where this is being called from, and b) why I don't see this error on the front end. Any ideas? Edited with answer: I think the answer for this can be found in this question: http://stackoverflow.com/questions/34194/asp-net-mvc-on-iis6

    Read the article

  • Updated MVC 4 to 5.2.0 via Nuget Site compiles but wont run

    - by hjavaher
    I had the bright idea of updating my perfectly working ASP.Net MVC 4 application to the MVC 5 via nuget, Everything compiles just fine but when I try to run the application I get the following yellow screen of death. Has anyone gotten this or know how to solve it? I've searched for it and couldn't find any solutions. Please let me know if there is any farther information that would help you that I can give you. Attempt by security transparent method 'WebMatrix.WebData.PreApplicationStartCode.Start()' to access security critical method 'System.Web.WebPages.Razor.WebPageRazorHost.AddGlobalImport(System.String)' failed. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.MethodAccessException: Attempt by security transparent method 'WebMatrix.WebData.PreApplicationStartCode.Start()' to access security critical method 'System.Web.WebPages.Razor.WebPageRazorHost.AddGlobalImport(System.String)' failed. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [MethodAccessException: Attempt by security transparent method 'WebMatrix.WebData.PreApplicationStartCode.Start()' to access security critical method 'System.Web.WebPages.Razor.WebPageRazorHost.AddGlobalImport(System.String)' failed.] WebMatrix.WebData.PreApplicationStartCode.Start() +112 [InvalidOperationException: The pre-application start initialization method Start on type WebMatrix.WebData.PreApplicationStartCode threw an exception with the following error message: Attempt by security transparent method 'WebMatrix.WebData.PreApplicationStartCode.Start()' to access security critical method 'System.Web.WebPages.Razor.WebPageRazorHost.AddGlobalImport(System.String)' failed..] System.Web.Compilation.BuildManager.InvokePreStartInitMethodsCore(ICollection`1 methods, Func`1 setHostingEnvironmentCultures) +556 System.Web.Compilation.BuildManager.InvokePreStartInitMethods(ICollection`1 methods) +132 System.Web.Compilation.BuildManager.CallPreStartInitMethods(String preStartInitListPath, Boolean& isRefAssemblyLoaded) +102 System.Web.Compilation.BuildManager.ExecutePreAppStart() +153 System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel, Exception appDomainCreationException) +516 [HttpException (0x80004005): The pre-application start initialization method Start on type WebMatrix.WebData.PreApplicationStartCode threw an exception with the following error message: Attempt by security transparent method 'WebMatrix.WebData.PreApplicationStartCode.Start()' to access security critical method 'System.Web.WebPages.Razor.WebPageRazorHost.AddGlobalImport(System.String)' failed..] System.Web.HttpRuntime.FirstRequestInit(HttpContext context) +9885060 System.Web.HttpRuntime.EnsureFirstRequestInit(HttpContext context) +101 System.Web.HttpRuntime.ProcessRequestNotificationPrivate(IIS7WorkerRequest wr, HttpContext context) +254

    Read the article

  • Plan Caching and Query Memory Part II (Hash Match) – When not to use stored procedure - Most common performance mistake SQL Server developers make.

    - by sqlworkshops
    SQL Server estimates Memory requirement at compile time, when stored procedure or other plan caching mechanisms like sp_executesql or prepared statement are used, the memory requirement is estimated based on first set of execution parameters. This is a common reason for spill over tempdb and hence poor performance. Common memory allocating queries are that perform Sort and do Hash Match operations like Hash Join or Hash Aggregation or Hash Union. This article covers Hash Match operations with examples. It is recommended to read Plan Caching and Query Memory Part I before this article which covers an introduction and Query memory for Sort. In most cases it is cheaper to pay for the compilation cost of dynamic queries than huge cost for spill over tempdb, unless memory requirement for a query does not change significantly based on predicates.   This article covers underestimation / overestimation of memory for Hash Match operation. Plan Caching and Query Memory Part I covers underestimation / overestimation for Sort. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   To read additional articles I wrote click here.   The best way to learn is to practice. To create the below tables and reproduce the behavior, join the mailing list by using this link: www.sqlworkshops.com/ml and I will send you the table creation script. Most of these concepts are also covered in our webcasts: www.sqlworkshops.com/webcasts  Let’s create a Customer’s State table that has 99% of customers in NY and the rest 1% in WA.Customers table used in Part I of this article is also used here.To observe Hash Warning, enable 'Hash Warning' in SQL Profiler under Events 'Errors and Warnings'. --Example provided by www.sqlworkshops.com drop table CustomersState go create table CustomersState (CustomerID int primary key, Address char(200), State char(2)) go insert into CustomersState (CustomerID, Address) select CustomerID, 'Address' from Customers update CustomersState set State = 'NY' where CustomerID % 100 != 1 update CustomersState set State = 'WA' where CustomerID % 100 = 1 go update statistics CustomersState with fullscan go   Let’s create a stored procedure that joins customers with CustomersState table with a predicate on State. --Example provided by www.sqlworkshops.com create proc CustomersByState @State char(2) as begin declare @CustomerID int select @CustomerID = e.CustomerID from Customers e inner join CustomersState es on (e.CustomerID = es.CustomerID) where es.State = @State option (maxdop 1) end go  Let’s execute the stored procedure first with parameter value ‘WA’ – which will select 1% of data. set statistics time on go --Example provided by www.sqlworkshops.com exec CustomersByState 'WA' goThe stored procedure took 294 ms to complete.  The stored procedure was granted 6704 KB based on 8000 rows being estimated.  The estimated number of rows, 8000 is similar to actual number of rows 8000 and hence the memory estimation should be ok.  There was no Hash Warning in SQL Profiler. To observe Hash Warning, enable 'Hash Warning' in SQL Profiler under Events 'Errors and Warnings'.   Now let’s execute the stored procedure with parameter value ‘NY’ – which will select 99% of data. -Example provided by www.sqlworkshops.com exec CustomersByState 'NY' go  The stored procedure took 2922 ms to complete.   The stored procedure was granted 6704 KB based on 8000 rows being estimated.    The estimated number of rows, 8000 is way different from the actual number of rows 792000 because the estimation is based on the first set of parameter value supplied to the stored procedure which is ‘WA’ in our case. This underestimation will lead to spill over tempdb, resulting in poor performance.   There was Hash Warning (Recursion) in SQL Profiler. To observe Hash Warning, enable 'Hash Warning' in SQL Profiler under Events 'Errors and Warnings'.   Let’s recompile the stored procedure and then let’s first execute the stored procedure with parameter value ‘NY’.  In a production instance it is not advisable to use sp_recompile instead one should use DBCC FREEPROCCACHE (plan_handle). This is due to locking issues involved with sp_recompile, refer to our webcasts, www.sqlworkshops.com/webcasts for further details.   exec sp_recompile CustomersByState go --Example provided by www.sqlworkshops.com exec CustomersByState 'NY' go  Now the stored procedure took only 1046 ms instead of 2922 ms.   The stored procedure was granted 146752 KB of memory. The estimated number of rows, 792000 is similar to actual number of rows of 792000. Better performance of this stored procedure execution is due to better estimation of memory and avoiding spill over tempdb.   There was no Hash Warning in SQL Profiler.   Now let’s execute the stored procedure with parameter value ‘WA’. --Example provided by www.sqlworkshops.com exec CustomersByState 'WA' go  The stored procedure took 351 ms to complete, higher than the previous execution time of 294 ms.    This stored procedure was granted more memory (146752 KB) than necessary (6704 KB) based on parameter value ‘NY’ for estimation (792000 rows) instead of parameter value ‘WA’ for estimation (8000 rows). This is because the estimation is based on the first set of parameter value supplied to the stored procedure which is ‘NY’ in this case. This overestimation leads to poor performance of this Hash Match operation, it might also affect the performance of other concurrently executing queries requiring memory and hence overestimation is not recommended.     The estimated number of rows, 792000 is much more than the actual number of rows of 8000.  Intermediate Summary: This issue can be avoided by not caching the plan for memory allocating queries. Other possibility is to use recompile hint or optimize for hint to allocate memory for predefined data range.Let’s recreate the stored procedure with recompile hint. --Example provided by www.sqlworkshops.com drop proc CustomersByState go create proc CustomersByState @State char(2) as begin declare @CustomerID int select @CustomerID = e.CustomerID from Customers e inner join CustomersState es on (e.CustomerID = es.CustomerID) where es.State = @State option (maxdop 1, recompile) end go  Let’s execute the stored procedure initially with parameter value ‘WA’ and then with parameter value ‘NY’. --Example provided by www.sqlworkshops.com exec CustomersByState 'WA' go exec CustomersByState 'NY' go  The stored procedure took 297 ms and 1102 ms in line with previous optimal execution times.   The stored procedure with parameter value ‘WA’ has good estimation like before.   Estimated number of rows of 8000 is similar to actual number of rows of 8000.   The stored procedure with parameter value ‘NY’ also has good estimation and memory grant like before because the stored procedure was recompiled with current set of parameter values.  Estimated number of rows of 792000 is similar to actual number of rows of 792000.    The compilation time and compilation CPU of 1 ms is not expensive in this case compared to the performance benefit.   There was no Hash Warning in SQL Profiler.   Let’s recreate the stored procedure with optimize for hint of ‘NY’. --Example provided by www.sqlworkshops.com drop proc CustomersByState go create proc CustomersByState @State char(2) as begin declare @CustomerID int select @CustomerID = e.CustomerID from Customers e inner join CustomersState es on (e.CustomerID = es.CustomerID) where es.State = @State option (maxdop 1, optimize for (@State = 'NY')) end go  Let’s execute the stored procedure initially with parameter value ‘WA’ and then with parameter value ‘NY’. --Example provided by www.sqlworkshops.com exec CustomersByState 'WA' go exec CustomersByState 'NY' go  The stored procedure took 353 ms with parameter value ‘WA’, this is much slower than the optimal execution time of 294 ms we observed previously. This is because of overestimation of memory. The stored procedure with parameter value ‘NY’ has optimal execution time like before.   The stored procedure with parameter value ‘WA’ has overestimation of rows because of optimize for hint value of ‘NY’.   Unlike before, more memory was estimated to this stored procedure based on optimize for hint value ‘NY’.    The stored procedure with parameter value ‘NY’ has good estimation because of optimize for hint value of ‘NY’. Estimated number of rows of 792000 is similar to actual number of rows of 792000.   Optimal amount memory was estimated to this stored procedure based on optimize for hint value ‘NY’.   There was no Hash Warning in SQL Profiler.   This article covers underestimation / overestimation of memory for Hash Match operation. Plan Caching and Query Memory Part I covers underestimation / overestimation for Sort. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   Summary: Cached plan might lead to underestimation or overestimation of memory because the memory is estimated based on first set of execution parameters. It is recommended not to cache the plan if the amount of memory required to execute the stored procedure has a wide range of possibilities. One can mitigate this by using recompile hint, but that will lead to compilation overhead. However, in most cases it might be ok to pay for compilation rather than spilling sort over tempdb which could be very expensive compared to compilation cost. The other possibility is to use optimize for hint, but in case one sorts more data than hinted by optimize for hint, this will still lead to spill. On the other side there is also the possibility of overestimation leading to unnecessary memory issues for other concurrently executing queries. In case of Hash Match operations, this overestimation of memory might lead to poor performance. When the values used in optimize for hint are archived from the database, the estimation will be wrong leading to worst performance, so one has to exercise caution before using optimize for hint, recompile hint is better in this case.   I explain these concepts with detailed examples in my webcasts (www.sqlworkshops.com/webcasts), I recommend you to watch them. The best way to learn is to practice. To create the above tables and reproduce the behavior, join the mailing list at www.sqlworkshops.com/ml and I will send you the relevant SQL Scripts.  Register for the upcoming 3 Day Level 400 Microsoft SQL Server 2008 and SQL Server 2005 Performance Monitoring & Tuning Hands-on Workshop in London, United Kingdom during March 15-17, 2011, click here to register / Microsoft UK TechNet.These are hands-on workshops with a maximum of 12 participants and not lectures. For consulting engagements click here.   Disclaimer and copyright information:This article refers to organizations and products that may be the trademarks or registered trademarks of their various owners. Copyright of this article belongs to R Meyyappan / www.sqlworkshops.com. You may freely use the ideas and concepts discussed in this article with acknowledgement (www.sqlworkshops.com), but you may not claim any of it as your own work. This article is for informational purposes only; you use any of the suggestions given here entirely at your own risk.   R Meyyappan [email protected] LinkedIn: http://at.linkedin.com/in/rmeyyappan

    Read the article

  • How to compile x64 asp.net website?

    - by Eran Betzalel
    I'm trying to compile (using Visual Studio) an ASP.Net website with the Chilkat library. The compilation fails due to this error: Could not load file or assembly 'ChilkatDotNet2, Version=9.0.8.0, Culture=neutral, PublicKeyToken=eb5fc1fc52ef09bd' or one of its dependencies. An attempt was made to load a program with an incorrect format. I've been told that this error occurs because of platform noncompliance. The weird thing is that although the compilation fails, the site works once accessed from a browser. My theory is that the IIS compilation uses csc.exe compiler from the Framework64 (64 bit) folder while the Visual Studio uses csc.exe compiler from the Framework (32 bit) folder. If this is acually it, how can I configure my Visual studio to run with the 64 bit compiler for ASP.Net sites? This is my current development configuration: Windows 7 (x64). Visual Studio 2008 Pro (x86 of course...). Chilkat library (x64) IIS/Asp.net (x64).

    Read the article

  • How can I resolve gstreamer dependencies in Ubuntu

    - by michael
    Hi, Can you please tell me how can I resolve these dependencies on ubuntu: checking for GSTREAMER... configure: error: Package requirements (gstreamer-0.10 >= 0.10 gstreamer-app-0.10 gstreamer-base-0.10 gstreamer-pbutils-0.10 gstreamer-plugins-base-0.10 >= 0.10.25 gstreamer-video-0.10) were not met: No package 'gstreamer-app-0.10' found No package 'gstreamer-pbutils-0.10' found No package 'gstreamer-plugins-base-0.10' found No package 'gstreamer-video-0.10' found I have tried: $ sudo apt-get install *gstreamer-video* Reading package lists... Done Building dependency tree Reading state information... Done E: Regex compilation error - Invalid preceding regular expression $ sudo apt-get install *gstreamer-app* Reading package lists... Done Building dependency tree Reading state information... Done E: Regex compilation error - Invalid preceding regular expression $ sudo apt-get install *gstreamer-base* Reading package lists... Done Building dependency tree Reading state information... Done E: Regex compilation error - Invalid preceding regular expression

    Read the article

  • How to complie x64 asp.net website?

    - by Eran Betzalel
    I'm trying to compile (using Visual Studio) an ASP.Net website with the Chilkat library. The compilation fails due to this error: Could not load file or assembly 'ChilkatDotNet2, Version=9.0.8.0, Culture=neutral, PublicKeyToken=eb5fc1fc52ef09bd' or one of its dependencies. An attempt was made to load a program with an incorrect format. I've been told that this error occurs because of platform noncompliance. The weird thing is that although the compilation fails, the site works once accessed from a browser. My theory is that the IIS compilation uses csc.exe compiler from the Framework64 (64 bit) folder while the Visual Studio uses csc.exe compiler from the Framework (32 bit) folder. If this is acually it, how can I configure my Visual studio to run with the 64 bit compiler for ASP.Net sites? This is my current development configuration: Windows 7 (x64). Visual Studio 2008 Pro (x86 of course...). Chilkat library (x64) IIS/Asp.net (x64).

    Read the article

  • Why does a module compile by itself but fail when used from elsewhere?

    - by chrisribe
    I have a Perl module that appears to compile fine by itself, but is causing other programs to fail compilation when it is included: me@host:~/code $ perl -c -Imodules modules/Rebat/Store.pm modules/Rebat/Store.pm syntax OK me@host:~/code $ perl -c -Imodules bin/rebat-report-status Attempt to reload Rebat/Store.pm aborted Compilation failed in require at bin/rebat-report-status line 4. BEGIN failed--compilation aborted at bin/rebat-report-status line 4. The first few lines of rebat-report-status are ... 3 use Rebat; 4 use Rebat::Store; 5 use strict; ...

    Read the article

  • Alter a single attribute with XSLT

    - by Pete Montgomery
    Hello! <?xml version="1.0" encoding="utf-8" ?> <configuration> <system.web> <compilation debug="true" defaultLanguage="C#"> <!-- this is a comment --> </compilation> </system.web> </configuration> What's the simplest XSLT you can think of to transform the value of the first, in this case only, /configuration/system.web/compilation/@debug attribute from true to false?

    Read the article

  • Make errors - can the gcc compiler warnings prevent a C file from being compiled into an object file

    - by Xolstice
    I'm trying to compile a wireless network card driver for my Linux box and I ran into a problem with the Make command. During the compilation process I normally see warnings on some of the C files that being are compiled; despite the warnings these files were still able to be compiled to an object file. When the Make process comes to a file called rtmp_wext.c however, the compiler generates a large number of warnings and then the whole Make process stops and returns an exit status of error 1, i.e. make: *** [rtmp_wext.o] Error 1. Usually I see an error with the C file for compilation to halt. This is the first time where it seems compiler warnings are preventing the file from being turned into an object file; is this possible or is something else the cause for the unsuccessful compilation?

    Read the article

  • C: Why does gcc allow char array initialization with string literal larger than array?

    - by Ashwin
    int main() { char a[7] = "Network"; return 0; } A string literal in C is terminated internally with a nul character. So, the above code should give a compilation error since the actual length of the string literal Network is 8 and it cannot fit in a char[7] array. However, gcc (even with -Wall) on Ubuntu compiles this code without any error or warning. Why does gcc allow this and not flag it as compilation error? gcc only gives a warning (still no error!) when the char array size is smaller than the string literal. For example, it warns on: char a[6] = "Network"; [Related] Visual C++ 2012 gives a compilation error for char a[7]: 1>d:\main.cpp(3): error C2117: 'a' : array bounds overflow 1> d:\main.cpp(3) : see declaration of 'a'

    Read the article

  • Bitwise operators in DX9 ps_2_0 shader

    - by lapin
    I've got the following code in a shader: // v & y are both floats nPixel = v; nPixel << 8; nPixel |= y; and this gives me the following error in compilation: shader.fx(80,10): error X3535: Bitwise operations not supported on legacy targets. shader.fx(92,18): ID3DXEffectCompiler::CompileEffect: There was an error compiling expression ID3DXEffectCompiler: Compilation failed The error is on the following line: nPixel |= y; What am I doing wrong here?

    Read the article

  • Installing Cairo to get FastRWeb working for R gWidgetsWWW2 -pkg

    - by hhh
    I want to install FastRWeb for R but it requires some Cairo. How can I install the Cairo? compilation terminated. make: *** [xlib-backend.o] Error 1 ERROR: compilation failed for package ‘Cairo’ * removing ‘/home/xfz/R/i686-pc-linux-gnu-library/2.13/Cairo’ ERROR: dependency ‘Cairo’ is not available for package ‘FastRWeb’ * removing ‘/home/xfz/R/i686-pc-linux-gnu-library/2.13/FastRWeb’ The downloaded packages are in ‘/tmp/Rtmpno8hhF/downloaded_packages’ Warning messages: 1: In install.packages("FastRWeb", , "http://rforge.net/", type = "source") : installation of package 'Cairo' had non-zero exit status 2: In install.packages("FastRWeb", , "http://rforge.net/", type = "source") : installation of package 'FastRWeb' had non-zero exit status I cannot find what the Cairo is here, 16 entries with this search term below. It is apparently some library. $ apt-cache search libcairo|wc 16 132 996 Perhaps related http://stackoverflow.com/questions/9826128/r-making-r-rook-program-into-rscript-program-r http://stackoverflow.com/questions/9812547/r-gui-vizualiser-with-command-line-access-browser-based-letting-users-to-s Some related packages FastRWeb and RServe for the gWidgetsWWW2 -pkg.

    Read the article

  • Is it important to obfuscate C++ application code?

    - by user827992
    In the Java world, it seems to sometimes be a problem, but, what about C++? Are there different solutions? I was thinking about the fact that someone can replace the C++ library of a specific OS with a different version of the same library, but full of debug symbols to understand what my code does. IS tt a good thing to use standard or popular libraries? This can also happen with some dll library under Windows replaced with the "debug version" of that library. Is it better to prefer static compilation? In commercial applications, I see that for the core of their app they compile everything statically and for the most part the dlls (dynamic libraries in general) are used to offer some third party technologies like anti-piracy solutions (I see this in many games), GUI library (like Qt), OS libraries, etc. Is static compilation the equivalent to obfuscation in the Java world? In better terms, is it the best and most affordable solution to protect your code?

    Read the article

  • How to detect GLSL warnings?

    - by msell
    After compiling a shader with glCompileShader, I can call glGetShaderiv with GL_COMPILE_STATUS to check if the shader compiled successfully. I can also call glGetShaderInfoLog to get information about possible errors, warnings or other info. The information log returned by this function is unspecified. In a tool where users can write their own shaders, I would like to print all errors and warnings from the compilation, but nothing if no warnings or errors were found. The problem is that the GL_COMPILE_STATUS returns only false if the compilation failed and true otherwise. If no problems were found, some drivers return empty info log from glGetShaderInfoLog, but some drivers can return something else such as "No errors.", which I do not want to print to the user. How is this problem generally solved?

    Read the article

  • How to compile and build fast in Visual studio 2010

    - by anirudha
    Sometime Project have included many thing with a project.suppose a ASP.NET MVC project maybe included Test project for the project you run or have some more project who attach to the current project. it's take a long time while project is going to debug the reason for that is because project have many subproject or attached project then compilation of all maybe goes long. the solution is that build and debug current project instead of all. it's same time on compilation in Visual studio. for configure build only current project you need to configure it in Visual studio. click on the button and select Configuration manager choose the project who you currently worked and unchecked all other. After that Visual studio debugging goes faster.

    Read the article

  • Generating CMakeLists.txt

    - by vanna
    I got a bunch of C++ sources files and headers. They may use external libraries such as Boost e.g. I am interested in the process of building binaries for Windows and *nix. Makefiles (*nix) and .vcproj (Windows) call compilers with some specifications such as the order of compilation, compilation options and stuff. CMakeLists.txt can be used by CMake to build either makefiles or .vcproj and use very helpful commands such as recursive search of files, automatic linkage with known libraries, installers, variables that can be used in source files... Is there any existing tool that would generate a CMakeLists.txt from specified options ? Options could be like : scan this folder and make a library out of it, then scan this other folder and make an executable and automatically link both with Boost as well along with a user friendly installer with generated INSTALL.txt and README.txt. Something very powerful like that.

    Read the article

  • Why can't the w3c validator find the Perl's SGML::Parser::OpenSP?

    - by coure06
    I have installed W3C validator locally on Windows, following their instructions. I am getting this error while validating a site: Can't locate loadable object for module SGML::Parser::OpenSP in @INC (@INC contains: C:/www/perl/site/lib C:/www/perl/lib .) at C:/www/validator/httpd/cgi-bin/check line 60 Compilation failed in require at C:/www/validator/httpd/cgi-bin/check line 60. BEGIN failed--compilation aborted at C:/www/validator/httpd/cgi-bin/check line 60. I am using SGML::Parser::OpenSP 0.991 module.

    Read the article

  • How to manually throw a compiler error in GCC and Xcode

    - by coneybeare
    In xcode, while compiling apps with gcc, I want to throw compilation time errors if things like NSZombieEnabled is on for a distribution release, thus ensuring that compilation will fail and I won't accidentally do something stupid. I did some googling, but could not figure out how to cause the compiler to bail if a certain condition is met. Surely it must be easy, am I just not finding it?

    Read the article

  • How to figure out why .NET app runs at 32-bit on a 64-bit machine

    - by user54064
    I have a .NET app (webforms - .NET 3.5) that is running on a 64-bit server as 32-bit (I checked the IntPtr.Size result). The compilation is set to AnyCPU so I would expect that on a 64-bit machine, the app would be run at 64-bit. There are many Third-party programs incorporated into the app, could they be causing a problem? How do I figure out why 64-bit compilation is not being done?

    Read the article

  • Cannot deploy asp.net openid library on shared hosting service

    - by asksuperuser
    I have deployed successfully the dotnetopenid dll under IIS7 but on my shared hosting service it says: Compilation Error Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately. Compiler Error Message: CS0246: The type or namespace name 'DotNetOpenId' could not be found (are you missing a using directive or an assembly reference?) Why ?

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >