Search Results

Search found 16126 results on 646 pages for 'wcf performance'.

Page 11/646 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Bad 3D Performance in Ubuntu 12.04

    - by Pandem
    I already posted a question before but I didn't really get any advice/help. I'll be a bit more brief/general in hope it'll help. I have an MSI HD 7850 with the Catalyst 12.4 drivers installed. I've found that I'm having bad 3D performance for some reason but I'm not entirely sure what. I suspect it may just that the graphics card is new and AMD just need to work on their drivers but it would be nice to get advice and narrow the problem down so that I can be sure rather than wait for driver updates that may not even help. I ran gxlgears to give some general idea of how bad the performance is. At default size it is averaging around 2000 FPS. The command glxinfo confirms the renderer is using AMD Radeon HD 7800 Series with OpenGL version 4.2. Edits below: As asked for others: lspci -v output is here. fglrxinfo output is here xvinfo output is here glxinfo | grep rendering says yes for direct rendering. These confirmed that everything was configured correctly. Within Unity and Gnome Classic: glxgears had an FPS of around 2000 FPS fgl_glxgears had an FPS of around 544 FPS Within LDXE: glxgears had an FPS of around 4600 FPS fgl_glxgears had an FPS of around 1600 FPS In the end it was discovered that Compiz was causing a large performance decrease and solution was simply to change window manager for the time being. Thanks to TechZilla for all his help!

    Read the article

  • Fix: WCF - The type provided as the Service attribute value in the ServiceHost directive could not

    - by Ken Cox [MVP]
    I wanted to expose some raw data to users in my current ASP.NET 3.5 web site project. I created a subdirectory called ‘datafeeds’ and added a WCF Data Service. I wired the dataservice up to the Entity Framework class and, on running the ItemDataService.svc file, was greeted with: The type  <> provided as the Service attribute value in the ServiceHost directive could not be found So why couldn’t it find the class? It was right there in the… oops! Instead of putting the ItemDataService.vb...(read more)

    Read the article

  • Getting WCF Bindings and Behaviors from any config source

    - by cibrax
    The need of loading WCF bindings or behaviors from different sources such as files in a disk or databases is a common requirement when dealing with configuration either on the client side or the service side. The traditional way to accomplish this in WCF is loading everything from the standard configuration section (serviceModel section) or creating all the bindings and behaviors by hand in code. However, there is a solution in the middle that becomes handy when more flexibility is needed. This solution involves getting the configuration from any place, and use that configuration to automatically configure any existing binding or behavior instance created with code.  In order to configure a binding instance (System.ServiceModel.Channels.Binding) that you later inject in any endpoint on the client channel or the service host, you first need to get a binding configuration section from any configuration file (you can generate a temp file on the fly if you are using any other source for storing the configuration).  private BindingsSection GetBindingsSection(string path) { System.Configuration.Configuration config = System.Configuration.ConfigurationManager.OpenMappedExeConfiguration( new System.Configuration.ExeConfigurationFileMap() { ExeConfigFilename = path }, System.Configuration.ConfigurationUserLevel.None); var serviceModel = ServiceModelSectionGroup.GetSectionGroup(config); return serviceModel.Bindings; }   The BindingsSection contains a list of all the configured bindings in the serviceModel configuration section, so you can iterate through all the configured binding that get the one you need (You don’t need to have a complete serviceModel section, a section with the bindings only works).  public Binding ResolveBinding(string name) { BindingsSection section = GetBindingsSection(path); foreach (var bindingCollection in section.BindingCollections) { if (bindingCollection.ConfiguredBindings.Count > 0 && bindingCollection.ConfiguredBindings[0].Name == name) { var bindingElement = bindingCollection.ConfiguredBindings[0]; var binding = (Binding)Activator.CreateInstance(bindingCollection.BindingType); binding.Name = bindingElement.Name; bindingElement.ApplyConfiguration(binding); return binding; } } return null; }   The code above does just that, and also instantiates and configures the Binding object (System.ServiceModel.Channels.Binding) you are looking for. As you can see, the binding configuration element contains a method “ApplyConfiguration” that receives the binding instance that needs to be configured. A similar thing can be done for instance with the “Endpoint” behaviors. You first get the BehaviorsSection, and then, the behavior you want to use.  private BehaviorsSection GetBehaviorsSection(string path) { System.Configuration.Configuration config = System.Configuration.ConfigurationManager.OpenMappedExeConfiguration( new System.Configuration.ExeConfigurationFileMap() { ExeConfigFilename = path }, System.Configuration.ConfigurationUserLevel.None); var serviceModel = ServiceModelSectionGroup.GetSectionGroup(config); return serviceModel.Behaviors; }public List<IEndpointBehavior> ResolveEndpointBehavior(string name) { BehaviorsSection section = GetBehaviorsSection(path); List<IEndpointBehavior> endpointBehaviors = new List<IEndpointBehavior>(); if (section.EndpointBehaviors.Count > 0 && section.EndpointBehaviors[0].Name == name) { var behaviorCollectionElement = section.EndpointBehaviors[0]; foreach (BehaviorExtensionElement behaviorExtension in behaviorCollectionElement) { object extension = behaviorExtension.GetType().InvokeMember("CreateBehavior", BindingFlags.InvokeMethod | BindingFlags.NonPublic | BindingFlags.Instance, null, behaviorExtension, null); endpointBehaviors.Add((IEndpointBehavior)extension); } return endpointBehaviors; } return null; }   In this case, the code for creating the behavior instance is more tricky. First of all, a behavior in the configuration section actually represents a set of “IEndpoint” behaviors, and the behavior element you get from the configuration does not have any public method to configure an existing behavior instance. This last one only contains a protected method “CreateBehavior” that you can use for that purpose. Once you get this code implemented, a client channel can be easily configured as follows  var binding = resolver.ResolveBinding("MyBinding"); var behaviors = resolver.ResolveEndpointBehavior("MyBehavior"); SampleServiceClient client = new SampleServiceClient(binding, new EndpointAddress(new Uri("http://localhost:13749/SampleService.svc"), new DnsEndpointIdentity("localhost"))); foreach (var behavior in behaviors) { if(client.Endpoint.Behaviors.Contains(behavior.GetType())) { client.Endpoint.Behaviors.Remove(behavior.GetType()); } client.Endpoint.Behaviors.Add(behavior); }   The code above assumes that a configuration file (in any place) with a binding “MyBinding” and a behavior “MyBehavior” exists. That file can look like this,  <system.serviceModel> <bindings> <basicHttpBinding> <binding name="MyBinding"> <security mode="Transport"></security> </binding> </basicHttpBinding> </bindings> <behaviors> <endpointBehaviors> <behavior name="MyBehavior"> <clientCredentials> <windows/> </clientCredentials> </behavior> </endpointBehaviors> </behaviors> </system.serviceModel>   The same thing can be done of course in the service host if you want to manually configure the bindings and behaviors.  

    Read the article

  • What are Silverlight, WCF RIA services or applications?

    - by Pankaj Upadhyay
    I asked a question here on programmers yesterday about learning HTML & CSS and the community was pretty generous to provide great answers. One of the answers was given by Emmad Kareem and that was : "if you can't do HTML, don't give up. Consider using Silverlight". This answer made me visit Silverlight.net and I came across the terms WCF RIA Services, Silverlight applications. After going through the website and some articles on website i am unable to draw a conclusive understanding on what this is all about. Is this another way of building websites using .NET, and is just like another framework like ASP.NET MVC3. What scenario's and requirements are basically targeted to silverlight applications or we are free to use either of Asp.net MVC or Silverlight in any web-application requirements.

    Read the article

  • Making your WCF Web Apis to speak in multiple languages

    - by cibrax
    One of the key aspects of how the web works today is content negotiation. The idea of content negotiation is based on the fact that a single resource can have multiple representations, so user agents (or clients) and servers can work together to chose one of them. The http specification defines several “Accept” headers that a client can use to negotiate content with a server, and among all those, there is one for restricting the set of natural languages that are preferred as a response to a request, “Accept-Language”. For example, a client can specify “es” in this header for specifying that he prefers to receive the content in spanish or “en” in english. However, there are certain scenarios where the “Accept-Language” header is just not enough, and you might want to have a way to pass the “accepted” language as part of the resource url as an extension. For example, http://localhost/ProductCatalog/Products/1.es” returns all the descriptions for the product with id “1” in spanish. This is useful for scenarios in which you want to embed the link somewhere, such a document, an email or a page.  Supporting both scenarios, the header and the url extension, is really simple in the new WCF programming model. You only need to provide a processor implementation for any of them. Let’s say I have a resource implementation as part of a product catalog I want to expose with the WCF web apis. [ServiceContract][Export]public class ProductResource{ IProductRepository repository;  [ImportingConstructor] public ProductResource(IProductRepository repository) { this.repository = repository; }  [WebGet(UriTemplate = "{id}")] public Product Get(string id, HttpResponseMessage response) { var product = repository.GetById(int.Parse(id)); if (product == null) { response.StatusCode = HttpStatusCode.NotFound; response.Content = new StringContent(Messages.OrderNotFound); }  return product; }} The Get method implementation in this resource assumes the desired culture will be attached to the current thread (Thread.CurrentThread.Culture). Another option is to pass the desired culture as an additional argument in the method, so my processor implementation will handle both options. This method is also using an auto-generated class for handling string resources, Messages, which is available in the different cultures that the service implementation supports. For example, Messages.resx contains “OrderNotFound”: “Order Not Found” Messages.es.resx contains “OrderNotFound”: “No se encontro orden” The processor implementation bellow tackles the first scenario, in which the desired language is passed as part of the “Accept-Language” header. public class CultureProcessor : Processor<HttpRequestMessage, CultureInfo>{ string defaultLanguage = null;  public CultureProcessor(string defaultLanguage = "en") { this.defaultLanguage = defaultLanguage; this.InArguments[0].Name = HttpPipelineFormatter.ArgumentHttpRequestMessage; this.OutArguments[0].Name = "culture"; }  public override ProcessorResult<CultureInfo> OnExecute(HttpRequestMessage request) { CultureInfo culture = null; if (request.Headers.AcceptLanguage.Count > 0) { var language = request.Headers.AcceptLanguage.First().Value; culture = new CultureInfo(language); } else { culture = new CultureInfo(defaultLanguage); }  Thread.CurrentThread.CurrentCulture = culture; Messages.Culture = culture;  return new ProcessorResult<CultureInfo> { Output = culture }; }}   As you can see, the processor initializes a new CultureInfo instance with the value provided in the “Accept-Language” header, and set that instance to the current thread and the auto-generated resource class with all the messages. In addition, the CultureInfo instance is returned as an output argument called “culture”, making possible to receive that argument in any method implementation   The following code shows the implementation of the processor for handling languages as url extensions.   public class CultureExtensionProcessor : Processor<HttpRequestMessage, Uri>{ public CultureExtensionProcessor() { this.OutArguments[0].Name = HttpPipelineFormatter.ArgumentUri; }  public override ProcessorResult<Uri> OnExecute(HttpRequestMessage httpRequestMessage) { var requestUri = httpRequestMessage.RequestUri.OriginalString;  var extensionPosition = requestUri.LastIndexOf(".");  if (extensionPosition > -1) { var extension = requestUri.Substring(extensionPosition + 1);  var query = httpRequestMessage.RequestUri.Query;  requestUri = string.Format("{0}?{1}", requestUri.Substring(0, extensionPosition), query); ;  var uri = new Uri(requestUri);  httpRequestMessage.Headers.AcceptLanguage.Clear();  httpRequestMessage.Headers.AcceptLanguage.Add(new StringWithQualityHeaderValue(extension));  var result = new ProcessorResult<Uri>();  result.Output = uri;  return result; }  return new ProcessorResult<Uri>(); }} The last step is to inject both processors as part of the service configuration as it is shown bellow, public void RegisterRequestProcessorsForOperation(HttpOperationDescription operation, IList<Processor> processors, MediaTypeProcessorMode mode){ processors.Insert(0, new CultureExtensionProcessor()); processors.Add(new CultureProcessor());} Once you configured the two processors in the pipeline, your service will start speaking different languages :). Note: Url extensions don’t seem to be working in the current bits when you are using Url extensions in a base address. As far as I could see, ASP.NET intercepts the request first and tries to route the request to a registered ASP.NET Http Handler with that extension. For example, “http://localhost/ProductCatalog/products.es” does not work, but “http://localhost/ProductCatalog/products/1.es” does.

    Read the article

  • Exposing business logic as WCF service

    - by Oren Schwartz
    I'm working on a middle-tier project which encapsulates the business logic (uses a DAL layer, and serves a web application server [ASP.net]) of a product deployed in a LAN. The BL serves as a bunch of services and data objects that are invoked upon user action. At present times, the DAL acts as a separate application whereas the BL uses it, but is consumed by the web application as a DLL. Both the DAL and the web application are deployed on different servers inside organization, and since the BL DLL is consumed by the web application, it resides in the same server. The worst thing about exposing the BL as a DLL is that we lost track with what we expose. Deployment is not such a big issue since mostly, product versions are deployed together. Would you recommend migrating from DLL to WCF service? If so, why? Do you know anyone who had a similar experience?

    Read the article

  • Dictionary as DataMember in WCF after installing .NET 4.5 [migrated]

    - by Mauricio Ulate
    After installing .NET Framework 4.5 with Visual Studio 2012, whenever I want to obtain the reference from a WCF service, my dictionaries are changed into arrays. For example, Dictionary<int, double> is changed into ArrayOfKeyValueOfintdoubleKeyValueOfintdouble. This happens in both Visual Studio 2012 and 2010 (both Express). I've reviewed my configuration and the dictionary data type in the service reference configuration is System.Collection.Generic.Dictionary. Changing this doesn't make a difference. Reverting to just using Visual Studio 2010 and .NET 4.0 is not an option.

    Read the article

  • What is the difference between WCF service and a simple Web service in developing using .NET Framework?

    - by Steve Johnson
    My questions are: What is the difference between WCF service and a simple Web service in .NET Framework? What a WCF Service can do which a .NET Web service cant? In other words, what are the limitation of .NET Web services which were overcome in WCF services? I understand that WCF are REST based and .NET web services are SOAP based. But I need to know more than that. How a developer will make a design decision whether to developer a Web service or a WCF service?

    Read the article

  • WCF using ChannelFactory.CreateChannel with webHttp behavior

    - by BrettRobi
    I've got a simple REST based service for which I am trying to create a client proxy using ChannelFactory. I want to be without a configuration file so I am trying to do this in code and I believe I have everything I used to have in .config except for the behavior. Can anyone tell me how I can get this config into c# code: Here is the stripped down c# code I have now: var endpoint = new EndpointAddress(urlCommServer); var binding = new WebHttpBinding(); return ChannelFactory.CreateChannel(binding, endpoint);

    Read the article

  • WCF service The maximum array length quota (16384) has been exceeded

    - by dmitry.baranovsky
    I have a wsf service and a client application. While trying to communicate the client and the service I've gotten the following message: "The formatter threw an exception while trying to deserialize the message: There was an error while trying to deserialize parameter http://tempuri.org/:blob. The InnerException message was 'There was an error deserializing the object of type FileBlob. The maximum array length quota (16384) has been exceeded while reading XML data. This quota may be increased by changing the MaxArrayLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 25931.'. Please see InnerException for more details." I have the customBinding element and it doesn't allow me to insert "readerQuotas" section. In both the client and service configs I have the following binding element: <customBinding> <binding name="LicenseServiceBinding" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"> <security authenticationMode="UserNameOverTransport"> <localClientSettings maxClockSkew="00:07:00" /> <localServiceSettings maxClockSkew="00:07:00" /> </security> <windowsStreamSecurity /> <httpsTransport maxReceivedMessageSize="2147483646"/> </binding> </customBinding> Thanks in advance for any help:)

    Read the article

  • Monitoring disk performance with MRTG

    - by Ghostrider
    I use MRTG to monitor vital stats on my servers like disk space, CPU load, memory usage, temperatures etc. It all works fine and well for parameters that don't change rapidly. By running small VB script I can also get any Performance Counter. However these scripts are called by MRTG every 5 minutes while performance counters like physical disk idle time return a snapshot value from previous few seconds so a lot or data is missed. Surely I could write a service that would poll all required counters in background and store average values somewhere on disk where MRTG would pick them up. However before I do so I would like to find out if there is some ready solution that would allow me to get average value of some counter for the last 5 minutes as opposed to immediate snapshot.

    Read the article

  • WCF Http Bindings, Require SSL

    - by JoshKraker
    I have the following binding I'm using with my wsHttpBinding webservice. <binding name="wsHttpConfig"> <security> <transport clientCredentialType="None"/> </security> </binding> The issue is that it allows for the client to connect using either Http or Https. I would like to require them to use SSL. I tried adding the following: <system.web.extensions> <scripting> <webServices> <authenticationService enabled="true" requireSSL = "true"/> </webServices> </scripting> </system.web.extensions> But it had no effect; client could still connect with Http. I then tried checking the "Require SSL" in the IIS7 SSL Settings and had client certificates radio set to Accept. Now, when I try to view the service I am getting the error "Could not find a base address that matches scheme http for the endpoint with binding WSHttpBinding. Registered base address schemes are [https]." Anyone know exactly how to fix this error? I have been googling for the last 3 hours trying 500 different combinations (not 500, but too many to list) and could not get anything to run.

    Read the article

  • Consuming secured WCF service through basicHTTPbinding

    - by Jason M
    I am consuming an secured service hosted over basicHttpBinding I have to pass credentials to the service for authenticatioon Here’s the config setting for the client <security mode="TransportWithMessageCredential"> <transport clientCredentialType="None" proxyCredentialType="None" realm="" /> <message clientCredentialType="UserName" algorithmSuite="Default" /> </security> While calling the service, I am getting following exception message An unsecured or incorrectly secured fault was received from the other party. See the inner FaultException for the fault code and detail. Message = "An invalid security token was provided (Bad UsernameToken Values)” I not sure how to get it working I am curious if somebody can help me out or provide me any url where I could find the solution

    Read the article

  • Performance analysis strategies

    - by Bernd
    I am assigned to a performance-tuning-debugging-troubleshooting task. Scenario: a multi-application environment running on several networked machines using databases. OS is Unix, DB is Oracle. Business logic is implemented across applications using synchronous/asynchronous communication. Applications are multi-user with several hundred call center users at peak time. User interfaces are web-based. Applications are third party, I can get access to developers and source code. I only have the production system and a functional test environment, no load test environment. Problem: bad performance! I need fast results. Management is going crazy. I got symptom examples like these: user interface actions taking minutes to complete. Seaching for a customer usually takes 6 seconds but an immediate subsequent search with same parameters may take 6 minutes. What would be your strategy for finding root causes?

    Read the article

  • Can a WCF contract use multiple callback interfaces?

    - by mafutrct
    I'm trying something like this: [ServiceContract ( CallbackContract = typeof (CallbackContract_1), CallbackContract = typeof (CallbackContract_2), CallbackContract = typeof (CallbackContract_3)) ] public interface SomeWcfContract { I know it does not work like this. Is there still a way to get a single contract use multiple callback interfaces?

    Read the article

  • Interfaces, Adapters, exposing business objects via WCF design

    - by Onam
    I know there have been countless discussions about this but I think this question is slightly different and may perhaps prompt a heated discussion (lets keep it friendly). The scene: I am developing a system as a means for me to learn various concepts and I came across a predicament which my brain is conflicting with. That is whether to keep my interfaces in a separate class library or should they live side by side my business objects. I want to expose certain objects via WCF, however refuse to expose them in its entirety. I am sure most will agree exposing properties such as IDs and other properties is not good practice but also I don't want to have my business objects decorated with attributes. The question: Essentially, I'll be having a separate interface for each of my objects that will essentially be exposed to the outside world (could end up being quite a few) so does it make sense to create a separate class library for interfaces? This also brings up the question of whether adapters should live in a separate class library too as ideally I want a mechanism from transferring from one object to the other and vice versa?

    Read the article

  • WCF service hosted in IIS7 with administrator rights?

    - by Allan Baker
    Hello, How do I grant administrator rights to a running WCF service hosted in IIS7? The problem is, my code works fine in a test console application runned as an administrator, but the same code used from WCF service in IIS7 fails. When I run the same console test application without admin rights, code fails. So, how do I grant admin rights to a WCF service hosted in IIS7? Do I grant admin rights to IIS7 service? Can I grant rights to a specific WCF service? How do I do 'Run as an administrator' on IIS7 or specific website? Thanks! (That's the question, here is a more detailed description of a situation: I am trying to capture frames from a webcam into a jpg file using Touchless library, and I can do that from a console application with admin rights. When I run that same console app without admin rights I cannot access a webcam in code. Same thing happens in a WCF service with the same code.)

    Read the article

  • Practical Performance Monitoring and Tuning Event

    - by Andrew Kelly
      For any of you who may be interested or know of someone in the market for a performance Monitoring and Tuning class I have just the ticket for you. It’s a 3 day event that will be held in Atlanta Ga. on January 25th to the 27th 2011. For those of you that know me or have been to my sessions you realize I like to provide more than just classroom theory and like to teach real world and above all practical methodology when it comes to performance in SQL Server. This class covers all the essentials...(read more)

    Read the article

  • SQL SERVER – Simple Example of Incremental Statistics – Performance improvements in SQL Server 2014 – Part 2

    - by Pinal Dave
    This is the second part of the series Incremental Statistics. Here is the index of the complete series. What is Incremental Statistics? – Performance improvements in SQL Server 2014 – Part 1 Simple Example of Incremental Statistics – Performance improvements in SQL Server 2014 – Part 2 DMV to Identify Incremental Statistics – Performance improvements in SQL Server 2014 – Part 3 In part 1 we have understood what is incremental statistics and now in this second part we will see a simple example of incremental statistics. This blog post is heavily inspired from my friend Balmukund’s must read blog post. If you have partitioned table and lots of data, this feature can be specifically very useful. Prerequisite Here are two things you must know before you start with the demonstrations. AdventureWorks – For the demonstration purpose I have installed AdventureWorks 2012 as an AdventureWorks 2014 in this demonstration. Partitions – You should know how partition works with databases. Setup Script Here is the setup script for creating Partition Function, Scheme, and the Table. We will populate the table based on the SalesOrderDetails table from AdventureWorks. -- Use Database USE AdventureWorks2014 GO -- Create Partition Function CREATE PARTITION FUNCTION IncrStatFn (INT) AS RANGE LEFT FOR VALUES (44000, 54000, 64000, 74000) GO -- Create Partition Scheme CREATE PARTITION SCHEME IncrStatSch AS PARTITION [IncrStatFn] TO ([PRIMARY], [PRIMARY], [PRIMARY], [PRIMARY], [PRIMARY]) GO -- Create Table Incremental_Statistics CREATE TABLE [IncrStatTab]( [SalesOrderID] [int] NOT NULL, [SalesOrderDetailID] [int] NOT NULL, [CarrierTrackingNumber] [nvarchar](25) NULL, [OrderQty] [smallint] NOT NULL, [ProductID] [int] NOT NULL, [SpecialOfferID] [int] NOT NULL, [UnitPrice] [money] NOT NULL, [UnitPriceDiscount] [money] NOT NULL, [ModifiedDate] [datetime] NOT NULL) ON IncrStatSch(SalesOrderID) GO -- Populate Table INSERT INTO [IncrStatTab]([SalesOrderID], [SalesOrderDetailID], [CarrierTrackingNumber], [OrderQty], [ProductID], [SpecialOfferID], [UnitPrice],   [UnitPriceDiscount], [ModifiedDate]) SELECT     [SalesOrderID], [SalesOrderDetailID], [CarrierTrackingNumber], [OrderQty], [ProductID], [SpecialOfferID], [UnitPrice],   [UnitPriceDiscount], [ModifiedDate] FROM       [Sales].[SalesOrderDetail] WHERE      SalesOrderID < 54000 GO Check Details Now we will check details in the partition table IncrStatSch. -- Check the partition SELECT * FROM sys.partitions WHERE OBJECT_ID = OBJECT_ID('IncrStatTab') GO You will notice that only a few of the partition are filled up with data and remaining all the partitions are empty. Now we will create statistics on the Table on the column SalesOrderID. However, here we will keep adding one more keyword which is INCREMENTAL = ON. Please note this is the new keyword and feature added in SQL Server 2014. It did not exist in earlier versions. -- Create Statistics CREATE STATISTICS IncrStat ON [IncrStatTab] (SalesOrderID) WITH FULLSCAN, INCREMENTAL = ON GO Now we have successfully created statistics let us check the statistical histogram of the table. Now let us once again populate the table with more data. This time the data are entered into a different partition than earlier populated partition. -- Populate Table INSERT INTO [IncrStatTab]([SalesOrderID], [SalesOrderDetailID], [CarrierTrackingNumber], [OrderQty], [ProductID], [SpecialOfferID], [UnitPrice],   [UnitPriceDiscount], [ModifiedDate]) SELECT     [SalesOrderID], [SalesOrderDetailID], [CarrierTrackingNumber], [OrderQty], [ProductID], [SpecialOfferID], [UnitPrice],   [UnitPriceDiscount], [ModifiedDate] FROM       [Sales].[SalesOrderDetail] WHERE      SalesOrderID > 54000 GO Let us check the status of the partition once again with following script. -- Check the partition SELECT * FROM sys.partitions WHERE OBJECT_ID = OBJECT_ID('IncrStatTab') GO Statistics Update Now here has the new feature come into action. Previously, if we have to update the statistics, we will have to FULLSCAN the entire table irrespective of which partition got the data. However, in SQL Server 2014 we can just specify which partition we want to update in terms of Statistics. Here is the script for the same. -- Update Statistics Manually UPDATE STATISTICS IncrStatTab (IncrStat) WITH RESAMPLE ON PARTITIONS(3, 4) GO Now let us check the statistics once again. -- Show Statistics DBCC SHOW_STATISTICS('IncrStatTab', IncrStat) WITH HISTOGRAM GO Upon examining statistics histogram, you will notice that now the distribution has changed and there is way more rows in the histogram. Summary The new feature of Incremental Statistics is indeed a boon for the scenario where there are partitions and statistics needs to be updated frequently on the partitions. In earlier version to update statistics one has to do FULLSCAN on the entire table which was wasting too many resources. With the new feature in SQL Server 2014, now only those partitions which are significantly changed can be specified in the script to update statistics. Cleanup You can clean up the database by executing following scripts. -- Clean up DROP TABLE [IncrStatTab] DROP PARTITION SCHEME [IncrStatSch] DROP PARTITION FUNCTION [IncrStatFn] GO Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SQL Statistics, Statistics

    Read the article

  • How to achieve best performance in DirectX 9.0 while rendering on multiple monitors

    - by Vibhore Tanwer
    I am new to DirectX, and trying to learn best practice. Please suggest what are the best practices for rendering on multiple monitors different things at the same time? how can I boost performance of application? I have gone through this article http://msdn.microsoft.com/en-us/library/windows/desktop/bb147263%28v=vs.85%29.aspx . I am making use of some pixel shaders to achieve some effects. At most 4 effect(4 shader effects) can be applied at same time. What are the best practices to achieve best performance with DirectX 9.0. I read somewhere that DirectX 11 provides support for parallel rendering, but I am not able to get any working sample for DirectX 11.0. Please help me with this, Any help would be of great value. Thanks

    Read the article

  • SQL SERVER – BI Quiz – Troubleshooting Cube Performance

    - by pinaldave
    My friend Jacob Sebastian runs SQL BI Quiz competition. Where there are 30 different questions on each day of the month. Winners get opportunity to participate in this Quiz, learn something new and win great awards. Working with huge data is very common when it is about Data Warehousing. It is necessary to create Cubes on the data to make it meaningful and consumable. There are cases when retrieving the data from cube takes lots of the time. Let us assume that your cube is returning you data very quickly. Suddenly on one day it is returning the data very slowly. What are the three things will you in order to diagnose this. After diagnose what you will do to resolve performance issue. Participate in my question over here Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Pinal Dave, PostADay, Readers Question, SQL, SQL Authority, SQL Performance, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Will JVisualVM degrade application performance?

    - by rocky
    I have doubts in JVisual VM profiler tool related to performance. I have requirement to implement a JVM Monitoring tool for my enterpise java application. I have gone through some profiling tools in market but all them are having some kind of agent file which we need include in server startup. I have a fear that these client agent will degrade my application performance will more. So I have decided to JVisual VM because this profiler tool comes with JDK itself but before implementing JVisualVM, does anybody faces any issues with JVisualVM profiler tool? As well as, is this safe if I implement in application?

    Read the article

  • Programmer performance

    - by RSK
    I am a PHP programmer with 1 year of experience. As I am just starting my career, I am learning a lot of things now. I can say I am a little bit of a perfectionist. When I am assigned a problem I start off by Googling. Then, even when I find a solution, I keep trying for a better one until I find 2-3 options. Then I start learning it and choose the best performing solution. Even though I am learning a lot, this process gets me labeled as a low performer. My questions: As a novice, should I continue to use this learning process and not worry about my performance? Should I focus more on my performance and less on how the code performs?

    Read the article

  • Using TPL and PLINQ to raise performance of feed aggregator

    - by DigiMortal
    In this posting I will show you how to use Task Parallel Library (TPL) and PLINQ features to boost performance of simple RSS-feed aggregator. I will use here only very basic .NET classes that almost every developer starts from when learning parallel programming. Of course, we will also measure how every optimization affects performance of feed aggregator. Feed aggregator Our feed aggregator works as follows: Load list of blogs Download RSS-feed Parse feed XML Add new posts to database Our feed aggregator is run by task scheduler after every 15 minutes by example. We will start our journey with serial implementation of feed aggregator. Second step is to use task parallelism and parallelize feeds downloading and parsing. And our last step is to use data parallelism to parallelize database operations. We will use Stopwatch class to measure how much time it takes for aggregator to download and insert all posts from all registered blogs. After every run we empty posts table in database. Serial aggregation Before doing parallel stuff let’s take a look at serial implementation of feed aggregator. All tasks happen one after other. internal class FeedClient {     private readonly INewsService _newsService;     private const int FeedItemContentMaxLength = 255;       public FeedClient()     {          ObjectFactory.Initialize(container =>          {              container.PullConfigurationFromAppConfig = true;          });           _newsService = ObjectFactory.GetInstance<INewsService>();     }       public void Execute()     {         var blogs = _newsService.ListPublishedBlogs();           for (var index = 0; index <blogs.Count; index++)         {              ImportFeed(blogs[index]);         }     }       private void ImportFeed(BlogDto blog)     {         if(blog == null)             return;         if (string.IsNullOrEmpty(blog.RssUrl))             return;           var uri = new Uri(blog.RssUrl);         SyndicationContentFormat feedFormat;           feedFormat = SyndicationDiscoveryUtility.SyndicationContentFormatGet(uri);           if (feedFormat == SyndicationContentFormat.Rss)             ImportRssFeed(blog);         if (feedFormat == SyndicationContentFormat.Atom)             ImportAtomFeed(blog);                 }       private void ImportRssFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = RssFeed.Create(uri);           foreach (var item in feed.Channel.Items)         {             SaveRssFeedItem(item, blog.Id, blog.CreatedById);         }     }       private void ImportAtomFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = AtomFeed.Create(uri);           foreach (var item in feed.Entries)         {             SaveAtomFeedEntry(item, blog.Id, blog.CreatedById);         }     } } Serial implementation of feed aggregator downloads and inserts all posts with 25.46 seconds. Task parallelism Task parallelism means that separate tasks are run in parallel. You can find out more about task parallelism from MSDN page Task Parallelism (Task Parallel Library) and Wikipedia page Task parallelism. Although finding parts of code that can run safely in parallel without synchronization issues is not easy task we are lucky this time. Feeds import and parsing is perfect candidate for parallel tasks. We can safely parallelize feeds import because importing tasks doesn’t share any resources and therefore they don’t also need any synchronization. After getting the list of blogs we iterate through the collection and start new TPL task for each blog feed aggregation. internal class FeedClient {     private readonly INewsService _newsService;     private const int FeedItemContentMaxLength = 255;       public FeedClient()     {          ObjectFactory.Initialize(container =>          {              container.PullConfigurationFromAppConfig = true;          });           _newsService = ObjectFactory.GetInstance<INewsService>();     }       public void Execute()     {         var blogs = _newsService.ListPublishedBlogs();                var tasks = new Task[blogs.Count];           for (var index = 0; index <blogs.Count; index++)         {             tasks[index] = new Task(ImportFeed, blogs[index]);             tasks[index].Start();         }           Task.WaitAll(tasks);     }       private void ImportFeed(object blogObject)     {         if(blogObject == null)             return;         var blog = (BlogDto)blogObject;         if (string.IsNullOrEmpty(blog.RssUrl))             return;           var uri = new Uri(blog.RssUrl);         SyndicationContentFormat feedFormat;           feedFormat = SyndicationDiscoveryUtility.SyndicationContentFormatGet(uri);           if (feedFormat == SyndicationContentFormat.Rss)             ImportRssFeed(blog);         if (feedFormat == SyndicationContentFormat.Atom)             ImportAtomFeed(blog);                }       private void ImportRssFeed(BlogDto blog)     {          var uri = new Uri(blog.RssUrl);          var feed = RssFeed.Create(uri);           foreach (var item in feed.Channel.Items)          {              SaveRssFeedItem(item, blog.Id, blog.CreatedById);          }     }     private void ImportAtomFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = AtomFeed.Create(uri);           foreach (var item in feed.Entries)         {             SaveAtomFeedEntry(item, blog.Id, blog.CreatedById);         }     } } You should notice first signs of the power of TPL. We made only minor changes to our code to parallelize blog feeds aggregating. On my machine this modification gives some performance boost – time is now 17.57 seconds. Data parallelism There is one more way how to parallelize activities. Previous section introduced task or operation based parallelism, this section introduces data based parallelism. By MSDN page Data Parallelism (Task Parallel Library) data parallelism refers to scenario in which the same operation is performed concurrently on elements in a source collection or array. In our code we have independent collections we can process in parallel – imported feed entries. As checking for feed entry existence and inserting it if it is missing from database doesn’t affect other entries the imported feed entries collection is ideal candidate for parallelization. internal class FeedClient {     private readonly INewsService _newsService;     private const int FeedItemContentMaxLength = 255;       public FeedClient()     {          ObjectFactory.Initialize(container =>          {              container.PullConfigurationFromAppConfig = true;          });           _newsService = ObjectFactory.GetInstance<INewsService>();     }       public void Execute()     {         var blogs = _newsService.ListPublishedBlogs();                var tasks = new Task[blogs.Count];           for (var index = 0; index <blogs.Count; index++)         {             tasks[index] = new Task(ImportFeed, blogs[index]);             tasks[index].Start();         }           Task.WaitAll(tasks);     }       private void ImportFeed(object blogObject)     {         if(blogObject == null)             return;         var blog = (BlogDto)blogObject;         if (string.IsNullOrEmpty(blog.RssUrl))             return;           var uri = new Uri(blog.RssUrl);         SyndicationContentFormat feedFormat;           feedFormat = SyndicationDiscoveryUtility.SyndicationContentFormatGet(uri);           if (feedFormat == SyndicationContentFormat.Rss)             ImportRssFeed(blog);         if (feedFormat == SyndicationContentFormat.Atom)             ImportAtomFeed(blog);                }       private void ImportRssFeed(BlogDto blog)     {         var uri = new Uri(blog.RssUrl);         var feed = RssFeed.Create(uri);           feed.Channel.Items.AsParallel().ForAll(a =>         {             SaveRssFeedItem(a, blog.Id, blog.CreatedById);         });      }        private void ImportAtomFeed(BlogDto blog)      {         var uri = new Uri(blog.RssUrl);         var feed = AtomFeed.Create(uri);           feed.Entries.AsParallel().ForAll(a =>         {              SaveAtomFeedEntry(a, blog.Id, blog.CreatedById);         });      } } We did small change again and as the result we parallelized checking and saving of feed items. This change was data centric as we applied same operation to all elements in collection. On my machine I got better performance again. Time is now 11.22 seconds. Results Let’s visualize our measurement results (numbers are given in seconds). As we can see then with task parallelism feed aggregation takes about 25% less time than in original case. When adding data parallelism to task parallelism our aggregation takes about 2.3 times less time than in original case. More about TPL and PLINQ Adding parallelism to your application can be very challenging task. You have to carefully find out parts of your code where you can safely go to parallel processing and even then you have to measure the effects of parallel processing to find out if parallel code performs better. If you are not careful then troubles you will face later are worse than ones you have seen before (imagine error that occurs by average only once per 10000 code runs). Parallel programming is something that is hard to ignore. Effective programs are able to use multiple cores of processors. Using TPL you can also set degree of parallelism so your application doesn’t use all computing cores and leaves one or more of them free for host system and other processes. And there are many more things in TPL that make it easier for you to start and go on with parallel programming. In next major version all .NET languages will have built-in support for parallel programming. There will be also new language constructs that support parallel programming. Currently you can download Visual Studio Async to get some idea about what is coming. Conclusion Parallel programming is very challenging but good tools offered by Visual Studio and .NET Framework make it way easier for us. In this posting we started with feed aggregator that imports feed items on serial mode. With two steps we parallelized feed importing and entries inserting gaining 2.3 times raise in performance. Although this number is specific to my test environment it shows clearly that parallel programming may raise the performance of your application significantly.

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >