Search Results

Search found 35708 results on 1429 pages for 'default copy constructor'.

Page 615/1429 | < Previous Page | 611 612 613 614 615 616 617 618 619 620 621 622  | Next Page >

  • Unity very slow while Gnome Classic running just fine

    - by Sorin Sbarnea
    I see tons of people complaining about Unity speed and I think the problem is not with the video drivers. When I login to Gnome Classic the system is behaving just fine, but when on Unity I can barely do use it: windows are moved hard, terminal is damn slow. Is there any solution or bug that I should track? Details Ubuntu 11.10 Two monitors setup Latest Nvidia proprietary drivers (tested with default ones also, no change) 6GB RAM, Xeon @ 2.8 Nvidia Driver 280.13 - Quadro NVS 295 with 8 cores 256MB RAM. lspci | grep VGA 02:00.0 VGA compatible controller: nVidia Corporation G98 [Quadro NVS 295] (rev a1) uname -a Linux sorins 3.0.0-16-generic #29-Ubuntu SMP Tue Feb 14 12:48:51 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

    Read the article

  • How To: Use Monitoring Rules and Policies

    - by Owen Allen
    One of Ops Center's most useful features is its asset monitoring capability. When you discover an asset - an operating system, say, or a server - a default monitoring policy is applied to it, based on the asset type. This policy contains rules that specify what properties are monitored and what thresholds are considered significant. Ops Center will send a notification if a monitored asset passes one of the specified thresholds. But sometimes you want different assets to be monitored in different ways. For example, you might have a group of mission-critical systems, for which you want to be notified immediately if their file system usage rises above a specific threshold. You can do so by creating a new monitoring policy and applying it to the group. You can also apply monitoring policies to individual assets, and edit them to meet the requirements of your environment. The Tuning Monitoring Rules and Policies How-To walks you through all of these procedures.

    Read the article

  • Announcing Oracle Audit Vault and Database Firewall

    - by Troy Kitch
    Today, Oracle announced the new Oracle Audit Vault and Database Firewall product, which unifies database activity monitoring and audit data analysis in one solution. This new product expands protection beyond Oracle and third party databases with support for auditing the operating system, directories and custom sources. Here are some of the key features of Oracle Audit Vault and Database Firewall: Single Administrator Console Default Reports Out-of-the-Box Compliance Reporting Report with Data from Multiple Source Types Audit Stored Procedure Calls - Not Visible on the Network Extensive Audit Details Blocking SQL Injection Attacks Powerful Alerting Filter Conditions To learn more about the new features in Oracle Audit Vault and Database Firewall, watch the on-demand webcast.

    Read the article

  • Is gstreamer the best encoder for vorbis or is there a better encoding engine I should use?

    - by sayth
    I have sound juicer installed and I want to rip to vorbis.ogg. Is gstreamer the best encoder for vorbis or is there a better encoding engine I should use. The default gstreamer profile is audio/x-raw-float,rate=44100,channels=2 ! vorbisenc name=enc quality=0.5 ! oggmux I am going to raise the quality to 0.7 but thats all nothing if gstreamer isn't the best encoder. Any suggestions for high quality ripping? Edit: a good answer to this will also be the top search result in google for "best vorbis encoding engine". Double Edit: It appears oggenc itself is the best encoder which rules out using sound juicer to rip cd's as it uses gstreamer. I have installed oggenc and am testing the command ripper abcde. Found a good configuration for it here oggenc config for abcde

    Read the article

  • 10 Package Management Operations You Need Synaptic for on Ubuntu

    - by Chris Hoffman
    The Ubuntu Software Center is a solid, user-friendly application, but sometimes you need more power. The Synaptic package manager – previously included with Ubuntu by default – can do many things the Ubuntu Software Center can’t. You can install Synaptic from the Ubuntu Software Center – just search for Synaptic. You can also perform all these operations from the terminal – but, if you need a powerful graphical application for managing packages, Synaptic can’t be beat. How to Play Classic Arcade Games On Your PC How to Use an Xbox 360 Controller On Your Windows PC Download the Official How-To Geek Trivia App for Windows 8

    Read the article

  • SCCM2012 R2 – How to integrate MDT with SCCM

    - by Waclaw Chrabaszcz
    Originally posted on: http://geekswithblogs.net/Wchrabaszcz/archive/2013/11/12/sccm2012-r2--how-to-integrate-mdt-with-sccm.aspxThere are two maybe not competitive but parallel products: Microsoft Deployment Toolkit and System Center Configuration manager. Few years ago I was wondering why they are separate, I why I cannot share Task Sequences between them. And how it usually happens in live, when I was focused on other technologies, MDT and SCCM became best friends. Let's integrate MDT with SCCM: If first step you need to download MDT http://www.microsoft.com/en-us/download/details.aspx?id=25175 Install MDT on your SCCM server boxaccept the EULA Join CEIP if you like  Once you completed the installation I would recommend you to complete MDT configuring before integration with the SCCM. Start the Deployment Workbenchinstall updatesyou will need to download and install WAIKcreate new deployment shareleave default values Create MDT databaseMake sure you will create separate database, DO NOT use existing SCCM DB Now we are ready to integrate MDT with SCCMthe Integration tool should discover your server automaticallyAfter reopening SCCM console in task sequences you should have new cool features: How to use them? That's another story …

    Read the article

  • Creating alias and script alias in Ubuntu

    - by Jesi
    I am configuring LG looking glass on Ubuntu. I have followed this link. In step 3 they said to add following two lines to webserver config: Alias /lg/favicon.ico /usr/local/httpd/htdocs/lg/favicon.ico ScriptAlias /lg /usr/local/httpd/htdocs/lg/lg.cgi I have added it to my webserver config: #vi /etc/apache2/sites-available/default Alias /lg/favicon.ico "/usr/local/httpd/htdocs/lg/favicon.ico" <Directory "/usr/local/httpd/htdocs/lg/favicon.ico"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> ScriptAlias /lg/ "/usr/local/httpd/htdocs/lg/lg.cgi" <Directory "/usr/local/httpd/htdocs/lg/lg.cgi"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> When I tried http://127.0.0.1/lg in my browser, it shows not found. I am new with web-server, can anyone help me please?

    Read the article

  • How do I point a new domain to start on a page that's not index.html on separate hosting? [closed]

    - by Owen Campbell-Moore
    Possible Duplicate: How do I point a new domain to start on a page that's not index.html on separate hosting? I'm using a service called SquareSpace to host my site, and today I'm registering the domain for it. Basically, how do I make it so when somebody types www.tedxoxford.com it points at http://www.tedxoxford.com/landing (currently http://tedxoxford.squarespace.com/landing) instead of the default index? Is this possible? Squarespace is quite a restricted CMS and means that your logos etc all point to the index so I don't want people ending up on my landing/splash page every time they want the home page, only on the first time they type in the URL. A dirty hack would be to check the refferer and redirect anyone hitting the index to the landing page, but that's a lot of loading overhead I'd rather avoid...

    Read the article

  • Audiencing with Forms-Based Authentication (FBA)

    - by PeterBrunone
    This really is no different from when you create an audience with regular old NTLM (Windows Authentication).  The difference is that while the AD provider is set up by default in all environments, the extra membership provider (that you use for Forms Authentication) isn't included anywhere except in the web application where you install it.  To be able to find your FBA users in the audience creation tool, you'll need to add the extra membership provider(s) to the web.config for your SSP site in IIS.  At that point, the People Picker should start recognizing your Forms Auth users, and you can create your audience as needed.

    Read the article

  • Timeout Considerations for Solicit Response – Part 2

    - by Michael Stephenson
    To follow up a previous article about timeouts and how they can affect your application I have extended the sample we were using to include WCF. I will execute some test scenarios and discuss the results. The sample We begin by consuming exactly the same web service which is sitting on a remote server. This time I have created a .net 3.5 application which will consume the web service using the basichttp binding. To show you the configuration for the consumption of this web service please refer to the below diagram. You can see like before we also have the connectionManagement element in the configuration file. I have added a WCF service reference (also using the asynchronous proxy methods) and have the below code sample in the application which will asynchronously make the web service calls and handle the responses on a call back method invoked by a delegate. If you have read the previous article you will notice that the code is almost the same.   Sample 1 – WCF with Default Timeouts In this test I set about recreating the same scenario as previous where we would run the test but this time using WCF as the messaging component. For the first test I would use the default configuration settings which WCF had setup when we added a reference to the web service. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"   The Test We simulated 21 calls to the web service Test Results The client-side trace is as follows:   The server-side trace is as follows: Some observations on the results are as follows: The timeouts happened quicker than in the previous tests because some calls were timing out before they attempted to connect to the server The first few calls that timed out did actually connect to the server and did execute successfully on the server   Test 2 – Increase Open Connection Timeout & Send Timeout In this test I wanted to increase both the send and open timeout values to try and give everything a chance to go through. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:10:00" receiveTimeout="00:10:00" sendTimeout="00:10:00"   The Test We simulated 21 calls to the web service   Test Results The client side trace for this test was   The server-side trace for this test was: Some observations on this test are: This test proved if the timeouts are high enough everything will just go through   Test 3 – Increase just the Send Timeout In this test we wanted to increase just the send timeout. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:10:00"   The Test We simulated 21 calls to the web service   Test Results The below is the client side trace The below is the server side trace Some observations on this test are: In this test from both the client and server perspective everything ran through fine The open connection timeout did not seem to have any effect   Test 4 – Increase Just the Open Connection Timeout In this test I wanted to validate the change to the open connection setting by increasing just this on its own. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:10:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"   The Test We simulated 21 calls to the web service Test Results The client side trace was The server side trace was Some observations on this test are: In this test you can see that the open connection which relates to opening the channel timeout increase was not the thing which stopped the calls timing out It's the send of data which is timing out On the server you can see that the successful few calls were fine but there were also a few calls which hit the server but timed out on the client You can see that not all calls hit the server which was one of the problems with the WSE and ASMX options   Test 5 – Smaller Increase in Send Timeout In this test I wanted to make a smaller increase to the send timeout than previous just to prove that it was the key setting which was controlling what was timing out. The timeout values for this test are: openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:02:30"   The Test We simulated 21 calls to the web service Test Results The client side trace was   The server side trace was Some observations on this test are: You can see that most of the calls got through fine On the client you can see that call 20 timed out but still hit the server and executed fine.   Summary At this point between the two articles we have quite a lot of scenarios showing the different way the timeout setting have played into our original performance issue, and now we can see how WCF could offer an improved way to handle the problem. To summarise the differences in the timeout properties for the three technology stacks: ASMX The timeout value only applies to the execution time of your request on the server. The timeout does not consider how long your code might be waiting client side to get a connection. WSE The timeout value includes both the time to obtain a connection and also the time to execute the request. A timeout will not be thrown as an error until an attempt to connect to the server is made. This means a 40 second timeout setting may not throw the error until 60 seconds when the connection to the server is made. If the connection to the server is made you should be aware that your message will be processed and you should design for this. WCF The WCF send timeout is the setting most equivalent to the settings we were looking at previously. Like WSE this setting the counter includes the time to get a connection as well as the time to execute on a server. Unlike WSE and ASMX an error will be thrown as soon as the send timeout from making your call from user code has elapsed regardless of whether we are waiting for a connection or have an open connection to the server. This may to a user appear to have better latency in getting an error response compared to WSE or ASMX.

    Read the article

  • Flash 10.2 RC + Crystal HD for HW accelerated video on Ubuntu

    - by Gee
    I have a netbook with a N450 Atom and a BCM70012 aka Crystal HD card. On Windows 7 I can play HD flash video with very little CPU usage because of the RC of Flash 10.2. I did some reading and saw posts claiming that the Crystal HD card is finally supported by the newer Flash 10.2 RC in Ubuntu but I can't get it to work. I can confirm that flash 10.2 is loaded and used, and there's even a HW acceleration option that is enabled in the settings but performance is horrible. From what I read, the Crystal HD card is supposed to be enabled on 10.10 by default - I don't know if it is. I tried installing drivers for it in various ways but HD flash video is still a slideshow So does anyone have it working? If so, how'd you set it up?

    Read the article

  • .htaccess Redirect 301 in Wordpress – From Post to Page

    - by elocman
    By default, Wordpress posts are added to RSS feed. For my website, I want to include Wordpress pages to RSS feed. I know that some plugins could help me. Instead, I try to use redirect 301 in .htacccess file. My question is, will this way work fine for Google and other search engines? Here’s what I did: Published a new page and then a new post with the same title, desc, keywords and content (though I know that if there’s redirect 301 Google won’t "read" the post but switch to the page) Added the line Redirect 301 etc. to my .htacccess file Now my post is listed in RSS feed, and when you click on it you’re redirected to the page

    Read the article

  • google-chrome video application association

    - by Ben Lee
    Is there any way to tell google-chrome to launch video files of particular types in an external application (or even just to bring up the download box as if the type was un-handled), instead of showing the video inside the browser? Searching online, it seems that chrome is supposed to use xdg-mime for file associations, but apparently is ignoring this for video. For example, when I do: xdg-mime query default video/mpeg It returns dragonplayer.desktop. But when I click on a mpeg video link, chrome displays it internally instead of launching Dragon Player (if I double click on a mpeg file in my file manager, on the other hand, it does open Dragon Player). So is there a way to tell chrome to respect this setting, or another way to coax chrome into opening the file externally? If it matters, I'm running the latest version of google-chrome stable (not chromium) at the time of writing, v. 18.0.1025.151, on kubuntu 11.10.

    Read the article

  • Ubuntu 11.04:Add right click menu as "Compress as ZIP"

    - by Ananthavel Sundararajan
    Step 1: I wanted to Add a menu as "Compress as ZIP" on right click. I know i can use change default compress format as "ZIP" using gconf-editor. But I wanted to add a new Menu Item for Compressing as ZIP without opening any other option dialog. Step 2: I wanted to Compress a file as ZIP and Rename it as a "epub". Please let me know is it possible to zip&rename by adding single menu Item? Im using Ubuntu 11.04 and Installed "Nautilus-Action-Configurations", but unsuccessful. N.B. I have read this Ask Ubuntu Q&A; I dont want to open a new window to choose me the format. It should be straight away saved as ZIP.

    Read the article

  • Plymouth did not install properly

    - by David Starkey
    I was installing the plymouth manager in hopes of making a custom loading screen. While the terminal was working, my computer unexpectedly powered off. I can open up the manager and it appears to do what it is supposed to (minus the fact that I can't make my own theme) and the screen only shows on powering down. Anyway, all of the advice I have seen so far have resulted in errors and nothing getting fixed. I do not have permissions to simply select the folder and delete it for some reason and I have not been able to find out how to grant myself those permissions. I guess my question then is how do I get rid of the plymouth manager so I can reinstall it properly? Already tried: -Installation - http://www.noobslab.com/2011/11/install-plymouth-manager-and-change.html -Removal - How to remove Plymouth Boot Animation manager and keep the default boot screen -Permissions - How do I change my user permissions to edit /etc/apt/sources.list? -Theming Guide - http://brej.org/blog/?p=158

    Read the article

  • Strange display language in gnome shell

    - by khalafuf
    I logged in gnome-shell, and found that the display language is set to some strange asian language (I think) without my prompt. I tried to change the locale settings but found that the default language is English (how?) despite of that strange language. Here's a snapshot, See the strange word instead of "Activity": I'm on Ubuntu 12.04 LTS. Output of locale: LANG=zh_CN.UTF-8 LANGUAGE=zh_CN:en_US:en LC_CTYPE="zh_CN.UTF-8" LC_NUMERIC=en_US.UTF-8 LC_TIME=en_US.UTF-8 LC_COLLATE="zh_CN.UTF-8" LC_MONETARY=en_US.UTF-8 LC_MESSAGES="zh_CN.UTF-8" LC_PAPER=en_US.UTF-8 LC_NAME=en_US.UTF-8 LC_ADDRESS=en_US.UTF-8 LC_TELEPHONE=en_US.UTF-8 LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=en_US.UTF-8 LC_ALL= Output of locale -a: C C.UTF-8 de_CH.utf8 en_AG en_AG.utf8 en_AU.utf8 en_BW.utf8 en_CA.utf8 en_DK.utf8 en_GB.utf8 en_IE.utf8 en_IN en_IN.utf8 en_NG en_NG.utf8 en_NZ.utf8 en_PH.utf8 en_SG.utf8 en_US.utf8 en_ZA.utf8 en_ZM en_ZM.utf8 en_ZW.utf8 POSIX zh_CN.utf8 zh_SG.utf8 Solved: This answer did it.

    Read the article

  • Week in Geek: New Malware Steals Bitcoin Currency

    - by Asian Angel
    This week we learned how to easily change a dual-booting PC’s default OS, “extract audio from any video using VLC, sneak around paywalls, & delay Windows Live Mesh during boot”, shrink videos to fit an Android phone with VLC, fix damaged or broken audio cables, “decide between an ISO or TS folder, help Windows 7 remember folder locations, & convert books for the Kindle”, and more. Photo by Profound Whatever.How to Make and Install an Electric Outlet in a Cabinet or DeskHow To Recover After Your Email Password Is CompromisedHow to Clean Your Filthy Keyboard in the Dishwasher (Without Ruining it)

    Read the article

  • Black screen when running xubuntu 13.10 after upgrade

    - by user213030
    I have a xubuntu v12 that I updated to v13.10. Sice the upgrade I get the black screen at a startup. I can get to the console and login. How can I run it in graphic mode? I run it on Oracle Virtualbox. Starting the VirtualBox Guest Additions ...done. Starting VirtualBox Guest Addition service ...done. saned disabled; edit /etc/default/saned; * Restoring resolver state... [ OK ] And it hangs on this.

    Read the article

  • Gateway IP for eth0 and gateway IP for pptp vpn are same

    - by user286630
    My problem is.. 1) I'm using laptop at school. 2) In school, the default gateway for ethernet is 192.168.1.1. 3) I want to connect to a pptp vpn server. The gateway over the vpn connection is also 192.68.1.1. (The VPN server assigns 192.168.100.1 to my laptop and I confirmed that it is not used in school.) In this situation, there is no problem in Windows 7. I think it is enough smart to distinguish two different gateways with the same IP. All connection requests may be forwarded to the vpn gateway. But, in Ubuntu, I cannot access a file server in the remote site. I guess every connection request is forwarded to the ethernet gateway. How can I send all connection requests to the vpn gateway whose IP is same as the ethernet gateway?

    Read the article

  • ReSharper File Location

    - by Ben Griswold
    By default, the ReSharper cache is stored in the solution folder.  It’s one extra folder and one extra .user file.  It’s no big deal but it does clutter up your solution a bit – especially since the files provide no real value. I prefer to store the ReSharper cache in the system Temp folder.  This setting is available by visiting ReSharper > Options > Environment > General. Just update where you’d like to store the ReSharper cache and you’re good to go.  Note, the .user file continues to linger around the solution folder but at least the _ReSharper.SolutionName folder is moved out of sight.

    Read the article

  • Oracle E-Business Suite 12 Certified on Oracle Linux 6 (x86-64)

    - by John Abraham
    Oracle E-Business Suite Release 12 (12.1.1 and higher) is now certified on 64-bit Oracle Linux 6 with the Unbreakable Enterprise Kernel (UEK). New installations of the E-Business Suite R12 on this OS require version 12.1.1 or higher. Cloning of existing 12.1 Linux environments to this new OS is also certified using the standard Rapid Clone process. There are specific requirements to upgrade technology components such as the Oracle Database (to 11.2.0.3) and Fusion Middleware as necessary for use on Oracle Linux 6. These and other requirements are noted in the Installation and Upgrade Notes (IUN) below. Certification for other Linux distros still underway Certifications of Release 12 with 32-bit Oracle Linux 6, 32-bit and 64-bit Red Hat Enterprise Linux (RHEL) 6 and the Red Hat default kernel are in progress. References Oracle E-Business Suite Installation and Upgrade Notes Release 12 (12.1.1) for Linux x86-64 (My Oracle Support Document 761566.1) Cloning Oracle Applications Release 12 with Rapid Clone (My Oracle Support Document 406982.1) Interoperability Notes Oracle E-Business Suite Release 12 with Oracle Database 11g Release 2 (11.2.0) (My Oracle Support Document 1058763.1) Oracle Linux website

    Read the article

  • jMonkey Quest Database

    - by theJollySin
    I am building a game in jMonkey (Java) and I have so far only used default quest text. But now I need to start populating a lot of quests with text. My design requires A LOT of quests texts. What is the best way to build a database of quest texts in jMonkey? I don't have a lot of real experience with databases. Is there a database that integrates well with jMonkey? Here are the ideal properties I want in my database, in order of priority: Reasonably light learning curve Easy portability (in Java) to Windows, Linux, and Mac OSX Good interface with Java Good interface with jMonkey The ability to add properties to the quests: ID, level, gender, quest chain ID, etc. Or am I wrong in thinking I need to use some giant monster like SQL? I haven't been able to find much information on this, so are people using some non-database methods for storing things like quest text in jMonkey?

    Read the article

  • Why won't Opera let me use the Ubuntu font?

    - by Roddie
    This is driving me crazy. I'm using monochrome rendering for fonts and this causes a few problems in my browser so I wanted to make Ubuntu the standard sans-serif font. I changed it in the preferences and it initially works okay but after a while it reverts to the default. If I go into the font section in the menu, it still lists Ubuntu and if I click OK the pages will correct themselves. Does anyone know I can stop this behaviour? I'm using Opera 11 on Ubuntu 10.10

    Read the article

  • Tuning Red Gate: #4 of Some

    - by Grant Fritchey
    First time connecting to these servers directly (keys to the kingdom, bwa-ha-ha-ha. oh, excuse me), so I'm going to take a look at the server properties, just to see if there are any issues there. Max memory is set, cool, first possible silly mistake clear. In fact, these look to be nicely set up. Oh, I'd like to see the ANSI Standards set by default, but it's not a big deal. The default location for database data is the F:\ drive, where I saw all the activity last time. Cool, the people maintaining the servers in our company listen, parallelism threshold is set to 35 and optimize for ad hoc is enabled. No shocks, no surprises. The basic setup is appropriate. On to the problem database. Nothing wrong in the properties. The database is in SIMPLE recovery, but I think it's a reporting system, so no worries there. Again, I'd prefer to see the ANSI settings for connections, but that's the worst thing I can see. Time to look at the queries, tables, indexes and statistics because all the information I've collected over the last several days suggests that we're not looking at a systemic problem (except possibly not enough memory), but at the traditional tuning issues. I just want to note that, I started looking at the system, not the queries. So should you when tuning your environment. I know, from the data collected through SQL Monitor, what my top poor performing queries are, and the most frequently called, etc. I'm starting with the most frequently called. I'm going to get the execution plan for this thing out of the cache (although, with the cache dumping constantly, I might not get it). And it's not there. Called 1.3 million times over the last 3 days, but it's not in cache. Wow. OK. I'll see what's in cache for this database: SELECT  deqs.creation_time,         deqs.execution_count,         deqs.max_logical_reads,         deqs.max_elapsed_time,         deqs.total_logical_reads,         deqs.total_elapsed_time,         deqp.query_plan,         SUBSTRING(dest.text, (deqs.statement_start_offset / 2) + 1,                   (deqs.statement_end_offset - deqs.statement_start_offset) / 2                   + 1) AS QueryStatement FROM    sys.dm_exec_query_stats AS deqs         CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest         CROSS APPLY sys.dm_exec_query_plan(deqs.plan_handle) AS deqp WHERE   dest.dbid = DB_ID('Warehouse') AND deqs.statement_end_offset > 0 AND deqs.statement_start_offset > 0 ORDER BY deqs.max_logical_reads DESC ; And looking at the most expensive operation, we have our first bad boy: Multiple table scans against very large sets of data and a sort operation. a sort operation? It's an insert. Oh, I see, the table is a heap, so it's doing an insert, then sorting the data and then inserting into the primary key. First question, why isn't this a clustered index? Let's look at some more of the queries. The next one is deceiving. Here's the query plan: You're thinking to yourself, what's the big deal? Well, what if I told you that this thing had 8036318 reads? I know, you're looking at skinny little pipes. Know why? Table variable. Estimated number of rows = 1. Actual number of rows. well, I'm betting several more than one considering it's read 8 MILLION pages off the disk in a single execution. We have a serious and real tuning candidate. Oh, and I missed this, it's loading the table variable from a user defined function. Let me check, let me check. YES! A multi-statement table valued user defined function. And another tuning opportunity. This one's a beauty, seriously. Did I also mention that they're doing a hash against all the columns in the physical table. I'm sure that won't lead to scans of a 500,000 row table, no, not at all. OK. I lied. Of course it is. At least it's on the top part of the Loop which means the scan is only executed once. I just did a cursory check on the next several poor performers. all calling the UDF. I think I found a big tuning opportunity. At this point, I'm typing up internal emails for the company. Someone just had their baby called ugly. In addition to a series of suggested changes that we need to implement, I'm also apologizing for being such an unkind monster as to question whether that third eye & those flippers belong on such an otherwise lovely child.

    Read the article

  • SQL SERVER – Beginning New Weekly Series – Memory Lane – #002

    - by pinaldave
    Here is the list of curetted articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 Query to Find ByteSize of All the Tables in Database This was my second blog post and today I do not remember what was the business need which has made me build this query. It was built for SQL Server 2000 and it will not directly run on SQL Server 2005 or later version now. It measured the byte size of the tables in the database. This can be done in many different ways as well for example SP_HELPDB as well SP_HELP. I wish to build similar script in 2005 and later version. 2007 This week I had completed my – 1 Year (365 blogs) and very first 1 Million Views. I was pretty excited at that time with this new achievement. SQL SERVER Versions, CodeNames, Year of Release When I started with SQL Server I did not know all the names correctly for each version and I often used to get confused with this. However, as time passed by I started to remember all the codename as well. In this blog post I have not included SQL Server 2012′s code name as it was not released at the time. SQL Server 2012′s code name is Denali. Here is the question for you – anyone know what is the internal name of the SQL Server’s next version? Searching String in Stored Procedure I have already started to work with 2005 by this time and I was personally converting each of my stored procedures to SQL Server 2005 compatible. As we were upgrading from SQL Server 2000 to SQL Server 2005 we had to search each of the stored procedures and make sure that we remove incompatible code from it. For example, syscolumns of SQL Server 2000 was now being replaced by sys.columns of SQL Server 2005. This stored procedure was pretty helpful at that time. Later on I build few additional versions of the same stored procedure. Version 1: This version finds the Stored Procedures related to Table Version 2: This is specific version which works with SQL Server 2005 and later version 2008 Clear Drop Down List of Recent Connection From SQL Server Management Studio It happens to all of us when we connected to some remote client server and we never ever have to connect to it again. However, it keeps on bothering us that the name shows up in the list all the time. In this blog post I covered a quick tip about how we can remove the same. I also wrote a small article about How to Check Database Integrity for all Databases and there was a funny question from a reader requesting T-SQL code to refresh databases. 2009 Stored Procedure are Compiled on First Run – SP is taking Longer to Run First Time A myth is quite prevailing in the industry that Stored Procedures are pre-compiled and they should always run faster. It is not true. Stored procedures are compiled on very first execution of it and that is the reason why it takes longer when it executes first time. In this blog post I had a great time discussing the same concept. If you do not agree with it, you are welcome to read this blog post. Removing Key Lookup – Seek Predicate – Predicate – An Interesting Observation Related to Datatypes Performance Tuning is an interesting concept and my personal favorite one. In many blog posts I have described how to do performance tuning and how to improve the performance of the queries. In this quick quick tip I have explained how one can remove the Key Lookup and improve performance. Here are very relevant articles on this subject: Article 1 | Article 2 | Article 3 2010 Recycle Error Log – Create New Log file without a Server Restart During one of the consulting assignments I noticed DBA restarting server to create new log file. This is absolutely not necessary and restarting server might have many other negative impacts. There is a common sp_cycle_errorlog which can do the same task efficiently and properly. Have you ever used this SP or feature? Additionally I had a great time presenting on SQL Server Best Practices in SharePoint Conference. 2011 SSMS 2012 Reset Keyboard Shortcuts to Default It is very much possible that we mix up various SQL Server shortcuts and at times we feel like resetting it to default. In SQL Server 2012 it is not easy to do it, there is a process to follow and I enjoyed blogging about it. Fundamentals of Columnstore Index Columnstore index is introduced in SQL Server 2012 and have been a very popular subject. It increases the speed of the server dramatically as well can be an extremely useful feature with Datawharehousing. However updating the columnstore index is not as simple as a simple UPDATE statement. Read in a detailed blog post about how Update works with Columnstore Index. Additionally, you can watch a Quick Video on this subject. SQL Server 2012 New Features I had decided to explore SQL Server 2012 features last year and went through pretty much every single concept introduced in separate blog posts. Here are two blog posts where I describe how SQL Server 2012 functions works. Introduction to CUME_DIST – Analytic Functions Introduction to FIRST _VALUE and LAST_VALUE – Analytic Functions OVER clause with FIRST_VALUE and LAST_VALUE – Analytic Functions I indeed enjoyed writing about SQL Server 2012 functions last year. Have you gone through all the new features which are introduced in SQL Server 2012? If not, it is still not late to go through them. Reference: Pinal Dave (http://blog.sqlauthority.com)   Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 611 612 613 614 615 616 617 618 619 620 621 622  | Next Page >