Search Results

Search found 31357 results on 1255 pages for 'database indexes'.

Page 546/1255 | < Previous Page | 542 543 544 545 546 547 548 549 550 551 552 553  | Next Page >

  • Troubleshooting source of heavy resource-usage on a windows server 2008 running multiple sites

    - by batman_man
    Hi, I am running about 10 asp.net websites on a hosted virtual server. The server runs Server 2008 - each website is backed by its own database running on SQL server 2008 on the same box. Lately the box has seemed really slow. The only kind of discovery i could think of doing was looking in the task manager, where i can see w3wp and sqlserver.exe jumping to 40% cpu usage every 5-10 seconds. What are the steps i can take to determine which of my websites is taking these resources and or what database is getting hit the most? I have of course ssms installed on the machine as well. As you can tell, my sysadmin skills are very very limited - any help would be much appreciated.

    Read the article

  • How come my Apache can't read my media folder, but it can load the site? (static files don't work)

    - by Alex
    Alias /media/ /home/matt/repos/hello/media <Directory /home/matt/repos/hello/media> Options -Indexes Order deny,allow Allow from all </Directory> WSGIScriptAlias / /home/matt/repos/hello/wsgi/django.wsgi /media is my directory. When I go to mydomain.com/media/, it says 403 Forbidden. And, the rest of my site doesn't work because all static files are 404s. Why? The page loads. Just not the media folder. Edit: hello is my project folder. I have tried 777 all my permissions of that folder.

    Read the article

  • Selective Disable APC caching

    - by Victor
    I installed APC on my VPS and it works great with W3 Cache wordpress plugin. My problem is that there is one database in MySQL which is pinged by client end every few seconds to see if there are new updates. These db contains certain time sensitive information and hence it can't be part of cached data. How can I disable APC for this database/files? or Can I set a very short expiry of certain type of data? Any help is highly appreciated.

    Read the article

  • Changing an MSSQL clustered index field from containing "random" GUIDs to sequential GUIDs - how wil

    - by Eyvind
    We have an MSSQL database in which all the primary keys are GUIDs (uniqueidentifiers). The GUIDs are produced on the client (in C#), and we are considering changing the client to generate sequential (comb) GUIDs instead of just using Guid.NewGuid(), to improve db performance. If we do this, how will this affect installations that already have data with "random" GUIDs as clustered PKs? Can anything be done (short of changing all the PK values) to rebuild the indexes to avoid further fragmentation and bad insert performance? Please give explicit and detailed answers if you can; I am a C# developer at heart and not all too familiar with all the intricacies of SQL Server. Thanks!

    Read the article

  • SQL Import to update existing records?

    - by Kenundrum
    I've got a table in a database that contains costs for items that gets updated monthly. To update these costs, we have someone export the table, do some magic in excel, and then import the table back to the database. We're running MSSQL 2005 and using the built in SQL Management Studio. The problem is that when importing back into the table, we have to delete all the records before we import or else we'll get errors. The ideal situation would be for the import to recognize the primary keys and then update the records instead of trying to create a second record with a duplicate key- halting the import. The best illustration of the behavior we're trying to get can be found at http://sqlmanager.net/en/products/mssql/dataimport/documentation/hs2180.html the update or insert example. Is something like this possible with the built in tools or do we have to get third party software to make it happen?

    Read the article

  • Secure external connection to SQL Server (from third party software)

    - by Bart
    I have a SQL Express 2008 R2 server running on a server in an internal lan network. A few databases are used by some third party software to store data. A SQL-Server user is used by this application to connect to the database. Now I need to access this database using a local installation of the software from an external pc. In this particular case a VPN connection is not the solution I am looking for. I have access to an external linux server, so I tried ssh tunneling from the windows server to the linux server and use the external pc to tunnel it back from the linux server to the client, but this is working very very slow. What are my other options to allow this external connection in a safe way?

    Read the article

  • How do I serve only internal intranet requests for a site with Apache?

    - by purpletonic
    I have an externally facing web server on our domain that we use for testing multiple sites. I have a site on this server that I want only people from within our intranet to view. How do I prevent requests originating from outside the intranet from seeing this website? I tried the following in my apache config file, but I get a 403 error. <Directory /> Options FollowSymLinks Order Deny,Allow Allow from domain.com Allow from 10.0.0.0/10.255.255.255 Deny from All AllowOverride None </Directory> <Directory /var/www/sitename/public> Options Indexes FollowSymLinks MultiViews Order Deny,Allow Allow from domain.com Allow from 10.0.0.0/10.255.255.255 Deny from All AllowOverride None </Directory>

    Read the article

  • Perl throwing 403 errors!

    - by Jamie
    When I first installed Perl in my WAMP setup, it worked fine. Then, after installing ASP.net, it began throwing 403 errors. Here's my ASP.net config: Load asp.net module LoadModule aspdotnet_module "modules/mod_aspdotnet.so" Set asp.net extensions AddHandler asp.net asp asax ascx ashx asmx aspx axd config cs csproj licx rem resources resx soap vb vbproj vsdisco webinfo # Mount application AspNetMount /asp "c:/users/jam/sites/asp" # ASP directory alias Alias /asp "c:/users/jam/sites/asp" # Directory setup <Directory "c:/users/jam/sites/asp"> # Options Options Indexes FollowSymLinks Includes +ExecCGI # Permissions Order allow,deny Allow from all # Default pages DirectoryIndex index.aspx index.htm </Directory> # aspnet_client files AliasMatch /aspnet_client/system_web/(\d+)_(\d+)_(\d+)_(\d+)/(.*) "C:/Windows/Microsoft.NET/Framework/v$1.$2.$3/ASP.NETClientFiles/$4" # Allow ASP.net scripts to be executed in the temp folder <Directory "C:/Windows/Microsoft.NET/Framework/v*/ASP.NETClientFiles"> Options FollowSymLinks Order allow,deny Allow from all </Directory> Also, what are the code tags for this site?

    Read the article

  • Apache: domains working fine, subdomains not working anymore

    - by David Lawson
    Hi there, I'm not sure when, but suddenly subdomains aren't working on my server. e.g. www.davidlawson.co works, but david.lawson.co isn't working. <VirtualHost 173.203.109.191:80> ServerAdmin [email protected] ServerName david.lawson.co ServerAlias davidlawson.co ServerAlias www.davidlawson.co DocumentRoot /var/www/lawson/david <Directory /var/www/lawson/david/> Options -Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog /var/log/apache2/lawson/david/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/lawson/david/access.log combined </VirtualHost> Any suggestions on how to debug this further, or what the problem might be?

    Read the article

  • Powershell and long-running external tools?

    - by leeand00
    I'm trying to compact a MS-Access database using JetComp.exe using a powershell script. Here is the operative lines: # 4. Run JetComp LogWrite("Begin: Running JetComp") .\JETCOMP.EXE -src: $srcDB -dest: $dstDB | Out-Null #Run this command and wait for it to finish... IfErrorExit("Error Compacting Database") LogWrite("End: Running JetComp") The JETCOMP.EXE program seems to complete long before it is actually finished and the $dstDB ends up being smaller than the compact should even make it. Initially ($srcDB) it's about 1.8 GB and by the time the command finishes it's about 300,000 kb (about 0.29 gb) that's a pretty long way off from 1.8 gb which when compacted manually ends up being about 1.6 gb. Is there some sort of timeout I don't know about in powershell scripts? P.S. I know that when running JETCOMP.EXE manually, that the system often detects it as "not responding" even though it's actually getting the job done, and waiting long enough will allow it to complete.

    Read the article

  • Linux server remounted to read-only

    - by Eustahije
    I have tried to find anwser on SF, but no luck. Nothing worked. (and I have just basic knowledge of linux systems - I'm more developer) Hour ago I noticed that database is not reacting anymore. From some reason system went to readonly mode. Complete server is now, ofc, unavailable. Server is VPS in Dutch company, and have no idea what I can do with it now to unlock it. Every suggestion would be more then appriciated. I tried to save as much of database as possible but there are 20G of images, that would be hard to backup, but I can do that also, if that is something that is smart to do.

    Read the article

  • cannot get apache2 redirect working for a site

    - by benson
    what i want to do is to redirect all visitors going to example.com to www.example.com.it seems a very common task but for some reason it is not working for this specific site .it always points to the default one. And strangely, if i replace the domain with another one(yyyyy.com and www.yyyyy.com), it works all right.i check my DNS,and it's resolved to the right IP. here's my virtual host configure: <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/html/example.com Servername www.example.com <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /var/www/html/example.com> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> </VirtualHost > <VirtualHost *:80> ServerAdmin webmaster@localhost Servername example.com Redirect 301 / http://www.example.com </VirtualHost>

    Read the article

  • How to query Oracle from SQL Server ?

    - by Albert Widjaja
    Hi Everyone, I'm having difficulties in creating a connection from my SQL Server 2008 Enterprise SP2 x64 into the Oracle database 10g even though I have already install the Oracle Client 11g R2 ? I've followed this article from steps URL: http://www.ideaexcursion.com/2009/01/05/connecting-to-oracle-from-sql-server/ plus added: TNS_ADMIN into the Server variables which point into: C:\Oracle\product\11.2.0\client_1\network\admin what is working now: TNSNAMES.ORA has been copied successfully from the other Developer wworkstation i can TNSPING into the DB instance i can connect to the database using SQLplus and perform any SQL commands i can create the DSN ONLY when using "[b]C:\Windows\SysWOW64\odbcad32.exe[/b]" the normal odbcad32 doesn't show my DSN that I have just created ? the DSN created from the above works fine from the test connection. my goal: To be able to select the Oracle connection in the Linked server object but still no effect after I restart the server. (Windows Server 2008 Enterprise 64 bit SP2). Any idea please in resolving this problem would be greatly appreciated. Thanks.

    Read the article

  • magento on Zend Server (Win7) installation error

    - by czerasz
    I try to install magento for the first time. I've created the database with the name "project" in my C:\Zend\Apache2\conf\httpd.conf I added on the end: <Directory "C:\Zend\Apche2\htdocs\project"> Options Indexes FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> in my ZendServer/Server Setup/Extensions: PDO_MySQL, simplexml, mcrypt, hash, GD, DOM, iconv, curl, SOAP are on in C:\Zend\ZendServer\etc\php.ini I set: safe_mode = Off ;<-- was set to off ... memory_limit = 512M; Maximum amount of memory a script may consume (128MB) After step "Configuration" of magento installation (with Use Web Server (Apache) Rewrites enabled) I get: Internal Server Error My database is full of tables (that schould be ok) My Zend Server shows: 27-Oct 06:55 6 Severe Slow Request Execution (Absolute) http://localhost/project/index.php/install/wizard/installDb/ Critical Open 27-Oct 06:55 4 Fatal PHP Error C:\Zend\Apache2\htdocs\project\lib\Varien\Db\Adapter\Pdo\Mysql.php Critical Open 27-Oct 06:55 5 Slow Function Execution curl_exec Warning Open 27-Oct 06:55 5 Slow Request Execution (Absolute) http://localhost/project/index.php/install/wizard/configPost/ What can be wrong?

    Read the article

  • SQL Server 2005 Default Backup Plan

    - by tylerl
    I noticed that a newly imported database on SQLServer 2005 had configured itself (without my knowledge) to perform daily backups; but it's not deleting old files and quickly filling up the disk. I don't know how the backup job got configured (maybe that's something that gets transferred when you move a database?) but I'm having trouble modifying it. The backup runs as part of SQL Server Agent job called "Daily Backups". This job runs a package called "(SSIS Packages)\Maintenance Plans\Backup Plan" -- which I can't find. The "Management\Maintenance Plans" area for my server is empty. I imagine I could delete the existing plan and re-create it manually, but I was hoping to just modify what was already there, since all that's missing is deleting old files.

    Read the article

  • How can we achive a 403 Permissions Denied for a subdomain?

    - by marikamitsos
    We have a multisite installed in the root directory (multisite.com) and also a wordpress single installation on a subdomain (help.multisite.com) In the root .htaccess we placed: #START Security: Disallow access to folders Options All -Indexes # END Security On the main site (as expected we get) "403 Permission Denied. You do not have permission for this request /wp-content/blogs.dir/83/" Nice. :) BUT. We just noticed that when trying to access the subdomains folders we get: Internal Server Error. The server encountered an internal error or misconfiguration and was unable to complete your request.... Additionally, a 500 Internal Server Error error... This is something we do NOT want. So the question is: How can we avoid the above result and make the message for the subdomains be "403 Permission Denied" (the same as for the main site and NOT "500 Internal Server Error" (as it is now)? We put what, where?

    Read the article

  • XAMPP pointing a file outside root folder

    - by Ravi
    I am using XAMPP for first time in Mac. Running out problems accessing other than root folder(htdocs).when I am placing my web application inside htdocs with default httpd.conf file it works when I try to point my web application url in httpd.conf it throws error I am aware that to modify the root folder I need to do changes to my XAMPP/etc/httpd.conf file With Default XAMPP MAC Settings, I am trying to change Server root,Document root and Directory in XAMPP/etc/httpd.conf file the following ServerRoot "/Users/ravi/Documents/Development/Backbone/backboneboilerplate" DocumentRoot "/Users/ravi/Documents/Development/Backbone/backboneboilerplate" <Directory /> Options FollowSymLinks AllowOverride All Order deny,allow Deny from all </Directory> <Directory "/Users/ravi/Documents/Development/Backbone/backboneboilerplate"> Options Indexes FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> its throwing error when trying to start XAMPP httpd: Syntax error on line 54 of /Applications/XAMPP/xamppfiles/etc/httpd.conf: Cannot load /Users/ravi/Documents/Development/Backbone/backboneboilerplate/modules/mod_authn_file.so into server: cannot create object file image or add library

    Read the article

  • MySQL & tmpfs : performance

    - by Serty Oan
    I was wondering if, and how much, using tmpfs could improve MySQL performance and how it should be done ? My guess would be to do mount -t tmpfs -o size=256M /path/to/mysql/data/DatabaseName, and to use the database normally but maybe I'm wrong (I'm using MyISAM tables only). Will a hourly rsync between the tmpfs /path/to/mysql/data/DatabaseName and /path/to/mysql/data/DatabaseName_backup penalize performances ? If so, how should I make the backup of the tmpfs database ? So, is it a good way to do things, is there a better way or am I losing my time ?

    Read the article

  • Can't perform ODBC connection to MySQL server on local network

    - by Emmanuel
    I have a wamp server running on LAN ip address 192.168.1.101 . From the browser on my PC which is on the LAN I can access the webserver and have as well set the phpmyadmin.conf file to be able to access the phpmyadmin interface. This works smoothly. On the wamp server I have a database which I'd need to access from any PC on the LAN using the MySQL Connector/ODBC. The problem is that I do not manage to setup the connection correctly. Here are the paramenters I use: Data Source Name: test_connection Description: test conenction Server: 192.168.1.101 Port: 3306 User: root Password: Database: The error message I get is the following: Connection Failed: [HY000][MySQL][ODBC 5.1 Driver]Can's connect to MySql server on '192.168.1.101' (10060) Would anybody have a hint to set up correctly the connection?

    Read the article

  • Amazon RDS Pros/Cons of Multiple DBs per instance

    - by Joe Flowers
    I run two completely independent websites. I am moving their MySQL databases to Amazon RDS. I'm not going to do Multi A/Z deployment - let's remove that variable from this question. I'm not sure whether to create a single RDS instance with two databases, or two Amazon RDS instances with a single database. Ignore cost for the sake of this question. I will not hit the 1 TB data limit so let's ignore that. However, it is extremely important that crashing one of the websites doesn't impact the other. Based on this document - http://docs.amazonwebservices.com/AmazonRDS/latest/UserGuide/Concepts.DBInstance.html I'm assuming that if I write terrible code that crashes one of the databases in a given RDS instance, it could possibly take down the entire RDS instance (and thus inadvertantly affect the other database). Is that correct? Thanks

    Read the article

  • How to configure SSL on an instance of SQL Server to allow dedicated users to remotely access it?

    - by The Good Boy
    I have configured the instance of SQL Server to allow dedicated users to access it remotely. Connection string Data Source = 192.168.1.2,1433\sqlexpress;etc... has been tested and works. However, I have not configured the SSL to secure the communication. How to configure SSL on an instance of SQL Server to allow dedicated users to remotely access it? edit 1 The dedicated user will administer its database using Sql Server Management Studio. What I want to do is to secure the communication when he/she administers the database using Sql Server Management Studio.

    Read the article

  • PSQL 64bit driver error

    - by Alex Holsgrove
    I have an Ubuntu 12.04 64bit server setup under Hyper-V. I have installed Pervasive 64bit SQL drivers so that a stock-updater script can run daily (Updates external MySQL database from another local server running Exchequer software / PSQL database). These drivers seem to conflict, as I found out when trying to run any apt-get commands: apt-get update apt-get: /usr/local/psql/lib64/libstdc++.so.6: version `GLIBCXX_3.4.9' not found (required by apt-get) apt-get: /usr/local/psql/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by apt-get) apt-get: /usr/local/psql/lib64/libstdc++.so.6: version `GLIBCXX_3.4.11' not found (required by apt-get) apt-get: /usr/local/psql/lib64/libstdc++.so.6: version `GLIBCXX_3.4.11' not found (required by /usr/lib/x86_64-linux-gnu/libapt-pkg.so.4.12) apt-get: /usr/local/psql/lib64/libstdc++.so.6: version `GLIBCXX_3.4.9' not found (required by /usr/lib/x86_64-linux-gnu/libapt-pkg.so.4.12) apt-get: /usr/local/psql/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /usr/lib/x86_64-linux-gnu/libapt-pkg.so.4.12) Any help would be great.

    Read the article

  • Proper management of PGPool II

    - by Cathy
    Currently I have a site, with one Postgres database server. It is just for a select number of users (less than ten) but it needs the maximum uptime possible. I would like some kind of automatic failover for the database. So I was thinking something like: one server running PGPool II, one running Postgres as master, one running Postgres as slave. But then, if wherever PGPool is running suddenly loses power (or dies, or whatever), there's a single point of failure and the whole thing goes down. Is there a solution, assuming that outsourcing this to someone else isn't possible?

    Read the article

  • SQL Server 2008 data directiories in SSD

    - by Kuroro
    I am going to install a new SQL server 2008 instance on my development/testing machine. My machine have one 7200rpm 500GB SATA Disk (C:OS) and one Intel X25-G2 80GB SSD(D:). Details machine config is as follow: CPU:i7 860 RAM:8GB Microsoft said I have an option to place following directories in different disk. So I plan to place User database & Temp DB on SSD and rest of it on traditional disk. Is it a good choice for gaining a performance boost in fast SSD? Data root directory :C:\Program Files\Microsoft SQL Server User database directory D:\Data User log directory C:\Logs Temp DB directory D:\TempDB Temp Log directory C:\TempDB Backup directory C:\Backups

    Read the article

< Previous Page | 542 543 544 545 546 547 548 549 550 551 552 553  | Next Page >