Search Results

Search found 32970 results on 1319 pages for 'zend db select'.

Page 579/1319 | < Previous Page | 575 576 577 578 579 580 581 582 583 584 585 586  | Next Page >

  • Determining which database instance makes biggest IO

    - by user2008937
    Assuming that I have a dedicated server on which I am running multiple instances of mysql and postresql servers. How without iotop determine which instance in particular time (proc/pid/io shows data collected in some peroid of time) makes the biggest IO (so it increases IOWAIT)? When lots of ppl do something on DB then I clearly see which instance is making the load because of high cpu usage, but I had a situation when the cpu usage was just normal, but very high iowait made a huge load on server and i had problem finding process that was making some outstanding IO

    Read the article

  • 1080p monitor: connected through VGA - perfect, HDMI - awful

    - by develroot
    When I connect my 23" monitor (1920x1080) to my pc through HDMI, I encourage some problems. It's not full screen That's it. There are ~1cm black borders on right and left side and ~0,5cm black borders on the top and the bottom of the monitor. That's pretty frustrating. I tried adjusting overscan, but I can't mannualy type the % of overscan that I need. I can only select between ex. 8 and 10% in AMD Vision Engine Control center, but what I need is 9%. Next, even if i select 10% and the image fits all the corners, but after log off all the settings are lost and I have to do it again and again. Text, images, everything looks blurry That surprised me a lot. Should'nt HDMI quality be better than VGA's one? When connected through HDMI, the text isn't readable. It's like a very low refresh rate, although i'm running at 60Hz. Also the text has something like little shadows, very very annoying. Are there any tips to get the same quality as with VGA, with HDMI ? (running on integrated ATI Radeon HD4200, which, appearently, is the best card I have ever seen in terms of integrated ones)

    Read the article

  • rsync --files-from (find + cat)

    - by Edward
    I try command rsync -v --files-from=/path/to/list.lst /home/user /path/to/backup list.lst contains for example .gnupg/ .pki/ .gnome2/keyrings/ .mozilla/firefox/*.default/bookmarkbackups/ .mozilla/firefox/*.default/bookmarks.html .mozilla/firefox/.default/.db files .mozilla/firefox/.default/.sqlite and i get error on all strings with * "failed: No such file or directory". Can anybody help me, or as variant can i combine find `cat /path/to/list.lst` with rsync?

    Read the article

  • Help with Ubuntu and Windows, separate HDs

    - by LuxuryMode
    Need some major help. Running a Dell XPS/Dimension 630i. It came with "SATA 2 RAID 0 With Dual 500GB Hard Drives." I have installed a new, third non-raided drive and installed Ubuntu on it. So now I have Windows on the original hard drive and Ubuntu Linux on the new HD. When I get to the boot menu where I can select an OS, if I select windows I get an error: "No such drive, no such disk." Also, strangely in the first place, in order to even get to the bootloader menu I have had to disable ALL ports under the RAID config. Unless I do this, I will just get to a never-ending blinking cursor. I have tried every conceivable CMOS config and nothing else works. Tried setting port 3 (the new HD w/ Ubuntu) to first hard disk boot priority. Tried disabling all other ports and enabling the Ubuntu HD port and vice versa. I have some pictures of boot up: first one is strange error i get after messing with CMOS to finally get ubuntu install to work. http://imgur.com/5sqJa then boot menu: http://imgur.com/TWtLq then error: http://imgur.com/TJ1mS. Also, please note that I can actually access all files from the raided Windows drive through Ubuntu.

    Read the article

  • TFS 2010 Subfolder Permissions

    - by gmcalab
    I am a TFSAdmin and when I have a TFS project in which a subfolder needs specific permissions to deny some users. So, I right click on the folder in question hit Properties, and click the Security tab. There I select the Windows User or Group radio, then click Add. I put in the AD User that I want specific permissions for and hit Check Names. That resolves, so I click OK. Next, I select the permissions to Allow or Deny below in the Permissions for list. I hit OK. The permission are honored by TFS, this user no longer has PendChange permissions and I was expecting. The odd thing is, I was expecting to be able to go back into the Security tab and see that User in the list of Users and Groups and see the current state. But the list is always empty. Not sure why, but the permissions are definitely being honored, I can re-add the user with different permissions and those are also honored. Any ideas why the current users are not showing up in the Users and Groups list under the Security tab for a folder's properties? I also used the tf permission $\... to see if there were any permissions but it always returns There are no permissions set for this item (Inherit: Yes)

    Read the article

  • SQl server 2008 permission and encryption

    - by Paranjai
    i have made columns in some of the tables encrypted in sql server 2008. Now as i am a db owner i have the access to encode and decode the data using the symmetric key and certificate. But some other users have only currently datareader and datawriter rights ,and when they execute any SP referring the logic which uses the key and certificate "User does has not right on the certificate to execute". What rights / exact permission should i grant them just to solve this problem

    Read the article

  • postgresql login from remote

    - by Hellnar
    I want to give remote access to my postresql db (8.2) to computers that are at the same lan, at the default config I have added this line to pg_hba.conf where xxx.xx.xx.xx is the ip of the machine that hosts postgresql. This machine is a windows 2k server. # IPv4 local connections: host all all 0.0.0.0/0 password host all all xxx.xx.xx.xx/24 password There no firewall or such blocking the connection between and `listen_addresses = '*' for postgresql.conf .

    Read the article

  • Creating a test database with copied data *and* its own data

    - by Jordan Reiter
    I'd like to create a test database that each day is refreshed with data from the production database. BUT, I'd like to be able to create records in the test database and retain them rather than having them be overwritten. I'm wondering if there is a simple straightforward way to do this. Both databases run on the same server, so apparently that rules out replication? For clarification, here is what I would like to happen: Test database is created with production data I create some test records that I want to keep running on the test server (basically so I can have example records that I can play with) Next day, the database is completely refreshed, but the records I created that day are retained. Records that were untouched that day are replaced with records from the production database. The complication is if a record in the production database is deleted, I want it to be deleted on the test database too, so I do want to get rid of records in the test database that no longer exist in the production database, unless those records were created within the test database. Seems like the only way to do this would be to have some sort of table storing metadata about the records being created? So for example, something like this: CREATE TABLE MetaDataRecords ( id integer not null primary key auto_increment, tablename varchar(100), action char(1), pk varchar(100) ); DELETE FROM testdb.users WHERE NOT EXISTS (SELECT * from proddb.users WHERE proddb.users.id=testdb.users.id) AND NOT EXISTS (SELECT * from testdb.MetaDataRecords WHERE testdb.MetaDataRecords.pk=testdb.users.pk AND testdb.MetaDataRecords.action='C' AND testdb.MetaDataRecords.tablename='users' );

    Read the article

  • how to import existing VM in vmware workstation 8 inventory

    - by Wimmel
    I like to add existing vmware (player) virtual machines to the vmware workstation 8 inventory on linux. When I create a new virtual machine, it is stored in /var/lib/vmware/Shared VMs/. But copying new directories to that folder, does not make them appear in the workstation window. I found out, the inventory is stored in /etc/vmware/hostd/vmInventory.xml; <ConfigRoot> <ConfigEntry id="0000"> <objID>1</objID> <vmxCfgPath>/var/lib/vmware/Shared VMs/test 1234/test 1234.vmx</vmxCfgPath> </ConfigEntry> </ConfigRoot> But I don't know if I break anything when adding entries myself, and giving it an unique ID. Besides, adding a large number of VMs this way is a bit cumbersome. On ESX, it was possible to use vmware-cmd -s register, but I don't have a vmware-cmd installed. In another question it was suggested to use vmware converter. But vmware converter 5 (on windows) only allows a destination file location when I select workstation as destination type. When I select vmware infrastructure as destination type, it says the destination is unsupported; it required vmware vcenter server.

    Read the article

  • how many tables can an MS SQL database hold?

    - by Peter Turner
    I've ran into this cryptic statement for SQL Server: Files Per Database 32,767. What does that mean exactly? Is there a maximum number of tables for a given version of SQL Server. We try to support SQL Server post 2005 32-bit and 64-bit. So if anyone has a handy dandy table they use to figure out how many tables they can have per DB for Microsoft SQL Servers I'd heartily appreciate seeing it.

    Read the article

  • Force Windows 7 to store thumbnails locally

    - by kotekzot
    I want Windows 7 to store thumbnails cache files in the same folder as the files (thumbs.db) instead of using the centralized location for all thumbnails (By default %userprofile%\AppData\Local\Microsoft\Windows\Explorer). How would one achieve this effect? Alternatively, if the former is implausible, I'd settle for no thumbnail caching at all, forcing Windows to regenerate thumbnails each time a folder is accessed.

    Read the article

  • Sharepoint 2010 moving site collection to different database error

    - by Brandon Ulasiewicz
    I am trying to move a site collection from one content database to another content database. First I used the following PowerShell command: New-SPContentDatabase -Name New_DB -WebApplication http://portal/ I confirmed that this did in fact create the DB in the SQL Server. I then used the following command: Move-SPSite http://portal/sites/hr -DestinationDatabase New_DB This generates an error stating that the "Operation is not valid due to the current state of the object" Can anyone help point me in the right direction with this? Thanks

    Read the article

  • School Management System

    - by BoundforPNG
    I am looking for a school management system to replace a homegrown Access db. It should be able handle the following for both a Primary and Secondary school Scheduling classes Student Enrollment Allow teacher to enter grades and comments Generate transcripts and report cards Handle attendance Handle tuition billing It should store data in a server database like SQL Server and it would be nice to have a web interface. We are open to a commercial system or an Open Source system that comes with support.

    Read the article

  • Whats the difference between pulling from a branch into master and pushing that branch onto master?

    - by Justin808
    In Tortoisegit, on the repository, I right-click and select sync. At the top of the dialog there are options for Local Branch and Remote Branch. If the local branch is named DeveloperA and the remote branch is master and I do a push, what happens? If the local branch is master and remote branch is DeveloperA and I Pull, what happens? If I am on the master branch and right click, select Merge and change the From to be my DeveloperA branch, what happens? If I try to push from master to remote master and the remote is updated git stops and tells me to pull. It seems if I push from DeveloperA to master it doens't stop, it just clobbers, it that correct? We're having an issue using git where the remote master branch gets clobbered at times and we are trying to figure out why. For example there is a developer working on his DeveloperA branch. He'll pull from master to get any updates, then push to master to push out his changes. But there are times that the push lists more files in the Out Commit list than he's edited. The odd thing is he can't revert those files as git is saying they are up to date and have not been modified. Yet when he pushes git pushes the files out. The problem is if there are changes between his pull and push the changes get clobbered.

    Read the article

  • MySQL Backup - incremental

    - by Tiffany Walker
    I know that you can use mysqldump. I am currently dumping the following way: ${MYSQLDUMP} --single-transaction -u ${MUSER} -h ${MHOST} -p${MPASS} $db | ${GZIP} -9 > $FILE From my understanding this locks the database and prevents any type of use of the database and can even lock up websites. Is there a better way to maybe do daily/hourly backups of the MySQL database should the database be in the 100mbs and even 1gbs in size?

    Read the article

  • Installed Windows 7 Ultimate on D Drive and previous Windows 7 Enterprise on C Drive has stopped starting up

    - by teenup
    Please please help! I have installed Windows 7 Ultimate on same hard drive on D Drive on my laptop and the previous Windows 7 Enterprise which was installed on C Drive is not booting up now. When I turn on my laptop, I see two Windows 7 on the screen, when I select newer one, it starts, but when I select older one which is Enterprise edition, system won't start and I get the DOS black screen with this error message: Windows Boot Manager Windows failed to start. A recent hardware or software change might be the cause. To fix the problem: Insert your Windows installation disc and restart your computer. Choose your language settings, and then click "Next." Click "repair your computer." Info: The boot selection failed because a required device is inaccessible. I notice that when I run the newer OS installed, the previous OS's drive (Which is D: now instead of C:) has become unusable and when I double click it, it asks me to format the drive. The data, that I had on my D Drive (Which is now C Drive for new OS), I had copied it to a network path and it is available. It was containing Windows 7 Users folder which I copied at that time when installing new windows. I have copied that Users folder again to the new OS's C Drive thinking it would run again, but of no use. Please please please...if someone can help...It is extremely required for me. Thanks a lot in advance.

    Read the article

  • Redis connection issue

    - by mre
    We are currently experiencing a lot of Redis errors with the message Unable to connect: read error on connection, trying next server We run Redis on FreeBSD using PHP Redis and we have a hard time reproducing the error on Ubuntu so this might be a hint. There's a long-running issue on that topic on github. Basically we get a socket from the operating system with a call to connect(host, port, timeout) in phpredis, but when we do a select(db_index) afterwards, we get an exception. Could there be an issue with persistance? I assume that connect does nothing in the background and select tries to access the connection, which is actually closed. We don't run into a timeout. We tried tuning TIME_WAIT without success. Any other ideas on where the problem might come from? What is the best way to track the issue down? dtrace maybe? Update We are currently looking into our BGSAVE settings. Interestingly it takes half a second and more to create a fork for the process which regularly writes the data to disk (persistence) and maybe redis can't respond to connect() requests during that timespan.

    Read the article

  • INSERT DELAYED on locked tables blocks PHP processes to continue

    - by sw0x2A
    Our webservers write some tracking information into a MySQL database (using INSERT DELAYED into MyISAM table). When a huge SELECT query is executed on this table or when it is locked for another reason, the webserver processes (with INSERT DELAYED) are waiting for the database and in some cases the MaxServer limit is reached in Apaches, so they will stop serving requests. We use INSERT DELAYED because The DELAYED option for the INSERT statement is a MySQL extension to standard SQL that is very useful if you have clients that cannot or need not wait for the INSERT to complete. This is a common situation when you use MySQL for logging and you also periodically run SELECT and UPDATE statements that take a long time to complete. Quote from MySQL documentation. I am wondering why the Apache processes are waiting for the INSERT DELAYED to finish. And what can I do to just send the data and forget about it. (Since this is logging data, I do not care if we lose some entries.) Even when the table is locked the PHP script should just go on and should not wait for an answer of MySQL. (We do not want to setup Master-slave for this table but we are thinking about move this data to some NoSQL database. But for now I would like to know why INSERT DELAYED is not working as expected.)

    Read the article

  • Postgresql Internals - Documentation

    - by NogginTheNog
    I'm looking for some up-to-date information about postgresql internals, specifically the query optimizer. I've found this link (referred to in the "Further Reading" section of the 8.4 docs):- http://db.cs.berkeley.edu//papers/UCB-MS-zfong.pdf but it seems quite old. That in itself is not a problem, but I wanted to be sure that I have information that is relevant. Is this the best resource for understanding how postgresql processes queries (using plans, statistics etc.) or are there others?

    Read the article

  • Dropbox doesn't recognize camera (for Camera Upload)

    - by Lee
    I'm running OS X 10.6.8 on a MacBook Pro 13" 2011 version, trying to use Dropbox 1.4.7 Camera Upload. When I connect my Sony HDR-XR260 video camera or my Blackberry Torch 9800 via USB or even an SD card via the built-in reader, Dropbox never recognizes any of it and the pop-up dialog box never pops up to ask me if I want to import my videos or not. I do have the option enabled in DB preferences. Any solutions?

    Read the article

  • disk write cache buffer and separate power supply

    - by HugoRune
    Windows has a setting to turn off the write-cache buffer (see image) Turn off Windows write-cache buffer flushing on the device To prevent data loss, do not select this check box unless the device has a separate power supply that allows the device to flush its buffer in case of power failure. Is it feasible and economical to get such a "separate power supply" for the internal sata drives of a non-server PC? Under what name is such a power supply sold? I know that there are UPS devices that can be connected to external drives,but what is required to be able to switch this setting safely on for an internal disk? The setting has different descriptions in different version of windows Windows XP: Enable write caching on the disk This setting enables write caching in Windows to improve disk performance, but a power outage or equipment failure might result in data loss or corruption. Windows Server 2003: Enable write caching on the disk Recommended only for disks with a backup power supply. This setting further improves disk performance, but it also increases the risk of data loss if the disk loses power. Windows Vista: Enable advanced performance Recommended only for disks with a backup power supply. This setting further improves disk performance, but it also increases the risk of data loss if the disk loses power. Windows 7 and 8: Turn off Windows write-cache buffer flushing on the device To prevent data loss, do not select this check box unless the device has a separate power supply that allows the device to flush its buffer in case of power failure. This article by Raymond Chen has some more detailed information about what the setting does.

    Read the article

  • Network folder image thumbnails taking long time to load

    - by Steve
    Our internal network folder can take up to 5 seconds to generate a thumbnail for each photo in a network folder. Usually, only 10% of thumbnails actually display; the rest are default jpg icons. A thumbs.db file already exists inside this folder, so presumably the thumbnails have been generated before. Why do they have to be generated again? The PC is Win7 64-bit. The server is Windows 2003 Server SP2.

    Read the article

< Previous Page | 575 576 577 578 579 580 581 582 583 584 585 586  | Next Page >