Search Results

Search found 40933 results on 1638 pages for 'database tools'.

Page 564/1638 | < Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >

  • Use Windows 7 offline sync with external usb hd

    - by René
    Yeah, truly the whole question in the header. Is there a way to use Windows 7 offline sync (which we know from network mapped drives) with a external usb hd? When not, are there similar built in tools or good third party tools? My scenario: I want to buy a ultrabook with SSD which is mostly limited in space. So I'm going to put all files to a external HD and only store current projects on the local SSD. Let's say I have to change project. It would be easy just change sync folders and have the second project synced to my hd too. With network mapped drives it's such easy. Paths do not differ if the drive is offline so in most situations you don't take notice if the folder is offline. And you only have to activate offline file for the folders you courrently need for work. So is there a similar solution for usb hard drives?

    Read the article

  • Change default code page of Windows console to UTF-8

    - by Regent
    Currently I'm running Windows 7 x64 and usually I want all console tools to work with UTF-8 rather than with default code page 850. Running chcp 65001 in the command prompt prior to use of any tools helps but is there any way to set is as default code page? Update: Changing HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Nls\CodePage\OEMCP value to 65001 appear to make the system unable to boot in my case. Proposed change of HKEY_LOCAL_MACHINE\Software\Microsoft\Command Processor\Autorun to @chcp 65001>nul served just well for my purpose. (thanks to Ole_Brun)

    Read the article

  • Debian Squeeze can't install php-pear

    - by Lennier
    I use Debian 6.0.6 sudo apt-get install php-pear results in: Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: initscripts : Breaks: console-setup (< 1.74) but 1.68+squeeze2 is to be installed Breaks: initramfs-tools (< 0.104) but 0.98.8 is to be installed Breaks: nfs-common (< 1:1.2.5-3) but 1:1.2.2-4squeeze2 is to be installed keyboard-configuration : Breaks: console-setup (< 1.71) but 1.68+squeeze2 is to be installed klibc-utils : Breaks: initramfs-tools (< 0.103) but 0.98.8 is to be installed E: Broken packages How can i solve it?

    Read the article

  • ASP Guidance - Development on laptop with limited internet access (nonhosted)

    - by Joshua Enfield
    I am fairly experienced with the .NET family of languages, as well as web development (from a PHP perspective.) I am home for winter break and have limited internet access but would like to learn ASP using C#. Am I able to do development for ASP (and see the results) for free on my laptop (with no internet access), and if so what tools do I need? Ideally I'd like to do development in Visual Studio and see the results in my browser via localhost. Extra tools I might need would be helpful as well.

    Read the article

  • View changelog of all packages to be upgraded before upgrading

    - by Stein G. Strindhaug
    When using synaptic on my Ubuntu desktop computer i can review all changelog of all the packages to be upgraded, and unselect a package for upgrade if I want. On my desktop I usually install everything, but I like to at least review what the changes are so that I can delay the upgrade if I suspect it could cause problems with the development tools I use. On a server (Ubuntu Server) with no x-server how can I do the same thing on the console: list all packages that will be updated (apt-get --dry-run upgrade does this along with a lot of noisy simulated install messages), view the changelog (if any) from last upgrade to the version it will be upgraded to. select which packages I want to ignore, or which I want to upgrade I've searched a lot for this but I haven't found anything, possibly I'm not using the correct terminology; but surely this must be possible. Synaptic must get it's info from some some low-level tool I assume? Complicated shellscripts are welcome too, if this is not already easily done with the existing tools.

    Read the article

  • Is the sql backend right choice for LDAP?

    - by skomak
    Hi, I have felt some troubles with LDAP dif database after unexpected system reboots. This databse was only read so it is confused why database have had errors. So im searching for replacement of this database. I think SQL would be more reliable. What do you think, is it? I need to know how much performance loss i'll meet then. How many more IOPS(I/O per second) in percentage I loss too. Thanks in advance, skomak

    Read the article

  • Connecting SQL 2005 to Oracle 10g

    - by Lorn
    Environment: - Oracle 10g database over a windows 32bit 2003 server - SQL 2005 database over a windows 32 bit 2003 server. I am trying to connect the above databases through heterogeneous services. I have updated the following files: TNSNames.ora, Listener.ora and hs.ora. When performing a test connection from SQL developer, I get the following error - ORA 28500 - indicating that the login for SA user is incorrect. I also tried using another authenticated user that has rights to the database. I can successfully connect with SQL 2000. Has anyone experienced such a problem before?

    Read the article

  • When to use MySQL replication or DRBD for HA on Xen VM?

    - by user62513
    I'm setting up a database which needs to be needs to provide High Availabilty. My primary concern is high performance and robustness (I don't want something that will fail fast and badly). The database is accessed by the application at an average of 300 qps. It's will run on Xen VMs and it has some InnoDB tables as well as MyISAM tables. The VMs are connected via ethernet 100Mbit/s ethernet cables. Which of the two - MySQL replication or DRBD - would you recommend in such a situation? Or should I use DRBD to make the master database Highly Available and use MySQL replication on the slaves? I'm a developer so these things are all not so easy for me to make a sound judgement.

    Read the article

  • Cross subnet connection [closed]

    - by user30472
    My internal Windows 2008 AD network is 172.20.xxx.xxx My Apple Wireless base stations only allows DHCP 172.16.xxx.xxx Private IP address ranges: 10.0.0.0 to 10.255.255.255 172.16.0.0 to 172.31.255.255 192.168.0.0 to 192.168.255.255 The problem: The internet works from my IPad that has a 172.16.xxx.xxx address, but I can't access (browse) my tools server that is 172.20.xxx.xxx that host my Filemaker database. Is it possible to add 172.16.xxx.xxx range somewhere in DHCP or DNS on my Windows domain server so I can access tools? Or is there another way to make this connection work? Thanks

    Read the article

  • TFS 2010 Check-in Policies

    - by Liam
    Currently we have check-in policies that are implemented by installing the TFS 2010 Power Tools on each developer machine. I was wondering if there was a way we could store those policies centrally within the TFS Server itself and push them out to the Developer machines in a group policy fashion without having to install anything additional on the Developer machines as realistically we only want the Power Tools on a couple of people's machines. I can't seem to find any documentation on how to do this or if it's possible so if someone could point me in the right direction I'd be very grateful.

    Read the article

  • SQL Server 2005/2008 Licensing Decision

    - by Hakim
    Hello, I have purchased a dedicated server from a reputable Hosting company. They only have Windows Server 2008 OS installed on it and NO Sql server. Server Configuration is Intel Dual core Processor with 2GB of RAM and 100GB HDD. I wanted to host my web services on that server which will be using the MS SQL Server 2005 at the backend.There are multiple web services and each using a different Database. Microsoft has CAL basis Licensing , Which I understand is based on number of users accessing the database directly ( I may be wrong ) . But my users will be accessing the webservice and no direct connection to the database as such. Yes but the number of users accessing the web server cannot be known and is not under my control. Which Licensing is best suited for this kind of setup ? I don't need analysing and BI services right now ,but i may want to upgrade that in future may be. Any help will be appreciated. Thanks

    Read the article

  • Fastest booting Linux ditribution on a live-cd

    - by Avindra Goolcharan
    I'm looking for a linux distro with the following: Boots quickly, as fast as possible. Has expected tools such as file browser, a web browser, etc. Doesn't need to have extraneous recovery stuff such as partition editors, and what not. These are the tools I have and use already: ophcrack Ultimate Boot CD for Windows (UBCD4Win) chntpw (Offline NT Password and Registry Editor) Hiren's BootCD gparted or Parted Magic Ubuntu nubuntu Any and all suggestions are welcome :-) The primary objective is to get a quick booting linux distro that I can grab / delete / move / copy files with. Currently, I prefer using ophcrack, it boots in (relatively) fast and I can manipulate files well. The one that takes the longest is ubuntu of course.

    Read the article

  • Mediawiki create user error after migration

    - by ing0
    So I had a mediawiki installed on windows with MySQL (running on AWS RDS). I've since moved it to a debian server for various reasons, but I think I've messed up the database because of the different versions of mediawiki I have used. The windows install was v 1.20alpha (58f390e). The new debian install is v 1.15.5-2squeeze4. I've tried to update debian but it doesn't find an update, so is this the latest squeeze version? Everything seems to work ok except adding users. It gives me a database error so I ran php maintenance/update.php which ran some stuff OK but didn't make a difference. I think I've not done the correct approach to this sort of move, does anyone know of a better way of doing it? I still have the old wiki running - but not used - on windows (using the same database) so I could always try this again.

    Read the article

  • Repairing a TFS 2005 install or starting anew.

    - by Johan Buret
    Following : Installing Team Foundation Server on a shared database instance We did use a shared database instance setup for TFS 2005, and that was not a good idea, because of the Reporting Service dependency. The reporting instance on the server gives error code 404. What works now Basic source code control. We're able to check in and out source code. What doesn't work : Everything else, including : Opening and creating new team projects. Build automation. Internal bug tracking. Goal setup Having a fully working TFS install, and keeping the history. 1) A full install of TFS 2005 on the same server, but within its own database and reporting instance. 2) Using another server might be an option, but it's really not prefered Downtime should be minimum, my colleagues needs to be able to work on the source Readings I've read the MSDN page about moving/restoring TFS 2005, but I'm still unsure about what to do. Thanks in advance for help

    Read the article

  • Migrating Magento Concern

    - by Pankaj Upadhyay
    We have a Magento 1.5.0.1 store running at a hosting provider. Now, we need to migrate the same from that server to a new hosting provider. I had talk with a technical guy from the new hosting provider who told me to do following things. Go into the cPanel Backup Wizard . Make a FULL BACKUP and download the zip file Then upload that zip file on their server in my root folder. Then tell them and they will do the restore. My Concern :- Will everything work as expected. What about the connectionstrings and database and all. Will database be automatically created and work the same. Also, somewhere I read that ver 1.5.0.1 used older type of database which might not work on new MySQLs. Can this too have any impact. Should i proceed in the same manner or I need to take care of some additional things to ensure smooth running.

    Read the article

  • Solaris 10: Identify a PID and the CPU it's running on

    - by Marcus
    I have multiple instances of a database running on a Solaris system. I'd like to prove that each database process is being handled by a different CPU. Essentially, I want to be able to do something like a ps -ef | grep <process_name> to get the PIDs and then run another command (if required) to identify the CPU... Is prstat able to do this? I'm making an assumption that as each database instance is started each one uses a different CPU. I'm not sure if I'm understanding this correctly... The reason I want to do this is because Sun hardware has slow CPU's, but lots of them. Therefore, to get the best performance out of it, I need to try and spread the load among CPU's... Thanks

    Read the article

  • Solaris 10: Identify a PID and the CPU it's running on

    - by Marcus
    I have multiple instances of a database running on a Solaris system. I'd like to prove that each database process is being handled by a different CPU. Essentially, I want to be able to do something like a ps -ef | grep <process_name> to get the PIDs and then run another command (if required) to identify the CPU... Is prstat able to do this? I'm making an assumption that as each database instance is started each one uses a different CPU. I'm not sure if I'm understanding this correctly... The reason I want to do this is because Sun hardware has slow CPU's, but lots of them. Therefore, to get the best performance out of it, I need to try and spread the load among CPU's... Thanks

    Read the article

  • MySql transfer / update (a bit specific)

    - by Jeff
    before posting I was digging whole site but didn't find help for my problem, so I hope someone will help... Facts: 30 Gb mysql database on remote server (about 20.000.000 rows) data are once weekly updated in local network (mysql) I need to transfer/replace local updated database with remote connection is about 2mb (real mb, not mbps) up/down Point is that I can't have 'down time' of remote mysql server. Until now I Tried: navicat data sync - Ok, but take about 3 days to finish dbForge - ok but need 5 days to finish mysql dump transfer to remote server and execution - about day, but a lot of downtime rsync folder with database /mysql/lib/MY_DATABASE - 4 hours, but after that I need to execute always 'repir on remote server' which takes about 2 hours, and a lot of down time mysql dump piped from cl to directly goto server - still now satisfied many problems I could give you more things that I tried... mysql replication - slow Anyase, what is best,best way to: refresh remote mysql on weekly level and in same time to have 0 sec down time nor huge server load If you have any idea please share

    Read the article

  • Facing difficulty with migrating from wordpress to Drupal

    - by rakibtg
    One of my blog was build of Wordpress but now i want to use Drupal as the CMS of my Blog. To do so I have deleted all the Wordpress files from my server and the Database and MySQL user which are associated with wordpress blog and uploaded the Drupal files in my server directory where the wordpress files were. But, when i have opened the blog it shows the Wordpress blog! though its been deleted and their should be the Drupal Installation interface. So, i have re-checked my server directories and database, there is not wordpress files and wp database all are deleted, there is only the drupal files, but when i go to the blog to install drupal there is still the Wordpress blog, I have checked the blog in many web browsers and there is not cache memory problem. My hosting server is linux based. can't understand what to do? Any idea? Thanks

    Read the article

  • Programmatic, script-based, or command line method to change starting program for user on Windows Server 2000/2003?

    - by Joe Majsterski
    I have written an app that we want to distribute to a large number of customers to be used as the shell program when they log onto their server with a particular admin account. I have figured out how to change the starting program by going to Administrative Tools->Computer Management->System Tools->Local Users and Groups->Users, selecting the properties for the user, going to the Environment tab, and changing the program file name under "Starting program" to my new app. But is there a way I could do this with some code that could be sent out and run on all these servers?

    Read the article

  • User mapping lost after manual failover

    - by fordan
    I have two Microsoft SQL Server instances set up for mirroring each with a number of databases. There are a number of logins and for each database one or more user/login mappings. When I restore a backup of database I always have to redo the login/user mappings. I understand this because the logins are per database server. So after restoring the databases on the pricipal I redid the login/user mappings. This was not possible for the mirror because the databases were 'restoring'. After a manual failover I could not use the databases because user credentials were missing. This was not unexpected, so I did the login/user mapping again. I did a manual failover again to make the initial pricipal, which was now the mirror, principal again. To my surprise I could not use the databases because the login/user mappings were gone. Is this the expected behaviour?

    Read the article

  • Upgrading Fedora on Amazon to 12 but getting libssl.so.* & libcrypto.so.* are missing

    - by bateman_ap
    I am upgrading to Fedora 12 on a Amazon EC2 using help here: http://www.ioncannon.net/system-administration/894/fedora-12-bootable-root-ebs-on-ec2/ I managed to do a 64 bit instance OK, however facing some problems with a standard one. On the final bit of the install from 11 to 12 I am getting an error: Error: Missing Dependency: libcrypto.so.8 is needed by package httpd-tools-2.2.1.5-1.fc11.1.i586 (installed) Error: Missing Dependency: libssl.so.8 is needed by package httpd-tools-2.2.1.5-1.fc11.1.i586 (installed) This is referenced in the comments from the link above but all it says is: Q: Apache failed, or libssl.so.* & libcrypto.so.* are missing A: These versions are mssing the symlinks they require. Easy fix, go symlink them to the newest versions in /lib However I am afraid I don't know how to do this. If it is any help I tried running the command locate libssl.so and got: /lib/libssl.so.0.9.8b /lib/libssl.so.6

    Read the article

  • How much visitors could handle my server ?

    - by coolboycsaba
    I have a website, and I want to host it on my own computer, but I'm wondering if it's good enough. The website check`s if the user is logged in and then displays 15 items (title, description) from a mysql database and the rating (stored in another database) and the comments (another database) for each item. It also displays some stats (number of items, comments). I also have an image for each item. My specs are: AMD Athlon 64 X2 Dual Core Processor 5600+ 2.90 GHz RAM memory: 4.00GB Windows 7 64bit So what do you think, how much visitors and items could it handle (at once or daily) ? My internet connection is good, around 7-10 mb upload and same download speed

    Read the article

< Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >