Search Results

Search found 31356 results on 1255 pages for 'database backups'.

Page 537/1255 | < Previous Page | 533 534 535 536 537 538 539 540 541 542 543 544  | Next Page >

  • Optimal Disk Setup for OLTP SQL Server

    - by Chris
    We have a high transaction (lots of reads and writes) database server (running SQL 2005) that is currently set up with a RAID 1 OS partition (C:) and a RAID 5 data/log/tempdb partition (D:). The C: has 2 drives and the D: has 4 drives. The server has around 300 databases ranging from 10MB to 2GB in size. I have been reading up on best practices for partioning the disks, but would like some opinions on our setup since we are so limited in the number of disks. It seems like RAID 10 is popular, but I dont think we could use it with only 6 total disks to work with. Thanks. Update I went with 3 RAID 1 Partitions (2 disks each) Partition 1: OS, TempDB, Backups Partition 2: Logs Partition 3: Data

    Read the article

  • Routers with USB plug to connect external storage

    - by sixtyfootersdude
    I am just about to buy a new wireless router. I want to be able to hook up a harddrive to it and let the harddrive serve the entire network. I will mostly be storing media and some backups on the drive. I know I could get some kind of NAS but I would prefer to just hook up one of my many unused hard drives directly to my router. It looks like d-link has several products that do this using shareport. If you were wanting to have network storage how would you do it? With a NAS? Using a router with a USB port. Are these systems robust? What router would you buy?

    Read the article

  • Limited access to Amazon S3 buckets

    - by Tomas Markauskas
    Is it possible to somehow limit the access to an Amazon S3 account. I don't really like the idea of distributing my secret access key to all of my applications, that want to access just a single bucket on my account. If someone gains access to one of the applications, I could loose all my data stored on S3. One way I was thinking to do it would be creating a second S3 account and give it access to just one bucket of the main account, but it's not really a great solution. Another nice thing for me would be to give the secondary account only write (but not modify/delete) and read access. That way I could upload backups or other files and be sure, that they won't get lost.

    Read the article

  • Is it secure to store the cert/key on a private AMI?

    - by Phillip Oldham
    Are there any major security implications to bundling a private AMI which contains the private key/certificate & environment variables? For resiliency I'm creating an EC2 image which should be able to boot and configure itself without any intervention. After boot it will attempt to: Attach & mount specific EBS volume(s) Associate a specific Elastic IP Start issuing backups of the EBS volume(s) to S3 However, to do this it will need the private key/pem files and will need certain environment variables to be available on start-up. Since this is a private AMI I'm wondering if it will be "safe" to store these variables/files directly in the image so that I don't need to specify any user-data information and can therefore start a new instance remotely (from my iPhone, if needed) should the instance be terminated for any reason.

    Read the article

  • Suggestions? Password & Encrypted Read/Write File like a Mac (.dmg or .SparseBundle) also R/W on Windows, Ubuntu

    - by Jeff Drew
    For years I have used .dmg or .sparsebundle (Encrypted and Password Protected) to safely keep home directory backups on my Mac. Now, I am looking for a similar Full Permissions/Read/Write that maintains an encrypted, and password protected file that it Tri-Platform. I'd like to have the future ability to use it on Mac OS X, Windows 7/8, and Ubuntu (current releases+). I appreciate your recommendations. Thank you. (I like mounting a DMG and having a file directory structure that can be easily maintained and organized. When done, un-mounting the file.) (I've seen Windows tools to open encrypted DMG files? and I will explore these options, but with the desire to also keep the file accessible on on three OSes, someone might have additional suggestions.)

    Read the article

  • Recommended Tape Library Backup software

    - by D4
    Hi, I recently "inherited" a Tape Library (Powevault 136T / Scalar 100). and I was asking for some advise on the backup software to manage the Library. My goal is to be able to manage backups of all my servers (linux and Windows) and also backup VIP´s laptop computers over the network. I am hoping for a GUI application since I will not be the one managing the process after a couple of months... Any idea is more than welcome... thanks in advance....

    Read the article

  • Recovering from bad ownership

    - by Christian Sciberras
    I was going to change the ownership of a directory to apache:apache, but I ended up running: chown -R apache:apache / Bad! Very bad! I knew what was going on when it started saying: chown: changing ownership of `/proc/2694/fd/48': Permission denied That's when I stopped everything (Ctrl+C). The current system I have is a server running virtualbox running CentOS 5. This problem happened inside the VM. Currently, everything seems to be working, but I have not restarted the system yet, and to be honest, I'm afraid that if I did something will break. I do not know chown's order, should I be concerned and assume something will break after a reboot? Is there a way to recover form this problem without having to rely on backups? I do have a daily one, but I thought there may be a simpler way out.

    Read the article

  • Why can't I create a Windows backup on my secondary disk?

    - by Brian Sullivan
    I've installed Windows 7 Ultimate on an SSD that I've added to the XPS desktop that I bought from Dell. I would like to use the built-in backup functionality to create incremental backups and store them on the large drive that came with the machine. I formatted the large drive and turned it into a Basic disk. However, when I try to set the backup location to the large internal disk (E:\) in the "Set up backup" wizard, a get a message saying, "A system image cannot be saved on a drive that your computer boots from or that Windows is installed on." Windows is not installed on that disk. I even deleted the OEM partition that was on the disk, and removed it completely from the boot order in the BIOS. Any clue why Windows is griping at me about this?

    Read the article

  • ext4: error loading journal

    - by cloudyOutside
    I have an external hard drive with two partitions: A small FAT32 which is mostly empty and works fine and a large ext4 with tons of data, most of which isn't backed up. The ext4 is visible, but can't be mounted. I get an "error loading journal" error. The drive is a Western Digital Caviar Blue 500GB. Roughly 30GB of that is FAT32 and the rest is the ext4. The light on the enclosure turns red when reading from the bad partition. It was made by Cavalry. There wasn't any warning, but coincidentally, I've been thinking lately that I should get two large capacity drives for real backups. Is there anything that can be done? I'm not even sure I have enough storage to backup everything even if it is redeemable.

    Read the article

  • Painless way of consolidating files across multiple machines/OSes

    - by 5arx
    Just bought a NAS. So I thought I'd get all our photos, media files and pdfs consolidated, de-duplicated, de-junked and virus-checked and stick them all on it. We have 3 laptops, one running Windows, the others OSX. We have a file server running Windows - it was the result of an earlier attempt at a networked fileserver - and a Mac Pro that is also kind of a server (previous attempts at this job have resulted in most of our stuff being on it). Also memory cards/sticks, cd backups and so on. I would be grateful if anyone could suggest a strategy or, ideally, tool(s) I could use to solve this problem. It is probably no more than one or two terabytes of data in total, but I can imagine that going through it all manually, file-by-file may well drive me insane.

    Read the article

  • How Do You Stress-Test Your Hard Drives?

    - by MetaHyperBolic
    When looking for large new drives (= 1 TB) on newegg and the like, I note a number of reviews talking about drives being either D.O.A. or hitting the Click of Death (or even releasing the Magic Smoke) within a week or so of use. A portion of the reviews mention this phenomenon whether the drive in question is Western Digital or Hitachi or whatever. For those of you using Windows, what do you to: 1) Place a large initial stress on the drive to see if it can take it? For how long? 2) Test the drive afterwards (presumably with some sort of S.M.A.R.T. tool or others) to see if any negative changes have been noted? Note: This is one component of a larger plan for both high-availability and backups for my home data.

    Read the article

  • VPS hosting for a social network

    - by Jana
    Hi, I've developed a social network and I've been using shared hosting for that since it was launched. With that I wasn't able to send emails in bulk in cases like "newsletters" and "invitations to join my site". Plus most importantly most of the mails I send ended up in user's SPAM list.I'm planning to move into VPS as it may not have limits added. I'm wondering what's the cheapest VPS host available. I'm not pretty much familiar with Linux commands and seeking cPanel to do the work for me. Will the following configuration suit for a "new" social network like mine which has a less load? 1000Mhz Guaranteed 512MB Guaranteed RAM 20GB (RAID) Disk Space 1000GB/month Bandwidth 2 IP(s) & 5 Backups Semi Managed Thanks in advance

    Read the article

  • How do I make a backup of a live server?

    - by Jurily
    At my new job, I have a production server with the following qualities: Windows (XP I think), ancient hardware Absolutely vital database No backups whatsoever Everyone in the company has full admin rights, the passwords are stored in a .txt on the global share No installers, except for the OS The machine itself is sitting on a wooden shelf 5 feet above the ground against an external wall with frequent truck traffic on the other side; the shelf is already bent from the constant load Hasn't been rebooted in $DEITY knows how long, my predecessor wasn't even sure if it would survive it UPS is installed, but since everything is hooked up to it, it would last 10 minutes tops No spare parts or hardware budget How do I make a full backup with minimal impact on the server? I'm not sure how close it is to a total meltdown. For all I know, plugging in a USB stick could kill the company, and of course it will be all my fault, since "it was running fine before you touched it". The ideal solution would be a VM, so I have a test environment as well (separate of course).

    Read the article

  • stsadm farm backup exits with ffffffff

    - by overbyte
    I have a SP2007 farm that uses stsadm through Scheduled Tasks to run farm backups. It always worked fine, however it ran for a couple of seconds one day and just exited with code ffffffff. Looked at Event Viewer, the Sharepoint logs themselves and nothing unusual happened at the time this job ran. No files were created so an spbackup.log doesn't exist. Searched the net for batch files and STSADM return codes but the error message doesn't even exist. Any other recommended place to look for issues like this?

    Read the article

  • Rsync like windows backup tool

    - by Halfgaar
    I need to backup some windows machines and have been unable to find the proper tool. What I need is a tool that does efficient copying of changed files to a windows network location, like Rsync does. In turn, the server will then back that up using rdiff-backup, a tool which does very clever incremental backups. Right now I'm using windows' 7 included backup feature, but I really don't get that. It's too much off-topic, but it doesn't suffice (seems buggy as well). I looked into Amanda, but as soon as it wanted to install MySQL, I aborted. I also tried Deltacopy, but unfortunately, I don't remember what the problem with that was... Any advice for an rsync like tool that just does daily syncs to a network location?

    Read the article

  • Backing up a Linux VPS with RSync to Vista

    - by Frank
    I've been working to setup a Linux VPS to host a couple of Wordpress sites and eventually a Mercurial server. I've setup one site and things have gone well. However, before I start moving other things to the VPS, I need to setup a backup solution. My provider, Linode, suggest RSync (among a couple of other options) to do backups. I've seen a few posts on this site that suggests other backup solutions including going to the Amazon Cloud but that costs money and the VPS is all the money I want to spend on this for the time being. So, to help solve that I want to have my backup computer be my home desktop computer. Assuming I'm using RSync, is it possible to use my Vista based home computer to become the destination for the backup? And if it is possible, what type of command or connection would I need to configure on the vista machine? Any insight would be helpful. It's probably obvious, but I've never used RSync.

    Read the article

  • Home Server Restore

    - by Bryan Avery
    I have had to reinstall Home Server on my server and I would now like to restore it back to the state it was the last moment it stopped. I have the hard drive in a state it was last in, which is a small 250 gb hard disk. I have now installed 1.5tb hard disks, and installed a full licenced copy, as the original copy was a trial version. So I'm in a state where I have a new install, I have one of the old drives plugged in, but I can't transfer the old backups across, how do I do this?

    Read the article

  • Home Server Restore

    - by Bryan Avery
    I have had to reinstall Home Server on my server and I would now like to restore it back to the state it was the last moment it stopped. I have the hard drive in a state it was last in, which is a small 250 gb hard disk. I have now installed 1.5tb hard disks, and installed a full licenced copy, as the original copy was a trial version. So I'm in a state where I have a new install, I have one of the old drives plugged in, but I can't transfer the old backups across, how do I do this?

    Read the article

  • Uninstall nginx on Ubuntu 10.04.3 LTS remote machine

    - by user831740
    I was given a server to setup, problem is this server was no reset, and the provider is quite slow on resetting it, so I have to completely uninstall some stuff it has, one of which is nginx. I had a few problems when doing setup on my local machine for nginx due to multiple installations of it, so I want to avoid the same mistake now. Problem is, I got no idea how nginx was installed here, and I need to remove it. When I access the server thro SSH i only have this folder $HOME/backups/nginx any idea how to uninstall this? Whenever I google it only come up with apt-get uninstall and so forth. Thank you

    Read the article

  • Windows Home Server is showing signs of death

    - by Guy
    I have a Windows Home Server (HP EX485 MediaSmart Home Server) and it started acting up about 4 weeks ago and a few days ago complained of a corrupt database and would I like to try and recover it? Yes, I would but I ended up losing all backups. I have to reboot it frequently for the client machines to be able to see it. I have 4 hard disks in the computer. I suspect that the primary hard disk is going bad. My first question: How can I confirm if it's going bad or not? I'm thinking about removing the primary disk and replacing it with one of the others and reloading the OS with the server restore disk. I know that I'll lose everything (but I recently did anyway) but is there any other reason why I should not do this?

    Read the article

  • Auto backup a user folder to a usb when usb is plugged in

    - by Azztech Computers
    I'm a computer technician and help customers everyday with their computers and smartphones and have a really basic (i think) request but dont know how to go about it. Customer always come in with broken phones, water damage, needing updates, or just want me to backup their information. I currently have a program that i use when i backup their computers it backups their iOS folder C:\Users\USER\AppData\Roaming\Apple Computer\MobileSync\Backup but what i want is a quick easy way to do this in customers houses. What i require is a way when i plug in a USB drive it AUTOMATICALLY searches for the folder and starts transferring the folder to a predefined folder on the USB drive. This was I can just plug it in and begin work on their computer or phone without the risk of losing their information. I'm sure there is a .bat/.ini file i could use but wondering if someone has already done this or something similar as I would need it to search all the USER folders not just the one I'm logged into. Thanks in advance

    Read the article

  • How to delete "System Volume Information" folder from external drives?

    - by Nadude
    I'm running Vista Prof 32bit on a lenovo w500 thinkpad. I have four external drives and use 4 different PCs, that all have system volume information folders, taking up lots of space, and I can't delete them. I don't even know which computer's files are backed up on which external. I've used Thinkvantage rescue and recovery to delete all backups, as well as checked system restore settings to only use my main C drive. I checked all the PC's to ensure only the Main drive keeps system restore points and deleted previous versions. I ran Disc Clean up too. But I can't figure out how to get rid of these large folders from my external drives.

    Read the article

  • Can I redeploy Citrix Xencenter Images in Amazon EC2?

    - by Mike Pinch
    Okay - I'm running Citrix Xencenter in my datacenter. My understanding is that Amazon EC2 is a very customized flavor of Xencenter. Based on that, I would like to know whether there is a method to utilize the .VHD disk images generated by Citrix Xencenter in the Amazon EC2 cloud. I realize that I can install tools into the OS and make my own images but that is not what I'm looking for. I'd like the efficiency of taking the images and redeploying them, or at least converting then redeploying. I basically would like to have all my backups running in the cloud, so that is my motivation.

    Read the article

  • Linux using the link command

    - by Xavier
    Here it goes. I have a folder that contains a not so large amount of space called /data/backup but I have been told that if I link that folder (/data/backup) to an even bigger folder area like /bigdata/backup for example, that I will be able to execute backups to the /data/backup folder because it will be just a link but the data will be seen in both folders and the latter one (/bigdata/backup) will contain the backup results but it will show on both folders and since the /bigdata/backup has far more disk space then the backup will no longer fail because of space problems in the /data/backup one. Is this true? Thanks Xav

    Read the article

  • Smart backup software

    - by gisek
    I use a laptop on daily basis. As I have a lot of important data there it would be nice to do backups of some directories every day. Can you recommend a specific application that would take care of it? Maybe there is an app that would instantly commit changes I make in a directory on my laptop to the backup folder? The important thing is that I have some big files (a few GB's) that have some minor changes very often. I'm talking about VirtualBox disk images. It would be nice if the software could handle it smartly. Also notice that I'd like to store it on an external usb HDD, which sometimes isn't plugged in.

    Read the article

< Previous Page | 533 534 535 536 537 538 539 540 541 542 543 544  | Next Page >