Search Results

Search found 9847 results on 394 pages for 'cloud backup'.

Page 6/394 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • SQL Server backup

    - by zzz777
    I have Full-Backup-A Transaction-Log-Backup-A Transaction-Log-Backup-B (*) - I have to restore this point Full-Backup-B How to do it? It seems that the only way is Full-Backup-A Transaction-Log-Backup-A Transaction-Log-Backup-B Shut-off client access Transaction-Log-C Full-Backup-B Allow client access Are there any other ways to guarantee that nothing did happen with the database between last transaction log and the next full backup. I was thinking about a. Starting transaction log backup simultaneously with full backup. b. Using differential back up while clients are connected and making full backup during maintenance window only c. Run replication and back-up the replica, stopping and restoring duplication services in points 4 and 7 and feel that it is actually hopeless.

    Read the article

  • Failed Backup Job With Backup Exec 12 and AOFO

    - by Mort
    I am backing up a Windows 2003 Small Business Server with SP2. We are running Backup Exec 12 with SP4. Recently the backup job started failing on backing up the system state with the following error: V-79-57344-34110 - AOFO: Initialization failure on: "System?State". Advanced Open File Option used: Microsoft Volume Shadow Copy Service (VSS). Snapshot provider error (0xE000FE7D): Access is denied. To back up or restore System State, administrator privileges are required. Check the Windows Event Viewer for details. Upon review of Symantec's website the error indicates a credential problem. However when I test the credentials they come back with no failures. I have found another forum here referencing a similar error and have tried what has been indicated with no succesful results. I have created new jobs based on new selection lists with no succesful results. I suspect a new update possibly from Microsoft may be causing this but I have no idea which one. I am looking for feedback. Thanks.

    Read the article

  • How is"cloud computing"different from "client-server"?

    - by BellevueBob
    Watching a CEO for a new "cloud computing" company describe his company on a finance TV program today, he said something like "Cloud computing is superior to old-fashioned client-server computing". Now I'm confused. Can someone please explain what "cloud computing" means in contrast to client-server? As far as I understand it, cloud computing is more of a network services model, such that I do not own or maintain the physical hardware. The "cloud" is all the back-end stuff. But I still might have an application that communicates with that "cloud" environment. And if I run a web site presents a form that a user fills out, pushes a button on the page, and returns some report that was generated by the web server, isn't that the same as "cloud" computing? And would you not consider my web browser as the "client"? Please note my question is specific to the concept of "cloud computing" with respect to "client-server". Sorry if this is an inappropriate question for this site; it's the one closest in the Stack universe and this is my first time here. I'm an old timer, programming since mainframe days in the late 70's.

    Read the article

  • See the latest Applications Cloud user experiences at Oracle OpenWorld 2014

    - by mvaughan
    By Misha Vaughan, Oracle Applications User Experience OAUX Day: Oracle Applications Cloud User Experience Strategy & Roadmap?. This event is for partners, Oracle sales, and customers who are passionate about Oracle’s commitment to the ongoing user experience investment in Oracle’s Applications Cloud. If you want to see where we are going firsthand, contact the Applications UX team to attend this special event, scheduled the week before Oracle OpenWorld.All attendees must be approved to attend and have signed Oracle’s non-disclosure agreement. Register HERE.Date and time: 8 a.m. - 5 p.m. Wednesday, Sept. 24, 2014 Location: Oracle Conference Center, Redwood City, Calif. Oracle Applications Cloud User Experience Partner & Sales Briefing This event is for Oracle Applications partners and Oracle sales who want to find out what’s up with release 9 user experience highlights for: Oracle Sales Cloud, Oracle HCM Cloud, cloud extensibility, and Paas4SaaS. It will be held the day before Oracle OpenWorld kicks off. All attendees must be approved to attend. Register HERE.Date and time: 10:30 a.m. - 12:30 p.m. Sunday, Sept. 28, 2014Location: Intercontinental Hotel, 888 Howard Street, San Francisco, Calif. , in the Telegraph Hill room. Oracle OpenWorld 2014 OAUX Applications Cloud Exchange.This daylong, demo-intensive event is for Oracle customers, partners, and sales representatives who want to see what the future of Oracle’s cloud user experiences will look like. Attendees will also see what’s cooking in Oracle’s research and development kitchen – concepts that aren’t products … yet.All attendees must be approved to attend and have signed Oracle’s non-disclosure agreement. Register HERE.Date and time:  1 - 4 p.m. and 6 - 8:00 p.m. Monday, Sept. 29, 2014 Location: Intercontinental Hotel, 888 Howard Street, San Francisco, Calif., on the Spa Terrace.

    Read the article

  • Windows 7 Backup not backing up custom library?

    - by James McMahon
    I have created a custom Library under Windows 7 64bit professional to handle my source code. When I tried Windows Backup and Restore for the first time I get the following error Backup encountered a problem while backing up file C:\Windows\System32\config\systemprofile\Source. Error:(The system cannot find the file specified. (0x80070002)) I've found a thread on the error on the Microsoft answers site. But it appears to be 404 (there is a version in Google's Cache) and the thread starter never gets an answer to his issue that works. The official Microsoft answer on this is This problem is due to one or more profiles under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\WindowsNT\CurrentVersion\ProfileList with missing ProfileImagePath. To check whether you have missing profiles: Open regedit, navigate to the above registry key. (HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList). Expand the list Click on each of the profiles listed. The first 3 profiles should have ProfileImagePath value of %SystemRoot%\System32\Config\SystemProfile, %SystemRoot%\ServiceProfiles\LocalService, and %SystemRoot%\ServiceProfiles\NetworkService respectively. Starting from the 4th profile, the ProfileImagePath should contain path to the user profiles on your machine, such as C:\users\Christine If one or more of the profile has no profile image, then you have missing profiles. To work around this, delete the profile in question (Caution: The registry contains critical settings that are necessary for your system to function properly. Take extra caution while making changes) First, export the ProfileList key for safekeeping. (Right click on the key, choose “Export”, and save it to the desktop.) Right click on the profile in question, choose delete. Try backup again. This does not work for me. Anyone have any idea what is going on here?

    Read the article

  • Mac Backup Plan

    - by Chuy77
    I'm reviewing my backup plan and would appreciate any thoughts about what more I should do (if anything) to make sure I'm properly covered in case of all hell breaking loose. :-) I have one machine. 1) I run a nightly clone with SuperDuper. I alternate the clone drive weekly so I have two clones, one never more than a week old. 2) I use BackBlaze as a sort of Time Machine in the cloud. It runs all the time and keeps everything on my machine backed up online. 3) I sync all my 1Password logins, etc. to my iPhone once a week. ...And that's it. I feel pretty covered. But I'm always reading stuff like this: http://www.43folders.com/2010/03/15/yes-another-backup-lecture And that doesn't even mention online backup, and seems like a huge pain in the behind. But maybe I'm being naive? Should I have more backups? Thanks for any feedback. I really appreciate it.

    Read the article

  • SQL SERVER – Retrieve and Explore Database Backup without Restoring Database – Idera virtual databas

    - by pinaldave
    I recently downloaded Idera’s SQL virtual database, and tested it. There are a few things about this tool which caught my attention. My Scenario It is quite common in real life that sometimes observing or retrieving older data is necessary; however, it had changed as time passed by. The full database backup was 40 GB in size, and, to restore it on our production server, it usually takes around 16 to 22 minutes, depending on the load server that is usually present. This range in time varies from one server to another as per the configuration of the computer. Some other issues we used to have are the following: When we try to restore a large 40-GB database, we needed at least that much space on our production server. Once in a while, we even had to make changes in the restored database, and use the said changed and restored database for our purpose, making it more time-consuming. My Solution I have heard a lot about the Idera’s SQL virtual database tool.. Well, right after we started to test this tool, we found out that it really delivers what it promises. Using this software was very easy and we were able to restore our database from backup in less than 2 minutes, sparing us from the usual longer time of 16–22 minutes. The needful was finished in a total of 10 minutes. Another interesting observation is that there is no need to have an additional space for restoring the database. For complete database restoration, the single additional MB on the drive is not required anymore. We can use the database in the same way as our regular database, and there is no need for any additional configuration and setup. Let us look at the most relevant points of this product based on my initial experience: Quick restoration of the database backup No additional space required for database restoration virtual database has no physical .MDF or .LDF The database which is restored is, in fact, the backup file converted in the virtual database. DDL and DML queries can be executed against this virtually restored database. Regular backup operation can be implemented against virtual database, creating a physical .bak file that can be used for future use. There was no observed degradation in performance on the original database as well the restored virtual database. Additional T-SQL queries can be let off on the virtual database. Well, this summarizes my quick review. And, as I was saying, I am very impressed with the product and I plan to explore it more. There are many features that I have noticed in this tool, which I think can be very useful if properly understood. I had taken a few screenshots using my demo database afterwards. Let us see what other things this tool can do besides the mentioned activities. I am surprised with its performance so I want to know how exactly this feature works, specifically in the matter of why it does not create any additional files and yet, it still allows update on the virtually restored database. I guess I will have to send an e-mail to the developers of Idera and try to figure this out from them. I think this tool is very useful, and it delivers a high level of performance way more than what I expected. Soon, I will write a review for additional uses of SQL virtual database.. If you are using SQL virtual database in your production environment, I am eager to learn more about it and your experience while using it. The ‘Virtual’ Part of virtual database When I set out to test this software, I thought virtual database had something to do with Hyper-V or visualization. In fact, the virtual database is a kind of database which shows up in your SQL Server Management Studio without actually restoring or even creating it. This tool creates a database in SSMS from the backup of the same database. The backup, however, works virtually the same way as original database. Potential Usage of virtual database: As soon as I described this tool to my teammate, I think his very first reaction was, “hey, if we have this then there is no need for log shipping.” I find his comment very interesting as log shipping is something where logs are moved to another server. In fact, there are no updates on the database from log; I would rather compare it with Snapshot Replication. In fact, whatever we use, snapshot replicated database can be similarly used and configured with virtual database. I totally believe that we can use it for reporting purpose. In fact, after this database was configured, I think the uses of this tool are unlimited. I will have to spend some more time studying it and will get back to you. Click on images to see larger images. virtual database Console Harddrive Space before virtual database Setup Attach Full Backup Screen Backup on Harddrive Attach Full Backup Screen with Settings virtual database Setup – less than 60 sec virtual database Setup – Online Harddrive Space after virtual database Setup Point in Time Recovery Option – Timeline View virtual database Summary No Performance Difference between Regular DB vs Virtual DB Please note that all SQL Server MVP gets free license of this software. Reference: Pinal Dave (http://blog.SQLAuthority.com), Idera (virtual database) Filed under: Database, Pinal Dave, SQL, SQL Add-On, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, SQLAuthority News, T SQL, Technology Tagged: Idera

    Read the article

  • Calculating Cloud Service Costs [closed]

    - by capdragon
    Possible Duplicate: Can you help me with my capacity planning? I would like to scale a web application to the Cloud and wanted to know if anyone had any experience calculating costs and could tell me how I would go about that. I have NO experience with cloud services at all. Currently my production environment consists of two web servers and one database server. If the application continues on a linear growth path, eventually, I want to scale to the cloud to avoid any more long term commitments to extra hardware. I want to be able to create a similar environment I have now as a baseline. Have this as my fixed cost that I will always have. I also want to calculate my variable costs that will increase with more users or bandwidth. I don't have a preferred cloud vendor. Amazon, Rackspace, Terremark or any other is fine as long as understand how to calculate my fixed and variable costs.

    Read the article

  • How to convert my backup.cmd into something I can run in Linux?

    - by blade19899
    Back in the day when i was using windows(and a noob at everything IT) i liked batch scripting so much that i wrote a lot of them and one i am pretty proud of that is my backup.cmd(see below). I am pretty basic with the linux bash sudo/apt-get/sl/ls/locate/updatedb/etc... I don't really know the full power of the terminal. If you see the code below can i get it to work under (Ubuntu)linux :) by rewriting some of the windows code with the linux equivalent (btw:this works under xp/vista/7 | dutch/english) @echo off title back it up :home cls echo ÉÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ» echo º º echo º typ A/B for the options º echo º º echo ÌÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ͹ echo º º echo º "A"=backup options º echo º º echo º "B"=HARDDISK Options º echo º º echo º º echo ÈÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍŒ set /p selection=Choose: Goto %selection% :A cls echo ÉÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ» echo º º echo º typ 1 to start that backup º echo º º echo ÌÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ͹ echo º º echo º "A"=backup options º echo º È1=Documents,Pictures,Music,Videos,Downloads º echo º º echo º "B"=HARDDISK Options º echo º º echo ÈÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍŒ set /p selection=Choose: Goto %selection% :B cls echo ÉÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ» echo º º echo º typ HD to start the disk check º echo º º echo ÌÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ͹ echo º º echo º "A"=backup options º echo º º echo º "B"=HARDDISK Options º echo º ÈHD=find and repair bad sectors º echo º º echo ÈÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍŒ set /p selection=Choose: Goto %selection% :1 cls if exist "%userprofile%\desktop" (set desk=desktop) else (set desk=Bureaublad) if exist "%userprofile%\documents" (set docs=documents) else (set docs=mijn documenten) if exist "%userprofile%\pictures" (set pics=pictures) else (echo cant find %userprofile%\pictures) if exist "%userprofile%\music" (set mus=music) else (echo cant find %userprofile%\music) if exist "%userprofile%\Videos" (set vids=videos) else (echo cant find %userprofile%\videos) if exist "%userprofile%\Downloads" (set down=downloads) else (echo cant find %userprofile%\Downloads) cls echo. examples (D:\) (D:\Backup) (D:\Backup\18-4-2011) echo. echo. if there is no "D:\backup" folder then the folder will be created echo. set drive= set /p drive=storage: echo start>>backup.log echo Name:%username%>>backup.log echo Date:%date%>>backup.log echo Time:%time%>>backup.log echo ========================================%docs%===========================================>>backup.log echo %docs% echo Source:"%userprofile%\%docs%" echo Destination:"%drive%\%username%\%docs%" echo %time%>>backup.log xcopy "%userprofile%\%docs%" "%drive%\%username%\%docs%" /E /I>>Backup.log echo 20%% cls echo ========================================"%pics%"=========================================>>backup.log echo "%pics%" echo Source:"%userprofile%\%pics%" echo Destination:"%drive%\%username%\%pics%" echo %time%>>backup.log xcopy "%userprofile%\%pics%" "%drive%\%username%\%pics%" /E /I>>Backup.log echo 40%% cls echo ========================================"%mus%"=========================================>>backup.log echo "%mus%" echo Source:"%userprofile%\%mus%" echo Destination:"%drive%\%username%\%mus%" echo %time%>>backup.log xcopy "%userprofile%\%mus%" "%drive%\%username%\%mus%" /E /I>>Backup.log echo 60%% cls echo ========================================"%vids%"========================================>>backup.log echo %vids% echo Source:"%userprofile%\%vids%" echo Destination:"%drive%\%username%\%vids%" echo %time%>>backup.log xcopy "%userprofile%\%vids%" "%drive%\%username%\%vids%" /E /I>>Backup.log echo 80%% cls echo ========================================"%down%"========================================>>backup.log echo "%down%" echo Source:"%userprofile%\%down%" echo Destination:"%drive%\%username%\%down%" echo %time%>>backup.log xcopy "%userprofile%\%down%" "%drive%\%username%\%down%" /E /I>>Backup.log echo end>>backup.log echo %username% %date% %time%>>backup.log echo 100%% cls echo backup Compleet copy "backup.log" "%drive%\%username%" del "backup.log" pushd "%drive%\%username%" echo close backup.log to continue with backup script "backup.log" echo press any key to retun to the main menu pause>nul goto :home :HD echo finds and repairs bad sectors echo typ in harddisk letter (C: D: E:) set HD= set /p HD=Hard Disk: chkdsk %HD% /F /R /X pause goto :home

    Read the article

  • Real tortoises keep it slow and steady. How about the backups?

    - by Maria Zakourdaev
      … Four tortoises were playing in the backyard when they decided they needed hibiscus flower snacks. They pooled their money and sent the smallest tortoise out to fetch the snacks. Two days passed and there was no sign of the tortoise. "You know, she is taking a lot of time", said one of the tortoises. A little voice from just out side the fence said, "If you are going to talk that way about me I won't go." Is it too much to request from the quite expensive 3rd party backup tool to be a way faster than the SQL server native backup? Or at least save a respectable amount of storage by producing a really smaller backup files?  By saying “really smaller”, I mean at least getting a file in half size. After Googling the internet in an attempt to understand what other “sql people” are using for database backups, I see that most people are using one of three tools which are the main players in SQL backup area:  LiteSpeed by Quest SQL Backup by Red Gate SQL Safe by Idera The feedbacks about those tools are truly emotional and happy. However, while reading the forums and blogs I have wondered, is it possible that many are accustomed to using the above tools since SQL 2000 and 2005.  This can easily be understood due to the fact that a 300GB database backup for instance, using regular a SQL 2005 backup statement would have run for about 3 hours and have produced ~150GB file (depending on the content, of course).  Then you take a 3rd party tool which performs the same backup in 30 minutes resulting in a 30GB file leaving you speechless, you run to management persuading them to buy it due to the fact that it is definitely worth the price. In addition to the increased speed and disk space savings you would also get backup file encryption and virtual restore -  features that are still missing from the SQL server. But in case you, as well as me, don’t need these additional features and only want a tool that performs a full backup MUCH faster AND produces a far smaller backup file (like the gain you observed back in SQL 2005 days) you will be quite disappointed. SQL Server backup compression feature has totally changed the market picture. Medium size database. Take a look at the table below, check out how my SQL server 2008 R2 compares to other tools when backing up a 300GB database. It appears that when talking about the backup speed, SQL 2008 R2 compresses and performs backup in similar overall times as all three other tools. 3rd party tools maximum compression level takes twice longer. Backup file gain is not that impressive, except the highest compression levels but the price that you pay is very high cpu load and much longer time. Only SQL Safe by Idera was quite fast with it’s maximum compression level but most of the run time have used 95% cpu on the server. Note that I have used two types of destination storage, SATA 11 disks and FC 53 disks and, obviously, on faster storage have got my backup ready in half time. Looking at the above results, should we spend money, bother with another layer of complexity and software middle-man for the medium sized databases? I’m definitely not going to do so.  Very large database As a next phase of this benchmark, I have moved to a 6 terabyte database which was actually my main backup target. Note, how multiple files usage enables the SQL Server backup operation to use parallel I/O and remarkably increases it’s speed, especially when the backup device is heavily striped. SQL Server supports a maximum of 64 backup devices for a single backup operation but the most speed is gained when using one file per CPU, in the case above 8 files for a 2 Quad CPU server. The impact of additional files is minimal.  However, SQLsafe doesn’t show any speed improvement between 4 files and 8 files. Of course, with such huge databases every half percent of the compression transforms into the noticeable numbers. Saving almost 470GB of space may turn the backup tool into quite valuable purchase. Still, the backup speed and high CPU are the variables that should be taken into the consideration. As for us, the backup speed is more critical than the storage and we cannot allow a production server to sustain 95% cpu for such a long time. Bottomline, 3rd party backup tool developers, we are waiting for some breakthrough release. There are a few unanswered questions, like the restore speed comparison between different tools and the impact of multiple backup files on restore operation. Stay tuned for the next benchmarks.    Benchmark server: SQL Server 2008 R2 sp1 2 Quad CPU Database location: NetApp FC 15K Aggregate 53 discs Backup statements: No matter how good that UI is, we need to run the backup tasks from inside of SQL Server Agent to make sure they are covered by our monitoring systems. I have used extended stored procedures (command line execution also is an option, I haven’t noticed any impact on the backup performance). SQL backup LiteSpeed SQL Backup SQL safe backup database <DBNAME> to disk= '\\<networkpath>\par1.bak' , disk= '\\<networkpath>\par2.bak', disk= '\\<networkpath>\par3.bak' with format, compression EXECUTE master.dbo.xp_backup_database @database = N'<DBName>', @backupname= N'<DBName> full backup', @desc = N'Test', @compressionlevel=8, @filename= N'\\<networkpath>\par1.bak', @filename= N'\\<networkpath>\par2.bak', @filename= N'\\<networkpath>\par3.bak', @init = 1 EXECUTE master.dbo.sqlbackup '-SQL "BACKUP DATABASE <DBNAME> TO DISK= ''\\<networkpath>\par1.sqb'', DISK= ''\\<networkpath>\par2.sqb'', DISK= ''\\<networkpath>\par3.sqb'' WITH DISKRETRYINTERVAL = 30, DISKRETRYCOUNT = 10, COMPRESSION = 4, INIT"' EXECUTE master.dbo.xp_ss_backup @database = 'UCMSDB', @filename = '\\<networkpath>\par1.bak', @backuptype = 'Full', @compressionlevel = 4, @backupfile = '\\<networkpath>\par2.bak', @backupfile = '\\<networkpath>\par3.bak' If you still insist on using 3rd party tools for the backups in your production environment with maximum compression level, you will definitely need to consider limiting cpu usage which will increase the backup operation time even more: RedGate : use THREADPRIORITY option ( values 0 – 6 ) LiteSpeed : use  @throttle ( percentage, like 70%) SQL safe :  the only thing I have found was @Threads option.   Yours, Maria

    Read the article

  • tar incremental backup is backing everything up, every time when used on the Dropbox directory

    - by Cyclic
    I made an incremental backup about 10 months ago (on Jan 27, 2013), creating a .snar metadata file. Now, when I try to make an incremental backup using tar --create --file=dropbox_incremental_1.tar --listed-incremental=dropbox_0.snar Dropbox the command just re-backs up everything. I'm not an expert at Unix timestamps, but I noticed that virtually all of my directory timestamps are way more recent than the last time they changed. For my actual files, they look like this: Access: 2013-03-12 19:04:51.000000000 -0500 Modify: 2012-09-30 15:10:47.000000000 -0500 Change: 2013-03-12 19:04:51.306209672 -0500 The 'Modify' timestamp seems correct, but the files were definitely not changed (at least not doing anything that I know of) at the time they say they were. These files still seem to go into the incremental archive. What's happening here? Is there a way to tell tar to look at the 'modify' timestamp? Isn't this what it's supposed to be doing?

    Read the article

  • Bare bones backup / restore for single Win 2003 server

    - by s.mihai
    I have a single server, Win 2003 Server and would like to setup a system to be able to perform a bare bones restore if needed. (just plug in a cd or smth and get everything back). Ideally the backup could be performed while the is powered so that i don't have to support downtime during this, and in order to restore i would reboot and use some sort of liveCD Any ideas on this, software and all... ? (backup will be done to a remote FTP server with plenty of bandwidth).

    Read the article

  • Incremental Backup which also is imageable

    - by qwertymk
    I'm looking for a backup program that does incremental backups and that I can use to completely flush onto my main HD. For example I use the C:\ as my main drive and have E:\backups... as my backup, what I want is to be able to have it make incremental backups but such that if my computer becomes infested I can just choose an earlier snapshot and restore my entire HD to that image. I'm also looking for something that had auto scheduling but I'm guessing they all do. I really would like it if there is an open source option that does this, but everything I tried doesn't seem to have an "imaging" option. Is there any open source programs that does this (for windows 7 64bit)? If not I would also use any free non-open-source options

    Read the article

  • Exchange 2007 backup / restore

    - by Matt
    To explain my situation I have a exchange 2007 server and I have recently upgraded to SP2 so that I can use windows server backup to take an "exchange aware" backup of the server. Is there an easy way of restoring this data onto a second server to which I have pre-installed server 2008 / exchange 2007 without upsetting my current exchange server set-up. Also is there a better way to do this, such as running one as a fall over should the primary exchange server develop a fault. I am still new to all this so please excuse any stupidity in my questions. Thanks in advance for any help and assistance

    Read the article

  • Incremental backup and sync software

    - by martjno
    I need a free software for Windows (with gui or command line) that does incremental backup copying all files and storing changed or deleted files in a directory named like last change date (or a progressive number). To be more precise: D:\ is my Data drive E:\ is my Backupdrive. If i want to backup all my data from D:\: E:\d_lastbackup\ will contain a plain copy of all the files and folder content (no compression or archiving, same files attributes) of D E:\d_20090822\ will contain all files (with their full path) that are changed or deleted in the last version (since the previous one) E:\d_20090820\ will contain all files (with their full path) that are changed or deleted in the last version (since the previous one) and so on... I had a software working prefectly with an old USB harddsik by Maxtor, but it works only on that device. Any suggestion?

    Read the article

  • Offset AND incremental backup

    - by Pyrolistical
    I already do backups from my main computer to my server computer using synctoy. But now I also want to do off-site backup. My idea so far: have source hard drive (we'll call S) at home have backup hard drive at work called B have transport hard drive called T connect T at work and record index of files on B take T home and check index of S and note new/changed/deleted files and copy changed files to T take T to work and update S repeat Its basically a sneakernet and using all of the advantages of it. High bandwidth, low latency. Is there some software to do this, or do I have to write it myself?

    Read the article

  • rsync remote to local automatic backup

    - by Mark Molina
    Because all my work is stored on a remote server I would like to auto backup my server monthly and weekly. My server is running Centos 5.5 and while searching the web I'm found a tool named rsync. I got my first update manually by using this command in terminal: sudo rsync -chavzP --stats USERNAME@IPADDRES: PATH_TO_BACKUP LOCAL_PATH_TO_BACKUP I then prompt my password for that user and bob's my uncle. This backups the necessary files from my remote server to my local device but does somebody know how I can automate this? Like automatic running this script every sunday? EDIT I forgot to mention that I let direct admin backup the files I need and then copy those files from the remote server to a local server.

    Read the article

  • tar incremental backup is backing everything up, every time

    - by Cyclic
    I made an incremental backup about 10 months ago (on Jan 27, 2013), creating a .snar metadata file. Now, when I try to make an incremental backup using tar --create --file=dropbox_incremental_1.tar --listed-incremental=dropbox_0.snar Dropbox the command just re-backs up everything. I'm not an expert at Unix timestamps, but I noticed that virtually all of my directory timestamps are way more recent than the last time they changed. For my actual files, they look like this: Access: 2013-03-12 19:04:51.000000000 -0500 Modify: 2012-09-30 15:10:47.000000000 -0500 Change: 2013-03-12 19:04:51.306209672 -0500 The 'Modify' timestamp seems correct, but the files were definitely not changed (at least not doing anything that I know of) at the time they say they were. These files still seem to go into the incremental archive. What's happening here? Is there a way to tell tar to look at the 'modify' timestamp? Isn't this what it's supposed to be doing?

    Read the article

  • How to backup mirror copies of C: drive?

    - by metal gear solid
    I've installed everything on my C: Drive . Whatever i need Windows 7, updated drivers and utilities and software etc i need. I now i want to take a backup mirror of everything in a DVD or i can keep backup in another USB HDD. so in case if i face any windows or hard-drive failure in future then i can restore everything as it is as all are today. I don't want to reinstall everything again Windows, Drivers all utilities and all needed soft-wares. My C: Drive's total capacity is 108 GB but data on c: drive is only 12 GB. What Should i do ? What is the best solution for me? I need free solution.

    Read the article

  • Backup software for Ubuntu - which one?

    - by Industrial
    Hi everybody, I have spent some time testing out different backup solutions for my small home office during the last weeks, but still haven't found anything that have been working out too well yet. We can definitely work with a non-GUI script if that's what it takes, if only the requirements are fulfilled: Upload to Amazon S3 Europe. We get unbelievable slow uploading speed to US, so uploading 400+ GB of data will not be happening anytime this year... Incremental backups - only changed files shall be uploaded or we will have a big bill from Amazon in the end of each month.. Files should not be uploaded in one big per-folder archive. This is not efficient at all, since if we change one file in a subfolder, a huge two-digit GB sized file would have to be uploaded during next backup. Not good for economy again, or traffic overhead on our internet connection. What options are available to us? Thanks!

    Read the article

  • cloud programming for OpenStack in C / C++

    - by Basile Starynkevitch
    (Sorry for such a fuzzy question, I am very newbie to cloud programming) I am interested in designing (and developing) a (free software) program in C or C++ (probably, most of it being meta-programmed, i.e. part of the C code code being generated). I am still in the thinking / designing phase. And I might perhaps give up. For reference, I am the main architect and implementor of GCC MELT, a domain specific language to extend the GCC compiler (the MELT language is translated to C/C++ and is bootstrapped: the MELT to C/C++ translator being written in MELT). And I am dreaming of extending it with some cloud computing abilities. But I am a newbie in cloud computing. (I am only interested in free-software, GPLv3 friendly, based cloud computing, which probably means openstack). I believe that "compiling on the cloud with some enhanced GCC" could make sense (for super-optimizations or static analysis of e.g. an entire Linux distribution, or at least a massive GCC compiled free software like Qt, GCC itself, or the Linux kernel). I'm dreaming of a MELT specific monitoring program which would store, communicate, and and enhance GCC compilation (extended by MELT). So the picture would be that each GCC process (actually the cc1 or cc1plus started by the gcc driver, suitably extended by some MELT extension) would communicate with some monitor. That "monitoring/persisting" program would run "on the cloud" (and probably manage some information produced by GCC e.g. on NoSQL bases). So, how should some (yet to be written) C program (some Linux daemon) be designed to be cloud-friendly? So far, I understood that it should provide some Web service, probably thru a RESTful service, so should use an HTTP server library like onion. And that OpenStack is able to start (e.g. a dozen of) such services. But I don't have a clear picture of what OpenStack brings. So far, I noticed the ability to manage (and distribute) virtual machines (with some Python API). It is less clear how can it distribute some ELF executable, how can it start it, etc. Do you have any references or examples of C / C++ programming on the cloud? How should a "cloud-friendly" (actually, OpenStack friendly) C/C++ server application be designed?

    Read the article

  • Oracle Announces Leading ISV Integration With Oracle Sales and Marketing Cloud Service

    - by Richard Lefebvre
    More Than 100 ISVs, including Big Machines, Marketo and Xactly, now Provide Integrated Offerings to Help Maximize Sales and Single Customer Viewpoint Demonstrating its continued commitment to business value via open standards and the cloud, Oracle today announced that more than 100 leading ISVs are integrating in the cloud with Oracle Sales and Marketing Cloud Service, a service available through Oracle Cloud. For the first time Oracle Sales and Marketing Cloud Service users can choose from a wide array of directly integrated third-party solutions, providing a new level of choice, seamless deployment and single view of customers with preferred implementations. Top partners, including ActivePrime, Avaya, BigMachines, Box, Brainshark, Callidus Software, CirrusPath, Clicktools, CRMIT, DBSync, EchoSign from Adobe, Eloqua, Fliptop, FPX, HarQen, HubSpot, iHance, InsideSales.com, InsideView, Interactive Intelligence, Lingotek, LinkPoint360, Marketo, Nuance, PerspecSys, Postcode Anywhere, Revegy, salesElement, StrikeIron, upsourceIT, White Springs, X+1 and Xactly, have announced their availability and integration today. By integrating with Oracle Sales and Marketing Cloud Service, ISV solutions can easily be leveraged by customersBy choosing Oracle Sales and Marketing Cloud Service as a sales platform, customers will continue to have complete choice of their own quoting, lead management and sales methodology solutions and it will all be pre-integrated with Oracle Sales and Marketing Cloud Service. With demonstrable integration fusing standards-based technologies, such as SOAP web services, Oracle Sales and Marketing Cloud Service customers choosing ISV integrations will also benefit from familiar ease-of-use and the Oracle Sales and Marketing Cloud ervice user interface, including buttons, links and custom objects for a rich user experience. ISV integration with Oracle Sales and Marketing Cloud Service also enables on-demand contextual data exchange capabilities, linking Oracle Sales and Marketing Cloud Service business data with third-party application data for a complete CRM view. ISVs building robust, repeatable integrations with Oracle Sales and Marketing Cloud Service can begin the process of achieving Oracle Validated Integration, an Oracle PartnerNetwork program that recognizes Oracle partner solutions with proven integration to Oracle Applications. ISVs can learn more about Oracle Validated Integration    here. For customers, Oracle Validated Integration means that a partner’s integration has been tested and validated as functionally and technically sound, that the partner solution is integrated with Oracle Sales and Marketing Cloud Service in a reliable, standardized way, and that the integration operates and performs as documented. Oracle Cloud provides a broad portfolio of Platform Services, Application Services, and Social Services, all on a subscription basis. Oracle Cloud delivers instant value and productivity for end users, administrators, and developers through functionally rich, integrated, secure, enterprise cloud services. Supporting Quotes “BigMachines is a leader in Configure, Price, and Quote solutions in the Cloud. Our solution delivers accurate quotes directly from an opportunity, integrated with the leading Oracle Sales and Marketing Cloud application from Oracle,” says John Pulling, Senior Vice President of Products at Big Machines. “Together, Big Machines and Oracle efficiently automate changes, enabling a faster, more efficient sales process for our joint customers.”   ”Modern marketing and sales must engage customers and prospects in real time across the web, email, social media, online and offline channels to understand where and how to allocate their budgets for maximum return,” said Srini Venkatesan, Senior VP, Products and Engineering at Marketo. “Alignment and integration with Oracle Sales and Marketing Cloud Service allows Marketo’s solutions to deliver innovative capabilities for sales and marketing to adapt and grow their business on the core Oracle platform for CRM.”   “Sales incentives are the best way to drive better performance. Well managed incentives improve the bottom line, particularly when combined with effective sales systems,” said Christopher Cabrera, president and CEO of Xactly Corporation. “With Oracle Sales and Marketing Cloud Service and Xactly working together, customers gain insight and efficiencies. The combination can create more effective compensation programs, while motivating sales to work to its full potential."   “The tremendous integration of leading ISVs with Oracle Sales and Marketing Cloud Service is a testament to the undeniable business value and demand from customers,” said Anthony Lye, SVP of Oracle CRM. “Oracle Sales and Marketing Cloud Service continues to define the industry, and we are proud to work with these leading ISVs to help users simultaneously maximize sales and revenue and extend their current deployments for a deeper and single customer viewpoint.” Supporting Resources Oracle Sales and Marketing Cloud Service Learn More About Oracle Cloud

    Read the article

  • Windows 7: Setting up backup to an external hard drive on another computer on the network

    - by seansand
    I have an external hard drive connected to a Windows 7 (Home Edition) computer. I have another computer (with Windows 7 Ultimate), and I want to have the Windows 7 Ultimate back up to the same external hard drive, without having to disconnect and move the external hard drive from the Home Edition PC. When I get to the "Set up backup" dialog within Windows 7, it asks me where to save the backup. I select "Save on a network". However, when I enter "\\computername\harddrivename" under Browse, the "OK" button remains grayed out. The button remains grayed out unless I also enter a Username and Password under "Network credentials". However, the account I have on the other computer doesn't have a password for it. To un-gray out the button I must enter a fake password, allowing me to click "OK", but then obviously I get a "bad password" error. Does anyone know how to get around this problem? (Seems kind of ridiculous.) I made sure that the security settings with the external hard drive on the other computer are full access to Everyone, so permissions is not the problem. I also thought about using Homegroup instead of the regular security settings, but there is no obvious way to go about it that way, either.

    Read the article

  • Taking HRMS to the Cloud to Simplify Human Resources Management

    - by HCM-Oracle
    By Anke Mogannam With human capital management (HCM) a top-of-mind issue for executives in every industry, human resources (HR) organizations are poised to have their day in the sun—proving not just their administrative worth but their strategic value as well.  To make good on that promise, however, HR must modernize. Indeed, if HR is to act as an agent of change—providing the swift reallocation of employees  and the rapid absorption of employee data required for enterprises to shift course on a dime—it must first deal with the disruptive change at its own front door. And increasingly, that means choosing the right technology and human resources management system (HRMS) for managing the entire employee lifecycle. Unfortunately, for most organizations, this task has proved easier said than done. This is because while much has been written about advances in HRMS technology, until recently, most of those advances took the form of disparate on-premises solutions designed to serve very specific purposes. Although this may have resulted in key competencies in certain areas, it also meant that processes for core HR functions like payroll and benefits were being carried out in separate systems from those used for talent management, workforce optimization, training, and so on. With no integration—and no single system of record—processes were disconnected, ease of use was impeded, user experience was diminished, and vital data was left untapped.  Today, however, that scenario has begun to change, and end-to-end cloud-based HCM solutions have moved from wished-for innovations to real-life solutions. Why, then, have HR organizations been so slow in adopting them? The answer—it would seem—is, “It’s complicated.” So complicated, in fact, that 45 percent of the respondents to PwC’s “Annual HR Technology Survey” (for 2013) reported having no formal HR software roadmap, and 40 percent stated that they “did not know” whether their organizations would be increasing their use of cloud or software as a service (SaaS) for HR.  Clearly, HR organizations need help sorting through the morass of HR software options confronting them. But just as clearly, there’s an enormous opportunity awaiting those that do. The trick will come in charting a course that allows HR to leverage existing technology while investing in the cloud-based solutions that will deliver the end-to-end processes, easy-to-understand analytics, and superior adaptability required to simplify—and add value to—every aspect of employee management. The Opportunity therefore is to cut costs, drive Innovation, and increase engagement by moving to cloud-based HCM.  Then you will benefit from one Interface, leverage many access points, and  gain at-a-glance insight across your entire workforce. With many legacy on-premises HR systems not being efficient anymore and cloud-based, integrated systems that span the range of HR functions finally reaching maturity, the time is ripe for moving core HR to the cloud. Indeed, for the first time ever there are more HRMS replacement initiatives than HRMS upgrade initiatives under way, and the majority of them involve moving to the cloud per Cedar Crestone’s 2013-2014 HRMS survey. To learn how you can launch your own cloud HCM initiative and begin using HR to power the enterprise, visit Oracle HRMS in the Cloud and Oracle’s new customer 2 cloud program. Anke Mogannam brings more than 16 years of marketing and human capital management experience in the technology industries to her role at Oracle where she is part of the Human Capital Management applications marketing team. In that role, Anke drives content marketing, messaging, go-to-market activities, integrated marketing campaigns, and field enablement. Prior to joining Oracle, Anke held several roles in communications, marketing, HCM product strategy and product management at PeopleSoft, SAP, Workday and Saba. Follow her on Twitter @amogannam

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >