Search Results

Search found 698 results on 28 pages for 'rsync'.

Page 5/28 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • rsync not using forwarded ssh credentials

    - by Mat
    I have a situation where I would like to rsync some files from a remote server to a server in my office. The source server requires key-based authentication and I have an appropriate key set up on my desktop machine. If I ssh into the local server and then ssh to the remote server, ssh agent forwarding works correctly. However, when I try to rsync over ssh I get permission denied. So, Desktop -- Local server -- Remote server. When ssh'd onto the local server ssh user@remote works, but rsync -avPe ssh user@remote:/src /dest does not - Permission denied (publickey).

    Read the article

  • rsync: files copied with hidden attribute

    - by haritan
    I run a backup from Windows 7 machine to Mac machine running Mountain Lion using rsync that is packaged in DeltaCopy application. I can't use DeltaCopy interface because the destination is a mapped drive (Mac's samba drive). So here is my setup: I have a folder in Mac that is the destination folder and I share this folder via Samba share. On windows machine, I map this samba share to a drive (let's say M:/) I run rsync: rsync -arv --delete "/cygdrive/C//origin/" "/cygdrive/M//mybackup/" it runs fine except that all files on the destination are hidden. Anyone has an idea on what's happening here? I really appreciate any feedback. Thank you.

    Read the article

  • rsync filtering

    - by biomed
    I use an rsync command to sync two directories remote local the command is (used in python script) os.system('rsync --verbose --progress --stats --recursive\ --copy-links --times --include="*/" --include="*good_name*.good_ext*"\ --exclude-from "/myhome/mydir/src/rsync.exclude"\ %s %s'%(remotepath,localpath)) I want to exclude certain directories that has the same files that I also want to include. I want to include recursively any_dir_name/any_file_name.good but I want to exclude any and all files that are in bad_dir_name/ I used exclude-from and here is my exclude from file * /*.bad_dir_name/ Unfortunately it doesn't work. I suspect it may have something to do with --include="*/" but if I remove it the command doesn't sync any files at all. Thanks for the help.

    Read the article

  • What are the advantages and disadventages of git or bzr + rsync vs rdiff-backup?

    - by Azendale
    I used to use rsync to do backups, but then I switched to rdiff-backup to incremental backups. Recently, I discovered git and bzr while working on a coding project. So, I was thinking, I could have my backup disk be a repository in either git or bzr. Then I could rsync to the repository, and commit the changes. Would there be any performance concerns with this? Any other issues that I'm not thinking of? The benefit I see in using rsync is that you can restart an interrupted transfer, while rdiff-backup reverts to the last version, and then starts again. Any reason not to do it this way? Anything I'm not thinking of?

    Read the article

  • Is it possible to use rsync over sftp (without an ssh shell)?

    - by Tom Feiner
    Rsync over ssh, works great every time. However, trying to rsync to a host which allows only sftp logins, but not ssh logins, provides the following error: rsync -av /source ssh user@remotehost:/target/ protocol version mismatch -- is your shell clean? (see the rsync man page for an explanation) rsync error: protocol incompatibility (code 2) at compat.c(171) [sender=3.0.6] Here's the relevant section from the rsync man page: This message is usually caused by your startup scripts or remote shell facility producing unwanted garbage on the stream that rsync is using for its transport. The way to diagnose this problem is to run your remote shell like this: ssh remotehost /bin/true > out.dat then look at out.dat. If everything is working correctly then out.dat should be a zero length file. If you are getting the above error from rsync then you will probably find that out.dat contains some text or data. Look at the contents and try to work out what is producing it. The most com- mon cause is incorrectly configured shell startup scripts (such as .cshrc or .profile) that contain output statements for non-interactive logins. Trying this on my system produced the following in out.dat: ssh-dummy-shell: Command not allowed. As I thought, the host is not allowing ssh logins. The following link shows that it is possible to accomplish this task using fuse with sshfs - however it is extremely slow, and not fit for production use. Is there any chance of getting rsync sftp to work?

    Read the article

  • Is it possible to use rsync over sftp (without an ssh shell) ?

    - by Tom Feiner
    Rsync over ssh, works great every time. However, trying to rsync to a host which allows only sftp logins, but not ssh logins, provides the following error: rsync -av /source ssh user@remotehost:/target/ protocol version mismatch -- is your shell clean? (see the rsync man page for an explanation) rsync error: protocol incompatibility (code 2) at compat.c(171) [sender=3.0.6] Here's the relevant section from the rsync man page: This message is usually caused by your startup scripts or remote shell facility producing unwanted garbage on the stream that rsync is using for its transport. The way to diagnose this problem is to run your remote shell like this: ssh remotehost /bin/true > out.dat then look at out.dat. If everything is working correctly then out.dat should be a zero length file. If you are getting the above error from rsync then you will probably find that out.dat contains some text or data. Look at the contents and try to work out what is producing it. The most com- mon cause is incorrectly configured shell startup scripts (such as .cshrc or .profile) that contain output statements for non-interactive logins. Trying this on my system produced the following in out.dat: ssh-dummy-shell: Command not allowed. As I thought, the host is not allowing ssh logins. The following link shows that it is possible to accomplish this task using fuse with sshfs - however it is extremely slow, and not fit for production use. Is there any chance of getting rsync sftp to work?

    Read the article

  • Rsync and wildcards

    - by Jay White
    I am trying to back up both the "Last Session" and "Current Session" files for Google Chrome in one command, but using a wildcard doesn't seem to work. I am trying with the following command rsync -e "ssh -i new.key" -r --verbose -tz --stats --progress --delete '/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/*Session' user@host:"/chrome\ sessions/" and get the following error rsync: link_stat "/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/*Session" failed: No such file or directory (2) What am I doing wrong?

    Read the article

  • Rsync problem...filenames

    - by Jay White
    I'm trying to back up users Chrome Sessions with Rsync with the following command: rsync -e "ssh -i new.key" -r --verbose -tz --stats --progress --delete \ '/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/Current Session' \ user@host:"/chrome sessions/" Except this doesn't work exactly, as I get a file called chrome in a sessions directory that is already present on the server. Why is this?

    Read the article

  • Migrate data from one server to another using rsync

    - by Leonid Shevtsov
    I'm moving from one VPS to another, and I figured that the simplest way to transfer data would be rsync. However, the data is owned by a user, www-data, which doesn't have ssh privileges, and I'd like it to be owned by the same (named) user on the target machine. Obviously I need all file permissions preserved. I have SSH access via another user with sudo privileges on both machines. Is this possible to do this with rsync?

    Read the article

  • Rsync --backup-dir seems to be ignored

    - by Patrik
    I want to use rsync to backup a directory from a local location to a remote location, and store changed files in another remote location. I did use: rsync -rcvhL --progress --backup [email protected]:/home/user/Changes/`date +%Y.%m.%d` . [email protected]:/home/user/Files/ The --backup-dir stays empty, while it should be filled. Is it possible what I try to accomplish, and am I doing something wrong? Thanks

    Read the article

  • Re-sync deleted files from rsync

    - by hfranco
    I need to recover files that have been deleted. My scenario: I have a rsync script that runs at 9PM and mirrors everything from server1 directory to another directory on backup server2. A couple of files have been accidentally deleted from server1. How do I recover those files from server1 with rsync?

    Read the article

  • Port forwarding for Rsync

    - by malfist
    Every port on my server is blocked except port 222 which is were ssh connects too. This server is pretty much a backup server, and I have my clients rsync to it. I do this by using ssh's port forwarding (-P 222 -L 873:myserver.com:873), however, I want to do this with just using the rsync command. Is that possible?

    Read the article

  • rsync --files-from (find + cat)

    - by Edward
    I try command rsync -v --files-from=/path/to/list.lst /home/user /path/to/backup list.lst contains for example .gnupg/ .pki/ .gnome2/keyrings/ .mozilla/firefox/*.default/bookmarkbackups/ .mozilla/firefox/*.default/bookmarks.html .mozilla/firefox/.default/.db files .mozilla/firefox/.default/.sqlite and i get error on all strings with * "failed: No such file or directory". Can anybody help me, or as variant can i combine find `cat /path/to/list.lst` with rsync?

    Read the article

  • rsync & rdiff backup combination giving erros

    - by Maikel van Leeuwen
    On the server I'm making every day a backup with rdiff-backup like: rdiff-backup /home/ /backup/home Then every week I want to make a rsync backup offside with sshfs like: rsync -avz /home/server/backup/home /backup/server-home/ This is giving me the following errors: Fatal Error: Previous backup to /backup/server-home/. seems to have failed. Rerun rdiff-backup with --check-destination-dir option to revert directory to state before unsuccessful session. Does anybody have a good solution to deal with this errors/situation? *2x edit for typo's

    Read the article

  • rsync not writing files

    - by Cyrcle
    I'm trying to setup rsync to backup a remote directory to my local drive. I cd to the directory that I want to pull the files to, then I enter: rsync -vrtW [email protected]:~/public_html I enter the password then it starts running. I get all the files listed, but none of them actually transfer. What am I missing? Thanks

    Read the article

  • Backup with bash and rsync...

    - by Roger
    Is there a way to auto-rename an existing file on the receiver? For example: if filename already exists, it auto-rename filename to something like filename_001, filename_002 and so on.... So far all I have is this: $ rsync -rh --progress --stats --exclude '.thumb' \ --update --perms /origin /destination By the way, I know rsync has --ignore-existing to "skip updating files that exist on receiver", but I guess what I need would be something like --rename-existing.

    Read the article

  • rsync delete remote duplicates

    - by BlakBat
    I'm trying to delete remote duplicate files without transferring the non-existing files, and without updating the existing files. If I specify both --existing and --ignore-existing (along with "-av --remove-source-files", the operation is a no-op and nothing will be transfered, but nothing will be deleted either. The best I got so far is to make a local copy of destination, use rsync without --ignore-existing, then rsync my local copy on top of the destination

    Read the article

  • Backing up Windows machines using rsync over SSH

    - by user38118
    We have a number of Windows XP / Windows 7 machines which need to be backed up nightly to a Linux file server. We would like to do it with rsync and rsnapshot as that's what we're familiar with already from the rest of our Linux/FreeBSD machines. We tried DeltaCopy, but DeltaCopy proved to be troublesome- lots of problems getting it to log in via SSH automatically, and the Windows Scheduled Tasks seem to fail often. Is there a reliable way/application which can back up Windows machines via rsync to a r

    Read the article

  • Rsync with a list of variables

    - by EMKA
    I am trying to write a bash script that will rsync only a specific subset of folders. I am trying to figure out a more slick so that I can just add a variables such as FOLDER1='name of folder in home directory' and then rsync -arvz --delete /home/emka/$FOLDER1/ /home/emka/Desktop/Mount/$FOLDER1 Currently I have FOLDER1 through FOLDER13, but I do not want to have the above line thirteen times. Could someone give me a push on how to do this?

    Read the article

  • Complex includes/excludes with rsync

    - by brianmathis
    I'm trying to work out the rsync filter syntax to perform complex include/excludes, and trying to achieve the following: Include / Exclude /home Include /home/user1/* I've tried many variations on the filter syntax, and despite reading the man page many time, I cannot get this sort of effect. Rsync filters seem to be very powerful, and I find it hard to believe they couldn't handle a common scenario such as this.

    Read the article

  • Has anyone achieved true differential sync with rsync in ESXi?

    - by Julius
    Berate me later on the fact that I'm using the service console to do anything in ESXi... I've got a working rsync binary (v3.0.4) that I can use in ESXi 4.1U1. I tend to use rsync over cp when copying VM's or backups from one local datastore to another local datastore. I've used rsync to copy data from one ESXi box to another but that was just for small files. In now trying to do true differential syncs of backups taken via ghettoVCB between my primary ESXi machine and a secondary one. But even when I do this locally (one datastore to another datastore on the same ESXi machine) rsync appears to copy the files in their entirety. I've got two VMDK's totally 80GB in size, and rsync still takes anywhere between 1 and 2 hours but the VMDK's aren't growing that much daily. Below is the rsync command I'm executing. I am copying locally because ultimately these files will get copied onto a datastore created from a LUN on a remote system. Its not an rsync that'll be serviced by an rsync daemon on a remote system. rsync -avPSI VMBACKUP_2011-06-10_02-27-56/* VMBACKUP_2011-06-01_06-37-11/ --stats --itemize-changes --existing --modify-window=2 --no-whole-file sending incremental file list >f..t...... VM-flat.vmdk 42949672960 100% 15.06MB/s 0:45:20 (xfer#1, to-check=5/6) >f..t...... VM.vmdk 556 100% 4.24kB/s 0:00:00 (xfer#2, to-check=4/6) >f..t...... VM.vmx 3327 100% 25.19kB/s 0:00:00 (xfer#3, to-check=3/6) >f..t...... VM_1-flat.vmdk 42949672960 100% 12.19MB/s 0:56:01 (xfer#4, to-check=2/6) >f..t...... VM_1.vmdk 558 100% 2.51kB/s 0:00:00 (xfer#5, to-check=1/6) >f..t...... STATUS.ok 30 100% 0.02kB/s 0:00:01 (xfer#6, to-check=0/6) Number of files: 6 Number of files transferred: 6 Total file size: 85899350391 bytes Total transferred file size: 85899350391 bytes Literal data: 2429682778 bytes Matched data: 83469667613 bytes File list size: 129 File list generation time: 0.001 seconds File list transfer time: 0.000 seconds Total bytes sent: 2432530094 Total bytes received: 5243054 sent 2432530094 bytes received 5243054 bytes 295648.92 bytes/sec total size is 85899350391 speedup is 35.24 Is this because ESXi is itself making so many changes to the VMDK's that as far as rsync is concerned the entire file has to be retransmitted? Has anyone actually achieved actual diff sync with ESXi?

    Read the article

  • rsync useful w/ encrypted files?

    - by barrycarter
    Is rsync efficient for transferring encrypted files? More specifically: I encrypt 'x' with my public key and call the result 'y'. I rsync 'y' to my backup server. 'x' changes slightly I encrypt the modified 'x' and rsync the modified 'y' to my backup server. Is this efficient? I know a small change in 'x' yields a large change in 'y', but is the change localized? Or has 'y' changed so thoroughly that rsync is not much better than scp? I currently backup my "critical" files by tarring/bzipping them nightly, then encrypting the .tar.bz file and rsync'ing it to my backup server. Many of the individual files don't change, but, of course, the tar file changes if even one of the files change. Is this efficient? Should I be encrypting and backing up each file individually? That way, unchanged files will take no time to rsync.

    Read the article

  • Rsync module path needs to be a home directory

    - by Malfist
    I'm trying to use rsync to backup windows servers to an rsync server. I'm having problems with rsync on the linux side though, it doesn't like symlinks. Currently I'm trying to use the module path of ~/backup, but rsync says that the chroot failed. I looked up what to do and saw that I needed to add the option use chroot = no and munge symlinks = no. That fixed the @ERROR: chroot failed but now it's telling me @ERROR: chdir failed and the log files say that there is no ~/backup directory. I know the user I'm authenticating with has a backup folder in his directory. How can I fix this? For reference I'm using a .NET port of rsync called NetSync and tunneling it over a port forwarded SSH connection generated with granados.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >