Search Results

Search found 498 results on 20 pages for 'sftp'.

Page 4/20 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • sftp Bad message - (badly formatted packet or protocol incompatibility)

    - by culter
    I have two servers connected through SFTP. When I'm trying to upload file DONATE_SPLATNOSTSFRB-1503_20120315.xls.gpg via WinSCP, it works fine, but when I change file name to DONATE_SPLATNOSTSFRB-1503_20120315.gpg it sometimes upload to server and sometimes not. When It's uploaded, I have problems to delete it. I get this error message: Bad message - (badly formatted packet or protocol incompatibility) Error code: 5 Error message from server: Bad Message Request code: 13 Others files works fine e.g.: DONATE_PREDSFRB-0212_20120315.gpg Thank you.

    Read the article

  • Fastest SFTP client

    - by Stan
    Protocol: SFTP (port 22) I've tested CuteFtp, FileZilla, SecureCRT and several others. Looks like CuteFTP has the best throughput, usually 200%-400% than others. I've read something about SecureFtp may have slower rate from here. Can anyone explain why CuteFtp has better throughput? And, is there any other FTP client even faster than CuteFtp? Thanks a lot!

    Read the article

  • Redirect user to dir upon SFTP connection.

    - by Gabriel A. Zorrilla
    Hi guys. I have an FTP user which i log in into the server via SFTP and a key file. Right, lovely, works like a charm. Now, by default the user logs to /home/ftp which is not cool. I'd like to know if there is a way to redirect just before the connection to a dir such as /var/www/site.com/public/files/ Mmm? :)

    Read the article

  • Load Balancer sftp persistence when a server goes offline

    - by Cobra Kai Dojo
    Let's say we have the following scenario: We have two identical *nix servers using a shared filesystem. We connect through SFTP (not FTPS) to one of them to upload a file to the shared filesystem, the server goes offline and we get redirected to the second system which is still available. My question is, would there be any connection persistence or the user will have to relogin? I guess a relogin would be needed because the ssh sessions are not shared between the two systems... Thanks in advance :)

    Read the article

  • How do I limit concurrent sftp / port forwarding logins

    - by Kyoku
    I have ssh set up so my users can only access sftp and port forwarding, how can I limit the number of concurrent logins on a per user basis? In my sshd_config I have UsePAM set to yes and in /etc/security/limits.conf I have: username - maxlogins 1 I also tried: username hard maxlogins 1 Neither of these works and the users can still log in multiple times.

    Read the article

  • Allow SFTP to a single folder not in home directory

    - by Brandon
    I have a web server that I use to host my websites. I have all the websites in folders in /srv/www. There is a Wordpress site that I want to give SFTP access to another developer, how would I go about doing this so that they only have access to /srv/www/thesite.com and not any other directories? Running Ubuntu 9.04.

    Read the article

  • SFTP logging: is there a way?

    - by Darryl Hein
    I'm wondering if there is a way to log commands received by the server. It can be all SSH commands, as long as it includes information on commands related to file transfer. I'm having issues with an SFTP client and the creator is asking for logs, but I am unable to find any existing logs. I'm looking to log on both or either CentOS or OS X (although I suspect if it's possible, it'd be similar on both).

    Read the article

  • sftp vs "FTP over SSH" c#

    - by sundar venugopal
    i am looking for sftp client in c# SSH File Transfer Protocol (SFTP) i come across http://sourceforge.net/projects/sharpssh http://granados.sourceforge.net/ after exploring i will use them. while trying to understand the basics. i come across the help text http://en.wikipedia.org/wiki/FTP_over_SSH#FTP_over_SSH_.28not_SFTP.29 which is confusing . could some one explain it for me what is difference between sftp and FTP over SSH. no library seems to give support for "FTP over SSH" if it is different.

    Read the article

  • SFTP in android

    - by Mr. Kakakuwa Bird
    Hi I want to use SFTP in my android project. Can anybody tell me if android [code]already have SFTP library?? Or i have to implement it. If anybody have already done SFTP client in android then please guide me. Thanks in Advance.

    Read the article

  • SFTP Libraries for .NET

    - by John Doe
    Can anyone recommend a good SFTP library to use? Right now I'm looking at products such as SecureBlackbox, IPWorks SSH, WodSFTP, and Rebex SFTP. However, I have never used any SFTP library before so I'm not sure what I'm looking for. If anyone has used these before, is there any reason why I should go with product "X" over "Y"? Thanks!

    Read the article

  • Lock down SFTP access on OpenSolaris

    - by Simon
    Hi all, I have an OpenSolaris 2009.06 server and I'd like to enable a user to remotely change files in a specific directory, ideally via SFTP or FTP-via-SSH. This user does not yet have an account on the machine and I'd like to create it so it's as restricted as possible. Is there a canonical way of doing this? I know about OpenSolaris' role-based access control and authorizations model, but I figure it's a lot of work (i.e., a lot I can mess up) to really lock down a full-blown user account (prevent fork bombs, make sure there's really no other file in the file system which can be written to...). Any hint is greatly appreciated. Thanks, Simon

    Read the article

  • SFTP: file symlinks in a jailed (chrooted) directory

    - by Kevin Duke
    I'm trying to set up sftp so that a few trusted people can access/edit/create some files. I have jailed a user into their home directory (/home/name) but have run into a problem. I want for them to also be able to access other parts of the VPS because it is also a game server, webhost, etc, and I want for them to be able to have full control of files outside their jailed directory. I tried making a symlink (ln -s) to the desired directory but it does not work, as expected. I tried (cp -rl) to the files that I wanted to give access and it worked -- they can edit the files in their directory and it changes the one stored outside of jail. BUT they cannot create new files (they can but it won't update outside of jail). I know I'm probably not doing this the "right way" but what can I do to do what I want?

    Read the article

  • Secure PHP environments with PHP-FPM and SFTP

    - by pdd
    I'd like to set up secure environments for a small number of untrusted PHP websites on a Debian server. Right now everything runs on the same Apache2 with mod_php5 and vsftpd for administrative file access, so there is room for improvement. The idea is to use nginx instead of apache, SFTP through OpenSSH instead of vsftpd and chrooted (in sshd_config), individual users for each website with their own pool of PHP processes. All these users and nginx are part of the same group. Now in theory I can set 700 permissions on all PHP scripts and 750 on static files that nginx has to serve up. Theoretically, if a website is compromised all the other users' data is safe, right? Are there better solutions that require less setup time and memory per website? Cheers

    Read the article

  • Openssh sftp-server: .filepart support?

    - by Guillaume Bodi
    I am trying to setup a SFTP server, running off Ubuntu Server 11.04. I installed openssh-server to provide SSH access. What I am trying to do is make file uploads run with a suffix (.filepart or whatever), which would be removed upon transfer completion. The flow idea is: User uploads cat.jpg The server starts writing cat.jpg.filepart in the destination directory Once the upload completes, the server trashes the previous cat.jpg (if any) and renames cat.jpg.filepart to cat.jpg This is to make sure that incomplete file uploads do not overwrite the existing files. Any idea on how I can do this? Thanks

    Read the article

  • FTP v/s SFTP v/s FTPS

    - by susmits
    We're setting up a web server at our workspace. In conjunction, we're planning to install an FTP server, however I'm stuck at what protocol to employ -- FTP, SFTP or FTPS. I googled around, trying to see what protocol offers what, coming across articles like this, but I can't make up my mind. Only simple, once-in-a-while file transfer is desired; however, security is a concern since the file server is intended to be accessible from the internet. What protocol is the most apt for my use, and why?

    Read the article

  • Unix: Sync directory with FTP or SFTP directory

    - by Svish
    I have a website on my local computer running Mac OS X. I am wondering if there is any built-in command that I can run in the Terminal that will upload that website to my webserver either through FTP or, if possible, SFTP. Installing new commands through MacPorts is also a possibility. A big bonus would be that it only uploaded the files that needs to be updated and not everything else. It would also be nice if I can tell it to delete the files on the server that no longer exists locally once in a while. Any good tips?

    Read the article

  • symbolic link and filezilla over sftp

    - by Doc
    I'm pretty new to debian, and I'm trying to set up a server. I created a user which can only access to his folder /home/username (and its subdirectory). Now I want to use that user for the webserver I set up, and I gave him access to /var/www but I can't see /var/www through sftp and i did a symbolic link like this root@server:/home/username# ln -s /var/www www root@server:/home/username# cd www root@server:/home/username/www# chown username:username * now, with filezilla, I can see www folder like this - but when I try to open it I get this - Where am I going wrong? sorry for my awful english, i hope you can understand my problem...

    Read the article

  • Launch script after SFTP disconnect

    - by Mates
    I'm currently using Caja (basically the same as Nautilus) to connect using SSH to my server and work with files. What I'm looking for is a way to launch a simple script when I disconnect - I can launch a script after disconnecting from the TTY by putting it into ~/.bash_logout file, but that is not executed when disconnecting from a file manager. The only idea I have is to set up a cronjob which would be checking for existing sftp-server or sshd processes periodicaly and launched the script when there's no such process running. Is there any easier way to do this?

    Read the article

  • SFTP (or similar) server automated setup for group spaces

    - by spikeheap
    I need to build a dedicated machine which will be used to allow our clients to upload and download files in a secure manner. Each client has multiple users, and I would rather not hand out generic client users which are used by multiple people. Each client should have access to their files only, and no others. There is no use-case (yet) for multiple clients interacting with a single file or space. Is there an existing solution to automating the creation and maintenance of these accounts, preferably with a view to integration with LDAP? Currently it looks like if we want to use SFTP with chrooted spaces they will need to be set up manually (or an automation hand-rolled). If a solution exists for a different (but still secure) transfer method, such as FTPS, I'm all ears.

    Read the article

  • Get Directory Size through SFTP

    - by Nongo
    A client has a managed websever; on this server is an e-commerce script and part of the script dumps a backup every week. The backup is stored on the web server (not in the HTTP route). The ISP takes a copy, and my clients wants to take a copy too. What I am trying to do is before downloading the file I want to be able to calculate the backup directory size - but the only access I have is through SFTP. Is it possible to easily get the directory size and then use this in a PowerShell Script. NOTE: I have written an automatic download script in Powershell, and I want to extend this. Forgive me if this sounds vague I can provide further info if you have any specific questions.

    Read the article

  • How to remotely open gedit with SFTP URL in Gnome through SSH?

    - by Álvaro Justen
    My setup is weird and I can't change it now. I have two machines: local-machine: it's my desktop running Ubuntu with Gnome remote-machine: it's one virtual machine, also running Ubuntu but without X In both machines I have my private and public SSH keys. I need to run SSH from remote-machine to local-machine and run gedit (in local-machine, under the default $DISPLAY) but openning a file in remote-machine throught SFTP. Something like this: myuser@remote-machine:~$ ssh local-machine "DISPLAY=:0.0 gedit sftp://remote-machine/some/file" The command above doesn't work. gedit shows this message: Could not open the file sftp://remote-machine/some/file. gedit cannot handle sftp: locations. Note that: /some/file exists on remote-machine. I can SSH normally from remote-machine to local-machine using my SSH key without any problems! I can run the command DISPLAY=:0.0 gedit sftp://remote-machine/some/file in a terminal on local-machine and gedit opens the file on remote-machine without any problems - but the terminal in which I executed the command is running in DISPLAY :0 (really, it's gnome-terminal). I also tried -t option of SSH client (to force pseudo-tty allocation) but it didn't work. If I try to run DISPLAY=:0.0 gedit sftp://remote-machine/some/file in local-machine but under a tty (for example in tty1, by pressing <Ctrl>+<Alt>+<F1>) it doesn't not work - I get the same error when running from remote-machine. I found that if I pass the environment variable DBUS_SESSION_BUS_ADDRESS with a correct value, it works! So, if I do something like that: myuser@local-machine:~$ env | grep DBUS_SESSION_BUS_ADDRESS > env.txt myuser@local-machine:~$ scp env.txt remote-machine: and then: myuser@remote-machine:~$ ssh local-machine "DISPLAY=:0.0 $(cat env.txt) gedit sftp://remote-machine/some/file" it works! The problem is that I'm not on local-machine so I can't get the correct value for this env variable. Is there any other way to make this work?

    Read the article

  • SFTP only works occasionally

    - by 82din
    I suddenly get this error using SFTP: Status: Connecting to example.com... Response: fzSftp started Command: open "[email protected]" 22 Command: Pass: ********* Status: Connected to example.com Status: Retrieving directory listing... Command: pwd Response: Current directory is: "/root" Command: ls Status: Listing directory /root Error: Connection timed out Error: Failed to retrieve directory listing I tried using FileZila, Cyberduck, Shell (Terminal), same result. However, it worked fine today (just a few seconds) in Passive mode. I guess something changed in my network, so I have tried both: Active and Passive mode: Connecting to probe.filezilla-project.org Response: 220 FZ router and firewall tester ready USER FileZilla Response: 331 Give any password. PASS 3.6.0.2 Response: 230 logged on. Checking for correct external IP address Retrieving external IP address from http://checkip.dyndns.org:8245/ Checking for correct external IP address IP <external IP> big-bf-ccc-f Response: 200 OK PREP 49565 Response: 200 Using port 49565, data token 380352881 PORT 186,15,222,5,193,157 Response: 200 PORT command successful LIST Response: 150 opening data connection Response: 503 Failure of data connection. Server sent unexpected reply. Connection closed Because I'm working behind a router, I get my external IP from http://checkip.dyndns.org:8245/ I also tested different range of ports.

    Read the article

  • Can't get rsync over sftp to work

    - by Patrik
    I'm trying to set up a backup system from an Ubuntu server to a Synology NAS (DS413j) using rsync and sftp. I have created a user for this that we can call ubuntu-backup. I have a directory in ubuntu-backup home directory called www where the backup will be saved. I have enabled Network Backup in DSM The user ubuntu-backup has full access to it's home directory Here is my rsync config file on the Synology NAS: #motd file = /etc/rsyncd.motd #log file = /var/log/rsyncd.log pid file = /var/run/rsyncd.pid lock file = /var/run/rsync.lock use chroot = no [NetBackup] path = /var/services/NetBackup comment = Network Backup Share uid = root gid = root read only = no list = yes charset = utf-8 auth users = root secrets file = /etc/rsyncd.secrets [ubuntu-backup] path = /volume1/homes/ubuntu-backup/www comment = Ubuntu Backup uid = ubuntu-backup gid = users read only = false auth users = ubuntu-backup secrets file = /etc/rsyncd.secrets The permissions on /volume1/homes/ubuntu-backup/www is ubuntu-backup:users 777 Here is the command i'm running. rsync -aHvhiPb /var/www/ [email protected]:./ The result: sending incremental file list ERROR: module is read only rsync error: syntax or usage error (code 1) at main.c(1034) [Receiver=3.0.9] rsync: connection unexpectedly closed (9 bytes received so far) [sender] rsync error: error in rsync protocol data stream (code 12) at io.c(605) [sender=3.0.9] If I'm running this: rsync -aHvhiPb /var/www/ [email protected] It looks like its sending files. No errors. But I cant find anything on the NAS.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >