Search Results

Search found 22866 results on 915 pages for 'ftp client'.

Page 11/915 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Error with FTP since binding via httpcfg

    - by Linda
    I was in a similar posistion to this question and bound two IP addresses using httpcfg. Since doing this ftp does not seem to be working on IIS6 in Windows Server 2003. Any ideas what could be wrong? The command I ran was: httpcfg set iplisten -i xxx.xxx.x.x I get the following when I try to conenct via Filezilla: Error: Connection timed out Error: Failed to retrieve directory listing The log file is returning the following: #Software: Microsoft Internet Information Services 6.0 #Version: 1.0 #Date: 2009-08-17 13:54:05 #Fields: date time c-ip cs-username cs-method cs-uri-stem sc-status sc-win32-status 2009-08-17 13:54:05 91.85.70.17 Client [1]USER Client 331 0 2009-08-17 13:54:05 91.85.70.17 Client [1]PASS - 230 0 In the ftp site settings I have the site pointing to the IP address used using httpcfg and the port set to 21. Update: I can see a directory listing if I connect via the inbuilt commandline ftp client in wondows vista. If I try to connect via a windows explorer I start in the incorrect folder and no files are listed just directories.

    Read the article

  • Alternatives to FTP

    - by Jack Hickerson
    I need to share files with clients outside of my business and unfortunately our FTP server is becoming too much of a hassle (with regards to clients use of an ftp client and creating password protected downloads based on customized account privileges) Essentially, I need: a remote service that mimics an FTP server with a web interface (easy for basic internet users to comprehend). over 100gb of storage file transfer size over 2gb customizable user account privileges (password protected downloads) secure storage and data transfer preferably less then $100/mo I have already looked into some services that almost meet my requirements (StreamFile.com, box.net, onehub.com, filesanywhere.com)- has anyone used a service they would recommend? cheers, jack

    Read the article

  • Mounting FTP as filesystem in debian using curlftpfs

    - by Karel Bílek
    I am trying to mount a FTP as filesystem in debian using curlftpfs. What I get after running curlftpfs -o allow_other username:[email protected] /mnt/myftp/ is just: fuse: failed to open /dev/fuse: Permission denied even when run as root. What am I doing wrong? (curlftpfs is in version curlftpfs 0.9.2 libcurl/7.21.0 fuse/2.8) edit: When I write ls -lah /dev/fuse, I see crw-rw---- 1 root fuse 10, 229 Apr 9 00:34 /dev/fuse ...but even when I add both myself and user root to group fuse, neither me (as a user) or user root can mount ftp, I still see fuse: failed to open /dev/fuse: Permission denied edit2: Even if I write this fairly insecure and crazy line: sudo chmod a+rwx /dev/fuse I still get the permission denied message. Which permissions could be denied? edit3: I forgot to mention I am on VPS with OpenVZ. I thought that there is no issue with this, but apparently, there is! I am adding the resolution as the answer.

    Read the article

  • Incremental backup up from a remote ftp box to a Windows server

    - by user65712
    I need to backup a website to a Windows server every week and I only have access to remote FTP. I'd like to use an incremental backup program so that I can just copy the files every week on a schedule and not worry too much about the size of the backups becoming an issue. Unfortunately, I can't find a Windows program that will automatically make incremental backups of specific FTP folders and files, as most programs are designed to backup to FTP, not from it. Are there any applications that can do this? I also have a Ubuntu 10.04 box I could use to relay the site to the Windows server if I needed to run Linux programs, but I would prefer a Windows-only solution over a Linux/Windows one, and a combined Linux/Windows solution over not having it work at all.

    Read the article

  • Strange FTP issues - some files are not downloaded

    - by FractalizeR
    I have a machine, which cannot fetch some files from remote servers by FTP. Machine is powered by CentOS. I tested FTP on three files: 12.09.2012 21:21 166 007 ll091212.002 13.09.2012 11:32 23 040 ll091212.003 13.09.2012 11:50 61 313 ll091212.004 From them, I can always successfully download only one - ll091212.004. Two others are downloaded by about 90% (I can see them on disk) and then FTP transfer hangs without any error messages. I move files, copy them about the remote server - no luck. Another machine from the same subnet can download all three of them easily. I just don't know what's the reason of this.

    Read the article

  • Issues with FTP on my server

    - by homestead
    I installed FTP on my windows 2003 box I created a FTP site on a ipaddress, I am not allowing anonymous access. I created a user that has access to the folder c:\ftptest connecting to my server using filezilla shows: connectiong to -0.0.0.0 connection established, waiting for welcome message could not connect to server I tried both active and passive modes. The port on the server is open i.e. TCP 21 I can connect to other FTP sites so my locla firewall isnt' the issue. (now I know why sys admin work is so fun!)

    Read the article

  • FTP - 530 Sorry, the maximum number of clients...?

    - by aSeptik
    Hi All! i know this is not a properly code question, but who of you don't use an FTP client!? ;-) Ok my problem is that my FTP work great, exept when i upload files on a particular client server! on this server happen that some files are uploaded fine and others not, they stop while uploading at half of it's size, then this error is displayed: 530 Sorry, the maximum number of clients (4) from your host are already connected. Unable to make a connection. Please try again. Obviously this is not true, i'm the only one that is uploading! Anyone had the same experience with this!? PS: i have tried many different FTP, all display the same error or just hung up! Thank's

    Read the article

  • Owner of uploads directory is `www-data` but this prevents FTP access via PHP scripts

    - by letseatfood
    To allow write access to Apache, I needed to chown www-data:www-data /var/www/mysite/uploads to my site's upload folder. This allows me to delete files from the folder via unlink() in a PHP script. Unfortunately, this prevents another PHP script, which uses FTP functions, from working. I think it is because the FTP user is mike and now that the uploads directory is owned by www-data, mike cannot access it. I added mike to the group www-data, but this does not fix the issue. Can somebody advise me on how to allow PHP FTP functions to work in addition to file deletion using PHP's unlink() function?

    Read the article

  • IIS FTP - Users Last Logon

    - by Izzy
    How would you determine the last FTP logon time/date for a bunch of local user accounts on a DMZ (standalone/workgroup) server running IIS FTP? I know I could use a log aggregator and sift through it that way, but this server has been operational for approximately 8 years and I don't fancy that vector. I have also tried the scripting route, but this is of no use because the users have never actually logged onto the machine, so there's no profile (rendering the WMI classes *WIN32_UserAccount* and *WIN32_UserProfile* useless). They're just used to access the FTP service. Thanks in advance

    Read the article

  • Adding FTP Users via Simple Control Panel

    - by Aristotle
    I just got setup with CentOS yesterday through GoDaddy, and today I'm trying to get into the server and start setting up some of my projects. I'm able to get in through PuTTY just fine, but I'm not able to connect through FTP with the same (root) account. I'm using Simple Control Panel, and have ensured that "Enable Server" is checked beneathSystem Configuration > FTP. Further, I've checked, and double checked that my root password is correct when providing the FTP details. Is there some other common setting I'm missing here that will prevent me from getting connected to the server to begin transfering files?

    Read the article

  • the commands ls and get of ftp are not working in vmware

    - by mnish
    Hi, Iam using vmware player version 3.1 to boot a minix 3 os image. After booting the minix os I want to get some files from a server using ftp. the ftp connection to the server works but when i use the commands "ls" or "get" nothing happens except it says "200 PORT command successful" and it hanges in there. The only thing i can do after typing ls+enter or get+enter is to exit the ftp by using ctrl+c. If anyone knows a solution to this? please help. Thank you

    Read the article

  • Create FTP accounts with access to just some folders in the web directory

    - by Karevan
    I own a VPS server. At the moment I havent installed any FTP server on it, I am using SSH and SFTP only. I am using Debian 6 Squeeze and Apache2 service. The web directory is in /var/www/ Well, I wanted to create different FTP accounts and give access to some people to them (one account per user). In my web directory I have an structure like this: /var/www/mtaplugins/music/mplayer/music/ /var/www/mapuploader/ and more folders inside. I want to create an FTP account which should be able to just access one of those folders and the folders inside them. I would appreciate some recomendations or stept to follow before installing anything or doing anythong, because I dont have any idea about this. I was thinking in using ProFTPd but as I saw in the documentation it would just create an account for each user in my server, and I want to not create more users (I always use root) Thanks in advance

    Read the article

  • Unable to FTP, any ideas?

    - by Nick
    I'm using Windows Server 2003. I have the FTP services installed, router set to DMZ, and currently Anonymous logins allowed. (I know, security risk, but there's nothing important on there and not worried at the moment) So here's the thing... I can ftp to my computer, list directory, get files etc, BUT only if I'm using the command prompt. If I try to log in using IE or any FTP client it's just timing out. I've tried: username@ipaddress ipaddress username:password@ipaddress and not able to get any of them to work. Anyone have any ideas? Thanks!!!

    Read the article

  • Setup IIS 7 as FTP Server that is connectable outside of my local network

    - by Usta
    I was able to setup an FTP site that I was able to access via ftp://127.0.0.1/ or my local(static) ip. To do this I followed these instructions (with the exception that I did not bind to 127.0.0.1 as suggested) http://learn.iis.net/page.aspx/301/creating-a-new-ftp-site-in-iis-7/ I have created a firewall exception for port 20 and 21, and setup port-forwarding on my wireless router. But I can only access the site via local-host, and I need to have a friend have read access to it. So how do I enable remote access to it? (I'd rather not purchase a domain-name) My setup: IIS 7.5 Windows 7 Professional Wireless Network Norton Internet Security 2012 An Internal Static IP Address

    Read the article

  • FTP server (vsftpd) with webgui

    - by manutenfruits
    I want to build a file server to make users able to upload and download mostly multimedia, but also common files. Right now I have an Arch installation with vsftpd and I'm about to install miniDLNA for multimedia sharing. The only problem is that FTP doesn't seem to fit my needs, because almost always makes the users need a client such as FileZilla to make the server friendly. I have been looking for a web frontend for vsftp but apart from management interfaces there's nothing. I need a frontend accessible from a browser through which users can navigate throught the folders in an easier and more elegant way than the plain FTP display that browsers make by default. It should be able to let users upload files and, as an awesome extra, let them play the multimedia directly on the browser. For this, I am willing to dump FTP if needed, I've heard about HTTP File Servers but don't know too much about it. I could code everything myself, but there's gotta be something out there already.

    Read the article

  • linux ftp server with virtual users

    - by kjertil
    i know there are already similar questions for this matter but the answers doesn't really make much sense to anyone who is not really technically comfortable in Linux. I've already tried articles like these for example: http://howto.gumph.org/content/setup-virtual-users-and-directories-in-vsftpd/ with the result of accidently breaking the whole system. The problem is that, while there are several technical possibilities to set up virtual users with a FTP server, it is not as easy as managing for instance a Filezilla server on Windows. I've seen some Web based GUI's but most of them seems to be out of date. The different flavours of Linux and the large amount of different popular FTP servers also seems to make the matter more complicated. I guess my question is, is there a way, to set up virtual FTP users on Linux without the hastle of having to manually edit PAM, MYSQL and config files?

    Read the article

  • AWS:EC2:: Could not connect FTP client?

    - by heathub
    My Server OS: Amazon Linux I am trying to set up ftp. I have: Installed vsftpd open port 20-21 open port 1024 - 1048 Basically, I followed every of these steps Start vsftpd service (the status indicate [ok]) I use filezilla for my ftp client. Here is my setting/configuration: Host: ec2-XX-XX-XXX-XX.compute-1.amazonaws.com Port: -(blank, but I have tried 20 and 21 though) Server Type: FTP - File Transder Protocol Logon Type: Normal Username: (tried root and ec2-user) Transfer mode: Tried passive and active I always has this error: Status: Waiting to retry... Status: Resolving address of ec2-XX-XX-XXX-XX.compute-1.amazonaws.com Status: Connecting to XX.XX.XXX.XX:21... Error: Connection timed out Error: Could not connect to server Have I missed any configuration/settings? EDIT After execute the /sbin/iptables -L -n Here is the result: Chain INPUT (policy ACCEPT) target prot opt source destination Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination

    Read the article

  • FTP Error 550 when trying to access a folder via symbolic link

    - by OrangeTux
    I'm configuring svftp on a linux machine. At the moment local users can login via ftp and they will see listened their home dir. They have write acces to it. No I want the users to write in de /var/www/ dir. Therefore I created an new group apache. Added users to the group and gave the group write access to /var/www. Via the terminal all users can write .var/www/. I created a link in the home directory to /var/www via ln -s /var/www/ /home/user/www ls gives: drwxr-xr-x 2 orangetux orangetux 4096 Jun 23 15:06 ftp lrwxrwxrwx 1 orangetux orangetux 21 Jun 23 15:00 www -> /var/www/ But when I use FTP I see the link but I cannot follow it. Error 550 which means file not found or bad access. How can I solve this, so that the users have access to /var/www via their home dir?

    Read the article

  • Odd FTP ASCII/BINARY transfer behavior after migrating to a new server

    - by Incognita Mundi
    I recently got a dedicated server and after migration to the new machine I started noticing problems with file transfers. My FTP client, despite being set to auto, keeps uploading php files in binary mode and the content of those files is messed up. Since I mostly upload files of different kinds it would be annoying to change from binary to ASCII every single time, beside, I never had such problems. What could be the cause of this behavior? My dedicated server runs CENTOS 6.4 and the ftp server is Pure-FTPd. I tried different FTP clients and they all have the same problem so I assume is soem misconfiguration on the server side. Thanks

    Read the article

  • monitor a folder and send files via ftp to clients

    - by user73109
    I am looking for software that will monitor a specific folder and when a file is created in it send that file off via ftp to a client associated with that folder by the software. I have tried software such as smart FTP and cute FTP and they don't seem to monitor folders very consistently. Some of the options with them were to write scripts to delete duplicated files from the transfer queue. I really don't want to have to write scripts for software I purchase. I am not opposed to needing scripting or writing it but I feel I shouldn't have to write scripting to make there software properly do some thing it says it does out of the box. I am currently trying to do this on a Windows XP box though running on a Server 2003 is an option if it would make things easier. I really just want pointed in the correct direction this is all fairly foreign to me

    Read the article

  • Using Dropbox API instead of a FTP server.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

  • Stream tar.gz file from FTP server

    - by linker
    Here is the situation: I have a tar.gz file on a FTP server which can contain an arbitrary number of files. Now what I'm trying to accomplish is have this file streamed and uploaded to HDFS through a Hadoop job. The fact that it's Hadoop is not important, in the end what I need to do is write some shell script that would take this file form ftp with wget and write the output to a stream. The reason why I really need to use streams is that there will be a large number of these files, and each file will be huge. It's fairly easy to do if I have a gzipped file and I'm doing something like this: wget -O - "ftp://${user}:${pass}@${host}/$file" | zcat But I'm not even sure if this is possible for a tar.gz file, especially since there are mutliple files in the archive. I'm a bit confused on what direction to take for this, any help would be greatly appreciated.

    Read the article

  • copy or move a file from one ftp server to another

    - by Oleg Pavliv
    I have a java application, which copies or moves a bunch of giga files from one ftp server to another. Currently it copies a file from the first fpt server to the local computer (where it runs) using ftp get and then copies it to the second ftp server using ftp put. I use net library from apache. Is it possible to copy it directly from one ftp server to another bypassing the local computer? One idea is to create a java telnet session and and send a couple of ftp commands. Will it work? Any other suggestions?

    Read the article

  • How to give a timeout to an FTP connection

    - by dierre
    The story behind: Old script written in ruby 1.8.6 that opens a connection to a ftp and download a configuration file. For a specific client with a windows ftp server the script just hangs. The log stops writing after it opens the connection to the ftp. It's an old script, it's in ruby and I'm not an expert on it. What I tried: So I tried this implementation of a timeout to check if an ftp connection hangs out with this code Timeout::timeout(5) { ftp = Net::FTP.new(host,pass,host) } The problem is that this isn't working. My guess is that the interpreter stops on opening the connection and the timeout doesn't kill the connection because the interpreter is stuck. Is it possible that that's the problem? Could you tell me if there is maybe an alternative solution or if I'm doing something wrong?

    Read the article

  • How to delete files with a Python script from a FTP server which are older than 7 days?

    - by Tom
    Hello I would like to write a Python script which allows me to delete files from a FTP Server after they have reached a certain age. I prepared the scipt below but it throws the error message: WindowsError: [Error 3] The system cannot find the path specified: '/test123/*.*' Do someone have an idea how to resolve this issue? Thank you in advance! import os, time from ftplib import FTP ftp = FTP('127.0.0.1') print "Automated FTP Maintainance" print 'Logging in.' ftp.login('admin', 'admin') # This is the directory that we want to go to directory = 'test123' print 'Changing to:' + directory ftp.cwd(directory) files = ftp.retrlines('LIST') print 'List of Files:' + files # ftp.remove('LIST') #------------------------------------------- now = time.time() for f in os.listdir(directory): if os.stat(f).st_mtime < now - 7 * 86400: if os.directory.isfile(f): os.remove(os.directory.join(directory, f)) #except: #exit ("Cannot delete files") #------------------------------------------- print 'Closing FTP connection' ftp.close()

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >