Search Results

Search found 5793 results on 232 pages for 'ftp sync'.

Page 57/232 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • What is the best way to keep a folder synchronized with my USB drive?

    - by Ivo Flipse
    I know there is a similar topic on syncing between computers, but I'm looking for an application to run on one computer that will sync a "document/file" folder with a folder on my secondary/external USB drive. What would be the best solution? I know I could use Dropbox & Live Mesh, but they use up bandwith which isn't very good when I drop in a lot of large files. I'm running Windows 7, but I assume any solution for Windows Vista would work just fine.

    Read the article

  • How to sync SkyDrive files to Microsoft Surface device?

    - by Sam
    Today I got a brand new Microsoft Surface. With the touch keyboard this really feels like a small notebook, my only problem is: How do I sync my SkyDrive files to the device? There is SkyDrive app installed, but it can't sync, I can only access the files one by one. And the usual desktop sync program can't be installed since the system is not x86 but ARM instead. So how do I sync SkyDrive to this device??

    Read the article

  • NTP daemon or ntpdate doesn't synchronize

    - by user2862333
    I'm having some problems with synchronization with an NTP server. 1) The NTP daemon doesn't sync the system clock at all, even though it's running (confirmed with /etc/init.d/ntp status). Forcing to sync with ntpd -q or ntpd -gq does not work either. 2) Stopping the NTP daemon and syncing manually with ntpdate does give me the following output: ~# ntpdate -d 0.debian.pool.ntp.org 6 Nov 16:48:53 ntpdate[4417]: ntpdate [email protected] Sat May 12 09:07:19 UTC 2012 (1) transmit(79.132.237.5) receive(79.132.237.5) transmit(85.234.197.2) receive(85.234.197.2) transmit(194.50.97.34) receive(194.50.97.34) transmit(79.132.237.1) receive(79.132.237.1) transmit(79.132.237.5) receive(79.132.237.5) transmit(85.234.197.2) receive(85.234.197.2) transmit(194.50.97.34) receive(194.50.97.34) transmit(79.132.237.1) receive(79.132.237.1) transmit(79.132.237.5) receive(79.132.237.5) transmit(85.234.197.2) receive(85.234.197.2) transmit(194.50.97.34) receive(194.50.97.34) transmit(79.132.237.1) receive(79.132.237.1) transmit(79.132.237.5) receive(79.132.237.5) transmit(85.234.197.2) receive(85.234.197.2) transmit(194.50.97.34) receive(194.50.97.34) transmit(79.132.237.1) receive(79.132.237.1) server 79.132.237.5, port 123 stratum 2, precision -20, leap 00, trust 000 refid [79.132.237.5], delay 0.05141, dispersion 0.00145 transmitted 4, in filter 4 reference time: d624e3b1.f490b90d Wed, Nov 6 2013 16:50:09.955 originate timestamp: d624e457.eaaf787c Wed, Nov 6 2013 16:52:55.916 transmit timestamp: d624e36c.4a7036fd Wed, Nov 6 2013 16:49:00.290 filter delay: 0.08537 0.05141 0.05151 0.06346 0.00000 0.00000 0.00000 0.00000 filter offset: 235.6038 235.6087 235.6095 235.6068 0.000000 0.000000 0.000000 0.000000 delay 0.05141, dispersion 0.00145 offset 235.608782 server 85.234.197.2, port 123 stratum 2, precision -20, leap 00, trust 000 refid [85.234.197.2], delay 0.05151, dispersion 0.00336 transmitted 4, in filter 4 reference time: d624e3e7.dc6cd02b Wed, Nov 6 2013 16:51:03.861 originate timestamp: d624e458.1c91031f Wed, Nov 6 2013 16:52:56.111 transmit timestamp: d624e36c.7da1d882 Wed, Nov 6 2013 16:49:00.490 filter delay: 0.05765 0.07750 0.06013 0.05151 0.00000 0.00000 0.00000 0.00000 filter offset: 235.6048 235.6014 235.6035 235.6078 0.000000 0.000000 0.000000 0.000000 delay 0.05151, dispersion 0.00336 offset 235.607826 server 194.50.97.34, port 123 stratum 3, precision -23, leap 00, trust 000 refid [194.50.97.34], delay 0.03021, dispersion 0.00090 transmitted 4, in filter 4 reference time: d624e38d.2bce952c Wed, Nov 6 2013 16:49:33.171 originate timestamp: d624e458.4dbbc114 Wed, Nov 6 2013 16:52:56.303 transmit timestamp: d624e36c.b0d38834 Wed, Nov 6 2013 16:49:00.690 filter delay: 0.03030 0.03636 0.03091 0.03021 0.00000 0.00000 0.00000 0.00000 filter offset: 235.6095 235.6085 235.6098 235.6105 0.000000 0.000000 0.000000 0.000000 delay 0.03021, dispersion 0.00090 offset 235.610589 server 79.132.237.1, port 123 stratum 3, precision -20, leap 00, trust 000 refid [79.132.237.1], delay 0.05113, dispersion 0.00305 transmitted 4, in filter 4 reference time: d624dfcb.6acea332 Wed, Nov 6 2013 16:33:31.417 originate timestamp: d624e458.838672ad Wed, Nov 6 2013 16:52:56.513 transmit timestamp: d624e36c.e405181c Wed, Nov 6 2013 16:49:00.890 filter delay: 0.06345 0.05113 0.05681 0.05656 0.00000 0.00000 0.00000 0.00000 filter offset: 235.6087 235.6038 235.6010 235.6074 0.000000 0.000000 0.000000 0.000000 delay 0.05113, dispersion 0.00305 offset 235.603888 6 Nov 16:49:00 ntpdate[4417]: step time server 79.132.237.5 offset 235.608782 sec Clearly, ntpdate can reach the NTP server(s), but after checking the clock, it hasn't changed and is still displaying the wrong time. Any ideas what would be the problem would be much appreciated.

    Read the article

  • SQL Server 'Real Time' Mirroring Possible?

    - by Ryan
    We have a SQL Server that has important databases for our clients, if the server goes down we want another server to be ready to be switched over (we would just change the IP). The question is, how can we automatically sync the primary SQL Server to the secondary one periodically through out the day? Or even in real time? Thanks!

    Read the article

  • How do I fix: The handshake failed due to an unexpected packet format?

    - by Greg Finzer
    I am connecting from Windows Server 2008 R2 to a Linux FTP Server running vsFTPd 2.0.7. I am connecting via SSL. Here is the line of code it is failing on: sslStream = new SslStream(stream, false, CertificateValidation); Here is the log: 220 (vsFTPd 2.0.7) AUTH SSL 234 Proceed with negotiation. I receive the following error: System.IO.IOException: The handshake failed due to an unexpected packet format. at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ForceAuthentication(Boolean receiveFirst, Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ProcessAuthentication(LazyAsyncResult lazyResult) at KellermanSoftware.NetFtpLibrary.ProxySocket.InitSsl() at KellermanSoftware.NetFtpLibrary.FTP.Connect(Boolean implicitConnection)

    Read the article

  • Python's FTPLib too slow ?

    - by PyNEwbie
    I have been playing around with Python's FTP library and am starting to think that it is too slow as compared to using a script file in DOS? I run sessions where I download thousands of data files (I think I have over 8 million right now). My observation is that the download process seems to take five to ten times as long in Python than it does as compared to using the ftp commands in the DOS shell. Since I don't want anyone to fix my code I have not included any. I am more interested in understanding if my observation is valid or if I need to tinker more with the arguments.

    Read the article

  • when using a FTPS connection to transfer a file, what is the difference between a 'Binary mode taran

    - by shaleen mohan
    I am using a FTPS connection to send a text file [this file will contain EDI(Electronic Data Interchange) information]to a mailbox INOVIS.I have configured the system to open a FTPS connection and using the PUT command I write the file to a folder on the FTP server. The problem is: what mode of file transfer should I use? How do I switch between modes? Moreover which mode is the 'best-practice' to use when transferring file over FTPS connection. If some one can provide me a small ftp script it would be helpful.

    Read the article

  • Explorer right-click uploader for CMS?

    - by Pekka
    I'm looking for a "right-click upload" application like RightLoad - an application that can upload media files to a remote FTP server from the Windows Explorer's context menu. I want to customize the application to serve as a customized image uploading tool to a PHP-based CMS. The user would upload images and other media files to a defined FTP account (I'm also very open for other methods of transport, as long as they are supported by run-off-the-mill web hosting stacks) that they could then use in the CMS they log in to. For me to be able to do these customizations, the application would have to be Open Source - RightLoad is "only" Freeware. Alternatively, I'm open for closed-source and commercial suggestions as long as they allow "pre-packaged" server settings that can easily be deployed to the user. Does anybody know such a tool compatible with at least the most current versions of Windows (XP, Vista, 7)?

    Read the article

  • Generate Windows .lnk file with PHP

    - by Andrei
    Hello, I'm working on a project which involves an FTP server running ProFTPd and a PHP/MySQL backend that creates accounts for users. Upon the creation of accounts, users are sent e-mails with their account details and instructions for downloading FileZilla or CyberDuck, depending on their OS, detected via user-agent string. To make things easier for novices, I thought of having .lnk files generated for FileZilla with the account logins details as parameters, so they would just have to click on the .lnk files to open up the server. This is not crucial feature but more of a technical challenge. My questions are : is this even feasible ? are there any alternatives (eg. generating a .bat with a script pointing to the Filezilla executable ?) are there any issues, perhaps with relative / absolute paths pointing to the executable ? to go even further, what would be the simplest way of providing users with software with FTP access on a single account / single server (web interface is not an option).

    Read the article

  • Partial Upload With storbinary in python

    - by brian
    I've written some python code to download an image using urllib.urlopen().read() and then upload it to an FTP site using ftplib.FTP().storbinary() but I'm having a problem. Sometimes the image file is only partially uploaded, so I get images with the bottom 20% or so cut off. I've checked the locally downloaded version and I have successfully downloaded the entire image, which leads me to believe that it is a problem with storbinary. I believe I am opening and closing all of the files correctly. Does anyone have any clues as to why I'm getting a partial upload with storbinary? Update: When I run through the commands in the Python shell, the upload completes successfully, I don't know why it would be different from when run as a script...

    Read the article

  • How do I dynamically specify a file in DOS?

    - by donde
    I am trying to use c# in .net to run dos commands to ftp a a file. Technically, it calls a BAT file which calls a CMD file which executes the DOS code. It was up to the CMD file. The CMD fiel will work if I hardcode the path, but I need to dynamically specify the path of the file. BAT File... ftp.exe -s:%~dp0\mycmdfile.cmd And in the cmd file... open <my ost> <my user name> <my pw> quote site cyl pri=1 sec=1 lrecl=1786 blksize=0 recfm=fb retpd=30 put <here is where I need the dynamic path> + localfilename remotefilename quit

    Read the article

  • PHP - ftp_get only works once

    - by William
    I'm connecting to an ftp server that I have no control over, and I'm pretty sure is using something old and outdated due to other issues I've run into. I'm simply using this code in a loop to get all the files in a directory. ftp_put($this->conn_id, $remote, $local, FTP_ASCII); The first time all goes well, but after that I get this error thrown for each file I try to get: "There is already an active transaction" I've tried both passive & active with no luck. It's the exact same code I use to connect to other FTP servers and get files with no problem. Any ideas? I suppose my last resort could be disconnecting and reconnecting for each file, but that seems like a huge waste.

    Read the article

  • How can I compare two jpeg encoding and other inormations

    - by Subhen
    We have created a Driver programe which connect to a remote host using FTP and mount the remote host as a network drive. So when I try to copy some data it copies using FTP retrieve request and then paste it to the destination. The copy paste works fine as we can see the source file size and destination size are same. But while we try to open the .jpg file that is being copied , says no preview. I suspect there must be some bytes that is being corrupted while we try to copy and paste. Is there any tools so that I can compare both source and destination to get the differences.

    Read the article

  • What strategy do you use to sync your code when working from home

    - by Ben Daniel
    At my work I currently have my development environment inside a Virtual Machine. When I need to do work from home I copy my VM and any databases I need onto a laptop drive sized external USB drive. After about 10 minutes of copying I put the drive in my pocket and head home, copy back the VM and databases onto my personal computer and I'm ready to work. I follow the same steps to take the work back with me. So if I count the total amount of time I spend waiting around for files to finish copying in order for me to take work home and bring it back again, it comes to around 40 minutes! I do have a VPN connection to my work from home (providing the internet is up at both sites) and a decent internet speed (8mbits down/?up) but I find Remote Desktoping into my work machine laggy enough for me to want to work on my VM directly. So in looking at what other options I have or how I could improve my existing option I'm interested in what strategy you use or recommend to do work at home and keeping your code/environment in sync. EDIT: I'd prefer an option where I don't have to commit my changes into version control before I leave work - as I like to make meaningful descriptive comments in my commits, committing would take longer than just copying my VM onto a portable drive! lol Also I'd prefer a solution where my dev environment stays in sync too. Having said that I'm still very interested in your own solutions even if they don't exactly solve my problem as best as I'd like. :)

    Read the article

  • Git repos over multiple machines - backups and keeping in sync

    - by a-or-b
    I'm new to git so please feel free to RTFM me... I have multiple development sites (none of which can communicate via a network with each other) and am working on a few projects (with a few people) at any one time. What I would ideally have is at each site a centralized repository that can be pulled from but development would occur in our own (personal) repos. Then I would like to be able to sync across the centralized repos (via USB key for example). I want a centralized repo at each location as (1) I'm new to git and do break my (personal) local repo by playing around and (2) some projects get put on hold so I want to be able to free up disk space by deleting them. This is the "backup" part of my question. I was also hoping to be able to use 'git clone --bare' for my centralized repos (and the USB key repos to?) as we don't need the full checkout, just the git benefits. However I can't seem to get a bare repo to work as repo I can push from. I've used 'git remote' to set up an remote origin (similar to http://toolmantim.com/thoughts/setting_up_a_new_remote_git_repository) but I can't get 'git push' to work - it seems I need a checked-out repo. . Does anyone else use this sort of repo/development structure or is there something fundamental about git usage that I'm missing? . A solution that I thought about that might not work - If I had a 'git clone --bare' at each site and then use a git repo on my removable media which has remotes set up for each site then I could ('pull') sync my USB key with each repo. But then can I update the site repo from my USB key? Could I push from USB?

    Read the article

  • Send file FTP over SSL with custom port number

    - by JM4
    I have asked the question before but in a different manner. I am trying taking form data, compiling into a temporary CSV file and trying to send over to a client via FTP over SSL (this is the only route I am interested in hearing solutions for unless there is a workaround to doing this, I cannot make changes). I have tried the following: ftp_connect - nothing happens, the page just times out ftp_ssl_connect - nothing happens, the page just times out curl library - same thing, given URL it also gives error. I am given the following information: FTPS Server IP Address TCP Port (1234) Username Password Data Directory to dump file FTP Mode: Passive very, very basic code (which I believe should initiate a connection at minimum): Code: <?php $ftp_server = "00.000.00.000"; //masked for security $ftp_port = "1234"; // masked but not 990 $ftp_user_name = "username"; $ftp_user_pass = "password"; // set up basic ssl connection $conn_id = ftp_ssl_connect($ftp_server, $ftp_port, "20"); // login with username and password $login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass); echo ftp_pwd($conn_id); // / echo "hello"; // close the ssl connection ftp_close($conn_id); ?> When I run this over a SmartFTP client, everything works just fine. I just can't get it to work using PHP (which is a necessity). Has anybody had success doing this in the past? I would be very interested to hear your approach.

    Read the article

  • Sync data between a windows desktop app and windows mobile client app

    - by Chris W
    I need to knock up a very quick prototype/proof of concept application to demo to someone within the next couple of days so I've minimal time to research this as fully as I normally would. The set-up is a very simple database application running on a laptop - will only ever be a single user updating a couple of tables so I was thinking of knocking up a basic Win Forms app against SQL Compact. Visual Studio's auto generated data grid edit screens will be fine with a little customisation. The second aspect is to then add a windows mobile client application that can pull data from both tables stored on the laptop, edit some data and insert some extra rows before sending the changes back to the laptop copy of the database. I've not done any WinMo development so what's the best approach for me to look at. Is it easy enough to sync data between the two databases when the WinMo device is connected to the laptop with USB? Most of the samples I've looked at so far seem to be syncing SQL Compact with SQL Standard using IIS which seems a bit overkill. The volumes of data to be synced are so small that I can easily write some manual sync code if it's easy for me to query/update the Compact DB from the laptop application when the device is connected.

    Read the article

  • Sync video play over network

    - by Nemesis
    Hi, I have made a media player that plays basically anything that's scheduled to it via a text file. The player can also play the exact same clip on multiple machines(PC's). The problem is the syncing. The same video starts playing on each of the machines, but they are out by about 400ms, which looks crap and if there's sound it's even worse. What I do at the moment is: One machine is set up as the master and all other machines are set up as slaves. The master decides what item will be played. It waits for a message from each of the slaves, once all slaves are connected (or after the timeout), it broadcasts the item id of the file that needs to be played. All machines then start playing that file. What I also tried: I thought that the file loading time might be the major driving factor in the sync mismatch, so I chankged the code to do the following. The master still decides what file to play. It waits for the connect message from each slave (or timeout) and transmits the item id of the file to play. All machines start playing that file but pauses it immediately. The master then again waits for a ready message from each of the slaves. As soon as all slaves responded the master sends a play message to all slaves. All machines then continue the file. This unfortunately did not improve the problem. I am now pretty sure the sync mismatch is due to network delay. How can I compensate for this? Or maybe determine the delay to each slave? All network comms are done with winsock. Any thoughts or ideas is much appreciated.

    Read the article

  • What NAS setup for two-way syncing over the internet?

    - by Jamse
    I have family living a few hours away and have a lot of files that I would like to share - especially lots of folders of digital photos, but also documents etc. - partially so they can see them, partially so I can have access when I visit them and partially for backup / redundancy purposes. My current hard drives on my main machine are getting pretty full anyway, and I have a MythTV box where my music is currently stored, so I was thinking of getting a NAS anyway. And at the other end my family have a few computers, so they would probably benefit from a NAS too. My general idea (though I'm willing to shift on this if there are any bright ideas about other ways of achieving my objectives) is to get a matching pair of NASs and have them sync over the internet. (To cut down on bandwidth use I would get them in sync locally to start with.) Having read around as best I can it seems that syncing over the internet is generally only a feature on quite high end units. However, I have seen that QNAP seem to feature this on their TS-110 and TS-210 units, which might work (they call it "remote replication"). They seem pretty reasonably priced for what they are, but of course with buying 2 of them and then adding the drives (say 1TB or 2TB each) I'd be looking at about £400 total. So, I'm looking for recommendations really. I don't want to spend more than the QNAPs would cost me, but any other ideas would be most appreciated. I am comfortable with technology and tinkering around, but I don't have as much time for that as I would like, so I guess I would favour solutions that require less tinkering rather than more (even though that's less fun!). Any thoughts would be welcome, as would any comments from people who have used the QNAP boxes for this. Thanks in advance. Some specifications: Two-way syncing. Changes made at either end should be synced to the other. There shouldn't be one unit that is effectively a read-only mirror of the other. Not real time. The syncing doesn't need to be real time - if it updated, say, daily overnight that would be fine. Set and forget. I would prefer minimal user interaction once set up - it would be great if syncs were scheduled and automatic. OS independence. I am running Windows XP plus an Ubuntu-based MythTV box. At the other end there are Windows 7 and Windows XP machines, plus a networked TV set top box which I think can play files off the network. Machine independence. I would favour a system that is self-contained, i.e. not reliant on any particular PC being switched on. If the system had enough else going for it I could perhaps work around it at this end, where I only have one PC that's used as such, but it would be harder at the other where there are at least two PCs that might be accessing the files. Notifications. I guess things like getting an email notification if the syncing fell over for any reason would be useful, though it's not a deal breaker. Update I've been digging some more and it looks like QNAP's Remote Replication function is actually just Rsync, so only really suitable for one-way syncing. I've posted on their forum to double check, but I think that's the case. In which case, I think the focus of my question is now either: do any reasonably-priced NASs support bidirectional syncing over the internet?, or has anyone had any luck installing onto NASs for this purpose? (Also, updated question to clarify that I'm after two-way syncing.)

    Read the article

  • What NAS setup for syncing over the internet?

    - by Jamse
    I have family living a few hours away and have a lot of files that I would like to share - especially lots of folders of digital photos, but also documents etc. - partially so they can see them, partially so I can have access when I visit them and partially for backup / redundancy purposes. My current hard drives on my main machine are getting pretty full anyway, and I have a MythTV box where my music is currently stored, so I was thinking of getting a NAS anyway. And at the other end my family have a few computers, so they would probably benefit from a NAS too. My general idea (though I'm willing to shift on this if there are any bright ideas about other ways of achieving my objectives) is to get a matching pair of NASs and have them sync over the internet. (To cut down on bandwidth use I would get them in sync locally to start with.) Having read around as best I can it seems that syncing over the internet is generally only a feature on quite high end units. However, I have seen that QNAP seem to feature this on their TS-110 and TS-210 units, which might work (they call it "remote replication"). They seem pretty reasonably priced for what they are, but of course with buying 2 of them and then adding the drives (say 1TB or 2TB each) I'd be looking at about £400 total. So, I'm looking for recommendations really. I don't want to spend more than the QNAPs would cost me, but any other ideas would be most appreciated. I am comfortable with technology and tinkering around, but I don't have as much time for that as I would like, so I guess I would favour solutions that require less tinkering rather than more (even though that's less fun!). Any thoughts would be welcome, as would any comments from people who have used the QNAP boxes for this. Thanks in advance. Some specifications: Two-way syncing. Changes made at either end should be synced to the other. There shouldn't be one unit that is effectively a read-only mirror of the other. Not real time. The syncing doesn't need to be real time - if it updated, say, daily overnight that would be fine. Set and forget. I would prefer minimal user interaction once set up - it would be great if syncs were scheduled and automatic. OS independence. I am running Windows XP plus an Ubuntu-based MythTV box. At the other end there are Windows 7 and Windows XP machines, plus a networked TV set top box which I think can play files off the network. Machine independence. I would favour a system that is self-contained, i.e. not reliant on any particular PC being switched on. If the system had enough else going for it I could perhaps work around it at this end, where I only have one PC that's used as such, but it would be harder at the other where there are at least two PCs that might be accessing the files. Notifications. I guess things like getting an email notification if the syncing fell over for any reason would be useful, though it's not a deal breaker.

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >