Using AT on Ubuntu to Background Downloads (w/ Queue)

Posted by Nicholas Yost on Server Fault See other posts from Server Fault or by Nicholas Yost
Published on 2013-11-07T00:58:52Z Indexed on 2013/11/07 3:58 UTC
Read the original article Hit count: 367

Filed under:
|
|
|
|

I am writing a PHP script, but I want to use the AT command in Ubuntu to fetch a remote file via WGET. I'm basically looking to background the process, so PHP can finish fairly quickly.

I cannot find any questions on here about how to use both, but I basically want to do the following pseudo-code:

<?php
    exec('at now -q queuename wget http://path.to/remote/file.ext');
?>

Additionally, I'd like to queue this between providers. I'd like to have each path.to have its own queue, so I only download one file from each provider at a time. Meaning:

<?php
    exec('at now -q remote wget http://path.to/remote/file.ext /local/path');
    exec('at now -q vendorone wget http://vendor.one/remote/file.ext /local/path');
    exec('at now -q vendortwo wget http://vendor.two/remote/file.ext /local/path');
    exec('at now -q vendorone wget http://vendor.one/remote/file.ext /local/path');
?>

This should download the files from path.to, vendor.one, vendor.two immediately, and when the first file is finished downloading from vendor.one, it starts the second file.

Does that make sense? I can't find anything like this anywhere on the web, much less on SO/SF. If we can use the crontab to run a one-off wget command, thats fine too.

© Server Fault or respective owner

Related posts about ubuntu

Related posts about cron