Ruby: Streaming large AWS S3 object freezes

Posted by Peter on Stack Overflow See other posts from Stack Overflow or by Peter
Published on 2010-03-31T19:26:06Z Indexed on 2010/06/12 19:43 UTC
Read the original article Hit count: 134

Filed under:
|

Hi, I am using the ruby aws/s3 library to retrieve files from Amazon S3. I stream an object and write it to file as per the documentation (with debug every 100 chunks to confirm progress)

This works for small files, but randomly freezes downloading large (150MB) files on VPS Ubuntu. Fetching the same files (150MB) from my mac on a much slower connection works just fine.

When it hangs there is no error thrown and the last line of debug output is the 'Finished chunk'. I've seen it write between 100 and 10,000 chunks before freezing.

Anyone come across this or have ideas on what the cause might be?

Thanks

The code that hangs:

  i=1
  open(local_file, 'w') do |f|
    AWS::S3::S3Object.value(key, @s3_bucket) do |chunk|
      puts("Writing chunk #{i}")
      f.write chunk.read_body
      puts("Finished chunk #{i}")
      i=i+1
    end
  end

© Stack Overflow or respective owner

Related posts about ruby

Related posts about amazon-s3