I am working on a PHP based, ticket management system. While creating a ticket, one can upload an attachment.
I want to put a limit (say 10 MB) per file upload.
To implement this I plan the following-
1. In php.ini set
post_max_size = 10M
2.In PHP script which receives the POST-
Since the file is larger than post_max_size, $_FILES[] will be empty. But I can still check the content-length header and discard the upload, if size more than 10M.
While testing this I tried uploading a file of 1 GB and analysed the http traffic and this is what I found.
- the entire 1 GB data is first uploaded to a to the server temporarily and discarded once the http request completes. Though I couldn't exactly find out where the file was getting saved(as it was not there in the temporary directory in the server.), but my http traffic analyzer showed that the browser did send 1 GB data to the server.
- the PHP script execution started only after completion of the http request(i.e after uploading the entire 1 GB)
Now I have 2 concerns:
a) People may exploit my server bandwidth by trying to upload large file, which I will have to discard anyways.
b) Even worse,
if someone starts uploading a huge file (say 100 GB), entire 100 GB data is first uploaded to the server temporarily, that means for that period, it will consume that much of memory on my server.
What's the common solution for this.
Am I missing something here?