Search Results

Search found 4900 results on 196 pages for 'upload'.

Page 1/196 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Upload an image using class.upload.php via AJAX, display image in form using jQuery

    - by Benjamin
    I am using class.upload.php to handle an image upload from a form that submits employee details to a MySQL database. class.upload.php does EXACTLY what I want it to do, resize the image and rename it - what I am now trying to accomplish is to upload via Ajax after the user selects the file and then display it in the form while they continue entering details. How would this best be accomplished with jQuery?

    Read the article

  • Javascript upload in spite of same origin policy

    - by Brian M. Hunt
    Can one upload files to a domain other than the domain a script originates from? For example, suppose you're hosting files on www.example.com, and you want to upload files to uploads.example.com, would the following script violate the same origin policy (using uploadify): <!-- from http://www.example.com/upload.html --> <input id="fileInput" name="fileInput" type="file" /> <script type="text/javascript">// <![CDATA[ $(document).ready(function() { $('#fileInput').uploadify({ 'uploader' : 'uploadify.swf', 'script' : 'http://upload.example.com/uploadify.php', 'cancelImg' : 'cancel.png', 'auto' : true, 'folder' : '/uploads' }); }); // ]]></script> I haven't seen a clear reference on the same origin policy that would indicate one-way or the other whether the this would be an issue with any browsers.

    Read the article

  • Why can't upload three files with FileUpload ?

    - by Valter Henrique
    Hi everyone, i'm trying to upload three images to my server, is working, but upload always the last file selected by the user, not the three selected. Here's my code: protected void doPost(HttpServletRequest request, HttpServletResponse response){ boolean multipart = ServletFileUpload.isMultipartContent(request); if (multipart) { DiskFileItemFactory fileItemFactory = new DiskFileItemFactory(); fileItemFactory.setSizeThreshold(5 * 1024 * 1024); //5 MB fileItemFactory.setRepository(tmpDir); ServletFileUpload uploadHandler = new ServletFileUpload(fileItemFactory); try { List items = uploadHandler.parseRequest(request); Iterator itr = items.iterator(); while (itr.hasNext()) { FileItem item = (FileItem) itr.next(); File file = new File(dir, generateNewName()); item.write(file); } } catch (FileUploadException ex) { } catch (Exception ex) { } } } -- UPDATE: <html> <head> <title>Upload</title> </head> <body> <form action="Upload" method="post" enctype="multipart/form-data"> <input type="file" name="file1" /> <br /> <input type="file" name="file2" /> <br /> <input type="file" name="file3" /> <br /> <input type="submit" value="Enviar" /> </form> </body> </html> Best regards, Valter Henrique.

    Read the article

  • PHP File upload fails with blank screen when size exceeds 742 KB.

    - by user284839
    Hi friends, This is one of the most bizarre errors I've come across. So I've written a little file upload web app for my friend and it works fine for any file less than or equal to 742kB in size. Needless to say, I arrived at this precise number based on relentless testing. Weird part is that if the file size is just a few KB more, for example 743 or 750, I get an error saying "MySQL has gone away". But if it is 1MB or more, then I just get a blank screen. And it happens in less than 2 seconds after I hit the upload button. So it doesn't look like a time-out to me. I checked out the PHP.ini file for post size and upload size, they are all set to 5 MB or more. And the timeout is set to 60 seconds. The uploaded file sits in MySQL database in a field of datatype mediumblob. I tried changing that to longblob. But that didn't help either. Any help? Thanks for reading, Girish

    Read the article

  • PHP File Upload second file does not upload, first file does without error

    - by Curtis
    So I have a script I have been using and it generally works well with multiple files... When I upload a very large file in a multiple file upload, only the first file is uploaded. I am not seeing an errors as to why. I figure this is related to a timeout setting but can not figure it out - Any ideas? I have foloowing set in my htaccess file php_value post_max_size 1024M php_value upload_max_filesize 1024M php_value memory_limit 600M php_value output_buffering on php_value max_execution_time 259200 php_value max_input_time 259200 php_value session.cookie_lifetime 0 php_value session.gc_maxlifetime 259200 php_value default_socket_timeout 259200

    Read the article

  • Upload multiple Images and Ping an address to upload them

    - by azz0r
    Hello! I am trying to allow administrators to upload 20-30 images in one go. I have tried jquery uploadify, SWFUpload and other various jquery scripts. However I have had many issues doing this. Uploadify worked fine, however it stopped working and after a day of debugging I couldn't figure out why (and by god, I tried to figure out why) So I have decided to try a java applet, however they all seem to cost quite abit of money. All I need is something to select some images, then each image upload pings and address where I can resize them in PHP, and done. Does anyone know a freeware solution thats easy to use?

    Read the article

  • Rename image file on upload php

    - by blasteralfred
    Hi, I have a form which uploads and re sizes image. The html file file submits data to a php file. The script is as follows; Index.html <form action="resizer.php" method="post" enctype="multipart/form-data"> Image: <input type="file" name="file" /> <input type="submit" name="submit" value="upload" /> </form> Resizer.php <?php require_once('imageresizer.class.php'); $imagename = "myimagename"; //Path To Upload Directory $dirpath = "uploaded/"; //MAX WIDTH AND HEIGHT OF IMAGE $max_height = 100; $max_width = 100; //Create Image Control Object - Parameters(file name, file tmp name, file type, directory path) $resizer = new ImageResizer($_FILES['file']['name'],$_FILES['file']['tmp_name'],$dirpath); //RESIZE IMAGE - Parameteres(max height, max width) $resizer->resizeImage($max_height,$max_width); //Display Image $resizer->showResizedImage(); ?> imageresizer.class.php <?php class ImageResizer{ public $file_name; public $tmp_name; public $dir_path; //Set variables public function __construct($file_name,$tmp_name,$dir_path){ $this->file_name = $file_name; $this->tmp_name = $tmp_name; $this->dir_path = $dir_path; $this->getImageInfo(); $this->moveImage(); } //Move the uploaded image to the new directory and rename public function moveImage(){ if(!is_dir($this->dir_path)){ mkdir($this->dir_path,0777,true); } if(move_uploaded_file($this->tmp_name,$this->dir_path.'_'.$this->file_name)){ $this->setFileName($this->dir_path.'_'.$this->file_name); } } //Define the new filename public function setFileName($file_name){ $this->file_name = $file_name; return $this->file_name; } //Resize the image function with new max height and width public function resizeImage($max_height,$max_width){ $this->max_height = $max_height; $this->max_width = $max_width; if($this->height > $this->width){ $ratio = $this->height / $this->max_height; $new_height = $this->max_height; $new_width = ($this->width / $ratio); } elseif($this->height < $this->width){ $ratio = ($this->width / $this->max_width); $new_width = $this->max_width; $new_height = ($this->height / $ratio); } else{ $new_width = $this->max_width; $new_height = $this->max_height; } $thumb = imagecreatetruecolor($new_width, $new_height); switch($this->file_type){ case 1: $image = imagecreatefromgif($this->file_name); break; case 2: $image = imagecreatefromjpeg($this->file_name); break; case 3: $image = imagecreatefrompng($this->file_name); break; case 4: $image = imagecreatefromwbmp($this->file_name); } imagecopyresampled($thumb, $image, 0, 0, 0, 0, $new_width, $new_height, $this->width, $this->height); switch($this->file_type){ case 1: imagegif($thumb,$this->file_name); break; case 2: imagejpeg($thumb,$this->file_name,100); break; case 3: imagepng($thumb,$this->file_name,0); break; case 4: imagewbmp($thumb,$this->file_name); } imagedestroy($image); imagedestroy($thumb); } public function getImageInfo(){ list($width, $height, $type) = getimagesize($this->tmp_name); $this->width = $width; $this->height = $height; $this->file_type = $type; } public function showResizedImage(){ echo "<img src='".$this->file_name." />"; } public function onSuccess(){ header("location: index.php"); } } ?> Everything is working well. The image will be uploaded in it's original filename and extension with a "_" prefix. But i want to rename the image to "myimagename" on upload, which is a variable in "Resizer.php". How can i make this possible?? Thanks in advance :) blasteralfred

    Read the article

  • Use WatiN for automation upload file on the website

    - by MrDuDuDu
    I need upload file on the website. But Have a problem, i can't choose file automatic in code. Always browser show me choose file window. What wrong in my code? IE ie = new IE("https://www.xxxx.com/WFrmlogin.aspx"); FileUploadDialogHandler uploadHandler = new FileUploadDialogHandler(@"D:\065-6405_URGENT.xls"); ie.WaitForComplete(); ie.TextField(Find.ById("txtUser")).TypeText("login"); ie.TextField(Find.ById("txtPassWord")).TypeText("***"); ie.Button(Find.ById("btnok")).Click(); ie.WaitForComplete(); ie.GoTo("https://www.orientspareparts.com/inq/WFrmUpOption.aspx"); ie.WaitForComplete(); ie.DialogWatcher.Clear(); ie.AddDialogHandler(uploadHandler); // This code show choose file dialog ie.FileUpload(Find.ById("FilUpload")).ClickNoWait(); ie.Button(Find.ById("butUpload")).Click(); ie.WaitForComplete();

    Read the article

  • HTTP Upload Problems

    - by jfoster
    We are running a marketplace on ColdFusion8 and IIS with a widely geographically distributed user base and have been receiving complaints of issues with some HTTP uploads. Most of the complaints are coming from geographically distant locations from our main datacenter on the US east coast. I've attempted to upload the same 70MB file from a US West coast test server to both our main site and a backup running the same code on a different network route and I saw the same issues fairly consistently in both places, so I've ruled out the code, route, and internal network errors. I've also tested uploads using both the native cf upload tag and a third party tool called SaFileUp. I saw the same issues with both upload tools, so I also don't think this is necessarily a ColdFusion problem. I don't have any problems uploading the test file from the East coast to other east coast servers, so I'm beginning to think that the distance between our users and our equipment is a factor. I've also found that smaller files are more likely to succeed than large ones (< 10MB) I tried the test upload with both IE and FF and did notice a difference in the way that the browsers seemed to handle packet errors. IE seemed to have a tough time continuing an upload after dropped / bad packets, whereas FF seemed to have the ability to gracefully resume an upload after experiencing packet problems. Has anyone experienced similar issues? Is there anything we can do on our side to make uploads more forgiving to packet loss or resumable after an error? A different upload tool etc… Do we need upload servers in more than one location to shorten the network routes between clients and servers? Does anyone think that switching uploads to SSL will help (no layer7 packet sniffing may lead to a smoother upload). Thanks.

    Read the article

  • Is it possible to upload only files that have been updated into a server?

    - by kamikaze_pilot
    Hi guys, Suppose I have a server accessible via FTP and it hosts websites Suppose I want to edit the website locally so it wont affect the site live, and suppose I edit a whole bunch of files, and I don't want to deal with the hassle of keeping track of which files I've edited all the time... Once I finished editing I want to upload it to the server via FTP....is there some FTP software that automatically detects which files have been edited and have only those files uploaded and overwritten rather than having me manually choosing the files I've edited (and hence having to keep track of edited files) or have me upload the entire site which is a waste of time thanks in advance

    Read the article

  • upload picture to and send path to database

    - by opa.mido
    I am trying to upload picture and link it to my database and I used the codes below : Upload2.php <?php // Check if a file has been uploaded if(isset($_FILES['uploaded_file'])) { // Make sure the file was sent without errors if($_FILES['uploaded_file']['error'] == 0) { // Connect to the database $dbLink = new mysqli('localhost', 'root', '1234', 'fyp'); if(mysqli_connect_errno()) { die("MySQL connection failed: ". mysqli_connect_error()); } // Gather all required data $name = $dbLink->real_escape_string($_FILES['uploaded_file']['name']); $mime = $dbLink->real_escape_string($_FILES['uploaded_file']['type']); $data = $dbLink->real_escape_string(file_get_contents($_FILES ['uploaded_file']['tmp_name'])); $size = intval($_FILES['uploaded_file']['size']); // Create the SQL query $query = " INSERT INTO `pic` ( `Name`, `size`, `data`, `created` ) VALUES ( '{$name}', '{$mime}', {$size}, '{$data}', NOW() )"; // Execute the query $result = $dbLink->query($query); // Check if it was successfull if($result) { echo 'Success! Your file was successfully added!'; } else { echo 'Error! Failed to insert the file' . "<pre>{$dbLink->error}</pre>"; } } else { echo 'An error accured while the file was being uploaded. ' . 'Error code: '. intval($_FILES['uploaded_file']['error']); } // Close the mysql connection $dbLink->close(); } else { echo 'Error! A file was not sent!'; } // Echo a link back to the main page echo '<p>Click <a href="index.html">here</a> to go back</p>'; ?> I reached the last stage and its give error a file was not sent I don't know where I have missed. thank you

    Read the article

  • PHP Image Upload object won't create directory on windows

    - by ThinkingInBits
    I'm writing a class that excepts a product_id and a $_FILES array as it's constructor. The class saves these variables and uses them to create a new directory named by product_id and attempts to copy the temp file to my specified directory. For some reason my class isn't even getting as far as creating a directory (which should happen in the constructor) Here's a link to my code: http://pastie.org/955454

    Read the article

  • File upload help in JSP /Struts 1.2

    - by user57421
    Hi All, I have seen lot of reviews and have not been able to figure out how I can solve my problem.. Problem: Currently we have a page to upload the file from local machine to the respositry. It is currently using Struts upload. Now the current requirment is, Since users upload around 1gb of file, they are made to wait for a long time.. So they changed the requirment to browse the file and select the file to upload and hitting upload button should return to the control to next screen imediately.. But the upload process some how needs to be pushed at the backend and should do the upload and send out an email once the upload is complete.. I'm not able to figure out how to transfer the control to next page, when the upload is still running ... any idea or help will be appreicated.. Regards, Senny

    Read the article

  • Limit vsftp upload to a given set of file-names

    - by Chen Levy
    I need to configure an anonymous ftp with upload. Given this requirement I try to lock this server down to the bear minimum. One of the restrictions I wish to impose is to enable the upload of only a given set of file-names. I tried to disallow write permission to the upload folder, and put in it some empty files with write permission: /var/ftp/ [root.root] [drwxr-xr-x] |-- upload/ [root.root] [drwxr-xr-x] | |-- upfile1 [ftp.ftp] [--w-------] | `-- upfile2 [ftp.ftp] [--w-------] `-- download/ [root.root] [drwxr-xr-x] `-- ... But this approach didn't work because when I tried to upload upfile1, it tried to delete and create a new file in its' place, and there is no permissions for that. Is there a way to make this work, or perhaps use a different approach like abusing the deny_file option?

    Read the article

  • Limit vsftpd upload to a given set of file-names

    - by Chen Levy
    I need to configure an anonymous ftp with upload. Given this requirement I try to lock this server down to the bear minimum. One of the restrictions I wish to impose is to enable the upload of only a given set of file-names. I tried to disallow write permission to the upload folder, and put in it some empty files with write permission: /var/ftp/ [root.root] [drwxr-xr-x] |-- upload/ [root.root] [drwxr-xr-x] | |-- upfile1 [ftp.ftp] [--w-------] | `-- upfile2 [ftp.ftp] [--w-------] `-- download/ [root.root] [drwxr-xr-x] `-- ... But this approach didn't work because when I tried to upload upfile1, it tried to delete and create a new file in its' place, and there is no permissions for that. Is there a way to make this work, or perhaps use a different approach like abusing the deny_file option?

    Read the article

  • ISPs with good upload speeds? [closed]

    - by Josh Comley
    I am a web developer, and I spend most of my time not waiting for downloads but waiting for my latest build of a website to publish to a test site. I use the excellent BeyondCompare to ensure I only upload what I need to upload. But still, if a 2MB C# DLL has changed, so be it. And I must wait. At work we host our own servers and have a dedicated line, so for 8.5 hours of my day I have blistering upload speeds across the web, which is nice. At home however, it's a different story. I am with Virgin Media on their XL internet package (I think). I think that means I get 256Kb upload and 20Mb download. So I have began to wonder - are there any ISPs in the UK with good upload speeds? If you know of any for other countries, do post for other's sake, but please specify which country your post is relevant to.

    Read the article

  • Best Practice: Apache File Upload

    - by matnagel
    I am looking for a soultion for trusted users to upload pdf files via html forms (with maybe php involved). This is quite a standard ubuntu linux server with apache 2.x and php 5. I am wonderiung what are the benefits of the apache file upload module. There were no updates for some time, is it actively maintained? What are the advantages over traditional php upload with apache 2 without this module? http://commons.apache.org/fileupload I remember traditional php file upload is difficult with some pitfalls, will the apache file upload module improve the situation? The solution I am looking for will be part of an existing website and be integrated into the admin web frontend. Things I am not considering are webdav, ssh, ftp, ftps, ftp over ssh. Should work with a browser and without installing special client software, so I am asking about a browser based upload without special client side requirements. I can request a modern browser like firefox = 3.5 or modern webkit broser like chrome or safari from the users.

    Read the article

  • Upload using python script takes very long on one laptop as compared to another

    - by Engr Am
    I have a python 2.7 code which uses STORBINARY function for uploading files to an ftp server and RETRBINARY for downloading from this server. However, the issue is the upload is taking a very long time on three laptops from different brands as compared to a Dell laptop. The strange part is when I manually upload any file, it takes the same time on all the systems. The manual upload rate and upload rate with the python script is the same on the Dell Laptop. However, on every other brand of laptop (I have tried with IBM, Toshiba, Fujitsu-Siemens) the python script has a very low upload rate than the manual attempt. Also, on all these other laptops, the upload rate using the python script is the same (1Mbit/s) while the manual upload rate is approx. 8 Mbit/s. I have tried to vary the filesize for the upload to no avail. TCP Optimizer improved the download rate on all the systems but had no effect on the upload rate. Download rate using this script on all the systems is fine and same as the manual download rate. I have checked the server and it has more than 90% free space. The network connection is the same for all the laptops, and I try uploading only with one laptop at a time. All the laptops have almost the same system configurations, same operating system and approximately the same free drive space. If anything the Dell laptop is a little less in terms of processing power and RAM than 2 of the others, but I suppose this has no effect as I have checked many times to see how much was the CPU usage and network usage during these uploads and downloads, and I am sure that no other virus or program has been eating up my bandwidth. Here is the code ('ftp' and 'file_path' are inputs to the function): path,filename=os.path.split(file_path) filesize=os.path.getsize(file_path) deffilesize=(filesize/1024)/1024 f = open(file_path, "rb") upstart = time.clock() print ftp.storbinary("STOR "+filename, f) upende = time.clock()-upstart outname="Upload " f.close() return upende, deffilesize, outname

    Read the article

  • Couldn't upload files to Sharepoint site while passing through Squid Proxy

    - by Ecio
    Hi all, we have this issue: one of our employees is collaborating with a supplier and he needs to upload documents on a Sharepoint site hosted on the supplier's main site. In our environment we use Squid Proxy to allow people navigate on the net (we have NTLM authentication and users transparently authenticate while using IE and FF). It seems that this specific Sharepoint site is using Integrated Windows Authentication only, and according to some research on the net it seems that this can have troubles with proxies. More specifically, we have tried two Squid versions: with Squid 3.0 we are unable to login to the site (the browser loads an empty page) with Squid 2.7 (that supports "Connection Pinning") we are able to login into the site, move on the different sections BUT.. when we try to upload a file that is bigger than a couple of KiloBytes (i.e. 10KB) the browser loads an error page (i think it's a 401 unauthorized but i must verify it) we've tried changing a couple of Squid options (in 2.7), what we got is that when you try to upload the file you got an authentication box (just like the initial login) and it refuses to go on even if you enter the same authentication credentials. What's really strange is that when you try to upload a small file (i.e. a text or binary 1KB file) the upload succeeds. I initially thought that maybe there was something misconfigured on their Sharepoint site but I've tried also this site: www.xsolive.com (it's a sharepoint 2007 demo site) and I've experienced the same problem. Has any of you experienced such behaviour? Thanks! Of course we've suggested to the supplier to activate also Basic+SSL and we're waiting for their reply..

    Read the article

  • Nginx upload PUT and POST

    - by w00t
    I am trying to make nginx accept POST and PUT methods to upload files. I have compiled nginx_upload_module-2.2.0. I can't find any how to. I simply want to use only nginx for this, no reverse proxy, no other backend and no php. Is this achievable? this is my conf: nginx version: nginx/1.2.3TLS SNI support enabled configure arguments: --prefix=/etc/nginx --sbin-path=/usr/sbin/nginx --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error.log --http-log-path=/var/log/nginx/access.log --pid-path=/var/run/nginx.pid --lock-path=/var/run/nginx.lock --http-client-body-temp-path=/var/cache/nginx/client_temp --http-proxy-temp-path=/var/cache/nginx/proxy_temp --http-fastcgi-temp-path=/var/cache/nginx/fastcgi_temp --http-uwsgi-temp-path=/var/cache/nginx/uwsgi_temp --http-scgi-temp-path=/var/cache/nginx/scgi_temp --user=nginx --group=nginx --with-http_ssl_module --with-http_realip_module --with-http_addition_module --with-http_sub_module --with-http_dav_module --with-http_flv_module --with-http_mp4_module --with-http_gzip_static_module --with-http_random_index_module --with-http_secure_link_module --with-http_stub_status_module --with-mail --with-mail_ssl_module --with-file-aio --with-ipv6 --with-cc-opt='-O2 -g' --add-module=/usr/src/nginx-1.2.3/nginx_upload_module-2.2.0 server { listen 80; server_name example.com; location / { root /html; autoindex on; } location /upload { root /html; autoindex on; upload_store /html/upload 1; upload_set_form_field $upload_field_name.name "$upload_file_name"; upload_set_form_field $upload_field_name.content_type "$upload_content_type"; upload_set_form_field $upload_field_name.path "$upload_tmp_path"; upload_aggregate_form_field "$upload_field_name.md5" "$upload_file_md5"; upload_aggregate_form_field "$upload_field_name.size" "$upload_file_size"; upload_pass_form_field "^submit$|^description$"; upload_cleanup 400 404 499 500-505; } } And as an upload form I'm trying to use the one listed at the end of this page: http://grid.net.ru/nginx/upload.en.html

    Read the article

  • File upload permission problem IIS 7

    - by krish
    I am unable to upload files to website hosted under IIS7. I have already given write permissions to "IUSR_websitename" and set the property in web.config also. I am able to upload files with out log in to application at the time of user registration. But once log in to application, if I upload files, it is giving "Access denied" error. Please help me.

    Read the article

  • Panic Transmit file upload

    - by 1ndivisible
    I've ditched Coda and bought Transmit. I'm a little confused by the file uploading. I have exactly the same folder structure remotely and locally, but if I right-click a file and choose Upload "SomeFileName.html" The file is always uploaded into the root of the remote site, even if the file is in a folder. If I choose to upload a file at assets/images/some_image.png I would expect it to be uploaded to the same folder on the remote server, not the root. Coda dealt with this perfectly and also told me what files had been modified and needed uploading. Transmit doesn't seem to do either of these things. So my questions are: How can I upload a file to the same path on the remote server without having to drag and drop Is there any way to have Transmit mark edited files or upload only edited files. [There is no tag for Transmit so if someone with more rep could make and add one that would be grand]

    Read the article

  • Multiple file upload with asp.net 4.5 and Visual Studio 2012

    - by Jalpesh P. Vadgama
    This post will be part of Visual Studio 2012 feature series. In earlier version of ASP.NET there is no way to upload multiple files at same time. We need to use third party control or we need to create custom control for that. But with asp.net 4.5 now its possible to upload multiple file with file upload control. With ASP.NET 4.5 version Microsoft has enhanced file upload control to support HTML5 multiple attribute. There is a property called ‘AllowedMultiple’ to support that attribute and with that you can easily upload the file. So what we are waiting for!! It’s time to create one example. On the default.aspx file I have written following. <asp:FileUpload ID="multipleFile" runat="server" AllowMultiple="true" /> <asp:Button ID="uploadFile" runat="server" Text="Upload files" onclick="uploadFile_Click"/> Here you can see that I have given file upload control id as multipleFile and I have set AllowMultiple file to true. I have also taken one button for uploading file.For this example I am going to upload file in images folder. As you can see I have also attached event handler for button’s click event. So it’s time to write server side code for this. Following code is for the server side. protected void uploadFile_Click(object sender, EventArgs e) { if (multipleFile.HasFiles) { foreach(HttpPostedFile uploadedFile in multipleFile.PostedFiles) { uploadedFile.SaveAs(System.IO.Path.Combine(Server.MapPath("~/Images/"),uploadedFile.FileName)); Response.Write("File uploaded successfully"); } } } Here in the above code you can see that I have checked whether multiple file upload control has multiple files or not and then I have save that in Images folder of web application. Once you run the application in browser it will look like following. I have selected two files. Once I have selected and clicked on upload file button it will give message like following. As you can see now it has successfully upload file and you can see in windows explorer like following. As you can see it’s very easy to upload multiple file in ASP.NET 4.5. Stay tuned for more. Till then happy programming. P.S.: This feature is only supported in browser who support HTML5 multiple file upload. For other browsers it will work like normal file upload control in asp.net.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >