Azure – Part 6 – Blob Storage Service

Posted by Shaun on Geeks with Blogs See other posts from Geeks with Blogs or by Shaun
Published on Wed, 05 May 2010 00:33:16 GMT Indexed on 2010/05/05 1:38 UTC
Read the original article Hit count: 738

Filed under:

When migrate your application onto the Azure one of the biggest concern would be the external files. In the original way we understood and ensure which machine and folder our application (website or web service) is located in. So that we can use the MapPath or some other methods to read and write the external files for example the images, text files or the xml files, etc. But things have been changed when we deploy them on Azure. Azure is not a server, or a single machine, it’s a set of virtual server machine running under the Azure OS. And even worse, your application might be moved between thses machines. So it’s impossible to read or write the external files on Azure. In order to resolve this issue the Windows Azure provides another storage serviec – Blob, for us.

Different to the table service, the blob serivce is to be used to store text and binary data rather than the structured data. It provides two types of blobs: Block Blobs and Page Blobs.

  • Block Blobs are optimized for streaming. They are comprised of blocks, each of which is identified by a block ID and each block can be a maximum of 4 MB in size.
  • Page Blobs are are optimized for random read/write operations and provide the ability to write to a range of bytes in a blob. They are a collection of pages. The maximum size for a page blob is 1 TB.

 

In the managed library the Azure SDK allows us to communicate with the blobs through these classes CloudBlobClient, CloudBlobContainer, CloudBlockBlob and the CloudPageBlob.

Similar with the table service managed library, the CloudBlobClient allows us to reach the blob service by passing our storage account information and also responsible for creating the blob container is not exist. Then from the CloudBlobContainer we can save or load the block blobs and page blobs into the CloudBlockBlob and the CloudPageBlob classes.

 

Let’s improve our exmaple in the previous posts – add a service method allows the user to upload the logo image.

In the server side I created a method name UploadLogo with 2 parameters: email and image. Then I created the storage account from the config file. I also add the validation to ensure that the email passed in is valid.

   1: var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
   2: var accountContext = new DynamicDataContext<Account>(storageAccount);
   3:  
   4: // validation
   5: var accountNumber = accountContext.Load()
   6:     .Where(a => a.Email == email)
   7:     .ToList()
   8:     .Count;
   9: if (accountNumber <= 0)
  10: {
  11:     throw new ApplicationException(string.Format("Cannot find the account with the email {0}.", email));
  12: }

Then there are three steps for saving the image into the blob service. First alike the table service I created the container with a unique name and create it if it’s not exist.

   1: // create the blob container for account logos if not exist
   2: CloudBlobClient blobStorage = storageAccount.CreateCloudBlobClient();
   3: CloudBlobContainer container = blobStorage.GetContainerReference("account-logo");
   4: container.CreateIfNotExist();

Then, since in this example I will just send the blob access URL back to the client so I need to open the read permission on that container.

   1: // configure blob container for public access
   2: BlobContainerPermissions permissions = container.GetPermissions();
   3: permissions.PublicAccess = BlobContainerPublicAccessType.Container;
   4: container.SetPermissions(permissions);

And at the end I combine the blob resource name from the input file name and Guid, and then save it to the block blob by using the UploadByteArray method. Finally I returned the URL of this blob back to the client side.

   1: // save the blob into the blob service
   2: string uniqueBlobName = string.Format("{0}_{1}.jpg", email, Guid.NewGuid().ToString());
   3: CloudBlockBlob blob = container.GetBlockBlobReference(uniqueBlobName);
   4: blob.UploadByteArray(image);
   5:  
   6: return blob.Uri.ToString();

Let’s update a bit on the client side application and see the result. Here I just use my simple console application to let the user input the email and the file name of the image. If it’s OK it will show the URL of the blob on the server side so that we can see it through the web browser.

image

Then we can see the logo I’ve just uploaded through the URL here.

image

You may notice that the blob URL was based on the container name and the blob unique name. In the document of the Azure SDK there’s a page for the rule of naming them, but I think the simple rule would be – they must be valid as an URL address. So that you cannot name the container with dot or slash as it will break the ADO.Data Service routing rule. For exmaple if you named the blob container as Account.Logo then it will throw an exception says 400 Bad Request.

 

Summary

In this short entity I covered the simple usage of the blob service to save the images onto Azure. Since the Azure platform does not support the file system we have to migrate our code for reading/writing files to the blob service before deploy it to Azure.

In order to reducing this effort Microsoft provided a new approch named Drive, which allows us read and write the NTFS files just likes what we did before. It’s built up on the blob serivce but more properly for files accessing. I will discuss more about it in the next post.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

© Geeks with Blogs or respective owner