Cloud Zone is brought to you in partnership with:

I'm founder of Cynapta Software, an ISV based out of India. I'm also a Windows Azure MVP. Gaurav is a DZone MVB and is not an employee of DZone and has posted 33 posts at DZone. You can read more from them at their website. View Full User Profile

Revisiting Windows Azure Shared Access Signature

02.14.2013
| 3854 views |
  • submit to reddit

In this blog post, we’ll talk about Shared Access Signature (SAS) functionality in Windows Azure. Steve Marx (@smarx) wrote excellent blog posts on the same subject a few years ago which you can read here: http://blog.smarx.com/posts/shared-access-signatures-are-easy-these-days, http://blog.smarx.com/posts/uploading-windows-azure-blobs-from-silverlight-part-1-shared-access-signatures. However a lot has changed since Steve wrote the posts and I thought it might be useful to write about it taking those new things into consideration (hence the word “Revisiting” in the title :) ). This blog will start with a basic explanation of shared access signature and the purpose it serves and then I will show you some code to perform some basic operations against Windows Azure Storage especially Blob Storage.

Now that the agenda is set, let’s start :)

Shared Access Signature (SAS) – Why and What?

This section describes why we need SAS and what exactly it is. If you’re familiar with these concepts, please feel free to skip this section otherwise read on!

Why?

First let’s understand why we need SAS. As you may already know that access to Windows Azure Storage is not anonymous. Rather you would need an “account key” to access resources in a Storage Account. This account key (actually there’re 2 account keys – primary and secondary) is generated automatically for you by Windows Azure Storage and you also get the ability to regenerate the key on demand. The problem with account key based access is that the account key grants “Admin” privileges on the storage account in question. Anybody who has access to this account key can perform any allowed operation on the storage account. This is obviously a security concern and one must attempt to keep the account key secret at all times and guard it with full zeal.

However there’re scenarios where you would want to grant a user (who may be a partner or a customer etc.) restricted access to your storage account. You would that user to perform only a few operations and that too for a limited amount of time. Obviously handing that user your account key won’t do you any good because that would make the user an admin. The answer to this problem is SAS. A SAS allows you to grant temporary access to private resources in your storage account.

I was recently reading a book by Bill Wilder (@codingoutloud) on Cloud Architecture Patterns and he has explained this concept very nicely by using a “Valet Key” analogy. If you’re planning on building a cloud application in Windows Azure, I would strongly recommend reading this book first before writing code.

What?

Simply put, SAS is a URL (or rather a part of it). You, as a creator of SAS define the validity duration of SAS and what a user in possession of this SAS can do with the resources protected by SAS. For example, you could define a SAS which would allow a user to upload a file into a blob container in your storage account and would expire after say 1 hour. Within that hour for which the SAS is valid, any user who has access to this SAS will be able to upload the files in the blob container. After that hour, the link will become useless.

You could create a SAS for a blob resources (blob containers and blobs), table resources (table and PartitionKey/RowKey ranges) and queue resources (queues and messages). You could give one or more of CRUD operation permissions in a SAS. Lastly, you could define a start and an end date for which the SAS is valid.

With blob storage, you could create a SAS for a blob container or a blob. Following table summarizes the permissions and the purpose they serve on various blob resources:

Permission Type

Blob Container

Blob

Read

Not applicable on blob container.

Gives the ability to read the contents of the blob. Also gives the ability to read the metadata and properties of the blob as well.

Write

Gives the ability to upload one or more blobs in blob container. Also gives the ability to update properties, metadata and create snapshots of a blob.

Give the ability to upload a new blob (by the same name) or overwrite an existing blob (by the same name). Also gives the ability to update properties, metadata and create snapshots of a blob.

List

Lists the blobs in the blob container.

Not applicable on blobs.

Delete

Not applicable on blob container.

Gives the ability to delete the blob.

With table storage, you could create a SAS for a table or entity ranges in a table. For entity ranges, you could specify PartitionKey/RowKey ranges. Following table summarizes the permissions and the purpose they serve on various table resources:

Permission Type

Table

Entity Ranges (PartitionKey/RowKey Ranges)

Query

Gives the ability to query the table.

Gives the ability to query a table but the query is limited to entity range specified in the SAS.

Add

Gives the ability to add an entity to the table.

Gives the ability to add an entity to the table provided the entity’s PartitionKey/RowKey values are within range.

Update

Gives the ability to update an entity in the table.

Gives the ability to update an entity in the table provided the entity’s PartitionKey/RowKey values are within range.

Delete

Gives the ability to delete an entity from the table.

Gives the ability to delete an entity from the table provided the entity’s PartitionKey/RowKey values are within range.

With queue storage, you could create a SAS for a queue. Following table summarizes the permissions and the purpose they serve on queue resources:

Permission Type

Queue

Read or Peek

Gives the ability to peek at the messages in the queue. Please note that peek messages functionality allows the user to view messages in “Read-Only” mode and does not alter the visibility of the messages.

Add

Gives the ability to add messages in the queue.

Process

Gives the ability to get messages from the queue. Please note that this would give users the ability to delete the messages as well and it alters the visibility of the messages.

Update

Gives the ability to update a message in the queue. For this permission, Process permission is required.

A few more comments about SAS before we jump into the code :) .

Revocable SAS

You could make a SAS revocable. What that means is that you could make a SAS invalid before its expiry time. This is achieved through what is known as “Blob Container Access Policy”. When you create a SAS using blob container access policy, to revoke the SAS, you could just change the access policy identifier or delete that access policy all together. At the time of writing, a blob container can have a maximum of 5 access policies.

Anonymous SAS

When a SAS is created without a blob container access policy, it’s called anonymous SAS. Since these are not revocable, proper care must be taken in handling this SAS. Generally speaking, you should try and create very short duration anonymous SAS.

Never Expiring SAS

Though not a recommended practice, it is possible to create a SAS which never expires. To achieve this, you don’t specify the expiry date when creating SAS. In the earlier version of storage service, only revocable SAS can be never expiring and anonymous SAS are only valid for 1 hour duration. However with the latest version of storage service, it is possible to create never expiring anonymous SAS.

SAS Start Date/Time

Please note that SAS start and expiry date/time are in UTC. Furthermore these are validated in Windows Azure. There may be a possibility that there may be a mismatch between time on your local computer (where SAS is generated) and time in Windows Azure. To counter this issue, it is generally recommended that you make your SAS start time by subtracting about 5 – 10 minutes from your local computer’s time (in UTC). For example:

var sas = blobContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
    {
        Permissions = permission,
        SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),//SAS Start time is back by 5 minutes to take clock skewness into consideration
        SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
    });

MSDN has nice documentation about SAS functionality and much more which I recommend you to read: http://msdn.microsoft.com/en-us/library/windowsazure/ee393343.aspx

The Code!

OK, enough talking! Let’s see some code. In the code examples below, I have used latest version of storage client library (2.0.4 at the time of writing this post). I’ve used the storage client library to create SAS URIs however you could use tools like Cloud Storage Studio which supports this functionality. Then using that SAS URI, I’m going to demonstrate how you’re going to perform some common operations against blob storage. For each operation, I’ve included how you’re going to perform the operation using storage client library as well as if you’re going to do this by consuming REST API directly using HttpWebRequest/HttpWebResponse. The common theme with storage client library is that it allows you to create instances of CloudBlobContainer and CloudBlockBlob (I’ve focused only on block blobs in this post) using the SAS URI only. Again for simplicity, I just went with anonymous SAS and did not use blob container access policy for creating SAS URIs.

Helper Functions

I created two helper functions to create SAS on a blob container and blob. For the sake of simplicity, I hard coded the start and expiry time for SAS:

/// <summary>
/// Creates a SAS URI for the blob container.
/// </summary>
/// <param name="blobContainer"></param>
/// <param name="permission"></param>
/// <returns></returns>
static string GetSaSForBlobContainer(CloudBlobContainer blobContainer, SharedAccessBlobPermissions permission)
{
    var sas = blobContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
        {
            Permissions = permission,
            SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),//SAS Start time is back by 5 minutes to take clock skewness into consideration
            SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
        });
    return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blobContainer.Uri, sas);
}
/// <summary>
/// Creates a SAS URI for the blob.
/// </summary>
/// <param name="blob"></param>
/// <param name="permission"></param>
/// <returns></returns>
static string GetSaSForBlob(CloudBlockBlob blob, SharedAccessBlobPermissions permission)
{
    var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
    {
        Permissions = permission,
        SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),
        SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
    });
    return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas);
}

Listing Blobs

Here’s the code for listing blobs in a blob container. For listing blobs, “List” permission must be defined in SAS.

Using Storage Client Library
/// <summary>
/// List blobs in a blob container using storage client library.
/// </summary>
/// <param name="blobContainerSasUri"></param>
static void ListBlobsWithStorageClientLibrary(string blobContainerSasUri)
{
    CloudBlobContainer blobContainer = new CloudBlobContainer(new Uri(blobContainerSasUri));
    var blobs = blobContainer.ListBlobs(null, true);
    foreach (var blob in blobs)
    {
        Console.WriteLine(blob.Uri);
    }
}
Using REST API
/// <summary>
/// List blobs in a blob container using REST API.
/// </summary>
/// <param name="blobContainerSasUri"></param>
static void ListBlobsWithRestAPI(string blobContainerSasUri)
{
    string requestUri = string.Format(CultureInfo.InvariantCulture, "{0}&comp=list", blobContainerSasUri);
    HttpWebRequest request = (HttpWebRequest) WebRequest.Create(requestUri);
    request.Method = "GET";
    using (HttpWebResponse resp = (HttpWebResponse) request.GetResponse())
    {
        using (Stream s = resp.GetResponseStream())
        {
            using (StreamReader reader = new StreamReader(s, true))
            {
                string blobsData = reader.ReadToEnd();
                Console.WriteLine(blobsData);
            }
        }
    }
}

You must append “&comp=list” to the URI for listing blob containers. Also the output is XML.

I could just take the URI (in requestUri variable above) and paste it in browser’s address window and see the list of blobs in the blob container. See screenshot below for example.

image

Upload Blobs

Here’s the code for uploading a blob in a blob container. For uploading a blob, you could create a SAS with “Write” permission on either a blob container or a blob. The blob need not be present in the blob container. If the blob is not present, it will be created otherwise it will be overwritten.

Using Storage Client Library
/// <summary>
/// Uploads a blob in a blob container where SAS permission is defined on a blob container using storage client library.
/// </summary>
/// <param name="blobContainerSasUri"></param>
static void UploadBlobWithStorageClientLibrarySasPermissionOnBlobContainer(string blobContainerSasUri)
{
    CloudBlobContainer blobContainer = new CloudBlobContainer(new Uri(blobContainerSasUri));
    CloudBlockBlob blob = blobContainer.GetBlockBlobReference("sample.txt");
    string sampleContent = "This is sample text.";
    using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(sampleContent)))
    {
        blob.UploadFromStream(ms);
    }
}
/// <summary>
/// Uploads a blob in a blob container where SAS permission is defined on a blob using storage client library.
/// </summary>
/// <param name="blobSasUri"></param>
static void UploadBlobWithStorageClientLibrarySasPermissionOnBlob(string blobSasUri)
{
    CloudBlockBlob blob = new CloudBlockBlob(new Uri(blobSasUri));
    string sampleContent = "This is sample text.";
    using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(sampleContent)))
    {
        blob.UploadFromStream(ms);
    }
}
Using REST API
/// <summary>
/// Uploads a blob in a blob container where SAS permission is defined on a blob container using REST API.
/// </summary>
/// <param name="blobContainerSasUri"></param>
static void UploadBlobWithRestAPISasPermissionOnBlobContainer(string blobContainerSasUri)
{
    string blobName = "sample.txt";
    string sampleContent = "This is sample text.";
    int contentLength = Encoding.UTF8.GetByteCount(sampleContent);
    string queryString = (new Uri(blobContainerSasUri)).Query;
    string blobContainerUri = blobContainerSasUri.Substring(0, blobContainerSasUri.Length - queryString.Length);
    string requestUri = string.Format(CultureInfo.InvariantCulture, "{0}/{1}{2}", blobContainerUri, blobName, queryString);
    HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
    request.Method = "PUT";
    request.Headers.Add("x-ms-blob-type", "BlockBlob");
    request.ContentLength = contentLength;
    using (Stream requestStream = request.GetRequestStream())
    {
        requestStream.Write(Encoding.UTF8.GetBytes(sampleContent), 0, contentLength);
    }
    using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
    {
         
    }
}

When working with REST API, you must specify “x-ms-blob-type” request header. For block blobs, its value must be “BlockBlob” and for page blobs, the value must be “PageBlob”.

Upload Blobs in Blocks

Here’s the code if you want to upload a blob by splitting it into blocks. For uploading a blob, you could create a SAS with “Write” permission on either a blob container or a blob. The blob need not be present in the blob container. If the blob is not present, it will be created otherwise it will be overwritten.

Using Storage Client Library
/// <summary>
/// Uploads a blob by splitting into blocks in a blob container where SAS permission is defined on a blob container using storage client library.
/// </summary>
/// <param name="blobContainerSasUri"></param>
static void UploadBlobInBlocksWithStorageClientLibrarySasPermissionOnBlobContainer(string blobContainerSasUri)
{
    CloudBlobContainer blobContainer = new CloudBlobContainer(new Uri(blobContainerSasUri));
    CloudBlockBlob blob = blobContainer.GetBlockBlobReference(Guid.NewGuid().ToString() + ".txt");
    List<string> blockIds = new List<string>();
    for (int i = 0; i < 10; i++)
    {
        string sampleContent = string.Format(CultureInfo.InvariantCulture, "Line {0}: This is sample text.\r\n", i);
        string blockId = i.ToString("d4");
        blockIds.Add(blockId);
        using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(sampleContent)))
        {
            blob.PutBlock(blockId, ms, null);
        }
    }
    blob.PutBlockList(blockIds);
}
/// <summary>
/// Uploads a blob by splitting into blocks in a blob container where SAS permission is defined on a blob using storage client library.
/// </summary>
/// <param name="blobSasUri"></param>
static void UploadBlobInBlocksWithStorageClientLibrarySasPermissionOnBlob(string blobSasUri)
{
    CloudBlockBlob blob = new CloudBlockBlob(blobSasUri);
    List<string> blockIds = new List<string>();
    for (int i = 0; i < 10; i++)
    {
        string sampleContent = string.Format(CultureInfo.InvariantCulture, "Line {0}: This is sample text.\r\n", i);
        string blockId = i.ToString("d4");
        blockIds.Add(blockId);
        using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(sampleContent)))
        {
            blob.PutBlock(blockId, ms, null);
        }
    }
    blob.PutBlockList(blockIds);
}
Using REST API
/// <summary>
/// Uploads a blob by splitting into blocks in a blob container where SAS permission is defined on a blob container using REST API.
/// </summary>
/// <param name="blobContainerSasUri"></param>
static void UploadBlobInBlocksWithRestAPISasPermissionOnBlobContainer(string blobContainerSasUri)
{
    string blobName = Guid.NewGuid().ToString() + ".txt";
    string queryString = (new Uri(blobContainerSasUri)).Query;
    string blobContainerUri = blobContainerSasUri.Substring(0, blobContainerSasUri.Length - queryString.Length);
    List<string> blockIds = new List<string>();
    for (int i = 0; i < 10; i++)
    {
        string sampleContent = string.Format(CultureInfo.InvariantCulture, "Line {0}: This is sample text.\r\n", i);
        string blockId = i.ToString("d4");
        blockIds.Add(blockId);
        int contentLength = Encoding.UTF8.GetByteCount(sampleContent);
        string requestUri = string.Format(CultureInfo.InvariantCulture, "{0}/{1}{2}&comp=block&blockid={3}", blobContainerUri, blobName, queryString, Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId)));
        HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
        request.Method = "PUT";
        request.ContentLength = contentLength;
        using (Stream requestStream = request.GetRequestStream())
        {
            requestStream.Write(Encoding.UTF8.GetBytes(sampleContent), 0, contentLength);
        }
        using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
        {
 
        }
    }
    string commitBlockListRequestBodyFormat = @"<?xml version=""1.0"" encoding=""utf-8"" ?><BlockList>{0}</BlockList>";
 
    string blockListReuestBodyFormat = "<Latest>{0}</Latest>";
    StringBuilder sb = new StringBuilder();
    for (int i = 0; i < blockIds.Count; i++)
    {
        sb.AppendFormat(CultureInfo.InvariantCulture, blockListReuestBodyFormat,
                        Convert.ToBase64String(Encoding.UTF8.GetBytes(blockIds[i])));
    }
    string requestPayLoad = string.Format(CultureInfo.InvariantCulture, commitBlockListRequestBodyFormat,
                                          sb.ToString());
    int contentLengthForCommitBlockList = Encoding.UTF8.GetByteCount(requestPayLoad);
    string requestUriForCommitBlockList = string.Format(CultureInfo.InvariantCulture, "{0}/{1}{2}&comp=blockList", blobContainerUri, blobName, queryString);
    HttpWebRequest req = (HttpWebRequest)WebRequest.Create(requestUriForCommitBlockList);
    req.Method = "PUT";
    req.ContentLength = contentLengthForCommitBlockList;
    using (Stream requestStream = req.GetRequestStream())
    {
        requestStream.Write(Encoding.UTF8.GetBytes(requestPayLoad), 0, contentLengthForCommitBlockList);
    }
    using (HttpWebResponse resp = (HttpWebResponse)req.GetResponse())
    {
 
    }
}
/// Uploads a blob by splitting into blocks in a blob container where SAS permission is defined on a blob using REST API.
/// </summary>
/// <param name="blobSasUri"></param>
static void UploadBlobInBlocksWithRestAPISasPermissionOnBlob(string blobSasUri)
{
    List<string> blockIds = new List<string>();
    for (int i = 0; i < 10; i++)
    {
        string sampleContent = string.Format(CultureInfo.InvariantCulture, "Line {0}: This is sample text.\r\n", i);
        string blockId = i.ToString("d4");
        blockIds.Add(blockId);
        int contentLength = Encoding.UTF8.GetByteCount(sampleContent);
        string requestUri = string.Format(CultureInfo.InvariantCulture, "{0}&comp=block&blockid={1}", blobSasUri, Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId)));
        HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
        request.Method = "PUT";
        request.ContentLength = contentLength;
        using (Stream requestStream = request.GetRequestStream())
        {
            requestStream.Write(Encoding.UTF8.GetBytes(sampleContent), 0, contentLength);
        }
        using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
        {
 
        }
    }
    string commitBlockListRequestBodyFormat = @"<?xml version=""1.0"" encoding=""utf-8"" ?><BlockList>{0}</BlockList>";
 
    string blockListReuestBodyFormat = "<Latest>{0}</Latest>";
    StringBuilder sb = new StringBuilder();
    for (int i = 0; i < blockIds.Count; i++)
    {
        sb.AppendFormat(CultureInfo.InvariantCulture, blockListReuestBodyFormat,
                        Convert.ToBase64String(Encoding.UTF8.GetBytes(blockIds[i])));
    }
    string requestPayLoad = string.Format(CultureInfo.InvariantCulture, commitBlockListRequestBodyFormat,
                                          sb.ToString());
    int contentLengthForCommitBlockList = Encoding.UTF8.GetByteCount(requestPayLoad);
    string requestUriForCommitBlockList = string.Format(CultureInfo.InvariantCulture, "{0}&comp=blockList", blobSasUri);
    HttpWebRequest req = (HttpWebRequest)WebRequest.Create(requestUriForCommitBlockList);
    req.Method = "PUT";
    req.ContentLength = contentLengthForCommitBlockList;
    using (Stream requestStream = req.GetRequestStream())
    {
        requestStream.Write(Encoding.UTF8.GetBytes(requestPayLoad), 0, contentLengthForCommitBlockList);
    }
    using (HttpWebResponse resp = (HttpWebResponse)req.GetResponse())
    {
 
    }
}

When working with REST API, for putting blocks you must specify “comp=block” and Base64 encoded block id in the request URI. For putting block list, you must specify “comp=blockList” in the request URI.

Download Blobs

Here’s the code for downloading a blob from a blob container. For downloading blob, “Read” permission must be defined in SAS.

Using Storage Client Library
/// <summary>
/// Downloads a blob from a blob cobtainer using storage client library.
/// </summary>
/// <param name="blobSasUri"></param>
static void DownloadBlobWithStorageClientLibrary(string blobSasUri)
{
    CloudBlockBlob blob = new CloudBlockBlob(new Uri(blobSasUri));
    using (MemoryStream ms = new MemoryStream())
    {
        blob.DownloadToStream(ms);
        byte[] data = new byte[ms.Length];
        ms.Position = 0;
        ms.Read(data, 0, data.Length);
        string blobContents = Encoding.UTF8.GetString(data);
        Console.WriteLine(blobContents);
    }
}
Using REST API
/// <summary>
/// Downloads a blob from a blob cobtainer using REST API.
/// </summary>
/// <param name="blobSasUri"></param>
static void DownloadBlobWithRestAPI(string blobSasUri)
{
    HttpWebRequest request = (HttpWebRequest)WebRequest.Create(blobSasUri);
    request.Method = "GET";
    using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
    {
        using (Stream s = resp.GetResponseStream())
        {
            using (StreamReader reader = new StreamReader(s, true))
            {
                string blobData = reader.ReadToEnd();
                Console.WriteLine(blobData);
            }
        }
    }
}

Deleting Blobs

Here’s the code for deleting a blob from a blob container. For delete operation, “Delete” permission must be defined in SAS.

Using Storage Client Library
/// <summary>
/// Deletes a blob from a blob cobtainer using storage client library.
/// </summary>
/// <param name="blobSasUri"></param>
static void DeleteBlobWithStorageClientLibrary(string blobSasUri)
{
    CloudBlockBlob blob = new CloudBlockBlob(new Uri(blobSasUri));
    blob.Delete();
}
Using REST API
/// <summary>
/// Deletes a blob from a blob cobtainer using REST API.
/// </summary>
/// <param name="blobSasUri"></param>
private static void DeleteBlobWithRestAPI(string blobSasUri)
{
    HttpWebRequest request = (HttpWebRequest)WebRequest.Create(blobSasUri);
    request.Method = "DELETE";
    using (HttpWebResponse resp = (HttpWebResponse) request.GetResponse())
    {
         
    }
}

Summary

That’s it for this post. I could have possibly done more and included more samples but these should give you an idea about how to use SAS to perform various operations. May be in another post, I will show some functionality related to Tables and Queues with SAS. I hope you’ve found this post useful. If you find any issues with this post, please let me know and I’ll try and fix them ASAP. Feel free to provide your comments below.

Happy Coding!!!




Published at DZone with permission of Gaurav Mantri, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)