There
are times when the code you’re working with will need to read from and
write to the local filesystem. Windows Azure allows for you to request
and access a piece of the local disk on your role instance.
You can create this space by
using the configuration of your role. You won’t have control over the
path of the directory you’re given access to, so you should make sure
that the file path your code needs to access is part of your
configuration. A hardcoded path will never remain accurate in the cloud
environment.
We recommend that you only use
local storage when you absolutely have to, because of some limitations
we’ll cover later in this section. You’ll likely need to use local
storage the most when you’re migrating to the cloud existing frameworks
or applications that require local disk access.
1. Setting up local storage
You can configure the
local storage area you need as part of your role by adding a few simple
lines of configuration to your role. The tag we’re going to work with is
the LocalStorage tag. It will tell the Fabric Controller to allocate local file storage space on each server the role instance is running on.
In the configuration
element, you need to name the storage space. This name will become the
name of the folder that’s reserved for you. You’ll need to define how
much filesystem space you’ll need. The current limit is 20 GB per role
instance, with a minimum of 1 MB.
<LocalResources>
<LocalStorage name="FilesUploaded" cleanOnRoleRecycle="false" sizeInMB="15" />
<LocalStorage name="VirusScanPending" cleanOnRoleRecycle="true" sizeInMB="5" />
</LocalResources>
You can declare multiple local
storage resources, as shown in the preceding code snippet. It’s
important that the local file storage only be used for temporary,
unimportant files. The local file store isn’t replicated or preserved in
any way. If the instance fails and
it’s moved by the Fabric Controller to a new server, the local file
store isn’t preserved, which means any files that were present will be
lost.
Tip
There is one time when the
local file storage won’t be lost, and that’s when the role is recycled,
either as part of a service management event on your part, or when the
Fabric Controller is responding to a minor issue with your server. In
these cases, if you’ve set the cleanOnRoleRecyle parameter to false, the current files will still be there when your instance comes back online.
Instances may only access
their own local storage. An instance may not access another instance’s
storage. You should use Azure BLOB storage if you need more than one
instance to access the same storage area.
Now that you’ve defined your local storage, let’s look at how you can access it and work with it.
2. Working with local storage
Working with files in local
storage is just like working with normal files. When your role instance
is started, the agent creates a folder with the name you defined in the
configuration in a special area on the C: drive on your server. Rules
are put in place to make sure the folder doesn’t exceed its assigned
quota for size. To start using it, you simply need to get a handle for
it.
To get a handle to your local storage area, you need to use the GetLocalResource method. You’ll need to provide the name of the local resource you defined in the service definition file. This will return a LocalResource object:
public static LocalResource uploadFolder = RoleEnvironment.GetLocalResource("FilesUploaded");
After you have this reference
to the local folder, you can start using it like a normal directory. To
get the physical path, so you can check the directory contents or write
files to it, you would use the uploadFolder reference from the preceding code.
string rootPathName = uploadFolder.RootPath;
In the sample code provided
with this book, there’s a simple web role that uses local storage to
store uploaded files. Please remember that this is just a sample, and
that you wouldn’t normally persist important files to the local store,
considering its transient nature. You can view the code we used to do
this in listing 1. When calling the RootPath method in the local development fabric, Brian’s storage is located here:
C:\Users\brprince\AppData\Local\dftmp\s0\deployment(32)\res\deployment(32).AiA_15___Local_Storage_post_pdc.LocalStorage_WebRole.0\directory\FilesUploaded\
When we publish this little application to the cloud, it returns the following path:
C:\Resources\directory\0c28d4f68a444ea380288bf8160006ae.LocalStorage_WebRole.FilesUploaded\
Listing 1. Working with local file storage
Now
that we know where the files will be stored, we can start working with
them. In the sample application, we have a simple file-upload control . When the web page is loaded, we write out the local file path to the local storage folder that we’ve been assigned . Once the file is uploaded, we store it in the local storage and write out its filename and path .
We then write the file back out to the browser using normal file APIs
to do so. Our example code was designed to work only with text files, to
keep things simple.
The local storage option is
great for volatile local file access, but it isn’t durable and may
disappear on you. If you need durable storage, look at Azure storage or
SQL Azure. If you need shared storage that’s super-fast, you should
consider the Windows Server AppFabric distributed cache. This is a
peer-to-peer caching layer that can run on your roles and provide a
shared in-memory cache for your instances to work with.