Azure and multi website packages created with cspack.exe

by ingvar 19. april 2011 07:49

Introduction

Iwanted to create Azure deploy packages through the command line. Well I wanted to create them outside Visual Studio. I started out with a with two websites in the same role with different binding endpoints. And the setup worked when i deployed from VS. In my setup, the WebRole copied files from the blob to the each website. So the initial websites were empty and the WebRole created them in the OnStart method.

I discovered one very important thing when creating packages with cspack.exe. And I want to share this with other developers, because it took me a great deal of time to find this small, but very important detail: The argument /RolePropertiesFile.

Result

First let me show what I found out was needed to create a working package:

Here is the command:

cspack.exe WindowsAzureProject\ServiceDefinition.csdef 
           /role:MyWebRole;MyWebRole  
           /rolePropertiesFile:MyWebRole;MyWebRole\RoleProperties.txt

 

Here WindowsAzureProject is the folder containing the Azure project and MyWebRole is the folder containing the web role added to the Azure project.

And here is the content of the file RoleProperties.txt

EntryPoint=WebRole.dll
TargetFrameWorkVersion=v4.0

Process

Here is the things i went through to get to the working result above.

I started out with the command, that could be found many places through some simple searching:

cspack.exe WindowsAzureProject\ServiceDefinition.csdef 
           /role:MyWebRole;MyWebRole  

This resulted in this IIS error message: Parser Error Message: Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive.

Then I did some more, well a lot more, searching and found this blog post. I added the argument /RolePropertiesFile:RolePropertoes.txt to the command and tried it out. But no luck, it did not work. The result site just gave me a new error: 403 as if no files existed on the website. The strange part was that it seemed that the WebRole did not run. In the WebRole.OnStart I had added some some remote logging. When I deployed the command line package, there were no logging, but if i deployed through VS it did do the logging. So the package was deployed, but the WebRole did not run, or excepted very early. Almost all the code was inside a try-catch-block and the exception was remote logged. But no log lines. So I’m pretty sure it did not run. Which also was the case after some more digging.

Then I tried adding the /copyOnly argument to the command. This generated a folder containing files that the emulator can use when running the WebRole. I started comparing files in this folder against the folder generated by VS and found a very interesting file: RoleModel.xml. Both files contained Properties element with a series of Property elements. The command line version of this file only had one Property element, namely TargetFrameWorkVersion that i added in the RoleProperties.txt file. The VS version of the RoleModel.xml file contained a lot more Property elements. So I added all of them to the RoleProperties.txt file, ran the command, redeployed and it worked!!

Then I started removing properties one by one, redeploying in between until i found the single property that was needed for the command line generated package to work (Yup, it took a long time to do this). And here is the result RoleProperties.txt file:

EntryPoint=WebRole.dll
TargetFrameWorkVersion=v4.0

I retried the whole setup, but this time with only one website in the web role. And for some strange and unknown reason to me, it worked without the extra parameter in the RoleProperties.txt file. 

So the lesson here is that using the /copyOnly argument and comparing the directory generated by the cspack.exe against VS could be very helpful. 

Tags:

Azure | C# | C1

Microsoft.WindowsAzure.* assemblies not in GAC on Azure sites

by ingvar 16. april 2011 16:34

I discovered something ord with Azure deployments. 99% of developers and Azure deployments will probably never encounter. To put it short if discovered that if the website you deploy to Azure does not have a reference to the assembly Microsoft.WindowsAzure.StorageClient.dll, the assembly can not be used by the website at all. In other words, this assembly is not in the GAC. Many will probably find this scenario strange. But I encountered this issue when I was working with the CMS Composite C1. C1 is deployed to Azure buy the WebRole.OnStart method that downloads a zip file and unzipping it to the website. Why? Well, there are many reasons, but one of them is that this allows developers deploying existing C1 sites to Azure with the same Azure deployment package (no need to rebuild the package). They just zips the site and points to it from the ServiceConfiguration.cscfg file. 

I wanted to share this, so I have found an easy way to reproduce it. Here is the steps to reproduce it (A zipped version can be downloaded Here):

  1. Create new windows azure project.
  2. Add a ASP.NET WebRole (Clean the site so only WebRole.cs and web.config are left).
  3. Add Empty ASP.NET website.
  4. Add the physicalDirectory=”../Website” to the Site element in the ServiceDefinition.csdef file.
  5. Add a Default.aspx to the website (Not the WebRole) with the code below.
  6. Deploy to Azure.
  7. Note that the CloudBlob type is not found!
  8. Add a Default.aspx with the same code to the WebRole project.
  9. Change the physicalDirectory to “../WebRole” in the ServiceDefinition.csdef file.
  10. Deploy to Azure.
  11. Now the types can be found! (WebRole having a reference to the needed assembly)

Here is ths code I added to the Default.aspx.cs files:

protected void Page_Load(object sender, EventArgs e)
{
    Type cloudBlobType = Type.GetType("Microsoft.WindowsAzure.StorageClient.CloudBlob, "
      "Microsoft.WindowsAzure.StorageClient");
    if (cloudBlobType != null)
    {
        PlaceHolder.Controls.Add(new LiteralControl("Type found! <br />"));
    }
    else
    {
        PlaceHolder.Controls.Add(new LiteralControl("Type NOT found! <br />"));
    }
}

I used Type.GetType here only to do the reproduction. This could also happen, as it did with C1, if an assembly is added to the website from the WebRole.OnStart, that have a reference to Microsoft.WindowsAzure.StorageClient.dll and used in the code. In this case the site is totaly dead.  

The fix is easy, just add needed references to the website. The reason that the WebRole in this reproduction works is that it has a reference to the needed assembly and this results in that the assembly is in the bin folder. 



Tags:

.NET | Azure | C#

Paths for each site on a Azure deployment

by ingvar 15. april 2011 22:49

If you need to do any file related work in your WebRole on the deployed files or if you want to add files from the WebRole. This post presents one way to get the physical paths and names of each website on a Azure deployment. This works for both single and multiple site setups.

First we need to get the role model, which is created by Azure from the ServiceDefinition.xml. So some of the aspects of this role model file is recognisable. The root directory of the role can be found in the environment variables. Like this:

string roleRootPath = Environment.GetEnvironmentVariable("RdRoleRoot");

The file name of the role model is ‘RoleModel.xml’, so the full path of the role model can be constructed and loaded like this:

string roleModelPath = Path.Combine(roleRootPath, "RoleModel.xml")
 

We need one more thing before we start parsing the RoleModel.xml file. We need the current application directory. We need this because the paths in RoleModel.xml are relative, so to get the full path we need application directory. We can get it like this:

string currentAppRootPath = Path.GetDirectoryName(AppDomain.CurrentDomain.BaseDirectory);

The web sites construct in this file is very similar as in ServiceDefinition.xml. So finding the web sites names and physical paths could be done like this:

XNamespace roleModelNS = "http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition";
XDocument roleModel = XDocument.Load(roleModelPath);
var siteElements = roleModel.Root.Element(roleModelNS + "Sites").
                                   Elements(roleModelNS + "Site");

var results = 
    from siteElement in siteElements
    where siteElement.Attribute("name") != null &&
          siteElement.Attribute("physicalDirectory") != null
    select new {
        Path = Path.Combine(appRootDir, siteElement.Attribute("physicalDirectory").Value),
        Name = siteElement.Attribute("name").Value };

Tags:

.NET | Azure | C#

Azure and the REST API

by ingvar 13. april 2011 21:09

Introduction

In this post I’ll write about the interesting things i discovered when doing REST to Azure. And I’ll also post the code needed to; create a hosted server, created a new deplyment and upgrade an existing deployment.

The three most important findings for me was how to get meaningfull error messages when the web request failed. The right way to use Uri for some REST operations and the right way to encode service configuration files in the body of the request.

Documentation for the Azure REST API can be found here: http://msdn.microsoft.com/en-us/library/ee460799.aspx

Certificate

You need a certificate to identify you REST commands to Auzre.

You can create a new certificate by issuing this command:

makecert -r -pe -a sha1 -n CN=AzureMgmt -ss My “AzureMgmt.cer"

The file 'AzureMgmt.ce' that was created by this command should be added to your Azure subscribtion. You can do this through the Azure Management Portal.

Generic setup

Here is a generic setup for issuing REST operations. 

Edit (2011-04-13): Setting the request method to "POST" should only be done if there is a body to send, thanks to Christian Horsdal for pointing this out.

string requestUrl = "";
string certificatePath = "";
string headerVersion = "";
string requestBody = "";

HttpWebRequest httpWebRequest = (HttpWebRequest)HttpWebRequest.Create(new Uri(requestUrl, true));
httpWebRequest.ClientCertificates.Add(new X509Certificate2(certificatePath));
httpWebRequest.Headers.Add("x-ms-version", headerVersion);
httpWebRequest.ContentType = "application/xml";

if (!string.IsNullOrEmpty(requestBody))
{
   httpWebRequest.Method = "POST";
   byte[] requestBytes = Encoding.UTF8.GetBytes(requestBody);
   httpWebRequest.ContentLength = requestBytes.Length;

   using (Stream stream = httpWebRequest.GetRequestStream())
   {
      stream.Write(requestBytes, 0, requestBytes.Length);
   }
}

try
{
   using (HttpWebResponse httpWebResponse = (HttpWebResponse)httpWebRequest.GetResponse())
   {
      Console.WriteLine("Response status code: " + httpWebResponse.StatusCode);       WriteRespone(httpWebResponse);
   }
}
catch (WebException ex)
{
   Console.WriteLine("Response status code: " + ex.Status);
   WriteRespone(ex.Response);
}

Here is the code for the WriteRepsonse method:

static void WriteRespone(WebResponse webResponse)
{
   using (Stream responseStream = webResponse.GetResponseStream())
   {
      if (responseStream == null) return;

      using (StreamReader reader = new StreamReader(responseStream))
      {
         Console.WriteLine("Response output:");
         Console.WriteLine(reader.ReadToEnd());
      }
   }
}

The three variables; requestUrl, headerVersion and requestBody) are the only things you need to change to do any REST operation. 

The certificatePath variable is the same for all operations, it just need to be the path to your certificate file. 

The requestUrl variable is constructed to match the specific REST operation. See the the documentation for the different REST operations here [http://msdn.microsoft.com/en-us/library/ee460799.aspx].

The headerVersion varies from operation to operation. Some operations has the same version. You can find the correct version in the documentation.

The requestBody is a small XML document. Not all operations needs a body and this is handled by the generic code above. See the documentation [http://msdn.microsoft.com/en-us/library/ee460799.aspx] if the body is needed and if it is, how its constructed.

 

Important findings

Here is a short list of the problems I ran into when doing REST operations and the solutions to them:

  • WebException.Response.GetResponseStream() is your friend! When a operation fails with an exception and the error code 400, reading the WebExceptions response stream can help you finding the reason why.
  • The Uri constructor should have dontEscape=false for the request URL. If this is not the case, REST operations like Upgrade Deployment fails.
  • The ordering of the elements in the body matters, it is XML, so not so supprising.
  • If a Azure deployment package is needed, this should be located in a blob store. This is also documented in the documentation.
  • If an Azure service configuration is needed, this should be read as a string and base64 encoded like i did in the code. My first try i read the file as byte (File.ReadBytes) but this made the operation fail.

 

Create Hosted Service Example

Here is the value of the needed three variables:

string requestUrl = "https://management.core.windows.net/<subscription-id>/services/hostedservices";
string headerVersion = "2010-10-28";
string requestBody = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
   "<CreateHostedService xmlns=\"http://schemas.microsoft.com/windowsazure\">" +
      "<ServiceName>myservicename</ServiceName>" +
      "<Label>" + Convert.ToBase64String(Encoding.UTF8.GetBytes("myservicename")) + "</Label>" +
      "<Location>North Central US</Location>" +
   "</CreateHostedService>";

Create Deployment Example

Here is the value of the needed three variables:

string requestUrl = "https://management.core.windows.net/<subscription-id>/services/hostedservices/myservicename/deploymentslots/staging";
string headerVersion = "2009-10-01";
string requestBody = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
   "<CreateDeployment xmlns=\"http://schemas.microsoft.com/windowsazure\">" +
      "<Name>mydeployment</Name>" +
      "<PackageUrl>http://myblob.blob.core.windows.net/MyPackage.cspkg</PackageUrl>" +
      "<Label>" +
      Convert.ToBase64String(Encoding.UTF8.GetBytes("mydeployment")) +
      "</Label>" +
      "<Configuration>" +
      Convert.ToBase64String(Encoding.UTF8.GetBytes(
         File.ReadAllText(@"C:\MyServiceConfiguration.cscfg"))) +
      "</Configuration>" +
      "<StartDeployment>true</StartDeployment>" +
   "</CreateDeployment>";

Upgrade Deployment Example

Here is the value of the needed three variables:

string requestUrl = "https://management.core.windows.net/<subscription-id>/services/hostedservices/myservicename/deploymentslots/staging/?comp=upgrade";
string headerVersion = "2009-10-01";
string requestBody = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
   "<UpgradeDeployment xmlns=\"http://schemas.microsoft.com/windowsazure\">" +
      "<PackageUrl>http://myblob.blob.core.windows.net/MyPackage.cspkg</PackageUrl>" +
      "<Configuration>" +
      Convert.ToBase64String(Encoding.UTF8.GetBytes(
            File.ReadAllText(@"C:\MyServiceConfiguration.cscfg"))) +
      "</Configuration>" + "<Mode>auto</Mode>" +
      "<Label>"
      Convert.ToBase64String(Encoding.UTF8.GetBytes("mydeployment")) + 
      "</Label>" +
   "</UpgradeDeployment>";

Tags:

.NET | Azure | C#

Azure and blob write performance

by ingvar 6. januar 2011 21:07

Edit - 22nd August

Please read the comment below by Joe Giardino regarding the relative poor performance of the OpenWrite method. He has a very good explanation!

Introduction

During my work at Composite on C1 I found out that, some ways of adding/uploading data to the Azure blob is faster than others. So I did some benchmarking on ways to add data to a block blob. I will start by listing the results and below you can see the code i used to do the testing. In all tests the loopCount was 50. See the code below on more information on the loopCount. The numbers in the table is the average of milliseconds of these 50 loops.

Results

Azure test results

As expected the blob is a little slower than writing to the local disk. But what surprised me is that the OpenWrite method is much much slower than the other methods for adding data to the blob. Unfortunately I started out using the OpenWrite method and used it a lot. This really slow down my solution. It got so slow I started getting ThreadAboutException's and time time outs.

Size in kb Local disk UploadByteArray UploadFile UploadFromStream OpenWrite
50 0 31
64
33 222
100 0 41 33 37 240
150 0 42 40 43 235
200 0 50 43 44 227
250 0 56 52 44 227
300 0 55 53 51 242
350 57
55 79 53 246
400 57 60 90 60 263
450 55 72 95 61 258
500 50 76 73 68 278

 

Local dev fabric test results

I included this because a made it work locally and then deployed on Azure. The numbers just reflect the numbers from the Azure run.

Size in kb Local disk UploadByteArray UploadFile UploadFromStream OpenWrite
50 0 89 78 80 338
100 0 90 86 89 398
150 0 128 124 129 411
200 0 138 137 136 426
250 0 144 151 148 436
300 0 179 178 185 521
350 19 197 193 192 561
400 16 234 230 229 550
450 14 245 249 245 541
500 14 248 247 249 555

 

The test code

Base code for all tests

Just finding the path of a local test file and creating the buffer to write to local disk or block blob.

string localPath = Path.Combine(
    HttpContext.Current.Request.PhysicalApplicationPath, "BlobTestFile.dat");

if (File.Exists(localPath)) File.Delete(localPath);

/* Setting up the buffer */
byte[] testBuffer = new byte[testBufferSize];
for (int i = 0; i < testBufferSize; i++)
{
    testBuffer[i] = (byte)(i % 256);
}

Base code for all blob tests

CloudStorageAccount account =
   CloudStorageAccount.FromConfigurationSetting("BlobConnectionString");

CloudBlobClient client =
   account.CreateCloudBlobClient();

CloudBlobContainer container =
   client.GetContainerReference("mycontainer"); /* Remember lower casing only */

container.CreateIfNotExist();

Local disk writes test code

Simply using the System.IO.FileStream class for writing the buffer to disk.

int diskWriteTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    using (FileStream fileStream = new FileStream(localPath, FileMode.Create))
    {
        fileStream.Write(testBuffer, 0, testBufferSize);
    }
}
int diskWriteTime2 = Environment.TickCount;

UploadByteArray test code

Uploading the testBuffer using the UploadByteArray method.

int blobUploadByteArrayTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");
    blob.UploadByteArray(testBuffer);
}
int blobUploadByteArrayTime2 = Environment.TickCount;

UploadFile test code

Uploading the file written by the local disk write test. The local file has the same size as the testBuffer.


int blobUploadFileTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");
    /* Reusing the local file, written in the local test*/
    blob.UploadFile(localPath);
}
int blobUploadFileTime2 = Environment.TickCount;

UploadFromStream test code

Uploading the file written by the local disk write test using a FileStream for reading the file. The local file has the same size as the testBuffer.


int blobUploadFromStreamTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");

    using (FileStream fileStream = new FileStream(localPath, FileMode.Open))
    {
        /* Reusing the local file, written in the local test*/
        blob.UploadFromStream(fileStream);
    }
}
int blobUploadFromStreamTime2 = Environment.TickCount;

OpenWrite test code

Uploading the testBuffer using the OpenWrite method.

int blobOpenWriteTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");

    using (Stream stream = blob.OpenWrite())
    {
        stream.Write(testBuffer, 0, testBufferSize);
    }
}
int blobOpenWriteTime2 = Environment.TickCount;

Tags:

.NET | Azure | Blob | C#

Azure and edit or delete deployed files from your website

by ingvar 21. december 2010 09:32

If you change the file permission on deployed files in the WebRole, you will be able to edit or delete those files from you website. This includes the web.config file!
Also if you write new files in the WebRole, you also need to change the file permission for those files.

My colleague at Composite Marcus Wendt found the code for changing the file permission here. Ill restate the code below. This code is not enough, you also need to edit the ServiceDefinition.csdef file. You need to add '<Runtime executionContext="elevated" />' to the ServiceDefinition.csdef file to give your WebRole rights to change file permissions:


<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="AzureTest"
                   xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
  <WebRole name="WebRole">
    
    <Runtime executionContext="elevated" />

  </WebRole>
</ServiceDefinition>

And here is the code for changing the permission for a given file:

void ChangePermission(string filePath)
{
    SecurityIdentifier sid = new SecurityIdentifier(WellKnownSidType.WorldSid, null);

    IdentityReference act = sid.Translate(typeof(NTAccount));

    FileSecurity sec = File.GetAccessControl(filePath);
    sec.AddAccessRule(new FileSystemAccessRule(act, FileSystemRights.FullControl,
                                               AccessControlType.Allow));

    File.SetAccessControl(filePath, sec);
}

Tags:

.NET | Azure | C#

Azure and Response is not available in this context

by ingvar 17. december 2010 10:29

Application_Start and 'Response is not available in this context'

I have experienced the the HttpException 'Response is not available in this context' a lot when working with Azure. It appens if you try to do Azure stuff like accessing the blob in the Application_Start method.

After some time I found out that it was because HttpContext.Current.Response was accessible. So I though, why do Azure needs access to the Response object? I knew that it was possible to access the blob storage from a console app. Though I'm not sure about this, but I think that Azure uses the Response object for logging. In any case, if you need to access the blob in your web app or do any other Azuer stuff AND want to do this even though there is not a Response object, like in the Application_Start method. You simply set HttpContext.Current to null, but in a smart way!

Here is my code for Application_Start and below is the code for the smart way of setting HttpContext.Current to null:

protected void Application_Start(object sender, EventArgs e)
{
    CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
    {
        configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    });

    using (new AzureContext())
    {
        CloudStorageAccount storageAccount = CloudStorageAccount.FromConfigurationSetting("BlobConnectionString");

        CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();

        CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
        container.CreateIfNotExist();
        container.FetchAttributes();
    }
}

 

And the code for AzureContext:

public class AzureContext : IDisposable
{
    HttpContext _oldHttpContext;
    bool _restoreOldHttpContext = false;


    public AzureContext(bool forceSettingContextToNull = false)
    {
        if (forceSettingContextToNull)
        {
            _oldHttpContext = HttpContext.Current;
            HttpContext.Current = null;
            _restoreOldHttpContext = true;
        }
        else
        {
            try
            {
                HttpResponse response = HttpContext.Current.Response;
            }
            catch (HttpException)
            {
                _oldHttpContext = HttpContext.Current;
                HttpContext.Current = null;
                _restoreOldHttpContext = true;
            }
        }
    }


    public void Dispose(bool disposing)
    {
        if (disposing)
        {
            if (_restoreOldHttpContext)
            {
                HttpContext.Current = _oldHttpContext;
            }
        }
    }


    public void Dispose()
    {
        Dispose(true);
    }


    ~AzureContext()
    {
        Dispose(false);
    }
}

Tags:

.NET | Azure | C#

Azure Blob AppendAllText

by ingvar 14. december 2010 10:04

There are a very limited way of reading/writing to a blob on Windows Azure. Basically you are stuck with OpenRead and OpenWrite which returns a read stream and a write stream respectively. So if you want to append some text to a file you have to do some coding your self. Here is an example of doing this in a read/write in same file version. Here I use a Exists method to determine if a blob exists or not. The implementation of the method can be found here.

CloudBlobContainer container; /* Initialize this */
CloudBlob blob = container.GetBlobReference("myblob.txt");
string contents; /* content to append */

if (blob.Exists())
{
    using (Stream blobStream = blob.OpenRead())
    {
        byte[] buffer = new byte[4096];
        using (Stream tempBlobStream = blob.OpenWrite())
        {
            int read;
            while ((read = blobStream.Read(buffer, 0, 4096)) > 0)
            {
                tempBlobStream.Write(buffer, 0, read);
            }

            using (StreamWriter writer = new StreamWriter(tempBlobStream))
            {
                writer.Write(contents);
            }
        }                       
    }
}
else
{
    using (Stream blobStream = blob.OpenRead())
    {
        using (StreamWriter writer = new StreamWriter(blobStream))
        {
            writer.Write(contents);
        }
    }
}

If you prefer you can do it in a in-memory version like this:

CloudBlobContainer container; /* Initialize this */
CloudBlob blob = container.GetBlobReference("myblob.txt");
string contents; /* content to append */

string oldContent;
if (!blob.Exists())
{
    oldContent = "";
}
else
{
    using (StreamReader reader = new StreamReader(blob.OpenRead()))
    {
        oldContent = reader.ReadToEnd();
    }
}

using (StreamWriter writer = new StreamWriter(blob.OpenWrite()))
{
    writer.Write(oldContent);
    writer.Write(contents);
}

Tags:

.NET | Azure | C# | Blob

Azure Blob Exists

by ingvar 14. december 2010 08:40

Here is a really simple way of determining of a blob exists or not. I have added the 'DebuggerStepThrough' attribute so my debugger won't break every time a blob does not exists.

public static class CloudBlobUtils
{
    [DebuggerStepThrough]
    public static bool Exists(this CloudBlob blob)
    {
        try
        {
            blob.FetchAttributes();
            return true;
        }
        catch (StorageClientException ex)
        {
            if (ex.ErrorCode == StorageErrorCode.ResourceNotFound)
            {
                return false;
            }

            throw;
        }
    }
}

Tags:

.NET | Azure | C# | Blob

Moving C1 to Azure (File I/O)

by ingvar 17. november 2010 12:33

This blog post focus on file I/O when moving a complex solution to the Azure platform.

Composite C1 is a state of the art ASP.NET 4.0 CMS and is currently not 100% Azure ready. There is a lot of things to consider when moving a really complex product like Composite C1 to the Azure platform. For a full description on all the problems we havae moving C1 to Azure and all the people that attended a three day workshop, see this very descriptive blog post by Marcus Wendt.

Brainstorm and one-to-one implementation

First we sat down a listed all the classes we knew did file I/O. Some of the classes that we found were StreamReader, StreamWriter, TextReader and XmlReader. We also found some methods:XElement.Load, XElement.Save, XDocument.Load and XDocument.Save.

After we completed this list, we created one-to-one implementations of the found classes. These classes had the all same methods as the original classes and contained one private field of the original class type. The methods mapped one-to-one to the original class. Here is a simple example:


[Serializable]
public class C1StreamWriter : TextWriter, IDisposable
{
    private StreamWriter streamWriter;

    public C1StreamWriter(string path)
    {
        streamWriter = new StreamWriter(path);
    }


    public override void Write(char value)
    {
        streamWriter.Write(value);
    }

    /* And all the other constructors and methods */
}

Then we did a search'n'replace on the whole solution to insert our classes instead of the originals. We also created extension methods to replace the found methods. Next task was to see if we had found ALL the classes/methods that did file I/O. Which proved to be a rather had task.

IntelliTrace

To see if we had found all the classes, we used the new feature in Visual Studio 2010, IntelliTrace. We enabled IntelliTrace with only the File events marked and then started C1 in debug mode. This quickly showed that we had missed several classes and methods. One-to-one implementations of these newly found classes were made and IntelliTrace was fired up again.

After some time doing this, it became rather time consuming and error prone looking through several thousands stack traces of file I/O events. So I started looking at the IntelliTrace API and developed a tool, that we could use to filter the events in an intelligent way. See my blog post on how to get started with the IntelliTrace API.

FxCop

After doing all this work on finding all file I/O’s it would be nice to have a way of ensuring that no C1 developer by mistake used one of the, now forbidden, classes or methods. Also, it would be nice to have something to give all developer using C1, so they could do their developing knowing that their code also would be Azure ready. FxCop and custom rules to the rescue! See my blog post on how to make custom FxCop rules.

Findings

So which classes and methods did we find? And how hard was it to exchange them with our own classes and methods. To answer these questions I will group them into 4 groups: Static classes, non-static classes, methods and configuration classes. In the following I'll describe in more detail about the findings in each group.

Static classes

The found classes in this group were:

  • System.IO.File
  • System.IO.Directory

These were really easy to create our own implementation off. Just create a static class and do a one-to-one mapping of all methods.

Non.static classes

The found classes in this group were:

  • System.IO.StreamReader (Disposable)
  • System.IO.StreamWriter (Disposable)
  • System.IO.FileStream (Disposable)
  • System.IO.FileSystemWatcher
  • System.IO.FileInfo
  • System.IO.DirectoryInfo

These were also really easy to create our own implementation off. One important thing to remember here is to implement the dispose method the right way. Otherwise there will be problems with writing to unclosed files etc. Here is an example of doing it the right way:

~FileStream()
{
    Dispose(false);
}

protected overwrite void Dispose(bool disposing)
{
    if (disposing)
    {
        originalClass.Dispose();
    }
}

Methods

The found methods in this group were:

  • System.Xml.Linq.XDocument.Load (Local and remote)
  • System.Xml.Linq.XDocument.Save
  • System.Xml.Linq.XElement.Load (Local and remote)
  • System.Xml.Linq.XElement.Save
  • System.Xml.XmlReader.Create
  • System.Xml.XmlWriter.Create
  • System.Xml.XmlSchemaSet.Add
  • System.Xml.Xsl.XslCompiledTransform.Load

Here we simply created new static methods, some of them, extension methods and used a stream approach rather than using a path/uri approach. So instead of passing a path-string to XmlReader.Create we passed our own implementation of System.IO.FileStream to the XmlReader.Create method.


Special care had to be taken when making new methods for XDocument and XElement because their string version of the Load method can also fetch a file over the network. Here we had to look at the inputUri string and see if it was a local file (Use our own FileStream implementation) or remote (Use WebRequest to fetch the file).

Configuration classes

Found classes in this group were:

  • System.Configuration.Configuration
  • System.Configuration.ExeConfigurationFileMap (The Load method)

These classes/methods was the hardest ones. There was no way of replacing their file I/O functionality in a nice way, like we did with the other classes. So in this case we had to accept some local file I/O. But what about the Azure platform and shared configuration file across multiple instances etc? One way of solving this is to have some hooks on when a configuration file is loaded and saved. In this way we could 'fetch' a configuration file on load and 'store' it on save. Here is the full implementation of a new Configuration class that solves this:

public class C1Configuration
{
    Configuration _configuration;

    public static C1Configuration Load(string path)
    {            
        ExeConfigurationFileMap map = new ExeConfigurationFileMap();
        map.ExeConfigFilename = path;
        Configuration configuration =
            ConfigurationManager.
            OpenMappedExeConfiguration(map, ConfigurationUserLevel.None);            

        return new C1Configuration(configuration);
    }

    protected Configuration(Configuration configuration)
    {
        _configuration = configuration;
    }

    public ConfigurationSectionCollection Sections
    {
        get
        {
            return _configuration.Sections;
        }
    }

    public ConfigurationSection GetSection(string sectionName)
    {
        return _configuration.GetSection(sectionName);
    }

    public void Save()
    {
        _configuration.Save();
    }
}

API and plugin architecture

Next step was to create an API for C1 developer to use when doing file I/O. And a plugin architecture so that we could make C1 run on a local IIS, on the Azure platform or possible other platforms. The API is for most parts the same as the API for the original classes and methods, so this was just simple make-it-so work. C1 uses Microsoft Enterprise library (Object build) as plugin architecture. So this work was also pretty straight forward. At the moment we are not done with this work but when we are done, I'll post at link to the API.

Final thoughts

At the moment we have not created a Azure implementation of our file I/O plugin. This implementation will use the blob storage for keeping the files. So in the near future I'll post how it went with the Azure implementation.

Also still missing is the ASP.NET/Webserver file I/O part. This can be resolved by using Virtualizing Access to Content. Another solution could be: Having the website files locally and do some kind of synchronization if files are added/updated/deleted. This synchronization is possible through our new file I/O abstraction layer and can be implemented in the Azure implementation. This synchronization could also be used to solve the System.Configuration.Configuration problem.

Stay tuned for details regarding the Azure implementation and other cool stuff!

Tags:

.NET | C# | C1

About the author

Martin Ingvar Kofoed Jensen

Architect and Senior Developer at Composite on the open source project Composite C1 - C#/4.0, LINQ, Azure, Parallel and much more!

Follow me on Twitter

Read more about me here.

Read press and buzz about my work and me here.

Stack Overflow

Month List