A basic inter webrole broadcast communication on Azure using the service bus

by ingvar 19. maj 2011 21:48

In this blog post I'll try to show a bare bone setup that does inter webrole broadcast communication. The code is based on Valery M blog post. The code in his blog post is based on a customer solution and contains a lot more code than needed to get the setup working. But his code also provides a lot more robust broadcast communication with retries and other things that makes the communication reliable. I have omitted all this to make it, as easy to understand and recreate as possible. The idea is that the code I provide, could be used as a basis for setting up your own inter webrole broadcast communication. You can download the code here: InterWebroleBroadcastDemo.zip (17.02 kb)

Windows Azure AppFabric SDK and using Microsoft.ServiceBus.dll

We need a reference to Microsoft.ServiceBus.dll in order to do the inter webrole communication. The Microsoft.ServiceBus.dll assembly is a part of the Windows Azure AppFabric SDK found here.
When you use Microsoft.ServiceBus.dll you need to add it as a reference like any other assembly. You do this by browsing to the directory where the AppFabric SDK was installed. But unlike most other references you add, you need to set the "Copy local" property for the reference to true (default is false).
I have put all my code in a separate assembly and then the main classes are then used in the WebRole.cs file. Even if I have added Microsoft.ServiceBus.dll to my assembly () and setted the "Copy Local" to true, I still have to add it to the WebRole project and also set the "Copy Local" to true here. This is a very important detail!

Creating a new Service Bus Namespace

Here is a short step-guide on how to create a new Service Bus Namespace. If you have already done this, you can skip it and just use the already existing namespace and its values.

  1. Go to the section "Service Bus, Access Control & Caching"
  2. Click the button "New Namespace"
  3. Check "Service Bus"
  4. Enter the desired Namespace (This namespace is the one used for EndpointInformation.ServiceNamespace)
  5. Click "Create Namespace"
  6. Select the newly created namespace
  7. Under properties (To the right) find Default Key and click "View"
  8. Here you will find the Default Issuer (This value should be used for EndpointInformation.IssuerName) and Default Key (This value should be used for  EndpointInformation.IssuerSecret)

The code

Here I will go through all the classes in my sample code. The full project including the WebRole project can be download here: InterWebroleBroadcastDemo.zip (17.02 kb)

BroadcastEvent

We start with the BroadcastEvent class. This class represents the data we send across the wire. This is done with the class attribute DataContract and the member attribute DataMember. In this sample code I only send two simple strings. SenderInstanceId is not required but I use it to display where the message came from.

[DataContract(Namespace = BroadcastNamespaces.DataContract)]
public class BroadcastEvent
{
public BroadcastEvent(string senderInstanceId, string message)
{
this.SenderInstanceId = senderInstanceId;
this.Message = message;
}

[DataMember]
public string SenderInstanceId { get; private set; }

[DataMember]
public string Message { get; private set; }
}

BroadcastNamespaces

This class only contains some constants that are used by some of the other classes.

public static class BroadcastNamespaces
{
public const string DataContract = "http://broadcast.event.test/data";
public const string ServiceContract = "http://broadcast.event.test/service";
}

IBroadcastServiceContract

This interface defines the contract that the web roles uses when communication to each other. Here in this simple example, the contract only has one method, namely the Publish method. This method is, in the implementation of the contract (BroadcastService) used to send BroadcastEvent's to all web roles that have subscribed to this channel. There is another method, Subscribe, that is inherited from the IObservable. This method is used to subscribe to the BroadcastEvents when they are published by some web role. This method is also implemented in the BroadcastService class.

[ServiceContract(Name = "BroadcastServiceContract", 
Namespace = BroadcastNamespaces.ServiceContract)]
public interface IBroadcastServiceContract : IObservable<BroadcastEvent>
{
[OperationContract(IsOneWay = true)]
void Publish(BroadcastEvent e);
}

IBroadcastServiceChannel

This interface defines the channel which the web roles communicates through. This is done by adding the IClientChannel interface.

public interface IBroadcastServiceChannel : IBroadcastServiceContract, IClientChannel
{
}

BroadcastEventSubscriber

The web role subscribes to the channel by creating an instance of this class and registering it. For testing purpose, this implementation only logs when it receives any BroadcastEvent.

public class BroadcastEventSubscriber : IObserver<BroadcastEvent>
{
public void OnNext(BroadcastEvent value)
{
Logger.AddLogEntry(RoleEnvironment.CurrentRoleInstance.Id +
" got message from " + value.SenderInstanceId + " : " +
value.Message);
}

public void OnCompleted()
{
/* Handle on completed */
}

public void OnError(Exception error)
{
/* Handle on error */
}
}

BroadcastService

This class implements the IBroadcastServiceContract interface. It handles the publish scenario by calling the OnNext method on all subscribes in parallel. The reason for doing this parallel, is that the OnNext method is blocking, so there is a good change that there is a okay performance gain by doing this in parallel.
The other method is Subscribe. This method adds the BroadcastEvent observer to the subscribers and returns a object of type UnsubscribeCallbackHandler that, when disposed unsubscribe itself. This is a part of the IObserver/IObservable pattern.

[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single, 
ConcurrencyMode = ConcurrencyMode.Multiple)]
public class BroadcastService : IBroadcastServiceContract
{
private readonly IList<IObserver<BroadcastEvent>> _subscribers =
new List<IObserver<BroadcastEvent>>();

public void Publish(BroadcastEvent e)
{
ParallelQuery<IObserver<BroadcastEvent>> subscribers =
from sub in _subscribers.AsParallel().AsUnordered()
select sub;

subscribers.ForAll((subscriber) =>
{
try
{
subscriber.OnNext(e);
}
catch (Exception ex)
{
try
{
subscriber.OnError(ex);
}
catch (Exception)
{
/* Ignore exception */
}
}
});
}

public IDisposable Subscribe(IObserver<BroadcastEvent> subscriber)
{
if (!_subscribers.Contains(subscriber))
{
_subscribers.Add(subscriber);
}

return new UnsubscribeCallbackHandler(_subscribers, subscriber);
}


private class UnsubscribeCallbackHandler : IDisposable
{
private readonly IList<IObserver<BroadcastEvent>> _subscribers;
private readonly IObserver<BroadcastEvent> _subscriber;

public UnsubscribeCallbackHandler(IList<IObserver<BroadcastEvent>> subscribers,
IObserver<BroadcastEvent> subscriber)
{
_subscribers = subscribers;
_subscriber = subscriber;
}

public void Dispose()
{
if ((_subscribers != null) && (_subscriber != null) &&
(_subscribers.Contains(_subscriber)))
{
_subscribers.Remove(_subscriber);
}
}
}
}

ServiceBusClient

The main purpose of the ServiceBusClient class is setup and create a ChannelFactory<IBroadcastServiceChannel> and a IBroadcastServiceChannel instance through the factory. The channel is used by the web role to send BroadcastEvent's through the publish method. It is in this class all the Azure service bus magic happens. Setting up the binding and endpoint. A few service bus related constants is used here and they are all kept in the EndpointInformation class. 

public class ServiceBusClient<T> where T : class, IClientChannel, IDisposable
{
private readonly ChannelFactory<T> _channelFactory;
private readonly T _channel;
private bool _disposed = false;

public ServiceBusClient()
{
Uri address = ServiceBusEnvironment.CreateServiceUri("sb",
EndpointInformation.ServiceNamespace, EndpointInformation.ServicePath);

NetEventRelayBinding binding = new NetEventRelayBinding(
EndToEndSecurityMode.None,
RelayEventSubscriberAuthenticationType.None);

TransportClientEndpointBehavior credentialsBehaviour =
new TransportClientEndpointBehavior();
credentialsBehaviour.CredentialType =
TransportClientCredentialType.SharedSecret;
credentialsBehaviour.Credentials.SharedSecret.IssuerName =
EndpointInformation.IssuerName;
credentialsBehaviour.Credentials.SharedSecret.IssuerSecret =
EndpointInformation.IssuerSecret;

ServiceEndpoint endpoint = new ServiceEndpoint(
ContractDescription.GetContract(typeof(T)), binding,
new EndpointAddress(address));
endpoint.Behaviors.Add(credentialsBehaviour);

_channelFactory = new ChannelFactory<T>(endpoint);

_channel = _channelFactory.CreateChannel();
}

public T Client
{
get
{
if (_channel.State == CommunicationState.Opening) return null;

if (_channel.State != CommunicationState.Opened)
{
_channel.Open();
}

return _channel;
}
}

public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}

public void Dispose(bool disposing)
{
if (!_disposed)
{
if (disposing)
{
try
{
if (_channel.State == CommunicationState.Opened)
{
_channel.Close();
}
else
{
_channel.Abort();
}
}
catch (Exception)
{
/* Ignore exceptions */
}


try
{
if (_channelFactory.State == CommunicationState.Opened)
{
_channelFactory.Close();
}
else
{
_channelFactory.Abort();
}
}
catch (Exception)
{
/* Ignore exceptions */
}

_disposed = true;
}
}
}

~ServiceBusClient()
{
Dispose(false);
}
}

ServiceBusHost

The main purpose of the ServiceBusHost class is to setup, create and open a ServiceHost. The service host is used by the web role to receive BroadcastEvent's through registering a BroadcastEventSubsriber instance. Like the ServiceBusClient it is in this class all the Azure service bus magic happens.

public class ServiceBusHost<T> where T : class
{
private readonly ServiceHost _serviceHost;
private bool _disposed = false;

public ServiceBusHost()
{
Uri address = ServiceBusEnvironment.CreateServiceUri("sb",
EndpointInformation.ServiceNamespace, EndpointInformation.ServicePath);

NetEventRelayBinding binding = new NetEventRelayBinding(
EndToEndSecurityMode.None,
RelayEventSubscriberAuthenticationType.None);

TransportClientEndpointBehavior credentialsBehaviour =
new TransportClientEndpointBehavior();
credentialsBehaviour.CredentialType =
TransportClientCredentialType.SharedSecret;
credentialsBehaviour.Credentials.SharedSecret.IssuerName =
EndpointInformation.IssuerName;
credentialsBehaviour.Credentials.SharedSecret.IssuerSecret =
EndpointInformation.IssuerSecret;

ServiceEndpoint endpoint = new ServiceEndpoint(
ContractDescription.GetContract(typeof(T)), binding,
new EndpointAddress(address));
endpoint.Behaviors.Add(credentialsBehaviour);

_serviceHost = new ServiceHost(Activator.CreateInstance(typeof(T)));

_serviceHost.Description.Endpoints.Add(endpoint);

_serviceHost.Open();
}

public T ServiceInstance
{
get
{
return _serviceHost.SingletonInstance as T;
}
}

public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}

public void Dispose(bool disposing)
{
if (!_disposed)
{
if (disposing)
{
try
{
if (_serviceHost.State == CommunicationState.Opened)
{
_serviceHost.Close();
}
else
{
_serviceHost.Abort();
}
}
catch
{
/* Ignore exceptions */
}
finally
{
_disposed = true;
}
}
}
}

~ServiceBusHost()
{
Dispose(false);
}
}

EndpointInformation

This class keeps all the service bus related constants. I have put a dummy constant for the ServiceNamespace, IssuerName and IssuerSecret. These you have to find in the Windows Azure Management Portal [URL]. Read below how to create a new Service Bus and obtain these values.

public static class EndpointInformation
{
public const string ServiceNamespace = "CHANGE THIS TO YOUR NAMESPACE";
public const string ServicePath = "BroadcastService";
public const string IssuerName = "CHANGE THIS TO YOUR ISSUER NAME";
public const string IssuerSecret = "CHANGE THIS TO YOUR ISSUER SECRET";
}


BroadcastCommunicator

This class abstracts all the dirty details away and is the main class that the web role uses. It has two methods. Publish for publishing BroadcastEvent instances. And Subscribe for subscribing to the broadcast events by creating an instanse of the BroadcastEventSubscriber and handing it to the Subscribe method.

public class BroadcastCommunicator : IDisposable
{
private ServiceBusClient<IBroadcastServiceChannel> _publisher;
private ServiceBusHost<BroadcastService> _subscriber;
private bool _disposed = false;

public void Publish(BroadcastEvent e)
{
if (this.Publisher.Client != null)
{
this.Publisher.Client.Publish(e);
}
}

public IDisposable Subscribe(IObserver<BroadcastEvent> subscriber)
{
return this.Subscriber.ServiceInstance.Subscribe(subscriber);
}

private ServiceBusClient<IBroadcastServiceChannel> Publisher
{
get
{
if (_publisher == null)
{
_publisher = new ServiceBusClient<IBroadcastServiceChannel>();
}

return _publisher;
}
}

private ServiceBusHost<BroadcastService> Subscriber
{
get
{
if (_subscriber == null)
{
_subscriber = new ServiceBusHost<BroadcastService>();
}

return _subscriber;
}
}

public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}

public void Dispose(bool disposing)
{
if (!_disposed && disposing)
{
try
{
_subscriber.Dispose();
_subscriber = null;
}
catch
{
/* Ignore exceptions */
}

try
{
_publisher.Dispose();
_publisher = null;
}
catch
{
/* Ignore exceptions */
}

_disposed = true;
}
}

~BroadcastCommunicator()
{
Dispose(false);
}
}

WebRole

This is a pretty strait forward web role. In the OnStart method a instance of the BroadcastCommunicator is created and an instance of BroadcastEventSubscriber is used to subscribe to the channel.
The Run method is a endless loop with a random sleep in every loop, for testing purpose. In every loop it sends a "Hello World" message including its own role instance id.
The OnStep method cleans up by disposing disposable objects.

public class WebRole : RoleEntryPoint
{
private volatile BroadcastCommunicator _broadcastCommunicator;
private volatile BroadcastEventSubscriber _broadcastEventSubscriber;
private volatile IDisposable _broadcastSubscription;
private volatile bool _keepLooping = true;


public override bool OnStart()
{
_broadcastCommunicator = new BroadcastCommunicator();
_broadcastEventSubscriber = new BroadcastEventSubscriber();

_broadcastSubscription =
_broadcastCommunicator.Subscribe(_broadcastEventSubscriber);

return base.OnStart();
}



public override void Run()
{
/* Just keep sending messasges */
while (_keepLooping)
{
int secs = ((new Random()).Next(30) + 60);

Thread.Sleep(secs * 1000);
try
{
BroadcastEvent broadcastEvent =
new BroadcastEvent(RoleEnvironment.CurrentRoleInstance.Id,
"Hello world!");

_broadcastCommunicator.Publish(broadcastEvent);
}
catch (Exception ex)
{
Logger.AddLogEntry(ex);
}
}
}

public override void OnStop()
{
_keepLooping = false;

if (_broadcastCommunicator != null)
{
_broadcastCommunicator.Dispose();
}

if (_broadcastSubscription != null)
{
_broadcastSubscription.Dispose();
}

base.OnStop();
}
}


Logger

The logger class is used some places in the code. If a logger action has been set, logging will be done. Read more about how I did logging below.

public static class Logger
{
private static Action<string> AddLogEntryAction { get; set; }

public static void Initialize(Action<string> addLogEntry)
{
AddLogEntryAction = addLogEntry;
}

public static void AddLogEntry(string entry)
{
if (AddLogEntryAction != null)
{
AddLogEntryAction(entry);
}
}

public static void AddLogEntry(Exception ex)
{
while (ex != null)
{
AddLogEntry(ex.ToString());

ex = ex.InnerException;
}
}
}

Simple but effective logging

When I developed this demo I used a web service on another server for logging. This web service just have one method taking one string argument, the line to log. Then I have a page for displaying and clearing the log. This is a very simple way of doing logging, but it gets the job done.

The output

Below is the output from a run of the demo project with 4 web role instances. Here the first two lines are most interesting. Here you can see that only web role instance WebRole_IN_0 and WebRole_IN_2 are ready to receive (and send) events. The reason for this is the late creation (create when needed) of the ServiceBusClient and ServiceBusHost in the BroadcastCommunicator class and the sleep period in the WebRole. This illustrates that web roles can join the broadcast channel at any time and start sending and receiving events.
 
20:30:40.4976 : WebRole_IN_2 got message from WebRole_IN_0 : Hello world!
20:30:40.7576 : WebRole_IN_0 got message from WebRole_IN_0 : Hello world!
20:30:43.0912 : WebRole_IN_2 got message from WebRole_IN_3 : Hello world!
20:30:43.0912 : WebRole_IN_1 got message from WebRole_IN_3 : Hello world!
20:30:43.0912 : WebRole_IN_0 got message from WebRole_IN_3 : Hello world!
20:30:43.1068 : WebRole_IN_3 got message from WebRole_IN_3 : Hello world!
20:30:45.4505 : WebRole_IN_0 got message from WebRole_IN_2 : Hello world!
20:30:45.4505 : WebRole_IN_3 got message from WebRole_IN_2 : Hello world!
20:30:45.4505 : WebRole_IN_1 got message from WebRole_IN_2 : Hello world!
20:30:45.4662 : WebRole_IN_2 got message from WebRole_IN_2 : Hello world!
20:30:59.4816 : WebRole_IN_0 got message from WebRole_IN_1 : Hello world!
20:30:59.4816 : WebRole_IN_3 got message from WebRole_IN_1 : Hello world!
20:30:59.4972 : WebRole_IN_2 got message from WebRole_IN_1 : Hello world!
20:30:59.4972 : WebRole_IN_1 got message from WebRole_IN_1 : Hello world!
20:31:59.1371 : WebRole_IN_2 got message from WebRole_IN_3 : Hello world!
20:31:59.2621 : WebRole_IN_1 got message from WebRole_IN_3 : Hello world!
20:31:59.3871 : WebRole_IN_0 got message from WebRole_IN_3 : Hello world!
20:31:59.5746 : WebRole_IN_3 got message from WebRole_IN_3 : Hello world!
20:32:03.1683 : WebRole_IN_2 got message from WebRole_IN_0 : Hello world!
20:32:03.1683 : WebRole_IN_0 got message from WebRole_IN_0 : Hello world!
20:32:03.1683 : WebRole_IN_1 got message from WebRole_IN_0 : Hello world!
20:32:03.1839 : WebRole_IN_3 got message from WebRole_IN_0 : Hello world!

Tags:

.NET | Azure | C#

Azure and CopyFromBlob performance

by ingvar 2. maj 2011 21:47

I have earlier looked at some of the performance for some of the methods for uploading content to a Azure blob. The result of these tests can be found here. While I was working on getting Composite C1 multi tenancy working in the cloud, I ran into a performance problem. Thanks to Henrik Westergaard, I took a look at the CopyFromBlob method and its performance. 

I used the same setup as in my eirlier test. The test code ran in the WebRole OnStart. The table below holds the tests results. Unlike the other methods i tested, the CopyFromBlob seems to be more invariant when the file size is getting larger. The CopyFromBlob method is 2-3 times faster for files larger than 250kb. And never significant slower. So if the setup permits it, this is preferred compared to the other upload-methods.

Size in kb CopyFromBlob
50 70
100 75
150 73
200 75
250 75
300 75
350 75
400 75
450 73
500 70

The code for the test loop is here:

int copyFromBlobTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob targetBlob = container.
        GetBlobReference(string.Format("TargetTestFile{0}.dat", i));
    targetBlob.CopyFromBlob(sourceBlob);
}
int copyFromBlobTime2 = Environment.TickCount;

Tags:

.NET | Azure | C# | Blob

Azure and creating website dynamically

by ingvar 28. april 2011 21:21

In this post I will show how to create a new website dynamically in an already deployed web role on Azure. This is especially interesting for multi tenant scenario. One other way of doing this would be creating a new deploy package and upgrade the running role. But this would mean downtime for the existing sites and will take longer than dynamically creating the site. 

I will start with some important things when creating websites dynamically from the web role and then I will show the code that does it.

The web role needs to have elevated privileges. This is done by adding the following element to the WebRole element in the ServiceDefinition.csdef file.


<Runtime executionContext="elevated" />

Only some ports are open in the firewall. When you normally creates new sites, you either run them on the same port as the deployed site does using host header (host name) or on a different port. On Azure using a different port is not an option. Only the ports that are specified for sites (or Remote Desktop Connection) when the package is deployed are open in the firewall. So the only option here is to use host header (host name) for new sites. You can see which ports that are open in the Window Azure Management Portal.

Microsoft.Web.Administration reference. You need to add the assembly Microsoft.Web.Administration to use the ServerManager class. The Microsoft.Web.Administration can be found in %WinDir%\System32\InetSrv directory. You also need to change the property Copy Local to true because it is not in the GAC on the Azure host. 

The code! Now lets get to the fun part, the code! Below is the code needed to create a new site. This could be done in OnStart or where you see fit. Though it has to be in the web role to have the privileges to do it. Here is the code:

string newWebsitePath = "SOME PATH!";
using (ServerManager serverManager = new ServerManager())
{
    /* Create the app pool */
    ApplicationPool applicationPool =
           serverManager.ApplicationPools.Add("MyApplicationPool");
    applicationPool.AutoStart = true;
    applicationPool.ManagedPipelineMode = ManagedPipelineMode.Integrated;
    applicationPool.ManagedRuntimeVersion = "v4.0";

    /* Create the web site */
    Site site = serverManager.Sites.
           Add("MyNewSite", "http", "*:80:www.mynewwebsite.com", newWebsitePath);
    site.Applications.First().ApplicationPoolName = "MyApplicationPool";
    site.ServerAutoStart = true;
    serverManager.CommitChanges();
}

Testing. A easy way to test this, is to edit your hosts file (C:\Windows\System32\drivers\etc\hosts). First you need to find the IP address of your web role. This can be done by either pinging it or by looking in the Window Azure Management Portal. So lets say the IP address is 111.222.333.444 and you have used www.mynewwebsite.com as host name. Then you need to add the following line to your hosts file:

111.222.333.444    www.mynewwebsite.com

When you have added this line and saved the file you can open www.mynewwebsite.com in your browser and the request will go to your web role.

Tags:

.NET | Azure | C#

Azure Shared Access Signature and pitfalls

by ingvar 26. april 2011 21:03

To start with, I did not think I would write a blog post about Azure Shared Access Signatures (SAS). But after having worked with them for some time I had stumbled into some things I think is worth sharing. The things I found is shown bellow the code. Thanks to @Danielovich for pointing me in the right direction.

I'll start by showing how to create a SAS. You need to have access to the Primary Access Key (or the Secondary Access Key) for the blob storage that you wish to use. These keys can be obtained through the Windows Azure Platform Portal. The code below shows how to create a SAS, use it and what you can/can not do with it. 

/* Here is how to create the SAS */
StorageCredentialsAccountAndKey masterCredentials = 
     new StorageCredentialsAccountAndKey("[Name]", "[AccessKey]");
CloudStorageAccount account = new CloudStorageAccount(masterCredentials, false);
CloudBlobClient client = account.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference("mytestcontainer");
container.CreateIfNotExist();

SharedAccessPolicy sharedAccessPolicy = new SharedAccessPolicy();
sharedAccessPolicy.Permissions = 
     SharedAccessPermissions.Delete |
     SharedAccessPermissions.List |
     SharedAccessPermissions.Read |
     haredAccessPermissions.Write;
sharedAccessPolicy.SharedAccessStartTime = DateTime.UtcNow;
sharedAccessPolicy.SharedAccessExpiryTime = DateTime.UtcNow + TimeSpan.FromHours(1);

string sharedAccessSignature = container.GetSharedAccessSignature(sharedAccessPolicy);

/* Here is how to use the sharedAccessSignature */
StorageCredentialsSharedAccessSignature sasCredentials = 
    new StorageCredentialsSharedAccessSignature(sharedAccessSignature);
CloudBlobClient sasClient = new CloudBlobClient(account.BlobEndpoint, sasCredentials);

CloudBlobContainer sasContainer = sasClient.GetContainerReference("mytestcontainer");
CloudBlob sasBlob = sasContainer.GetBlobReference("myblob.txt");

/* This will work if SharedAccessPermissions.Write is used */
sasBlob.UploadText("Hello!");

/* This will work if SharedAccessPermissions.Read is used */
sasBlob.DownloadText();

/* This will work if SharedAccessPermissions.Delete is used */
sasBlob.Delete();

/* This will work if SharedAccessPermissions.List is used */
sasContainer.ListBlobs();

/* This will always fail */
sasContainer.FetchAttributes();

/* This will always fail */
sasClient.ListContainers(); 

Here are some points that I think is worth noting when working with SAS. It might even save you some time:

  • Remember to use Utc methods on DateTime. If you use anything else, the time window where the SAS is valid, might not be the same as you think.
  • The FetchAttributes method does not work on the container/blob that the SAS was generated for. This is interesting because the FetchAttributes method is very often used to determine if the container/blob exists or not. But it will work for blobs inside a container if the SAS was generated for that container. 
  • A StorageClientException with the message: The specified resource does not exist, is thrown if the SAS does not grand enough access. So Azure hides the container/blob if the client does not have the right access level. 
  • DeleteIfExists will never fail if SharedAccessPermissions.Delete is not specified. As mentioned above, Azure hides containers/blobs if access rights are missing. 

Tags:

.NET | Azure | C#

Azure and multi website packages created with cspack.exe

by ingvar 19. april 2011 07:49

Introduction

Iwanted to create Azure deploy packages through the command line. Well I wanted to create them outside Visual Studio. I started out with a with two websites in the same role with different binding endpoints. And the setup worked when i deployed from VS. In my setup, the WebRole copied files from the blob to the each website. So the initial websites were empty and the WebRole created them in the OnStart method.

I discovered one very important thing when creating packages with cspack.exe. And I want to share this with other developers, because it took me a great deal of time to find this small, but very important detail: The argument /RolePropertiesFile.

Result

First let me show what I found out was needed to create a working package:

Here is the command:

cspack.exe WindowsAzureProject\ServiceDefinition.csdef 
           /role:MyWebRole;MyWebRole  
           /rolePropertiesFile:MyWebRole;MyWebRole\RoleProperties.txt

 

Here WindowsAzureProject is the folder containing the Azure project and MyWebRole is the folder containing the web role added to the Azure project.

And here is the content of the file RoleProperties.txt

EntryPoint=WebRole.dll
TargetFrameWorkVersion=v4.0

Process

Here is the things i went through to get to the working result above.

I started out with the command, that could be found many places through some simple searching:

cspack.exe WindowsAzureProject\ServiceDefinition.csdef 
           /role:MyWebRole;MyWebRole  

This resulted in this IIS error message: Parser Error Message: Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive.

Then I did some more, well a lot more, searching and found this blog post. I added the argument /RolePropertiesFile:RolePropertoes.txt to the command and tried it out. But no luck, it did not work. The result site just gave me a new error: 403 as if no files existed on the website. The strange part was that it seemed that the WebRole did not run. In the WebRole.OnStart I had added some some remote logging. When I deployed the command line package, there were no logging, but if i deployed through VS it did do the logging. So the package was deployed, but the WebRole did not run, or excepted very early. Almost all the code was inside a try-catch-block and the exception was remote logged. But no log lines. So I’m pretty sure it did not run. Which also was the case after some more digging.

Then I tried adding the /copyOnly argument to the command. This generated a folder containing files that the emulator can use when running the WebRole. I started comparing files in this folder against the folder generated by VS and found a very interesting file: RoleModel.xml. Both files contained Properties element with a series of Property elements. The command line version of this file only had one Property element, namely TargetFrameWorkVersion that i added in the RoleProperties.txt file. The VS version of the RoleModel.xml file contained a lot more Property elements. So I added all of them to the RoleProperties.txt file, ran the command, redeployed and it worked!!

Then I started removing properties one by one, redeploying in between until i found the single property that was needed for the command line generated package to work (Yup, it took a long time to do this). And here is the result RoleProperties.txt file:

EntryPoint=WebRole.dll
TargetFrameWorkVersion=v4.0

I retried the whole setup, but this time with only one website in the web role. And for some strange and unknown reason to me, it worked without the extra parameter in the RoleProperties.txt file. 

So the lesson here is that using the /copyOnly argument and comparing the directory generated by the cspack.exe against VS could be very helpful. 

Tags:

Azure | C# | C1

Microsoft.WindowsAzure.* assemblies not in GAC on Azure sites

by ingvar 16. april 2011 16:34

I discovered something ord with Azure deployments. 99% of developers and Azure deployments will probably never encounter. To put it short if discovered that if the website you deploy to Azure does not have a reference to the assembly Microsoft.WindowsAzure.StorageClient.dll, the assembly can not be used by the website at all. In other words, this assembly is not in the GAC. Many will probably find this scenario strange. But I encountered this issue when I was working with the CMS Composite C1. C1 is deployed to Azure buy the WebRole.OnStart method that downloads a zip file and unzipping it to the website. Why? Well, there are many reasons, but one of them is that this allows developers deploying existing C1 sites to Azure with the same Azure deployment package (no need to rebuild the package). They just zips the site and points to it from the ServiceConfiguration.cscfg file. 

I wanted to share this, so I have found an easy way to reproduce it. Here is the steps to reproduce it (A zipped version can be downloaded Here):

  1. Create new windows azure project.
  2. Add a ASP.NET WebRole (Clean the site so only WebRole.cs and web.config are left).
  3. Add Empty ASP.NET website.
  4. Add the physicalDirectory=”../Website” to the Site element in the ServiceDefinition.csdef file.
  5. Add a Default.aspx to the website (Not the WebRole) with the code below.
  6. Deploy to Azure.
  7. Note that the CloudBlob type is not found!
  8. Add a Default.aspx with the same code to the WebRole project.
  9. Change the physicalDirectory to “../WebRole” in the ServiceDefinition.csdef file.
  10. Deploy to Azure.
  11. Now the types can be found! (WebRole having a reference to the needed assembly)

Here is ths code I added to the Default.aspx.cs files:

protected void Page_Load(object sender, EventArgs e)
{
    Type cloudBlobType = Type.GetType("Microsoft.WindowsAzure.StorageClient.CloudBlob, "
      "Microsoft.WindowsAzure.StorageClient");
    if (cloudBlobType != null)
    {
        PlaceHolder.Controls.Add(new LiteralControl("Type found! <br />"));
    }
    else
    {
        PlaceHolder.Controls.Add(new LiteralControl("Type NOT found! <br />"));
    }
}

I used Type.GetType here only to do the reproduction. This could also happen, as it did with C1, if an assembly is added to the website from the WebRole.OnStart, that have a reference to Microsoft.WindowsAzure.StorageClient.dll and used in the code. In this case the site is totaly dead.  

The fix is easy, just add needed references to the website. The reason that the WebRole in this reproduction works is that it has a reference to the needed assembly and this results in that the assembly is in the bin folder. 



Tags:

.NET | Azure | C#

Paths for each site on a Azure deployment

by ingvar 15. april 2011 22:49

If you need to do any file related work in your WebRole on the deployed files or if you want to add files from the WebRole. This post presents one way to get the physical paths and names of each website on a Azure deployment. This works for both single and multiple site setups.

First we need to get the role model, which is created by Azure from the ServiceDefinition.xml. So some of the aspects of this role model file is recognisable. The root directory of the role can be found in the environment variables. Like this:

string roleRootPath = Environment.GetEnvironmentVariable("RdRoleRoot");

The file name of the role model is ‘RoleModel.xml’, so the full path of the role model can be constructed and loaded like this:

string roleModelPath = Path.Combine(roleRootPath, "RoleModel.xml")
 

We need one more thing before we start parsing the RoleModel.xml file. We need the current application directory. We need this because the paths in RoleModel.xml are relative, so to get the full path we need application directory. We can get it like this:

string currentAppRootPath = Path.GetDirectoryName(AppDomain.CurrentDomain.BaseDirectory);

The web sites construct in this file is very similar as in ServiceDefinition.xml. So finding the web sites names and physical paths could be done like this:

XNamespace roleModelNS = "http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition";
XDocument roleModel = XDocument.Load(roleModelPath);
var siteElements = roleModel.Root.Element(roleModelNS + "Sites").
                                   Elements(roleModelNS + "Site");

var results = 
    from siteElement in siteElements
    where siteElement.Attribute("name") != null &&
          siteElement.Attribute("physicalDirectory") != null
    select new {
        Path = Path.Combine(appRootDir, siteElement.Attribute("physicalDirectory").Value),
        Name = siteElement.Attribute("name").Value };

Tags:

.NET | Azure | C#

Azure and the REST API

by ingvar 13. april 2011 21:09

Introduction

In this post I’ll write about the interesting things i discovered when doing REST to Azure. And I’ll also post the code needed to; create a hosted server, created a new deplyment and upgrade an existing deployment.

The three most important findings for me was how to get meaningfull error messages when the web request failed. The right way to use Uri for some REST operations and the right way to encode service configuration files in the body of the request.

Documentation for the Azure REST API can be found here: http://msdn.microsoft.com/en-us/library/ee460799.aspx

Certificate

You need a certificate to identify you REST commands to Auzre.

You can create a new certificate by issuing this command:

makecert -r -pe -a sha1 -n CN=AzureMgmt -ss My “AzureMgmt.cer"

The file 'AzureMgmt.ce' that was created by this command should be added to your Azure subscribtion. You can do this through the Azure Management Portal.

Generic setup

Here is a generic setup for issuing REST operations. 

Edit (2011-04-13): Setting the request method to "POST" should only be done if there is a body to send, thanks to Christian Horsdal for pointing this out.

string requestUrl = "";
string certificatePath = "";
string headerVersion = "";
string requestBody = "";

HttpWebRequest httpWebRequest = (HttpWebRequest)HttpWebRequest.Create(new Uri(requestUrl, true));
httpWebRequest.ClientCertificates.Add(new X509Certificate2(certificatePath));
httpWebRequest.Headers.Add("x-ms-version", headerVersion);
httpWebRequest.ContentType = "application/xml";

if (!string.IsNullOrEmpty(requestBody))
{
   httpWebRequest.Method = "POST";
   byte[] requestBytes = Encoding.UTF8.GetBytes(requestBody);
   httpWebRequest.ContentLength = requestBytes.Length;

   using (Stream stream = httpWebRequest.GetRequestStream())
   {
      stream.Write(requestBytes, 0, requestBytes.Length);
   }
}

try
{
   using (HttpWebResponse httpWebResponse = (HttpWebResponse)httpWebRequest.GetResponse())
   {
      Console.WriteLine("Response status code: " + httpWebResponse.StatusCode);       WriteRespone(httpWebResponse);
   }
}
catch (WebException ex)
{
   Console.WriteLine("Response status code: " + ex.Status);
   WriteRespone(ex.Response);
}

Here is the code for the WriteRepsonse method:

static void WriteRespone(WebResponse webResponse)
{
   using (Stream responseStream = webResponse.GetResponseStream())
   {
      if (responseStream == null) return;

      using (StreamReader reader = new StreamReader(responseStream))
      {
         Console.WriteLine("Response output:");
         Console.WriteLine(reader.ReadToEnd());
      }
   }
}

The three variables; requestUrl, headerVersion and requestBody) are the only things you need to change to do any REST operation. 

The certificatePath variable is the same for all operations, it just need to be the path to your certificate file. 

The requestUrl variable is constructed to match the specific REST operation. See the the documentation for the different REST operations here [http://msdn.microsoft.com/en-us/library/ee460799.aspx].

The headerVersion varies from operation to operation. Some operations has the same version. You can find the correct version in the documentation.

The requestBody is a small XML document. Not all operations needs a body and this is handled by the generic code above. See the documentation [http://msdn.microsoft.com/en-us/library/ee460799.aspx] if the body is needed and if it is, how its constructed.

 

Important findings

Here is a short list of the problems I ran into when doing REST operations and the solutions to them:

  • WebException.Response.GetResponseStream() is your friend! When a operation fails with an exception and the error code 400, reading the WebExceptions response stream can help you finding the reason why.
  • The Uri constructor should have dontEscape=false for the request URL. If this is not the case, REST operations like Upgrade Deployment fails.
  • The ordering of the elements in the body matters, it is XML, so not so supprising.
  • If a Azure deployment package is needed, this should be located in a blob store. This is also documented in the documentation.
  • If an Azure service configuration is needed, this should be read as a string and base64 encoded like i did in the code. My first try i read the file as byte (File.ReadBytes) but this made the operation fail.

 

Create Hosted Service Example

Here is the value of the needed three variables:

string requestUrl = "https://management.core.windows.net/<subscription-id>/services/hostedservices";
string headerVersion = "2010-10-28";
string requestBody = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
   "<CreateHostedService xmlns=\"http://schemas.microsoft.com/windowsazure\">" +
      "<ServiceName>myservicename</ServiceName>" +
      "<Label>" + Convert.ToBase64String(Encoding.UTF8.GetBytes("myservicename")) + "</Label>" +
      "<Location>North Central US</Location>" +
   "</CreateHostedService>";

Create Deployment Example

Here is the value of the needed three variables:

string requestUrl = "https://management.core.windows.net/<subscription-id>/services/hostedservices/myservicename/deploymentslots/staging";
string headerVersion = "2009-10-01";
string requestBody = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
   "<CreateDeployment xmlns=\"http://schemas.microsoft.com/windowsazure\">" +
      "<Name>mydeployment</Name>" +
      "<PackageUrl>http://myblob.blob.core.windows.net/MyPackage.cspkg</PackageUrl>" +
      "<Label>" +
      Convert.ToBase64String(Encoding.UTF8.GetBytes("mydeployment")) +
      "</Label>" +
      "<Configuration>" +
      Convert.ToBase64String(Encoding.UTF8.GetBytes(
         File.ReadAllText(@"C:\MyServiceConfiguration.cscfg"))) +
      "</Configuration>" +
      "<StartDeployment>true</StartDeployment>" +
   "</CreateDeployment>";

Upgrade Deployment Example

Here is the value of the needed three variables:

string requestUrl = "https://management.core.windows.net/<subscription-id>/services/hostedservices/myservicename/deploymentslots/staging/?comp=upgrade";
string headerVersion = "2009-10-01";
string requestBody = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
   "<UpgradeDeployment xmlns=\"http://schemas.microsoft.com/windowsazure\">" +
      "<PackageUrl>http://myblob.blob.core.windows.net/MyPackage.cspkg</PackageUrl>" +
      "<Configuration>" +
      Convert.ToBase64String(Encoding.UTF8.GetBytes(
            File.ReadAllText(@"C:\MyServiceConfiguration.cscfg"))) +
      "</Configuration>" + "<Mode>auto</Mode>" +
      "<Label>"
      Convert.ToBase64String(Encoding.UTF8.GetBytes("mydeployment")) + 
      "</Label>" +
   "</UpgradeDeployment>";

Tags:

.NET | Azure | C#

Azure and blob write performance

by ingvar 6. januar 2011 21:07

Edit - 22nd August

Please read the comment below by Joe Giardino regarding the relative poor performance of the OpenWrite method. He has a very good explanation!

Introduction

During my work at Composite on C1 I found out that, some ways of adding/uploading data to the Azure blob is faster than others. So I did some benchmarking on ways to add data to a block blob. I will start by listing the results and below you can see the code i used to do the testing. In all tests the loopCount was 50. See the code below on more information on the loopCount. The numbers in the table is the average of milliseconds of these 50 loops.

Results

Azure test results

As expected the blob is a little slower than writing to the local disk. But what surprised me is that the OpenWrite method is much much slower than the other methods for adding data to the blob. Unfortunately I started out using the OpenWrite method and used it a lot. This really slow down my solution. It got so slow I started getting ThreadAboutException's and time time outs.

Size in kb Local disk UploadByteArray UploadFile UploadFromStream OpenWrite
50 0 31
64
33 222
100 0 41 33 37 240
150 0 42 40 43 235
200 0 50 43 44 227
250 0 56 52 44 227
300 0 55 53 51 242
350 57
55 79 53 246
400 57 60 90 60 263
450 55 72 95 61 258
500 50 76 73 68 278

 

Local dev fabric test results

I included this because a made it work locally and then deployed on Azure. The numbers just reflect the numbers from the Azure run.

Size in kb Local disk UploadByteArray UploadFile UploadFromStream OpenWrite
50 0 89 78 80 338
100 0 90 86 89 398
150 0 128 124 129 411
200 0 138 137 136 426
250 0 144 151 148 436
300 0 179 178 185 521
350 19 197 193 192 561
400 16 234 230 229 550
450 14 245 249 245 541
500 14 248 247 249 555

 

The test code

Base code for all tests

Just finding the path of a local test file and creating the buffer to write to local disk or block blob.

string localPath = Path.Combine(
    HttpContext.Current.Request.PhysicalApplicationPath, "BlobTestFile.dat");

if (File.Exists(localPath)) File.Delete(localPath);

/* Setting up the buffer */
byte[] testBuffer = new byte[testBufferSize];
for (int i = 0; i < testBufferSize; i++)
{
    testBuffer[i] = (byte)(i % 256);
}

Base code for all blob tests

CloudStorageAccount account =
   CloudStorageAccount.FromConfigurationSetting("BlobConnectionString");

CloudBlobClient client =
   account.CreateCloudBlobClient();

CloudBlobContainer container =
   client.GetContainerReference("mycontainer"); /* Remember lower casing only */

container.CreateIfNotExist();

Local disk writes test code

Simply using the System.IO.FileStream class for writing the buffer to disk.

int diskWriteTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    using (FileStream fileStream = new FileStream(localPath, FileMode.Create))
    {
        fileStream.Write(testBuffer, 0, testBufferSize);
    }
}
int diskWriteTime2 = Environment.TickCount;

UploadByteArray test code

Uploading the testBuffer using the UploadByteArray method.

int blobUploadByteArrayTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");
    blob.UploadByteArray(testBuffer);
}
int blobUploadByteArrayTime2 = Environment.TickCount;

UploadFile test code

Uploading the file written by the local disk write test. The local file has the same size as the testBuffer.


int blobUploadFileTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");
    /* Reusing the local file, written in the local test*/
    blob.UploadFile(localPath);
}
int blobUploadFileTime2 = Environment.TickCount;

UploadFromStream test code

Uploading the file written by the local disk write test using a FileStream for reading the file. The local file has the same size as the testBuffer.


int blobUploadFromStreamTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");

    using (FileStream fileStream = new FileStream(localPath, FileMode.Open))
    {
        /* Reusing the local file, written in the local test*/
        blob.UploadFromStream(fileStream);
    }
}
int blobUploadFromStreamTime2 = Environment.TickCount;

OpenWrite test code

Uploading the testBuffer using the OpenWrite method.

int blobOpenWriteTime1 = Environment.TickCount;
for (int i = 0; i < loopCount; i++)
{
    CloudBlob blob = container.GetBlobReference("BlobTestFile.dat");

    using (Stream stream = blob.OpenWrite())
    {
        stream.Write(testBuffer, 0, testBufferSize);
    }
}
int blobOpenWriteTime2 = Environment.TickCount;

Tags:

.NET | Azure | Blob | C#

Azure and edit or delete deployed files from your website

by ingvar 21. december 2010 09:32

If you change the file permission on deployed files in the WebRole, you will be able to edit or delete those files from you website. This includes the web.config file!
Also if you write new files in the WebRole, you also need to change the file permission for those files.

My colleague at Composite Marcus Wendt found the code for changing the file permission here. Ill restate the code below. This code is not enough, you also need to edit the ServiceDefinition.csdef file. You need to add '<Runtime executionContext="elevated" />' to the ServiceDefinition.csdef file to give your WebRole rights to change file permissions:


<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="AzureTest"
                   xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
  <WebRole name="WebRole">
    
    <Runtime executionContext="elevated" />

  </WebRole>
</ServiceDefinition>

And here is the code for changing the permission for a given file:

void ChangePermission(string filePath)
{
    SecurityIdentifier sid = new SecurityIdentifier(WellKnownSidType.WorldSid, null);

    IdentityReference act = sid.Translate(typeof(NTAccount));

    FileSecurity sec = File.GetAccessControl(filePath);
    sec.AddAccessRule(new FileSystemAccessRule(act, FileSystemRights.FullControl,
                                               AccessControlType.Allow));

    File.SetAccessControl(filePath, sec);
}

Tags:

.NET | Azure | C#

About the author

Martin Ingvar Kofoed Jensen

Architect and Senior Developer at Composite on the open source project Composite C1 - C#/4.0, LINQ, Azure, Parallel and much more!

Follow me on Twitter

Read more about me here.

Read press and buzz about my work and me here.

Stack Overflow

Month List