Azure SDK 1.4 Refresh – Web Deploy is disabled

April 16, 2011 Leave a comment

Hurray,

we now have web deploy integrated in the SDK, that is a major step in rapid deployments into Azure ūüôā

I immediatelly installed the 1.4 Refresh and tried to deploy my application, but the web deploy checkbox was disabled…

I read the notes in the Windows Azure Team post :

  • Web Deploy only works with a single role instance.
  • The tool is intended only for iterative development and testing scenarios
  • Web Deploy bypasses the package creation. Changes to the web pages are not durable. To preserve changes, you must package and deploy the service.
everything was ok, so looking closer to the publish window i noticed a warning: Remote Desktop must be enabled! Ok, nothing wrong there, it was already enabled.
I created a new azure application to see if the checkbox was enabled, and it was ! I suspected something was wrong with the *.csdef file. Checking it helped me solve the problem, i was still in HWC (Hostable Web Core) ¬†mode¬†with no “Sites” element ¬†in the xml. After changing to Full IIS mode the checkbox was enabled ūüôā
So, one requirement to use the Web Deploy is that you must be in Full IIS mode!

Web Deploy Enabled

Categories: Uncategorized

Azure SDK and the CommunicationObjectFaultedException

January 31, 2011 Leave a comment

I see quite a few posts on the Azure foruns related to this issue.

Usually the person says he has created a new project, ran it and boom:

System.ServiceModel.CommunicationObjectFaultedException was unhandled

 

This usually happens because Azure 1.3 SDK tries to change the machine key in web.config and this file is read-only, probably because it is checked-in in a source control system.

Checking-out web.config will solve this, hopefully SDK’s 1.4 version will too.

Categories: Azure Tags:

Azure SDK 1.3

December 20, 2010 Leave a comment

As usual, I wanted my project to have the latest sdk available, so after receiving the news of the SDK 1.3 Release and having some spare time, i downloaded it from microsoft.

What’s New:

  • Virtual Machine (VM) Role (Beta):Allows you to create a custom VHD image using Windows Server 2008 R2 and host it in the cloud.
  • Remote Desktop Access: Enables connecting to individual service instances using a Remote Desktop client.
  • Full IIS Support in a Web role: Enables hosting Windows Azure web roles in an IIS hosting environment.
  • Elevated Privileges: Enables performing tasks with elevated privileges within a service instance.
  • Virtual Network (CTP): Enables support for Windows Azure Connect, which provides IP-level connectivity between on-premises and Windows Azure resources.
  • Diagnostics: Enhancements to Windows Azure Diagnostics enable collection of diagnostics data in more error conditions.
  • Networking Enhancements: Enables roles to restrict inter-role traffic, fixed ports on InputEndpoints.
  • Performance Improvement: Significant performance improvement local machine deployment.

After installing the SDK and refreshing my project references, i had a very strange error when starting the Web Role:

The communication object, System.ServiceModel.Channels.ServiceChannel, cannot be used for communication because it is in the Faulted state.

Googling it gave me the awaited answer: somehow the web.config file must be writable! Hope Azure Team will fix this quickly.

This was the only problem i had to upgrade my project to the 1.3 version.

Categories: Azure Tags:

Table Storage : Performance Tips, Part2 – Merge Entity

December 13, 2010 Leave a comment

When updating an entity, why do we need to send all it’s fields when only one field has changed, such as a “published” field?

The table service API addresses this quite well, you can check MSDN here.

The trick is to simply pass a “MERGE” http method in the request. Other considerations must be attended like If-Match header, wich is required.

If-Match

Required. Specifies the condition for which the update should be performed.

To force an unconditional update, set If-Match to the wildcard character (*).

You reap more advantages with this technique on tables that have a large number of columns since it reduces the request size.

Categories: Azure, Uncategorized

Table Storage : Performance Tips, Part 1 – Batch

December 11, 2010 Leave a comment

Whenever there is a need to do bulk inserts/updates in table storage, all performance issues matter.

Using¬†TableServiceContext from Azure’s SDK we can set the property¬†SaveChangesDefaultOptions to SaveChangesOptions.Batch, this will send batches of operations to table storage service instead of one request per operation. There are some limitations:

  • All entities subject to operations as part of the transaction must have the same¬†PartitionKey value.
  • An entity can appear only once in the transaction, and only one operation may be performed against it.
  • The transaction can include at most 100 entities, and its total payload may be no more than 4 MB in size.
  • All entities are subject to the limitations described in¬†Understanding the Table Service Data Model.

You can check MSDN documentation here.

Anyway, below is an analysis using Fiddler of the two approaches.

Test case: update 108 entities

  1. SaveChangesOptions.Batch – There must be two requests, one with 100 entities, and the second with the remaining 8.
    Bytes Sent: 121.131
    Bytes Received: 28.413
    Overall Elapsed: 00:00:03.856 

    Bytes Sent: 10.362
    Bytes Received: 2.844
    Overall Elapsed: 00:00:00.591

    Total Bytes Sent: 131.493
    Total Bytes Received: 31.257
    TotalOverall Elapsed: 00:00:05.447

  2. SaveChangesOptions.None
    Total Bytes Sent: 149.904
    Total Bytes Received: 31.968
    TotalOverall Elapsed: 00:00:14.040

Judging by the results, i saved 60% of time using this technique.

Categories: Azure Tags: ,

Table Storage : Storing Dictionaries

November 15, 2010 Leave a comment

Table  Storage is really great!  Why? It uses the concept of Property Bags for table content.

I don’t have to create a table and define every field it has. All i need to do is to create that table anda start saving data to it.

CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
CloudTableClient tableClient= account.CreateCloudTableClient();
tableClient.CreateTableIfNotExist(tableName);

after that i can insert my object:

var tableService = new TableServiceContext(account.TableEndpoint.AbsoluteUri, account.Credentials);         tableService.AddObject(tableName, new Something());
tableService.SaveChanges();

TableServiceContext will then read my object properties and map them into a xml that will be sent in the request to table storage.

The problem in this is that the object must have those properties in it, nothing dynamic here…

So… I¬† implemented a indexer in that object, something like this:

class Something{
public object this[string key] {get; set}

public IEnumerable<string> SomethingProperties {get;} //lists all keys inserted in the indexer
}

This allows me to extend the object properties in a simple way.

Next step is to get into the xml creation for the request as well as the object creation when reading table storage. The events ReadingEntity and WritingEntity of the TableServiceContext got me there.

void FacilitusCloudContext_WritingEntity(object sender, ReadingWritingEntityEventArgs e)
{
XNamespace d = “http://schemas.microsoft.com/ado/2007/08/dataservices”;
XNamespace m = “http://schemas.microsoft.com/ado/2007/08/dataservices/metadata”;

var entity = e.Entity as Something;
foreach (var key in entity.SomethingProperties)
{
object value = entity[key];
XElement xElement = new XElement(d + key, value);
XElement properties = e.Data.Descendants(m + “properties”).First();
properties.Add(xElement);
}
}

void FacilitusCloudContext_ReadingEntity(object sender, ReadingWritingEntityEventArgs e)
{
XNamespace d = “http://schemas.microsoft.com/ado/2007/08/dataservices”;
XNamespace m = “http://schemas.microsoft.com/ado/2007/08/dataservices/metadata”;

var entity = e.Entity as Something;

var entityProperties = e.Entity.GetType().GetProperties().Select(p => p.Name);
var dataElements = e.Data.Descendants(m + “properties”).First().Descendants().ToList();

foreach (var item in dataElements)
{
if (!entityProperties.Contains(item.Name.LocalName))
entity[item.Name.LocalName] = item.Value;
}
}

This is a simple solution to a problem that can be very tricky.
Hope you liked it.

Categories: Azure Tags:

WCF Data Services : Ambiguous match found

November 15, 2010 Leave a comment

I am using WCF Data Services to expose my POCO model as ODATA.

Today, a colleague asked me to help him solve a “ambiguous match found exception”.
Watching the stack trace lead me to the solution, Type.GetProperty() was the last statement.

I created a little test to point me the guilty property:

var types = Assembly.GetExecutingAssembly().GetTypes();
foreach (var type in types)            {
Debug.WriteLine(type.FullName);
var props = type.GetProperties();
foreach (var prop in props)
Debug.WriteLine(” ” + type.GetProperty(prop.Name).Name);
}

Running this code, i got the same exception.
Conclusion:

class A{
public string this[string key] { get;set;}
}
class B : A{
public string Item { get;set;}
}

When asking, through reflection, for the Item property there is an ambiguity because the indexer shows up with the name “Item”.

I renamed the property on class B…