Let's rokk! [Tudor Cret's blog]

June 14, 2011

10 reasons to use Azure for your cloud apps

Filed under: Windows Azure — Tudor Cret @ 11:09 am
Tags: , ,

1. Familiarity of Windows

Based on Windows you can write .NET apps using C#/VB/C++ and ASP.NET/MVC/Silverlight.

Easy to migrate existing Windows applications.

2. 64-bit Windows Virtual Machines

Each instance of the app running in its own VM on the 64-bit Windows Server 2008 operating system. The hypervisor on which they run is designed specifically for the cloud. You don’t have to supply your own VMs or deal with managing and maintaining the OS because apps are developed using Web role instances or worker role instances that run in their own VMs

3. Azure SDK

You can run locally (on your PC) when developing and debugging an application and then move it to the cloud.

4. Scalability and flexibility

Using Azure, you can easily create applications that run reliably and scale from 10 to 10 thousand or even 10 million users — without any additional coding. Azure Storage provides scalable, secure, performance-efficient storage services in the cloud.

5. Cost benefits and pricing model

No costs for building and/or expanding on-premises resources, pay as you go model.

6. Data Center in the cloud

Relational database engine in the cloud offered by SQL Azure.You get high availability and reliability with redundant copies of your data and automatic failover.

7. Support resources

Because Azure uses the same familiar tools and technologies as other Windows platforms, you can take advantage of the well-established support structure within Microsoft and company-provided resources. Strong communities and world wide forums.

8. Interoperability

You can develop hybrid applications that allow your on-premises applications to use cloud services, such as the cloud database and storage services. Communications services work between on-premises applications and the cloud, as well as mobile devices (HTTP,XML,SOAP,REST). SDKs for Java,PHP,Ruby.

9. Security

Windows Azure AppFabric provides a powerful mechanism to secure your application and your communications to the app.

10. Something for everyone

Windows Azure can benefit hosting providers, ISVs, systems integrators, and custom software developers. Hosting providers can expand their services to areas where they don’t have existing infrastructure and add new services without more infrastructure investment. ISVs can use Azure to create, deploy, and manage Web apps and SaaS without large capital expenditures, and they can scale those applications more quickly and cost effectively. Systems integrators can take advantage of Azure’s ability to work with existing on-premise infrastructures. Custom software developers can create software solutions for customers who can’t afford the costs of in-house development, including hardware costs, and they can deliver their applications to customers as services without building and maintaining an expensive data center.


Azure Named Fastest Cloud Service

Filed under: Windows Azure — Tudor Cret @ 10:34 am
Tags: , ,

According to tests by Compuware’s CloudSleuth service Azure was named the fastest cloud service.

In a comparative measure of cloud service providers, Microsoft’s Windows Azure has come out ahead. Azure offered the fastest response times to end users for a standard e-commerce application. But the amount of time that separated the top five public cloud vendors was minuscule.

These are first results I know of that try to show the ability of various providers to deliver a workload result. The same application was placed in each vendor’s cloud, then banged on by thousands of automated users over the course of 11 months.” The complete ranking board:

  1. Windows Azure
  2. GoGrid
  3. Amazon EC2
  4. Rackspace

The test involved the ability to deliver a Web page filled with catalog-type information consisting of many small images and text details, followed by a second page consisting of a large image and labels. The top five were all within 0.8 second of each other.

The test application is designed to require a multi-step transaction that’s being requested by users from a variety of locations around the world. CloudSleuth launches queries to the application from an agent placed on 150,000 user computers.

The response times were:

  1. Windows Azure (data center outside Chicago) – 10.142 seconds
  2. GoGrid – 10.468 seconds
  3. Amazon EC2 Northen Virginia – 10.942 seconds
  4. Rackspace  – 10.999 seconds
  5. Amazon EC2 West (Washington State) – 11.838 seconds
  6. OpSource, Calif. – 12.440 seconds
  7. GoGrid West – 12.604 seconds
  8. Terremark – 12.971 seconds
  9. CloudSigma – 18.079 seconds
  10. Amazon EC2 Europe/Ireland – 18.161 seconds
  11. Windows Azure for Southeast Asia – 27.534 seconds
  12. Amazon EC2 Asia/Pacific Singapore – 30.965 seconds

The response times include all the latencies of the last mile of service as the message moves off the Internet backbone and onto a local network segment. The response times reflect what end users are likely to see "at the edge of the network”.

The results listed:

  • are averages for the month of December, when traffic increased at many providers. Results for October and November were slightly lower, between 9 and 10 seconds.
  • are an average for each vendor, a composite response time compiled from 90,000 browser calls a month to the target application placed in each service provider’s cloud.

The original article is available on the informationweek.com’s page here.

April 29, 2011

Windows Azure at MIX 2011

Filed under: Windows Azure — Tudor Cret @ 9:17 am

Microsoft announced several updates to the Windows Azure at MIX11. These new capabilities will help developers deploy applications faster, accelerate their application performance and enable access to applications through popular identity providers including Microsoft, Facebook and Google.

New Services and Functionality

  • A preview of the Windows Azure Content Delivery Network (CDN) for Internet Information Services (IIS) Smooth Streaming capabilities, which allows developers to upload IIS Smooth Streaming-encoded video to a Windows Azure Storage account and deliver that video to Silverlight, iOS and Android Honeycomb clients. A CTP of this service will be released by the end of this fiscal year.
  • An update to the Windows Azure SDK that includes a Web Deployment Tool to simplify the migration, management and deployment of IIS Web servers, Web applications and Web sites. This new tool integrates with Visual Studio 2010 and the Web Platform Installer.
  • Updates to the Windows Azure AppFabric Access Control service, which provides a single-sign-on experience to Windows Azure applications by integrating with enterprise directories and Web identities.
  • Release of the Windows Azure AppFabric Caching service in the next 30 days, which will accelerate the performance of Windows Azure and SQL Azure applications.
  • A community technology preview (CTP) of Windows Azure Traffic Manager, a new service that allows Windows Azure customers to more easily balance application performance across multiple geographies.

Windows Azure Platform Offer Changes

Microsoft also announced several offer changes including:

  • The extension of the expiration date and increases to the amount of free storage, storage transactions and data transfers in the Windows Azure Introductory Special offer. This promotional offer now includes 750 hours of extra-small instances and 25 hours of small instances of the Windows Azure service, 20GB of storage, 50K of storage transactions, and 40GB of data transfers provided each month at no charge until September 30, 2011. More information can be found here.
    • An existing customer who signed up for the original Windows Azure Introductory Special offer will get a free upgrade as of today. An existing customer who signed up for a different offer (other than the Windows Azure Introductory Special) would need to sign up for the updated Windows Azure Introductory Special Offer separately.
  • The Cloud Essentials Pack for Microsoft partners now includes 750 hours of extra-small instances and 25 hours of small instances of the Windows Azure service, 20GB of storage and 50GB of data transfers provided each month at no charge. In addition, the Cloud Essentials Pack also contains other Microsoft cloud services including SQL Azure, Windows Azure AppFabric, Microsoft Office 365, Windows Intune and Microsoft Dynamics CRM Online. More information can be found here.

Please read the press release or visit the MIX11 Virtual Press Room to learn more about announcements at MIX11. More information about the Windows Azure AppFabric announcements, can be found on the blog post, "Announcing the Commercial Release of Windows Azure AppFabric Caching and Access Control" on the Windows Azure AppFabric blog.

April 18, 2011

Azure Diagnostics

Filed under: Windows Azure — Tudor Cret @ 2:49 pm
Tags: ,

Diagnostics and monitoring for services and applications that don’t run in the cloud doesn’t require a special attention since you have physical  access to the production server, more or less, it depends on how you’ve decided to host them. But in the cloud there are several challenges with diagnostics like:

  • Many instances
  • They move around
  • Massive amount of data
  • Can’t remote desktop in
  • No remote tools (yet)

So Microsoft have implemented a monitoring agent (MonAgentHost.exe) which runs on each instance in the cloud. The agent is started automatically by default. The listener is wired up in the app/web.config, like any other TraceListener. More you need to define a storage account connection string for this listener. In a nutshell it works in 5 steps:


  1. 1.Role instance starts and by default the monitoring agent is started too.
  2. 2.Diagnostic Monitor Starts
  3. 3.Monitor must be configured at start time or remotely any time using service configuration file.
  4. 4.Monitor starts to buffer data locally. The user might set a quota, too.
  5. 5.User initiates transfer to storage. The transfer can be scheduled  or on demand. I recommend a scheduled transfer. In this way diagnostics storage is up to date with live locally data.
  6. image


Below is the table with items that can be monitored in the cloud, if they are activated by default or not and the storage destination type.

Data Source



Trace Logs


Azure Table

Diagnostic Infrastructure Logs


Azure Table

IIS Logs



Performance Counters


Azure Table

Windows Event Logs


Azure Table

IIS Failed Request Logs



Crash Dumps



Arbitrary Files



Implementing Azure Diagnostics:

First Diagnostic Agent is loaded as an Azure module the ServiceDefinition.csdef:


Then module expects a connection string:


Attention, a production connection string must be https! Like this:

<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;
AccountName=YOURACCOUNT;AccountKey=YOURKEY" />

The common pattern to configure diagnostics suppose:

  • Get Config
    • From default
    • Current running
  • Make a change to the config
    • If changed within the instance it will affect only that instance. Don’t forget to start the agent immediately
    • If changed from outside for all roles then
      • change the central file
      • agent notices a change and reloads
      • affects all instances of the role
  • Start the Diagnostics agent with the new configuration

The code:

Method 1:

  • I’ve set up a transfer to the storage every one minute
  • Also I’ve added “System” and “Application” transfer from Event Viewer
Code Snippet
  1. //—————————————-method 1—————————-//
  2.                 //get config
  3.                 CloudStorageAccount storageAcc = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"));
  4.                 RoleInstanceDiagnosticManager ridm = storageAcc.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId,
  5.                     RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);
  6.                 DiagnosticMonitorConfiguration dmc = ridm.GetCurrentConfiguration();
  8.                 //change config
  9.                 // Transfer logs to storage every minute
  10.                 dmc.Logs.ScheduledTransferPeriod = tsTenSeconds;
  11.                 dmc.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = tsTenSeconds;
  12.                 dmc.Directories.ScheduledTransferPeriod = tsTenSeconds;
  13.                 dmc.WindowsEventLog.ScheduledTransferPeriod = tsTenSeconds;
  15.                 dmc.WindowsEventLog.DataSources.Add("System!*");
  16.                 dmc.WindowsEventLog.DataSources.Add("Application!*");
  17.                 dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Information;
  19.                 ridm.SetCurrentConfiguration(dmc);
  20.                 //—————————————-end method 1—————————-//

Method 2:

  • I’ve loaded the default configuration and I’ve set up a scheduled transfer
Code Snippet
  1. //—————————————-method 2—————————-//
  2.                 // Start up the diagnostic manager with the given configuration
  3.                 DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();
  4.                 dmc.Logs.ScheduledTransferPeriod = tsTenSeconds;
  5.                 // Transfer verbose, critical, etc. logs
  6.                 dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
  7.                 DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", dmc);
  8.                 //—————————————-end method 2—————————-//

Method 3:

  • I’ve created a custom listener that writes log entries in a dedicated Azure table
Code Snippet
  1. //—————————————-method 3—————————-//
  2.                 CloudStorageAccount storageAcc = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"));
  3.                 CloudTableClient tableStorage = storageAcc.CreateCloudTableClient();
  4.                 tableStorage.CreateTableIfNotExist(TableStorageTraceListener.DIAGNOSTICS_TABLE);
  5.                 AzureDiagnostics.TableStorageTraceListener listener =
  6.                  new AzureDiagnostics.TableStorageTraceListener("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString")
  7.                  {
  8.                      Name = "TableStorageTraceListener"
  9.                  };
  10.                 System.Diagnostics.Trace.Listeners.Add(listener);
  11.                 System.Diagnostics.Trace.AutoFlush = true;
  12.                 Trace.Listeners.Add(new DiagnosticMonitorTraceListener());
  13.                 //—————————————-end method 3—————————-//

Full code available here.

Visualizing the data

You might use Azure Storage Explorer to explore diagnostics storage account or you can use Cerebrata’s Azure Diagnostics Manager which is a dedicated tool for diagnostics management. It will display a more friendly UI than the Storage Explorer:


Create a free website or blog at WordPress.com.