Let's rokk! [Tudor Cret's blog]

August 24, 2011

Securing NopCommerce stores

Filed under: Uncategorized — Tudor Cret @ 9:29 am
Tags: ,

Why NopCommerce based online stores are secure? I’ve pointed out some arguments below:

  • NopCommerce uses Forms Authentication Provider and the ASP.NET membership provider with ASP.NET login controls (together they provide a way to collect user credentials, authenticate them and manage them using little or no code)
  • Database Access is secure – there are not used any dynamic sql statements. All queries are manually parameterized or by Entity Framework.
  • Message Errors are safe – the application doesn’t show detailed errors to users (achievable by configuring customErrors section properly in the web.config)
  • Sensitive Information is kept safely – password and encryption keys are encrypted  in the database using MD5 and SSL is turned on and also NopCommerce stores only the last 4 digits of the credit card and masked it
  • Guard Against Denial-of-Service Threats – file uploads are limited (4096kb). You can use httpRuntime section in web.config to set up this limit.
  • Guard Against SQL Statement Exploits – the applications uses parameterized SQL statements for data access
  • Guard Against Scripting Exploits – ASP.NET performs request validation against query-string and form variables as well as cookie values. By default, if the current Request contains HTML-encoded elements or certain HTML characters (such as — for an em dash), the ASP.NET page framework raises an error.

June 14, 2011

10 reasons to use Azure for your cloud apps

Filed under: Windows Azure — Tudor Cret @ 11:09 am
Tags: , ,

1. Familiarity of Windows

Based on Windows you can write .NET apps using C#/VB/C++ and ASP.NET/MVC/Silverlight.

Easy to migrate existing Windows applications.

2. 64-bit Windows Virtual Machines

Each instance of the app running in its own VM on the 64-bit Windows Server 2008 operating system. The hypervisor on which they run is designed specifically for the cloud. You don’t have to supply your own VMs or deal with managing and maintaining the OS because apps are developed using Web role instances or worker role instances that run in their own VMs

3. Azure SDK

You can run locally (on your PC) when developing and debugging an application and then move it to the cloud.

4. Scalability and flexibility

Using Azure, you can easily create applications that run reliably and scale from 10 to 10 thousand or even 10 million users — without any additional coding. Azure Storage provides scalable, secure, performance-efficient storage services in the cloud.

5. Cost benefits and pricing model

No costs for building and/or expanding on-premises resources, pay as you go model.

6. Data Center in the cloud

Relational database engine in the cloud offered by SQL Azure.You get high availability and reliability with redundant copies of your data and automatic failover.

7. Support resources

Because Azure uses the same familiar tools and technologies as other Windows platforms, you can take advantage of the well-established support structure within Microsoft and company-provided resources. Strong communities and world wide forums.

8. Interoperability

You can develop hybrid applications that allow your on-premises applications to use cloud services, such as the cloud database and storage services. Communications services work between on-premises applications and the cloud, as well as mobile devices (HTTP,XML,SOAP,REST). SDKs for Java,PHP,Ruby.

9. Security

Windows Azure AppFabric provides a powerful mechanism to secure your application and your communications to the app.

10. Something for everyone

Windows Azure can benefit hosting providers, ISVs, systems integrators, and custom software developers. Hosting providers can expand their services to areas where they don’t have existing infrastructure and add new services without more infrastructure investment. ISVs can use Azure to create, deploy, and manage Web apps and SaaS without large capital expenditures, and they can scale those applications more quickly and cost effectively. Systems integrators can take advantage of Azure’s ability to work with existing on-premise infrastructures. Custom software developers can create software solutions for customers who can’t afford the costs of in-house development, including hardware costs, and they can deliver their applications to customers as services without building and maintaining an expensive data center.

Azure Named Fastest Cloud Service

Filed under: Windows Azure — Tudor Cret @ 10:34 am
Tags: , ,

According to tests by Compuware’s CloudSleuth service Azure was named the fastest cloud service.

In a comparative measure of cloud service providers, Microsoft’s Windows Azure has come out ahead. Azure offered the fastest response times to end users for a standard e-commerce application. But the amount of time that separated the top five public cloud vendors was minuscule.

These are first results I know of that try to show the ability of various providers to deliver a workload result. The same application was placed in each vendor’s cloud, then banged on by thousands of automated users over the course of 11 months.” The complete ranking board:

  1. Windows Azure
  2. GoGrid
  3. Amazon EC2
  4. Rackspace

The test involved the ability to deliver a Web page filled with catalog-type information consisting of many small images and text details, followed by a second page consisting of a large image and labels. The top five were all within 0.8 second of each other.

The test application is designed to require a multi-step transaction that’s being requested by users from a variety of locations around the world. CloudSleuth launches queries to the application from an agent placed on 150,000 user computers.

The response times were:

  1. Windows Azure (data center outside Chicago) – 10.142 seconds
  2. GoGrid – 10.468 seconds
  3. Amazon EC2 Northen Virginia – 10.942 seconds
  4. Rackspace  – 10.999 seconds
  5. Amazon EC2 West (Washington State) – 11.838 seconds
  6. OpSource, Calif. – 12.440 seconds
  7. GoGrid West – 12.604 seconds
  8. Terremark – 12.971 seconds
  9. CloudSigma – 18.079 seconds
  10. Amazon EC2 Europe/Ireland – 18.161 seconds
  11. Windows Azure for Southeast Asia – 27.534 seconds
  12. Amazon EC2 Asia/Pacific Singapore – 30.965 seconds

The response times include all the latencies of the last mile of service as the message moves off the Internet backbone and onto a local network segment. The response times reflect what end users are likely to see "at the edge of the network”.

The results listed:

  • are averages for the month of December, when traffic increased at many providers. Results for October and November were slightly lower, between 9 and 10 seconds.
  • are an average for each vendor, a composite response time compiled from 90,000 browser calls a month to the target application placed in each service provider’s cloud.

The original article is available on the informationweek.com’s page here.

April 29, 2011

Windows Azure at MIX 2011

Filed under: Windows Azure — Tudor Cret @ 9:17 am

Microsoft announced several updates to the Windows Azure at MIX11. These new capabilities will help developers deploy applications faster, accelerate their application performance and enable access to applications through popular identity providers including Microsoft, Facebook and Google.

New Services and Functionality

  • A preview of the Windows Azure Content Delivery Network (CDN) for Internet Information Services (IIS) Smooth Streaming capabilities, which allows developers to upload IIS Smooth Streaming-encoded video to a Windows Azure Storage account and deliver that video to Silverlight, iOS and Android Honeycomb clients. A CTP of this service will be released by the end of this fiscal year.
  • An update to the Windows Azure SDK that includes a Web Deployment Tool to simplify the migration, management and deployment of IIS Web servers, Web applications and Web sites. This new tool integrates with Visual Studio 2010 and the Web Platform Installer.
  • Updates to the Windows Azure AppFabric Access Control service, which provides a single-sign-on experience to Windows Azure applications by integrating with enterprise directories and Web identities.
  • Release of the Windows Azure AppFabric Caching service in the next 30 days, which will accelerate the performance of Windows Azure and SQL Azure applications.
  • A community technology preview (CTP) of Windows Azure Traffic Manager, a new service that allows Windows Azure customers to more easily balance application performance across multiple geographies.

Windows Azure Platform Offer Changes

Microsoft also announced several offer changes including:

  • The extension of the expiration date and increases to the amount of free storage, storage transactions and data transfers in the Windows Azure Introductory Special offer. This promotional offer now includes 750 hours of extra-small instances and 25 hours of small instances of the Windows Azure service, 20GB of storage, 50K of storage transactions, and 40GB of data transfers provided each month at no charge until September 30, 2011. More information can be found here.
    • An existing customer who signed up for the original Windows Azure Introductory Special offer will get a free upgrade as of today. An existing customer who signed up for a different offer (other than the Windows Azure Introductory Special) would need to sign up for the updated Windows Azure Introductory Special Offer separately.
  • The Cloud Essentials Pack for Microsoft partners now includes 750 hours of extra-small instances and 25 hours of small instances of the Windows Azure service, 20GB of storage and 50GB of data transfers provided each month at no charge. In addition, the Cloud Essentials Pack also contains other Microsoft cloud services including SQL Azure, Windows Azure AppFabric, Microsoft Office 365, Windows Intune and Microsoft Dynamics CRM Online. More information can be found here.

Please read the press release or visit the MIX11 Virtual Press Room to learn more about announcements at MIX11. More information about the Windows Azure AppFabric announcements, can be found on the blog post, "Announcing the Commercial Release of Windows Azure AppFabric Caching and Access Control" on the Windows Azure AppFabric blog.

April 18, 2011

Azure Diagnostics

Filed under: Windows Azure — Tudor Cret @ 2:49 pm
Tags: ,

Diagnostics and monitoring for services and applications that don’t run in the cloud doesn’t require a special attention since you have physical  access to the production server, more or less, it depends on how you’ve decided to host them. But in the cloud there are several challenges with diagnostics like:

  • Many instances
  • They move around
  • Massive amount of data
  • Can’t remote desktop in
  • No remote tools (yet)

So Microsoft have implemented a monitoring agent (MonAgentHost.exe) which runs on each instance in the cloud. The agent is started automatically by default. The listener is wired up in the app/web.config, like any other TraceListener. More you need to define a storage account connection string for this listener. In a nutshell it works in 5 steps:


  1. 1.Role instance starts and by default the monitoring agent is started too.
  2. 2.Diagnostic Monitor Starts
  3. 3.Monitor must be configured at start time or remotely any time using service configuration file.
  4. 4.Monitor starts to buffer data locally. The user might set a quota, too.
  5. 5.User initiates transfer to storage. The transfer can be scheduled  or on demand. I recommend a scheduled transfer. In this way diagnostics storage is up to date with live locally data.
  6. image


Below is the table with items that can be monitored in the cloud, if they are activated by default or not and the storage destination type.

Data Source



Trace Logs


Azure Table

Diagnostic Infrastructure Logs


Azure Table

IIS Logs



Performance Counters


Azure Table

Windows Event Logs


Azure Table

IIS Failed Request Logs



Crash Dumps



Arbitrary Files



Implementing Azure Diagnostics:

First Diagnostic Agent is loaded as an Azure module the ServiceDefinition.csdef:


Then module expects a connection string:


Attention, a production connection string must be https! Like this:

<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;
AccountName=YOURACCOUNT;AccountKey=YOURKEY" />

The common pattern to configure diagnostics suppose:

  • Get Config
    • From default
    • Current running
  • Make a change to the config
    • If changed within the instance it will affect only that instance. Don’t forget to start the agent immediately
    • If changed from outside for all roles then
      • change the central file
      • agent notices a change and reloads
      • affects all instances of the role
  • Start the Diagnostics agent with the new configuration

The code:

Method 1:

  • I’ve set up a transfer to the storage every one minute
  • Also I’ve added “System” and “Application” transfer from Event Viewer
Code Snippet
  1. //—————————————-method 1—————————-//
  2.                 //get config
  3.                 CloudStorageAccount storageAcc = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"));
  4.                 RoleInstanceDiagnosticManager ridm = storageAcc.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId,
  5.                     RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);
  6.                 DiagnosticMonitorConfiguration dmc = ridm.GetCurrentConfiguration();
  8.                 //change config
  9.                 // Transfer logs to storage every minute
  10.                 dmc.Logs.ScheduledTransferPeriod = tsTenSeconds;
  11.                 dmc.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = tsTenSeconds;
  12.                 dmc.Directories.ScheduledTransferPeriod = tsTenSeconds;
  13.                 dmc.WindowsEventLog.ScheduledTransferPeriod = tsTenSeconds;
  15.                 dmc.WindowsEventLog.DataSources.Add("System!*");
  16.                 dmc.WindowsEventLog.DataSources.Add("Application!*");
  17.                 dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Information;
  19.                 ridm.SetCurrentConfiguration(dmc);
  20.                 //—————————————-end method 1—————————-//

Method 2:

  • I’ve loaded the default configuration and I’ve set up a scheduled transfer
Code Snippet
  1. //—————————————-method 2—————————-//
  2.                 // Start up the diagnostic manager with the given configuration
  3.                 DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();
  4.                 dmc.Logs.ScheduledTransferPeriod = tsTenSeconds;
  5.                 // Transfer verbose, critical, etc. logs
  6.                 dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
  7.                 DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", dmc);
  8.                 //—————————————-end method 2—————————-//

Method 3:

  • I’ve created a custom listener that writes log entries in a dedicated Azure table
Code Snippet
  1. //—————————————-method 3—————————-//
  2.                 CloudStorageAccount storageAcc = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"));
  3.                 CloudTableClient tableStorage = storageAcc.CreateCloudTableClient();
  4.                 tableStorage.CreateTableIfNotExist(TableStorageTraceListener.DIAGNOSTICS_TABLE);
  5.                 AzureDiagnostics.TableStorageTraceListener listener =
  6.                  new AzureDiagnostics.TableStorageTraceListener("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString")
  7.                  {
  8.                      Name = "TableStorageTraceListener"
  9.                  };
  10.                 System.Diagnostics.Trace.Listeners.Add(listener);
  11.                 System.Diagnostics.Trace.AutoFlush = true;
  12.                 Trace.Listeners.Add(new DiagnosticMonitorTraceListener());
  13.                 //—————————————-end method 3—————————-//

Full code available here.

Visualizing the data

You might use Azure Storage Explorer to explore diagnostics storage account or you can use Cerebrata’s Azure Diagnostics Manager which is a dedicated tool for diagnostics management. It will display a more friendly UI than the Storage Explorer:


March 27, 2011

Windows Azure @ExperienceWorks(“v.1.0”)–Day 2

Filed under: Uncategorized — Tudor Cret @ 12:32 pm


ExperienceWorks it’s a partnership program initiated by MSP from Technical University of Cluj-Napoca and the local IT companies: BitStar, IQuest, ForTech and Evozon. The scope of the program is to link students, companies and academics using the common element – Microsoft technologies. This is the first edition –v.1.0.

I want to thank to all participants. It was a nice experience to share Windows Azure with you. I’ll be waiting until April 10 your email with the members of the team and the project you intend to do. You’ll have to finish it until May 1-14 when I suppose you are going to deliver it to the faculty too. Also don’t forget that we have the Windows Azure licenses – and no credit card or payment is required. As usual I’ve made a summary of the day 2. Just look below:

April 2, Windows Azure Day 2 summary:

We’ve dedicated the entire day to “Hands-on lab”. Also we’ve discussed about cloud computing patterns, scenarios and applications you can build using Windows Azure for your academic assessment and ExperienceWorks’s too. Topics we’ve covered:

  • Azure storage: tables, blobs and queues
    • We’ve built and run <WATKroot>\Labs\ExploringWindowsAzureStorageVS2010
  • SQL Azure
  • Cloud Computing Patterns & Practices
    • I’ve tried to share with you some ideas, scenarios and areas that would give you a starting point for your project. Some of them are:
      • audio/video content processing
      • SaaS solutions like ticketing platforms or erp’s
      • Data backup & restore
      • Social media and online marketing
      • Other domains where fast growth represents a major requirement

Day 1 materials available here.

March 25, 2011

Experience Works

Filed under: Uncategorized — Tudor Cret @ 1:14 pm

[Enter Post Title Here]



What is “ExperienceWorks”?

It’s a partnership program initiated by MSP from Technical University of Cluj-Napoca and the local IT companies: BitStar, IQuest, ForTech and Evozon. The scope of the program is to link students, companies and academics using the common element – Microsoft technologies. This is the first edition –v.1.0.

Between March,26 – May, 14 in each Saturday, a trainer will held a lab on a particular technology.  For this year we’ll have: Windows Azure, ASP MVC, WP7 and Silverlight. More details about the event here. So, I’ll dedicate the next two Saturdays on sharing from my work experience with Windows Azure.

The complete agenda for the Windows Azure lab is:

Day 1 – March, 26

  • Intro to Cloud Computing and Azure
  • Windows Azure Roles
  • SQL Azure
  • Diagnostics and Service Management
  • Solution deployment

Day 2 – April, 2

  • Storage basics
  • Queues
  • Using Azure Tables
  • Using BLOB Storage
  • AppFabric
  • Cloud Computing Patterns & Scenarious

Full event description available here.


December 9, 2010

RONUA TechEd Review Roadshow 2010

Filed under: Others — Tudor Cret @ 9:59 am
Tags: ,


Pentru cei care nu stiu inca RONUA = Romanian .NET User Association, adica comunitatea programatorilor .NET din Romania.

Ce face RONUA? – Ceea ce poate sa faca o comunitate care are in comun aceeasi tehnologie, in cazul de fata .NET.

Ce a facut RONUA in toamna aceasta? Roadshowhttp://newsletter.ronua.ro/tsp/ De data aceasta a acoperit 10 orase:Cluj,Oradea,Arad,Timisoara,Sibiu,Constanta,Galati,Iasi,Brasov,Bucuresti.

La ce bun acest roadshow? Foarte simplu. Inveti HTML5, CSS3 si afli ultimele noutati despre Windows Azure si nu numai. Eu aproape am terminat primele demo-uri folosind HTML5 si CSS3.

Aaaa…si fii pe faza, mai mult ca sigur urmeaza roadshow-ul de primavara!

May 25, 2010

Why I hate Apple!

Filed under: Uncategorized — Tudor Cret @ 3:52 pm

These days and weeks it happens to need some informations about the capabilities of running a web app. on iPhone. Actually I need to find how can I capture video and audio from iPhone’s camera and mic. from a web app., if I can. The final purpose is to send the captured a/v stream to a streaming server. For that I need to encode somehow the captured bytes.

I’ve started to google. But with no results. I thought QuickTime is the solution. No resources found. In the end I’ve called to developers support in UK and I’ve told them my problem. Now their answers: “Yes it’s possible, if you need to find out how please enroll in the program dedicated to developers”. That means $99/year. I’m not an Apple developer, I just want to build my app. to run on iPhone.

Posts on dev forums at Apple? Maybe if I’ll pay to Apple. On stackoverflow.com? Of course. No one had an answer. Moreover no simulators are available for iPhone on Windows. To buy one? No way. I’m waiting for the Windows Phone 7.

Conclusion: If you need something from Apple, first of all please sell your house, car, kids and wife and then come back to Apple. Why Apple can do this? They have no competitors. Android is too weak, Microsoft’s Windows Phone 7 is in the MS labs, Nokia & the rest sell just phones. Apple I hate you!

January 25, 2010

Behind ciripescu.ro, a Windows Azure cloud application (part 1)

Filed under: Technologies — Tudor Cret @ 2:18 pm
Tags: , , , , , , ,


January 4th marks an important step towards enabling Microsoft’s customers and partners, to build and grow businesses on the Windows Azure platform. Microsoft announced that upgrade from Community Technology Preview (CTP) accounts of the Windows® Azure™ platform (i.e., Windows Azure, SQL Azure and/or Windows Azure platform AppFabric) to paid commercial subscriptions is available. At the moment just for a list of countries, list in which Romania is not included. At least until February 1, 2010 the CTP accounts can be used. After this date the storage accounts will become read-only.      

The good news is that finally we don’t have a CTP version anymore. The bad news is that it is no longer free. And it costs. How much? Just watch the pricing list here. And before knowing the total…take a pen and a paper or try TCO and ROI calculator. In our case for ciripescu.ro, one worker role and one web role we’ll start from a minimum of $50 per month.      

But what services we get? In three words: Availability, Durability, Scalability. That means:      

  • Data from the storage (I’m not speaking about database, I’m talking about storage…) will always be available. Data is replicated across 3 nodes, at least. No request to the storage should fail. Also the uptime for my workers and web roles is around 99.99%. And this because at least 3 machines serves a worker or web role.
  • Data from storage is never lost. As I said data is replicated over 3 nodes. At least one node Is available and durable to serve the requested data.
  • I can scale my application as easy as clicking on a button and using a pay-as-you-go service. Et voila, I can add a new role to my application. With a click I have a new machine working for me, which means no hardware, no administrative tasks, etc.

Of course, these are not all the benefits. More can be found here.      

The purpose of the current post is to make a summary of what is behind ciripescu.ro a simple micro-blogging application developed around the concept of twitter. This platform can be extended and customized for any domain.      

So our intention is to develop a simple one-way communication and content delivery network using Windows Azure. Let’s define some entities:      

  • User –  of course we can do nothing without usernames, passwords and  emails J Additional info can be whatever you want, just like an avatar, full name etc.
  • Cirip – defines an entry of maximum 255 characters, visible to everybody. Users content is delivered through this entity
  • Private Message – a private message between two users
  • Invitation – represents an invitation a user can send to invite more other users to use the application. 

I won’t enter in more details about what the application does, you can watch it in action (at least until the end of January L) at www.ciripescu.ro. For the next sections I will concentrate on how the application does all its functionalities. Configurations and deployment aren’t so interesting since they are exposed to any hello world azure application video or tutorial.      

High level view

First let’s take a look at the architecture:      

High level design of Ciripescu.ro


The webrole is what people traditionally view as a web application. VS project contains everything a normal ASP.NET web application would: aspx pages, user controls, web services, code behind (including business layer or data access layer), etc. In fact it can be converted to a normal web application with two clicks from the context menu. It may be even an ASP MVC application. All HTTP requests are served by Windows Azure to one of these Web Roles. I’m saying “one of them”, because scaling up such an application involves running for Web Roles, and possibly more Worker Roles. If one web role has “enough” requests to process then a new incoming request will be handled by one of the other web roles.  With just one click you can have more web roles and more worker roles. Just one click….and a bigger pay bill at the end of the month. Yes, but…if you have your own datacenter? More web roles and worker roles means…electricity, hardware and human force.  And hardware is part from the companie’s actives…      

The worker role can be best compare to a windows service. It’s an application that runs in the cloud and performs tasks that are not suitable for a web application. Anything that takes too long to process and would unreasonably delay the loading of a page should clearly execute asynchronously and out of the normal IIS pipeline. For instance ciripescu.ro sends all its emails using such a worker Role. Why? Well, as most of the other social networking sites, ciripescu.ro has the ability to import contacts from a yahoo messenger account for instance and then lets the user select which of them he wishes to invited. This means that with one click a user could make ciripescu.ro send out a few hundred emails. Such a task could take between a few minutes, up to an hour. Obviously the user can’t wait that long for the page to load. Another example: let’s say I have to send a SMS message from my web application and sending SMS through a SMS gateway takes more than 10 seconds. Et voila, I already have the worker; it should do the job, while the user is free to “walk” to the rest of the application. So we can have as web roles any web application(website, ASP.NET MVC application) and as worker roles  any class library project(C# or VB). I don’t see a reason for which we wouldn’t have multiple worker or web roles written in different languages. Now it would be interesting to see a mix of worker and web roles each one developed using different platforms, PHP, Java, .NET that works together in the same application in the cloud. What would mean all that? That Azure is flexibile and not dedicated only for .NET ? It’s not tested yet that a ASP.NET MVC web role can use a Java worker role, but it’s a challenge for the future time. Anyway Azure is a programatically flexibile. You can easily integrate the existing web applications with new ones. Also integration with “on-premises” applications can be easily done via Bus Services. More about Bus services and Windows Azure flexibility can be found on Windows Azure training kit or just watching videos from PDC  2009. 

More available in part 2, here.

Next Page »

Blog at WordPress.com.