Automatically Configure Azure AD in your Web Applications using VS 2015

Some of you might have noticed that in the project context menu, Visual Studio 2015 gives the option ‘Configure Azure AD Authentication

When you click on this, a wizard will open up and it will do most of the heavy lifting for you in configuring the Azure AD for your application. Though the wizard is very rich, it is somewhat flexible as well. In this post I walk you through the direct way of configuring Azure AD Authentication in your application and explain the code generated by the wizard. Read this article to know more about Azure AD authentication scenarios.

First you need an Azure AD domain in order to register your application. You should have this and the user who is going to perform this action should be in the Global Administrator roles for the Azure AD

Assuming that you have the prerequisites to configure the authentication, click the above option.

Step 0


Step 1

  • Enter your Azure AD domain
  • Choose Create a new Azure AD application using the selected domain.
  • If you already have an application configured in you Azure AD you can enter the client ID and the Redirect URI of the application.


Step 2

  • Tick the Read Directory data option to read the Azure AD
  • You can use the properties you read as claims in your application
  • Click Finish


This is will take few minutes and configure the Azure AD Authentication for your application. Also it adds the [Authorize] attributes to your controllers.

Now if you log into the Azure Portal and in the Applications section of the Azure AD you can notice the new application provisioned.

Note that the wizard provisions the app as a single tenant application for you, if you want later you can change that to a multi-tenant application.

Configuring the single tenant application / multi-tenant application is beyond the scope of this post.

Code

The wizard retrieves the information about the app it created, stores them in the configuration file and write code to access them. Main part is in the App_Start folder of the web project the Startup.Auth.cs

The below image shows the code generated by the wizard on a web project which hasn’t had any authentication configured before.

  • First line creates the data context object of the EF model provisioned. This is not mandatory if you are not accessing any data from the database. You can simply delete this line.
  • Second line sets the DefaultSignInAsAuthenticationType property – this will be passed to Azure AD during authentication
  • Third line sets the cookie authentication for the web application.
  • Fourth line of code which looks big and confusing but actually not. It is a common method of OWIN, commonly used in Open Id authentications.
  • Here we specify the options for the Azure AD Open Id authentication and the call backs.
  • Since the above code is bit unclear I have broken the code in a simpler way.

I have simplified the code to the level of retrieving the authentication code. The wizard generated code goes to next level and retrieves the authentication token as well. I haven’t included this in the code because I wanted to simplify the logic as much as possible. After retrieving the authentication code you can request an authentication token and perform actions. This is a separate topic and includes the authentication workflow of Azure AD.

I will discuss Azure AD authentication workflow and how to customize applications in a separate post.

Advertisement

SQL Azure Database best practices and TDD

This post summarizes the set of good practices you should follow when developing application on SQL Azure and how to arrange TDD or unit tests based on those practices.

Before continue reading this post, I assume that you are familiar with SQL Azure Databases, Entity Framework and TDD.

One of the highly regarded and strongly recommended cloud development strategies is resilient development, meaning that the application should expect the failures and handle them. This is more than catching the exceptions and informing to users, this includes retry logics, trying alternatives or any other way to mitigate / reduce the failure and user frustration.

Resilient Development in SQL Azure

When accessing a SQL Azure Database we should implement resilient mechanisms. EF provides an execution strategy interface to put custom logic on retrying errors. You can read more about the execution strategies here

There’s a dedicated implementation of the execution strategy named SqlAzureExecutionStrategy especially for the SQL Azure Databases.

Before continuing let us discuss why we need to implement this execution strategy and why it is highly recommended when using SQL Azure databases. You can list many transient failures in accessing databases, but I list the following two are the main culprits.

  • SQL Azure is a database as a service. Applications talk to SQL Azure through the Internet and there is a high probability of communication breakage.
  • SQL Azure has a concept of DTU (Data Throughput Units) which is a number assigned to a database, which directly maps to the usage of CPU, memory and IO. You can say that a database with higher DTU has more throughput. Since DTU is a limiting factor, when the databases reach the maximum allocated DTU usage, they start to throttle the requests, which might produce timeout exceptions.

Implementing the execution strategy comes with the limitations, mainly we cannot handle custom transactions if we implement execution strategy. So needless to say that we should implement it in a way that we can control it.

This link describes the limitations and provide the pattern to implement controllable execution strategy.

Limitation on TDD / Unit Tests

Before proceeding just by the above sub heading do not get mislead to the conclusion that TDD is same as Unit Tests, they are different approaches.

As you have read in the above link now we know the importance of implementing execution strategy and the limitations. In our unit tests we implement transactions because we can rollback them. This is very essential when running tests on production databases, so we can rollback changes done by the unit tests and leave the database unchanged after the text execution regardless of whether the tests have passed or not.

This is a great article which provides details on how to implement such a test base class.

Adaptive Bitrate (ABR) Playback in HTML 5 and Azure Media Services

I highly recommend to read these following posts before continuing on this.

Once your media is ready to be streamed we need some mechanism for the ABR playback in the browser without any plugins. Because HTML 5 video tag does not perform ABR and it just performs progressive download. (This is explained in the first link above).

MPEG DASH is the protocol / standard to stream ABR playback using HTML 5.

YouTube adapted this in January 2015.

Creating a DASH player is a custom implementation since W3C defines it in the MSE (Media Source Extensions) as a standard.

Dash.js – https://github.com/Dash-Industry-Forum/dash.js
is an open source project which helps to build the DASH player. The source contains samples ranging from simple streaming to complex samples like captioning and ad insertion.

Azure Media Player (AMP)

Azure Media Player is an implementation of a DASH player adhering to MSE and EME (Encrypted Media Extensions). AMP is also provides fallback mechanism to play the media using Flash / Silverlight when DASH is not supported by the browser. It has a limitation of it will only playback the media streamed from Azure Media Services, but for the intended purpose it serves very well.

Read documentation of AMP here. http://amp.azure.net/libs/amp/latest/docs/

You can check a sample online AMP here. http://amsplayer.azurewebsites.net/azuremediaplayer.html

Simple implementation of AMP using DASH without any fallback mechanisms would look very similar to the below code snippet.

Delivering media content using Azure Media Services – Part 2

Preparing the video for the HTTP Adaptive Streaming using AMS

I assume you have read the Part 1 of this series and you have the understanding about the HTTP Adaptive Streaming and why we need to support it in HTML 5.

HTTP adaptive streaming allows the browsers to request video fragments in varying quality based on the varying bandwidth. In order to serve video fragments in varying qualities, the media should be encoded in different qualities and kept ready to be streamed. For example if you have a video in 4K quality, you have to encode it in different qualities like 1080p, 720p and 480p likewise. This eventually will produce individual files in different qualities.

Note: Typically upscale encoding process do not produce quality videos, meaning that if your original video is in 2K quality and encoding it in lower qualities would produce best outputs but encoding it in the 4K quality would not produce clear 4K most of the times.

So in the process of encoding the video files in different qualities, the first step is uploading the source file. This is known as Ingestion in the media work flow.

Assuming that you have AMS up and running in you Azure account, the following code snippet would upload your video file to the AMS. When creating the AMS you have to create a new / associate an existing storage account with that. This storage account will be used to store the source files and the encoding output files (these files are known as assets).

First install the following AMS NuGet packages

var mediaServicesAccountName = “<your AMS account name>”;

var mediaServicesKey = “<AMS key>”;

var credentials = new
MediaServicesCredentials(mediaServicesAccountName, mediaServicesKey);

var context = new
CloudMediaContext(credentials);

var ingestAsset = context.Assets.CreateFromFile(“minions.mp4”, AssetCreationOptions.None);

After the upload we should encode the file with one of the available HTTP Adaptive Streaming encoding types. The encoding job will produce different files in different qualities for the same source. In the date of this writing AMS supports 1080p as the maximum ABR encoding quality.

Following lines of code submit the encoding job to the AMS Encoder. You can scale the number of encoders and the type based on your demand like any other services in Azure. When the encoding is completed AMS will put the output files in the storage account under the specified name (adaptive minions MP4).

var job = context.Jobs.CreateWithSingleTask(MediaProcessorNames.AzureMediaEncoder,


MediaEncoderTaskPresetStrings.H264AdaptiveBitrateMP4Set1080p, ingestAsset, “adaptive minions MP4”,


AssetCreationOptions.None);

job.Submit();

job = job.StartExecutionProgressTask(p =>

{


Console.WriteLine(p.GetOverallProgress());

},


CancellationToken.None).Result;

Once the job is finished, we can retrieve the encoded asset, apply polices to it and create locators. Locators could have conditions on the access permission and the availability. When we create locators, assets will be published.

var encodedAsset = job.OutputMediaAssets.First();

var policy = context.AssetDeliveryPolicies.Create(“Minion Policy”,


AssetDeliveryPolicyType.NoDynamicEncryption, AssetDeliveryProtocol.Dash, null);

encodedAsset.DeliveryPolicies.Add(policy);

context.Locators.Create(LocatorType.OnDemandOrigin, encodedAsset, AccessPermissions.Read, TimeSpan.FromDays(7));

var dashUri = encodedAsset.GetMpegDashUri();

We can retrieve the URLs for different streaming protocols like HLS, Smooth and DASH. Since our topic of discussion focuses on delivering HTTP adaptive streaming over HTML 5, we will focus on constructing the DASH URI for the asset.

This is the manifest URI, which should be used in HTTP adaptive streaming playback.

In the Part 1 of this series, I have explained what HTML 5 video tag does and the browser is not aware of requesting varying fragments based on varying bandwidths. In order to do this we need a browser which supports DASH.

There’re several implementations of the DASH player, we will be using Azure Media Player for the playback since it has fallbacks for the other technologies if DASH isn’t supported.

Note: AMS should be configured with at least one streaming end point in order to enable the HTTP Adaptive Streaming.

Azure Active Directory (AAD): Common Application Development Scenarios

Azure Active Directory is a cloud identity management solution, but not limited to cloud identity alone. In this post let’s discuss about how AAD can be used in designing multi-tenant applications in cloud. As usual consider that MassRover is an ISV.

MassRover got this great idea of developing a document management application named ‘Minion Docs‘ for enterprises, they simply developed the application for Intranet using Windows Authentication. VTech was the first customer of Minion Docs. MassRover installed the application on premise (in the VTech data centers).

After a while VTech started complaining that the users want to access it outside the organization in a more secure way using different devices, and VTech also proposed that they are planning to move to Azure.

MassRover decided to move the application to the cloud in order to sustain the customers, also they realized that moving to the cloud would open the opportunity to cater multiple clients and they can introduce new business models.

Creating Multi-Tenant Applications

The Intranet story I explained is very common in the enterprises.

All the organizations have the burning requirements of handling the modern application demands like mobility, Single Sign On and BYOD without compromising existing infrastructure and the investments.

MassRover team decided to move the application to the Azure in order to provide solutions for those problems and leverage the benefits of the cloud.

First Mass Rover got an Azure subscription and integrated Minion Docs as a multi-tenant application in their AAD. As an existing Intranet application this requires minimum rewrite with more configuration.

The below setup window is the out of the box ASP.NET Azure Active Directory multi-tenant template, you see in Visual Studio.

Registering an application in AAD as a multi-tenant application allows other AAD administrators to sign up and start using our application. Considering the fact that Minion Docs is an AAD application there 2 primary ways that VTech can use Minion Docs.

  • Sync Local AD with AAD along with passwords – allows users to single sign on using their Active Directory credentials even though there’s no live connection between local AD and AAD.
  • Federate the authentication to local AD – users can use the same Active Directory credentials but the authentication takes place in local AD.

The only significant different between the above two methods is, where the authentication takes place; in the AAD or in federated local AD.

Local AD synced with Azure Active Directory with passwords

VTech IT decides to sync their local AD with their AAD along with the passwords. And VTech AD administrator signs up for the Minion Docs and allows the permissions (read / write) to Minion Docs.

What happens here?

  • MassRover created and registered Minion Docs as a multi-tenant Azure Active Directory application in their Azure Active Directory.
  • VTech has their local AD which is the domain controller which had been used in the Minion Docs Intranet application.
  • VTech purchases an Azure Subscription and they sync their local AD to their Azure Active Directory along with the passwords.
  • VTech Azure Active Directory admin signs up for the Minion Docs application, during this process VTech admin also grants the permissions to the Minion Docs.
  • After the sign up Minion Docs will be displayed under the ‘Applications my company uses’ category in the VTech’s AAD.
  • Now a user named ‘tom’ can sign in to the Minion Docs application with the same local AD credentials.

Sign in Work Flow


Few things to note

  • Minion Docs does not involve in the authentication process.
  • Minion Docs gets the AAD token based on the permission granted to Minion Docs application by the VTech AAD admin.
  • Minion docs can request additional claims of the user using the token and if they are allowed Minion Docs will get them.
  • Authorization within the application is handled by the Minion Docs.

Local AD is federated using ADFS

This the second use case where the local AD is synced with the AAD but VTech decides to federate the local Active Directory. In order to do this, first VTech should enable ADFS and configure it. ADFS doesn’t allow any claims to be retrieved by default, so VTech admin should specify the claims as well.

Federated Sign in Work Flow

In the federated scenario the authentication happens in the local AD.

As an ISV Minion Docs is free from, whether the authentication happens in the customer’s AAD or in their local AD.

Tenant Customization

Anyone with the right AAD admin rights can sign up to the Minion Docs, but this is not the desired behavior.

The better approach would be during the first sign up we can notify the Minion Doc administrators with the tenant details and they can decide or this could be automated in subscription scenarios.

A simple example, consider Voodo is another customer who wants to use Minion Docs. The Voodo admin signs up and before adding Voodo as an approved tenant in the Minion Docs database they have to complete a payment. Once the payment is done Voodo will be added to the database. This is very simple and very easy to implement.

Delivering media content using Azure Media Services – Part 1

Introduction

Media is a big business in the Internet, and it is the highest bandwidth consuming content available in WWW. This post will give you detailed information about delivering media through the Internet, what are the concepts and technologies associated with that and how Azure Media Services (AMS) can be used to deliver media.

Before talking about what AMS is and how to use it in delivering media content through the HTTP, let me explain some basics of the media (here the topic of discussion is videos but the concepts are very analogous to audio as well).

Video formats are just containers for media content. It packages the video content, audio content and other extras like the poster, header and much more. There are many video container formats available, the *.mp4, *.ogv, *.flv and *.webm are few well known container formats.

CODECS are used to package the media content in those containers. The term CODEC is used to describe both the encoding and decoding. So for any playback the player should know two things.

  • How to open / read the specific media container.
  • How to decode the media content using the CODEC used to encode it.

H.264, Theora and VP8 are few well known video codecs.

I highly recommend this post in order to get more information about media containers and codecs.

The real world experience

Talking about video creation, it is very easy for all of us to create a video even with 4K quality using smartphones. We record the video using our devices, edit them and share it.

From that point onwards what happens to that content and how it is handled by the sites like YouTube is the core part of this discussion.

Before the introduction of HTML 5 video tag, we had to install a third party plugin (mostly the Flash Player) in order to view the videos. This is because before the HTML 5, browsers did not know how to perform the above mentioned two tasks to play a video. (They didn’t know how to open the container and they didn’t know how to decode the content).

When delivering media to highly diversified audience (different operating systems, different browsers and different devices) we should concentrate we should concentrate on the format (container and codec) of the media. Prior to HTML 5, we had no choice other than installing the plugins for our browsers.

These plugins were smart enough to do the following main tasks.

  • They can read and decode the media
  • They could detect the bandwidth variations and request the media fragments in the right quality which is known as adaptive streaming / adaptive bitrate streaming (ABR)

The second task is very important and that made the streaming a fruitful experience in low bandwidth conditions as well. I’m pretty sure everyone has experienced this in YouTube. (Don’t tell me that you haven’t used YouTube).

In order to provide this experience not only the client players (plugins) need special capabilities, the streaming servers should also encode the video in different qualities as well.

HTML 5

HTML 5 ripped off most plugins from the browsers. It also has the video tag which can play the video. Browsers do not need a third party plugin to playback videos / audios. Put your video files in a static directory in your web application and open it using the <video> tag. Things works perfectly well.

This will work perfectly fine as long as your browser supports *.mp4 / H.264 (all modern browsers do support this).

This is how browser handles the request.

You can notice, before hitting the play there are few requests to the media content in order to render the initial frame. But the content request is made after hitting the play (as auto play is not enabled in the above HTML snippet) and it is in the Pending state because the browser has made the request to the HTTP server (localhost in this case) and the response is not complete.

But your browser starts to play the video before the response is completed, because browsers are capable of rendering the content. When you seek the progress bar you have to wait till content to be loaded to that point from the start. Because your browser just send a HTTP request for the video file and it simply downloading it, the browser do not know anything about how to request based on varying bandwidth and how to request in the middle of the download. It is just a start to end download. This is known as Progressive Download.

Progressive download can give us the streaming experience though technically it is not streaming because the browsers know how to render / playback the content what is available to it.

HTML 5 video / audio tags do this. They render / playback the content.

In this case we do not have a client which can detect varying bandwidth. We cannot request video chunks (known as fragments) in different qualities since we have only one video file. In the progressive download you will get an uninterrupted video playback only if your bandwidth is higher than the bitrate of the video.

The above diagram shows the flow. It is very similar how you deliver any other static content like images and scripts.

So it is obvious using the static content delivery we do not get ABR which is very important in the modern world since we have multiple devices and varying bandwidth.

So we require a technology in HTML 5 that can perform the ABR streaming. This requires changes in both the client and streaming side. Part 2 of this series will explain how to do Adaptive HTTP Streaming and ABR play back in HTML 5 without plugins.

Azure Key Vault Manager

Azure Key Vault is generally available. If you use Azure Key Vault in your projects, then there’s a high probability that you felt the need of a handy dev tool to manage your Vault.

Here it is. http://keyvaultmanager.azurewebsites.net

GitHub: https://github.com/thuru/AzureKeyVaultManager

More about Azure Key Vault: https://thuruinhttp.wordpress.com/2015/05/30/azure-key-vault-setup-and-usage-scenarios/

Redis on Azure : Architecting applications on Redis and its data types

Redis is the recommended option in the Azure PaaS caching techniques (Read why). Still Azure provides and supports other caching options too (What are the other available options)

Redis cache is available in Azure. View the Azure Redis Documentation and Pricing details.

When talking about caching, often we think of key value pair store. But Redis is not just a key value store, Redis has its own data types and support transactional operations.

Having a clear idea about the Redis cache and Redis data types will certainly help us designing applications in more Redis way.

Clone this git repository to get the idea about Redis data types. https://github.com/thuru/azure-redis-cache

The application demonstrates a crude sample of posting messages. Users can post messages and like the posts. Most recent posts and most liked posts are shown in different grids. Also users can tag posts.

Key Value pairs, Lists, (capped lists), Redis functions, Hashsets and Sets are used in the sample.

Read more on Redis data types

I will update source with more samples and information. Apart from the Redis features in the Azure portal we get comprehensive dashboards about the cache usage, cache hits, cache misses, reads, writes, memory usage and many more.