Azure Active Directory, Microsoft Azure accounts, Graph API and Multi tenant application development.

Last week I did a session about Azure AD and multi tenant application development using Azure AD. Azure AD is a big topic and when we combine that with the other services and account provisioning it becomes huge. In the session I managed to cover the common scenarios in Microsoft Azure account management and application development.

Feeling gloomy and lazy to write all of them in the blog – I have shared the topics I discussed and the presentation. Please feel free to throw your questions here in the comment section. Following list contains the topics covered in the session.

  • Provisioning Azure AD
  • How Azure AD is related to Microsoft Azure and how not.
  • Accessing Azure AD using PowerShell
  • Directory Integration (on premise AD) with Azure AD
  • Azure AD with Office 365
  • Multi tenant application development with Azure AD (both federated and non federated scenarios)
  • Azure AD Graph API and .NET client libraries
  • Branding Azure AD

AWS Best login practice – IAM Policies

This post explains the best practice that highly recommended to be followed in AWS account and the Identity Access Management (IAM) policies in login process.

First the user account that is used to create the AWS account is the Root Account. (the composite credential of email address and password). For example if I go to AWS portal and submit email address thuru@qbe.com and the password along with credit card payment details, this account or the credential holder becomes the Root Account.

It is not recommended to use the Root Account for any purposes even to login to AWS portal. (unless required in very specific scenarios like changing core account details and payment methods)

The best practice is to create an IAM user with administrator privileges. IAM administrator has all the privileges except the core account privileges mentioned above. You can use the built it IAM administrator template for this.

Follow these steps.

  • First login to the portal as AWS Root User (as in the beginning Root User is the only one who has privileges to create Administrators). Root user will go to https://console.aws.amazon.com/console/home and enter email and password to login to the portal. You can note that this is a global URL for all AWS users all over the world.
  • In the portal go to under Administration & Security section click on IAM image
  • Go to users and click create users. Name the user (ex – admin) and you can generate key for the user in order to access AWS services via APIs. It’s not recomended to use admin account for development purposes so better not to generate keys for admin. Click create to create the user.image
  • By default the users do not get any permissions. In order to assign permission, click on admin (the user created in the above step will appear in the users grid). Under the permission tab click on attach policy.image
  • First option is Administrator Access policy. Click on that to assign it and click Attach Policy button.image
  • Then back in the page under Security Credentials section click on Manage Password to create a password for the admin.image
  • There you can auto assign a password or can assign a custom password. You also get the option to make the user to change the password in the first login.
  • You also have the option to specify MFA (will post a separate blog on this topic)
  • Now go back to the IAM Dashboard and you will see a section like this.image
  • This is the IAM user sign-in link. You can click customize and replace the number in the URL to some alias (your organization name). And this URL should be used by IAM users including the admin in order to access AWS web console.

image

 

 

 

 

 

Root user account and the global sign-in URL should be untouched unless a specific requirement.

Contribution of cloud computing to the Agile

I can be pretty sure that almost all the times we hear the word Agile our mind relates that to the Agile software development process rather than the English word agile. Even Google thinks so. True enough that the semantic of the English word agile is they key to name the so called process as agile.

image

The reason I gave such an introduction to the agile is to bring out how much popularity the process has gained overtime. There’re different ways to implement agile, I don’t know any of them properly by the rules. But I have an understanding that the core of the agile is iterative thinking in an incremental delivery mode. That’s the key rest is how you do that.

Thinking about the current software delivery, the process of agile and how it evolved from the well blamed waterfall model, I felt little happy about myself for knowing some old school stuff. I think I was lucky enough to work with computers with huge keyboards which make sound of a shutting clam with green monochrome screens. They used to run the so called DOS 6.2. I have written programs in GW Basic and FoxPro and used 5 1/2  inch floppy disks.

Software used to be developed and  delivered totally different in those days. An ISV  had to write the software and ship it through some hard media (floppy disks or optical drives) mostly with a serial key for licensing purposes. We couldn’t think of iterative delivery on that model. A huge complex software would have ended up with 100s of CDs delivered to the client every two weeks; probably requiring a delivery service like DHL or FedEx.

So the delivery and the development practices were forced to lock up in the boundary of water fall model because frequent deliveries were mostly impossible due technology limitations. And those days most of the software were written for desktop computers.

With the time, industry evolved and cloud computing has become the heart and soul of the IT. Software development practices started to change and most of the development occurs for the cloud.

Cloud not only facilitates the different licensing models and how organizations manage their resources, cloud also has changed the entire software development process. It brought the trends of continuous delivery, online build automation, continuous integration, cloud source control and much more features which are the core part for the iterative development and agile methodologies.

Without those tools and technical processes we cannot think of implementing agile in software development in the modern day. Cloud facilitates the modern Agile Software Development and Dev Ops.

Each and every line of change is reflected to the customers in near real time with entire automation. Iterative development is fueled by the fast feedback loops. In order to gain the faster feedback loops continuous delivery is vital. Cloud computing facilitates this phenomena.

image

The developer and operations work flow is seamless with the cloud computing. Platforms like Microsoft Azure provides end to end DevOps workflow with tools like Visual Studio online, Azure Web Apps and Application Insights which exactly maps to the above diagram.

Cloud not simply a platform it’s the trend setter.

Project Oxford – Behind the scenes of how-old.net

http://www.how-old.net has been trending recently in social media. Simply you can upload a picture in this website and it will detect the faces in the photo and tells you the gender and the age of the person the face belongs to.

The site uses Face API behind the scenes, which is available here http://gallery.azureml.net/MachineLearningAPI/b0b2598aa46c4f44a08af8891e415cc7

You can try this service by subscribing to the service. It’s an App Service in Microsoft Azure and you need to have a Azure subscription to subscribed to that. Currently it is free and you are allowed to make 20 transactions per minute for the subscription.

image

image

Once you are done with the purchase like any other service Face API is available in the Azure Marketplace section.

image

In the management section you can get the key for the API, the Face API is managed by Azure API Management (read more about Azure API Management here)

 

image

Face API teams also provides a sample WPF application with the portable client library as a wrapper for their REST service.

Getting Started with the Face API .NET client SDK

A simple face detection method would be very similar to this.

var faceClient = new FaceServiceClient("<your subscription key>");

                        var faces = await faceClient.DetectAsync(fileStream, false, true, true, false);

                        var collection = new List<MyFaceModel>();

                        foreach (var face in faces)
                        {
                            collection.Add(new MyFaceModel()
                                {
                                    FaceId = face.FaceId.ToString(),
                                    Gender = face.Attributes.Gender,
                                    Age = face.Attributes.Age
                                });
                        }

A direct JSON output would be like this. (test it here – http://www.projectoxford.ai/demo/face)

image

Face detection is made immensely easy by this research project. 🙂 Happy face detection.

The library has loads of other features like matching faces, grouping highlighting and all.

Azure Blob Storage Data Security

The question of security and cloud computing is one of the very well argued discussed and debated topic. It’s also one of the very common question I get in my sessions.

What ever said and done I personally believe that talking about security regardless of in the cloud or not it has 2 major characteristics.

  1. Security is a policy based feature
  2. Security is a shared responsibility

Ok, leaving the arguments of security, let’s jump into the Azure blob storage data security.

What would be the concerns of data security in Azure Blob storage

  1. How much secure the Azure data centers are ?
  2. What is the security for the data in transit ?
  3. Does Microsoft look or access my data stored in Azure Blob ?
  4. What are the policies should I follow in order to store data securely in Azure Blob ?

Let’s discuss each point one by one

How much secure the Azure data centers are ?

If you’re concerned that Azure data centers aren’t secured enough that physical security breach easily be possible, then you are not going to use the Azure in the first place. And there are tons of articles and videos in the Internet explaining the data center security features of Microsoft Azure.

What is the security for the data in transit ?

All the communications with the Azure and also within different data centers in Azure happen via https so it’s secured via TLS. There are some services which can be accessed (blob service included) with the http but by default this is turned off and it is not recommended to turn it on unless specifically required.

Does Microsoft look or access my data stored in Azure Blob ?

The direct answer from Microsoft is No, and indeed it’s true to my knowledge. But as per in the agreement Microsoft might collect the telemetry data of your blob storage  in terms of billing, usage patterns and service improvement parameters.

What are the policies should I follow in order to store data securely in Azure Blob ?

All your cooperate policies that you apply on premise and also additional policies that you have defined in your organization to store the data outside the cooperate firewall should be applied when using Azure Blob storage. But this is very unique and different to each company.

With all that information Why do you need to encrypt the data at rest from any public cloud provider ?

This is a question I ask from the customers when they request me to encrypt the data. Of course there’re several reasons mostly it is due to the cooperate policies which state any data at rest outside the cooperate firewall should be encrypted and also there’re cooperate policies that define the data should be in a specific political and geographic boundary. This is another concern in architecting the applications and selecting appropriate redundancy policies.

Other than the above reasons some customer have complained that we want to protect the data from our public cloud service provider, and prevent them reading the data. This is acceptable for some extent but again this raises plenty of other arguments like how and where to keep the encryption keys and how to secure them.

 

Encrypting data at rest in Azure Blob Storage

Azure blob storage does not provide any inbuilt mechanisms to encrypt the data at rest. We have to handle this in our application.

First selection of the encryption is very important. Since we’re talking about encrypting the data at rest symmetric encryption serves the purpose well, we do not need to put asymmetric encryption and overwhelm the situation.

Second where to keep the private keys – In most of the application design scenarios keeping the private keys in a database either encrypted or unencrypted format is a common practice. If you’re encrypting your private keys then you should have a master key.  You can design your own key management policy for you application or use Azure Key Vault for the key management which is still in preview.  One of the concerns I’ve been researching is having an algorithm as a master key which can create master keys for given parameters and handles master key expiration policies this will help us from the problems of persisting keys. (will write another separate post about this)

Third we should be aware when and how to apply the encryption – As the best practice always encrypt and decrypt the memory streams of your blobs. So you ensure that data is encrypted before hitting the disk.

In a web application encryption and decryption happens in the web server. The following diagram explains a scenario where there’s a web app and a blob storage in place and how encryption/decryption happens for the data at rest.

blob encryption diagram

Azure encryption extensions is a great library (https://github.com/stefangordon/azure-encryption-extensions)

Nuget – Install-Package AzureEncryptionExtensions

 

Performing the encryption and decryption in the stream helps to protect the data before it hits the disk. See the code below.

You can access the sample code with Web API service for upload and download content to Azure blob storage with symmetric encryption from my git repo

https://github.com/thuru/azurestorageencryptionsample

Which Azure Cache offering to choose ?

The above is one of the burning questions from Azure devs, and with all other cache offerings from Microsoft Azure along with their sub categories the confusion gets multiplied on what to choose.

Currently there are (and probably not for a much longer in the future) 3 types of cache services available in Azure.

  • Azure Redis Cache
  • Managed Cache Service
  • In-Role Cache (Role based Cache)

Ok now let me the answer the question straightly (especially if you’re lazy read the rest of the post) – For any new development Redis cache is the recommended option

https://msdn.microsoft.com/en-us/library/azure/dn766201.aspx

So what is the purpose of the other 2 cache offerings ?

I blogged about the Managed Cache Service and Role based Cache some time back. (I highly recommend to read the article here before continue the reading) . The below diagram has the summary.

Picture1

 

Read this blog post to get to know how to create and use Role based cache and Azure Managed Cache service.

Pricing, future existence and other details

Role Based Cache :

Since the Role based cache is technically a web/worker role regardless of whether it is co-located or dedicated, it is a cloud service by nature. So you create a cloud service in Visual Studio and deploy it in the cloud services, you can see and manage these roles under the cloud service section in the portal. And cloud service pricing is applied based on the role size. Role based cache templates are still available in Azure SDK 2.5 and you can create them, but not recommended. The future versions of the Azure SDK might not have the Visual Studio project template option for the Role based cache.

 

Azure Managed Cache Service :

The blog post shows how to create the Managed Cache in the Azure management portal and how to develop applications using C#.NET. But if you try to create the Managed Cache service now you will not find the option in the Azure management portal, because it’s been removed. At the time of writing of that blog it was available. The reason why it’s been removed is very obvious because Microsoft recommends Redis cache as alternative. The apps which use the Managed Cache service will continue to function properly but highly recommended to migrate to Redis Cache. Still the creation of Managed Cache option is available in Azure PowerShell. I’m not discussing about the pricing of the Azure Managed Cache service since it’s been discontinued. I have a personal feeling that Managed Cache service will soon be eliminated from Azure services, Microsoft might be waiting for that last customer to move away from Managed Cache Service 😛

 

Azure Redis Cache :

This is the newly available feature Redis on Windows cache option. The below link has information about the usage pricing and other information about the Azure Redis Cache.

http://azure.microsoft.com/en-us/services/cache/

HttpResponseMessage vs IHttpActionResult

In Web API 2 IHttpActionResult is introduced. Read this post which explains the Web API 2 response types and the benefits of IHttpActionResult

Assuming you’ve read the above article it is recommended to use IHttpActionResult.

Apart from the benefits of clean code and unit testing the main design argument of using IHttpActionResult is the single responsibility principle; stating that actions have the responsibility of serving the HTTP requests and should not involve in creating the HTTP response messages. This argument makes sense, but keeping this aside if we look at the implementation of IHttpActionResult it calls the ExecuteAsync method to create the HttpResponseMessage object.

But overall it is new, easy to perform unit testing and a recommended practice to use IHttpActionResult. I personally prefer IHttpActionResult due to clean code and the ability to write neat unit tests.

Still HttpResponseMessage provides more control over the HTTP response message sent across the wire, do we have that control in IHttpActionResult especially the HTTP response message creation is hidden from us.

Yes you can get the full control. Because in the above article it’s mentioned that ExecuteAsync method is called in the pipeline in constructing the HTTP response. So the solution is simple we should have a custom type which implements IHttpActionResult interface and provide the logic for for creating the HttpResponseMessage object.

This github repo has the code for a generic type which implements IHttpActionResult. Which you can use or extend; in the sample I have provided how to implement caching in the response header.

The main class is CacheableHttpActionResult<T>

Deleting all unused Azure Resource Groups

Execute this PowerShell script to delete all the unused empty Resource Groups from Azure. 

Resource Groups helps us to logically group the Azure resources together and monitor them. In the new Azure Preview Portal you can view the Azure Resource Groups. When ever you create a resource in the Azure it’s created under a Resource Group.

New preview portal allows us to create Resource Groups and visualize them. But the pain point is Resource Groups are not deleted even after you delete all the resources attached to a Resource Group. The idea behind this is, that you can reuse them without creating them from the scratch.

But sometimes this is annoying because most of the Resource Groups that are listed in the new Azure Portal do not have any resources and most of them were created automatically when you provision a resource in the old portal.

I had a huge list of empty Resource Groups so I created this PowerShell script to delete them all.

 

Read more about Azure Resource Groups here