How the different DNAs of Amazon, Microsoft and Google influence their Cloud Platforms.

Disclaimer: This is an opinionated post. The views and platitudes are solely based on my own experience and observations.

AAG – AWS, Azure & GCP

AWS, Azure and GCP are respectively from Amazon, Microsoft, and Google. These companies have different roots, values strengths and weaknesses. Each of them has a different DNA, which influences their cloud services in diverse ways. This article shows a completely different perspective about how the different DNAs of these organizations have been shaping their cloud market.  

AWS – The Retail DNA

AWS has the retail DNA from the roots of Amazon’s e-commerce business culture. Few notable key traits of the retail DNA are shipping fast to be the first, more focus on volume than margins and packaging under own brands known as private labeling.

Amazon launched AWS in 2006. Early adapters and open-source folks went ahead with AWS, this includes many current successful startups who were catching up during 2008-2013. Although Microsoft launched Azure later that period in 2011, it was less matured, and Microsoft did not have a good repo with open-source communities then. Being the first to market without a serious competition, AWS took the whole advantage of the situation during that period.

AWS follows a continuous innovation cycle and keeps on releasing new services although those services are either less popular or only useful to a smaller set of customers. AWS does this to be the first in the market, not worrying about the bottom-line.

Another interesting trait of the retail DNA is Private Labeling. Private labeling is a business technique used by retail players to package common goods from suppliers under their own labels with some value additions. AWS uses this technique very cleverly. AWS has an inherent weakness of not having any established software or operating systems of its own. This does not play well for AWS when it comes to cloud lock-in or giving generous discounts to the customers on software licenses. However, using private labeling AWS has been successfully battling this challenge by creating its own services. Few examples are Aurora DB which is a private label of MySQL/Postgres and Redshift is another successful example.

Azure – The Modern Enterprise DNA

Azure has the DNA of a modern enterprise. Modern Enterprise DNA has old traits like bottom line focus, partner ecosystem and speaking the corporate lingo combined with modern traits such as innovation, openness, and platform strategy.

Azure is not a laggard when it comes to innovations, Azure has its own share of innovative services with more focus on developer productivity and enterprise adaption. Azure Active Directory, Azure Cosmos Database, Azure Functions and Azure Lighthouse are few of those several enterprise-focused innovative services.

Generally, Azure targets its innovations at stable markets where they anticipate greater adaption, they do not invest much on niche market areas just to appear cool. This may be because of the traditional bottom-line focused business orientation. Because of this trait, sometimes we can notice that Azure terminates few services at their beta stage without releasing in General Availability, thus focusing on stable high reach bottom-line focused innovations over diversity of the service portfolio.

Having a rich partner ecosystem is another key strength of Microsoft. This has given an unbeatable position for Microsoft in hybrid cloud market with its Azure Stack suite. Azure Stack is a portfolio of products that extends the Azure capabilities to any environment. It has three products Azure Stack Edge, Azure Stack HCI and Azure Stack Hub. In other terms, Azure Stack is Azure in different versions, loaded in different hardware and bundled together for customers having different hybrid cloud demands. This is only possible by Microsoft because of its long-standing partner ecosystem and OEM partner network.

GCP – Internet Services DNA

GCP has the DNA of an Internet services company; in fact, there is no surprise as it is coming from Google. Google leads the Internet based consumer services; we all use Google services in our day-to-day life. Internet services DNA prioritizes individual services over a whole platform, and it prioritizes B2C over B2B.

GCP is the third largest cloud provider by revenue, but the gap between GCP and Azure is big. Also, GCP has a serious competition from Ali Cloud.

GCP has all the required foundational building blocks of a modern cloud, but it lacks the rich portfolio of services what AWS or Azure has. GCP tries to sell the same thing under different packaging, one example – API management service is listed as ‘New Business Channels using APIs’ and ‘Unlocking Legacy Applications using APIs’. Those are two different use cases of the same product, but not two different services. Though some may debate, this is an approach to attract customers with two different needs, other cloud providers do not do the same trick under their list of products.

Google is a successful Internet services company; Google should have been the leader in cloud computing. Ironically, it did not happen because Google did not believe in enterprise businesses. They were so focused on Internet based services and generating revenue by content advertisements. Individual users were more important than big businesses. When they realized big corporates are the big customers for the cloud business it was bit too late, and they had to bring the leadership from outside to get that thinking.

Google’s Internet service DNA has made GCP fragmented, the perception about GCP as one solid platform is vastly missing. Most of us use GCP services without much attention to the whole platform. We use Google Maps in applications, Firebase has become a necessity for mobile development, we use Google search APIs, but we see them as individual services, not as single cloud platform. The single platform thinking is essential to win the enterprise customers, not having such perception is a major downside of GCP.

However, it is not all bad for GCP, amongst these odds Google seems happy with what they are doing. They are showing upward trend in the revenue, and recently won few notable enterprise customers.

Build your SaaS right with Azure

Cloud has the proven promise of great opportunities and agility for ISVs. Modern cloud platforms have low entry barriers and huge array of service offerings beyond traditional enterprise application requirements. Cloud services provide intact environment to SaaS applications with features such as cutting edge innovative services, intelligence as services, continuous integration and continuous delivery, computation and storage scale for the global reach.

The current digitized environment, device proliferation and the span of intelligent cloud services give the best mix of social, technical and business aspects for SaaS products to emerge and prevail with high success.

Cloud enables equal opportunity to every SaaS player – technical and business domain skills and expertise are vital elements in order to succeed in the SaaS playground, knowing the business and knowing the technology are two utmost important facts.

From a SaaS consumer point of view, a customer has ample number of choices available to choose from list of SaaS providers. Having the right mix of features, availability, security and business model is important. Choosing the right tools at the right time at the right cost is the skill to master.

Figure 1: What customers expect from SaaS providers.

1Source: Frost & Sullivan, 2017

In order to deliver successful SaaS application, ISVs should have attributes such as – concrete DevOps practices to deliver features and fixes seamlessly, responsible SaaS adoption models concerning Administration & Shadow IT, trust and the privacy of Data & Encryption, promising service Uptime and many more.

DevOps with Azure Tooling

Azure tools bring agile development practices and continuous integration & continuous delivery. Code changes take immediate effect in the build pipeline with VSTS build definitions and deployed to the respective environments in Azure.

Figure 2: The simple DevOps model with Azure tooling

2

Environment and resource provisioning is handled via automated ARM template deployments from VSTS build and release pipeline. The model depicted in Figure 2 vary based on the context and complexity of the project with multiple environments, workflows and different services.

Centralized Administration and Shadow IT

Customers have the concern of how the SaaS enables the centralized organizational access management can be performed. On the other hand, SaaS providers require frictionless approach in the adoption of the services and enable more users much as possible.

Azure based organizational SaaS implementations often utilize Azure Active Directory (AAD) based integration and Single Sign On (SSO).

Data Security and Encryption

Customers trust the SaaS providers with their data. It is the most valuable asset SaaS providers take responsibility of in delivering value and helping the business of the customers. Data security and encryption is a prime concern and growing rapidly with complex and fast evolving regulatory and complaince requirements.

Azure has great compliancy support, tools and services in data protection. It offers many out of the box data encryption and protection services like TDE, DDM (Dynamic Data Masking), RLS (Row Level Security), In-built blob encryption and etc.

In certain cases, built-in security features do not provide the sufficient protection and compliance. In those sensitive environments we can leverage additional Azure services which provide high degree data security.

Figure 3: Advanced data security implementation in Azure

3

Azure Key Vault based encryption with SQL Database Always Encrypted, Blob encryption (envelope encryption), AAD based access control and MFA can be implemented in such cases. Also, this provides new models of Bring Your Own Key (BYOK) in encryption where customers can provide and manage their keys.

Uptime

Service uptime should be considered not only during unexpected failures but also during updates.

Azure provides inbuilt geo replication for databases, storage and specific services. Application tier redundancy is implemented with the use of Traffic Manager. Configuring geo replication and redundancy introduces concerns like geographic regulatory concerns of data, synchronization issues and performance.

Azure tools like Application Insights for application monitoring & telemetry, auto scaling, geo replication, traffic manager and many others are mixed with architectural practices to deliver required uptime for the SaaS application.

Conclusion

Apart from the technologies and tools, SaaS application development on a cloud platform requires expertise on the platform of choice, in order to achieve cost effectiveness, business agility and innovation.

How SaaS application is bundled and sold is a crucial decision in technology strategies like cache priming, tenant isolation, security aspects, centralized security, multi-tenancy at different services and etc.

This article provides a high level of view about the considerations customers look from SaaS providers and how Azure tools and services can help in achieving them.

 

 

Contribution of cloud computing to the Agile

I can be pretty sure that almost all the times we hear the word Agile our mind relates that to the Agile software development process rather than the English word agile. Even Google thinks so. True enough that the semantic of the English word agile is they key to name the so called process as agile.

image

The reason I gave such an introduction to the agile is to bring out how much popularity the process has gained overtime. There’re different ways to implement agile, I don’t know any of them properly by the rules. But I have an understanding that the core of the agile is iterative thinking in an incremental delivery mode. That’s the key rest is how you do that.

Thinking about the current software delivery, the process of agile and how it evolved from the well blamed waterfall model, I felt little happy about myself for knowing some old school stuff. I think I was lucky enough to work with computers with huge keyboards which make sound of a shutting clam with green monochrome screens. They used to run the so called DOS 6.2. I have written programs in GW Basic and FoxPro and used 5 1/2  inch floppy disks.

Software used to be developed and  delivered totally different in those days. An ISV  had to write the software and ship it through some hard media (floppy disks or optical drives) mostly with a serial key for licensing purposes. We couldn’t think of iterative delivery on that model. A huge complex software would have ended up with 100s of CDs delivered to the client every two weeks; probably requiring a delivery service like DHL or FedEx.

So the delivery and the development practices were forced to lock up in the boundary of water fall model because frequent deliveries were mostly impossible due technology limitations. And those days most of the software were written for desktop computers.

With the time, industry evolved and cloud computing has become the heart and soul of the IT. Software development practices started to change and most of the development occurs for the cloud.

Cloud not only facilitates the different licensing models and how organizations manage their resources, cloud also has changed the entire software development process. It brought the trends of continuous delivery, online build automation, continuous integration, cloud source control and much more features which are the core part for the iterative development and agile methodologies.

Without those tools and technical processes we cannot think of implementing agile in software development in the modern day. Cloud facilitates the modern Agile Software Development and Dev Ops.

Each and every line of change is reflected to the customers in near real time with entire automation. Iterative development is fueled by the fast feedback loops. In order to gain the faster feedback loops continuous delivery is vital. Cloud computing facilitates this phenomena.

image

The developer and operations work flow is seamless with the cloud computing. Platforms like Microsoft Azure provides end to end DevOps workflow with tools like Visual Studio online, Azure Web Apps and Application Insights which exactly maps to the above diagram.

Cloud not simply a platform it’s the trend setter.

How to create custom NuGet packages

NuGet provides an easy and a very efficient solution to distribute packages. A NuGet package can contain assemblies, content files and other tools that you want to distribute. A NuGet package is described by a manifest file known as nuspec. View the nuspec reference.

You can use 3 sub folders \lib \content and \tools in the NuGet package structure. \lib is used to store the assemblies, \content folder is used to store scripts, images, style sheets, ect… and finally \tools folder contains power shell scripts which mostly handle the package events (installation of the package, un-installation of the package)

\content folder also contains the transformation files which apply changes to files such as web.config and app.config. NuGet packages can also be used to insert code and create code files.

You can create NuGet packages either using visual designer or command line. In order to use the command line, we need NuGet.exe which can be downloaded from this link.

Open CMD and go to NuGet.exe path (or you can add the NuGet.exe path to environment variable to access it from anywhere)

Type the following command to make sure that you’re running the latest version of NuGet.exe

NuGet Update –self

Then we have to create the nuspec for the package. Create the above mentioned 3 folders and copy the assemblies inside the \lib. You can have sub folders inside the those folders. For example you can have \content\images to store images and when your package is referenced it will create a folder named ‘images’ in the project.

After setting up the file and folder structure, run the following command to create the nuspec file.

nuget spec

This will create the nuspec manifest file with tokens for the parameters like author, owner, description, urls and ect. Open this nuspec file and enter those details. A sample nuspec file looks like this.

   1: <?xml version="1.0"?>

   2: <package >

   3:   <metadata>

   4:     <id>Package</id>

   5:     <version>1.0.0</version>

   6:     <authors>Thuru</authors>

   7:     <owners>Thuru</owners>

   8:     <licenseUrl>https://thuruinhttp.wordpress.com</licenseUrl>

   9:     <projectUrl>https://thuruinhttp.wordpress.com</projectUrl>

  10:     <iconUrl>http://ICON_URL_HERE_OR_DELETE_THIS_LINE</iconUrl>

  11:     <requireLicenseAcceptance>false</requireLicenseAcceptance>

  12:     <description>Package description</description>

  13:     <releaseNotes>Summary of changes made in this release of the package.</releaseNotes>

  14:     <copyright>Copyright 2014</copyright>

  15:     <tags>Tag1 Tag2</tags>

  16:     <dependencies>

  17:     </dependencies>

  18:   </metadata>

  19: </package>

Once the nuspec file (consider the name is mypackage.nuspec) is created, you can create the NuGet package. Run the below command to create the NuGet package

nuget pack mypackage.nuspec

This will create the NuGet package. You may get errors and warnings based on some values in the nuspec.

This the structure used. (Note here I’ve copied the NuGet.exe in my working folder because I didn’t set the environment variable)

image

Few good to know features of Entity Framework

I have been doing some reading about advance EF features for last couple of weeks and got to know some interesting things, this post is a compilation of the information and provides links to the relevant sites.

Starting with EF 5.0 ObjectContext was replaced with DbContext. So what’s the difference between ObjectContext and DbContext ?

This link gives the simplest answer for the above question. And using the following code you get the ObjectContext from your DbContext. Say that your have model class named MyEntities the best practice is to create a partial class (partial classes are the safest way since the auto generated classes could be overridden at any time there’s a code regeneration) and create a property that returns the ObjectContext. The below code describes the idea.

   1: public partial class MyEntities : DbContext

   2: {

   3:     public System.Data.Entity.Core.Objects.ObjectContext ObjectContextProperty

   4:     {

   5:         get

   6:         {

   7:             return ((IObjectContextAdapter)this).ObjectContext;

   8:         }

   9:     }

  10: }

 

Compiled Queries

Compiled Queries are helpful when we want to squeeze the final drop of performance from our application. We cache the TSQL statement which is produced from LINQ to Entity queries we write. Since we write the queries in LINQ these queries first get translated into TSQL statements and they’re executed against the SQL Server. As a result of caching the TSQL statements of the LINQ queries we cut down the translation time. Microsoft has mentioned this provides 7% performance improvement. Details here

This article provides information on creating compiled queries

MSDN Compile Query Class

The reason why I haven’t mention any samples here is that we do not need to worry much about the query compilation in EF 5.0 and above because it automatically happens.

This MSDN blog post explains the automatic query compilation in EF 5.0 and above. But again if you want to have more advanced and granular control over the compilation you should consider implementing your custom compiled queries. It’s better to have the compiled queries in your data repository classes.

Performance

Read this Data Access Blog article about the EF 5.0 performance. The below image take from the blog gives a summary.

image 

As you see that updating to EF 5.0 on .NET 4.5 gives a greater performance boost to LINQ to Entities, automatic query compilation is one of the main features for this performance gain.

EF Versions – http://msdn.microsoft.com/en-us/data/jj574253

This channel 9 video is a good one about EF 5 and EF 6

 http://channel9.msdn.com/Shows/Visual-Studio-Toolbox/Entity-Framework-5-and-6

Microsoft doesn’t embrace it own products

I have noticed few things that makes me feel that MS doesn’t embrace its own products sometimes. For example when MS launched Windows Phone 7 and 7.5 there were massive marketing campaigns about the phone.

But in the Live / Hotmail (now Outlook) login page iPhone was in the middle as a highlighted smartphone which supports Hotmail / Live. Then thankfully someone noticed it and changed it. Later WP was in the middle.

Yesterday I came across a big frustrating problem when dealing with Windows Azure websites. MS has been doing a really great job with Azure and Azure websites deployment provides plenty of options to host the websites. We can pull the website files from various sources and Dropbox is also available. But they don’t have an option to pull a folder from Skydrive to Azure websites, they still have an option for Dropbox.

I cant believe this. Really confused.

Capture

After few minutes I saw this tweet.

c2

I really don’t know what’s going on.  But I think this is the real problem in MS now. There’s no communication, everyone does something in their own. But MS is not a company which became big yesterday. They should have and I hope definitely they should be having processes for integration and a streamlined communication between products. I wonder whether they don’t have the processes or someone has forgotten it in the middle.

Microsoft’s confuzzled.

You might wonder what this confuzzled is. It’s a word I heard in a movie or in a TV series (don’t remember exactly where) as a combination of confused and puzzled. Hot smile Cool isn’t it ? But it perfectly suits the current situation of the Silverlight.

Silverlight is a good platform, no one can deny that. MS released it as a RIA plugin for the web. (initially to battle against Adobe Flash and Java FX) then Microsoft realized that HTML 5 is  great and started the promising of open web.

They started campaigning Silverlight is not meant for the web. At that time Mango was in a hype. So Silverlight got a safe shelter under the Mango tree. Island with a palm tree

Microsoft also announced the Lightswitch, which is based on Silverlight. (out of the browser feature).

Lightswitch has its own attention from some ISV’s and has been growing, but not much popular in a broader market. Though it is a pleasing platform to develop applications deployment of a Lightswitch app is little hard.

But now Silverlight got a very serious problem. It doesn’t know where to go and where to exactly fit in ? Because of the new boss – WinRT. Just kidding

We all know that Windows 8 apps are running on top of WinRT, and the Windows phone 8 is also a WinRT platform. No more  Silverlight in Windows phone.

Poor Silverlight it lost one of its major safe houses. Sick smile

So as of now Lightswitch is the only guaranteed place for Silverlight to rest in peace.

But you might wonder still we use the XAML in WinRT, so how come we loss the Silverlight. XAML is a powerful UI standard created for WPF and then ported to Silverlight and now being used in WinRT.

So still you can use the XAML skills you’ve got on layout, user interaction, data binding and so and so in the WinRT, but Silverlight is no longer there. That’s it.

So what about the existing WP7 (Mango) apps written in Silverlight, MS promised an automatic recompile of the apps in the marketplace to WinRT.

And since XAML is there, Silverlight devs no need to panic much, because they can still use the XAML and other features like Command model, Converter model and other patterns in WinRT. ( I think the dependency property is still there, not so sure)

Silverlight also has the promises of BI 2.0 but again in a very limited scope. In SQL Server 2012 the power view has some sort of a Silverlight touch.

So when calculating all the formulas, I think Silverlight 5 would be the last version. Party smile

Everything about Windows Processes

This is a great web site which has almost all the information we need to know about the processes running on Windows operating system.

http://www.neuber.com/taskmanager/process/index.html

Time to time we wonder looking at our Task Manager, and thinking what that process if for ? Is it a Windows Process or a 3rd party one or a virus. The above link gives you all the data about the Windows processes.

Here I have listed some of them, (summarized version from the above the site) which seems suspicions but are not dangerous.

igfxsrvc.exe

Igfxsrvc.exe is installed along-side Intel Graphics Accelerator cards and with on-board graphics chipsets. It is the Common User Interface and starts with the Operating System and occupies a slot in the System Tray. It provides the graphical interface for all the display settings of the chipset including color quality and screen resolution.http://www.neuber.com/taskmanager/process/igfxsrvc.exe.html

csrss.exe

This is the user-mode portion of the Win32 subsystem; Win32.sys is the kernel-mode portion. Csrss stands for Client/Server Run-Time Subsystem, and is an essential subsystem that must be running at all times. Csrss is responsible for console windows, creating and/or deleting threads, and implementing some portions of the 16-bit virtual MS-DOS environment.

hkcmd.exe

"hkcmd.exe" is Intel’s "extreme" graphics hot key interceptor. If you never use the Intel hotkeys. You can turn off the hot keys and close this process.

dwm.exe

One of the new features in Windows Vista/7 is the Desktop Window Manager (DWM). It responsible for the graphical effects such as live window previews and a glass-like frame around windows (Aero Glass), without draining your CPU.

rundll32.exe

This program is part of Windows, and is used to run program code in DLL files as if they were within the actual program. However, many viruses also use this name or similar ones. This file is also commonly used by spyware to launch its own malicious code. http://www.neuber.com/taskmanager/process/rundll32.exe.html