Twitter V2 : Twitter with Elon Musk

It is all over the Internet that Elon Musk is on a mission to buy Twitter. Internet is going crazy about this. Part of it because Elon is doing it, anything Elon is involved get augmented on the Internet for some reason. I thought about what would be the case if Elon is successful this deal and how it would change Twitter.

I ain’t a fanatic follower of Elon. If I have to say one thing I like about Elon, it is his ability to gather great engineering teams and relentlessly push to achieve the great things. I have listed here few things, I would love to see in Twitter, hoping Elon would do these.

Inherent authenticity should be strengthened with Web 3 features

Twitter was started as a micro-blogging platform but quickly earned its place for legitimacy. When someone tweets, it is often considered as a source of statement. Tweets cannot be edited, this character gives a strong essence of authenticity to Twitter. Twitter can augment this with the use of Web3 technologies. Web3 is interpreted in many ways, but decentralized and verifiable characteristic is the focus here. Tweets can be stored / backed by a verifiable cryptographic platform, which will increase authenticity beyond a single entity [Twitter itself] controlling it.

Bots should be regulated.

Twitter is infested with bots. Getting rid all the bots is not easy, it also has implications to some existing platforms and business models. Having right checks and balances and regulating the bots is the possible successful path. Validating the purpose of a bot, allowing bots to have limited reach and allowing them to earn the trust to continuously expand their reach, identifying bots with a different flag, identifying and validating the real entities behind the bots, etc. are few of the many ways to regulate the bots.

New Business Models

Twitter has a weak revenue compared to other social media platforms. Major social media platforms are focused on content economy. Twitter does not have a content based economic model, so it is almost impossible for Twitter to get substantial revenue from content advertisements.

But Twitter is known for its legitimacy; Twitter should validate all entities behind each every Twitter account. This is easy considering the advancements in AI. If you feel I’m talking too dreamy, consider Uber does this with a greater success rate. When the accounts are verified, it opens several opportunities.

  1. Twitter becomes the only verified IAM system at such scale on the Internet, verified and established with Web3 technologies. (Panic button ON for FB and Apple ID). When an identity is verified, many services can leverage it.
  2. Combining this globally verifiable IAM model with metaverse elements like verified metaverse beings opens many other business models in metaverse. A digital twin of a verified person can hold a verified presence in metaverse.
  3. Subscription charges for certain Twitter accounts. But this should be a flat small fee to keep the power balanced and not coupled with any promotional benefits.
  4. Twitter is already testing few features of Web3. Each tweet of a verified account (as we already discussed how all accounts can be verified) linked with a decentralized platform will automatically become an NFT.

Putting it all together – Say you have a Twitter account, you have your username and password. First thing, Twitter would allow you to verify yourself by submitting with the required information. Then you become a verified user (this does not mean you have to be celebrity) Once verified, Twitter can issue you a verifiable Decentralized Identifier (DID). These identities will become a platform neutral (Hopefully this can be achieved with liberal and innovative culture Elon would bring in) decentralized identities for the users. Now, users can navigate across other digital platforms including many metaverse options, with these verified decentralized identities. This will enable users to create authentic digital presence. Digital service providers benefit from linking the authentic digital beings with the real humans enriching their services. At the heart of this, users will have the full control over their identities and they can decide whether to trust or not trust their digital interactions. If this verifiable decentralized identities become a reality then even future elections will use the Twitter backed (but not controlled) decentralized identities.

Once Twitter was one of the revolutionary and very forward-thinking company. Vine and Periscope were good examples of this. However, they are also good examples of how such forward thinking ideas could end up in garbage without a strong visionary leadership. Elon has the greater talent in building great engineering teams. Personally, I like Twitter very much and love to see its future with Elon.

Making of Aventude Calendar 2020.

Ok, this is another brushing about Aventude, but this time it’s not about a customer case, it’s about the calendar we produced for the year 2020. First, when the idea was proposed; it was little out of the traditional way, because technology companies do not create calendars, they would instead create a notebook or nothing at all.

But some felt, let’s do it for some fun and creativity – not to mention I was one of them. The idea of using animals was the first idea, and all agreed, so no much noise in deciding the theme of the calendar. The challenge was, we need to map the technological concepts with nature. We wanted to take a different approach there; we wanted to bring the technology in a more explanatory way, rather than using buzzwords directly.

Also, we wanted to check the quality of the print, images, fonts etc. with one sample. So without spending much time we took one sentence from our corporate slide deck ‘Speed & Quality at Scale’ – Cheetah came in, and we needed a cheetah that resembles both quality and a hunting speed. We reviewed the morphing method eight times as it should be natural as possible without any hard finishes and also colour fade should not disturb the calendar view. Last part we wanted to bring the tech feel from the photos, with neural connection kind of a mesh. The mesh should be based on triangles, and three shades of blue are needed to create a beautiful pattern, we reviewed many combinations of blue on-screen and on print. At this time, the entire concept was not even started, but reviewed the execution viability more than twenty times.

First draft took around two months to get a final cut; I did not want to spend much time and money without seeing it in print, Because if the print is not right, then it will be a wasteful effort. First sample was good and fine to go ahead.

Now, we need 12 concepts. All of us fell in love with the Cheetah, it was amazing on the big screen. So, we decided to keep him, so we needed 11 more. We started with buzzwords, and came up with the lovely and elegant phrases to bring the inner meaing.

MonthPhrases
JanuaryQuality & Speed at scale
FebruaryCollaboration across Geographies
MarchCost Effective Architecture
AprilThree Lane App Modernization
MayAssimilated Engineering with DevOps
JuneData Driven Decisions
JulySimple & Elegant User Experience
AugustEffortless & Powerful Serverless Architecture
SeptemberComposable Service Architecture
OctoberDemocratic Authority & Blockchain
NovemberRationale Intelligence
DecemberUnified Experience & Digital Convergence

Now we have to find the right images. We described what kind of an image is needed, rather than saying that and this. Some are very specific.

We were adamant on we need a peacock, but it is not that all fancy with a fully opened feather (called train). We need a good healthy looking one, should be proud but pure, should resemble elegance and not overly showing-off. I’m pretty sure the design team should have thought, they were stuck with some retards. After 28 reviews, we got the right one.  Some image explanations are too crazy to write here.

However, some were quite straight and quick, ‘Cost-Effective Architecture’ – with one review it came perfect, and this one is my personal favourite.

After completing all the images, we thought ok now it’s time to distribute, but our CEO wanted a nice packaging – Indeed Yes !. An excellent product needs a beautiful packaging.

Now all set, and it went out, we received excellent comments. I thought, little scared that people would think Aventude as a calendar printing company. Lol. The most remarkable comment was from the print agent, as they wanted to use this for their portfolio and the design team was happy they told they hadn’t done a thoughtful design like this.

It was an excellent and an extraordinary effort from them, and no one would have grabbed the idea better than Pixolines. I appreciate and recommend them.

So what now? Are we working on such a thing for 2021? – the answer is No. We may work on something, but it is too early to decide on anything, but it will not be a calendar.

Construction & Interior Design in post COVID19 – Simulations & AI

We are living in a very unusual time, in fact, personally, 2020 is the most challenging year thus far,  and I see the reflections of it in my business and personal life. But times are filled with transformational opportunities rather than allowing us to sharpen the same old knife again and again.

At Aventude Spark-E, we are working with some interesting social distancing induced business cases, and one aspect exciting to me is building architecture & social distancing. It was a request from one customer to obtain technical advisory on how to augment the existing evacuation planning simulations to map social distancing. If you haven’t heard about evacuation planning simulations, simple googling will help, it is a well-established agent-based simulation to study and aid evacuation planning at an event of a catastrophe.

We started it with matching eye-contact based simulation (I was surprised when I saw it first), and it seemed to work well. However, the critical issue is, these simulations are expensive and often loaded as part of pricy software. Also, this software need specialized hardware and processing. Those reasons didn’t map well when we did cost curve analysis for a SaaS application.

Either we have to reduce the cost of the implementation or augment the problem statement to attract the investors and expand the audience. Second option seemed more feasible than first, but how to make it to the mass audience. An idea came across the table, why don’t we make it a standard and we will put social distancing index for each building, that every building has to do it and qualify this index.

An engineering simulation soon became a standardization business – in the back of my head I was thinking, ok, this is how standards are born lol. We gave that task to someone who’s specialized in that area, and started thinking about how to bring more crispy use cases.

At this stage, we were working mostly at the conceptual level or thought leadership level as our PR team prefers that way (wink). Whether the social distancing index would fly or not, the requirement to modify existing buildings and their interiors is a fascinating use case. Most businesses are facing a struggle on how to bring the customer back; it is not enough for them to show they cleaning shoes and tables every hour. Something has to structurally convincing for people to feel safe, because we are fighting against an invisible enemy.

We did some R&D with Revit with a structural engineering designer from customer side, who helped us to do standard simulations and interior basics. The eye contact simulation which already exist, we thought to tap into it and see how things can work together.

The prototype, seemed to be working well.

  • Revit Python SDK is used to study the existing CAD drawing of a building structure
  • It plays a simulation in encode base to identify the eye-contact rate at a given occupancy rate – there are suggestions to use ray-tracing simulation and lighting as well, but we haven’t tried it yet.
  • Revit layer to suggest interior changes to reduce the eye contact rate, which is mapped to the social distancing index.

#3 is challenging but doable; the real challenge is suggesting a building architecture with meaning and taste. Say if you’re a coffee shop, the algorithm should know what are the possible things to put in and re-arrange things in a way that are relevant to your business. It is entirely different from modelling a library. This is the super end goal, but will not be part of the initial release, or it may be available as a preview feature for a particular segment of buildings.

Leaving the details, this is a compelling case in terms of how we are trying to address a creative industry and applying AI to augment it. At this level, a complete AI would be very expensive (or a better way to put it; we still do not know how to make it cost-effective). Now the model suggests the structural elements. These models are more structural that they do not possess aesthetic value; that’s where human creativity and emotion play a role. It is not a real-time human to AI interaction; the baby steps are in more of a guiding mechanism for the designers.

I love to see an AI, that re-models an interior of an existing building, we send a drone or something to capture the building model, and it suggests changes with minimal investment with the current stuff adhering to the preferred choices of interior design.

Buzz Engineering

Disclaimer: This is not a post to discourage your next new thing, in fact as developers and technology decisions makers, we all have our own perils and survival dependency in Buzz engineering. But it is good to understand things, since there is an increasing noise around high technology churn and failures.

So, what is Buzz Engineering – It is easy to spot buzz engineering but hard to digest.

  • Aren’t you working on some next level microservices with serverless orchestration on containers?
  • Aren’t you guys releasing at least once a week?
  • Oops – are you still using relational databases?
  • Why isn’t there any AI in your program – AI should have rendered that CSS better in IE. #lol
  • You should be kidding me, you haven’t started that project in Rust and blockchain.
  • Oh! Gosh, you take initial time to design – You’re waterfally Yack!! – my personal favorite

Yes, that exactly the Buzz Engineering is.

It’s not that, we shouldn’t use anything new: We SHOULD and INNOVATE, the innovation should be around the problem rather than on a technology buzz. The reasoning should come from the problem, rather than discovering a problem because you already got a hot technology in your hand.

The world’s real problems need far more complex and strong engineering and technology innovation along with the careful digestion of the problem. Understanding the existing problem context, the future, landscape change and continuous improvements on innovation are essential. Sometimes it is disappointing to see the buzz engineering overtaking the problem. Not all problems need the same formula and not all the problems can be solved with the same technology.

From a practical developer point of view, getting into technologies and implementations due someone else is doing it is quite common. In my opinion this is a killer of creative engineering. But at the same time, we all are both the victims and beneficiaries of buzz engineering.

Low Code Development Platform (LCDP) – love, hate & rise

Last couple of weeks or so, I happened to hear about some of the ‘Low Code Development Platforms’ (LCDPs) from few project engagements in Tiqri and also from some technical discussions outside. In fact Tiqri hat its own LCDP – Compose (http://www.onctg.com/)

What is a LCDP ? according to wikipedia – Low-code development platforms represent a type of technology that allows for creating apps through configuration of functions, rather than coding those functions.

The Love

There are number of LCDPs available, where you can plug and play things and create screens. These screens are often simple web based ones or mobile apps.

If you’re an enterprise and you need some screens to read and write data from the devices of your sales team to your ERP,

  • You need it instantly
  • Don’t want to spend time in the discussions of  native/hybrid or web app/SPA
  • Want to complete this from a small developer footprint and cost

LCDPs are good at achieving this, and they are often targeted at business stakeholders.

Like a developer with little business context getting obsessed with the recent conference he attended and want to convert the systems to the great technology he saw there; the business stakeholders with little technical context love the the promise of high productivity and low cost of LCDPs

In fact, we cannot have an intrinsic argument built on the idea of LCDPs do not offer productivity. They do offer faster feature delivery compared to the typical development due to obvious reasons.

a11.png

The Hate

The general perception about LCDPs is – of course they offer quick feature delivery but when it comes to customization and granular business requirements they hit the limit at some point and then become the pain.

a2

It is hard to evaluate any technology completely before using it, after all the evaluation itself has a shelf-life as technologies often upgrade. But LCDPs have an inherent disadvantage of (not all but most of them) being coupled with certain types of systems. If your organization has different systems and conformity of the LCDPs with the different systems should be considered.

LCDPs remain mostly in internal applications. If you’re launching an app for public usage and high penetration mostly technical stakeholders do not opt for the LCDPs because they can have their own limitations and reverting from such implementations is very costly compared to reverting from an internal application developed for your sales team.  Application branding and look & feel customization is one big challenge.

Finally a psychological reason – As developers we favor the engineering feast of the solution more than the business solution. – I accept this; this does not mean we do not care the business, we do and make our fullest effort to solve the business problems but we do not like to do it at the stake of losing the passion of coding. – I know there can be many arguments on this point

The Rise

LCDPs are not new to the market, even bigger development technology providers like Microsoft, Google and Oracle have their own share of LCDPs in the market.

Few platforms which are connected to a major products (SAP, Salesforce) have their fair share and some generic LCDPs have been living in their own sweet internal world specific to few industries. – Starting at a point some LCDPs have first phase customer’s industry influence in them, which makes them not suitable for other industries. 

According to Forrester Wave Q2 2016, LCDPs are categorized under these 5 categories. Does general purpose mean anything that cannot fit in other 4 categories ?;) 

a3.png

I have marked the 2 solid green boxes, and one dashed as per my opinion and observation of the LCDPs. These categories will have a rise in the future. There are reasons for this.

  1. APIs everywhere – Most of the LCDPs rely on data pipeline for the functionality, if the data cannot be accessed from a system they cannot begin their work. But now the proliferation of APIs help LCDPs to break their initial barrier.
  2. Modern Cloud based tools – Modern services like AI, Machine Learning, Data pipelines, integrations and etc have cloud based LCDP tools. Azure, AWS or IBM Watson all have a drag & drop based or wizard based implementations of machine learning development platforms.
  3. Serverless (BaaS) – Most of the Serverless Back-end as a Service models rely on the LCDP platform model. The visual designers and cloud based tools aid this. Also, the nature of the BaaS itself relies heavily on the third party integrations and configurations which makes it a choice to be a LCDP.

Conclusion

Based on the trends in APIs, Cloud based tools and Serverless, LCDPs have more potential to grow, but that does not mean they will have the solutions for all the problems in hand. Mainly the in the areas of handling dynamic data structures and modeling, UI, data governance, performance and so on. (individual LCDPs have some solutions for these problems in certain ways)

As of now back-end based LCPDs which come under the flavor of Serverless are promising in request handling and some process flows. These platforms are often offered by cloud giants like Azure & AWS. This would be a competition for some LCDPs whose strengths are in those areas.

Overall I believe the future of LCDPs are promising in terms of Serverless, APIs and Integrations but doesn’t look much convincing in terms of frontier application development except internal collateral application screens for the enterprises.

Visual programming, Rapid Application Development (RAD), high productivity PaaS (hpa-aPaaS) are some terms closely connected to the LCPD or used to refer LCDPs.

 

Shadow IT – Tradeoff between frictionless user experience and being responsible with AAD V2

Introduction

First let me begin with ‘What is meant by Shadow IT ?’. In a broader view shadow IT is, any sort of IT usage without the direct governance of IT department of your organization.

Sometimes this remains as a violation of the company policies, but the proliferation of the cloud SaaS applications and BYOD trends makes shadow IT an unavoidable practice.

100-cloudtweaks-comic

A simple example would be a cloud based file sharing application used in an organization which is not officially approved by the IT.

In most cases organizations are not aware of the tools used by their employees and shadow IT usage. Only 8% of the organizations are aware of their shadow IT usage.

percentage shaow it

Taken from – Cloud Adoption Practices & Priorities report 2015 : Cloud Security Alliance.

In my opinion there are two major reasons which fuel the increasing shadow IT practices. First, when employees have higher and diversified devices than the ones available at work. Secondly when they find sophisticated SaaS tools than the ones available at work.

Another notable reason is – communication between contextual boundaries, like departments, SBUs and other companies – people tend to use cloud based SaaS tools either for convenience or due to some already existing shadow IT practices of a party.

How to with AAD V2

So, what is the importance in software development in shadow IT ? –  One of the projects I’m involved with has been going through the transformation of being an internal system to a public system. We decided to open this up as a SaaS tool that anyone with a Azure Active Directory (AAD) credential can use it.

Behind the scenes the application has rules to track the user, tenant, recurrence of the tenant, other users in the tenant and the list grows. But anyone with a valid AAD account can simply create an account and start using it. This makes the application a perfectly fitted candidate in Shadow IT. It’s a perfect Shadow IT tool.

As SaaS provider we want many users as possible using our system, after all we charge per transaction 🙂

  • The application is registered as a AAD V2 app in the home directory.
  • We removed the friction in the enrollment by keeping only the minimal delegated permission (User.Read) in the app.

But in order to provide more sophisticated experience inside the application we require access to AAD and read permissions on other users. In order obtain this we thought of an approved shadow IT practice via application permissions.

  • We added the admin privileges to the application permissions, and generated application secrets.

The configured AAD V2 app principle look similar to the below one.

b1.PNG

In the experience point of view, we trigger the typical following normal AAD login URL for the user login. We trigger the organization endpoint (restrict the Microsoft accounts) with the following URL. (You can try the URL)

https://login.microsoftonline.com/organizations/oauth2/v2.0/authorize?client_id=412e0485-15f1-4df6-be94-25ce5fcc62db&response_type=id_token&redirect_uri=https://localhost:8080&scope=user.read openid profile&nonce=3c9d2ab9-2d3b-4

This will popup the login and after the successful validation of the credentials you’ll see the following consent screen.

b3

User accepts the consent and she’s inside the SaaS application Shadow IT begins here. In order to get the additional rights we allow the user to inform her IT administrator and ask for additional permission.

IT administrator will be notified by the email entered by the user with following admin consent link.

http s://login.microsoftonline.com/[tenantid]/adminconsent?client_id=412e0485-15f1-4df6-be94-25ce5fcc62db&response_type=id_toke&redirect_uri=https://localhost:8080

Here we obtain the tenant id from the id_token from the user logged in previous step. When the IT administrator who has admin rights in the AAD hits the above URL and after successful validation of the credentials he will see the following admin consent screen.

b4.png

The permission list varies based on the configured application permissions in the application. After successful consent grant, the redirection will happen to a URL similar like this

http s://localhost:8080/?admin_consent=True&tenant=[tenant id]

Now the application can use the app secret to access the specific tenant.

NoteFrom AAD principle point of view, the service principle of the application is registered in the tenant in the first step. (This configuration should be allowed in AAD – by default this is enabled) and during the admin consent process the service principle gets more permissions granted to it. 

Summary

We achieved both the frictionless experience for the user and allowing administrator to grant the permission when required. The below image summarizes the process.

b5

  • By request IT admin knows the usage of application and it give some light to the usage of such SaaS.
  • By granting access IT admin allows it to the organizational use and removing shadow IT context.
  • If admin rejects the consent the organizational user knows that he’s in Shadow IT context.
  • Blocking of such SaaS may continue based on the organizational policies.

 

 

 

 

 

The point of polyglot

Recently I spoke about polyglot persistence in one of the SQL Saturday events. The basic idea of this session revolved around the idea of not getting overwhelmed by the NoSQL boom, but at the same time understanding the modern application requirements which demand more features which side with the NoSQL features.

Enterprise application development is under massive shift than ever before. Enterprises look for more consumer application and social features in the enterprise software. Example – having a chat feature in a banking system, tags based image search, heavy blob handling features like bookmarking, read-resume-state and some go beyond the traditional limits and have AI features with cognitive services.

So the NoSQL technologies would help us in mapping, modeling, designing and developing these applications, sure they would do. But the adaption of NoSQL technologies, how it happens and the mentality of the people is quite interesting to see.

In my opinion there are two major concerns prevail in the industry about the adaption of NoSQL technology. They are

  • NoSQL for no reason  – People who believe that NoSQL is the way to go in all the projects. NoSQL is the ultimate savior.  NoSQL replaces the relational stores. World does not require relational databases. I often hear complains that the database table have more than 1 million rows or the database has grown more than 2 TB and now we think we need to move this to NoSQL, or they say it is very slow, so we need to move to NoSQL.
  • Fear of traditional relational database people – People who have relational database skills and think their skills do not match the NoSQL world and afraid of it. NoSQL is an alien technology that is going to replace relational databases. The fear of these people get worse by the group of people mentioned above who believe NoSQL for no reason.

Both parties miss the big picture. The better option is to use the right technology based on the requirement. The better case is – opting for polyglot persistence as the hybrid of both relational and NoSQL technologies.

Let’s name the decision point of when to make the move to polyglot persistence is point of polyglot. Below I have presented two real cases of polyglot persistence and mainly at which stage it happened.

Scenario of moving to polyglot from relational only – A product used in banking risk analysis, it handles many transactions and Azure SQL database running on premium tier. A feature came that users should be able to create their own forms and collect data (a custom surveys) we needed to store the HTML of the survey template and data filled by the users. At this point we thought about NoSQL, but we sided with relational. We stored the template as HTML and data as JSON in the SQL Database. We made this decision because there is no search required to be performed and the new feature seemed less likely to be used frequently. Later another feature rich chat module came with the ability to send attachments and group conversations. This is the point we decided to use Document DB (Azure based document type NoSQL). The user related data is in SQL Databases and the chat messages are in Document DB leading to a polyglot persistence.

Things to note : We were reluctant to move to NoSQL when the survey requirement came because, though it is dynamic during creation very much static after creation. And we didn’t want to add up NoSQL just because of this feature which is a part of a big module. But we readily made the decision of using Document DB for chat because it is a replacement of internal email system and not a good candidate to model using the relational schema.

Scenario of moving to polyglot from NoSQL only – This is a backend service and persistence of an emerging mobile app. Loads of unstructured data about places and reviews. Started with Azure Document DB. Later the app expanded and wanted the places and restaurants to be able to login using a portal and adjust their payment plans for promotions. We required to persist meta data and payment information – that’s the point we set up a Azure SQL Database and everything is smooth.

Things to note : It’s not that a NoSQL database cannot handle those transaction / accounting based information but it is  not a natural fit for any reporting and auditing purpose.

As you see there’s no strict rule on when one should decide to move to a NoSQL or to relational schema. I mention this balance as the natural fit.

Having strict demarcations of relational and NoSQL wouldn’t help to achieve the best use cases. As it’s hard to define the crossing point but it easy to see the overall business case and decide.

The below figure shows, the point of polyglot (author’s concept)

image

Natural fit plays a major role in deciding the point of polyglot. But it doesn’t mean it is always somewhere in the middle, it can be anywhere based on the product features, roadmaps and team skill. There are products which have polyglot persistence from the beginning of the implementation.

Though point of polyglot can be mapped like above, the implementation of polyglot is influenced by two major factors – they are cost of implementation and the available skills. The below figure shows the decision matrix (author’s concept).

image

Conclusion – There are two groups of people with opposing mindsets in adapting either NoSQL or relational stores. At some point most of the projects would go through the point of polyglot but this is not the implementation point. In the general ground, implementation decision is highly influenced by the decision matrix.

Dev Day 2015 FB app – powered by Azure Storage

An FB app was around during the Dev Day 2015 season, which generates a picture merging dev day logo along with user’s current profile picture and posts that in his/her facebook timeline. The user who generarted more pictures announced as the winner.

Anuradha presenting the FREE ticket to the Winner.

WP_20151217_18_51_56_Pro 

There were 478 unique users generated 1023 images. These numbers aren’t that stagerring but let’s see how this app was modeled.

App used Azure Blob storage and Table storage. Blob storage was used to store the merged images of the users.

In the Azure Blob storage there were two containers one is public and the other one is private. The app specific images were stored in  the private container including the dev day logo. The generated images were kept in the public container so easy to post them to Facebook using the public URL.

Privacy policy was aligned to cater the behavior of keeping the merged images in a public repository. “according to app privacy policy – the merged images are considered as processed content of the app and can be used outside the scope of the app itself. The app did not store the raw profile pictures anywhere”

Table storage was used to store the information of participants, and initial rule was share the picture and one user will be selected in random as a winner, so the design was like this.

12

There was only single partition and no much worry on that. But Facebook User Id had been used as the RowKey making that even if a user generates the image more than one time, there will be single entry in the table. As a lazy programmer i just used a single Upsert opertaion to write data to this table.

But soon after launching the app I noticed the usage pattern is significanly different and same users had been generating more than one image, I tracked this using the Table Storage TimeStamp column and also I had another columnd to track the last updated time.

To make the competition fair and increase the traffice, I redesigned it by announcing the new rule, saying the person who generates more images will be the winner. Changed the RowKey to a GUID and adding another Id column to track users.

121

At the end of the competition a simple group by Id query with the count revealed the winner.

Speacail thanks to @Madhawee for helping in the UI of the app.

A portion of the collage generated from first 100 photos generated by the app. (Images are posted here with the privacy policy acceptance by the users, that merged images could be used externally outside the scope of the app itself)

Click to enlarge.

13