Monday, March 28, 2016

BI Capabilities

As the result of week end's reading, herez an interesting Gartner's update on "Critical Capabilities for BI (Business Intelligence) Analytic Platform" .

The very first use of what we now mostly call business intelligence was in 1951, at Lyons Electronic Office, powered by over 6,000 vacuum tubes. Itz about “meeting business needs through actionable information”. Linear equation of BI growth, is depicted as the attachment.

The BI and analytic platform market has undergone a fundamental shift. During the past 10 years, BI platform investments have mostly been in IT-led consolidation and standardization projects for large-scale system-of-record reporting.

Current Trend
As demand from business users for pervasive access to data discovery capabilities grows, IT wants to deliver on this requirement without sacrificing governance — in a managed or governed data discovery mode.

Business analytic of tomorrow is focused on the future (Predictive) and tries to answer (Prescriptive) the questions: What will happen? How can we make it happen?

Predictive analytic encompasses a variety of techniques from statistics, data mining, and game theory that analyze current and historical facts to make predictions about future events.


  • BI has passed a tipping point as it shifts away from IT-centric, reporting-based platforms
  • Early entrants to the data discovery market may have strong capabilities in interactive visual data discovery
  • Higher differentiation score by using the emerging capabilities — such as search, embedded analytics, collaboration, self-service data preparation and big data.

Road Map
Few predictions to define BI Analytic Road Map:

  • By 2018, data discovery and data management evolution will drive most organizations to augment centralized analytic architectures with decentralized approaches.
  • By 2018, smart, governed, Hadoop-based, search-based and visual-based data discovery will converge into a single set of next-generation components.
  • By 2020, 80% of all enterprise reporting will be based on modern business intelligence and analytics platforms; remaining 20% will still be on IT-centric, reporting-based platforms because the risk to change outweighs value.

Closure Note
As per Gartner report, BI market has shifted to more user-driven, agile development of visual, interactive dashboards with data from a broader range of sources.  With my rich experience on building the financial enterprise data hub, I can sense the breath & depth of "broader range of sources"

Thursday, March 24, 2016

Altiscale Insight Cloud

Hadoop-as-a-Service (HaaS) vendor Altiscale is moving up the stack with a new service called Altiscale Insight Cloud, which sits on top of existing service Altiscale Data Cloud. How it works?

Ingest services consist of a user interface over jobs that run on Apache Oozie, and allow the definition of validation rules on the ingested data. Analysis functionality is provided by an OEM'd implementation of Alation, a product which acts as a data catalog. Underneath Alation, Altiscale has configured the Hadoop cluster such that Hive and Spark SQL point to exactly the same data files, and either technology be used to satisfy queries.

Insight Cloud nicely finishes off the raw infrastructure of Altiscale Data Cloud with some basic functionality to make the combination of Hadoop and Spark more usable, but without reinventing the wheels that BI and Big Data analytics players have in-market already.

Altiscale says Insight Cloud is a Hadoop/Spark offering that is very BI tool-ready, so that users of Tableau, Excel or other common self-service tools can more readily attach to and analyze Big Data.

Pricing is consumption driven, and at $9,000/month for 20TB of storage and 10,000 "task hours". Having Insight Cloud in-market makes Altiscale more competitive with fellow HaaS provider Qubole. 

Saturday, March 19, 2016

AWS Decade

Looking back at the past decade, it is pretty impressive to see just how much the IT world has changed. Even more impressive, the change is not limited to technology. Business models have changed, as has the language around it.

A decade ago we would not have spoken of the cloud, micro-services, server-less applications, the Internet of Things, containers, or lean startups. We would not have practiced continuous integration, continuous delivery, DevOps, or ChatOps

Today, keeping current means staying abreast of developments in programming languages, system architectures, and industry best practices. It means that you spend time every day improving your current skills and looking for new ones.

Itz great to turn back and look at the success path of AWS:
  • 2006 - Amazon EC2 & S3 - Elasticity Computing (EC) & Storage
  • 2009 -  Amazon RDS - Relational Database Service on Demand
  • 2012 – Amazon DynamoDB – Internet-Scale Data Storage 
  • 2012 – Amazon Redshift - Data Warehouses in Minutes not Quarters 
  • 2013 – Amazon Kinesis – real time data capturing, processing 
  • 2014 – AWS Lambda - A New Programming Model 
  • 2015 – AWS IoT - Devices are the Future 

In the event of AWS 10 years anniversary, qwikLABS is offering unlimited access to any AWS lab in the qwikLABS catalog. Ref:

Authorized AWS training with live console access, for a limited time, from now until March 31!  Take as many of our 95 labs as you can handle and see how many badges you can earn. 

But hurry—this special event is available until the end of March.

To AWS, Congratulations on an awesome ten years. To us,  Keep on learning, keep on building, and keep on sharing successes with the society!

Thursday, March 10, 2016

Central Authentication Service

In Today's application architecture, any type of distributed App needs 3A support.

  1. Authentication
  2. Authorization
  3. Auditing

Software Engineers are always called on to authenticate or authorize access to applications, systems, or apps. And if they’re not, they should be. Constantly.

The outline is roughly defined in the below 4 steps:

  1. A user accesses the Audit Console with an non-authenticated request
  2. The user is redirected to the CAS login server
  3. After logging into the CAS, the CAS server issues a ticket for the username
  4. The AuditConsole validates the ticket and loads the authenticated user from the user database

Industry Usage
There are three major standards, namely OAuth, SAML, and Open ID. Google, Amazon, Facebook, and just about any other major internet application provider supports at least one of these standards and most support all of them.

An emerging authentication technology is floating in the industry namely CAS (Central Authentication Service).

CAS is an enterprise Single Sign-On solution for web services. Single Sign-On (SSO) means a better user experience when running a multitude of web services, each with its own means of authentication. With a SSO solution, different web services may authenticate to one authorative source of trust, that the user needs to log in to, instead of requiring the end-user to log in into each separate service.

A number of out-of-the-box solutions exist to enable web services written in a specific language, or based on a framework, to use CAS. This would enable deployers to implement a SSO solution in a matter of hours.

CAS Protocol 3.0 Specification was retrieved on Jan'15. Ref:

Open Source licensed under Apache v2. Ref:

In the similar line, letz catch up the other standards quickly.

OASIS started work on SAML in 2001. The common implementations use XML data passed between various subsystems to authenticate users. OpenID addresses the same problem, and experience has shown it to be a bit more scalable.

Brad Fitzpatrick initiated OpenID in 2005, with the support for signing and strong encryption. By 2008, it had been supported by Sun Microsystems and Yahoo! for authentication and authorization.

Blaine Cook started working on OAuth in 2006 while working on OpenID at Twitter. Designed for HTTP, OAuth uses access tokens as the basis for user authentication, a common pattern.

Enterprise Use Case
You might be little confused what to use? Answer is purely based on your application use case.

CAS centralizes authentication. If you want all your applications to ask users to login to a single server.

OpenID decentralizes authentication. If you want your application to accept users login to whatever authentication service they want.

OAuth is not about Single Sign-On. It handles authorization, which is about letting the user to control how their resources may be accessed by third-parties.

As per your application requirement & specification, the appropriate industry standard pattern/practice should be leveraged. Not other way round !!

Saturday, March 5, 2016


Today, I attended the coding contest for the first time @ Hackerrnak.  Itz quite interesting exercise.

This competition was 4 hour coding contest from the time, you first logged in.  Out of the given 6 problem statements, I can make 100-150 ranks in 3 problems, 400+ in 2, 630+ in 1.

Though, I didn't make the great progress, it was exciting to apply the career opportunities with Accel Partners’s fast-growing portfolio companies, which is attached here.

Lesson learnt - need to push hard to earn the top ranks in the highly competitive and standard exercises.  Final score card is:

Your Final Position: 636  You finished in the top 82%

Gearing up for the next turn !!!

Tuesday, March 1, 2016

Microsoft Show

Last week, got an opportunity to participate Bangalore event and here I'm sharing the conference info.

Key Note
Event was ignited with the Key note address on the strategy and messaging about the platforms and productivity services for a mobile-first, cloud-first world.

Abstract of key note address is around Business transformation, with 3 core essences of
  1. Efficiency
  2. Agile
  3. Differentiation.
Efficiency produced better result, Agile with faster/cheaper, Differentiation with Innovative factor.

With these essences, Microsoft product walk thro was exhibited in the areas of DevOps, Analytic, Cortana, IoT, Cloud Infra, Mobile first.

Cortana Analytics is a fully managed big data management and advanced analytics suite that enables the customer to transform the data into intelligent action. To the customer, this solution enhances the business applications with machine intelligence to evolve from simple descriptive analytics to prescriptive analytical recommendations.

Windows 10
To me, Windows 10 is not only Universal Application platform but also other features like:
  • Enterprise Data Protection - differentiation between private and work data
  • Wipe and Reload - process drags old files, settings, and programs from your previous Windows system to new one
  • Bio-metric - security in a native format that will essentially eliminate the need for passwords
  • Continuum - user interface can adapt automatically depending on the situation/device
  • Edge/ IE 11 - both modern web browser 'edge' and traditional 'IE' work better together
To me, Delve (code named "Oslo") is the next generation connect platform (SharePoint) with the interesting factor of personal analytical reports.

Delve displays information based on "the work they are doing and the people with whom they are engaging," in a card-like user interface. With Delve, the idea is users won't have to remember where their information is stored or who shared it. That information will be surfaced for users automatically, but only when the appropriate permissions are granted.

Delve uses Microsoft's internally built Office Graph to ascertain relationship between people, content and activity across Office 365. Microsoft officials have said Delve is just the first app that will make use of Office Graph information.

SQL 2016
SQL 2016 comes packed with more in-depth, encrypted security, in-memory improvements, a mobile-friendly BI and Analytics dashboard, a stretch database for hyper-scale in the cloud, built-in advanced analytics and more.

To me, Top-3 attracted features are:
  1. Poly Base
  2. Workload Insight
  3. Built-in Security
Open Source Strategy
As a developer, I'm so excited on Microsoft's Open Source Strategy implementation.

.NET Core (named Core) was created so that .NET could be open source, cross platform and be used in more resource-constrained environments. .NET Core and the .NET Framework have (for the most part) a subset-superset relationship.

.NET Core platform is part of the .NET Foundation, open sourced by Microsoft Open Technologies Inc.
  1. .NET Core Project
  2. .NET Compiler Platform ("Roslyn" Project)
  3. ASP.NET Project
ASP.NET 5 is a new cross-platform version of ASP.NET that is designed for the cloud, and runs on Windows, Linux and Mac.

To me, the learning is adaptability.  If you/your company can't adapt to the industry change, then we'll be thrown out eventually.