Sunday, May 31, 2015

IBM SoftLayer


IBM is two years into its SoftLayer acquisition, which has  transformed and helped form the backbone of its cloud offerings.

The SoftLayer acquisition is really exciting and transformational for IBM, and I think it provides a tremendous amount of value for our customers small and large.

Increasingly what we find is that customers want to put high performance workloads on the cloud. For example, they might be doing big data analysis or they might have mobile applications or social applications that they want to stand up on cloud.

What SoftLayer provides you is a full expanse of cloud capabilities so you can have what are known as bare metal dedicated servers which are ideal for IO-intensive workload applications. They don’t have virtualization so they can scale vertically, which is very important when you're doing big data analysis.

Then they have virtualized servers the way you would expect from many of the other providers out there. And they also have the ability to deliver this on a set of network options. So if you want a private cloud you can stand that up on SoftLayer. If you want public cloud, you can stand that up on SoftLayer. And, they give you a very high performance network as well.

So it’s the servers. It’s the security. It’s the other capabilities including firewalls, security devices, etc. It’s a high performance network. All of those are capabilities that SoftLayer delivers for IBM acquisition.

Sunday, May 24, 2015

Amazon Cloud Free Tier

Amazon Web Service (AWS) Free Tier is now designed to enable you to get hands-on experience with AWS at no charge for 12 months after you sign up.

AWS Marketplace offers more than 700 free and paid software products that run on the AWS Free Tier. If you qualify for the AWS Free Tier, you can use these products on an Amazon EC2 t2.micro instance for up to 750 hours per month and pay no additional charges for the Amazon EC2 instance (during the 12 months).  Software charges may still apply for paid software.

After creating your AWS account you can use any of the 21 products and services, listed below, for free within certain usage limits.  Few samples are:
  • Amazon EC2: 750 hours per month
  • Amazon S3: 5 GB of Standard Storage; 20,000 Get Requests; 2,000 Put Requests
  • Amazon DynamoDB: 25 GB of Storage; 25 Units of Write Capacity; 25 Units of Read Capacity
  • AWS Lamda: 1,000,000 free requests per month; Up to 3.2 million seconds of compute time per month
  • AWS Key Management Service: 20,000 free requests per month

These free tiers are only available to new AWS customers, and are available for 12 months following your AWS sign-up date. You will not be eligible for the Offer if you or your organization create(s) more than one account to receive additional benefits under the Offer or if the new account is included in Consolidated Billing. You will be charged standard rates for use of AWS services if we determine that you are not eligible for the Offer.

You can sign up today to automatically take advantage of AWS’s free tier at https://portal.aws.amazon.com/gp/aws/developer/registration/index.html

Tuesday, May 19, 2015

AWS Storage Services


Today, I was invited to participate AWS Storage - Lunch & Learn Session at Amazon Web Service office.  Recently, I read AWS launched new storage service - Elastic File System (EFS) at CTO-Werner's blog http://www.allthingsdistributed.com/2015/04/amazon-elastic-filesystem-machine-learning.html and so I was curios to learn their storage eco system as shown in the attached diagram

Now, AWS offers 4 major storage solutions namely objects, archive, block storage, file system.  Letz have a snapshot of each model.

1. Object Storage
Amazon Simple Storage Service (S3) for object storage, presented as buckets of objects and accessible over the Internet using URLs or APIs.  Each storage items like image, video, html, dbfile, feed file, etc. is considered as Object or Resource.  It is accessible via URL.  So, AWS object storage is not file system or database system.  Each object can be range from 1 byte to 5 TB; virtually unlimited object storage with auto scaling.

2. Archive Storage
Amazon Glacier for archival storage, in the form of vaults of objects, available for infrequent access via APIs over the Internet.  Amazon Glacier is a secure, durable, and extremely low-cost storage service for data archiving and online backup. Customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month, a significant savings compared to on-premises solutions. To keep costs low, Amazon Glacier is optimized for infrequently accessed data where a retrieval time of several (roughly 3 to 5) hours is suitable. Glacier is technically accessible via S3; not directly.  User can set the purge rule to archive from running S3 data to archival Glacier and cleaning Glacier after specific interval.

3. Block Storage
Amazon Elastic Block Store (EBS) for SAN-style block storage, presented as disk volumes that can be attached to one Amazon Elastic Compute Cloud (EC2) instance at a time. I am seeing itz more of cloud mount to attach/detach with any type of system i.e. Windows, Linux, etc.  Multiple EBS can lead to build RAID storage.

In simple term, block storage - EBS is used as database for the application development.  DB content can be backed up with the combination of EBS, S3 & Glacier. Let me explain the business use case.  DB storage in EBS, can be snapshot/version with the flat storage at S3 for running instance.  On arrival of the next snapshot, older version can be pushed to Glacier as the archival.  It is completely automated via rules provided by AWS storage services.

4. File System
Amazon EFS is a fully-managed service that makes it easy to set up and scale shared file storage in the AWS Cloud. With a few clicks in the AWS Management Console, customers can use Amazon EFS to create file systems that are accessible to EC2 instances and that support standard operating system APIs and file system semantics. Amazon EFS file systems can automatically scale from small file systems to petabyte-scale without needing to provision storage or throughput. Amazon EFS can support thousands of concurrent client connections with consistent performance, making it ideal for a wide range of uses that require on-demand scaling of file system capacity and performance.

Apart from the above 4 storage services, AWS Storage Gateway is making the power of secure and reliable cloud storage accessible from customers on-premises applications

My observation is that Amazon is doing wonderful achievements in Cloud platform.  Itz tip of ice berg; but enormous brilliant minds & efforts are underneath. Most of the companies are talking about API or Service model; but Amazon did it in reality with robustness after dog food on their own global on-line store App.

Sunday, May 17, 2015

Business Analytics @IIT


Yday, I attended 'Analytics for Business' session by Aaum Analytics at Indian Institute of Technology Park.  Itz pretty interactive and deep technical discussions.  In spite of weekend, significant set of participants came for the session, which reflects the thirst on Big Data technology.

Few take away points:

Recommended solution
  1. Persist raw inbound data in the original file format with cheap cost, if required
  2. Process the inflow data; summarize those information; then persist in traditional DB
  3. It helps the reporting layer to extract & present the report in better way

Challenges to Adapt
  1. Return on Investment (RoI) business justification
  2. Key Performance Indicator (KPI) missing accuracy
  3. Unavailable Talent pool
  4. Customer, Analyst, Developer growing requirements gap

Assessment Attributes for Analytic
  1. Demographic
  2. Behavioral
  3. Psycho-graphic
  4. Processing parameters

Reporting Methodology Evolution
  • What happened? Reporting
  • Why happened?  Analytic
  • What is happening? Monitor
  • What will happen? Prediction
  • How to resolve? Prescriptive

Visualization Methodology
4 Key factors to build visualization models
  1. Distribution
  2. Relationship
  3. Comparison
  4. Composition

Few industry wide popular analysis methods
  • Weighted average rating
  • Logistic regression
  • Time Series Analysis
  • Stationary series
  • Principal Component analysis

R Language
  • R is considered as key programming language for analytics becaz
  • Open source running on Windows,Mac,Linux,Unix,etc
  • Contains industry wide statistical package
  • Measure of dispersion
  • 3 versions: Simple R, Parallel R, Distributed R
  • Robust, vibrant community to support
  • Embedded data visualization techniques

Sector wise Demo site:
http://genisights.com/retail/home.jsp
http://genisights.com/finance/home.jsp

Friday, May 15, 2015

Azure Camp


Yday, I took part in Microsoft Dev Camp on Azure Essentials.  After making my hands dirty on Azure Camp-2011, didn't have the opportunity to dive into Azure; more focused on Big Data space.  Now, itz great to learn Azure's maturity over the last years.

Herez take away points at end of Yday sessions.

Why Cloud? 
3 Core Advantages: Speed, Scale, Cost

Howz Azure Spread?
17 regions, as of now. Satya-CEO recently shared the plan of few additional centres within India, to start with Pune. Local,Zone,Geo-Redundant are opted for replication, to build the high availability.

Few Stats

  • 57% of Fortune-500 companies 
  • 1 million SQL DBs 
  • 3 million request per second 
  • 300 million active users.


Non Microsoft Support

  • Azure is no longer only Microsoft compliant; but Heterogeneous
  • Runtime- beyond .NET, Java, Python, Node.JS, PHP
  • Storage- beyond SQLServer, MySQL, Redis, CouchDB, Hadoop, Cassandra
  • Operating System - beyond Windows, Linux


Scalability
Auto Scaling is possible with work load thresholds and instance max limits. Instance's RAM threshold of 445GB

Cost Model
Online cost calculator is available at http://azure.microsoft.com/en-in/pricing/calculator/

Managed Services
Azure services are completely self managed via dashboard, configured notification, real time monitoring, etc.
Services are offered in 3 major modes: Basic, Standard, Premium
ISO27001 Cloud Security Alliance compliant

Iot (Internet of Things) Adaptability
Using 5 emerging concepts: Event Hubs, Notification Hubs, Stream Analytics, Machine Learning, HDInsight

App Model
Web App & Mobile App are current trend; Logic App & API App are futuristic App model. Azure supports them.

My recommendation is to read the book on Microsoft Azure Essentials by Michael Collier & Robin Shahan. Itz freely available at http://windowsitpro.com/azure/fundamentals-azure-microsoft-azure-essentials-free-microsoft-press-ebook

Wednesday, May 13, 2015

Visual Studio 2015 RC


Last week, I installed Visual Studio 2015 Release Candidate with .NET 4.6 preview.  Itz always taste the tech food as the early bird.

Being privileged to work @.NET platform from the beginning, I can sense the power and vision of Visual Studio as the development platform for .NET based application whether desktop, web, mobile, cloud App.  Itz amazing to digest the travel of Visual Studio from 2003 to 2015.  What a difference it made !!!

Developers targeting today’s mobile and cloud platforms require greater levels of choice than ever before.  Whether you are targeting iOS, Android or Windows on the client, targeting Windows or Linux on the server, or using a wide variety of languages and frameworks, our goal is to deliver developer tools and services that support the breadth of today’s developer needs.

Visual Studio 2015 RC, including great new productivity and diagnostics features, tools for building Universal Windows Applications, and cross-platform development support for building applications spanning from Windows to Linux to iOS and Android.  For developers targeting the Windows ecosystem, Visual Studio 2015 RC provides great tools for building apps and games for the Universal Windows Platform spanning that run across all Windows 10 devices, including phones, tablets, PCs, Xbox, IoT and HoloLens.  With new UI Debugging Tools, an improved XAML designer and enhanced profiling and debugging features, developing apps for Windows has never been easier.

Let me start swimming this ocean to share the experience !!!

Thursday, May 7, 2015

Double Century


Exactly 5 years later, my weekly tech blog reaches 200th artifact nearing 55K views and 21 registered  followers.  Thanks everyone for the continuous support to ink my point of view.

In this special edition, I would like to share my recent interview experience with Amazon. I wish to be part of the top notch product companies; just for the work culture.  Wish list includes google, microsoft, amazon, etc.

Short listing the profile itself is an immense learning process for me.  On the complete scan of my profile with the industry's consistent contributions, short-listed for 9 rounds of battle towards the senior leadership position.  I heard the position was open for more than a year, becaz of the missing right fit. I was little surprised and excited to start the tech war with the legend. My goal was to evaluate my skill par with the industry giant; to be honest just to pass first round.

Round 1 & 2, were in deep technical d(r)ive for exactly an hour each.  It covered the end-end solution architecture of the systems with code level details. After couple of days, got the big surprise that I cleared the initial rounds; now little scared too.  Oh my god!  On Tuesday, I was invited to have the full day mela for next 7 rounds, if required.

Day came!!! Cab pickup was given to start the day at 9AM. Pack jam of fun and learning during the interview process of all rounds till 7PM.  Really admire to see the innovative and inspiring way of interview process for the first time. Amazon is so transparent, even to the candidate. Though I didn't clear finally, I'm really impressed not only interview but also selection process.  Final due-diligence with all interviewers, is the decision call with all dimensions.  Even a single person can influence the result with their point of view towards firm's goal.

Majority of my result was outstanding especially technology; just slipped in financial leadership to execute the given L7 role. To be honest, I didn't expect the positive spots; but agreed with my improvement area. In spite of the result, I've so many lessons in this process (at least Top-5) to share:

1. Deep Dive
Getting to the nitty-gritty of the solution,  enables to analyze the problem on a deeper level.  Amazon expects the deep dive from the candidate.

2. Customer Focus
Basically Amazon aspires to be Earth's most customer-centric company. I noticed it was extended to every people.

3. No Deviation
I was not tweak in Tech role becaz the opted position required the variety of leadership hat.  No deviation, stick with their focus/need.

4. Huge Expectation
Past is experience, Present is experiment, Future is expectation. Amazon expects experience in your experiments to achieve their high expectation.

5. Rigid Standards
Positions are important than person.  Meaning, they are ready to await for the right candidate till they get it complete; time doesn't matter.

Dedication to my buddy Raja for stretching me to go beyond, since childhood friendship.  With good amount of (l)earning experience, my passionate journey drives on 'Continuous Learning & Continuous Sharing' motto.