Thursday, December 27, 2018

.NET Core Journey


Couple of weeks ago, Microsoft announced .NET Core 3 Preview 1, which is the first public release of .NET Core 3. Let us scan the journey of .NET Core

.NET Core 1
The .NET Core journey began a few years ago, with version 1 in 2016, with the goal of building the first version of .NET that was open source and cross-platform (Windows, macOS and Linux).

Version 1 shipped with new versions of ASP.NET and Entity Framework (EF) and primarily targeted Web applications.

.NET Core 2
While version 1 got .NET running on new platforms, it supported only a limited set of .NET APIs. In order to address this, we created .NET Standard, which specified the APIs that any .NET runtime must implement so that code and binaries can be shared across .NET platforms and versions.

With .NET Standard 2.0, we added more than 20,000 APIs to the .NET Standard spec,  .NET Core 2 also continued the push to make .NET one of the fastest full-stack frameworks.
 
.NET Core 3.0
.NET Core 3.0 is the next major version of the .NET Core platform. It includes many exciting new features, such as support for Windows desktop applications with Windows Forms (WinForms), Windows Presentation Foundation (WPF) and Entity Framework 6. For Web development it adds support for building client-side Web applications with C# using Razor Components (formerly known as Blazor). And it includes support for C# 8.0 and .NET Standard 2.1.

.NET Core 3.0 will also fully support ML.NET, our open source machine learning framework built for .NET developers, along with support for Internet-of-Things (IoT) scenarios.

You can see complete details of the release in the .NET Core 3 Preview 1 release notes at https://github.com/dotnet/core/blob/master/release-notes/3.0/preview/3.0.0-preview1.md

Wednesday, December 19, 2018

Google Analytics working model


Google Analytics collects the date the moment that visitor navigates to one of the web pages in browser. Embedded within web page content, there is a small block of JavaScript code referred as GATC (Google Analytics Tracking Code)

When the visitor's browser loads the web page, it runs GATC which performs key analytics operations.  First, it loads the Google Analytics master JavaScript file (ga.js) by downloading from Google servers. This code collects data about the visitor (like browser type, version, resolution, page title, etc.) from the browser. 5 cookies (ref image) determine whether the visitor is a repeat to the site with tracking information. Finally, it transmits to the data collection server, in turn persisted as log file at Google.

During the day, Google performs the first pass of these log files on an hourly basis and then runs another more thorough reprocessing pass at the end of the day. Data sources are from raw click data and aggregated summary data in Google's big table database. End user can view the accessed report from Google Analytics web site. 

Sunday, December 16, 2018

Microsoft Digital IDentity


Last couple of years, Microsoft has invested in incubating a set of ideas for using blockchain and other distributed ledger technologies to create new types of digital identities—identities that are designed from the ground up to enhance personal privacy, security, and control.

Digital IDentity (DID) will be a first-class citizen of the Microsoft identity stack.

Vision
  • Every user needs a digital identity they own, that can securely and privately store all their personal data.
  • This self-owned identity must be intuitive and convenient to manage, and provide complete control over how identity data is accessed and used.

Business Benefits
  • Deeply engage with users while minimizing privacy and security risks
  • Transact with customers, partners, and suppliers over a unified data protocol
  • Improve transparency and auditability of your business operations

Microsoft's Whitepaper of DID is available at https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RE2DjfY

Saturday, December 8, 2018

DataStax Enterprise 6.7


This week, DSE 6.7 has been launched with multi-workload support for operational analytics, geospatial search, increased data protection in the cloud, better performance insights, Docker production support, and connectivity to Apache Kafka.

Top-5 improvements of DataStax Enterprise 6.7 includes:
  1. Production-ready Kafka and Docker integration
  2. Easier, more scalable operational analytics for today’s cloud applications
  3. Simplified enterprise search for geospatial applications
  4. Improved data protection with smart cloud backup/restore support
  5. Improved performance diagnostics with new insights engine and third-party integration

DSE 6.7 and updated versions of OpsCenter, Studio, and DSE Drivers are available for download, as is updated documentation to guide installation and upgrading.


Sunday, December 2, 2018

AWS CEO at ReInvent 2018



Keynote from Andy Jassy, AWS CEO, included 20 new announcements during this week's AWS ReInvent 2018.

Available Now:
  1. Amazon FSx for Windows File Server – Fully-managed Windows file system on Windows native servers
  2. Amazon FSx for Lustre – Fully-managed file servers for compute-intensive workloads
  3. Amazon DynamoDB On-Demand – Flexible DynamoDB with no capacity planning necessary
  4. Amazon Elastic Inference – GPU-Powered Deep Learning Inference Acceleration
  5. Amazon SageMaker Ground Truth – Build accurate datasets and reduce costs at the same time
  6. AWS Marketplace for Machine Learning – ML algorithms and model packages are now available in the Marketplace
  7. Amazon SageMaker RL – Managed Reinforcement Learning with Amazon SageMaker
  8. Amazon Forecast – Time series using Amazon’s forecast algorithms

Coming Soon:
  1. Amazon S3 Glacier Deep Archive – New long-term data archival class for S3
  2. AWS Control Tower – Automate setting up a well-architected multi-account AWS environment
  3. Amazon Textract – Optical Character Recognition to extra data from most documents
  4. Amazon Personalize – Real-time personalization and recommendation
  5. AWS Outposts – Bringing AWS hardware and software on-premises
  6. Amazon RDS on VMware – Fully-managed service for on-premises databases
  7. Amazon Quantum Ledger Database – Fully-managed ledger databases
  8. AWS Managed Blockchain – Managed blockchain service supporting Hyperledger Fabric and Ethereum
  9. Amazon Timestream – Database service designed specifically for time-series data
  10. AWS Lake Formation – Fully-managed service will help you to build, secure and manage a data lake
  11. AWS Security Hub – Centrally view and manage security alerts and automate compliance checks
  12. AWS DeepRacer – Go hands-on with Reinforcement Learning with the DeepRacer

Saturday, November 10, 2018

Talend Stitch


Five years back, I came to know about Talend as open source ETL product in conjunction with Pentaho.  It was part of my earlier assignment to evluate the open source ETL product against Informatica. Talend expands its business base in greater focus.

Yday, there was an industry news that Talend is buying  Stitch, a 2-year old Philadelphia-based spinoff of RJ Metrics for $60 million in cash.

Stitch offers a cloud-based self-service offering that automates data ingestion pipelines into the cloud. It's an emerging space where the closest competitors are Alooma and Fivetran, but also where Confluent and StreamSets play. In its two years, Stitch has already built a customer base exceeding 1000 customers

Talend has not yet announced a closing date for the deal.

Tuesday, November 6, 2018

.NET Standard 2.1


.NET Core is the open source, cross-platform, and fast-moving version of .NET. Because of its side-by-side nature it can take changes that we can’t risk applying back to .NET Framework. This means that .NET Core will get new APIs and language features over time that .NET Framework cannot. At Build we showed a demo how the file APIs are faster on .NET Core. If we put those same changes into .NET Framework we could break existing applications, and we don’t want to do that.

The .NET Standard specification is a standardized set of APIs. The specification is maintained by .NET implementors, specifically Microsoft (includes .NET Framework, .NET Core, and Mono) and Unity. A public feedback process is used as part of establishing new .NET Standard versions through GitHub.

Since Microsoft shipped .NET Standard 2.0 about a year ago,  it’s time to update the standard to include some of the new concepts as well as a number of small improvements that make your life easier across the various implementations of .NET.

In total, about 3k APIs are planned to be added in .NET Standard 2.1. A good chunk of them are brand-new APIs while others are existing APIs that we added to the standard in order to converge the .NET implementations even further.

Announcement details are available at https://github.com/dotnet/standard/blob/master/docs/planning/netstandard-2.1/README.md

Thursday, November 1, 2018

IBM RedHat

This Monday (29 Oct) IBM  announced that it would pay a record $34 billion in cash and debt to acquire enterprise open source provider Red Hat. 

Eclipsing Microsoft’s $26.2 billion acquisition of LinkedIn, this is the biggest software acquisition in history. It’s not the biggest tech acquisition ever, though, as that title belongs to Dell’s $67 billion buyout of data storage business EMC.

During RedHat Summit (May) 2018, there was a press release about long-standing relationship between IBM and Red Hat to benefit the combined power of both companies’ technologies in private and public clouds.  Ref: https://www.redhat.com/en/about/press-releases/ibm-and-red-hat-join-forces-accelerate-hybrid-cloud-adoption
https://www.redhat.com/en/about/press-releases/ibm-and-red-hat-join-forces-accelerate-hybrid-cloud-adoption

Saturday, October 27, 2018

Microsoft GitHub


After the EU approved Microsoft’s acquisition of GitHub last week, Nat Friedman, CEO of GitHub announced Yday (26 Oct) that the company is now officially owned by Microsoft. 

The Redmond giant spent $7.5 billion in stock to acquire the web-based hosting service earlier this year, but just like LinkedIn, GitHub will continue to operate independently as a company.

Back in June, the initial acquisition announcement was met with some hostility from developers, but that apparently didn’t impact GitHub’s business. As spotted by Geekwire, GitHub actually gained 3 million developers since June, and has now crossed 31 million users overall.

As per Dog Food strategy, Microsoft has 30 open source repositories in GitHub with Top products like Visual Studio Code, .NET Core framework, TypeScript, etc. Ref: https://github.com/Microsoft
https://github.com/Microsoft

Sunday, October 21, 2018

Apache HBase on Amazon S3


By using Amazon S3 as a data store for Apache HBase, you can separate your cluster’s storage and compute nodes.

Many customers have taken advantage of the benefits of running Apache HBase on Amazon S3 for data storage. These benefits include lower costs, data durability, and more efficient scalability.

Customers, such as the Financial Industry Regulatory Agency (FINRA), have lowered their costs by 60% by moving to an Apache HBase on Amazon S3 architecture. They have also experienced operational benefits that come with decoupling storage from compute and using Amazon S3 as the storage layer.

AWS's whitepaper provides an overview of Apache HBase on Amazon S3 and guides data engineers and software developers in the migration of an onpremises or HDFS backed Apache HBase cluster to Apache HBase on Amazon S3.

The whitepaper offers a migration plan that includes detailed steps for each stage of the migration, including data migration, performance tuning, and operational guidance.

Whitepaper Ref: https://d1.awsstatic.com/whitepapers/Migrating_to_Apache_Hbase_on_Amazon_S3_on_Amazon_EMR.pdf

Friday, October 12, 2018

CoCo



Microsoft's Coco Framework enables high-scale, confidential blockchain networks that meet all key enterprise requirements.

It is providing a means to accelerate production enterprise adoption of blockchain technology by cutting across the technology.

Sunday, October 7, 2018

Neuton


Deep learning is the hottest technology today, with countless applications and deep investment from the usual suspects.  Deep learning neural networks are behind much of the progress in AI these days.

Neuton is a new framework that claims to be much faster and compact, and it requires less skills and training than anything the AWSs, Googles, and Facebooks of the world have.

Bell Integrator says Neuton is an Auto ML solution with resulting models that are self-growing and learning. And, to top that off, says Bell Integrator, Neuton is so easy to use that no special AI background is needed.

Ref: https://neuton.ai/

Monday, August 27, 2018

Amazon QuickSight


This week, Amazon announced the availability of email reports and data labels in QuickSight.

With email reports in Amazon QuickSight, you can receive scheduled and one-off reports delivered directly to your email inbox. 

Using email reports, you have access to the latest information without logging in to your Amazon QuickSight accounts. You also get offline access to your data with email reports. 

For deeper analysis and exploration, you can easily click through from the email report to the interactive dashboard in Amazon QuickSight

Wednesday, August 15, 2018

Blockchain of Transportation Mobility


Interesting read from IBM, on blockchain of Transportation Mobility.

Blockchain can be used to:
* Verify vehicle identity and vehicle history
* Track auto components through the supply chain
* Automate machine payments
* Establish a mobility commerce platform
* Facilitate car and ride sharing
* Support usage-based insurance and taxes

Ref: https://www.ibm.com/blogs/blockchain/2018/07/road-to-the-future-blockchain-for-transportation-mobility/

Wednesday, August 8, 2018

Amazon ES Cloudwatch


Last week, Amazon Elasticsearch Service (Amazon ES) announces support for publishing error logs to Amazon CloudWatch Logs.  This new feature provides you with the ability to capture error logs so you can access information about errors and warnings raised during the operation of the service.

These details can be useful for troubleshooting. You can then use this information to work with your users to identify patterns that cause error or warning scenarios on your domain.

Access to the feature is enabled as soon as your domain is created.  You can turn the logs on and off at will, paying only for the CloudWatch charges based on their usage.

By enabling the error logs feature, you can gain more insight into issues with your Amazon ES domains and identify issues with domain configurations.  Additionally, you can also use the integration of CloudWatch Logs and Amazon ES to send application error logs to a different Amazon ES domain and monitor your domain’s performance.

Sunday, August 5, 2018

Kafka Summit 2018


Kafka Summit is the premier event for data architects, engineers, DevOps professionals, and developers who want to learn about streaming data. It brings the Apache Kafka community together to share best practices, write code, and discuss the future of streaming technologies.

This year summit will be on Oct 16,17 at SFO, if possible.  Ref:
https://kafka-summit.org/events/kafka-summit-san-francisco-2018/

Thursday, August 2, 2018

What is Blockchain


Block chain is buzz word in the current emerging technology / industry.

On resolution of 5 Ws (What, Why, When, Where, Who) & 1 H (How) queries, any new technology can be learnt in seamless way with width & depth details. Today, I'm just inking the first question - "What" i.e. What is Block chain?

Preface
This series doesn't have deep technical dive; in stead to address the fundamental understanding of this distributive technology to add business value.

As I was coming from the financial industry background, 4 business context need to define in alignment with Block chain technology.

  1. Asset
  2. Ledger
  3. Transaction
  4. Contract


Asset
Any tangible or intangible object, which is capable of being owned or controlled ultimately to produce value. Good examples are house (tangible) and its mortgage (intangible).

Ledger
It is an accounting principle and practice to record the business activities meticulously.

Transaction
Any activity to execute on the asset. Good examples are buying a new house or selling the existing house.

Contract
Any business condition for transaction to occur. Good examples are on settling the mortgage amount, house title will be transferred to the owner

Executive Summary
In a nut shell, Block chain is decentralized public digital ledger of all transactions in various business contracts of the underlying assets. Transaction details can’t be tampered to build the trust on data.

Hope, you can leverage this one pager on "What is block chain?"

Saturday, July 21, 2018

Bitcoin 3 million wallet


Blockchain began with a simple idea: to make Bitcoin transactions and the Bitcoin economy easier to understand and use.

Industry experienced exponential growth ever since first launched in 2013.

Industry expanded by a whopping 2100% from 100,000 to 3 million wallets.

Sunday, June 10, 2018

SAP Blockchain


This week, SAP made official announcement that new block chain service will support Hyper ledger Fabric and MultiChain, and will be built on top of SAP HANA data management system

Ref: https://www.sap.com/products/leonardo/blockchain.html

Sunday, May 13, 2018

Google Duplex

 I'm so excited to share the industry update on AI pack full week from Microsoft Build and Google I/O annual tech conferences. This article is about Google's AI digital assistant - Duplex.

During my early 90's graduation, had a paper on Artificial Intelligence.  To be honest, I never dream in my life time about seeing/feeling the theoretical concepts, studied in college. You know what? Itz real now and disruptive technology reached its height.

During this week Google I/O summit, Sundar Pichai demonstrated Google Duplex, which is designed to pretend to be human, with enough human-like functionality to make similarly inane phone calls in real world. 

As Sundar said, Google's AI technology has come a very long way. This demo was pretty incredible if you haven't seen

This new AI based digital assistant helps us to improve life mode by making simple boring phone calls intelligently on your behalf. During this demo, tech revolution has been witnessed in the ability of computers to understand and to generate natural speech, especially with the application of deep neural networks.

The Google Duplex technology is built to sound natural, to make the conversation experience comfortable. How is technically possible?

Duplex is a recurrent neural network (RNN), built using TensorFlow Extended (TFX).  The network uses the output of Google’s automatic speech recognition (ASR) technology, as well as features from the audio, the history of the conversation, the parameters of the conversation and more.

Smart Data readiness is quite tricky by training the understanding model separately for each task, but leveraged the shared corpus across tasks. In last state, hyper-parameter optimization from TFX is leveraged to improve the model further.

As artificial intelligence continues to improve, voice quality will improve and the AI will become better and faster at answering more and more types of questions. We’re obviously still a long way from creating a conscious AI.

Saturday, May 12, 2018

Google 2018 Summit


This week, AI "Intellectual Democratization" is show cased at Google I/O Summit. Learn them using 14 mins snippets at https://youtu.be/BRUvbiWLwFI

VPS (Visual Positioning System) leverages leverages disruptive AR (Augmented Reality) technology from GPS, in alignment with our Map domain.

Thursday, April 26, 2018

Sunday, March 18, 2018

ReturnTrucks


ReturnTrucks is an online platform connecting truck owners and load owners for an effective, timely and economic engagement.

It helps the load owners post their requirements and truck owners post their availability for free of cost.

Interesting industry news related to Transportation & Logistics domain at https://returntrucks.in/

Sunday, February 25, 2018

Redshift Spectrum


Redshift Spectrum helps to run SQL queries against data in an Amazon S3 data lake as easily as you analyze data stored in Amazon Redshift. It achieves without loading data or resizing the Amazon Redshift cluster based on growing data volumes.

Redshift Spectrum separates compute and storage to meet workload demands for data size, concurrency, and performance.  It scales processing across thousands of nodes, so results are fast, even with massive datasets and complex queries. It is possible to query open file formats that you already use—such as Apache Avro, CSV, Grok, ORC, Apache Parquet, RCFile, RegexSerDe, SequenceFile, TextFile, and TSV—directly in Amazon S3, without any data movement.

Top 3 performance features are:
  1. Short Query Acceleration - speed up execution of queries such as reports, dashboards, and interactive analysis
  2. Results Caching - deliver sub-second response times for queries that are repeated, such as dashboards, visualizations, and those from BI tools
  3. Late Materialization - reduce the amount of data scanned for queries with predicate filters by batching and factoring in the filtering of predicates before fetching data blocks in the next column
AWS Summit video at https://www.youtube.com/watch?v=gchd2sDhSuY

Sunday, February 18, 2018

MongoDB ACID Tackle


MongoDB is making the ACID feature of 4.0 available for testing in public beta.

With the most recent release improving the robustness of BI queries, database writes, schema validation, and Python support, the upcoming 4.0 release will tackle ACID.

Ref: https://www.mongodb.com/transactions?jmp=hero

Friday, February 9, 2018

Amazon Spheres


Amazon’s Spheres, the centerpiece of the retail juggernaut’s $4 billion urban campus, was opened to employees on Jan'18 end.

Amazon has built an architecturally ambitious focal point and symbol of its status as Seattle’s largest employer and one of the most dynamic entities in corporate America.

The flora inside the domes is the work of a team of Amazon horticulturists who were charged with scouring the globe for interesting plants and growing them in a 40,000-square-foot greenhouse in Woodinville.

Four years of growth later, and supplemented by a few older plants acquired from other conservatories, the Spheres feel like a mature, if unusually well-manicured, jungle.  Back to nature, now !!

Wednesday, February 7, 2018

C# Nullable


Technology is fulfilled when it adds business value.

Nullable data type helps to process optional business parameter.  As an example, App shouldn't store '0' in 'manufacturing year' as optional/default integer; it leads to wrong business context.

Current issue of Microsoft magazine illustrates tech concept using C# 

Ref: https://msdn.microsoft.com/en-us/magazine/mt829270

Sunday, February 4, 2018

Angular 3 year support


AngularJS is planning one more significant release, version 1.7, and on July 1, 2018 it will enter a 3 year Long Term Support period.

AngularJS is an extremely stable framework for building web applications, and has been used by millions of developers across the web. Angular is its successor and uses the same philosophies such as declarative templates and dependency injection. 

Many developers are still using AngularJS, and that the migration process to Angular takes time and energy.

All AngularJS applications that work now, will continue to work in the future. All published versions of AngularJS, on npm, bower, CDNs, etc will continue to be available.

Saturday, January 27, 2018

Redshift ETL


In ETL world, Amazon Redshift is revolutionary to make the developer life simple.  It is used to calculate daily, weekly, and monthly aggregations, which are then unloaded to S3, where they can be further processed and made available for end-user reporting using a number of different tools, including Redshift Spectrum and Amazon Athena.

The proposed ETL process has 4 key steps to execute:

Step 1:  Extract from the RDBMS source to a S3 bucketIn this ETL process, the data extract job fetches change data every 1 hour and it is staged into multiple hourly files.

Step 2: Stage data to the Amazon Redshift table for cleansing
Ingesting the data can be accomplished using a JSON-based manifest file. Using the manifest file ensures that S3 eventual consistency issues can be eliminated and also provides an opportunity to deduce any files if needed.

Step 3: Transform data to create daily, weekly, and monthly datasets and load into target tables
Data is staged in the “stage_tbl” from where it can be transformed into the daily, weekly, and monthly aggregates and loaded into target tables.

Step 4: Unload the daily dataset to populate the S3 data lake bucketThe transformed results are now unloaded into another S3 bucket, where they can be further processed and made available for end-user reporting using a number of different tools, including Redshift Spectrum and Amazon Athena.

Monday, January 22, 2018

India Economics 2018


Last 3 years, Indian politics has different perspective on the current Prime Minister Mr. Narendra Modi.  Forget about all the criticism against his initiatives.  In 4 members family, we used to have difference of opinions; think about 1.4 billion people with different culture, language, religion, etc.

Being an individual from my motherland, he inspires fellow citizen that every thing is possible in life from poor boy to prime minister of the largest powerful democratic nation.

Being industry leader, I've 2 dimension to review (technical; not political) his effort namely leadership skill and business domain.

(1) Leadership Skill
As Warren says "Leadership is the capacity to translate vision into reality".  Mr. Prime minister demonstrated strong leadership skill with courageous execution on few long pending initiatives like Global Sales Tax (GST), Demonetization, Unique identity (Aadhar) roll out, Citizens welfare on Health policy, Digital Innovation, etc.

(2) Business Domain
Indian Prime Minister prepares this week to address global business and political leaders in Davos, Switzerland, as his country passes France and the U.K. to become the world’s fifth-largest economy, underscoring the South Asian nation’s drive for recognition as a great power.

Some economists calculate that India’s gross domestic product jumped into the top five last quarter as it continued to outgrow every country in Europe—and for that matter most of the rest of the world. It has been reflected in Wall Street Journal (WSJ).
Ref: https://www.wsj.com/articles/davos-offers-modi-stage-to-push-muscular-vision-for-india-1516552979

Big salute to all the contributors.  Jai Hind !

Saturday, January 20, 2018

AD integration with EMR

Active Directory (AD) is a directory service that Microsoft developed for Windows domain networks. It is included in most Windows Server operating systems as a set of processes and services
In recent times,  many enterprises use Microsoft Active Directory to manage users, groups, and computers in a network.

This article is about the seamless integration of Active Directory on Amazon EMR with the same single sign-on (SSO) experience.

Ref: https://aws.amazon.com/blogs/big-data/use-kerberos-authentication-to-integrate-amazon-emr-with-microsoft-active-directory/

The ability to authenticate users and services with Kerberos not only allows you to secure your big data applications, but it also enables you to easily integrate Amazon EMR clusters with an Active Directory environment.  It is also possible to use AWS CloudFormation to automate the deployment of this solution.

Sunday, January 14, 2018

Effective vs Efficient

In back of my mind, the question pops up around the difference between Effective and Efficient. Couple of management books and few blogs, enlighten me to share this knowledge over Pongal holidays.

Definition

With fundamentals of computing, three states are vital in any system. They are
  1. Input
  2. Process
  3. Output
With this context, Efficiency is focused on lower input and higher output with core processing of 'quantity'. As an example, the business is efficient if they deliver the high quantity take away with lower resources. Let us assume, a building is constructed in 6 months with 100 construction professionals during normal term. To make it efficient, the business is expected to complete the same task with less time frame (say 3 months) and less people (50). Now, it is claimed that the project is efficiently completed by 50%
On coming to Effective factor, there is a co-relation between input and output parameters. It is related to the fine tuned processing of doing right things. In turn, Effectiveness has the 'quality' focus; rather than 'quantity'.

Expert Opinion

In the essence, management Guru Peter Drucker describes "Efficiency means doing thing right; where as Effective doing right thing", as defined below:

Mathematical Mode

With mathematical derivation, matrix is represented on 2 x 2 mode as below:
As self explanatory, most effective succeeds at a high cost; but most efficient leads to fail due to cost control. Below mathematical curve proves the optimum solution to have the balance between effective and efficient.

Management Mode

Let us log into the business management mode. Objective is the key goals in the business context. Return on Investment (RoI) and Cost are vital parameters to drive Effective and Efficient model in the business management.
On doing high effective & low efficient mode, goal got pursued in high cost. So, it aims towards high RoI & Cost. High efficient & low effective directs lower production with low cost model and so aims lower RoI & Cost.
Ideally, the business management targets High RoI & Low Cost using high effective & efficient model. Strive to Thrive is success mantra !!

Conclusion

As industry leader persona, Top-5 summary points are:
  1. Effective & Efficient are useful tools to leverage concurrently/interrelately
  2. Both are performance scorecard/indicator, to get things done with on time mode
  3. Motivates to share transparent customer feedback with your team
  4. Engages team to make them feel of belonging sense
  5. In turn, foster the positive work environment for better business results
As Ron Kaufman said "First be effective and then be efficient"

Monday, January 1, 2018

AWS Digital Training


AWS Training and Certification recently released free digital training courses that will make it easier for you to build your cloud skills and learn about using AWS Big Data services. This training includes courses like Introduction to Amazon EMR and Introduction to Amazon Athena.

You can get free and unlimited access to more than 100 new digital training courses built by AWS experts at aws.training. It’s easy to access training related to big data. Just choose the Analytics category on our Find Training page to browse through the list of courses. You can also use the keyword filter to search for training for specific AWS offerings.

Reference link: https://www.aws.training/

Recommended training
Just getting started, or looking to learn about a new service? Check out the following digital training courses:

Introduction to Amazon EMR (15 minutes)
Covers the available tools that can be used with Amazon EMR and the process of creating a cluster. It includes a demonstration of how to create an EMR cluster.

Introduction to Amazon Athena (10 minutes)
Introduces the Amazon Athena service along with an overview of its operating environment. It covers the basic steps in implementing Athena and provides a brief demonstration.

Introduction to Amazon QuickSight (10 minutes)
Discusses the benefits of using Amazon QuickSight and how the service works. It also includes a demonstration so that you can see Amazon QuickSight in action.

Introduction to Amazon Redshift (10 minutes)
Walks you through Amazon Redshift and its core features and capabilities. It also includes a quick overview of relevant use cases and a short demonstration.

Introduction to AWS Lambda (10 minutes)
Discusses the rationale for using AWS Lambda, how the service works, and how you can get started using it.

Introduction to Amazon Kinesis Analytics (10 minutes)
Discusses how Amazon Kinesis Analytics collects, processes, and analyzes streaming data in real time. It discusses how to use and monitor the service and explores some use cases.

Introduction to Amazon Kinesis Streams (15 minutes)
Covers how Amazon Kinesis Streams is used to collect, process, and analyze real-time streaming data to create valuable insights.

Introduction to AWS IoT (10 minutes)
Describes how the AWS Internet of Things (IoT) communication architecture works, and the components that make up AWS IoT. It discusses how AWS IoT works with other AWS services and reviews a case study.

Introduction to AWS Data Pipeline (10 minutes)
Covers components like tasks, task runner, and pipeline. It also discusses what a pipeline definition is, and reviews the AWS services that are compatible with AWS Data Pipeline.