Resources

Case Studies

About the client

Sanofi is one of the Largest European head
quartered Pharma companies.

Business Challenges

The client lacked visibility of consolidated payments made to Doctors. This is because the Doctor data is duplicated across different CRM and Data originating systems. To add to this a Doctor can be associated with multiple hospitals and clinics. In addition the same Doctor is seen by different Sales Reps, each in turn creating new Doctor Id for every thereby duplicating the data. This led to several operational staff engaged in manually validating and de-duplicating the data which increased the cost of operations, increased risk and reduced the efficiency. In addition, they had to comply with the Transparency Act.

Solution

Data was extracted from Sanofi’s source systems including CRM and then validated, cleansed, deduplicated, matched and loaded into a Master data hub alled the Doctors Hub. This hub would contain the Golden record for each Doctor and their association with Hospitals. Master Data Governance process was established using a Workflow model to prevent data deterioration and ensure that the systems remained duplicate free. The Doctor Hub delivered Single version of truth and enabled accurate information sharing across the enterprise.

Benefits

Deduplication of data:
Spend Analysis:
Doctor Hub:
Compliance: 
Information Governance:

Technology

MDM SE (Standard Edition) on Prem, Information Server, DB2.

About the client

One of the Leading payment solutions provider in India.

Business Challenges

Client had undertaken an initiative to implement Cloudera based Data Lake and were facing huge challenges in the acquisition of data from their mainframe applications based on Z/OS as well as from dozens of satellite applications based on Oracle. Z/OS being proprietary in nature did not allow for easy access of data. Client needed a sophisticated solution to extract data in real-time from not only mainframes from also various applications and store them in Cloudera. Client wanted to leverage native Hadoop capabilities for business intelligence reporting as well as for Predictive Analytics.

Solution

IBM CDC was used to extract changed data from mainframes as well as from Oracle RDBMS, these data were then fed to a Kafka cluster and then posted to Cloudera. IBM BigIntegrate jobs were used to process data on Cloudera to generate relevant output data for Business Intelligence and Analytics. As part of the engagement governance catalog was also configured to track technical assets and provide end-to-end lineage.

Benefits

Real-time replication of data from various source systems to Cloudera
(Data Lake).

Leveraged native Hadoop capabilities through IBM products to process data to perform transformation and speed up processing of terabytes of data.

Enabled day to day operational reporting.

Enabled Analytics: Supported downstream Analytics Users for model building with near real-time data.

Technology

IBM CDC, IBM BigIntegrate (DataStage, QualityStage, Governance Catalog).

About the client

Top mid market General Insurance company in India providing Health, Motor, and Commercial Insurance.

Business Challenges

Siloed applications, no central data repository, hence business users were finding it difficult to generate their day-to-day reports, had to perform considerable manual work for generating key insurance specific KPI’s. Actuarial users were finding it difficult to consolidate data and generate their models.

Solution

Data from Core Insurance applications running on AS/400 was replicated in real-time using IBM’s CDC technology on to an Operational Data Store (ODS). IBM’s Data Integration tool was used to process and transform the input data and subsequently populate the Star Schema defined in IBM Db2 warehouse. Key reports and KPI’s were developed in Cognos. Real time availability of data in a central place allowed the Actuarial users to build and test their models with very little effort.

Benefits

Real-time availability of data from various source systems in ODS with implementation of purpose built data marts.

Revenue Growth:

  • Improved tracking of key KPI’s like Revenue Per Policyholder, Average Cost Per Claim, Loss Ratio, Underwriting Speed.
  • Implemented Key Predictive Models namely Persistency Report, Customer Churn, Campaign Analytics, Analytics to Predict Surrender.

Business Users able to generate their own reports.

Reduced IT dependency.

Technology

IBM CDC, IBM DataStage, IBM Db2 Warehouse, IBM Cognos Analytics.

About the client

Leading Retailer and Manufacturer in India known for leading brands in the area of Eyecare, Jewelry, Fashion Clothing, and Perfumes. Client has Pan India presence and leverage Analytics for their critical decision making.

Business Challenges

Client wanted a Single Vendor to manage all of their Analytical needs. They have complete IBM Analytics stack for core of their Data Management and Data Integration needs. They needed agility in delivering as per business needs as their business was growing exponentially. They also wanted to migrate to AWS stack for scalability.

Solution

We provided them a team of qualified resources who acted as an extended arm of the IT department and took over from the existing Vendor and helped stabilize the existing deployment. We managed the Administration of the following components:

1. IBM Information Server (On-prem and AWS Cloud):
a) Patching
b)  Monitoring
c) Performance Improvement
d) Housekeeping
e) Helping in Backup and Recovery Process

2. HortonWorks:

a) Day to Day administration
b) User Management
c) Hbase Maintenance
d) Hadoop Services maintenance

3. IBM Netezza:

a) Managing User Creation and access
b) Creation of New Tables / Modification of Tables
c) View Creation
d) Performance Tuning of Queries and ensuring proper table design as per NZ best practices
e) Regular Housekeeping
f) Migrating data to AWS

  • L2 and L3 Support
  • Execution of BAU Jobs
  • Establish and Implement comprehensive framework for Exception handling and dealing with AWS Redshift Data Upload)
  • Fine Tuning of jobs to run within their stipulated times
  • Catering to new needs for ETL development across various lines of businesses for reporting
 purposes (designing new tables, Writing ETL routines to populate data)
  • Troubleshooting and fixing issues, coordinating with IBM for PMR’s raised

Benefits

Improved reporting across Lines of Businesses helping meet Reporting SLA’s.

Improved automation of jobs, reduced manual intervention, optimum utilization of IT resources.

Technology

IBM Information Server, IBM Netezza, HortonWorks, Tableau, AWS Redshift, AWS S3, AWS Dynamodb.

News and Events

Blogs

More Than Meets the AI

The future of artificial intelligence is becoming increasingly uncertain as it shifts from hype to scepticism, along with growing concerns of its use.

Dr Ganesh Natarajan

Artificial Intelligence seems to have crossed the Gartner peak of inflated expectations and is sliding down the trough of disillusionment for many investors and corporate users.

Leaders are concerned that Gen AI is nothing more than a good and speedy summariser of information that can hasten the process of garnering knowledge and forming an opinion, but does not provide value beyond that. And for investors, whose feeding frenzy took Nvidia to a market value over $3 trillion with $2.5 trillion of that increase happening in the past two years and who are investing in OpenAI at an expected valuation north of $150 billion, the threat of the world pivoting away in search of alternate business models to massive processing and large language models is very real. Are we headed for a major crash in AI or is there a slope of enlightened growth to follow?

REDUCING ENERGY COST

A recent issue of The Economist argues that the huge monies flowing into early AI leaders like Nvidia and OpenAI and the $100 billion of angel and VC money poured into AI start-ups may dry up if the present big concerns about energy guzzling by AI are not addressed. The stark reality is that the world cannot afford the energy cost to power the Large Language Models (LLMs) that will be needed in the current process of training AI to be a partner and in some cases a replacement for human effort. The large GPUs and Nvidia chip models of today could also be wrecked by this reality.

There are work arounds to this problem, new AI specialist chip companies may emerge that compete with Nvidia, and LLMs could give way or be supplemented by more narrow AI systems, which choose between reasoning and text generation as their core focus. The computation costs that are today excessive and, in many cases, comparable with LLM training costs, will have to be reduced by a constellation of AI models working in concert rather than the mega LLMs.

Nimble countries like China will see this as an opportunity to find new ways to AI success. In fact, the denial of access to new chip production to other economies by the US may hasten the process of innovation in China and even countries like India and parts of Europe, which have been AI laggards. They may find it worthwhile to explore new models and provide easier funding conduits to the less aggressive PES that worry about burning their fingers with today’s rampaging hectocorn like Nvidia and OpenAI.

IN THE WRONG HANDS 

Celebrity author Yuval Noah Harari, in his recent talk in New York and his recent book Nexus, has also warned that the propensity of large AI to centralise data and provide instant analysis of  behaviour and intent of every citizen in a country can play into the hand of dictatorial regimes. He makes the point that the much-feared KGB in the Soviet Union and possibly in Vladimir Putin’s Russia and the hated regimes of Mussolini, Stalin and Hitler would have benefited greatly from these tools that could have helped them identify potential dissenters and eliminate them even before the thoughts and actions were fully crystallised in the minds of the perpetrators.

Harari also cautions about rogue algorithms that are the creation of some of the internet giant corporations and mentions Myanmar as a case in point where Facebook algorithms sensed that hate posts against the Rohingyas got more attention and chose to hence propagate those much faster than the healing voices. He argues that an algorithm designed with wrong or inadequate rules will have the potential to drive behaviour and initiate actions that may not have been envisaged by the creators themselves. 

While we exult at the capabilities of tech firms to push the frontiers of AI and look forward to the era of Artificial General Intelligence (AGI) and Artificial Super Intelligence (ASI), we need to be wary of both the environmental consequences and threat of AI moving the human out of the loop and taking over the world.

Let us be conscious about how we push our businesses and our lives towards a future filled with AI agents. There is no doubt that we live in interesting times!

 

The author is chairman of 5F World, GTT Data Solutions and Honeywell Automation India Limited. Views expressed are personal.