Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic
Madhava Reddy G.

Madhava Reddy G.

Eden Prairie,MN

Summary

Results-oriented professional with 15 years of experience in Hadoop, Apache Spark, Java, Alteryx, and SQL. Strong expertise in Hadoop and its ecosystem components including Map Reduce, Hive, Pig, Sqoop, NoSQL databases, Storm, and Kafka. Currently involved in designing and implementing Operational Intelligence (OPSi) Risk Analytics application for Optum in the US Health Care domain. Responsible for designing and developing Sqoop and Hive jobs for Data Ingestion on the Big Data platform. Developed Oozie workflows to integrate and automate application components. Utilized Alteryx for syntactic and QA checks. Extensive experience in Apache Spark ecosystem including Spark SQL and Streaming. Familiar with NoSQL databases such as MongoDB, Cassandra, and Hbase. Skilled at understanding business requirements and assessing feasibility of integrating other systems with Hadoop to provide effective solutions aligned with organizational goals. Proficient in promoting code to production environments with a strong emphasis on quality. Adaptable to new technologies and capable of scaling up quickly.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Data Analytics Manager

Optum Global Solutions
02.2017 - Current
  • OPSI (Operations Intelligence): OPSI analytics platform is a unified computation engine from Optum RQNS (Risk, Quality and Network solutions), for payers, health plans, providers and other downstream stream applications. This platform orchestrates and computes quality and risk gaps, metrics and indicators and provide automated operational assistance to additional services (care managers, Medical records) using one configurable with data quality checks, deployment and security.
  • Roles & Responsibilities:
  • Involving in the members claims processing using Bigdata analytics platform.
  • Processing the RAPS shared by CMS (Centers for Medicare & Medicaid Services).
  • Calculating the members risk score & classifying them into diff buckets based on their risk score and sharing the same with Providers.
  • Involving in data cleansing of results (Pre and post processing).
  • Participating in data ingestion from RDBMS to Big data platforms.
  • Building the workflows for orchestration using Oozie and Alteryx.
  • Created the rollup module for member's gaps using pyspark.
  • Created the Sqoop outbound process to store the results into RDBMS.
  • Developed the apache Pig scripts for data transformations and generating the statistics.
  • Expertise in production deployments, OPS team supporting and bug fixes. Worked on UAT , E2E testing with downstream systems.(7+ Teams).
  • Participated in project strategic planning, created the delivery plan.
  • Built the robust configurable and generic QA layer using Alteryx designer.
  • Automated this complete using Hive, Hbase and Alteryx scheduler.
  • Work with Business Stakeholders to understand the business requirements, problem areas and then develop solutions to business problems.
  • Demonstrate the teams' solution to business stake holders on how the business problems can be solved and improve efficiencies of the business processes.
  • Review the project deliverables for performance and quality improvements.
  • Build the poc for any initiatives and design, finalize the architecture and road map for project deliveries and provide the necessary technical, functional training for team members.
  • Skills: Hadoop eco system, Apache Spark, Pyspark, Sqoop, Alteryx, Hbase , Azure.
  • Streamlined data collection methods, improving the quality and reliability of gathered information.
  • Managed a team of data analysts, fostering professional growth and ensuring high-quality output.
  • Led process improvement initiatives by analyzing workflow inefficiencies and proposing actionable solutions.
  • Mentored junior analysts as they developed their skills in various aspects of data analytics management techniques.
  • Optimized data analysis processes by implementing advanced analytical tools and software.
  • Enhanced data-driven decision-making for company leadership through comprehensive reporting and visualization.
  • Designed and implemented customized dashboards that provided real-time insights into business performance.
  • Analyzed large amounts of data to identify trends and find patterns, signals and hidden stories within data.
  • Created and automated data visualizations to present insights and tell compelling stories.

Big data Consultant

Karvy analytics
09.2015 - 02.2017
  • Roles & Responsibilities:
  • Providing the Substitute cost-effective and robust solutions using Big Data eco system tools.
  • Building the data pipelines for processing in real time & Batch processing.
  • Involved with pre-sales team to Design POCs for prospects and mentoring the team on Big Data.
  • Developed a few utils for raw data cleansing.
  • Involving in identifying the technology stack for Big Data solutions.

Associate

Cognizant Technology Solutions
11.2014 - 09.2015
  • Project title : Building the data lake for reporting using MongoDB.
  • Environment : Mongo DB and Informatica.
  • Duration : Nov 2014 – June 2015.
  • Roles & Responsibilities:
  • Analyzed existing system of SQL Server and MongoDB assessment server.
  • Created logical data models for both Staging and Reporting databases in MongoDB.
  • Developed physical data models for staging database and helped Informatica team to load source data into physical data models.
  • Prepared and executed aggregation and transformation logics to generate data to reports and inserted in physical data models of reporting database.
  • Unit testing done to generate valid report based on data inserted into the reporting engine database.

Sr. Engineer

Happiest Minds
09.2013 - 10.2014
  • Project title : Pearson Reporting Engine
  • Environment : Hadoop and other big data tools.
  • Duration : Sep' 2013 – June 2014
  • Project Description:Pearson Education is the world's biggest online education company. It includes books and resources that help students learn, teachers teach, and professionals evolve throughout their careers. The carefully designed learning tools help people around the world to expand their knowledge, develop their skills.We have developed the platform to Process its users (Students, professionals and teachers) historical data and make the better decisions to improve their learning system and its mythologies.
  • Skills:Apache Hadoop, Kafka, Storm, Mongo DB, Camus and Json, Avro file formats.
  • Roles & Responsibilities:
  • Involved in requirement gathering, preparation of design document, configuration activities, and customizations.
  • Implemented Util's in Kafka to consume the data from source systems.
  • Built Storm Topologies to consume and validate the Json data.
  • Involved in testing each component and its flow in architecture.
  • Implemented test cases using Junit.
  • Involved in preparing test reports.
  • End to End Deployment of components developed.

Software Engineer

Tech Mahindra
12.2010 - 09.2013
  • Business Credit Card e-mail Statement data generation Using Hadoop.
  • National Australia Bank (NAB) is one of the four largest financial institutions in Australia in terms of market capitalization and customers. It has a feature for their account holders of providing account statement via email.
  • Skills: Apache Hadoop, PIG, HIVE, Sqoop.
  • Roles & Responsibilities:
  • Looking technical feasibility of the solution along with functional team members.
  • Involved in requirement gathering, preparation of design document, configuration activities, and customizations.
  • Developed Map Reduce, Hive, Pig scripts to process data.

Education

Master of Science - Computers Applications

Jawaharlal Nehru Technological University
Hyderabad ,India
01.2010

Skills

  • Hadoop eco system, Apache Spark, Pyspark, Sqoop, Alteryx, Hbase , Azure
  • Data analytics expertise
  • Proficient in Spark framework
  • Strategic insights generation
  • Hadoop ecosystem
  • Python development
  • Proficient in managing multiple priorities
  • Effective organizational skills
  • Project leadership
  • Project strategy development
  • Effective project execution management
  • Data quality management

Certification

Cloudera Certified Developer For Apache Hadoop (CCDH-310)

Timeline

Data Analytics Manager

Optum Global Solutions
02.2017 - Current

Big data Consultant

Karvy analytics
09.2015 - 02.2017

Associate

Cognizant Technology Solutions
11.2014 - 09.2015

Sr. Engineer

Happiest Minds
09.2013 - 10.2014

Software Engineer

Tech Mahindra
12.2010 - 09.2013

Master of Science - Computers Applications

Jawaharlal Nehru Technological University
Madhava Reddy G.