Summary
Overview
Work History
Education
Skills
Certification
Additional Information
Timeline
Generic

Govindaraj Sanjeevi

Austin,TX

Summary

A highly motivated, goal oriented and easy to work with data professional with experience in Data Engineering, Data Analytics, Data Warehousing and Business Intelligence. Passionate about leveraging quantitative techniques, programming skills and analytical skills to provide strategic decision making to stakeholders of all levels. Excellent analytical, problem-solving and documentation skills. Team player with excellent Interpersonal and communication skills.

Overview

13
13
years of professional experience
1
1
Certification

Work History

Manager/Senior Data Engineer

Wayfair
09.2021 - Current
  • Spearheaded efforts to migrate legacy systems onto cloud-based platforms, resulting in improved scalability and cost efficiency.
  • Increased team productivity by implementing streamlined processes and effective communication strategies.
  • Reduced operational costs through comprehensive process improvement initiatives and resource management.
  • Mentored junior team members for career advancement, fostering a pipeline of future leaders within the organization.
  • Accomplished multiple tasks within established timeframes.
  • Reengineered existing ETL workflows to improve performance by identifying bottlenecks and optimizing code accordingly.
  • Ingested variety of sources into the Foundational layer and Curated them later for consumption.
  • Supported the consumption for users by creating GraphQL end points and also posting the messages to the Kakfa topic.
  • Built CDC components, automation scripts to move existing scripts in internal scheduler to use with Airflow compatibility.

Data Analytics Consultant

Nationwide Insurance
10.2019 - 09.2021
  • Worked to build Unified Claims Analytical System by bringing in the data from various sources into one single system(Bronze, Silver, Gold layes) in Delta Lake then loaded into Snowflake for consumption.
  • Created Apache Spark pipelines to bring in data from various sources. - Optimized the code to bring data from 1970’s till today since it is massive.
  • Built Fact tables over 20 billion rows to support the machine learning capabilities.
  • Built Claim Events table with all the different type of stages a claim can go through into one single table.
  • Trained team members on best practices in data management and analytics methodologies, promoting a culture of continuous learning and improvement.

Data Engineer - Big Data

Lbrands
03.2018 - 10.2021
  • Involved in migrating all data warehouse and business intelligence reports from Teradata warehouse to Mapr Hadoop cluster.
  • Automating the ingestion process of moving all the data from the Teradata to the Hadoop Distributed File system
  • Created Hive Query Language scripts to populate the data warehouse tables
  • And all these data is merged into the semantic layer(Hbase - NoSql DB) were other sources also loads the data that enable the reporting team to show case into business users
  • The business objective of the project is to have a centralized data repository that helps the users in their business development decision making in a better performed way.

Consultant

Nationwide Insurance
06.2016 - 03.2018
  • Involved in creating Informatica mappings to map the XML data coming from JMS(Java Messaging Queue) system to the Teradata tables for the underwriting team to perform there analytics.
  • Involved in creating Informatica mappings to create a dataset in the XML format and send to the third party company(Ausum) for analytics in the 5MB file size that was requested.
  • Involved in various stages of the project to build the pipeline end to end.

Associate

Blue Cross Blue Shield of Minnesota
02.2015 - 06.2016
  • Built datasets to help calculate the Member Out of Pocket expenses for decision making for health plan members.
  • Involved in Business analysis and requirements gathering and architecting the project.
  • Coordinate with the offshore team to develop the project.
  • Built several Informatica mappings and Wherescape Red scripts to automate the process.
  • Ensured quality, technical guidance, code review and mentoring the juniors.

Associate

Blue Cross Blue Shield of Minnesota
02.2011 - 01.2015
  • Involved in migration of the existing Mainframe DB2 data architecture and the allied components (including the JCL, COBOL scripts) to a better model, from a data warehousing perspective, involving Teradata database.
  • Involved in analyzing the existing process and scripts and designed for enhancements.
  • Built several Informatica mappings, Wherescape Red scripts and SAS scripts to automate the process.
  • Ensured quality, technical guidance, code review and mentoring the juniors.

Education

Bachelor of Engineering - Computer Science And Engineering

Magna College of Engineering
Chennai, India
05.2010

Skills

  • PySpark
  • Python
  • SQL
  • Databricks
  • Apache Airflow
  • Github
  • Google Cloud - Bigquery, Dataproc, Dataflow, Bigtable, Composer, Cloud Storage, Cloud SQL, Looker, Vertex AI
  • AWS Cloud - S3, Redshift, EC2, EMR, Amazon RDS
  • RestAPI, GraphQL, Postman
  • Docker
  • Kubernetes
  • Terraform
  • Snowflake
  • Tensorflow Basics
  • Informatica
  • SAS
  • Teradata
  • Tableau

Certification

  • Google Cloud Certified - Data Engineer
  • SAS Certified Base Programmer for SAS 9
  • Informatica Powercenter Developer
  • Oracle 9i: SQL
  • Cognizant Certified Programmer – AHM250(Heath care)

Additional Information

https://www.linkedin.com/in/govindarajsanjeevi/

Timeline

Manager/Senior Data Engineer

Wayfair
09.2021 - Current

Data Analytics Consultant

Nationwide Insurance
10.2019 - 09.2021

Data Engineer - Big Data

Lbrands
03.2018 - 10.2021

Consultant

Nationwide Insurance
06.2016 - 03.2018

Associate

Blue Cross Blue Shield of Minnesota
02.2015 - 06.2016

Associate

Blue Cross Blue Shield of Minnesota
02.2011 - 01.2015

Bachelor of Engineering - Computer Science And Engineering

Magna College of Engineering
Govindaraj Sanjeevi