Summary
Overview
Work History
Education
Skills
Certification
Skill Technology Summary
Timeline
Generic

PALLAVI JINJIRAMPALLI

Belleville

Summary

Total 8+ years of IT work experience in all phases of Software Development with Agile methodology which includes User Interaction, Business Analysis/Modeling, Design & Development, Integration, Planning and testing, migration and documentation in applications using ETL pipelines and distributed applications.

Overview

8
8
years of professional experience
1
1
Certification

Work History

Data Engineer

CVS/Aetna
08.2023 - 01.2024
  • Conducted in-depth data assessment and profiling to identify data quality issues, resulting in 30% improvement in data accuracy through cleansing and validation processes
  • Spearheaded patient data management initiatives, enhancing the accuracy and accessibility of patient information across systems
  • Championed the implementation of Master Data Management (MDM) practices, ensuring data consistency and quality across enterprise platforms
  • Designed and developed advanced dashboard reports using SSIS and SSRS, facilitating insightful analytics and decision-making
  • Architected and implemented high-performance, adaptable data solutions optimized for cloud platforms including GCP, AWS, and Azure, aligning with strategic business objectives
  • Led the development and execution of the data architecture strategy and roadmap, ensuring optimal alignment with evolving business needs
  • Designed and maintained advanced data models, schemas, and structures, facilitating scalable and efficient data warehouse solutions
  • Advocated for and implemented data governance principles, ensuring data integrity and compliance across all data management processes
  • Designed and implemented data mapping processes, ensuring alignment between source and target systems, reducing data mapping errors by 25%
  • Liaison between Project team and business
  • Utilized ETL tools like Talend and Informatica for robust data integration and transformation processes, prioritizing Talend for its flexibility and performance
  • Developed interactive dashboards in Tableau (with a keen interest in integrating Power BI), providing actionable insights to business stakeholders
  • Utilized ETL tools like Infoworks to extract, transform, and load data from legacy systems into modern cloud-based platforms, optimizing data processing efficiency by 40%
  • Collaborated with business analysts and IT teams to define migration requirements, facilitating smooth communication and alignment between technical and business objectives
  • Orchestrated comprehensive migration testing, including UAT and SIT, resulting in 98% data accuracy rate and 20% reduction in post-migration issues
  • Proactively identified and resolved data migration issues, minimizing data loss and downtime, and ensuring smooth transition for end-users
  • Problem-solving process requires active communication among all relevant parties, development of action plan to identify steps required to solve problem and constant status checks and updates during resolution period so that business is kept informed.

GCP Data Engineer

CVS/Aetna
01.2021 - 08.2023
  • Participated in requirement grooming meetings which involves understanding functional requirements from business perspective and providing estimates to convert those requirements into software solutions (Design and Develop & Deliver the Code to IT/UAT/PROD and validate and manage data Pipelines from multiple applications with fast-paced Agile Development methodology using Sprints with JIRA Management Tool)
  • Developed end-to-end data pipelines on GCP, extracting data from diverse sources, transforming it using Apache Beam and Dataflow, and loading it into Big Query for analysis
  • Experience in building and architecting multiple Data pipelines, Kafka end to end ETL and ELT process for Data ingestion and transformation in GCP
  • Designed and optimized data models and schemas in BigQuery to support efficient querying and reporting, resulting in 30% improvement in data retrieval times
  • Extensively worked on Data Modeling involving Dimensional Data modeling, Star Schema/Snowflake schema, FACT & Dimensions tables, Physical & logical data modeling
  • Involved in Designing Snowflake Schema for Data Warehouse, ODS architecture by using tools like Data Model, Erwin
  • Developed data models and data migration strategies utilizing concepts of snowflake schema
  • Worked with Google Cloud (GCP) Services like Compute Engine, Kafka Cloud Functions, Cloud DNS, Cloud Storage and Cloud Deployment Manager and Tensor Flow SaaS, PaaS and IaaS concepts of Cloud Computing and Implementation using GCP
  • Involved in Onsite & Offshore coordination to ensure the deliverables
  • Migrated legacy systems to modern big-data technologies, improving performance and scalability while minimizing business disruption.

Data Engineer

PWC
11.2020 - 01.2021
  • Analyze and develop Data Integration templates to extract, cleanse, transform, integrate, and load to data marts for user consumption
  • Implemented Master Data Management strategies, aligning data governance with business objectives and enhancing data integrity across systems
  • Played a pivotal role as a Data Engineer, showcasing deep proficiency in data modelling and management principles to support business analytics and decision-making
  • Developed robust data pipelines using ETL tools such as Informatica / IICS and Talend, enhancing data integration and transformation capabilities
  • Delivered sophisticated analytical solutions utilizing Power BI, Tableau, and Looker, driving actionable insights and supporting strategic business initiatives
  • Fostered an understanding of various data access approaches, including microservices and event-based architectures, to improve data availability and usability
  • Expertly managed patient data, ensuring high standards of confidentiality, integrity, and availability
  • Utilized Azure Snowflake for data warehousing tasks, capitalizing on its powerful data storage and analytics capabilities
  • Designed semantic layers to simplify data access and analysis, improving end-user experience and data comprehension
  • Developed efficient data pipelines using Talend, optimizing data flow and processing tasks
  • Enhanced reporting capabilities with SSIS/SSRS and Tableau dashboards, enabling comprehensive data analysis and visualization
  • Review code against standards and checklists
  • Perform Analysis on the existing source systems, understand the Informatica/ETL/SQL/Unix based applications and provide the services which are required for development & maintenance of the applications
  • Create High- & Low-level design documents for the various modules
  • Review design to ensure adherence to standards, templates, and corporate guidelines
  • Validate design specifications against results from proof of concept and technical considerations
  • Coordinate with Application support team and help them assist understand business and necessary components for Integration, Extraction, Transformation, and load data
  • Worked on multiple Modules, HCM Global Integration with different Region's and ONECRM Salesforce Cloud
  • Involved in gathering and analyzing requirements and preparing business Requirements
  • Conducted POC on GCP mainly on Big Query platform for cloud Migration Project
  • Migrated from On-prem Hadoop 3.1v to Big Query and DataProc
  • Developed custom algorithms for advanced analytics, driving actionable insights from large datasets
  • Designed scalable and maintainable data models to support business intelligence initiatives and reporting needs.

Data Engineer

Radial Inc
03.2019 - 12.2020
  • Perform Analysis on existing source systems, understand Informatica/ETL/SQL/Unix based applications and provide services which are required for development & maintenance of applications
  • Implemented Master Data Management (MDM) solutions to standardize and harmonize critical business data across the organization
  • Specialized in designing and implementing data platforms and pipelines on traditional RDBMS platforms like SQL Server and Oracle, ensuring data consistency and accessibility
  • Leveraged hands-on experience in designing data pipelines with technologies like Spark, Python, and Scala to support complex data processing and analytics tasks
  • Contributed to the Healthcare industry by applying specific data models and practices, enhancing the quality and utility of patient data management systems
  • Embraced Agile methodologies and contributed to the adoption of the Scaled Agile Framework, facilitating more efficient and collaborative project management processes
  • Leveraged Azure Snowflake and Data Lake to handle complex data warehousing needs, ensuring scalability and performance
  • Applied Talend for ETL processes, streamlining data integration and enhancing data quality
  • Developed and maintained semantic layers, facilitating efficient data access and reporting
  • Created dynamic reports and dashboards using SSIS/SSRS and Tableau, supporting business intelligence initiatives
  • Create High & Low level design documents for various modules
  • Review design to ensure adherence to standards, templates and corporate guidelines
  • Coordinate with Application support team and help them assist understand business and necessary components for Integration, Extraction, Transformation and load data
  • Used Python with OpenStack, OpenERP (presently ODOO), SQL Alchemy, DJango CMS and so forth
  • Worked with Google Cloud (GCP) Services like Compute Engine, Cloud Functions, Cloud DNS, Cloud Storage and Cloud Deployment Manager and SaaS, PaaS and IaaS concepts of Cloud Computing and Implementation using GCP
  • Deployed application using Docker and AWSConsole services
  • DevOps role converting existing AWS infrastructure to Server-less architecture (AWS Lambda, Kinesis) deployed via CloudFormation
  • Designed data models for complex analysis needs
  • Developed database architectural strategies at modeling, design, and implementation stages to address business or industry requirements.

Data Engineer

First Republic Bank
12.2017 - 02.2019
  • Worked on implementing pipelines and analytical workloads using big data technologies such as Hadoop, Spark, Hive and HDFS
  • Directed the development and maintenance of the data architecture, prioritizing scalability, performance, and adaptability on modern cloud platforms
  • Championed the design and implementation of comprehensive data models and schemas, significantly improving the efficiency of data warehouse solutions
  • Implemented data governance frameworks, ensuring data integrity, security, and compliance across all data operations
  • Collaborated closely with business, operations, and engineering stakeholders to drive project success, demonstrating excellent communication and teamwork skills
  • Played a key role in proposal solutioning, defining execution approaches, and providing accurate project estimations, contributing to the successful acquisition and implementation of strategic projects
  • Experienced in designing and deployment of Hadoop cluster and different Big Data analytic tools including Pig, Hive, HBase, Oozie, Sqoop, Kafka, Spark, Impala
  • Directed Master Data Management initiatives, ensuring a single, accurate view of critical data entities in a decentralized environment
  • Integrated Azure Snowflake and Data Lake into the data architecture, achieving high-performance data analytics and storage solutions
  • Employed Talend for advanced ETL operations, achieving superior data integration and transformation results
  • Designed semantic layers to bridge the gap between raw data and business insights, enhancing data usability
  • Advanced reporting capabilities with the development of SSIS/SSRS and Tableau dashboards, delivering comprehensive analytics to inform strategic decisions
  • Experience in building and architecting multiple Data pipelines, end to end ETL and ELT process for Data ingestion and transformation in GCP
  • Perform Analysis on the existing source systems, understand the Informatica/Teradata based applications and provide the services which are required for development & maintenance of the applications
  • Coordinate with GCP Application support team and help them assist understand the business and necessary components for the Integration, Extraction, Transformation, and load data
  • Analyze and develop Data Integration templates to extract, cleanse, transform, integrate, and load to data marts for user consumption
  • Review code against standards and checklists.

ETL Developer

InRythm
07.2016 - 09.2017
  • Extract Transformation and Loading process has been implemented with the help of Informatica power center & Power exchange, Main frames, shell script which populates the database tables, used for generating the reports with Business Objects
  • Build data pipelines in airflow in GCP for ETl related jobs using different airflow operators
  • Lead offshore developers and help them assist understand business and necessary components for Data Integration, Extraction, Transformation and Load
  • Developed ETL design and ETL tool to extract data from DB2, Oracle and Xml databases
  • Expertise in Build, Unit testing, System testing and User acceptance testing
  • Designed documented report processing logic, Standardized process of report interaction to non-technical business users
  • Building new ETL Designs to load DataMart's.

ETL Developer

Arvind Technologies Pvt Ltd
03.2016 - 06.2016
  • Developed data archiving, data loading and performance test suites using Etl tools like Power center DM Express, Teradata, Unix, SSIS
  • Involved in data scaling and data dicing
  • Proficiency in Data Analysis, handling complex query building and performance tuning
  • Expertise in extracting and analyzing data from existing data stores using power center and DM express tools and performing ad-hoc queries against warehouse environments such as Teradata
  • Involved in SDLC using Informatica Power center, DMExpress ETL tool has been implemented with help of Teradata load utilities
  • Worked proficiently on different database versions across all platforms (Windows, Unix)
  • Extensively worked on adhoc requests using different ETL tools to load data.

Education

Master of Science -

New England College
07.2022

Bachelor of Technology in Electronics and Communication -

Visvesvaraya Technological University
08.2016

Skills

  • ETL development
  • Data Warehousing
  • Data Modeling
  • Data Pipeline Design
  • Data Migration
  • Database Design
  • Data warehousing expertise
  • Big Data Processing
  • Data lakes
  • Azure Snowflake
  • SQL and Databases

Certification

  • GCP Internal Certification (CVS)
  • ITIL 4 Certification

Skill Technology Summary

  • Java
  • Python
  • C++
  • C#
  • JavaScript
  • Shell Script
  • Ruby
  • HTML
  • CSS
  • TypeScript
  • Node.js
  • PHP
  • Ruby on Rails
  • ASP.NET
  • MySQL
  • PostgreSQL
  • MS SQL Server
  • Oracle
  • MongoDB
  • Cassandra
  • Couchbase
  • Redis
  • Google BigQuery
  • Amazon Redshift
  • Snowflake
  • Git
  • SVN
  • Jenkins
  • GitLab CI/CD
  • Travis CI
  • Maven
  • Gradle
  • Ant
  • Docker
  • Kubernetes
  • Ansible
  • Puppet
  • Chef
  • Hadoop
  • Spark
  • Kafka
  • Flink
  • Hadoop HDFS
  • Amazon S3
  • True
  • AWS (Amazon Web Services)
  • Microsoft Azure
  • Google Cloud Platform
  • Linux (Ubuntu, CentOS, Red Hat)
  • Windows Server
  • MacOS for development
  • Angular
  • React
  • Vue.js
  • Spring Boot (Java)
  • Django (Python)
  • Express (Node.js)
  • Agile
  • Scrum
  • Kanban
  • Waterfall
  • Nessus
  • Wireshark
  • Slack
  • Microsoft Teams
  • JIRA

Timeline

Data Engineer

CVS/Aetna
08.2023 - 01.2024

GCP Data Engineer

CVS/Aetna
01.2021 - 08.2023

Data Engineer

PWC
11.2020 - 01.2021

Data Engineer

Radial Inc
03.2019 - 12.2020

Data Engineer

First Republic Bank
12.2017 - 02.2019

ETL Developer

InRythm
07.2016 - 09.2017

ETL Developer

Arvind Technologies Pvt Ltd
03.2016 - 06.2016

Master of Science -

New England College

Bachelor of Technology in Electronics and Communication -

Visvesvaraya Technological University
PALLAVI JINJIRAMPALLI