Summary
Overview
Work History
Education
Skills
Academic Projects
Accomplishments
Timeline
Generic

Rakesh Gourishetty

Cincinnati,OH

Summary

Enthusiastic Data Engineer with over 3.5 years of hands-on experience in architecting robust data solutions. Proven track record in optimizing data workflows, architecting robust data solutions, ensuring quality, and implementing efficient and scalable ETL processes.

Overview

3
3
years of professional experience

Work History

Data Engineer

Data Economy Global Services
2021.03 - 2022.12
  • Managed the migration of a decade's worth of customers credit card data from Oracle and MySQL to Redshift and AWS Athena. Utilized Talend ETL for extraction, transformation, and loading of data while ensuring data integrity and compliance through rigorous validation and error-checking. This improved data accessibility and analytical capabilities, reducing query response times by 40%.
  • Integrated and processed financial customer data from various sources, performed transformations to streamline the data and ensure precision by eliminating inaccuracies. This resulted in simplified data integration and processing, facilitating its effective utilization for data analysis.
  • Improved data processing efficiency and integrated multiple data sources into our target data warehouse by designing, developing, and deploying ETL jobs. By incorporating Azure services like Azure Data Lake Storage and Azure Synapse Analytics into our existing data pipelines achieved a 40% decrease in data processing time and ensured efficient data loading into the production environment.
  • Responsible for ensuring the accuracy and reliability of customer data by crafting meticulous and comprehensive unit test cases, achieving a 98% accuracy rate in identifying data errors. Additionally, collaborated with a cross-functional team to architect and implement solutions leveraging Azure's cloud services, enhancing the scalability and performance of our data processes.
  • To improve data processing efficiency and job scheduling to enhance operational effectiveness, automated processes using JIL to create and manage job schedules in Autosys, specifying job dependencies and conditions for running the jobs. This led to a significant revenue increase of $1.5M per quarter due to the enhanced data processing efficiency.
  • In order to maintain high availability and reliability of our data processing operations, Utilized Azure Monitor and Azure Log Analytics to monitor job performance and swiftly troubleshoot any issues. This allowed me to keep a close eye on job performance in real-time and promptly address any issues that arose, ensuring minimal downtime and a smooth workflow for our data processing operation.
  • In preparation for initiating a data integration project involving diverse sources, meticulously analyzed the initial project plan and systematically gathered crucial details to ensure seamless integration while aligning with stakeholder requirements. This effort resulted in a 25% reduction in project setup time, leading to faster project initiation and cost savings through improved efficiency.

Data Engineer

Tech9kApp Solutions Private Limited
2019.08 - 2020.12
  • To enhance healthcare data quality and accuracy while minimizing human intervention, implemented strategies focusing on streamlined processes
  • To boost efficiency in data extraction and integration, crafted essential SQL queries within Talend. This led to a 35% improvement in extraction speed, enabling faster insights generation and decision-making. Consequently, our efforts contributed to a $1M increase in revenue per quarter.
  • To ensure uninterrupted data processing, efficiently deployed multiple ETL jobs into production. This led to a 50% reduction in downtime, improving system reliability and performance while enhancing productivity and operational continuity.
  • In my role overseeing various data-related tasks like migration, profiling, modeling, and data warehousing, I focused on improving the efficiency of our ETL workflows. By streamlining processes and making better use of resources, we managed to boost efficiency by 30%, making our data operations smoother and more effective overall.
  • Created strong error-handling mechanisms for data transformation. This meant being involved in every step, from understanding requirements to translating business needs into visual representations. As a result, we implemented effective error-handling measures that maintained data integrity and minimized disruptions during transformation processes.
  • In response to the challenge of streamlining data ingestion into the AWS data lake while maintaining data quality, Developed a Python script. This script facilitated the ingestion of data from source files and was accompanied by comprehensive test cases to ensure data quality throughout the process. As a result of these efforts, we experienced a 20% increase in data migration speed, contributing to an annual revenue increase of $2M by accelerating the delivery of actionable insights.

Education

Master of Science in Information Technology -

University of Cincinnati
04.2024

Bachelor of Technology in Information Technology -

Jawaharlal Nehru Technological University
Hyderabad
05.2018

Skills

  • Databases: Oracle, My SQL, Redshift.
  • Cloud Services: AWS Athena, Azure Data Lake Storage, Azure Synapse Analytics.
  • Bid data technologies: Apache Spark, Apache Kafka
  • ETL Tools: Talend, Informatica
  • Reporting Tools: Tableau, Power BI
  • Management Tools: JIRA, Autosys
  • Version Control: GitHub
  • Languages: Java, Python
  • Data Expertise: Data Modeling, Data Cleansing, Data Analysis, Data Warehousing, Data Ingestion
  • Microsoft Office Tools: Excel, PowerPoint, Word.
  • Strong Communication, Problem-solving and Interpersonal Skills.

Academic Projects

Impact of Layoffs on Unemployment Rate
  • Conducted a comprehensive study to analyze the effects of corporate layoffs on unemployment rates, aiming to understand broader economic trends. Utilized data analysis and visualization techniques, leveraging Tableau to examine datasets sourced from Kaggle.
  • Employed quantitative analysis and statistical methods to identify correlations between layoffs and unemployment rates, focusing on patterns over time and industry-specific vulnerabilities.
  • Discovered a direct correlation between layoffs and unemployment rates, with significant impacts on individuals' lives. Found that technical advancements and financial instability were the primary drivers of layoffs. The project contributes to strategic decision-making processes and suggests future longitudinal studies incorporating machine learning algorithms for deeper insights into economic dynamics.

Accomplishments

  • Honored with the SUPER DEBUT for outstanding performance at Data Economy.
  • Data Analysis with Python Certification with respective badge issued by IBM.
  • Python 101 for Data Science Certification with respective badge issued by IBM.

Timeline

Data Engineer

Data Economy Global Services
2021.03 - 2022.12

Data Engineer

Tech9kApp Solutions Private Limited
2019.08 - 2020.12

Master of Science in Information Technology -

University of Cincinnati

Bachelor of Technology in Information Technology -

Jawaharlal Nehru Technological University
Rakesh Gourishetty