Summary
Overview
Work History
Education
Skills
Projects
Certification
Timeline
Generic

Bhargavi Chimata

Kansas City,MO

Summary

Dedicated Data Engineer skilled in developing efficient ETL workflows and optimizing data processes. Experienced in validating and cleansing datasets, contributing to improved data accuracy and reliability across teams for enhanced decision-making.

Overview

2
2
years of professional experience
1
1
Certification
1
1
year of post-secondary education

Work History

Data Engineer

S&M Scholarly Solutions
06.2024 - 12.2025
  • Designed and implemented ETL workflows integrating structured and unstructured data from 10+ sources, reducing manual preparation time by 35%.
  • Built and optimized SQL queries to validate and cleanse over 1 million records weekly, achieving a 20% reduction in data errors.
  • Assisted in performance tuning of Python scripts, enhancing data processing speed by 40% for near real-time analytics.
  • Delivered comprehensive data solutions that improved pipeline efficiency and accelerated analytics delivery by 30%.

Data Engineer Intern

S&M Scholarly Solutions
03.2024 - 05.2025
  • Proactively monitored and optimized database performance, reducing query response times by 25%.
  • Implemented data validation techniques to enhance dataset accuracy by 30% and minimize reporting errors.
  • Collaborated with cross-functional teams to establish data capture and reporting requirements, achieving 100% on-time report delivery.
  • Established repeatable validation and QA processes, strengthening data integrity and reducing manual rework by 20%.

Education

Master of Science - Computer Science

University of Central Missouri
Warrensburg, MO
01.2025 - Current

Bachelor of Technology - Computer Science and Engineering

Vignan's Lara Institute of Technology and Science
Vadlamudi, Tenali
01.2024

Skills

  • Python
  • SQL
  • Statistical modeling
  • Talend
  • Data warehousing
  • Data Modeling
  • MySQL
  • PostgreSQL
  • Data analysis
  • ETL processes
  • Machine learning
  • Data pipeline design
  • Query optimization

Projects

Building a Real-Time Data Pipeline

  • Objective: Design and implement an end-to-end data pipeline that ingests, processes, and stores real-time streaming data from a source (e.g., social media feeds, IoT sensors).
  • Technologies: This project would involve using tools like Apache Kafka for data ingestion, Apache Spark Streaming or Apache Flink for real-time processing, and a NoSQL database like Cassandra or MongoDB for storage.

Developing a Data Warehouse for an E-commerce Business

  • Objective: Create a scalable data warehouse solution to support business intelligence and analytics by integrating data from various sources (e.g., sales, customer data, inventory).
  • Technologies: You would use an ETL/ELT tool like Apache Airflow or dbt to orchestrate data transformations. The data would be loaded into a cloud-based data warehouse like Snowflake or Amazon Redshift, and then analyzed using a business intelligence tool like Tableau or Power BI.

Certification

  • Google Cloud Certified - Professional Data Engineer
  • IBM Data Engineering Professional Certificate
  • dbt Fundamentals Certificate

Timeline

Master of Science - Computer Science

University of Central Missouri
01.2025 - Current

Data Engineer

S&M Scholarly Solutions
06.2024 - 12.2025

Data Engineer Intern

S&M Scholarly Solutions
03.2024 - 05.2025

Bachelor of Technology - Computer Science and Engineering

Vignan's Lara Institute of Technology and Science