Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Naresh Chintalapudi

Frisco,TX

Summary

Experienced Data Engineer with over 10 years of expertise in developing and optimizing cloud-based data architectures, data warehousing solutions, and big data technologies. Skilled in building and managing scalable ETL/ELT pipelines using SQL, Azure Data Factory (ADF), Databricks, Spark, Python, and Snowflake. Proficient in automating data processes, optimizing data delivery, and redesigning infrastructure for greater scalability. Specialized in identifying and implementing process improvements that enhance performance and reduce manual workflows. Known for leveraging AWS big data technologies to build efficient, scalable data solutions while collaborating with cross-functional teams to deliver actionable insights and support data-driven decision-making.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Azure Data Engineer

T-Mobile
11.2020 - Current
  • Developed scalable, reusable ETL pipelines using Azure Data Factory (ADF), Spark (Databricks), Kafka, EventHub, APIs, and Python for transforming and delivering data across multiple environments (batch/real-time).
  • Optimized Spark jobs and Snowflake queries, reducing execution times through partitioning strategies, caching, and tuning memory settings for large datasets and complex transformations.
  • Built and deployed real-time streaming pipelines using EventHub, Kafka, Spark, and Python, ensuring efficient data processing, event offset management, and high throughput for millions of records.
  • Led corporate development of data warehouse and business analytics solutions, designing Snowflake architecture, implementing multi-schema designs, and optimizing storage, query performance, and cost efficiency.
  • Created automated validation and error-handling frameworks in ADF, Spark, and Python to ensure data consistency, reduce discrepancies, and improve data quality.
  • Integrated Snowflake and Databricks environments, optimizing data transfers and leveraging the Snowflake Spark Connector for efficient and scalable data flow between systems.
  • Implemented monitoring, alerting, and security measures using Azure Data Lake Storage (ADLS), Databricks, Snowflake, and Python, ensuring high availability, compliance, and pipeline health across multiple data sources.
  • Worked with internal stakeholders to support their data infrastructure needs, solving technical issues and optimizing data systems to enhance analytics functionality.
  • Managed multiple data engineering projects, delivering solutions in an agile, iterative manner while ensuring timelines, quality, and scalability of deliverables.
  • Designed and implemented data solutions that incorporated data encryption, data privacy, and governance standards, ensuring compliance with relevant regulations such as CCPA, USGCI, and COPPA.
  • Deployed ADF pipelines and Control-M jobs via CI/CD Git processes, ensuring seamless integration and automated deployment in a collaborative environment.

Azure Data Engineer

Depository Trust & Clearing Corporation
07.2017 - 11.2020
  • Developed and optimized data pipelines using ADF, PySpark, and Azure Synapse to ingest, transform, and store data in Azure Data Lake Storage (ADLS) and Synapse Pools for batch and real-time processing.
  • Created reusable pipelines for file-based data ingestion (CSV, Text, Excel) using Synapse Data Studio and Azure Copy Activity, transforming raw data to Parquet format for downstream processing.
  • Built incremental and full data processing pipelines in Synapse, implementing truncation and reload mechanisms to ensure data consistency and high performance.
  • Configured Azure Data Shares for seamless data transfer between ADLS accounts and Synapse Pools, optimizing cross-system data sharing.
  • Automated reporting and analytics using SQL, Python, and PySpark, delivering timely business insights with efficient data transformation and reporting workflows.
  • Tuned Spark applications for optimal performance, adjusting batch intervals, parallelism, and memory settings to handle large data volumes effectively.
  • Applied analytical problem-solving techniques to address key business questions, leveraging SQL, Python, and Scala to manipulate data and develop high-performing data solutions.
  • Collaborated with business stakeholders and technical teams to enhance the functionality of the data systems, ensuring that they meet evolving business needs and provide valuable insights into marketing and customer acquisition efforts.
  • Managed data pipelines and infrastructure, ensuring seamless integration and efficient data delivery, while continuously driving process improvements and re-designing infrastructure for greater scalability.
  • Provided production support by troubleshooting pipeline issues, fine-tuning processes, and managing CI/CD deployments via GitHub to ensure smooth, automated deployments.

Test Analyst

Bank of America
02.2016 - 07.2017
  • Designed and developed Windows applications using C# .NET and ASP.NET, ensuring functionality and user-friendly interfaces.
  • Created and automated tests using Microsoft Coded UI and C#, designing comprehensive test strategies and a Hybrid-driven automation framework for seamless test execution.
  • Developed customized HTML reports in UFT using VBScript, HTML, and CSS, making them easily understandable for non-technical users.
  • Implemented browser compatibility testing across multiple browsers (e.g., Firefox, Chrome, Safari) to ensure consistent application performance.
  • Designed and implemented automated testing frameworks using QTP and VBScript, focusing on back office trading systems, reporting, and front office applications.
  • Performed backend validation using SQL queries in Oracle 12c, ensuring data integrity and validation across data transactions.
  • Worked with Unix and Linux environments for log file analysis, batch job processing, and troubleshooting SOAP Web Services using SOAPUI and WSDL for service verification.

Education

Master’s - computer science

SILICON VALLEY UNIVERSITY
San Jose, CA
12.2015

Bachelor’s - Electronics And Communication Engineering

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY
Kakinada, Andhra Pradesh, India
11.2012

Skills

  • Programming Languages: Python, Scala, Java, SQL, VB Script
  • Cloud Platforms: Azure (ADF, ADLS, Synapse, Databricks), AWS
  • Data Engineering Tools: Azure Data Factory (ADF), Spark SQL, Spark Streaming, Kafka, Snowflake, HDFS, Hive, Pig, U-SQL
  • ETL/ELT Tools: ADF, Spark SQL, U-SQL, Sqoop, Databricks, Synapse Analytics
  • Database & Data Warehousing: Snowflake, SQL Server, Oracle, Teradata, MySQL
  • Big Data Technologies: Apache Hadoop, Apache Spark, Kafka, HBase, Flume
  • Real-Time Data Processing: Kafka, Spark Structured Streaming, HBase
  • Version Control & CI/CD: Git, Azure DevOps, Jenkins
  • Job Scheduling & Monitoring: Control-M, Oozie
  • Other Tools: Apache Nifi, LoadRunner, JIRA, SOAPUI, REST APIs

Certification

  • Azure Data Engineer Associate
  • DP-200 Implementing an Azure Data Solution
  • DP-201 Designing an Azure Data Solution

Timeline

Azure Data Engineer

T-Mobile
11.2020 - Current

Azure Data Engineer

Depository Trust & Clearing Corporation
07.2017 - 11.2020

Test Analyst

Bank of America
02.2016 - 07.2017

Master’s - computer science

SILICON VALLEY UNIVERSITY

Bachelor’s - Electronics And Communication Engineering

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY