Summary
Overview
Work History
Education
Skills
Timeline
Generic

Nikhita Kandagatla

Charlotte,NC

Summary


- Proficient in articulating complex technical concepts effectively, both verbally and in writing.
- Demonstrated excellent expertise in creating databases, tables, stored procedures, DDL/DML triggers, views, user-defined data types, functions, cursors, and indexes using T-SQL.
- Skilled in the High-Level Design of ETL DTS Packages & SSIS Packages for data integration from heterogeneous sources (Excel, CSV, SAS, Oracle, flat files, Text Format Data) using various SSIS transformations.
- Experience in handling SQL connections through web services.
- Strong skills in deploying and operating large-scale cloud infrastructure and application services on Amazon Cloud and other cloud providers.
- Proficient in host scripting for automating routine tasks.
- Exceptional ability to create and manage various database objects like tables and views.
- Extensive work on ETL processes using SSIS packages.
- Experience in both on-premises and cloud solutions, with expertise in Microsoft Azure (Power BI Service, ADF2, Azure SQL Server, Azure SQL IaaS/SaaS/PaaS, Azure SQL DW, Azure Blob, AKV Management) and hands-on experience with AWS cloud solutions.
- Expertise in data conversion and migration using SSIS and DTS services across different databases (Oracle, MS Access, flat files).
- Good experience in working with APIs to access specific database features.
- Proven experience in performance tuning and query optimization.
- Hands-on experience in the maintenance and administration of SSIS by creating jobs, alerts, SQL Mail Agent, and scheduling DTS/SSIS packages.
- Generated data to support SAP Business Objects Interface for users to access data from the Data Warehouse.
- Hands-on experience with Azure Data Factory V2, Data Flows, Snowflake, Azure Data Lake, Azure Analysis Services.
- Experience in creating packages to transfer data between DB2, Oracle, MS Access, and flat files to SQL Server using DTS/SSIS.
- Expertise in creating Star Schema and Snowflake Schemas.
- Three years of Azure cloud experience with ADF V2, ADF Data Flows, Azure Databricks, SQLDB/Hyperscale, SQLDW, ADLS Gen 2, and other Azure services. ADF source code version control using Azure DevOps or Git.
- Expertise in SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), and Power BI DAX with good knowledge of SQL Server Analysis Services (SSAS).
- Experience in report writing using SQL Server Reporting Services (SSRS) and Power BI, creating various types of reports like drill-down, parameterized, cascading, conditional, table, matrix, chart, and sub-reports.

Overview

9
9
years of professional experience

Work History

ETL Developer

ABC Supply
CHARLOTTE, NC
02.2021 - Current
  • Successfully executed Azure Data Factory pipelines, resulting in a 20% enhancement in data processing efficiency.
  • Crafted and fine-tuned data models, leading to a 15% reduction in query response time and an overall boost in system performance.
  • Streamlined ETL processes, cutting data load times by 25% and ensuring timely availability of crucial business insights.
  • Effectively managed Azure Data Lake Storage, achieving a 30% reduction in storage costs through strategic data partitioning and compression techniques.
  • Implemented data quality checks and validation procedures, resulting in an impressive 95% improvement in data accuracy and integrity.
  • Conducted performance optimization on Azure SQL Data Warehouse, significantly improving query execution times and achieving a 20% boost in overall system performance.
  • Implemented scalable solutions on Azure, empowering the system to handle a 50% increase in data volume without compromising performance.
  • Developed comprehensive documentation for data engineering processes, and conducted knowledge transfer sessions, reducing onboarding time for new team members by 30%.
  • Orchestrated ETL workflows, processing an average of 1 terabyte of data per day, ensuring prompt and precise data delivery.
  • Implemented Azure SQL Data Warehouse to streamline data storage, successfully reducing storage costs by 15%.
  • Leveraged Azure Databricks for processing and analyzing streaming data, resulting in a remarkable 30% reduction in processing time.
  • Implemented robust security policies for Azure Data Lake Storage, achieving a 100% compliance rating during internal and external audits.
  • Deployed Azure Monitor and Azure Log Analytics for proactive monitoring, leading to a 25% reduction in system downtime.
  • Led the development team in understanding requirements, building high-standard stable codes, and guiding developers in line with Confidential and client processes.
  • Developed Informatica mappings based on client requirements for the analytics team.
  • Conducted end-to-end system integration testing and participated in functional testing and regression testing.
  • Reviewed and wrote SQL scripts to verify data from source systems to target, utilizing HP Quality Center to store and maintain test repositories.
  • Worked on transformations to prepare data for the analytics team's visualization and business decisions.
  • Participated in knowledge transfer sessions, providing feedback on requirements, and played a role in migrating the client data warehouse architecture from on-premises to Azure cloud.
  • Created pipelines in Azure Data Factory for extracting, transforming, and loading data from multiple sources such as Azure SQL, Blob storage, and Azure SQL Data Warehouse.
  • Designed data auditing and data masking for security purposes, monitored end-to-end integration using Azure Monitor, and implemented Azure Active Directory for specific user roles.
  • Deployed Azure Data Lake Storage accounts and SQL Databases, ensuring a smooth transition into the Azure cloud environment.

Data Engineer Etl Developer

Abc123
Charlotte, NC
01.2018 - 12.2020

Certainly, here are the rewritten points:

- Played a crucial role as a key member of the BI team, achieving outstanding results in meeting development needs through proficiency in SSIS, SSAS, and SSRS.
- Provided support to the Project Manager in leading multi-functional project teams, successfully implementing data conversion strategies and effectively communicating essential information to team members.
- Developed complex Stored Procedures, Triggers, Functions (UDF), Indexes, Tables, Views, and other T-SQL code, along with SQL joins for SSIS packages and SSRS reports.
- Collaborated directly with end users/clients to gather, analyze, and document business requirements and rules.
- Created intricate Stored Procedures/execute tasks, incorporated C# code in script tasks for ETLs, and maintained ETL packages with high performance.
- Published SSRS reports in Report Manager, SharePoint web pages, and custom-made ASP.NET web applications.
- Conducted SSIS development and support, devising ETL solutions for integrating data from diverse sources (Flat Files, Excel, SQL Server, Raw File, DB2, Oracle, XML files, SharePoint Lists) into the central OLTP database.
- Extracted, transformed, and loaded data from systems like ART, WANDA, EDR, and IES.
- Developed C# console applications for database manipulation and calculations.
- Extended SSIS capabilities by creating custom SSIS components for SharePoint.
- Configured SSIS packages with XML configuration files, environment variables, registry entries, parent package variables, and SQL Server tables.
- Created SSRS and Power BI reports, sourcing from SSAS's Multidimensional OLAP and Tabular model cubes with MDX and DAX.
- Scheduled various SQL Jobs using SQL Server Agent to perform administrative tasks.
- Trained and guided new or junior developers in specific development areas to enhance their understanding of the project process.
- Assisted in developing systems with a high level of reliability and performance.
- Successfully resolved numerous critical and challenging issues raised by the team about ETL processes.
- Participated in software quality assurance activities, including code reviews and unit, system, integration, regression, and performance testing.
- Understood and completed necessary documentation for the application/infrastructure change control process.

ETL DEVELOPER

ABCD
CHARLOTTE, VTNC
01.2015 - 12.2018
  • Managed the integration of data from diverse sources into Hadoop Distributed File System (HDFS).
  • Utilized MySQL and Oracle databases to query data based on specific requirements.
  • Developed data warehouses and conducted analyses using various business development tools.
  • Executed ETL tasks with Hadoop technologies such as Hive, Sqoop, and Oozie, extracting records from different databases into HDFS.
  • Conducted preprocessing of data using Hadoop, employing different components or tools.
  • Imported and exported data between different sources and HDFS for further processing using Apache Sqoop.
  • Automated Sqoop jobs and data ingestion workflows into HDFS.
  • Crafted SQL queries for retrieving information from databases based on requirements and for data cleanup processes.
  • Validated data transformations and performed end-to-end data validation for ETL and BI systems.
  • Contributed to the development of test strategy, test plans, and designs, executing test cases for ETL and BI systems.
  • Participated in the development and execution of ETL-related functionality, performance, and integration test cases, along with documentation.
  • Analyzed and comprehended the developed ETL workflows.
  • Coordinated daily with the offshore team to execute test cases as per requirements.
  • Worked on data completeness, data transformation, and data quality for various data feeds from sources.
  • Conducted data modeling and data analysis.
  • Supported all phases of the project development lifecycle using SDLC and other project methodologies.
  • Implemented various transformations, including Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router, to implement business logic.
  • Created complex mappings using Connected and Unconnected Lookups, Aggregate, Update Strategy, and Router transformations for populating target tables efficiently.
  • Participated in team meetings with Data Modelers, Data Architects, and Business Analysts to analyze and resolve modeling issues.

Education

Bachelor of Science - COMPUTER SCIENCE

JNTU
HYDERABAD, INDIA
05-2014

Skills

Data Extraction: Azure Synapse Analytics Workspace, Azure Data Factory, Azure databricks, Azure Logic Apps

Automation: Microsoft Power Automate
Data Warehousing: SQL server, MYSQL, PostgreSQL, SSRS, SSSIS
Visualization: Power BI, Tableau, Apache Superset
Backend Development: Python, Docker, Kubernetes
Programming Languages: Python, SQL
Scripting: Java script, HTML, CSS
Versioning Tools: GIT
Build Tools: Tox, poetry
Databases: SQL Server, PostgreSQL
Virtualization: Docker
Monitoring Tools: Azure Monitor
Cloud Technologies: Azure Data Factory, Databricks, DevOps and Power BI

Timeline

ETL Developer

ABC Supply
02.2021 - Current

Data Engineer Etl Developer

Abc123
01.2018 - 12.2020

ETL DEVELOPER

ABCD
01.2015 - 12.2018

Bachelor of Science - COMPUTER SCIENCE

JNTU
Nikhita Kandagatla