Skilled in client interaction and customer relationship management
Experienced professional with expertise in requirement analysis
Experienced professional with expertise in creating high-level design (HLD), low-level design (LLD), and migration documents
Experienced ETL developer skilled in utilizing various Talend components to create efficient ETL jobs. Proficient in developing data integration solutions and optimizing data quality. Able to effectively analyze business requirements and implement tailored solutions.
Experienced in coordinating with onshore and offshore counterparts to ensure accurate delivery of deliverables.
Software development professional with experience in enhancing software systems.
Significant expertise in analyzing, designing, and developing solutions in AWS and Azure Data
Experienced in designing and developing data pipelines with Snowflake.
Experienced SQL professional skilled in implementing and developing stored procedures, triggers, nested queries, joins, cursors, views, user-defined functions, and indexes.
Experienced in data modeling, data cleansing, data profiling, and data analysis. Demonstrated ability to understand complex datasets and derive insights.
Experienced in Agile/Scrum and Waterfall SDLC methodologies, effectively applying them to drive project success. Adept at seamlessly integrating core competencies and key skills to consistently deliver high-quality results.
Strong communicator with excellent collaboration and team-building skills. Quick to grasp new technical concepts and apply them effectively.
Overview
12
12
years of professional experience
Work History
Senior Data Engineer
Innover Digital
Hyderabad, Telangana
09.2023 - 07.2024
Extensive experience in SQL, including implementing and developing stored procedures, triggers, nested queries, joins, cursors, views, user-defined functions, and indexes.
• Strong understanding of data modeling and experience with data cleansing, data profiling, and data analysis.
•Demonstrated expertise in Azure Data Factory (ADF) by creating Linked Services, Datasets, and Pipelines for various data sources, including File System and Data Lake Gen2
•Orchestrated data movement from Data Lake Storage to Azure SQL Data Warehouse using Azure Data Factory.
· I developed Databricks notebooks for transformations and this data pipeline is automated/scheduled to run every day using Azure data factory (ADF).
Developed data pipelines of distributed computing data applications using Apache Spark, Python, Azure Databricks, and Azure.
· Demonstrated expertise in Azure Data Factory (ADF) by creating Linked Services, Datasets, and Pipelines for various data sources, including File System and Data Lake Gen2.
Orchestrated data movement from Data Lake Storage to Azure SQL Data Warehouse using Azure Data Factory.
Worked extensively on AWS S3 data transfer, and AWS Redshift was used for cloud data storage.
Handled data extraction and ingestion from different data sources into S3 by creating ETL pipelines using Talend.
Senior Software Engineer
Kenn IT Business Solutions
Hyderabad, Telangana
04.2023 - 07.2023
Developed and implemented cloud-based solutions for clients using Amazon Web Services.
Developed and deployed ETL processes to extract data from multiple sources into a single Data Warehouse.
Designed and implemented database objects such as tables, views, stored procedures, functions, triggers in the Data Warehouse.
Optimized performance of existing ETL jobs by analyzing query plans and indexes used in the process.
Monitored ETL job performance on daily basis and troubleshoot any issues encountered during execution.
Performed unit testing of developed ETL code prior to deployment in production environment.
Developed stored procedures, functions and triggers to support application requirements.
Senior Data Engineer
Synechron Technologies Pvt Ltd
Pune, Maharashtra
06.2018 - 04.2023
Developed ETL jobs to extract data from multiple sources such as Oracle, MySQL and flat files using Talend Studio.
Created reusable routines in Java code for custom logic implementation within the Talend job framework.
Implemented error handling techniques in Talend jobs by configuring components like tDie, tWarn and tJavaFlex.
Developed and maintained data pipelines to ingest, store, process and analyze large datasets in AWS S3 buckets.
Created ETL processes using Python scripts to move data from various sources into the target databases on AWS Redshift or RDS.
Developed stored procedures, functions and triggers to support application requirements.
Created database objects like tables, views, indexes and synonyms in Oracle 11g and 12c databases.
Optimized existing queries for better performance using SQL query tuning techniques.
Performed database tuning activities such as indexing, partitioning and performance monitoring.
Created web services for data exchange between client-server applications using SOAP and RESTful web services.
Talend Developer
Dynpro India Pvt Ltd
Bangalore, Karnataka
06.2014 - 02.2018
Developed ETL jobs to extract data from multiple sources such as Oracle, MySQL and flat files using Talend Studio.
Designed complex mappings for data transformation from source to target systems with Talend components like tMap, tJoin, tAggregate.
Implemented error handling techniques in Talend jobs by configuring components like tDie, tWarn and tJavaFlex.
Created reusable routines in Java code for custom logic implementation within the Talend job framework.
Migrated existing ETL scripts into Talend jobs for better performance optimization and scalability.
Tuned Talend jobs for improved performance through optimization of queries, parallelization of tasks.
Integrated various web services with Talend jobs to access remote APIs or resources using REST and SOAP protocols.
Monitored the execution of scheduled jobs on a daily basis and troubleshoot any issues encountered during run time.
Involved in setting up data integration processes between different applications using Talend DI suite of products.
Utilized version control tools like SVN and GIT to manage concurrent development activities and maintain job versions.
Coordinated with business users to understand requirements related to data extraction, transformation and loading process.
Configured scheduling options in order to automate the execution of developed ETL processes at regular intervals.
Optimized existing jobs by restructuring mappings and component configurations based on best practices suggested by vendors.
Reviewed project requirements to identify customer expectations and resources needed to meet goals.
OBIEE/ ODI Developer
Value Labs
Hyderabad, Telangana
04.2012 - 01.2013
Developed ETL packages to extract, transform, and load data from multiple sources into a data warehouse.
Created complex ad-hoc reports, dashboards, and scorecards in OBIEE to support business needs.
Developed technical solutions using OBIEE and Oracle Business Intelligence Enterprise Edition tools.
Tuned queries with proper usage of hints, aggregation functions and other features available in OBIEE toolset.
Integrated various applications with OBIEE to provide seamless access to BI content across the enterprise.
Education
Bachelor of Technology - Computer Science and Engineering
P.V.P.S.T. College of Engineering
Vijayawada
Skills
Data Warehousing
Data Modeling
Business Intelligence
Data Migration
Data Validation
Python Programming
Big data technologies
SQL and Databases
Talend
Informatica
OWB
ODI
OBIEE
SQL
PL SQL
Oracle
Postgres
SQL Server
SnowFlake
Redshift
Databricks
ADF
Insurance
Banking
Retail
Github
Timeline
Senior Data Engineer
Innover Digital
09.2023 - 07.2024
Senior Software Engineer
Kenn IT Business Solutions
04.2023 - 07.2023
Senior Data Engineer
Synechron Technologies Pvt Ltd
06.2018 - 04.2023
Talend Developer
Dynpro India Pvt Ltd
06.2014 - 02.2018
OBIEE/ ODI Developer
Value Labs
04.2012 - 01.2013
Bachelor of Technology - Computer Science and Engineering