Summary
Overview
Work History
Education
Skills
Certification
Availability
Workauthorization
Timeline
Generic

RAKHESH KASAGANI

Atlanta,GA

Summary

  • 14 years of experience in Information Technology & Engineering, focused on executing data-driven solutions to increase efficiency, accuracy, and utility of internal and external data processing.
  • Great experience managing various databases and files (txt, csv, tab, excel, , xml, pdfs) performing Data integration, transformation, aggregation, pattern evaluation, cleansing, mining, and other data wrangling requests from business.
  • Expert in handling the data and designing ETL processes.
  • Familiar with all the cloud technologies - AWS, Azure and GCP.
  • Good knowledge on Hadoop, Couchbase - NoSQL, Kafka, AWS, API, Python, R, Statistical tools, and methods.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Sr Data Engineer/AI Engineer

CDW
12.2022 - Current

Client: CDC (Centers for Disease Control and Prevention – Federal project)


Project: NBS Modernization

  • Analyzed HL7 RIM Model to map relationships within the CDC NBS environment.
  • Participated in Sprint reviews, backlog grooming, and planning for effective task prioritization.
  • Optimized stored procedures, improving query performance and reducing ETL processing time.
  • Demonstrated POCs using Talend, Nifi, and DBT for data migration to AWS S3.
  • Developed Nifi and Talend ETL jobs, replacing SQL stored procedures with efficient pipelines.
  • Proficient in Talend and Apache Nifi, conducting demos and building workflows for diverse data sources.
  • Environment: Talend, Apache Nifi, SQL Server, JIRA, Confluence, Mural, Agile


Project: NIOSH

  • Documented JIRA tasks and organized them in Confluence for clear tracking.
  • Developed AWS Glue jobs to move data from S3 to various databases.
  • Explored on-prem to AWS Cloud migration options.
  • Built time series model POC using AWS Sage Maker and Jupyter Notebook.
  • Assigned user policies and managed service permissions in AWS.
  • Environment: AWS – (Glue, Athena, S3, RDS, EC2, Secrets Manager, etc.,)


Project: Sales Enablement Bot

  • Created Sales Enablement chatbots using Microsoft Copilot Studio and Power Automate.
  • Experienced in building topics, flows, and publishing bots on Microsoft Teams.
  • Integrated Power Automate with AWS Lambda, handling AWS Signature between AWS Bedrock(RAG Bot) and Azure.
  • Explored the options between Anthropic model vs Open AI. Choosen Claude v3 model to build the Rag Bot
  • Retrieved AWS Signature via API calls between Azure and Lambda function url endpoints.
  • Created Azure functions to handle the security between Azure and AWS endpoints
  • Worked on configuring App registration for other applications to interact with Teams.
  • Gained expertise in Generative AI and prompt engineering.
  • Environment: Microsoft Copilot Studio, Power Apps, Microsoft Azure, AWS – Bedrock, Lambda function




Sr Talend Engineer

Elastic
02.2022 - 11.2022
  • Worked in an agile scrum environment with a biweekly sprint
  • Actively participated in backlog and refinement meetings considering the workload for various integrations
  • Involved in gathering business requirements and documenting various attributes to a project
  • Independently lead a project communicating with Business users and external vendor
  • Also identified various new and constructive techniques to retrieve the data from Cloud by looping the process multiple levels down
  • Implemented error handling and notifications inside the talend integration tool
  • Improved ETL job performance by implementing variables on various internal configurations as well as applying various techniques to speed up the overall process
  • Deep dive into triggering, executing, retrieving, and parsing the payloads and reports using talend data integration tool
  • Used various components in talend to read, write and parse the data coming from various cloud systems and integrating them into another applications (involving Cloud and snowflake environments)
  • Tested, peer reviewed and deployed the code into production as and when needed by the team
  • Dealt with complex integration jobs by supporting the production fixes
  • Analyzed various talend jobs and documented important steps to enhance the overall process
  • Worked on workday data integration and supported the enhancements recommended by the business
  • Retrieved various reports and payloads from API applications, parsed them and loaded into their target systems
  • Worked on Azure Data factory to build pipelines and data flow for data migrations
  • Designed and build ETL and ELT pipelines using Azure data factory and Azure Databricks
  • Designed and build pipelines to migrate data from on-prem databases, rest API to Azure Data Lake Storage and Azure Synapse Analytics
  • Retrieved data from various containers and transformed the data into complex structured files based on the requirements provided
  • Connected and worked on various VM’s using RDP and Bastion
  • Analyzed various settings to learn and enhance the Azure platform
  • Created small framework to integrate various data flows via talend as well as Azure Data factory
  • Environment: Talend Cloud Data Management platform, Artiva, Azure, GCP – google big query, Salesforce, NetSuite, Okta, GitHub, JIRA, Monday, Confluence, Saas, Zuora, Workday, Concur, Slack, Lucid chart, Elastic Academy (Dochbo), Elastic Cloud, snowflake etc.




Data Integration Specialist

Domino's Pizza, LLC
12.2015 - 02.2022
  • As a Data Integration Specialist, I used various techniques to identify, analyze and transform the data for various stakeholders and business partners
  • Worked on various Predictive analysis projects considering future data (sales, weather, shifts, warehouse, etc.)
  • Built various Data Integration solutions to handle batch/streaming data on ETL platform for various applications and microservices
  • Strong understanding on relational database concepts and data processing concepts
  • Integrated API calls, transforming Json files to the required format through SQL as well as Talend components and made available for reporting
  • Used Talend Data Integration to send/retrieve data to/from Salesforce (internal & external)
  • Also executed queries, reports inside Salesforce to validate the data flow
  • Converted DataStage ETL jobs to Talend ETL’s with proper validations and unit testing
  • Created Tidal/Control-m jobs to schedule the ETL jobs to run on a specific schedule (considering all the predecessors and successors)
  • Great experience with TAC (Talend Administration Center) to setup users, grant required permissions, remove locks on the ETL’s, monitor job runs, etc
  • Created Talend ETL processes to load source (files - txt, csv, xml, , Camelot DB2 database, Axway, ftp, other databases) to target (Hadoop, Netezza, SQL Server, files, APIs, Couchbase, S3, etc.) with required transformation
  • Used Kafka and N1QL scripts to load data from Datawarehouse to Couchbase
  • Used Talend Data Integration to integrate data between Datawarehouse and various application like SAP, Salesforce, SmartRecruiters (ATS), Pulse, S3, Couchbase, etc
  • Resolved production issues and corrected/enhanced the ETL jobs to have the data integrity over all the environments (source, staging and target)
  • Good hands-on Aqua, SSMS, GIT/STASH, JIRA etc
  • Responsible in delivering assigned projects on a timely manner with desired quality
  • Environment: Talend Data Integration 5x/6x/7x, DataStage 8x, Micro-strategy, DB2, Netezza, SQL SERVER, Salesforce, Aqua Data Studio, S3, Couchbase, Greenplum, Profisee, Kafka, ATS, API’s, Hadoop, Microsoft Azure, etc.



ETL Developer

Ohio State University Wexner Medical Center
12.2014 - 12.2015

o Domain: Health Care (State project)

o Project: Informatics for integrating Biology & the Bedside (I2B2) Enterprise Implementation

o Environment: I2b2 1.7.0, DataStage 8.7, SQL Server 2012, Oracle 11/12, UNIX, EPIC

Client

State of Ohio Department of Taxation
04.2013 - 12.2014

o Domain: State (Tax - State project)

o Project: STARS Project

o Environment:ETPM, DataStage 8.5, Cognos, Oracle, PeopleSoft, UNIX, ESB, Agile methodology, Scrum meetings.

Client

McKesson Provider Technology
05.2011 - 03.2013

o Domain: Health Care

o Project: MTS FIN Optimization – BPR Data Conversion project

o Environment: IBM INFORMATION SERVER, ORACLE, HP 3000, PEOPLESOFT, UNIX, SAP, EPIC

Client

Emblem Health
01.2011 - 04.2011

o Domain: Health Care

o Project:Inventory Health

o Environment: IBM INFORMATION SERVER (8.1), ORACLE (9i), AUTOSYS, IBM AUX (5.3), VSS

Client

C R BARD
07.2010 - 12.2010

o Domain: Health Care

o Project:Global Inventory

o Environment:DATASTAGE 7.5.2, MICROSTRATEGY 8, ERWIN, PROGRESS DB, ORACLE, PEOPLESOFT, UNIX, SHELL SCRIPTS, CRT-M


DataStage responsibilities:

  • Engineered and executed IBM DataStage ETL processes, including design, development, and deployment
  • Performed data mappings and transformations to move data from source to target systems
  • Optimized ETL processes by identifying and resolving performance bottlenecks
  • Collaborated in code reviews and deployed robust ETL scripts into production
  • Utilized Parallel Extender stages (e.g., Remove-Duplicates, Sort, Funnel) to streamline data processing
  • Created DataStage jobs using Stage Variables, Derivations, and Constraints
  • Addressed discrepancies from User Acceptance Testing to ensure ETL process integrity
  • Managed job transitions across DataStage versions with impact analyses
  • Transformed UNIX shell scripts and PL/SQL into DataStage ETL jobs with documentation
  • Supported DataStage production environment for stability and performance


Education

Master’s Degree -

The University of New Haven
CT, USA
01.2009

Skills

    ETL: IBM Information Server (8x)/DataStage 8x/7x/6x, Pentaho Data Integration Tool, Talend 542/61,70,8, Apache NIFI, DBT

    AI: Generative AI, NLP (Natural Language Processing), Chatbots

    Reporting: Cognos, Micro strategy

    Databases: Sql Server, Oracle,MS Access,DB2,Progressive DB,PeopleSoft,HP 3000, Seibel, SAP, Netezza, Couchbase

    Cloud Technologies: AWS, Azure, GCP

    Other tools: ETPM, PL/SQL and Unix shell Scripts, I2B2, Maestro, Snowflake, Microsoft Copilot studio, Power Apps, Salesforce, etc,

Certification

  • AWS Certified Cloud Practitioner, https://www.credly.com/earner/earned/badge/afe57080-8ff8-4c8d-8839-47be05cc0683
  • AWS Certified Data Analytics – Specialty, https://www.credly.com/badges/0dadb8a0-4dff-4521-87a8-19e03b716a3d

Availability

Immediately 

Workauthorization

Green Card

Timeline

Sr Data Engineer/AI Engineer

CDW
12.2022 - Current

Sr Talend Engineer

Elastic
02.2022 - 11.2022

Data Integration Specialist

Domino's Pizza, LLC
12.2015 - 02.2022

ETL Developer

Ohio State University Wexner Medical Center
12.2014 - 12.2015

Client

State of Ohio Department of Taxation
04.2013 - 12.2014

Client

McKesson Provider Technology
05.2011 - 03.2013

Client

Emblem Health
01.2011 - 04.2011

Client

C R BARD
07.2010 - 12.2010

Master’s Degree -

The University of New Haven
RAKHESH KASAGANI