Summary
Overview
Work History
Education
Skills
Timeline
Generic

Yamini Suguru

PITTSBURGH

Summary

Dynamic Sr. Application Developer with expertise in ETL processes and Informatica tools. Proven ability to automate workflows and enhance data integration, leading to improved operational efficiency. Strong communicator and collaborator, adept at translating complex requirements into actionable solutions. Committed to delivering high-quality data-driven insights for strategic decision-making.

Overview

8
8
years of professional experience

Work History

Sr. Application Developer

Highmark Health
04.2022 - 04.2025

· Worked as a Production support Lead handling team of 6 developers and project discussions and working closely with Customer on requirements gathering and discussions

· Develop Informatica mappings, SQL/stored procedures and worked with different transformations, and parameterization.

· Hands-on experience with data ingestion data analysis.

· Understanding the Database concepts related to Hive/Oracle, and performance procedures in Informatica.

· Worked as ETL developer using Informatica DEI tool for ONPREM to cloud migrations and also worked on Low code extract framework UI using Big Query

· Automated Jobs using parameterization, scheduling and event driven triggers.

· Working on Z140 with Business Analyst, and Data Mapping document to initiate the mappings

· Worked on creating hive tables in Hadoop environment and creating connections in the tool to connect to different databases, such as Teradata, Hadoop, hive, Salesforce, and code migration from one environment to another environment.

· Performing analysis, design, and development of ETL processes to support project requirements.

· Developed mappings related to CDC, which is Change Data Capture, recycle process.

· Developed end-to-end data pipelines using Microsoft Fabric to support data analytics and real-time insights across business units.

· Leveraged Microsoft Fabric’s integrated data environment to streamline ETL processes and manage data workflows for large-scale data operations.

· Worked closely with data scientists to integrate machine learning models into business processes using Microsoft Fabric’s built-in ML tools.

· With Fabric, data engineering tasks such as ETL (Extract, Transform, Load) are simplified, allowing users to automate and scale data workflows.

· Worked on Lakehouse architecture, which combines the benefits of data lakes and data warehouses which is an efficient way to handle both structured and unstructured data.

· Worked with both Blaze and Spark engines.

· Worked with Large amount of data and data analysis to understand the requirements and see the best approach.

· Working with customers on possibilities of development and ask for more required information for Quality delivery

· Worked with HDFS and hive using as a source to BDM and loading to Oracle.

· Worked on ESP scheduling with PPS team for batch and real time loads.

· Performing unit testing, helping testers in QA environments with providing test cases, and work with customers to resolve issues or any concerns related to UAT.

· Doing peer reviews for mappings and workflows when required.

· Self-valuating the code always and work on if change required.

· Worked on issues related to Integration services and with OS profiles.

· Worked with offshore team on development activity

· Worked with production support team on any issues related to failures.

· Maintaining documentation for all code related designs and SQL scripts.

· Outstanding ability to communicate, both verbally and writing.

Environment: Informatica BDM 10.5.5. Informatica Administrator DEI tool for Monitoring, Mappings, Workflows, Sessions, Reusable Transformations, Microsoft Fabric for ETL activities, Control Center, Hue, Teradata SQL assistant, TOAD for oracle 12.12, Putty, LCEF UI

Application Developer

HM Health Solutions Inc.
02.2022 - 04.2022

· Develop Informatica mappings, SQL/stored procedures and worked with different transformations, and parameterization.

· Hands-on experience with data ingestion data analysis.

· Understanding the Database concepts related to Hive/Oracle, and performance procedures in Informatica.

· Worked as ETL developer using Informatica DEI tool for ONPREM to cloud migrations and also worked on Low code extract framework UI using Big Query

· Automated Jobs using parameterization, scheduling and event driven triggers.

· Working on Z140 with Business Analyst, and Data Mapping document to initiate the mappings

· Worked on creating hive tables in Hadoop environment and creating connections in the tool to connect to different databases, such as Teradata, Hadoop, hive, Salesforce, and code migration from one environment to another environment.

· Performing analysis, design, and development of ETL processes to support project requirements.

· Developed mappings related to CDC, which is Change Data Capture, recycle process.

· Developed end-to-end data pipelines using Microsoft Fabric to support data analytics and real-time insights across business units.

· Leveraged Microsoft Fabric’s integrated data environment to streamline ETL processes and manage data workflows for large-scale data operations.

· Worked closely with data scientists to integrate machine learning models into business processes using Microsoft Fabric’s built-in ML tools.

· With Fabric, data engineering tasks such as ETL (Extract, Transform, Load) are simplified, allowing users to automate and scale data workflows.

· Worked on Lakehouse architecture, which combines the benefits of data lakes and data warehouses which is an efficient way to handle both structured and unstructured data.

· Worked with both Blaze and Spark engines.

· Worked with Large amount of data and data analysis to understand the requirements and see the best approach.

· Working with customers on possibilities of development and ask for more required information for Quality delivery

· Worked with HDFS and hive using as a source to BDM and loading to Oracle.

· Worked on ESP scheduling with PPS team for batch and real time loads.

· Performing unit testing, helping testers in QA environments with providing test cases, and work with customers to resolve issues or any concerns related to UAT.

· Doing peer reviews for mappings and workflows when required.

· Self-valuating the code always and work on if change required.

· Worked on issues related to Integration services and with OS profiles.

· Worked with offshore team on development activity

· Worked with production support team on any issues related to failures.

· Maintaining documentation for all code related designs and SQL scripts.

· Outstanding ability to communicate, both verbally and writing.

Environment: Informatica BDM 10.5.5. Informatica Administrator DEI tool for Monitoring, Mappings, Workflows, Sessions, Reusable Transformations, Microsoft Fabric for ETL activities, Control Center, Hue, Teradata SQL assistant, TOAD for oracle 12.12, Putty, LCEF UI

Application Developer

DATAEDGE INC
10.2020 - 02.2022

· Configured and managed Cloud data integration and API services for data flow.

· Develop Informatica mappings, SQL/stored procedures and worked with different transformations, and parameterization.

· Hands-on experience with data ingestion data analysis.

· Understanding the Database concepts related to Hive/Oracle, and performance procedures in Informatica.

· Worked as ETL developer using Informatica DEI tool for ONPREM to cloud migrations and also worked on Low code extract framework UI using Big Query

· Automated Jobs using parameterization, scheduling and event driven triggers.

· Working on Z140 with Business Analyst, and Data Mapping document to initiate the mappings

· Worked on creating hive tables in Hadoop environment and creating connections in the tool to connect to different databases, such as Teradata, Hadoop, hive, Salesforce, and code migration from one environment to another environment.

· Performing analysis, design, and development of ETL processes to support project requirements.

· Developed mappings related to CDC, which is Change Data Capture, recycle process.

· Developed end-to-end data pipelines using Microsoft Fabric to support data analytics and real-time insights across business units.

· Leveraged Microsoft Fabric’s integrated data environment to streamline ETL processes and manage data workflows for large-scale data operations.

· Worked closely with data scientists to integrate machine learning models into business processes using Microsoft Fabric’s built-in ML tools.

· With Fabric, data engineering tasks such as ETL (Extract, Transform, Load) are simplified, allowing users to automate and scale data workflows.

· Worked on Lakehouse architecture, which combines the benefits of data lakes and data warehouses which is an efficient way to handle both structured and unstructured data.

· Worked with both Blaze and Spark engines.

· Worked with Large amount of data and data analysis to understand the requirements and see the best approach.

· Working with customers on possibilities of development and ask for more required information for Quality delivery

· Worked with HDFS and hive using as a source to BDM and loading to Oracle.

· Worked on ESP scheduling with PPS team for batch and real time loads.

· Performing unit testing, helping testers in QA environments with providing test cases, and work with customers to resolve issues or any concerns related to UAT.

· Doing peer reviews for mappings and workflows when required.

· Self-valuating the code always and work on if change required.

· Worked on issues related to Integration services and with OS profiles.

· Worked with offshore team on development activity

· Worked with production support team on any issues related to failures.

· Maintaining documentation for all code related designs and SQL scripts.

· Outstanding ability to communicate, both verbally and writing.

Environment: Informatica BDM 10.5.5. Informatica Administrator DEI tool for Monitoring, Mappings, Workflows, Sessions, Reusable Transformations, Microsoft Fabric for ETL activities, Control Center, Hue, Teradata SQL assistant, TOAD for oracle 12.12, Putty, LCEF UI

Application Developer

DATAEDGE INC
08.2019 - 10.2020

· Configured and managed Cloud data integration and API services for data flow.

· Develop Informatica mappings, SQL/stored procedures and worked with different transformations, and parameterization.

· Hands-on experience with data ingestion data analysis.

· Understanding the Database concepts related to Hive/Oracle, and performance procedures in Informatica.

· Worked as ETL developer using Informatica DEI tool for ONPREM to cloud migrations and also worked on Low code extract framework UI using Big Query

· Automated Jobs using parameterization, scheduling and event driven triggers.

· Working on Z140 with Business Analyst, and Data Mapping document to initiate the mappings

· Worked on creating hive tables in Hadoop environment and creating connections in the tool to connect to different databases, such as Teradata, Hadoop, hive, Salesforce, and code migration from one environment to another environment.

· Performing analysis, design, and development of ETL processes to support project requirements.

· Developed mappings related to CDC, which is Change Data Capture, recycle process.

· Developed end-to-end data pipelines using Microsoft Fabric to support data analytics and real-time insights across business units.

· Leveraged Microsoft Fabric’s integrated data environment to streamline ETL processes and manage data workflows for large-scale data operations.

· Worked closely with data scientists to integrate machine learning models into business processes using Microsoft Fabric’s built-in ML tools.

· With Fabric, data engineering tasks such as ETL (Extract, Transform, Load) are simplified, allowing users to automate and scale data workflows.

· Worked on Lakehouse architecture, which combines the benefits of data lakes and data warehouses which is an efficient way to handle both structured and unstructured data.

· Worked with both Blaze and Spark engines.

· Worked with Large amount of data and data analysis to understand the requirements and see the best approach.

· Working with customers on possibilities of development and ask for more required information for Quality delivery

· Worked with HDFS and hive using as a source to BDM and loading to Oracle.

· Worked on ESP scheduling with PPS team for batch and real time loads.

· Performing unit testing, helping testers in QA environments with providing test cases, and work with customers to resolve issues or any concerns related to UAT.

· Doing peer reviews for mappings and workflows when required.

· Self-valuating the code always and work on if change required.

· Worked on issues related to Integration services and with OS profiles.

· Worked with offshore team on development activity

· Worked with production support team on any issues related to failures.

· Maintaining documentation for all code related designs and SQL scripts.

· Outstanding ability to communicate, both verbally and writing.

Environment: Informatica BDM 10.5.5. Informatica Administrator DEI tool for Monitoring, Mappings, Workflows, Sessions, Reusable Transformations, Microsoft Fabric for ETL activities, Control Center, Hue, Teradata SQL assistant, TOAD for oracle 12.12, Putty, LCEF UI

Software Developer

Premier IT Solutions LLC
01.2019 - 08.2019

1) Meetings and discussions with professional staff; analysis of computer software systems and
network needs and problems including designing necessary modifications to, and training users;
2) Study existing systems, reading manuals and using information systems principles so as to best
modify new and existing software;
3) Design of software systems and networks and modifications as necessary; design of training
procedures for use of the systems; providing troubleshooting and technical assistance; upgrading
computer hardware and software as necessary;
4) Coding, this will include writing the programs that will implement fundamental changes to the
system software;
5) Testing the codes and modifying them as necessary

Software Developer

Nemo It Solutions
02.2017 - 01.2019

1) Meetings and discussions with professional staff; analysis of computer software systems and network needs and problems including designing necessary modifications to, and training users; 2) Study existing systems, reading manuals and using information systems principles so as to best modify new and existing software; 3) Design of software systems and networks and modifications as necessary; design of training procedures for use of the systems; providing troubleshooting and technical assistance; upgrading computer hardware and software as necessary; 4) Coding, this will include writing the programs that will implement fundamental changes to the system software; 5) Testing the codes and modifying them as necessary

Education

Master of Science - Electrical Engineering

NORTHWESTERN POLYTECHNIC UNIVERSITY
Fremont, CA
12-2016

Master of Science - Information Technology

University of The Cumberlands
Williamsburg, KY
09-2019

Ph.D. - Information Technology

University of The Cumberlands
Williamsburg, KY

Bachelor of Science -

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY
HYDERABAD,INDIA
04-2014

Intermediate

Narayana JR College
Dilsukhnagar, INDIA
03-2010

Secondary Education

Bhashyam Public School
HYDERABAD,INDIA
03-2008

Skills

    ETL Tools

    Informatica Power Center 102/101/961/95/91, Administration, Informatica BDM 1021/1022/1053/1055, Informatica intelligent Cloud Services (IICS), Informatica Power Exchange, Metadata Manager, Informatica Data Explorer (IDE) etc, Low code extract framework UI (Big Query)

    Database Tools

    SQL Server Management Studio (2008), Oracle SQL Developer (30), HUE, Toad 116 (Oracle), DB2, Teradata, Oracle, SQL Browser (Oracle Sybase), Visio, ERWIN

    Reporting Tools

    Business Objects XIR2/61/50, QlikView, Micro Strategy, Oracle Analytics, etc,

    Operating Systems

    Windows Server 2008/2003, Windows 7/XP/98/95

    Scheduling tools

    Informatica Scheduler, Control-M

    Web Technologies

    HTML, XML

    Languages

    C, C, SQL, XML

Timeline

Sr. Application Developer

Highmark Health
04.2022 - 04.2025

Application Developer

HM Health Solutions Inc.
02.2022 - 04.2022

Application Developer

DATAEDGE INC
10.2020 - 02.2022

Application Developer

DATAEDGE INC
08.2019 - 10.2020

Software Developer

Premier IT Solutions LLC
01.2019 - 08.2019

Software Developer

Nemo It Solutions
02.2017 - 01.2019

Master of Science - Electrical Engineering

NORTHWESTERN POLYTECHNIC UNIVERSITY

Master of Science - Information Technology

University of The Cumberlands

Ph.D. - Information Technology

University of The Cumberlands

Bachelor of Science -

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY

Intermediate

Narayana JR College

Secondary Education

Bhashyam Public School
Yamini Suguru