Summary
Overview
Work History
Education
Skills
Timeline
Generic

Prasanthi Gogineni

Beaumont,TX

Summary

  • Around 4 years of experience in the IT industry with an emphasis on ETL Development, Data Modeling, Data Analysis, SQL, and PL/SQL Development.
  • Worked on multiple domains like Data Integration, Data Mining, and Database Security industries across various clients.
  • Currently working on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, Workflow Monitor. Enhanced in transforming SSIS packages from different sources to destination servers.
  • Proficient in leveraging diverse data feeds to design and create robust solutions tailored to organizational needs.
  • Skilled in utilizing SQOOP for seamless data transfer between HIVE tables and Oracle databases, facilitating efficient data manipulation and report generation for users.
  • Demonstrated expertise in processing and optimizing data workflows within Hadoop Distributed File System (HDFS) using Spark, ensuring enhanced processing speed and performance.
  • Proficient in employing SQL ETL techniques to develop Data Warehouse applications in relational database management systems (RDBMS), enabling streamlined data integration and analysis.

Overview

6
6
years of professional experience

Work History

Informatica ETL / SQL Developer

Amadeus IT Group
12.2019 - 08.2022
  • Working on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, Workflow Monitor, Business glossary, Informatica Lineage, DVO Analyzing customized package, PL/SQL code provided by Informatica to provide a solution for Lineage automation.
  • Converting the store Procedure to ETL using Informatica.
  • Created the Incremental Aggregated Transaction Fact.
  • Used Type 1, Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, Union and SQL transformation to develop robust mappings in the Informatica Designer.
  • Process the data load by using the ETL tool Informatica, which is grouped into 4 batches of initial load to the staging area, initial load to the facts and dimensions, incremental data from source systems to the staging area and incremental load from staging to the fact and dimensions, and few independent sessions.
  • Creating SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc.
  • To generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to the data warehouse.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Creating shared dimension tables, measures, hierarchies, levels, cubes and aggregations on MS OLAP/ Analysis Server (SSAS).
  • Generating periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
  • Creating SSAS OLAP Cubes and Partitioning Cubes, monitored and full and incremental load using cube wizard.
  • Gathering functional and non-functional client requirements in order to optimize the design of BI deliverables (reports, dashboards, alerts, visualizations).
  • Designing and developing SSIS Packages for loading data from text files, CSV files to SQL Server databases using SSIS.
  • Extract data from legacy systems and provide it in a new environment.
  • Utilize Hadoop infrastructure to process large data sets using Spark, MapReduce, and Hive. Build frameworks with Spark to perform data transformations.
  • Design and develop end-to-end ETL and BI solutions by sourcing data from various data feeds.
  • Use SQOOP to export/import HIVE tables and load them into Oracle for data manipulation and reporting for users.
  • Process and manipulate data into HDFS using Spark, designing optimal solutions for faster processing.
  • Wrote Shell Scripts to maintain production environments.
  • Environment: Informatica Power Center 9.5.2 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), OBIEE 10.1.3.4 SQL Query Analyzer 8.0, Oracle 10g/11g, MS SQL Server, SQL Developer, Windows, Unix/Linux, putty, Spark, SQOOP.

SQL MSBI Developer

Amdocs
01.2017 - 12.2019
  • Designing conceptual, logical & physical data models, data dictionaries and metadata repositories.
  • Working with the Project Management team to prioritize business, information and reporting needs.
  • Working with Enterprise Data Architects to drive acceptance of an enterprise view of data and its use.
  • Coordinating with DBA's in creating and managing Tables, Indexes, and Table spaces, Sequences, Views and Materialized Views.
  • Generating DDL scripts using Forward Engineering technique to create objects and deploy them into the databases.
  • Partnering with the Data Governance and other data committees to ensure to implement common vocabulary and understanding of business entities and relationships between entities.
  • Providing thought leadership in the areas of advanced data techniques, including data modeling, data access, data integration, data visualization, text mining, data discovery, statistical methods, database design and implementation.
  • Developing and documented data requirements and design specifications in the form of data models, data mappings and data quality metrics.
  • Mapped out structure and organization of the relevant data for Aflac.
  • Working with internal customers and business units to understand the business requirements and business processes to design data warehouse schemas and define extract-translate & load (ETL) processes for data teams.
  • Assisting with the analysis, design, development and implementation of logical data models, physical database objects, data conversion, integration, and loading processes, query and reporting functions, data management and governance and data quality assurance processes.
  • Working in collaboration with the Data Analysts & ETL developers to provide source to target ETL requirements and ensured efficient transformation and loading.
  • Assisting in maintaining and enhancing the metadata infrastructure, the data dictionary and business metadata, and facilitate publishing the information to the business and technical communities.
  • Partnered with the business to create the appropriate business rules for data usage.
  • Environment: Informatica PowerCenter 9.5.1, Oracle, Windows 7, IDQ, SQL Developer 2005, UNIX, PL/SQL, Autosys, Business Objects, AIX, SVN, Tidal, Wave Analytics.

Education

MASTERS - Computer Science

LamarUniversity
Beaumont, Texas
12.2023

Bachelors - Information Technology

SRK Institute Of Technology
Vijayawada, Andhra Pradesh, India
05.2013

Skills

  • Data Extraction, Transformation and loading data using Informatica
  • Proficient in writing complex SQL queries, stored procedures, normalization, database design, creating indexes, functions, triggers, and sub-queries
  • Created reports for the users using Tableau by connecting to multiple data sources like Flat Files, MS Excel, CSV files, SQL Server, and Oracle
  • Extensively used V-LOOKUP's, Pivot tables and Excel functions, also created Pivot graphs, reports and dashboards
  • Configured Power BI dashboards and reports
  • Experience working in teams, Agile/Scrum development environment
  • Good command of Business Intelligence (BI) tool Cognos
  • Extensively used Shell scripting to maintain production environments
  • Experience with Data Extraction, Transforming and Loading (ETL) using various tools such as Data Transformation Service (DTS), SSIS
  • Good Knowledge of Business Intelligence, OLAP, Dimensional modeling, Star and Snowflake schema extraction, transformation and loading (ETL) process
  • Strong working knowledge of Oracle, SQL, PL/SQL, Teradata, DB2, Flat Files, MS Access and Sybase
  • Good in analysis, requirements gathering, design, development, implementation, and management of full lifecycle data warehouse projects
  • Creating ETL flows using Informatica PowerCenter 952
  • Utilized SQOOP for exporting/importing data to HIVE tables and loading into Oracle databases, enabling effective data manipulation and report generation
  • Implemented frameworks with Spark to execute data transformations efficiently, improving processing speed and scalability

Timeline

Informatica ETL / SQL Developer

Amadeus IT Group
12.2019 - 08.2022

SQL MSBI Developer

Amdocs
01.2017 - 12.2019

MASTERS - Computer Science

LamarUniversity

Bachelors - Information Technology

SRK Institute Of Technology
Prasanthi Gogineni