Summary
Overview
Work History
Education
Skills
Timeline
Receptionist

Ellari Kota

Charlotte,NC

Summary

Over 14 years of experience in Designing, Developing, Maintaining and building large business applications such as data migration, integration, conversion, data warehouse and Testing. Expert in all phases of Software development life cycle (SDLC) – Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance. Experience working in Agile environment & leading sprint meetings for multiple projects using Jira & Kanban board. Hands-on Experience in dealing with various Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snowflake Schema, Fact Table, Dimension Table. Business Requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation. Design and develop Ad-hoc reports using SQL scripts. Perform Data Analysis using SQL tools & also with IBM Visual Explain. Expertise in tuning databases and database queries/ETL Processes. Involved in loading of huge data from different databases and migrating them into various targets as per the client requirements. Experience in Ab Initio Express>IT with BRE, Metadata>Hub, Data>Quality Environment (DQE). Experience in Data Warehousing, ETL Architecture, Data Profiling. Involved in various projects related to Data Modelling, System/Data Analysis, Design and Development for both OLTP & Data warehousing environments. Experience in working as an Informatica MDM Developer. Experience in working in Production Support Team. Experience in working with various versions of Informatica Power Center 9.1/8.6/8.5 Informatica MDM – Client and Server tools. Experienced in estimation, planning, Risk Management, finalization of technical / functional specifications, communication management and quality management of the product. Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks. Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs. Experience in integration of various data sources like SQL Server, Oracle, Flat files, DB2 Mainframes. Expert in troubleshooting/debugging and improving performance at different stages like database, workflows, mapping. Thorough Knowledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP. Involved in writing Unit test cases for complex scenarios. Experience includes thorough domain knowledge of Business Financial system, Banking, Investments, Healthcare Information Technology, Insurance, Pharmacy claims systems & Hospitality. Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and leadership capabilities with high level of adaptability. Established and maintained comprehensive data model documentation including detailed description of business entities, attributes and data relationships. Developed and designed ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center. Responsive expert experienced in monitoring database performance, troubleshooting issues and optimizing database environment. Possesses strong analytical skills, excellent problem-solving abilities, and deep understanding of database technologies and systems. Equally confident working independently and collaboratively as needed and utilizing excellent communication skills.

Overview

16
16
years of professional experience

Work History

Sr Data Integration Analyst

Marriott Vacations Worldwide
03.2023 - 10.2023
  • Lead initiative of data migration project from Oracle to Cloud (Azure)
  • Performed data analysis on existing time-sharing dataset to understand data lineage between various sources
  • Designed tables for various data models using Erwin data tool for enhancing data sources
  • Developed conceptual model using Erwin, based on requirement analysis
  • Developed normalized Logical and Physical database models to design OLTP system for E-Commerce applications
  • Designed various E-R diagrams for supporting business stakeholders & IT development teams for their visualization of current and future state architecture
  • Created Star & Snowflake diagrams to support the new data architecture that supports legacy timesharing sources
  • Performed data normalization on logical data models for handling data duplication and anomalies
  • Developed testing scripts for ~ 90 tables that were identified as part of time-sharing sources
  • Lead testing initiative with various teams and application owners across application by tracking defects and enhancement requests using HP-ALM tool.

Sr Data Analyst

Fidelity & Guaranty Life Insurance Company
03.2021 - 03.2023
  • Performed extensive analysis in understanding the current state of Investment Account application that hosts Investment, Hedging & Annuities data
  • In-depth analysis on existing SSKC source system which will be replaced by a new source system – Clearwater
  • Developed ETL mappings using various transformations such as Aggregator, Expression, Joiner, Filter, Source Qualifier, XML, Union, connected/unconnected Lookups, Sequence Generator, Update Strategy, Stored Procedure, Java, Router and HTTP transformations for reading the data from source files and loading it into IT foundation layer for downstream application needs using Informatica tool
  • Performed intensive data mapping activities and created various artifacts for supporting source system replacement project
  • Design and conduct complex analysis to identify and remediate data redundancy, quality and integrity issues
  • Designed future state system architecture data flow by cataloging all the data points & various source systems including 3rd party vendor systems
  • Data enrichment processes, customer analytics, reporting and drafting recommendations for the data hosting, consumption, analytics and reporting
  • Design and develop a data hosting, enrichment and transformation process for a scalable analytical and reporting solution
  • Develop recommendations for optimal approaches to govern, manage and secure data.

Sr. Data Management Consultant

Wells Fargo Bank NA
12.2018 - 02.2021
  • Leads initiatives to ensure data quality is maintained within application.
  • Consults with source partners to assess current state of data and metadata quality
  • Design and conduct complex analysis to identify and remediate any data quality & integrity issues
  • Develop and implement technical logic using business quality controls for any new data sources when getting onboarded
  • Develop and implement data quality rules using Ab Initio Express>IT tool
  • Identify missing data quality metrics and execute data quality audits to benchmark state of data quality
  • Leads or assists in development and implementation of plans for assessing quality of new data sources
  • Perform intensive data analysis using SQL skills to ensure the quality of incoming & outgoing data is maintained per enterprise standards
  • Responsible for understanding and analyzing source data by connecting to Hadoop Cluster on EDL 1.5/2.0 using ADS 18.5.

Sr. Data Analyst

Wells Fargo Corporate Finance
07.2016 - 12.2018
  • Involved in assessing the impacts of various applications before getting onboarded
  • Perform data analysis and create technical documents for both sourcing and downstream system of records (SOR) for regulatory reporting
  • Responsible for understanding SOR requests and evaluating technical impacts
  • Creating functional specification documents based on the technical impacts caused by applications during Onboarding process
  • Working in Oracle 11g environment and generating reports through OBIEE
  • Creating Impact Analysis Assessment by performing Data analysis on the existing customer data
  • Responsible for creating logical & physical data models using CA Erwin Data Modeler tool
  • Perform end-end data mappings for newly onboarded source systems
  • Single point of contact between Line of Business (LOB) and Technology team during integrating process with RDR application
  • Generate Ad hoc reports based on user requests using OBIEE tool
  • Responsible for handling the requests from various partners and providing them the required access to DB for testing purpose
  • Worked closely with BA’s and System Partners in JAD sessions for the requirements gathering and translating them successfully into IT logic
  • Responsible for Code Reviews & Bug fixes during various phases of project and was responsible for creating Flow Diagrams and design techniques as per the user requirements
  • Responsible for performing data validations and data analysis by reading data in OFSAA (FSDF framework) as part of FDR project
  • Involved in validating the Regulatory reporting requirements & schedules (FR-Y, 14 M, 14Q, FFIEC).

Data Analyst

Wells Fargo Wholesale Technology Services
01.2015 - 06.2016
  • Involved in assessing the impacts of various applications as part of Tech refresh project
  • Responsible for understanding Upstream/Downstream partner requests and evaluating technical impacts
  • Creating technical impact assessment documents based on the changes caused due to applications
  • Working in MS SQL Server 2012 database environment and generating SQL reports through SSRS
  • Creating ETL mappings using Informatica 9.5 version tool and applying data transformation logics according to business requirements
  • Creating Impact Analysis Assessment by performing Data analysis on the existing customer data
  • Responsible for handling the requests from various partners and providing them the required access to DB for testing purpose
  • Responsible for validating the data and identifying the gaps within the system
  • Responsible for data loads during deployments and cautioning the partners in case for any failures
  • Worked closely with BA’s and System Partners in JAD sessions for the requirements gathering and translating them successfully into IT logic
  • Responsible for testing web services in lower environments before promoting changes to Prod environment using SOAP UI 4.6
  • Responsible for Code Reviews & Bug fixes during various phases of project and was responsible for creating Flow Diagrams and design techniques as per the user requirements
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation
  • Extensively worked with Look up Caches like Shared Cache, Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
  • Responsible for Unit Testing of Mappings and Workflows
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files
  • Responsible for understanding and analyzing the source data by connecting to Hadoop Ecosystem using ADS.

System Analyst

OptumRx
03.2012 - 01.2015
  • Working on Rx Claims systems and running Adhoc Reports per the client request
  • Worked in SQL Server 2008 R2 DB environment for performing various Database activities
  • Responsible for performing all DDL/DML activities and providing reports per Business requirements
  • Responsible for requirement gathering and requirement understanding for various projects within the organization
  • Responsible as a Project Planner for various small and medium scale projects which were assigned to me
  • Responsible from Start to End in Infrastructure Transition/Migration from one system to another
  • Responsible for Technical & Functional support for the project
  • Involved in converting the functional requirements to Technical with the help of Architects & leading various modules in the project
  • Responsible for Code Reviews & Bug fixes during various phases of project and was responsible for creating Flow Diagrams and design techniques as per the user requirements
  • As a developer, involved in development and played as a mentor for offshore team in development activities
  • Played a key role in doing estimations at different stages of modules and estimating the risks involved in the project
  • Played a vital role in doing impact analysis during code merge with Scalability as a part of Fresh Start
  • Used SSIS as ETL tool in building the packages along with SSRS as reporting on top of those packages
  • Involved in Query Optimization, using optimizers and Database performance tuning of slow performing Queries using SQL Explain Plan (DB2-ISeries Navigator tool)
  • Worked with managed services team in running those packages in different environments
  • Involved in performing some join conditions based on the requirement
  • Performed Inner, Outer, Left, Right, Full joins in some of the queries for getting the outputs
  • Using Informatica 9.1 as tool for doing ETL development activities in 24/7 environment in coordination with offshore team
  • Extensively worked with Look up Caches like Shared Cache, Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
  • Responsible for Unit Testing of Mappings and Workflows
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files
  • Created pre-SQL and post-SQL scripts which need to be run at Informatica level
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger
  • Extensively worked with both Connected and Un-Connected Lookups.

Informatica Developer

Cigna Health Insurance
01.2012 - 03.2012
  • Worked closely in 835 INBOUND/OUTBOUND, 277CA, 837 INBOUND/OUTBOUND transactions
  • Worked rigorously in 24/7 environment by recoding/debugging the existing mappings as per the business requirement
  • Worked mainly on Power Center Tool 9.10 in developing the current code, designing the current workflows as per the data model and Unit testing them before moving the code to SYT
  • Worked in Oracle Database Environment and used developer tools like TOAD in writing the queries and modifying the existing current Source Qualifiers in order to match the load equally on the Database as well as the Informatica code
  • Worked in clearing the existing defects raised in Productions and if needed developed some codes as per the logic
  • Used all most all types of transformations in this project like Expression, Router, Filter, Aggregator, Sorter, and Lookup & Normalizer and even used XML Parsers at times needed as per the requirement
  • Worked closely with BA’s and BO’s for the requirement gathering and successfully debugged or modified the code as per the logic
  • Worked even in the Performance tuning Tower and helped them out successfully to decrease the run time of the workflows which were running for more than the time limit
  • Extensively used all types of joins as per the requirement and created few tables,indexes,views under the guidance of DBA’s in order to cut short the timeline
  • Created many parameters and variables at mapping level as well as workflow level and was even involved in writing low level Stored Procedures or even modifying the existing stored procedures
  • Worked closely with Architect’s and Team lead’s in redesigning the entire existing architecture
  • Even worked in production support at Tier 2 level and helped then out if there are any issues related to the jobs scheduling/failing/session runs
  • Worked closely in data cleansing group and identified the bad data as per the source and target columns.

Informatica Developer

State of Tennessee
11.2011 - 01.2012
  • Worked with various legacy systems like COTS, VRTS, ACCENT, TENNCARE, ACCENT, TCCMS as part of conversion projects from Mainframes to Oracle system.
  • Developed mappings, workflows and tasks using Informatica Power Center 9.10.
  • Identified, troubleshot and corrected bugs and bottlenecks.
  • Was involved in writing complex stored procedures utilized on day to day purpose as per the business requirements
  • Worked closely with conversion team in the project for implementation of maps and workflows as per BRD
  • Involved extensively in modifying the existing code as per the requirement and utilized various transformations in the mappings as per the logic
  • Worked closely with BA, QA and DA’s of the project in Analysis, Discovery, Design, Construction, and Testing steps in delivering the code as per the functional group
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters
  • Played vital role in Data Factory Group by solving the tickets raised by different users and customers
  • Responsible for deploying the codes and migration them till UAT/SIT environment using Informatica Tool
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning
  • Written various procedures, created Indexes, primary keys and data bases testing
  • Created pre-SQL and post SQL scripts which need to be run at Informatica level
  • Extensively worked with both Connected and Un-Connected Lookups.

Informatica Developer

Quintiles Inc.
08.2011 - 11.2011
  • Worked as an Informatica MDM developer within various legacy systems
  • Worked with various heterogeneous source systems like INNTRAX, QLIMS, TRIO, CDW, INFORM, PEOPLESOFT, SALESFORCE, ECG, and CTMS
  • Performed multitasking functions like working in multiple environments and developing various mappings for their new products
  • Worked as Production Support Analyst in 24/7 environment and providing the solutions within the time frame
  • Worked extensively on Oracle database using TOAD tool and created many complex queries for their day to day jobs loads
  • Worked extensively in creating complex mappings like Type 1 SCD and Type 2 SCD for their jobs where most of them are like truncate and load on weekly basis
  • Responsible for Performance tuning in Informatica Power Center at the Target Level, Source level, Mapping Level, Session Level, and System Level
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters
  • Played vital role in Data Factory Group by solving the tickets raised by different users and customers
  • Responsible for deploying the codes and migration them till UAT/SIT environment using Informatica Tool
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning
  • Written various procedures, created Indexes, primary keys and data bases testing
  • Created pre-SQL and post SQL scripts which need to be run at Informatica level
  • Extensively worked with both Connected and Un-Connected Lookups.

Informatica Developer

Microsoft Corporation
01.2011 - 02.2011
  • Was involved in extracting raw files from SQL server and loading them into various targets
  • Extensively worked on creating mapping, variables, Mapplets & Parameters which were used in various workflows for reusability
  • Worked on SQL SERVER 2008 R2 as Source and Target Database for some of the requirements
  • Worked with various Active & Passive transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation Expression Transformation, and Sequence Generator
  • Extensively used MS View as reference tables in many mappings
  • Developed Slowly Changing Dimensions Type1, Type 2, for complex scenarios
  • Worked on Performance Tuning at various places like Mapping Level, Session Level, Source Level, and the Target Level
  • Extensively worked with Look up Caches like Shared Cache, Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
  • Extensively worked with aggregate functions like Avg, Min, Max, First, Last in the Aggregator Transformation
  • Extensively used SQL Override function in Source Qualifier Transformation
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters
  • Also worked on the Postproduction support activities after deployment.

Programmer Analyst

Tech Mahindra
06.2007 - 07.2008
  • Involved in designing blueprint of navigational maps for cities and states using Microsoft Visio
  • Developed Use Case diagrams such as Class & Sequence diagrams using Visual Studio code
  • Involved in designing and developing various roads maps for counties and cities in Unites States using HTML, CSS.
  • Performed regression and system-level testing to verify software quality prior to release
  • Identified causes of issues within applications and determined which modifications needed to be made
  • Worked closely with clients to establish specifications and system designs
  • Authored code fixes and enhancements for inclusion in future code releases and patches

Education

Master of Science - Computer Engineering

International Technological University
San Jose, CA
08.2010

Bachelor of Science - Instrumentation & Control Engineering

Jawaharlal Nehru Technological University
Hyderabad, India
06.2007

Skills

  • Data Analysis
  • Data Integrity Assurance
  • Data Modeling
  • Data Research and Validation
  • Data Processing
  • SQL and Databases
  • Issue Identification
  • System Analysis
  • Root Cause Analysis
  • Documentation and Reporting
  • Project Management
  • Process Improvements

Timeline

Sr Data Integration Analyst

Marriott Vacations Worldwide
03.2023 - 10.2023

Sr Data Analyst

Fidelity & Guaranty Life Insurance Company
03.2021 - 03.2023

Sr. Data Management Consultant

Wells Fargo Bank NA
12.2018 - 02.2021

Sr. Data Analyst

Wells Fargo Corporate Finance
07.2016 - 12.2018

Data Analyst

Wells Fargo Wholesale Technology Services
01.2015 - 06.2016

System Analyst

OptumRx
03.2012 - 01.2015

Informatica Developer

Cigna Health Insurance
01.2012 - 03.2012

Informatica Developer

State of Tennessee
11.2011 - 01.2012

Informatica Developer

Quintiles Inc.
08.2011 - 11.2011

Informatica Developer

Microsoft Corporation
01.2011 - 02.2011

Programmer Analyst

Tech Mahindra
06.2007 - 07.2008

Master of Science - Computer Engineering

International Technological University

Bachelor of Science - Instrumentation & Control Engineering

Jawaharlal Nehru Technological University
Ellari Kota