Summary
Overview
Work History
Education
Skills
Timeline
Generic

Sravya V

Charlotte,NC

Summary

Over 9 Years of IT experience in designing, implementation, development, testing and maintaining the ETL components for building Data Warehouse & Data marts across different domains. Experienced in all stages of the software lifecycle Architecture (Waterfall model, Agile Model) for building a Data warehouse. Well acquired on Informatica Power Center 10.x/9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) and Repository Manager & Admin console. Extensively worked on various transformations like Lookup, Joiner, Router, Rank, Sorter, Aggregator, Expression, etc. Exposure to Data Warehousing, Data Architecture, Data Modeling, Data Analysis, SDLC Methods. Good Understanding with experience in Ralph Kimball Methodology, Bill Inmon Methodology, entity-relational and dimensional-relational table modeling using Data Modeling (Dimensional & Relational) concepts like Star Schema Modeling and Snow-flake schema modeling. Extensive experience with Data Extraction, Transformation, and Loading(ETL) from heterogeneous Data sources of Multiple Relational Databases like Oracle, Teradata, DB2, SQL Server, My Sql, Sybase and Worked on integrating data from flat files like fixed width and delimited, CSV, XML into a common reporting and analytical Data Model (DWH) using Informatica PC. Involved in Troubleshooting data warehouse bottlenecks, Performance tuning - session and mapping tuning, session partitioning & implementing Pushdown optimization. Strong knowledge of reporting tools and working with reposting & Business users. Experience with Teradata utilities like Fast load, Fast Export, Multi Load, TPUMP & TPT. Have experience in creating BTEQ scripts. Experience with Teradata Performance Tuning using Explain Plan. Experience in proper usage of Different Teradata Tables, Indexes (PI, SI & JI). Experience on Oracle utilities like SQL Loader. Extensively used SQL and PL/SQL for development of Procedures, Functions, Packages and Triggers. Experience in using the Informatica command line utilities like PMCMD to execute workflows in Unix. Used versioning for code migration activities. Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization along with the CDC (Change Data Capture). Experience in doing Unit Testing and working with QA team for System Testing. Good Experience in UNIX Shell scripting & Windows Batch Scripting for Parsing Files & automation of batch ETL jobs. Involved in Supporting Data Warehousing jobs as well supporting Data issues for Business Users using trouble ticket management system. Versatile team player with excellent analytical, communication and presentation skills. Experience on Cloud Databases and Data warehouses (SQL Azure and Redshift/RDS). Dynamic ETL Developer practiced in helping companies with diverse transitioning, including sensitive [Type] data and massive big data installations. Promotes extensive simulation and testing to provide smooth ETL execution. Known for providing quick, effective tools to automate and optimize database management tasks.

Overview

12
12
years of professional experience

Work History

BI Developer

Charter Communications
10.2021 - Current
  • This project involved in migrating data from source systems and is fed into XDW
  • This process includes extraction of data from source systems, applying transformations and loading the data after the query tuning and performance check
  • Teradata SQL have played a significant role in migrating the data to Data warehouse and achieving the expected gains
  • Performed data analysis and gathered columns metadata of source systems for understanding requirement
  • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table
  • Worked on optimizing and tuning the Teradata views and sqls to improve the performance of batch
  • Worked with complex sql queries to test the data generated by the ETL process against the target database
  • Worked on impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing data warehouse applications that are running in production environment
  • Worked on providing production fixes and proactively involved in fixing production support issues
  • Worked with UNIX environment and execute the TPT script from UNIX platform
  • Worked on framework conversion
  • Worked on ETL and data validation using AWS Redshift

Environment: Teradata, AWS Redshift Oracle, Unix.

ETL Informatica developer

CIGNA
09.2020 - 09.2021
  • Cigna's Global Applications Operations organization, The Global Applications organization is responsible for maintaining all of Cigna's health data in the most efficient manner possible and for the transformation of this data-to-data marts for the use by customers
  • Worked on designing and developing Informatica ETL Workflows and Mappings Analyze requirements and prepare the technical specification documents
  • Developing ETL logic utilizing Informatica workflows, scripting, and load utilities Developing and executing quality assurance and test scripts
  • Worked on ESP conversion project
  • Worked on converting Informatica workflows to ESP
  • Worked with business analysts to understand business requirements and use cases
  • Participated with project & delivery teams on sessions for new applications transitioning to production support
  • Researched and evaluated alternative solutions and recommended the most efficient application programming solution for continuous improvement opportunities
  • Problem solving and fixing technical issues
  • Registration and Extraction Oracle CDC tables using Power exchange navigator
  • Imported Power exchange Register tables to Implement CDC on Informatica
  • Created Database Objects like Tables and stored procedures in Azure Data warehouse
  • Created Cosmos scripts which pulls data from upstream structured streams and include business logic and transformations to meet the requirements
  • Environment: Informatica Powercenter, Power Exchange, Oracle, ESP, Unix.

ETL Informatica Developer

Wellcare
06.2019 - 08.2020
  • Wellcare provides Medicare and Medicaid managed care health plans for 2.2 million members
  • Wellcare partners with over 91,000 physicians and employs over 3,500 associates
  • Wellcare Health Plans, Inc
  • Is the holding company for several subsidiaries, including well Care, Stay well, Health Ease, Harmony
  • Working with business users for requirements gathering and business analysis
  • Analyzing the business requirement and Preparation of Detailed Design documents
  • Implemented audit process for the HRA(HEALTH RISK ASSESSMENT ) subject area to ensure Data warehouse is matching with the source systems
  • Good experience in Health Insurance membership, provider and claims subsystem
  • Experience in all aspects of the Software Development Life Cycle which includes Analysis, Design, Development, Testing, Implementation and Support with specific focus on development
  • Involved in loading data from UNIX file system to HDFS
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way
  • Load and transform large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts
  • Responsible for creating Hive tables, loading data and writing hive queries
  • Created DDL, DML for greenplum database
  • Developed scripts to load processed data from HDFS into Greenplum database
  • Loaded data into a Greenplum database instance using external tables, SQL copy and insert commands, and parallel load utilities
  • Worked on Greenplum features, benefits, and understanding architecture and how Greenplum supports redundancy and high availability
  • Designed and implemented table partitioning for handling very large tables
  • Improve query performance by following performance enhancement tips and database best practices
  • Created autosys jobs for automation of jobs
  • Experience in Unix Scripting and Power shell
  • Working on the ETL side of the process to load data into database from different servers
  • Environment: Informatica Power Center 10.1, Hive 0.9, HQL, SQL, SQL Server, Greenplum, Unix.

ETL Informatica Developer

Optum
07.2018 - 05.2019
  • As a part of this project the primary objective is to capture different claims, claim details from multiple source systems and vendor
  • Extracted Transformed Loaded data into EDW using Informatica Power Centre and generated various reports on a daily, weekly monthly and yearly basis
  • These reports give details about claims to the stakeholders
  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area
  • Used Informatica reusability at various levels of development
  • Developed mappings/sessions using Informatica Power Center 10.2 for data loading
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union
  • Developed Workflows using task developer, worklet designer and workflow designer in workflow manager and monitored the results using workflow monitor
  • Building Reports according to user Requirement
  • Implemented slowly changing dimension methodology for accessing the full history of accounts
  • Optimizing performance tuning at source, target, mapping, and session level
  • Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings
  • Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources like XML, Teradata, and Delimited Flat files
  • Worked in data Extraction, Transformation and loading from Xml files, large volume data to EDW using B2B data transformation
  • Environment: Informatica Power Center 10.1/9.6.1, Teradata, Flat Files, XML, SVN, JIRA, UNIX, Windows.

ETL Informatica Developer

Bank of West
10.2017 - 06.2018
  • (WIPRO Contractor) Description: As part of this project, providing IT services to Bank in different models across multiple data warehouse, ETL & reporting technologies
  • I work closely with the ETL team, customers, business analysts and other colleagues in the IT department to analyze operational data sources, determine data availability, define the data warehouse schema and develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse
  • Responsibilities: Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data marts
  • Estimates and planning of development work using Agile Software Development
  • Ensured performance metrics are met and tracked Processes to generate Daily, Weekly and Monthly data extracts were developed and the data files were sent across to the downstream applications
  • Designing and Developing ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems such as Oracle, Flat files, XML
  • Working on slowly changing dimension Type1 and Type2
  • Configured tasks and workflows using workflow manager
  • Extensively working with Informatica Designer and workflow manager
  • Designed and developed various mappings and mapplets in mapping designer, sessions and workflows in workflow manager to extract data from various sources and load into Oracle database tables
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager
  • Helped in Performance Tuning for Oracle RDBMS using Bulk binding, Explain Plan, Parallelism while insert, select and creating required table indexes
  • Prepared Impact analysis, technical design, Run-Book documents for each assigned task
  • Supported Integration testing and User acceptance testing, and HP QC is used for defect life cycle management
  • Involved in supporting the existing applications by resolving the abends, suggested code changes and knowledge transfer to new team members
  • Responsible for creating plans for the code migration using SVN as a version control tool following security policy
  • Wrote UNIX shell Scripts for FTP of files from remote server, backup of repository and folder, sending exceptional mails with reports etc
  • Implemented couple of Health Checks to make sure data loads are accurate
  • Involved in leveraging data for Reporting purposes which further be used for business analysis Conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings
  • Involved in analysis , documenting & testing Informatica Upgrade from 9.6.1 to 10.1 Identify performance issues on the code which are performing badly and tune them to complete the jobs within the given SLA time
  • Coordinating with DBA in creating and managing tables, indexes, constraints, Table-spaces and data quality checks
  • Helped Setting up TEST and UAT environments to reproduce the production issue and proposed solutions Involved in trouble shooting the existing production issues, discussing feasible solution with team and get sign off from business for further development tasks
  • Exploring new solutions for existing applications in improving data quality, better performance, best customer experience
  • Programming ad-hoc reports related to get business approval, Data warehouse testing and execute the testing effectively
  • Preparing the Mock data to satisfy all positive and negative Business rules for unit and UAT testing
  • Environment: Informatica Power Center 10.1/9.6.1, Oracle 12c , Flat Files, XML, SVN, Toad, Unix, Windows.

ETL Informatica Developer

Texas Medicaid and Health Care Partnership
03.2015 - 09.2017
  • Description: Texas Medicaid and Health Care Partnership is an innovative leader in the health and well-being industry, Texas Medicaid and Health Care is dedicated to making business decisions that reflect its commitment to improving health and well-being of its members, associates, the communities
  • During this project I worked closely with the ETL team, customers, business analysts and other colleagues in the IT department to analyze operational data sources, determine data availability, define the data warehouse schema and develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse
  • Responsibilities: Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data marts
  • Preparation of technical specification document for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards Estimates and planning of development work using Agile Software Development
  • Good experience on Agile Methodology and the scrum process
  • Ensured performance metrics are met and tracked Processes to generate Daily, Weekly and Monthly data extracts were developed and the data files were sent across to the downstream applications
  • Involved in Data modeling review sessions - E/R diagrams, normalization and de-normalization as per business requirements
  • Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management and Data transportation
  • Written Teradata BTEQs to move data from staging to base as well used utilities – M-Load, F-Load, F-Export, Tpump, TPT
  • Experience in Performance tuning of Source SQL queries and Teradata Queries
  • Designed and Developed ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems such as Oracle, Oracle GG, Flat files, XML, My Sql
  • Sourced transaction data being staged by Golden Gate tool CDC (Change Data Capture) through Informatica Power center and loaded into the target Worked on slowly changing dimension Type1 and Type2
  • Configured tasks and workflows using workflow manager
  • Involved in the creation of Oracle Sql and PL/SQL stored procedures and functions
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data
  • Tuned the Sessions for better performance by eliminating various performance bottlenecks
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager
  • Writing UNIX & Perl script to load data from sources to staging tables, to create indirect file list, generate parameter files for respective paths
  • Used SVN for version control and Maestro for Job scheduling
  • Applied Agile methodology throughout the development life cycle of application
  • Environment: Informatica Power Center 9.6.1/9.5.1, Teradata 14/13, Teradata Tools and Utilities, Maestro, My Sql, UNIX, Putty, Perl, MS Visio.

ETL Informatica Developer

Anthem Inc
01.2014 - 02.2015
  • Description: Anthem Inc
  • Is an American health insurance company founded in the 1940s, prior to 2014 known as WellPoint, Inc
  • It is the largest for-profit managed health care company in the Blue Cross and Blue Shield Association
  • This Project is extracting Provider Directory and Provider Network File data from the Enterprise Reporting group to the PDX group, creating new extracts and Standardize Provider Directory extracts and Provider Network File data to a single platform (PDX) for all Medicaid markets and aligns responsibility for maintaining them with the team supporting PDX
  • Responsibilities: Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (SQL server and flat files) by incorporating business requirement rules
  • Using Informatica Power Center 9.5.1 as a tool, extracted data from Flat files, SQL to build Data Source and Applied business requirement logic to load the Facets, Vendor and Pharmacy data into Data Warehouse Data Mart Tables
  • Designed and developed Mappings using different transformations such as Source Qualifier, Expression, Aggregator, Filter, Joiner, and Lookup to load data from source to target tables
  • Created Stored Procedures to handle the selection criteria such as Address, Provider, Specialty, Chapters and Credentialing and to load the data for the Extract and Exclusion reports based on the business requirements
  • Created Stored Procedures to load the crosswalks data for Medicare Provider and Printed Directories to SQL Staging tables
  • Created variables and parameters files for the mapping and session so that it can migrate easily in different environment and database
  • Worked with re-usable sessions, decision task, control task and email tasks for on success/on failure mails
  • Used Teradata utilities like Fast load, Fast Export, Multi Load, TPUMP & TPT and experience in creating BTEQ scripts
  • Involved Teradata Performance Tuning using Explain Plan
  • Involved in proper usage of Different Teradata Tables, Indexes (PI, SI & JI)
  • Migration of ETL code from Development to QA and from QA to Production environments
  • Involved in working with business analyst in resolving the issues related with the migration of designs from Dev Server to Prod Server
  • Perform Unit testing by writing simple test scripts in the database and involved in integration testing
  • Creating and Deploying SSRS reports for the provider directory Health Plan and vendor data and generate the daily, weekly, monthly and Quarterly reports
  • Created various batch Scripts for scheduling various data cleansing scripts and loading process
  • Created Subscriptions and generated Subscription ID's for the Extract and Exclusion Reports from SSRS Reports
  • Created tidal jobs to schedule the informatica workflows using Tidal Scheduler
  • Extensively worked on creating technical design documentation
  • Environment: Informatica Power Center 9.5, SQL Server 2012, Oracle 11g, Flat Files, Teradata 12, SSRS, Microsoft Visual Studio 2012, Tidal, JIRA, Microsoft Visio, SharePoint, Facets, Tortoise SVN.

ETL Developer

ICICI Bank
07.2012 - 12.2012
  • Description: ICICI is one of the leading banks in India
  • The project caters bringing the different product systems of the bank
  • It caters for various levels of aggregation and summarization of the products of the bank
  • Systems include the following products and related information: Customer Information, General Ledger, Current Accounts, Credit Card, Loans, Trade Finance, Treasury, Collaterals, Facilities, NPL and MFA/MRA
  • Responsibilities: Translated business requirements into data warehouse design Involved in understanding logical and physical enterprise data warehouse schemas
  • Used Star Schema approach for designing of Data Warehouse
  • Integrated different systems using Informatica server configuration
  • Extracted data from Flat files, Oracle, SQL Server, MS-Access and Loaded data into Oracle using Informatica
  • Created Source definitions, Target definitions, Mappings, Transformations, Reusable Transformations, Mapplet using Informatica Designer tool which includes Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer
  • Created sessions, database connections and batches, Scheduled and monitored transformation processes using Informatica Server Manager
  • Used Informatica repository manager to backup and migrate metadata in development, test and production systems
  • Environment: Informatica Power Center 7.1.1/8.1.1, Cognos Impromptu 6.0, Impromptu web reports (IWR 6.0), UNIX, PL/SQL, Oracle 8.0/8i and Win NT.

Education

Bachelor of Technology -

JNTU (Jawaharlal Nehru Technical University)

MS EE -

SUNY at New Paltz

Skills

  • Windows
  • Unix
  • Linux
  • MS-DOS
  • Quality Center
  • Mercury Test Director
  • AWS Redshift
  • Teradata 15/14/13/12
  • Oracle 8x/9i/10g/11g/12c
  • DB2
  • MS SQL Server 2000
  • Sybase
  • My Sql
  • Teradata Studio
  • Sql Assistant
  • Sql Developer
  • Toad
  • Putty
  • Winscp
  • SQL Plus
  • SQL Loader
  • Netezza
  • Greenplum
  • Hive 09
  • Informatica Power Center 1010/961/951/91/86/85/81/71
  • ETL development
  • Business intelligence tools
  • Data Modeling
  • Data Warehousing

Timeline

BI Developer

Charter Communications
10.2021 - Current

ETL Informatica developer

CIGNA
09.2020 - 09.2021

ETL Informatica Developer

Wellcare
06.2019 - 08.2020

ETL Informatica Developer

Optum
07.2018 - 05.2019

ETL Informatica Developer

Bank of West
10.2017 - 06.2018

ETL Informatica Developer

Texas Medicaid and Health Care Partnership
03.2015 - 09.2017

ETL Informatica Developer

Anthem Inc
01.2014 - 02.2015

ETL Developer

ICICI Bank
07.2012 - 12.2012

Bachelor of Technology -

JNTU (Jawaharlal Nehru Technical University)

MS EE -

SUNY at New Paltz
Sravya V