Summary
Overview
Work History
Education
Skills
Timeline
Generic

Sreekanth Palla

San Antonio,TX

Summary

  • Software Development Life Cycle (SDLC) Expertise: Over 12 years of proven experience in the complete SDLC, adept at system requirements gathering, architecture, design, coding, development, and testing.
  • ETL Methodology and Data Migration: Demonstrated proficiency in developing and designing ETL methodologies, with approximately 12 years of hands-on experience in data migration. Skilled in using Datastage and Informatica Power Center for data transformations and processing.
  • Cloud Data Management: Around 3 years of successful work experience with Snowflake Database on AWS, specializing in extraction, transformation, and loading of data.
  • Actuarial Application Development: Possessing around 5 years of expertise in end-to-end design, development, and enhancement of Actuarial Applications, focusing on modeling and experience studies.
  • OFSAA Data Model: Over 8 years of practical experience working with the OFSAA Data Model, ensuring efficient data warehousing and reporting capabilities.
  • Agile and Scrum Development: Proficient in Agile, Scrum, and Waterfall methodologies to drive efficient and collaborative development practices.
  • Data Warehousing and Modelling: Strong expertise in designing and developing Data Marts, Data Modeling, and Data Warehouses using multi-dimensional models such as Snowflake Schema and Star Schema. Well-versed in OLTP, OLAP systems, FACT, and Dimensional tables.
  • Database Expertise: Skilled in analyzing database design and working with Oracle 11g, Netezza, SQL Server 2008, PL/SQL, Teradata, and DB2 databases. Strong database programming abilities, including stored procedures, triggers, functions, and packages.
  • Python:- Proficient in Python programming, with a strong understanding of data structures, algorithms, and object-oriented principles
  • UNIX Shell Scripting: Proficient in UNIX shell scripting, automating daily tasks such as file copying, archive/migration, formatting, and FTP/SFTP processes.
  • Performance Tuning and Troubleshooting: Adept at analyzing and implementing performance tuning techniques, error handling, and debugging mappings for efficient ETL development.
  • Data Profiling and Analysis: Proficient in data profiling and analysis using Microsoft Excel and SQL to extract meaningful insights from source data.
  • Data Quality and Reporting: Collaborated with Business users to define data quality rules and designed ETL processes to run data quality checks after batch runs, reporting exceptions in OBIEE reports.
  • Version Control and Code Promotion: Skilled in working with version control tools like Star Team, GIT Hub, and Serena to efficiently manage and promote code to higher environments.
  • Business Intelligence Reporting: Knowledgeable in using business intelligence reporting tools like OBIEE 12c and Tableau Desktop 10.x to visualize and communicate data insights.

Dynamic ETL Developer practiced in helping companies with diverse transitioning, including sensitive data and massive big data installations. Promotes extensive simulation and testing to provide smooth ETL execution. Known for providing quick, effective tools to automate and optimize database management tasks.

Overview

12
12
years of professional experience

Work History

Lead ETL Consultant

USAA, The United Services
01.2020 - Current

The United Services Automobile Association (USAA) is a San Antonio-based Fortune 500 diversified Financial group offering banking and insurance to people and families who serve, or served, in the United States Armed Forces

Enterprise Management Ledger Financial Reporting (EMLFR) application is to serve as CFO’s Single source of truth for General Ledger actuals generated by OFSAA and forecasts published by Hyperion Essbase Planning for various line of business

  • · Played an key role within an Agile Scrum team, actively engaging in Daily Stand-ups, Iteration Planning, Sprint Planning, Backlog Refinement, and Retrospectives, enhancing project coordination and efficiency.
  • · Collaborated closely with the Business team to ensure effective requirement gathering and resolution of production-related data issues, fostering seamless communication and alignment.
  • · Pioneered the generation of snow events for Control M job failures by leveraging Python scripting for XML production, subsequently loading these XMLs into Service Now via DataStage's Webservice integration.
  • · Executed pivotal tasks within the Oracle to SnowFlake Migrations project, crafting snowflake Tasks to automate the Oracle-to-Snowflake data load process through efficient copy commands, ensuring a smooth and accurate transition.
  • · Leveraged Steams in SnowFlake for implementing SCD type 2 strategies to maintain data integrity and historical tracking.
  • · Demonstrated expertise in implementing Snowflake's Virtual warehouse Scale-up and Scale-out approaches for better performance of ETL jobs, optimizing performance while maintaining cost-effectiveness.
  • · Exhibited a profound understanding of Time Travel and Fail-safe concepts within SnowFlake, contributing to the system's robustness and reliability.
  • · Contributed significantly to the DataStage upgrade from 11.5 to 11.7, ensuring a seamless transition and enhanced functionality.
  • · Worked on migrating source systems from PeopleSoft to Data Foundation while retaining vital business logic, ensuring continuity and accuracy.
  • · Orchestrated the successful deployment of applications to higher environments using Git Hub and UCD, streamlining the development process.
  • · Designed, developed, and monitored ETL loads through Control M, optimizing performance and ensuring timely data processing.
  • · Enhanced reconciliation processes between FIH and EMLR across all lines of business, refining timing and accuracy for daily loads.
  • · Innovated the Suspension Process, seamlessly integrating reprocessing of rejects into daily batches, and generating Snow Events for rejected records and report to business.
  • · Crafted comprehensive test suites for ETL jobs, meticulously comparing actual outcomes with expected results, ensuring data integrity and system robustness.

Technical Environment: DataStage 11.7/11.5/11.7/8.5, Informatica Power Center 9.6, Snowflake, AWS, Oracle, Netezza, Control M, Win SCP, Qtest, Jira, GIT, Putty, Unix, Teradata, Slack, Microsoft Visual Studio

Lead ETL Consultant

Whataburger
10.2018 - 01.2020

Whataburger is an American privately held, regional fast food restaurant chain. There are more than 670 stores in Texas and over 150 in New Mexico, Arizona and the southern United States

As part of EIS-ETL team, I worked on multiple projects related to enhancements to data warehouse, extract data from DW and send it to external vendors(Sale guard) via SFTP, build a data mart as per business requirement, extract survey data from external source, transform and load data into data warehouse.

· Successfully designed and developed a data model for survey data on employee training at restaurants, enabling comprehensive analysis and insights.

· Implemented ETL jobs for text analysis, extracting key insights and scores from surveys based on specific keywords, improving the evaluation process.

· Leveraged ETL expertise to extract data from the data warehouse and generate daily operational files, securely sending them to external vendors (Sale Guard) via SFTP for fraud analysis.

· Seamlessly converted SQL stored procedures into efficient ETL processes, optimizing data flow and performance.

· Developed ETL jobs to extract Facebook JSON data using REST API and efficiently load it into Netezza and subsequently transported it into the Data Warehouse.

· Utilized Sqoop for smooth data migration from Netezza to Hive/HDFS, effectively managing large datasets for analysis.

· Integrated Hadoop into traditional ETL pipelines, accelerating data processing for both structured and unstructured data sources.

· Employed HDFS for inbound/outbound file storage by creating an NFS Gateway, enabling efficient data exchange.

· Implemented an automated process to extract school dining sales data from the data warehouse and loaded it into the Azure database for further analysis.

· Developed Windows batch scripts for each ETL process to trigger smoothly from the Active Batch scheduler, ensuring timely execution.

· Developed Unix scripts for secure SFTP file transfers and file processing, enhancing data security and reliability.

· Demonstrated expertise in developing ETL Jobs for API calls, efficiently pulling JSON data from external sources.

· Proficiently managed ETL tasks and tracked progress using Azure DevOps, ensuring seamless project management and collaboration.

Technical Environment: DataStage 11.5/11.7/8.5, Informatica Power Center 9.6, IBM BigInsight, Netezza, Active Batch, WinSCP, SQL Server, Aginity Workbench, TOAD, Jira, Git, Putty, Unix, HDFS, Big Data, Azure DevOps

Lead ETL Consultant

TIAA-CREF
09.2016 - 09.2018

The FA-Modeling project aims to replace an existing Legacy system responsible for generating MPFs for the Milliman Model and automate Model point file generation. This involves integrating assets and liabilities data from multiple source systems like IRW and FSDF into the Target Data warehouse (M-Star). The generated models are used for valuation, asset liability management, business planning, risk analysis, and other purposes. Models represent a simplified representation of a group of assets and liabilities, estimating the future financial performance of a company.

· Working throughout the entire life cycle of data warehousing projects, from requirements gathering to code deployment and document preparation at various stages.

· Understanding business requirements and incorporating key metrics/counts in the facts for data marts.

· Building data marts for each subject area, including Pension DA and IA, Assets, and Life Insurance.

· Collaborating with business users and stakeholders to determine expected business reporting requirements and ensuring the data warehouse meets all requirements.

· Designing and implementing the DEI Component, which communicates with the Milliman utility to parse input XML requests and load appropriate files into the Milliman model. Model results are later copied to the landing zone after processing via SFTP.

· Developing complex dimensions (SCD Type 1 & Type 2) and fact tables, writing intricate queries to integrate data from upstream systems.

· Designing and developing end-to-end scheduling of ETL Jobs for the entire process/project in Autosys.

· Creating Unix shell scripts for FTP and SFTP of MPFs to the Milliman utility and setting up jil jobs to run from Autosys.

· Performing data profiling and analysis using SQL and Microsoft Excel.

· Implementing and monitoring monthly data quality checks and conducting research on bad data/system failures.

· Reviewing test summary plans and test cases with the QA team and providing valuable inputs.

· Analyzing and providing resolutions for production incidents as per SLAs.

· Managing both Onsite and Offsite resources and tracking development status in JIRA.

· Utilizing the GIT Repository to upload ETL code and facilitating migration through Serena to higher environments.


Technical Environment: DataStage 11.5, Oracle 11g, Teradata, Autosys, WinSCP, SQL Developer, TOAD, Jira, GIT, Serena, Putty, HP(ALM), Teradata SQL Assistant, OBIEE, Unix, Big data.

Sr Datastage Developer

TIAA-CREF
01.2014 - 09.2016

The Main objective of Finance and Actuarial Experience Study project is to collect actual participant transaction data in an automated and organized manner, such that transaction frequency and magnitude may be compiled, categorized across the behavior spectrum, and summarized for further study. Study involves in understanding participant behavior from Pension investments to Annuitization.


Provided data to support BI reports Roll forward reports, Mortality cash and count report and Fund transfer reports for each subject area like Pension, Immediate Annuity, Life Insurance, ATA, Reserves and Re-insurance.


  • Conducted Requirement Gathering sessions with Business Users to understand their needs and ensure SLA compliance.
  • Designed and developed a logical data model, incorporating conformed dimensions and fact tables to integrate multiple subject areas like Pension, Immediate Annuity, ATA, Re-insurance, and Life Insurance.
  • Established a Centralized Staging Repository (FSDF) to efficiently stage data extracted from multiple record-keeping systems before processing.
  • Collaborated with the OBIEE team during design sessions to ensure the data warehouse fulfilled all business requirements.
  • Implemented Oracle compression techniques to optimize storage and enhance the performance of ETL processes and OBIEE reports.
  • Designed Flattened tables on top of Summary tables to fine-tune OBIEE Reports, ensuring efficiency and ease of analysis.
  • Developed PL/SQL scripts to build historical indicative data using snapshots provided from the source.
  • Extensively implemented SCD Type 2 for historical data using SQL and Datastage, providing comprehensive historical insights.
  • Optimized ETL jobs and database design to handle substantial historical loads using Bulk load techniques.
  • Created Unix Shell scripts for file movement, formatting, and count validations between control and data files.
  • Designed an Audit framework for simplified data quality checks after loads, complemented by OBIEE reports to analyze trends and identify anomalies.
  • Developed Autosys jobs for each ETL process, ensuring smooth execution according to SLAs.
  • Provided On-Call Production Support, effectively addressing any failures promptly.
  • Collaborated with QA teams to resolve defects logged during weekly batch cycles.
  • Utilized StarTeam and Serena for seamless code promotion to higher environments.

Technical Environment: DataStage 11.5/8.1, Informatica Power Center 9.6, Oracle 11g, Teradata, Autosys, Win SCP, SQL Developer, TOAD, Jira, GIT, Serena, Putty, HP(ALM), Teradata SQL Assistant, OBIEE, Unix, Big data.

Datastage Developer

TIAA-CREF
05.2011 - 12.2013

Main objective of Confirms project is to automate Confirm statement generation for all financial transactions that process through record keeping systems at or before completion of transactions


  • Extract data from multiple sources like Omni, EDW and text file and generate Xml file that will later be sent to BRMS for validations
  • After Validations XML's are loaded into BPEL system through MQ to produce final PDF and mail them to participants
  • Plan data management project is integrating various source systems data such as Omni Plus, Siebel, Plan Rules Display and Institution Information System into the tactical Plan Data ODS
  • Plan data is exposed to pension operations via web services and reports
  • Involved in all phases of project from Requirement Gathering, Data Analysis, Design and implementation of data model
  • Automated process of Data extraction from record keeping systems like OMNI and RDW, transformed and loaded data into target data warehouse to generate confirms statements
  • Experience in Installing, Setup and Testing of File Transfer between ETL, BPEL and BRMS system
  • Developed File transfer scripts in Unix and created Jil scripts to automated it from Autosys
  • Hands on Experience working with XML, Sequential File, Transformer, Lookup, Join, SCD, Merge, Filter, MQ Stage and Database Connector Stage using DataStage
  • Developed master controlling sequence jobs to call parallel jobs
  • Hands on experience working with XML files using DataStage
  • Provided support for UAT Batch and worked with QA team to resolve defects
  • Performed performance tuning, Error handling and debugging of ETL jobs and SQL scripts
  • Provided development status to PM through Daily Scrum calls
  • Developed Autosys jobs to run File watcher, DataStage jobs and Filter transfer script with dependency and schedule
  • Good Experience working in Offshore and Onsite Model

Technical Environment: DataStage 8.1, Oracle 11, Netezza, DB2, SQL Server, Autosys, Win SCP, SQL Developer, TOAD, Jira, StarTeam, Serena, , BRMS, HP(ALM), Putty, WinSCP gorave6301merpalert

Education

Master’s - computer science

University of Houston Clear Lake
Houston, TX
12.2011

Bachelor of Science - computer science and Information Technology

G Pulla Reddy Engineering College
Kurnool, AP
06.2008

Skills

  • ETL:- Data Stage , Informatica Power Center
  • Data Base:- Snowflake, Oracle , Teradata, DB2, Netezza, SQL Server
  • Programming Languages:- SQL, PL/SQL, Python, UNIX Shell Scripting
  • Version Tools:- StarTeam, GIT Hub, Serena, UCD
  • Scheduling Tools :- Autosys, Active batch,Control M
  • Business Intelligence :- Tableau Desktop 10x, OBIEE 12c
  • Web services :- SOAP & REST web service
  • Other Tool:- SQL Plus, SQL Developer, TOAD, Putty, HP Quality Center, Teradata SQL Assistance, Win SCP, Microsoft Office, Jira, Teradata SQL Assistant, Hadoop Eco system, Aginity Work bench, Json

Timeline

Lead ETL Consultant

USAA, The United Services
01.2020 - Current

Lead ETL Consultant

Whataburger
10.2018 - 01.2020

Lead ETL Consultant

TIAA-CREF
09.2016 - 09.2018

Sr Datastage Developer

TIAA-CREF
01.2014 - 09.2016

Datastage Developer

TIAA-CREF
05.2011 - 12.2013

Master’s - computer science

University of Houston Clear Lake

Bachelor of Science - computer science and Information Technology

G Pulla Reddy Engineering College
Sreekanth Palla