Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Generic

Sravan Kumar Vanguru

Hoover,AL

Summary

Over 12 years of experience in complete software life cycle involving Analysis, software development life cycle (SDLC), Analysis, Design, Development, Documentation, Testing, and Implementation of Windows and web-based applications. Manage and lead a team of data engineers, providing guidance, mentorship, and support to ensure to meet their objectives and project deadlines. Oversee the design and development of data pipelines that extract, transform, and load (ETL) data from various sources into data warehouses or other storage systems. Collaborate with other teams (e.g., Data Scientists, Analysts, DevOps) to design and maintain a robust data infrastructure that supports the organization's data needs. Work closely with other departments to understand their data requirements and provide data solutions to meet those needs. Experience in managing and leading a team of data engineers, including performance evaluation and team development. Ability to identify and address data engineering challenges and optimize processes for better performance. Analytical mindset to understand complex data problems and devise effective solutions. Certified SAFe 4 Agile Practitioner. Significant experience in Big Data (Extensive use of Hive, Impala, Hadoop file system, Sqoop, Spark, Python, Scala, Kafka, and Flume). Excellent command of databases such as Oracle 12C, SQL Server 2000, and MS Access 2003. Worked with different source systems like Flat Files and Databases. Good working skills in optimizing the SQL to improve performance. Written UNIX Shell scripts for Big Data Automation. Specialist in Sqoop Automation (RDBMS data moved in a single shot). Experience in Kafka streaming and spark streaming. Experience in JIRA, SOURCE TREE, BIT BUCKET, GIT HUB. Excellent interpersonal and communication skills and is experienced in working with senior-level managers, businesspeople, and developers across multiple disciplines. Responsive expert experienced in monitoring database performance, troubleshooting issues and optimizing database environment. Possesses strong analytical skills, excellent problem-solving abilities, and deep understanding of database technologies and systems. Equally confident working independently and collaboratively as needed and utilizing excellent communication skills. Detail-oriented Lead Senior Data Engineer designs, develops and maintains highly scalable, secure and reliable data structures. Accustomed to working closely with system architects, software architects and design analysts to understand business or industry requirements to develop comprehensive data models. Proficient at developing database architectural strategies at the modeling, design and implementation stages.

Overview

11
11
years of professional experience

Work History

Lead Senior Data Engineer

Regions Financial Corporation
10.2014 - Current
  • Worked closely with Business Analysts to gather Business Requirement Specific (BRS) and Prepare Technical Specifications
  • Manage and lead a team of data engineers, providing guidance, mentorship, and support to ensure to meet their objectives and project deadline
  • Oversee the design and development of data pipelines that extract, transform, and load (ETL) data from various sources into data warehouses or other storage systems
  • Experience working on Snowflake, AWS
  • Collaborate with other teams (e.g., Data Scientists, Analysts, DevOps) to design and maintain a robust data infrastructure that supports the organization's data needs
  • Extensively worked on Cloudera manager
  • Extensively used Python for writing ETL
  • Extensively worked on bigdata for sourcing the raw data into HDFS, and HIVE for data analysis and data predictions
  • Extensively worked on creating RDD's with Spark-scala
  • Automated Relational data move using Sqoop where whole data base can be stored on HDFS and expose to Hive with one single click
  • Created Kafka topics for Real time data transfers using spark streaming and loading data in to HDFS expose it to Hive
  • Used SBT for compiling Scala scripts
  • Extensively used HQL scripts for cleansing data
  • Extensively worked on applying Regular expressions and cleaning data using Spark-scala code
  • Involved in working with Data science team for their data analytics and data predictions
  • Hands on experience on Kudu for storing Analytical data
  • Analyze, design, and modify the existing educational Data Warehouse's data structures in Oracle Database
  • Worked on UNIX shell Scripting for Major Automation processes for the team
  • Worked on ERWIN Data Modeler Tool
  • Created Parameter files and validation scripts
  • Expertise in handling special projects in the bank
  • Performed Unit testing and Data validation testing using Validation scripts
  • Worked as QA for the project to review all the Code and analysis
  • Utilized several transformations in OWB such as Sequence, Splitter, Duplicator, Constant
  • Migrated data from staging to ODS and from ODS to RPT
  • Worked with numerous flat files, loaded data from flat files to Oracle
  • Performed UAT for different projects internally
  • Handled multiple tasks at a time and successful in completion on time, without any loss of quality
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically
  • Identified the errors by analyzing the session logs
  • Wrote documentation to describe program development, logic, coding, testing, changes and corrections
  • Worked on Production Support for Different Issues in Different Projects
  • Environment: Cloudera management, Oracle12c, Hue, StreamSets, Control M, UNIX, Red hat, Flat Files DB2, SQL, PL/SQL, Mainframes, Java.
  • Optimized data pipelines by implementing advanced ETL processes and streamlining data flow.
  • Enhanced system performance by designing and implementing scalable data solutions for high-traffic applications.
  • Developed custom algorithms for efficient data processing, enabling faster insights and decisionmaking.
  • Collaborated with cross-functional teams to define requirements and develop end-to-end solutions for complex data engineering projects.
  • Reduced operational costs by automating manual processes and improving overall data management efficiency.
  • Ensured data quality through rigorous testing, validation, and monitoring of all data assets, minimizing inaccuracies and inconsistencies.
  • Mentored junior team members in best practices for software development, code optimization, and troubleshooting techniques.
  • Evaluated emerging technologies and tools to identify opportunities for enhancing existing systems or creating new ones.
  • Designed robust database architecture that supported seamless integration of new datasets and facilitated rapid analysis capabilities.
  • Implemented cutting-edge machine learning algorithms to unlock valuable insights from large volumes of structured and unstructured data.
  • Spearheaded efforts to migrate legacy systems onto cloud-based platforms, resulting in improved scalability and costefficiency.
  • Established standard procedures for version control, code review, deployment, and documentation to ensure consistency across the team''s work products.
  • Leveraged advanced analytics tools to create interactive dashboards that provided actionable insights into key business metrics.
  • Reengineered existing ETL workflows to improve performance by identifying bottlenecks and optimizing code accordingly.
  • Participated in strategic planning sessions with stakeholders to assess business needs related to data engineering initiatives.
  • Contributed significantly towards setting up an automated monitoring mechanism using state-of-the-art technologies and tools, which led to proactive issue identification and resolution.
  • Acted as a trusted advisor for clients by providing thought leadership on best practices in data engineering, ensuring their systems were optimized for performance and scalability.
  • Championed the adoption of agile methodologies within the team, resulting in faster delivery times and increased collaboration among team members.
  • Delivered exceptional results under tight deadlines, consistently prioritizing tasks effectively to meet project timelines without compromising quality or accuracy.
  • Developed polished visualizations to share results of data analyses.

Programmer Analyst III

Arkstek INC
10.2013 - 10.2014
  • Gathered functional & technical specifications
  • Extracted data from various sources like Oracle, flat files, and SQL Server
  • Developed Mappings as per the Technical specification approved by the Client
  • Involved with the Business team to gather Business requirements for the project
  • Worked in the Business analytics team to analyze their data
  • Developed Complex mappings by extensively using Informatica Transformations
  • Implemented the SCD Type2 to keep track of historical data
  • Created and Configured Workflows, Worklets, and Sessions to load data into target warehouse tables using Informatica Workflow Manager
  • Extensively Worked with Variables and Parameters in the mappings to pass the values between sessions
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed, or failed
  • Debugged mappings for failed sessions
  • Involved in Scheduling the Informatica sessions using the workflow manager to automate the loading process, and import and export data from Production to development box for testing
  • Optimizing/Tuning mappings for better performance and efficiency
  • Environment: Informatica power center, Oracle12c, Teradata, Erwin, Control M, UNIX, Red hat, Flat Files DB2, SQL, PL/SQL, Mainframes.

Programmer Analyst (Intern)

Sarga solutions
09.2012 - 10.2013
  • Worked with business analysts for requirement gathering, and business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse
  • Analyzed the system for the functionality required as per the requirements and created a System Requirement Specification document (Functional Requirement Document)
  • Involved in understanding and analyzing the existing abinitio code and prepared the mapping docs
  • Involved in the development of the conceptual, logical, and physical data model of the star schema using ERWIN
  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager, and Informatica Workflow Manager
  • Designed and developed daily audit and daily/weekly reconcile process ensuring the data quality of the Data warehouse
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, and Sequence Generator transformations
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager
  • Created users and user groups with appropriate privileges and permissions, folders, and folder permissions in Repository Manager
  • Extensive knowledge of Teradata SQL Assistant
  • Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies
  • Developed UNIX shell scripts using PMCMD utility and scheduled ETL load using Maestro
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations, and tuned accordingly for better performance
  • Worked for various Agile teams in getting the projects finished
  • Involved in Unit and Integrating testing of Informatica Sessions, Batches and the Target Data
  • Extensively involved in coding the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment
  • Prepared Run books providing guidelines to troubleshoot the errors that occurred during the run time and instructions on how to restart the loads
  • Actively involved in production support
  • Implemented fixes/solutions to issues/tickets raised by user community
  • Environment: Informatica tools, Oracle12c, Teradata, Erwin, Control M, UNIX, Red hat.

Education

MASTERS IN COMPUTER SCIENCE - Information Technology

Virgina International University
Fairfax, VA
12.2012

BACHELOR OF COMPUTER SCIENCE -

JNTU

Skills

  • MS-DOSOracle 12C, DB2 80/70/60, MS SQL Server 2005/2000/70/65/2008, and MS Access 70/2000
  • SQL, PL/SQL, Transact SQL, SQL
  • Plus 33/80, Visual Basic 60/50, HTML, DHTML, C, C and Unix Shell Scripting
  • Business Objects XI/65/60, Business Objects Universe Developer, Business Objects Supervisor, Business Objects Set Analyzer 20 and Cognos Series 70
  • Sun Solaris 26/27, Red Hat Servers HP-UX 1020/90, IBM AIX 42/43, MS DOS 622, Novell NetWare 411/361, Win 3x/95/98, Win NT 40, Sun-Ultra, Sun-Spark, Sun Classic, SCO Unix, HP9000 and RS6000, Cloudera manager
  • Informatica Power Center 1021 Informatica Power Mart 1021 (Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet, Transformations), Informatica Power Analyzer, ETL, Metadata, DataMart, Autosys, OLAP, OLTP, SQL
  • Plus and SQL
  • Loader
  • API Development
  • ETL Development
  • Data Warehousing
  • Data Curating
  • Python Programming
  • Performance Tuning
  • Scala Programming
  • Spark Development
  • Metadata Management
  • Real-time Analytics
  • Team Leadership
  • Hadoop Ecosystem
  • SQL and Databases
  • Data Migration
  • RDBMS
  • AWS
  • Airflow
  • SnowFlake

Accomplishments

    Best Employee for the year 2022-2023

Timeline

Lead Senior Data Engineer

Regions Financial Corporation
10.2014 - Current

Programmer Analyst III

Arkstek INC
10.2013 - 10.2014

Programmer Analyst (Intern)

Sarga solutions
09.2012 - 10.2013

MASTERS IN COMPUTER SCIENCE - Information Technology

Virgina International University

BACHELOR OF COMPUTER SCIENCE -

JNTU
Sravan Kumar Vanguru