Summary
Overview
Work History
Education
Skills
Timeline
Generic

Srinivas Naveen Kumar Vanapally

Prosper

Summary

Professional with 9 years of IT experience specializing in Data Warehousing. Expertise includes Business Requirements Analysis, Application Design, Data Modeling, Development, Testing, and Documentation within Financial and Communication sectors. Proven ability to deliver high-quality data solutions and drive project success. Seeking to contribute skills to a Data Engineer position.

Insightful Report Developer known for high productivity and efficient task completion. Possess specialized skills in SQL database management, data visualization with tools like Tableau and Power BI, and advanced Excel functions. Excel in problem-solving, time management, and communication, ensuring successful project delivery and stakeholder satisfaction. Proactive etl developer with several years carefully handling and transforming data loads to suit customer needs. Versed in all dominant information modeling designs and mapping technologies. Dedicated to flawless etl integrity and impeccable data security during all database maintenance and modification tasks. Resourceful Cloud Analyst known for high productivity and efficient task completion. Specialize in cloud infrastructure management, data security protocols, and performance analysis to ensure optimal operations. Excel in problem-solving, communication, and adaptability to navigate complex cloud environments effectively. Insightful Data Warehouse Developer known for high productivity and efficient task completion. Possess specialized skills in ETL processes, database optimization, and data modeling which enhance workflow and data integrity. Excel in problem-solving, teamwork, and adaptability, ensuring smooth project execution and innovation in data management solutions. Highly analytical Business Intelligence Intern with experience in data collection, analysis and reporting. Possess strong technical skills in SQL, Microsoft Power BI, and Excel for effective data management and interpretation. Demonstrated ability to translate complex data into clear, understandable business insights that drive strategic decisions. Proven track record of improving operational efficiency through innovative use of business intelligence tools.

Overview

10
10
years of professional experience

Work History

Power BI Developer

T-Mobile
Prosper
01.2022 - Current
  • Developed interactive dashboards to visualize client data insights.
  • Collaborated with stakeholders to gather requirements and define project scope.
  • Designed and implemented data models supporting reporting needs.
  • Conducted data analysis to identify trends influencing decision-making processes.
  • Optimized reports for enhanced performance and usability.
  • Integrated multiple data sources into Power BI for comprehensive reporting solutions.
  • Maintained documentation for processes, models, and dashboards created in projects.
  • Analyzed code to correct errors, optimizing output.
  • Completed day-to-day duties accurately and efficiently.
  • Contributed innovative ideas and solutions to enhance team performance and outcomes.
  • Worked successfully with diverse group of coworkers to accomplish goals and address issues related to our products and services.
  • Promoted high customer satisfaction by resolving problems with knowledgeable and friendly service.
  • Prioritized and organized tasks to efficiently accomplish service goals.
  • Collaborated closely with team members to achieve project objectives and meet deadlines.
  • Worked with cross-functional teams to achieve goals.
  • Maintained updated knowledge through continuing education and advanced training.
  • Implemented new technologies designed to improve efficiency of facilities operations.
  • Assisted in developing a standardized approach towards migrating legacy applications into modern technologies like BigData and Cloud Computing platforms.
  • Analyzed cloud infrastructure performance and identified areas for optimization.
  • Developed documentation for cloud processes and user guides for stakeholders.
  • Evaluated new cloud technologies and tools to enhance organizational capabilities.
  • Maintained and monitored cloud-based applications, services and systems to ensure optimal performance.

Azure Data Engineer

Tasfresh
Tasmania
05.2018 - 09.2021
  • Involved in business Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Experience in leading offshore team in assigning tasks, getting status updates & help offshore team to provide technical solutions.
  • Creating pipelines, data flows and complex data transformations and manipulations using Azure Data Factory(ADF) and PySpark with Databricks.
  • Created, provisioned multiple Databricks clusters needed for batch and continuous streaming data processing and installed the required libraries for the clusters.
  • Designing and Developing Azure Data Factory (ADF) pipelines to extract the data from Relational sources like Teradata, Oracle, SQL Server, DB2 and non-relational sources like Flat files, JSON files, XML files, Shared folders etc.
  • Developed streaming pipelines using Apache Spark with Python.
  • Develop Azure Databricks notebooks to apply the business transformations and perform data cleansing operations.
  • Develop Databricks Python notebooks to Join, filter, pre-aggregate, and process the files stored in Azure data lake storage.
  • Ingested huge volume and variety of data from disparate source systems into Azure DataLake Gen2 using Azure Data Factory V2.
  • Created reusable pipelines in Data Factory to extract, transform and load data into Azure SQL DB and SQL Data warehouse.
  • Implemented both ETL and ELT architectures in Azure using Data Factory, Databricks, SQL DB and SQL Data warehouse.
  • Experienced in developing audit, balance and control framework using SQL DB audit tables to control the ingestion, transformation and load process in Azure.
  • Used Azure Logic Apps to develop workflows which can send alerts/notifications on different jobs in Azure.
  • Used Azure Devops to build and release different versions of code in different environments.
  • Automated jobs using Scheduled, Event based, Tumbling window triggers in ADF.
  • Created External tables in Azure SQL Database for data visualization and reporting purpose.
  • Create and setup self-hosted integration runtime on virtual machines to access private networks.
  • Well-versed with Azure authentication mechanisms such as Service principal, Managed Identity, Key vaults.

Data Engineer/Teradata Developer

Tasfresh
Tasmania
08.2017 - 05.2018
  • Involved in business meetings to gather requirements, business Analysis, Design, review and Development, testing.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain. Responsible for Collect Statistics on FACT tables.
  • Developed Python scripts for ETL load jobs using Panda functions.
  • Created proper Primary Index taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Created Temporal tables and Columnar tables by utilizing the advanced features of V14.0.
  • Created tables, views in Teradata, according to the requirements.
  • Provided architecture/development for initial load programs to migrate production databases from Oracle data marts to Teradata data warehouse, as well as ETL framework to supply continuous engineering and manufacturing updates to the data warehouse (Oracle, Teradata, MQ Series, ODBC, HTTP, and HTML).
  • Performed the ongoing delivery, migrating client mini-data warehouses or functional data-marts from Oracle environment to Teradata.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Multiload and FastLoad.
  • Used various transformations like Source qualifier, Aggregators, lookups, Filters, Sequence generators, Routers, Update Strategy, Expression, Sorter, Normalizer, Stored Procedure, Union etc.
  • Used Informatica Power Exchange to handle the change data capture (CDC) data from the source and load into Data Mart by following slowly changing dimensions (SCD) type II process.
  • Used Power Center Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
  • Designed, created and tuned physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
  • Worked on creating few Tableau dashboard reports, Heat map charts and supported numerous dashboards, pie charts and heat map charts that were built on Teradata database.

ETL Developer

Tasfresh
Tasmania
08.2015 - 07.2017
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Created a BTEQ script for pre population of the work tables prior to the main load process.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MultiLoad scripts and FastLoad scripts.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance.
  • Created a shell script that checks the corruption of data file prior to the load.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Involved in troubleshooting the production issues and providing production support.
  • Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
  • Involved in troubleshooting the production issues and providing production support.
  • Developed unit test plans and involved in system testing.

Education

M.S. - Information Technology

CHARLES STURT University
Melbourne, Australia
01.2015

Bachelors of Technology - Information Technology

JNTU
Hyderabad, Telangana
01.2012

Skills

  • Azure cloud platform expertise
  • Data storage solutions
  • SQL database management
  • Data analytics services
  • Data engineering tools
  • ETL process optimization
  • Programming proficiency
  • Database technologies
  • Big data frameworks
  • Requirements gathering
  • Data visualization
  • Power BI
  • Decision-making
  • Agile methodologies

Timeline

Power BI Developer

T-Mobile
01.2022 - Current

Azure Data Engineer

Tasfresh
05.2018 - 09.2021

Data Engineer/Teradata Developer

Tasfresh
08.2017 - 05.2018

ETL Developer

Tasfresh
08.2015 - 07.2017

M.S. - Information Technology

CHARLES STURT University

Bachelors of Technology - Information Technology

JNTU