Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Timeline
Generic

Ramesh Vankayalapati

Rocklin,CA

Summary

Good experience with Apache Sqoop to Sqoop the Data from Db2 to Hadoop distribution File system. Experienced in handling different file formats like Text file, Avro data files, Sequence files, Xml and Json files. Extensive experience in working with structured data using Hive QL, join operations, writing custom UDF's and experienced in optimizing Hive Queries. Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map Reduce, Hive, Spark jobs. Excellent knowledge on Hadoop Architecture and ecosystems such as HDFS, Name Node, Data Node, YARN, Map Reduce, Spark, HBase, Hive, Pig. Used spark to parse XML files and extract values from tags and load it into multiple hive tables. Implemented Partitioning, Bucketing in Hive for better organization of the data. Having knowledge in big data' Ecosystem with 1+ years of combined experience Created Reports with SAP Business objects on top of Apache Hive. Accomplished engineer proffering extensive cloud monitoring, deployment and troubleshooting skills. Defined, built and maintained infrastructure using vendor-neutral and platform-specific tools. Organized and focused person with extraordinary leadership acumen. Resourceful [Job Title] experienced in evaluating and assessing client requirements and implementing infrastructure to solve identified problems. Harnessed code and cloud-native technologies to create scalable and user-centric systems. Strong negotiator with excellent value-driven solutions. Detail-oriented Computer Systems Engineer committed to improving system design and operations for reliable workflow management. Researches and implements budget-conscious security and encryption solutions to protect user privacy and increase overall network health. Creates easy-to-follow guidelines and troubleshooting documentation for non-technical staff.

Overview

12
12
years of professional experience
1
1
Certification

Work History

Sr Cloud Data/ETL Engineer

PNC Bank, C&IB
01.2019 - Current
  • Designed and implemented data lake architecture, including data ingestion, storage, and access layer
  • Created Data Pipelines using Azure Data factory to extract, Transform and load Data from different sources into Azure SQL data bases
  • Worked on SQL server Integration services packages to load warehouse by ensuring Data Quality and Integrity
  • Engaged in cross-functional teamwork to architect data models and schema structures within Azure SQL Data Warehouse, resulting in query performance enhancements and minimized data latency
  • Designed and implement data cleansing, validation, and transformation logic to ensure data accuracy and consistency
  • Developed User-Defined Functions (UDFs) for efficient utilization within reports
  • Designed and implemented Streaming ETL batch packages using Change capture functionality in SQL server
  • Extensive experience in data processing and transformation leveraging Data bricks.
  • Worked on Dashboard/Reports for CAT /MDS Lobs for Data insights for decision making and to support Day to Day business Activities.
  • Identified, analyzed and resolved infrastructure vulnerabilities and application deployment issues.
  • Worked with cloud architect to generate assessments and develop and implement actionable recommendations based on results and reviews.

ETL/Developer Programmer

BEST, La County of Education
07.2018 - 01.2019
  • Created Design Mock-ups for reports and Developed Reports using SAP Business objects
  • Created Universes in Business Objects by employing various techniques including Hierarchies, Loops, Cardinality, Contexts, Conditions, LOV's, Derived Tables and Prompts
  • Created reports in Web Intelligence Rich Client (WebI) reporting tool using functionalities such as Filters, Ranking and Sections
  • Created ad-hoc reporting ready Universes using Universe Designer
  • Resolved loops in Universes using table aliases and contexts to remove cyclic dependencies
  • Developed various WebI Reports and made user friendly by implementing Alerts, Prompts, Conditions, Filters and formatting reports based on user requirements
  • Created Derived tables in Universes using complex SQLs to fill gaps in ETL Designs and to reduce ETL Development time
  • Trained end users in basic ad-hoc reporting skill using Web Intelligence.
  • Wrote and optimized in-application SQL statements.
  • Collaborated with business intelligence staff at customer facilities to produce customized ETL solutions for specific goals.

ETL Developer

EDR, Franchise Tax Board
04.2013 - 06.2018
  • Designed and developed ETL Jobs to extract, cleanse and Load into various Targets from different Sources like XMLs, Flat Files, Relational Databases and external Sources
  • Developed ETL jobs as Service and successfully Deployed and managed using IBM DataStage and Quality Stage 8.7/11.5
  • Developed Real-time jobs using ISD input/output stages and web service transformer to call the Master Data Management (IBM Initiate)
  • Implemented and Supported Address Standardization jobs to standardize and certify the USA and global addresses using Data Quality stages like MNS, Address Verification Interface and CASS stages in Data stage
  • Integrated Data Stage and Pitney Bowes address Verification Interface to replace the Quality stage Address Verification Interface
  • Created ETL Jobs to shred the Large Xmls into tables
  • Reviewed and optimized SQL queries and ETL Packages for better performance
  • Documented and tested ETL jobs and supported the Production Data Cleaning efforts
  • As a Developer worked with clients to complete and qualify the Technical Requirement Documents
  • Designed and Developed Indexes on tables for better performance of the ETL packages to update and insert
  • Created Universes for Personal Income Tax Audit and Business Entities for Audit Purpose, created Crystal reports and assisted users on issues with reports.
  • Collaborated with business intelligence staff at customer facilities to produce customized ETL solutions for specific goals.
  • Wrote and optimized in-application SQL statements.

Oregon Health and Science University
02.2013 - 04.2013
  • Using data stage Designer Designed and Developed many jobs using stages Sequential file, change capture, scd, odbc-connector, oracle connector, transformer, sort, lookup, join, Column export stages
  • Implemented incremental loading using change capture and surrogate key stages
  • Implemented Type 1 and Type 2 dimensions using SCD stage
  • Developed Test cases and executed Test scenarios as per the requirement
  • Installed, configured, administered SQL Server 2008R2/2012/2014 on production and Non-production in Clustered and Standalone Server Environments
  • Designed and developed ETL packages, store procedures, configuration files, and tables, views, and functions implement best practices to maintain optimal performance of the Database.

ORR & SCMAS, Southern California Edison
06.2011 - 02.2013
  • As a data stage developer involved in designing of CDC PAD and Stage to core Documents
  • Developed Batch jobs using stages Sequential file, Teradata connector, transformer, sort, lookup, join, surrogate key generator, Column export stages
  • Used Type 1 and Type 2 slowly changing dimensions to load the different dimension tables.

Education

MS-IT -

Arkansas Tech University
Russellville, AR
05.2011

Bachelors - information technology

V.R. Siddhartha Engg College
Vijayawada
05.2008

Skills

  • Azure Cloud, ADF, Azure Data bases
  • Data Warehousing
  • IBM Info sphere Data Stage & Quality stage 87/115, Meta Data Work Bench & Information Analyzer
  • Informatica Power Center
  • SQL (DB2, Teradata, Oracle, SQL Server, Hive QL ) , T-SQL
  • Data bricks ,Python, Py Spark, Pandas, Jupyter Notebooks
  • Pitney Bowes Spectrum Address verification
  • SAP Business Objects 42, Tableau
  • Application Development
  • Apache Sqoop, Apache Hadoop, Apache Hive, Kafka, Apache spark
  • Maintenance Organization and Development

Accomplishments

  • MICROSOFT CERTIFIED: AZURE DATA ENGINEER ASSOCIATE
  • UNIX / AIX 3 3
  • Application knowledge
  • IBM Info sphere Data Stage and Quality stage 7 4
  • IBM info sphere Meta Data Work Bench& Information Analyzer, CDC 5 3
  • Informatica Power center 5 4
  • SAP Business Objects 2 3
  • Apache Hadoop, Hive, Sqoop, Kafka, Spark
  • Big Data 2 3
  • IT disciplines
  • Data Warehouse Development 7 4
  • Application Development 6 4
  • Data Base Administration 1 3
  • Industry knowledge
  • Tax and Revenue 5 3
  • Utilities 2 3
  • Education And Health 2 3
  • 1 = basic, 2 = familiar, 3 = competent, 4 = expert
  • EXPERIENCE
  • SNAPSHOT
  • INDUSTRY EXPERTISE
  • Tax & Revenue
  • Utilities
  • Education &Health
  • Local Government

Certification

Azure Data Engineer Associate

  • Certified Microsoft Azure Data Engineer Associate

Timeline

Sr Cloud Data/ETL Engineer

PNC Bank, C&IB
01.2019 - Current

ETL/Developer Programmer

BEST, La County of Education
07.2018 - 01.2019

ETL Developer

EDR, Franchise Tax Board
04.2013 - 06.2018

Oregon Health and Science University
02.2013 - 04.2013

ORR & SCMAS, Southern California Edison
06.2011 - 02.2013

MS-IT -

Arkansas Tech University

Bachelors - information technology

V.R. Siddhartha Engg College
Ramesh Vankayalapati