Summary
Overview
Work History
Education
Skills
Websites
Certification
Timeline
Generic

Vamsi Krishna Kosuri Senior Data Engineer

Miami,Florida

Summary

  • Around 15 years of ETL/ELT experience in analysis, design, development, implementation, migration and troubleshooting in the areas of Data Warehousing Methodologies.
  • Experience in ETL and ELT of data from disparate sources into Data Warehouses and Data Marts using Informatica Intelligent Cloud Services (IICS) and Informatica Power Center. Proficient across various Amazon web services (AWS) like EC2, S3, IAM and Lambda.
  • Extensively worked on Data Ingestion tools like Fivetran & Qlik to ingest raw data into Data Lake. Extensively worked on Continuous Integration and Continuous Delivery (CICD) pipeline using Jira and GitHub.
  • Good knowledge on data warehousing concepts like star schema and snowflake schema.
  • In-depth understanding of Snowflake multi-cluster virtual warehouses, multi-cluster size and optimized long running queries saving Snowflake credits and storage.
  • Good knowledge in developing streams, tasks, snowpipes, views, stored procedures, and Data classification of PII and Sensitive data in Snowflake.
  • Good knowledge on building and maintaining Data classification, Data profiling, Data cleansing, Data Security and Data governance solutions. Involved in Performance tuning of Informatica mappings and push down optimization technique.
  • Good knowledge on UNIX commands and Shell Scripting to pull data from UNIX, FTP and SFTP servers.
  • Experience in preparing evaluations and decisions register based on pros and cons by conducting PoCs on best fit tools and technologies available in the market.
  • Excellent analytical, problem solving, written and verbal communication skills with ability to interact with individuals at all levels.

Overview

15
15
years of professional experience
1
1
Certification

Work History

Senior Data Engineer

IBM Corp
08.2015 - Current
  • Project 1: Client: WORLD KINECT CORPORATION
  • Duration: AUG 2015 - Current
  • Role: SENIOR DATA ENGINEER
  • Project Description:
  • World Kinect Corporation (WKC) formerly known as World Fuel Services (WFS) Corporation is energy, commodities, and services company and it focuses on marketing, trading, and financing of aviation, marine, corporate, and ground transportation energy commodities.
  • As a Senior Data Engineer, I am responsible for Analysis, design, development, testing, and implementation of Enterprise Data warehousing solutions for new implementations and migrating legacy systems.
  • Roles & Responsibilities:
  • Integrating data shared across legacy systems; Develop and maintain advanced database systems including EDW, Data Marts and highly normalized models needed for business, operational analysis and/or reporting.
  • Requirements gathering from business stakeholders and liaising with client IT team to prioritize new features or enhancement requests, production issues and to initiate remediation.
  • Building ETL pipelines using Snowflake, Informatica Intelligent Cloud Services, Qlik, Fivetran and Python to load data from various sources into enterprise data warehouse built on Snowflake cloud data platform.
  • Also migrated code from Informatica PowerCenter to IICS.
  • Design and implement data transformations using DBT models, ensuring data integrity and consistency.
  • Build and manage Data Catalog tools like Atlan and Informatica EDC which aids business users in Data Discovery, Data Stewardship, Data lineage and metadata management.
  • Implementing new ideas and features to enrich metadata focused on cloud technologies and business applications.
  • Apply expertise in managing Amazon Web Compute Services like Elastic Cloud Compute (EC2) through AWS Command Line Interface (AWS CLI), storage services like Simple Storage Service (Amazon S3), and event-driven services like Lambda.
  • Eliminating bottlenecks by performance tuning; providing optimized solution and reducing overall cost of cloud resources.
  • Streaming raw data into Data Lake through data ingestion tools like Fivetran and Qlik and responsible for managing these tools.
  • Involved in designing and developing ETL processes that meet program specifications as well as client requirements using Informatica PowerCenter Tools like Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Sprint planning, creating stories in Jira with estimates, mentoring offshore team by providing guidance and sharing technical expertise to ensure closure of all tickets committed in sprint by strictly following Agile methodologies.
  • Environment: Informatica Intelligent Cloud Services (IICS), Informatica PowerCenter, Snowflake, Amazon Web Services (AWS), ORACLE, Atlan, EDC, UNIX, Jira, GitHub, Fivetran, Qlik, Postman

Data Engineer

IBM Corp
08.2014 - 08.2015
  • Project 2: Name: ANTI MONEY LAUNDERING
  • Client: ROYAL BANK OF SCOTLAND
  • Duration: AUG 2014 - AUG 2015
  • Role: DATA ENGINEER
  • Project Description:
  • As a Data Engineer, I am responsible for Analysis, design, development, testing, and implementation of Data warehousing solutions for new implementations and migrating legacy systems for the IBM UK client Royal Bank of Scotland (RBS).
  • Roles & Responsibilities:
  • Requirements gathering, Providing Estimations, Requirement review, Detail design, Development, Code review and Unit Testing.
  • Interaction with the Business Partners to understand the business.
  • Understanding business requirements and giving technical resolutions.
  • Prioritizing work queue items, assigning them to team members and tracking them to closure to adhere to scheduled timelines.
  • Developing ETL jobs using Informatica PowerCenter tools.
  • Involved in writing UNIX shell scripts in KSH.
  • Performed Tuning by identifying and eliminating the bottlenecks occurred.
  • Preparing release documents and migrating code from one environment to other
  • Environment: Informatica PowerCenter, ORACLE, SQL, PL/SQL, UNIX

Technology Analyst

Infosys Limited
06.2012 - 07.2014
  • Project 1: Name: NEXT GENERATION BILLING
  • Client: CHARLES SCHWAB & CO
  • Duration: JUN 2012 - JUL 2014
  • Role: TECHNOLOGY ANALYST
  • Project Description:
  • As a Technology Analyst, I am responsible for Analysis, development, testing and implementation of Data warehousing solutions for new implementations and migrating legacy systems for Infosys US client Charles Schwab & Co.
  • Roles & Responsibilities:
  • Requirement gathering, Estimation, Impact analysis, Detail design, Development & Code review.
  • Interaction with Business Partners to understand the business of the existing billing systems.
  • Understanding business requirements and giving technical resolutions.
  • Performed ABP/OES Impact analysis.
  • Developing ETL jobs using Informatica for implementing SCD type2.
  • Involved in writing UNIX shell scripts in KSH to extract data from files to database.
  • Involved in implementation of Push down optimization.
  • Performed Tuning by identifying and eliminating bottlenecks.
  • Prioritizing work queue items, assigning them to team members and tracking them to closure to adhere to scheduled timelines.
  • Preparing Scheduling documents, Release documents and validating the install.
  • Environment: Informatica PowerCenter, Teradata, ORACLE, SQL, PL/SQL, UNIX

System Engineer

Infosys Limited
02.2009 - 06.2012
  • Project 2: Name: ERISA LO
  • Client: WELLPOINT
  • Duration: FEB 2009 - JUN 2012
  • Role: SYSTEM ENGINEER
  • Project Description:
  • As a System Engineer, I am responsible for development and testing of Data warehousing solutions for Infosys US client WellPoint.
  • Roles & Responsibilities:
  • Understanding and Designing Solutions for complex Client Business Processes and Defining Development Strategies.
  • Developed ETL jobs using Informatica PowerCenter tools.
  • Created test plan for testing master workflow and document the same.
  • Ensuring quality deliverables to client within stipulated time to adhere to scheduled timelines.
  • Preparing Scheduling documents, Release documents.
  • Environment: Informatica PowerCenter, Teradata, UNIX

Education

Bachelor of Technology (B. Tech) in Computer Science Engineering (CSE) -

D.M.S.S.V.H College of Engineering, Acharya Nagarjuna University, Andhra Pradesh, India
04.2008

Skills

  • Cloud Technologies: Informatica Intelligent Cloud Services (IICS), Amazon Web Services(AWS)
  • ETL Tools: Informatica PowerCenter 861/95, DBT core
  • Could Data Platform: Snowflake
  • Databases: Oracle 9i/10g/11g, SQL Server, R12, Teradata 12/13
  • Data Ingestion Tools: Fivetran, Qlik
  • Languages: Python (beginner level)
  • Data Catalogs or Data Governance Tools: Atlan Data Catalog, Informatica Enterprise Data Catalog (EDC)
  • Database Utilities: SQL Developer, DBeaver, Teradata SQL Assistant, Toad for Oracle
  • Version Control Tools: WINCVS, Bitbucket
  • CI/CD: Jira, Bamboo, GitHub
  • Scripting: Unix shell scripting
  • Schedulers: Crontab
  • API: Postman
  • Documentation: Confluence
  • Data Warehousing
  • Data Governance
  • Metadata Management
  • Problem-Solving Abilities
  • Analytical Thinking
  • Adaptability and Flexibility

Certification

  • SnowPro Core certified (COF-C02)
  • AWS Certified Cloud Practitioner (CLF-C01)
  • IBM Certified Agile Explorer
  • IBM Certified Data Science Foundations Level 1 and Level 2

Timeline

Senior Data Engineer

IBM Corp
08.2015 - Current

Data Engineer

IBM Corp
08.2014 - 08.2015

Technology Analyst

Infosys Limited
06.2012 - 07.2014

System Engineer

Infosys Limited
02.2009 - 06.2012

Bachelor of Technology (B. Tech) in Computer Science Engineering (CSE) -

D.M.S.S.V.H College of Engineering, Acharya Nagarjuna University, Andhra Pradesh, India
Vamsi Krishna Kosuri Senior Data Engineer