Summary
Overview
Work History
Education
Skills
Timeline
ResearchAssistant

Ramarao Kavuri

Round Rock,TX

Summary


● Experience in Data Engineering, Data Migrations, Data Lake formation, and Data Analytics, Data Quality.

● Skilled, results-oriented, and organized architect with 16+ Years of impactful experience working with

● Strong foundation with Data Integration tools and Visualization.

● Progressive growth patterns through development and design to Leading Teams through Architecture

● Experience with client-centric Agile driven deliveries, across offshore-onsite models with significant client Site experience.

● Expertise in data masking using custom algorithms using the Delphix tool.

● Good Knowledge of data modeling concepts, such as entity-relationship (ER) modeling, data vault, dimensional modeling, logical and physical database design, normalization, and data integration.

● Experienced in dimensional modeling using ER/Studio, Erwin and Data Vault 2.0

● Knowledge and experience with full SDLC lifecycle and experience with Lean / Agile development methodologies

● Have a solid understanding of delivery methodology (SDLC) and lead teams in the implementation of the solution according to the design/architecture

● Experience in Designing and implementing scalable and efficient data architecture within Snowflake, including schema design, data models, tables, and data integration patterns.

● Analyzing and optimizing query performance in Snowflake by designing the partitioning strategies, and clustering keys. Monitoring and tuning query execution plan to improve system performance and resource utilization.

● Implementing robust metadata management, security controls, access privileges, and encryption mechanisms within Snowflake to ensure data security and compliance

● Extensively worked on Snowpipe , Streams, Schedule Tasks, Data Sampling, and Data Masking

● Experience in using data sharing, Snowflake Clone, access management, and Time Travel.

● Experience in copying data from AWS S3bucket and Azure Blob Storage to Snowflake.

● Monitoring the performance and health of Snowflake data warehouses, clusters, and queries. Analyzing system usage, resource utilization, and query execution statistics to identify and resolve performance bottlenecks.

● Experience in DBT, Five Tran, and Airflow tools.

● Experience in Qlik replicate and Qlik Sense.

● Knowledge in building charts, tables and maps using Qlik sense.

● Good experience in data ingestion using Qlik replicate.

● Good expertise in DBT models, Macros, Snapshots, Seeds, and Materializations.

● Good experience in creating YML files, Jinjas and Macros, Hooks, and operations using DBT.

● Good knowledge of Python used in creating Airflow Dag.

● Experience in ETL tools IBM Data Stage, Informatica IICS, and Informatica Power Center.

● Expertise in AWS S3, AWS Glue, IAM, Ec2 Fundamentals, SQS, KMS and, SNS Messaging.

● Good knowledge of Python, Spark, Pyspark, and, Azure Data Bricks

● Expertise in pandas, NumPy libraries of python

● Good knowledge of Spark-SQL to query unstructured and semi-structured data.

Overview

2
2
years of professional experience

Work History

Snowflake/Senior Data Architect

WEX INC
06.2023 - Current

Environment: Snowflake, DBT, Fivetran, Qlik Replicate , AWS, Airflow, Oracle, Informix , Github

Description:

● Our current data system is built by the center tech team over the last 25 years, Build of Lakefront1.0 started in 2019 bringing all wex data into one single platform which is on aws and redshift to build data lake and on top of redshift dbt will transform the data and created data marts with respect to different each business. In Lakefront 2.0 we introduced Qlik replicate and fivetran to ingest data into snowflake and dbt will run models and create data marts. Pointed all models to snowflake instead of redshift.

● Lakefront is broken into Raw, Prep and Mart layers. Raw layer will consist of all the raw data which is coming from multiple sources. Prep layer will create views on top of raw layer and make PCI data and if any filters required on raw data by using dbt models. Finally the data mart layer will be created with transformed data and business specific marts.

Snowflake Data Engineer

Saksoff5th
06.2022 - 04.2023

Project Title: SO5 Data Re-platform (migration from on-premise to

cloud)

Environment: Snowflake, DBT, Fivetran, AWS, Airflow, Oracle.

Description:

● Our current data system is built by the center tech team over the last 30 years. It consists of a large amount of data pipelines, multiple data warehouses which overlap the data sources, and different reporting tools which are used by different teams to get similar metrics. The current data platform has the following drawbacks.

● Data is mingled with other banners from data sources, data pipelines, data warehouses and Reporting tools. This tightly coupled system makes it hard to maintain and add new features.

● There are no consistent data models well defined for different business domains, which Causes inconsistency of data and low reusability of business logic. There is no clear data governance process, which makes it hard to maintain a single source of truth. For example, one team calculates the store sales from the sales audit data while the other team calculates store sales from the CMS GL table,we haven't enabled the self-service capability to enable the business team to easily access the data and serve themselves.

● To solve this problem, we will build a new data platform in Snowflake as the single source of truth to serve both business intelligence and Machine Learning needs. We aim to provide clean, well-documented, consistent, and accessible data to unlock our business user’s ability to make informed decisions at greater speed.

● At the high level, we will adopt the ELT (Extract, Loading, and Transform) approaches instead of ETL because the ELT approach will move completed raw data sets into the snowflake without losing information. We will design a common data model by using dimensional data Modeling based on the business characteristics and divide the data warehouse into several Layers.

Education

Master of Science - Power Electronics

SRM Unerversity
11.2005

Skills

  • Snowflake
  • DBT
  • AWS
  • Airflow
  • Fivetran
  • Qlik Replicate
  • Github
  • Data Modelling
  • ETL Tools(Data Stage, informatica)
  • Python, Spark, Databricks
  • Delphix

Timeline

Snowflake/Senior Data Architect

WEX INC
06.2023 - Current

Snowflake Data Engineer

Saksoff5th
06.2022 - 04.2023

Master of Science - Power Electronics

SRM Unerversity
Ramarao Kavuri