Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Ramesh Kandimalla

Minneapolis,MN

Summary

Detail-oriented Snowflake Developer with more than 4 years of experience in data nations, ETL development, and cloud data solutions. Preface: Snowflake, AWS S3, EC2, Glue, Athena, Python, SQL, PL/SQL, Informatica PowerCenter, data migration, performance, scalability. Strong evidence in the specification of ETL structures bottom up and the improvement of ETL extract, transformation, and load. Accustomed to working hand in hand with business analysts for the purpose of identifying requirements, planning and improving work processes as well as for providing relevant solutions. Proven team player with expertise in providing data conversion solutions as well as designing and implementing complex stored procedures, enhancing SQL queries, and post implementation support to deliver high quality results improving the value of the firm’s projects that are data-dependent.

Overview

3
3
years of professional experience
1
1
Certification

Work History

Snowflake Developer

Qualcomm
09.2023 - Current
  • Business Requirements review with the client, worked with the Business Analysts to iron out the gaps in the business requirements, and provided advice on any application improvements or caveats.
  • Wrote SnowSQL, Stored Procedures and Design snowflake tables (Temporary, Transient, Permanent)
  • Working with Snowflake objects- Warehouses, Roles, Databases, Schemas, Tables, Views, Constraints and Snowflake Table Clustering Keys.
  • Development - Developed and lead end-to-end solution in integrating various technologies in the application for the respective project using ETL Informatica jobs, sequencers, mappings, transformations, UDFs, re-usable components, command tasks, Oracle PL/SQL Programming and writing Stored procedures in Microsoft SQL Server DB.
  • Responsible for loading data into S3 buckets from the internal server and the Snowflake data warehouse.
  • Built the framework for efficient data extraction, transformation, and loading (ETL) from a variety of data sources.
  • Using Amazon Web Services (Linux/Ubuntu), launch Amazon EC2 Cloud Instances and configure launched instances for specific applications.
  • Tuned Several Complex SQL Queries – improved performance of several SQL scripts.
  • Used Informatica Power Center Workflow manager to create sessions, workflows, and batches to run wif the logic embedded in the mappings.
  • Develop and customize MDM solutions using Informatica MDM tools.
  • Writing complex PL/SQL procedures, functions to extract data in required format for interfaces.
  • Developed and managed data pipelines to integrate data from AWS S3, relational databases, and external APIs into Snowflake using AWS Glue and custom Python scripts.
  • Using Python programming and SQL queries, data sources are extracted, transformed, and loaded to make CSV data files.
  • Developed Snowflake views to load and unload data from and to an AWS S3 bucket, as well as transferring the code to production.
  • Design, development, and implementation of performant ETL pipelines using PySpark and AWS
  • Developed ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake
  • Extensively used to handle the Equipment Data and Vision Link data by using XML, XSD, JSON files and loaded into Teradata database.
  • Created ETL mapping document and ETL design templates for the development team.
  • Created external tables on top of S3 data which can be queried using AWS services like Athena.
  • Snowflake data engineers will be responsible for architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse.
  • Migrated the data from SAP, Oracle and created Data mart using Cloud Composer (Airflow) and moving Hadoop jobs to Datapost workflows.
  • Created Code Review Standards for UNIX Scripting, SQL, and PL/SQL.

Snowflake Developer

TCS
10.2021 - 09.2023
  • Led data conversion projects, ensuring accurate and efficient migration of data from legacy systems to Oracle HCM Cloud across workforce structure, Core HR, Payroll, Benefits, and Talent modules.
  • Automated HDL creation process using SQL and PL/SQL queries in SQL developer tool
  • Driven data conversion mapping sessions with the business team.
  • Utilized HCM Extracts and Fast Formulas to streamline data extraction processes and for building complex integration solutions such as General Ledger, Seniority and service dates and grade step progression processes.
  • Collaborated with functional teams to understand business needs and translate them into technical specifications.
  • Led technical delivery during post go-live support and CRs implementation phase.
  • Exposed to various stages of project life cycle from requirement gathering to Go-Live and supporting the project during hyper care phase with post go-live defects and changes.
  • Provided technical support and troubleshooting for Oracle HCM Cloud application issues during testing cycles and post go-live support in production environment.
  • Created comprehensive technical documentation for data conversion processes and report generation.
  • Conducted training sessions for end-users on reports, Fast formulas and HCM extracts technical build.
  • Proficient in stakeholder management, adept at fostering productive relationships, addressing concerns, and ensuring alignment to achieve project objectives.
  • Maintained consistent communication and engagement with clients throughout the project lifecycle, assisting from inception to completion by developing timelines and providing actionable insights
  • Worked closely with functional leads, extraction teams, and business teams to resolve conversion defects and complete conversions in a timely manner.
  • Collaborated with other internal project teams to discuss and leverage the objects that are already built with similar requirement reducing the efforts required for the build
  • Part of top 1% performers at TCS for 2 consecutive years.

Education

Master of Science - Information Technology Management

Concordia University - St. Paul
Saint Paul, MN
12-2024

Bachelor of Engineering - CSE

Vignan's Foundation For Science, Technology & Research Deemed University
01.2021

Skills

  • Operating Systems - Linux, Unix, Windows
  • Databases - Oracle, DB2, SQL Server, MySQL, Teradata, Mongo DB, NoSQL
  • ETL - Informatica 96/91 and Tableau
  • Programming Languages - SQL, PL/SQL, Python, Pyspark, Unix
  • Data Modeling Tools - Snowflake modelling, Erwin Data Modeler, ER Studio v17, Star Schema, Snowflake Schema
  • Reporting Tools - Informatica Power Analyzer, Informatica IICS, ADF, Azure Data bricks, Redshift, Tableau, SQL, PL/SQL, Python
  • Data Warehousing - Informatica IDMC, MDM, Informatica Cloud Data Integration (CDI), Power center 102/96/95/91/86, ADF, Data Bricks, Informatica Power Exchange 10x/9x/8x/7x, Repository Manager, Workflow Manager and Workflow Monitor, Informatica IICS, DataStage
  • Cloud Platform - AWS, Azure, GCP
  • Scripting - UNIX Shell Scripting, Korn Shell Scripting, python
  • Methodologies - System Development Life Cycle (SDLC), Agile, RAD
  • Visualization - Tableau, Power BI, Data studio

Certification

  • Oracle Cloud 2023 Certified Implementation Professional, Oracle
  • Beginner to Advance PL/SQL, Udemy
  • Python for Everybody and Python Data Structures, Coursera
  • MS Excel, MS Word, Udemy

Timeline

Snowflake Developer

Qualcomm
09.2023 - Current

Snowflake Developer

TCS
10.2021 - 09.2023

Master of Science - Information Technology Management

Concordia University - St. Paul

Bachelor of Engineering - CSE

Vignan's Foundation For Science, Technology & Research Deemed University
Ramesh Kandimalla