Around 15 years of ETL/ELT experience in analysis, design, development, implementation, migration and troubleshooting in the areas of Data Warehousing Methodologies.
Experience in ETL and ELT of data from disparate sources into Data Warehouses and Data Marts using Informatica Intelligent Cloud Services (IICS) and Informatica Power Center. Proficient across various Amazon web services (AWS) like EC2, S3, IAM and Lambda.
Extensively worked on Data Ingestion tools like Fivetran & Qlik to ingest raw data into Data Lake. Extensively worked on Continuous Integration and Continuous Delivery (CICD) pipeline using Jira and GitHub.
Good knowledge on data warehousing concepts like star schema and snowflake schema.
In-depth understanding of Snowflake multi-cluster virtual warehouses, multi-cluster size and optimized long running queries saving Snowflake credits and storage.
Good knowledge in developing streams, tasks, snowpipes, views, stored procedures, and Data classification of PII and Sensitive data in Snowflake.
Good knowledge on building and maintaining Data classification, Data profiling, Data cleansing, Data Security and Data governance solutions. Involved in Performance tuning of Informatica mappings and push down optimization technique.
Good knowledge on UNIX commands and Shell Scripting to pull data from UNIX, FTP and SFTP servers.
Experience in preparing evaluations and decisions register based on pros and cons by conducting PoCs on best fit tools and technologies available in the market.
Excellent analytical, problem solving, written and verbal communication skills with ability to interact with individuals at all levels.
Overview
15
15
years of professional experience
1
1
Certification
Work History
Senior Data Engineer
IBM Corp
08.2015 - Current
Project 1: Client: WORLD KINECT CORPORATION
Duration: AUG 2015 - Current
Role: SENIOR DATA ENGINEER
Project Description:
World Kinect Corporation (WKC) formerly known as World Fuel Services (WFS) Corporation is energy, commodities, and services company and it focuses on marketing, trading, and financing of aviation, marine, corporate, and ground transportation energy commodities.
As a Senior Data Engineer, I am responsible for Analysis, design, development, testing, and implementation of Enterprise Data warehousing solutions for new implementations and migrating legacy systems.
Roles & Responsibilities:
Integrating data shared across legacy systems; Develop and maintain advanced database systems including EDW, Data Marts and highly normalized models needed for business, operational analysis and/or reporting.
Requirements gathering from business stakeholders and liaising with client IT team to prioritize new features or enhancement requests, production issues and to initiate remediation.
Building ETL pipelines using Snowflake, Informatica Intelligent Cloud Services, Qlik, Fivetran and Python to load data from various sources into enterprise data warehouse built on Snowflake cloud data platform.
Also migrated code from Informatica PowerCenter to IICS.
Design and implement data transformations using DBT models, ensuring data integrity and consistency.
Build and manage Data Catalog tools like Atlan and Informatica EDC which aids business users in Data Discovery, Data Stewardship, Data lineage and metadata management.
Implementing new ideas and features to enrich metadata focused on cloud technologies and business applications.
Apply expertise in managing Amazon Web Compute Services like Elastic Cloud Compute (EC2) through AWS Command Line Interface (AWS CLI), storage services like Simple Storage Service (Amazon S3), and event-driven services like Lambda.
Eliminating bottlenecks by performance tuning; providing optimized solution and reducing overall cost of cloud resources.
Streaming raw data into Data Lake through data ingestion tools like Fivetran and Qlik and responsible for managing these tools.
Involved in designing and developing ETL processes that meet program specifications as well as client requirements using Informatica PowerCenter Tools like Designer, Workflow Manager, Workflow Monitor and Repository Manager.
Sprint planning, creating stories in Jira with estimates, mentoring offshore team by providing guidance and sharing technical expertise to ensure closure of all tickets committed in sprint by strictly following Agile methodologies.
As a Data Engineer, I am responsible for Analysis, design, development, testing, and implementation of Data warehousing solutions for new implementations and migrating legacy systems for the IBM UK client Royal Bank of Scotland (RBS).
Roles & Responsibilities:
Requirements gathering, Providing Estimations, Requirement review, Detail design, Development, Code review and Unit Testing.
Interaction with the Business Partners to understand the business.
Understanding business requirements and giving technical resolutions.
Prioritizing work queue items, assigning them to team members and tracking them to closure to adhere to scheduled timelines.
Developing ETL jobs using Informatica PowerCenter tools.
Involved in writing UNIX shell scripts in KSH.
Performed Tuning by identifying and eliminating the bottlenecks occurred.
Preparing release documents and migrating code from one environment to other
As a Technology Analyst, I am responsible for Analysis, development, testing and implementation of Data warehousing solutions for new implementations and migrating legacy systems for Infosys US client Charles Schwab & Co.