Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

RAJA RAM CH

Westlake,TX

Summary

Accomplished Snowflake Architect & Senior ETL Developer with a proven track record at United Services Automobile Association, enhancing data processes and security. Expert in IBM InfoSphere DataStage and Azure Data Factory, with a knack for automating and refining data workflows. Demonstrated leadership in cross-functional teams, significantly improving project efficiency and data quality.

Overview

18
18
years of professional experience
1
1
Certification

Work History

Snowflake Architect & Senior ETL Ld Developer

Teachers Insurance Annuity Association-TIAA
06.2023 - Current
  • Collaborated with product owners, business units, and development teams to identify data gaps and refine requirements for federal filings (TIC Schedule 3, Form PF)
  • Worked closely with Fund Managers to understand their data needs and ensure data accuracy and completeness
  • Build, maintain, and analyze data from multiple sources within a centralized repository, providing a comprehensive view of the business across different departments through IDW
  • Automated manual filing processes by developing scripts and workflows using CSS and Signal to generate automated extracts
  • Designed and implemented ETL workflows on Azure Data Factory to extract, transform, and load data into Snowflake, ensuring data quality and consistency
  • Creating data connections (linked services) to diverse data sources (databases, cloud storage, APIs)
  • Defining the sequence of data pipeline activities, scheduling data flows, and managing dependencies between different stages of the process
  • Extensively used DataStage designer and components for data processing
  • Documented technical design and source-to-target mapping to translate the business logic
  • Created jobs in DataStage to extract from heterogeneous data sources like Oracle, SQL server, and Flat files
  • Implemented robust data access controls in Snowflake to restrict unauthorized access to sensitive data
  • Created CI/CD pipelines to automate Azure data ingestion processes, improving efficiency and reducing manual effort
  • Build Autosys jobs to schedule the data flows from various source
  • Created ini to automate the jobs and worked with calendars specific to requirement for scheduling
  • Have provided L2/L3 Support
  • Environment: IBM Datastage 11.7, Snowflake, SNOW SQL, AWS, MSSQL, Python, Admaster, Oracle, HTML, JIRA, Kanban, Azure, MS Visio, AWS, Tableau, Autosys, Snowflake SCIM

Senior ETL Snowflake Developer

United Services Automobile Association
San Antonio, TX
09.2021 - 06.2023
  • Worked with product owners, cross-BUs BUs, and development teams on requirements and data gaps in the existing model
  • Extensively used DataStage designer to develop the parallel jobs and server jobs
  • Designed several DataStage jobs using join, merge, lookup, change apply, change capture, funnel, filter, column generator, transformer, modify, surrogate key, aggregator, row generator, and XML stages
  • Implemented data access controls in Snowflake to restrict unauthorized access to sensitive data
  • Worked in cross-functional Agile teams to ensure smooth coordination and knowledge sharing
  • Managed and maintained project workflows in Jira to support the ETL development process
  • Designed and developed SQL scripts in Snowflake to extract data from various sources
  • Extensively used DataStage designer and components for data processing
  • Documented technical design and source-to-target mapping to translate the business logic
  • Created jobs in DataStage to extract from heterogeneous data sources like Oracle, SQL server, and Flat files
  • Created and assigned tasks to team members using Jira to facilitate project planning and task distribution
  • Built SQL scripts to handle data updates and upsert operations within Snowflake
  • Developed ETL processes to extract and load data from various sources into the data warehouse
  • Collaborated with data engineers and architects to design data models and schemas for the data warehouse
  • Collaborated with security teams to conduct regular security audits and vulnerability assessments
  • Worked with Hierarchical Stage to connect API’s
  • Created a new physical data model for the HRAE core module
  • Worked on creating stage and loading data from AWS S3 to Snowflake Layer 0
  • Worked on DBT models to transform data and load into Snowflake tables in Layer 1 and 2
  • Worked with data models using dbt, including staging, intermediate, and final analytics tables
  • Good Experience with dbt model Source, staging, intermediate and Load models to populate marts
  • Wrote complex SQL queries and worked on performance tuning
  • Handson Experience with Workday to develop Integrations to build Talent Acquisition data Mart
  • Closely worked with Workday to develop Integrations & API calls and consumed data for HR Data Mart through DBT & Snowflake
  • Conducting data migration projects from on-premises databases to Snowflake in Azure
  • Created Snowpipe to automate incoming vendor files to load into Snowflake
  • Worked on creating pipelines to deploy objects into Snowflake
  • Designed jobs to extract, cleanse, and parameterize the jobs to allow portability and flexibility during runtime, apply business rules and logic at the transformation stage, and load data into the data warehouse
  • Conducted Weekly design review sessions and reviewed code with application and platform teams throughout the lifecycle
  • Worked with Python to connect, Import / Export, and API pulls from Snowflake
  • Involved in the daily scrum and sprint review sessions
  • Attended Program Incremental Meetings and was involved in creating stories from Feature and adding subtasks on Jira Cloud
  • Environment: IBM Datastage 11.7, Snowflake, SNOW SQL, AWS, Netezza, MSSQL, Unix, GIT, Service Now, Urban Code Deploy, Azure, Athena, Amazon S3, Datastage, Data Build Tool (DBT), Python, Jira Cloud

Senior ETL Snowflake Developer

United Services Automobile Association
San Antonio, TX
01.2020 - 08.2021
  • Consumable warehouse is intended to be a multipurpose data environment that is intuitive, easy to learn, and holds data related to Auto Policies and Claims data
  • Designed and implemented the ETL process using Informatica Power Center
  • Designed and developed ETL processes using Python and Snowflake to extract, transform, and load data from various sources
  • Configured and managed Azure Data Factory triggers and monitored ETL jobs for successful execution
  • Implemented data partitioning and indexing strategies for optimized query performance in the data warehouse
  • Regularly reviewed Snowflake security logs to identify and mitigate potential security threats
  • Wrote complex SQL queries to perform data transformations and aggregations in the data warehouse
  • Encouraged open communication and transparency within the Agile team
  • Customized Jira boards and filters to visualize and prioritize ETL development tasks effectively
  • Utilized Jira dashboards and reports to provide project status updates to stakeholders
  • Created complex SQL queries to perform data transformations and enrichments in Snowflake
  • Optimized Python code and SQL queries for performance and scalability in Snowflake
  • Extracted OLTP from different source systems like Oracle, Netezza, and Flat files
  • Designed and developed Mappings and Mapplets to load data from source to target database using Informatica Power Center, and tuned mappings to improve performance
  • Implemented data archiving and purging strategies in Azure and Snowflake to manage storage costs effectively
  • Extensively used transformations like Router, Aggregator, Lookup, Source qualifier, Joiner, Expression, and Sequence generator in extracting data in compliance with the business logic developed
  • Created Sessions and Batches using Informatica Workflow
  • Environment: Informatica 10.1, Snowflake, Hive, Unix, Netezza, Tableau 2018.2, IBM RTC client, Jira, GitLab, UCD

Senior ETL Snowflake Developer

United Services Automobile Association
San Antonio, TX
06.2018 - 12.2019
  • HRAE is the analytical application for all human resource analytics at USAA
  • The following are core components of HRAE
  • Integrated Workforce Analytics which deals with employee details belonging to the organization such as department, job, pay, and personal details daily
  • Voice Of Employee & Employee job Survey is a present employee feedback system, Talent Recruiting and Succession Planning is USAA recruitment events captured daily and its aggregated data
  • Worked on project planning, roadmap presentations, and capacity planning
  • Worked with the product owner and scrum master on defining KPIs, epics, and stories
  • Documented technical design and source-to-target mapping to translate the business logic
  • Created jobs in DataStage to extract from heterogeneous data sources like Oracle, SQL server, and Flat files
  • Built and managed automated data validation processes in Azure to ensure data integrity before loading it into Snowflake
  • Wrote efficient SQL scripts to load data into Snowflake tables from staging areas
  • Built and maintained data pipelines in Python to ensure the smooth flow of data from source systems to Snowflake
  • Implemented secure coding practices to prevent common security vulnerabilities in ETL workflows
  • Integrated Jira with other tools and platforms to streamline the ETL development workflow
  • Managed and resolved bugs and issues reported in Jira by conducting root cause analysis
  • Implemented data security and access controls to protect sensitive information within the data warehouse
  • Implemented transaction control and error handling in SQL scripts for data integrity
  • Developed error-handling mechanisms and logging frameworks to ensure the reliability of ETL processes in the data warehouse
  • Designed jobs to extract, cleanse, and parameterize the jobs to allow portability and flexibility during runtime, apply business rules and logic at the transformation stage, and load data into the data warehouse
  • Created a GIT repository and maintained versioning of the code using GIT
  • Stayed up-to-date with the latest Azure services and features to enhance ETL processes and data integration capabilities
  • Worked as Level 3 SME for HRAE application and took care of validating daily and monthly data loads and reports
  • Worked on Workday HCM Reports & Integration for Talent Acquisition and Benefits
  • Worked on building change release packages and promoted them to higher environments
  • Environment: DataStage, python 2.7, 3.5, git 2.7, Netezza 7.2.1, Git, IBM UCD, SAP BO, ServiceNow

Sr ETL Developer

United Services Automobile Association
San Antonio, TX
04.2015 - 10.2017
  • The Voice Of Employee program centers on ensuring employees are listening and acting to improve best in class with dedicated listening posts where they can share feedback from the employee job satisfaction tool to Pride to UCount to internal social media channels
  • Designed and implemented the ETL process using Informatica Power Center
  • Extracted OLTP from different source systems like Oracle, SQL Server, and Flat files
  • Designed and developed Mappings and Mapplets to load data from source to target database using Informatica Power Center, and tuned mappings to improve performance
  • Advocated for the use of Agile metrics to track team performance and project progress
  • Created and maintained documentation for project processes and guidelines within Jira
  • Conducted regular Jira data backups and ensured data integrity and continuity
  • Conducted security assessments of third-party applications integrated with Snowflake
  • Contributed to Agile team capacity planning and resource allocation discussions
  • Designed and optimized SQL scripts for incremental data loading to minimize processing time
  • Created and managed scheduled jobs and workflows using Python libraries
  • Involved in performance tuning of the ETL process and performed the data Warehouse testing
  • Extensively used transformations like Router, Aggregator, Lookup, Source qualifier, Joiner, Expression, and Sequence generator in extracting data in compliance with the business logic developed
  • Created Sessions and Batches using Informatica Workflow
  • Environment: Informatica 9.1, Hadoop 2.6.5, Hive, Python 3.5, Unix, Netezza, SAP BO 4.2, Tableau 2018.2, IBM RTC client, Service Manager, Windows 7

Sr ETL Developer

United Services Automobile Association
San Antonio, TX
10.2013 - 04.2015
  • Company Overview: Oracle IAM – Suite of products that integrate to meet security, compliance, and efficiency requirements of USAA
  • To increase operational efficiency – Streamlined process for User access requests, user transfer, and user terminations
  • To provide an improved security framework for applications
  • To reduce access control risks
  • Compliance – Report on who has access to what
  • Regularly attest that people have the right access to their jobs
  • Provided post-implementation support
  • Used IAM tools like OIA & OIM to load the feeds with user access information
  • Used Control-M to develop the jobs to schedule the jobs
  • Oracle IAM – Suite of products that integrate to meet security, compliance, and efficiency requirements of USAA
  • Environment: DataStage v8.5, Oracle 10g, Oracle IAM Suite, UNIX

ETL Developer

Lloyds Banking Group
, UK
01.2010 - 09.2013
  • CRR will deliver a single platform providing regulatory reporting for credit & large exposure risk to group finance
  • CRR is a strategic solution supporting internal & external reporting requirements of the combined wholesale & WI divisions
  • This will be achieved by Moody’s regulatory reporting engine called Ray
  • Ray produces regulatory capital calculations & reporting for credit risk

ETL Developer

Standard Chartered Bank
Ind
12.2008 - 12.2009
  • The objective of the Project is to develop PD rating models (Both Application/Behavioral models) for the Small Business (SB), Medium Enterprise (ME), and Micro segments, in the following 5 countries: Singapore, Hong Kong, Malaysia, China, and India

ETL Developer

Standard Chartered Bank
Ind
01.2007 - 11.2008
  • The Business & Client Performance Reporting (“B&CPR”) Project is a strategic initiative approved by the Group and Wholesale Bank Investment Committees
  • The initiative aims to rebuild the Wholesale Bank’s Performance Management capability by replacing all silo-based MIS applications with a single instance Data Warehouse and Related Applications / Engines

Education

Master of Technology - Computers & Communications

Bharath University
India
01.2005

Bachelor of Technology - Electronics & Communications

JNTU
India
01.2003

Skills

  • ETL/ELT Tools
  • IBM InfoSphere DataStage,DBT
  • Informatica PowerCenter,IICS
  • Azure Data Factory,Matillion,Abinitio,SSIS
  • Snowflake,Netezza,Oracle,SQL Server

Cloud Platforms

  • AWS,Azure
  • Informatica Cloud
  • AWS Lake Formation
  • Big Data Technologies
  • Hadoop,HDFS,Hive,Sqoop
  • Scripting and automation
  • Python
  • SQL
  • Shell Scripting
  • CI/CD
  • GitLab CI/CD & Urban Code Deploy
  • Control-M & Autosys
  • SCM,SVN
  • Reporting Tools
  • SAP Business Objects
  • Azure Synapse Analytics
  • Tableau
  • ERP
  • Workday HCM
  • Domain Experience
  • HR,Banking Core,Healthcare,Insurance

Certification

  • Microsoft Certified: Azure Data Fundamentals
  • IBM Certified Solution Developer –DataStage
  • Snowflake Certified by Edureka
  • Matillion ETL Foundations

Timeline

Snowflake Architect & Senior ETL Ld Developer

Teachers Insurance Annuity Association-TIAA
06.2023 - Current

Senior ETL Snowflake Developer

United Services Automobile Association
09.2021 - 06.2023

Senior ETL Snowflake Developer

United Services Automobile Association
01.2020 - 08.2021

Senior ETL Snowflake Developer

United Services Automobile Association
06.2018 - 12.2019

Sr ETL Developer

United Services Automobile Association
04.2015 - 10.2017

Sr ETL Developer

United Services Automobile Association
10.2013 - 04.2015

ETL Developer

Lloyds Banking Group
01.2010 - 09.2013

ETL Developer

Standard Chartered Bank
12.2008 - 12.2009

ETL Developer

Standard Chartered Bank
01.2007 - 11.2008

Master of Technology - Computers & Communications

Bharath University

Bachelor of Technology - Electronics & Communications

JNTU
RAJA RAM CH