Summary
Overview
Work History
Education
Skills
Additional Information
Personal Information
Timeline
Generic

Vijay C

Raleigh,NC

Summary

Talented Informatica Developer with good requirements gathering, mapping and logic design skills. Detail-oriented and thorough with good grasp of business processes and needs developed over 6 years in ETL field.

Overview

7
7
years of professional experience

Work History

Sr Informatica Developer

Walgreens
11.2021 - Current
  • Worked with IT architect, Program managers in requirements gathering, analysis, and project coordination
  • Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL Data Warehouse, Azure Data Lake Store and Azure Blob Storage technologies
  • Analyzed existing ETL Data Warehouse process and ERP/NON-ERP Applications interfaces and created design specification based on new target Cloud Data Warehouse (Azure Synapse) and Data Lake Store
  • Created ETL and Data Warehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies
  • Created mapping documents with detailed source to target transformation logic, Source data column information and target data column information
  • Designed, Developed and Implemented ETL processes using IICS Data integration
  • Created IICS connections using various cloud connectors in IICS administrator
  • Installed and configured Windows Secure Agent register with IICS org
  • Extensively used performance tuning techniques while loading data into Azure Synapse using IICS
  • Extensively used cloud transformations - Aggregator, Expression, Filter, Joiner, Lookup (connected and unconnected), Rank, Router, Sequence Generator, Sorter, Update Strategy, Union Transformations
  • Extensively used cloud connectors Azure Synapse (SqlDW), Azure Data Lake Store V3, Azure BLoB Storage, Oracle, Oracle CDC and SQL Server
  • Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes
  • Extensively used Parameters (Input and IN/OUT parameters), Expression Macros and Source Partitioning Partitions
  • Extensively used Push Down Optimization option to optimize processing and use limitless power of Azure Synapse (SqlDW)
  • Extracted data from Snowflake to push the data into Azure warehouse instance to support reporting requirements
  • Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics and insight use case for Sales team
  • Created PYTHON scripts to create on demand Cloud Mapping Tasks using Informatica REST API
  • Created PYTHON scripts which will used to start and stop cloud Tasks (the scripts use Informatica Cloud API calls)
  • Developed CDC load process for moving data from Peoplesoft to SQL Datawarehouse using 'Informatica Cloud CDC for Oracle Platform'
  • Developed complex Informatica Cloud Task Flows (parallel) with multiple mapping tasks and task flows
  • Developed MASS Ingestion tasks to ingest large datasets from on-prem to Azure Data Lake Store - File ingestion
  • Designed Data Integration Audit framework in Azure SqlDw to track data loads, data platform workload management and produce automated reports for SOX compliance
  • Worked with team of 4 onshore and 6 offshore development teams and prioritizing project tasks
  • Involved in Development, Unit Testing, SIT and UAT phases of project.

ETL Informatica & IDQ Developer

State of Indiana
12.2020 - 10.2021
  • Designed ETL-Data Architecture for Personal Health Record (PHR) Datamart
  • Prepared the ETL Mapping documents for all stage, Dimension and Fact tables from sources tables
  • Design, development, and implementation of data quality solutions using Informatica PowerCenter and IDQ
  • Mentored and developed Informatica flows for various Dimension and fact table in the project
  • Designed and developed the incremental/delta logic for all data loads by using the Run/load status table
  • Developed and automated Aetna IMS Jobs which delivers 100+ incentive files on a given day to various clients/vendors by using Linux shell scripts and Informatica workflows
  • Optimizing / tuning mappings for better performance and efficiency by using Informatica functionality
  • Studied existing environment and accumulating the requirements by querying the Clients on various aspects
  • Took the complete responsibility of Informatica Installation on Linux, as part of that upgrade from 8.1 to 8.6,8.6 to 10.1 and 10.1 to 10.5
  • And involved in informatica administration process after installations
  • Developed PERL script to convert csv files to XLS files since Informatica on Linux can't prepare and clients wanted only the XLS files instead of txt or csv
  • Developed Linux shell script to upload the incentive files to vendors/client by using sftp protocol
  • Worked different kinds of source feed file, such as delimited, fixed width and XML files
  • Working with Off-shore team member in developing flows and productions support activities
  • Supporting Post-Production activities of all applications and implementing ongoing modification and enhancement
  • Involved in migrating the client data warehouse architecture from on-premises into Azure cloud
  • Understanding of the Tableau dashboard business requirements of various systems
  • Implement Azure Data factory operations and deployment into azure for moving data from on-premises into cloud
  • Worked on Workflow Monitor displays and all Integration Services associated with the repository
  • Worked on SFTP connections to use secure protocols, such as SSH, to access source and target files
  • Implemented Informatica Intelligent Cloud Services which runs a data integration task with a FTP/SFTP target connection, it creates a target file based on the target defined in the task
  • As it completes the task, Informatica Intelligent Cloud Services writes the target file to the remote directory, overwriting the existing files
  • Implement data quality processes using Informatica Data Quality (IDQ) to cleanse, standardize, and enrich data.

Sr. ETL Consultant

Clarios
12.2019 - 10.2020
  • Responsible to analyze functional requirement
  • Created design specification and technical specifications based on functional requirements
  • Extensively worked on developing Informatica Mapplets, Mappings, Sessions, Worklets and Workflows for data loading
  • Worked on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Sorter, Router, Sequence Generator, XML transformation etc
  • Created mappings of initial load for all source systems, cleansed data and load it into staging area
  • Working on Web Service Consumer transformation to gather metrological data
  • Extensively used ETL to load data from wide range of sources such as relational database, XML, flat files (fixed-width or delimited) etc
  • Expertise in writing Pre Post Sqls and overrides Sqls as per the requirement
  • Extensively worked on using the PDO (Push down optimization), CDC (Change data capture) mechanism
  • Extensively used the capabilities of Power Center such as File List, pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc
  • Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions
  • Identifies, researches, and resolves ETL root cause of production issues or system problems
  • Worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs using CONTROL-M schedulers and involved in migration deployment of ETL codes from development to production environment
  • Knowledge on STAR Schemas and Snowflake Schema
  • Responsible for creating the parameter files in order to connect to the right environment and data base Responsible for monitoring sessions
  • Debugged mapping of the failed session
  • Handled Production Support for the monitoring of daily and Monthly jobs Part of the implementation team.

ETL Developer

Simon property Group
02.2017 - 11.2019
  • Developed ETL (data transformations and data movement) using Talend integration, SQL server
  • Supported production release during migration and high per care period
  • Created Java Routines, Reusable transformations, Joblets using Talend as an ETL Tool
  • Developed mappings to load Fact and Dimension tables, SCDType 1 and SCD Type 2 dimensions and Incremental loading
  • Developed CDC (Change Data Capture) jobs to process inserts, updates and deletes in talend using routines
  • Refined and performed information relocations with SQL Server and complex Excel macross
  • Made SQL put away systems and capacities for information purifying, organizing, and the consolidation of business rules
  • Performed issue examination and relocation issue goals with every day customer calls If customer support all through movement
  • Migrating from Cast Iron (Data stage) Jobs to Talend
  • Arranged documentation for the help assets on new usefulness and material for examination of client issues Striking
  • Accomplishments
  • Designed and implemented product data vault to extract, transform and load the product data with history of changes on the products attributes
  • Made new SQL put away methodology to improve precision and speed of existing Forms
  • Extensively used database connections, file component, tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, aggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher
  • Involved in design, development of Talend mappings
  • For dividing rows used tSplitrow component and for removing duplicates used tUniquerow
  • Experienced with structured query nomenclature
  • Examples include MSSQL, DB2, SQL, etc.

Education

Master of Science - Information Technology

Northwestern Polytechnic University
12.2016

Bachelors - Electronics and Communication Engineering

JNTUH - Hyderabad
05.2014

Skills

  • Oracle 12c,11gr2(112010)/10g/9i, MS SQL Server 2012/2008
  • SQL, PL/SQL, Korn & Bourne Shell Scripting

Additional Information

Over 8+ years of experience in IT industry, related to various aspects involving Data integration and Data warehousing techniques, using ETL tools like Informatica Power Center 10.2/9.6/9.1/8.6, Informatica Power Exchange 10.2/9.6, Informatica Intelligent Cloud Services (IICS) Experience integrating data to/from On - premise database and cloud-based database solutions using Informatica intelligent cloud services Experience working with cloud-based database solutions including Azure Synapse, Azure Data lake store, AWS Redshift and Snowflake Experience working with traditional on-premise database including Oracle, Sql Server and Teradata Expert in all phases of Software development life cycle (SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance Worked with different non-relational Databases such as Flat files, XML files, Mainframe Files Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, maps, workflow names, log files, bad files, input, variable, output ports) Worked with different Informatica performance tuning issues like source, target, mapping, transformation, session optimization issues and fine-tuned Transformations to make them more efficient in terms of session performance Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings, PL/SQL Stored Procedure and Triggers Experience in Creating ETL Design Documents, strong experience in complex PL/SQL packages, functions, cursors, indexes, views, materialized views Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team Extensive experience in UNIX Shell Scripting, AWK and file manipulation techniques Demonstrated ability in defining project goals and objectives, prioritizing tasks, developing project plans and providing framework for effective communication while maximizing responsiveness to change Possess experience in working on concurrent projects in very demanding and high-pressure situations. Proficient in Informatica PowerCenter, Autosys job scheduling, SQL, and database technologies.

Personal Information

Title: ETL Developer

Timeline

Sr Informatica Developer

Walgreens
11.2021 - Current

ETL Informatica & IDQ Developer

State of Indiana
12.2020 - 10.2021

Sr. ETL Consultant

Clarios
12.2019 - 10.2020

ETL Developer

Simon property Group
02.2017 - 11.2019

Master of Science - Information Technology

Northwestern Polytechnic University

Bachelors - Electronics and Communication Engineering

JNTUH - Hyderabad
Vijay C