Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Parthiban Venkatesan

Frisco,TX

Summary

14 years of IT experience in Software Development, Analysis, ETL, Requirements Gathering Data warehousing for clients in Banking, Health care and Retail domains. 7+ years of experience in Banking domain and expertise in Retail banking concepts. Worked for Healthcare and Retail clients Experienced in attending business meetings, gather requirements and convert business requirements to design documents, Exposure to projects in On-prem and Cloud environments

Overview

16
16
years of professional experience
1
1
Certification

Work History

Senior Data Engineer

MGM Resorts
Plano, TX
09.2021 - Current

Tools and Technologies : Teradata, Azure Data Factory, Azure Databricks, Synapse, Oracle, SSIS, SQL server, Confluence, Github,Python, Atomic Job Scheduler, ADLS

Job Responsibilities:

  • Understand the various business data and their corresponding databases, responsible as a SME to assist the business users to build dashboards
  • Create new mapping sheets for any additional stream of business data related to various operations like guest services, call centers, Gaming, Labor, ticket reservation
  • Migrate the existing code and logic in On-prem Teradata to Azure databricks, convert Teradata SQLs to Spark Sqls, from on-prem to cloud
  • Developed data pipelines to ingest and process large datasets.
  • Orchestration of jobs in ADF for new marts and migrating from on-prem
  • Work with admins for resource and storage accounts setup
  • Perform modelling activities on migration of the marts
  • Designed, built, and maintained high-performance databases for reporting and analysis purposes.
  • Write SQL queries in Teradata to create views that is sourced to the dashboards built on Power BI and Cognos
  • Write new and modify existing python scripts that pull data from multiple sources and databases and convert to csv and populate to Teradata using the utilities
  • Perform UT on the Azure Databricks using the python spark sql and test SCD1 and SCD2
  • Create new tables and views in Teradata for various business streams adhering to Data model
  • Responsible to analyze the existing data and the data flow in EDW that holds the clients enterprise data built on Teradata 16
  • Maintain the jobs schedule in Atomic, monitor daily jobs and provide support and maintain during the schedule maintenance
  • Collaborated with other teams to understand their requirements and deliver solutions accordingly.
  • Perform GAP analysis between the middleware application and the database sources built on DB2 and sourced from Oracle and Teradata
  • Analyze data present outside of EDW built on databases like Teradata, Oracle PL SQL, SQL Server etc
  • Utilize Confluence for uploading documentation to be shared across the team
  • Maintain spaces for knowledge and space documentation
  • Coding version control is uploaded to Github, create branches and peers in github
  • Write and modify the ETL scripts in databricks, perform scheduling and monitoring in ADLS
  • Modify the existing and create new python scripts for file handling
  • Attend Business meetings to gather requirements, create a design strategy, convert functional requirements to technical design and write design documents
  • Lead team on design and development and support until warranty
  • Write complex Teradata and SQL queries that include various join strategies to arrive at the design, create Unix shell scripts to handle files created on .csv, xml etc
  • Perform Data analysis before creating the design and perform data cleansing of files
  • Create source to target mappings before coding by understanding the Physical and Logical data model working along with Data modeler
  • Write Teradata components using utilities BTEQ, TPT, MLOAD, FLoad for the migration activity based on design
  • Lead migration activity analysis for Teradata to Cloud

Senior ETL Analyst

Compunnel Software Group Inc
Plano, TX
08.2019 - 08.2021

Tools and Technologies : Teradata, Oracle, SSIS, Informatica, Unix, SQL server, Workato, Confluence, Github,Python

Job Responsibilities:

  • Responsible to analyze the existing data and the data flow in EDW that holds the clients enterprise data built on Teradata 16
  • Perform GAP analysis between the middleware application and the database sources built on DB2 and sourced from Oracle and Teradata
  • Analyze data present outside of EDW built on databases like Teradata, Oracle PL SQL, SQL Server etc
  • That uses code in Mainframes, UNIX etc
  • And provide ETL solutions using Oracle, Teradata, Datastage and Informatica Powercenter 10.1 to bring in to EDW
  • Utilize Confluence for uploading documentation to be shared across the team
  • Maintain spaces for knowledge and space documentation
  • Coding version control is uploaded to Github, create branches and peers in github
  • Modify the existing and create new python scripts for file handling
  • Execute the sqls in sql server using the SAS
  • Utilize business intelligence tools Tableau for retails data reporting
  • Attend Business meetings to gather requirements, create a design strategy, convert functional requirements to technical design and write design documents
  • Lead team on design and development and support until warranty
  • Write Stored procedures and Triggers using T-SQL in SQL server, PL-SQL in Oracle, create packages to perform ETL
  • Create error handling workflows and mappings to generate email alerts
  • Write complex Teradata and SQL queries that include various join strategies to arrive at the design, create Unix shell scripts to handle files created on .csv, xml etc
  • Perform Data analysis before creating the design and perform data cleansing of files
  • Create source to target mappings before coding by understanding the Physical and Logical data model working along with Data modeler
  • Write Oracle scripts using SQLPLUS, SQLLDR
  • Write Teradata components using utilities BTEQ, TPT, MLOAD, FLoad for the migration activity based on design
  • Utilize Teradata TASM for maintaining workload management, performance monitoring and increase efficiency of performance
  • Perform data and object migration between systems using Data mover in Teradata
  • Write Teradata Stored procedures and Triggers for performing CDC on each tables
  • Create Informatica mapping, workflows, using various transformations for performing ETL aspects of the data migration
  • Utilize Erwin data model to make changes and include new tables into the model
  • Performance tuning on Informatica mappings and workflows in powercenter, utilize Push down optimization
  • Analyze the data match and the tables across various databases to modify the source
  • Perform a detail level of testing on the developed queries, Tune it for performance and tune the underperforming queries
  • Define an approach to perform the stats operations on the newly created and modified tables
  • Calculate necessary space in GB for new incoming data and raise request to DBA
  • Implement performance tuning and write advance SQL and PL-SQL queries in finding out the equivalent Physical columns within EDW for the corresponding business terms
  • Create workflow and jobs in Workato, to access data in web using rest API and populate them in databases
  • Use SQL Server Management Studio to create tables in SQL server and populate data from APIs and create ETL jobs using SSIS
  • Write and understand T-SQLs for performing data modifications and reports in the tables.

Senior ETL Programmer Analyst Teradata

Systems Technology Group, Inc, Ford Motor Credit Company
Dearborn, MI
10.2017 - 08.2019

Tools: Teradata, Mainframes, Datastage, Autosys

Responsibilities:

  • Worked in Data Integration and migration projects within different databases of Teradata by utilizing the ETL Capabilities
  • Validate the data compatibility between systems across various databases
  • Create designs in Teradata and ETL to migrate new data into the system
  • Write Stored procedures and Triggers using T-SQL in SQL server, utilize load utilities and create packages to perform ETL
  • Responsible to work in Enterprise data warehouse (EDW) and Analyze the jobs created in Mainframes , Teradata and other (Extract transform Load) ETL utilities like Datastage used in the client environment that access the EDW to bring new data and create reports on existing data
  • Perform migration activity on the components using (Job Control Language) JCLS, Parmcards, BTEQ, Fastload, MLoad
  • Understand the Logical and Physical data model , relationship of tables and Create mapping documents based on the analysis on existing jobs to create new components
  • Perform Proof of concepts on converting SQLs into Datastage stages for enabling lineage
  • Attend Business meetings , create a design strategy and convert functional requirements to design
  • Interpret the various business terms and identify/mine their equivalent technical terms inside the data warehouse built on Teradata
  • Write various PL-SQL queries to perform updating the existing data based on the business requirements
  • Implement performance tuning and write advance SQL queries in finding out the equivalent physical columns for the corresponding business terms
  • Work with DBAs to create DDLs, ensure proper standards are setup, Skewness is avoided in queries
  • Tweak underperforming queries and work with DBA to setup spaces
  • Develop new components in Teradata, Mainframes , Design documents
  • Create Datastage components utilizing various stages of IBM Datastage
  • Perform Proof of Concept in converting complex SQL codes into Datastage mappings

ETL Developer

Cognizant Technology Solutions, PEPSI CO
Plano, TX
01.2016 - 09.2017
  • Working as a Data Analyst and Developer for a retail domain, responsible for gathering requirements, develop SQL code, Unit testing
  • Attend business meetings, gather requirements and convert business requirements to design documents
  • Understand the data model, write source to Target mappings for the ETL process
  • Perform data quality checks based on the test data
  • Create Mappings, workflows in Informatica power center 9.5.1 , 10.1 and Cloud perform Publication, Subscriptions in cloud and usage of different transformations or create packages in SSIS and create ETL jobs
  • Write Shell scripts in handling csv, xml, txt files in Unix, use SED,AWK commands in file handling and writing trigger scripts
  • Involve in performance tuning of T-SQL Queries, creating mapping source to target mapping sheets, handling stats on the tables
  • Write Stored procedures and Triggers using PL-SQL in Oracle, utilize load utilities and create packages to perform ETL
  • Generate reports by writing ad-hoc T-SQLs
  • Responsible for creating and doing unit test, ensure performance tuning using Informatica Power Center 9.5.1 and 10.1
  • Develop new components in Informatica Data Integration Hub (DIH)
  • Teradata Utilities Fast load, Mload, Bteq, Tpump and responsible for Performance tuning
  • Worked in Oracle 11g, using load utilities like SQLPLUS, SQLLDR
  • Environment: Oracle, Teradata, SQL Server, SSIS, Informatica Power center 9.5.1 and 10.1, Data Integration Hub (DIH), SQL Assistant, UNIX, Oracle, Db visualizer.

Programmer Analyst, SSIS and ETL Analyst

Wipro Technologies Ltd, CVS Care Mark
Richardson, Dallas, TX
02.2015 - 01.2016
  • As an ETL Lead responsible to gather requirements, Design and assist develop and review the code on the health care projects in Teradata 13, SQL Server with Informatica Power center 9.5.1 and SSIS as ETL on UNIX operating system
  • Perform data analysis alongside solution architects for requirements specifications
  • Create history files after the project deployment based on business requirements using ETL and utilities
  • Responsible for creating design documents, source to target mappings as per functional requirements and presentation to client side Leads and SMEs for approvals
  • Provide QA support during ST,QAT and UAT
  • Perform release management process in Serena
  • Good knowledge on Health care subject areas CLIENT, MEMBER, ACCOUNT, GROUP, DRUG, PRESCRIBER
  • Create T- SQLs, Tables is SQL server and Teradata components BTEQ, MLOAD, FLOAD and Informatica workflows, sessions and mapping based on data model and requirements
  • Environment: Teradata, SQL Server, SSIS, Informatica Power center 9.5.1, SQL Assistant, UNIX, Oracle, Design and documentation

Subject Matter Expert

Wipro Technologies Ltd, Lloyds Banking Group
Manchester, UK
12.2012 - 01.2015
  • For Bank’s data warehouse and Teradata lead, Responsible to successfully handle and deliver the data migration and integration activity of two major banks and also handle SDLC projects on Retail portfolio
  • In depth knowledge in various databases that holds each type of data in the warehouse
  • Working on Teradata 12, Mainframes, ETL and FSLDM
  • The core database of the ware house was built on Teradata FSLDM, Expertise in FSLDM on subject areas- AGREEMENT, PARTY, FEATURE,EVENT,PRODUCT
  • Knowledge on using different type of Indexes like Join Index, Multi Table JI, Single table JI
  • Work with DBA’s in defining DDLs based on Logical data models, setting up coding standards, assigning the generic parameters to find high skewed queries
  • Work with data modelers to map the data into the FSLDM based database
  • Responsible for performance tuning, space additions, scheduling etc
  • For handling new data
  • Used Fast load, Fast Export, BTEQ scripts for handling ETL requirements
  • Create Impact Assessments IA's, before study
  • This will identify for any impacts to the platform and to give an estimate which has variance level of +- 50
  • Environment: Teradata, TD Utilities, ETL, Performance Tuning, Mapping, Sql Assistant, Unix, Mainframes zOS, JCLs

Senior Software Engineer - Teradata Analyst

Wipro Technologies Ltd, Lloyds Banking Group
Chennai, India
01.2011 - 11.2012
  • Responsible to Develop Mid-level platform design document, data mapping sheets
  • Raise queries to the business and architect team to close all the possible clarifications in design to avoid any hindrance to build
  • Create Impact assessment and Teradata BTEQs, JCLs and other components
  • Support QA testing using HP Quality Center
  • Gather requirements, provide updates to the CIO and necessary clarifications for EAD for any queries related to GDW data and support implementation of projects
  • Environment: Teradata, TD Utilities, ETL, Mapping, SQL Assistant, UNIX, Mainframes zOS, JCLs, Connect Direct.

Senior Systems Analyst – Teradata Developer

Infosys Technologies LTD, Bank of America
Chennai, India
07.2007 - 12.2010
  • Responsible to build Teradata components like BTEQ, MLOAD, FLOAD in Mainframes
  • Build Mainframe components JCL, PROCS, parmcards
  • Create different type of tables like SET, MULTISET, Volatile based on the requirements and the Physical data model defined
  • Handled multiple files and used MLOAD to load the Teradata staging tables
  • Applied mapping logic over the incoming data from Landing and staging tables, load to the target tables
  • Worked on collect statistics, decoding the explain plan, analyzing the confidence level of queries and different type of joins
  • Presented performance tuning to team of around 15 and explained the efficiency of the different Teradata tables, Architecture and the efficiency of the different load utilities
  • Environment: Teradata, TD Utilities, Oracle –PL SQLs, ETL, Mapping, SQL Assistant, UNIX, Mainframes zOS, Change man, File Aid, JCLs, CA7, Data Migration and System migration.

Education

Bachelors in Engineering - Electrical, Electronics And Communications Engineering

SA Engineering College
Chennai
03.2007

Skills

  • Technical Skills:

Database and systems

  • Teradata, Oracle, SQL Server, Databricks, DB2
  • Mainframes, Unix, Python
  • Azure Databricks, ADLS, Azure Data Factory
  • Teradata, Oracle tools and Utilities
  • Friendly, Positive Attitude
  • ETLs: Informatica Power Center 951, Power Center 102,IICS,Data Integration Hub, Datastage 115, Workato, SSIS, SSMS ,Data Governance, Data Cleansing
  • Waterfall, Agile, Jira, github, Confluence
  • Scheduling and other Client Tools :CA7, ENDEVOR, Autosys, WCC, Ctrl - M
  • SQL Assistant, Visio, SSMS, Teradata Studio

Certification

Texas McCombs School of BusinessTexas McCombs School of BusinessPost Graduate Program in Artificial Intelligence and Machine Learning: Business Applications, Artificial Intelligence and Machine LearningPost Graduate Program in Artificial Intelligence and Machine Learning: Business Applications, Artificial Intelligence and Machine LearningSep 2021 - Aug 2022Sep 2021 - Aug 2022

  • Activities and societies: https://eportfolio.mygreatlearning.com/parthiban-venkatesan

Timeline

Senior Data Engineer

MGM Resorts
09.2021 - Current

Senior ETL Analyst

Compunnel Software Group Inc
08.2019 - 08.2021

Senior ETL Programmer Analyst Teradata

Systems Technology Group, Inc, Ford Motor Credit Company
10.2017 - 08.2019

ETL Developer

Cognizant Technology Solutions, PEPSI CO
01.2016 - 09.2017

Programmer Analyst, SSIS and ETL Analyst

Wipro Technologies Ltd, CVS Care Mark
02.2015 - 01.2016

Subject Matter Expert

Wipro Technologies Ltd, Lloyds Banking Group
12.2012 - 01.2015

Senior Software Engineer - Teradata Analyst

Wipro Technologies Ltd, Lloyds Banking Group
01.2011 - 11.2012

Senior Systems Analyst – Teradata Developer

Infosys Technologies LTD, Bank of America
07.2007 - 12.2010

Bachelors in Engineering - Electrical, Electronics And Communications Engineering

SA Engineering College
Parthiban Venkatesan