Summary
Overview
Work History
Education
Skills
Work Availability
Timeline
Generic

Tymur Aurora

Data Integration Lead - IICS
Oviedo,FL

Summary

14+ years of work experience in ETL Development, Data Visualization, Data Warehousing and Data Integration using cutting edge technologies. 4+ years of professional experience in Data Integration on Informatica Intelligent Cloud Services (IICS) and Informatica Cloud Real Time (ICRT) processes. 10+ years of strong experience in Data Warehousing and ETL using Informatica PowerCenter 10.2.1/9.6.5.1/8.x/7.x, Power Exchange 10.1/9.6.5.1/8.6/8.1, Oracle 12c/11g/10g/9i, Teradata 14/13/12/V2R6 and Erwin. 2+ years of experience in Informatica Data Quality of data integration solutions using IDQ 10.2, 9.6.1. 3+ years of experience on real-time data Warehouse development using CDC tool Informatica PowerExchange 10.1/9.6.5.1/8.6/8.1. 1+ year of experience of SnapLogic data integration solution using SnapLogic tool.

Overview

18
18
years of professional experience

Work History

Data Integration Lead-IICS

AbbVie
03.2022 - Current
  • Interacted with business partners to identify/understand requirements, design specification, exception, and vision in key projects
  • Identified ETL specification based on business requirements and created ETL IICS mapping document (STTM), high level design document (HLD) and technical design document (TDD)
  • Developed TaskFlow, DataTask, Mapping using IICS cloud services for Marketo load
  • Developed REST API/Bulk process for Marketo extracts
  • Built data feeds and Extracts for Marketo loads
  • Extensively worked and developed on Web Service transformation, Business Service as a part of the inventory sync between DataMart and Marketo loads
  • Managed code reviews and ensure that all solutions are aligned with architectural/requirements specification, SLAs and standards
  • Worked on REST API, Bulk extract downloaded from Marketo application for data reconciliation using Postman
  • Managed and supported daily MA-Ops data loads
  • High volume of data loaded into Marketo using Curl command
  • Managed and supported US 3RD party data loads
  • Maintained, and supported S3 SFTP folder for US 3rd party vendor
  • Maintained, and supported AWS Secret and Access key for US and global vendors
  • Maintained, and supported AWS S3 bucket for DataMart loads using Hadoop cluster
  • Maintained, and supported Azure DevOps code repository for DataMart
  • Maintained and supported Informatica real time application process ICRT
  • Worked on various ICRT process load data from different sources using REST API
  • Performed routine Autosys tasks such as synchronizing database, opened cases with CDL as needed and processed Service Now request
  • Supported Autosys scheduling infrastructure by monitoring all components of the infrastructure to ensure high availability
  • Worked on Hadoop cluster loading data from Unix file system to HDFS
  • Used Spark-SQL to create temp table in Hive database and files created in AWS S3
  • Read files in AWS S3 using Spark-SQL
  • Worked on DevOps Repo for code migration
  • Worked on API Manager (iPaaS)to set up rate limit for IICS extract
  • Transformed source excel, pdf files to load data into Informatica PIM (Product 360) Structure features
  • Transformed source pdf data into csv and load data into Informatica PIM (Product 360) Structure features.

Data Integration Lead-IICS

Sanofi
01.2021 - 02.2022
  • Manage US 3rd party Scrum boards and leads the team's efforts towards continuous improvement
  • Performed project management on several business enhancement and work requests; Manage sprint backlog items and tasks
  • Manage team and assigned tasks, review/validate code and help them to fix critical issues in testing and development
  • Provided guidance to MCE OPS team data integration solutions; led troubleshooting for complicated issue; supervised junior staffs on the technical resolutions; provided training to peers with specialized technical skills; knowledge and skills transfer to developers and business analysts based on Agile and Sprint methodology, actively joined daily scrums
  • Managed and supported daily MCE OPS data loads
  • Interpreted the business rules and sourcing the data from multiple source systems
  • Interacted with users and supported multiple projects for Data integration team (Merkle analytics, Adobe, SFMC and MCE global team)
  • Extracted the raw data from SharePoint, Sql Server, Flat Files to staging tables and loaded data into Cloud integration hub (CIH), Flat files, SharePoint, Sql server using Informatica Cloud
  • Hands on Experience with loading Customer data, Response, Contact, Fulfillment and Undeliverable and did Upserts with help of External ID (Hash Key), to maintain SCD Type 1 within CDI Using Task Flows and Used Bulk API, and Standard API
  • Created File Mass ingestion task for moving Data files from FTP, SFTP, Local Folders to FTP SFTP, Local Folders and Data from Databases Mass ingestion task to load data from on-premises database to Salesforce cloud
  • Created, maintained, and supported S3 SFTP folder for US 3rd party vendor and global 3rd party vendor
  • Created, maintained, and supported AWS Secret and Access key for US and global vendors
  • Developed Informatica Cloud Data Integration mapping and Taskflows to extract and load data between on-premises, Amazon S3, SQL Workbench Data Warehouse and Data Lake Store; created and configured all kinds of cloud connections and runtime environments with Informatica IICS
  • Operated AWS console to configure services and configurations
  • Handled data profiling, source analysis and data quality analysis
  • Prepared high level solution architecture Design and review with top management.

Informatica Cloud Engineer

Indiana University Health
07.2020 - 12.2020
  • Gather requirements from the business stakeholders and IU Health Intelligence and Medical Intelligence application for IICS testers
  • Created high-level designs and architectures directions to be used as inputs to detailed designs
  • Delivered IICS ETL specifications and Detailed Design by developing Informatica mapping designs, Data Integration Workflows and load processes
  • Responsible to analyze functional requirement
  • Created IICS design specification and technical specifications based on functional requirements
  • Created end to end error handling process using Hierarchy Parser transformation, XML file and Batch script
  • Created end to end automated process parameter file for extensive IU Health extract generation
  • Extensively worked on developing IICS Mapplets, Mappings, Sessions, Worklets and Workflows for data loading
  • Worked on IICS transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Sorter, Router, Sequence Generator, XML transformation etc
  • Extensively used IICS ETL to load data from wide range of sources such as relational database, XML, flat files (fixed-width or delimited) etc
  • Expertise in writing Pre and Post SQL and overrides SQL as per the requirement
  • Extensively worked on using the PDO (Push down optimization), CDC (Change data capture) mechanism
  • Responsible for creating the Parameter files in order to connect to the right environment and data base Responsible for monitoring sessions
  • Assist with Interpretation of the IT strategy and aligns resources and people to meet strategy goals
  • Make operational and strategic decisions, executing business operations for the department/function managed
  • Involved in testing QNXT member, provider, claim processing, utilization management, accumulators, contracts and benefits
  • Analyzed the change detection process on QNXT database tables to capture the daily changes done by the user through online QNXT application
  • Involved in impact analysis of QNXT adjunction system as a result of change in EDI transaction.

Informatica Cloud Engineer

Healthfirst
10.2019 - 06.2020
  • Provide recommendations and expertise of data integration and ETL methodologies
  • Built, developed and provided solutions over 15 integration data sourcing from Workday
  • Created and integrated SOAP call for Workday-IICS integration
  • Connections created for HR, Payroll and Finance data in the IICS
  • Convert specifications to programs and Integration Data Mapping for ETL Informatica Cloud environment
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Created and provided solution for High level design documentation
  • Created details level design document specification for Informatica cloud mappings
  • Built Mapping task, task flow, liner task flow and provided solution for custom error handling process
  • Built data validation pipeline for vendor like, Equifax, Aetna and Cigna etc
  • Sent file to vendor using FTP/SFTP connection
  • Developed and write files into the EC2 server and AWS S3 bucket
  • Develop reusable integration components, transformation, logging processes
  • Built Unix shell script for rename files, change header and footer for files specification
  • Built post session command script for files specification
  • Design Develop and Informatica jobs to trigger Unix Shell scripts for importing the data from workday and bringing data into AWS S3 storage
  • Manage IICS administrator console
  • Built Informatica Cloud deployment process
  • Deployed Informatica Cloud jobs into various environment (ST, RT & Prod)
  • Maintained project timelines and communicate with clients to ensure project progress satisfactorily using Agile methodology
  • Worked on JIRA and client internal VersionOne agile tools.

Sr. Software Engineer-Cloud Developer

PwC
10.2018 - 09.2019
  • Provide recommendations and expertise of data integration and ETL methodologies
  • Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Developed Cloud mappings to extract data for different regions (APAC, UK and America)
  • Design, Development, Testing and Implementation of ETL processes using SnapLogic
  • Integrate full process of IDQ installation
  • Arranged and write test cases for PoT (Proof of Technology) Data Quality vendor selection
  • Data cleansing prior to loading it into target system using IDQ
  • Designed and executed IDQ mappings that will cleanse, de-duplicate
  • Implemented data quality processes including transliteration, parsing, analysis, standardization and data enrichment using IDQ transformations
  • Guided and migrated Postgresql and MySql databases to AWS Aurora
  • Special interest and experience in AWS cloud infrastructure database migrations, Postgresql and converting existing ORACLE and MS SQL Server databases to PostgreSQL, MySQL and Aurora
  • Developed ETL pipelines in and out of data warehouse using SnowSQL Writing SQL queries against Snowflake
  • Designed and implemented a fully operational production grade data solution on Snowflake Data Warehouse
  • Experienced in developed human task workflow and IDQ scorecard to support data remediation
  • Convert specifications to programs and data mapping in ETL Informatica Cloud environment
  • Develop reusable integration components, transformation, logging processes
  • Used Address Doctor extensively for North America Address validations
  • Built several reusable components on IDQ using Standardized and Reference tables which can be applied directly to standardize and enrich Address information
  • Created custom rules to validate zip codes, states and segregated address data based on country
  • Created web services for address mapplets of different countries to integrate with SOAP UI
  • Support system and user acceptance testing activities, including issue resolution
  • Maintain and support database SQL Server, MemSQL, PostgreSQL and TDV (Tibco Data Virtualization)
  • Execute change management activities supporting production deployment to Developers, Quality Control Analysts, and Environment Management personnel
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics
  • Perform unit and integration testing and document test strategy and results
  • Formulation of highly detailed DW solutions using Informatica tool set which can be practically implemented.

Tech Lead/Lead Developer

Salesforce
10.2017 - 09.2018
  • Developed and advocate the development standards and practices for the development team, i.e
  • Coding, code management, and documentation
  • Developing new workflow components for Salesforce system
  • Design & develop highly efficient/high performance ETL mappings/workflows
  • Developed the audit activity for all the cloud mappings
  • Created File watcher jobs to setup the dependency between Cloud and PowerCenter jobs
  • Design Develop and Test ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.1
  • Develop Informatica code, Design, Develop and modify Informatica mappings and workflows
  • Involved with management in terms of supplying input for key design and architecture decisions, as well as work estimation and resource planning.

Lead Informatica Developer

BlueShield of CA
11.2016 - 10.2017
  • Developed ETL programs using Informatica to implement the business requirements
  • Perform data mapping of source-to-target data sets
  • Loaded the aggregate data into a relational database for reporting, dash boarding and ad-hoc analysis
  • Overseeing the inbound and outbound interfaces development process by closely working with functional, developer and tester
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Lead Informatica Developer

Actavis
11.2014 - 01.2016
  • Involved and proficient in defining and validating protocols for clinical studies and handling trial responsibility throughout the data-management lifecycle
  • Worked on CED-EDC source system (Electronic Data Capture) and database design and hypothesis
  • Support clinical trials for CRO (Contract Research Organizations) by providing meticulous data management
  • Design and maintain databases, queries, reports, and graphics and data-analysis tools; perform data entry, check reviews, database audits and coding; and define and validate study protocols
  • Worked on Oracle Clinical development on the design, testing and implementation of study databases
  • Develop clear clinical data sets enabling the standardized collection and analysis of massive amounts of cross-boundary data content in a timely manner and with a high level of accuracy
  • Track progress of clinical studies, ensuring projects meets timelines and quality expectations.

Sr. Informatica Developer

Tufts Health Plan
08.2013 - 10.2014
  • Developed technical specifications of the ETL process flow
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in SQL Server and Oracle
  • Worked on various issues on existing Informatica Mappings to Produce correct output
  • Debugged mappings by creating logic that assigns a severity level to each error and sending the error rows to error table so that they can be corrected and re-loaded into a target system
  • Analyzed existing system and developed business documentation TRD on changes required
  • Analyzed existing mapping and Reverse Engineering created DLD
  • Analyzed existing Health Plan issues and Re Design on change required.

Sr. Informatica Developer

Long Island Railroad
10.2011 - 07.2013
  • As a lead member of ETL Team, responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, administration and mentoring
  • Participated in business analysis, ETL requirements gathering, physical and logical data modeling and documentation
  • As Scrum Master I manage: Stand-ups, Backlogs, sprint Planning Meetings
  • Delivery of quality solutions on time, within budget using approved scheduling tools, techniques and methodologies has been critical to success
  • Accustomed to working well with internal and external stakeholders at multiple levels, navigating the organization, successfully completing complex projects under tight deadlines
  • Doing self and peer review for Informatica and oracle objects
  • Designing the data transformation mappings and data quality verification programs using Informatica and PL/SQL
  • Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.

Informatica Developer

Northrop Grumman
10.2009 - 09.2011
  • Designed and developed Logical/physical Data Model, Forward/Reverse engineering Using Erwin 7.2
  • Designed and developed Workflows as per ETL Specification for Stage load and Warehouse load
  • Providing on call production support and efficiently tracked heat-tickets, timely resolving prod issues and proactively escalation (if appropriate), resolution and closure of trouble issues and tickets
  • Designed ETL functional specifications and converting them into technical specifications
  • Interacted with management to identify dimensions and measures
  • Review source systems and propose data acquisition strategy
  • Developed ETL methodology to custom fit the ETL needs of sales.

ETL Developer

Health Care USA
11.2007 - 09.2009
  • Involved in creation of Logical Data Model for ETL mapping and the process flow diagrams
  • Worked on Informatica versioned repository with check in and checkout objects feature
  • Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions
  • Provided guidance to less experienced personnel
  • Conducted quality assurance activities such as peer reviews
  • Develop, execute and maintain appropriate ETL development best practices and procedures
  • Monitor and tune ETL processes for performance improvements; identify, research, and resolve data warehouse load issues.

Network System Administrator

CUNY York College
06.2005 - 10.2007
  • Work with and/or manage vendors to implement and maintain network-related solutions
  • Analyze data network documentation and communicate to management regarding the current operational status of networks
  • Assist technical support staff and end-users to manage basic and expedited support for all network related issues
  • Work independently while providing sysadmin support to multiple software development teams
  • Develops and reviews policies and guidelines established for all levels of network and systems administrators / engineers to follow
  • Work with wired and wireless data network providers to debug and resolve customer affecting service issues
  • Document and perform network changes to fulfill network change requests.

Education

Certificate - Scrum Master

Scrum Alliance
Web
12.2021

Bachelor of Science - Information System Management

York College of The City University of New York
Jamaica, NY
02.2005

Skills

  • Data Warehouse/Data Mart Development Life Cycle using Dimensional modeling of STAR, SNOWFLAKE schema, OLAP, ROLAP, MOLAP, Fact and Dimension tables, and Logical & Physical data modeling using ERWIN 75/42 and MS Visio
  • Having Business Intelligence experience using OBIEE 11g/10g, Business Objects XI R2, MS Access Reports
  • Extensive experience in using Oracle 11g/10g/9i, DB2 8/7, MS SQL Server 2008/2000, Teradata 14/13/12/V2R6, MS Access 70/2000, Erwin, XML, SQL, PL/SQL, SQL
  • Loader and MS SQL Developer 2000, Win 7/XP and Sun Solaris
  • Worked with Teradata loading utilities like Multi Load, Fast Load, TPump and BTEQ
  • Extensively worked on Oracle Function, Cursor, Store Procedure, and Package & Trigger
  • Experience on data modeling and create LDP and PDM for Star schema and Snowflake schema using MS Visio and ERWIN 71/45
  • Exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation and production support
  • Excellent working knowledge of UNIX shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX
  • Experience of working on onsite – offshore module
  • Worked as Agile Scrum call facilitator and track status and discussion using BaseCamp and Rally
  • Extensive experience on effort estimation, distribution of work, track & update status report, team coordination and update client
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member
  • Providing Production Support, Resolution & Closure of Trouble issues tickets
  • ETL Technology IICS R29, Informatica PowerCenter 1021/9651/8x/71/61, PowerExchange 101/965/86/81, IDQ 102, SnapLogic,
  • Data Modeling Star Schema Modelling, Snowflake Modelling, MS Visio, Erwin
  • Databases Oracle 12c/11g/10g/9i, MS SQL Server 2008/2000, MS Access, DB2, Teradata 14/13/12/V2R6, Sybase, Netezza, PostgreSQL, MemSQL
  • Reporting Tools Business Objects XIR2, OBIEE 11g/10g
  • Languages C, C, SQL, PL/SQL, HTML, UNIX Scripting
  • Other Tools Toad, SCM, Putty, Tidal, Autosys, ESP, Tableau, QlikView, Power BI
  • Operating Systems LINUX, SUN Solaris, AIX, Windows7/XP/2000/98, Mac OS
  • Version Control GitHub, SVN (Tortoise as SVN client, Subclipse for Eclipse), Clear Case
  • Data Integration

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Timeline

Data Integration Lead-IICS

AbbVie
03.2022 - Current

Data Integration Lead-IICS

Sanofi
01.2021 - 02.2022

Informatica Cloud Engineer

Indiana University Health
07.2020 - 12.2020

Informatica Cloud Engineer

Healthfirst
10.2019 - 06.2020

Sr. Software Engineer-Cloud Developer

PwC
10.2018 - 09.2019

Tech Lead/Lead Developer

Salesforce
10.2017 - 09.2018

Lead Informatica Developer

BlueShield of CA
11.2016 - 10.2017

Lead Informatica Developer

Actavis
11.2014 - 01.2016

Sr. Informatica Developer

Tufts Health Plan
08.2013 - 10.2014

Sr. Informatica Developer

Long Island Railroad
10.2011 - 07.2013

Informatica Developer

Northrop Grumman
10.2009 - 09.2011

ETL Developer

Health Care USA
11.2007 - 09.2009

Network System Administrator

CUNY York College
06.2005 - 10.2007

Certificate - Scrum Master

Scrum Alliance

Bachelor of Science - Information System Management

York College of The City University of New York
Tymur AuroraData Integration Lead - IICS