Accomplished IT professional with over 25 years of expertise in Data Architecture and development, specializing in data quality assurance and the implementation of critical business applications across Insurance, Financial, Retail, and Telecommunications sectors. Proven track record of enhancing data integrity and optimizing systems through innovative design and effective maintenance strategies. Core competencies include project management, system integration, and analytical problem-solving. Aiming to leverage extensive experience to drive efficiency and innovation in future technology initiatives.
Overview
29
29
years of professional experience
1
1
Certification
Work History
Lead Consultant / Sr. Data Architect
Infosys
02.2017 - Current
As part of the assignments, ETL processes are built to extract data from different Source applications using Informatica, Azure DF, real-time CDC functionality, DMF/Dixie, TDI & TVD frameworks, transform and load into Target databases like Teradata, Netezza, Data Lake / Hive and Snowflake for Warehouse and BI Reporting
Lawson system is upgraded from on-prem to Lawson CloudSuite, this includes exchanging INBOUND and OUTBOUND files with vendors like Transamerica which facilitates 401K Plans, ACA which generates employee tax documents and ATS ICIMS for hiring, and onboarding employees online, Equifax work numbers for employment verification, benefits focus for Employee benefit enrollments, Asset Protection Area and so on
Architected and designed Snowflake Data Cloud solutions, optimizing performance, security, and reliability across multiple environments
Interacting with the clients regularly to understand the business use of the requirements in detail
Engineered Snowflake databases, schemas, virtual warehouses, clustering, and time travel strategy
Established Snowflake non-production environments, resulting in optimizing testing and development processes
Provided extensive training and continuous support to teams transitioning to Snowflake, empowering them to fully exploit its analytics and reporting capabilities
Led migration of on-premise Teradata applications to Snowflake for a leading communications provider, resulting in significant licensing savings and performance improvement
Analyzed the current pain points & required KPIs for immediate Business needs and provided the report
Prepared Scope of work, Effort Estimation, ETL design document and technical specification documents
Built ETL workflows with Real-time CDC using Informatica Power Exchange, Azure Pipelines and NiFi Flows
Responsible for the development of workflows/mappings and unit testing documents which help offshore and Junior team members to understand the requirements and develop the code
Managed DB2 databases, handling system configurations, performance tuning, and data migrations for large-scale data warehousing projects
Involved in data analysis and handling the ad-hoc requests by clients
Implemented disaster recovery strategies, ensuring data retention policies are adhered to on Snowflake, AWS, and Azure platforms
Developed PL/SQL, UNIX, Scala, and pig scripts for ETL routines to fulfill the requirements
Developed and automated database processes using Python and Unix Shell scripting, enhancing workflow efficiency
Designed and managed Snowflake virtual warehouses, ensuring right-sizing and cost-effectiveness
Responsible for code migration and code version control maintenance in each deployment
Optimized Snowflake queries, using SnowSQL and Snowflake’s advanced features (SnowPipe, Streams, Tasks, etc.) to enhance data pipeline performance
Responsible for peer ETL code reviews to improve the overall quality of deliverables
Performed analysis and troubleshooting for application issues
Developed and maintained automation scripts in Python to streamline standard database processes and troubleshoot existing Unix shell scripts
Troubleshoot and enhanced existing Unix Shell scripts for database management processes
Processes around CI/CD pipelines involving integrations with Jenkins, testing frameworks, GitHub, JIRA, etc
Technical Environment: Snowflake, Teradata, Azure, Azure Data Factory, Databricks, Scala, Python, Informatica Power Center 10.0, Power Exchange, CDC Real Time, XML, SSIS, SQL Server, Netezza, DB2, AS/400
Senior ETL Consultant
AVIVA Insurance
11.2015 - 12.2016
As part of Horizon Guidewire Policy Center (PC)/ Billing Center (BC) implementation, PC/ BC data is extracted and merged with existing N0110 from The General Ledger System for Personal Lines (Auto/Property) to feed MIS & EDW systems
Analyzed the Policy Center (Guidewire) Business requirements and created detailed design and technical design documents for loading data from the Policy Center to downstream Systems
Design and develop Batch Integration Framework and Financial Reconcile process against PC/BC/N0110
Extensively used Informatica Power Center to design multiple mappings with embedded business logic
Extensively used advanced PL/SQL concepts (e.g., arrays, PL/SQL tables, cursors, user-defined object types, exception handling, database packages, nested tables) to manipulate data
Promote project code (Informatica, UNIX & Oracle Scripts) across multiple environments using deployment groups and provided support for PROD deployments
Work on performance tuning of the ETL Jobs to achieve faster data loads as part of ETL Code Re-factoring
Setup ETL load schedules for batch processing using Zena
Evaluated the business impact of specific data quality issues
Measured and audited large volumes of varying data for quality issues
Technical Environment: Informatica Power Center 9.1, XML, Oracle 12.1, PL/SQL, GIT, ALM, JIRA, Guidewire Policy Center, Downstream N0110 process
Team Lead / Technical Manager
Scotiabank
07.2011 - 11.2015
Scotia Bank Enterprise Data warehouse is created to provide user access to key decision-making business information
Data feeds are received from multiple source systems on daily, weekly and monthly frequencies
A single and detailed view of the entire Scotia Bank Data Warehouse is inclusive of more than 3000 business attributes
Interacted with end users and business analysts to gather requirements and prepare design specifications
Analyzed the business requirements and created detailed design and technical documents for loading data from source to target
Designed and developed Informatica Mappings to load data from Source systems to a data warehouse
Used Informatica Power Center to design multiple mappings with embedded business logic
Created session tasks and workflows in the workflow manager and deployed them across the DEV, QA, UAT and PROD repositories
Worked on performance tuning of the SQL queries, Informatica mappings, sessions, and workflows to achieve faster data loads to the data warehouse
Fine-tuned for performance and incorporate changes to complex PL/SQL procedure / Packages for updating the existing dimension tables using PL/SQL Developer
Scheduled Jobs and Job streams using Maestro (Tivoli)
Preserved versions of code using Informatica and PVCS
Carried pager and was on-call supporting all EDW batch processes
Resolved Incident tickets within SLA and documented the solutions
Acted as a team lead and Conducted impact analysis and provided feedback on problems and solutions recommended
Supported the team of SAS Migration from version 9.1.3 to 9.3
Managed the SAS Admin for SAS platforms (Base SAS, SAS MO, Enterprise Minor/Guide, SNA, and Fraud Framework
Performed Root Cause and Diagnostic Analysis
Supported all EDW applications in 30 TB of Production environments
Technical Environment: Informatica Power Center 9.1, DB2, SQL Server, PL/SQL, Business Objects, PVCS, SharePoint, SAS 9.1.3, SAS 9.3, DataFlux 8.2, Tivoli, UNIX, Cognos 10, Netezza, Big Data
Senior ETL Consultant
Allstate Insurance Company of Canada
01.2007 - 05.2011
Company Overview: Allstate Canada is one of Canada's leading producers and distributors of home and auto insurance products
Project: Property and Casualty Insurance Claims Data mart
This project was to build property and casualty insurance claims DataMart and generate reports to help the Management and claims employees identify trends and implement timely action plans to improve claim data quality and cost management
The ETL process reads data from the source system flat files and writes to target SQL Server DataMart
Conducted requirement gathering, analysis, coding, code review, testing, and Implementation
Involved in the design and implementation of the process for loading the monthly loads coming from P&C into staging tables in the SQL Server Database using TSQL scripts, Stored Procedures & functions
Involved in dimensional modeling to design and develop Star schemas using Erwin 4.1
Performed data analysis on all feeds to ensure granularity matching on targets
Wrote standard documents and code templates and introduced reusable objects to speed up the development
Involved in the design and development of complex ETL mappings and stored procedures in an optimized manner
Used Power exchange for mainframe sources
Conducted code walkthroughs and peer review and documentation
Wrote stored procedures and packages for implementing business rules and transformations
Loaded operational data from Oracle, flat files, XML files and Excel Worksheets into various data marts
Designed error-handling strategy for ETL loads and performance tuning
Created UNIX shell scripts, configured for transferring files, archiving data feeds, and executing Informatica sessions
Allstate Canada is one of Canada's leading producers and distributors of home and auto insurance products
Technical Environment: Informatica Power Center 8.6/8.1.1, SQL Server, TSQL, Stored Procedures & functions, Business Objects, Quality Center, PVCS
Company Overview: Fairfax, a global financial services holding company is engaged in property, casualty and life insurance as well as reinsurance, investment management and insurance claims management
The project was implemented in various phases
The main objective of the project was to build an Insurance Data mart
As an ETL developer, I was responsible for claims and finance data marts using Informatica Power Center
Coordinated with the client in gathering user requirements
Worked with business analysts to develop business requirements and translate them into technical specifications
Designed the technical documentation for source-to-target mappings
Extracted, transformed and loaded the data from the mainframe to the Oracle target database
Developed PL/SQL packages for processing data in the staging tables according to the Client's requirements
Created various transformations in Power Center Designer
Worked on caching optimization techniques in Aggregator, Lookup, and Joiner transformation
Developed mapping to implement type 2 slowly changing dimensions
Fairfax, a global financial services holding company is engaged in property, casualty and life insurance as well as reinsurance, investment management and insurance claims management
Technical Environment: IBM Mainframe, Informatica Power center 7.1.1/6.1, Striva, COBOL, DB2, ORACLE, PL/SQL, MS-ACCESS, TOAD
Systems Analyst
Singapore Network services
, Singapore
01.1998 - 07.2002
Project: BizNet
An electronic information database service provided by the registry of companies and businesses offering users online access to information relating to Singapore companies and businesses
Project: LawNet
This application links government agencies such as the Judiciary, Attorney General's Chambers and the Ministry of Law to private organizations
Involved in the analysis and design, discussions with clients for the requirements, prepared necessary specifications, and prepared analysis and design documents using Rational Rose
Developed and maintained the mainframe online systems, migrating to the J2EE application
Generated various types of web reports by using JBuilder 3.5
Migrated applications from CSP V3.3 to CSP V4.1 and CICS/ESA TO CICS TS FOR S/390
Technical Environment: COBOL, CICS, CSP, DB2, VSAM, IMS, JCL and EASYTRIEVE PLUS 6.2, PL/SQL, Web: Oracle WebServer, Java 1.3.1, Servlets 2.1, Java Beans, JBuilder 3.5, JDBC 2.0, JSP 1.0, HTML and DHTML, XML, VB Script, Java Script, Oracle 9i, UNIX, WIN NT
Software Engineer
Indus Computers Pvt Ltd, India
, India
01.1996 - 12.1997
The process includes Impact Analysis, planning for Conversion, conversion of source code, Testing and delivery
Received all the source codes from the client and prepared the inventory and analysis for obtaining various requisites reports
Involved in the design and development of software to perform issuing purchase orders to vendors, inventory maintenance, opening letters of credit for banks and analysis of the status of vendors and letters
Involved with the impact analysis for Y2K conversions and documentation of application programs
Technical Environment: VS COBOL-II, CICS, DB2, VSAM and JCL, XPEDITER, MF-Revolve/2000