IT leader who has been involved in the entire life cycle process of software application development, data delivery & processing, software & platform architecture, team & platform management. 16+ years of experience in Data warehousing, Data modeling, Data Integration, Data Migration, Business Intelligence for Health Care, BFSI (Banking, Financial & Insurance), Telecom, Energy & Utilities & Retail. Experience includes working with Teradata databases, MySQL, PostgreSQL, and SingleStore (MemSQL) across Linux, and Windows operating systems. Played a key role in the end-to-end implementation of Data Warehousing projects and have led multiple successful database migration and upgrade projects to the latest versions. I also have experience configuring MemSQL (SingleStore) pipelines to transfer data securely from GCP to on-premises SingleStore environments On Google Cloud Platform (GCP), I have experience with Cloud SQL (MySQL), integrating and managing it effectively within hybrid cloud environments Involved in various projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments. Assists with development of Data and Process improvement plans Successfully managed critical production MySQL, and SingleStore, Teradata, PostgreSQL databases in 24/7 environments across various platforms including Windows, Solaris, and Linux. Works autonomously and applies judgment and decision making when monitoring workflow and handling new data requests, changes, and deletions Manage business priorities while driving process improvements and data quality enhancements Understand data sources, data structure and data capture and reporting systems in TIAA Asset Management Develop data mapping, integration and consolidation approach, methodology and tools Implement data mapping, integration and consolidation. Experience of Database Administration, design, development, maintenance and production support of relational databases, business applications, new server setup, MySql Server installation, upgrade, migration. Experienced in Database optimization and developing stored procedures, Triggers, Cursors, Joins, Views, Cursors and SQL on databases: MySQL, Memsql. Experienced in Performance Tuning, Query Optimization, Client/Server Connectivity, and Database Consistency Checks using different Utilities. Expertise in snapshot, import/export, db optimization with the help of explain plan. Expert of writing shell scripting, Perl script for Linux/Unix. Good understanding of views, Synonyms, Indexes, Joins and Partitioning. Informatica Server, Repository Server manager. Also, in ETL Testing using DataStage/Informatica & Teradata. Experience on Teradata Development and OLAP operation using Teradata 11,12,13,14,15 SQL, PL/SQL, SQL
Plus, SQL
Loader, TOAD, MySQL, MS SQL Server, Oracle, DB2. Strong hands-on experience using Teradata utilities ( SQL , BTEQ, Fast Load, Multiload, Fast Export, Tpump). Broad experience in Teradata/ETL performance including effective use of database objects, SQL Trace, Explain Plan, different types of Optimizers, Hints, Indexes, Table Partitions, Sub Partitions, Materialized Views, Global Temporary tables Performance Tuning and Data Stage Transformation and workflow. Sound Knowledge of Data Warehousing concepts, E-R model and Dimensional modeling (3NF) like Star Schema, Snowflake Schema and database architecture for OLTP and OLAP applications, Data Analysis and ETL processes.
Overview
19
19
years of professional experience
1
1
Certification
Work History
Senior Data Base Engineer
American Express
07.2017 - Current
Set up, administer, and support MySQL, Memsql, and Jethro databases servers for Production, QA, and Development.
Installing, configuring, and maintaining MySQL InnoDB Cluster environments.
Managing cluster topology, adding/removing nodes, and scaling operations.
Optimizing cluster parameters for performance and consistency.
Performing full and incremental MySQL MEB backups of cluster data to ensure data integrity.
Identifying and resolving cluster-related issues like replication conflicts, network problems, or node failures.
Proficiency with MySQL cluster management commands and configuration options.
Upgrading MySQL database to the latest version 8.0.27 and Memsql database version to version 8.9.
Effectively configured MySQL replication as part of the HA solution.
Migrating from memsql-ops to memsql-toolbox for Memsql databases.
Performing cluster Schema and partitioning design, data migration, and pipelines on memsql databases.
Performed MySQL replication setup and administration on Master-slave and Master-master. MySQL database administration on large-scale MySQL installation, backup, and recovery strategies.
Troubleshooting and fine-tuning database performance and concurrency.
Database monitoring in Memsql using Splunk.
Monitor databases for performance, bottlenecks, and other issues, and identify and deploy solutions. Perform appropriate backup, restoration, and upgrades of database servers.
Design and create database objects such as data table structures, stored procedures, views, Triggers, reports, Database Administration, Configuration, Maintenance, and Support.
Implement and maintain security and integrity controls including backup and disaster recovery strategies for the document management system and MySQL databases.
Help in translating business requirements into technical design specifications and preparing high-level documentation for the Database Design and Database Objects.
Create alerts and notifications, operator, and Database mail configuration for system errors, insufficient resources, and fatal database errors.Performing Database partitioning, replication, and Migration of MySQL, Memsql, and Jethro databases.
Develop, implement, and maintain change control and testing processes for modifications to databases.
Indexing and implementing auto cubes to deliver fast performance for all BI queries from highly selective to inherently aggregated for Jethro databases.
Ensure all database systems meet business and performance requirements.
Automated data ingestion from GCS into on-prem SingleStore (MemSQL) using pipelines and scripts, enabling near real-time analytics and seamless cloud-to-on-prem integration.
Managed and administered Cloud SQL MySQL and PostgreSQL databases on Google Cloud Platform (GCP), including provisioning, configuration, performance tuning, backup and recovery, and implementing high availability and security best practices.
Managed IAM role bindings for service accounts in Google Cloud Platform (GCP) to enforce least-privilege access, enabling secure and controlled access to GCP resources such as Cloud SQL, GCS, BigQuery
Worked on GRF capability applications across credit and fraud risk management, technology systems, global credit administration, compliance and operation risk to achieve risk management goals. These are real-time and automation applications running in high availability environment processing requests from various real-time data feeds, flat files and worksheets. The project also involves creating a new data driven architecture to develop and maintain capabilities in very less time and effort and to provide high transparency to business partners.
Responsible for designing an internal portal to help application developers configure their applications for the new architecture using agile methodologies. Responsible for the deployment and maintenance of the internal portal. Responsible for production support of GRF applications.
Worked on a project entitled RRD (Regulatory Report Delivery) which serves as the base for Oracle Financial Services Analytical Applications (OFSAA).
Administration and management of the entire development, QA and production environment.
Installed and configured MySQL on Linux and Windows environments.
Managing/Troubleshooting MySQL 5.0.22 and 5.1.24 in production and developer environments on both Linux (5.0, 5.1) and Mac OS X.
Performed installation, new databases design, configuration, backup, recovery, security, upgrade and schema changes, tuning and data integrity.
Increased database performance by utilizing MySQL config changes, multiple instances and by upgrading hardware.
Assisted with sizing, query optimization, buffer tuning, backup and recovery, installations, upgrades and security including other administration functions as part of profiling plan.
Ensured production data being replicated into data warehouse without any data anomalies from the processing databases.
Worked with the engineering team to implement new design systems of databases used by the company.
Effectively configured MySQL Replication as part of HA solution.
Designed databases for referential integrity and involved in logical design plan.
Performance Tuning on a daily basis for preventing issues and providing capacity planning using MySQL Enterprise Monitor.
Developed stored procedures, triggers in MySQL for lowering traffic between servers & clients.
Proficiency in Unix/Linux shell commands.
Created and deleted users, groups and set up restrictive permissions, configuration of the sudo files etc.
Created data extracts as part of data analysis and exchanged with internal staff.
Performed MySQL Replication setup and administration on Master-Slave and Master-Master.
Tools/Technologies: Mysql ,Jethro,Memsql, Teradata/Informatica 9.1, ORACLE, Management studio for SQL Server, TOAD for DB2, Putty, Agile
Data Steward Analyst/Business Intelligence Engineer
Fifth Third Bank
01.2015 - 07.2017
Lead Developer for creating a Payments Data Archive to consolidate(warehouse) payment transactions for Customers into a central Data Archive toaddress the gap in not supporting a central, long-term archive of payment details.
Lead the Data team through requirements, design and development for a complete overhaul of the Business Intelligence system.
Provided support to implement Data Warehouse/BI solutions and utilizing an extensive portfolio of experience and best practices.
Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions
Developed a Conceptual model using Erwin based on requirements analysis
Developed normalized Logical and Physical database models to design OLTP system for insurance applications
Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7.1
Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models
Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis
Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation
Responsible for defining the naming standards for data warehouse
Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 and SQL Server database systems
Exhaustively collected business and technical metadata and maintained naming standards
Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information
Involved in analyzing/building Teradata EDW using Teradata ETL utilities and Datastage.
Created Batch Stored Procedures for in the Report Scheduler according to the monthly, weekly or daily.
Generated Tableau Dashboard with quick/context/global filters, parameters and calculated fields on Tableau (7.x / 8.x) reports.
Helped Configure Physical Layer, Logical layer and presentation layer as per requirements.
Worked with Business Users to develop Subject Area, Metadata reporting such as schemas required hierarchies and data sources.
Worked with the ETL team in loading of tables using ETL tools from legacy data.
Prepared design document (Source to Target) for the ETL load process.
Reviewed the Datastage mapping to check the proper implementation of Business Rules, Load Testing for final deployment in production.
Responsible for system design concerning data integration and preparation of Technical Design Document (TDD).
ETL Development for Control Architecture, Common Modules, Sequence Controls and major critical interfaces.
Objective of the project is to provide visibility on Direct Spend (Material Procurement – Raw material + Finished Goods) by Supplier and Material to MD&D Supply Chain Business team.
Used the Cognos Ad-hoc Reporting tool - Query Studio for the reporting needs to answer the business questions at high level.
7 scanned reports are in the current scope of project 20/20 execution.
Reviewed & Maintained Informatica Mappings to check the proper implementation of Business rules, Load Testing for Informatica Mappings and finally deployed them into production.
Responsible for system design concerning data integration and preparation of System Design Document (SDD).
Reviewed Deliverables including Technical Documents, Review of Unit testing and Unit Testing Log and perform load testing for Informatica Mappings.
Designing and Developing Dashboards using Tableau for various ROI metrics.
Automate monthly excel reports into Tableau Workbooks and Dashboards.
Design Dashboards using Filters, Parameters and Action Filters
Produce standard monthly reports using Tableau.
Worked extensively with the Teradata SQL Assistant 6.2 to interface with the clients Database.
Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another and for overall data management capabilities for copying indexes, global temporary tables.
Used BTEQ scripts to batch-load large data sets into the existing database.
Executing monthly, weekly and daily reports in UNIX and automated the scripts using shell script.
Unit tested the developed ETL scripts, created test SQLs, and handled UAT issues.
Reverse engineered application specific name-value pair database into a query friendly normalized data model to support reports.
Responsible for design and build data mart as per the requirements.
Performed tuning and optimization of complex SQL queries using Teradata Explain, responsible for collect Statics on FACT tables.
Performance Tuning of SQL queries.
Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
Extensively worked on Views, Stored Procedures, Triggers and SQL queries and for loading the data-(staging) to enhance and maintain the existing functionality.
Done analysis of Source, Requirements, existing OLTP function and identification of required dimensions and facts from the database.
Created common reusable objects for the ETL team and overlook coding standards.
Reviewed high-level design specification, ETL coding and mapping standards.
Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
Used ETL to extract files for the external vendors and coordinated that effort.
Migrated mappings from Development to Testing and from Testing to Production.
Extensively used MLoad, Fload and Fast Export Teradata tools both in ETL and UNIXscripts for high volume flat files.
Worked with the Business and Application Development Teams to ensure the database Infrastructure to meets/exceeds business expectations.
Royal Mail Group - United Kingdom (Public Services)
08.2010 - 08.2011
Prepared test scripts to perform load testing on the application with various user loads.
Designed and developed jobs for extracting, transforming, integrating, and loading data into data mart using DataStage Designer, used Data Stage manager for importing metadata from repository, new job categories and creating new data elements.
Developed user defined Routines and Transformations for implementing Complexbusiness logic.
Extensively used Shared Containers and Job Sequencer to make complex jobs simple and to run the jobs in sequence.
Involved in the preparation of ETL documentation by following the business rule, procedures and naming conventions.
Created Mapping Sheet from the given Data Model
Identified the performance bottlenecks based on the table used and the volume of data.
Created HLD & LLD for creating jobs.
Created Unix Scripts to maintain the Status Table and updating the same using scripts.
Managed the project delivery & client communication.
SFR – France (Communication and Telecommunications)
01.2009 - 07.2010
Solved various production issues with high severity levels
Wrote various Teradata scripts and Unix scripts for various maintenance related issues
Worked on sync up of databases across two different servers, on ARCAIM utility & scheduler scripts.
Designed the ETL processes using Teradata and BTEQ’s to load data from base tables to stage and from stage to datamart
Used Teradata utilities such as FLOAD and MLOAD to put data into base tables.
Developed mapping to load the data into staging tables and worktables.
Technologies Used: Data stage (ETL), UNIX, SQL, Oracle, Teradata and PVCS GCL
ETL (Datastage) Developer
Travelers Insurance – US
01.2007 - 01.2009
Engaged in Code review, Test Plan Review process and revieweddocumentation.
Provided data models and data maps (extract, transform and load analysis) of the data marts for systems
Involved in Extracting, cleansing, transforming, integrating and loading datainto data warehouse using Datastage Designer.
Used the Datastage Director and the runtime engine to schedule runningthe solution, testing and debugging its components and monitoring theresulting executable versions (on adhoc or scheduled basis).
Coordinate with the Team members in resolving the issues.
Developed Shell scripts to automate file manipulation and data loadingprocedures.