Technical Architect and Data Engineer with extensive expertise in IBM IIS Tool Kit, Informatica, Talend, Azure Synapse, Data Bricks, ADF, AWS Glue, and Snowflake. Demonstrated success in designing efficient data pipelines and optimizing performance, resulting in improved data accuracy and compliance. Strong analytical capabilities and collaborative approach align projects with business objectives and regulatory standards.
Overview
14
14
years of professional experience
1
1
IBM Datastage
1
1
Azure Synapse
1
1
Azure Fabric
1
1
Power BI
1
1
Git/Git hub
Work History
Technical Architect
The Northern Trust
Chicago, IL
11.2021 - Current
Facilitated data extraction from ORMB into Datamart through Oracle GoldenGate replication.
Engineered scalable architecture solutions for effective financial service applications.
Led the design and development of ETL processes, enhancing data integration efficiency.
Provided guidance to junior team members on architecture best practices and methodologies.
Performed unit testing of all ETL mappings prior to production deployment.
Implemented advanced data cleansing techniques to elevate reporting accuracy.
Analyzed large datasets for patterns, informing strategic business decisions.
Configured and managed replication services across clusters to maintain synchronized datasets.
Designed scalable architecture solutions for financial services applications.
Collaborated with cross-functional teams to define technical requirements.
Evaluated emerging technologies to enhance system performance and security.
Developed integration strategies for third-party systems and tools.
Conducted risk assessments to identify potential system vulnerabilities.
Streamlined development processes through effective documentation and communication.
Developed technical architecture plans and documented them for implementation.
Provided technical support during the development lifecycle from initial concept through deployment.
Provided technical guidance on the design of systems to ensure scalability, reliability, and performance.
Determined scope of projects based on objectives and specifications.
Collaborated with professionals to develop suggested architecture and cultivate implementation.
Maintained knowledge base of best practices related to technical architecture designs.
Identified potential risks associated with proposed solutions and provided mitigation strategies.
Worked with clients to develop and customize technology solutions according to specific business requirements.
Coordinated with teams across multiple departments to ensure successful integration between systems.
Assisted in troubleshooting complex problems related to distributed systems architectures.
Analyzed business processes and identified opportunities for automation or optimization.
Monitored system performance metrics such as uptime, response time, throughput.
Assessed existing applications and infrastructure to determine areas for improvement.
Designed efficient database structures for storing large amounts of data.
Redesigned and developed architecture, resulting in increased productivity for clients.
Conducted research into emerging technologies to evaluate their potential impact on current systems.
Created detailed diagrams illustrating application, data, and system architectures.
Data Engineer /Senior ETL Consultant
The Northern Trust
Chicago, IL
08.2021 - 11.2021
FAD is to identify financial reporting solutions for capital expenditures, fixed assets, depreciation and amortization, and project information through the integration of existing data elements from multiple systems, including financial general ledger, project portfolio management, ServiceNow, Hyperion planning, and forecasting that will support multiple financial reporting scenarios (plan, forecast, and actual).
Responsibilities:
Designed and maintained logical and physical data models using SQL Developer, aligning them with evolving project requirements.
Interpreted business strategies and transformed them into actionable IT and data engineering solutions, addressing complex data challenges.
Leveraged IBM DataStage to build robust ETL workflows for data extraction, transformation, and integration across diverse target systems.
Extracted and transformed data from PeopleSoft (FGL) and PPM systems, delivering curated datasets for Cognos-based reporting.
Led efforts in defect identification and resolution, contributing to data accuracy and stable production deployments.
Delivered production support by monitoring data pipelines through Control-M, ensuring workflow reliability, and SLA compliance.
Contributed to high-level architecture design and project planning documentation to align technical solutions with business objectives.
Worked extensively with Snowflake, utilizing tools such as Snow SQL, Snow CLI, and Snowpipe for real-time and scheduled data ingestion.
Refactored existing DataStage ETL jobs to support Snowflake’s cloud-native architecture, enhancing performance and scalability.
Built automated data ingestion pipelines using Snowpipe and the COPY command to handle high-volume batch processing.
Enabled secure data sharing across Snowflake accounts, improving inter-system collaboration, and data accessibility.
Developed and maintained Snowflake stages (internal/external), embedding transformation logic during data loading to streamline ingestion.
Enhanced Snowflake view performance through architectural redesign and SQL optimization.
Authored stored procedures and sophisticated views to support advanced analytics and business logic within the Snowflake environment.
Applied a deep understanding of RDBMS concepts to write efficient SQL and PL/SQL queries for data processing and troubleshooting.
Participated in seamless database migrations from legacy systems to Snowflake, with minimal disruption to ongoing operations.
Partnered with data scientists, engineers, and domain specialists to refine and optimize data workflows and analytical infrastructure.
Managed multi-cluster Snowflake environments, ensuring balanced workload distribution, and consistent high performance across tenants.
IBM IIS Admin/Senior ETL Consultant
New York State Department of Motor Vehicles
Albany, NY
04.2020 - 08.2021
The New York State DMV seeks the service to transform and modernize DMV’s systems by replacing the existing legacy systems and improving the business process and data quality. This initiative focuses on providing better customer services, reducing transaction processing times, and in-office wait times, responding to regulatory and legislative requirements more efficiently, enhancing security, reducing system outages, streamlining the process, and increasing self-service capabilities.
To support the growing number of services and various delivery methods, the DMV has incrementally constructed over 200 applications upon the core systems and master files over the past 50 years.
Responsibilities:
Participated in requirement-gathering sessions with both ITS and COTS teams, aligning technical deliverables with business objectives.
Contributed to the development of project conceptual architecture, and provided input for various planning deliverables, ensuring alignment with project scope and timelines.
Responsible for preparing and submitting Deliverable Expectation Documentation in accordance with the approved RFP (Request for Proposal).
Assisted in compiling and maintaining the Bill of Materials (BOM) for environment provisioning and setup activities.
Led efforts in designing and updating logical and physical data models based on core business processes (e.g., Client, License, Financial, Registration, Ticketing, Title), and System of Record (SOR) structures, referencing target state catalogs provided by ITS/DMV.
Developed a comprehensive Cleanse/Enriched Data Dictionary and published data definitions based on SOR/PK-FK attributes, supporting the COTS development team by leveraging key documents such as MDM mappings, legacy lineage, Master Data Catalog, Compass, and LifeTime Abstract Key documents.
Performed installation and configuration of the IBM IIS Toolkit on both Windows Server 2012 and Red Hat Linux, and documented the full installation process for future operational use.
Resolved critical installation issues, including problems with Data Masking stages and the Operational Console, by executing IBM DataStage administrative functions.
Designed and implemented highly reusable ETL jobs to extract data from legacy systems, handling diverse formats, including 113 VSAM files, multiple MS SQL Server databases, and DB2 schemas, using minimal, centralized job structures.
Built scalable ETL frameworks to ingest data from over 1,000 legacy tables across various file systems into the RAW/Stage layers, ensuring consistent and efficient data load processes.
Enforced data quality rules in response to DMV-submitted legacy issue logs, helping to ensure consistency and accuracy in data transformations.
Conducted in-depth data analysis at the table and attribute levels, mapping business requirements to technical specifications.
Developed reusable ETL processes to move data from RAW/Stage to the Cleanse Layer, integrating type conversions, data quality validations, and error-handling mechanisms.
Created modular ETL workflows for data movement from the Cleanse Tier to the Pre-Publish Tier, isolating only SOR, PK, and FK columns to support ITS team validation and approval workflows prior to final deployment.
Established ETL pipelines to transfer validated data from Pre-Publish to the Publish Tier, again focusing on SOR-aligned attributes, to enable COTS application development.
ETL Architect
The Northern Trust Bank
Chicago, IL
07.2019 - 03.2020
Financial institutions need to have a robust Anti-Money Laundering program to satisfy the Bank Secrecy Act, the Patriot Act, and other U.S. AML regulations. Institutions that fail to implement adequate anti-money laundering programs face severe fines, penalties, and reputational risk. The KYC allows NT to better comply with the Bank Secrecy Act (BSA) guidelines, regulatory expectations, and allows us to perform some level of risk assessment and due diligence prior to creating and funding accounts for new customers.
This project involves the execution of all development and enhancements to risk and compliance projects (AML, KYC, governance, Basel) as and when requested by business partners.
Responsibilities:
Engaged with business stakeholders to gather and refine functional and technical requirements, ensuring alignment with compliance objectives.
Played a key role in the design and implementation of robust ETL processes to feed data into the Nice Actimize Anti-Money Laundering (AML) platform.
Collaborated with clients and vendors during architectural workshops to support ongoing KYC framework enhancements, and scalability planning.
Translated complex business goals and compliance drivers into actionable IT requirements, delivering appropriate, and scalable technical solutions.
Produced comprehensive high-level and low-level design documentation, including detailed source-to-target data mappings, to support development efforts.
Developed optimized IBM DataStage 11.3 parallel jobs to perform data extraction, transformation, and loading (ETL) into the target data warehouse environment.
Actively participated in project tracking activities, contributing to weekly and monthly status meetings, and supporting the delivery of regulatory reporting milestones.
Supported various testing phases, including System Integration Testing (SIT), User Acceptance Testing (UAT), and regression testing, to ensure solution quality and compliance.
Conducted peer reviews, and performed defect root cause analysis and resolution, maintaining high standards for code quality and performance.
Contributed to Control-M job design, ensuring efficient scheduling, monitoring, and automation of ETL workflows.
ETL Architect
The Northern Trust Bank
Chicago, IL
02.2018 - 07.2019
Northern Trust had undertaken the CMRM (Capital Markets Risk Management) program to replace the existing end-of-life version of IBM Algorithmics with Murex to support market and counterparty credit-risk functions.
The reporting work stream is part of the overall implementation plan to build an enrichment layer (CDW) and a multidimensional data warehouse (MDW) to replace the existing reports.
Responsibilities:
Collaborated closely with business users to gather, clarify, and document functional requirements, enabling accurate technical planning and delivery.
Led the design and development of ETL pipelines to load structured data into the Cognos reporting environment, supporting enterprise-level reporting needs.
Analyzed business drivers and translated strategic goals into scalable IT solutions, aligning technical architecture with organizational objectives.
Managed large-scale data warehousing operations, handling data volumes of over 50 million records per day, ensuring performance and reliability.
Created both high-level and detailed technical designs, including source-to-target data mappings, to guide development activities.
Built and deployed IBM DataStage 11.3 parallel jobs for efficient data extraction, transformation, and loading (ETL) into target data warehouse layers.
Participated in project governance meetings, including weekly progress updates, and monthly regulatory reporting reviews, to track milestones and risks.
Supported testing phases include System Integration Testing (SIT), User Acceptance Testing (UAT), and regression testing, ensuring end-to-end solution quality.
Conducted peer code reviews, contributed to issue analysis and resolution, and enforced best practices in job design and data integrity.
Contributed to the design and configuration of Control-M job schedules, facilitating automated and reliable ETL job execution and monitoring.
ETL/E-LT Lead (Onsite)
The Northern Trust Bank
INDIA & MEXICO
02.2014 - 02.2018
Northern Trust is a global leader in delivering innovative investment management, asset and fund administration, and banking solutions to corporations, institutions, and affluent individuals. This project is to implement a credit risk warehouse to comply with the Basel II accord.
Responsibilities:
Utilized IBM DataStage extensively to design and implement ETL jobs for data extraction, transformation, integration, and loading into various target tables and files.
Led the DataStage upgrade initiative from version 8.5 to 11.3, including validation and testing of the UNIX system transition to NFS4, to ensure seamless system integration.
Conducted column-level data lineage analysis by source system, enabling accurate metadata job creation using the Metadata Workbench.
Worked with operational metadata files, validating them to enhance understanding of data flows and dependencies across business processes.
Implemented error-handling mechanisms for lookup failures and database connectivity issues, ensuring reliability and fault tolerance in ETL jobs.
Participated in Oracle database upgrade testing from 10g to 12c, validating schema integrity, and ETL compatibility post-upgrade.
Created custom SQL scripts and reconciliation jobs to perform extensive ETL validation and testing, ensuring data accuracy across environments.
Performed file-level data validation by comparing source data against generated outputs, using spooling techniques.
Translated complex business requirements into scalable ETL solutions that aligned with data governance and reporting needs.
Developed ETL frameworks for data archival and historical tracking using Slowly Changing Dimensions (SCD) Type 1 and Type 2 techniques.
Created BCA jobs for executing monthly financial adjustments as part of recurring processing.
Worked on the CCAR ST model process, designing reconciliation jobs to deliver compliance data to the Federal Reserve (FED).
Focused on ETL and database performance optimization, leveraging parallel processing, and load balancing to improve data throughput.
Provided production support for DataStage jobs, ensuring timely data availability to business users, in alignment with SLA commitments.
Conducted peer reviews and rigorous ETL testing to maintain job quality and consistency across environments.
Offered hands-on support as a Database and DataStage Administrator, contributing to infrastructure, job maintenance, and metadata management.
Developed custom routines in DataStage to meet specialized business logic requirements.
Managed metadata import/export processes to streamline ETL development and promote reusable design practices.
Supported the QA team during test phases, addressing data issues, defects, and ensuring overall test coverage.
Monitored job execution and status using Control-M, providing end-to-end visibility for ETL workflows across multiple layers.
ETL/E-LT Developer/Production Support (Offshore)
The Northern Trust Bank/Hexaware
Mumbai, INDIA
08.2012 - 01.2014
This is a PeopleSoft EPM reporting project involving application support, maintenance, and migration through ETLs using IBM WebSphere DataStage 7.5 and IBM Infosphere DataStage 8.5 to load data from multiple source systems to PeopleSoft EPM 8.9.
Responsibilities:
Involved in meetings with the onsite coordinator and business analyst for requirement gathering.
Involved in meetings with architects during the designing phase of the project.
Involved in the migration of DataStage 7.5 server jobs to the IBM InfoSphere DataStage 8.5 parallel version.
Involved in extensive testing of ETLs by creating SQL scripts and reconciliation jobs.
Testing and peer reviewing of the ETL jobs.
Preparing unit and system test case documents.
Preparing the UAT and production migration plan.
Involved in defect analysis and defect fixing.
Involved in version controlling of ETL jobs during migration using the TFS tool.
Involved in Control-M schedule design.
ETL Developer
Sears Holdings Corporation
Pune, INDIA
04.2012 - 08.2012
The customer is a leading integrated Shop Your Way, a social shopping experience where members can earn points and receive benefits across a wide variety of physical and digital formats through ShopYourWay.com.
Responsibilities:
Understanding the business requirement, and preparing the ETL specification documents.
Developing the jobs for loading the data from legacy sources to tables and files.
Developing the jobs using different processing stages, like Transformer, copy, filter, join, lookup, aggregator, funnel, sort stage, file stages, and database stages.
ETL Development, Maintenance
Fifth Third Bank/Wipro
Pune, INDIA
08.2011 - 04.2012
The customer is a diversified financial services company; it has 15 affiliates, and 1,135 banking services. Our goal for the bank is to manage money for individuals and institutions by offering numerous solutions to help its customers achieve their investment and wealth management goals.
Responsibilities:
Understand existing systems, suggest improvements, and create new systems using MS BI tools.
Analyze user needs and requirements to determine the feasibility of design within time and cost constraints.
Create multi-dimensional databases using Oracle Express and Oracle Sales Analyzer cube slices.
Provide demonstrations and training on market stands of CPG producers and retailers, using Excel scorecards, Oracle Sales Analyzer reports, and Symphony RPM Analytics.
Create complex SQL stored procedures and SSIS packages to extract and transform EPOS data.
Working on trackers for the Maintenance work, Analysis, Design, Development, and Testing & Release.
Reviewing the work products developed by peers.
Education
Bachelor of Technology - Computer Science Engineering
JNTU
AP, India
01-2009
Skills
ETL tools: IBM IIS Tool Kit, Talend, and Apache Kafka
Cloud platforms: MS Azure Synapse, Databricks, AWS Glue, and Snowflake
Version control: Git, GitHub, and Microsoft TFS, PVCS