Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic
Bijit Ghosh

Bijit Ghosh

Dallas,TX

Summary

Personable and analytical with knack for problem-solving and data-driven insights. Proficient in data modeling and ETL processes, leveraging SQL and Python to manage and optimize data pipelines. Committed to delivering scalable data solutions that drive business performance.

Overview

11
11
years of professional experience
1
1
Certification

Work History

Lead Data Engineer

Virtusa Consulting Services
01.2022 - Current
  • M&T Bank – Early Warning Indication and Reporting – (Banking) - July-2024 – Till date
  • Role: Lead ETL Developer
  • Technology: Informatica PowerCenter, Informatica ICS, Snowflake, Oracle, Teradata17.x, Microsoft SQL Server2012, Salesforce (NCino), JIRA, Agile, GitLab, Workload Automation
  • Responsibilities:
  • Design Informatica workflow and task flows to extract data from enterprise data warehouse and transform the data for a third-party application to detect Money Laundering (AML) and analyzing customer data (KYC) for the process of lending
  • Collaborate effectively with various business units and project teams to comprehend requirements and translate them into precise technical specifications
  • Fulfill ad hoc data requests using SQL, Toad, Power BI and other available tools
  • Aligned business objectives with technical requirements through close collaboration with product owners providing insights based on available data sources.
  • Identified areas for improvement within existing ETL frameworks to maximize efficiency without sacrificing accuracy or integrity of results produced during transformations applied on ingested datasets.
  • Provide analytical support for data quality issue and redundancy resolution
  • Leveraging database knowledge to fine tune the performance of ETL workflows, reporting queries to minimize the resource utilization in distributed environment
  • Migration of ETL workflows from Informatica PowerCenter to Informatica ICS
  • Write shell scripts for file pre-process, post process logging and automation of error logging and execution of ETL jobs
  • Perform Unit Testing and collaborate with other application team for end-to-end integration testing
  • Provide support24/7 on a weekly rotational basis
  • This includes monitoring jobs, conducting root cause analysis, resolving issues, and maintaining transparent communication with customers regarding expected ETAs and resolutions
  • Troubleshoot and resolve production defects and bugs for ETL Jobs and workflows
  • Collaborate with an offshore team to drive efficient development work
  • Identify major gaps, issues and new product or capability implementations and work with Solution Lead / Solution Director and Technical Solution Architect on resolution
  • Fiserv – Enterprise Service Framework (ESF) Integration –Fin-Tech - June-2023 to June-2024
  • Role: Senior ETL Developer
  • Technology: Informatica PowerCenter, Oracle, UNIX shell script, Postman, Microsoft SQL Server2012, MS Visio, REST API Integration, JIRA, Agile
  • Description: Enterprise Service Framework (ESF) Integration project involves designing customized applications for the clients of Fiserv
  • The ETL development team collaborates closely with business partners, internal project managers, and development resources to identify business needs and create technical solutions for the applications
  • The contribution as a Senior ETL Developer involves data analysis, providing valuable business insights, and offering data-driven strategy recommendations for various products
  • Responsibilities:
  • Build Extracts, Data Loads, and jobs for seamless Data Transformation using Informatica PowerCenter, REST API Integration, Informatica ICS
  • Collaborate effectively with various business units and project teams to comprehend requirements and translate them into precise technical specifications
  • Design and develop application architecture, offering estimations, designing data security layers, and crafting comprehensive design documents
  • Fulfill ad hoc data requests using SQL, Toad, Power BI and other available tools
  • Apply shell scripting skills to compress Informatica code aiding code to be run on schedulers
  • Provide support24/7 on a weekly rotational basis
  • This includes monitoring jobs, conducting root cause analysis, resolving issues, and maintaining transparent communication with customers regarding expected ETAs and resolutions
  • Troubleshoot and resolve production defects and bugs for ETL Jobs and workflows
  • Collaborate with an offshore team to drive efficient development work
  • Leverage knowledge of schedulers (CA7, AutoSys, or Control-M) to orchestrate seamless data operations
  • Identify major gaps, issues and new product or capability implementations and work with Solution Lead / Solution Director and Technical Solution Architect on resolution
  • Reduced latency in real-time analytics applications by optimizing query performance through indexing strategies and proper database design principles.
  • Established strong working relationships with stakeholders across multiple departments, facilitating clear communication channels regarding project requirements and progress updates.
  • Support delivery process as SME and primary point of contact for the ETL solution during delivery test and deployment phases
  • Identify opportunities for new product and technical components to add value, both to financial performance and client experience terms
  • Maintain documentation for ETL process flow, source-target mapping, Job scheduling manual, latency reporting etc

Senior Consultant – Data Integration Lead

M&T Bank
06.2022 - 06.2023
  • M&T Bank – Commercial Finance Risk Regulatory – (Banking) - June-2022 to June-2023
  • Technology: Informatica PowerCenter, Oracle, Teradata, MSSQL, Salesforce Integration, nCino data extraction, MS Excel, CA7
  • Description: Develop data pipeline and extraction transformation load solution for regulatory reporting and business intelligence system support for bank
  • Commercial banking, data quality assessments and strategies, data storage and data retention requirements, analytics, and Data integration support
  • Perform complex data review, research, and/or reconciliation
  • Interpret results and present findings to influence strategic decisions within the department or division
  • Responsibilities:
  • Play lead role in refining data requirements, identify data sources, design & validate data objects, and enhance commercial loan product within banking eco system
  • Also assist with third party data user coordination to ensure data is used according to user contracts
  • Design data integration pipeline with help of Informatica, SQL, salesforce integration
  • Acquire data, database structures, and creating output for analytical use cases
  • Develop a thorough understanding of the business and its functions, processes, and operations
  • Leverage performance optimization techniques to improve performance of data extraction transformation and load jobs and reporting queries
  • Analyze substantial amounts of data and information to provide meaningful insights and professionally communicate those insights to management
  • Drive continuous improvement of existing processes, develop new processes, or enhance existing processes where required including maintenance plans, procedural documentation, and custom tools for automation
  • Identify resources that can be utilized to support business operations or improve existing business processes
  • Provide input and recommendations to management

Data Engineer

Oncor Electric
07.2015 - 06.2022
  • Oncor Electric – AMS Outage Reporting – (Energy & Utility) – July-2015 to June-2022
  • Technology: Informatica PowerCenter, Informatica ICS, Informatica Power-Exchange CDC, Ab initio, Oracle, MSSQL, Informix, FME (Feature Manipulation Engine), Jenkins, JIRA, Workload Automation, WinSCP, Git, Bitbucket
  • Project: Oncor Electric LLC
  • Responsibilities:
  • Improve performance of data replication workflows from transactional system, files and other relational databases to central repository and integrate data warehouse with downstream applications
  • Installation and configuration of Informatica Power-Exchange Logger/Listener for multiple source application system
  • Development and implementation of Informatica SCD (Slowly Changing Dimension) logic to store premise and meter combination data
  • Analyze requirements from the users and Data Management team and create, review the specifications for the ETL
  • Review modifications to existing data flow to improve efficiency and performance
  • And examine new application design and recommend corrections if required
  • Worked with complex mapping using transformations such as Expression, Router, Lookup, Filter, Joiner, SQ, Stored Procedures and Aggregator
  • Migration of workflows from on-prem system to Informatica ICS
  • Identifying the root causes for the Failure of ETL jobs and troubleshoot the complex problems and rerun the processes to avoid customer, market transactions related issues

Data Specialist – ETL Informatica

American Express
04.2014 - 07.2015
  • Company Overview: The American Express Company is a multinational financial service corporation
  • American Express - BASEL & Regulatory Reporting – (Banking & Finance) - Apr-2014 to July-2015
  • Technology: Informatica PowerCenter, Putty, WinSCP, Citrix, TDM, UCA Atomic, Control-M, SQL Developer, Notepad++, UNIX shell scripting
  • Description: The project was to work on Basel-II, upgradation from Basel-II to BASEL-III standard, Regulatory Reporting (Y9C, Y9LP, Y14M,2052A) and Financial Instruments (IFRS9 & CECL) to meet the data criteria and preparation as per federal reporting requirements
  • Responsibilities:
  • Worked exclusively on performing the ETL to Level1 as is staging, Level2 staging, DW load based on the business rules and transformation specifications
  • Analysis on requirements from the users and create, review the specifications for the ETL process to develop
  • Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy
  • Designed, Developed and unit tested the ETL process for loading the Bank Customer records across STAGE, LOAD and ACCESS schemas
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings

Education

Bachelor of Technology - Computer Science and Engineering

West Bengal University of Technology
Kolkata, India
01.2013

Diploma - Computer Science and Technology

West Bengal State Council of Technical Education
West Bengal, India
01.2010

Skills

  • Process improvement and modernizing
  • ETL development
  • Relational databases
  • Data warehousing
  • SQL programming
  • Data migration
  • Data pipeline design
  • Data extraction process consultation
  • Business process reengineering
  • Problem solving
  • Communication
  • Leadership

Certification

  • Microsoft Azure Fundamentals Certification
  • AWS Certified Cloud Practitioner
  • Informatica Cloud Data Integration for PowerCenter Developers
  • Agile Advocate – IBM
  • Energy & Utilities Industry Foundation - IBM

Timeline

Senior Consultant – Data Integration Lead

M&T Bank
06.2022 - 06.2023

Lead Data Engineer

Virtusa Consulting Services
01.2022 - Current

Data Engineer

Oncor Electric
07.2015 - 06.2022

Data Specialist – ETL Informatica

American Express
04.2014 - 07.2015

Bachelor of Technology - Computer Science and Engineering

West Bengal University of Technology

Diploma - Computer Science and Technology

West Bengal State Council of Technical Education
Bijit Ghosh