Results-oriented IT professional with over 15 years of experience in diverse roles across industry verticals, including Banking, Insurance, Healthcare, and Telecom.
Data Warehousing Expertise: Proven track record in successfully implementing data warehouse solutions for prominent Insurance clients, overseeing design, development, and project management aspects to optimize data storage, ensure data integrity, and implement efficient retrieval systems.
ETL Development and Architecture: Highly skilled ETL Architect and Informatica Developer with extensive experience in designing and implementing data warehousing solutions. Proficient in Informatica PowerCenter, IICS, SQL, and data modeling, with a strong focus on optimizing ETL workflows and improving system performance.
Data Engineering and Integration: Seasoned Senior Data Engineer with expertise in developing, testing, and maintaining robust data architectures. Proficient in utilizing databases, flat files, XML, and cloud technologies (AWS S3, Redshift) for complex data transformations and integrations.
API and Real-Time Integration: Experienced in API-based and real-time integration workflows, with a solid understanding of REST API endpoints. Demonstrated ability in building application and service connectors to facilitate seamless data integration.
Project Management and Collaboration: Accomplished project manager adept at leading ETL projects in multi-vendor environments, with a strong capability to translate business requirements into technical specifications. Known for effective communication and collaboration with stakeholders to align data solutions with business objectives.
Technical Proficiency: Strong command of Unix, Python, and Shell scripting for automation purposes. Results-driven with a hands-on approach to technical coding, able to tackle projects from foundational to advanced levels.
Interpersonal Skills: Highly motivated self-starter with exceptional communication and interpersonal relations, recognized as a valuable team player who contributes to achieving shared objectives.
Overview
16
16
years of professional experience
Work History
Data Engineer
Compunnel Software Group Inc. [Client - US Bank]
Hopkins, United States
02.2023 - Current
Executed data migration initiatives following Union Bank acquisition by USBank
Partnered with Union Bank's team to assess data sources and file systems
Integrated Informatica workflows into USBank ETL servers
Optimized workflow testing for accuracy using newly created Autosys jobs.
Executed comprehensive dry runs for workflows following the project timeline
Established an audit framework for session tracking.
Produced detailed design documentation to enhance understanding
Migrated code across higher environments, ensuring stability
Engineered data workflows using PySpark for efficient processing
Constructed integrated system with Python and PySpark
Executed advanced preprocessing of source files for Python framework compatibility
Acted as Subject Matter Expert for offshore team, directing ETL configuration development
Refined SSIS packages aiding database migrations
Generated JSON files to facilitate CSV data integration with Cassandra
Performed thorough unit testing to maintain data integrity post-migration
Designed data integration solutions alongside data architects and business analysts
Engaged in User Acceptance Testing (UAT) to ensure migrated processes meet functional standards
Enhanced query efficiency by indexing database tables
Developed and implemented data models, database designs, data access and table maintenance codes.
Analyzed user requirements, designed and developed ETL processes to load enterprise data into the Data Warehouse.
Created stored procedures for automating periodic tasks in SQL Server.
Configured SSIS packages for scheduled data loads from source systems to target tables.
Documented data architecture designs and changes, ensuring knowledge transfer and system maintainability.
Participated in agile development processes, contributing to sprint planning, stand-ups, and reviews to ensure timely delivery of data projects.
Optimized SQL queries and database schemas for performance improvements in data retrieval operations.
Provided technical mentorship to junior data engineers, guiding them on best practices and project execution.
Researched and integrated new data technologies and tools to keep the data architecture modern and efficient.
Designed, constructed, and maintained scalable data pipelines for data ingestion, cleaning, and processing using Python and SQL.
Collaborated with cross-functional teams to gather requirements and translate business needs into technical specifications for data solutions.
Managed version control and deployment of data applications using Git, Docker and Jenkins.
Trained non-technical users and answered technical support questions.
Data Warehouse Specialist
Smart works [Client - BlueShield Of California]
Minnetonka, United States
06.2022 - 02.2023
Partnered with business users and analysts to gather IT requirements
Performed impact analysis, lab testing, and module development
Ensured consistent robustness by conducting daily design reviews
Created and updated Netezza scripts to enhance operational efficiency
Designed and executed robust ETL processes
Developed scalable and efficient database solutions using SQL stored procedures
Participated in design, development, unit testing, and Quality Assurance stages of software projects
Identified quality improvement areas in deployment, testing support, and user acceptance processes
Facilitated testing and production teams by supplying essential tools and expertise
Utilized various flat file and XML formats for data sourcing and targeting
Enhanced data processing efficiency using Netezza's high-performance analytical platform
Constructed Denodo views to establish virtualized data layers
Seamlessly integrated Denodo with existing infrastructures, ensuring system compatibility
Exploited Denodo’s data virtualization features, minimizing data redundancy
Leveraged Netezza's parallel processing to manage vast data volumes efficiently
Data Warehouse Specialist
TCS [Client - Allianz Life Insurance Of NA]
Golden Valley, United States and Bangalore
03.2015 - 06.2022
Spearheaded SAP SuccessFactors Employee Central (SF EC) and SAP Cloud Platform Integration (ICPI) projects across multiple regions
Enhanced interoperability in SAP SuccessFactors utilizing SFAPIs
Exhibited strong expertise in SAP CPI, focusing on iFlows and REST protocols.
Utilized Groovy Scripting to optimize functionalities within SAP CPI
Developed comprehensive data mapping rules to enhance system integration
Instituted proactive monitoring procedures to minimize downtime and ensure seamless operations
Enhanced established SAP SuccessFactors integrations utilizing extensive knowledge of SAP APIs and CPI flows
Enhanced response times by monitoring system performance
Optimized SAP CPI build and rollout with pre-packaged solutions
Leveraged SAP CPI monitoring tools for real-time integration insights
Ensured HR systems were updated according to the latest features in SAP SuccessFactors
Contributed to requirement gathering sessions, offering technical insights on ETL solutions.
Engaged in drafting both low-level and high-level designs
Implemented effective data integration techniques to ensure reliability
Worked alongside data architects and business analysts to grasp data needs
Created and managed intricate ETL workflows with Informatica PowerCenter
Aligned Informatica PowerCenter with existing systems to enable streamlined data flow
Led code reviews to maintain high standards in Informatica PowerCenter projects
Created and updated documentation for Informatica PowerCenter mappings, workflows, configurations
Developed comprehensive test cases urging precision in user acceptance testing
Partnered with database and system administrators to optimize deployment of Informatica PowerCenter workflows
Created Unix scripts to automate tasks
Led each phase of software development, from initiation through completion
Developed ETL process to extract, transform and load data from Oracle database into the Data Warehouse using Informatica PowerCenter.
Created mappings, sessions and workflows to move data from source systems to target systems using Informatica PowerCenter.
Tested and validated the loaded data in the target system for accuracy and completeness.
Optimized existing ETL processes by analyzing performance bottlenecks and implementing best practices in Informatica PowerCenter.
Performed unit testing of mappings, sessions and workflows developed in Informatica PowerCenter.
Documented all ETL processes created using Informatica PowerCenter.
Provided production support for all ETL related issues involving Informatica PowerCenter.
Monitored loading jobs on daily basis to ensure that all ETL jobs are running as scheduled.
Designed complex transformations using various transformation objects like Joiner, Lookup, Aggregator, Sorter in Informatica PowerCenter.
Identified data quality issues and worked with business users to resolve them in a timely manner.
Implemented error handling logic for failed ETL loads using error log tables in Oracle Database or flat files generated by Informatica PowerCenter.
Developed shell scripts to schedule the execution of ETL jobs through Cron scheduler on UNIX platform.
Automated the scheduling process of executing multiple concurrent tasks within an application environment utilizing Workflow Manager feature of Informatica PowerCenter 8.x and 9.x versions.
Configured session parameters such as commit intervals, buffer size, based on volume of data being processed through each mapping or workflow within an application environment utilizing Workflow Manager feature of Informatica PowerCenter 8.x and 9.x versions .
Involved in troubleshooting problems encountered during UAT phase and provided necessary fixes before going live .
Assisted QA team by providing necessary information which is needed while performing Unit Testing activities.
Used Performance tuning techniques such as Partitioning technique and Bulk Loader options available within Transformations section while developing Mapping logic .
Worked with business analysts to gather and define project requirements.
Identified, debugged, and fixed system bottlenecks and problems.
Produced technical specifications and design documentation.
Deployed business logic frameworks for model implementation.
Reviewed data warehouse system details such as execution time and storage to improve efficiency.
Enforced procedures to maintain overall data integrity.
Resolved warehouse performance and access problems by troubleshooting.
Consultant
TCS [Client - Swiss Re-Insurance]
Bangalore, India
07.2012 - 11.2012
Managed daily and weekly interface activities, including Jupiter interface
Handled month-end interface tasks with focus on Jupiter Month End Close interfaces
Monitored P Log metrics to track service performance
Oversaw file issues in lower-informatics processes, enhancing swift resolution
Oversaw ISO Price Monitoring tasks, maintaining accurate and reliable data
Facilitated issue resolution for minimal operation disruption in coordination with Zurich team
Monitored interface process activities, ISO Plus, P Log Metrics, and escalation and upload processes to ensure seamless performance
Oversee incident management and service coordination
Identified and documented problem tickets enhancing incident tracking
Ensured data accuracy and integrity through collaboration with Swiss Re application managers
Carried out small-scale data corrections ensuring user experience remained uninterrupted
Designed solutions to manage open incidents and service requests without hindering operations
Secured approval from Swiss Re for known errors and workarounds
Facilitated resolution handover to Change Management for priority realignment and scheduling
Conducted trend and root cause analysis on problem tickets
Kept operational documentation up-to-date for accurate data maintenance
Led induction of new applications into Service Operations, ensuring seamless integration