Summary
Overview
Work History
Education
Skills
Timeline
Generic

Sampada Ashok Marathe

Eden Prairie,MN

Summary

Results-oriented IT professional with over 15 years of experience in diverse roles across industry verticals, including Banking, Insurance, Healthcare, and Telecom.

  • Data Warehousing Expertise: Proven track record in successfully implementing data warehouse solutions for prominent Insurance clients, overseeing design, development, and project management aspects to optimize data storage, ensure data integrity, and implement efficient retrieval systems.
  • ETL Development and Architecture: Highly skilled ETL Architect and Informatica Developer with extensive experience in designing and implementing data warehousing solutions. Proficient in Informatica PowerCenter, IICS, SQL, and data modeling, with a strong focus on optimizing ETL workflows and improving system performance.
  • Data Engineering and Integration: Seasoned Senior Data Engineer with expertise in developing, testing, and maintaining robust data architectures. Proficient in utilizing databases, flat files, XML, and cloud technologies (AWS S3, Redshift) for complex data transformations and integrations.
  • API and Real-Time Integration: Experienced in API-based and real-time integration workflows, with a solid understanding of REST API endpoints. Demonstrated ability in building application and service connectors to facilitate seamless data integration.
  • Project Management and Collaboration: Accomplished project manager adept at leading ETL projects in multi-vendor environments, with a strong capability to translate business requirements into technical specifications. Known for effective communication and collaboration with stakeholders to align data solutions with business objectives.
  • Technical Proficiency: Strong command of Unix, Python, and Shell scripting for automation purposes. Results-driven with a hands-on approach to technical coding, able to tackle projects from foundational to advanced levels.
  • Interpersonal Skills: Highly motivated self-starter with exceptional communication and interpersonal relations, recognized as a valuable team player who contributes to achieving shared objectives.

Overview

16
16
years of professional experience

Work History

Data Engineer

Compunnel Software Group Inc. [Client - US Bank]
Hopkins, United States
02.2023 - Current
  • Executed data migration initiatives following Union Bank acquisition by USBank
  • Partnered with Union Bank's team to assess data sources and file systems
  • Integrated Informatica workflows into USBank ETL servers
  • Optimized workflow testing for accuracy using newly created Autosys jobs.
  • Executed comprehensive dry runs for workflows following the project timeline
  • Established an audit framework for session tracking.
  • Produced detailed design documentation to enhance understanding
  • Migrated code across higher environments, ensuring stability
  • Engineered data workflows using PySpark for efficient processing
  • Constructed integrated system with Python and PySpark
  • Executed advanced preprocessing of source files for Python framework compatibility
  • Acted as Subject Matter Expert for offshore team, directing ETL configuration development
  • Refined SSIS packages aiding database migrations
  • Generated JSON files to facilitate CSV data integration with Cassandra
  • Performed thorough unit testing to maintain data integrity post-migration
  • Designed data integration solutions alongside data architects and business analysts
  • Engaged in User Acceptance Testing (UAT) to ensure migrated processes meet functional standards
  • Enhanced query efficiency by indexing database tables
  • Developed and implemented data models, database designs, data access and table maintenance codes.
  • Analyzed user requirements, designed and developed ETL processes to load enterprise data into the Data Warehouse.
  • Created stored procedures for automating periodic tasks in SQL Server.
  • Configured SSIS packages for scheduled data loads from source systems to target tables.
  • Documented data architecture designs and changes, ensuring knowledge transfer and system maintainability.
  • Participated in agile development processes, contributing to sprint planning, stand-ups, and reviews to ensure timely delivery of data projects.
  • Optimized SQL queries and database schemas for performance improvements in data retrieval operations.
  • Provided technical mentorship to junior data engineers, guiding them on best practices and project execution.
  • Researched and integrated new data technologies and tools to keep the data architecture modern and efficient.
  • Designed, constructed, and maintained scalable data pipelines for data ingestion, cleaning, and processing using Python and SQL.
  • Collaborated with cross-functional teams to gather requirements and translate business needs into technical specifications for data solutions.
  • Managed version control and deployment of data applications using Git, Docker and Jenkins.
  • Trained non-technical users and answered technical support questions.

Data Warehouse Specialist

Smart works [Client - BlueShield Of California]
Minnetonka, United States
06.2022 - 02.2023
  • Partnered with business users and analysts to gather IT requirements
  • Performed impact analysis, lab testing, and module development
  • Ensured consistent robustness by conducting daily design reviews
  • Created and updated Netezza scripts to enhance operational efficiency
  • Designed and executed robust ETL processes
  • Developed scalable and efficient database solutions using SQL stored procedures
  • Participated in design, development, unit testing, and Quality Assurance stages of software projects
  • Identified quality improvement areas in deployment, testing support, and user acceptance processes
  • Facilitated testing and production teams by supplying essential tools and expertise
  • Utilized various flat file and XML formats for data sourcing and targeting
  • Enhanced data processing efficiency using Netezza's high-performance analytical platform
  • Constructed Denodo views to establish virtualized data layers
  • Seamlessly integrated Denodo with existing infrastructures, ensuring system compatibility
  • Exploited Denodo’s data virtualization features, minimizing data redundancy
  • Leveraged Netezza's parallel processing to manage vast data volumes efficiently

Data Warehouse Specialist

TCS [Client - Allianz Life Insurance Of NA]
Golden Valley, United States and Bangalore
03.2015 - 06.2022
  • Spearheaded SAP SuccessFactors Employee Central (SF EC) and SAP Cloud Platform Integration (ICPI) projects across multiple regions
  • Enhanced interoperability in SAP SuccessFactors utilizing SFAPIs
  • Exhibited strong expertise in SAP CPI, focusing on iFlows and REST protocols.
  • Utilized Groovy Scripting to optimize functionalities within SAP CPI
  • Developed comprehensive data mapping rules to enhance system integration
  • Instituted proactive monitoring procedures to minimize downtime and ensure seamless operations
  • Enhanced established SAP SuccessFactors integrations utilizing extensive knowledge of SAP APIs and CPI flows
  • Enhanced response times by monitoring system performance
  • Optimized SAP CPI build and rollout with pre-packaged solutions
  • Leveraged SAP CPI monitoring tools for real-time integration insights
  • Ensured HR systems were updated according to the latest features in SAP SuccessFactors
  • Effectively resolved complex technical issues promptly
  • Conducted rigorous design reviews for functional integrity
  • Designed and established ETL mappings, sessions, workflows
  • Developed SAP CPI interfaces and crafted XSLT scripts to streamline data integration processes, improving efficiency and accuracy
  • Developed SQL stored procedures and views for scalable database solutions
  • Participated in comprehensive project lifecycle management
  • Suggested insights and solutions for better performance
  • Assisted test and production teams, supplying essential tools
  • Crafted efficient data workflows employing Informatica PowerCenter
  • Employed Oracle SQL and UNIX shell scripting to craft custom project code
  • Executed defect triage and issue analysis, ensuring optimal system performance
  • Analyzed system weaknesses and implemented creative resolutions to boost business efficiency
  • Oversaw change management for functional improvements
  • Facilitated governance calls on incidents impacting client operations
  • Executed crisis management strategies mitigating adverse impacts during project rollouts
  • Directed smooth transition from delivery to support services
  • Evaluated code developed by offshore team
  • Oversaw warranty support for production consistency
  • Maintained system integrity through proactive measures
  • Provided ad-hoc business support for data queries across technical and nontechnical domains
  • Enhanced delivery and ensured data integrity of reporting tools
  • Efficiently created comprehensive documentation
  • Developed and sustained documentation throughout the development lifecycle
  • Oversaw acceptance testing alongside stakeholders and testing team
  • Aligned with environment coordination team to ensure smooth progression to higher environments and production.

Team Lead and Developer

TCS [Client - Swiss Re-Insurance]
Bangalore, India
01.2013 - 05.2014

Project : Corporate Solutions – Conversion and Integration

  • Collaborated with on-site teams for thorough requirement analysis
  • Led development processes, leveraging expertise for scalable outcomes
  • Created ETL mappings, sessions, and workflows to streamline data processes
  • Implemented efficient data integration processes for optimal system performance.
  • Created Informatica mappings for enhanced performance.
  • Scheduled sessions, batches, workflows in Informatica PowerCenter
  • Ensured seamless operations by preemptively managing potential performance problems
  • Supported user acceptance processes by confirming solution robustness

Software Developer

TCS [Client : Swiss Re-Insurance]
Bangalore, India
11.2012 - 12.2012

Project : Corporate Solutions - Application Notification Database

  • Contributed to requirement gathering sessions, offering technical insights on ETL solutions.
  • Engaged in drafting both low-level and high-level designs
  • Implemented effective data integration techniques to ensure reliability
  • Worked alongside data architects and business analysts to grasp data needs
  • Created and managed intricate ETL workflows with Informatica PowerCenter
  • Aligned Informatica PowerCenter with existing systems to enable streamlined data flow
  • Led code reviews to maintain high standards in Informatica PowerCenter projects
  • Created and updated documentation for Informatica PowerCenter mappings, workflows, configurations
  • Developed comprehensive test cases urging precision in user acceptance testing
  • Partnered with database and system administrators to optimize deployment of Informatica PowerCenter workflows
  • Created Unix scripts to automate tasks
  • Led each phase of software development, from initiation through completion
  • Developed ETL process to extract, transform and load data from Oracle database into the Data Warehouse using Informatica PowerCenter.
  • Created mappings, sessions and workflows to move data from source systems to target systems using Informatica PowerCenter.
  • Tested and validated the loaded data in the target system for accuracy and completeness.
  • Optimized existing ETL processes by analyzing performance bottlenecks and implementing best practices in Informatica PowerCenter.
  • Performed unit testing of mappings, sessions and workflows developed in Informatica PowerCenter.
  • Documented all ETL processes created using Informatica PowerCenter.
  • Provided production support for all ETL related issues involving Informatica PowerCenter.
  • Monitored loading jobs on daily basis to ensure that all ETL jobs are running as scheduled.
  • Designed complex transformations using various transformation objects like Joiner, Lookup, Aggregator, Sorter in Informatica PowerCenter.
  • Identified data quality issues and worked with business users to resolve them in a timely manner.
  • Implemented error handling logic for failed ETL loads using error log tables in Oracle Database or flat files generated by Informatica PowerCenter.
  • Developed shell scripts to schedule the execution of ETL jobs through Cron scheduler on UNIX platform.
  • Automated the scheduling process of executing multiple concurrent tasks within an application environment utilizing Workflow Manager feature of Informatica PowerCenter 8.x and 9.x versions.
  • Configured session parameters such as commit intervals, buffer size, based on volume of data being processed through each mapping or workflow within an application environment utilizing Workflow Manager feature of Informatica PowerCenter 8.x and 9.x versions .
  • Involved in troubleshooting problems encountered during UAT phase and provided necessary fixes before going live .
  • Assisted QA team by providing necessary information which is needed while performing Unit Testing activities.
  • Used Performance tuning techniques such as Partitioning technique and Bulk Loader options available within Transformations section while developing Mapping logic .
  • Worked with business analysts to gather and define project requirements.
  • Identified, debugged, and fixed system bottlenecks and problems.
  • Produced technical specifications and design documentation.
  • Deployed business logic frameworks for model implementation.
  • Reviewed data warehouse system details such as execution time and storage to improve efficiency.
  • Enforced procedures to maintain overall data integrity.
  • Resolved warehouse performance and access problems by troubleshooting.

Consultant

TCS [Client - Swiss Re-Insurance]
Bangalore, India
07.2012 - 11.2012
  • Managed daily and weekly interface activities, including Jupiter interface
  • Handled month-end interface tasks with focus on Jupiter Month End Close interfaces
  • Monitored P Log metrics to track service performance
  • Oversaw file issues in lower-informatics processes, enhancing swift resolution
  • Oversaw ISO Price Monitoring tasks, maintaining accurate and reliable data
  • Facilitated issue resolution for minimal operation disruption in coordination with Zurich team
  • Monitored interface process activities, ISO Plus, P Log Metrics, and escalation and upload processes to ensure seamless performance
  • Oversee incident management and service coordination
  • Identified and documented problem tickets enhancing incident tracking
  • Ensured data accuracy and integrity through collaboration with Swiss Re application managers
  • Carried out small-scale data corrections ensuring user experience remained uninterrupted
  • Designed solutions to manage open incidents and service requests without hindering operations
  • Secured approval from Swiss Re for known errors and workarounds
  • Facilitated resolution handover to Change Management for priority realignment and scheduling
  • Conducted trend and root cause analysis on problem tickets
  • Kept operational documentation up-to-date for accurate data maintenance
  • Led induction of new applications into Service Operations, ensuring seamless integration
  • Managed escalations effectively, ensuring continuous operation
  • Enhanced issue resolution by coordinating routine escalations and metric assessments
  • Managed end-to-end operations of ETL data pipelines, maintaining uptime.
  • Implemented best practices while developing ETL packages like error handling, logging.

Team Leader

TCS [Ericsson]
Mumbai, India
10.2008 - 06.2012

Project : Wireless Custom Design - Planning

  • Collaborated with clients to fully understand Feature Requirement Specifications.
  • Prepared detailed Design Estimates for client review.
  • Facilitated the creation and review of development and testing documents to ensure project alignment.
  • Maintained active participation across Planning, Development, and Testing processes.
  • Supported senior developers during the planning, development, and testing phases.
  • Developed stored procedures, functions, and triggers to support application requirements.
  • Created complex workflows, incorporating tasks such as session, email, command, and decision tasks.
  • Developed Informatica mappings for data validation, transformation, and loading into target systems.
  • Integrated multiple data sources into a unified view using SQL views or joins.
  • Optimized existing queries using SQL query tuning techniques for better performance.
  • Conducted database tuning and performance monitoring to ensure optimal system functionality.
  • Offered support during User Acceptance Testing (UAT) to align outcomes with user expectations.
  • Reviewed Internal Quality Assurance (IQA) and External Quality Assurance (EQA) processes to meet quality standards.
  • Led the migration of databases to new platforms or versions, minimizing downtime and data loss.
  • Implemented data migration from legacy systems to new databases.
  • Provided support during migration activities by troubleshooting production issues.
  • Pinpointed and rectified bottlenecks to enhance system functionality.
  • Optimized data streamlining within the ETL framework through session creation.
  • Monitored database performance metrics and adjusted configurations to meet scalability demands.
  • Documented all ETL processes according to corporate standards.
  • Assisted in ETL processes under guidance, utilizing transformations like joiner, sort, lookup, update, and route.
  • Implemented ETL processes using SQL and additional tools to migrate data across different systems.
  • Involved in creating technical design documents related to ETL projects.

Education

Bachelor of Engineering - Electronics And Communications Engineering

Visvesvaraya Technological University
Belgaum, India
07-2008

Skills

  • Integrated Informatica Cloud Services
  • Informatica PowerCenter
  • SAP CPI
  • Success Factor
  • Denedo
  • Netezza
  • Python Programming
  • Pyspark Programming
  • Unix Shell Scripting
  • AWS Cloud Services
  • SQL
  • Plsql
  • Kafka Streaming
  • Data warehousing & Dimensional Modelling
  • PostgreSQL
  • Snowflake
  • Mongo DB
  • Cassandra
  • XSLT Scripting
  • Apache Groovy Scripting
  • Bitbucket
  • Control-M
  • Autosys
  • Jenkins
  • Ranchers
  • Kibana

Timeline

Data Engineer

Compunnel Software Group Inc. [Client - US Bank]
02.2023 - Current

Data Warehouse Specialist

Smart works [Client - BlueShield Of California]
06.2022 - 02.2023

Data Warehouse Specialist

TCS [Client - Allianz Life Insurance Of NA]
03.2015 - 06.2022

Team Lead and Developer

TCS [Client - Swiss Re-Insurance]
01.2013 - 05.2014

Software Developer

TCS [Client : Swiss Re-Insurance]
11.2012 - 12.2012

Consultant

TCS [Client - Swiss Re-Insurance]
07.2012 - 11.2012

Team Leader

TCS [Ericsson]
10.2008 - 06.2012

Bachelor of Engineering - Electronics And Communications Engineering

Visvesvaraya Technological University
Sampada Ashok Marathe