Summary
Overview
Work History
Education
Skills
Websites
Certification
Languages
Timeline
Generic

Senthilkumar Perumal

Celina,TX

Summary

Accomplished IT Application Programmer Lead with 16 years of experience architecting, developing, and delivering enterprise applications and advanced data products. Specialized in designing and implementing scalable, event-driven data platforms and modernizing legacy systems using Snowflake, AWS, and leading ETL tools. Demonstrated expertise in building robust data pipelines, developing AI-enabled data solutions, and leveraging Snowflake features—including Snowpark, Streams, Tasks, and Cortex—for high-performance analytics and real-time processing. Proven track record of technical ownership for mission-critical data products, from requirements gathering through deployment and support, with a strong focus on automation, data governance, and security. Adept at leading cross-functional teams, mentoring developers, and driving innovation to deliver secure, reliable, and business-aligned data products that empower analytics, BI, and data science initiatives.

Overview

16
16
years of professional experience
1
1
Certification

Work History

IT Apps Programmer Lead

Progressive Insurance
09.2024 - Current
  • Project: Claims Triage Automation - Event-Driven Data Platform Modernization
  • Overview:
  • Leading the design and implementation of an event-driven, AI-enabled data platform that combines
  • Key Contributions:
  • - Designed AWS Lambda + SQS + Glue-based ingestion to normalize high-volume nested JSON into Snowflake.
  • - Built recursive JSON shredding logic using Snowpark and surrogate keys for child-parent linking.
  • - Integrated Snowflake Cortex to infer schema and classify tables/columns dynamically.
  • - Implemented Snowflake Streams + Tasks for near real-time incremental CDC processing.
  • - Created end-to-end Prefect v3 orchestration with alerting and rollback features.
  • - Maintained observability dashboards and alert triggers with Snowflake Alerts.
  • - Led upgrade and incident management responsibilities as TAO across Snowflake and Prefect ecosystems.
  • - Provided training, playbooks, and mentoring to cross-functional teams.
  • Technology Stack:
  • AWS (Lambda, Glue, S3, EventBridge, SQS), Snowflake (Streams, Tasks, Cortex, Snowpark, Alerts), Prefect

IT Apps Programmer Senior

Progressive Insurance
12.2022 - 09.2024

Project: Claims IT Data & Reporting – Unified View for Claim Notes

Overview:

Designed and developed high-performance data pipelines to create a unified view of claim notes by integrating AWS live streams and legacy DB2 historical data in a cloud environment. This unified dataset supports analysts, BI teams, and data scientists for advanced modeling and querying.

Key Contributions:

  • Built and managed AWS infrastructure using Terraform to automate resource provisioning for ingesting JSON datasets into S3.
  • Developed AWS Glue jobs and PySpark scripts to extract, normalize, and partition high-volume JSON data for efficient processing.
  • Configured Glue Crawlers and schedules to update table metadata and optimize S3 partitioning.
  • Created Snowflake external tables and leveraged COPY commands to ingest S3 data into raw tables.
  • Collaborated with business analysts to document data lineage and finalize requirements.
  • Wrote complex SnowSQL queries to join, transform, and load data into ODS and target layers, producing unified claim note views.
  • Implemented CDC mechanisms using Snowflake Streams to efficiently capture and process data deltas.
  • Supported QA teams by troubleshooting bugs and addressing change requests.
  • Developed Python workflows orchestrated with Prefect for pipeline automation, monitoring, and alerting.
  • Established reconciliation processes with pre- and post-validation queries to ensure data consistency across source and target tables.

Technology Stack:
AWS (Glue, Kinesis Firehose, EventBridge, S3, Athena), Snowflake, PySpark, Terraform, GitHub, Jenkins, Prefect

Senior Software Developer

Randstad Technologies
04.2022 - 12.2022

Project: Cloud Data Migration & Reporting Enablement

Overview:

Responsible for architecting and building AWS infrastructure and services to support the migration, processing, and storage of data in AWS S3 and Snowflake. Led end-to-end data extraction from source systems, transformation, and loading into cloud-based targets, ensuring seamless data availability for reporting teams both in the cloud and on-premises.

Key Responsibilities:

  • Designed and provisioned AWS infrastructure (Glue, Lambda, S3) to support scalable data processing and migration initiatives.
  • Extracted, transformed, and loaded data from PWD/Hadoop and other source systems into AWS S3 and Snowflake.
  • Set up and configured Snowflake environments and services to make data accessible for cloud and on-prem reporting teams.
  • Collaborated with Data and Reporting teams to migrate critical datasets from legacy platforms to the cloud.
  • Developed PySpark and Python scripts for data transformation, normalization, and workflow automation.
  • Enabled secure access to cloud data from on-premises environments, ensuring business continuity and usability.
  • Mentored junior developers, fostering skill growth and enhancing team collaboration.
  • Conducted code reviews, ensuring adherence to best practices and high-quality standards.

Technology Stack:AWS (Glue, Lambda, S3), Snowflake, PySpark, Python

Senior Associate

Cognizant Technology Solutions
07.2016 - 03.2022

Project: CLDW – Commercial Lines Data Warehousing

Client: The Hartford Insurance
Overview:
Supported agile product delivery and automated ETL validation, data validation, and status reporting for commercial lines data warehousing. Identified cross-sell opportunities from acquisitions/mergers, maintained code repositories, and ensured seamless CI/CD operations.

Key Contributions:

  • Designed and maintained data architecture, ETL design, and reporting capabilities.
  • Set up CI/CD pipelines for PLSQL, Talend, AWS EMR, and Snowflake using GitHub, Jenkins, Udeploy, and Liquibase.
  • Developed data ingestion and processing pipelines with Apache NiFi, Amazon EMR, Spark, PySpark, Athena, Glue, and Data Pipelines for Policycenter Guidewire data into S3 Data Lake.
  • Guided developers on CI/CD best practices and pipeline usage.
  • Collaborated with scrum and systems teams to refine release strategies.
  • Created account and policy-level datasets, performed SQL queries for metrics, and implemented MDM for data cleansing.
  • Built dashboards, reports, and automated ETL execution, validation, and reporting.
  • Deployed custom tables, views, procedures, and indexes to SQL Server for staging and data-mart environments.
    Technology Stack:
    Linux, Talend, Informatica PowerCenter, Liquibase, Autosys, AWS EMR, Snowflake, AWS S3, GitHub, Jenkins, Nexus, Oracle, MSSQL Server

Project: AWS Adaptive Data Foundation

Client: Merck & Co. (Pharmaceutical)
Overview:
Enabled complete and accurate migration of datasets from zCloud mainframe to AWS, replicating reporting functionality in the cloud and optimizing data access and performance.
Key Contributions:

  • Advised on technical decisions for data intelligence and cloud provisioning.
  • Built a responsive cloud data ecosystem for sourcing, transforming, and consuming data.
  • Designed optimized reports, queries, S3 folders, and file rationalization for Athena.
  • Performed post-archival validation to ensure data integrity.
  • Developed AWS IAM role-based access controls for S3 data.
  • Participated in AWS foundational planning and growth.
  • Migrated Informatica ILM TDM applications to AWS EC2 reserved instances.
    Technology Stack:
    Talend, DMX-SyncSort, AWS (S3, EC2, Glue, Athena, CLI), IBMDB Mainframe, Virtual Tapes

Project: Data Governance and Security

Client: The Hartford Insurance
Overview:
Designed and implemented data masking and subsetting strategies to ensure production data is not used in non-production environments, enhancing data privacy and compliance.
Key Contributions:

  • Led data masking strategy and ILM framework design for application decommissioning.
  • Estimated effort and forecasted resources for future projects.
  • Coordinated with stakeholders for smooth project execution.
  • Developed masking and subsetting methods in Informatica PowerCenter and TDM.
  • Loaded lower environments with PII-protected production data.
  • Rapidly profiled data for analysis and validation.
  • Built BI reports to analyze vulnerabilities in PII and sensitive data.
  • Designed ETL pipelines for future business process integration.
    Technology Stack:
    UNIX/Linux, ILM Test Data Manager, Informatica TDM/PowerCenter, ILM Data Validation Option, DDS, MongoDB, Oracle, MSSQL, AWS CloudWatch, CloudTrail, IAM, Jsonar

Associate

Cognizant Technology Solutions, CTS
11.2014 - 06.2016

Project: Information Lifecycle Management (ILM) – Application Retirement & Live Archival

Client: Genentech, Inc. (Biotech)
Overview:
Decommissioned legacy applications (SAP, Siebel, Oracle/MSSQL) using Informatica ILM, maintaining archived data for compliance and reducing maintenance costs.
Key Contributions:

  • Collaborated with customers to understand system requirements and retirement solutions.
  • Coordinated with Informatica support and R&D to resolve critical issues.
  • Provided feasibility analysis, proof of concepts, and business presentations.
  • Analyzed requirements and resolved postproduction issues related to retirement.
    Technology Stack:
    Informatica ILM Data Archive, ILM Data Validation Option, JReports

Project: Medsupp Migration

Client: Anthem, Inc. (Insurance)
Overview:
Migrated Medicare Supplement business from legacy mainframe to Medisys platform, designing ETL routines and ensuring secure, accurate data transition.
Key Contributions:

  • Developed and maintained ETL maps/scripts and data models.
  • Built and tested Informatica mappings, sessions, workflows, and tasks.
  • Executed data extracts for legacy application retirement.
  • Defined transformation rules and mapped IMS segments to relational tables.
  • De-identified PII and sensitive data in test environments.
  • Performed code reviews and unit testing, supporting QA and release management.
    Technology Stack:
    Mainframe Z/Os, Linux, Informatica PowerCenter, PowerExchange Navigator, IBM Optim, COBOL, DB2, Oracle

Associate | Programmer Analyst

Cognizant Technology Solutions, CTS
02.2009 - 11.2014

Project: ARC Program – Architecture Rationalization and Consolidation

Client: Barclays (Banking & Investment)
Overview:
Rationalized and consolidated architecture to simplify systems, reduce legacy dependencies, and enable strategic replacements.
Key Contributions:

  • Implemented data archival solutions and developed business objects.
  • Built automation frameworks for archiving and reporting.
  • Designed and developed secured SSRS reports on archived and live data.
  • Analyzed and optimized SQL performance for source/target databases.
  • Interacted with SMEs and vendors for requirements gathering and reporting.
    Technology Stack:
    MSSQL, Intersystem Cache, Informatica ILM, SSIS, SSRS

Education

Bachelor of Science - Mechanical Engineering

Kumaraguru College of Technology, Anna University
Coimbatore, Tamil Nadu, India
04-2008

Skills

  • Cloud Platforms: AWS (S3, Glue, Lambda, EMR, Kinesis, EventBridge, Athena, EC2, IAM, CloudWatch, CloudTrail)
  • Data Engineering: ETL (Talend, Informatica PowerCenter/TDM/ILM, Apache NiFi), Data Modeling, Data Migration, Data Masking & Subsetting, Data Governance, CDC, Data Validation, Data Archival
  • Application Programming: Application Design & Development, Technical Application Ownership, System Integration, Troubleshooting & Incident Management, Performance Tuning, Application Maintenance & Support, Automation of Workflows
  • Programming & Scripting: Python (PySpark), SQL, Shell Scripting, COBOL
  • Big Data & Analytics: Apache Spark, PySpark, Snowflake, AWS Athena, Data Lakes
  • DevOps & CI/CD: GitHub, Jenkins, Nexus, UDeploy, Terraform, Liquibase, Autosys
  • Databases: Oracle, MSSQL Server, DB2, MongoDB, Intersystem Cache
  • Reporting & BI: SSRS, JReports, Dashboard Development, Data Lineage Documentation
  • Version Control & Automation: Git, Jenkins, Prefect Orchestration
  • Other Tools: IBM Optim, DMX-SyncSort, PowerExchange Navigator, Jsonar
  • Collaboration & Leadership: Agile/Scrum Methodologies, Cross-functional Team Leadership, Training & Mentoring, Documentation & Playbook Creation

Certification

  • AWS Certified Cloud Practitioner
  • AWS Certified solutions architect – associate

Languages

English
Full Professional
Tamil
Native or Bilingual
French
Elementary

Timeline

IT Apps Programmer Lead

Progressive Insurance
09.2024 - Current

IT Apps Programmer Senior

Progressive Insurance
12.2022 - 09.2024

Senior Software Developer

Randstad Technologies
04.2022 - 12.2022

Senior Associate

Cognizant Technology Solutions
07.2016 - 03.2022

Associate

Cognizant Technology Solutions, CTS
11.2014 - 06.2016

Associate | Programmer Analyst

Cognizant Technology Solutions, CTS
02.2009 - 11.2014

Bachelor of Science - Mechanical Engineering

Kumaraguru College of Technology, Anna University