Accomplished IT Application Programmer Lead with 16 years of experience architecting, developing, and delivering enterprise applications and advanced data products. Specialized in designing and implementing scalable, event-driven data platforms and modernizing legacy systems using Snowflake, AWS, and leading ETL tools. Demonstrated expertise in building robust data pipelines, developing AI-enabled data solutions, and leveraging Snowflake features—including Snowpark, Streams, Tasks, and Cortex—for high-performance analytics and real-time processing. Proven track record of technical ownership for mission-critical data products, from requirements gathering through deployment and support, with a strong focus on automation, data governance, and security. Adept at leading cross-functional teams, mentoring developers, and driving innovation to deliver secure, reliable, and business-aligned data products that empower analytics, BI, and data science initiatives.
Project: Claims IT Data & Reporting – Unified View for Claim Notes
Overview:
Designed and developed high-performance data pipelines to create a unified view of claim notes by integrating AWS live streams and legacy DB2 historical data in a cloud environment. This unified dataset supports analysts, BI teams, and data scientists for advanced modeling and querying.
Key Contributions:
Technology Stack:
AWS (Glue, Kinesis Firehose, EventBridge, S3, Athena), Snowflake, PySpark, Terraform, GitHub, Jenkins, Prefect
Project: Cloud Data Migration & Reporting Enablement
Overview:
Responsible for architecting and building AWS infrastructure and services to support the migration, processing, and storage of data in AWS S3 and Snowflake. Led end-to-end data extraction from source systems, transformation, and loading into cloud-based targets, ensuring seamless data availability for reporting teams both in the cloud and on-premises.
Key Responsibilities:
Technology Stack:AWS (Glue, Lambda, S3), Snowflake, PySpark, Python
Project: CLDW – Commercial Lines Data Warehousing
Client: The Hartford Insurance
Overview:
Supported agile product delivery and automated ETL validation, data validation, and status reporting for commercial lines data warehousing. Identified cross-sell opportunities from acquisitions/mergers, maintained code repositories, and ensured seamless CI/CD operations.
Key Contributions:
Project: AWS Adaptive Data Foundation
Client: Merck & Co. (Pharmaceutical)
Overview:
Enabled complete and accurate migration of datasets from zCloud mainframe to AWS, replicating reporting functionality in the cloud and optimizing data access and performance.
Key Contributions:
Project: Data Governance and Security
Client: The Hartford Insurance
Overview:
Designed and implemented data masking and subsetting strategies to ensure production data is not used in non-production environments, enhancing data privacy and compliance.
Key Contributions:
Project: Information Lifecycle Management (ILM) – Application Retirement & Live Archival
Client: Genentech, Inc. (Biotech)
Overview:
Decommissioned legacy applications (SAP, Siebel, Oracle/MSSQL) using Informatica ILM, maintaining archived data for compliance and reducing maintenance costs.
Key Contributions:
Project: Medsupp Migration
Client: Anthem, Inc. (Insurance)
Overview:
Migrated Medicare Supplement business from legacy mainframe to Medisys platform, designing ETL routines and ensuring secure, accurate data transition.
Key Contributions:
Project: ARC Program – Architecture Rationalization and Consolidation
Client: Barclays (Banking & Investment)
Overview:
Rationalized and consolidated architecture to simplify systems, reduce legacy dependencies, and enable strategic replacements.
Key Contributions: