Results-driven Data Director and Enterprise Cloud Data Architect with over 20 years of experience designing, implementing, and managing scalable, secure, and innovative data solutions. Proven track record of leading cross-functional teams to deliver high-quality, regulatory-compliant solutions that improve lives and drive business value. Expertise in AWS cloud architectures, big data platforms, data warehousing, and distributed systems. Passionate about fostering innovation, building high-performance teams, and aligning technology strategies with enterprise goals.
Overview
19
19
years of professional experience
1
1
Certification
Work History
Enterprise Principal Data Director
Tek Systems Global Services
07.2021 - Current
Designed a HIPAA-Compliant Serverless Architecture: Built secure, scalable, and compliant serverless solutions using AWS Lambda, API Gateway, and DynamoDB, ensuring adherence to healthcare regulations like HIPAA and GDPR
Enabled Real-Time Data Processing: Implemented serverless workflows for real-time patient data ingestion, processing, and analytics using AWS Step Functions and Kinesis
Cost Optimization and Scalability: Delivered a cost-effective, auto-scaling solution that reduced infrastructure overhead while handling unpredictable healthcare workloads
Integrated FHIR APIs for Interoperability: Leveraged AWS Health Lake and serverless services to enable seamless integration with FHIR APIs for standardized healthcare data exchange
Improved Patient Outcomes: Accelerated development cycles and improved system responsiveness, enabling faster decision-making and enhanced patient care
Enterprise principal cloud architect
NucleusTeq
11.2019 - 06.2021
Designed and deployed enterprise-grade cloud data platforms on AWS, enabling seamless data integration, storage, and analytics for global clients
Led the migration of on-premises data warehouses to cloud-based solutions, reducing costs by 30% and improving query performance by 40%
Partnered with product and engineering teams to define data architecture standards, ensuring alignment with business objectives and regulatory requirements
Built and optimized big data pipelines using Apache Spark, Hadoop, and AWS Glue, processing terabytes of data daily
Implemented IoT architectures to collect, process, and analyze sensor data, enabling predictive maintenance and real-time decision-making
Conducted architectural reviews and provided recommendations to improve system performance, security, and scalability
AWS Architect & Big Data Architect
BNSF Railways
08.2018 - 11.2019
Designed a Scalable Real-Time Ingestion Pipeline: Implemented a serverless architecture using AWS IoT Core and Kinesis Data Streams to capture and process high-frequency track measurement data from geo cars in real time
Ensured Data Reliability and Durability: Leveraged Amazon Kinesis Data Firehose to reliably ingest and store raw data in Amazon S3, ensuring durability and enabling batch analytics
Enabled Real-Time Processing and Alerts: Used AWS Lambda to process incoming data streams, detect anomalies, and trigger real-time alerts for track maintenance teams via Amazon SNS
Integrated with Analytics Tools: Streamed processed data into Amazon Redshift and QuickSight for advanced analytics and visualization, enabling actionable insights for track health monitoring
Optimized for Scalability and Cost: Designed the pipeline to auto-scale with fluctuating data volumes, ensuring cost-efficiency while maintaining low-latency processing for critical track measurements
Big Data Solution Architect
Scotia Bank/Ministry of Transportation/Roche Canada/Bell Canada
11.2012 - 11.2019
Designed and deployed a scalable big data solution in Scotia Bank leveraging Apache Spark and Hadoop to process and analyze millions of credit card transactions in real time, enabling rapid fraud detection
Integrated machine learning models with the pipeline to identify anomalous patterns, reducing fraud incidents by 30% while ensuring low-latency processing and seamless integration with existing banking systems
Implemented a cloud-based solution on AWS to capture and process images from speed cameras using Amazon Rekognition for real-time image detection and analysis, identifying traffic violations and unsafe driving behaviors
Leveraged AWS Lambda, S3, and QuickSight to store, process, and visualize data, enabling the Ministry of Transportation to enhance road safety measures and reduce accidents by 25% through data-driven insights and actionable analytics
Designed and implemented an AWS Data Lake and Modern Data Analytics platform, migrating clinical data in Roche Canada from on-premises systems to Amazon S3, and transformed raw data into structured formats using AWS Glue and Lambda for seamless analytics
Enabled advanced clinical insights and reporting by integrating Amazon Redshift, Athena, and QuickSight, achieving a 40% reduction in data processing time and ensuring HIPAA compliance while improving data accessibility for researchers and healthcare providers
Implemented a Big Data Solution using Informatica as the integration tool to capture and process hardware performance data from switches, towers, and network products, enabling comprehensive analytics on device efficiency and energy consumption
Leveraged Hadoop and Spark for data processing and Tableau for visualization, resulting in data-driven decisions for hardware vendor contract renewals and a 15% reduction in energy consumption across Bell Canada's network infrastructure
ETL/Informatica Solution Architect
Credit Suisse
06.2006 - 11.2012
At Credit Suisse, as an ETL/Informatica Solution Architect, I designed and implemented a robust data integration platform to source trade and position data from Front Office (FO) systems, transforming and processing it to calculate Back Office (BO) Profit and Loss (PnL) numbers
The solution involved building complex ETL workflows in Informatica to ensure accurate PnL calculations, which were then posted to the downstream General Ledger (GL) system for financial reporting
Designed and implemented a reconciliation framework to validate and align PnL numbers between FO and BO systems, ensuring data consistency and reducing discrepancies by 95%
The platform improved processing efficiency, reducing the PnL calculation and posting cycle time by 40%, while ensuring compliance with regulatory requirements
This solution enhanced transparency, streamlined financial reporting, and provided stakeholders with reliable data for decision-making
Successfully architected and implemented a company-wide data integration platform, reducing data processing time by 40%
Mentored and trained a team of 5 ETL developers, resulting in a 30% improvement in team productivity
Education
Bachelors Degree - Computer Applications
Osmania University
08.2001
Skills
Cloud Security
Data Visualization
Machine Learning
Project Management
Stakeholder Engagement
Cost Optimization
Performance Tuning
AWS Architecture
Big Data Architecture
AWS AI Architect
Data Integrations
Data Architectures
Enterprise Architecture
Data Lakes
Delta Lakes
Cloud Data Analytics
Certification
AWS Certified Cloud Practitioner
AWS Certified Data Engineer Associate
AWS Certified AI Practitioner
AWS Certified Solution Architect Professional
ITIL Version 3 Foundation Certified
Oracle PL/SQL Certified
Project Planning, Analysis, and Control Certified by George Washington University.
Timeline
Enterprise Principal Data Director
Tek Systems Global Services
07.2021 - Current
Enterprise principal cloud architect
NucleusTeq
11.2019 - 06.2021
AWS Architect & Big Data Architect
BNSF Railways
08.2018 - 11.2019
Big Data Solution Architect
Scotia Bank/Ministry of Transportation/Roche Canada/Bell Canada
11.2012 - 11.2019
ETL/Informatica Solution Architect
Credit Suisse
06.2006 - 11.2012
Bachelors Degree - Computer Applications
Osmania University
Similar Profiles
Miranda McGradyMiranda McGrady
Customer Service Representative at TEK Systems Global ServicesCustomer Service Representative at TEK Systems Global Services