I am a highly skilled Data Engineer with over 11 years of experience specializing in AWS and GCP-based big data solutions. I have a strong track record of designing and implementing scalable ETL/ELT pipelines, real-time data streaming architectures, and cloud-native data platforms across industries such as finance, insurance, telecommunications, and healthcare. My technical expertise spans a wide range of cloud services and tools including AWS Glue, Redshift, Lambda, EMR, Kinesis, Terraform, Snowflake, Google Cloud Dataflow, BigQuery, and Cloud Composer (Apache Airflow). I have also worked extensively with dbt (Data Build Tool) to build modular, testable data transformations and ensure data quality in modern data stack environments. I bring deep experience in developing high-performance data lakes, orchestrating pipelines, managing data governance, and deploying infrastructure as code. My work emphasizes cost-efficiency, performance optimization, and compliance with regulatory standards (HIPAA, GDPR, SOC 2, PCI-DSS). Proficient in Python, SQL, Apache Spark, and Apache Beam, I thrive in fast-paced, collaborative environments and enjoy solving complex data challenges. I am passionate about enabling data-driven decision-making through robust, scalable, and secure data solutions built on cutting-edge cloud and big data technologies.