Experienced Data Engineer with a decade of expertise in designing and optimizing data pipelines across AWS, GCP, and Azure platforms. Successfully implemented large-scale data processing using Hadoop (HDFS, MapReduce), Spark, and Databricks, resulting in enhanced speed and reliability of analytics workflows. Developed efficient ETL frameworks utilizing AWS Glue, Talend, and DataStage to streamline data ingestion and transformation processes. Architected scalable solutions on AWS EC2 with Autoscaling, Azure Synapse, and GCP BigQuery to support high-volume data and complex queries. Proficiently managed real-time data streaming through Kafka, Pub/Sub, and Azure Event Hubs to ensure low-latency analytics and rapid insights. Demonstrated expertise in optimizing database performance in Snowflake, PostgreSQL, MySQL, and DB2, leading to improved query execution and resource utilization. Leveraged Tableau, Power BI, and SSRS as BI and reporting tools to deliver actionable insights through interactive dashboards. Configured robust security measures with IAM Security, VPC Configuration, and best practices to safeguard data against breaches. Automated workflows using shell scripting, Python, and Unix to reduce manual interventions and accelerate deployments. Proven track record of adopting Agile methodologies and Microservices principles to effectively coordinate cross-functional teams and drive continuous integration for data-driven projects. Proactive and goal-oriented professional with excellent time management and problem-solving skills. Known for reliability and adaptability, with swift capacity to learn and apply new skills. Committed to leveraging these qualities to drive team success and contribute to organizational growth.