Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Vipinkumar Galipoina

Fuquay-Varina,NC

Summary

Seasoned Cloud Engineer at Fidelity Investments with expertise in AWS and Azure, driving automation through CI/CD pipelines and Infrastructure as Code. Proven ability to enhance cloud performance and security while leading cross-functional teams. Skilled in Python and strategic collaboration, achieving significant efficiency improvements in cloud deployments.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Cloud Engineer

Fidelity Investments
Durham, NC
01.2024 - Current
  • Configured and maintained cloud-based data infrastructure on platforms like AWS and Azure to enhance data storage and computation capabilities.
  • Implemented CI/CD pipelines for automated deployment and configuration in AWS and Azure using Terraform, ARM templates, and CloudFormation.
  • Streamlined database provisioning and maintenance with CloudFormation, Terraform, OpenTofu, and Ansible.
  • Developed a custom Oracle AMI for AWS EC2 and Azure VM images using Packer and Ansible, reducing manual instance setup time.
  • Automated deployment processes by creating efficient scripts in Python, Shell, and Groovy.
  • Enhanced deployment accuracy by setting up CI/CD pipelines using Jenkins and GitHub Actions.
  • Executed performance-focused projects showcasing RDS evaluations and SCSI to NVMe migrations on cloud systems.
  • Developed Jenkins core CI/CD pipelines using Shell scripts, AWS CLI, and Azure CLI to optimize infrastructure workflows.
  • Secured multi-cloud environments by managing AWS IAM policies and Azure Service Principal credentials.
  • Configured AWS SSO with SAML and Azure az login authentication, securely storing secrets in Vault paths.
  • Enhanced cloud workload performance by adjusting kernel parameters such as vm.swappiness and TCP buffer sizes.
  • Diagnosed multi-threaded service issues using Linux tools (GDB, strace, perf).
  • Performed system administration tasks to maintain high availability of cloud services.
  • Built automated diagnostic processes with Bash scripts for monitoring the health of Linux systems in cloud environments.
  • Implemented integration of SonarQube with Jenkins, lowering code vulnerabilities.
  • Integrated concepts from vRealize Automation into projects using Terraform and Ansible.
  • Configured load balancers and auto-scaling groups to ensure high availability and fault tolerance.
  • Evaluated firewalls and monitored threats to establish secure cloud environments.
  • Configured various types of authentication protocols such as LDAP, Kerberos within the enterprise environment.

Senior Data Engineer

Ernst & Young
Durham, NC
01.2022 - 01.2024
  • Collaborated with cross-functional teams to gather requirements and translate business needs into technical specifications for data solutions.
  • Developed a Common Data Platform (CDP) for data ingestion, API integrations, and distribution.
  • Migrated Oracle databases from on-premises to AWS EC2, leveraging Jenkins for deployment automation.
  • Implemented robust RESTful APIs with Java and Spring Boot, ensuring seamless data handling on AWS EKS.
  • Developed and containerized REST APIs using Docker and Kubernetes.
  • Integrated Informatica mappings with Oracle to automate ETL workflows.
  • Deployed APIs using Helm and Kubernetes within the CI/CD framework.
  • Established local containerized environments with Docker Desktop, Kubernetes, and Helm.
  • Implemented SonarQube within CI/CD pipelines to enhance code quality and security compliance.
  • Implemented HPA scaling resulting in proactive issue resolution.
  • Implemented message queues such as Kafka to incorporate APIs efficiently.
  • Instructed junior engineers in database schema protocols, elevating knowledge sharing across the team.
  • Enhanced test coverage and system reliability through development of test cases in UTPL/SQL, Mockito, and JaCoCo.
  • Optimized package dependency management with JFrog and MyBatis for efficient builds.
  • Engineered scalable Data Warehouse & Data Lake solution with Snowflake, AWS S3, and DBT.
  • Optimized and developed ELT workflows in Apache Airflow.
  • Developed standard data modeling, schema design, and access control policies adhering to best practices and business needs.
  • Implemented DBT transformations to boost data quality and reporting efficiency.
  • Streamlined Snowflake query performance for faster data access.
  • Implemented and optimized big data storage solutions, including Hadoop and NoSQL databases, to improve data accessibility and efficiency.
  • Performed advanced analytics on structured and unstructured data using SQL, Python, R.
  • Optimized queries for better performance on relational databases such as PostgreSQL and MySQL.
  • Implemented serverless architectures with AWS Lambda functions, API Gateway and DynamoDB.
  • Deployed serverless architecture using Lambda functions for cost efficient operations in the cloud environment.
  • Implemented distributed computing frameworks such as Apache Spark, Hadoop, Kafka and Flink for large scale data processing tasks.

Senior Data Engineer

Credit Suisse
06.2018 - 01.2022
  • Designed and developed a Legal Compliance Data Warehouse to consolidate investment banking and asset management data, enabling faster regulatory reporting and compliance tracking.
  • Designed a scalable data ingestion framework with Azure Data Factory and Databricks.
  • Developed event-driven data pipelines with Azure Functions and Event Grid for real-time data streaming.
  • Optimized data transformations for multi-terabyte datasets, boosting ETL job performance with Spark & PySpark.
  • Transitioned Oracle data models to Azure SQL Data Warehouse, achieving a 30% reduction in infrastructure costs.
  • Developed serverless data-sharing solutions with Azure Data Factory, PolyBase connectors, and Data Lake.
  • Designed automated ETL job deployments and infrastructure-as-code (IaC) processes, minimizing manual tasks by 60%.
  • Enhanced Azure Synapse Analytics efficiency using partitioning, indexing, and query optimization techniques.
  • Managed security improvements with implementation of RBAC and encryption techniques, achieving regulatory compliance.
  • Enhanced efficiency and reduced costs by tuning Azure Data Lake and SQL Data Warehouse.
  • Enhanced regulatory reporting through collaboration with Data Science, Risk & Compliance, and Cloud Engineering teams.
  • Enhanced DevOps automation standards by coaching peers in technical methodologies.
  • Built a SharePoint-based Work Intake Tool for Project Management, storing data in SharePoint Lists and automating workflows with custom REST services and InfoPath-designed forms, streamlining request tracking and approvals.

Oracle Developer

Wells Fargo
01.2017 - 06.2018
  • Developed complex Oracle PL/SQL packages, procedures, functions and triggers to support Online Payment application development.
  • Developed UNIX Shell scripts to streamline the scheduling of data cleansing processes.
  • Enhanced performance by optimizing SQL queries using indexing.
  • Crafted Python scripts to automate data ingestion processes.
  • Implemented ETL workflows using Informatica for enhanced transaction data processing.
  • Involved in data migration from legacy systems into Oracle Database using different methods such as SQL.
  • Performed code reviews, debugging, optimization and fine-tuning of SQL queries and PL/SQL programs.
  • Worked with end-users and functional analysts to capture and document business requirements and inform development.

Software Intern

Trigger IT LLC
09.2016 - 12.2016
  • Created comprehensive queries using Oracle Forms 10g for enterprise applications.
  • Enhanced workflow efficiency through automation of data ingestion with Korn shell scripting.
  • Designed effective SQL queries to bolster backend support and address troubleshooting.
  • Assisted in the preparation of project timelines and delivery schedules.

Education

Masters - Computer Science

University of Central Missouri
Missouri
12.2016

Skills

  • Python and C
  • Bash
  • Groovy
  • Scala
  • Java
  • SQL
  • Oracle PL/SQL
  • AWS
  • EKS
  • EC2
  • Lambda
  • RDS
  • CloudFormation
  • Route 53
  • S3
  • EFS
  • EBS
  • IAM
  • Azure
  • Azure VM
  • Entra ID
  • ADF
  • Synapse
  • ADLS Gen2
  • Azure DevOps
  • Terraform
  • OpenTofu
  • Ansible
  • Kubernetes
  • Docker
  • CI/CD
  • Jenkins
  • JFrog
  • GitHub Actions
  • Snowflake
  • Hadoop
  • Spark
  • Kafka
  • Oracle
  • SQL Server
  • HBase
  • Hive
  • MongoDB
  • Informatica
  • SnapLogic
  • Databricks
  • Airflow
  • Glue
  • DBT
  • Splunk
  • Datadog
  • Power BI
  • Tableau
  • Cloud Cost Optimization
  • Security Best Practices
  • High Availability Architectures
  • Strategic Team Leadership
  • Cross-squad collaboration
  • Scrum and Agile Practices
  • CI/CD pipelines
  • Infrastructure as Code
  • Containerization techniques

Certification

Snowflake SnowPro Core Certified, Snowflake, 06/01/23

Timeline

Cloud Engineer

Fidelity Investments
01.2024 - Current

Senior Data Engineer

Ernst & Young
01.2022 - 01.2024

Senior Data Engineer

Credit Suisse
06.2018 - 01.2022

Oracle Developer

Wells Fargo
01.2017 - 06.2018

Software Intern

Trigger IT LLC
09.2016 - 12.2016

Masters - Computer Science

University of Central Missouri
Vipinkumar Galipoina