Summary
Overview
Work History
Education
Skills
Timeline
Generic

Sunil Kumar

Salt Lake City,UT

Summary

Detail-oriented designs, develops and maintains highly scalable, secure and reliable data structures. Accustomed to working closely with system architects, software architects and design analysts to understand business or industry requirements to develop comprehensive data models. Proficient at developing database architectural strategies at the modeling, design and implementation stages. Responsive expert experienced in monitoring database performance, troubleshooting issues and optimizing database environment. Possesses strong analytical skills, excellent problem-solving abilities, and deep understanding of database technologies and systems. Equally confident working independently and collaboratively as needed and utilizing excellent communication skills. Detail-oriented team player with strong organizational skills. Ability to handle multiple projects simultaneously with a high degree of accuracy. To seek and maintain full-time position that offers professional challenges utilizing interpersonal skills, excellent time management and problem-solving skills.

Overview

9
9
years of professional experience

Work History

Cloud Data Engineer

Goldman Sachs
01.2022 - Current
  • Led end-to-end implementation of multiple high-impact projects from requirements gathering through deployment and post-launch support stages.
  • Evaluated various tools, technologies, and best practices for potential adoption in the company''s data engineering processes.
  • Increased efficiency of data-driven decision making by creating user-friendly dashboards that enable quick access to key metrics.
  • Collaborated with cross-functional teams for seamless integration of data sources into the company''s data ecosystem.
  • Conducted extensive troubleshooting to identify root causes of issues and implement effective resolutions in a timely manner.
  • Enhanced data quality by performing thorough cleaning, validation, and transformation tasks.
  • Designed scalable and maintainable data models to support business intelligence initiatives and reporting needs.
  • Developed custom algorithms for advanced analytics, driving actionable insights from large datasets.
  • Migrated legacy systems to modern big-data technologies, improving performance and scalability while minimizing business disruption.
  • Automated routine tasks using Python scripts, increasing team productivity and reducing manual errors.
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Used GDP on validation protocols, test cases and changed control documents.
  • Developed and delivered business information solutions.
  • Managed identification, protection and use of data assets.
  • Communicated new or updated data requirements to global team.
  • Implemented security best practices within the AWS environment, safeguarding sensitive data and ensuring compliance with industry regulations.
  • Developed and maintained CI/CD pipelines using Jenkins, increasing deployment speed and reliability.
  • Migrated legacy applications to AWS, reducing downtime and improving overall efficiency.
  • Facilitated better decision-making with advanced analytical tools like Amazon Redshift or EMR for business intelligence insights.
  • Enhanced data security through the implementation of encryption algorithms and access controls.
  • Ensured high availability of critical systems by designing and deploying fault-tolerant architectures on AWS infrastructure.
  • Developed custom ETL processes for efficient data ingestion and transformation.
  • Upheld best practices in code quality, testing, documentation, deployment, and continuous integration within a DevOps-focused team environment.
  • Simplified database management tasks, leveraging managed services such as RDS or DynamoDB in the cloud environment.
  • Optimized data processing by implementing AWS-based big data solutions.
  • Enabled real-time analytics capabilities by designing streaming data pipelines using Amazon Kinesis or Apache Kafka.
  • Reduced costs by automating manual processes using AWS Lambda functions.
  • Increased productivity by providing training and support for team members on various AWS services, tools, and best practices.
  • Minimized downtime with proactive monitoring and troubleshooting of complex data pipelines.
  • Improved scalability and performance by migrating legacy systems to cloud-based platforms.
  • Worked with teams of talented software engineers to define, build and maintain cloud infrastructure.
  • Identified, analyzed and resolved infrastructure vulnerabilities and application deployment issues.
  • Utilized code and modern cloud-native deployment techniques to design, plan and integrate cloud computing and virtualization systems.
  • Wrote and maintained custom scripts to increase system efficiency and performance time.
  • Participated in system development life cycle from requirements analysis through system implementation.

Cloud Engineer

Valeant
07.2019 - 12.2021
  • Developed custom scripts for automating routine tasks, increasing team productivity and reducing manual errors.
  • Reduced server downtime by proactively monitoring cloud resources and addressing potential issues before they escalated.
  • Streamlined complex workflows by breaking them down into manageable components for easier implementation and maintenance.
  • Provided technical guidance and mentorship to junior team members, fostering a collaborative learning environment within the organization.
  • Automated routine tasks using Python scripts, increasing team productivity and reducing manual errors.
  • Established robust monitoring processes to detect system anomalies.
  • Conducted extensive troubleshooting to identify root causes of issues and implement effective resolutions in a timely manner.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Developed and maintained CI/CD pipelines using Jenkins, increasing deployment speed and reliability.
  • Implemented security best practices within the AWS environment, safeguarding sensitive data and ensuring compliance with industry regulations.
  • Enhanced system performance by implementing AWS CloudFormation templates for infrastructure automation.
  • Facilitated better decision-making with advanced analytical tools like Amazon Redshift or EMR for business intelligence insights.
  • Ensured compliance with industry regulations by implementing robust security measures across all stages of the data lifecycle.
  • Ensured high availability of critical systems by designing and deploying fault-tolerant architectures on AWS infrastructure.
  • Developed custom ETL processes for efficient data ingestion and transformation.
  • Upheld best practices in code quality, testing, documentation, deployment, and continuous integration within a DevOps-focused team environment.
  • Simplified database management tasks, leveraging managed services such as RDS or DynamoDB in the cloud environment.
  • Optimized data processing by implementing AWS-based big data solutions.
  • Accelerated time-to-market for new products with agile methodologies in developing end-to-end data engineering solutions.
  • Enabled real-time analytics capabilities by designing streaming data pipelines using Amazon Kinesis or Apache Kafka.
  • Reduced costs by automating manual processes using AWS Lambda functions.
  • Increased productivity by providing training and support for team members on various AWS services, tools, and best practices.
  • Minimized downtime with proactive monitoring and troubleshooting of complex data pipelines.
  • Worked with teams of talented software engineers to define, build and maintain cloud infrastructure.
  • Used metrics to monitor application and infrastructure performance.
  • Utilized code and modern cloud-native deployment techniques to design, plan and integrate cloud computing and virtualization systems.
  • Managed installation, upgrade and deployment projects and provided on-site direction for network engineers.
  • Participated in system development life cycle from requirements analysis through system implementation.
  • Managed use of various types of databases and configured, installed and upgraded new ones.

Data Engineer

Anthem
03.2017 - 06.2019
  • Enhanced data quality by performing thorough cleaning, validation, and transformation tasks.
  • Streamlined complex workflows by breaking them down into manageable components for easier implementation and maintenance.
  • Automated routine tasks using Python scripts, increasing team productivity and reducing manual errors.
  • Fine-tuned query performance and optimized database structures for faster, more accurate data retrieval and reporting.
  • Designed scalable and maintainable data models to support business intelligence initiatives and reporting needs.
  • Increased efficiency of data-driven decision making by creating user-friendly dashboards that enable quick access to key metrics.
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design.
  • Designed compliance frameworks for multi-site data warehousing efforts to verify conformity with state and federal data security guidelines.
  • Contributed to internal activities for overall process improvements, efficiencies and innovation.
  • Designed advanced analytics ranging from descriptive to predictive models to machine learning techniques.
  • Prepared written summaries to accompany results and maintain documentation.
  • Developed database architectural strategies at modeling, design, and implementation stages to address business or industry requirements.
  • Ensured high availability of critical systems by designing and deploying fault-tolerant architectures on AWS infrastructure.
  • Developed custom ETL processes for efficient data ingestion and transformation.
  • Simplified database management tasks, leveraging managed services such as RDS or DynamoDB in the cloud environment.
  • Increased productivity by providing training and support for team members on various AWS services, tools, and best practices.
  • Boosted collaboration between teams through effective communication of technical concepts and project requirements to non-technical stakeholders.
  • Managed use of various types of databases and configured, installed and upgraded new ones.

Data Engineer

Symphony
08.2015 - 02.2017
  • Automated routine tasks using Python scripts, increasing team productivity and reducing manual errors.
  • Evaluated various tools, technologies, and best practices for potential adoption in the company''s data engineering processes.
  • Designed scalable and maintainable data models to support business intelligence initiatives and reporting needs.
  • Increased efficiency of data-driven decision making by creating user-friendly dashboards that enable quick access to key metrics.
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Communicated new or updated data requirements to global team.
  • Contributed to internal activities for overall process improvements, efficiencies and innovation.
  • Used GDP on validation protocols, test cases and changed control documents.
  • Developed and delivered business information solutions.
  • Reviewed project requests describing database user needs to estimate time and cost required to accomplish projects.
  • Managed identification, protection and use of data assets.

Education

MBA -

Campbellsville University
Campbellsville, KY
08.2021

Master of Science - Information Technology

Campbellsville University
Campbellsville, KY
08.2019

Master of Science - Computer And Information Sciences

The American College of Commerce And Technology
Falls Church, VA
03.2017

Bachelor of Science - Pharmacy

Acharya Nagarjuna University
Guntur, India
04.2010

Skills

  • ETL development
  • Data Warehousing
  • Data Migration
  • API Development
  • Scripting Languages
  • SQL Programming
  • RESTful APIs
  • Software Development
  • Object-Oriented Programming
  • Teamwork and Collaboration
  • Agile Methodology
  • Amazon Web Services (AWS)
  • Databricks
  • Python
  • Redshift
  • Elastic Search
  • DynamoDB
  • Lambda
  • Kinesis
  • Glue and AI/ML tools

Timeline

Cloud Data Engineer

Goldman Sachs
01.2022 - Current

Cloud Engineer

Valeant
07.2019 - 12.2021

Data Engineer

Anthem
03.2017 - 06.2019

Data Engineer

Symphony
08.2015 - 02.2017

MBA -

Campbellsville University

Master of Science - Information Technology

Campbellsville University

Master of Science - Computer And Information Sciences

The American College of Commerce And Technology

Bachelor of Science - Pharmacy

Acharya Nagarjuna University
Sunil Kumar