Summary
Overview
Work History
Education
Skills
Projects
Timeline
Generic

Kirthna Conjeepuram

Charlotte,NC

Summary

An aspiring DevOps engineer who is looking for opportunities in DevOps and Cloud

Overview

1
1
year of professional experience

Work History

DevOps Engineer

Random Trees
03.2021 - 02.2022
  • Setting up and configuring development, testing, and production environments in linux and AWS
  • Assisting in the automation of deployment processes using tools like Jenkins, GitLabCI/CD
  • Collaborating with development and operations teams to troubleshoot issues and ensure smooth software delivery
  • Learning and contributing to the improvement of CI/CD pipelines for faster and more reliable software delivery
  • Participating in on-call rotations to address any urgent issues that may arise outside regular hours
  • Documenting processes, procedures, and configurations for future reference and knowledge sharing.

Education

Master of Science -

Trine University
Phoenix, AZ
12-2023

Bachelor of Science -

Aurora Degree College
Hyderabad India
06-2021

Skills

  • AWS
  • Azure
  • Linux
  • Docker
  • Kubernetes
  • CI/CD
  • Bash
  • Terraform
  • Gitlab
  • Kafka
  • Redis

Projects

End to end CICD implementation-Jenkins

  • Implemented end to end CICD for a java application using jenkins declarative pipelines. Created docker images and deployed into the kubernetes using argoCD

Implemented Three Tier Architecture - AWS

  • Implemented a three tier architecture that is highly available and scable in AWS using terraform and deployed the application in EC2 instances
  • AWS Components involved:
  • VPC, EC2, IAM, SG, NACL,S3, Route53

Cloud Cost Optimization - AWS

  • Used the combination of AWS cloud watch and lambda functions to decrease the cloud usage clost
  • Created lambda function in python, used boto3 module to interact with AWS service apis and the lambda function is triggered by cloud watch events. The lambda function would watch for unused EBS snapshots and send the notification using SNS to the owner

Data Processing and Caching System using Kafka and Redis

  • Set up an Apache Kafka cluster to handle the ingestion of data streams. Configure topics to organize and partition data efficiently and implement a data consumer module that subscribes to Kafka topics. Integrate Redis as a caching layer to store frequently accessed data.

Timeline

DevOps Engineer

Random Trees
03.2021 - 02.2022

Master of Science -

Trine University

Bachelor of Science -

Aurora Degree College
Kirthna Conjeepuram