Summary
Overview
Work History
Education
Skills
Timeline
Generic

Nithin Kumar Kollu

Land O' Lakes,FL

Summary

Devops Engineer with 10 plus years of IT experience which includes Devops, Infra automation and Bigdata. Proficient in architecting & automating infra in AWS, Azure, GCP, Kubernetes and Terraform, Ansible. Experienced in setting up , Kafka, Hadoop Clusters and framework for logging (ELK), monitoring and alerting (Prometheus, Grafana), optimising cloud costs.

Overview

11
11
years of professional experience

Work History

Devops Engineer

Unravel Data Systems Pvt Ltd
2021.07 - Current
  • Deployed Unravel product in multiple customer environments.
  • Automated Unravel product deployment using Ansible .
  • Worked on Cloudstack for virtualization of servers, Migrate VMs and servers from Vagrant to Cloudstack .
  • Enable monitoring for VMs using Prometheus and Grafana .
  • Create Hortonworks, CDP, MAPR clusters for Unravel integration
  • Work on Dataproc, EMR, Bigquery, Databricks clusters
  • Generate workloads on all clusters using Jenkins, Ansible
  • Terraform to deploy and terminate clusters in Cloud.
  • Built a framework to simplify resource creation for cloud with termination times to save cost which helped in saving 1Million in one year.
  • Resolve issues reported by customers in Salesforce .
  • Written scripts for building dashboards for the cost utilisation for Cloud Platforms .
  • Integrated Unravel with opensource Kafka for collecting metrics .
  • Packaging unravel builds using Jenkins.
  • Working on containerising product for SAAS deployment.
  • Create runbooks with all issues encountered and steps to follow to fix them.
  • Server procurement, OS install, Network, RAID configuration.
  • Work on templates for the autoaction to trigger notifications.
  • Test cases for failure scenarios of Spark, Hive, Oozie Jobs.
  • Tracking all tickets using JIRA and do sprint planning.
  • Integrate LDAP for Unravel server, clusters associated with it.

Senior Associate

DBS Asia HUB 2
2019.07 - 2021.07
  • Created automation for infra provisioning and deployment, upgrades for Druid, Hadoop, Kafka on VM and AWS using Ansible
  • Performed migration of the Confluent Kafka cluster from a service domain to the Bank Domain
  • Enabled RBAC for confluent Kafka, created connectors for Kafka- S3, Mysql-Kafka
  • Enabled log and metric monitoring for Kafka and Druid, Hadoop clusters with Prometheus, Grafana, Nagios, ELK
  • Built self-healing scripts using bash and python
  • Troubleshoot incidents related to Kafka, Druid, Hadoop and tune the clusters to have better performance and for permanent closure of incidents
  • Implemented Kafka based ingestion to Druid in order to track real time usage of Hadoop queue resources by a tenant for charge back
  • Architected and deployed Imply's Druid and migrated the existing tenants from Open Source Druid
  • Run alluxio with SAN as the storage
  • Created Charts in PIVOT to let application teams monitor their query response times
  • Fix all NVAs in order to make sure the platforms are in compliance with bank's policies
  • Enable KMS for data at REST encryption
  • Create automation to provision AWS Infra like EC2 S3, RDS, EBS, EFS using terraform
  • Creating notifications to maintain the cost- efficient ways of using AWS infra using cost explorer, cloudwatch and SNS services
  • Create and maintain Kubernetes Clusters.
  • Create Jenkins pipelines for automated CICD in order to containerize applications and deploy on OpenShift.

Software Developer Senior Analyst

Accenture Solutions Pvt Ltd
2018.03 - 2019.07
  • Written ansible playbooks to deploy AMP a Kogentix's product
  • Spin HD Insight cluster and manage Data lake stores on Azure, build Hadoop clusters on AWS, onprem with TLS encryption on Hortonworks and Cloudera Platforms
  • Install, configure Elasticsearch, Kibana, logstash for name pattern matching
  • Interacting with Cloudera&Hortonworks support in case of any issues and fixing them as per the recommendations
  • Manage Cloudera Datascience workbench
  • Automated deployment of alluxio and integrate it with HDFS
  • User management, Patch & Packages administration, Filesystem management

Change Control Specialist

Teradata India Pvt Ltd
2013.09 - 2018.03
  • Installed Hortonworks Clusters for Mayo Clinic and Horizon Health Services, Cloudera Cluster for AMC Theatres
  • Upgraded clusters to new releases when required and maintained the clusters to remain healthy
  • Data transfer and load between Teradata/Aster and Hadoop
  • Kerberos, Sentry setup for authentication and authorization \
  • Install and configure Teradata Query Grid connector for Hortonworks and Cloudera distribution
  • Configuration ACLs for different users on the cluster to access HDFS.

Education

Bachelor of Science - Computer Science

St.Peter's Engineering College
Hyderabad
2013

Intermediate(XII Standard) - Mathematics

Narayana Junior College
Hyderabad
2009

Schooling (Xth Standard) -

City Central School
Kodad
2007

Skills

  • Kubernetes
  • Kafka
  • Hadoop
  • Druid
  • Linux
  • Jenkins
  • AWS, GCP, Azure
  • Prometheus
  • Grafana
  • GIT
  • Mysql, Postgres
  • Cloudstack
  • Databricks, EMR, Dataproc, Bigquery
  • Terraform, Ansible.
  • Shell Scripting, Python

Timeline

Devops Engineer

Unravel Data Systems Pvt Ltd
2021.07 - Current

Senior Associate

DBS Asia HUB 2
2019.07 - 2021.07

Software Developer Senior Analyst

Accenture Solutions Pvt Ltd
2018.03 - 2019.07

Change Control Specialist

Teradata India Pvt Ltd
2013.09 - 2018.03

Bachelor of Science - Computer Science

St.Peter's Engineering College

Intermediate(XII Standard) - Mathematics

Narayana Junior College

Schooling (Xth Standard) -

City Central School
Nithin Kumar Kollu