Summary
Overview
Work History
Education
Skills
Timeline
Generic

Abdullah Anwar

Murrieta,CA

Summary

Over all 10 years of IT experience including 4+ years Confluent Kafka in Installation and configuration of different Kafka eco-system components in the existing cluster and 5 years of Experience in Linux Administration in RHEL 4.x, 5.x, 6.x,7.x,8.x SUSE LINUX 11.x/10/9, Cent OS 5.x,6.x,7.x. Experienced working with various data file formats including parquet, Json, orc, csv, avro. Experience Developing and deploying Apache/Confluent Kafka. Experience using Change management tools like Git and AWS Code Commit. Hands-on experience in installing, patching, upgrading and configuring Linux based operating systems RHEL and Centos in a large set of clusters. Knowledge of SDLC-requirement gathering & analysis, planning, designing, developing, testing and implementing. Expert in enabling security on the Hadoop clusters using Kerberos etc. Proficient in handling Hardware issues, Migration and Data Center Operations. Experience in writing Shell scripts using bash, Perl, for process automation of databases, applications, backup and scheduling. Experienced in Performance Monitoring, Security, Trouble shooting, BackUp, Disaster recovery, Maintenance and Support of UNIX systems. Experience in Installing, upgrading and configuring REDHAT Linux 3.x, 4.x, 5.x using KICKSTART Servers and Interactive Installation. Knowledge of networking (TCP/IP, Ethernet), NFS, DHCP, SMTP and RAID. Documented the systems processes and procedures for future references. Worked on automation framework to create, delete, alter topics – and add/remove ACLs. Experience with monitoring tools such as (Grafana,Prometheus,Elastic shearch) Bachelor degree in (computer science).

Overview

11
11
years of professional experience

Work History

Kafka Engineer

TCS
12.2020 - Current
  • Experience in Configuring and Setting up Schema Registry, Kafka Connect, Rest Proxy, Confluent Control Center & KSQL clusters
  • Designed and implemented the self-managed multi-datacenter architecture from scratch for On-Premise Deployments
  • Kafka management Access Control List (ACLs), Simple Authentication and security layer components to lock down and secure a cluster
  • Experience in Apache Kafka, Zookeepers, Partitioning of Topics, Adding/Deleting nodes
  • Handle all Kafka environment builds, including design, capacity planning, cluster setup and performance tuning
  • Perform high-level, day-to-day operational maintenance, support, and upgrades for over 15 production Kafka Clusters
  • Experience in building Kafka pipelines using Ansible
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices
  • Successfully implemented tier-1 active-active design architecture to minimize the downtime and data loss resulting from a disaster
  • Utilizing Kafka Connect framework for connecting to Oracle/Mongo database to push and pull data from/to Kafka
  • Using KSQL, create tables from Apache Kafka topics, and create tables of query results from other tables or streams
  • Setting up LDAP for user authentication with the help of user bearer tokens with MDS endpoints
  • Implementing client authentication over SASL-SSL and configuring a separate principal for the worker and the connectors
  • Worked with variety of source and sink connectors such as IBMMQ, File stream, MongoDB, Elastic Search etc.

Infrastructure Engineer

Walgreens
12.2017 - 01.2020
  • Involved in all aspects of system design, including domain modeling, microservice design, security, DevOps, and infrastructure
  • Implemented Kafka Security Features using SSL and Kerberos
  • Designed and involved in architectural discussions related to DR functionality which were to prevent the business revenue, loss and prepared a runbook for better understanding of the situation
  • Resolved issues with knowledgeable support and quality service in the Kafka Pipeline
  • Configured multi-node environments which includes Production, QA, UAT and DEV
  • Setup Confluent Control Center (C3) for managing and monitoring the Kafka clusters across the environments
  • (Customized C3 for better performance) Deployed Confluent Replicator on DC/OS platform as a connector which replicates data among multiple datacenters cluster in near real-time
  • Designed and configured the clusters to support message sizes larger than 1MB (maximum 25MB) which is not usual for the streaming platform like Kafka
  • Installed single node-single broker and multi-node multi broker clusters and encrypted with SSL/TLS, authenticate with SASL/PLAINTEXT, SASL/SCRAM and SASL/GSSAPI (Kerberos)
  • Installed and Configured enterprise features which offers much more critical functionalities namely Replicator, Auto Data Balancer
  • Always managed and Monitored Replicator connector from Confluent Control Center for the near real-time data transfer between four different Kafka clusters in Production/DR and Production Testing/DR environments
  • Upgraded Confluent Enterprise and community edition platform across versions and identify/fix any issue for production version to meet general needs
  • Updating kernel & security patches in Linux environment, handling out of memory issues in Linux kernels during rebalance in Kafka cluster
  • Implementation of Ansible Playbooks as an automation platform for configuration management, application deployment, and task automation
  • Successfully Generated consumer group lags in Kafka.

Linux/DevOps Engineer

DXC
03.2013 - 11.2017
  • Involved in DevOps automation processes for build and deployment of a 3tier web architecture
  • Worked closely with Project Managers to understand the code configuration release scope
  • Build the infrastructure and continuously improved the build infrastructure for DevOps engineering, there by implementing scalable infrastructure
  • Responsible for maintaining process documentation on Company SharePoint
  • Provided 24x7 on-call / Remote Support for LINUX Production Problems on weekly rotation basis
  • Experience in providing day-to-day user administration like adding/deleting users in local and global groups on Red Hat Linux platform and managing user's queries
  • Responsible for building/deploying consistently repeatable build/deployments to company production and non-production environments using Jenkins & Build Pipelines, ANT, MAVEN, Shell
  • Experienced on patches installation, patch upgrades and packages installation on Red Hat Linux servers by using RPM & YUM
  • Expertise in configuration and managing Linux Virtual Machines under VMware 5.x Regular disk management like adding / replacing hot swappable drives on existing servers workstations, partitioning according to requirements, creating new file systems or growing existing one over the hard drives and managing file systems
  • Managed and document all post deployment issues utilizing the Post Deployments Issue Log
  • Virtualized the servers using the Docker for the test environments and dev-environment needs
  • Configured Docker container for branching purposes
  • Developed Perl and shell scripts for automation of the build and release process.

Education

Bachelor of Science -

Skills

  • Confluent-Kafka-Kraft
  • Zookeeper
  • ELK stack
  • Nagios
  • Jenkins
  • Mongo DB
  • Maria DB
  • Oracle
  • SQL Server
  • MySQL
  • Ansible
  • Puppet
  • Ant
  • Maven
  • JIRA
  • GitHub
  • GitLab
  • TCP/IP
  • DHCP
  • DNS
  • HTTP/HTTPS
  • SOAP
  • AWS
  • C
  • C
  • Java
  • WebLogic
  • WebSphere
  • Apache Tomcat
  • Shell/Bash
  • Python
  • JavaScript

Timeline

Kafka Engineer

TCS
12.2020 - Current

Infrastructure Engineer

Walgreens
12.2017 - 01.2020

Linux/DevOps Engineer

DXC
03.2013 - 11.2017

Bachelor of Science -

Abdullah Anwar