Summary
Overview
Work History
Education
Skills
Certification
Websites
Timeline
Generic
Veera Shankara Ravindra Reddy Kakarla

Veera Shankara Ravindra Reddy Kakarla

Pittsburgh,USA

Summary

Over 14 years of professional experience in the IT industry with diversified exposure in Software Process Engineering, designing and building web applications using JAVA/J2EE, Spring Boot, Microservices, and Cloud Services. Experience working as Technical Architect and identifying the business problems, designing the solutions, developing the software architecture using Java based solutions.

  • Experience in software planning, requirement analysis, designing and building enterprise applications for manual and automation process.
  • Experience in developing projects for Banking, Healthcare, Lifestyle media, Autism, ETL domains and market data. Experience in Object Oriented Methodology, Design Patterns.
  • Requirement gathering & analysis, designing UML - for developing application specific Object model, Use Case diagrams, Class diagrams, Sequence diagrams and State diagrams.
  • Experience in frameworks like Spring Boot 2, Spring Boot 3.2.0, Spring MVC, Spring ORM, Spring JDBC, Spring AOP, Spring Context, Spring Security and Hibernate. Experience working on the BPM tool like Camunda.
  • Experience in developing frameworks using Quarkus and GraalVM. Experience in developing applications using Java technologies include Core Java, OOPS, Multithreading, JDK 8, JDK 11, JDK 17, JDK 19, JDK 21, J2EE, Angular, ReactJS, Java Server Pages (JSP), Java Server Faces (JSF), Servlets, Java Script, JDBC, Java Mail, JMS and EJB. Experience in implementing Web Services based in Service Oriented Architecture (SOA) using SOAP, Restful Web Services, JAX-WS, UDDI, WSDL, Apache Axis using technologies such as JSON, SOAP, XML, JAXB, Swagger2.0 and OpenAPI and Jersey. Knowledge of Oauth 2.0 and SAML.
  • Hands-on experience working on Microservice-based applications. Experience with AWS services like RDS, S3, EC2, ECS, Lambda, Cloudwatch, Cloudfront, Cloud formation, API Gateway, Route 53, SQS and SNS Experience in handling messaging services using Apache Kafka, Pub/Sub architecture, RabbitMQ and Amazon SNS, SQS. Good experience on writing apis using GO Lang. Good experience on Angular and ReactJS.
  • Good experience on DevOps technologies and Ansible Deployment scripts. Good experience in AWS Cloud, Google Cloud, and Azure Cloud(Containers, Blob storage, Congnito, Azure Repos, Cosmos DB, AKS).
  • Good experience on Migrating applications from OnPrem to Cloud (AWS).
  • Good experience working on the Spring Cloud related features using Feign Client, Resilience4j, Zipkin, Seluth, Jaeger tracing.
  • Experience with Docker Containers, leveraging Linuxs containers and AMI’s to create Docker images/containers, Kubenetes, Jenkins, Terraform for continues integration and Deployment for Microservices (CI/CD). Specialized in Java/J2ee-based applications development using JAVA, JSP, Servlets, JSF, EJB, STRUTS, Spring, Hibernate, Spring Boot, Spring Webflux(Reactive Java programming), Spring Cloud, and Rest APIs. Experience in configuring build tools like Maven, Gradle, and ANT for the development and deployment, with Version control management using CVS, GIT, Bitbucket, Source Tree, Tortoise and SVN.
  • Used Apache JMeter for API performance monitoring, SonarQube for code coverage and New Relic, AppDynamics and Dynatrace for performance monitoring.
  • Splunk for tracing logs, Kibana(ELK) Monitoring, Logscale for checking logs. Experience in developing Unit testing and Integration testing with unit testing frameworks like JUnit, Mockito, Testing and Power Mocks. Good experience on TDD and BDD frameworks. Good experience on handling onshore and offshore teams.
  • Good understanding on the very latest Java version features and started the POC to migrate some of the marketdata applications to JDK 21/Java 21 and SpringBoot 3.2.0.
  • Highly experienced with Relational (Oracle, PostgreSQL, MySQL, H2) and NoSQL (Mongo, Dynamo, Cassandra) databases. Hands-on experience with Waterfall, Agile and SAFE methodologies. Excellent team player, quick learner and self-starter with effective communication, motivation and organizational skills combined with attention to details and business process improvements.
  • Adapting AI tools to generate the code using Github Copilot and ChatGPT.
  • Good experience on the Github Copilot for the Client projects and Personal applications.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Senior Engineer

Tata Consultancy Services
Pittsburgh, USA
03.2024 - Current
  • Involved in SDLC requirements gathering, analysis, design, development, and testing of applications.
  • Provide technical guidance to the development teams.
  • Establish and enforce coding standards, best practices, and design patterns.
  • Application design and implementation of microservices, APIs, and event-based applications using Kafka.
  • Developing the applications using Java, Spring Boot 3.2.0, Data JPA, and Spring Cloud, applying design patterns, and applying SOLID principles during the application design and development phase.
  • Fixing the security violations.
  • Implemented the scheduler jobs using Spring Boot, and scheduling using CA7.
  • Implemented applications using NoSQL MongoDB.
  • Implemented the distributed cache using Redis.
  • Perform peer reviews.
  • Writing unit test cases using JUnit, Mockito, and following the BDD framework.
  • Creating Docker builds and deploying applications in OpenShift.
  • Creating MonPo monitoring for the production applications.
  • Used JUnit, Mockito, and PowerMock framework for unit testing of the application, and implemented TDD and BDD methodologies.
  • Implemented JPA repositories to perform the CRUD operations using Oracle and SQL Server databases.
  • Supporting production applications and collaboration with the stakeholders.
  • Migrating monolithic applications to microservice-based applications and migrating on-premises to OpenShift.
  • Rotational production stats monitoring and fixing the production issues.
  • Writing SQL queries to generate the reports from the database, and database queries to verify the data from the database directly.
  • Used SonarQube and fixed the Sonar violations.
  • Migrated some of the microservice applications using JDK 21, virtual threads, and Spring Boot 3.3.0.
  • Synchronous RestClient (an alternative to RestTemplate and WebClient) is used for the API calls, which was introduced in the very latest Spring Boot version.
  • Perform load and performance testing before the changes are deployed in production using Apache JMeter.
  • Verifying the production application performance using Dynatrace.

Environment: Java, Spring Boot, Spring Webflux, Spring Cloud, Microservices, Oracle, SQL Server, MongoDB, GraphQL, Data JPA, Docker, Kubernetes, CI/CD pipeline, Logscale, Redis, Spring Webflux, Openshift

Senior Engineer

OMV America
New Jersey, USA
04.2023 - 02.2024

Working on the market data applications for the NBC Universal Live market data application, and on-air applications.

  • Involved in SDLC requirements gathering, analysis, design, development, and testing of applications.
  • Developing the applications using Java, Spring Boot 3.2.0, Data JPA, and Spring Cloud, applying design patterns, and applying SOLID principles during the application design and development phase.
  • Implemented OneTick queries, RESTful APIs, and GraphQL services.
  • Implemented the scheduler jobs using Spring Boot
  • Some of the quote applications implemented using Go Lang.
  • Implemented Reactive Redis (Lettuce) and Redis cluster to store the market data.
  • ORM tool Spring Data JPA is used to represent entities to perform the API/application CRUD operations.
  • Using OneTick DB for the market data applications for the market's present and historical data.
  • Developing and maintaining applications that interact with the OneTick database via APIs.
  • Designing and developing the new applications to handle the stocks.
  • Using the Windows OneTick client to query the OTQ’s.
  • Writing data ingestion pipelines to load data from various sources into OneTick.
  • Good experience in deploying applications on OpenShift.
  • Creating Docker builds and deploying applications into AWS-managed Kubernetes.
  • Creating dashboards using Grafana, and alerting using New Relic.
  • Implemented some of the market data services using NoSQL MongoDB.
  • Used JUnit, Mockito, and PowerMock framework for unit testing of the application, and implemented TDD and BDD methodologies.
  • Writing SQL queries to generate the reports from the database, and database queries to verify the data from the database directly.
  • Eclipse, Emma, for code coverage, and Splunk, for the applications log analysis.
  • POC on Java virtual threads to migrate some of the applications into JDK 21 and Spring Boot 3.2.0.
  • Migrated some of the microservice applications using JDK 21, virtual threads, and Spring Boot 3.2.0.
  • Synchronous RestClient (an alternative to RestTemplate and WebClient) is used for the API calls, which was introduced in the very latest Spring Boot version.

Environment: Java, Spring Boot, Spring Webflux, Spring Cloud, Microservices, AWS, MySQL, MongoDB, GraphQL, Data JPA, Docker, CI/CD pipeline, Splunk, Redis, Apache Kafka.

Automation Engineer

Informatica Business Solutions
Hyderabad
09.2021 - 02.2023

CDMA (Cloud data management analytics), CAI (Cloud application integration), and API Management (IPAAS)

  • Worked on multiple low-code and no-code platform integrations, such as iPaaS platforms.
  • Involved in SDLC requirements gathering, analysis, design, development, and testing of applications.
  • Developing the applications using Java, Spring Boot, Data JPA, and Spring Cloud, applying design patterns, and applying SOLID principles during the application design and development phase.
  • Implemented the RESTful APIs and GraphQL services.
  • Some of the services were converted from monolithic to microservice architecture using Spring Boot, and followed 12-factor application principles.
  • ORM tool Spring Data JPA is used to represent entities to perform the API/application CRUD operations.
  • Implemented the Spring Boot Kafka Stream to process the real-time events for both producer and consumer applications.
  • Created services using Apache Iceberg to store the data in the lake store.
  • Storing the data in GraphDB using Neo4j.
  • Access the servers using the PuTTY tool.
  • Implemented the applications using Redis Cluster to store the different tenants' data.
  • Working on iPaaS application development.
  • Creating Docker builds and deploying applications into AWS-managed Kubernetes.
  • Creating dashboards using Grafana, and alerting using AppDynamics.
  • Implemented distributed tracing using Jaeger tracing for all the microservices.
  • Implemented the services using the NoSQL MongoDB.
  • Used JUnit, Mockito, and PowerMock framework for unit testing of the application, and implemented TDD and BDD methodologies.
  • Writing SQL queries to generate the reports from the database, and database queries to verify the data from the database directly.
  • Performing CRUD operations with AWS DynamoDB.
  • AppDynamics for performance monitoring, ELK for log monitoring, and dashboard creation.

Environment: Java, Spring Boot, Spring Webflux, Spring Cloud, Microservices, Azure Cloud, AWS, PostgreSQL, DynamoDB, GraphQL, Data JPA, Neo4J, Docker, Kubernetes, CI/CD pipeline, ELK, Apache Kafka Stream, Redis, Spring Webflux, Reactive Kafka, MySQL, and Apache Iceberg.

Senior Technical Analyst

Ivy Comptech
Hyderabad
01.2020 - 09.2021

Working on a betting gaming integration platform to process the live feed from BWIN called BWIN Integration. We integrated multiple sports through the BWIN feed data.

  • Involving technical design meetings and designing technical solutions.
  • Working on Java using versions 1.8 and 11, Spring Boot, and Microservices-based application design and development.
  • Created a service to produce and consume feed from BWIN Kafka.
  • Implementation with the S3 SDK.
  • Exposing the services using AWS API Gateway.
  • Launching AWS EC2 instances using CloudFormation templates.
  • Pushing the Docker images to AWS ECR, Deploying applications in AWS ECS.
  • Storing the feed data using in-memory and Redis cache for the live sports.
  • Implemented some of the sports microservices using Redis Lettuce, Reactive Kafka, and Spring WebFlux reactive programming.

Environment: Java, Spring Boot, Spring, Hibernate, AWS ECS, EC2, S3, Redis, Apache Kafka, Spring WebFlux, Redis Lettuce, Reactive Kafka, MongoDB

Senior Technical Lead II

OptumGlobal
Hyderabad
08.2019 - 12.2019

I worked for Optum's internal productions, such as provider enrollment and claims-related applications.

  • Requirement discussions with the Product Owner, and solution approach discussions with the Solution Architect.
  • Requirement discussions with the team, and guide team members to follow the best coding practices.
  • Working on Sprint Development activities
  • Developing the microservice applications using Apache Kafka Streams and Java Spring Boot, implementing the services using Spring Boot RESTful services.
  • Containerizing the services using Docker and deploying the applications using Kubernetes.
  • Distributed tracing using Zipkin and Sleuth. Testing the services using JUnit, Mockito, and Postman.
  • Alerting the services using Prometheus.
  • Deploying the Kafka Stream services and APIs using Kubernetes deployment.
  • Created dashboards using Angular by calling the backend REST APIs.
  • Used JUnit, Mockito, and PowerMock framework for unit testing of the application, and implemented Test Driven Development (TDD) methodology.

Environment: Java, Spring Boot, Spring Webflux, Apache Kafka Stream, Spring Cloud Stream, Spring Cloud, gRPC, Protobuf, Angular 8, Hibernate, Data JPA, Azure Cloud, AWS S3, DynamoDB, EC2, ECS, ECR, CloudFront, Google Kubernetes Engine, Azure Cognito, Apache Kafka, Camunda, WinSCP, PuTTY.

Technical Lead

SenecaGlobal IT Services Private
Hyderabad
09.2018 - 08.2019

Cognitivebotics is an innovative application that can help with different autism skills, for example, audio-video visualization, how to greet others, and how to talk to others, etc.

  • Analysis and designing of the microservice applications using Spring Boot and Spring Cloud.
  • Implemented the content storage service API using AWS S3.
  • Implemented Zookeeper distributed configuration system for API and gRPC.
  • Deployed the applications in AWS EC2 and AWS ECS (Fargate cluster).
  • Maintain the Docker images in AWS ECR.
  • Implemented the AI service by consuming the AWS and Azure Face Detection API.
  • Implemented Swagger API documentation for REST API endpoints.
  • Implemented a coding best practices checklist and PMD rules.
  • Visited multiple autism schools to understand the project domain.
  • Providing the estimations and billing estimates for cloud deployment.
  • CI/CD using AWS CodePipeline.

Senior Java Developer

Infinite Computer Solutions
Bangalore
08.2012 - 09.2018

I worked on the MMIS and NLCD development projects.

MMIS is a health care application funded by the state and federal government in the United States.

  • The project is based on scheduling the nonlinear contents, referred to as non-linear clips and VOD programs, which will be available on Scripps Networks Interactive, business partner websites, and Video On Demand offerings, respectively.
  • The content distribution involves publishing nonlinear videos, closed caption files, metadata, and images to various receivers.
  • I attended daily scrum meetings, as well as backlog grooming sessions every week, and I understood business requirements.
  • Getting user stories from the Rally tool.
  • Perform the implementation of user stories.
  • Perform the deployment of NLV applications.
  • Hibernate, JBPMN, Wicket framework, JSP, Servlets, Web services (CXF and REST), and XSLT.

Software Engineer

NIIT Technologies Ltd and Marlabs Software (P) Ltd
Noida
12.2010 - 08.2012

US banks come to SEI GWP desktops as customers, which requires SEI to customize GWP desktops for every new bank. SEI decided to develop an Open Architecture for the GWP desktop, which will offer core services offered by the GWP desktop to customer banks through a collection of web services offering. In 12.1, we are back to the OA team because, before the code is moved to the production environment, we are going to reduce the code by removing the unused methods, duplicate methods, and some of the methods we are trying to implement as common for all the Web services. The purpose of the 12.1 release is to implement XML Schema Validation, adding relevant method comments. The purpose of the Schema Validation is for better negative scenario validations.

  • Creating common methods using Java.
  • Deploying an application using the WebLogic server.
  • Unit test the application using SoapUI.
  • Supporting QAs to fix the issues.
  • Performing the CRUD operations using JDBC, and web screen implementation using JSP and Servlets.
  • Reviewing the test scenarios.
  • Getting the input test data using the DB SP script.
  • Daily status reports are submitted to the manager and onsite client.
  • Support the MasterCraftLite tool at the project level for other teams.
  • Environment, Java, Struts, Spring (AOP), MasterCraftLite, Web services - EJB (JAX-WS), and Oracle.

Education

Master of Computer Applications -

JNTU
Hyderabad
01.2010

Bachelor of Science -

Acharya Nagarjuna University
Guntur
01.2007

Skills

Java, J2EE, JSP, Mockito, Servlets, Hibernate, JPA, JSF, Struts, Spring, Spring Boot, Spring Webflux, Redis, Apache Kafka, Confluent Kafka, Microservices Design Patterns, Junit, Maven, Ant, Gradle, OpenAPI, Kong API Gateway, Zuul API Gateway, AWS API Gateway, Putty, Rally, Postman, RestClient, JIRA, Sonar, Jenkins, AppDynamics, Dynatrace, Kibana, Swagger 2, Swagger 3, Grafana, Prometheus, Ansible, Linux, Apache Camel, Github Copilot, HTML, CSS, XML, XSLT, JavaScript, JSON, JDBC, JMS, Log4J, Microservices, Web Services, Spring Cloud, AWS, GCP, Azure, Openshift, Apache Tomcat, WebLogic, JBOSS, SVN, GIT Hub, Bitbucket, ORACLE, MySQL, PostgreSQL, Mongo, DynamoDB, Cassandra, GraphQL, Onetick DB, Windows, Mac, UNIX/Linux, Eclipse, IntelliJ, SOAP UI

Certification

  • AWS Solution Architect Associate
  • Scrum internal certified

Timeline

Senior Engineer

Tata Consultancy Services
03.2024 - Current

Senior Engineer

OMV America
04.2023 - 02.2024

Automation Engineer

Informatica Business Solutions
09.2021 - 02.2023

Senior Technical Analyst

Ivy Comptech
01.2020 - 09.2021

Senior Technical Lead II

OptumGlobal
08.2019 - 12.2019

Technical Lead

SenecaGlobal IT Services Private
09.2018 - 08.2019

Senior Java Developer

Infinite Computer Solutions
08.2012 - 09.2018

Software Engineer

NIIT Technologies Ltd and Marlabs Software (P) Ltd
12.2010 - 08.2012

Master of Computer Applications -

JNTU

Bachelor of Science -

Acharya Nagarjuna University
Veera Shankara Ravindra Reddy Kakarla