Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Generic

ARIVUMANI RAMALINGAM

Summary

Experienced Senior Java Architect and Big Data Engineer with over 20 years of expertise in designing, architecting, and developing advanced technology solutions. Skilled in mentoring teams and creating scalable data architectures, as well as optimizing Java applications and data pipeline frameworks for enterprise environments

Overview

24
24
years of professional experience

Work History

Big Data Consultant / Senior Engineer

GEICO
01.2013 - Current
  • Architected, Designed, and Deployed Data Pipeline Frameworks
    Built and deployed scalable data pipeline frameworks in Azure Kubernetes for moving polyglot data payloads to the Data Lakehouse. Actively utilized in production by all data vertical teams, these pipelines streamline data workflows across GEICO zones. Trained and mentored over 100 developers to adopt and leverage these services effectively.
  • Developed Data Pipelines for Analytical and Data Science Consumption
    Designed and implemented pipelines for transferring polyglot data payloads to ADLS and Snowflake, supporting analytics and data science initiatives. Leveraged tools such as Apache Flink, Airbyte, Spark, Azure Data Factory, Azure Databricks, and DBT. Proficient in Java, Python, Spark, Scala, DBT, SQL, and PL/SQL.
  • Integrated with Data Lakehouse
    Enabled seamless data storage in RAW and Conformed zones of the Data Lakehouse using technologies like Apache Iceberg, Polaris REST Catalog, ADLS Gen2, Hive, Spark, Snowflake, Hudi, Blaze, DBT, Ranger, and related tools.
  • Implemented Data Quality and Balance Control Framework
    Designed an end-to-end framework ensuring data quality and accuracy. Utilized SQL Server PL/SQL, control data frameworks, Guarantee Delivery API frameworks, DBT SQL engines, and Power BI to monitor and validate data integrity.
  • Automated Deployment Workflows
    Developed deployment automation pipelines using Airflow, Azure DevOps (ADO), and Python libraries, leveraging REST API capabilities for various ingestion tools.
  • Deployed Scalable Solutions in Azure Kubernetes
    Deployed microservices-based operators, executors, and ingestion frameworks in Azure Kubernetes. Focused on system stability, optimized resource utilization, scaling, and implementing business continuity and disaster recovery measures.


Technologies - Apache Flink, Airbyte, Spark, Azure Data Factory, Azure Databricks, and DBT. Java, Python, Spark, Scala, DBT, SQL, and PL/SQL, Apache Iceberg, Polaris REST Catalog, ADLS Gen2, Hive, Spark, Snowflake,, DBT, Ranger, Airflow, Azure DevOps (ADO), Azure Kubernetes, Helm, Shell Scripting, Dev, Kyuubi, Blaise



Adopted and Implemented Licensed PaaS Data Ingestion Tools ( 2020 - 2022 )
Developed, customized, and implemented licensed PaaS ingestion tools such as Fivetran, HVR, and Confluent Kafka to seamlessly move data from diverse sources to ADLS Gen2 and Snowflake. Collaborated closely with vendors to align these tools with GEICO's security standards and compliance requirements.


Tools and Technologies - Fivetran, HVR, Confluent Kafka, ADLS Gen2, Snowflake, DBT, Azure DevOps, Java, Python, Scala, Spark, SQL/PLSQL, Databricks, Azure Data Factory, SQL Server, Oracle, Flat Files, and Message Hubs


Architected and Implemented Kafka Connect-Based Ingestion Architecture ( 2018 - 2020 )
Designed and developed a Kafka Connect-based ingestion framework to build robust data pipelines. Collaborated closely with Confluent, Microsoft, and Pivotal developers to enhance pipeline capabilities and ensure seamless integration with enterprise systems


Tools and Technologies - Kafka Connect, Open Source Kafka, Debezium, JDBC Connectors, ADLS Gen2 Sink, Snowflake Sink, DBT, Databricks, Spark, Java, Python, Scala, Jenkins, Git



Architected and Implemented Big Data Pipelines Using Open-Source Frameworks ( 2015 - 2017 )
Designed, developed, and implemented scalable big data pipelines leveraging open-source ingestion platforms. Utilized technologies such as Spring Boot, Spring Cloud Dataflow, Kubernetes, Apache Kafka, Hive, Hadoop, HBase, Camus Framework, Spark, MapReduce, and Jenkins to streamline data processing and integration


Tools and Technologies - Spring Boot, Spring Frameworks, Java, Spring Cloud Dataflow, Kubernetes, Apache Kafka, Hive, Hadoop, HBase, Camus Framework, Spark, MapReduce, Jenkins


Enhanced Auto Insurance Coverage Handling for GEICO Online Portal ( 2013 - 2014 )
Developed and maintained functionality to manage auto insurance coverages based on user requests. Utilized Blaze Rules Engine to implement decision-making logic, ensuring accurate and efficient processing of coverage options


Technologies - Java, Spring, JSF, Spring MVC, Blaze, Jenkins, DB2, Oracle, SQL Server, Sonar

Senior Java Engineer / Architect

Blue Cross and Blue Shield of Florida
03.2008 - 12.2012

Led Java-Based System Development for Health Insurance Coverage Management
Designed and Developed a Java-based system to manage health insurance coverages for members and agents. Designed and contributed to the creation of a Common Sales Tool, enabling adoption and utilization across multiple health insurance providers


Technologies - Java, J2EE, Struts, Spring, JSP, JavaScript, Ajax, JQuery, SQL/PLSQL, DB2, JRules, Jenkins

Senior Java Developer

Minnesota Department of Employment and Economic Development
09.2006 - 03.2008

Designed and developed an Unemployment Insurance application portal for the State of Minnesota to streamline the processing of unemployment claims. Collaborated closely with the business team to gather requirements and build a Java-based system to collect, process, and deliver unemployment benefits to Minnesota applicants


Technologies - Java, Struts, JSP, JavaScripts, MySQL

Java / Database Developer

Department of Health, State of Maine
05.2005 - 08.2006

Developed and implemented a data extraction process for water quality XML files, enabling seamless integration with MQ to support critical subsystems such as the Immunization Registry (ImmPact), Laboratory Information Tracking System (LITS+), and Laboratory Management System (StarLIMS), improving data accessibility and workflow efficiency


Technologies - Java, Oracle, XML, SQL, PL/SQL, MQ Series,

Java Developer

GE-Penske Truck Leasing
01.2001 - 04.2005
  • Orchestrated development of Rentalnet truck leasing application
  • Streamlined reservation, contract, and payment processes
  • Enhanced system efficiency through Big Data engineering


Technologies - Java, J2EE, JavaScript, Struts, JSP

Education

Bachelor of Science ( Computer Science ) -

AVC College ( Bharathidasan University )
Mayuram, TN, India
05-1997

Master of Computer Applications -

Annamalai University
Annamalai Nagar, TN, India
05-2010

Skills

  • Java
  • Python
  • Docker
  • Kubernetes
  • Cloud Services (Azure, IBM, Google)
  • Apache Flink
  • Data warehouse (Apache Iceberg, Delta Lake, Snowflake, ADLSGen2, Polaris Catalog, Hyuudi, Hadoop)
  • Kafka (Kafka Connect, Apache Kafka, KSQL, Kafka Streams)
  • Microservices
  • Fivetran/HVR
  • Automation (DevOps, Airflow, Jenkins)
  • Database Technologies (MYSQL, MS SQL Server, Oracle, DB2, SQL/PLSQL)
  • Spring Frameworks (Spring Boots, Struts, MVC, Spring Cloud Dataflow )
  • AngularJS, JSF, MVC,JQuery
  • Spark
  • Scala
  • DBT
  • Hadoop, HBase, Hive
  • JBPM, Blaze, JRules
  • Databricks
  • Monitoring Tools (Dynatrace, Prometheus, Grafana, Splunk)
  • Gradle, Maven, Git

Accomplishments

  • Pioneered five iterations of big data pipelines at GEICO, significantly enhancing data processing speed and decision-making efficiency.
  • Adopted various ingestion tools, including both vendor-based and open-source custom developments, such as Apache Flink, Spark, Kafka Connect, Fivetran/HVR, Spring Cloud Dataflow, and Spring XD for batch and streaming.
  • Utilized advanced engineering skills to solve complex data challenges, ensuring robust system performance.
  • Designed and maintained customer web apps for enrollment and coverage management, enhancing user experience.
  • Architected a quality sales platform using Java Enterprise Technologies.
  • Collaborated with multiple vendors to deliver integrated solutions.
  • Influenced project outcomes through innovative problem-solving.
  • Led Java-based system development for unemployment insurance solutions.
  • Streamlined application processing, boosting department productivity.
  • Conducted complex requirement analysis and optimized database queries.
  • Implemented scalable solutions for high-volume applications.
  • Created process to extract water quality XML data to load them into MQ for various subsystems like Immunization Registry (ImmPact), Laboratory Information Tracking System (LITS+), Laboratory Management System (StarLIMS).
  • Engineered data extraction process for water quality XML, integrating with MQ for critical subsystems like ImmPact and LITS+, enhancing data accessibility.
  • Orchestrated development of Rentalnet truck leasing application.
  • Streamlined reservation, contract, and payment processes.
  • Enhanced system efficiency through Big Data engineering.

Timeline

Big Data Consultant / Senior Engineer

GEICO
01.2013 - Current

Senior Java Engineer / Architect

Blue Cross and Blue Shield of Florida
03.2008 - 12.2012

Senior Java Developer

Minnesota Department of Employment and Economic Development
09.2006 - 03.2008

Java / Database Developer

Department of Health, State of Maine
05.2005 - 08.2006

Java Developer

GE-Penske Truck Leasing
01.2001 - 04.2005

Bachelor of Science ( Computer Science ) -

AVC College ( Bharathidasan University )

Master of Computer Applications -

Annamalai University
ARIVUMANI RAMALINGAM