Experienced Senior Java Architect and Big Data Engineer with over 20 years of expertise in designing, architecting, and developing advanced technology solutions. Skilled in mentoring teams and creating scalable data architectures, as well as optimizing Java applications and data pipeline frameworks for enterprise environments
Technologies - Apache Flink, Airbyte, Spark, Azure Data Factory, Azure Databricks, and DBT. Java, Python, Spark, Scala, DBT, SQL, and PL/SQL, Apache Iceberg, Polaris REST Catalog, ADLS Gen2, Hive, Spark, Snowflake,, DBT, Ranger, Airflow, Azure DevOps (ADO), Azure Kubernetes, Helm, Shell Scripting, Dev, Kyuubi, Blaise
Adopted and Implemented Licensed PaaS Data Ingestion Tools ( 2020 - 2022 )
Developed, customized, and implemented licensed PaaS ingestion tools such as Fivetran, HVR, and Confluent Kafka to seamlessly move data from diverse sources to ADLS Gen2 and Snowflake. Collaborated closely with vendors to align these tools with GEICO's security standards and compliance requirements.
Tools and Technologies - Fivetran, HVR, Confluent Kafka, ADLS Gen2, Snowflake, DBT, Azure DevOps, Java, Python, Scala, Spark, SQL/PLSQL, Databricks, Azure Data Factory, SQL Server, Oracle, Flat Files, and Message Hubs
Architected and Implemented Kafka Connect-Based Ingestion Architecture ( 2018 - 2020 )
Designed and developed a Kafka Connect-based ingestion framework to build robust data pipelines. Collaborated closely with Confluent, Microsoft, and Pivotal developers to enhance pipeline capabilities and ensure seamless integration with enterprise systems
Tools and Technologies - Kafka Connect, Open Source Kafka, Debezium, JDBC Connectors, ADLS Gen2 Sink, Snowflake Sink, DBT, Databricks, Spark, Java, Python, Scala, Jenkins, Git
Architected and Implemented Big Data Pipelines Using Open-Source Frameworks ( 2015 - 2017 )
Designed, developed, and implemented scalable big data pipelines leveraging open-source ingestion platforms. Utilized technologies such as Spring Boot, Spring Cloud Dataflow, Kubernetes, Apache Kafka, Hive, Hadoop, HBase, Camus Framework, Spark, MapReduce, and Jenkins to streamline data processing and integration
Tools and Technologies - Spring Boot, Spring Frameworks, Java, Spring Cloud Dataflow, Kubernetes, Apache Kafka, Hive, Hadoop, HBase, Camus Framework, Spark, MapReduce, Jenkins
Enhanced Auto Insurance Coverage Handling for GEICO Online Portal ( 2013 - 2014 )
Developed and maintained functionality to manage auto insurance coverages based on user requests. Utilized Blaze Rules Engine to implement decision-making logic, ensuring accurate and efficient processing of coverage options
Technologies - Java, Spring, JSF, Spring MVC, Blaze, Jenkins, DB2, Oracle, SQL Server, Sonar
Led Java-Based System Development for Health Insurance Coverage Management
Designed and Developed a Java-based system to manage health insurance coverages for members and agents. Designed and contributed to the creation of a Common Sales Tool, enabling adoption and utilization across multiple health insurance providers
Technologies - Java, J2EE, Struts, Spring, JSP, JavaScript, Ajax, JQuery, SQL/PLSQL, DB2, JRules, Jenkins
Designed and developed an Unemployment Insurance application portal for the State of Minnesota to streamline the processing of unemployment claims. Collaborated closely with the business team to gather requirements and build a Java-based system to collect, process, and deliver unemployment benefits to Minnesota applicants
Technologies - Java, Struts, JSP, JavaScripts, MySQL
Developed and implemented a data extraction process for water quality XML files, enabling seamless integration with MQ to support critical subsystems such as the Immunization Registry (ImmPact), Laboratory Information Tracking System (LITS+), and Laboratory Management System (StarLIMS), improving data accessibility and workflow efficiency
Technologies - Java, Oracle, XML, SQL, PL/SQL, MQ Series,
Technologies - Java, J2EE, JavaScript, Struts, JSP