Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Neeraja Saddi

Novi

Summary

Experienced Data Engineer and Data Warehouse Developer with a strong background in Oracle, Databricks, Apache Spark, Delta Lake, and AWS for building scalable data solutions. Skilled in Database programming (SQL, PL/SQL), ETL pipeline development, data modeling, governance, and optimization to drive business insights. Expertise in enterprise data migration, big data processing, and automation using Python. Proficient in Agile methodologies and JIRA, collaborating with cross-functional teams to deliver high-impact data solutions.

Overview

21
21
years of professional experience
1
1
Certification

Work History

Sr Data Engineer / Data warehouse Developer / Oracle Developer

Amerisure Insurance
02.2014 - Current
  • Designed and implemented scalable ETL pipelines using Databricks, Apache Spark, and Delta Lake to optimize data ingestion, transformation, and storage
  • Led the Enterprise Loss-Sensitive data ingestion project onto the Lakehouse, ensuring enhanced reporting, data integrity, security, and compliance
  • Built Delta Live Tables (DLT) pipelines to accelerate ETL development, improve data quality, and enhance analytics team productivity
  • Developed cost-efficient data solutions leveraging Databricks and AWS services (S3, Glue, Lambda, and SNS) to streamline data processing and improve performance
  • Orchestrated Databricks jobs and AWS Glue Workflows, reducing manual effort and enhancing data pipeline reliability
  • Optimized big data processing performance, reducing compute and storage costs while maintaining high availability and reliability
  • Implemented data governance best practices, including data quality validation, schema enforcement, and role-based access control for secure and compliant data handling
  • Collaborated with cross-functional teams, including data analysts and business stakeholders, to deliver actionable insights from large datasets
  • Contributed to Amerisure’s Enterprise Data Migration, transitioning from an on-premise Oracle Data Warehouse to an AWS Data Lake
  • Designed dimensional data models to support complex analytical needs, enabling data-driven decision-making across the organization
  • Designed and implemented scalable data models (conceptual, logical, and physical) to optimize storage, retrieval, and performance in the enterprise data warehouse and Lakehouse
  • Developed a high-performance, governed Data Lake on AWS S3, enabling batch and streaming data transformations
  • Used Databricks Autoloader, Salesforce connector, Guidewire Connector to hydrate Data Lake from various sources efficiently
  • Implemented a Data Vault model enterprise applications, improving data consistency, governance, and enabling a flexible analytics architecture in the Data Lakehouse
  • Worked with Enterprise Architecture to integrate warehouse models into the Lakehouse, ensuring enhanced scalability and performance
  • Created data mapping and gap analysis documents to facilitate Guidewire Policy Center & Claim Center data integration into Enterprise Oracle Data Warehouse
  • Designed and implemented normalized and denormalized schemas to support both OLTP and OLAP workloads, ensuring efficient data integration and business intelligence reporting
  • Created ER diagrams and data dictionaries, improving transparency and governance across data teams
  • Developed new dimensions and fact tables to enhance the Enterprise Data Warehouse, aligning with evolving business requirements
  • Developed and maintained complex PL/SQL packages, stored procedures, functions, and triggers to efficiently process and load Policy Premium and Claims data into Warehouse Dimensions and Fact tables, supporting Amerisure's analytical and reporting needs
  • Expert in optimizing PL-SQL code objects and SQL queries, significantly reducing execution time while enhancing warehouse performance
  • Created complex database Views and Materialized Views to build a semantic layer for enterprise data access, supporting applications such as Sure-Connect, Clara-Analytics, Loss Control, DRS (Deductible Recovery), Actuarial, and Agency Dashboard, improving ease of data access and analytics capabilities
  • Conducted a proof of concept on the PL/SQL unit testing framework using utPLSQL to evaluate its effectiveness and feasibility for automated testing
  • Integrated Underwriter Advantage data into the Enterprise Data Warehouse, supporting risk assessment and underwriting processes using ODI (Oracle Data Integrator) and Tidal workload automation
  • Developed ETL pipelines using SnapLogic iPaaS to ingest Guidewire Datahub data into the AWS Data Lake
  • Developed automated data cleansing processes in Python, utilizing libraries such as Pandas, cx_Oracle, Boto3, and PySpark DataFrames
  • Created automation framework for data validation checks using Python at different Lakehouse layers (Access, Confirmed, and Analytical)
  • Experienced in following Agile methodology, actively participating in sprint planning, daily stand-ups, and retrospectives to ensure smooth project execution
  • Proficient in using JIRA for tracking work items, managing user stories, and collaborating with cross-functional teams to drive efficient project delivery
  • Environment: Oracle 19C, PL/SQL, SQL, PL/SQL Developer 11.0, Python, PySpark, AWS, Amazon S3, AWS Glue, AWS Lambda, AWS Athena, ODI, Snaplogic iPas, Databricks

Software Developer

ProQuest
11.2010 - 02.2014
  • As part of the database team, wrote oracle procedures, functions, packages and triggers on online and manufacturing data bases to fulfill business requirements
  • Created and modified APIs for PAM and One Search applications
  • Created batch jobs and schedulers using DBMS_SCHEDULER
  • Created SQL Loader scripts to move the data from external files into the database
  • Generated and modified XML files using Oracle XML DB options and XML Document Object Model (DOM)
  • Created materialized views and compound triggers for better performance
  • Supported the TRACS (title) daily data loads from legacy application into Morning star database using scheduler jobs
  • Worked on day-to-day customer problems and data fixes using SQL scripts
  • Worked on enhancements and development of Morningstar AdminApps using JDeveloper and ADF technology
  • Involved and supported in designing and creating an interface for an internal application using Oracle Application Development Framework (Oracle ADF)
  • Environment: PL/SQL, SQL, Oracle 11g, Toad 9.0, SQL
  • Loader, Oracle JDeveloper 10.1.3.40, ADF 10g, Oracle Application Server 10g, Windows XP

Programmer Analyst

DTE Energy
07.2004 - 01.2008
  • Worked on production support to fix problems and enhance the functionality of the various modules for Customer Service and Billing (CSB)
  • Enhanced on-line screens and created new screens as and when required
  • Designed and developed PL/SQL procedures to enhance the functionality
  • Created and modified UNIX shell scripts and control files for new and existing production jobs and handled any errors that may occur during processing
  • Wrote PL/SQL procedures to read the data from complex tables and to generate flat files
  • Wrote SQL Loader scripts to move the data from external files into the database
  • Performance tuning of several SQL queries and batch modules using EXPLAIN PLAN and TKPROF
  • Modified code and queries to improve performance
  • Modified and created PC (middle tier) files for accurate on-line and database communication
  • Created unit test plan and unit testing
  • Documented the changes and new requirements
  • Analyzed and developed several PL/SQL scripts for data fixes/analysis and data setup for testing different business test cases
  • Created new on-line screens for High Bill Analyzer and integrated with Customer Service and Billing application to help resolving high bill inquiries
  • Created new PC (middle tier) files for on-line and database communication
  • Created new PL/SQL procedures, functions and packages to calculate and retrieve current and previous bills for the customers and compare the invoice
  • Interacted with the users and created a user manual
  • Worked with testers to implement various test scenarios
  • Worked on developing Customer Information Security techniques by encrypt/decrypt customer identity in Customer Service and Billing application
  • Created PL/SQL script to encrypt the data and load into new columns and drop the old columns
  • Created new PL/SQL function to encrypt and decrypt the RAW type data
  • Modified the procedures to send (HTTP) and receive encrypted data to and from the external server
  • Modified on-line screens for Customer Service and Billing application to mask the personal identification information
  • Write new shell scripts to secure ftp files to Experian
  • Create detailed documents for the changes
  • Environment: Oracle Forms6i, PL/SQL, SQL, Oracle 9i/10g/11g, Toad 8.0, SQL
  • Loader, Tuxedo 6.0, Rational ClearCase 6/7, Request Tracking System (RTS), Peregrine Service Center 5.1, Windows NT, UNIX & Linux

Education

Master of Computer Applications -

Osmania University

Bachelor of Science -

Osmania University

Skills

  • Azure
  • AWS
  • DynamoDB
  • Snaplogic iPas
  • Databricks
  • Data Lake
  • SQL
  • PL/SQL
  • Python
  • C
  • C
  • Agile/Scrum
  • Windows / LINUX
  • Oracle
  • SQL Server
  • MS Access

Certification

  • AWS Certified Cloud Practitioner
  • Databricks Accredited Lakehouse Fundamentals

Timeline

Sr Data Engineer / Data warehouse Developer / Oracle Developer

Amerisure Insurance
02.2014 - Current

Software Developer

ProQuest
11.2010 - 02.2014

Programmer Analyst

DTE Energy
07.2004 - 01.2008

Bachelor of Science -

Osmania University

Master of Computer Applications -

Osmania University
Neeraja Saddi