Summary
Overview
Work History
Education
Skills
Accomplishments
Additional Information
Certification
Timeline
Generic

DHEERAJ HUNDLANI

Director
Piscataway,NJ

Summary

Experienced Data Architect well-versed in defining requirements, planning solutions and implementing structures at the enterprise level. Analytical problem-solver with a detail-oriented and methodical approach. Prepared to offer 12+ years of related experience to a dynamic new position with room for advancement.

Overview

12
12
years of professional experience
4
4
years of post-secondary education
1
1
Certification

Work History

Director

Exusia Inc
Jersey City, NJ
04.2019 - Current
  • Project 1: Data content & modeling: Project is to catalog feed metadata related to all RBC data processing applications.
  • Organize data collected based on RBC data governance policies and develop a strategy for automated metadata data collection build workflow for metadata attestation and identify areas of cost savings in current application architecture
  • Tools used: Collibra Data Governance Centre, Collibra API, ASG Rochade – Data Intelligence, Rochade As A Service, Dremio, Swagger UI
  • Roles & Responsibilities:
  • Manage client relationship by providing strategic solutions towards metadata management
  • Work with data custodians and data owners to gather application metadata
  • Build technical, functional & Testing specifications
  • Understand existing data governance policies and find ways to integrate distributed metadata repositories
  • Configure Rochade Metability, Becubic scanners and develop python scripts to Automate data collection, schema extraction
  • Design & develop python scripts to transform metadata collected to be imported to Colibra data governance management tool
  • Support decision making process of choosing metadata management tool to collect, manage, report technical metadata
  • Work towards developing Proof of Value implementation of the tools in discussion
  • Project 2: Data control tool 3.0: DCT 3.0 is an inhouse data ingestion tool built in Ab Initio by Wells Fargo team
  • This tool ingests data into a Hadoop Data lake after sanitizing with valid data values
  • The tool has features like Data movement controls, Audit and Logging, Ingestion of multiple data formats, etc
  • The framework is to be used by various teams across the enterprise for moving data from one area to other
  • The project uses Advanced Ab Initio products like Acquire>It, Testing Framework to build the Data Acquisition Layer by acquiring data from various Data domains in the bank
  • Roles & Responsibilities:
  • Add new features required to the framework by various team using the tool
  • Read data from various sources including Hadoop Data file systems, KAFKA Topics, AWS S3 Buckets, JMS Queues, Mongo DB
  • Work with the team of 2-5 developers to develop various features of the application
  • Conform the application development with Agile development framework using Story creation, Story point estimation & Project Improvement planning
  • Design and develop the backend layer for reporting data to providers & business team
  • Build CI/CD pipeline using Python scripts, Jenkins and Udeploy tools of Code deployment
  • Ensure Consistent data lineage, data quality and profiling metrics using data governance methodologies
  • Code reviews and work implementation plans for deliverables going live
  • Work with business team to estimate effort on various features in the application
  • Work with the scrum team to slot stories for the team.

Managing Consultant

Exusia
Memphis, TN
04.2018 - 03.2019
  • CARGO SECURITY 2.0: Cargo Security is an application designed to provide backend and front-end support to fraud and security analytics teams in FedEx
  • This application currently runs on batch and 2.0 project is an initiative to make system as real time as possible under Ab Initio Framework
  • The project uses Advanced Ab Initio products like Acquire>It, Testing Framework & Control Center to build the Data Acquisition Layer by acquiring Shipment, Customer, Location & Network Activity data
  • Roles & Responsibilities:
  • Interacted with clients to define and understand long-term goals and strategies.
  • Analyzed problematic areas to provide recommendations and solutions.
  • Developed analysis methodologies and task requirements in tandem with senior management.
  • Work with team of 2-5 developers to develop various features of the application
  • Conform the application development with Agile development framework using Story creation, Story point estimation & Project Improvement planning
  • Design and develop the backend layer for reporting data to providers & business team
  • Code reviews and work implementation plans for deliverables going live
  • Work with business team to estimate effort on various features in application
  • Work with scrum team to slot stories for team.

SENIOR CONSULTANT

Transunion
Chicago, IL
09.2015 - 03.2018
  • Quote Exchange: Online application to process customers personal information (PI) data looking for auto insurance quotes
  • Application used an always online Ab Initio continuous service to process PI data and enrich it with customer’s credit information and driver history
  • This is then sent to auto insurance carriers registered with application which responds with their own quotes for customer
  • Roles & Responsibilities:
  • Work with team of 2-5 developers to develop various features of application
  • Prioritized projects and project tasks depending upon key milestones and deadline dates.
  • Conform the application development with Agile development framework using Story creation, Story point estimation & Project Improvement planning
  • Troubleshot issues by understanding issue, diagnosing root cause and coming up with effective solutions.
  • Design and develop the backend layer for reporting data to auto insurance providers & business team
  • Code reviews and work implementation plans for deliverables going live
  • Work with business team to estimate effort on various features in application.

ASSOCIATE - PROJECTS

COGNIZANT TECHNOLOGY SOLUTIONS
Cleveland, OH
08.2012 - 08.2015
  • Master Data Management (MDM) is an IBM provided framework used by KeyBank to get a 360 view of a customer
  • MDM – ETL project facilitates generation of XML files which form input data for MDM System
  • Data from various Data Sources is converted into XML format and sent as input to MDM System
  • MDM System is based on Oracle Exadata DB
  • This project is developed using Agile SDLC model where incremental approach is being used in order to incorporate new functionalities
  • Roles and Responsibilities:
  • Understand Business requirements and creating high level design
  • Make low level design decisions
  • Review & test ETL objects created by teammates
  • Task distribution among team members
  • Improve the project design constantly to make it easier & faster to deliver quality code

ANALYST – PROGRAMMER

SYNTEL LTD
01.2010 - 01.2012
  • PROJECT 1: Marketing – Customer Focused Re-Invention – Leads is an application of ETL process for standard layer of marketing data
  • Data is Extracted from Raw Layer of Allcorp Data Warehouse apply business transformations and load into Standard layer tables
  • Further, similar process is applied to load Presentation Layer Tables which forms base for reporting and analytics environments
  • Roles and Responsibilities:
  • Development of Ab Initio objects (Customized graphs, dmls, xfrs, psets, reusable components & graphs and Unix Scripts) which facilitate the loading of standard layer tables
  • Performance tune of generic graphs in order to speed up processing of huge amount of data
  • Applied various lookup concepts such as, dynamic lookups, shared lookups, range lookup operations
  • Extract data by connecting to multiple database schemas using generic graph
  • Design and Analyze dependency charts among various Unix wrapper scripts to execute graphs simultaneously
  • Closely coordinate with onsite and offshore teams across locations
  • Prepare daily task reports and keep team in sync with daily developments
  • Testing (Unit & Integration) of the ETL objects developed and documentation
  • Code migration across development to QA EME
  • PROJECT 2: ADW – Producers Standard: Sales and Marketing Information Strategy (SMIS) program’s sub-track
  • This sub-track explains ETL standard layer processing for initial population and update of ADW’s (Allcorp Data Warehouse) acquisition of Customer Information Centers’ (CIC) Customer Information Professionals’ (CIP) person indicative data from Allstate Representative Repository (ARR) system of record
  • This project extracted data from ARR’s RAW layer and using given business rules fed data to Data Warehouse’s Standard Layer
  • Roles and Responsibilities:
  • Analysis of High-Level Technical documentation and prepare an approach for development of ETL Objects
  • Prepare detailed technical specification of Ab-Initio graphs to be developed for implementing solution
  • Implemented surrogate key assignment
  • Implemented Run Control and Record Count Balancing mechanisms for audit purpose, better support and analysis
  • Implemented concept of re-startability (start from last completed check point) in graphs with heavy loads and PL/SQL scripts handling huge amount of data
  • Design, develop and test (unit & integration) the developed ETL objects created during the project
  • Keep onsite team coordinator informed about daily development process through detailed status reports.

Education

Hons. in Bachelor of Engineering - Computer Science

University of Rajasthan
India
07.2005 - 06.2009

Skills

    Data Analytics

undefined

Accomplishments

  • Comprehensive understanding Data Governance methodologies – Data lineage, data Quality standards, Data Stewardship, Reference Data Management, etc
  • Broad understanding of Metadata Management policies, procedures and systems that are used to administer data that describes other data
  • Hands on with governance tools like Collibra Data Governance Centre and ASG Rochade – Data Intelligrence
  • Experience with working on Hadoop Datalake, Automated data ingestion frameworks, Auditing & change management
  • Good knowledge of building AWS Cloud infrastructure and various services involved
  • Amazon S3, EC2, Lamda Functions
  • Deploy application code and analytical models using CI/CD tools and techniques and provide support for deployed data applications and analytical models
  • Worked with Real time applications, Master Data Management (MDM), Batch data processing applications
  • Experience in working with Agile methodologies of Project management using JIRA & Agile Central
  • Worked with various Abinitio Concepts like continuous graphs, Online web services, Parallel batch processing, Micrographs, database logging, data reconciliation and many advanced concepts
  • Mentoring associates and peers for collective team growth and solution delivery
  • Extensively worked on Advanced ETL Ab - Initio products like Acquire>It, Query>It, Metadata Hub, building end to end frameworks for data acquisition including BRE, Data Quality Engine (DQE),
  • Certifications: AWS Certified Cloud Practitioner – Jun 2022
  • Visa Status
  • H1 B visa active till March 2024, Green Card EB2 priority date 2020

Additional Information

  • Always participated in volunteering activities like teaching school kids, charity runs Employee appreciation awards in first 6 months of joining in Cognizant (2013) & Exusia (2016). Outstanding project effort award in Syntel 2012.

Certification

AWS Cloud Practitioner

Timeline

AWS Cloud Practitioner

08-2022

Director

Exusia Inc
04.2019 - Current

Managing Consultant

Exusia
04.2018 - 03.2019

SENIOR CONSULTANT

Transunion
09.2015 - 03.2018

ASSOCIATE - PROJECTS

COGNIZANT TECHNOLOGY SOLUTIONS
08.2012 - 08.2015

ANALYST – PROGRAMMER

SYNTEL LTD
01.2010 - 01.2012

Hons. in Bachelor of Engineering - Computer Science

University of Rajasthan
07.2005 - 06.2009
DHEERAJ HUNDLANIDirector