Summary
Overview
Work History
Education
Skills
Websites
Technologies
Certification
Accomplishments
Technical Blogs
Timeline
Generic

Eesha Kumar

Summary

With over 18 years of experience in Data and Analytics, I bring deep expertise across Data Engineering, Architecture, and Administration. I design and deliver scalable, end-to-end data solutions that drive business intelligence and analytics. My work spans modern cloud-based platforms—including Data Lakes, Warehouses, and real-time streaming—on AWS and Azure. I’ve led major ETL and ELT data integration efforts across both on-premises and cloud environments, consolidating sources like SAP, databases, and APIs into unified, analytics-ready systems.

I architect high-availability, fault-tolerant, and serverless data platforms optimized for performance and cost. Through right-sizing and efficient designs, I’ve helped reduce cloud spend while enabling seamless scaling. My experience includes full lifecycle data modeling—conceptual, logical, and physical—for relational (SQL Server, Oracle) and NoSQL systems, tailored for analytics and reporting.

Beyond architecture, I lead analytics initiatives using advanced statistics and machine learning to convert data into insights. I evaluate and implement technologies aligned with evolving business needs, ensuring scalable, secure, and high-performing solutions. I also create impactful dashboards and visualizations with Tableau, Power BI, QuickSight, and MicroStrategy to support data-driven decisions.

Overview

18
18
years of professional experience
1
1
Certification

Work History

Analytics Solutions Architect

Amazon (AWS)
05.2021 - Current
  • As a member of the World-Wide Sales Organization (WWSO) worked as Analytics Specialist Solutions Architect with several customers ranging from small to large scale businesses helping them adopt AWS Analytic Services to quickly and efficiently extract valuable insights from their data. This involves working with various AWS services as well as customer’s technical environment.
  • Led the adoption of cloud-native data solutions, helping enterprise customers transition from monolithic on-premises architectures to distributed, scalable, and flexible AWS Analytics Services.
  • Actively Work with current AWS customers to review their production environment provide recommendations to optimize considering cost and performance.
  • Engage with new customers / compete scenarios looking to modernize their analytics applications using AWS by evaluating their current workloads, design and review cloud native architecture and very quick implement proof of concept centered around customers success criteria (Performance, Cost Savings, Scalability, Reliability) and present results to their executive management teams to make data driven decisions.
  • Worked with Several Customers and designed solutions that cover multiple analytics services like AWS Glue, Amazon Athena (Presto), Amazon EMR, Data Lake, AWS Lake Formation, Amazon QuickSight, Data Governance, Data Zone, DynamoDB, Amazon Sagemaker, Amazon Redshift, Amazon OpenSearch, Amazon MSK, Kinesis Data Streams, Lambda, Data Migration Service, Data Brew etc. by understanding their business needs, designing customized analytics architectures to address their unique challenges including data ingestion, transformation, storage, processing, machine learning, and visualization.
  • Spearheaded cloud data migrations successfully migrating legacy on-premise data warehouses & other cloud database technologies and applications to modern cloud native architectures as part of Proof of Concept (POC) using migration tools like AWS Database Migration Service (DMS), Schema Conversion Tool etc.
  • Developed and implemented data transformation strategies utilizing AWS Glue, Amazon EMR, and Amazon Kinesis for seamless ETL (Extract, Transform, Load) processing and real-time analytics.
  • Utilized AWS CloudFormation templates and Terraform to automate the provisioning, configuration, and management of AWS resources ensuring scalable, repeatable infrastructure deployment and reducing manual intervention.
  • Discussed performance bottlenecks & Optimize Customers Applications using best practices for performance tuning.
  • Implemented Product recommendation & Content recommendation Systems combining Amazon OpenSearch Vector DB combined with Amazon Sagemaker & Amazon Bedrock Machine Learning (AI/ML) Models.
  • Developed cost-effective architectures for big data processing using Amazon EMR, AWS Data Pipelines.
  • Conducted data migration assessments and roadmap planning, creating detailed migration strategies, risk assessments, and timelines for clients transitioning from legacy systems to cloud-based platforms.
  • Enable internal teams through Presentations & Working Demos.
  • Document product feature requests (PFR) by add customer influences and work with product managers and service teams to improve the service and drive better customer adoption.
  • Collaborated & Presented Several Multi-Customer AWS Analytics Events.
  • Perform Scale Testing and provide feedback for new instances and feature releases.
  • Through tailored immersion days, guided customers in understanding and maximizing the value of our analytics products, ensuring their successful adoption and integration into their business operations.
  • Implemented data pipelines using AWS Lambda, Python, AWS Glue, and Amazon Kinesis, AWS Step function, AWS Data Pipelines, Apache Airflow, Amazon MSK, Kafka for real-time data processing and batch ETL workflows.
  • Developed ETL pipelines using Apache Spark to read from and write to Apache Iceberg tables in S3, with seamless integration into Amazon Redshift for high-performance data querying and analytics.
  • Leveraged Redshift Spectrum to directly query Apache Iceberg tables stored in S3 enabling low-latency access to big data without requiring data movement.

Data Architect/Engineer

REAP Tech Solutions (Independent Consulting)
11.2014 - 05.2021
  • Client: US Health Group (Aug’ 2019 – May’ 2021)
  • Environment: Azure, Snowflake, WhereScape, SQL Server and Tableau
  • US health is building a data warehouse to support and modernize their reporting and analytics. Existing data systems are not able to provide right answers as they operate in isolation and there is no corporate view.
  • Architect Data warehouse layers using Snowflake cloud DB as target using Wherescape ETL tool.
  • Build data lake to support ad-hoc exploratory queries from Actuary analysts for risk analysis.
  • Evaluate master data products and architecture fit recommendation to senior executives.
  • Establish ELT standards and process, Define Data warehouse Architecture layers, Define rules for data flow and semantic layer.
  • Develop best practices to take advantage of Snowflake while delivering data to business needs in agile fashion.
  • Leverage Cloud architecture for Modern data warehouse. Design it to optimize for best performance while keeping cost of cloud services lower.

Managing Consultant

REAP Tech Solutions (Independent Consulting)
10.2018 - 06.2019
  • Client: Misumi USA
  • Environment: AWS Redshift, AWS S3, Talend, SQL Server and PowerBI
  • A Japanese auto part manufacturer needed to build a road map for their reporting and analytics project needs. My role was to engage with client as BI Solutions Architect and to do requirement analysis, develop road map for final solution.
  • Analyze current architecture and conduct workshops on requirements.
  • Provide architecture recommendations for DW system and get buy-in from corporate.
  • Develop high level road map and phase 1 project plan with Talend MDM.
  • Implement the project for Sales and Marketing analytics domain.
  • Help Misumi in developing data governance plan and work with stakeholders in defining the process for ‘Golden Copy Customer’ records.
  • Mentor analyst and other technical team members and assist client team in managing business expectations.

Data Architect

REAP Tech Solutions (Independent Consulting)
02.2018 - 08.2018
  • Client: Thomson Reuters
  • Environment: Informatica, SAP, and SQL Server
  • Project Connect was to deliver a central repository to store data from across Tax & Accounting businesses (Government, Professional & Corporate), enabling standardized and automated reporting. Data would be extracted from 10+ heterogeneous data sources (SAP, Unison, Great Plains, and home-grown solutions) and reported out of multiple subject areas (Sales, Marketing, Finance, HR, and Product Usage) and the KPIs will be derived from the subject areas for management Dashboards. My role as Data Architect was to Responsible for data architecture deliverables from IBM implementation team.
  • Review and validated IBM deliverables of the project are of TR standards.
  • Review business requirements and analyze source data structure.
  • Reviewed data model against KPI needs and future enhancements. Responsible for logical data model final design.
  • Interact with Business leads to understand business goals and how BI can help them achieve those goals.
  • Work with implementation partner IBM to ensure data model, ETL design and Testing Strategy are aligned to meet required business outcome.
  • Lead offshore COE team for testing in testing strategy and help in resolving issues via IBM team.

Enterprise Architect – BI and Analytics

REAP Tech Solutions (Independent Consulting)
11.2014 - 12.2017
  • Client: Miller Coors
  • Environment: SAP BW, Rev R, Cloudera Hadoop Distribution SQL Server, HANA, Business Objects Reporting tool set, Datastage, SSAS, Data Services (BODS), Informatica.
  • As a Solution architect for BI – BI/Analytics is responsible to lead technical development and architectural solution blueprint for assigned projects. Implementation of HANA BW solution and Data lake project for Big data analytics in Agile mode. Responsibilities included Provide architecture guidance to project teams in areas of BI and Analytics.
  • Drive proof of concept projects that are aligned to MillerCoors’s strategy.
  • Research and lead development of prototype projects to assess and build new capabilities in the enterprise based on new use-cases.
  • Present projects to business stake holders to evaluate ROI and funding.
  • Work across resources from different capabilities as needed.
  • Data Archiving platform evaluation, Vendor tool (Space Planning) evaluation and Hadoop environment as an option for Big Data analysis and data archiving.
  • Manage Agile Analytics Solution project to provide rapid development process for data discovery and statistical analytics.
  • Ensuring compliance with established architectural principles, standards, and processes.
  • Security architecture with Dynamic Access Privilege for accessing content via Enterprise portal and BO.
  • Architecture review and development of Data Integration into HANA from Teradata into Native HANA and BW extractors (SAP Data Services and SQL).
  • Responsible for end-to-end solution design review and implementation of architecture standards.
  • Review Big Data technology stack/tools to select required tools that fit MillerCoors architecture and standards.
  • Develop MillerCoors specific Data Lake architecture to address Data archiving, Data Migration and Agile Analytics.

BI Architect

Commercial Metals Company
08.2012 - 10.2014
  • Environment: SAP BW, SQL Server, HANA, Business Objects Reporting tool set, Datastage, SSAS, Data Services (BODS), Informatica.
  • I designed and lead development of a data warehouse that draws data from SAP ERP solutions, Oracle Business Suite, Sales Force, and other legacy applications. The goal is to provide one data layer (Enterprise Data Warehouse and Operational Data Store) that is system agnostic and can be consumed by various tools.
  • I provided technical and project management in design and development activities.
  • Managing development of data marts for Oracle and SQL Server applications and review team deliverables.
  • Self-service BI - Developing and proving design pattern to use logical data warehouse for analytical purpose via broad range of reporting tools (BOBJ, SSRS, Power Pivot, Power View).
  • Manage Migration project and rebuilding various BW processes from 7.3 into 7.4 (BW on HANA) and HANA modeling.
  • Responsible for meeting team goals and performance evaluation of team on above mentioned BI projects. (7 employees and 12 off-site contract resources).
  • Mentor junior members of team and play the role of BI project SME.
  • Manage offshore vendors and budgets for their resources.
  • Evaluate project risks and keep business stake holders informed.
  • Decommissioning data replication server with HANA SLT process and SQL Scripting.
  • Design and Building HANA views.
  • Manage deliverable of ETL packages in BODS for integration into data warehouse.

Specialist Senior

Deloitte
05.2010 - 09.2012
  • Company Overview: Deloitte is an industry leader in strategic management and consulting.
  • Environment: Oracle, SQL server, SAP BW/BI 3.5/7.0, Business Objects, Xcelsius.
  • At Sysco systems SAP implementation was to cover most of SAP modules and a complete overhaul of Sysco business processes. I managed deployment and testing efforts for BI systems. Developing deployment plan and managing deployment waves and related test efforts as more Sysco business units join SAP systems.
  • Maintenance and support of existing modules and OPCOs.
  • Managing BW testing efforts across all landscapes and keeping systems in sync in the transport path.
  • Supporting several upgrades efforts to ECC, CRM and BO systems.
  • Maintenance and support of BW data model and BO Reports.
  • At Deloitte, ITS developed Consulting Management Reporting System (CMRS) to deliver key management reports and their supporting metrics to enable Consulting Leadership to monitor and evaluate performance.
  • Managed BW data model effort to support reporting initiative during the project and managed all QA responsibilities that included front end testing.
  • Responsible for developing ETL solution and supporting BW systems. (Data sources, DSOs, Cube, Multiproviders, Queries, Transformations, DTPs, Infopackages, Process Chains, Aggregates and ABAP routines etc).
  • Lead POC project to leverage Business Objects Reporting. Created diverse types of reports like Master/Detail, Cross Tab and Chart and imposed several Joins, Filters, and Alerts to restrict/highlight the data.
  • As a Data Architect at Bell Helicopter for BSM (Business System Modernization) Designing/Documenting ETL architecture and standards of the project.
  • Involved in developing conceptual modelling and logical data modelling of Enovia (Engineering) and Visiprise (Manufacturing) application data in to federated datawarehouse. Developed naming standard and approach for EIM data dictionary.
  • Deloitte is an industry leader in strategic management and consulting.

BI Developer

Tyson Foods
10.2007 - 04.2010
  • Environment: SAP BW/BI 3.5/7.0, BEx Analyzer, Business Objects, SAP R/3, ABAP, Oracle, and SQL Server EDW, SSIS, SSAS, SSRS, Datastage.
  • Responsibilities: As a BI Developer, I Implemented SAP BW solution for COPA, MM, PM, HR, and PU modules. I integrated data from SAP BI into Enterprise data warehouse in Oracle with subject oriented data marts.
  • Involved in all phases of the SAP BI project cycle from Project preparation, Blueprinting, Design Architecture, Development, Documentation, Integration Testing and Go Live.
  • Successfully managed the project to integrate SAP BI data into Legacy Oracle Warehouse. This saved six months in getting data to critical users.
  • Developed Complex Reports (Used multiple data providers, Union, Intersection, minus and Master/Detail, cross tab, Charts).
  • Created Transformations, DTPs for SAP 7.0 objects. Used start routine, end routine and other transformation options to modify data as needed.
  • Designed and implemented custom solution for ‘grouping systems’ that included a custom DSO to hold grouping system data and can be maintained by each business group. This solution reduced number of requests related to custom grouping and provided a uniform solution across all business units.

Education

MBA -

Sam M Walton College of Business
Fayetteville, AR
05.2006

Skills

  • Data warehouse
  • Modern Cloud Data Architecture
  • Data Lake
  • Data Architecture
  • Bigdata Analytics
  • Data Security & Governance
  • Machine Learning /Generative AI
  • ETL ELT Data Integration
  • Gen AI – LLM Analytics
  • Cloud Computing
  • Data Modeling
  • Data Visualization
  • Database Management/Optimization
  • MDM
  • Data Quality Management
  • Near Real Time Analytics
  • Statistical Analysis Algorithms
  • Stakeholder management
  • Cloud architecture design

Technologies


  • AWS cloud: Amazon Redshift, AWS Glue, AWS Lambda, AWS Data Pipelines
  • Amazon Sagemaker, Amazon Bedrock, Amazon DynamoDB, AWS DMS, Amazon RDS, Kinesis
  • Microsoft Azure Data Factory, Microsoft BI, Databricks, Snowflake, DBT, WhereScape, Informatica
  • Apache Airflow, Python, PySpark
  • PowerBI, Tableau, MicroStrategy
  • SQL Server, Oracle, Teradata, SAP BW, SAP HANA
  • Terraform, Amazon CDK/CFT

Certification

  • AWS - Solution Architect – Professional
  • AWS - Data Engineer Associate
  • AWS - Data Analytics – Specialty

Accomplishments

  • Delivered multiple customer training workshops, technical blogs as part of AWS Modern Data strategy adoption.
  • Delivered DynamoDB and Redshift Integration tool for Real Time Data Analysis and Schema agnostic integration. Preboarded 7 customers using tool prior to AWS feature release on integration.
  • Successfully created customizable customer facing pricing tool to discuss various pricing options for easy facilitation.
  • Enhanced customer AWS Redshift infrastructure evaluation to identify potential issues and targeted recommendations. This reduced repeated multiple meetings with customers by 30%.
  • Reduced monthly billing process time for a large entertainment customer by 80% by adopting Modern Data Lake and Real time data ingestion methods with scalable infrastructure.

Technical Blogs

1) https://aws.amazon.com/blogs/big-data/simplify-data-ingestion-from-amazon-s3-to-amazon-redshift-using-auto-copy/


2) https://aws.amazon.com/blogs/big-data/simplify-analytics-on-amazon-redshift-using-pivot-and-unpivot/


3) https://aws.amazon.com/blogs/big-data/build-a-big-data-lambda-architecture-for-batch-and-real-time-analytics-using-amazon-redshift/

Timeline

Analytics Solutions Architect

Amazon (AWS)
05.2021 - Current

Managing Consultant

REAP Tech Solutions (Independent Consulting)
10.2018 - 06.2019

Data Architect

REAP Tech Solutions (Independent Consulting)
02.2018 - 08.2018

Data Architect/Engineer

REAP Tech Solutions (Independent Consulting)
11.2014 - 05.2021

Enterprise Architect – BI and Analytics

REAP Tech Solutions (Independent Consulting)
11.2014 - 12.2017

BI Architect

Commercial Metals Company
08.2012 - 10.2014

Specialist Senior

Deloitte
05.2010 - 09.2012

BI Developer

Tyson Foods
10.2007 - 04.2010

MBA -

Sam M Walton College of Business
Eesha Kumar