Summary
Overview
Work History
Education
Skills
Linkedin
Timeline
Generic

Naveen Mucharla

Aldie,VA

Summary

Dynamic IT Data Specialist with 15+ years of expertise delivering scalable, secure, and high-performing data solutions across RDBMS and Big Data ecosystems. Proficient in Data Architecture, Data Modeling, and Data Engineering for cloud platforms like GCP/AWS and on-premises systems. Skilled in aligning data strategies with business goals, driving end-to-end digital transformation, and leading global teams to build impactful data products. A proven expert in Big Data technologies, ETL processes, visualization tools, and industry-leading practices in Retail, Financial, Insurance, and Healthcare domains. Renowned for optimizing cloud performance, enhancing data governance, and enabling actionable insights through cutting-edge analytics.

Overview

15
15
years of professional experience

Work History

Lead Data Architect

Walmart
04.2019 - Current
  • Business Interface & Requirements Gathering: Collaborated with Business/Product teams to gather, document, and translate business problems into technical solutions adhering to Walmart standards.
  • Technical Solutions & Migration: Led end-to-end data migration from Hive to GCP, including configuring GCS, creating GCP buckets, and developing Big Query schemas. Designed scalable data architectures for enterprise data warehousing.
  • Team Leadership & Documentation: Managed a team of 8 engineers, designed workflows, standards, and documented metadata, data flows, and migration plans in Confluence.
  • Data Architecture & Security: Designed enterprise data models, implemented data security measures (Ranger, access controls), and managed multi-platform data warehouses (Teradata, Hadoop, GCP).
  • Data Processing & Performance Optimization: Ingested and processed complex data sets (Kafka, Flat files, RDBMS), created optimized partitioning/bucketing strategies for Hive schemas, and reduced cloud storage costs.
  • Visualization & Reporting: Developed Power BI, Tableau, and Looker dashboards to highlight critical KPIs, improve decision-making, and support strategic planning.
  • Collaboration & Support: Partnered across engineering, network, and program teams to troubleshoot production issues, improve data governance, and validate business rules for enhanced DQ scores.
  • Innovation & Cost Efficiency: Architected data models for Customer & Marketing domains, implemented code enhancements, and streamlined workflows to reduce cloud spending and improve data reliability.
  • Planned migration strategies for legacy systems transition to modernized architectures without compromising operational continuity or end-user experience during critical transformation phases.
  • Increased customer satisfaction by developing targeted marketing campaigns based on comprehensive analysis of user behavior patterns within large datasets.
  • Enhanced data processing efficiency by implementing optimized big data architecture solutions.
  • Evaluated emerging industry trends, ensuring alignment with cutting-edge tools and methodologies for maximum competitive advantage in the realm of big data.

Big Data Architect

CareFirst BCBS, FEPOC
02.2017 - 04.2019
  • Requirements Gathering & Analysis: Collaborated with BCBS Plans, Actuaries, Product Owners, and SMEs to capture On-Demand Reporting requirements and analyze legacy reports, translating them into database models for the Enterprise Data Hub (EDH).
  • Data Architecture & Flow Design: Designed end-to-end data architecture, integrating Mainframe and DB2 source data into EDH using Raw, Trusted, and Publish zones for both Batch and Near Real-Time (NRT) reporting.
  • Data Modeling & Optimization: Created Hive and Kudu structures for optimized performance, leveraging de-normalized techniques, aggregated structures, and Sqoop/Impala for seamless querying and data analysis.
  • Consumption & Reporting Models: Built consumption layers in data marts for reporting and file generation using ETL tools (Ab Initio, MicroStrategy) and supported flattened and reference schemas for Actuaries' data mart.
  • Process Automation & Data Quality: Automated MDM workflows to reduce manual tasks while designing data flows to address data quality and lineage issues, migrating enrollment reports into EDP.
  • Documentation & Standards: Authored data modeling standards, naming conventions, data architecture documents, and table-level technical designs for both Dimensional and Big Data methodologies.
  • Stakeholder Collaboration: Worked closely with BCBS Plans to communicate requirements, address data constraints, and resolve downstream application dependencies on Data Lake Trusted Zone models.
  • Healthcare Analytics Expertise: Extensive experience with healthcare data (Claims, Enrollment, Benefits), Legacy COBOL modules, and providing innovative, simplified Big Data solutions for performance improvement.
  • Contributed to the development of long-term strategic roadmaps for the evolution of big data solutions within the organization, aligning business objectives with technical capabilities.
  • Streamlined database management processes for improved performance and resource utilization.
  • Spearheaded cross-functional collaboration, driving successful implementation of strategic big data initiatives across departments.

Enterprise Data Architect

Fannie Mae
12.2014 - 02.2017
  • Requirements Gathering & Analysis: Collaborated with business partners, product owners, and analysts to gather and translate requirements into conceptual, logical, and physical data models. Validated alignment with the Enterprise Logical Data Model (ELDM).
  • Data Modeling & Design: Designed OLTP and OLAP models, including Loan Sourcing Data Mart (LSDM) using Dimensional Modeling (Star/Snowflake Schemas) and Kimball’s Conformed Dimensions for seamless data integration and reporting.
  • Data Governance & Standards: Ensured data governance by maintaining centralized models, metadata repositories, and adhering to Fannie Mae’s naming standards. Tagged attributes for security classifications (NPI, critical data) in alignment with Security Architecture.
  • ODS and Data Mart Development: Created, reviewed, and maintained relational (ODS) and dimensional (Data Mart) models for Delivery and Acquisition systems, collaborating with ETL and reporting teams to meet business and reporting needs.
  • Documentation & Collaboration: Authored source-to-target mapping, design documents, data dictionaries, and metadata records. Conducted model reviews with stakeholders, ensuring transparency and alignment.
  • Cloud Migration Support: Contributed to AWS cloud migration projects, including creating S3 buckets and validating high-level design changes. Improved performance using indexing and optimization techniques.
  • Legacy & Reverse Engineering: Worked on legacy models by reverse engineering databases using ER/Studio, addressing data structure anomalies, and standardizing object naming conventions.
  • Project Management & Agile Delivery: Supported multiple projects in an agile environment, identified risks, and coordinated with stakeholders to maintain project scope, timelines, and budgets while improving application performance.
  • Adhered to industry best practices for modeling standards such as UML notation, ERD diagrams, or IDEF1X methodology during development phases.

Senior Data Modeler

Allianz Life Insurance Company of North America
09.2011 - 11.2014
  • Facilitated collaborative sessions with business users and analysts to define, refine, and document reporting requirements, ensuring alignment with organizational goals. Actively identified and mapped source data from systems like Oracle Apps and Salesforce to meet reporting needs.
  • Spearheaded logical and physical data modeling efforts for various projects, implementing robust database designs and leveraging Star & Snowflake schemas for the creation of reliable and scalable Data Warehouse solutions.
  • Designed and maintained relational models (3NF) for the Enterprise ODS and dimensional models in Data Warehouses, enabling end-users to derive actionable insights through dimensional reporting. Integrated master data across platforms by developing and maintaining effective MDM workflows.
  • Utilized tools like Informatica Analyst and Informatica Data Quality to profile and analyze source data, ensuring clean, conforming, and transformed data meets the reporting requirements within defined project timelines.
  • Delivered efficient Data Marts using Dimensional Modeling techniques (Star & Snowflake schema) and incorporated Kimball’s Conformed Dimensions and Enterprise Bus Matrix to ensure drill-through capabilities across multiple subject areas

Data Analyst

Deloitte
02.2010 - 09.2011
  • Collaborated with SMEs, Business Analysts, and Architects to gather and analyze requirements, while defining ETL transformation logic and data migration rules for OLTP, ODS, and Data Warehouse systems.
  • Enhanced logical and physical data models, created source-to-target mappings, and performed data profiling to ensure data quality and alignment with business requirements.
  • Produced monthly reports using advanced Excel spreadsheet functions.

Education

Master of Science - Computer Science

Herguan University
Sunnyvale
12-2009

Bachelor of Science - Electrical, Electronics And Communications Engineering

Jawaharlal Nehru Technological University
Hyderabad, India
05-2007

Skills

    Data Modeling

  • ER Studio 951/75/711, Power Designer, Erwin 98/82/73/45, Dimensional Modeling, Relational Modeling, Star/Snowflake Modeling, Data mart, NO SQl, Facts & Dimensions and Big data Flattened Structures
  • Bigdata Technologies:

  • HDFS, Hive, Sqoop, Flume, Kafka, KUDU, Spark, Cloudera CDH5X, Ranger, Tez & MapReduce, Cassandra, MongoDB, Elasticsearch, Hbase, Yarn
  • ETL Tools

  • Informatica Power Center 91/81/7x, SSIS, Informatica IDQ Analyst 951, Ab Initio, Meta Data Hub, Autosys, Automic, Control M, Airflow, Data proc
  • Databases

  • Oracle11g/10g, MS SQL Server, Redshift, Teradata12113, Netezza702,DB2, Mainframes, Hive, GCP Cloud SQL & AWS RDS, Big lake,
  • Query Tools

  • TOAD, PL/SQL Developer, SQL Plus, Aginity Work Bench, SQL Server, Teradata SQL Assistant, CLI, HUE, DBeaver, Hive, Spark, Presto

    Reporting Tools

  • POWER BI, Micro strategy, Business Objects, Cognos 80, Tableau, Looker
  • Front End Tools

  • MS Visio, MS Office Suite, Rational Requisite Pro, Documentum, Caliber RM, DOORS, SharePoint, AGM and Rally Agile Project Tool
  • Languages

  • HiveQL, SQL, Python, PL/SQL, XML, HTML, JSON
  • Cloud Platforms

  • Google Cloud Platform (GCP), Azure, AWS EC2 & EMR, Google Compute Engine (GCE), Google Big Query, HUDI
  • Other Tools

  • GitHub, IntelliJ IDEA, MITI, Collibra, JIRA, Web Services (REST) API, JDBC, ODBC

Linkedin

www.linkedin.com/in/mucharla-naveen-1393911b

Timeline

Lead Data Architect

Walmart
04.2019 - Current

Big Data Architect

CareFirst BCBS, FEPOC
02.2017 - 04.2019

Enterprise Data Architect

Fannie Mae
12.2014 - 02.2017

Senior Data Modeler

Allianz Life Insurance Company of North America
09.2011 - 11.2014

Data Analyst

Deloitte
02.2010 - 09.2011

Master of Science - Computer Science

Herguan University

Bachelor of Science - Electrical, Electronics And Communications Engineering

Jawaharlal Nehru Technological University