Summary
Overview
Work History
Education
Skills
Certification
Professional Development
Timeline
Generic

NUSRATH MOHAMMED

Bentonville,USA

Summary

Staff Data Engineer at Walmart with profound knowledge in Hadoop and data pipeline development. Orchestrated initiatives to develop unified data models, markedly boosting analytics capabilities. Showcased exceptional project management skills and effective client relationship management, contributing to increased reporting efficiency and comprehensive data governance practices.

Overview

20
20
years of professional experience
1
1
Certification

Work History

Staff Data Engineer and Senior Data Engineer

Walmart
Bentonville, AR
05.2020 - Current
  • An instrumental lead in Intelligent Reporting team that works on Enterprise, Walmart, Sam’s and Ecommerce reports.
  • Built end to end data pipeline reading information from GBQ and One Stream and inserting into GBQ for analysis.
  • Designed and built one single source of truth unified data model one each for Sam’s, Walmart and Ecommerce which powered the semantic models for their respective reports.
  • Developing data solutions across multiple environments including GCP, Azure Synapse, Azure SQL Server and Hive to power the analytics of the Finance organization and tools used by the Finance Team.
  • Led the end to end process of creating consumption models for Sam’s Daily Sales Tracker, Sam’s Weekly Trade Meeting, Sam’s Merch P&L, Walmart Weekly Trade Meeting Reports and Walmart Ops P&L.
  • Built the data models for Sams P&L,Sams Membership, Sams ICx, Sams H&W Semantic Models.
  • The data models were built in such a way it can be used by the Gen AI to generate insights.

Senior Hadoop Admin

Walmart
Bentonville, AR
01.2016 - 04.2020
  • An instrumental member of the Admin team that manages 20 Hortonworks Hadoop clusters (12 Development and 8 Production – Totaling 3460 nodes with 62.4 PB size) for the Walmart Information Systems Division.
  • Regularly upgrade multiple clusters from Pivotal 3.0 to HW 2.4, HDP2.4.2, HDP 2.5.3, and HDP2.6.3.
  • Utilize Ambari to monitor clusters and write scripts on the ambary-agent log that sends an alert when the status of the cluster changes.
  • Automate scripts that adds users to Kerberos for authentication.
  • Train users on writing relevant hive queries and using Spark functionality whenever applicable.
  • Installed Wandisco tool that can be used as DR to replicated Hadoop Clusters.
  • Built five secure clusters and seven non-secure clusters in GCP; enabled knox gateway and SLL for all 12 clusters.
  • Utilized Ranger and KMS for secure clusters.
  • Drafted and distributed a monitoring tool using Python modules that gather and analyze metrics from REST API calls, and write the data to a Cassandra Graphite Database, and uses Grafana to display metrics on the dashboard.
  • Developed the Queue Metrics dashboard currently used by management to determine how teams consume resources in the data lake cluster.

Technical Lead and Scrum Master

Walmart
Bentonville, AR
10.2014 - 01.2016
  • Gathered and analyzed requirements to develop project plans that adequately address technical specifications.
  • Created workflows using Oozie and coordinated time-based, and touch-file-based jobs.
  • Developed a custom Map Reduce job that loaded the data from HDFS to Oracle and performed data cleaning and transformation.
  • Designed Hive queries and Pig scripts to analyze large datasets, and a custom MR Job to load data into Cassandra for generating the reports for each campaign.
  • Wrote data management queries for Experian Datasets.

IT Systems Analyst III - Hadoop

Sprint
Overland Park, KS
10.2013 - 10.2014
  • Collaborated with the Analytics COE team to enforce data governance practices within the Enterprise Analytics Platform (EAP) environment.
  • Fielded ad-hoc requests for reporting and data analysis, and assessed business and IT projects for inclusion in EAP procedures.
  • Developed POCs for various business accounts and implemented incoming solutions.
  • Utilized Hadoop to execute projects for Churn Analysis, Analyze customer calls, texts, and data usage while driving, send recommendations for safe driving, and analyzing customer over-the-web usage to validate revenue.
  • Imported and exported data from Teradata to HDFS using sqoop and created Decision Trees using the predictive module of Alteryx for Customer Churn analysis.
  • Successfully revamped a hive job that originally required 24-hours to run just text messages and data transaction, and improved the performance to 30-minutes to complete calls, text messages, and data processing.

Application Developer III

Sprint
Overland Park, KS
03.2006 - 09.2014
  • Provided ongoing support and technical expertise on the EAI Team tasked with designing and developing Business Logic Modules that interface with various front end and back end systems integrated with Sprint, supporting their nationwide wireless network.
  • Some of the major project engagements include Unified Billing Project, and Wireless Resellers conversion.
  • Gathered and analyzed requirements, identified and mitigated risks, performed system, and technical design reviews, implemented improvements to streamline processes and increase productivity.
  • Attended code reviews for fellow developers and ensured compliance with coding and design standards.
  • Connected to Nextel database by developing an XML over MQ connection framework.
  • Wrote and designed the core logic for cell phone and text message usage summary that five different frontends referenced including Sprint Web UI.

Education

MBA - Information Systems, Corporate Finance and Personal Finance

University of Kansas
Lawrence, KS, USA

Bachelor of Engineering (B.E) - Computer Science

Osmania University
Hyderabad, AP, INDIA

Skills

  • Full life cycle project management
  • Systems implementation
  • IT storage solutions
  • Client relationship management
  • Software development life cycle
  • Web development and design
  • Software testing and quality assurance
  • Backup and recovery strategies
  • Problem analysis and resolution
  • User training and support
  • Hadoop ecosystem: HDFS, MapReduce, Hive, Pig, Spark, HBase, Cassandra, ZooKeeper, Flume, Sqoop, Python, Oozie
  • RDBMS: Oracle, SQL Server 70, DB2, SQL, Loader, IMS DB/DC, MySQL
  • Programming: C, C, Java, XML, XSL, Visual Basic, COBOL, SQL, PL/SQL, Tuxedo, UNIX
  • Web technologies: Servlets, JDBC, JSP, Applets, Swing, JUnit, Struts, Spring, SOAP, Web Services, MQ
  • Scripting languages: UNIX Shell script, JavaScript, DHTML, HTML
  • IDEs: Eclipse, IntelliJ IDEA, RAD, Visual Studio 60
  • Data modeling
  • SQL query optimization
  • Data governance
  • ETL processes
  • Data pipeline development

Certification

Six Sigma Green Belt Certification, Johnson County Community College (JCCC), Overland Park, KS

Professional Development

Dale Carnegie Training

Timeline

Staff Data Engineer and Senior Data Engineer

Walmart
05.2020 - Current

Senior Hadoop Admin

Walmart
01.2016 - 04.2020

Technical Lead and Scrum Master

Walmart
10.2014 - 01.2016

IT Systems Analyst III - Hadoop

Sprint
10.2013 - 10.2014

Application Developer III

Sprint
03.2006 - 09.2014

MBA - Information Systems, Corporate Finance and Personal Finance

University of Kansas

Bachelor of Engineering (B.E) - Computer Science

Osmania University