Summary
Overview
Work History
Education
Skills
Professional Title
Timeline
Generic

Nishant Adat

Data Science Specialist
Alpharetta,GA

Summary

Results-driven AI Engineer Lead with over 14 years of experience managing teams to deliver Data and AI products. Specializes in designing and scaling GenAI-powered solutions using modern data architecture, LLMOps practices, and agent-based systems. Proven expertise in leading cross-functional teams, implementing agile methodologies, and delivering high-impact AI/ML products in production. Adept at implementing large-scale generative AI systems, orchestrating data-to-insight workflows, and aligning AI initiatives with enterprise goals.

Overview

10
10
years of professional experience
7
7
years of post-secondary education

Work History

Data Science Specialist/ML Architect

UST Global Contractor for TMobile
02.2024 - Current
  • Leading the design and deployment of scalable GenAI and Agentic Mesh systems using OpenAI, AutoGen, and Snowflake, powering telecom analytics and 360 customer intelligence.
  • Leading a team of data scientists from offshore to build and deliver enterprise grade Generative AI applications.
  • Built production-grade LLM applications on Azure and Databricks, integrating MLflow, Unity Catalog, and ADF for GenAI workflow orchestration.
  • Implemented robust LLMOps and FMOps practices, including prompt versioning, automated evaluation, and continuous feedback for high-performance LLM pipelines.
  • Developed metadata-aware agents for dynamic SQL generation and natural language insights from structured data using CosmosDB, Snowflake, open ai sdk team of agents. This includes sota frameworks like Chase sql and evaluation frameworks for SQL agents using Bird-SQL.
  • Championed GenAI and Agentic Ops adoption across engineering teams, establishing governance, scaling best practices, and delivering measurable business impact.
  • Strategic Leader & Mentor in GenAI Adoption: Spearheaded cross-functional GenAI initiatives for telecom giants like T-Mobile, mentored AI engineers on best practices for LLM deployment, and advised leadership on roadmap, tooling, and scalability of generative AI systems in production.

Data/ML Architect

Gbee.ai
09.2023 - 02.2024
  • Leading technical teams of Data Science specialists to create scalable enterprise grade AI ML solutions that deliver business values.
  • Engineered scalable machine learning masterpiece with Apache Spark and Spark-ML libraries, revolutionizing anomaly detection. Achieved a remarkable 20% boost in processing efficiency, showcasing a commitment to innovation and performance optimization.
  • Architected a robust ML system capable of processing broadband usage data from a staggering 40 million subscribers in near real-time.
  • Empowered the system to deliver comprehensive trend analyses for each subscription, spanning various periods such as hours, days, months, weeks, and weekends. Implemented segmentation based on region, CMTS, and plans, providing nuanced insights for strategic decision-making.
  • Engineered a cutting-edge machine learning system for a multinational order tech platform, enabling real-time adaptive SLA computations for shipments.
  • Championed the inception, development, and execution of an innovative AI Factory for a telecom powerhouse, transforming data into a powerful data product. Pioneered the integration of data across diverse lines of business, creating a dynamic platform.
  • Established a repository capturing derived behaviors, translating raw data from the integrated datalake into actionable insights. Implemented a scoring system, empowering the telecom giant to personalize customer experiences by aligning offers and plans with individual behaviors. This visionary initiative not only elevated sales but also enabled the delivery of tailor-made solutions, marking a significant leap in customer engagement.
  • As a Data Science Architect, spearheaded the development of the data strategy to manage and store organization data for anomaly detection use cases for a major cable service provider.
  • This mission-critical initiative involved handling large volumes of data with high velocity, variety, and veracity, necessitating robust and scalable data solutions.
  • This initiative eventually achieved improvement in customer relationship and prevented revenue leakage primarily for broadband customer usage. This was later extended to Order management platforms as well.
  • Designed and implemented data models, including conceptual, logical, and physical data models, ensuring data integrity and consistency across the organization.
  • This involved collaborating with stakeholders, including business leaders, data analysts, and IT teams, to understand business objectives and data requirements.
  • Developing and maintaining the overall data strategy and architecture roadmap that aligns with the organization's goals and data-driven initiatives became the key contributor to the team’s success.
  • As part of a tech-uplift initiative, there was a need to redesign the complete data arrangement for a large internet and voice service platform.
  • Designed and created data models that represent the structure and relationships of various data entities and attributes.
  • Developed conceptual, logical, and physical data models for databases and data warehouses.
  • Defined data standards and data naming conventions to ensure consistency and data integrity.
  • Have designed and built pipelines that do data ingestion, Transformation, model training, scaling, algorithmic registry, and Model Inference over big data and provide recommendations and predictions that have helped improve businesses over time. The implementations were initially done using Python libraries such as pandas, numpy, matplotlib, Seaborn. Later rewired the design using more manageable and more effective pipelines provided by the Gbee platform.
  • Designed and developed Machine Learning (ML) workflow/pipeline using scientific analysis and mathematical models to help predict and measure outcomes based on business data. Work involves architecting applications and configuring templates.
  • Successfully deployed an event-driven platform for a large cable service provider prevent incorrect billing and identified anomalous broadband data usage.
  • Developed and executed a comprehensive data strategy and architecture roadmap, aligning data infrastructure with organizational goals and data-driven initiatives.
  • Designed and implemented data models, including conceptual, logical, and physical data models, ensuring data integrity and consistency across the organization.
  • Have successfully delivered critical business insights using Statistics, Machine Learning, and Deep Learning Libraries such as Spark ML and DL to evaluate business data.
  • Have created Data Extraction pipelines from variety of source channels such as Kafka, HDFS, Rest Services, and sockets. The socket channels were to read data in Radius protocol for a WiFi business. Kafka channels for real-time broadband usage data pricing and HDFS for report generation from archived data.
  • Ensured operational excellence of data-intensive applications by analyzing, debugging, and correcting data pipeline issues and implementing appropriate Data quality checks.
  • Build reusable frameworks/libraries to ingest data into data lakes with technologies like Apache Spark/ Apache NiFi.
  • Calibrate and optimize databases (Oracle, MongoDB, Cassandra, etc.), and I/O performances. For Mongo DB Sharding, setting the right read preference to prevent overloading the primary replica and ensuring proper indexing, along with techniques in organizing the data, were used to accomplish optimum performance.
  • Plan and coordinated installation and timely upgrade of technologies required by the ML/AI platform.
  • Deliver technical documents and support guidelines for production deployments and maintenance.
  • Experienced with MongoDB to persist high-volume datasets, perform optimized CRUD operations, and analyze data using Aggregate pipelines. Also, have experience in using sharding, replication, and other key MongoDB techniques to optimize the MongoDB environments.
  • Cloud environment experience includes using AWS S3 for object data storage and EC2 instances for installing and maintaining the Kafka cluster.
  • Timely business reports delivery to decision-makers was ensured by designing and implementing ETLs, and ELTs initially using PySpark by connecting to the Hadoop datastore, which was migrated later to pipelines through the GBee platform, a proprietary framework by Tulasea inc.

SME in Mediation and Transaction Processing

Comcast Cable Services
09.2015 - 08.2020
  • Experienced as SME in mediation and transaction processing engines using Kafka, MongoDB, Apache-Nifi clusters.
  • SME in Apache Kafka for high volume solutions with hands-on experience for more than 5 years.
  • Engineered robust data pipelines capable of processing a staggering 600 Million Daily WiFi usage events and generating usage charges for over 60 million subscribers.
  • This modernized system not only surpassed legacy limitations but also resulted in a noteworthy 20% reduction in usage disputes incurred by the organization.
  • Architected a real-time data analysis solution for the internal legal team, enabling swift detection and response to security threats.
  • This initiative drastically slashed the average response time from the conventional 1 day to an impressive 1 minute, ensuring heightened security measures.
  • Implemented a technological upliftment for processing High-Speed Data usage events, handling 60,000 transactions per second from real-time Kafka streams. This transformation streamlined the real-time usage meter display, reducing latency to less than an hour from the previous 8 hours.
  • Developed Scala applications to implement user details obfuscation in compliance with the California Consumer Privacy Act (CCPA).
  • The real-time system excelled in masking both existing and new data using SHA256 encryption, ensuring robust measures to safeguard sensitive information and uphold data privacy standards.
  • Streamlined ETL processes, leveraging advanced optimization techniques, and successfully reduced processing time from 8 hours to an impressive 1 hour.
  • This optimization contributed to faster and more efficient data delivery, crucial for supporting critical business operations.

Education

MS - DataScience and Machine Learning

Woolf University
01.2023 - Current

Bachelor of engineering - undefined

Karpagam College of Engineering
01.2006 - 01.2010

Skills

  • Strategic technology alignment
  • Collaborative teamwork across departments
  • Talent development facilitation
  • Experience in designing agentic systems
  • LLMOps and FMOps integration
  • Big data solutions

  • Advanced data modeling skills
  • Data Engineering and Data pipelineintegration

Professional Title

Data Science Specialist

Timeline

Data Science Specialist/ML Architect

UST Global Contractor for TMobile
02.2024 - Current

Data/ML Architect

Gbee.ai
09.2023 - 02.2024

MS - DataScience and Machine Learning

Woolf University
01.2023 - Current

SME in Mediation and Transaction Processing

Comcast Cable Services
09.2015 - 08.2020

Bachelor of engineering - undefined

Karpagam College of Engineering
01.2006 - 01.2010
Nishant AdatData Science Specialist