Summary
Overview
Skills
Accomplishments
Timeline
Work History
Education
Certification
Languages
Hi, I’m

Debasish Ghosh

Jersey City,NJ
Debasish Ghosh

Summary

Strategic and results-driven Data & AI Solution Director with over 18 years of experience in designing and implementing enterprise-grade data and AI solutions that drive significant business transformation. Expertise includes Enterprise Data Architecture, LLM-powered AI, and cloud-native platforms, with a strong focus on addressing real-world business challenges across the Banking, Finance, Insurance, and Utility sectors. Proven track record in architecting and delivering business-centric data ecosystems that facilitate impactful use cases such as customer intelligence, fraud detection, risk analytics, intelligent document processing, and multilingual conversational AI. Recognized as a trusted advisor to business and technology leaders, skilled in aligning data strategy with enterprise objectives while enabling Generative AI adoption, data governance, and large-scale application modernization.

Overview

18
years of professional experience
5
Certification

Skills

  • Architectural solutions in presales
  • Designed comprehensive data integration frameworks
  • Cloud network management
  • Snowflake Cortex AI/ML
  • Cloud storage solutions: Azure Blob Storage, AWS S3
  • Document intelligence solutions - Azure AI
  • Data warehouse - Snowflake
  • Custom ETL solutions leveraging cloud technologies
  • Multilingual conversational AI and Q&A
  • Power Platform - Power Automation
  • N8N, Crew AI, Custom Metrics tools
  • System Design

Accomplishments

  • Led technical discovery and solution design for enterprise AI initiatives, resulting in over $25M in closed deals across financial services, healthcare, and retail sectors.
  • Architected scalable AI platforms integrating traditional ML and generative AI models, tailored to client-specific use cases such as fraud detection, personalized marketing, and document summarization.
  • Designed and presented high-impact demos and POCs using tools like Azure ML, Databricks, and Hugging Face, accelerating client buy-in and shortening sales cycles by 30%.
  • Collaborated with cross-functional teams (sales, product, engineering) to translate business requirements into technical blueprints, ensuring alignment with client goals and compliance standards.
  • Enabled multi-cloud deployments (AWS, Azure, GCP) with secure data pipelines and federated governance, supporting decentralized AI workloads across 5+ global regions.
  • Advised C-level stakeholders on AI strategy, ROI modeling, and implementation roadmaps, positioning the company as a trusted innovation partner.
  • Contributed to RFP responses and technical proposals, increasing win rates by 40% through differentiated solutioning and clear value articulation.

Timeline

Solution Director - Presales Architect

HCLTech (USA)
05.2024 - Current

Business Data Architect

TCS (USA)
02.2017 - 04.2024

Data Architect

TCS (INDIA)
04.2014 - 01.2017

Technical Lead

Tech Mahindra, INDIA
01.2013 - 01.2014

Tech Lead/Developer

Cognizant, INDIA
01.2011 - 01.2013

Lead Engineer

HCL Technologies, INDIA
01.2007 - 01.2011

West Bengal University of Technology

Bachelor of Technology from Computer Science & Engineering

Work History

HCLTech (USA)

Solution Director - Presales Architect
05.2024 - Current

Job overview

  • Architected scalable, secure data platforms, and governance frameworks across AWS, Azure, and Snowflake to support enterprise-wide analytics and AI initiatives.
  • Defined business-centric data architectures to support key initiatives, such as customer 360, sales performance optimization, and operational efficiency across cloud platforms.
  • Enabled cross-cloud interoperability using Apache Iceberg and Delta Sharing for seamless data exchange between Snowflake, Databricks, and the AWS platform.
  • Led cost optimization initiatives by implementing storage tiering, query pruning, and workload-aware compute provisioning across cloud data warehouses.
  • Collaborated with business stakeholders to translate analytical needs into scalable data products, driving measurable impact in marketing, finance, and operations.
  • Let me know if you'd like these tailored for a resume, LinkedIn profile, or presentation deck. I can also help you highlight specific accomplishments or metrics.
  • Collaborated with business stakeholders to translate analytical and reporting needs into scalable data models and pipelines, enabling real-time insights, and self-service BI capabilities.
  • Led cloud modernization of legacy systems using AWS and Snowflake, improving performance and cost efficiency.
  • Designed data security architectures to redact PII/SII, and protect PCI data using encryption, IAM, and cloud-native controls.
  • Implemented data governance and lineage tracking across multi-cloud environments to ensure compliance and data quality.
  • Delivered open-source-based GenAI/LLM solutions with integrated MLOps pipelines for monitoring and deployment.
  • Embedded LLMs (OpenAI, Azure OpenAI) into enterprise workflows for intelligent automation and contextual Q&A.
  • Designed AI-driven solutions for customer analytics, fraud detection, and demand forecasting, aligned with business goals.
  • Built and deployed end-to-end ML pipelines using MLflow, Airflow, and Docker, with model monitoring and drift detection.
  • Optimized query performance and cost using partitioning, clustering, and materialized views.
  • Implemented secure data access and governance policies in BigQuery to ensure compliance with PII, PCI standards.

TCS (USA)

Business Data Architect
02.2017 - 04.2024

Job overview

  • Executed solutions on a large-scale AI and data modernization program across nine business units, aligning LLM initiatives with enterprise goals, and driving measurable business outcomes.
  • Led strategic planning to align AI/ML, data, and IT strategies with business objectives, enabling cost optimization, digital transformation, and improved decision-making.
  • Partnered with business leaders to identify high-impact use cases, such as customer churn prediction, intelligent document processing, and supply chain optimization.
  • Designed low-level architectures for complex system integrations, enabling seamless data flow across upstream, downstream applications, and hybrid cloud environments.
  • Delivered a cloud transformation program that reduced infrastructure costs by 27–30% while integrating GenAI capabilities into legacy systems for enhanced automation.
  • Architected enterprise microservices and API-based systems for scalable LLM deployment, using loosely coupled design patterns and event-driven architectures.
  • Built reusable ETL frameworks using AWS Glue, Step Functions, and PySpark with model-controller design patterns; deployed via CloudFormation and Jenkins.
  • Implemented serverless GenAI pipelines using AWS Lambda, Python, and event-driven triggers, achieving approximately 30% infrastructure savings and a faster time-to-market.
  • Engineered data platforms using Snowflake, PySpark, AWS Glue, S3, Lambda, and Step Functions to support scalable LLM data pipelines, and real-time analytics.
  • Developed GenAI prototypes using LangChain and Python, and deployed ML models via Amazon SageMaker for use cases such as anomaly detection, semantic search, and intelligent assistants.
  • Collaborated with business and technical directors to prioritize AI investments, improving project success rates by 20%, and reducing delivery timelines by approximately 15%.

TCS (INDIA)

Data Architect
04.2014 - 01.2017

Job overview

  • Directed foundational data and AI/ML engineering and model lifecycle orchestration, including experimentation, deployment, monitoring, and governance for scalable enterprise use cases.
  • Engineered data pipelines to support business-critical use cases, such as customer segmentation, risk scoring, and operational forecasting.
  • Extracted, transformed, and cleaned large datasets from diverse sources (SQL, APIs, CSVs) to enable analytics and model development.
  • Designed data models and schemas aligned with business KPIs, enabling efficient reporting and predictive analytics across departments.
  • Identified patterns, trends, and anomalies using statistical summaries and visualizations (e.g., Seaborn, Matplotlib) to support data-driven decision-making.
  • Performed feature engineering to enhance model accuracy and interpretability, improving outcomes in classification and regression tasks.
  • Built and tuned predictive models using machine learning algorithms (e.g., regression, classification, clustering).
  • Assessed model performance using cross-validation and metrics such as accuracy, precision, recall, and AUC, to ensure reliability in production.
  • Deployed models via Flask APIs and cloud platforms (AWS, Azure), integrating them into business applications, and monitoring performance in real time.

Tech Mahindra, INDIA

Technical Lead
01.2013 - 01.2014

Job overview

  • Worked as hands-on Python and .NET-based Azure and Microsoft SharePoint solutions.
  • Azure Cloud Architecting: Designed and deployed scalable, secure solutions using Azure services (App Services, Functions, Logic Apps, Key Vault, Azure DevOps).
  • SharePoint Engineering: Delivered custom SharePoint Online/On-Prem solutions, including SPFx web parts, Power Automate workflows, and integration with Microsoft Graph.
  • Solution Design & Architecture: Defined technical architecture, data flow, and integration patterns for cloud-native and hybrid enterprise applications.
  • DevOps & CI/CD: Implemented CI/CD pipelines using Azure DevOps and GitHub Actions for automated builds, testing, and deployments.

Cognizant, INDIA

Tech Lead/Developer
01.2011 - 01.2013

Job overview

  • DOMAIN: Banking and Finance domain.
  • Developed scalable and maintainable applications using Python frameworks such as Flask, Django, and FastAPI.
  • Designed and implemented RESTful APIs, and integrated third-party services and internal systems.
  • Written cleanly, efficiently, and with reusable code, following best practices and coding standards.
  • Automated data workflows, ETL processes, and system tasks using Python scripting.
  • Processed and analyzed structured and unstructured data using Pandas, NumPy, and SQL.
  • Collaborated with front-end developers, data engineers, and product teams to deliver end-to-end solutions.

HCL Technologies, INDIA

Lead Engineer
01.2007 - 01.2011

Job overview

  • DOMAIN: Banking and Finance domain.
  • Designed and developed on-premises applications using N-tier architecture and a service-driven framework.
  • Developed backend services and APIs using Flask and Django for scalable web applications.
  • Written automation scripts to streamline data processing and system operations.

Education

West Bengal University of Technology
India

Bachelor of Technology from Computer Science & Engineering
01.2007

Certification

Dataiku AI ML practioner

AWS Data Analytics

AWS Solution Architect - Associate

Databricks Gen AI

Azure Fundamentals

Scrum PSM 1

Languages

English
Full Professional