Results-driven AI/ML Engineer with 4 years of experience delivering end-to-end data and machine learning solutions across diverse domains, including investment analytics, retail forecasting, and healthcare operations. Proven track record in designing scalable models, building robust data pipelines, and extracting insights from large, complex datasets to inform strategic business decisions. Adept in Python, SQL, and cloud platforms (AWS, Databricks), with strong command over supervised learning, NLP, and model interpretability techniques. Known for leading client-facing implementations, collaborating cross-functionally with engineers, analysts, and stakeholders, and translating technical results into actionable outcomes. Passionate about building intelligent systems that drive efficiency, automation, and real-world value.
The client, a field service solutions provider, required an internal analytics system to monitor field operations efficiency, track service request volumes, and reduce technician dispatch errors across regions. They needed a centralized reporting framework to support operational planning, identify high-delay zones, and automate the flagging of SLA violations. The goal was to help the business reduce downtime, optimize technician assignments, and improve customer satisfaction across high-volume regions.
Responsibilities:
The client required a machine learning-based inventory forecasting system to support demand-driven procurement across their retail network. The goal was to develop a scalable solution capable of processing large transactional datasets, producing accurate SKU-level forecasts, and integrating seamlessly with existing ERP and POS platforms. Key priorities included reducing overstock and stockouts, improving visibility into supply chain trends, and ensuring compliance with data governance standards.
Responsibilities:
We were engaged by a regional healthcare provider to modernize its fragmented claims data system. The objective was to integrate structured and semi-structured data across EMR, billing, and compliance systems into a centralized analytics platform. This integration aimed to support value-based care initiatives and predictive cost modeling.
Responsibilities:
The client required a robust AI-driven investment analytics platform capable of processing large-scale financial datasets to optimize portfolio strategies, improve risk assessment accuracy, and deliver actionable insights to institutional investors. Their objective was to integrate real-time and historical market data sources, enhance predictive modeling for asset performance, and enable intelligent decision-making through explainable AI solutions. Additionally, they sought automation of investment trend detection, sentiment analytics from financial news, and scenario simulations to support the investment team’s forecasting accuracy. Compliance with SEC and internal audit policies, along with ensuring data transparency, interpretability, and scalability, were paramount.
Responsibilities:
Programming: Python (NumPy, Pandas, Scikit-learn, XGBoost, TensorFlow, Keras, Matplotlib, Seaborn), R (basic statistical modeling), SQL (PostgreSQL, MySQL, T-SQL), Bash/Shell Scripting (for deployment pipelines), PL/SQL (Oracle environments)
Big Data & Databases: Snowflake, Databricks, Google BigQuery, Oracle, SQL Server
ML/AI: Supervised & Unsupervised Learning, Time-Series Forecasting (Prophet, LSTM), Regression, Classification, Clustering (K-Means, Hierarchical), NLP (BERT, spaCy, NLTK, text summarization, sentiment analysis), A/B Testing & Statistical Hypothesis Testing, Model Evaluation (RMSE, MAPE, Confusion Matrix, ROC-AUC), Feature Engineering and Selection, Explainability: SHAP, LIME
Data Engineering: Data Preprocessing & ETL (Airflow, pandas, openpyxl, Informatica), Data Pipelines (Apache Airflow, Luigi), API Integration (RESTful APIs), Data Warehousing Concepts (Star Schema, Fact/Dim Tables)
Visualization & Reporting: Power BI, Tableau, Matplotlib / Seaborn (Python), Excel Dashboards (PivotTables, VLOOKUP, Data Validation)
Cloud & DevOps Tools: AWS (S3, EC2, Lambda, SageMaker, CloudWatch), Docker (model containerization), Git (version control), GitHub/GitLab, Jenkins (basic CI/CD)