Summary
Overview
Work History
Education
Skills
Timeline
References
Generic
Jordan Voves

Jordan Voves

Old Saybrook,CT

Summary

Data Architect and Databricks specialist with a proven track record of building end-to-end AI systems that power both customer-facing products and business insights. Experienced in designing platforms from the ground up, implementing real-time evaluation frameworks, and embedding feedback loops and contextual signals to continuously improve model performance. Recognized for originating prototypes that became flagship products, driving company-wide adoption of Databricks, and serving as a technical leader who bridges architecture, data engineering, and AI/ML to deliver measurable customer value.

Overview

7
7
years of professional experience

Work History

Data Architect / Sr Data Engineer

Conexiom
Vancouver, BC
07.2022 - Current

Data Architect (2025–Present), Senior Data Engineer (2022–2025)

Joined as one of the founding members of a new AI team at Conexiom, a leading sales order automation platform. As the sole Data Engineer for nearly three years, I established Databricks as the company’s Data + AI foundation and architected a cutting-edge lakehouse in a greenfield Azure environment—now supporting multiple AI products, BI initiatives, real-time evaluation, and comprehensive operational monitoring. Promoted to Data Architect to scale the platform and guide cross-functional delivery.

Leadership

  • Served as the primary technical interviewer for four hires and hiring manager for a Senior Data Engineer; now manage and mentor that role.
  • Technical lead of an 8-person pod (AI Engineer, 3 SWE, FE, UI Designer, PM) delivering the Insights and Actions product.
  • Provided data architecture leadership and strategic guidance as the most senior data engineering professional in the company.

Key Products

  • Insights and Actions – Created and validated a new approach for generating automation insights from CSR correction data. Presented it through a Databricks App (Insights Lab) with Lakebase; strong reception led to business prioritization and production delivery.
  • Navigator – Built the data foundation for a customer-facing app with KPIs and metrics giving customers real-time visibility into automation performance across 15 million+ trade documents annually.
  • Express (AI Document Extraction) – Architected the data and evaluation layers underpinning Conexiom’s flagship AI extraction product, enabling >99.9 % uptime with full operational and cost transparency.

Context & Evaluation Frameworks

  • Built a unified evaluation framework combining ground-truth metrics for extraction with LLM-as-judge for unlabeled tasks, enabling rapid iteration and confident production releases.
  • Developed a context moat to enrich inference and continuously improve extraction accuracy:
    - Part Number Recommendation: Five-million-part catalog powered by vector search for context-aware matching when part numbers are missing.
    -Trading Partner Insights: Durable contextual assets built from user corrections, interactions, and historical performance.
    - Layout & Partner Detection: Embedded document layouts to accurately identify trading partners and document types.
    - Few-Shot Retrieval: Surfaced verified extractions from prior documents as high-quality examples during inference.

Platform Architecture

  • Championed Databricks as the company-wide AI/data platform.
  • Built and maintained 20+ production-grade Spark streaming/batch pipelines with automated CI/CD, alerting, and self-healing.
  • Established Unity Catalog governance, evolving the lakehouse into a secure, production-grade platform supporting real-time monitoring and end-to-end observability across the AI system.
  • Developed and maintained 10+ department-specific dashboards for finance, product, customer success, and engineering.
  • Created Genie spaces for retrieval-augmented analytics and maintained a semantic layer (via Pydantic) to improve the effectiveness of Genie and AI/BI.

Sr. Data Engineer / Data Engineer

SQAD LLC
Tarrytown, New York
07.2018 - 07.2022

Sr Data Engineer (Jan 2022 - Jul 2022), Data Engineer (2018-2022)

  • Led migration from legacy MSSQL workflows to a modern data lake architecture.
  • Designed and implemented an Airflow-based orchestration platform, retiring dozens of legacy ETL jobs and enabling scalable production pipelines.
  • Rebuilt large-scale aggregations (3B+ records) using PySpark and Presto, cutting runtimes from 48 hours to 20 minutes.
  • Mentored junior engineers and data scientists on Python design patterns and AWS technologies (Kinesis, Glue, Lambda, ECS, Athena)

Education

B.S.E - Computer Science & Engineering

Bucknell University
Lewisburg, PA
01-2018

Skills

Cloud Providers

Azure, AWS

Data Platform

Delta Lake, Unity Catalog, Lakebase

Data Engineering

PySpark, Autoloader, schema inference/evolution, streaming architectures, ELT design, data modeling, Databricks SQL, MongoDB, Postgres, Airflow

AI/ML Systems

Model Serving, Embeddings, Vector Search, Prompt Engineering, Structured Outputs, Evaluation Pipelines, MLflow 30, LangGraph, Prompt Registry, LLM-as-Judge, FastAPI, Kubernetes

Visualization & Business Enablement

AI/BI Dashboards, Databricks Apps, Genie, semantic layer design, retrieval-augmented generation (RAG), Tableau, Dash Apps, Streamlit

Development Practices

Dependency management (uv), automated testing (pytest), secrets management, RBAC, Azure Devops CI/CD, Infrastructure as Code (Databricks Asset Bundles, Terraform, Python tooling)

Timeline

Data Architect / Sr Data Engineer

Conexiom
07.2022 - Current

Sr. Data Engineer / Data Engineer

SQAD LLC
07.2018 - 07.2022

B.S.E - Computer Science & Engineering

Bucknell University

References

References available upon request.