Summary
Overview
Work History
Education
Skills
Cerfications
Timeline
Generic
Conner Snavely

Conner Snavely

Fort Lauderdale,FL

Summary

Dynamic Data & AI Engineer with a proven track record at SailPlan Maritime, adept at building efficient ELT pipelines and enhancing data processing speed. Skilled in AWS and GCP, I excel in delivering impactful analytics solutions while fostering collaboration and communication with stakeholders. Passionate about leveraging data for operational excellence and innovation.

Overview

3
3
years of professional experience

Work History

Senior Data & AI Engineer

SailPlan Maritime
Fort Lauderdale, FL
01.2025 - Current
  • Primary data engineer building and maintaining ELT pipelines, which process gigabytes of near-real-time maritime telemetry data, transforming raw data in BigQuery to analytics-level data, powering Looker BI dashboards.
  • Refactored the DBT Core pipeline to DBT Cloud, reducing pipeline runtime from 30 minutes to 5 minutes, vastly improving development speed for others.
  • Developed an event-driven, terabyte-scale NOAA weather data ingestion microservice, enriching maritime data tables for machine learning models using Cloud Functions, Cloud Run Jobs, GCS, and BigQuery.
  • Designed and launched an automated, event-driven DBT Cloud/Cloud Run pipeline for marine data intelligence — it ingests AIS data by MMSI, integrates NOAA weather, processes voyage metrics, and emails automated vessel reports to the user within 1 minute.
  • Pioneered AI-driven analytics by developing multi-agent systems using Google's Agent Development Kit, providing users with a natural-language interface to interact with their data, ingest third-party data, and provide insights for operational intelligence and benchmarking.
  • Built an AI-powered PDF parsing system, allowing consumers to upload automated ship reports containing operating metrics, and interact with that data on their analytics platform within minutes.

Data Engineer

CapTech Consulting
Richmond, VA
09.2022 - 01.2025
  • Individually designed and implemented modern data pipeline architecture, including Airflow (Python), GitHub Actions, automated testing and monitoring, enhanced logging, and multiple environments—reducing pipeline runtime by 50% and significantly improving resiliency.
  • Engineered an automated process to convert over 100 Python 2.7 scripts to Python 3.12 as part of Enterprise Data Warehouse conversion efforts, completing ahead of target by 30 days.
  • Created a data reconciliation tool in Python, ensuring that 1,000+ tables in Snowflake identically matched Teradata, saving several hundred hours of manual intervention in the new Snowflake Production Environment.
  • Built a fully automated Change Data Capture ETL solution for the PGA of America in AWS to consolidate 10+ disparate data sources into an Amazon Redshift data mart for BI (Looker) dashboards.
  • Created PySpark AWS Glue scripts that read from an S3 data lake (JSON, CSV, Parquet), performed transformations, and wrote gigabytes of data to a warehouse daily.
  • Developed and implemented ERD changes using Flyway database version control.

Education

B.S. - Systems Engineering

University of Virginia
Charlottesville, VA
05-2022

Skills

  • AWS, GCP, Azure
  • Bigquery, Snowflake, Redshift
  • Spark, Glue, ADF
  • Python
  • SQL
  • Tableau, Looker
  • Airflow
  • CI/CD
  • Git
  • ETL/ELT
  • DBT
  • Google Agent Development Kit
  • Github Copilot

Cerfications

  • AWS Certified Data Engineer - Associate
  • Databricks Certified Data Engineer Associate
  • AWS Certified Solutions Architect - Associate
  • AWS Certified Cloud Practitioner
  • Microsoft Azure Data Fundamentals

Timeline

Senior Data & AI Engineer

SailPlan Maritime
01.2025 - Current

Data Engineer

CapTech Consulting
09.2022 - 01.2025

B.S. - Systems Engineering

University of Virginia
Conner Snavely