Summary
Overview
Work History
Education
Skills
Websites
Certification
Timeline
Generic

Madhavi Polisetti

Houston

Summary

Dynamic and hands-on data engineering leader with a proven track record at Magnolia Capital, Solera, Omnitracs, and Argonne National Lab, excelling in architecting data warehouses and automating data modeling and ETL processes. Expert in Snowflake, SQL, and Python. A collaborative leader, she empowers teams to deliver impactful BI solutions that align with business objectives.

Overview

24
24
years of professional experience
1
1
Certification

Work History

Senior Data Engineer/Architect/Team Leader

Magnolia Capital
09.2022 - Current
  • Madhavi directs a team of two junior data engineers in architecting a 350 GB enterprise-wide data warehouse on Snowflake, pioneering automated Datawarehouse build process.
  • She developed and refined Python/SQL-based frameworks for defining and maintaining KPIs that align with measurable business outcomes.
  • Her team currently designs and maintains dashboards and visualizations developed in Power BI, tailoring them to stakeholder needs, ensuring clarity and alignment with organizational priorities.
  • Dashboards and investor reporting sped up Leadership's strategic decision-making by 40%.
  • One of her main duties is to communicate KPIs to operations and stakeholders effectively.
  • She architected ETL solutions that accelerated data delivery and enhanced financial, marketing, and other reporting by 40%.
  • Implemented star-schema-based data marts in both RedShift and Snowflake environments that house this private equity firm’s data.
  • Madhavi's BI and analytics solutions improved operational efficiencies by 40%.
  • She leveraged WhereScape and dbt solutions to automate 10 to 15 daily ETL jobs, speeding up data ingestion and metadata management, which resulted in an 80% reduction of manual ETL pipeline-building efforts.
  • Implemented 50-60 Azure, Python, and SQL-based daily data pipelines, and 10-15 WhereScape jobs supporting hundreds of daily pipelines, which are critical to business operations and support daily analytics needs.
  • Automated data ingestion, modeling, and loading processes reduce manual effort by 80% and accelerate data delivery timelines.
  • Created self-service financial, accounting, and marketing dashboards, visualizing complex KPIs and key metrics for Asset Management, Marketing, and Accounting teams, reporting monthly financial performance and monthly marketing funnels.
  • Lead efforts to develop BI dashboards for asset management and leadership oversight of proprietary real estate platform data in the AWS Redshift environment.
  • Maintain data science modeling developed in Databricks to predict rent growth of markets and submarkets in the upcoming year.

Data Architect

Solera
01.2022 - 09.2022
  • Led a team of 3 engineers in building high-impact data pipelines in Snowflake
  • Architected data warehouse and pipelines leveraging AWS services to automate ETL processes and reducing manual processing time by 40%
  • Led the team in creating high-quality data using AWS, Python, Snowflake, and SQL-based data ingestion processes, and automated 90% of Data Warehouse builds using Talend
  • Developed an enterprise-wide data warehouse on Snowflake that integrated data from 15 different sources, leading to single source of truth databases providing consistent data across the organization
  • Snowflake data warehouse integrated data from 5 different database types, including MySQL, SQL Server, and Snowflake, improving data retrieval times by 30%
  • Documented data models, and data flows for historical reference, significantly enhancing system architecture, collaboration, and knowledge transfer
  • Created 2 single source of truth databases, which led to the development of 5 new customer-facing products, resulting in a 20% increase in customer engagement

Data Engineering Manager/Senior Data Engineer

Omnitracs
Chicago
08.2019 - 01.2022
  • Developed a comprehensive Single Source of Truth Shipper/Receiver locations database with over 6.5 million rows (~40 GB)
  • Led a team of 3 Data Engineers/Analysts in developing over 20 high-performance ETL data pipelines connecting AWS/Snowflake, and managed data for 100+ customer reports using Talend, Looker, and Cloud ETL tools
  • Championed data-driven product development, leading the creation of Python algorithms and pipelines for developing geofences and wait-times at trucking locations using databricks, Python, Geo-Pandas, geo-spatial data, and Microsoft Geofence datasets
  • Engineered data for 2 innovative products: Location Intelligence for Truck Carriers and Weather Intelligence for Truckers
  • Crafted algorithms, data pipelines, and machine learning models using AWS /Python/SQL/Snowflake to address logistics industry challenges, including algorithms for warehouse detention calculations, predicting dwell times at shipper/receiver locations, and automatic text summarization of Google reviews
  • Developed new algorithms and delivered critical decision-making insights to key stakeholders
  • Significantly accelerated data science-based product development by analyzing logistics IoT data totaling 95 billion rows (~4 TB)

Data Science Project/Data Engineer

Argonne National Lab
Chicago
12.2018 - 12.2019
  • Developed IoT data analysis to assess the health of 100+ sensors for the Chicago Urban Sensing Project
  • Reported directly to the IT Director and led a driven team of three in designing, building, and delivering predictive modeling insights using Python, PySpark, and Hadoop clusters
  • Processed 100 GB of IoT data, constructed prediction models, and provided strategic recommendations on the health of IoT nodes and sensors
  • Our sensor health analysis and failure prediction algorithms empowered the Chicago city-wide project to select the most reliable sensors from 10-15 vendor options
  • Capstone Presentation to University of Chicago Community

Senior Web Developer

Coilcraft
Chicago
09.2013 - 08.2019
  • Engineered single-page applications (SPAs) with JavaScript, HTML, and CSS to elevate interactive user experiences, leveraging SQL Server data and managing the company's entire user interactions
  • Spearheaded the automation of vendor management, sales funnels, internal messaging, and other workflows, slashing manual data entry by 50-60%
  • Crafted a secure and intuitive website with enhanced UI/UX, driving a 60% surge in user engagement and a 40% boost in retention
  • Mastered Git version control to track changes and uphold code quality, ensuring a robust, maintainable code base for over 10,000 lines

Senior Software Engineer

Submittal Exchange(now Oracle)
Chicago
05.2012 - 09.2013
  • Coordinated with 4 other Software engineers to evaluate Company's SaaS implementation and upgrades
  • Analyzed and proposed technical solutions based on customer requirements enhancing company's SaaS solution developed for aiding Construction Management
  • Developed robust, scalable, modular and API-centric infrastructures to support the product
  • Delivered and unit-tested C#.NET web applications within customer-prescribed timeframes

Software Engineer Consultant

TekSystems, Tech Mahindra, Optyinc
Chicago
11.2000 - 05.2012
  • Consulted with 4 Insurance clients in the Chicago area aligning software development with customer priorities
  • Served as primary liaison between the customer development team and consultant team, relating requirements and feedback for development efforts
  • Offered software-related technical support, developed COBOL, DB2,JCL based software and performed implementation and implementation support
  • Operated within cross-functional environments, coordinating multi-disciplinary teams translating highly technical language into easily understood requirements
  • Aided in software implementation for clients in Insurance industry, providing training services
  • Served as technical or Subject Matter expert for development efforts and implementation support

Education

Master of Science - Applied Data Science

University of Chicago
Chicago, IL
12.2019

Bachelor of Engineering - Electrical Engineering

Andhra University
Visakhapatnam, India
06.2000

Skills

  • Data Vault 20 Implementation
  • Snowflake Datawarehousing
  • Snowflake Clusters, UDF and Tasks & Streams
  • SQL programming
  • AWS RedShift Data Mart Development
  • AWS
  • Microsoft Azure
  • Azure Data Factory
  • ETL tools: WhereScape, DBT
  • Algorithm Development
  • ETL Process Proficiency
  • Python, PySpark
  • Data Modeling: Dimensional, Agile
  • Microsoft SQL Server
  • Data Pipelines
  • Data Warehousing
  • PowerBI Dashboards
  • Data Governance
  • Real Estate, IoT, Financial Data Analysis
  • Databricks
  • Team collaboration
  • Adaptability
  • Time management abilities

Certification

Certified Data Vault 2.0 Practitioner:

https://www.credential.net/b227de62-0624-418e-b8d5-359896e6049a

Timeline

Senior Data Engineer/Architect/Team Leader

Magnolia Capital
09.2022 - Current

Data Architect

Solera
01.2022 - 09.2022

Data Engineering Manager/Senior Data Engineer

Omnitracs
08.2019 - 01.2022

Data Science Project/Data Engineer

Argonne National Lab
12.2018 - 12.2019

Senior Web Developer

Coilcraft
09.2013 - 08.2019

Senior Software Engineer

Submittal Exchange(now Oracle)
05.2012 - 09.2013

Software Engineer Consultant

TekSystems, Tech Mahindra, Optyinc
11.2000 - 05.2012

Master of Science - Applied Data Science

University of Chicago

Bachelor of Engineering - Electrical Engineering

Andhra University
Madhavi Polisetti