Summary
Overview
Work History
Education
Skills
Websites
Certification
Work Availability
Timeline
BusinessAnalyst

Madhavi Polisetti

Houston,TX

Summary

  • Data Engineer with 20+ years of hands-on leadership experience in driving software and data development initiatives.
  • Skilled in spearheading enterprise-wide data warehouse implementations and engineering scalable data pipelines and data marts for BI and analytics applications.
  • Successfully implemented Data Vault 2.0 modeling methodologies to warehouse complex financial services data, utilizing WhereScape for automated ETL workflows and enhanced metadata governance.
  • Expertise in optimizing processes and generating actionable insights using cloud platforms such as AWS and Azure, along with advanced ETL tools like WhereScape, and Talend.
  • Proficient in working with cloud data platforms including AWS Redshift, Snowflake, and Databricks.

Overview

24
24
years of professional experience
1
1
Certification

Work History

Senior Data Engineer/Architect

Magnolia Capital
09.2022 - Current
  • Directed a team of 2 junior Data Engineers to architect a 350 GB enterprise data warehouse on Snowflake, pioneering automated data processes tailored for financial services
  • Engineered comprehensive ETL solutions that accelerated data delivery and enhanced financial and other reporting by 40%
  • Implemented star-schema-based data marts in Snowflake to drive Power BI dashboards
  • Our Analytics solutions improved operational efficiencies by 40%
  • Leveraged WhereScape solutions to automate 10-15 daily ETL jobs, enhancing data ingestion and metadata management, which resulted in an 80% reduction in manual efforts
  • Implemented 50-60 Azure/Python/SQL based daily data pipelines and 10-15 WhereScape jobs supporting 100s of daily pipelines, supporting daily analytics needs that are critical to business operations
  • Automated data ingestion, modeling, and loading processes, reducing manual effort by 80% and accelerating data delivery timelines
  • Created self-service financial, accounting and marketing dashboards, visualizing complex KPIs and key metrics for Asset Management, Marketing and Accounting teams, reporting monthly financial performance and monthly marketing funnels
  • Our dashboards sped up strategic decision-making for these groups by 40%
  • Lead efforts to develop BI Dashboards for Asset Management & Leadership oversight of proprietary Real-estate platform data in AWS RedShift environment

Data Architect

Solera
01.2022 - 09.2022
  • Led a team of 3 engineers to orchestrate high-impact data pipelines in Snowflake, leveraging AWS services to automate ETL processes and reducing manual processing time by 40%
  • Led the team in creating high-quality data using AWS, Python, Snowflake, and SQL-based data ingestion processes, and automated 90% of Data Warehouse builds
  • Developed an enterprise data warehouse on Snowflake that integrated data from 15 different sources, leading to single source of truth databases providing consistent data across the organization
  • Snowflake data warehouse integrated data from 5 different database types, including MySQL, SQL Server, and Snowflake, improving data retrieval times by 30%
  • Documented data models, and data flows for historical reference, significantly enhancing system architecture, collaboration, and knowledge transfer
  • Created 2 single source of truth databases, which led to the development of 5 new customer-facing products, resulting in a 20% increase in customer engagement

Data Engineering Manager/Senior Data Engineer

Omnitracs
08.2019 - 01.2022
  • Developed a comprehensive Single Source of Truth Shipper/Receiver locations database with over 6.5 million rows (~40 GB)
  • Led a team of 3 Data Engineers/Analysts in developing over 20 high-performance ETL data pipelines connecting AWS/Snowflake, and managed data for 100+ customer reports using Talend, Looker, and Cloud ETL tools
  • Championed' data-driven product development, leading the creation of Python algorithms and pipelines for developing geofences and wait-times at trucking locations using databricks, Python, Geo-Pandas, geo-spatial data, and Microsoft Geofence datasets
  • Engineered data for 2 innovative products: Location Intelligence for Truck Carriers and Weather Intelligence for Truckers
  • Crafted algorithms, data pipelines, and machine learning models using AWS /Python/SQL/Snowflake to address logistics industry challenges, including algorithms for warehouse detention calculations, predicting dwell times at shipper/receiver locations, and automatic text summarization of Google reviews
  • Developed new algorithms and delivered critical decision-making insights to key stakeholders
  • Significantly accelerated data science-based product development by analyzing logistics IoT data totaling 95 billion rows (~4 TB)

Data Science Engineer

Argonne National Lab
12.2018 - 12.2019
  • Developed IoT data analysis to assess the health of 100+ sensors for the Chicago Urban Sensing Project
  • Reported directly to the IT Director and led a driven team of three in designing, building, and delivering predictive modeling insights using Python, PySpark, and Hadoop clusters
  • Processed 100 GB of IoT data, constructed prediction models, and provided strategic recommendations on the health of IoT nodes and sensors
  • Our sensor health analysis and failure prediction algorithms empowered the Chicago city-wide project to select the most reliable sensors from 10-15 vendor options
  • Capstone Presentation to University of Chicago Community

Senior Web Developer

Coilcraft
09.2013 - 08.2019
  • Engineered single-page applications (SPAs) with JavaScript, HTML, and CSS to elevate interactive user experiences, leveraging SQL Server data and managing the company's entire user interactions
  • Spearheaded the automation of vendor management, sales funnels, internal messaging, and other workflows, slashing manual data entry by 50-60%
  • Crafted a secure and intuitive website with enhanced UI/UX, driving a 60% surge in user engagement and a 40% boost in retention
  • Mastered Git version control to track changes and uphold code quality, ensuring a robust, maintainable code base for over 10,000 lines

Senior Software Engineer

Submittal Exchange(now Oracle)
05.2012 - 09.2013
  • Coordinated with 4 other Software engineers to evaluate Company's SaaS implementation and upgrades.
  • Analyzed and proposed technical solutions based on customer requirements enhancing company's SaaS solution developed for aiding Construction Management.
  • Developed robust, scalable, modular and API-centric infrastructures to support the product.
  • Delivered and unit-tested C#.NET web applications within customer-prescribed timeframes.

Software Engineer Consultant

TekSystems, Tech Mahindra, Optyinc
11.2000 - 05.2012
  • Consulted with 4 Insurance clients in the Chicago area aligning software development with customer priorities.
  • Served as primary liaison between customers development team and consultant team, relating requirements and feedback for development efforts.
  • Offered software-related technical support, developed COBOL programs and performed implementation and implementation support.
  • Operated within cross-functional environments, coordinating multi-disciplinary teams translating highly technical language into easily understood requirements.
  • Aided in software implementation for clients in Insurance industry, providing training services.
  • Served as technical or Subject Matter expert for development efforts and implementation support.

Education

Master of Science - Applied Data Science

University of Chicago
Chicago, IL
12-2019

Bachelor of Engineering (B.E.) -

Andhra University
Visakhapatnam, India
06.2000

Skills

  • Data Vault 20 Implementation
  • WhereScape
  • Snowflake Data Management
  • SQL Data Analysis
  • Database Management
  • Algorithm Development
  • ETL Process Proficiency
  • Python
  • Data Analysis
  • Data Modeling
  • Data Mart
  • PySpark
  • Star Schema
  • Microsoft SQL Server
  • Azure Data Factory
  • Amazon Web Services
  • Data Pipelines
  • Data Warehousing
  • Git
  • Dashboards
  • Data Governance
  • Team Management
  • Financial Data Analysis

Certification

  • Certified Data Vault 2.0 Practitioner https://datavaultalliance.com/certification/

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Timeline

Senior Data Engineer/Architect

Magnolia Capital
09.2022 - Current

Data Architect

Solera
01.2022 - 09.2022

Data Engineering Manager/Senior Data Engineer

Omnitracs
08.2019 - 01.2022

Data Science Engineer

Argonne National Lab
12.2018 - 12.2019

Senior Web Developer

Coilcraft
09.2013 - 08.2019

Senior Software Engineer

Submittal Exchange(now Oracle)
05.2012 - 09.2013

Software Engineer Consultant

TekSystems, Tech Mahindra, Optyinc
11.2000 - 05.2012

Master of Science - Applied Data Science

University of Chicago

Bachelor of Engineering (B.E.) -

Andhra University
Madhavi Polisetti