Summary
Overview
Work History
Education
Skills
Timeline
Generic

ALEKHYA DUDYALA

San Antonio

Summary

Built and maintained ETL pipelines using Python and SQL for high-volume vehicle production and logistics datasets Developed data models (star and snowflake schemas) to structure manufacturing and warranty data for optimized BI consumption Over 6 years of experience in Business/Data Analysis, ETL development, and data visualization using Tableau, Power BI, and SQL Specialized in building interactive Tableau dashboards with filters, dropdowns, KPIs, and real-time visual storytelling to support business intelligence Strong skills in SQL for writing complex queries (joins, window functions) and Python for scripting, automation, and data transformation Experienced in ETL processes, including data extraction, cleansing, transformation, and loading using Python and SQL-based tools Hands-on with Microsoft Power BI to deliver supplementary insights and create end-to-end reporting ecosystems Solid understanding of Azure Synapse Analytics, Azure Data Factory, Azure Data Lake Storage, and Databricks for cloud-based analytics pipelines Well-versed in AWS services like S3, EC2, Lambda, DynamoDB, and EMR to support data operations and reporting infrastructure Proficient in collaborating with cross-functional teams to define KPIs, gather requirements, and deploy scalable BI dashboards Familiar with QA processes, UAT testing, and full Software Development Life Cycle (SDLC) for analytics solutions Skilled in NumPy, Pandas, and Matplotlib for data manipulation, analysis, and basic statistical modeling Knowledge of machine learning libraries like Scikit-learn and TensorFlow to apply predictive techniques and detect data trends Strong communication and problem-solving abilities with a focus on turning complex data into simple, visual business insights

Overview

6
6
years of professional experience

Work History

Data Engineer

Ford Motor Company
Dearborn
08.2023 - Current
  • Built and maintained ETL pipelines using Python and SQL for high-volume vehicle production and logistics datasets
  • Developed data models (star and snowflake schemas) to structure manufacturing and warranty data for optimized BI consumption
  • Created and scheduled data pipelines in Azure Data Factory, integrating sources into Synapse and Azure Lake for centralized analytics
  • Used AWS S3 and Lambda to manage data ingestion and processing of sensor/telemetry data from connected vehicles
  • Implemented Snowflake for data warehousing and built views for duplication, aggregation, and operational reporting
  • Automated infrastructure provisioning and deployments using Terraform and CI/CD pipelines with Jenkins
  • Designed interactive Tableau dashboards for operations, highlighting metrics like part defects, plant efficiency, and inventory delays
  • Wrote Python scripts for log parsing, anomaly detection, and streaming batch data preparation
  • Applied Spark (PySpark and Spark SQL) to optimize batch processing jobs across massive, distributed datasets
  • Leveraged Git and Jira for version control and agile sprint-based project delivery
  • Used Postman and REST API integrations to test data flow across systems and connect to external tools
  • Built data validation layers to ensure completeness and quality across ingestion layers
  • Defined data governance and access control policies in collaboration with IT and security teams
  • Conducted root cause analysis and performance tuning to support executive reporting use cases
  • Delivered detailed documentation for pipeline architecture, data lineage, and dashboard logic to support scaling
  • Environment: Azure Synapse Analytics, Azure Data Factory, Azure Data Lake, Databricks, AWS (S3, EC2, Lambda, DynamoDB, EMR), Snowflake, Python (Pandas, NumPy), SQL, Spark (PySpark, Spark SQL, Spark Streaming), Tableau, Power BI, Jenkins, Git, Postman, REST APIs, Jira, Windows, Linux

Senior Data Analyst / Tableau Developer

Wells Fargo
Charlotte
07.2019 - 07.2023
  • Conducted root cause analysis and performance tuning to support executive reporting use cases
  • Delivered detailed documentation for pipeline architecture, data lineage, and dashboard logic to support scaling
  • Designed and delivered enterprise-wide Tableau dashboards for fraud detection, compliance monitoring, and operational efficiency, consumed by 100+ users across departments
  • Partnered with business units to translate analytical needs into scalable Tableau solutions using parameters, level-of-detail (LOD) expressions, dynamic filters, and interactive charts
  • Wrote and optimized complex SQL queries in MySQL and PostgreSQL to extract and transform data across multiple banking systems and warehouses
  • Used Python (Pandas, NumPy) to automate data cleansing, reshape large datasets, and push updates to Tableau extracts
  • Collaborated with stakeholders to define KPI frameworks and dashboard wireframes, increasing decision-making speed by 40%
  • Created scheduled data refreshes, published workbooks, and managed access through Tableau Server and Tableau Online
  • Integrated Tableau with Power BI and Excel reports to provide layered insights across finance, risk, and customer experience domains
  • Built ETL processes using SQL-based procedures and Python scripting to extract data from Oracle and flat files into centralized reporting schemas
  • Conducted data profiling, normalization, and metadata management to ensure data quality and traceability across BI systems
  • Implemented row-level security (RLS) and dashboard performance tuning techniques for faster rendering and secure data access
  • Collaborated with QA and engineering teams to automate UAT test cases and validate dashboard outputs against source systems
  • Delivered executive dashboards used in quarterly board reviews, enabling data-driven budgeting and operational forecasting
  • Created documentation for dashboard logic, field definitions, and user guides, reducing onboarding time for analysts by 50%
  • Supported migration of legacy reporting systems into Tableau Cloud and Power BI, leading to a unified BI stack
  • Managed projects through Agile sprints using Jira, ensuring on-time delivery and iterative feedback cycles
  • Environment: Tableau Desktop, Tableau Server, Power BI, SQL, Python, PostgreSQL, MySQL, Excel, Jira, Git, REST APIs, Oracle, Agile (Scrum), Windows, Linux

Education

M.S - Information Sciences

University of Texas at Arlington
Arlington, TX
12.2024

Skills

  • Languages: SQL, Python, R
  • Data Visualization: Tableau (Advanced), Power BI, Excel (PivotTables, Power Query)
  • ETL & Data Preparation: Tableau Prep, Python (Pandas, NumPy), SQL-based ETL
  • Cloud Platforms: Azure Synapse, Azure Data Factory, Azure Data Lake, Databricks, AWS (S3, EC2, Lambda, DynamoDB, EMR)
  • Databases: MySQL, PostgreSQL
  • Project Management: Jira, Agile (Scrum), SDLC documentation
  • Testing & QA: Postman, Selenium (basic), REST API log analysis
  • Machine Learning Tools: Scikit-learn, TensorFlow (basic), Matplotlib, Seaborn
  • Version Control & CI/CD: Git, GitHub, Jenkins (basic)
  • Other Tools: Tableau Public, Power BI Service, VS Code

Timeline

Data Engineer

Ford Motor Company
08.2023 - Current

Senior Data Analyst / Tableau Developer

Wells Fargo
07.2019 - 07.2023

M.S - Information Sciences

University of Texas at Arlington
ALEKHYA DUDYALA