Summary
Overview
Work History
Education
Skills
Certification
Core Expertise
Timeline
Generic

Suresh Bandaru

Delivery Lead & Data Engineer
Alpharetta,GA

Summary

Results-driven Delivery Lead and Data Engineer with over 15 years of experience specializing in building scalable data products and delivering complex analytics solutions. Skilled in cloud platforms, data architecture, and stakeholder management, consistently driving agile methodologies to achieve impactful business outcomes. Experienced in leading cross-functional teams and adept at developing end-to-end data pipelines, dashboards, and governance frameworks to support organizational success.

Overview

16
16
years of professional experience
5
5
years of post-secondary education
2
2
Certifications

Work History

Delivery Lead & Data Engineer

AT&T
03.2018 - Current
  • Delivered enterprise data products (e.g. Account, Subscription, OneID) by integrating data and curation processes, creating a unified source of truth for improved user experience.
  • Designed subject-specific data libraries (Account, Identity, Viewership) and business-specific data domains using agile methodologies.
  • Prioritized business needs through case validation and source data identification.
  • Developed proof of concepts and Minimum Viable Products (MVPs) with automation, metadata, and basic data quality.
  • Transitioned validated MVPs to full-scale production with advanced automation, robust SLAs, SOX compliance, and operational support.
  • Documented end-to-end data lineage and validated architectural designs for standards compliance.
  • Created physical and logical data models and implemented proactive data quality controls with ticketing systems.
  • Leveraged Azure Databricks and Snowflake for data transformation.
  • Developed Power BI and Tableau dashboards for end-to-end reporting.
  • Acted as a Subject Matter Expert (SME) for data domains like Viewership, Accounts, and Identity.
  • Collaborated with stakeholders to define goals, timelines, and roadmaps.
  • Allocated resources effectively, tracked performance, resolved deviations, and shared updates regularly.
  • Coordinated onshore and offshore teams, delivering numerous user stories.

Big Data Engineer

AT&T
05.2016 - 02.2018
  • Built scalable data pipelines and ETL workflows using Hadoop ecosystem tools.
  • Processed large datasets using HiveQL, Spark, and Pig for advanced analytics and reporting.
  • Automated workflows with Oozie and optimized system performance through advanced tuning.
  • Evaluated Hadoop and its ecosystem for project suitability, implementing proof of concepts (POCs) for adoption under Big Data initiatives.
  • Designed and implemented Hadoop clusters, including backup and disaster recovery systems.
  • Exported large datasets from RDBMS to Hive using Sqoop and transformed data using MapReduce and Spark.
  • Handled unstructured and semi-structured datasets in terabyte-scale sizes.
  • Implemented advanced Hive features like Partitioning, Dynamic Partitions, and Buckets to improve query efficiency.
  • Optimized MapReduce jobs with compression mechanisms to efficiently utilize HDFS.

Hyperion Essbase & Planning Consultant

Logic Planet Inc.
01.2009 - 04.2016
  • Built ASO (Aggregate Storage Option) and BSO (Block Storage Option) cubes in a multidimensional structure and facilitated data export between them.
  • Designed and optimized outlines, data load rules, and calculation scripts for efficient cube operation.
  • Developed calculation scripts to forecast six years of data based on actuals and different vintages for P&L accounts.
  • Implemented Transparent Partitions and report scripts to extract and transfer data to staging areas.
  • Improved performance by experimenting with Sparse/Dense combinations to reduce block size.
  • Tagged outline members as Dynamic Calc to reduce block size and improve calculation time.
  • Performed database tuning, backups, and restoration for Essbase.
  • Created batch files using MaxL for automation of tasks.
  • Developed reports for Income Statements, Balance Sheets, Key Metrics, Operating/Admin Expenses, Corporate P&L, and Revenue using Financial Reporting and OBIEE.
  • Created budget and forecasting applications, web forms, and business rules.
  • Managed security groups, user access, and database/security refreshes in Hyperion Planning.
  • Migrated reports from development to production environments.
  • Performed DBA activities such as database performance tuning, backups, and migration of cubes and reports across environments.
  • Set up new servers, performed software upgrades, and ensured smooth migration processes.
  • Supported users, maintained cubes, and regularly updated cube outlines.
  • Created load rules to validate and feed data into Essbase automatically.
  • Developed documentation for knowledge transfer to users and for administration/support purposes.

Education

Master’s - interdisciplinary studies

Texas A&M University-Commerce
01.2007 - 01.2008

Bachelor’s - computer science engineering

Anna University
01.2001 - 01.2005

Skills

  • Azure Databricks

  • Snowflake

  • Hadoop

  • Big Data

  • Power BI

  • Tableau

  • Hyperion Essbase

  • Planning

  • OBIEE

  • Business Objects

  • ETL pipelines

  • Agile methodologies

  • CI/CD

  • Spark

  • Kafka

  • Python

  • Java

  • SQL

  • Scala

  • Teradata

  • Star-Schema

  • Snowflake Modeling

  • JIRA

  • GITHUB

Certification

Azure Fundamentals, AZ-900

Core Expertise

Designed and delivered high-impact data products with seamless integration of sources, ensuring single-source-of-truth data., Defined product roadmaps, data models and governance frameworks in collaboration with data architects and business clients., Built scalable data pipelines using tools like Apache Airflow and Azure Data Factory, storing data in Snowflake and Azure Data Lakes for fast querying., Created dashboards using Power BI and Tableau, enabling actionable insights and enhanced decision-making., Implemented CI/CD pipelines, reducing release cycles by 30%, and established robust data quality processes., Led agile project delivery, translating business goals into actionable tasks and ensuring on-time completion., Translated high-level goals into actionable tasks using Agile methodologies, creating detailed backlogs and sprint plans to track progress., Delegated tasks effectively, matching team members' skills and expertise with project requirements, resulting in a 40% increase in efficiency., Monitored task progress through Jira and conducted daily stand-ups to address blockers, ensuring on-time completion of milestones., Mentored team members to improve their task management skills, enhancing overall team performance., Expertise in Hadoop ecosystem tools: Spark, Hive, Pig, Sqoop, and Flume., Skilled in real-time data processing using Kafka and Spark Streaming., Developed and optimized MapReduce programs and ETL pipelines for large-scale data handling., Proficient in cloud platforms like Azure and Snowflake for data storage, processing, and transformation., Extensive experience with Hyperion BI, OLAP, Essbase and Planning for financial and multidimensional modeling., Expertise in Calc Scripts, Dynamic Dimension Builds, Data Load Rules, Tuning and Optimization., Developed Web forms using Hyperion Planning web client for the end users to input values and Forecast, Budget and Actual., Proficient in star and snowflake schema modeling techniques for data warehouses., Created dynamic dashboards and reports using Tableau, Power BI, and Hyperion Financial Reporting.

Timeline

Delivery Lead & Data Engineer

AT&T
03.2018 - Current

Big Data Engineer

AT&T
05.2016 - 02.2018

Hyperion Essbase & Planning Consultant

Logic Planet Inc.
01.2009 - 04.2016

Master’s - interdisciplinary studies

Texas A&M University-Commerce
01.2007 - 01.2008

Bachelor’s - computer science engineering

Anna University
01.2001 - 01.2005
Suresh BandaruDelivery Lead & Data Engineer