Summary
Overview
Work History
Education
Skills
Languages
Certification
Websites
Timeline
Generic

Chaitanya Anchala

Wylie,Texas

Summary

Accomplished Lead Data Architect and Data Modeler with 14 years of expertise in designing and implementing robust data architecture solutions across diverse domains including corporate loans, asset management, Energy Commodities, and business travel. Proficient in conceptualizing, developing, and refining data models to drive business insights and operational efficiency. Recognized as a "hands-on" leader adept at orchestrating cross-functional teams and overseeing data professionals to ensure alignment with organizational objectives.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Lead Data Architect & Modeler

American Century Investments
Dallas, TX
12.2023 - Current
  • Designing and maintaining comprehensive data models using ER Studio, aligning with business requirements and industry best practices
  • Integrating ER Studio into the data architecture lifecycle to streamline processes and facilitate collaboration among team members
  • Leveraging ER Studio's features for documentation and visualization of data models, including entity-relationship diagrams
  • Incorporating ER Studio into data governance initiatives to enforce data standards and compliance throughout the data lifecycle
  • Collaborating with data engineers to incorporate ER Studio-generated data models into the broader data ecosystem
  • Providing training and support on ER Studio usage and best practices to empower team members in data modeling activities
  • Modeling APIs and generating OpenAPI contracts to define interfaces for app development
  • Ensuring data integrity and adherence to defined schemas through the use of Pydantic classes
  • Overseeing the development and maintenance of data models, dictionaries, and standards
  • Leading data integration projects to ensure data quality and seamless data flows across platforms
  • Designing scalable data pipelines and implementing data transformation processes
  • Leveraging technologies like Apache Spark, AWS Glue, and Azure Data Factory for data orchestration and automation
  • Collaborating with business analysts to design optimal data structures for business intelligence and analytics
  • Providing technical guidance and mentorship to team members to foster their professional growth.

Vice President- Lead Data Architect

Goldman Sachs
Dallas, TX
08.2019 - 12.2023
  • Leading a high-performing team of data modelers, engineers, and administrators for end-to-end data architecture and management
  • Developing and maintaining comprehensive data models for corporate loans and asset management using Erwin and FINOS
  • Building data pipelines with Azure Data Factory for migrating on-premises SQL Server databases to the cloud and AWS Glue for AWS environments
  • Creating data platforms with Python PySpark, AWS Glue Catalog, and Athena on S3 for scalable data processing
  • Designing enterprise-level data architectures for warehouses, Operational Data Stores (ODS), and reporting platforms in AWS and Azure
  • Building serverless functions with AWS Lambda and Azure Functions and developing complex workflows with AWS Step Functions
  • Leveraging Terraform for infrastructure as code to ensure reliable deployment of data solutions on both AWS and Azure
  • Collaborating with cross-functional teams to translate data requirements into scalable solutions
  • Leading data integration projects to ensure data quality, integrity, and seamless flows across systems
  • Implementing AWS Performance Insights on RDS and Azure SQL Database and optimizing database performance
  • Designing and developing scalable data pipelines for efficient data ingestion and processing
  • Collaborating with business analysts to design optimal data structures for BI and analytics
  • Providing technical guidance and mentorship to foster team members' professional growth.

Lead Data Architect and Data Modeler

Copart
Dallas, TX
08.2018 - 08.2019
  • Spearheaded the design and data modeling of operational data stores and cutting-edge data warehouse applications for the auction engine and member registration, utilizing Azure ETL tools such as Azure Data Factory and Azure Databricks
  • This initiative drove efficient data processing and analysis in the Azure environment
  • Introduced and implemented a robust reference data management platform on Azure, revolutionizing data governance and ensuring consistent and accurate data across the organization
  • Leveraged Azure services for seamless integration and management of reference data
  • Collaborated seamlessly with cross-functional teams, including data scientists and business analysts, utilizing Azure ETL tools for robust data engineering support in advanced analytics and insightful reporting initiatives
  • Proactively partnered with data scientists, ensuring seamless integration of high-quality data from Azure sources for their sophisticated models and algorithms
  • Leveraged Azure services to meet analytical needs effectively
  • Played a pivotal role in gathering and synthesizing requirements from business analysts, utilizing Azure ETL capabilities to ensure accurate and readily available data for their critical reporting needs
  • Acted as a catalyst for effective communication and coordination across diverse teams, fostering a collaborative environment using Azure tools that fueled the successful execution of data projects and initiatives
  • Nurtured and mentored junior data engineers, empowering their professional growth with a focus on Azure best practices in data engineering
  • Instilled Azure-specific skills to enhance their capabilities.

Senior Data Engineer

Radius Travel
Bethesda, MD
05.2016 - 08.2018
  • Mentored and inspired a high-performing team in a fast-paced environment, cultivating a collaborative and stress-free atmosphere to drive successful project delivery
  • Designed and meticulously reviewed innovative machine learning models, uncovering valuable data trends within the reservation system to deliver exceptional insights for customers
  • Collaborated seamlessly with multiple cross-functional teams, harnessing their collective expertise to understand application-specific data requirements and engineer cutting-edge machine learning and reporting models, delivering unparalleled insights
  • Championed marketing initiatives by leveraging the power of enterprise data warehousing, crafting compelling reports and empowering teams with transformative insights derived from the system's data trends
  • Fine-tuned the performance of Teradata, SQL Server, SQL, and ETL processes, optimizing data processing and query performance for exceptional efficiency and speed
  • Pioneered the development of automation tools using Python, empowering support and development teams with productivity-enhancing solutions
  • Architected and built robust ETL processes, seamlessly extracting, transforming, and loading new data into the enterprise data warehouse (EDW) for actionable insights
  • Innovated the deployment workflow by designing and implementing deployment tools using Python, streamlining the deployment of database objects
  • Leveraged an array of cutting-edge tools and technologies, including SSIS, Python, SQL Server, Teradata, Informatica, Unix, and MSTR.

Data Engineer

Cognizant Technology Solutions Corp
Cleveland, OH
12.2015 - 05.2016
  • Designed and implemented robust ETL processes using Informatica to capture mortgage and vehicle loans, ensuring efficient data extraction, transformation, and loading
  • Specialized in IBM Banking Reference Data Model, leveraging expertise in both Hadoop and Teradata environments to optimize table structures and indexes for enhanced performance
  • Developed shell scripts to automate the invocation of Informatica overnight batch jobs, ensuring seamless and timely data processing
  • Conducted performance tuning of existing stored procedures and macros while creating new ones, improving the overall efficiency and speed of data operations
  • Proficiently utilized Teradata tools such as Teradata Viewpoint for monitoring, TD Administrator for workload management, and Netvault backup tool for backups, demonstrating hands-on experience with Teradata utilities like MLoad, TPT, and FastLoad
  • Actively involved in performance tuning efforts for Informatica workflows and Teradata SQL queries, optimizing data processing and query performance
  • Participated in the successful upgrade of Informatica from version 8.6 to 9, as well as Teradata upgrades 12/13, ensuring smooth transitions and minimal disruption to operations
  • Provided efforts estimation for each release and played an integral role in warehousing projects within the banking domain
  • Served as a platform support specialist, responsible for setting up Informatica and Unix environments for seamless data processing
  • Developed Python scripts in Python 2.x for data wrangling purposes, harnessing the power of scripting for efficient data manipulation
  • Mentored and onboarded new colleagues in the EDW framework, imparting essential functional knowledge and fostering a collaborative team environment.

DW BI Developer

Westpac Banking Corporation
Sydney, NSW, Australia
03.2011 - 12.2014
  • Contributed to a range of projects within the enterprise data warehouse (EDW), ERR Data Mart, and framework activities during my tenure at Westpac
  • Assumed various roles, including technical standard documentation for each tool and framework, ensuring adherence to best practices and industry standards
  • Translated business requirements into comprehensive technical specification mapping documents, facilitating seamless communication and alignment between business and technical teams
  • Deployed ETL and database code while conducting meticulous build reviews to maintain code quality and compliance with project requirements
  • Played a key role in building the ETL architecture and defining source-to-target mappings for data loading into the data warehouse
  • Leveraged a wide array of transformations, including Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union, to develop robust mappings in Informatica Designer
  • Processed and transformed delta feeds of customer data on a daily basis, ensuring accurate and up-to-date information
  • Conducted data analysis and profiling to gain insights into data quality and integrity
  • Developed and maintained Control-M jobs in non-production environments, ensuring smooth job scheduling and execution
  • Maintained detailed documentation for over 35,000 Control-M jobs, enabling efficient job management and troubleshooting
  • Extracted and loaded data using Teradata Utilities, optimizing data movement and performance
  • Performed performance tuning activities on Informatica workflows and Teradata SQL queries, enhancing data processing efficiency and query performance
  • Developed and executed Teradata loading utilities such as MLoad, FastLoad, and TPT to efficiently load data into the Teradata database
  • Engaged in Teradata DBA activities, including space allocation, user access level management, table creations, and partitioning
  • Managed Control-M scheduling for production, testing, and development databases, ensuring timely and reliable job execution
  • Assumed responsibility for clarifying technical and functional aspects of ETL processing to testing, reporting, and business teams, fostering effective collaboration and understanding.

ETL Production Support Analyst

Bank of America, Tata Consultancy Services Ltd
Chennai, India
01.2010 - 01.2011
  • Led table creation and size allocation activities, ensuring optimal data storage and efficient data retrieval
  • Managed incident, change, and problem management processes, promptly addressing and resolving production bugs and performance issues
  • Played an integral role in change, incident, and problem management as part of business-as-usual (BAU) operations, ensuring smooth and reliable system performance
  • Scheduled jobs and ensured timely delivery of data to upstream and downstream systems, meeting SLA requirements
  • Orchestrated migration activities during enterprise releases, coordinating seamless data transfers and minimizing disruptions
  • Collected statistics and verified spaces to maintain the health and integrity of the data environment
  • Oversaw the quality process, delivering weekly status reports to clients and providing comprehensive metrics reporting
  • Conducted impact analysis and effort estimation for change requests, efficiently allocating tasks to team members and resolving technical issues
  • Demonstrated proficiency in debugging and identifying production-related issues, promptly resolving them within expected time frames
  • Hands-on experience in scheduling jobs using Autosys and Control-M schedulers, ensuring efficient job execution and timely data processing
  • Worked on enhancements, value adds, permanent fixes, performance tuning, reloads, and user tickets within the supported applications, continuously improving system functionality and user experience.

Education

Master of Science - Computer Science

University of Central Missouri
Warrensburg, MO
12-2015

Skills

  • Leadership in Data Architecture: Trusted leader in leading teams of data modelers and data architects in the creation of robust data architecture solutions Leveraging strong interpersonal skills and technical expertise to guide and mentor team members, ensuring the successful execution of projects and delivery of high-quality solutions
  • Data Modeling Excellence: Demonstrated proficiency in conceptual, logical, and physical data modeling for structured and non-structured data Specialized in designing adaptable data structures tailored to specific business requirements, fostering scalability and agility in data management
  • Innovative Framework Creation: Renowned as the creator of the DMDD (Data Model-Driven Development) framework, pioneering the integration of data models into code This innovative approach streamlines development processes, ensuring seamless alignment between data architecture and application development, resulting in enhanced efficiency and maintainability Moreover, DMDD serves as the foundational stepping stone towards leveraging data models for code generation from AI, revolutionizing the development landscape
  • Data Integrity and Consistency: Committed to upholding data accuracy and consistency standards, leveraging a deep understanding of reference data management principles Implement strategies to ensure data quality and reliability throughout the data lifecycle
  • Technological Proficiency: Skilled in leveraging industry-leading tools and technologies for ETL processes, including Python, Informatica, SSIS, and PySpark for big data processing Expertise in utilizing Delta Lake for data versioning and reliability, and Databricks for scalable data engineering and analytics workflows
  • Cloud-Native Expertise: Specializing in cloud-native data integration and processing tools such as Azure Data Factory, Azure Logic Apps, Azure Synapse, AWS Lambda, AWS Step Functions, and EMR Extensive experience in executing seamless migrations from on-premises to cloud environments, utilizing AWS Database Migration Service (DMS) and Azure Database Migration Service to ensure uninterrupted data flow
  • Wide-ranging Database Proficiency: Proficient in working with a diverse range of database technologies including Microsoft SQL Server, Oracle, PostgreSQL, MariaDB, MongoDB, DynamoDB, Redshift, Snowflake, Teradata, MySQL, Amazon Aurora, and Azure SQL Database This comprehensive expertise enables seamless integration and optimization of data solutions across various database platforms

With excellent communication and collaboration skills, adept at translating complex technical concepts into actionable insights, ensuring alignment with industry trends and best practices Committed to driving innovation and efficiency through strategic data modeling initiatives

Languages

English
Full Professional
Hindi
Native/ Bilingual
Tamil
Native/ Bilingual
Telugu
Native/ Bilingual
Malayalam
Limited

Certification

  • AWS Certified Solutions Architect - Associate
  • Python Data Engineer

Timeline

Lead Data Architect & Modeler

American Century Investments
12.2023 - Current

Vice President- Lead Data Architect

Goldman Sachs
08.2019 - 12.2023

Lead Data Architect and Data Modeler

Copart
08.2018 - 08.2019

Senior Data Engineer

Radius Travel
05.2016 - 08.2018

Data Engineer

Cognizant Technology Solutions Corp
12.2015 - 05.2016

DW BI Developer

Westpac Banking Corporation
03.2011 - 12.2014

ETL Production Support Analyst

Bank of America, Tata Consultancy Services Ltd
01.2010 - 01.2011

Master of Science - Computer Science

University of Central Missouri
  • AWS Certified Solutions Architect - Associate
  • Python Data Engineer
Chaitanya Anchala