Summary
Overview
Work History
Skills
Areasofexpertise
Timeline
Generic

Lakshman Teja Lagadapati

Azure Middleware Developer
Frisco

Summary

Over 7 Years of IT experience with special emphasis on Design, Development and implementing Data warehouses and around 6 years of experience in ETL development and 3 years of experience in Snowflake/Cloud migration Solid understanding of Data Modelling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications. Strong experience in Data Extraction, Transforming and Loading (ETL) from/to multiple sources like Excel, MS Access, Oracle, Teradata and DB2 using ETL tools. Experience in Migration of mappings and transformations from one environment to another. Designed the Data Marts in dimensional data modelling using star and snowflake schemas. Used Stored Procedures, SnowSQL and data movement servers to migrate on premise data to snowflake. Experience in building Snowpipe. Creating Multi-Cluster warehouses depending on Stakeholders requirement. Work closely with application developers to create and improve data flows between internal/external systems and the data warehouse Expertise in Integration of various data sources like DB2 UDB, MF DB2, MF Flat files, Oracle 10g/9i/8i/7i, SQL Server, MS Access, Teradata, and Sybase Expertise in using Azure services like containers for ETL processing into snowflake. Experience in Database Design and Modeling and Software Development Life Cycle (SDLC). Coordinated offshore team and handled projects as onshore lead for issue resolution, technical reviews and all the offshore deliveries. Created ETL processes to ensure data integrity (Error Handling) and document appropriate procedures. Prepared discussion diagrams, process diagrams, to explain and support business and IT management decision making. Strong analytical, organizational, presentation and problem-solving skills, as well as excellent inter-personal and communication skills.

Overview

10
10
years of professional experience

Work History

Azure Cloud Engineer

United Service Automobile Association (USAA)
01.2025 - Current
  • Designed and implemented scalable middleware solutions using Azure services to facilitate seamless integration between diverse systems.
  • Developed and deployed APIs, microservices, and middleware components using Azure Logic Apps, Azure Functions, and API Management.
  • Integrated on-premises and cloud-based systems using Azure Integration Services, including Service Bus, Event Grid, and Data Factory.
  • Built real-time and batch processing pipelines for secure and efficient data exchange between systems.
  • Optimized middleware solutions to ensure high availability, low latency, and scalability to support growing business requirements.
  • Conducted performance tuning and debugging of middleware components to resolve bottlenecks.
  • Implemented secure communication protocols and authentication mechanisms using OAuth, OpenID Connect, and Azure Active Directory.
  • Ensured middleware solutions adhered to organizational and industry compliance standards, such as GDPR or HIPAA.

Azure Cloud Engineer

AT&T
07.2021 - 12.2023
  • Company Overview: AT&T Inc
  • Is an American multinational telecommunications holding company headquartered at Dallas, Texas
  • It is the world's largest telecommunications company by revenue and the third largest provider of mobile telephone services in the U.S
  • AT&T was ranked 13th on the Fortune 500 rankings of the largest United States corporations, with revenues of $168.8 billion
  • The 'CDO ECDW Migration' project aims to transition the existing data warehouse infrastructure from Teradata to Snowflake
  • Script Conversion: Translate existing BTEQ and TPT scripts into Snowflake-compatible SQL and scripting languages
  • Ensure functional equivalence between the original Teradata scripts and the new Snowflake scripts
  • Optimize the translated scripts for performance improvements in the Snowflake environment
  • ETL Development: Develop and implement ETL processes for data extraction, transformation, and loading into Snowflake
  • Utilize Snowflake-specific features and functionalities to enhance ETL processes
  • Ensure ETL processes are scalable, efficient, and maintainable
  • Data Migration: Plan and execute the migration of data from Teradata to Snowflake
  • Implement strategies to ensure data integrity, consistency, and accuracy during the migration
  • Minimize downtime and disruption to business operations during the data migration
  • Collaboration: Work closely with data architects, data engineers, and DBAs to design and implement the Snowflake architecture
  • Collaborate with QA teams to test and validate the migrated scripts and data
  • Engage with business analysts to understand business requirements and ensure the new system meets user needs
  • Documentation: Document the new ETL processes, data models, and scripts in Snowflake
  • Maintain comprehensive documentation for the migration process and any changes made
  • Provide clear and detailed instructions for operational and support teams
  • AT&T Inc
  • Is an American multinational telecommunications holding company headquartered at Dallas, Texas
  • It is the world's largest telecommunications company by revenue and the third largest provider of mobile telephone services in the U.S
  • AT&T was ranked 13th on the Fortune 500 rankings of the largest United States corporations, with revenues of $168.8 billion
  • Environment: Snowflake, SnowSQL, Azure, Azure Blob, Stored Procedures, SQL, Datastage 11.5, Linux, UNIX, DB2, Teradata, Vertica, SQL Server, JIRA, Bitbucket, TWS

ETL Developer

United Service Automobile Association (USAA)
02.2019 - 06.2021
  • Company Overview: The United Services Automobile Association (USAA) is a San Antonio-based Fortune 500 diversified financial services group of companies including a Texas Department of Insurance-regulated reciprocal inter-insurance exchange and subsidiaries offering banking, investing, and insurance to people and families who serve, or served, in the United States Armed Forces
  • The project focus is to provide development & support for the Customer Data Warehouse data used to serve analytical reporting for Marketing, Billing and various other user communities by accomplishing return to service and enhancements to processes for consistent and timely data delivery
  • Hands-on experience with Snowflake utilities, Snowflake SQL, Snow Pipe, etc
  • Created Snow pipe for continuous load data and used copy to bulk load the data
  • Worked in Snowflake advanced concepts like setting up Resource Monitors, Role Based Access Controls, Data Sharing, Virtual Warehouse Sizing, Query Performance Tuning, Snow Pipe, Tasks, Streams, Zero- copy cloning etc
  • Using snowflake as SaaS and migrated Teradata data to snowflake using Stored Procedures, SNOWSQL and data movement servers
  • Created Internal, External stage and transformed data during load
  • Use Azure services for external staging and ETL processing
  • Use Azure container for external file hosting
  • Use IBM Information Server DataStage/Quality Stage to create ETL design and to develop jobs to perform data cleansing, data loads and data transformations
  • Worked and developed python utilities for Modelling Data before loading to Staging Area
  • Implemented batch processing using Control M tool
  • Extract data from various databases like DB2, SQL Server, Oracle and Teradata
  • Design share containers for code reusability and to implement predefined business logic
  • Provide ongoing maintenance and support of ETL flows and their target applications
  • Design complex job control processes to manage a large job network
  • Extensively use Quality Stage to convert data (trimming, parsing, collapsing, etc.) from legacy sources into consolidated high-quality information within the enterprise Warehouse and Data Mart
  • Responsible for unit, system and integration testing
  • Development test scripts, test plan and test data
  • Participated in UAT (User Acceptance Testing)
  • Responsible for maintaining and managing metadata for data warehousing environments
  • Part of ETL Production support team and helped to resolve day to day Production issues
  • The United Services Automobile Association (USAA) is a San Antonio-based Fortune 500 diversified financial services group of companies including a Texas Department of Insurance-regulated reciprocal inter-insurance exchange and subsidiaries offering banking, investing, and insurance to people and families who serve, or served, in the United States Armed Forces
  • Environment: Snowflake, SnowSQL, AWS, SQL, Datastage 11.5, Linux, UNIX, DB2, Netezza, SQL Server, JIRA, Bitbucket, Jenkins

ETL Developer

KPI Partners
05.2015 - 10.2018
  • EMV Debit Card Pilot enables Deposits applications to identify EMV cards with chip indicator ‘8’ (Apple Pay Payments) and writes its transactions with ‘CHIP CARD’ in Log and History database
  • Used the DataStage Designer to develop processes for Extracting, Cleansing, Transforming, Integrating, and Loading data into Data warehouse
  • Requirement analysis and gathering to provide technical and architectural support to the team
  • Documenting the Proof of Concepts and delivered to Clients
  • Writing of Technical Specification of the project
  • Documented the Purpose of Mapping so as to facilitate the personnel to understand the process and incorporate the changes as and when necessary
  • Primarily involved in Job Design, Technical Reviews and Troubleshooting of jobs
  • Extensively involved in different Team review meetings and conferences with remote team
  • Worked on the logical and physical design of the Data warehouse
  • Identified sources/targets and analyzed source data for dimensional modeling
  • Implemented SCD Type1 and SCD Type2 using different stages in for building the EDW
  • Developed various Server and Parallel jobs using Oracle, ODBC, FTP, Peek, Aggregator, Filter, Funnel, Copy, Hash File, Change Capture, Merge, Look up, Join, Sort, Merge, Lookup stages
  • Performed Unit Testing and tuned for better performance with updates on data warehouse tables using DataStage Director for jobs Monitoring and Troubles Shooting
  • Developed the data warehouse repository using DataStage Manager by importing the source and target database schemas
  • Created shared containers to use in multiple jobs
  • Extensively worked on DataStage Job Sequencer to Schedule Jobs to run jobs in Sequence
  • Prepare various project specific documents such as Technical Design Document, UTD, Code Review Doc, Data Validation Checklist, Project Description Document etc
  • Developed PL/SQL Procedures, Functions, Packages, Triggers, Normal and Materialized Views
  • Worked with Reporting team for extensively reporting using Data mart for Slice & Dice, Drill Down and Drill through
  • Defect Tracking, unit testing, defect reporting, analyzing results and documentation
  • Environment: DataStage 9.1, Oracle, IBM AIX, Toad for Oracle 9.7, DB2, Flat files, JIRA, GIT, UCD

Skills

Snowflake

AWS Services

Azure services

Azure Blob

IBM InfoSphere DataStage 115

IBM InfoSphere DataStage 91

Shell Script

Python

Agile

Waterfall

Teradata

Vertica

Oracle

undefined

Areasofexpertise

Snowflake, AWS Services, Azure services, Azure Blob, IBM InfoSphere DataStage 11.5, 9.1, Shell Script, Python, Agile, Waterfall, Teradata, Vertica, Oracle, SQL Server, DB2, Tableau, AutoSys, Control M, TWS, UNIX, Linux, Windows 11, 10, 9x, 2K, XP

Timeline

Azure Cloud Engineer

United Service Automobile Association (USAA)
01.2025 - Current

Azure Cloud Engineer

AT&T
07.2021 - 12.2023

ETL Developer

United Service Automobile Association (USAA)
02.2019 - 06.2021

ETL Developer

KPI Partners
05.2015 - 10.2018
Lakshman Teja LagadapatiAzure Middleware Developer