Summary
Overview
Work History
Education
Skills
Websites
Certification
Timeline
Generic
Kiran Bethamcherla

Kiran Bethamcherla

Columbus

Summary

Around 18 years of experience in the areas of finance, royalty, credit cards, retail, and investment domains.

Expertise in data modeling, data analysis, data integration, star schema, snowflake schema, dimensions, and fact tables. Worked extensively with all Informatica components, including IDQ, Power Exchange, Power Center, IICS, and MDM implementations. Implemented data ingestion strategies in Snowflake using Python and DBT models and macros. Solid experience and understanding of implementing large-scale data warehousing programs and end-to-end data integration solutions on Snowflake Cloud, Azure Databricks, and Informatica PowerCenter integrated with multiple relational databases (MySQL, Postgres, Oracle, Sybase, SQL Server, DB2). Experience in data migration from DB2, Postgres DB, to Snowflake cloud data warehouse, and Azure SQL database. Experience in using Snowflake features: zero-copy cloning, data sharing, bulk loading, external tables, partitioning, and ETL workflows. Knowledge and experience in S3 AWS services, and managed policies for S3 buckets. Used S3 bucket and Glacier for storage and backup on AWS. Experience in granting permissions, managing resources, and providing permissions to users in AWS IAM. Experience in Python programming for data processing and handling data integration between on-prem and cloud databases or data warehouses. Designed Apache Kafka streaming to read data from various sources, and send it to data pipelines. Had knowledge of open-source Kafka by creating topics using consumers and producers to ingest data into applications. Experience in working with Unix Shell, Perl, and Python scripts. Proficient in using Python libraries, such as Pandas and NumPy, for data analysis. Experience in designing and building data architecture on Azure, and using its services like Azure Data Lake, Azure Databricks, and Azure SQL Data Warehouse. I used OpenAI to prepare business need statements based on keywords used in search optimization and ranking. Working on an AI model to create applications and APIs for vendors, reducing the workforce and time by more than 50%. Load tested the applications with virtual users ranging from 25,000 to almost 1 million. Designed the system architecture, implementation, and integration of microservices for real-time, customer-facing applications.

Overview

21
21
years of professional experience
5
5
Certifications

Work History

Product Owner FitnessAtYourHome.com ,

ChangeHabits.org , SaaSForCloudKitchen.com
02.2021 - Current
  • Designed App model, functionalities, use cases and worked with team to create desktop App and Mobile Apps in both Android and iOS.
  • Oversaw development, UX and roadmap for the application in next releases.
  • Spearheaded Core on configuring the systems, integrating the payment processing and fine-tuning the system throughout development and implementation.
  • Worked with product vendors customizing the application and creating tickets to trouble shoot the issues.
  • Collaborated with development and AWS implementation team to ensure functional cohesiveness of software implementation.
  • Management and maintenance of AWS services EC2, S3, ELB, Route 53, IAM, CLI and VPC.
  • Designed and led the team in developing AWS RDS, Elastic Load Balancing, Auto Scaling, SES, SNS and many other required services of AWS infrastructure.
  • Worked on AWS Lambda to automate the tasks and reduced manual intervention of one third of total effort.
  • Implemented auto scaling and load balancer to adjust number of active EC2 instances to handle the demand.
  • Integrated AWS services with CloudWatch for real-time monitoring of EC2, RDS, and Load Balancer metrics.
  • Monitored AWS services and MySQL databases in AWS RDS.
  • Used SNS to facilitate event driven sending notifications via email.
  • Application is spread across 2 availability zones for high availability.
  • Managed in App Purchases with Apple and Google Store.
  • Used Kafka, message brokering services to create topics for producers and consumers and exchange data between micro services.
  • Load tested the application with virtual users ranging from 25k to almost 1 million.
  • Worked with customers on defects and change management.
  • Worked with AWS team on implementing dynamic elastic allocation of servers based on the load.
  • Environment: Angular, Ionic framework, Cordova, boot strap, ASP .net core MVC, MS SQL, bit bucket, AWS, Python, Visual Studio core.

ETL Lead/Analyst/Architect/Developer

Alliance Data Services
Columbus
09.2017 - Current
  • Played the role of an architect/lead/designer/developer for multiple crucial projects.
  • Interacted with the business users for the requirements, design presentations and design approvals.
  • Extensively worked on Informatica tool to meet the challenging requirements and create complex mappings within limited timeframe.
  • Used Informatica powerexchange for loading/retrieving CDC data from DB2 database on Mainframes.
  • Worked on creating extraction groups for all production source databases and adding registration groups and maintaining them.
  • Developed automated AIX scripts to register new tables with Power exchange.
  • Created end to end Automation scripts using Python and AIX scripts that read tables metadata information from DB2 installed on Mainframe, make Informatica Change Data Capture maps and checked in the code into Informatica repository.
  • Developed in house deployment group migration tool to migrate deployment groups across the environment and logs all the migrations in database.
  • Designed and developed Python automation scripts to create XML files of Informatica mappings that reduced the need of manual development and created mappings by reducing 90% development effort.
  • Written home grown Informatica code standards checking application using Linux and Python that is equivalent to ETL market tool Undraleu.
  • Involved in data vault modelling to identify the pieces that can be automated.
  • Directly worked with Informatica and Microsoft on trouble shooting ODBC driver connectivity issues on AIX platform.
  • Worked on IICS components like Data integration and Application integration.
  • Worked on Informatica upgrades from version 9.6 to 10.1 and 10.1 to 10.2 Hotfix and from 10.2 to 10.5.
  • Worked with Semi-structured Data and loaded into Snowflake tables using Python Pandas Library.
  • Designed, build and managed ETL/ELT Data Pipeline leveraging Python in Snowflake Data Warehouse using DBT models and macros.
  • Handle Duplicate Records with HASH/MD5 functions in Snowflake to maintain the integrity of the tables as per business requirement.
  • Participated in daily validation of data between DB2, Yellowbrick and Snowflake to make sure data is exactly matching without any loose ends.
  • Implemented complex logic in snowflake by using pushdown optimization in Informatica.
  • Implemented Slowly changing dimensions in snowflake using Informatica power exchange and power center.
  • Created views, materialized views and secured views as per business needs.
  • Led the team to meet project deadlines, testing the project and implementing it in production.
  • Worked on integration testing for migrating the project from Netezza to Yellowbrick database and Yellowbrick database to Snowflake.
  • Led the migration of on-prem Yellowbrick database to Azure SQL database reducing operation costs by 25%.
  • Developed and maintained ETL pipelines using Azure spark, Azure Databricks and Azure functions processing over 10 TB of data daily and over 2 billion transactions monthly with a 98.0% success rate.
  • Streamlined data processing workflows with Azure Functions, reducing batch processing windows by 25% for critical nightly jobs.
  • Environment: Informatica Power Center 10.2, Informatica Power exchange/IDQ, Informatica Intelligent clouds services, Netezza, Yellow Brick, Python 3.10.0, AIX 7.2, Snowflake, DBT( Data build tool), Red Hat Linux 9.2, Azure Databricks, Azure SQL Database.

ETL Consultant

Nationwide Insurance
Columbus
03.2012 - 08.2017

Informatica Developer

DSW Shoes INC
Columbus
07.2010 - 02.2012

ETL Informatica Developer

All State Insurance
Northbrook
01.2010 - 06.2010

ETL / Informatica Developer

Sony BMG music entertainment
10.2007 - 12.2009

ETL / Oracle Developer

Washington Mutual
Orange county
06.2005 - 09.2007

Education

Bachelors -

Computer Science

Skills

  • ETL tools: Informatica PowerCenter, IICS, SSIS, AWS Glue, Azure Data Factory
  • Data cleansing: Informatica Data Quality, Informatica Data Explorer
  • Cloud technologies: AWS S3, Snowflake, Microsoft Azure
  • Programming languages: SQL, PL/SQL, UNIX Shell scripting, Python, Perl, Java, C, VBNET
  • Visualization tools: Smartsheet, Tableau, Power BI
  • Reporting tools: IBM Cognos Analytics
  • Operating systems: Windows, UNIX
  • Databases: Oracle 10g/9i/8i, Teradata, Netezza, YellowBrick, DB2, Informix, MS SQL Server
  • Project management tools: Postman, Rally, JIRA, GitHub, Confluence
  • Utilities and tools: SQL Loader, PL/SQL Developer, Teradata Queryman

Certification

PCAP

Timeline

Product Owner FitnessAtYourHome.com ,

ChangeHabits.org , SaaSForCloudKitchen.com
02.2021 - Current

ETL Lead/Analyst/Architect/Developer

Alliance Data Services
09.2017 - Current

ETL Consultant

Nationwide Insurance
03.2012 - 08.2017

Informatica Developer

DSW Shoes INC
07.2010 - 02.2012

ETL Informatica Developer

All State Insurance
01.2010 - 06.2010

ETL / Informatica Developer

Sony BMG music entertainment
10.2007 - 12.2009

ETL / Oracle Developer

Washington Mutual
06.2005 - 09.2007

Bachelors -

Computer Science
Kiran Bethamcherla