Summary
Overview
Work History
Education
Skills
Certification
Software/Technical Skills
Extracurricular Activities
Languages
Websites
References
Timeline
Generic
Rachana Choudhary

Rachana Choudhary

Manchester,CT

Summary

Over 18 years of IT experience in Data Modernization, Designing Data Warehouse/Data Marts/BI Reporting/OLTP applications, Data Integration and Data Management, Development, Implementation and Testing for verticals like Banking industry(American Express, Citi Group, Capitol One and SunTrust), Insurance industry (Travelers Inc.).

Overview

19
19
years of professional experience
5
5
Certification

Work History

Technical (ETL) Architect and Lead

Travelers
Hartford, CT
08.2016 - Current

Data Modernization

  • Worked with multiple cloud architects to establish common architecture and engineering designs, that guardrails and enables organization wide technology decisions, helping all teams to leverage common capabilities, improve efficiency and build skills.
  • Designing and Developing processes to manage Data Quality, Security, and Integrity throughout the data lifecycle.
  • Advance Analytics on Cloud - Using MicroStrategy and QlikSense to generate business insights, predictions, and recommendations.
  • Migrating on-prem data to AWS cloud (data moving to S3, Aurora(Postgres), Data Lakes, Snowflake tables)
  • RDU – Reference Data Units, getting data from business in files and loading into S3 buckets to Aurora DB and pushing CDC data into Data Lake for other teams, who can subscribe Events to consume the data. Also, team have APIs to consume data directly from Auroa Postgres.
  • Common process is developed to capture the Audit and Logging details, in DynamoDB semi-structured(key-value) database using Lambda functions.
  • Glue Python Scripts – run periodically using Eventbridge scheduler to move the data files extracted from Postgres (in Parquet Format) to Data Lake in cross account setup.
  • KMS and IAM services are used for database credentials security and least-privileged access.
  • Amazon Athena is used to perform analysis.
  • Higher Environment Migration - AWS Services provisioned using Terraform scripts (automated CICD) and Glue/Lambda (UCD), and Database changes (Liquibase setup)

Business Insurance and Personal Insurance Billing Datawarehouse and Reporting

  • Utilized vast data warehouse experience to design and architect critical Insurance Billing data pipelines and end to end solution for the new Enterprise Data warehouse.
  • Used STAR Schema and SNOW FLAKE Schema. Efficiently handled the Granularity, Indexing and Partitioning of underlying data storage platform
  • Integrated various Operational Data Stores
  • Designed/Developed Data Mart for Billing, as well as Business Reports for better decision making using QlikView tool for BI Billing and used MicroStrategy tool for PI Billing Data. Designed and Developed Reporting for Agents to understand how to retain the existing customers and reasons for policy cancellations and various options to improve the business and more. Billing reports to understand better performing payment channels and what methods are used for payments etc.
  • Leading all the ETL enhancements and supporting the current system which contains all the billing/financial data for accounts and policies.
  • Worked directly with the business and different stakeholder to discuss the Application, Data requirements as well as identified data issues. Also, Document and Review the Use Case with business and partners to get their signoff.
  • Determines the optimal approach for obtaining data from diverse source system platforms and moving it to the data analytics environment
  • Designed and implemented the framework to manage data from different sources including EBCDIC file/VSAM files, Oracle, Teradata, Mainframe DB2, SQL Server. Create in-house accelerators for process automation and optimization
  • Worked on Data Marts, SCD (Slowly Changing Dimension) process, different type of dimension (Conformed Dimension) and fact tables, Surrogate key generation.
  • Designing and Developing ETL Framework (Tool: Ab-Initio) and working with DAR/BA/Dev team members globally.
  • Ensuring effective ETL functional decomposition into deliverables objects and ETL quality. Making sure industry best practices are followed.
  • Involved in Data Modeling. Directly work with Data Modelers to create the Data Model, use Dimensional Modeling for Data Marts, deciding surrogate key process, CDC process, archive and purge processes.
  • Streamlining and Automating Data Governance and Management process.
  • Involved in iteration planning, story prioritization, backlog grooming, retrospective, burnt up chart, and velocity estimation.
  • Implementation and execution of Agile practice to deliver technology solution, expertise in writing features, functional and technical stories in Rally tool.
  • Setting up roadmap for team to improve the current products to enhance customer experience and to enhance the data quality.
  • Project estimation and resource planning.
  • Excellent communication and interpersonal skills.

Environment

AWS Services (S3, Lambda, Glue, Event Bridge, KMS, IAM, Dynamo DB, Aurora DB), GitHub, Ab-Initio GDE 4.1.1.0, Co>Operating Sys 4.0.1.4, Postgres DB, Oracle, Teradata, Db2,SQL Server, PL/SQL, UNIX, Autosys

ETL Architect and Technical Lead

Citi Group
Jersey City, NJ
10.2012 - 08.2016

Citi Group is one of the world's leading financial services companies with 200 million accounts in 160 countries. It's one of the leading banks, works to provide consumers, corporations and institutions with a broad range of financial products and services.

PFG - Promontory Financial Group (AML)

All the banks and many other money handling institutions are required to perform Anti Money Laundering checks, penalties for failing to do so can be high for both corporate and personal, so Citi has been mandated by the OCC under the Consent Order to evaluate the effectiveness of its AML surveillance and transaction monitoring system (MANTAS).

PFG implementation services consist of detail of Account Posting transactions, and SWIFT MT 103/202 messages.

  • Work with the business stakeholders and other programmers and analysts to develop a solution.
  • Worked on different Use Case to review it with business and partners and get the sign off.
  • Designed and implemented a common Metadata Management Framework for more than 50 countries where multiple services can be used for multiple countries.
  • Had around 40 source systems (50 countries) around the globe with multiple character sets, like Cyrillic, ASCII, UTF-8 etc.
  • Develop procedures for data management. Creates procedures for developing and maintaining metadata and data mapping documents
  • Provided XML camt52 Handoff files to all partner system through common batch processing, specific format for each partner.
  • Worked with EMEA Team (Singapore) and India Team, as well as led the technical team to develop the hand-off files.
  • Worked on Partner Interface Control document that includes all the SLA information for all the partners
  • Processing of SWIFT messages and used Binary Large OBject (BLOB).
  • Sending XML messages using MQs (Continuous Flow) – Proof of Concept(POC) was done.
  • Developed ETL Data Flow process using Microsoft Visio and presented to business stakeholders for high level design sign off.
  • Worked on source to target mapping. Participated in various data cleansing and data quality exercises.
  • Developed various Ab-Initio graphs which include extracting various vector/hierarchal files and loading it into XML format.
  • Performed code reviews, performance tuning strategies at ab initio and Database level.
  • Created/Modified the UNIX shell scripts for that application.
  • Performed Code Migration, job scheduling for multiple countries having different time zones and multiple servers and corresponding migration teams.
  • Coordinated with Senior management on day-to-day activities.

Flex cube Data Conversion and Metadata Hub

There were multiple versions for flex cube available and data was scattered in various formats, not standardized. Purpose of this initiative was to standardize and profile the data based on business requirement.

Also, Metadata Hub was used to provide end to end lineage.

Environment

Ab-Initio GDE 3.1.7.4, Co>Operating Sys 3.1.1.10, Metadata Hub, Data Profiler, Ab-Initio ACE, Oracle 10g, SQL Server, PL/SQL, UNIX, Autosys

Technical Lead

Capital One Client
Richmond, VA
12.2011 - 10.2012

Senior Software Engineer

American Express
New York, NY
06.2009 - 12.2011

Senior Software Engineer

SunTrust Bank
Atlanta, GA
09.2008 - 06.2009

Senior Software Engineer

American Express
Phoenix, AZ
01.2008 - 09.2008

Software Engineer- ETL Designer & Developer

American Express
Mumbai, MH
07.2006 - 01.2008

Software Engineer - ETL Developer

Metavante Corp
Mumbai, MH
06.2005 - 07.2006

Education

Bachelor's degree in Information Technology -

Sobhasaria Engineering College - Sikar, Rajasthan University (Jaipur, India)
01.2004

Skills

  • Cloud Services - S3, Lambda, Glue, Event Bridge, KMS, IAM, Dynamo DB, Aurora DB
  • Operating Systems - Windows, Linux, Unix
  • Databases - Postgres, Oracle, Teradata, DB2, SQL Server, MS Access
  • ETL Tools - Ab-Initio , Informatica, DataStage
  • Data Visualization tool - Tableau, QlikView, MicroStrategy
  • Scheduler - Autosys, Control-M
  • Miscellaneous – Python, GitHub, MS Office, InfoPath, Quality center, HPSM, Web-EME, Informatica, DataStage, SQL/PL-SQL , Unix Scripting, HTML,XML
  • Trained - Trained on Talend tool, Talend and Big Data Integration (Attended 2018 Talend Connect in New York), Knowledge on Azure cloud concepts and trained as Azure Data Engineer

Certification

  • CLF-C01: AWS Certified Cloud Practitioner
  • AWS Certified Solutions Architect – Associate Certification with Academy
  • Certified Celonis Data Engineer 2020/2021
  • Celonis Process Connection Advanced Course

Software/Technical Skills

  • Cloud Services - S3, Lambda, Glue, Event Bridge, KMS, IAM, Dynamo DB, Aurora DB
  • Operating Systems - Windows, Linux, Unix
  • Databases - Postgres, Oracle, Teradata, DB2, SQL Server, MS Access.
  • ETL Tools - Ab-Initio , Informatica, DataStage
  • Data Visualization tool - Tableau, QlikView, MicroStrategy
  • Scheduler - Autosys, Control-M
  • Miscellaneous – Python, GitHub, MS Office, InfoPath, Quality center, HPSM, Web-EME, Informatica, DataStage, SQL/PL-SQL , Unix Scripting, HTML,XML
  • Trained - Trained on Talend tool, Talend and Big Data Integration (Attended 2018 Talend Connect in New York), Knowledge on Azure cloud concepts and trained as Azure Data Engineer.

Extracurricular Activities

  • Head of Organizing Committee at Travelers, Hartford for the festival celebrations such as Diwali, Christmas.

  Volunteered at,

  • KNOX Greater Hartford Green, CT
  • Travelers T-Factor, Hartford to raise money for charity.
  • Travelers Championship to raise money for charity.
  • New York Cares while working in New York and New Jersey.
  • Jersey City Library Literacy Program for Immigrants & Refugees.
  • American Express for CRY(Child Rights and You) NGO to raise funds in Arizona.

Languages

English
Professional
Hindi
Professional

References

References available upon request.

Timeline

Technical (ETL) Architect and Lead

Travelers
08.2016 - Current

ETL Architect and Technical Lead

Citi Group
10.2012 - 08.2016

Technical Lead

Capital One Client
12.2011 - 10.2012

Senior Software Engineer

American Express
06.2009 - 12.2011

Senior Software Engineer

SunTrust Bank
09.2008 - 06.2009

Senior Software Engineer

American Express
01.2008 - 09.2008

Software Engineer- ETL Designer & Developer

American Express
07.2006 - 01.2008

Software Engineer - ETL Developer

Metavante Corp
06.2005 - 07.2006

Bachelor's degree in Information Technology -

Sobhasaria Engineering College - Sikar, Rajasthan University (Jaipur, India)
Rachana Choudhary