TECHNICAL STACK: SNOWFLAKE, Oracle 19c, ODI 12c, Informatica, ADF, BIACM, Airflow Python, Pyspark, Azure Function Apps, Power BI, Tableau, GitHub, Azure Data Bricks
· Involved in business discussion in understanding functional requirements and collaborating with technical teams in designing modern pipelines and nuances of requirements to build them as per specs.
· Designed SNOWFLAKE database objects based off Oracle PVO extracts from BIACM using Azure Blob Storage containers and Oracle Cloud Integration buckets.
· Involved in development of whole migration of current Oracle ingest solution of files into SNOWFLAKE. As part of data ingestion process, created snowflake external and internal stages, file formats, WH's, roles and needed objects.
· Design and create appropriate containers based on environments and create corresponding Snowflake integration objects.
· Developed Oracle BIACM API’s written in Python to call Job schedules and build extracts.
· Use dBT in pipelines to move data from staging areas with cleansing, augmented data into silver layer with help of models and leverage dBT macros.
· Used Snowflake Streams and tasks to build CDC mechanism, so that downstream tables/view can get, merge data and tag transactions. This can be leveraged by Power BI data flows can refresh based off CDC tagged data.
· Build different integration objects to support integration between SNF and tenants.
· Created stored procedures in Snowflake to accommodate JSON file process who holds metadata about files which get pulled from Oracle BIACM.
· Developed Python code for different tasks, dependencies, SLA watcher and time sensor for each job for workflow management.
· Build data pipelines with control mechanism in case of failures starting from BIACM PVO extracts till SNOWFLAKE replicated tables.
· Build Stored procedures to implement business logic to process data in pipelines.
· Leveraged Snowflake Pipe to build continuous stream of data to support IC module from external systems using Azure ADLS Gen2 and Queue mechanism.
· Use Rclone to populate files from OCI buckets to Blob containers based off job type subject areas in related to Oracle financials.
· Creation of azure app functions and used as external functions in Snowflake.
· Build up needed capabilities like WH set ups and performant queries in SNOWFLAKE, so that data loads get in stipulated period for business day to day activities.
· Align with standard practices, always upload file sizes recommended by Snowflake and maintain resource monitors.
· Designed and developed FBDI load process for all Master data into Oracle Fusion using data pipelines developed in ODI and ADF.
· Developed ADF ELT flows which includes migration of all financial objects which includes AR, AP GL from Oracle EB R12 to Oracle Cloud using FBDI process and stage them into ADLS containers.
· Written Pyspark scripts to pull data from home grown MDM Applications using API’s.
· Build process around error handling mechanisms which helps to find out mismatches what loaded Vs failures from Oracle Cloud tables using BIP process.
· Build Python scripts to call Oracle API's and build payloads to process bank, bank account info.
· Developed reconciliation process which gives over all summary, counts of process, and deliver over email to respective teams.
· Developed stored procedures to pull data from transaction tables and staged them. So ETL routines will pick up and build FBDI files. Then use them to create transactional data into Oracle Cloud.
· Understand Oracle table structures and build needed extract queries consulting with function teams and get signed off.
· Manage team and keep over all communication to stake holders whenever there are roadblocks in development activity.
ETL:Informatica, IICS, dBT, ADF, ODI12c
undefined