Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Anand Soleti

Vernon Hills,IL

Summary

  • Experienced Data Warehouse Technical Lead and Data Engineer using modern data architecture experience over 13+ years of experience in Data Warehouse design and development.
  • Manage team of data engineers, delegating work, hand over project status to lead architects and managers.
  • Keen to learn modern technologies stack and invest time on building POC’s on modern data stack.
  • Process data using different file formats parquet, delta, JSON etc using batch processing and stream processing using snowflake streaming.
  • Organized and independent candidate successful lead at managing multiple priorities with outlook. Willingness to take on added responsibilities to meet team goals.
  • Use Databricks clusters for heavy payloads in pipelines.
  • Experience in data warehousing modelling such as dimensional modelling and good exposure to Data Vault 2.0 methodologies and design.
  • Worked in Agile methodology and involved in sprint planning, STORIES design, awarding points.
  • Worked with various warehouse tools like Databricks, Snowflake, OBIA, OAC and ETL tools like Informatica, IICS, dBT, ODI and ADF.
  • Used Snowflake to design and build brand new warehouse to support ongoing migration of replications from On-prem to Cloud.
  • Well experienced with building Tableau Data Sources in full and incremental modes to support ongoing business activity.
  • Build data flows and data sets in Power BI for ongoing migration from Web methods API’s reports to Power BI

Overview

10
10
years of professional experience
3
3
years of post-secondary education
2
2
Certifications

Work History

Sr.Data Engineer

Fujitsu America
02.2019 - Current

TECHNICAL STACK: SNOWFLAKE, Oracle 19c, ODI 12c, Informatica, ADF, BIACM, Airflow Python, Pyspark, Azure Function Apps, Power BI, Tableau, GitHub, Azure Data Bricks

· Involved in business discussion in understanding functional requirements and collaborating with technical teams in designing modern pipelines and nuances of requirements to build them as per specs.

· Designed SNOWFLAKE database objects based off Oracle PVO extracts from BIACM using Azure Blob Storage containers and Oracle Cloud Integration buckets.

· Involved in development of whole migration of current Oracle ingest solution of files into SNOWFLAKE. As part of data ingestion process, created snowflake external and internal stages, file formats, WH's, roles and needed objects.

· Design and create appropriate containers based on environments and create corresponding Snowflake integration objects.

· Developed Oracle BIACM API’s written in Python to call Job schedules and build extracts.

· Use dBT in pipelines to move data from staging areas with cleansing, augmented data into silver layer with help of models and leverage dBT macros.

· Used Snowflake Streams and tasks to build CDC mechanism, so that downstream tables/view can get, merge data and tag transactions. This can be leveraged by Power BI data flows can refresh based off CDC tagged data.

· Build different integration objects to support integration between SNF and tenants.

· Created stored procedures in Snowflake to accommodate JSON file process who holds metadata about files which get pulled from Oracle BIACM.

· Developed Python code for different tasks, dependencies, SLA watcher and time sensor for each job for workflow management.

· Build data pipelines with control mechanism in case of failures starting from BIACM PVO extracts till SNOWFLAKE replicated tables.

· Build Stored procedures to implement business logic to process data in pipelines.

· Leveraged Snowflake Pipe to build continuous stream of data to support IC module from external systems using Azure ADLS Gen2 and Queue mechanism.

· Use Rclone to populate files from OCI buckets to Blob containers based off job type subject areas in related to Oracle financials.

· Creation of azure app functions and used as external functions in Snowflake.

· Build up needed capabilities like WH set ups and performant queries in SNOWFLAKE, so that data loads get in stipulated period for business day to day activities.

· Align with standard practices, always upload file sizes recommended by Snowflake and maintain resource monitors.

Data Conversion Lead at WWT

Fujitsu America
02.2018 - 02.2020

· Designed and developed FBDI load process for all Master data into Oracle Fusion using data pipelines developed in ODI and ADF.

· Developed ADF ELT flows which includes migration of all financial objects which includes AR, AP GL from Oracle EB R12 to Oracle Cloud using FBDI process and stage them into ADLS containers.

· Written Pyspark scripts to pull data from home grown MDM Applications using API’s.

· Build process around error handling mechanisms which helps to find out mismatches what loaded Vs failures from Oracle Cloud tables using BIP process.

· Build Python scripts to call Oracle API's and build payloads to process bank, bank account info.

· Developed reconciliation process which gives over all summary, counts of process, and deliver over email to respective teams.

· Developed stored procedures to pull data from transaction tables and staged them. So ETL routines will pick up and build FBDI files. Then use them to create transactional data into Oracle Cloud.

· Understand Oracle table structures and build needed extract queries consulting with function teams and get signed off.

· Manage team and keep over all communication to stake holders whenever there are roadblocks in development activity.

Data Integration Consultant at Zebra Technologies

Fujitsu America
12.2015 - 12.2018


  • TECHNICAL STACK: OBIEE 11.1.7.x, OBIA 7.9.5, Informatica 9.5.1, ODI 12,2.1, DAC 11.1.1.7, Data Meer 5.4, Oracle 11G, SVN, CDH -5.1.0-mr2, Informatica Cloud, Java, Red Hat Enterprise Server 6.5, SVN, GITHUB Design & Analysis
  • Performed design to build informatica routines and extended behavior of existing OOB informatica mappings
  • Designed STAR schemas to accommodate reports which are based on MD-50 documents
  • Held discussion with business and technical team to understand behavior of OLTP referencing in MD50
  • Designed Execution Plans/Fact groups to perform data loads based on new Subject Areas of OBIEE
  • Perform analysis on figuring out relationship between data entities in EBS and Siebel
  • Build Md-70 technical design document and mentor offshore team members on implementation part
  • Technical Work:
  • Designed new custom Informatica mapping for different functional areas which includes AR, AP, Inventory Items etc
  • Extended OOB behavior of Informatica mappings to support ongoing development work
  • Involved in custom development of mappings, procedures, and packages and added to custom fact groups
  • Support the production environment development issues and planned the new releases based on issue log
  • Build metadata in different layers of RPD to support reports build up
  • Migration of RPD ‘s, catalogues across different environments
  • Build mappings to load data from Siebel, EBS and SFDC data sources and work with objects in conjunction with EBS Contracts, ERP Payables, SFDC Case Management etc
  • Wrote Shell scripts to invoke Scheduler and circulate iBots to different business streams
  • Used Java transformations to connect to InContact Cloud system and pull needed data to build KPI pertain to call center data
  • Built SQL Queries, Packages, Function as per need of technical requirements
  • Used Informatica Cloud (IICS) to load data into SFDC DWH tables using replication tasks
  • New dashboards, reports were built on top of new Subject Areas and old subject areas pertain to Finance, Call Centre, Service Contracts, Service’s Repair Management, Account Payables, and Receivables etc
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  • Creating Tables, Indexes, Views, Triggers
  • Created Oracle packages with advanced SQL queries like analytical and aggregate functions to process and load daily files
  • Used RPD's classic features like Hierarchical drilldown's, Federations, date function based on Time dimension etc
  • Involve in post go live production support, handling incidents and root fixes within SLA reach
  • Collaborating with team of developers designed, developed, and implemented a BI solution for Sales, Product and Customer KPIs
  • PROJECTS, :
  • NAME: BI Implementation
  • Role:

Sr Data Integration Lead at Toyota Motors

Ascentt Systems
Torrance
12.2013 - 12.2015
  • Involved in inception of new Data Mart design related to financial data integration between Toyota Parts Data Mart, Toyota Incentive Systems
  • Participated in elaboration of requirements, drafted design, and development approach of OBIEE
  • Elaboration phases: Build data model in align with STAR schema, interview business processes, capture business events and derive requirements, mockup reports, derivation of sample data
  • Integrated different home-grown data marts with OBIA warehouse holding Financial Incentives (GL, AP), vehicle related transactions
  • Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information
  • Designed and developed OBIEE related works which include BMM layer: Hierarchies, proper designing of Logical Tables sources of Facts and Dim’s, setting of proper dimensionalities on Facts tables, derivation of measures and applying business logic on facts measures
  • Developed OBIEE advance features like Drill through & Drill Across in & between multiple data sources
  • Build Financial transaction details reports using Oracle BIP templates
  • Build aggregate navigation based on user experience to boost performance of financial dashboard
  • Switched standard dashboard and report design based on standards of oracle sample app and follow sample OBIEE VM practices
  • Involved in setting up Session such as user session and Repository (Static and Dynamic) variables such as location as static and current day/month/year as dynamic variable using various Initialization blocks
  • Implemented data level securing based on user app roles and created new application roles to support different verticals (Sales DM, Vehicles DM, Financials DM) having multiple apps.

Education

Master of Science - Computer Science

Osmania University
Hyderabad
06.1999 - 06.2002

Skills

ETL:Informatica, IICS, dBT, ADF, ODI12c

undefined

Certification

SnowPro Certified from Snowflake

Timeline

Sr.Data Engineer

Fujitsu America
02.2019 - Current

Data Conversion Lead at WWT

Fujitsu America
02.2018 - 02.2020

Data Integration Consultant at Zebra Technologies

Fujitsu America
12.2015 - 12.2018

Sr Data Integration Lead at Toyota Motors

Ascentt Systems
12.2013 - 12.2015

Master of Science - Computer Science

Osmania University
06.1999 - 06.2002
Anand Soleti