Summary
Overview
Work History
Education
Skills
Certification
Languages
LinkedIn profile
Interests
Workshop
Timeline
Generic
Sathish Vardhineedi

Sathish Vardhineedi

Summary

Experienced Microsoft Certified Data Engineer with 10+ years of expertise in designing and managing enterprise data architectures. Skilled in data modeling, database management, and data integration. Committed to collaborative teamwork and consistently delivering impactful results. Recognized for adaptability and reliability in dynamic environments. Passionate about utilizing data to drive business insights and innovation. Holds an advanced level English certification and possesses knowledge of Dell Boomi.Compelling presentation and reporting skills.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Data Architect

AJG Gallagher
02.2023 - Current
  • We have designed an app leveraging Power bi and Power Automate
  • Determining the market value for compensation for the jobs of the client
  • For that the Gallagher team compiles the surveys performed matching those surveys against the jobs which need to be priced and finally these surveys list against a particular job position is delivered to the client
  • Salesforce: Shared file location (file system)
  • Power BI: To Visualize the existing master data with different slicers and different hierarchy
  • Power Automate: The usage of Power app add a new survey, upload a bulk job, single job Example: What we do in power apps being uploaded in Dedicated pool, for example, if we need to insert a row

Lead Data Engineer

NBFC
04.2022 - 12.2022
  • The objective of this project is to develop a data cloud solution for NBFC where all the different entities data can be stored at a central location and this data cloud serves as a single solution for all the insights for internal as well as external stakeholders
  • Data comes from various source systems external and internal in various formats structured, semi structured and unstructured
  • Extraction and transformation and load is being done by Azure Data Factory And SSIS
  • DATA LAKE: This project needs a staging layer which will load incremental and full load files to create a archive folder in adls containers
  • These files will be loaded in CSV format

Lead ETL DEVELOPER

Dover
09.2020 - 04.2022
  • The objective of this project is to develop a solution to prove the quality of the reports which we are displaying
  • This project also focuses on facilitating Export options in BI report
  • This can be achieved by Scraping required attributes from Website using Databricks python code and selenium, web driver
  • This Framework is developed to scrape information once in a week

Lead ETL DEVELOPER

Dover
02.2020 - 04.2022
  • The objective of this project is to develop a data cloud solution for Dover where all the different entities data can be stored at a central location and this data cloud serves as a single solution for all the insights for internal as well as external stakeholders
  • Data comes from various source systems external and internal in various formats structured, semi structured and unstructured
  • Extraction and transformation and load is being done by either Azure Data Factory or Databricks
  • In Databricks languages such as python or pyspark or sparl.sql are used by me to perform etl and required transformations
  • DATA LAKE: This project needs a staging layer which will load incremental and full load files to create a archive folder in adls containers
  • These files will be loaded in parquet format

Software Engineer

Dover
02.2020 - 09.2020
  • The objective of this project is to develop a data cloud solution for Dover where all the different entities data can be stored at a central location and this data cloud serves as a single solution for all the insights for internal as well as external stakeholders
  • Data comes from various source systems external and internal in various formats structured, semi structured and unstructured
  • Extraction and transformation and load is being done by either Azure Data Factory or Databricks
  • In Databricks languages such as python or pyspark or sparl.sql are used by me to perform etl and required transformations
  • DATA LAKE: This project needs a staging layer which will load incremental and full load files to create a archive folder in adls containers
  • These files will be loaded in parquet format

Big Data, Data Warehouse, Data Lake Developer

Omics software inc
06.2019 - 02.2020
  • The objective of this project is to develop an analytics solution for ME Dubai
  • Data is coming from various source systems external and internal in various formats structured and semi structured
  • Tasks include copying incremental data into Azure data lake store in UDL folder – validate for schema and data quality checks, Perform data cleansing, referential integrity, restatement, error logging etc
  • Write the curated data into ADLS and enterprise data warehouse in Azure SQL Data Warehouse
  • Perform aggregations and calculate KPI's

Big Data, Data Warehouse, ELT and Power BI Reports Developer

Unilever
05.2018 - 06.2019
  • The objective of this project is to make automatic process which loads the incremental and restatement data present in local file share in text delimited format into Azure Data lake and Azure SQL Data Warehouse
  • Tasks include copying incremental data into Azure data lake store in RAW folder – validate for schema and data quality checks, Perform data cleansing, referential integrity, restatement, error logging etc
  • Merge the curated data into Big file in ADLS and enterprise data warehouse in Azure SQL Data Warehouse
  • Create Power BI reports/Dashboards using curated data from SQL Data Warehouse

T-SQL, ELT and Power BI Reports Developer

Unilever
04.2017 - 05.2018
  • The objective of this project is to load the data that is coming from various sources in nested JSON format into Azure SQL Data Warehouse
  • Tasks include extract some of the sources data and place it into data lake
  • Read nested JSON files and copy the data into SQL staging tables
  • Apply data quality checks, error logging and load the data into enterprise data warehouse
  • Consume the curated data into Power bi reports/Dashboards

T-SQL, ETL and SSRS Reports Developer

Conference Series
06.2016 - 04.2017
  • The main objective of this project is to create ETL packages for their departments according to business requirements and add them in their existing enterprise data warehousing solution
  • The responsibility mainly includes creation of ODS packages, facts and dimensions, optimization of their existing packages
  • Working as system analyst and following agile model of development
  • Requirement gathering, analysis, design, development, unit testing, deployment, validation, support for production issues

Education

Bachelor of Technology - Computer And Information Sciences

MLR Institute of Technology
Hyderabad
06-2015

Skills

  • Azure Synapse Expertise
  • Azure Data factory
  • SQL Server Data Tools (SSDT) 2012/2010 (SSIS)
  • SQL Server Management Studio
  • Experience with Microsoft Fabric
  • Power BI Data Visualization
  • GIT Repository Management
  • Fundamental Salesforce Workbench Skills
  • T-SQL Database Management
  • Fundamental Python Skills
  • Basic SOQL Proficiency
  • Azure SQL Database Management
  • SQL Server 2012/2014/2016/2019
  • Big Data
  • ETL/ELT development

Certification

  • Microsoft Certified Azure Data Engineer, Data engineering Certifications from Microsoft-DP 203
  • Microsoft Certified Azure Enterprise Data Analyst Associate, Azure Enterprise Data Analyst Associate from Microsoft-DP-500
  • Microsoft Power BI Data Analyst Associate, Microsoft Power BI Data Analyst-PL-300

Languages

English
Professional Working

LinkedIn profile

https://www.linkedin.com/public-profile/settings?trk=d_flagship3_profile_self_view_public_profile

Interests

  • Tech enthusiast, passionate about exploring the latest advancements and innovations
  • Participating in cultural exchange programs and homestays

Workshop

Attended an immersive NVIDIA AI workshop hosted by our company.

Timeline

Data Architect

AJG Gallagher
02.2023 - Current

Lead Data Engineer

NBFC
04.2022 - 12.2022

Lead ETL DEVELOPER

Dover
09.2020 - 04.2022

Lead ETL DEVELOPER

Dover
02.2020 - 04.2022

Software Engineer

Dover
02.2020 - 09.2020

Big Data, Data Warehouse, Data Lake Developer

Omics software inc
06.2019 - 02.2020

Big Data, Data Warehouse, ELT and Power BI Reports Developer

Unilever
05.2018 - 06.2019

T-SQL, ELT and Power BI Reports Developer

Unilever
04.2017 - 05.2018

T-SQL, ETL and SSRS Reports Developer

Conference Series
06.2016 - 04.2017
  • Microsoft Certified Azure Data Engineer, Data engineering Certifications from Microsoft-DP 203
  • Microsoft Certified Azure Enterprise Data Analyst Associate, Azure Enterprise Data Analyst Associate from Microsoft-DP-500
  • Microsoft Power BI Data Analyst Associate, Microsoft Power BI Data Analyst-PL-300

Bachelor of Technology - Computer And Information Sciences

MLR Institute of Technology
Sathish Vardhineedi