Rajesh is a seasoned Business Intelligence/Data Engineering/Data Science/Big Data consultant and Architect with knack to understand the Core system and perform system analysis and use hands down approach to shape the system architecture, implement and deliver measurable business values and tangible outcomes.
In his career he is responsible for business requirement analysis, Client Engagement and Coordination with team, prepare technical and functional manuscript to meet business requirement and involve in end-to-end SDLC process.
Engaged with Marketing Directors, Enterprise Architect and Business PM’s as a Data consultant and generating the IT needs and its execution delivery.
Having 16+ Years of IT Experiences in Data Engineering , Data Science ,Data Modelling, Data Mart Development & Maintenance of Data Warehousing and Client server Application with Banking & Finance, Manufacturing, Retail.
Architected and optimized the Databricks Lakehouse framework to support multi-terabyte datasets, incorporating best practices in data modeling, ETL processes, and data partitioning to ensure efficient data storage and access.
Designed and implemented complex pipelines for ingesting and transforming data for Data science use case.
Architected and implemented Data Science Infrastructure using Microsoft Azure Cloud and also designed and implemented the VNET based architecture.
Architected and designed the model creation and deployment strategy for Data Science projects.
Well versed with architecting and solutioning data intensive applications for performance and scale.
Well versed with architecting and designing data platforms and solutions across multiple cloud and managing security, performance and scale.
Experienced in Data Extraction, Migrations, Transformation and Loading (ETL) between Homogenous and Heterogeneous Systems.
Strong experience in Python, Stored Procedure/TSQL coding, Performance tuning, and Query Optimization both on SQL Server and Databricks.
Highly proficient in the use of T-SQL for developing complex Stored Procedures, Triggers, Tables, User Defined Functions, Relational Database models and Data integrity, and SQL joins.
Worked on all phases of software development life cycle including Analysis, Design, Development, Testing, Peer Code Review and Support.
Having Good knowledge of Big data,Hadoop and Microsoft Windows Azure concepts and tools, technologies.
Good knowledge of Cloud integration using SSIS and Dell Boomi.
Drafting business proposal and architecture for various projects.
Good knowledge on Private Banking, Investment Banking, Leasing and Manufacturing Process.
Broad knowledge in Multi cloud application design and implementation (Azure,AWS,GCP)
Overview
16
16
years of professional experience
Work History
Senior Data Engineer & Solution Architect
Medical Solutions
03.2021 - Current
As SA and Senior Data Engineer, designed solutions with layered architecture using configuration driven approach to support multiple requirements.
Developed robust data pipelines using Databricks, Delta Lake, and Apache Spark, facilitating data ingestion, cleansing, transformation, and aggregation from diverse sources into the Lakehouse with high throughput and low latency.
Led a team of data engineers and collaborated with cross-functional teams (including Data Scientists, Business Analysts, and IT Security) to align the Lakehouse project with business objectives and security requirements.
Participating in strategic discussions, requirement workshops, business process descriptions, use case scenarios and workflow analysis.
Acted as the key Solutions Architect, presenting technical strategies, project progress, and outcomes to stakeholders, including technical leads, project managers, and executive leadership, ensuring project alignment with business goals.
Responsible for architecting, designing, implementing and supporting scalable secure infrastructure and solutions for cloud-based Data Science project.
Was instrumental is designing and developing the multi environment deployment of pipelines and ML models.
Implemented security best practices using AZURE ACL and Unity Catalog.
Provide consulting and cloud architecture on Microsoft Azure platform for high availability of services and low operational costs.
Azure Databricks Delta Lake implementation
Azure Log Analytics and reporting for smooth product support.
Data platform design and development for ingesting data from multiple sources
NoSQL design strategy and implementation for consistent query performance and scalability using Atlas MongoDB
Designed Technical data flow with specs and diagrams and lead team to deliver the solutions based on specs.
Gathering requirements from business and performing data analysis on requirement and provide the business reports.
Designed and developed batch processing mechanism to process tons of records on daily basis where you can control the batch processing with ease.
Engaged with business to understand their pain point and give solutions.
Created complex Azure Data factory and Azure Databricks pipelines to integrate data from multiple sources.
Created PySpark Notebooks to integrate and process data from Datalake.
Worked on migration of entire ADF/ADB pipelines to Azure Data Lake GEN2 from Azure Data Lake GEN1
Plan solution architecture to create cloud-based Azure big data solution using Azure services and components. Build Big Data Solutions to Acquire, Prepare, Process, Predict, Visualize Data Insights using services like Azure Data Factory, Azure Storage, Azure Data Lake, Apache Spark Clusters, Azure Databricks, Power BI, Microsoft BI Stack, Azure Python.
Provided architectural guidance on Azure Data flow and its implementation.
Implemented security using Azure Key vault.
worked on and supported Azure Devops (CI/CD pipelines) for continuous integration and deployments.
Agile/Scrum methodology used in the project, performed pivotal role in planning and developing and aligning with sprints.
Used Python to build databricks notebooks to refine data.
Developed and operationalize ML solutions.
Principal Software Engineer
GEP
09.2019 - 03.2021
As an Architect and Principle Engineer, designed solutions with layered architecture using configuration driven approach to support multiple requirements.
Participating in strategic discussions, requirement workshops, business process descriptions, use case scenarios and workflow analysis.
Provide consulting and cloud architecture on Microsoft Azure platform for high availability of services and low operational costs.
Contributed to Multi cloud design and development of product and services.
Azure Databricks Delta Lake implementation
Azure Log Analytics and reporting for smooth product support.
Data platform design and development for ingesting data from multiple sources
NoSQL design strategy and implementation for consistent query performance and
scalability using Atlas MongoDB
Designed Technical data flow with specs and diagrams and lead team to deliver the solutions based on specs.
Gathering requirements from business and performing data analysis on requirement and provide the business reports.
Designed and developed batch processing mechanism to process tons of records on daily basis where you can control the batch processing with ease.
Engaged with business to understand their pain point and give solutions.
Created complex Azure Data factory and Azure Databricks pipelines to integrate data from multiple sources.
Created PySpark Notebooks to integrate and process data from Datalake.
Worked on migration of entire ADF/ADB pipelines to Azure Data Lake GEN2 from Azure Data Lake GEN1
Plan solution architecture to create cloud-based Azure big data solution using Azure services and components. Build Big Data Solutions to Acquire, Prepare, Process, Predict, Visualize Data Insights using services like Azure Data Factory, Azure Storage, Azure Data Lake, Apache Spark Clusters, Azure Databricks, Power BI, Microsoft BI Stack, Azure Python
Provided architectural guidance on Azure Data flow and it’s implementation.
Implemented security using Azure Key vault
Worked on and supported Azure Devops (CI/CD pipelines) for continuous integration and deployments.
Agile/Scrum methodology used in the project, performed pivotal role in planning and developing and aligning with sprints.
Migration strategy from SQL (Relational) to NoSQL
Used Python to build frameworks in databricks to refine data
Used Python to notebooks to build migration logic from SQL to MongoDB(NoSQL)
Solution Architect
Chevron
03.2019 - 09.2019
As a Solutions Architect and BI Technical Lead, designed solutions with layered architecture using configuration driven approach to support multiple requirements.
Plan solution architecture to create cloud based Azure big data solution using Microsoft Azure Services and components.
Built Big Data Solutions to Acquire, Prepare, Process, Predict, Visualize Data Insights using services like Azure Data Factory, Azure Storage, Azure Data Lake, Apache Spark Clusters, Azure Databricks, Power BI, Microsoft BI Stack, Azure python
Participating in strategic discussions, requirement workshops, business process descriptions, use case scenarios and workflow analysis.
Provide consulting and cloud architecture on Microsoft Azure platform for high availability of services and low operational costs.
Designed Technical data flow with specs and diagrams and lead team to deliver the solutions based on specs.
Gathering requirements from business and performing data analysis on requirement and provide the business reports.
Designed and developed batch processing mechanism to process tons of records on daily basis where you can control the batch processing with ease.
Engaged with business to understand their pain point and give solutions.
Created complex ADF pipelines to integrate data from multiple sources.
Created complex dashboard and reports using Power BI for directors and Executives.
Agile/Scrum methodology used in the project, performed pivotal role in planning and developing and aligning with sprints.
Technical Lead
Microsoft
08.2017 - 02.2019
Interact with different stake holders like Business Analysts, Supply Chain Managers, Store Managers to create plan to meet their business requirements.
Conduct Technical Reviews and Code Reviews to adhere to the development guidelines as per the Industry standards at all stages of software development process such as requirements gathering, designing architecture, development, testing and deployment
Plan solution architecture to create cloud based Azure big data solution using Azure services and components. Build Big Data Solutions to Acquire, Prepare, Process, Predict, Visualize Data Insights using services like Azure Data Factory, Azure Storage, Azure Data Lake, Apache Spark Clusters, Azure Databricks, Power BI, Microsoft BI Stack, Azure Machine Learning using Python
Functions as a technical lead in Cloud computing space using Azure and act as a Primary coordination point for all projects in Devices Supply Chain Management Data Sciences Team.
Evaluate option of using solutions on PAAS vs IAAS vs SAAS models.
Plan and Implement solutions using Azure services, Components and Microsoft Tools.
Implement solution on BI/DW environment on Azure/On Premise using the Azure Analysis
Service, Azure SQL Data warehousing, SQL Server and MSBI Stack.
Design ETL/ ELT pipelines using Azure Data Factory with Data Lake/Azure Storage for Data Storage, Azure Data Analytics / Apache Spark for Data Processing, Azure Machine Learning for Predictive Analytic’s.
Plan, Design, Implement DW environment using Lambda Architecture in the Cloud with Azure Databricks. Build Multi-dimensional Cubes or Tabular Cubes using either STAR schema or SNOWFLAKE schema based on data volume and provide data at Cube level in Facts, which should be easily slice-able by Dimensions.
Design and Implement well laid out and stunning visuals displaying Data Insights using Power BI/ Sql Server Reporting Services.
Technical Lead
Corning
09.2014 - 07.2017
Understanding the Business process and data analysis.
Requirement gathering from the Business users and get the document sign off with business users.
Create the functional requirement, implementation plan, unit testing documents.
Designing and implementing ETL mechanism for business requirements that includes full loads and transformations using SSIS.
Experience in working with SSIS/SSAS/SSRS in creating packages, data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying SSIS/SSRS and SSAS projects.
SSAS Cube Analysis using MS-Excel, PowerPivot and Power BI.
SQL Server Analysis Services (SSAS) OLAP Cubes Implemented with dimensional Data Modelling Star and Snow Flakes Schema.
Defining Regular, Referenced, Many-to-Many relationships and creating Hierarchies key.
Creating DAX/MDX queries with Calculations, KPIs, Actions and Partitions, Aggregations in SSAS.
Defined attribute properties in a Parent Child Dimensions. Responsible for hiding and disabling attributes hierarchies. Sorting attributes members based on a secondary attribute.
Cloud Integration(Rest API) using SSIS and Dell Boomi.
Creating users Role base security (permissions) for cubes, dimensions, measure groups etc.
Deploying the Cubes, Dimensions and Partitions in DEV, UAT and Production/Live servers.
Creating Tables, Views/Index Views with relationships in the MS SQL Server 2012.
Configure the ETL packages, Cubes in Jobs and Scheduling the Jobs Using SQL Server 2012.
Component testing and peer code review.
Worked on deployment, Regression testing strategy for the application.
Data Integration using SSIS/Informatica.
Reporting using SSRS/SSAS/Powerview.
SharePoint Integration
Lead
Schneider Electric
07.2012 - 08.2014
Understanding the Business process and data analysis.
Requirement gathering from the Business users and get the document sign off with business users.
Create the functional requirement, implementation plan, unit testing documents.
Designing and implementing ETL mechanism for business requirements that includes full
loads and transformations using SSIS.
Experience in working with SSIS/SSAS/SSRS in creating packages, data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying SSIS/SSRS and SSAS projects.
SQL Server Analysis Services (SSAS) OLAP Cubes Implemented with dimensional Data Modelling Star and Snowflake Schema.
Creating MDX queries with Calculations, KPIs, Actions and Partitions, Aggregations in SSAS.
Creating users Role base security (permissions) for cubes, dimensions, measure groups etc.
Deploying the Cubes, Dimensions and Partitions in DEV, UAT and Production/Live servers.
Creating Tables, Views/Index Views with relationships in the MS SQL Server 2012.
Configure the ETL packages, Cubes in Jobs and Scheduling the Jobs Using SQL Server 2012.
Component testing and peer code review.
Worked on deployment, Regression testing strategy for the application.
Data Integration using SSIS/Informatica.
Reporting using SSRS/SSAS.
Developed component using C#.Net
Senior Software Engineer
Societe Generale
02.2008 - 07.2012
Involved in Requirement gathering and Business design.
Requirement review and preparing design document, Technical Specification, Unit Test Cases.
Worked with cross-functional teams to define the scope of project and to define the customer requirements.
Deploying the Cubes, Dimensions and Partitions in DEV, UAT and Production/Live servers.
Creating Tables, Views/Index Views with relationships in the MS SQL Server.
Integration of data using SSIS. Configure the ETL packages, Cubes in Jobs and Scheduling the Jobs Using SQL Server.
Coding / Implementation (developing various GUI and Backend Coding)
Component testing and peer code review.
Involved in Performance Tuning and Demoralization Engine Development for the application.
Worked on Regression testing strategy for the application.
Education
Master of Science - Computer Science
University of Calcutta
India
07.2007
Bachelor of Science - Computer Science
University of Calcutta
India
07.2005
Skills
Microsoft Windows Azure Services
SPARK SQL, HIVE SQL, U SQL
Microsoft Azure services (ADLS, ADF, Blob Storage, VM Configuration, HDFS management, YARN)