An innovative and highly motivated professional with 10 plus years of experience in Data Engineering, Data Analytics, and Business Intelligence. Seeking a challenging position as a Senior Data Engineer/BI Developer in a growth-oriented organization that values creative and analytical professionals.
Hands-on experience in working with Azure Cloud and its components like Azure Data Factory, Azure Blob Storage, Azure Data Flows, Azure Databricks, Azure Synapse and Azure Key Vault.
Extensive experience in data loading and integration using Azure Data Factory (ADF).
Migrated data solutions to Azure, using Azure Data Factory, Azure SQL Database, and integrating data with Azure Databricks.
Collaborated with cross-functional teams to streamline supply chain processes by analyzing data on procurement, transportation, and warehouse operations, resulting in a 15% improvement in efficiency.
Involved in Data Migration using SQL, SQL Azure, Azure Storage, and SSIS, PowerShell
Proficient in data pipeline development, data modeling, and designing & developing data warehouses and business intelligence solutions.
Strong experience in developing complex Stored Procedures, Functions, Views, Joins and Sub queries with T-SQL.
Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification, Identifying data mismatch.
Skilled in preparing complex T-SQL queries, views, and stored procedures for data loading.
Hands on experience in creating Reports and Dashboards as per the business requirements using SSRS, Power BI and Tableau.
Developed and maintained dashboards in Power BI/Tableau to monitor key supply chain metrics such as lead times, order fulfillment, and supplier performance.
Implemented advanced Excel models to predict demand trends based on historical sales data, seasonality, and market analysis.
Proven proficiency at Data Transformations like Derive Column, Conditional Split, Aggregate, Merge Join, Lookup, and Sort & Execute SQL Task to load data.
Experience in automating routine database tasks, such as backups, index maintenance, and deployments, using Power Shell scripts integrated with SQL Server.
Expertise in creating SSIS packages to load data into the database with required transformations
Well-versed in handling semi-structured and unstructured data in Azure Blob Storage for data retrieval and efficient storage.
Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements – Store procedures, Functions, Triggers and Packages.
Extensive experience in designing, developing, and deploying SSIS packages to extract, transform, and load (ETL) data from multiple sources, including flat files, XML, and databases, into SQL Server databases.
Created SSIS Catalog Utility and deployed the SSIS package on various servers.
Validating the incoming data against predefined rules and set up alerts for any anomalies Using Snowflake's SQL functions and constraints.
Developed a dynamic pipeline within Azure Data factory for extracting and orchestration.
Building, publishing customized interactive reports and dashboards using Tableau Desktop/Qlik Sense Desktop.
Demonstrated expertise in leveraging Azure Databricks for distributed data processing, transformation, validation, cleansing, and ensuring data quality and integrity.
Possess hands-on experience working with API, web services.
Experienced with Azure Repos, Azure functions and GIT repository.
Implemented copy activity using custom Azure factory pipelines.
Worked on bug fixes, and performance enhancement, and supported production runs.
Excellent communication and reporting skills.
Exceptionally well-qualified Development Engineer with sophisticated technical skills and a passion for resolving complex problems or challenges through innovation.
Ability to meet deadlines, handles pressure situations, and completes the task in time.
A self-learner, a decisive problem-solver.
Overview
10
10
years of professional experience
1
1
Certification
Work History
Senior Data Analyst
STAR COMPLIANCE
09.2021 - Current
Gather business requirements from stakeholders and transform them into reporting requirements
Implement a strategy to source all data requirements directly from third-party data platforms
Develop and maintain the data dictionary/document metadata and other relevant documentation for the Data Hub
SQL Development, creation of complex stored procedures, and performance tuning skills
Define and document features by using JIRA as a tool to manage the product backlog, user stories, acceptance criteria to collect and keep track of task details
Analyzed large datasets to identify trends, patterns, and anomalies in supply chain operations, leading to optimized inventory levels and reduced costs
Utilized Joins and sub-Queries to simplify complex queries involving multiple tables while optimizing procedures and triggers to be used in production
Designed and developed ETL pipelines to extract, transform, and load data from various sources, ensuring seamless integration into target databases or data warehouses
Automated data loading tasks using SQL Server Integration Services (SSIS) and custom scripts, reducing manual intervention and increasing overall process efficiency
Managed ETL process scheduling and orchestration using tools like SQL Server Agent, Airflow, or Azure Data Factory, ensuring timely and reliable data delivery
Design and build data pipelines to efficiently process and transfer data
Utilize Azure Data Factory for data integration and orchestration
Develop and maintain code in Python for data processing and analysis
Performed Data Analysis and Data Profiling using SQL on Various source systems including oracle and Teradata
Written SQL scripts to test the mapping and developed traceability matrix of business requirements mapped to test scripts to ensure any changes control in requirements lead to test case update
Implement and optimize solutions within the Azure Data stack
Developed and maintained dashboards in Power BI/Tableau to monitor key supply chain metrics such as lead times, order fulfillment, and supplier performance
Develop Knime workflows to load data into Teradata databases with required transformations
Identified bottlenecks in the supply chain using data modeling and provided actionable insights that reduced cycle time by 10%
Create mock Reports and Dashboard wireframes based on requirements
Automated data collection and reporting processes using Python, reducing manual effort by 40% and increasing data accuracy
Make changes to Reports/Dashboards based on feedback from users in UAT
Designed and implemented key performance indicators (KPIs) to monitor and improve supply chain performance, driving continuous improvement initiatives
Make changes to Reports/Dashboards based on feedback from users in UAT
Prepare user guides to help end-users analyze data using Reports and Dashboards
Ability to multitask and meet strict deadlines
Prepare technical and design specification documents
Utilize GitHub for source code management and version control
Integrated fabric workspaces with GIT Repository for code syncup and deploy code to PROD.
Senior Data Analyst
LORVENKA GLOBAL
05.2016 - 08.2021
Provide specialist data analysis skills and support as required to support key Identity and Access Management initiatives
Uplift the existing monitoring and reporting capability across Staff Identity services
Mature and expand the Identity Analytics capability (within Staff Identity)
Providing identity and access data extracts and reports for other asset/project teams
Providing identity and access data and reports for monitoring and auditory purposes
Created SSIS packages to load data from Different Source Systems to Staging and Staging to Data Warehouse
Worked on complex stored procedures to implement required business logic
Worked with various Control Flow items (For Each loop, Sequence containers, Execute SQL Task, File System task, Execute Package Task, and Send Mail Task) and Data Flow items (Flat File Source, Excel Source, OLEDB source, OLEDB Destination, Derived Column, Sort, Conditional Split, and Row Count)
Integrated data from multiple sources (ERP, WMS, TMS) to create a unified view of the supply chain, enhancing decision-making capabilities
Performed complex data transformations and cleansing, including table joins, filtering, aggregation, and data quality checks, to meet business requirements and ensure data accuracy
Generating reports through MDX queries by using various MDX functions
Generated SSRS Parameterized Reports using MDX Queries
Developed reports like Tabular, Matrix and Chart with various features like Drilldown, Drill through, Sub Reports, and Parameterized reports using SSRS
Developed the pySpark Code to call the API using the request library
Analyzing the data to find out/manage the risks involved in the Identity and Access management
Understanding the user requirements for the reports and gather the required data from multiple systems
Developed SSIS packages to load data into the database with required transformations
Implemented Logging functionality, to trace the package execution
Used XML package configuration to update the connections in the SSIS package
Creating mock Reports and Dashboard wireframes based on requirements
Development of Reports and Dashboards as per the business requirements using SSRS, Power BI, and Tableau
Managing the reports and user permissions on the Tableau server and Power BI workspaces/apps
Made changes to the Reports/Dashboards based on the feedback from users in UAT
Scheduled the data factory pipelines as the business requirements and monitored the jobs on regular bases on PROD
GIT repository was maintained to maintain Databricks codes, data factory pipeline codes also the SQL scripts.
Data Analyst
Wells Fargo India Solutions
05.2014 - 05.2016
Gathered user requirements, analyzed source systems, and source data
Developed SSIS packages to load data into staging tables with required transformations
Implemented Logging functionality, to trace the package execution
Used Event Handlers to handle the errors during the execution
Used XML package configuration to update the connections in the SSIS package
Developed Tabular Models using SSAS
Develop SSIS packages for data loading and transformations into staging tables, implement logging, error handling, XML configurations, and SSIS catalog utility, and deploy and schedule SSIS package executions across servers using SQL Server Agent
Creating mock Reports and Dashboard wireframes based on requirements
Created complex MDX queries to fetch data from Tabular Model for Reporting
Development of Reports and Dashboards as per the business Requirements using SSRS, Tableau, and Power BI
Reconciliation of the Reports with the data on source systems
Used TFS for source code management and version control
Deployed Reports and Dashboards to different SharePoint sites (on Dev, SIT, UAT, and Prod)
Made changes to the Reports/Dashboards based on the feedback from users in UAT
Solid understanding of data warehousing concepts and big data technologies
Prepared a user guide to help the end users in analyzing the data using Reports and Dashboards.
Education
Bachelor of Technology in Electronics & Communication Engineering -
JNTUK
Skills
Cloud Platform: Azure, Snowflake, AWS
Databases: MS SQL, Azure SQL, Cosmos, Terradata, Postgress