Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Languages
Timeline
Generic

Puja Roy

Montclair,NJ

Summary

Working as a Data Analyst and Report Developer at Montclair State University since March 2019. Have end-to-end experience in Software Requirement Analysis, Estimation, Documentations, ETL Development, Execution and Management, Defect Resolution, BI Reporting, Implementation/Deployment, Support/Enhancement and Maintenance. Have in-depth knowledge in Extract, Transform and Load (ETL) development using SQL Server Management Studio (SSMS), SQL Server Integration Services (SSIS) and Oracle SQL Developer. Designed conceptual data model based on the requirement, Interacted with non-technical end Users to understand the business logics. Modeled the logical and physical diagram of the future state, Delivered BRD and the low level Design document. Discuss and develop Data model, data flow and data mapping with the development team. Discuss and Develop Conceptual Data Model Integrate multiple data Models and create a Single Unified Model Translated and assembled business requirements into detailed, production-level technical Specifications, detailing new features and enhancements to existing business functionality. Experience in database design which includes the concept of fact and dimension. Experience in debugging, fixing, query optimization and performance tuning. Software Development Life Cycle - Waterfall and Agile Methodologies Unit Testing of T-SQL and PL/SQL code by creating test data. Experience in reporting using SAP Business Objects, IBM Cognos Analytics and SAS Enterprise Guide. Troubleshooting of Production issues within SLA. Excellent oral and written communication skill. Adaptable to change, Strong Attitude, Creativity and Analytical approach to problem solving and Highly task oriented. Ability to handle multiple assignments with critical deadlines and changing priorities. Create and refine data visualization and interactive dashboards using Tableau, Power BI, SQL Server Reporting, and other software. Efficient in implementing Normalization to 3NF/ De-normalization techniques for optimum performance in relational and dimensional database environment Created the data models for OLTP (online transaction processing) and OLAP (online analytical processing) systems. Engaged in data profiling to integrate data from different sources and Performed the gap Analysis and impact analysis. Assisted DBAs on support needs and provided guidance on architectural issues. Participate in the maintenance and development of online web reporting capabilities. Work closely with clients from academic and administrative units to define requirements and Solutions and provide decision support consultations. Partner with IT staff to develop efficient data structures, provide access to data and participate in Data conversion and data migration efforts. Collaborate with financial analysts on complex analyses. Document best practices, processes, utilities, and reports. Experience working in JIRA, Confluence, Service-Now. Facilitated QA for developing test plans and test cases for unit, system, and Enterprise testing. Collaborated on the data mapping document from source-to-target and the data quality Assessments for the source data. Leveraging analytics and large amounts of data. Data exploration, data cleaning, implementation of advanced statistical and machine learning algorithms.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Data Analytics and Report Engineer

Montclair State University
03.2019 - Current
  • Assisting in the documentation of requirements and specifications
  • Create visualizations, identify patterns and meaningful insights for the data extracted with Tableau
  • Manage Tableau Server
  • Twisting SQL queries to improve performances, examine glitches in business processes and resolve them in Tableau
  • Experience with Tableau desktop with a strong understanding of Tableau architecture and administration.
  • Develop, manage and support migration of objects (dimensions, measures, hierarchies, reports, workbooks, dashboards), from development to production environments.
  • Providing Hands-on training to a small subset of reporting individuals on a sustained basis with the intent to transfer report/dashboard development experience.
  • Facilitating change and innovation to legacy systems and processes
  • Transforming Higher Ed Organizations through Digital and Cloud Strategies.
  • Automate business workflows across cloud apps using Workato.
  • Design, build and enhance integrations with various systems of Workato to ensure data integrity.
  • Oracle programming through Oracle SQL Developer.
  • Prepare all reports for management with help of Cognos Report Studio.
  • Prepare all designs and code for various Cognos reporting objects and ensure compliance to all best business practices in industry.
  • Perform troubleshoot on all Cognos helpdesk tickets and resolve all queries for content and prepare standard reports and ensure accuracy of same.
  • Created the enhanced logical model in 3NF using ER/Studio and the many to many relationships between the entities are resolved using associate tables.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Database programming in Oracle SQL through Oracle SQL Developer which includes the concept of Join, Cursors, Ref-cursors, Procedures, Functions, Partitioned Tables, Triggers, Table Indexing for Extract, Transform and Load (ETL) of source data.
  • Tools: Oracle 12c, Tableau, Tableau Prep, Cognos Analytics, Oracle ODS, Navigate for Colleges and Universities, Workato

Technical Support Engineer and Data Analyst

GEP Worldwide
01.2018 - 01.2019
  • Analyse the concerns raised by client for the company's procurement product and provide resolution by executing scripts at the backend.
  • Transforming Procurement Organizations through Digital and Cloud Strategies.
  • Create independently data addition/update scripts for Client's UAT and Production sites.
  • Independently conduct root cause analysis of issues and take appropriate actions to resolve the issues.
  • Coordinate with QC, development and TSO team member for getting completed data setup related task.
  • Collaborating with the existing technology team to understand and leverage current technology components available under existing products.
  • Work flexible hours, especially during product launches and production issues.
  • Write effective, scalable code with Python libraries, including Matplotlib, NumPy, Pandas, etc.
  • Created predictive model using Microsoft Azure.
  • Tools: SQL Server Management Studio and Microsoft Azure, Python

Data Warehouse Analyst-Data Modeler & Lead SQL Developer (Offshore and Onshore)

General Electric through Tata Consultancy Services Ltd.
01.2012 - 01.2017
  • Actively participated in the requirement gathering from different stakeholders, project planning, effort estimation, design, and development of ETL framework and Quality assurance, emphasizing on automation and streamlining of the ETL process.
  • Attend daily status call and provide inputs on day to day activities and share thoughts on important decisions.
  • As a Data Warehouse Analyst, the responsibility was to assume a key role in day-to-day support of Business Intelligence tools, Decisions and Reporting solutions, Business Objects, and underlying ETL framework.
  • Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF)
  • Involved in analysis and cleansing of data. Developed reporting tools for use by other Transmission staff.
  • Reviewed the logical model with application developers, ETL Team, DBA's, and testing team to provide information about the Data Model and business requirements.
  • Prepared Logical Data Modelling using ERWIN that contains set of diagrams and supporting documents containing the essential business elements, detailed definitions, and descriptions of the relationships between the data elements to analyse and document business data requirements.
  • Responsible for analysing huge data sets with basic statistical methods, interprets results, and provides written summaries of data analysis.
  • Collaborated on the data mapping document from source-to-target and the data quality assessments for the source data.
  • Database programming in T-SQL through SQL Server Management Studio which includes the concept of Join, Cursors, Ref-cursors, Procedures, Functions, Partitioned Tables, Triggers, Table Indexing for Extract, Transform and Load (ETL) of source data.
  • For reporting of source data by SQL Server Integration Services (SSIS), SAP Business Objects XI 3.1, SAS Enterprise Guide 9.4
  • Writing complex queries, aggregating tables, truncating, dropping, modifying tables joins and source queries to pull data from other sources to integrate in the data warehouse, Full and Incremental loads.
  • Built the DW environment using SQL server tables with Terabytes of Data. Using SQL Server Integration Services (SSIS) as an ETL Tool, the data has Extracted, Transformed and Loaded (ETL) from Excel and Flat file to MS SQL Server.
  • Designed scalable, flexible, and affordable Data warehouse models, created surrogate keys, and worked with bridge tables.
  • Facilitated QA for developing test plans and test cases for unit, system, and Enterprise testing.
  • Documented monthly status reports for enhancement and modification requests for the development team to assist them in efficient tracking and monitoring of open issues of the project.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models
  • Created the data models for OLTP (online transaction processing) and OLAP (online analytical processing) systems.
  • User creation for SAS Enterprise Guide 9.4 and SAS Enterprise Guide 9.1
  • Library creation both ODBC and Flat file for SAS Enterprise Guide 9.4 and SAS Enterprise Guide 9.1
  • Creation of jobs in SQL Server Management Studio, SQL Server Integrated Services and SAS Enterprise Guide.
  • Creation of Web Intelligence Report by creation of object (Dimensions and Measures), universes in SAP Business Object.
  • Tools: SQL Server Management Studio, SQL Server Integration Services, SAP Business Objects XI 3.1 and SAS Enterprise Guide 9.4

Education

Bachelor of Technology (B-Tech) -

West Bengal University of Technology

Master of Science (M.S.) - Business Analytics

Montclair State University

Skills

  • Software: SQL Server Management Studio (2005 & 2008 R2 & 2012), Microsoft Business Intelligence Development Studio (2005, 2008 R2 & 2012), SAP Business Objects XI 31, SAS Enterprise Guide 91, and SAS Enterprise Guide 94, IBM Cognos Analytics
  • Tools: Analytical Tool ETL Tool Reporting Tool - SQL Server Management Studio (2005, 2008 R2 & 2012), R, Oracle SQL Developer ETL Tool - Microsoft Business Intelligence Development Studio (2005, 2008 R2 & 2012) Reporting Tool –Tableau, Microsoft Business Intelligence Development Studio (2008 R2 & 2012), SAP Business Objects XI 31, SAS Enterprise Guide 91, SAS Enterprise Guide 94, IBM Cognos Analytics Visualization Tool – Tableau Clouds Services - Workato
  • Methods: Waterfall and Agile methodologies of Software Development Life Cycle
  • Domain Experience: Mortgage Loan Business, Banking, Higher Education

Accomplishments

  • Reduced reporting latency by 60% by redesigning ETL workflows in SQL and automating data refresh schedules via Python scripts and cron jobs.
  • Delivered 40+ executive dashboards in Tableau and Power BI that provided near real-time insights into financial and operational KPIs.
  • Built scalable data pipelines integrating data from Workday, Salesforce, and Banner into centralized analytics marts using SQL and SSIS.
  • Optimized complex queries and introduced materialized views, reducing runtime from hours to minutes on large financial datasets.
  • Collaborated with business stakeholders to define data quality metrics, increasing trust in analytics deliverables and enabling faster decision-making.
  • Implemented error-handling and audit logs for ETL jobs, reducing manual data correction effort by 35%.
  • Migrated 15+ legacy reports from Cognos to Tableau, introducing dynamic parameters and row-level security for secure data access.
  • Automated data validation checks with Python and SQL scripts, preventing recurring reconciliation issues between source and reporting systems.
  • Trained 20+ functional users on self-service analytics using Tableau and Workday reporting tools, improving report turnaround time.

Certification

  • Microsoft Certified Professional - 70-461
  • Big Data & Hadoop
  • BI SAS Foundation
  • Cloud Computing

Languages

English
Native or Bilingual

Timeline

Data Analytics and Report Engineer

Montclair State University
03.2019 - Current

Technical Support Engineer and Data Analyst

GEP Worldwide
01.2018 - 01.2019

Data Warehouse Analyst-Data Modeler & Lead SQL Developer (Offshore and Onshore)

General Electric through Tata Consultancy Services Ltd.
01.2012 - 01.2017

Master of Science (M.S.) - Business Analytics

Montclair State University

Bachelor of Technology (B-Tech) -

West Bengal University of Technology