Around 4 years of experience as a Data Analyst with strong skills in analyzing data using SQL Server, Python, Tableau, SAS, Data Management, and Microsoft Excel. Coordinated UAT sessions, guiding end-users through the testing process and gathering input for continual product improvement. Worked on SDLC (Software Development Life Cycle) models like Agile methodologies and Waterfall Methodologies. Experienced in using Agile approaches such as Scrum and Kanban to successfully manage projects, improve team collaboration, and gradually deliver value. Developed complex SQL queries and PL/SQL scripts to extract, convert, and load data from diverse sources into Oracle databases, improving data accuracy and usability. Conducted Scrum Meetings and JAD Sessions with representatives, and Project Managers to break down Functional and Non-functional requirements. Skilled in extracting relevant insights from huge data sets using complex Excel functions like VLOOKUP, INDEX-MATCH, PivotTables, and conditional formatting. Implemented FHIR standards for healthcare data exchange, enabling seamless interoperability between disparate systems and improving data accessibility. Developed SQL scripts for data extraction, transformation, and loading (ETL) operations, maintaining data integrity across the organization. Designed and built Extract, Transform, and Load (ETL) methods for OMOP data, allowing for the integration of disparate healthcare datasets. Used Python scripts like NumPy, pandas, and matplotlib for data processing, analysis, and visualization, resulting in significant insights for decision-making. Developed data transformation rules and logic for the migration process, to optimize data quality and integrity during the Snowflake transition. Used MongoDB within Databricks notebooks to query and manipulate data, executing aggregations, filtering, and joining operations for analysis.
Databases SQL, MY SQL, MS Access