Logical Data Analyst is skilled in requirement analysis, software development, and database management. Self-directed and proactive professional with 5+ years of vast experience collecting, cleaning, and interpreting data sets. Natural problem-solver possessing a strong cross-functional understanding of information technology and business processes.
Overview
6
6
years of professional experience
Work History
Data Analyst
EUniverse Technologies LLC
09.2023 - Current
Expertise in designing and deploying data visualizations using Tableau Desktop
Created workbooks by importing data, and defining relationships (Excel, Oracle, MySQL)
Tested, Cleaned, and Standardized Data to meet the business standards using Execute SQL task, Conditional Split, Data Conversion, and Derived columns in different environments
Experience in working with complex SQL queries like Joins, Date functions, Inline functions, and Sub-queries to generate reports
Excellent understanding of data warehouse-specific Architecture, Design Procedures, ETL and workflow development, data cleansing, and data quality validation
Strong understanding of advanced Tableau features including calculated fields, parameters, table calculations, row-level security, R integration, joins, data blending, and dashboard actions
Designed, Developed and Deployed reports in MS SQL Server environment using SSRS 2008
Created many complex Stored Procedures/Functions and used them in Reports directly to generate reports on the fly
Recognized performance bottlenecks in SSIS packages by in-depth analysis of ad-hoc T-SQL queries, stored procs, and functions
Enhanced ETL processes for optimal performance by fine-tuning of ETL procedures resulting in reduction in processing times
Implemented data quality checks within the data warehousing environment to ensure the consistency, accuracy, and integrity of data
Graduate Assistant
TEXAS TECH UNIVERSITY
Lubbock, TEXAS
10.2021 - 05.2023
Analyzed datasets using Python libraries, employing SQL queries to build bespoke tables within PowerBI, and visualized preliminary results through interactive dashboards
Designed and implemented ETL (Extract, Transform, Load) pipelines for seamless data flow between systems
Developed data transformation scripts using tools such as Apache Spark, Python, or SQL
Designed and built data warehouses to support efficient querying and reporting
Implemented snowflake schema designs for optimal data storage and retrieval
Drafted SQL queries to construct custom tables in PowerBI, serving as data sources for the development of visually appealing and informative dashboards
Managed and optimized database systems, ensuring high performance and scalability
Performed database tuning and optimization for efficient query execution.
Data Analyst
MATRIMONY.COM
CHENNAI, INDIA
06.2018 - 05.2021
Conducted data extraction and manipulation using Python and SQL to gather relevant information from various databases, APIs, and flat files for analysis
Created and automated SQL scripts to process, transform, and load data from various sources into data warehouse systems
Developed and optimized complex SQL queries, joins, subqueries, and aggregations to analyze large datasets and provide valuable insights for decision-making
Leveraged advanced statistical methods for ad hoc analysis, including clustering, segmentation, and trend analysis, uncovering patterns and relationships in data that contributed to process optimization
Created interactive and informative data visualizations using tools like Amazon Quick Sight to present findings to stakeholders
Utilized Python libraries such as Pandas, NumPy, and Matplotlib to perform advanced data analysis, including statistical analysis, trend forecasting, and data visualization presenting actionable insights to stakeholders and driving data-driven decision-making
Designed and executed A/B tests to evaluate the effectiveness of marketing campaigns, resulting in a 15% increase in conversion rates and providing actionable insights for campaign optimization
Extracted data from various source systems and established tables/schemas in the Glue Catalog through the creation of Glue Crawlers
Automated processes using AWS Step Functions, CloudFormation, Lambda, and CI/CD with Azure DevOps, resulting in a significant 50% reduction in manual effort
Developed pyspark scripts for data aggregation, queries, and writing back to S3, utilizing data frames and Spark-SQL for transformations
Leveraged AWS serverless services such as Step Function, Lambda, Glue, Redshift Spectrum, Athena, and CloudWatch
Collaborated closely with Business Analysts, data modelers, and Visualization teams to achieve the desired output, incorporating Data Warehousing concepts and schema design.