Summary
Overview
Work History
Education
Skills
Timeline
Generic

TEJASWINI GURRAM

Flagstaff

Summary

6+ years of experience in Data Modeling, Data Analysis and Database Design and SQL Development in IT, Finance, Retail and Software Domains. Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling. Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server. Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch. Excellent experience in analysis huge volumes of data in industries such as Finance, Healthcare & Retail. An excellent team player & technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

Overview

5
5
years of professional experience

Work History

Data Analyst

Cyient
11.2020 - 04.2023
  • Performed data cleansing and made the data available for the data scientists to score their models.
  • Developed the data quality framework using shell Scripts, python, and UNIX scripts.
  • Tested the compatibility of different open-source software with existing systems.
  • Worked on the Microsoft Azure environment (blob storage, Data Lake, AZ copy) using Hive as extracting language.
  • Utilized Azure Data Factory for transforming and moving data from virtual machine to Data Factory, BLOB storage, and SQL Server.
  • Designed and assisted in developing and customizing the scripts for Quality Framework using Python.
  • Analyzed the data and created models to store data in the big data environment.
  • Established and deploy pipelines, activities with datasets, and linked services according to the business requirements.
  • Used statistical sampling methods to perform data analysis of the dataset using Python.
  • Reviewed data for master data management, maintain existing data models, and wrote Confidential queries to determine the quality and consistency of the data in DB2 database.
  • Worked on decrypting and encrypting of the files.
  • Performance tuning of the SQL server after loading the data to tables.

Data Analyst

Cryptograph Technologies
08.2018 - 05.2020
  • Interacted with business users to clarify on business logic required for the data models.
  • Involved in gathering complete requirements by organizing and managing meetings with business analysts, data stewards and subject matter experts on a regular basis.
  • Analyzed OLTP source systems and Operational Data Store and research the tables/entities required for the project.
  • Designing the measures, dimensions and facts matrix document for the ease while designing.
  • Analyzed the specifications and identified the source data from disparate data sources like Oracle, MS SQL Server and flat files that needs to be moved to data warehouse.
  • Designed and maintained the Logical & Physical dimensional data models using ER/Studio.
  • Generating the DDL statements and working with the database team in creating the tables, views, keys in the database.
  • Manage the Meta data for the subject area models for the Data warehouse environment.
  • Performed Data profiling and identified the risks involved with Data integration to avoid time delays in the project.
  • Performed Data scrubbing for removing incomplete, irrelevant Data and maintained consistency in the target Data warehouse by cleaning the dirty Data.
  • Worked on specifications given by the data governance team and data quality team that required managing the master data from all the business units and ensuring data quality across the enterprise.
  • Validate the models with the production data and developed the Source to Target mapping matrix, an expert solution to design and develop the mappings for the loading of data, to the ETL developers.
  • Extensive Excel work using pivot tables and complex formulas to manipulate large data structures.
  • Expert in data cleansing for accurate reporting. Thoroughly analyze data and integrate different data sources to process matching functions.
  • Conducted frequent meetings with my ETL coding and development team to co-ordinate the process and to efficiently organize and distribute the workflow among the team.

Education

Masters - Computer Science

Northern Arizona University
Flagstaff, AZ
12.2024

Skills

  • Windows
  • UNIX
  • Linux
  • MS-DOS
  • Erwin Data Modeler
  • PowerBI
  • ER/Studio
  • Informatica
  • DataStage
  • MS Office
  • MS Project
  • MYSQL
  • DB2
  • MS SQL
  • Access
  • Oracle
  • Teradata
  • Snowflake
  • Python
  • PostgreSQL

Timeline

Data Analyst

Cyient
11.2020 - 04.2023

Data Analyst

Cryptograph Technologies
08.2018 - 05.2020

Masters - Computer Science

Northern Arizona University
TEJASWINI GURRAM