Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

SHILPA T

Plano,TX

Summary

Overall experience of 8+ years of experience as an IT Data Analyst. Expertise in managing and maintaining database objects such as tablespaces, tables, indexes, sequences, materialized views, partitions, tuning queries, and writing stored procedures and functions as per the business needs. Expertise in Db2 12 for z/OS, IBM DB2 utilities, IBM DB2 performance management and tuning and Structured Query Language (SQL). Expertise in using tools like Informatica, ETL, Db2 Administration Tool for z/OS. 2 years of experience in Azure Data Engineer, Extensive hands-on experience implementing data processing using Azure services: Azure Databricks, ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Event Hub, Azure Stream Analytics, Azure Analysis Service, HDInsight, and processes including tools like Azure Data Factory.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Azure Data Engineer

INFOWEB SYSTEMS INC
, USA
02.2022 - Current
  • Experience working on database/storage technologies such as SQL, Data Lakes and Cosmos DB (Azure Data Lake)
  • Experience in designing, developing databases, data pipelines
  • Designing and implementing efficient and scalable data engineering framework that powers the company's key data applications such as experimentation and metrics reporting
  • Experience with Azure cloud technologies for creating data pipelines (Data Factory, Azure Synapse)
  • Possess in-depth understanding of SQL, database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) frameworks.

Data Analyst

COHASH LLC
, USA
10.2021 - 01.2022
  • Demonstrated knowledge and experience developing data pipelines to automate data processing workflows
  • Designed and Built the integration jobs that move data from existing applications into and throughout the Microsoft Cloud environment (Data Lake, Azure SQL, Azure DWH) leveraging common standard integration patterns, tools and languages (Azure Data Factory, Data Brick, Spark SQL, Python, PySpark)
  • Worked in different file formats such as text, parquet, delta, JSON, and CSV
  • Good knowledge in Azure (Blob, ADLS, Azure Functions, Log Analytics, Events Hub, etc.), Databricks, Delta lake, and Notebooks development
  • Experience in designing job orchestration (ADF), sequence, metadata design, Audit trail, dynamic parameter passing, and error/exception handling.

IT Data Analyst

INFOWEB SYSTEMS INC IOWA
07.2018 - 09.2021
  • Managing and clearly assigning tasks, ground rules team goals, evaluating progress, allowing team to work collaboratively
  • Ability to work through complex issues, identifies themes, and develops solutions, in time- critical situations
  • Assist application teams when necessary, with technical advice, diagnosis of SQL errors
  • Perform application DBA functions (maintain and implement database objects, handle image copy & maintenance activities) for an enterprise
  • Participate in the selection process of establishing and implementing backup and recovery policies and procedures and participate in Virtual Teams assembled to architect new software solutions
  • Managing production system monitoring of databases and database-driven applications
  • Maintain and Enforce Database standard and best practices through multiple sections of the company
  • Provide troubleshooting and problem determination during outage or daily issues
  • Performs basic database tuning (e.g., index design)
  • Database design and performance evaluation reviews
  • Design and develop packages, stored procedures, functions, views, queries and other related jobs to facilitate application access to data
  • Selects and establishes security access best practices and the provisioning model for users and application access to databases
  • Create logical data models for an enterprise and project level where projects are of medium to high complexity and moderate to high in risk, while presenting information to other team members and superiors
  • Ensures databases are protected with a suitable disaster recovery solution
  • Design, implement and support Logical and Physical database design for an Enterprise and/or Project level
  • Work collaboratively across the organization providing support to all Channel specific databases from development to production
  • Participate with team members in analysis of complex database problems and provide recommendations
  • Review and provide analysis to the application developers on the proposed database changes
  • Coordinate with the team members and the application developers in the development and delivery of all database changes assigned to the group
  • Capturing all SQL DDL, DCL, statements in the right order using BMC Change Manager for DB2
  • Involved in performing database changes such as creating Tables, Views, Materialized Query Table (MQT) and, alter table scripts
  • Triggering an ESP event to reset all Identity columns under a schema, and to capture the DB changes in the BMC Change Manager for DB2 for the changes mentioned by the application team in the CIS DB change request spread sheet
  • Performance optimization techniques involved in query tuning by adding the Indexes, usage of utilities to minimize the downtime to attain highest degree of data integrity
  • Consult Database Architect and Senior DBA's in troubleshooting and improving database performance
  • Monitor database backup and recovery for all Channel specific databases
  • Experience in design and implement provisioning data storage services, ingesting, streaming, and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data sources to satisfy business needs
  • Expertise in implement and design data solutions that use the following Azure services: Azure Cosmos DB, Azure Synapse Analytics, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage
  • Experience with streaming data and use of Azure Event Hub and Synapse streaming
  • Knowledge in developing/optimizing PySpark, and SparkSQL code
  • Worked in Datalake implementation and knowledge of datalake concepts and ETL implementations (Both Batch and Streaming).

Software Engineer

REDBUD TECHNOLOGIES, INC
McKinney
06.2018 - 07.2018
  • Worked extensively on the Agile Methodologies, and Sprint delivery models
  • Created and maintained Database Objects like Tables, Views, Indexes (B Tree, Bitmap and Function
  • Based), Constraints, Sequence and Synonyms
  • Extensively worked in Oracle Front-end programming and Back-end development using tools like PL/SQL, SQLPlus, TOAD
  • Effectively made use of Table Functions, Indexes, Table Partitioning, Collections and Analytical functions, Materialized Views, Query rewrite and Transportable table spaces.

Programmer Analyst

SEPALS CORP
Danbury
11.2017 - 06.2018
  • Enhanced, developed and deployed reports based on new requirements for a separate module
  • Exposure to Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques
  • Wrote stored procedures for cleaning up data and providing underlying structure for reporting
  • Prepared the DDL's for the staging / work tables and coordinated with DBA for creating the development environment and data models
  • Worked on various transformations like Ref cursor, Bulk collect, PL/SQL Collections, dynamic SQL
  • Resolved production problems for the applications and ensure all support service level agreements
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes.

Data Analyst

FABEC IT
Hamilton
04.2017 - 10.2017
  • Create users and assign permissions based on the level of database access the user would need
  • Resetting the password and unlocking the user accounts
  • Escalating database backup failures to Senior DBA's
  • Assist developers/application team on defining database structures
  • Interact with application team on any database related issues
  • Monitoring the databases in Oracle Enterprise Manager on long running queries and critical alerts
  • Coordinated with other DBA's during UAT and PROD deployment changes
  • Conduct meetings with application team and Performance test team during the load testing
  • Support Application teams during releases
  • Knowledge on PostgreSQL.

Programmer Analyst

AVERON SOLUTIONS INC
Edison
01.2017 - 04.2017
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from heterogeneous data sources of multiple relational databases like Oracle, SQL Server, T-SQL, IBM DB2, Teradata etc
  • Integrated data from VSAM files and flat files like fixed width and delimited CSV, XML files etc
  • Worked with Oracle, PL/SQL Stored Procedures, Functions, Indexes and Triggers and involved in Query Optimization and worked on Teradata SQL, Stored Procedures and Utilities
  • Experience in Identifying Bottlenecks in ETL Processes and Performance tuning using Database Tuning, Partitioning, Index Usage, Session partitioning, Load strategies, commit intervals, transformation tuning and implementing Pushdown optimization.

Trainee Program Analyst

BRAINWARE INC
Bridgewater
08.2016 - 12.2016
  • Analyzing the business and having walk through with the mapper to get cleared with the concerns related to business and by ensuring that the queries related to business are documented in Work Product Inspection (WPI) Form and to get sign off by placing it in the Subversion (SVN) before the build activities takes place
  • Creating the Microdesign or Low-Level Design Document (LLD) and Source to Target Mapping (STM), Unit Test Case (UTC), sample data documents as reference to build the coding activities
  • Involved in the Informatica build activities using Informatica Power Center 9.6 such as Change Requests (CR's), Proof of Concept of file comprising of unstructured file format and also developed SCD Type-II to build warehouse in Integration layer to generate reports in the Dashboards in OBIEE (11g) in the Semantic Layer
  • Used various Transformations such as Look up (Connected and Unconnected), Aggregator, Joiner, Sorter to generate SCD Type-II
  • Documenting all the queries related to Test data, queries related to business to build the coding in Work Product Inspection Form (WPI) and by getting it signed off from the architect by placing in the Subversion (SVN)
  • Creating the Code Traceability Document as part of build activities which comprises of the Informatica objects
  • Involved in Code Review, Peer Review
  • Captured the test results by capturing the test results in the Unit Test Plan (UTC).

Informatica Developer

IBM India Pvt Ltd
Hyderabad
08.2013 - 07.2015
  • Analyzed the HLD by having Walk-through on the business with the Architect and gathered the Mapping Specification and Technical Specification from the SharePoint
  • Extensively worked with all the client components of Informatica like Repository Manger, Designer, Workflow Manger, workflow Monitor
  • Generated the DDL scripts of source and target tables in the database
  • Generated the Audit entries for the Audit Tables
  • Developed ETL design using various transformations like Source Qualifier, Aggregator, Sorter, Joiner, Lookup, Stored Procedure, Router, Filter, Sequence Generator, Expression for source-to-target data mappings and to load the target table
  • Involved in writing sql queries in the source qualifier to extract the data from homogeneous sources
  • Wrote SQL queries, stored procedures for implementing business rules and transformations
  • Implemented Slowly Changing Dimensions (SCD) phenomenon using Informatica ETL mapping to load SCD Type1 and Type2 tables and captured the test results in the Unit Test Plan
  • Involved in Code review and Peer review
  • Worked with Informatica workflow manager and workflow monitor to schedule, run, debug and test the application on development and to obtain the performance statistics
  • Created the deployment groups to deploy the code from Development box to SIT Box
  • Helped the Team Leads in preparing Deployment Document and Release notes
  • Ran the jobs using CONTROL-M in QA, UAT environment
  • Testing database applications to ensure functionality
  • Performed Sanity Checks once after the code got deployed into SIT
  • Collaborating with DBA's to implement structures
  • Proactive Defect fixing by logging into Application Life Cycle Management (HP ALM QC)
  • Troubleshooting database related errors
  • Interacting with DBA's on optimizing queries
  • Participated in database designing.

MAXX Technologies
Bangalore
08.2011 - 07.2013
  • Involved in code and application development & enhancements
  • Developed various mappings & Mapplets to load data from various sources using different transformations like Router, Aggregator, and Joiner, Lookup, Update Strategy, Source qualifier, Filter, Expression and Sequence generator etc., to store the data into the target table
  • Created the Mapplets & Reusable Transformations
  • Involved in development of Change requests
  • Worked on SCD type-1 and SCD Type-2 dimensions
  • Involved in the performance Tuning
  • Created sessions, workflows and ran for scheduling jobs and implemented error-handling mechanism for ETL mappings
  • Done code reviews of Informatica Mapplets, Mappings, and Workflows and other documents required by onshore.

Education

Master's - Information Assurance

WILMINGTON UNIVERSITY

Bachelor of Technology - Electronics, & Communications JNTUH

Wilmington University
New Castle, DE

Skills

  • TECHNICAL SKILLS
  • Operating Systems Windows XP/2000, Linux 6, QWS3270 Secure 48
  • Databases Oracle SQL/PLSQL 11g/12c, PostgreSQL, IBM DB2/Z
  • Database tools SQL Developer, TOAD for Oracle, AQT, SQLWorkbench
  • Data Studio 411 Client
  • Scheduler Tool Control-M, ESP
  • ETL Tool Informatica Power center 91
  • Defect Tracker ALM HP Quality Center 1152
  • File transfer Filezilla 3400
  • Tools WinMerge (21400), UltraEdit 64-bit, Beyond Compare 429
  • Azure Services Azure Synapse Analytics, Azure Data Lake Storage, Azure Data
  • Factory, Azure Databricks, and Azure Blob storage

Certification

DP-900 - Microsoft Azure Data Fundamentals

AWS Certified Solutions Architect – Associate

Timeline

Azure Data Engineer

INFOWEB SYSTEMS INC
02.2022 - Current

Data Analyst

COHASH LLC
10.2021 - 01.2022

IT Data Analyst

INFOWEB SYSTEMS INC IOWA
07.2018 - 09.2021

Software Engineer

REDBUD TECHNOLOGIES, INC
06.2018 - 07.2018

Programmer Analyst

SEPALS CORP
11.2017 - 06.2018

Data Analyst

FABEC IT
04.2017 - 10.2017

Programmer Analyst

AVERON SOLUTIONS INC
01.2017 - 04.2017

Trainee Program Analyst

BRAINWARE INC
08.2016 - 12.2016

Informatica Developer

IBM India Pvt Ltd
08.2013 - 07.2015

MAXX Technologies
08.2011 - 07.2013

Master's - Information Assurance

WILMINGTON UNIVERSITY

Bachelor of Technology - Electronics, & Communications JNTUH

Wilmington University
SHILPA T