Summary
Overview
Work History
Skills
Timeline
Generic
Sindhu  Pulakani

Sindhu Pulakani

Data Analyst
Minneapolis ,Minnesota

Summary

Over 8+ years of experience in business and Data Analysis, Data Migration, Data Conversion, Data Integration and other database development activities. Expertise in Data Modeling, Data Architecture, Data Integration (ETL/ ELT) and Business Intelligence. Skilled in implementing SQL tuning techniques such as Join Indexes, Aggregate Join Indexes, and Table changes including Index. Sound knowledge on SDLC process - Involved in all phases of Software Development Life Cycle - analysis, design, development, testing, implementation and maintenance of applications. Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques. Experience in DAX (Data Analysis Expressions) functions in creating Power BI dashboards, reports and Tabular Models. Experience working with IBM DataStage to improve the testing quality. Experience in testing the accuracy and security of the data using IBM DataStage TDM. Thorough Knowledge in creating DDL, DML and SQL Queries for SQL Server, Oracle and Teradata databases. Extensive experience in developing jobs using Data Transformation Services (DTS) and SQL Server Integration Services (SSIS), and in developing reports with SQL Server Reporting Services (SSRS). Expertise in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non-relational databases. Expertise in working with the relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2012/2008/2005 Responsible for architecture design, data modeling, and implementation of Big Data platform and analytic applications.

Overview

10
10
years of professional experience

Work History

ETL/Data Analysis Lead

US Bank
Minneapolis, MN
09.2022 - Current
  • As a part of Remediation project for the PCI data throughout the organization(Structured and Unstructured), I am responsible in training the team according to the project requirements and preparing documentation for the same
  • As a lead, I will be helping, guiding the team to reach the expected goals for PCI data Remediation
  • Interacting with the offshore team and handling their deliverables required for the project
  • Responsible for source Data cleansing, refinement and Validation
  • Performing Data mining and predicting the future outcomes of the Data
  • Responsible for various analysis on data according to the requirements of the project
  • Responsible to extract, transform and load data in to database which is a source for tableau reporting dashboard
  • Creating ETL jobs with required transformations
  • Designing the ETL jobs using IBM Web Sphere Information Server 11.5 to Extract, Transform and load the data into Staging then to Teradata database,DB2 and salesforce
  • Monitoring scheduled ETL components in Control-M
  • Responsible for extracting the required Information from data sources using SQL/ Unix, Python scripting
  • Responsible for design, development & maintenance of ongoing analysis, reports, dashboards for the key business decisions
  • Responsible for Gap Analysis/L2 validation and publishing the findings
  • Responsible for Identify, analyze and Interpret trends or patterns in complex datasets
  • Creating Ad-hoc reports for meetings
  • Creating batch jobs using python to automate the on-going process in the team
  • Creating macros to identify and send automated notifications to employees with non-compliant data
  • Planning, designing, developing and testing software systems or applications for software enhancements and new products
  • Plan and direct studies of potential electronic data processing applications
  • Assist in coordination of testing changes, upgrades; ensuring servers will operate correctly in current and upgraded future environments
  • Make recommendations on functional and technical improvements to the environment
  • Participate in performance, data volume analysis and design
  • Accurately set severity of identified defects
  • Provide input to training and/or documentation materials regarding latest technical and functional design changes
  • Environment: Python, SQL Server, Tableau, Control-M, MS=Excel, Agile, Data Analytics, IBM Datastage

ETL Developer and Data Analyst

Hilton
Memphis, TN
04.2020 - 08.2022
  • Create physical/detailed design documents that reflect the needs of the business requirements and the conceptual and logical constraints of the solution architecture documents
  • Develop and enhance systems in a manner that conforms to development guidelines
  • Provide systems support in a manner that conforms to development and support guidelines
  • Create and execute unit test cases that correspond to the needs of the validated requirements and the constraints of the design documents
  • Validating XML data
  • Create and execute performance test cases that correspond to the needs of the validated requirements and the constraints of the design documents
  • Worked on IBM DataStage TDM to improve the application quality as part of testing life cycle
  • Participating in all phases of data mining, data mapping, data cleaning, data collection, developing models, validation, and visualization and performed Gap analysis
  • Assisted a team of data scientists in building data lake in Snowflake environment
  • Validated the data to make sure the data has been loaded as desired by having regular comparison and counts between source and destination
  • Performed data validation and data cleansing as part of the ETL process
  • Monitored the data pipelines and rendered production support
  • Performing Data Cleansing and Analysis by applying various data cleansing rules, designed data standards and architecture/designed the relational models
  • Implementing various Performance tuning techniques at ETL & SQL Server for efficient development and performance
  • Evaluate data mining request requirements and help develop the queries for the requests
  • Worked on data blending tools like Alteryx to gather and filter data from different sources and also worked on saving the documents in Alteryx connect
  • Environment: Snowflake, MS SQL Server 2014/2012/2008R2, Control-M ,SQL Server Integration services (SSIS), Tableau, MS Excel, Windows XP, JIRA, Waterfall.

Data Analyst

Apollo Healthcare
Hyderabad
06.2018 - 02.2020
  • Implemented various types of change data captures according to source data behavior and business requirements
  • Performed Data cleansing and data validation on the data that was being loaded in BigQuery
  • Performed extensive analysis on bad data and bot data and flagged that data as part of the data cleansing process
  • Administered the quality of data that is necessary to fulfill the needs of automation using IBM DataStage TDM
  • Implemented various Performance tuning techniques at ETL & SQL Server for efficient development and performance
  • Generated test data sets on complex data scenarios and addressed data privacy risks using IBM DataStage TDM
  • Created and maintained Logical and Physical models for the data mart and created partitions and indexes for the tables in the data mart
  • Participated in all phases of data mining, data mapping, data cleaning, data collection, developing models, validation, and visualization and performed Gap analysis
  • Performed Data profiling and Analysis applied various data cleansing rules designed data standards and architecture/designed the relational models
  • Created new data designs and ensuring that they fall within the realm of the overall Enterprise BI Architecture
  • ValidatedXML file formats
  • Update/create technical specifications: data mapping, data flows, and dashboard content, relational diagrams
  • Developed and reviewed SQL queries with use of joins clauses (inner, left, right) in Tableau Desktop to validate static and dynamic data for data validation
  • Collaborated with ETL teams to create data landing and staging structures as well as source to target mapping document
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing
  • Developed target data architecture, design principles, quality control, and data standards for the organization
  • Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository
  • Identified dataflow/ workflow issues, as well as finding the source of the issues and resolving them
  • Extracted data from IBM Cognos to create automated visualization reports and dashboards on Tableau
  • Extensively used DAX (Data Analysis Expressions) functions in creating Power BI dashboards, reports and Tabular Models
  • Created many calculated columns and measures using DAX in Power BI based on report requirements and published Power BI reports to end user
  • Designed and developed, Use Cases, Activity Diagrams, Sequence Diagrams, Visio and Business Process Modeling
  • Evaluates data mining request requirements and help develop the queries for the requests
  • Participated in sessions with management, SME, vendors, users and other stakeholders for open and pending issues Involved in reviewing/ analyzing Business Requirement Documents/Functional specifications/use cases
  • Worked with Google Analytics, BigQuery and Google Data Studio to generate reports and perform web analytics
  • Analyzed web analytics and performed reporting via Google Analytics
  • Built daily ad hoc queries in various SQL platforms like SQL Server and BigQuery
  • Environment:BigQuery, MS SQL Server 2014/2012/2008R2, SQL Server Reporting Services (SSRS), Power BI, MS Excel , SQL Server Management Studio (SSMS), AUTOSYS, Agile, IBM DataStage TDM, IBM DataStage Powercenter 10.0.

Data Analyst

Energytech Global
Hyderabad
10.2016 - 05.2018
  • Involved with Business Analysts team in requirements gathering and in preparing functional specifications and changing them into technical specifications
  • Used Erwin and Visio to create 3NF and dimensional data models and published to the business users and ETL / BI teams
  • Involved in Data mapping specifications to create and execute detailed system test plans
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity
  • Managed full SDLC processes involving requirements management, workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model
  • Created and maintained Logical and Physical models for the data mart
  • Created partitions and indexes for the tables in the data mart
  • Performed data profiling and analysis applied various data cleansing rules designed data standards and architecture/ designed the relational models
  • Gathered, reviewed business requirements and analyzed data sources from Excel/SQL for design
  • Development, testing, and production rollover of reporting and analysis projects within Tableau Desktop
  • Wrote complex SQL queries for validating the data against different kinds of reports generated by Business Objects
  • Interacted with the Business Users for gathering design requirements and taking feedback on improvements
  • Wrote python scripts to validate and test source to target mapping (STTM) migration from Oracle to Redshift
  • Implemented ETL logic in python which was originally written in Scala
  • Developed SQL queries in SQL Server management studio, Toad and generated complex reports forth end users
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues
  • Created SQL scripts to find data quality issues and to identify keys, data anomalies, and data validation issues
  • Participated in the Master Data Management (MDM) effort; provide technical advice and support toward the development of strategic and tactical plans for client master data management strategy, data inventories, data governance, data management, storage, and distribution alternatives in support of client MDM strategy
  • Used data analysis techniques to validate business rules and identified low quality missing data in the existing Enterprise Data Warehouse (EDW)
  • Analyzed the Business information requirements and research the OLTP source systems to identify the measures, dimensions and facts required for the reports
  • Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
  • Done data migration from an RDBMS to a NoSQL database, and gives the whole picture for data deployed in various data systems
  • Environment: DB2, MS SQL Server 2014/2012/2008R2, SQL Server Integration services (SSIS), Tableau, MS Excel, Windows XP, SQL Server Management Studio (SSMS), AUTOSYS, Agile.

Business Data Analyst

ICICI Bank
05.2014 - 09.2016
  • Responsible for gathering requirements from Business Analysts and Operational Analysts and identifying the data sources required for the request
  • Performed Data analysis on many ad hoc request's and critical projects through which some of the critical business decisions will be made
  • Worked on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent
  • Optimized the data environment in order to efficiently access data Marts and implemented efficient data extraction routines for the delivery of data
  • Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL
  • Analyze, design, code, test, implement and a support data warehousing extract programs and end-user reports and queries
  • Built a python class where the objects were batch jobs depending on their severity
  • Implemented python modules like numpy, Pandas, datetime to perform extensive data analysis
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources
  • Aggregate functions were executed on measures in the OLAP cube to generate information about dynamic trends including bandwidth consumption and their cost analysis
  • Generated periodic reports from the OLAP cubes based on the statistical analysis of the trends about location and bandwidth data from various time frames and departments using SQL Server Reporting Services (SSRS) to project various KPI's
  • Environment: DB2, MS SQL Server 2014/2012/2008R2, Python, SQL Server, MS Excel, Agile, Crystal reports.

Skills

  • TECHNICAL SKILLS:
  • DBMS Programming Languages
  • PL/SQL (Oracle), T-SQL (SQL Server) ,Python
  • SQL Interpreters
  • Toad, SQL Developer, SQL Plus, SQL Loader
  • Operating Systems
  • Windows, Unix
  • ETL tools
  • SSIS, IBM DataStage
  • Scheduling Tools
  • Control-M, TDS, Autosys
  • Data modeling
  • Erwin, SQL Database modeler, Apache Spark
  • BI & Reporting tools
  • Business Objects, Power BI,Cognos, SSRS, Crystal Reports
  • MS-Office Package
  • Microsoft Office (Windows, Word, Excel, Power Point, Visio, Project)

Timeline

ETL/Data Analysis Lead

US Bank
09.2022 - Current

ETL Developer and Data Analyst

Hilton
04.2020 - 08.2022

Data Analyst

Apollo Healthcare
06.2018 - 02.2020

Data Analyst

Energytech Global
10.2016 - 05.2018

Business Data Analyst

ICICI Bank
05.2014 - 09.2016
Sindhu PulakaniData Analyst