Summary
Overview
Work History
Education
Skills
Certification
References
Timeline

Rushi Parekh

Charlotte,NC

Summary

10+ years of experience developing, supporting and consulting in the IT industry. Extensive experience in the complete Software Development Life Cycle (SDLC) covering Requirements Management, Data Analysis, Data Modeling, Data Mapping, Business System Analysis, Design, Development and Deployment of business applications. As software engineer worked on agile methodology on various teams support to manage design of product independently and as a team member. Proficient in Technical and Business Writing, Business Process Flow, Business Process Modeling, Business Analysis and Testing various methodologies. Collaborated with data engineer and data management analyst Conversions to resolve Gaps in business specifications and ensured that the decisions are documented for future reference. Worked on Domain driven design modeling (DDD) to its target-specific physical data models. Supported analysis complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors. Expertise Normalized tables and dimensions up to 3NF in order to optimize the performance. Experience in Data Integration techniques like Data Extraction, Transformation and Loading (ETL) from disparate Data Source databases like Oracle, SQL Server, NoSQL MongoDB, MS Access, flat files, CSV files and XML files into target warehouse. Strong in Source to Target data mapping, Standardization Document Slowly Changing Mapping Creation, Star/Snowflake Schema Mapping Creation, RDMS, Building Data Marts and Meta Data Management. Extensive experience in loading high volume data, worked extensively with data migration, data cleansing and ETL processes. Well versed with programming languages, relational database systems, operating systems, data warehouse management as well as web server & Cloud environments (Paas, IaaS) Support the delivery of business value for the Consumer Lending lines of business by interviewing business experts to understand their information needs, represent that understanding with business conceptual and logical models, and design physical database structures for relational and non-relational databases. Manage reference data, design data scripts, and support the production and publication of data design artifacts. supporting multiple scrum teams, and using Kanban to manage work. Effective communicator that can work with minimum direction and supervision. A consensus builder that motivates cross-functional teams to deliver superior products and services. Viewed as an individual who builds consensus and makes technology work for the customer with Organizational, Analytical, Presentation Skills, positive approach and Leadership Qualities.

Overview

13
13
years of professional experience
1
1
Certification

Work History

Senior Software Engineer

Wells Fargo
Charlotte, NC
11.2016 - Current
  • The data Standards, Practices and Oversight Team (SPOT) is a horizontal team that provides standards, practices, tooling, and oversight across Consumer Lending and Consumer & Small Business Banking lines of business
  • SPOT team is part of the Data Governance team which is responsible for data modeling, metadata management, data mapping, data lineage, and related tooling with industry standards
  • It also includes creating and maintaining a training curriculum for our Data Management Self-Service (DMSS) product, which enables scrum teams to stay compliant with Enterprise Data Governance policy as part of normal Agile development practices
  • Performed in relational and dimensional Data Modeling for creating Logical and Physical designs of the database and ER diagrams using data modeling tools
  • Created, documented, and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprise data stores within a metadata repository
  • Review and analyze business, operational, or technical challenges and remediate the root-cause analysis
  • Well-versed in system analysis, ER/Dimensional Modeling, Database design, and implementing RDBMS & Non-RDBMS schema
  • Worked on multiple tools for JSON, NoSQL methodology, and data streaming with Kafka
  • Proof of concept for cloud strategy various cloud providers Microsoft Azure & Google
  • Working knowledge of migrating procedures from on-premise application by lift and shift cloud strategy
  • Utilizes data to enable various lines of business across the enterprise to serve customers, improve efficiency and profitability while meeting regulatory, compliance, surveillance and risk management requirements
  • Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, and data manipulation
  • Managed reference/lookup data and data provisioning design support
  • Build Non-prod environments via CI/CD pipelines using GitHub & managed DMI extension files at central repository
  • Involved in research on Data Management and Insights (DMI) which is transforming the way that Wells Fargo uses and manages data
  • As part of that transformation DMI is establishing the Automated Data Management Framework (ADMF), a one-stop-shop for all data management related information including metadata and lineage, data controls, data quality defects, etc
  • As an organization of data platform & enablement focused on automation across all Data Platform & Analytics at product level
  • Supported development of integrated business and/or enterprise application solutions to ensure specifications are flexible, scalable, and maintainable and meet architectural standards
  • Experience in systems integration testing (SIT) and user acceptance testing (UAT) for large, complex, cross-functional application initiatives by providing insight to testing teams in order to ensure the appropriate depth of test coverage
  • Drive metric-based reliability OKRs & implemented NFR to follow the standards
  • Conducted cost analysis for NoSQL tool Hackolade vendor and presented perpetual enterprise license benefits vs cost saving for the organization and involved in MSA between vendor and Organization
  • Also individually handled Hackolade license POC for all user for newest version upgrade at Wells Fargo
  • Experience in the work environment consisting of Production Support teams, Subject Matter Experts, Database Administrators, Database developers, Data Engineers, and Data Architects
  • Performed research on current NoSQL tools and competitive available tools in market to be consist and single tool for RDMS and Non-RDBMS database
  • Experience on building micro-services using Kafka, MongoDB, Domain Driven Design
  • Co-lead for Peer Progress Center of excellence at Charlotte Tech Community, worked as an advisor to develop quarterly syllabus and events to share technology topics at organization level
  • Passionate to learn and growing in the current technology and focusing on Bigdata, Hadoop/Hive, Python, Artificial intelligence, Machine learning, and Cloud and advanced analytics technologies to tackle large scale, complex data management, data governance, standardization, data quality, data warehousing, analytics and reporting challenges across the enterprise

Data/Risk Reporting Engineer

Bank of America
Charlotte, NC
06.2016 - 11.2016
  • Involved with Global Risk Management (GRM), including Global Compliance for Bank of America
  • A team where responsible for overseeing the company's governance and strategy for global risk management
  • Focused on the application software Implementing and maintaining appropriate risk management principles and policies, internal controls and processes designed to identify and mitigate risks
  • Conducted one to one interview with Portfolio Manager to gather Business Requirements and was involved in the documentation of Business Requirement Documents to develop reports thru MicroStrategy and update code
  • Design, development, implementation and roll-out of Micro strategy Business Intelligence applications
  • Architected Extract, Transformation and Load processes (ETL) to populate an operational data store from various sources including other databases like Teradata, SQL, spreadsheets, and flat files
  • Extensively worked in the performance tuning of programs, ETL procedures and processes
  • Analyze requirements utilizing various methods Sample data from SQL queries, moke-ups, reports, prototype screens, sourcing information, and other data models.)
  • Worked on data modeling and produced data mapping and data definition documentation for the new requirements
  • Conducted Functional Walkthroughs, User Acceptance Testing (UAT), and directed the development of User Manuals for customers and enhance risk technology, data and analytics capabilities
  • Performed data analysis and ran various SQL queries to critically evaluated test results in various production and non-production environments
  • Worked with developers to make sure that they understood the Use Cases and update code to database
  • Designed and implemented basic SQL queries for QA Testing and Report / Data Validation
  • Partnered with the Technical Areas in the research and resolution of System and User Acceptance Testing.

Software Engineer

Wells Fargo
Des Moines, IA
06.2014 - 05.2016

Senior Data Analyst/Modeling

MUFG Union Bank
Los Angeles, CA
12.2013 - 06.2014
  • (stylized as Union Bank) is an American full-service bank with 398 branches in California, Washington, and Oregon which is wholly owned by The Bank of Tokyo-Mitsubishi UFJ Project was to develop financial analysis data warehousing for analyzing customers, their transactions, accounts, balances and their authorizations between accounts of its customers
  • This project is developed for research & analysis of the Finance Department to assist them in their business
  • This system provides accurate and timely information on their portfolios on a daily and monthly basis, regarding credit gains and losses for potential candidates so that they may take appropriate actions proactively
  • Working with data producers and critical downstream consumers to identify and document data elements that are most critical to enterprise financial, risk, and regulatory reporting with the enterprise data management team
  • Developed prototypes using a data Warehouse and created cubes for multidimensional analysis of data by using MOLAP/ROLAP process
  • Normalized the database up to 3NF to put them into the star schema of the Data warehouse
  • In the role of Data Analyst performed analysis and design of extensions to an existing data warehouse/mart business intelligence platform
  • Documentation involved design docs, sop, mapping specifications and process flowcharts etc
  • Involved in quality assurance to ensure quality, validity and accuracy of data across the servers
  • Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams
  • Involved in different team review meetings for logical data model walkthroughs and validation
  • Developed Functional Specification Document and Supplementary Specification (non-functional) Document
  • Participated in the Logical and Physical Design sessions and developed Design Documents.

Data Analyst/Data Modeling

Wells Fargo
Des Moines, IA., IA
05.2012 - 11.2013
  • Client & same Team at, (Link - below section), Working with Business team to analyze requirement on CORE based application thru different UI and translate them into Data requirement thru Data Assessment documents
  • Works for IBM data mapping repository tool to map data various downstream systems
  • Working with the business community and Source of Record IT analysts to gather information needed to perform the data modeling using Power designer and to understand the data flow from source to target
  • Analyzed data reported via bug from JIRA tool, utilize JIRA to assign Task and tracking purpose
  • Converting all data types from Legacy systems to Teradata standards with SQL queries
  • Working with different downstream systems to analyze data including MIDE (Mortgage Integrated Data Environment) and MODE (Mortgage Operational Data Environment)
  • Involved in various EDW (Enterprise Data Warehouse) activities like Modeling, Mapping and Analysis
  • Validate various extract files for government compliance projects like MISMO
  • Participated in ETL Functional Specifications Document (FSD) reviews and responded to questions related to the mappings, modeling and processing logic flows
  • Created data mapping spreadsheets for each release for views as well UI mappings
  • Worked on Metadata validation using Ab-Initio Metadata hub to validate lineage between source and target
  • Create logical dimensional data models and Physical data models using Sybase Power designer 15 for data warehouse
  • Utilized Power Designer 15 for design and development of models for different subject area such as Loan Account Production, Customer/Party, Location, and Real Estate and also Source to Target mapping for different subject areas
  • Perform data profiling to analyze and to support data quality controls (valid value ranges, relationships between data elements, business rules & reasonability checks)

Data Analyst/Data Mapper/Data

Luxottica
Cincinnati, Ohio
01.2012 - 05.2012
  • Modeling
  • Developed a Reporting System comprising of different Universes, Canned & Ad-hoc Reports and corporate Repositories
  • Finance Departments from the under lying Oracle & SQL Server Databases
  • This system supports Analysis & Decision-making for Finance, Marketing, Risk, Collections, and Consumer Relations
  • Also, using different ad-hoc analysis, the Reports assist in defining strategy for each customer category
  • Complete study of the in-house requirements for the data warehouse
  • Analyzed the DW project database requirements from the users in terms of the dimensions they want to measure and the facts for which the dimensions need to be analyzed
  • Defined ETL strategy to capture new data, store historical data and update data warehouse in the most efficient way
  • Extracted the Business Requirements from the end users keeping in mind their need for the application and prepared Business Requirement Documents (BRD) using RequisitePro
  • Designed and developed Informatica mappings and workflows/sessions to load data from Source systems to ODS and then to Data Mart
  • Created Conceptual/Logical/ Physical Data Models, with emphasis on Optimization techniques and Data Management for high volume transactions
  • Developed estimates, project plans (Microsoft project), training material, BI reports using Micro strategy
  • Developed and managed Project Plans and Schedules
  • Conducted Functional Walkthroughs, User Acceptance Testing (UAT), and supervised the development of User Manuals for customers.

Data Analyst

MMC Systems Inc
Herndon, VA
08.2010 - 04.2012
  • Doing Data Analysis to check validity of an issue and solving an issue
  • Updating data requirements and doing data mapping/high level design in a support capacity of the data warehouse
  • Mostly working on Oracle SQL,TOAD, Microsoft Excel, Pac2000
  • Generated reports using SQL from Oracle 11g database, which were used for comparison with legacy system
  • Involved in quality assurance to ensure quality, validity and accuracy of data across the servers
  • Involved in different team review meetings
  • Conducted logical data model walkthroughs and validation
  • Integrated data from multiple systems and made that data available to all business users

Education

Masters: MBA - Project Management

Bachelor: Bachelor of Science - Engineering

Certified in SQL Server & T-SQL Certified Microsoft Azure Cloud fundamental -

Skills

  • 165, Teradata SQL Assistant
  • Data Warehousing
  • Informatica Power Center, Informatica Designer, Workflow Manager, Work flow Monitor, ETL, Datamart, OLAP, OLTP, Transformations, SQL
  • Loader
  • Databases
  • Oracle 10g/11i/9i/8i/11g/12g, IBM, DB2, DB2, UNIX and Windows, MS SQL Server 2008/2005, Teradata, Informix, Sybase, MS Access, Apache
  • Tracking Tools
  • JIRA, IBM Rational Team Concert V501, JIRA, HP OpenView Service Center
  • Big Data & Cloud
  • Cloudera, Hadoop, Hive, NoSQL, Data Lake, Microsoft Azure & Google cloud
  • Additional Tools & Expertise
  • TortoiseSVN, CI/CD with GitHub, Pac2000, ServiceNow, Erwin data intelligence suits, Clarity Project management, Ab-Initio Metadata Atlassian Confluence Wiki, Javascript, Tableau, Adobe Photoshop, SDLC/AGILE/Hybrid, Enterprise Metadata hub, Collibra

Certification

Data Modeling Dimensional Data Modeling using Star Join Schema and Snowflake Modeling, XML Schema, Relational Modeling, OLTP, OLAP, FACT and Dimensions Tables, Data Mapping, Data Staging Area, Logical and Physical Data Modeling, JSON Schema, ERWIN Data Modeling, ER Studio, Oracle Designer, Toad, Visio and Sybase Power Designer 15/ 16.1

References

REFERENCES - Available Upon Request

Timeline

Senior Software Engineer - Wells Fargo
11.2016 - Current
Data/Risk Reporting Engineer - Bank of America
06.2016 - 11.2016
Software Engineer - Wells Fargo
06.2014 - 05.2016
Senior Data Analyst/Modeling - MUFG Union Bank
12.2013 - 06.2014
Data Analyst/Data Modeling - Wells Fargo
05.2012 - 11.2013
Data Analyst/Data Mapper/Data - Luxottica
01.2012 - 05.2012
Data Analyst - MMC Systems Inc
08.2010 - 04.2012
- Masters: MBA, Project Management
- Bachelor: Bachelor of Science, Engineering
- Certified in SQL Server & T-SQL Certified Microsoft Azure Cloud fundamental,
Rushi Parekh