Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Work Availability
Quote
Timeline
Generic
Ramesh VenkatasamyRamasamy

Ramesh VenkatasamyRamasamy

Westborough,MA

Summary

15+ years of IT expertise in leading and implementing multiple Cloud platform, Data modeling, AWS, Azure and Python projects. Built and nurtured a high performing team of cloud solution developers/consultants.

Proven capability in working with relational databases like Vertica, Netezza, Oracle and SQL Server along with cloud database platforms like Snowflake and HIVE.Expert in developing HIVEQL, NOSQL, PL/SQL, T-SQL (DDL, DML), Data Integrity (constraints), Data quality & validation rules, Performance Tuning and Query Optimization.Deep knowledge in developing ETL mappings, workflows, data pipelines and reusable plug-ins using SSIS, Informatica and Clover.

Experience in ingesting near real-time data from multiple source systems for effective and timely business decision-making using AWS stacks like Kinesis, Lambda, EC2, EMR, API Gateway, S3, SQS and SNS Topics. Proficient in building Data Lake solutions on S3(AWS), Delta Lake solutions on Databricks(Azure) and cloud based database platforms like Snowflake ad HIVE.

Good Understanding in creating, populating and maintaining Data marts. Thorough knowledge of Features, Structure, Attributes, Hierarchies, Star and SnowFlake Schemas of Data Marts. Proficient in Database design using Normalization Forms, Entity Relationship Design (ERD) model, logical and physical data modeling using Erwin and VISIO and application oriented design.

Domain skills include Online Retail, Pharma & Commercial Operations and Sales, Insurance- Multinational Insurance and P&C and Logistics.

Overview

15
15
years of professional experience
2
2
Certification

Work History

Senior Data Engineer

Chewy
Boston, MD
12.2018 - Current
  • Modernization of data warehouse by converting ETLs into ELT and building flat de-normalized models for faster data processing and consumption
  • Led Denver retirement project successfully by migrating 10+ applications from legacy applications to AWS
  • Architect analytic solutions for business to provide data that would influence decisions and operate business;
  • Led several initiatives to modernize Data Warehouses for faster ad-hoc analysis and query performance.
  • Monitored incoming data analytics requests, executed analytics and efficiently distributed results to support Sales and Supply chain strategies.
  • Replicate source data from DynamoDB (No-SQL DB) using Kinesis, Lambda , API Gateway and S3
  • Real-time order events ingestion from SNS topics into snowflake for tracking back-ordered items due to inventory being out of stock
  • Developed python scripts for reading live stream data, processing them and preparing them for analytics
  • Created data pipelines using Clover/python to integrate data from payment, autoship and customer MDM into snowflake and data warehouse
  • Building data integration solutions using Python script and AWS to process large volume transactions in real time and in batch using big data technologies;
  • Determine feasibility of design within time and cost constraints utilizing SQL expertise by researching and analyzing user needs and software requirements;
  • Built ETLs on snowflake data lake to create Aggregated data consumption layer for internal business units and Tableau reporting
  • Designed Autoship and payment data mart to better track customers subscriptions and develop customer behavioral patterns
  • Technology Stack: Vertica, Snowflake, Clover, Python, AWS stacks- Kinesis, S3, Lambda, SNS topics, Dynamodb, Aurora, Airflow, Github
  • Contributed to internal activities for overall process improvements, efficiencies and innovation.

Azure Architect

Accenture/Shire
Waltham, MA
06.2017 - 12.2018
  • Strategized, architected and implemented cloud solutions for COMM03 project.
  • Analyzed complex data and identified anomalies, trends and risks to provide useful insights to improve internal controls.
  • Designed HIVE data model and data lake architecture in Azure platform
  • Reverse engineer existing application, reimplemented some of best solutions and re-modify solution based on source system changes
  • Addressed ad hoc analytics requests and facilitated data acquisitions to support internal projects, special projects and investigations.
  • Compiled, cleaned and manipulated data for proper handling.
  • Review IQVIA and other vendor files, automate QTA process and design system to be flexible for mid-quarter alignment
  • Involved in Data Profiling, Quality, Governance, Data Retention, Storage and Data Reconciliation
  • Ingested, parsed and processed data in various formats including json, csv, parquet received through various modes including REST API, sFTP through Python
  • Developed python scripts to load data from HIVE tables to data exploration tool
  • Review data models, ETL loads, schedules, data design with enterprise architecture team and get their approval for implementation
  • Expert in creating 100+ HIVE internal & external tables, partitioned and bucketed tables
  • Created reusable plug-ins so that Quality Checks can be added or modified without changing code
  • Provide feedback on test files, document test cases, technical design and file requirements and get sign off for all milestones planned
  • Lead technical work stream and provide status on risk, mitigation plans, key priorities to steering committee
  • Assign daily tasks to offshore team of 12 members, connect with ETL team to clarify requirements/design, review work and provide feedback, discuss complex technical challenges and validate test results
  • Collaborated with enterprise/operations/citrix team to fix environmental issues affecting delivery
  • Devised overall strategy for documentation and identified as-built designs and final building information models (BIM).
  • Attended all team meetings to resolve technical and project issues, coordinate with team members and review project schedules.
  • Addressed ad hoc analytics requests and facilitated data acquisitions to support internal projects, special projects and investigations.
  • Analyzed complex data and identified anomalies, trends and risks to provide useful insights to improve internal controls.
  • Compiled, cleaned and manipulated data for proper handling.
  • Technology Stack: Azure HDInsight Cluster, HIVE, Informatica BDM, Qlik Sense, Erwin

BI Architect

Accenture/Shire
Waltham, MA
09.2014 - 05.2017
  • Designed and built Comprehensive data warehouse for Rare Disease Business Unit
  • Developed database architectural strategies at modeling, design and implementation stages to address business or industry requirements
  • Collaborated with system architects, design analysts and others for data model management, enforce compliance rules, standards and best practices around data modeling efforts.
  • Created project and application architecture deliverables consistent with architecture principles, standards, methodologies and best practices
  • Design SSIS packages, store procedures, configuration files, tables, views, and functions; implement best practices to maintain optimal performance
  • Build efficient ETL packages for processing fact and dimension tables with complex transforms and type 1 and type 2 changes
  • Coordinate with development of interfaces to integrate vendor files and data coming from Salesforce
  • Automated manual uploads done for IC changes and yearly forecast loading, this ensured reduction in manual efforts
  • Provide solution blueprint to data (ETL) and report teams, translate functional business needs into technical requirements and assign tasks to offshore teams.
  • Coordinate with offshore team for Unit testing and system testing.
  • Created wireframes and data models for development of Qlikview dashboards
  • Participated in collaborative development of data management, business intelligence and analytic architecture standards and facilitated understanding and adherence to standards
  • Resolved conflicts and negotiated mutually beneficial agreements between parties
  • Technology Stack: SSIS, Sql Server, T-sql, Netezza, Informatica, Datameer & Qlikview

BI Team Lead

Accenture/Sanofi
Chennai, TN
03.2011 - 08.2014
  • Lead Application Services team which handled Devops activities for 110+ UI applications.
  • Managed team of 14 employees, overseeing hiring, training, and professional growth of employees.
  • Analyzed new change requests and estimated effort needed for change requests, planning and scheduling change requests
  • Collaborated with business partners in reviewing requirements, designing and code changes
  • Designing data model, building relational and dimensional models, developing stored procedures and tables
  • Integrated multiple data sources to CTDM application using SSIS packages
  • Automated packages by scheduling via Sql server job agents
  • Ensure validated systems have implemented HIPAA rules and are documented as per guidelines
  • Prepares status reports for higher management and participates in project status meetings
  • Proved successful working within tight deadlines and fast-paced atmosphere.
  • As continuous improvement tower lead, drove technical and operational process related innovations and implementations and helped in saving cost for clients
  • Saved $100k by implementing cost-saving initiatives that addressed long-standing problems.
  • Technology Stack: SSRS, SSIS, SQL Server 2008 & 2005, T-SQL, Visual Studio 2010, VB.Net, ASP, IIS, BMC Remedy, Erwin

Data Modeler and MSBI Developer

Cognizant/ACE Insurance
Chennai, TN
12.2009 - 02.2011
  • Responsibilities in Foreign Casualty Database project: Led team, handled database design, Data warehouse modeling, Performance tuning, Data migration and Functionality mapping
  • Designed data warehouse model (Star Model) that required mastery of domain and business knowledge.
  • Identified and described relationships between data fields in enterprise databases.
  • Analyzed enterprise data holdings to reduce redundancy of data within existing systems or improve conflation of datasets from one system to another.
  • Involved in physical data modeling using Microsoft Visio, designing database objects and integrating disparate databases, lookup files and centralized data warehouse using SSIS
  • Reviewed project requests describing database user needs to estimate time and cost required to accomplish projects.
  • Involved in functionality mapping of source and destination database fields and in implementation of coding standards and best practices in SSIS
  • Responsible for optimizing decision support query performance in relational databases and performance tuning of SSIS packages
  • High contribution in development of different reports using SSRS
  • Responsible for automating reports generation, scheduling SSIS packages on monthly basis, creating configuration files and deploying SSIS packages
  • Performed process related activities such as identifying risks, preparation of Estimates, Project Plans and Resource Planning, and Allocation of Tasks
  • Technology Stack: SSRS, SSIS and SQL Server 2005, Oracle PL/SQL,DB2, Toad, Windows XP, Windows Visual Source Safe, MS Visio, Autosys

Data Modeler and MSBI Developer

Cognizant/ACE Insurance
Chennai, TN
11.2008 - 11.2009
  • Responsibilities in Multinational Account Exchange Reporting: Lead team in planning and strategizing end to end development
  • Took responsibility for Report generation, Database design, Data warehouse modeling, Performance tuning and Functionality mapping.
  • Collaborated with system architects, design analysts and others to understand business and industry requirements
  • Identified and described relationships between data fields in enterprise databases
  • Performed analysis and designed Data warehousing data model by partitioning transactional data into facts or dimensions,
  • Developed Logical as well as physical data modeling using Erwin
  • Designed and developed overall integration such that scheduled and structured ETL process (SSIS) fetches data from transactional data source and feeds data to new reporting database and data feeds to reports were through reporting tool SSRS
  • Involved in Unit testing and System testing
  • Handled process related activities, Metrics collection, Reporting, Project Plan preparation, Resource Planning and Task Allocation and Weekly Status Report
  • Technology Stack: Oracle PL/SQL, SQL Server 2008, DB2, Excel macros, Erwin, SSIS and SSRS

ETL and Database Developer

Cognizant/WWL
Chennai, TN
07.2006 - 10.2008
  • Responsibilities in WWL Reporting System & GATE Cleanup include collaboration with system architects, design analysts and others to understand business and industry requirements
  • Designed and developed Oracle PL/SQL queries and stored procedures, packages, Queues, Triggers, jobs for trading partners
  • Performed application tuning of PL/SQL Stored Procedures, Functions, Triggers and Packages
  • Automated monitoring activities and ensured huge reduction in manual effort of team
  • Analyzed and optimized existing message flows by implementing subscriptions, queue concepts and reducing cross-database querying
  • Developed new mappings using Data mapper builder
  • Integrated new data sources and applications using SSIS
  • Tested packages, integration components and objects
  • Actively prepared documentation of all delivery documents such as Business Requirements document and Technical Design Document
  • As DP coordinator, performed root cause analysis of defects and reported to top management
  • Technology Stack: Oracle PL/SQL, SQL Server 2005, SSIS, SSRS, UNIX and Integration Broker.

Education

Bachelor Degree - Engineering

Anna University
Tirunelveli
04.2006

Skills

  • Languages : Unix/Linux Scripting, Python
  • Big Data Technologies : Azure, AWS
  • Database : HIVE, SQL Server, Oracle PL/SQL, Netezza, Vertica, Snowflake
  • ETL Tools : SSIS, Informatica, Informatica BDM, Clover
  • Analytical Tools : SSRS, SAP BO, Qliksense, Tableau
  • Data Modeling Tools : MS Visio, ERWIN
  • Scheduling Tools : Autosys, UC4, Control-M, Airflow
  • Streaming Tools: Kinesis Streaming, Spark

Accomplishments

  • Led multiple projects to ingest near real-time data from different source systems for effective and timely business decision-making using AWS technologies like Kinesis, Lambda , API Gateway , S3 , SNS Topics etc. Some of my most impactful integrations are -
    a. Clickstream Data from Segment.
    b. Source data Replication from Dynamo DB (No-SQL DB)
    c. Real-time order events ingestion from SNS topics into snowflake for tracking back-ordered items due to inventory being out of stock.
  • Built a python framework to seamlessly replicate data between snowflake and Vertica. It ensured developers could replicate data between the two databases without writing additional code. It reduced development time for downstream data delivery by 80%
  • Led several initiatives to modernize Data Warehouses for faster ad-hoc analysis and query performance.
  • Supervised team of 5 staff members.

Certification

70-475 Certification: Designing and Implementing Big Data Analytics Solutions

70-463 Certification: Implementing a Data Warehouse with Microsoft SQL Server 2012

OCA Oracle certification.

Multiple Udemy certifications for Cloud platforms

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Quote

Innovation distinguishes between a leader and a follower.
Steve Jobs

Timeline

Senior Data Engineer

Chewy
12.2018 - Current

Azure Architect

Accenture/Shire
06.2017 - 12.2018

BI Architect

Accenture/Shire
09.2014 - 05.2017

BI Team Lead

Accenture/Sanofi
03.2011 - 08.2014

Data Modeler and MSBI Developer

Cognizant/ACE Insurance
12.2009 - 02.2011

Data Modeler and MSBI Developer

Cognizant/ACE Insurance
11.2008 - 11.2009

ETL and Database Developer

Cognizant/WWL
07.2006 - 10.2008

Bachelor Degree - Engineering

Anna University
Ramesh VenkatasamyRamasamy