Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Generic

BALAKRISHNAN SELLAM

New Egypt,NJ

Summary

Certified, senior level professional with 20+ years of industry experience including hands technical and leadership roles, focused on driving adoption of cloud technologies. Hands on experience with Snowflake, AWS Red Shift DW and cloud technologies AWS-EC2, S3, SQS, SNS, Cloud Formation, Data Pipeline, Streams & Cloud servers. Experience in Data center & DB2 to Aurora PostgreSQL migration, managing onsite-off DBA team, and Training to support team. Broad exposure and hands-on experience with a wide range of RDBMS and BI tools. Deep data domain knowledge and excellent understanding of financial services and Health industry. Creative problem solver, quick learner, proactive team leader with excellent prioritization skills and aptitude to balance multiple priorities. Strong presentation and communication skills. Passionate for learning new technologies.

Overview

14
14
years of professional experience

Work History

Data Consultant

StateFarm Insurance
11.2022 - 06.2023
  • Snowflake, DB2, POSTGRESQL, MSSQL, MONGO DB, Redshift
  • Experienced in DWH Redshift Performance Tuning for Columnar Databases
  • Proficiency with Redshift Architectures, Storage, Query Processing
  • Created Redshift Clusters for improve Query throughput, high concurrency, cost saving.

Sr. Cloud Data Consultant / Modeler

OPM .VA
08.2019 - 01.2020
  • Hands on experience with Time Travel for recovery with accident removal
  • Created Pipeline with S3 stage & Notification to Trigger Pipeline
  • Automated File load from S3 bucket
  • Manipulated with Roles [Account Admin, Sys, Sec & User Admin ] & Privileges
  • SnowFlake & AWS storage Pricing knowledge for cost saving and Effective Usage
  • Hands on experience with bulk loading and Continues Loading
  • Hands on Experience with scaling up and scaling out for workloads and WLM
  • Created Streams for CDC for online DML changes
  • Created Data Pipeline using python.
  • Good in Python Programming
  • Created Data Sampling using Row and Block methods
  • Created EC2 instance and Migrated UDB 11.5 DB to PostgreSQL 14.3 AWS RDS
  • Escalation Support Production Data Issues and Performance Tuning
  • Hands on Experience in PostgreSQL
  • Created EC2 instance with General Purpose SSD for best performance
  • Support for PostgreSQL, DB2 UDB, MSSQL Databases
  • Implemented for EDB PostgreSQL Advanced Server for Oracle Compatibility
  • Performance Expert
  • Tuning, Vacuum, Vacuum Full, Bloat, Parameters
  • Hands on Experience with NOSQL MongoDB ( BigData )
  • Good understanding with Hadoop Technologies
  • Expert in Erwin Data Modeling Tool
  • Created Logical/Physical Model using Reverse Engineering
  • Developed PL/SQL packages to load/purge large volume of Data
  • Hands on Experience in AWS Cloud Architect
  • S3 storage all possible configuration which includes version and Encrypt
  • Configured VPN, Security Groups with NAT Gateway
  • Implemented Range Partition using Extension
  • Experienced in Cloud Formation, Cloud Trail, IAM, Route 53, S3 and RBAC
  • Hands on Experience with Code Pipeline, API Gateway, Lambda, and DynamoDB
  • Responsibilities:
  • Created EC2 instance and Installed UDB 11.1 with FP4
  • Escalation Support Production Issues and Performance Tuning for Business Queries
  • Installed Fix Packs to avoid vulnerabilities and security risks
  • Performance tuning in complex DWH, OLTP, OLAP queries
  • Implemented successfully CPU shares in Work Load Manager
  • Hands on experience on Performance Tools Dyna Trace, DBI, DSM, Zenoss
  • Expert in UDB DPF and built Multi Host DB
  • Hands on experience and strong Architectural knowledge in Splunk,ELK
  • Automated routine tasks with shell scripts.

Data Consultant

CMS
01.2009 - 01.2020
  • Snowflake, DB2, POSTGRESQL, MSSQL, MONGODB, Redshift
  • Environment: Snowflake XL 16 Server
  • AWS EC2 m5 Instance, 192 GB, AWS RA3 nodes
  • Responsibilities:
  • Experienced in DWH Snowflake, Redshift, NoSQL Mongo dB
  • Proficiency with Snowflake Architectures, Storage, Query Processing, Could Services
  • Created Virtual Warehouse with XL (16 Servers) and configured
  • Created Clusters for improve Query throughput for high concurrency
  • Configured Auto Suspense and Auto Resume for

Cloud Database Developer

Huntington Bank
11.2018 - 08.2019
  • Environment: Five Physical Node DPF, with 20 Logical Partitions
  • Responsibilities:
  • Implemented WLM for Bank DWH DB on UDB 11.4
  • Escalation Support Production Issues and Performance Tuning
  • Installed Fix Packs to avoid vulnerabilities and security risks
  • Implemented new features RCAC for HR Databases for Rows and columns(Mask)
  • Performance tuning in complex DWH, OLTP, OLAP queries
  • Implemented successfully CPU shares in WLM
  • Hands on experience on Performance Tools Dyna Trace, DBI, DSM,Zenoss
  • Modeled and implemented database schemas in DB2 UDB

Replication Architect

Morgan Stanley
10.2015 - 09.2018
  • Hands on Experience on PURE SCALE Cluster solution
  • Installed and upgraded from 10
  • 5 to 11.1 for 4 CF setup
  • Level 3 support for UDB DB2
  • Implemented new UDB 11 features for critical DPF and ESE applications
  • RCAC and RADIX for OLTP Applications
  • Working closely with Application DBAs and Engg., Team for enhancements
  • Performance Issues and Tuning using tools db2batch, dbcaem
  • Expert in UDB DB2 DPF built in NAS storage

Performance Engineer MSSQL DBs

Bank of America
04.2015 - 10.2015
  • Env: HP DL380 Gen 8 16 Core 2.9 GHz Windows 2008 MSSQL 2012
  • Responsibilities:
  • Tuning exercise on applications using latest DMVs and DMFs
  • Working Table Range partitions for Rollin/ Rollout
  • Column store Indexes created for Query Performance
  • Implemented new features in MSSQL 2014 & 2016
  • Always Encrypted & Stretch Database
  • Setup HADR with multiple RO stand by.

DB Admin / Developer

Global Support Team BARCLAYS
12.2012 - 03.2015
  • Env: HP DL380 Gen 8 16 Core 2.9 GHz Red Hat Enterprise Linux Server release 6.4 (Santiago)
  • UDB 10.5 FP4 ESEDPF Mulitnode Cluster Server, MS SQL2008 R2, 2012& 2014
  • Role: Level 3 Support for Data Warehouse Team and other Applications
  • DB2 UDB Responsibilities:
  • Expert in DPF Partitioned Database Architectural Design, Setup and Installation
  • Developed Store Procedure and automated scripts to Rollin/Rollout data for Table Partitioned DB
  • Found and resolved Architectural issue to separate TMP, LOGS, DATA File system in dedicated VG for best performance for OLTP env
  • Support for Production issues on 24/7 basis
  • Very good with Volume Group, File System Setup and SRDF config,
  • Hands on experience on SRDF setup Failover and Fallback
  • Worked on UDB 10.5 FP5, new features of HADR Multiple STANDBY and new commands db2prereqcheck
  • Explored new features of multi-temperature storage for hot, cold and warm data using STOGROUP & Adaptive Compression
  • Support for Apply Lag, Capture Qdepth, ASN Schema Tables issues
  • Cataloging Databases and Nodes (db2cfexp & db2cfimp)
  • Used tools DB2DART, DB2SUPPORT, DB2PD
  • Experienced in DB2 Recovery ( Crash Recovery, Roll Forward Recovery, Version Recovery ) and movement of data using Load, Import, and Export Utilities
  • Expert in analysis of Run stat, Reorg Tables, Roerich report for SQL Performance
  • Strong in Partition concepts (Hash, Range & Group by)
  • Strong with DB Artisan Tools, Erwin, Quest Central / Spotlight, BMC Change Manager & Patrol
  • Expert in Session Logs and Workflow monitoring to improve performance of ETL load
  • Knowledgeable in Star, Snowflake Schemas concepts and Cube Creation,
  • Strong in SQL Analysis, Hash Joins, Loin, and Merge Join
  • Federated DB Setup to move data from one environment to others
  • Perl and Shell scripting.

Application DBA

CHASE
06.2012 - 12.2012
  • Environment: IBM P570 POWER6 Servers & IBM 3150 BCU DW
  • Responsibilities:
  • Bank Application setup with HADR with High Availability with minimal time and effort.

Architect

Bank of America
11.2010 - 06.2012
  • For Data Warehouse ETL Team
  • Environment: Front end Cognos Report and Backend DB2 UDB Five Physical Nodes23 TB Database
  • Responsibilities:
  • Built on UDB 9.5 FP9 DPF DB
  • Support for Production issues on 24/7 Basis.

Education

Bachelors - Electronics and Communication

SKA POLYTECHNIC
New York, NY
12.2003

Skills

  • Multiple RDBMS, SNOWFLAKE & REDSHIFT
  • SQL, NOSQL Architect & Admin
  • System Mapping
  • Database Updates
  • Reverse Engineering
  • Data Lakes
  • Data Migration
  • Performance Tuning

Accomplishments

  • CERTIFICATIONS:
  • IBM Certified Specialist (DB2 UDB)
  • IBM Certified Solutions Expert (DB2 UDB Database Administration for Linux, UNIX, Windows and 0S/2)

Timeline

Data Consultant

StateFarm Insurance
11.2022 - 06.2023

Sr. Cloud Data Consultant / Modeler

OPM .VA
08.2019 - 01.2020

Cloud Database Developer

Huntington Bank
11.2018 - 08.2019

Replication Architect

Morgan Stanley
10.2015 - 09.2018

Performance Engineer MSSQL DBs

Bank of America
04.2015 - 10.2015

DB Admin / Developer

Global Support Team BARCLAYS
12.2012 - 03.2015

Application DBA

CHASE
06.2012 - 12.2012

Architect

Bank of America
11.2010 - 06.2012

Data Consultant

CMS
01.2009 - 01.2020

Bachelors - Electronics and Communication

SKA POLYTECHNIC
BALAKRISHNAN SELLAM