Summary
Overview
Work History
Education
Skills
Timeline
Generic

VARUN S

Summary

  • More than 9 years of experience in various phases of Software Development Life Cycle involving Analysis, Design, Development and Maintenance of Business Applications using Oracle Relational Database Management System. Experience in Design and Implementation of PL/SQL Stored Procedures, Functions, Packages, and Database Triggers for Oracle Relational Databases. Proficient in the Data Manipulation using SQL for the retrieval of data from the Relational Database (inner joins, outer join, group by, order by, cursors etc). Experience in Query optimization & Performance tuning using Explain Plan, Performance Monitor. Implemented performance tuning techniques at application, database and system levels. Proficient in Oracle Tools and Utilities such as SQL
  • Loader, TOAD, SQL Developer, and PL/SQL Developer. Experience in Design and Implementation of PL/SQL Stored Procedures, Functions, Packages, Ref Cursors, Constraints, Database Links, UTL file and Triggers for Oracle Relational Databases. Strong experience in developing SQL/PLSQL Scripts/Programs for Data Analysis, Extraction, Transformation and Loading (ETL) mechanism. Extensive development experience in data designing, mapping, Loading data, Testing and Tuning databases. Extensively worked in Client-Server application environment using Oracle 10g/9i/8i/7.x on Windows, UNIX/LINUX platforms and profound knowledge on Oracle 11g. Very Good experience with UNIX Cron Jobs, shell scripting, Batch Processing with highly complex data model. Extensive experience with Datawarehouse solutions like oracle EXADATA and TERADATA. Experience in migrating Oracle 11g to Oracle 19c on AWS RDS. Solid understanding of Relational (ROLAP) and Multidimensional (MOLAP) modeling, broad understanding of data warehousing concepts, star and snowflake schema database design methodologies and Meta data management. Expert in coding complex and advanced PL/SQL programs using Oracle’s Object-Oriented programming, collections, records, index-by tables, object types and methods. Extensive experience in using PLSQL Object Types, Oracle Parallel Queries, Materialized Views, Bulk Collects, Bulk Load, Regular Expressions, FORALL, Merge etc. Excellent proficiency in Data Transformations and loading using Export/Import and SQL
  • Loader and worked on External tables and Transportable table spaces. Experience in developing dashboards on tableau. Experience in developing and support applications based on SQL SERVER and SSIS. Experience in developing powershell scripting. Extensive experience with SAS 9.4 for data integration and ETL processing. Strong analytical skills with ability to quickly understand clients’ business needs and identify available data sources and the relationships that exist within them. Ability to learn and adapt quickly in a dynamic environment and Ability to work on multiple projects against aggressive deadlines. Good Communication, Writing, Presentation, Workforce Management, Interpersonal and Analytical skills, along with the ability to work on multiple projects and prioritize workload. Self starter can work independently as well as a good team player. Expertise in Design and Implementation of PL/SQL Stored Procedures, Functions, Packages, and Database Triggers for Oracle Relational Databases. Proficient in Oracle Tools and Utilities such as SQL
  • Loader, TOAD, SQL Developer, and PL/SQL Developer. Strong experience with Gitlab administration. Experience in tuning queries on EXADATA using OEM CLOUD CONTROL 13C. Working knowledge with NoSQL databases like MongoDB and postgressql. Experience in developing drill down approach dashboards on tableau. Good knowledge in Data modelling and Reporting. Experience in developing powershell and batch scripting. Working experience with Hadoop environment and data lake. Experience in querying data with Hive, Impala and Spark. Worked on the discrepancies with Hive, Impala and spark such as data types and functions. Extensive experience in automating workflows using Python in Hadoop. Use hive for data analytics and data quality checks.

A proven track record of working in a fast-paced environment. Strategic Data Engineer versed in distilling and analyzing large data sets. Develops and delivers presentations detailing data findings. Articulate and collaborative with expertise in algorithm design and data collection. Meticulous Data Engineer brings expertise in testing, validating and reformulating models. Employs statistical software to manipulate data and forecast outcomes. Proven history of designing and implementing data collection innovations. Practical Database Engineer possessing in-depth knowledge of data manipulation techniques and computer programming paired with expertise in integrating and implementing new software packages and new products into system. Offering 9-year background managing various aspects of development, design and delivery of database solutions. Tech-savvy and independent professional bringing outstanding communication and organizational abilities. Outgoing and friendly with strong drive to succeed. Hardworking and reliable Data Engineeer with strong ability in Oracle plsql and hive.. Highly organized, proactive and punctual with team-oriented mentality. Organized and motivated employee eager to apply time management and organizational skills in various environments. Seeking entry-level opportunities to expand skills while facilitating company growth.

Overview

9
9
years of professional experience

Work History

Sr. Data Engineer

Bank of America
Charlotte, NC
03.2022 - Current
  • Develop stored procedures, packages and functions to load data from different sources that include equities, monetary and exposure amounts of different counterparties
  • Provide reports to business on demand
  • Develop and deploy shell scripts to support automation and other utilities for application support
  • Develop and tune queries to support several views in tableau dashboard
  • Work with business analysts and product owners on requirements gathering and application analysis
  • Work with the team or as an individual contributor to perform analysis, design, development and testing of solutions to meet requirements
  • Read Execution Plans and Tuning methodology for SQL statements to enhance performance of loads
  • Develop and recommend strategies and specifications for (technical) data solutions based on the analysis of the business goals, objectives, needs, and existing data systems infrastructure
  • Data Modeling and experience in developing new logical, conceptual, and physical models for data and system flows
  • Support Hadoop ecosystem to perform various data related activities between environments in lower lanes
  • Support regression runs in lower lanes and deploy code to preprod environments for testing code via regression runs
  • Enforce data quality standards and governance processes
  • Collaborate with data stewards and business users to ensure data accuracy, consistency, and integrity
  • Identify and address performance bottlenecks in data pipelines and databases
  • Make use of existing on prem engineered applications to enhance performance
  • Monitor yarn jobs to track progress of data movement jobs between higher and lower lanes
  • Identify and address performance bottlenecks in data pipelines and databases
  • Lead troubleshooting efforts and resolve issues related to data infrastructure
  • Ensure adherence to coding standards and contribute to the development of best practices within the organization.
  • Maintained open communication with team members and stakeholders, resulting in successful project outcomes.
  • Deploy code to lower lanes using Ansible Tower.
  • Use qtest manager suite to add test cases and execute test cases in lower lanes.
  • Coordinate with multiple teams on release items.

Sr. Oracle Developer

IsolveTech
Morrisville, NC
10.2021 - 03.2022
  • Generated DDL scripts and Created and modified database objects such as tables, views, sequences, functions, synonyms, indexes, packages, stored procedures using TOAD tool
  • Created and used Table Partitions to further improve the performance while using tables containing large number of columns and rows
  • Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable
  • Used Bulk collect and For all in stored procedures to improve the performance and make the application to run faster
  • Performed SQL and PL/SQL tuning to improve the performance with the help of SQL Trace, Explain Plan, Indexing and Hints
  • Implemented CTAS (Create Table As Select) approach to optimize archive process
  • Provide quality operations support for production environment
  • Work with QA and provide support to provide timely technical resolutions for defects
  • Involved in performance tuning using Indexes, Hints, Explain Plan, Stats gathering etc
  • Analyse and reverse engineer existing ETLs built in Oracle DB
  • Create requirement specification document to discuss further on application enhancements with stakeholders
  • Create technical specification document for Migrating existing ETLs to Oracle on AWS RDS
  • Parse response flat files from data source and load them to oracle using External tables
  • Involve in requirements gathering meetings to drive technical discussions
  • Validate file generations using rds logs
  • Use file utilities in AWS RDS to read and write files to oracle directories.

Sr. Oracle Developer

Bank of America
Charlotte, NC
10.2020 - 09.2021
  • Develop stored procedures, packages and functions to load data from different sources that include equities, monetary and exposure amounts of different counterparties
  • Provide reports to business on demand
  • Support and monitor Autosys jobs for batch scheduling
  • Develop and deploy shell scripts to support automation and other utilities for application support
  • Develop and tune queries to support several views in tableau dashboard
  • Work with business analysts and product owners on requirements gathering and application analysis
  • Work with the team or as an individual contributor to perform analysis, design, development and testing of solutions to meet requirements
  • Read Execution Plans and Tuning methodology for SQL statements to enhance performance of loads
  • Develop and recommend strategies and specifications for (technical) data solutions based on the analysis of the business goals, objectives, needs, and existing data systems infrastructure
  • Data Modeling and experience in developing new logical, conceptual, and physical models for data and system flows
  • Develop technical strategies and architectures for high-performance, scalable, and complex enterprise-grade system
  • Provides architectural guidance and best practices for with respect to the application team’s use of the Oracle Exadata platform
  • Ensures the health and wellness of the Oracle Exadata platform as it relates to application usage
  • Assist infrastructure DBAs, Oracle support and application development team members with resolving platform issues
  • Develop, enhance and debug existing python scripts and postgres sql scripts.

Oracle PLSQL Developer

Kaiser Permanente-DOR
Oakland, CA
06.2018 - 10.2020
  • As part of collaborative effort of Kaiser North California, an oracle based virtual data warehouse has been created to provision data analytics for different health care providers of different departments in Kaiser
  • VDW is refreshed daily, monthly and quarterly
  • Worked extensively on migrating Teradata to oracle Exadata migration
  • Worked on generating reports for further data analysis
  • Analyze cohort dataset based on medications, procedures and diagnosis
  • Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures
  • Developed Unix scripts, stored procedures / packages to run ETL jobs and to extract data from different data sources and load into oracle
  • Import and export data to external portal using json API
  • Use external tables to load data in oracle
  • Used java programs to extract data from Snappy DB and load into oracle Datawarehouse
  • Used Oracle scheduler to automate ETL job runs daily, weekly, monthly and quarterly
  • Used Exadata to extract data and transform and load to improve performance of data loads
  • Provision Data sources to data scientists for further analysis
  • Administered and assisted setting up GIT repository for entire team
  • Worked extensively on production issues as priority
  • Worked shell scripting to support ETL loads and further ad-hoc requests
  • Created crontabs for shell scripts to automate some data loads
  • Created dynamic sqls and plsql blocks to enhance performance of updates in larger tables
  • Work with external tables using oracle directory objects
  • Used exchange partitions to address performance issues of data loads
  • Used tableau desktop to create dashboards visualizing data loads status and metrics
  • Used tableau online to host dashboards for users and team itself
  • Responsible for developing complex SQL queries, stored procedures, functions and packages in oracle Exadata.

Oracle PLSQL Developer

Credit Suisse
Raleigh, NC
01.2016 - 06.2018
  • Worked on the technical implementation of multiple global Orders, trades, products, positions, Reference data and Market Data ETL’s sourcing projects for the Actimize IBTS application to serve Credit Suisse’s Trades Surveillance
  • Worked on migrating data feeds, scripts, and models from Mantas to Actimize IBTS Platform
  • Worked on the trades, positions, reference data, Market data load process (ETL) within IBTS
  • Tuned huge Inside Quote data loads (3.7 Billion Records) to around 1 hour
  • Involved in getting the User Requirements, Data Modeling & Development of the system
  • Engaged in CAB meetings for upcoming changes, DR Tests, and Production Releases
  • Played key role in designing and implementing Strategic and Tactical sourcing of data from various systems, processing it and feeding into CDS Schema of IBTS System
  • Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures
  • Developed Unix scripts, stored procedures / packages to run ETL jobs and to extract FUT Orders, Trades, Exchange information from LCDB database
  • Developed Linux Shell Scripts to implement Holiday Check, File watcher, Gunzip, Rename, Stats Collection and other generic modules
  • Configured above modules as Database Commands which helps in easy maintenance and reduce dependency on Control M
  • Worked on production issues in Mantas, HFT, AMTS, LCR, LCD, IBTS Applications
  • Build several Unix Shell scripts for PLSQL programs to schedule them on Control M
  • Performance Tuning of complex SQL queries using Explain Plan to improve the performance of the application
  • Upgrade/Migrated the production and development databases from 9i to 11g
  • Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable
  • Used Bind Variables while writing Dynamic SQL to improve performance
  • Extensively worked on BULK COLLECTS, BULK INSERTS, and BULK UPDATES & BULK DELETES for loading, updating and deleting huge data
  • Used Oracle Analytical functions such as RANK, DENSE RANK, LEAD, LAG, LISTAGG & ROW_NUMBER Functions for sorting data
  • Extensively worked on creating partitioned indexes for large tables
  • Successfully validated all shell scripts on new Linux environment
  • Worked on control-M Scheduler for successfully working of these validated scripts to run in daily and weekly basis
  • Involved in developing SQL
  • Loader control files which are actively called in current shell scripts to load staging and master tables
  • Developed shell scripts in Linux environment for automotive load process using existing procedures and packages related to sales/orders, service and manufacturing data
  • Worked extensively on Ref cursors and alter table exchange partition and implemented in many procedures
  • Designed and developed several complex database procedures, packages
  • Extensively used features like Cursors, autonomous transactions, distributed transactions, exception handling, Dynamic, pl/sql tables, bulk load methods, optimizer Hints, Cursor variables and returning of multi record set from procedures and functions
  • Performed SQL and PL/SQL tuning to improve the performance with the help of SQL Trace, Explain Plan, Indexing and Hints.

Oracle Developer

T-Mobile
GA, USA
10.2014 - 12.2015
  • Wrote SQL, PLSQL, SQL
  • Plus programs required to retrieve the selected data from the Data repository using cursors and exception handling
  • Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures
  • Before ETL, Involved in Data Analysis, Data Design, Data Integration and Data Mapping
  • After ETL, Involved in Data Validation, Data Performance in loading target table and for reporting purpose
  • Developed control files for SQL
  • Loader and PL/SQL scripts for Data Extraction/Transformation/Loading (ETL) and loading data into interface Tables (Staging area) from different source systems and validating the data
  • Developed various Mappings to load data from various sources using different Transformations
  • Extracted data from different sources like Oracle, Flat files, External files and transformed the data based on Business requirements and loading into Oracle target database
  • Created and modified database objects such as Tables, Views, Materialized views, Indexes, Sequences and constraints, SQL queries (Sub queries and Join conditions)
  • Used Bulk Collect, Bulk Insert, Update functions in the ETL Programs
  • Created and used Table Partitions to further improve the performance while using tables containing large number of columns and rows
  • Extensive use of Oracle External Tables which is used to load the flat files (Essbase Extract) into Oracle Database
  • Extensively involved in performance tuning using Explain Plan, DBMS_PROFILER and Optimized SQL queries, created Materialized views for better performance
  • Over 80 objects are analyzed to ensure modifying existing structure will not create potentially unwanted behavior to downstream processes
  • Documented the PL/SQL packages, log files, locations and descriptions, log tables and possible error codes and message descriptions
  • Detected and corrected bugs during system integration and user acceptance testing.

Education

Masters in computer Science -

University of Texas at Tyler

Bachelors in computer science and engineering -

Jawaharlal Nehru university

Skills

  • Tableau
  • WinScp
  • Windows95/98/2000/XP/NT40, UNIX (Sun Solaris, HP-UX)
  • SQL, PL/SQL, C, Java
  • Oracle 12c,11g/10g/9i/8i, SQL server 2000/2005, MS Access (2000, XP)
  • Oracle SQL Developer 12
  • Reports 6i, TOAD 101, 961, 85, 81, Eclipse IDE
  • HTML, XML Publisher
  • Gitlab, SVN
  • Performance Tuning
  • Data Warehousing
  • Continuous Integration
  • Machine Learning
  • Data Modeling
  • Python Programming
  • NoSQL Databases
  • Data Analysis
  • SQL Transactional Replications
  • SQL and Databases
  • Data Migration
  • Big Data Technologies
  • Relational Databases
  • Database Design
  • Statistical Analysis
  • Advanced Data Mining
  • Database Maintenance
  • Software Development Life Cycle (SDLC)
  • SAS Programming
  • Database Programming and SQL
  • Structured Query Language (SQL)
  • UNIX System
  • Data Integrity Validation
  • SQL
  • Git

Timeline

Sr. Data Engineer

Bank of America
03.2022 - Current

Sr. Oracle Developer

IsolveTech
10.2021 - 03.2022

Sr. Oracle Developer

Bank of America
10.2020 - 09.2021

Oracle PLSQL Developer

Kaiser Permanente-DOR
06.2018 - 10.2020

Oracle PLSQL Developer

Credit Suisse
01.2016 - 06.2018

Oracle Developer

T-Mobile
10.2014 - 12.2015

Masters in computer Science -

University of Texas at Tyler

Bachelors in computer science and engineering -

Jawaharlal Nehru university
VARUN S