Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Timeline
Generic

MOTHIRAM RAJASEKARAN

Jacksonville,Florida

Summary

Dependable Senior Consultant skilled at forming lasting bonds with clients and customers by carefully addressing needs and fulfilling business goals. Perpetually available to address customer concerns. Keen to help companies connect clients with solutions to perfectly meet needs while driving internal revenue growth.

Overview

14
14
years of professional experience
5
5
Certification

Work History

Senior Solution Consultant

Cloudera Inc
06.2021 - Current


  • Performed proof of concept demonstrations, instructing potential customers on benefits of Cloudera
  • Analyze complex distributed production deployments, and make recommendations to optimize performance
  • Work directly with customer’s technical resources to devise and recommend solutions based on understood requirements
  • Recommended specific security products to customers based on unique technical requirements of each
  • Coordinated support responses to customer issues, verifying closure of concerns and correction of deficiencies
  • Performed detailed research of customer business structures to accurately tailor Cloudera to unique needs of each
  • Corrected, modified and upgraded software to improve performance
  • Keep current with Hadoop Big Data ecosystem technologies
  • Attend speaking engagements when needed.
  • Communicated with advertising and sales teams to integrate customer feedback into future promotional efforts
  • Developed and maintained long-term relationships with clients, fostering strong service bonds and encouraging return patronage

Lead Hadoop Developer

Synergy Technologies, Florida Blue
07.2019 - 05.2021
  • Successfully Migrated POPHealth, Census and Claims from SQL server Stored procedure to Postgresql using Hadoop Technologies.
  • Developed highly maintainable Hadoop code and followed all best practices regarding coding.
  • Authored documentation for data dictionaries, business rules and intake parameters and presented collected information to decision-makers.
  • Verified new product development effort alignment with supportability goals and proposed Service Level Agreement (SLA) parameters.
  • Performed data cleaning on structured information using various Hadoop tools.
  • Met with key stakeholders to discuss and understand project scope, tasks required and deadlines.
  • Supervised Hadoop projects and offered assistance and guidance to junior developers.
  • Contributed ideas and suggestions in team meetings and delivered updates on deadlines, designs, and enhancements.
  • Corrected, modified and upgraded software to improve performance.
  • Authored code fixes and enhancements for inclusion in future code releases and patches.
  • Coordinated deployments of new software, feature updates and fixes.
  • Tested troubleshooting methods and documented resolutions for inclusion in knowledge base for support team use.
  • Translated technical concepts and information into terms parties could easily comprehend.
  • Tuned systems to boost performance.

Senior Hadoop Developer

Hexaware Technologies, Freddie Mac
04.2018 - 07.2019
  • Freddie Mac is public government-sponsored enterprise that represents participation in pool of mortgages guaranteed by Federal Home Loan Mortgage Corporation
  • Loan Processing Advisory Data which is present in different source systems like Mainframe, SQL and this File system has to be migrated to Hadoop Environment
  • Implemented and operated runtime data environments to maintain uptime rating.
  • Gathered client requirements by studying functional document and conducted functional specs discussion with business team
  • Gained knowledge of both source and target processes and data model for effective migration
  • Created Apache Nifi job process to migrate data from Mainframe to Hadoop environment.
  • Created Apache Nifi jobs to migrate data from SQL server to Hadoop
  • Checked data quality and completeness using checksum mechanism
  • Created Apache spark/ Python scripts to apply business logic and also created Hive queries to load data to External/ Internal table
  • Created ORC file format with compression techniques to increase the performance
  • Created Partition and buckets for performance improvement.
  • Developed highly maintainable Hadoop code and followed all best practices regarding coding.
  • Authored documentation for data dictionaries, business rules and intake parameters and presented collected information to decision-makers.
  • Met with key stakeholders to discuss and understand project scope, tasks required and deadlines.
  • Supervised Hadoop projects and offered assistance and guidance to junior developers.
  • Contributed ideas and suggestions in team meetings and delivered updates on deadlines, designs, and enhancements.

Hadoop Developer

Hexaware Technologies, IQVIA
12.2015 - 03.2018
  • Researched and designed platform migration strategies, creating frameworks to transfer assets from IBM Mainframe to Hadoop utilities.
  • Met with key stakeholders to discuss and understand project scope, tasks required and deadlines
  • Authored documentation for data dictionaries, business rules and intake parameters and presented collected information to decision-makers
  • Performed data cleaning on structured information using various Hadoop tools
  • Developed highly maintainable Hadoop code and followed all best practices regarding coding
  • Corrected, modified and upgraded software to improve performance
  • Tested troubleshooting methods and documented resolutions for inclusion in knowledge base for support team use
  • Designed and implemented scalable applications for data extraction and analysis
  • Tuned systems to boost performance

Hadoop Developer

Cognizant Technology Service, TD Ameritrade
01.2013 - 11.2015
  • One of key things focused at by customer is to provide unique and personalized customer experience. This means understanding customer’s likes and dislikes are key
  • Thus collected and analyzed large amounts of data from customers 24×7 from several data points – websites, mobile apps, credit card program, loyalty program, social media and online chat
  • Data from these data points could be structured, semi-structured and unstructured in few cases
  • All these data is collected, aggregated and analyzed in Hadoop cluster to find trading patterns, customer preferences, identify cross sell or upsell business decisions and devise targeted marketing strategies as result improving overall user experiences
  • Accountabilities Worked on a live 50 nodes Hadoop cluster running Horton Works with highly unstructured and semi structured data of 1 TB in size (replication factor of 3)
  • Extracted data from Netezza, Oracle and SQL into HDFS using Sqoop
  • Created and worked Sqoop (version 1.4.3) jobs with incremental load to populate Hive External tables
  • Gained extensive experience in writing Pig (version 0.11) scripts to transform raw data from several data sources into forming baseline data
  • Developed Hive (version 0.10) scripts for end user/ analyst requirements to perform ad hoc analysis
  • Experience in using Sequence files and ORCfile formats
  • Gained working knowledge of Hbase.
  • Authored documentation for data dictionaries, business rules and intake parameters and presented collected information to decision-makers.
  • Met with key stakeholders to discuss and understand project scope, tasks required and deadlines.

Education

Master of Science - Computer Science

Middlesex University
London
02.2010

Bachelor of Science - Electronic and Communication

Anna University
India
06.2007

Skills

  • Hadoop Development
  • Hadoop Administration
  • Data Migration
  • Application Migration
  • Apache Spark
  • Apache Spark Sql
  • Distributed Programming:
  • Data Analysis
  • Data Collection, Transformation and Loading
  • Technical Support,System Architecture
  • Security Planning
  • Data Lakes
  • Data Warehousing
  • Quality Assurance
  • Best Practices
  • Validate Code

Accomplishments

Bagged Code Award Cloudera during a major upgrade for one of the leading financial customer.

Certification

  • Certified Modern Data Architecture. Cloudera - 2022.
  • Cloudera Data Platform: Security Administration for CDP Base. Cloudera - 2022
  • Microsoft certification Azure Fundamentals. Microsoft - 2021
  • CCA 175 Spark Hadoop Developer. Cloudera - 2017
  • Cognizant Certified Profession in Hadoop. Cognizant - 2015
  • Cognizant Certified Profession in Informatica. Cognizant - 2015
  • Putting Research into Practice from RDF Realized as a best Technology. Middlesex University - 2010
  • Microsoft Certified System Administrator. Microsoft - 2008

Timeline

Senior Solution Consultant

Cloudera Inc
06.2021 - Current

Lead Hadoop Developer

Synergy Technologies, Florida Blue
07.2019 - 05.2021

Senior Hadoop Developer

Hexaware Technologies, Freddie Mac
04.2018 - 07.2019

Hadoop Developer

Hexaware Technologies, IQVIA
12.2015 - 03.2018

Hadoop Developer

Cognizant Technology Service, TD Ameritrade
01.2013 - 11.2015

Master of Science - Computer Science

Middlesex University

Bachelor of Science - Electronic and Communication

Anna University
MOTHIRAM RAJASEKARAN