Summary
Overview
Work History
Education
Skills
Certification
Software
Timeline
Generic

Rakesh Tatineni

Louisville,KY

Summary

Seasoned Technology Professional with 14+ years of experience in IT Industry

. Strong experience with IBM Information Server (Data Stage). Comprehensive knowledge of Data Warehousing and Business Intelligence in Retail and Insurance domain. Working knowledge of Big Data and Apache Hadoop. Extensive experience in design and development with Data Stage Designer. Experience with troubleshooting Data Stage jobs; addressing production issues and fixing defects, including performance tuning. Experience with highly scalable parallel processing infrastructure using Data Stage Enterprise Edition (PX). Working Knowledge with Ruby on rails and Restful API’s. Proficient in working with ETL processes across SQL Server, Oracle and DB2 Databases. Extensively worked on Claims Center and Policy Center Billing Center Applications Configuration and Integration. GOSU: Written/Enhanced multiple GOSU Utility Classes, Query Builder Classes, Mapping Classes, and GOSU Enhancements to meet complex business needs and extensively used Transaction Bundles. Extensive experience in design and development with Dell boomi. Have good understanding of Amazon Web Services and Docker. Well versed with UNIX operating system - shell scripting and commands. Excellent knowledge of studying the data dependencies using Metadata of Data Stage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs. Involved in all the phases of the software development lifecycle (SDLC).

Overview

13
13
years of professional experience
1
1
Certification

Work History

Distributed Programmer/Analyst III

Kentucky Farm Bureau
08.2019 - Current
  • GWPC Integration Point #1: Document Generation
  • Description: Business wants to generate real time documents with policy information for all transactions
  • These documents were required to print both to printer and as PDFs
  • Used XSD functionality available in Studio to send and receive payloads from Dialogue (Document generator).
  • GWPC Integration Point #2: Archive Documents
  • Description: Business requires documents to be archived with policy information for all transaction
  • Sent payloads to Dialogue (Document generator) which is uses this message to generate document and store it in On-Base.
  • GWPC Integration Point #3: Batch Pint
  • Description: Documents sent to customers via mail doesn’t require all information on the policy
  • Information for batch print is mapped and sent to database tables which is then picked up by ETL and sent to Dialogue.
  • GWPC Integration Point #4: Rating Mapping
  • Description: Sent requests and accepted responses from Blaze rating engine
  • Build cost objects and rating factors from the data sent from Blaze rating engine.
  • GWPC Integration Point #5: Diffs Generation
  • Description: Generated diffs required to show on view differences (VOV) screen for policy change transactions
  • Generated required diffs that business wants as we receive unwanted rating changes from policy center during change transactions
  • Also configured VOV screen when required.
  • GWPC Integration Point #6: Configuring Forms
  • Description: Configured availability and inference of forms as required by business
  • Used product designer extensively during print and rating integration
  • Sound knowledge on working with product designer.
  • Maintained existing applications and designed and delivered new applications.

Software Developer Analyst

V-soft consulting
04.2016 - 08.2018
  • KFB PAS R4 - Worked on bringing property lines of business via ETL processes utilizing Data Stage from Policy Center to target databases ODS 2.0 (Operational Data Store) and ADW (Atomic Data Warehouse)
  • Responsible for all outgoing print mailed to insureds and additional interests
  • Also played an pertinent role in the KFB Lloyds Earthquake project creating the OTR file to feed the newly created Earthquake Property Cash database and I also created the Lloyds Accord Report that is sent to Lloyds Of London via FTP protocol.

Software Developer

Coupa
10.2014 - 04.2016
  • Coupa is a cloud-based suite of financial applications
  • It is a spend management solution that makes it easy to get visibility and control over all the ways spend happens—procurement, expenses and AP

System Analyst

Gemini It Labs
05.2014 - 10.2014
  • Macys is one of the leading retail chains in US
  • Business wants to develop a process called RDPP (Rules Driven Product Placement) – A process to display products on Macy's web portal, on the basis of calculated metric scores
  • Twenty different values/metrics, had been taken into account to perform these calculations
  • These included On hand quantity available, Newness, Rebates, Promotion, Ratings etc.

System Analyst

Gemini It Labs
04.2012 - 04.2014
  • KFB (Kentucky Farm Bureau) had a requirement to capture operational data pertinent to insurance, such as daily transactions on the insurance policies, any events on, or changes to the existing policies
  • Based on these changes, notifications had to be sent to the respective customers
  • Data extract from the Operational Mainframe systems and Policy Center needed to be transformed through the Data Staging ETL tool by applying the appropriate business rules
  • This data had to be finally loaded into another ODS2.0 database where event letters and notifications were generated to be sent to the customers using various modes of communication.

SQL Developer

IT Keysource Inc
05.2011 - 03.2012
  • Captured customer preferences from one source system and update corresponding data in a customer hub with respective preferences
  • The primary intent of this was to enhance user experience for customers in terms of their preferences towards receiving promotional messages through email and/or mobile
  • Data needed to be consumed from different sources; cleansed in Data Stage on the staging layer
  • Data had to then be passed through an integration layer where various business transformations/rules had to be applied
  • Further to this, data had to be loaded onto the target DB2 database.

Education

Masters in Electrical Engineering -

University of New Haven
New Haven, CT
12.2010

Bachelors in Electronics and Instrumentation -

JNTU
Hyderabad India
05.2008

Skills

  • Industry Standard Products: GuideWire PolicyCenter, IBM DataStage
  • Programming Languages: Java,Gosu,Ruby on Rails, Python, Web Focous and Junit,Groovy
    Scripting: Java Script,JQuery,CSS, HTML,XML,SQL,Shell
    FrameWorks : Spring Core,Spring WebMvc, Spring Batch, Hibernate and Rails
    WebServices : SOAP & REST
  • Other Technologies: DoucSign,AWS, Docker Imaging
  • Tools: Eclipse, JUnit, Gradle, Intellij Studio, Apache MQ,Dell Boomi
  • Methodologies: Waterfall, Agile, SDLC
  • Application Servers: IBM Websphere, Weblogic, Tomcat
  • EngineDatabase Platforms: IBM DB2, Oracle, MySQL

Certification

Guidewire Certified Ace - PolicyCenter Configuration

Guidewire Certified Ace - InsuranceSuite Integration

Specialist Certification - InsuranceSuite 10.0 Integration

Specialist Certification - PolicyCenter 10.0 Configuration

Associate Certification - InsuranceSuite 10.0 Developer


Software

Java

Python

SQL

AWS

Docker

Boomi

Timeline

Distributed Programmer/Analyst III

Kentucky Farm Bureau
08.2019 - Current

Software Developer Analyst

V-soft consulting
04.2016 - 08.2018

Software Developer

Coupa
10.2014 - 04.2016

System Analyst

Gemini It Labs
05.2014 - 10.2014

System Analyst

Gemini It Labs
04.2012 - 04.2014

SQL Developer

IT Keysource Inc
05.2011 - 03.2012

Masters in Electrical Engineering -

University of New Haven

Bachelors in Electronics and Instrumentation -

JNTU
Rakesh Tatineni