Summary
Overview
Work History
Education
Skills
Employment History
Certification
Timeline
Generic

Jaya Krishna Vemuri

Summary

Skilled IT professional with over 14 years of experience in Software industry on implementation of high-volume, high-performance Investment Banking Products using Oracle, Snowflake, MS Azure and DBT. Proficient in cloud computing tools to build Data Warehousing platforms with efficient performance to meet client expectations. Played Major role in all large-scale Production deployments and have strong working expertise in Investment Accounting products such as PAM (Portfolio Asset Management) , GLS (General Ledger System) etc and Capital Markets domain.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Vice President

State Street Bank and Trust Company
Princeton, NJ
06.2022 - Current

Middle Office Data Services:

Middle Office Data Services is a new cloud data warehouse and information delivery platform for Middle & Back office Applications that adheres to the core design principals and delivers critical business capabilities:

  • Open Cloud-Native Architecture with Elastic Scalability and optimized Total Cost of Ownership.
  • Standardized Data Models and Information Delivery mechanisms.
  • Streamlined Client Onboarding processes. Built-in Data Quality framework.
  • Comprehensive Operational tools and end to end auto delivery of Client reports across multiple geographical Regions with SLAs.

Responsibilities:

· Utilize Data Vault Modeling technique to deploy innovative data warehousing solution.

· Implement Oracle Golden Gate replication to extract data from multi-tenant client database applications.

· Perform advanced data transformations for each domain such as Valuations, Cash, Transactions, Portfolios, Securities, etc., in staging environments using Oracle Packages before ingesting the data into Cloud for the best performance.

· Transmit payload files and meta data events via Oracle AQ (Advanced Queue) to Azure Data Lake and Event Hub.

· Configure data ingestion in Snowflake for efficient data transfer from Azure Data Lake using Snow PIPE, Streams, and snowflake Tasks to client-specific cloud databases systems.

· Design and Develop Snowflake stored procedures for implementing advanced data handling, such as Row versioning mechanism to identify the latest data for processing, and a Materialization process to ensure faster and accurate data retrieval for client dash boards/Reports.

· Implement snowflake views and Database functions to perform data transformations and aggregations as per business logic and code reusability across multiple applications.

· Design and configure the extracts/Reports using end-to-end API calls for auto generation and delivery of reports to the end client with multiple delivery options such as SFTP, Cloud, Continuous data streaming, Data share to another client’s snowflake account, Managed File Transfer Etc.

· Create a framework to unload the data and events between Snowflake Cloud and Azure Blob storage.

· Utilize the DBT functionality to perform complex Aggregations and Transformation and store the transformed data into Snowflake objects.

· Lead and Manage DevOps operations to perform CI/CD operations across 20+ databases using AKS, Kubernetes, Schema change and Python library.

· Create a framework to perform local deployment in lower environments using AKS, Kubernetes, Schema change and Python library

· Enhance the performance using Snowflake’s advanced features, like Multi Clustering, Warehouses, and Cluster keys to partition large volume tables.

· Engage with the Snowflake Core team to discuss and advice Snowflake to implement the core advanced features which can be used by snowflake users across the globe.

· Possesses strong understanding and hands-on experience with various Azure Services including Key Vault, Event Hub, Cosmos, Data Lake, Blob Storage, Queues, AKS and ACR.

Vice President

State Street Bank and Trust Company
Princeton, NJ
06.2022 - Current

IMS Middle Office Conversion Reconciliation:

IMS (Investment Management Services) Middle Office Conversion Reconciliation serves as a crucial and powerful recon tool to perform front-to-back reconciliation services to existing State Street Customers. It also serves as a single platform to perform end-to-end client conversion activities in multiple tranches to onboard a new client.

Clients migrated from legacy system loads, end-to-end legacy data feeds, including reference market data, cash, transactions, and valuations. It performs the pre-data quality checks for complete data availability, accuracy and validation before performing the reconciliation. Also, it performs the reconciliation such as Cash, Trade, market value, Portfolio total and NAV etc., with legacy system and new Alpha Platform to decide State Street client is onboarded with end-to-end flow.

This Recon tool is also capable of replacing legacy State Street Applications with single Recon platform for all financial Recon needs. This will reduce lot of cost and maintenance for State Street and users/clients will have a single system to perform end to end Reconciliation which includes, Pre-Data Quality Checks, Transform the data to a downstream acceptable format, Receive the acknowledgement back from multiple downstream system and process it back to client with final Data Analyst Reports as per client expectations with end-to-end SLAs.

Responsibilities:

· Create ingestion pipelines for continuous data integration multiple data feeds from conversion/business users via multiple sources such as EFR (Enterprise Financial Reference system), MYSS platform, MQ and SFTP in multiple formats like xml, json, csv, avro and txt using Azure and Snowflake.

· Create a framework to generate the data load events after each successful ingestion in Snowflake target systems to Event Grid, Event Hub, Blob connector and Cosmos in Azure platform to trigger the subsequent processes/applications.

· Create transformation logics based on business requirements using Snowflake Stored procedures, Views, functions and DBT which can be used in the report/extract generation.

· Create various Pre-Data Quality checks to ensure availability, accuracy and completeness of data such as Valuations, Transactions, Cash and Portfolio using Snowflake SQL and Azure Data Bricks.

· Create a framework to send various transactional data for multiple assets such as Equinity, Fixed Income, Bonds, Futures, etc., to RKS applications which will be used by Investment managers.

· Create a logic to receive the acknowledgement and responses from RKS and display in MYSS UI (State Street user interface) for the users to take the necessary steps.

· Design and develop the recon transformation logic for Cash, Transactions and valuations.

· Create a factor preprocessing event for a contribution date for users to perform the data factor analysis.

· Lead and architect the DevOPS for continuous integration and deployment of multiple system.

Work as a subject Matter Expert and Primary contact for major incident in case of any issues in Production.

Vice President

State Street Bank and Trust Company
Princeton, NJ
09.2020 - 05.2022

Front Office Data Control (FODC):

FODC (Front Office Data Control) is designed to be used as a powerful State Street Proprietary tool for Portfolio services team to perform reviews and take multiple actions such as assign Portfolios to investment managers to review the assets, auto annotate, comment individual record with user accepted inputs, auto match between cash balance and Transaction based on a business key, allow users flexibility to perform manual match with approval workflow process to ensure the data quality, security and accuracy. It performs the reconciliation between the Investment Book of Record (IBOR) and client’s records (Cash, Transaction and Valuations) without manual inventions.

Responsibilities:

· Configure the data ingestion in Snowflake for efficient data transfer from Azure Data Lake using Snow PIPE, Streams, and Tasks to client-specific Snowflake databases.

· Implement snowflake views and functions to transform the raw data into multiple data points, perform aggregations as per complex business logic for each data point and make it as a generic code platform to be used for all the client’s expectations for data manipulation using multiple cloud UI platforms with State Street inhouse tools.

· Create and configure the extracts/reports using workflow API calls that access the business views, configured the SLAs and auto trigger mechanism to deliver the files to TLP system with multiple delivery options such as SFTP, Cloud, Continuous data streaming, Data share to another client’s snowflake account, Managed File Transfer Etc.

· Configure IBOR (Investment Book of Records) extracts for the End-of-day Cash, Positions and Transactional data.

· Configure the client data into State Street cloud platform from the Front Office Source system called Bloomberg for continuous integration of data across multiple regions (APAC, EMEA and NA).

· Create a framework to perform reconciliation between client’s Bloomberg data and State Street RKS data to ensure identification of data breaks, configure multiple tolerance checks like market value, price and Unrealized Gain and Loss etc., using Azure Data Bricks, Snowflake and API calls.

· Create business logic for Cash proofing and client’s Transactional data for multiple funds for each client across all the regions.

Create delta extracts/APIs for Cash and Stock for each client across all the regions, based on an event and send to Bloomberg.

Senior Consultant

Atos Syntel Inc
Princeton, NJ
05.2016 - 08.2020

Enterprise Servicing Platform:

State Street’s Enterprise Servicing Platform (ESP) is the central repository of data from different upstream applications like RKS (Record keeping System), IBOR (Investment Book of Record), GTM (Global Trade Manager), Asset Control, OTC (Over the Counter), WSO (Wall Street Office), BAA (Bank Account Interest Accruals). It gives asset owners and managers a dynamic, customizable and scalable self- service platform for all their data needs. It has the ability to store and aggregate information from multiple sources and dynamically created hierarchies. ESP provides industry leaders with one platform that is dynamic enough to meet customized needs yet complete enough to provide end to end middle, back office and information delivery capabilities. ESP lets the user to create data marts, categories, feeds, interactive views and extracts using UI through which user can define data transformation, joins and calculations. The data from data marts can be consumed through Interactive views, Extracts and IRD reports.

Responsibilities:

· Interact with Product Owner, Business Analyst and stakeholders to gather project requirements and convert them into technical stories in the JIRA board.

· Estimate the efforts required to implement the business requirements based on the complexity of the Data mart and Stored procedure logic and work on the resource allocation.

· Develop various database components, like tables, views, indexes and packages to implement the business logic.

· Create a package to identify only the impacted market valuations for each client which helps to improve the performance of Backdated valuations reports.

· Create a framework for data purging based on client retention policies to optimize storage space and extraction performance and streamline real-time data management by automating the chaining of redundant records.

· Create custom events to trigger the delivery of data file to the client after successful validation of transformation and data integrity.

· Automate end to end extract generation process by creating the Autosys and CRON jobs (which triggers the notify events, state monitor check, workflows and extracts and deliver the extracts/reports to the end user once all the validations are performed).

· Optimize the queries for performance improvement of views using Explain Plan, indexes, hints, nested loops, FORALL, bulk collect and table partitions and resolve issues in large-scale production environments with query optimization techniques.

· Resolve the MQ system failures and data validation issues in production by collaborating with multiple teams, such as L4, Production Support, EXA DATA, and DBA, and ensured the client deliverables met the SLA requirements.

· Participate in meetings with ESP core business team and MYSS teams to understand the new functionalities implemented in ESP Application to drive towards root cause identification and resolution.

· Work with the production support team to deploy the changes in production during the release window, validate the integration changes by generating the extracts and reports post-deployment, notify the business and get the sign-off on the new changes.

· Compile daily execution metrics, analyze process improvements, and communicate project updates during internal meetings.

· Provide technical know-how, support, mentoring & coaching team members on ESP application knowledge and technical flow in Oracle PL/SQL, Unix, Snowflake, DBT, Data modeling techniques and IMS business overview & other project specific aspects

Consultant

Syntel Inc.
Kansas City
06.2015 - 05.2016

General Ledger System Reporting/Data export:

DataExport is a reporting application in State Street Alternate Investment Solutions (AIS) that extracts raw data from multiple servers and FTPs the data files to the clients in CSV, PSV or TXT format based on their requirements. Scheduled AutoSys jobs can be created to send these files to clients on a regular basis. Clients use these files and upload them to their internal systems for their data processing.

Responsibilities:

· Interact with Business Analyst and stakeholders to capture business needs and translate them into precise technical tasks.

· Provide critical inputs for story development based on past release challenges and user feedback received during defect triage.

· Design and develop the Data Flows and control flows to perform Data cleansing.

· Configure clients and funds in development environment according to client requirements.

· Create stored procedure to implement the business logic as per the requirement.

· Create and enhance the reports using Jasper reporting by understanding the business logic from the user

· Report Extract logic enhancement and validation, generation of various Report types

· Involved in creating ETL jobs and monitoring the ETL Schedule.

· Identify high level production fix and perform validation, provide signoff during onsite hours, and communicate the fix to offshore team for additional assistance

Organize daily internal status call for offshore to accumulate the status, notify project updates, clarify project queries, productivity tracking, Performing audit tasks like Effort estimation sheet, Accumulation of daily execution metrics, skill mapping document, cause and analysis document, ideas for process improvements.

Consultant

Princeton Financial Systems
07.2010 - 05.2015

Portfolio Account Management:

This integrated investment accounting and management system includes multi-basis accounting for Investment reporting such as local, GAAP, tax, statutory, management and IFRS. This tool is an integrated subledger with multiple charts of accounts. It supports Investment types across many financial transactions, asset classes and managers. PAM does Net Asset Value (NAV) calculations for single or multi-class funds.

This tool also supports Web-based reporting with user-defined dashboards.

Investment Managers can Access to add-on modules for complex/alternative investment accounting, data warehousing and bespoke reporting, reconciliation. compliance and risk.

Responsibilities:

· Interpret client feedback for precise business need assessment.

· Configure portfolios, equities, and fixed incomes according to client needs.

· Perform a thorough examination of data sets from all the sources to maintain accurate data sets by cleaning up the redundant and inaccurate data.

· Deals with Multiple Trade types such as Cash, FX Forwards, Purchase and sales, Income and Non-Marketable data as well

· Analyze different corporate Actions data and process them into production system to get impact in Positions or Market value

· Accumulate and maintain client’s transactional data from external sources including State Street, TNR, Private Equity and Real Estate Investments.

· Load daily Market Data information such as FX Rates, Market Prices, Ratings and Corporate Actions.

· Upload data into Client Test and Production Environments.

· Matching off data on a monthly basis, analyzing unmatched records and correcting incorrect records

· Reconciliation of Cash, Security Position, Market Value, FX Forwards, and with Legacy and current system reports and send the reports to Client manager.

· Analyze the Recon differences and correct them if there are any issues with processing

Perform Daily period close to ensure no manipulations are done after trade processing.

Consultant

Princeton Financial Systems
10.2010 - 09.2012

StatPRO:

STATPRO is a PFS Investment Banking product that deals with financial instruments used by Portfolio Managers of Asset Management Companies. It is used to calculate performance returns of the managers for historical investment assets.

Responsibilities:

· Analyze historical financial instruments of different managers for any errors like formatting errors, missing data etc.

· Injest Transaction, Asset and Holding files from Historic Data into StatPro for performance measurement.

· Compute the investment manager Performance using StatPro tool (Manager Level performance) to analyze the historical assets growth of the financial instruments invested.

· Compare and report historical performance returns with the Performance return generated by Statpro to decide on the future financial actions of individual asset managers.

· Rework as per client’s requirements to match off the differences obtained between StatPro Portfolio Return Values and Historical Returns

· Determine Net Asset Value (NAV) and perform the Reconciliation to ensure desired results

Perform Manager Cash Vs Bank Cash reconciliation to ensure Manager Cash Balance and Bank Cash matches with client asset data

Education

Master of Computer Applications -

B.V.RAJU INSTITUTE OF COMPUTER EDUCATION
01.2008

Bachelor of Science - Electronics

NNR & CL DEGREE COLLEGE
01.2005

Skills

  • Snowflake
  • Microsoft Azure
  • Oracle
  • SQL
  • PL/SQL
  • Cloud DBT
  • AWS
  • Sybase
  • Autosys
  • GIT
  • Subversion
  • Python

Employment History

  • State Street Bank and Trust Company - Jun 2020 to Till date
  • Syntel - Sep 2010 to Jun 2020

Certification

Snowflake Snow Pro certification

AWS Cloud certified practitioner

Oracle Database Fundamentals 11g

Neo4j completion

AWS training and certification

Timeline

Vice President

State Street Bank and Trust Company
06.2022 - Current

Vice President

State Street Bank and Trust Company
06.2022 - Current

Vice President

State Street Bank and Trust Company
09.2020 - 05.2022

Senior Consultant

Atos Syntel Inc
05.2016 - 08.2020

Consultant

Syntel Inc.
06.2015 - 05.2016

Consultant

Princeton Financial Systems
10.2010 - 09.2012

Consultant

Princeton Financial Systems
07.2010 - 05.2015

Master of Computer Applications -

B.V.RAJU INSTITUTE OF COMPUTER EDUCATION

Bachelor of Science - Electronics

NNR & CL DEGREE COLLEGE
Jaya Krishna Vemuri