Professional in financial sector, bringing valuable experience in client relationship management and financial analysis. Recognized for collaborative approach and reliability in dynamic environments. Known for leveraging analytical skills and attention to detail to drive results and support team objectives.
Overview
16
16
years of professional experience
1
1
Certification
Work History
Java Developer (Training)
Newgen Software Technologies
Senior Software Engineer
Fidelity Investment
02.2022 - 03.2024
Acquired knowledge and deep insight about securities, plans and rules related to Model performance. During different development assignments.
Gathering requirement from Business and convert the technical solution to them.
Getting requirements from Business and giving demos to them to understand the developed features of the tasks.
Built good knowledge of reporting tool like Tableau/PowerBI/OBIEE to generate reports against business requirements.
Writing SQL scripts and analyzing data to gather correct datasets for Business.
Documents all the task which assigned through JIRA.
Built automation framework using Python for web scrapping data from EDGAR system, perform ETL and sending periodic emails notification for new reports.
Restful APIs integration with Edgar application using Python
Using Restful APIs to connect and retrieve files from the S3
Advanced working SQL knowledge and experience working with relational databases, query authoring and familiarity with a variety of databases.
Define epics, user stories, acceptance criteria, facilitate story refinement sessions, and demo sprint deliverable.
Perform end to end Business Acceptance Testing and ensure the traceability of test cases.
End-to-end deployment using CI/CD pipeline
System Analyst
ReacHire (Client: Fidelity Investments)
09.2021 - 01.2022
Facilitate the translation of complex data into actionable insights. My commitment is to continuous improvement, optimizing data delivery, and contributing to solutions that scale and adapt to evolving business needs.
Understanding the business requirements and convert into technical solution. Worked with developers to make them understand the requirement and understand life cycle of the project.
Advanced working SQL knowledge and experience working with relational databases, query authoring and familiarity with a variety of databases.
Using Snowflake to retrieve data and build reports.
API integration with the system. Scheduling AutoSys jobs and monitoring the performances
Getting requirements from Business and giving demos to them to understand the developed features of the tasks.
Created user manuals and training materials to decrease on-boarding time and provide efficient support for the end-user base
Troubleshoot and debugged application issues in a timely manner, resulting in improved performance metrics
Data Engineer
Endurance International Group
03.2015 - 03.2017
Endurance acquired many small companies. As a result, company also inherited different billing and CRM systems. Each of them has their own proprietary Data format and Billing plans. DMP was the centralized platform to unify companywide billing practices. Feeds from different billing systems were collected into DMP after proper ETL processes. This Data was then queried using HIVE according to different business/marketing requirements to build daily CSV reports e.g., customer retention, promotion and upgrade.
Responsibilities:
Gather and process raw data at scale (including writing scripts, web scraping, calling APIs, write HQL/SQL queries, etc.).
Design and develop code, scripts and data pipelines that leverage structured and unstructured data integrated from multiple sources (e.g., Stream, Batch, etc.) via csv/xls file
Perform software installation and configuration
Participate in requirements and design workshops for enterprise dashboard creation
Build data platform capable for supporting data visualization using BIRT eclipse
Process unstructured data particularly log files into a form suitable for analysis – and then do the analysis.
Develop Analytics ready data set for collaboration with Data Scientists
Imported Hive tables into Impala for generating reports using Tableau.
Used Oozie for automating the end-to-end data pipelines and Oozie coordinators for scheduling the workflows.
Enabled speedy reviews and first mover advantages by defining the job flow in Oozie to automate data loading into the Hadoop Distributed File System.
Hadoop Developer
Loan-IQ Misys
05.2011 - 08.2012
Smalltalk and Java Developer
Drapsa Technologies
01.2008 - 08.2009
Education
Master of - computer science
Maharishi Dayanand University
01.2007
PGDCA - computer application
M.J.P Rohilkhand University
01.2005
Bachelor of Science - undefined
M.J.P Rohilkhand University
01.2003
Skills
Reporting – OBIEE, Tableau and Power BI Tool to generate reports
Languages–Python, Java and Bash/Shell scripts, Nodejs, Angular, RESTful APIs
Cloud - AWS (EKS, EC2, S3, RDS), Datadog
Deployment – Jenkins, CI/CD pipeline
Database – SQL, Hive and Oracle databases
Adept in Agile methodology Also familiar with SDLC life cycle from requirement analysis to system
Study, designing, testing, de-bugging, documentation and implementation