Databricks Responsibilities:
- Designed and implemented scalable cloud architectures on AWS, integrating Databricks with various platforms such as BI tools, APIs, SMTP servers, and Autosys, ensuring seamless data workflows and automation.
- Developed and optimized data pipelines and workflows within Databricks, enabling efficient data ingestion, processing, and analytics across multiple platforms, significantly enhancing data accessibility and insights.
- Collaborated with stakeholders to facilitate onboarding and integration of Databricks solutions, providing technical guidance and support for platform connectivity, data security, and best practices in cloud and big data environments.
Responsibilities:
- Designing, implementing, and maintaining java applications CCAR (Comprehensive Capital Analysis and Review), BUK (Bank of UK), RRP (Recovery and Resolution Planning) using core java, spring framework, hibernate, spark, hadoop, web services.
- Developing BOE (Bank of England), EMF (Empirical model framework) applications using latest technologies like core java, spring, hibernate (using annotssations), and REST based Webservices, XML API and tools.
- Responsible to lead the controls service and team of 4 members. Make sure the team members are meeting the deadlines.
- Responsible to communicate with Quants and users to discuss the requirements and expectations
- Training the new joiners to bring them up to the speed based on initial 6-week program
- Migrating complex, multi-tier applications on AWS
- All the functionality is implemented using Spring Boot and Hibernate ORM (Object-Relational Mapping). Upgraded Spring boot’s version to control the vulnerabilities.
- Building on premise data pipelines using Spark streaming using the feed from API streaming Gateway REST (representational state transfer) service.
- Integrating of Amazon Web Services (AWS) with EMF, RRP applications infrastructure.
- Using Amazon Cloud Watch to monitor AWS services and Amazon Cloud Watch logs to monitor application.
- Responsible to manage and upgrade CDH (Cloudera Distributed Hadoop) to latest and test all the java applications in DEV (development environment) and SIT (System integration testing).
- Writing and maintaining liquibase scripts to manage the database changes.
- Involving in software architecture, detailed design, coding, testing and creation of functional specs of the application.
- Analyzing java applications code quality using SonarQube
- Involving in Code reviews with the team, design reviews with the Architects.