Volleyball
Aspiring Robotics Engineer with a passion for autonomous systems, SLAM, path planning, and computer vision. I’ve worked on real-world projects involving UAV systems and cutting-edge technology like online visual tracking and real-time sensor fusion. With experience in Python, C++, and ROS, I enjoy tackling complex challenges in robotics, from multi-robot coordination to advanced tracking solutions. I’m excited to bring my skills, creativity, and fresh perspective to help your team innovate and succeed.
Years of research experience
Course Work:
As a Graduate Research Assistant in the ADAMS Lab at the University at Buffalo, I had the opportunity to work on advanced robotics and autonomous systems projects, focusing on UAV (drone) development, multi-robot coordination, and real-time sensor integration. A large part of my work involved utilizing hardware platforms like the Nvidia Jetson Nano, Raspberry Pi and the Intel RealSense Depth Camera D435i to solve challenging robotics problems such as drone tracking, path planning, and real-time data processing.
UAV Tracking and Real-Time Object Detection
One of the major projects I worked on involved developing an autonomous UAV tracking system that used deep reinforcement learning (RL) for control and YOLOv8 for real-time object detection. The system was equipped with Intel RealSense D435i depth cameras and RGB cameras, allowing the drones to accurately perceive and track targets. We used the Holybro X500 V2 drone kit as the platform, with the Pixhawk Cube Orange Plus as the flight controller running ArduPilot firmware. To handle the onboard data processing and control tasks, I implemented the Nvidia Jetson Nano, which allowed the UAV to dynamically adjust its flight path and maintain precise tracking, even in complex environments. This system was thoroughly tested in both simulations and real-world scenarios, resulting in significant improvements in tracking accuracy, real-time response, and overall autonomy.
Multi-UAV Coordination and Coverage Path Planning
I also worked on a project that involved coordinating multiple UAVs for large-scale area coverage, particularly in disaster response and environmental monitoring. We developed a Scalable Coverage Path Planning (SCoPP) framework to manage these drones in covering non-convex areas efficiently. In addition to the SCoPP framework, I implemented a Multi-Robot Task Allocation (MRTA) algorithm, which was deployed on custom-built drones running ArduPilot firmware and on Parrot Anafi AI and Anafi Thermal drones. For control and communication, I used Raspberry Pi 5 for the custom drones, with Pymavlink facilitating communication, while the Olympe Python package was used to control the Parrot drones. This system was rigorously tested in both simulations and real-world environments, including the SOAR facility at the University at Buffalo, where we successfully applied it to scenarios like post-flood assessment and large-area monitoring.
Hardware Integration and Field Deployments
A significant part of my work involved implementing and integrating hardware systems to support our research. I used Nvidia Jetson Nano and Raspberry Pi boards for onboard computation and control of custom drones, and we equipped these drones with Intel RealSense D435i depth cameras and RGB cameras for real-time visual processing. In addition, I worked with TurtleBot platforms (Burger and Waffle) for ground-based experiments, where we developed systems for real-time sensor fusion and autonomous navigation. These projects demonstrated the importance of high-performance computing and multi-sensor data fusion for reliable UAV operations in real-world environments.
Mentoring Undergraduate Students
In addition to my technical contributions, I mentored several undergraduate students in the lab, helping them build and customize their own drones, guiding them through firmware installation and configuration, and leading workshops on UAV setup and control systems.
Simulation and Real-World Testing
To bridge the gap between simulation and real-world testing, I contributed to the creation of a digital twin of the SOAR facility using Unreal Engine and AirSim. This allowed us to rigorously test our UAV coordination algorithms in a simulated environment before deploying them in real-world conditions. The transition to real-world testing was seamless, with field tests at the SOAR facility demonstrating the effectiveness of our algorithms. The UAV fleet successfully completed various missions, validating both our hardware and software designs in dynamic environments.
NLP-Powered Recommendation System Development:
As a Software Developer at TCS, I designed and implemented an advanced recommendation system powered by natural language processing (NLP). This system automated the support ticket resolution process by analyzing three years of historical ticket data, including descriptions, summaries, and resolutions. It significantly improved the triage and resolution workflow by accelerating the process of identifying and addressing recurring issues.
Enhancing Workflow Efficiency:
The NLP recommendation system I developed played a critical role in boosting the efficiency of issue resolution. By automating the ticket handling process, it minimized human intervention and reduced the time required to resolve frequently encountered problems, thus improving overall support operations.
SSO Management for Enterprise Applications:
In my role as a Technical Support Engineer, I was responsible for managing Single Sign-On (SSO) integrations, utilizing SAML2 and OAuth2 protocols across more than 600 enterprise-wide applications. I provided continuous support for these systems, ensuring seamless authentication processes for users.
Consistently High Ticket Resolution Rate:
I managed and resolved an average of 60 support tickets each month, focusing primarily on authentication and access-related issues. My ability to quickly troubleshoot and resolve complex technical problems helped maintain the reliability and security of enterprise applications.
Recognition for Outstanding Performance:
For my consistent contributions, I was honored as Employee of the Month on three separate occasions. This recognition highlighted my commitment to delivering top-tier technical support, ensuring smooth operations, and meeting high standards of customer satisfaction.
Scalable Coverage Path Planning (SCoPP) for Multi-UAV Coordination
Developed the Scalable Coverage Path Planning (SCoPP) framework for coordinating multiple UAVs in large-scale disaster response and environmental monitoring tasks. I integrated Multi-Robot Task Allocation (MRTA) algorithms into custom-built drones and Parrot Anafi AI and Anafi Thermal UAVs, using Olympe and Pymavlink for control. The system was tested in simulation and real-world environments, including the SOAR facility at the University at Buffalo, where it successfully managed coverage of non-convex areas for disaster relief.
Autonomous UAV Tracking and Pursuit using Reinforcement Learning
Designed a robust UAV tracking system utilizing YOLOv8 for high-precision object detection combined with reinforcement learning for optimal navigation and control. The system integrates state space representation to forecast the target’s future states, improving the tracking of dynamic aerial objects. The reward-based control system was trained and tested in AirSim simulation environments, achieving enhanced UAV tracking accuracy, making it suitable for applications like aerial surveillance.
Concurrent Design of Unmanned Aerial Vehicles for Real-Time Drone Tracking
This project focused on optimizing UAV tracking systems by integrating YOLOv2 for object detection with MOSSE correlation filtering to enhance tracking accuracy and responsiveness. Using depth cameras, the system was able to maintain precise positioning relative to its target. I employed MATLAB and simulation tools for system modeling, applying concurrent design principles for both software and hardware optimization. This research has potential real-world applications in surveillance, search and rescue, and geographic mapping.
Imitation Learning for Autonomous UAV Navigation and Last Mile Delivery
Developed an advanced navigation framework for UAVs using inverse reinforcement learning (IRL) to optimize routes for last-mile delivery. The system improves real-time decision-making and dynamic adaptability in urban environments, enhancing delivery accuracy and safety. By employing machine learning techniques, the UAV was able to autonomously adjust its behavior based on its environment, making it highly efficient for autonomous deliveries.
Monocular Vision-Based Object Detection and Depth Estimation for Autonomous Vehicles
Developed a system using YOLOv5 for object detection and depth estimation in monocular vision settings. This project focused on improving the perception abilities of autonomous vehicles in dynamic environments. The system enhanced the detection accuracy and depth estimation, enabling the vehicle to navigate more effectively in real-world scenarios.
Stereo Visual SLAM for Autonomous Robots
Implemented a Stereo Visual SLAM system using ORB-SLAM2 for enhanced localization and mapping in unknown environments. This system was deployed on TurtleBot platforms and custom drones, significantly improving autonomous navigation through robust mapping and environmental understanding.
Advanced Path Planning Algorithms: A*, Dijkstra, RRT, RRT*
Implemented a variety of path planning algorithms, including A*, Dijkstra, RRT, and RRT* for efficient autonomous navigation. These algorithms were integrated into custom-built UAVs and ground robots to optimize their route planning and ensure collision-free navigation in complex and dynamic environments. The algorithms were tested across multiple scenarios, proving effective for both indoor and outdoor robotic applications.
Volleyball
Cricket
Fitness
Sketching