
M.S. Human-Robot Interaction candidate with expertise in swarm robotics, embodied interaction design, and real-time embedded medical devices. Led development and evaluation of a 9-robot swarm system and a pressure-feedback G-tube training device showcased at NECHFES 2025. Proficient in ROS2 and TensorFlow. Open to roles in HRI, medical robotics, or perception engineering starting June 2026.
Embodied Swarm HRI with 5 Sphero Bolts, Co-Lead (Interaction Design), Tufts University, Fall 2024,
Pressure-Feedback G-Tube FeedingTraining Device, Hardware & Embedded Lead, Tufts Human Factors (Prof. Chaiwoo Lee), Spring 2025
G-EYE- A Handheld Retinal Imaging device, Research Assistant, G. Narayanamma Institute, 10/23 – 08/23
Hierarchical State Machines, embodied communication, user studies (t-tests, chi-square), Monte Carlo Localization, SwarmPRM, MQTT/RabbitMQ microservices, overhead vision tracking, ESP32 firmware (C++), force/pressure sensors, real-time LED/OLED feedback, 3D printing + laser cutting, ROS2/ROS, Gazebo, MoveIt, Navigation2, URDF, PyTorch, OpenCV, Hugging Face, Whisper ASR, Python (expert), C/C++, Node.js, Docker, Git, Blender, LaTeX