Ph.D. Candidate | University of Melbourne
Human–Robot Interaction Researcher Designing Failure-Aware and Trustworthy Robotic Systems
I investigate robot behavior under failure, with a focus on maintaining transparency, adaptability, and trust in human–robot interaction. My research integrates robotics systems, perception, and vision–language/large language model (VLM/LLM)-based reasoning to study failure detection and recovery in collaborative settings.
I examine how humans perceive and respond to robot failures, how trust evolves across repeated interactions, and how behavioral signals (e.g., gaze and interaction patterns) can inform failure detection. In parallel, I develop external reasoning frameworks that enable robots to autonomously detect failures and select appropriate recovery strategies.
My work is grounded in end-to-end system development and validated through real-robot experiments, simulation environments, and controlled user studies.
Current role
Ph.D. candidate in HRI
I am based in the School of Computing and Information Systems at the University of Melbourne, working with A/Prof. Wafa Johal and Prof. Vassilis Kostakos in the Human-Computer Interaction Group and the Human-Robot Interaction Lab.
Research question
What happens when robots fail?
I study how people interpret robot failures, how trust evolves across repeated interactions, and how robots can detect, communicate, and recover from errors during collaboration through both human behavioral signals and external reasoning systems (e.g., LLMs/VLMs).
Hands-on systems
From experiments to live demos
I have built and evaluated collaborative robot behaviours on Tiago and Furhat, combining user studies, gaze sensing, and real-time analysis in both research and public-facing demonstrations.
Research
Selected Research and Engineering Work
My work challenges the assumption that robots and AI systems are flawless. I focus on failure detection, trust calibration, and recovery strategies for human-robot collaboration.
Study 1 / 2
01
Gaze-Based Robot Failure Detection
For robots, detecting and predicting failures as early as possible is vital to prevent damage and negative user experiences. I studied user non-verbal behaviour, especially gaze patterns, to identify cues that indicate when a failure is about to occur.
- User gaze behaviour can signal the onset of a robot failure.
- Gaze patterns are related to the type of failure the robot makes.
- A random forest classifier showed strong potential for detecting failures within a few seconds after they occur.
02
Trust Dynamics Across Multiple Robot Failures
Robot failures often happen more than once during collaboration, and their effects on user trust can accumulate. I examine how repeated failures change trust, perceived robot intelligence, and the value of showing failure awareness.
- User trust is influenced by both the current failure and the failures that came before it.
- Different failure sequences with similar severity can produce different trust and intelligence judgments.
- Failure awareness helps after serious failures, but it can reduce trust when the failure is minor or barely noticeable.
Capabilities
Technical Strengths
Robotics & HRI
- Robotics systems: ROS workflows, Tiago (PAL Robotics), Furhat social robot, collaborative task design, interaction behaviors
- Human studies: experimental design, study execution, trust measurement, multimodal behavioral analysis
- Demonstration work: live showcases, public-facing robotics demos, rapid prototyping for interaction scenarios
Perception, Programming & Analysis
- Perception and data: Pupil Labs, gaze tracking and processing, annotation workflows, OpenCV, computer vision pipelines
- Programming and systems: Python, ROS, LLM- and VLM-based reasoning via OpenAI APIs, API integration
- Modeling and analysis: Unity, SolidWorks, statistical modeling (linear mixed-effects models, cumulative link mixed models)
Career
Experience
University of Melbourne
Researcher
Dec 2023 – PresentInteractive Technologies Lab (IXT)
- Tiago Robot
- Built ROS-based collaborative tasks with diverse failure conditions and conducted controlled user studies with 50+ participants on real robots.
- Developed pipelines integrating eye tracking, ROS synchronization, and real-time gaze feature extraction for human-centered failure detection.
- Designed a simulation-based failure detection framework in NVIDIA Isaac using behavior trees and VLM-based reasoning for autonomous detection and recovery.
- Furhat Robot
- Developed complex emotional expressions using facial action units, with VLM- and user-based evaluation to identify appropriate responses.
- Contributed to live robotics showcases and demos for academic and public audiences.
Tutor & Project Supervisor
Mar 2024 – PresentTeaching and supervision
- Elements of Data Processing · Tutor · Semester 2, 2025 and Semester 1, 2026
- Machine Learning · Tutor · Semester 1, 2026
- Master's Project · Supervisor · Mar 2024 – Jul 2024 · Supervised development of a web app for annotating ROSBag data.
Sharif University of Technology
Research Assistant
Sep 2021 – Sep 2023CEDRA
- Programmed and ran HRI experiments with Nao and Opo robots to study gaze behavior in children and young adults.
- Built deep learning models for lip reading and facial emotion recognition from video data.
Tutor
Sep 2022 – Jun 2023Teaching roles
- Social Cognitive Robotics · Jan 2023 – Jun 2023
- Advanced Math 1 · Sep 2022 – Dec 2022
Academic Path
Education
University of Melbourne
Ph.D., Computing & Information Systems
Dec 2023 – PresentThesis: Exploring and Exploiting Human Behavioural Responses to Robot Failures in Human-Robot Interaction
Supervisors: Dr. Wafa Johal & Prof. Vassilis Kostakos
Sharif University of Technology
MSc, Mechanical Engineering
Sep 2021 – Jun 2023GPA: 18.10/20 (= 3.87/4.00)
Thesis: Empirical motion-time pattern for human gaze behaviour in social situations using DNNs
Supervisors: Dr. Alireza Taheri & Prof. Ali Meghdari
University of Tehran
BSc, Mechanical Engineering
Sep 2017 – Sep 2021GPA: 17.45/20 (= 3.80/4.00)
Thesis: Controller design for a refrigerator using Peltier modules
Supervisor: Dr. Ehsan Hosseinian
Engagement
Demos & Public Engagement
- Innovation Week (Sep 2025): designed and delivered a Furhat robot social interaction demo with rapid behavior scripting and multi-party interaction.
- University of Melbourne Showcase Event (Sep 2025): presented an interactive robotics pipeline combining real-time perception, behavior control, and HRI concepts.
- Post-HRI Academic Visit (Mar 2025): demonstrated an office assistant robot on Tiago with autonomous navigation and interaction.
- CIS Doctoral Colloquium (Oct 2024): poster presentation, Gazing at Failure: Investigating Human Gaze in Response to Robot Failure in Collaborative Tasks.
- Ubicomp 2025 Demo Session (Sep 2024): live demo, Robot Failures in Human-Robot Collaboration Using the Tiago Robot.
- University Open Day (Aug 2024): public demo, Autonomous Social Robotics.
Research Output
Selected Publications
- Tabatabaei, Kostakos, Johal. Oops, I Did It Again (But I Know It): Robot Failure Consistency and Awareness in Human-Robot Collaboration. ACM CHI 2026.
- Tabatabaei, Kostakos, Johal. Gazing at Failure: Investigating Human Gaze in Response to Robot Failure in Collaborative Tasks. ACM/IEEE HRI 2025.
- Tabatabaei, Kostakos, Johal. Real-Time Detection of Robot Failures Using Gaze Dynamics in Collaborative Tasks. ACM/IEEE HRI 2025.
- Zhang, Li, Tabatabaei, Johal. ROSAnnotator: A Web Application for ROSBag Data Analysis in Human-Robot Interaction. ACM/IEEE HRI 2025.
- Pan, Schömbs, Zhang, Tabatabaei, Bilal, Johal. OfficeMate: Pilot Evaluation of an Office Assistant Robot. ACM/IEEE HRI 2025.
- Tabatabaei, Kostakos, Johal. Oops, I Did It Again (But I Know It): Robot Failure Consistency and Awareness in Human-Robot Collaboration. CHI 2026.
Recognition
Awards & Honors
- Winner, HRI24 Robot Challenge, Office Assistant on Tiago (Team Melbourne) · Mar 2024
- Top 0.2% National Entrance Exam (Master's) · Aug 2021
- Top 15% of graduating class · Jun 2021
- Top 1% National Entrance Exam (Bachelor's) · Jul 2017
Languages
Languages
- English: IELTS 7.0 (L 7.0, R 7.5, W 6.5, S 6.5)
- Persian: Native
