1. Problem Statement

Effective communication between clinicians and patients is essential for safe and high quality healthcare. However, in intensive care units (ICUs), many patients are unable to speak due to intubation, sedation, neurological conditions, or trauma. This creates a critical communication gap that can lead to:

  • Misinterpretation of symptoms
  • Delayed or incorrect diagnoses
  • Increased patient distress
  • Poor clinical outcomes and longer hospital stays

Existing communication methods such as gestures, yes/no systems, or manual pain charts are often unreliable, cognitively demanding, and inadequate in high stress clinical environments.

A particularly important challenge is pain localization: clinicians often cannot accurately determine where a nonverbal patient is experiencing pain.

2. Project Purpose

This project aims to develop and evaluate an innovative system called PainPoint, which enables nonverbal patients to communicate using eye gaze.

The core idea is that human eye movements are naturally guided by the brain. By tracking where a patient looks, it is possible to infer their intent and enable communication without speech or prior training.

Key Concept

Patients will interact with a 3D virtual body model displayed on a screen. By simply looking at a body part:

  • The system automatically rotates, zooms, and centres on that region
  • The clinician can clearly see the selected pain location
  • Communication becomes intuitive and real time

This approach leverages natural eye brain coordination, allowing patients, even those who are groggy, stressed, or recovering from anaesthesia, to communicate effectively without learning complex systems.

3. Project Objectives

The project has four main objectives:

Objective 1: System Development

Design and implement a clinical ready prototype that includes:

  • Eye tracking integration
  • Interactive 3D anatomical body model
  • Gaze controlled camera navigation
  • Clinician interface for monitoring responses

Objective 2: Clinical Feasibility Evaluation

Test the system in a real ICU environment to assess:

  • Whether patients can use it intuitively without training
  • Accuracy of pain localization
  • Ease of use for clinicians

Objective 3: Analysis and Insights

Identify:

  • Technical limitations (e.g., eye tracking accuracy, head movement)
  • Clinical workflow challenges
  • Opportunities for extending the system to broader communication tasks

Objective 4: System Refinement

Improve usability, robustness, and scalability toward a commercial ready product and integration into a broader platform (EyeVoice).

School

Computer Science and Engineering

Research Area

Assistive technology and accessible computing | Medical/health informatics | Eye tracking systems | User centred design | UX in critical environments

Suitable for recognition of Work Integrated Learning (industrial training)?

Yes

CSE VR lab

This project will:

  1. Demonstrate feasibility of gaze based medical communication
  2. Improve care for nonverbal and vulnerable patients
  3. Enable faster and more accurate clinical decision making
  4. Lay the foundation for a broader communication platform (EyeVoice)
  5. Contribute to research in human computer interaction, healthcare technology, and assistive systems
Lecturer Ali Darejeh
Lecturer

Eye Gaze Company