DSCF8950 (1).jpg

Envisioning Anaesthesia in 2032

An interface proposal that reduces the attention required for anaesthesiologists to run and supervise the system so that more focus can be directed towards the patient and communication with colleagues. 

rsz_dscf8950_2.jpg
 

Envisioning Anaesthesia in 2032

 


project details

Duration: 10 weeks, fall 2017 AT uid

Team: Selvi Olgac (se), Toby Whelan (gb)

Corp. partner: Maquet, Getinge Group

Type: Human-MACHINE INTERACTION

my contribution

Research, Co-creation workshop,

microinteractions, WIREFRAMING,

FILMING,VISUAL-EFFECTS, PROTO-

TYPING, VIDEO-prototyping. 

CHALLENGE

Helping Anaesthesia nurses to

explore, learn, and fail during

surgery. 

Concept

Sandbox – A handheld device

that enables nurses to simulate

drug adjustments and predict

individual patient responses.


 
 

Challenge


 

Anaesthesia nurses don’t know their Anaesthesia machines well as they could. They are not using the machine’s full potential because the sensitive operation theatre environment does not give any room to learn and fail with the machine in practice. This project explores the possibility for nurses to experiment and deepen their knowledge of how the Anaesthesia machine affects the patient.

 

Anaesthesia nurse is monitoring patient.

 
 
 
 

Concept


 

With Sandbox, part of tomorrow's Anaesthesia machine, nurses can see a prediction of the patient’s response to drug parameter adjustments before administering any changes. Through this system, the nurse can explore, fail and learn without risk of harming the patient.

 

Sandbox

When Sandbox is passive: User can press & hold the black screen to activate the Sandbox.

When user executes adjustments with Sandbox: User must place four fingers on  indicated spots in order to make execution.

When Sandbox is active: 1/ User selects a drug parameter (Blue, Yellow, Cyan). 2/ turns outer metal ring manually. 3/ sees predictions. 

When user gives feedback to Sandbox  system: User can provide additional context to Sandbox's predictions, thus making predictions more accurate.

 
 
 
 

Future vision


 
 

In our future vision the Anaesthesia nurse will not lose their manual skills. Instead, Anaesthesia nurses will combine their cognitive abilities with the machines, performing a better patient care (speaking of act rather than react to situations and being closer and more attentive to the patient while still being communicative around the OR colleagues when using Sandbox) in tandem.

 
 
 
 

FIELD RESEARCH – Understanding Anaesthesia nurses workflow


 

During the project we visited two hospitals where we attended six different surgeries. Being in the field, we applied ethnographic research methodologies, which helped us to understand the Anaesthesia nurses thoughts and feelings about their workflows and daily routines.

 
 
 
 
 

INSIGHTS – WHAT WE LEARNED FROM OUR RESEARCH


 

Insight: 20/80 knowledge

Currently, the nurses do not know the full capacity of the Anaesthesia machines because of their complexity. There is also a limited amount of time for exploring and learning more about the machine during surgery.

Double checking screen and patient

Nurses must continually monitor the patient, both through the data displayed on the screen interface and through visually checking the patient. Double-checking their readings against two sources, the patient, and values and numbers from the master screen gives assurance. 

Boring autopilot phase

Whilst the beginning and end of the operation are fast and high-pressure for the nurse, the timely middle section, dubbed the ‘autopilot phase’, is generally uneventful. Here the nurse may need to monitor the patient for more than 6 hours.

 
 
 
 

INITIATING RESEARCH WORKSHOP – tuRNING INSIGHTS INTO TANGIBLE IDEAS


 
 

Together with our collaboration partner, colleagues, and tutors, we hosted and participated in a wide range of activities revolving around our research findings. The purpose of this engagement was to speculate and imagine scenarios of how the operation theatre of the future might look like.

 
 
 
 

Design sprint & concept development


 

We focused on empathy and human touch. We were interested in skilling up the nurse’s human sensory apparatus as it might offer a different way of learning on how to monitor the Anaesthesia that is induced to the patient. Technology is not what makes good patient care, rather it is empathy and the confidence of one’s human abilities.

 
 
To us, good interactive products respect all of man’s skills: his cognitive, perceptual- motor and emotional skills.
— Djajadiningrat, Wensveen, Overbeeke
 
 
9_breathing+1.jpg

Initial concepts – Second skin:

We developed the wearable device “second skin”, which is an elastic, breathable display that fits on the skin and measures the patient’s vital signs and transmits the biometric data wirelessly to the master screen.

10_breathing+2.jpg

Initial concepts – Layered touch

Another concept is the faceless gestural interaction “layered touch”, which enables the Anaesthesia nurse to apply Anaesthesia machine settings on the patient through gestural interactions.

colourful+film+stripes+small.jpg

Initial concepts – Checkpoint touch 

In case of an signalled alarm emergency coming from the master screen, the Anaesthesia is suppose to stabilise the situation through touching the patient.

 
 
 
 

Concept User testing with Anaesthesia providers


 
 

Through prototype testing and role-plays we gained a better understanding of what empathy and human touch means in relation to the nurse’s daily work routine. The human touch concept proved in the user testings to be difficult to implement with the human-centered-design approach, as there are practical reasons for the lack of empathy and human touch.

 
 
 
 

Staying human-centered


 

The way we have set up our design approach so far, opposes the current march of technology, although these rapid widespread technological advancements happening around the world are inevitable. Instead of saying, we are working against technology, or, we are working for technology, we decided to take the perspective on we are working with technology. At the moment, machines are, for example, doing well in fast processing over big data. On the other hand, the humans situational awareness, for example, is greater compared to machines. Combining both human and machines cognitive abilities may bring an effective human-machine work in tandem. We see potential in such human-machine teamwork in the Anaesthesia nurses work environment in 2032.

 
 
 
 
 

Prototyping the Sandbox


 
 

For a more clear link between the readings on the existing master screen and our final concept, we adopted the master screen's visual identity on our concept as well. This ensures a comparability between the simulated data (from our concept) and live patient data (from master screen).

 
 
 
 

Interface User Testing


 

We tested the product use with both nurses and other students at the university. Some of the results indicated logical as well as visual discrepancies found on the interface prototypes. 

 
 
 
 
 

Final Interface


 
adjustments (2).gif

Sandbox's adjustment mode

When adjusting parameters, simulated patient readings appear individually as they are affected. When tapping any of the 7 readings to highlight them in the central area. Once highlighted, more information is displayed.

execution (1).gif

Sandbox's execution mode

When executing changes 4 fingers are placed at touch points on the outer edge of the Sandbox and holding on for 3 seconds, to avoid accidental changes to the selected values. The product provides the nurse with feedback through haptics and a green completing ring. 

feedback (1).png

Sandbox's feedback mode

In the final stage the nurse submits the observations of the patient's physical responses to the changes executed. In this stage the tacit knowledge of the nurse and the knowledge from the machine are united into the cooperative network system.

 
 
 
 

Concept video


 

Our first video prototypes of user scenarios was made within a self-built hospital environmental setting. We decided to show the final interface product via screen replacement effects. This required from the team to do intense track motion labour in Adobe After Affects – one of the most technical challenging tasks in this project. 

 
 
 
 
 

FEEDBACK & REFLECTION


 

To conclude, we have presented a concept that solves a real problem, addressing challenges identified through our research and validated through user testing. In our future vision the Anesthesia nurses will not lose their manual skills. Instead, Anaesthesia nurses will combine their cognitive abilities with the machines, performing a better patient care in tandem.

The concept relies on having highly accurate prediction software, with machine learning needed access to a huge quantity of data. If the system was even remotely inaccurate, the implications on trust would be significant, so system-level integration is required. That said, the benefits for the nurse of an accurate patient prediction system are substantial. Additionally, this proposal could impact far beyond the OR, with applications for the learning system in the wider hospital environment and further.