Virtual Immersive Training And Learning (VITAL)
- Our research explores the use of Virtual and Immersive technology for training humans to perform complex or high-risk tasks.
- We adopt these technologies to develop experimental paradigms that help us tackle fundamental research questions relating to sensorimotor control and perceptual cognitive expertise.
- We work closely with industry partner Cineon Training.
- Our research is underpinned by psychological theories of human perception, cognition and emotion, and we adopt a number of concurrent measurement techniques to examine important research questions relating to human learning, psychomotor control and psychophysiology. (See the Cognitive and psychophysiological determinants of human performance group for more information.)
Modulating multisensory information to explore pain
A PhD project examining how VR technologies can be used to alleviate chronic and acute and wrist pain, with obvious applied applications but a unique opportunity to understand the psychological underpinnings and fundamental drivers of pain.
Virtual Reality and Stroke
Building novel assessments for upper-limb and hand function following impairment: A PhD project funded through the EPSRC DTP developing novel assessments which will inform bespoke rehabilitation methods for upper-limb impairments caused by Stroke by mapping upper-limb functional ability of stroke survivors in reach-to-grasp tasks through Motion Capture and Virtual Reality.
Using analogies to overcome freezing of gait: A first step towards making the first step
Virtual reality was used to create a stressful environment (a raised walkway) to enable us to induce freezing episodes in people with Parkinson’s disease and then explore interventions to help them initiate a successful step. The project was in collaboration with Dr Will Young at Brunel University, London, and was funded by Parkinson’s UK.
Augmented reality as a tool for cognitive de-biasing
This project is funded by Defence Science Technologies Laboratory and will explore how individuals’ eye and reaching movements when completing bespoke tasks in augmented reality can be assessed using machine learning to characterise levels of cognitive biases. The project is led by Professor Mark Wilson in collaboration with Dr Piotr Slowinski in Mathematics.
- Access to programmers and coders via our industry partner Cineon Training, including Unity and UE4.
- VR integrated and mobile eye tracking.
- Oculus and Vive VR headsets.
- Access to 360 video production.
- Psychophysiological measurement.
- Mixed reality object interaction (with motion tracking).
Military simulation research
We delivered a research project, funded by the Defence Science and Technology Laboratory, which utilised a weapon simulation (see image).
The research explored the use of Quiet eye training to improve marksmanship skills in simulated environments.
With funding from the Higher Education Funding Council for England (HEFCE) we worked with Exeter-based airline Flybe to use flight simulators in the assessment of a pilot’s reaction to pressure.
Surgery simulation and robotics
We performed a number of studies to explore the fidelity of surgical simulations for laparoscopic surgery. We tested the construct validity of a TURP simulator and also adopted eye tracking technology to validate the simulation against real-life operations.
With funding from Intuitive Surgical, we also explored the performance and cognitive benefits of the DaVinci surgical robot from the perspective of the surgeon, including learning, stress and workload.
We also explore the role of observational learning, in the acquisition of robotic surgical skills. (Link to experimental work).
- Face validity, construct validity and training benefits of a virtual reality TURP simulator.
- Assessing visual control during simulated and live operations: gathering evidence for the content validity of simulation using eye movement metrics.
- Robotic technology results in faster and more robust surgical skill acquisition than traditional laparoscopy.
- Robotically assisted laparoscopy benefits surgical performance under stress.
- Surgeons display reduced mental effort and workload while performing robotically assisted surgical tasks, when compared to conventional laparoscopy.
- Action observation for sensorimotor learning in surgery.
- A randomised trial of observational learning from 2D and 3D models in robotically assisted surgery.
Additionally, simulated driving is an ideal environment in which to explore the psychophysiological determinants of flow, a peak performance state of intense concentration and motivation (see publications).
- Prevention of coordinated eye movements and steering impairs driving performance
- The role of effort in moderating the anxiety – performance relationship: Testing the prediction of processing efficiency theory in simulated rally driving
- Is Flow Really Effortless? The Complex Role of Effortful Attention
- An external focus of attention promotes flow experience during simulated driving
- Could virtual reality revolutionise training in nuclear? (Energy Live News)