Virtual Immersive Training And Learning (VITAL)
- Our research explores the use of Virtual and Immersive technology for training humans to perform complex or high-risk tasks.
- We adopt these technologies to develop experimental paradigms that help us tackle fundamental research questions relating to sensorimotor control and perceptual cognitive expertise.
- We work closely with industry partner Cineon Training.
- Our research is underpinned by psychological theories of human perception, cognition and emotion, and we adopt a number of concurrent measurement techniques to examine important research questions relating to human learning, psychomotor control and psychophysiology. (See the Human Movement Science: Psychology group for more information.)
Modulating multisensory information to explore pain
A PhD project examining how VR technologies can be used to alleviate chronic and acute and wrist pain, with obvious applied applications but a unique opportunity to understand the psychological underpinnings and fundamental drivers of pain.
Mixed reality learning: robots and dinosaurs
A £5.6m project (£4M from Innovate UK and £1.6M from the private sector) to develop a new mixed and virtual reality educational experience. The funding comes from the Audience of the Future Demonstrator fund (Industry Strategy Challenge Fund). Exeter received ~£100k to tackle research questions relating to immersive technology for education and creative experience. The lead organisation is FACTORY 42, and other partners include The Natural History Museum, The Science Museum, The Almeida theatre company, Magic Leap, and Sky VR.
Developing evidence-based methods of validation
VR simulations are often adopted within training programmes before they have been rigorously tested. Recently we have been developing a methodology for testing and validating virtual environments; see: Testing the fidelity and validity of a virtual reality golf putting simulator and Development and validation of a simulation workload measure: The Simulation Task Load Index (SIM-TLX), for more information.
Evaluating motion capture technology in virtual-reality
Many commercially-produced VR devices offer integrated motion tracking with a quality head-mounted display, which makes it a potentially valuable tool for researchers. To assess the suitability of these systems for scientific research (particularly in sport science and biomechanics), we are comparing the position/orientation estimation of the HTC Vive with that of an industrial standard motion capture system during dynamic movement tasks. This research forms an important first step in the PhD project of Jack Evans, who is undergoing an EPSRC-funded project to integrate Virtual Reality into the care and rehabilitation of stroke patients (find out more about this project here).
- Access to programmers and coders via our industry partner Cineon Training, including Unity and UE4.
- VR integrated and mobile eye tracking.
- Oculus and Vive VR headsets.
- Access to 360 video production.
- Psychophysiological measurement.
- Mixed reality object interaction (with motion tracking).
To ask a question or discuss working with us, please contact Dr Sam Vine - firstname.lastname@example.org
Military simulation research
We delivered a research project, funded by the Defence Science and Technology Laboratory, which utilised a weapon simulation (see image).
The research explored the use of Quiet eye training to improve marksmanship skills in simulated environments.
With funding from the Higher Education Funding Council for England (HEFCE) we worked with Exeter-based airline Flybe to use flight simulators in the assessment of a pilot’s reaction to pressure.
Surgery simulation and robotics
We performed a number of studies to explore the fidelity of surgical simulations for laparoscopic surgery. We tested the construct validity of a TURP simulator and also adopted eye tracking technology to validate the simulation against real-life operations.
With funding from Intuitive Surgical, we also explored the performance and cognitive benefits of the DaVinci surgical robot from the perspective of the surgeon, including learning, stress and workload.
We also explore the role of observational learning, in the acquisition of robotic surgical skills. (Link to experimental work).
- Face validity, construct validity and training benefits of a virtual reality TURP simulator.
- Assessing visual control during simulated and live operations: gathering evidence for the content validity of simulation using eye movement metrics.
- Robotic technology results in faster and more robust surgical skill acquisition than traditional laparoscopy.
- Robotically assisted laparoscopy benefits surgical performance under stress.
- Surgeons display reduced mental effort and workload while performing robotically assisted surgical tasks, when compared to conventional laparoscopy.
- Action observation for sensorimotor learning in surgery.
- A randomised trial of observational learning from 2D and 3D models in robotically assisted surgery.
We have previously used simulated rally driving to explore the link between eye movements and steering movements and test theories about the disruptive effects of anxiety on attention.
Additionally, simulated driving is an ideal environment in which to explore the psychophysiological determinants of flow, a peak performance state of intense concentration and motivation (see publications).
- Prevention of coordinated eye movements and steering impairs driving performance
- The role of effort in moderating the anxiety – performance relationship: Testing the prediction of processing efficiency theory in simulated rally driving
- Is Flow Really Effortless? The Complex Role of Effortful Attention
- An external focus of attention promotes flow experience during simulated driving