University of Texas Medical Branch augments curriculum with virtual reality simulations

University of Texas Medical Branch augments curriculum with virtual reality simulations

Nursing, medicine, respiratory care and other healthcare students learn to identify safety hazards in a simulation; practice medical interviews with a virtual human nicknamed “Mike White”

The University of Texas Medical Branch first opened the door to its Health Education Center in 2019. The five-story facility boasts 161,000 square feet of training space with 77 simulated patient beds. The HEC provides human, non-human (i.e., mannequins) and virtual reality (VR) simulations to train students from five different healthcare schools across the University.

UTMB has been experimenting with simulations since the 1970s to train healthcare professionals. For example, it was one of the first universities to begin using standardized patients – actors paid to role-play for students and bring realism to medical training. The goal has always been the same: to provide the highest quality education possible. 

Today, the HEC at UTMB has a new addition to its educational toolset: extended reality (XR) provided by Virti. XR is an overarching term used to describe immersive learning technologies such as virtual reality (VR) and augmented reality (AR). 

UTMB has turned to Bruce Adcock, an assistant professor of Respiratory Care, and Richard Briley, an instructional technologist at the school, to help it implement XR into the curriculum. 

Simulation: the patient safety room

School of Nursing students at UTMB will find all sorts of hazards in the patient safety room if they look carefully. It could be a used needle forgotten on a hospital bed. Or it could be a urinary catheter bag left on the railing. None of these hazards pose a real physical risk because the exercise is a simulation. 

The patient safety room simulation provides students with the opportunity to apply what they’ve learned in a classroom. In other words, students practice spotting potential hazards (and HIPPA violations) they might find in a future clinical environment. Bruce created the simulation using a 360-degree video camera to record a hospital room and upload it to Virti. 

Students put on a VR headset, enter the patient safety room – a completely immersive learning environment – and attempt to identify these and other hazards. When the exercise is complete, the class collectively debriefs with their professor to discuss what they saw – or didn’t see.

The immersive nature of this class is a vast improvement over how it was previously taught: with traditional video and lectures. The class used to be held later in the semester, but students and faculty alike requested it be conducted sooner. “When they get out to clinical, they’ve already had practice,” Bruce explained. 

Planning and scene development aside, the filming of the simulation took him just four minutes. Most students complete the exercise in 10-20 minutes. The lessons, the University hopes, will be remembered by students for their entire careers. 

Simulation: medical interviews with a patient named “Mike White”

A second simulation offered is a medical interview with “Mike White.” Mike White is a non-descript nickname for a virtual human – an avatar – in Virti. It’s powered by generative artificial intelligence (AI) and can speak in 20 different languages and can be made to look like any ethnicity, gender or background for which the curriculum calls. 

“It’s geared towards students learning how to communicate with the patient and ask appropriate questions,” said Richard. “There's a medical interview component of it with about five or six objectives that they have to meet. The faculty makes sure that the students ask the necessary questions regarding the patient's condition.”

Since it’s a virtual environment, there’s also a physical examination component to the health assessments. Students are required to check vital signs, such as measuring the patient’s blood pressure, taking their temperature, listening to their heart and lungs – and even conducting a capillary refill – during the simulation. 

Interpersonal training – demonstrating good bedside manners – is as important as asking the right questions. Since Mike White is powered by generative AI, it can be programmed with a backstory that enables it to answer off-script questions. For example, if a student asks the patient a question aimed at building rapport – ‘Who is your favorite baseball team?’ – the avatar will answer the question based on its backstory.

All of this solves a logistical challenge for the School of Nursing. With 120 students per semester, they have limited capacity to conduct role-play simulations. Previously, the students might have had one chance to conduct a medical interview with a standardized patient. With virtual reality, they can run through the simulation many times over. 

“So, timewise, the students get a lot out of it,” said Bruce. “They actually get to execute the interview, compared to just sitting there listening to it or watching videos in a classroom.”

Accessibility and convenience today; summative assessments tomorrow

Although the work with XR simulations is still in the pilot phase at UTMB, the fact that students get to practice such interviews underscores one of the “biggest benefits” of XR. That is “accessibility and convenience,” according to Richard. 

For example, standardized patients for role-play simulations must be booked up a year in advance. These require a detailed plan – learning objectives, script writing, scene development and the budget allocation to hire an actor. 

By contrast, with XR, that time can be cut in half – to about six months for most simulations. This is especially true for those that rely on 360-degree video because those simulations typically also require actors. However, if a script is already written, the teams can program that into a virtual human and have that simulation ready in just three or four days. 

Bruce and Richard are careful to emphasize that simulations have to meet or exceed the performance of established simulations like the standardized patient. The long history UTMB has with simulations means the faculty has high expectations for what a simulation will provide. 

“Bruce and I will not do an XR simulation unless we can get it very close, similar, or better than what non-human and human can offer,” noted Richard. 

To date, XR simulations have been used to train about 1,000 students in a pilot phase. The usage has also grown beyond the School of Nursing. XR augments curriculums from the Schools of Medicine, Health Professions, Public and Population Health and the Graduate School of Biomedical Sciences.

The HEC is carefully collecting benchmarking data about XR simulations at UTMB. In the future, they hope the data can be used to develop a comparative study that quantifies the difference in student outcomes. If all goes well, they foresee XR being used one day to help evaluate overall student performance and even prepare for board exams.

“That’s the challenge that we're seeing now,” said Richard. “XR simulation creates accessibility and students can utilize this to practice and hone their skills, but the real question on everybody's mind is how can this be utilized as a formative assessment or a summative assessment?”