Virtual Humans: More Honest Data in the Future of Healthcare

Any dancer or doctor knows full well what an incredibly expressive device your body is. 300 joints! 600 muscles! Hundreds of degrees of freedom!

The next time you make breakfast, pay attention to the exquisitely intricate choreography of opening cupboards and pouring the milk — notice how your limbs move in space, how effortlessly you use your weight and balance. The only reason your mind doesn’t explode every morning from the sheer awesomeness of your balletic achievement is that everyone else in the world can do this as well.

With an entire body at your command, do you seriously think the future of interaction should be a single finger? – Bret Victor

MultiSense ICTThe Future will be Virtual, Augmented, and Wearable

The USC Institute for Creative Technologies is a pioneer in Virtual Human (VH) technology. ICT’s work with virtual humans creates digital characters that look, sound, and behave like real people.

Understanding the human face is an especially complex process. The face contains 43 muscles, and it takes five muscles to display whether we are happy, sad, afraid, angry, disgusted or surprised.  But understanding and sensing emotions in real humans is key to making virtual characters more realistic.

VH technology is currently being used to help clinicians better interact with patients.

MultiSense and SimSensei in Healthcare 2014

ICT developed MultiSense as a way to quantify facial expressions, body posture, and speech patterns. Algorithms combine this data to create a complete picture of a user’s emotional state in real time, and with a profile that recognizes changes over time. MultiSense drives SimSensei – a next generation Virtual Human platform designed to improve healthcare decision-making and delivery.

Virtual Humans USC ICT

Learn more about MultiSense and SimSensei in healthcare.

Patients More Honest with Virtual Humans

Virtual Human technology is used in role-playing and training to help clinicians improve their interactions with patients. But new research by ICT has netted some surprising results.

New research finds patients are more likely to respond honestly to personal questions when talking to a Virtual Human.

Originally, ICT began training clinicians by having them interact with a Virtual Human patient. In the new research, the tables were turned – patients interacted with Virtual Human interviewees asking questions a physician might normally ask. The process started with general getting-to-know-you types of questions gradually leading to more personal and revealing questions like, “How close are you to your family?”

“Half of the participants were told that their conversation was entirely computer-driven and not being observed. The others were informed they were being watched by a person in another room who was also manipulating the machine to ask certain questions. In all cases, video images of their faces were recorded and later analyzed to gauge their level of emotional expression.” – Tom Jacobs, “I’ll Never Admit that to My Doctor

Surprisingly, Virtual Humans were able to extract better patient data. In discussing private matters with the computer-generated entities, patients disclosed more information. Why? According to Gale Lucas, who led the study for ICT, participants did not feel like they were being observed or judged. They also reported “significantly lower fear of self-disclosure.”

You can read more about the study in the journal Computers in Human Behavior.

Virtual Humans Will Help Predict Treatment

Across the pond, researchers in England are using Virtual Physiological Humans “to engineer a simulation of the body so true to life, any data could be potentially input to create a personalized health plan, and predictions for any future patient.”

According to Marco Viceconti, Director of the Insigneo Institute at the University of Sheffield,

“If I now feed to my simulations the data related to a particular individual, that simulation will make health predictions about the status of that individual. This is not personalized medicine, this is individualized medicine, we can finally say something about you not because you are about the same age and sex and disease as another thousand people, but because you are you with your condition and your history.”

Virtual Human Insight for Wearable Technologies

In a recent opinion piece for CIO,  Brian Eastwood writes that wearable tech’s dilemma is too much data, and not enough insight. He explains that even though he runs marathons and writes about healthcare IT, he still does not have a fitness tracker.

I started thinking about how Virtual Human technology could combine with wearable devices. Although speech recognition technology is already used with Google Glass, it is not at the level of sophistication of VH. Imagine your own Virtual Human personal trainer who would have an understanding of your emotions and behaviors, and your personal weaknesses and motivators. Interacting with your VH through speech-recognition technology would minimize the need to display lots of data on a small screen. Your VH-enabled wearable device could know just the right words and cues to promote healthy behaviors, and maximize your personal wellness.

There will be no distinction, post-Singularity, between human and machine and between physical and virtual reality. – Ray Kurzweil

Additional Viewing

 

The following two tabs change content below.

HealthIsCool

HealthIsCool writes for HL7standards.com on health innovation, wearable tech, Quantified Self, IoT, mHealth, and future trends. Follow on Twitter at @HealthIsCool.

, , , , , , , , ,

Comments

Loading Facebook Comments ...
Loading Disqus Comments ...
Comments are closed.