“Humans interacting with avatars have an experience of being with another person and are willing to disclose information that is considered highly personal. Moreover, recent evidence indicates that avatars are trusted to a similar extent as humans.”Andrija Javor, Department of Neurology II, Kepler University Clinic, Johannes Kepler University, Linz, Austria
Hospitals and health care providers serve large, diverse communities. However, for a healthcare brand to resonate with anyone, it must be developed with someone specific in mind. Persona models distill patient research into a simple but powerfully informative guide that brings patients to life in ways that demographics alone can never achieve.
Persona models also help us understand how patients perceive their disease states, and whether they feel any stigmas about their health condition, or, on the flip side, if they are likely to leverage their condition as part of their identity.
Persona models also help healthcare providers in patient segmentation. Because the purchase cycle for health care is complex, and because hospitals treat so many people of different ages, ethnicities, incomes, and lifestyles, it’s a fool’s errand to focus on just one or two “patients” when developing care delivery strategies.
Personifying patients, giving them a voice makes it easy to evaluate the strengths of care delivery strategies and concepts.
Up until now, patient persona creation has primarily been a paper exercise. Health care marketing and strategy professionals create visual representations of the personas they want to model, incorporating as much demographic and research information on their patient population as possible. In addition to the visual guide, they also write a narrative description of the patient that goes beyond the basic demographics and speaks to the human qualities of their patient: who she is, how she lives, what good health care means to her, and her desired relationship with a provider.
I’ve been interested in photogrammetry and medical avatars for over a decade. In several of my conference presentations (where my colleagues often conducted a poll on whether what I was presenting was “cool” or “creepy” – hence the title of this post), I featured some of the work done by Dr. Leslie Saxon, Founder and Executive Director of the USC Center for Body Computing, where the goal was to create avatars for experts and researchers from the Keck School of Medicine of USC. Below is one of the early videos (2015) of her during the development of her medical avatar:
Dr. Saxon used this medical avatar to create an app for her patients with atrial fibrillation called DocOn (2017). Here’s a look at how that app came out:
The technology used to create Dr. Saxon’s medical avatar required expensive 3D photo scanning equipment that is beyond the reach of most organizations. The “Tent” is a canvas enclosure that holds a circular array of 100 individual cameras hooked up to a computer that melds the 100 images into a single 3D representation.
“It’s exciting and it’s valuable because it’s just going to change the way people interact with their doctors. With an app like this, it allows you to talk to a doctor in the comfort of your own home.”Ketetha Olengue, USC Center for Body Computing featured in the DocOn application video
Technology has evolved considerably since then. Companies like Soul Machines have created digital humans across multiple industries, including health care. Lucien Engelen, CEO of @Transform.Health in The Netherlands, has worked with Deloitte Nederland and UneeQ to develop digital human applications in pharmacy, mental health, and COVID-19 screening. Here’s a video (2021) of Wendy, their COVID-19 assistant, along with some user comments (audio in Dutch with English subtitles):
But now, even beyond contracting with companies like Soul Machines and UneeQ, we can create digital “MetaHuman” avatars on any desktop computer using cloud computing software. Enter the MetaHuman Creator from Epic Games.
The MetaHuman Creator is a web app that lets you assemble a digital human simply by picking eyes, skin color, hair, and other attributes, much like customizing an avatar for a video game. It’s hard to describe how real the results look, so instead, take a look at the video below:
MetaHuman can do years’ worth of work in a much shorter amount of time — and on one person’s browser, no less. Now, I’m no computer graphic artist, nor am I a CGI programmer. But I’ve been spending some time on Office Hours, a global community of almost 4000 media and event professionals who meet each day to learn from each other, work with each other, and forge a new path for their industries, learning how to use MetaHuman Creator. The ability to easily create photorealistic human characters is genuinely unique. So, my experience with the platform to date led me to wonder whether we might be able to supplement the paper-based exercise of creating patient personas with a realistic video-based character that could be used for empathy training and to explore what various patients might expect from care delivery.
So, what’s the plan? – I’ve enlisted the help of four of my former students to learn the ins and outs of MetaHuman Creator and help create a series of six digital patient personas. We’re currently interviewing health system strategy and marketing professionals in each city where we live to develop a list of typical patient personas to explore.
Once we have that list compiled, we’ll create the six digital patient personas, considering an appropriate mix of gender, ethnicity, age (no children because of some current online concerns and the fact that they are not supported in MetaHuman Creator), and clinical conditions. When the digital patients are completed, we’ll review the look and feel with the folks who suggested the personas. If they seem right, we’ll animate and record audio of the digital patients with questions around their specific conditions and needs. Our current plan is to use voice-over artists to provide the audio and animation using Live Link Face for Unreal Engine.
Finally, once everything is complete and rendered in Unreal Engine, we’ll review the results with our advisors and see whether there is any value in adding the digital patient personas to their existing work.
I’ll provide updates on our progress in other blog posts as we complete each stage of the process. I’ll include the good, the bad, and the ugly and show the various personas as they are developed. And once we’re done, we’ll post everything online, including the source data in the form of an Autodesk Maya file, including meshes, skeleton, facial rig, animation controls, and materials, so folks can further edit and refine the personas. We’ll see how this all plays out over the next six months or so. No guarantees – but it will undoubtedly be an interesting crowd-sourced project. Stay tuned……
In the meantime, drop a comment to this post with your opinion: “Is the development of MetaHuman patient persona avatars cool or creepy?” And tell me why. Thanks for reading!