Medical Education
Feb 2019

Emerging Roles of Virtual Patients in the Age of AI

C. Donald Combs, PhD and P. Ford Combs, MS
AMA J Ethics. 2019;21(2):E153-159. doi: 10.1001/amajethics.2019.153.

Abstract

Today’s web-enabled and virtual approach to medical education is different from the 20th century’s Flexner-dominated approach. Now, lectures get less emphasis and more emphasis is placed on learning via early clinical exposure, standardized patients, and other simulations. This article reviews literature on virtual patients (VPs) and their underlying virtual reality technology, examines VPs’ potential through the example of psychiatric intake teaching, and identifies promises and perils posed by VP use in medical education.

Virtual Patients in Medical Education

Over the past 20 years, a revolution has taken place in the use of health care simulation. Technological advances in computational power, graphics, display systems, tracking, interface technology, haptic devices, authoring software, and artificial intelligence (AI) have supported creation of low-cost, user-friendly virtual reality (VR) technology and virtual patients (VPs). VPs are defined by the Association of American Medical Colleges as “a specific type of computer-based program that simulates real-life clinical scenarios; learners emulate the roles of health care providers to obtain a history, conduct a physical exam, and make diagnostic and therapeutic decisions.”1-3 VPs represent a fusion of simulation technologies and VR, which is generally defined as “a three-dimensional, computer-generated environment which can be explored and interacted with by a person.”4

Research has been conducted documenting many settings where VR and VPs add value in education and in clinical practice. No longer merely a prop in a virtual world, VPs are designed to interact in 2D and 3D virtual worlds and to engage in face-to-face dialogues with users.5,6 Artificially intelligent VPs interact verbally and nonverbally, and the most sophisticated VPs approach verisimilitude by engaging in rich conversations, recognizing nonverbal cues, and reasoning about social and emotional factors.7 Learners interact with avatars (computer representations of patients that can speak and answer learner questions) in ways that mimic real and standardized patients. VPs provide a safe, effective means by which learners practice clinical skills before interacting with patients.

This article reviews literature on VPs and their underlying VR technology, examines the application of VPs in teaching psychiatric intake, and identifies promises and perils posed by VPs in medical education. Research suggests that VPs can successfully facilitate learners’ acquisition of core knowledge in psychiatry and help develop their skills in history taking, interviewing, clinical reasoning, decision making, and assessing suicide risk.8,9 We use psychiatric intake as an example because psychiatric issues such as opioid overuse, posttraumatic stress syndrome, and suicidality are pervasive. Shortages of mental health professionals and services limit learners’ exposure to these clinical problems,10 so VPs could play an especially useful role.11

The Promise of VPs

Technologically savvy learners have expectations about learning methods that differ from those of previous generations, and some faculty been slow to respond to this change. Devices such as laptops, tablets, and smartphones linked with sensors and applications through ubiquitous Wi-Fi networks are no longer merely peripheral to learners’ experiences: they have become indispensable elements of education and practice. This evolution coincides with significant changes in the health care sector. An aging, often chronically ill population demands increasing attention from health care practitioners, and stringent clinical productivity expectations can reduce the time available for clinician-educators to participate in traditional teaching models. These trends are exacerbated by patients’ decreasing lengths of stay in hospitals that further limit opportunities for students to participate in longitudinal treatments.12 One response to these constraints is broader use of VPs.

Use of VPs offers several advantages when compared to traditional methods of teaching clinical skills. Online learning materials, such as VPs, are accessible any time and almost anywhere there is a computer with an internet connection. Once the VP software is developed, it can be reused without additional cost and VP “knowledge” can be updated quickly.13 VPs also have advantages over standardized patients. VPs are more uniform than standardized patients because there is no variation in VP behavior once the software is completed and, unlike standardized patients, VPs do not need to be physically present with a learner. VPs might also convey more didactic information than standardized patients, who rely on recall. Additionally, VPs combine images, animations, video, and audio clips, which digital natives find more stimulating than textbooks.14 VPs can help students learn clinical and ethical decision making, basic practitioner-patient communication, and history-taking skills.15,16

Nevertheless, it is important to acknowledge that VPs are not equivalent to real patients and cannot replace traditional clinic-based teaching. Modern state-of-the-art simulations are still limited compared to the reality of symptoms exhibited by patients. Additionally, learning through VPs outside a classroom requires substantial self-discipline, and enthusiasm for learning could deteriorate due to a lack of face-to-face feedback from teachers and fellow students. Those observations aside, VR and VPs have uses in patient care (including exposure therapy, autism treatment, and responding to phantom limb pain) and uses in learning anatomical analysis, team training, surgical management, expressing empathy, and facilitating patient wellness.17,18

Using VPs to replicate clinical conditions and settings can provide a useful context for learning. Psychiatric intake, for example, is important because information elicited during the intake process can either be a prelude to accurate diagnosis and appropriate treatment or, to varying degrees, lead to mistakes, misunderstandings, and inappropriate care. A typical intake process includes gathering information on patient characteristics: address, sex, family, income, education, primary care practitioner, other clinicians, past and current health problems, relationship information, current functioning, and mental and physical symptoms.19 This information can influence the quality of clinician-patient interactions, the accuracy of a diagnosis, and the effectiveness of a treatment plan. Research suggests that VPs can successfully facilitate learners’ acquisition of core knowledge in psychiatry and help develop their skills in history taking, interviewing, clinical reasoning, decision making, and assessing suicide risk.8,9

The Peril of VPs

VPs have proven useful, but they have shortcomings. Current VPs might not represent the diversity of a population and, when racial or ethnic diversity is represented, VPs with darker skin tone could trigger learners’ unconscious bias.20,21 Stigmatizing language used in health records also influences learners’ attitudes towards patients and their medication prescribing behavior.22 People who are sincere in renouncing prejudice can remain vulnerable to biased habits of mind. Intentions are not good enough. Studies demonstrate bias affecting nearly every group of people.

If you are Latino, you will get less pain medication than a white patient. If you’re an elderly woman, you will receive fewer life-saving interventions than an elderly man.… If you are an obese child, your teacher is more likely to assume you’re less intelligent than if you were slim.23

Bias is, no doubt, reflected in VP construction as well. Joanna Bryson, an expert in AI, notes that sexist AI could be a consequence of AI programming being done predominantly by “white, single guys from California”24 and that it might be addressed, at least partially, by diversifying the software development workforce. According to Bryson, it should come as no surprise that machines express opinions of the people who program them: “When we train machines by choosing our culture, we necessarily transfer our own biases. There is no mathematical way to create fairness. Bias is not a bad word in machine learning. It just means that the machine is picking up regularities.”24 This concern about bias applies not only to AI but also to VPs.

In addition to concerns about bias in VP creation and use, there is significant potential for malicious intent in their programming. One example is a virtual human, named Norman, created at the Massachusetts Institute of Technology.24 Norman illustrates that the data used to teach a machine learning algorithm can significantly influence what is learned and how a VP “behaves.” When the output of an AI algorithm is biased and unfair, the culprit is usually not the algorithm but biased data used in training. Norman was subjected to extended exposure to the darkest corners of Reddit and is called the “world’s first psychopath AI.”25 He represents a case study of AI gone wrong when biased data is used in machine learning algorithms.

For most users of AI, VR, and VPs, what goes on in the “black box” of programming is unknown and assumed to be trustworthy. As noted by Marc Goodman, a law enforcement agency adviser, “The thing people don’t get is that cybercrime is becoming automated and it is scaling exponentially.”26 The most dangerous type of AI system and the one most difficult to defend against is an AI system made malevolent on purpose.27 The easiest method to compromise a user immersed in VR and VPs is for the software programmer to subject the user, unknowingly, to content designed to change, persuade, or influence a user’s decisions in harmful ways.28 Additionally, any software—including VPs—can be hacked. In 2016, the number of reported data breaches increased by 40% over the previous year.29,30

Malicious intent, embedded bias, and mistaken connections among patient characteristics are all perils of VPs, and they raise interesting legal questions. The implications of using VPs in teaching psychiatric intake, for example, are frighteningly broad. Opportunities to amplify or distort a broad range of variables represented in intake records are numerous and could negatively influence students’ learning.

Evaluating Promises and Perils of VPs

The Nuffield Council on Bioethics released a briefing note in 2018 on what it sees as big ethical questions about uses of AI in health care.31 Modified to apply to VPs, we ask the following questions:

  • What is the danger of VPs providing incorrect feedback?
  • Who is responsible when the feedback is flawed?
  • What is the potential for the malicious use of VPs?
  • Will VPs diminish in-person interactions among teachers and learners?
  • What impact does the growing use of VPs have on teaching and learning?

These questions can help clarify our thinking about the appropriate roles of VPs in education, how they should be constructed and used, and how we might increase the promise and decrease the peril of their use.

References

  1. Association of American Medical Colleges Institute for Improving Medical Education. Effective use of educational technology in medical education. Colloquium on educational technology: recommendations and guidelines for medical educators. https://members.aamc.org/eweb/upload/Effective%20Use%20of%20Educational.pdf. Published March 2007. Accessed August 8, 2018.

  2. Kononowicz AA, Zary N, Edelbring S, Corral J, Hege I. Virtual patients—what are we talking about? A framework to classify the meanings of the term in healthcare education. BMC Med Educ. 2015;15(11):1-7.
  3. Cook DA, Triola MM. Virtual patients: a critical literature review and proposed next steps. Med Educ. 2009;43(4):303-311.
  4. Virtual Reality Society. What is virtual reality? https://www.vrs.org.uk/virtual-reality/what-is-virtual-reality.html. Accessed July 17, 2018.

  5. Parsons TD. Virtual standardized patients for assessing the competencies of psychologists. In: Khosrowpour M, ed. Encyclopedia of Information Science and Technology. Vol 9. 3rd ed. Hershey, PA: IGI Global; 2015:6484-6493.

  6. Maichen K, Danforth D, Price A, et.al. Developing a conversational virtual standardized patient to enable students to practice history-taking skills. Simul Healthc. 2017;12(2):124-131.
  7. Rizzo A, Parsons T, Buckwalter JG, Lange B, Kenny P. A new generation of intelligent virtual clinical patients for clinical training. http://ict.usc.edu/pubs/A%20New%20Generation%20of%20Intelligent%20Virtual%20Patients%20for%20Clinical%20Training-ABS.pdf. Accessed December 6, 2018.

  8. Pantziaras I, Fors U, Ekblad S. Training with virtual patients in transcultural psychiatry: do the learners actually learn? J Med Internet Res. 2015;17(2):e46.

  9. Foster A, Chaudhary N, Murphy J, Lok B, Waller J, Buckley PF. The use of simulation to teach suicide risk assessment to health profession trainees—rationale, methodology, and a proof of concept demonstration with a virtual patient. Acad Psychiatry. 2015;39(6):620-629.
  10. Weiner S. Addressing the escalating psychiatrist shortage. AAMC News. February 13, 2018. https://news,aamc.org/patient-care/article/addressing-escalating-psychiatrist-shortage/. Accessed October 17, 2018.

  11. Doolen J, Giddings M, Johnson M, Guizado de Nathan G, O Badia L. An evaluation of mental health simulation with standardized patients. Int J Nurs Educ Scholarsh. 2014;11(1):55-62.
  12. American Hospital Association. Trends affecting hospitals and health systems. https://www.aha.org/guidesreports/2018-05-22-trendwatch-chartbook-2018. Published 2018. Accessed October 17, 2018.

  13. Triola M, Feldman H, Kalet AL, et al. A randomized trial of teaching clinical skills using virtual and live standardized patients. J Gen Intern Med. 2006;21(5):424-429.
  14. Prensky M. Digital natives, digital immigrants. On the Horizon. 2001;9(5):1-6.
  15. Kenny P, Parsons T, Gratch J, Leuski A, Rizzo A. Virtual patients for clinical therapist skills training. Lect Notes Comput Sci. 2007;4722:197-210.

  16. Stevens A, Hernandez J, Johnsen K, et al. The use of virtual patients to teach medical students history taking and communication skills. Am J Surg. 191(6):806-811.
  17. Craig E, Georgieva M. VR and AR: driving a revolution in medical education and patient care. Educause Review. August 30, 2017. https://er.educause.edu/blogs/2017/8/vr-and-ar-driving-a-revolution-in-medical-education-and-patient-care. Accessed July 8, 2018.

  18. Hsieh MC, Lee JJ. Preliminary study of VR and AR applications in medical and healthcare education. J Nurs Health Stud. 2018;3(1):1-5.
  19. Virginia mental health intake and evaluation. https://www.apadivisions.org/division-31/publications/records/virginia-intake-form.docx. Accessed July 17, 2018.

  20. Urresti-Gundlach M, Tolks D, Kiessling C, Wagner-Menghin M, Härtl A, Hege I. Do virtual patients prepare medical students for the real world? Development and application of a framework to compare a virtual patient collection with population data. BMC Med Educ. 2017;17(1):174.

  21. Zipp SA, Krause T, Craig SD. The impact of user biases toward a virtual human’s skin tone on triage errors within a virtual world for emergency management training. Proc Hum Factors Ergon Soc Annu Meet. 2017;61(1):2057-2061.
  22. P Goddu A, O’Conor KJ, Lanzkron S, et al. Do words matter? Stigmatizing language and the transmission of bias in the medical record. J Gen Intern Med. 2018;33(5):685-691.
  23. Nordell J. Is this how discrimination ends? Atlantic. May 7, 2017. https://www.theatlantic.com/science/archive/2017/05/unconscious-bias-training/525405/. Accessed June 4, 2018.

  24. Wakefield J. Are you scared yet? Meet Norman, the psychopathic AI. BBC. June 2, 2018. https://www.bbc.com/news/technology-44040008. Accessed July 17, 2018.

  25. Massachusetts Institute of Technology. Norman—world’s first psychopath AI. http://norman-ai.mit.edu. Accessed July 17, 2018.

  26. Markoff J. As artificial intelligence evolves, so does its criminal potential. New York Times. October 23, 2016. https://www.nytimes.com/2016/10/24/technology/artificial-intelligence-evolves-with-its-criminal-potential.html. Accessed June 4, 2018.

  27. Yampolskiy RV. Taxonomy of pathways to dangerous AI. Paper presented at: 30th AAAI Conference on Artificial Intelligence; February 12-13, 2016; Phoenix, AZ. https://arxiv.org/ftp/arxiv/papers/1511/1511.03246.pdf. Accessed June 4, 2018.

  28. Andrasik AJ. Hacking humans: the evolving paradigm with virtual reality. SANS Institute. https://www.sans.org/reading-room/whitepapers/testing/hacking-humans-evolving-paradigm-virtual-reality-38180. Published November 2017. Accessed July 17, 2018.

  29. Donaldson S. Virtual and augmented reality: transforming the way we look at the internet and data security. Fossbytes. March 19, 2017. https://fossbytes.com/virtual-and-augmented-reality-security/. Accessed July 8, 2018.

  30. CyberScout. Identity Theft Resource Center Data Breach Reports: 2016 End of Year Report. https://www.idtheftcenter.org/images/breach/2016/DataBreachReport_2016.pdf. Accessed November 19, 2018.

  31. Nuffield Council on Bioethics. Artificial intelligence (AI) in healthcare and research. http://nuffieldbioethics.org/wp-content/uploads/Artificial-Intelligence-AI-in-healthcare-and-research.pdf. Published May 2018. Accessed June 4, 2018.

Citation

AMA J Ethics. 2019;21(2):E153-159.

DOI

10.1001/amajethics.2019.153.

Conflict of Interest Disclosure

The author(s) had no conflicts of interest to disclose.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.