Letter to the Editor
Oct 2019

Response to “Emerging Roles of Virtual Patients in the Age of AI”

Frederick W. Kron, MD, Timothy C. Guetterman, PhD, and Michael D. Fetters, MD, MPH, MA
AMA J Ethics. 2019;21(10):E920-925. doi: 10.1001/amajethics.2019.920.


We appreciate the AMA Journal of Ethics’ forward-looking issue on artificial intelligence (AI), and we select ideas in C. Donald Combs and P. Ford Combs’ article, “Emerging Roles of Virtual Patients in the Age of AI,” for further discussion.

  1. Conflation of virtual patients and virtual humans. The term virtual patient (VP) has been applied to numerous applications with different designs, technologies, and educational objectives. This heterogeneity can lead to confusion.1 The Association of American Medical Colleges’ definition of VP that the authors reference was developed in 2006 and refers to computerized clinical case simulations.2 These applications, which are largely text based with multimedia content, focus on clinical reasoning and decision making and do not utilize AI.1, 3, 4 The authors conflate VPs of this type with virtual humans (VHs), computer-driven conversational agents with human form that interact with humans using the full range of behaviors found in human-to-human, face-to-face interaction.5 VHs utilize AI in computer-based interpersonal communication training simulations—as virtual standardized patients,4, 6 physicians, or any other human across the health care enterprise.
  2. Overstatement of virtual patient perils. The authors present material about “sexist AI,” cybercrime, malicious intent in programming, and psychopathic AI. Without an accompanying account of educational software, AI, or VH development, readers may overestimate the risk of using these agents. The following clarification should mitigate the sense of menace the article evokes.

    AI is broadly defined as any task performed by a program or machine that, if performed by a human, would require applied intelligence to accomplish.7 The current state-of-the-art is narrow AI,8 which might utilize natural language processing and machine learning to solve specific problems. By contrast, strong AI is an assemblage of cognitive processes sufficient to enable self-awareness and intentionality. Strong AI is far removed from realization and may not even be possible.9, 10, 11 Personified as Norman Bates, the serial killer in Alfred Hitchcock’s Psycho,12 the Norman program mentioned in the article is suggestive of the strong AI of dystopian films like Ex Machina and Blade Runner. By referring to the program using the personal pronoun “he” and stating that “Norman was subjected to the darkest corners of Reddit,” as if a person had been subjected to a terrifying ordeal, the authors make Norman seem much “stronger” than it is. Norman is merely a sensationalized example of narrow AI that was intentionally derived using data-driven machine learning applied to an unvetted set of data from a now-banned Reddit website where users posted videos of people dying and gave textual explanations of the manner of death.13,14 When shown a series of Rorschach inkblots, Norman unsurprisingly interprets inkblots as people dying, because that’s what MIT researchers trained it to do.14

    By generalizing from this example to VH creation, the article misses the point that development of VHs for medical education is wholly under the control of medical educators and trusted experts. It would be ethically irresponsible for educators to use unvetted data sets to train a VH, to implement AI algorithms that allow unwanted degrees of freedom from desirable VH behaviors, or to abrogate responsibility for human oversight in VH program development.15 To ensure positive learning outcomes, educators must stipulate evidence-based design requirements, create content, and then iteratively evaluate the sufficiency of materials passed back to them by software engineers. This agile development process16, 17 requires transparency to stakeholders, effectively eliminating the “black box” of programming and minimizing the risk of VH applications being tainted by the unintended introduction of undesirable content.

  3. VH opportunities. The article overlooks the most noteworthy opportunity that AI-enabled VH simulation offers to medical education: training in basic and complex communication skills (eg, facial expression, verbal and nonverbal behaviors) along with cultivating awareness and application of ethical principles. Communication and ethics are deeply interrelated. Verbal and nonverbal communication proficiency18, 19, 20 is necessary for clinicians to develop trust, encourage patient disclosure, and determine patients’ needs, values, beliefs and concerns.21, 22, 23 Good practice in complex communication is therefore inseparable from the ethical practice of medicine.24, 25 Ethics and communication have both proven challenging to teach, however.26, 27, 28, 29

    With its capacity for standardized presentation of materials, distributed learning across institutions, and fine-grained uniform assessment, AI-enabled VH simulation can help address the variability of current undergraduate and graduate ethics education.30,31 Learners can engage one-on-one with VH patients, family members, or colleagues in realistic situations drawn from everyday clinical encounters that focus on ethical challenges and complex communication.32, 33, 34, 35 These simulated situations can pose a range of ethical challenges for learners—from informed consent to breaking bad news, dealing with cultural disparities, and more. AI-enabled VH simulation can improve how students learn, remember, perceive, and make decisions.29 By scaffolding learning materials, simulations can increase in complexity as learners advance along their educational trajectory from premedical study to postgraduate continuing medical education. Moreover, their round-the-clock accessibility provides flexibility for busy learners.

In summary, VH education offers a promising frontier in health care education into which educators should not fear to stride.


  1. Kononowicz AA, Zary N, Edelbring S, Corral J, Hege I. Virtual patients—what are we talking about? A framework to classify the meanings of the term in healthcare education. BMC Med Educ. 2015;15:11.

  2. Institute for Improving Medical Education, Association of American Medical Colleges. Effective Use of Educational Technology in Medical Education: Colloquium on Educational Technology: Recommendations and Guidelines for Medical Educators. Washington, DC: Association of American Medical Colleges; March 2007.

  3. Doloca A, Ţănculescu O, Ciongradi I, Trandafir L, Stoleriu S, Ifteni G. Comparative study of virtual patient applications. Proc Rom Acad A Math Phys Tech Sci. 2015;16(3):466-473.
  4. Talbot TB, Sagae K, John B, Rizzo AA. Sorting out the virtual patient: how to exploit artificial intelligence, game technology and sound educational practices to create engaging role-playing simulations. Int J Gaming Comput Mediat Simul. 2012;4(3):1-19.
  5. Gratch J, Rickel J, André E, Cassell J, Petajan E, Badler N. Creating interactive virtual humans: some assembly required. IEEE Intell Syst. 2002;17(4):54-63.
  6. Parsons TD. Virtual standardized patients for assessing the competencies of psychologists. In: Khosrow-Pour M, ed. Encyclopedia of Information Science and Technology. 3rd ed. Hershey, PA: IGI Global; 2015:6484-6492.

  7. McCarthy J, Minsky ML, Rochester N, Shannon CE. A proposal for the Dartmouth summer research project on artificial intelligence. AI Mag. 2006;27(4):12-14.
  8. Weinbaum D, Veitas V. Open ended intelligence: the individuation of intelligent agents. J Exp Theor Artif Intell. 2016;29(2):371-396.
  9. Kalanov TZ. Man versus computer: difference of the essences. The problem of the scientific creation. Brain (Bacau). 2017;8(2):151-178.
  10. Braga A, Logan R. The emperor of strong AI has no clothes: limits to artificial intelligence. Information. 2017;8(4):156.

  11. Russell S, Hauert S, Altman R, Veloso M. Ethics of artificial intelligence. Nature. 2015;521(7553):415-416.
  12. Mac R. After the proliferation of the New Zealand shooting video, Reddit has banned two channels showing human death. BuzzFeed. March 15, 2019. https://www.buzzfeednews.com/article/ryanmac/reddit-bans-groups-death-gore-new-zealand-massacre-video. Accessed July 15, 2019.

  13. Zetlin M. MIT researchers use Reddit to create world’s first psychopath AI. Inc. June 12, 2018. https://www.inc.com/minda-zetlin/mit-psychopath-ai-norman-reddit-violence-captions.html. Accessed June 28, 2019.

  14. Floridi L, Sanders JW. Artificial evil and the foundation of computer ethics. Ethics Inf Technol. 2001;3(1):55-66.
  15. Mor Y, Cook J, Santos P, et al. Patterns of practice and design: towards an agile methodology for educational design research. In: Conole G, Klobučar T, Rensing C, Konert J, Lavoué É, eds. Design for Teaching and Learning in a Networked World: 10th European Conference on Technology Enhanced Learning, EC-TEL 2015, Toledo, Spain, September 15-18, 2015, Proceedings. Cham, Switzerland: Springer; 2015:605-608.

  16. Chan FKY, Thong JYL. Acceptance of agile methodologies: a critical review and conceptual framework. Decis Support Syst. 2009;46(4):803-814.
  17. Schön EM, Thomaschewski J, Escalona MJ. Agile requirements engineering: a systematic literature review. Comput Stand Interfaces. 2017;49:79-91.

  18. Gorawara-Bhat R, Hafskjold L, Gulbrandsen P, Eide H. Exploring physicians’ verbal and nonverbal responses to cues/concerns: learning from incongruent communication. Patient Educ Couns. 2017;100(11):1979-1989.
  19. Bommier C, Mamzer MF, Desmarchelier D, Hervé C. How nonverbal communication shapes doctor-patient relationship: from paternalism to the ethics of care in oncology. J Int Bioethique. 2014;25(4):29.

  20. Mast MS. On the importance of nonverbal communication in the physician-patient interaction. Patient Educ Couns. 2007;67(3):315-318.
  21. Plotkin JB, Shochet R. Beyond words: what can help first year medical students practice effective empathic communication? Patient Educ Couns. 2018;101(11):2005-2010.

  22. Hannawa AF. Disclosing medical errors to patients: effects of nonverbal involvement. Patient Educ Couns. 2014;94(3):310-313.
  23. Forde R, Vandvik IH. Clinical ethics, information, and communication: review of 31 cases from a clinical ethics committee. J Med Ethics. 2005;31(2):73-77.
  24. Hain R, Saad T. Foundations of practical ethics. Medicine. 2016;44(10):578-582.
  25. Alfandre D. From “I’m not staying!” to “I’m not leaving!”: ethics, communication, and empathy in complicated medical discharges. Mt Sinai J Med. 2008;75(5):466-471.
  26. Onguti S, Mathew S, Todd C. Communication and ethics in the clinical examination. Med Clin North Am. 2018;102(3):485-493.
  27. Swetz KM, Crowley ME, Hook CC, Mueller PS. Report of 255 clinical ethics consultations and review of the literature. Mayo Clin Proc. 2007;82(6):686-691.
  28. Branch WT. Supporting the moral development of medical students. J Gen Intern Med. 2000;15(7):503-508.
  29. DuBois JM, Burkemper J. Ethics education in US medical schools: a study of syllabi. Acad Med. 2002;77(5):432-437.
  30. Carrese JA, Malek J, Watson K, et al. The essential role of medical ethics education in achieving professionalism: the Romanell Report. Acad Med. 2015;90(6):744-752.
  31. Mozer MC, Wiseheart M, Novikoff TP. Artificial intelligence to support human instruction. Proc Natl Acad Sci U S A. 2019;116(10):3953-3955.
  32. Kron FW, Fetters MD, Scerbo MW, et al. Using a computer simulation for teaching communication skills: a blinded multisite mixed methods randomized controlled trial. Patient Educ Couns. 2017;100(4):748-759.
  33. Guetterman TC, Kron FW, Campbell TC, et al. Initial construct validity evidence of a virtual human application for competency assessment in breaking bad news to a cancer patient. Adv Med Educ Pract. 2017;8:505-512.

  34. Fetters MD, Guetterman TC, Scerbo MW, Kron FW. A two-phase mixed methods project illustrating development of a virtual human intervention to teach advanced communication skills and a subsequent blinded mixed methods trial to test the intervention for effectiveness. Int J Mult Res Approaches. 2018;10(1):296-316.
  35. Kelly E, Nisker J. Increasing bioethics education in preclinical medical curricula: what ethical dilemmas do clinical clerks experience? Acad Med. 2009;84(4):498-504.


AMA J Ethics. 2019;21(10):E920-925.




This work was supported by a Small Business Innovation Research (SBIR) phase II grant (01/5R44TR000360-04), “Modeling Professionalism and Teaching Humanistic Communication in Virtual Reality,” from the National Institutes of Health and a career development award (1-K01-LM-012739-01) from the National Library of Medicine and the National Institutes of Health (Dr Guetterman).

Conflict of Interest Disclosure

Dr Kron serves as president of, and Dr Fetters has stock options in, Medical Cyberworlds, Inc, which received the SBIR phase II grant funding that supported this research. The University of Michigan Conflict of Interest Office considered potential for conflict of interest and concluded that no formal management plan was required. Dr Guetterman had no conflicts of interest to disclose.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.