Painimation: Nuanced Descriptions and AI Improve Diagnoses
“When you ask patients about pain, they want to tell a story, not give a number,” says Charles Jonassaint, associate professor of medicine, clinical psychologist, and co-developer of Painimation. The tool, a union of language, art, and machine learning, takes a nuanced approach to pain communication that goes beyond the classic numeric approach.
Jonassaint and Nema Rao, then a master’s candidate at Carnegie Mellon University’s School of Design during the creation of Painimation, worked with designers to translate the words people use to describe their pain into abstract, moving images.
The animations were then validated by those with neuropathic pain—“Yes, this looks like my pain”—and incorporated into an iPad app that asks people to color in the parts of the body that hurt, then select which animations match their pain. This adjustment is translated as a numerical rating ranging from one to 100.
In composite, the data provide a meaningful description, incorporating the capacity of artificial intelligence (AI) to find patterns and correlations with certain conditions. As more data are collected and machine learning gets cleverer, more connections—among pain, disease, treatment, and recovery—will surely surface.