Doctors need to learn what illnesses look like on darker skin

This new book will teach them.

Doctors can learn a lot about a patient’s health just by looking at their skin. Red rash on their face? That’s a symptom of lupus. Blue lips? Might want to test them for heart disease.

But while learning about these clinical signs of disease during his second year of med school, 20-year-old student Malone Mukwende found himself repeatedly asking the same question:

“But what will it look like on darker skin?”

The problem was that his curriculum at St George’s Hospital Medical School often only showed pictures of what symptoms would look like on white skin. The language used to describe skin symptoms also defaulted to their appearance on white patients.

So Mukwende, collaborating with two of his school’s lecturers, decided to write a guidebook focused specifically on identifying and describing clinical signs on darker skin.

Mind the Gap

In “Mind the Gap: A Handbook of Clinical Signs in Black and Brown Skin,” readers will find side-by-side images of the same illness symptoms on white and darker skin.

In some cases, the difference between the two images can be profound — and doctors not knowing they exist could be potentially deadly for patients.

For example, the rare Kawasaki disease — recently found to be a complication of COVID-19 — causes a bright red rash on white skin. But on darker skin, it looks more “like goose bumps,” Mukwende told CBC Radio.

“If both those patients presented to a hospital, we can almost tell that, if there was not this knowledge, who would be getting sent home and who would be receiving treatment,” he said. “Consequently, it can mean that someone will end up dying because of that.”

This problem of missing clinical signs on darker skin isn’t just a hypothetical scenario, either.

In an opinion piece for STAT, dermatology specialist Art Papier describes seeing patients with darker skin suffer due to the same curriculum gap Mukwende hopes to address with his book.

“I’ve seen patients of color misdiagnosed because clinicians could not recognize their rashes,” he wrote. “I’ve seen immunologic diseases such as lupus, life-threatening drug reactions, and other conditions that manifest themselves on the skin get missed for the same reasons.”

Artificial intelligences designed to detect skin conditions are stumbling over the same pitfalls as their human counterparts — because they’re trained on image datasets that, like Mukwende’s textbooks, are dominated by white skin tones, they’re not as adept at spotting signs of disease on darker skin.

A Living Document

Mukwende and his co-authors are now in the process of finding a publisher for the book, but their effort to bridge the curriculum gap won’t end when it starts rolling off the presses.

Because they had such a hard time tracking down any photos of some conditions on darker skin, they’ve decided to launch a companion website for the book where people can send in photos themselves.

I want it to empower medical professionals.

Malone Mukwende

The authors will include some of those images in future iterations of “Mind the Gap,” along with images and descriptions submitted by med students and healthcare workers.

The ultimate goal is for the book to be a “live document” that becomes more useful the longer it exists.

“My hope is that the handbook will become a staple resource in medical settings around the world,” Mukwende told the Washington Post. “I want it to empower medical professionals, so they feel more competent, and so patients can be confident that their doctors understand them.”

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
“Please do not assume the worst”: Students want colleges to teach them how to use AI the right way
Students want to work with their communities, universities, and governments to figure out how to engage productively and responsibly with AI.
Large language models are biased. Can logic help save them?
MIT researchers trained logic-aware language models to reduce harmful stereotypes like gender and racial biases.
ChatGPT answers physics questions like a confused C student
When asked about physics, ChatGPT gave a mixture of true, false, relevant, irrelevant, and contradictory answers — all with authority.
ChatGPT in academia: Can it help with the research process?
Several researchers have already listed a chatbot as a co-author on academic studies, but ChatGPT is better in some areas than others.
An expert explains how you’re using ChatGPT wrong
ChatGPT is designed to produce strings of words that sound good in response to the words you give it – not to provide you with information.
Up Next
outdoor classrooms
Subscribe to Freethink for more great stories