
Image databases of skin conditions are notoriously biased towards lighter skin. Rather than wait for the slow process of collecting more images of conditions like cancer or inflammation on darker skin, one group wants to fill in the gaps using artificial intelligence. It’s working on an AI program to generate synthetic images of diseases on darker skin — and using those images for a tool that could help diagnose skin cancer.
“Having real images of darker skin is the ultimate solution,” says Eman Rezk, a machine learning expert at McMaster University in Canada working on the project. “Until we have that data, we need to find a way to close the gap.”
But other experts working in the field worry that using synthetic images could introduce their own problems. The focus should be on adding more diverse real images to existing databases, says Roxana Daneshjou, a clinical scholar in dermatology at Stanford University. “Creating synthetic data sounds like an easier route than doing the hard work to create a diverse data set,” she says.
There are dozens of efforts to use AI in dermatology. Researchers build tools that can scan images of rashes and moles to figure out the most likely type of issue. Dermatologists can then use the results to help them make diagnoses. But most tools are built on databases of images that either don’t include many examples of conditions on darker skin or don’t have good information about the range of skin tones they include. That makes it hard for groups to be confident that a tool will be as accurate on darker skin.