Experts warn that deepfake detection tools must use inclusive data sets of darker skin tones to avoid bias
"Deepfake detection tools must work with dark skin tones, experts warn", 17 August 2023
Detection tools being developed to combat the growing threat of deepfakes – realistic-looking false content – must use training datasets that are inclusive of darker skin tones to avoid bias, experts have warned.
Most deepfake detectors are based on a learning strategy that depends largely on the dataset that is used for its training. It then uses AI to detect signs that may not be clear to the human eye.
This can include monitoring blood flow and heart rate. However, these detection methods do not always work on people with darker skin tones, and if training sets do not contain all ethnicities, accents, genders, ages and skin-tone, they are open to bias, experts warned.
Over the last couple of years, concerns have been raised by AI and deepfake detection experts who say bias is being built in these systems.
Rijul Gupta, synthetic media expert and co-founder and CEO of DeepMedia, which uses AI and machine learning to assess visual and audio cues for underlying signs of synthetic manipulation said: “Datasets are always heavily skewed towards white middle-aged men, and this type of technology always negatively affects marginalised communities.”
Gupta added that deepfake detection tools that use visual cues, such as blood-flow and heart-rate detection, can have “underlying biases towards people with lighter skin tones, because darker skin tones in a video stream are much harder to extract a heart rate out of”.
The “inherent bias” in these tools means that they will perform worse on minorities.
“We will see an end result of an increase of deepfake scams, fraud and misinformation caused by AI that will be highly targeted and focused on marginalised communities”, Gupta says.
Mutale Nkonde, AI policy adviser and the CEO and founder of AI for the People, said the concerns tap into larger exclusions minorities face.
“We are well educated around the issues that facial recognition has in recognising dark skin, but the general public don’t realise that just because the technology has a new name, function or use doesn’t mean that the engineering has advanced.
Ellis Monk, professor of sociology at Harvard University and visiting faculty researcher at Google, developed the Monk Skin Tone Scale. It is an alternative scale that is more inclusive than the tech-industry standard and will provide broader spectrum of skin tones than can be used for datasets and machine learning models.