


A woman using a smart home assistant
RossHelen/Shutterstock
Freely available software that can mimic a specific individual’s voice produces results that can fool people and voice-activated tools such as smart home assistants.
Security researchers are increasingly concerned by deepfake software, which uses artificial intelligence to alter videos or photographs to map one person’s face onto another.
Emily Wenger at the University of Chicago and her colleagues wanted to investigate audio versions of these tools, which generate realistic speech based on a sample of a person’s voice, after reading about …

More Stories
US police are selling seized phones with personal data still on them
Why the 2023 Atlantic hurricane season is especially hard to predict
Elon Musk’s brain implant firm Neuralink gets approval for human trial