Instagram head says company is evaluating how to handle deepfakes

Posted at 9:06 AM, Jun 25, 2019
and last updated 2019-06-25 11:06:19-04

Instagram head Adam Mosseri said the photo-sharing platform is still figuring out how to address doctored videos, also known as deepfakes.

“We don’t have a policy against deepfakes currently,” Mosseri told CBS This Morning co-host Gayle King in an interview that aired Tuesday. “We’re trying to evaluate if we wanted to do that and if so, how you would define deepfakes.”

Mosseri joined Instagram’s parent company Facebook in 2008 and took the helm of Instagram in October 2018 after the founders Kevin Systrom and Mike Krieger left the company.

King asked Mosseri about a deepfake video of Facebook CEO Mark Zuckerberg, which falsely portrays him as saying, “Imagine this for a second: One man, with total control of billions of people’s stolen data, all their secrets, their lives, their futures.”

The video was made by using 2017 footage of Zuckerberg from CBS’s streaming news service CBSN with artificial intelligence technology to manipulate Zuckerberg’s face to make it look like he said something he didn’t. The Facebook CEO’s voice is replaced by an actor’s.

Facebook and Instagram did not remove the video.

“We are not going to make a one-off decision to take a piece of video down just because it’s of Mark and Mark happens to run this place,” Mosseri said. “That would be really inappropriate and irresponsible. We need to have defined principles and we need to be transparent about those principles.”

In May, Facebook also declined to take down a manipulated video that made it appear House Speaker Nancy Pelosi was slurring her words. An Instagram spokesperson said at the time that it will treat the video “the same way we treat all misinformation on Instagram.” If it’s marked as a fake by third-party fact checkers, the platform’s algorithms won’t recommend it to users, according to the Instagram spokesperson.

CBS’s King said deepfakes are upsetting because it could influence people with information that isn’t true.

“I agree with you, but if it takes too long to identify it if a million people see a video like that in the first 24 hours or the first 48 hours, the damage is done,” Mosseri said in response. “The thing we are focused on right now, internally, is not if we take it down when we find it, but how do you find it more quickly.”

Mosseri also said the company is trying to balance safety and speech, which can be “really, really tricky.”

“Right now, I think the most important thing for us to focus on is getting to the content quicker,” he said. “Once we can do that, then we can have the next debate about whether or not to take it down when we find it.”

Mosseri said it’s an issue not only facing the company but something he struggles with on a personal level, too.

“I don’t feel good about it,” he said.