We are not ready for better deepfakes


If you’re like me, then lately you’ve scrolled past something on social media and thought, “Wait, was that real?” Deepfakes are everywhere, and they’re getting a lot more convincing.
That brings me to my Decoder guest today: Gaurav Misra, the CEO of Captions. You may not have heard of Captions yet, but you’ve probably seen a video that was generated using its AI models. The company’s Mirage Studio platform lets anyone generate AI versions of real people, and the results are alarmingly realistic.
Captions just put out a blog post titled, “We Build Synthetic Humans. Here’s What’s Keeping Us Up at Night.” It’s a good overview of the state of deepfakes and where they’re headed.
As the CEO of a company building deepfake technology, I wanted to know what specifically keeps Gaurav up at night, which you’ll hear us get into. I’m generally more optimistic about the long-term impacts of AI than a lot of people, but as you’ll hear in this conversation, I’m a lot more nervous about this topic.
Ultimately, I came away from this episode unsettled by the fact that the deepfakes of today are the least believable they’ll ever be, we are not ready, and the companies building this tech are racing ahead anyway.
If you’d like to read more on what we talked about in this episode, check out the links below:
- We build synthetic humans. Here’s what’s keeping us up at night | Captions
- Google’s Veo 3 AI video generator is a slop monger’s dream | The Verge
- Gemini AI can now turn photos into videos | The Verge
- Trump just unveiled his plan to put AI in everything | The Verge
- Racist videos made with AI are going viral on TikTok | The Verge
- Microsoft wants Congress to outlaw AI-generated deepfake fraud | The Verge
- YouTube is supporting the ‘No Fakes Act’ targeting unauthorized AI replicas | The Verge
- This Tom Cruise impersonator is using deepfake tech to impressive ends | The Verge
Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!
What's Your Reaction?






