YouTube’s much-discussed AI likeness detection tool started its rollout on Tuesday. The company notified eligible creators who are part of the YouTube Partner Program via email.
People using AI to create unauthorized deepfakes of actors, musicians, politicians, activists and creators has been a pressing concern ever since the technology has taken off. That’s what led to YouTube and CAA’s collaboration last December, which involved several influential figures gaining access to the pilot program of YouTube’s AI likeness detection tool. The tool then let users identify and manage AI videos that used the likeness of CAA’s clients.
Now that test is over, and YouTube is rolling out the deepfake detection tool to over 5,000 creators to start.
“Essentially, those that are selected are creators we think may have the most immediate use for the tool,” Jack Malon, YouTube’s policy communications manager, told TheWrap. “That’s going to help us keep developing the tech as we continue to roll out because the more we can actually, practically use a tool, the more we can test it, the more we can improve it.”
YouTube plans for the tool to be available to all monetized YouTube creators across the world by January of 2026.
As part of its first-wave rollout, YouTube also released a video explaining how the deepfake detection tool works under its Creator Insider channel. Eligible users will be able to access the tool through YouTube Studio under the “Content detection” option. Creators will then have to opt in to the technology, which requires them to agree to a consent agreement, verify it’s them via a QR code scanned on their phone and provide a photo ID. They will then be asked to record a selfie video, which asks creators to perform an assortment of actions (i.e. turn your head left, look up) in a randomized order. All of that information will then be reviewed by YouTube. As long as the information on a user’s photo ID and video match up, they will be approved for the program. YouTube predicts this process will likely take a few days.
Once approved, deepfake AI videos featuring that creator will appear under the newly added “Likeness” tab. Videos that seem especially concerning will be labeled as “High priority.” Creators will then have three options for dealing with those videos: file a removal request, file a copyright request or archive the video.
Creators will also have the option to opt out of the likeness detection tool after they’ve opted in. After submitting a request to be removed from the offering, it will take roughly 24 hours to be removed.