YouTube has rolled out its AI likeness detection tool to all adult users. This means anyone over 18 with a YouTube account can now have the platform scan for deepfakes featuring their face.
The system works by scanning for facial matches using a selfie-style scan of your face. If a match is found, it alerts you, and you can request that YouTube removes the content. The company claims only a small number of removal requests are received, but concerns about privacy and misuse remain high.
Initially tested on content creators, government officials, politicians, journalists, and finally the entertainment industry, this expansion aims to provide an extra layer of protection for everyday users. However, the tool’s effectiveness is limited as it only covers facial likeness and does not address other identifying features like voice or body shape.
The announcement came on YouTube’s creator forum. However, a spokesperson noted there are no specific requirements for what constitutes a ‘creator.’ This means anyone can benefit from this protection, whether they’ve been creating content for years or just started. The tool will be evaluated based on criteria such as the realism of the deepfake and if it features unique identifying information.
While this move could help curb the spread of fake content, there are also concerns about overreach and the potential misuse of personal data. As we become more reliant on AI for protection, the line between privacy and surveillance might blur ever so slightly.







