Skip to main content
An analyst looks at a phone showing Gemini AI images with a SynthID watermark, while a laptop runs detection tool.

Editorial illustration for Google's Gemini Watermarks 20 Billion AI Images with SynthID Detection Tool

Gemini SynthID Watermarks 20B AI Images for Authenticity

Gemini app watermarks 20 billion AI images with SynthID, tests Detector

Updated: 2 min read

In the rapidly shifting landscape of generative AI, Google is taking a bold step to address growing concerns about image authenticity. The company's Gemini platform has rolled out SynthID, a watermarking tool designed to tag AI-generated images with an invisible digital signature.

The move comes as trust in digital imagery becomes increasingly fragile. Deepfakes, misinformation, and AI-generated content have blurred the lines between real and synthetic media, leaving consumers and professionals struggling to distinguish genuine visuals.

Google's approach aims to provide transparency in an era of visual uncertainty. By embedding imperceptible markers into AI-generated images, the tech giant hopes to create a verifiable system that can help identify the origin of digital content.

But how effective is this new detection method? The numbers are intriguing: Google claims to have already watermarked a staggering 20 billion AI-generated images. The company is now testing its SynthID Detector with journalists and media professionals, signaling a serious commitment to combating digital misinformation.

Since then, over 20 billion AI-generated pieces of content have been watermarked using SynthID, and we have been testing our SynthID Detector, a verification portal, with journalists and media professionals. How it works If you see an image and want to confirm it has been made by Google AI, upload it to the Gemini app and ask a question such as: "Was this created with Google AI?" or "Is this AI-generated?" Gemini will check for the SynthID watermark and use its own reasoning to return a response that gives you more context about the content you encounter online.

Google's move to watermark 20 billion AI images signals a serious commitment to transparency in synthetic media. The SynthID tool represents an intriguing first step toward helping users verify AI-generated content directly within the Gemini app.

Journalists and media professionals are getting early access to test the detection capabilities, which could become important as AI-generated images proliferate. Users can now simply upload an image and ask Gemini whether it was created using Google's AI tools.

The verification process appears straightforward: Gemini checks for embedded watermarks and uses its reasoning to confirm an image's origin. While the full effectiveness remains to be seen, it's a proactive approach to addressing concerns about AI image authenticity.

Still, questions linger about how full the detection will be and whether other AI companies will adopt similar verification methods. For now, Google seems focused on building trust through technological solutions that empower users to understand the content they're viewing.

The SynthID tool suggests we're entering an era where AI-generated content's origins might become more transparent and traceable.

Further Reading

Common Questions Answered

How many AI-generated images have been watermarked by Google's SynthID tool?

Google's SynthID has watermarked over 20 billion AI-generated images since its launch. The tool provides an invisible digital signature that helps verify the origin of AI-created content.

How can users verify if an image was created using Google AI?

Users can upload an image to the Gemini app and ask specific questions like 'Was this created with Google AI?' or 'Is this AI-generated?'. The Gemini app will then check for the SynthID watermark and use its reasoning to confirm the image's origin.

Who is currently testing the SynthID Detector?

Journalists and media professionals are currently getting early access to test the SynthID Detector verification portal. This initial testing phase is part of Google's efforts to enhance transparency in synthetic media creation.