No one is ready for this

The Verge

It took less than 10 seconds to create each of these images with the Reimagine tool in the Pixel 9’s Magic Editor.
When a smartphone “just works,” it’s usually a good thing; here, it’s the entire problem in the first place.
(Consider Victorian spirit photos, the infamous Loch Ness monster photograph, or Stalin’s photographic purges of IRL-purged comrades.)
These images have defined wars and revolutions; they have encapsulated truth to a degree that is impossible to fully express.
This is all about to flip — the default assumption about a photo is about to become that it’s faked, because creating realistic and believable fake photos is now trivial to do.
Consider all the ways in which the assumed veracity of a photograph has, previously, validated the truth of your experiences.
The photos that are modified with the Reimagine tool simply have a line of removable metadata added to them.
Google claims the Pixel 9 will not be an unfettered bullshit factory but is thin on substantive assurances.

NEGATIVE

A blast emanating from the side of an ancient brick structure. A bicycle that has crashed in a city crosswalk. A cockroach within a takeout box. Using the Reimagine tool in the Pixel 9’s Magic Editor, it took less than ten seconds to create each of these images. They’re crunchy. Their color is vibrant. They have great fidelity. No ominous background blur or telltale sixth finger is present. These photos are all incredibly fucking fake, but they look so real.

The newest iteration of Google’s flagship phone, the Pixel 9, goes on sale this week and will come with the smoothest, most user-friendly interface for premium lies integrated directly into the phone. With comparable features already present on rival devices and soon to be released on more, this is almost guaranteed to become the standard. Normally, it’s a positive thing when a smartphone “just works”; however, in this case, that’s the main issue from the start.

Since the beginning of time, photography has been employed in deceptive ways. (Remember the famous Loch Ness monster photo, the Victorian spirit photos, or Stalin’s photo purges of his real-life allies. However, to claim that photos have never been regarded as trustworthy evidence would be untrue. Every reader of this article in 2024 grew up in a time when a photograph was assumed to be an accurate representation of reality. While there were some potential deceptions to be aware of, such as a digitally altered photo, a deepfake, or a staged scene with special effects, they were anomalies. Undermining the intuitive trust in a photograph required specific knowledge and specific tools. The exception, not the rule, was fake.

When I mention Tiananmen sq\., you’ll probably think of the same picture that I do. This also applies to the Napalm Girl or Abu Ghraib. These pictures have defined revolutions and wars; they have captured reality to an extent that words cannot adequately describe. There was no need to explain the significance of these images, their importance, or the high value we place on them. Our faith in photography was so strong that we felt compelled to emphasize the possibility that photographs could occasionally be fakes when debating the authenticity of the images.

Everything is about to change: since it is now so easy to produce realistic and convincing fake photos, the general perception of a photo will soon change to that it is staged. We’re not ready for what comes next.

Since photographs have served as evidence of events for as long as humans have existed, no one on Earth has ever lived in a society where they were not the cornerstone of social consensus. Think about all the times when you have felt that the perceived authenticity of a picture has confirmed the veracity of your personal stories. the prior dent in your rental car’s fender. The ceiling leak that you have. a package showing up. A real, human-made cockroach in your takeout order. In the event that wildfires invade your neighborhood, how do you inform friends and acquaintances about the amount of smoke outside?

Furthermore, up until now, the burden of proof has primarily rested with those contesting the veracity of a photo. The reason the flat-earther is not in line with the mainstream is not because they are ignorant of astrophysics (after all, how many of us are truly knowledgeable about it? ), but rather because they have to constantly provide long and complex explanations for why the images and videos they see are not real. The consistent release of satellite images capturing the Earth’s curvature necessitates the creation of a massive state conspiracy. A soundstage for the 1969 Moon landing needs to be built.

They bear the burden of proof, which we take for granted. We should probably start reviewing astrophysics again in the era of the Pixel 9.

The typical image produced by these AI tools will, for the most part, be rather benign in and of itself — an extra tree placed against a backdrop, an alligator placed inside a pizza place, a silly costume placed over a cat, etc. Ultimately, the flood completely changes the way we think about the picture, which has enormous implications all by itself. Take the extraordinary social upheaval that has occurred in the United States over the past ten years, for example, as a result of grainy footage of police brutality. These recordings revealed the truth in areas where the authorities hid or distorted it.

The constant exclamation of “Fake News!” coming from Trumpist circles heralded the start of this age of pure bullshit, where the firehose of lies will smother the impact of the truth. A sea of AI-generated war crime snuff will bury the next Abu Ghraib. The next George Floyd will remain undiscovered and unproven.

The shape of what’s to come is already apparent. The prosecution successfully convinced the judge to shift the burden of proof to the prosecution in the Kyle Rittenhouse trial, arguing that the defense used Apple’s pinch-to-zoom feature to manipulate photos. More recently, Donald Trump made the untrue claim that a picture of a packed Kamala Harris rally was created by artificial intelligence (AI), which he could only make because people could actually believe it.

Before artificial intelligence, media professionals had been carefully examining every image’s provenance and details, looking for instances of photo manipulation or misleading context. After all, there is always a deluge of false information following a significant news event. However, something far more fundamental is at stake with this coming paradigm shift than the never-ending minefield of suspicion that is sometimes referred to as digital literacy.

Google knows exactly what it is doing to the photograph as an institution; in an interview with Wired, the group product manager for the Pixel camera said that the editing tool is “help[ing] you create the moment that is the way you remember it, that’s authentic to your memory and to the greater context, but maybe isn’t authentic to a particular millisecond.”. In this world, a photograph becomes a mirror of human memory rather than an aid to it. And when photos turn into little more than visual hallucinations, the silliest of arguments will turn into a legal struggle over the credibility of the witnesses and the availability of supporting documentation.

The Pixel 9 won’t be the only factor contributing to the deterioration of the social consensus; it started earlier. But even with the low entry barrier, the phone’s new AI capabilities are noteworthy because the security measures we encountered were remarkably inadequate. Amidst the usual standards saga, the industry’s proposed AI image watermarking standard is stuck, and when The Verge tested out the Pixel 9’s Magic Editor, Google’s much-heralded AI watermarking system was nowhere to be found. All that is added to the images that are altered using the Reimagine tool is a line of detachable metadata. (Google’s creation of the theoretically irreversible SynthID watermark was intended to address the intrinsic fragility of this type of metadata. Google informed us that the Pixel Studio outputs, which are more akin to DALL-E and are pure prompt generators, will be watermarked with a SynthID watermark; however, we discovered that the Magic Editor’s Reimagine tool, which alters pre-existing photos, has far more frightening capabilities.

While there are few concrete guarantees, Google says the Pixel 9 won’t be a free-for-all bullshit factory. According to Google communications manager Alex Moriconi, “We design our Generative AI tools to respect the intent of user prompts and that means they may create content that may offend when instructed by the user to do so,” in an email to The Verge. Having said that, nothing goes. We provide clear guidelines in our Terms of Service and policies about the types of content that we accept and reject, as well as preventive measures against misuse. Certain prompts have the potential to circumvent the limitations of these tools, but we’re dedicated to improving and perfecting the protections we have in place going forward. “.

As one might anticipate, Google services prohibit using them to encourage or facilitate criminal activity or acts of violence. Upon attempting certain prompts, the error message “Magic Editor cannot finish this edit” was displayed. Attempt typing something different. As you can see, there were a few concerning prompts that did function throughout this narrative. When it comes down to it, though, the photograph’s status as a signal of truth is about to end, and standard-fare content moderation will not stop it.

For a short while, the photograph served as a shortcut to knowledge, reality, and owning a smoking gun. We could have navigated the world around us so well with its help. Now that we’re moving forward quickly, reality will merely be harder to know. A portable phone that can spew lies is considered cutting edge technology; it would have fit the entire lost Library of Alexandria onto a microSD card the size of my Nintendo Switch.

scroll to top