Apple: Protect Authentic Photography

Below is a guest essay submitted on June 10, 2024, on the occasion of Apple's World Wide Developers Conference announcing their plans to include "clean up" generative AI in Apple's Photos app.


How will he know if this photo was not generated by AI?
He won’t. That’s the problem. #ShotOniPhone

Apple: Protect Authentic Photography from Generative AI

Since its founding in 1976, Apple has distinguished its brand by protecting the privacy and security of its customers through integrating its hardware, software, and services. Apple, with its 1.5 billion iPhone customers, now has an opportunity to protect authentic photography. This is the next chapter of its customer-centric legacy as the company celebrates its first 50 years.

As revealed in the World Wide Developers Conference opening session regarding Apple’s plans for generative AI, Apple will not compete with Google’s generative gimmicks. Apple also does not subscribe to Samsung’s belief that “there is no such thing as a real picture” or Adobe’s advice to “skip the photoshoot.” To corner the market, these companies only amplify the generative AI hype cycle, which started in late 2022 (we’re in the “bargaining” phase).

It has been estimated that by mid-2023, artificial intelligence had already generated more images than all photographers had taken until the invention of the first portable digital camera in 1975. AI-generated “phictions” are now so realistic that most people can’t tell the difference between them and an authentic photograph.

The metastasizing of generative AI in the last 18 months has opened the door for corrupt bad-faith actors to deceive us. Many observers are right to focus on deepfakes in elections. But consider another scenario: a family member documenting a wedding manipulates photographs taken on a smartphone—erasing your LGBTQ+ cousin or ex-felon sibling as the snapshot is taken—to deny what others in the future understand about their family. What photos will archive to preserve for posterity?

Because the iPhone is the most widely used camera in the world, Apple has a special obligation to protect authentic photography. This obligation extends the company’s long-standing ethic to produce realistic photos—not using generative AI—especially when viewed on a screen. Apple’s “computational photography” seeks to make it easier for users to create readable photographs, even in difficult situations, with little to no fuss when the photo is taken.

Those interested in saying “I was there”—especially visual journalists—by recording what they witnessed with their cameras (including iPhones) often have a code of ethics for image-making. 200 years of image-making with lens-based cameras have trained photographers and their audiences about the parameters of photography’s trust relationship, a trust being broken by generative AI. 

Elsewhere, I’ve cited articles about how photographers can protect themselves and their audiences from generative AI. Rather than put the onus on photographers (let alone their viewers) to attest to what is trustworthy, it’s time that camera makers do everything they can to help protect authentic photography. Apple can approach the problem in three interrelated ways:

1. Become actively involved in the Coalition for Content Provenance and Authenticity (C2PA), a multi-year effort to develop open technical standards to establish universal content provenance, and the Content Authenticity Initiative (CAI), an aligned group working to develop C2PA-compliant and open-source tools to implement these open standards (The New York Times, who, among other international publications, chose not to publish this as a guest essay, is a founding member of the CAI). Several camera companies, including Nikon, Canon, Sony, and Leica, are working to develop digital provenance solutions based on the C2PA/CAI in their cameras' hardware and software.

Yet no smartphone company is involved in C2PA or CAI. That’s a gaping void Apple could fill by developing its chip technology to cryptographically protect the provenance of digital assets produced on iPhones and edited or shared with MacOS, iOS, and iPadOS apps. If Apple does nothing else to protect authentic photography, this alone would be a significant step.

2. Distinguish between photographs and “phictions” when applying a content credential to photos produced on an Apple device. The CAI promotes an embedded and indelible Content Credential (CC) to declare the provenance of an image to digital content consumers. However, the CC specification does not visibly distinguish between authentic and AI-generated content (unless users root around in the credential). Many companies involved in the CAI are also selling generative AI products: Adobe, the convener of C2PA/CAI/CC and promoter of AI as “the new digital camera,” has included generative AI capacities in most of its subscriptions, such as Photoshop and Firefly. In effect, AI image generators produced by Adobe, Microsoft, Google, OpenAI, Stability AI, Midjourney, and TikTok—all CAI members (Meta is missing)—receive free advertising displayed in CAI’s credentials, further monetizing our attention. Since AI-generated content will soon dwarf authentic content on social media (if it has not already), CAI’s Content Credential will quickly become meaningless even if it is widely adopted by social media (which is in question). 

Apple can do better: limit your credential to appear only on digital content with no generative AI content applied—on the iPhone or by any application to “clean up” a photo—to the original photo. Perhaps Apple could add an unobtrusive “+” on the corner of a photograph to indicate that it is all authentic (see below). 

Everyone should be able to tell at a glance if a photograph is authentic or not.

3. Reimagine photo sharing on the internet by promoting authentic photographs, publicly and widely shared. Photo-sharing sites, such as Google Photos, Flickr, and Instagram, are being inundated by AI-generated content. Apple should simplify and expand access to its iPhoto sharing by creating a new photo-sharing service—perhaps called Photos+—specifically to aggregate only protected authentic photographs, accessible to anyone. A low-cost subscription service would enable anyone with authentic photographs to post their photos on this website, free from advertising and its attendant data collection. 

Clicking on the + embedded in an authenticated photograph would reveal provenance details, as in any content credential, with a difference: the provenance of the photo’s production would be sublimated (the + already designates the photo as authentic) in favor of information added by the photographer to frame the photo with a title and/or caption. Adapting a solution developed by the Four Corners Project, the + credential primarily represents the context of an authentic photo. Promote the service with an “i Was There” tagline to establish the relationship between the author and authenticated photography.

These steps represent the integration of hardware, software, and services, an integration upon which trust in the Apple brand has historically relied to protect customers’ privacy and security. Apple, to expand its brand in the age of AI, should apply this same process to protect authentic photography.

Given that industry lobbyists will prevent effective legislation to regulate generative AI, it is essential to scale quickly and broadly to counteract the corrosive effect of generative AI on authentic photography—and on our shared understanding of our world made possible by it. Scaling the technology to authenticate and share billions of digital photographs is a big initiative: Apple may be the only company that can rise to this challenge.

Instead of focusing on the contents of a photograph—which is why any photo is taken by a family photographer or a professional in the first place—people are now being forced by the companies generating AI images to focus on a photograph’s provenance (or lack thereof). This is an unacceptable burden on the viewer, further degrading notions of what is real and what is not.

Seeing is no longer believing. Our trust in authentic photography is a democratic and inter-generational institution that needs Apple’s protection. Sign the petition.


Marshall Mayer, an amateur photographer, and new grandfather based in Montana, produced the Writing with Light Bibliography, citing articles relevant to generative AI and its effect on authentic photography. Mayer has used Apple products and services, professionally and personally, since 1986.