
I love this Question from Youtuber Marques Brownlee, who goes by MKBHD. He asks: What is the picture? It’s a deep question.
Just think of how early black and white film cameras worked. I pointed the camera at a tree, for example, and pressed the button. This opened the shutter so that light could pass through the lens (or more than one lens) to project an image of the tree onto the film. Once this film is developed, show a picture — a picture. But that image is just a file the acting For what was actually there, or even what the photographer saw with his own eyes. The color is missing. The photographer has tweaked settings like the camera’s focus, depth of field, or shutter speed, and the movie chosen affects things like the brightness or sharpness of the image. Adjusting camera and film parameters is the photographer’s job. This is what makes photography such an art form.
Now jump forward in time. We use digital smartphone cameras instead of film, and these phones have made huge improvements: better sensors, more than one lens, and features like image stabilization, longer exposure times, and high dynamic range, where a phone takes multiple photos using different exposures and combines them to get More amazing picture.
But they can also do something a photographer is used to: Their software can modify a photo. In this video, Brownlee used the camera in his Samsung Galaxy S23 Ultra to take a picture of the moon. He used a 100x zoom to get a very nice, stable image of the Moon. maybe also Nice – good.
The video — and others like it — sparked a response on Reddit from a user who took to “ibreakphotos.” In the test, they used a camera to capture a blurry image of the Moon on a computer screen — and Still Produce a clear and detailed image. What happened?
Brownlee followed up with another video, saying he repeated the test with similar results. He concluded that detail is a product of the camera’s AI software, not just the optics. In the video, he says the camera’s operations “essentially sharpen what you see in the viewfinder toward what you know the moon is supposed to look like.” Ultimately, he says, “the stuff that comes out of a smartphone camera isn’t so much reality as it is this computer’s interpretation of what it thinks you want the real thing to look like.”
(When WIRED’s Gear team covered the dust on the moon shot, a Samsung spokesperson told them, “When a user takes a photo of the moon, the AI-based scene optimization technology recognizes the moon as the main object and takes multiple shots for multi-composition frames, after which the AI enhances the image quality details.) and colors.” The Computational Imaging team here, and see more from Brownlee on the subject here.)