The truth about a viral photo of a wedding dress, artificial intelligence did its thing with an iPhone

After a flurry of stories, the bride uploaded the photo to her feed. (Instagram: wheatpraylove)

Tessa Coates He runs a podcast and is a writer, and he’s about to get married. She shared a portion of the preparations on her Instagram profile and like many events, the bride’s dress was one of the most talked about topics.

However, for Coates, her dress was not the focus of the debate; Recently on her stories, she shared a picture of herself wearing it and standing in front of two mirrors.

At first glance nothing seems to be happening, but look at the photo for a second or two to notice that it appears broken Matrix Or taken from the episode Black glass.

The point is that the writer poses differently than what can be seen in his reflection. To note this “failure Matrix“You have to pay close attention to the position of your hands.

The controversy over the podcaster’s attire has led to some curious comments. (information)

After the controversy, coatings He uploaded the photo to his feed, along with this text: “This is a real photo, not retouched PhotoshopPanoramic or not Live photo. Please enjoy this glitch in the matrix/photo I puked on the street.

Actually, it’s not even an episode Science fiction Some series. Everything points to the failure of the artificial intelligence integrated into the cell phone camera that took the photo.

First, it is known that the photo was taken on an iPhone since Coates himself mentioned it Live photo In cell phones Manjana. Also, it is known that it was not recovered.

See also  Next Generation Solar Cell Technology| Time to contemplate Benefits

Now, we have to remember that Apple has been applying for quite some time Computational photography Taking iPhone pictures.

When they launched the iPhone 11, they introduced technology from Cupertino Deep fusion, the goal isn’t just to improve the quality of low-light photos. They aimed to better capture colors and textures, and the end result should be sharper overall.

Computational photography came to Apple with the iPhone 11. (apple)

How did they do it? The camera took multiple photos of the same scene (some even before the user pressed the capture button) to apply different exposures and parameters and combine them automatically with artificial intelligence.

In 2022, on the occasion of the launch of the iPhone 14, Apple introduced an optimized version of this technology. Photonic engineIt worked similarly Deep fusionBecause it takes multiple captures and combines them to give a better result in a single image.

However, the photonic engine uses uncompressed images and the process starts earlier. Here’s how Apple explained it in its release:

Photonic engine By using this allows for an amazing increase in quality Deep fusion In the imaging process, provide extraordinary detail, preserve subtle textures, provide better color, and retain more information in the photo.”

While it’s not known what kind of iPhone the image of the bride was taken with, the failure to distinguish between Coates and her reflection in the mirror can be attributed to a glitch in the photonic engine or deep fusion process. Call her.

Apple named DeepFusion and Photonic Engine for its technologies related to computational photography. (Manjana)

In conclusion, the artificial intelligence and machine learning used in Apple’s Computational Photography still can’t clearly distinguish scenes with glasses.

See also  Astronomers make unprecedented discovery in search for water in space

Technology is integrated There were three different people in the photograph and only one was not reflected in the two mirrors. Therefore, in the air that iPhone Captured, the best version of each of the bride’s “variations” in the viral photo is automatically selected.

Misty Tate

"Freelance twitter advocate. Hardcore food nerd. Avid writer. Infuriatingly humble problem solver."

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top