15.6 C
London
Sunday, May 26, 2024

Samsung responds to faux Moon controversy

Must read

- Advertisement -


Samsung has printed an English-language blog post explaining the methods utilized by its telephones to {photograph} the Moon. The submit’s content material isn’t precisely new — it seems to be a flippantly edited translation of an article posted in Korean last year — and doesn’t supply a lot new element on the method. However, as a result of it’s an official translation, we are able to extra carefully scrutinize its clarification of what Samsung’s picture processing know-how is doing.

The reason is a response to a viral Reddit post that confirmed in stark phrases simply how a lot additional element Samsung’s digital camera software program is including to pictures when taking a photograph of what seems to be the Moon. These criticisms aren’t new (Input published a lengthy piece about Samsung’s moon pictures in 2021) however the simplicity of the take a look at introduced the difficulty better consideration: Reddit consumer ibreakphotos merely snapped a photograph of an artificially blurred picture of the Moon utilizing a Samsung telephone, which added in additional element that didn’t exist within the unique. You possibly can see the distinction for your self under:

Samsung’s weblog submit at the moment explains that its “Scene Optimizer” characteristic (which has supported Moon pictures for the reason that Galaxy S21 collection) combines a number of methods to generate higher photographs of the Moon. To start out with, the corporate’s Tremendous Decision characteristic kicks in at zoom ranges of 25x and better, and makes use of multi-frame processing to mix over 10 pictures to cut back noise and improve readability. It additionally optimizes its publicity so the Moon doesn’t seem blown-out at midnight sky, and makes use of a “Zoom Lock” characteristic that mixes optical and digital picture stabilization to cut back picture blur.

Truly figuring out the Moon within the first place is completed with an “AI deep studying mannequin” that’s been “constructed primarily based on quite a lot of moon shapes and particulars, from full via to crescent moons, and relies on pictures taken from our view from the Earth.”

However the important thing step, and the one which’s generated all of the controversy, seems to be using an under-explained “AI element enhancement engine.” Right here’s how Samsung’s weblog submit describes the method:

”After Multi-frame Processing has taken place, Galaxy digital camera additional harnesses Scene Optimizer’s deep-learning-based AI element enhancement engine to successfully eradicate remaining noise and improve the picture particulars even additional.”

- Advertisement -

And right here’s Samsung’s movement chart of the method, which describes the Element Enhancement Engine as a convolution neural community (a kind of machine studying mannequin generally used to course of imagery) that finally compares the outcome with enhanced element towards a “Reference with excessive decision.”

Samsung’s movement chart reveals how the moon is recognized, after which its “Element Enhancement Engine” will get to work.
Picture: Samsung

It appears to be this stage that’s including element that wasn’t current when the picture was initially taken, and will clarify why ibreakphotos’ followup test — inserting a plain gray square onto a blurry picture of the Moon — resulted within the clean sq. being given a Moon-like texture by Samsung’s digital camera software program. 

Whereas this new weblog submit gives extra particulars in English in comparison with what Samsung has stated publicly earlier than, it’s unlikely to fulfill those that see any software program able to producing a sensible picture of the Moon from a blurry picture as basically faking the entire thing. And when these AI-powered capabilities are used to advertise phones, Samsung dangers deceptive clients about what the zoom options of its telephones are able to.

However, as my colleague Allison wrote yesterday, Samsung’s digital camera software program isn’t 1,000,000 miles away from what smartphone computational pictures has been doing for years to get more and more crisp and vibrant pictures out of comparatively small picture sensors. “Yr after 12 months, smartphone cameras go a step additional, making an attempt to make smarter guesses concerning the scene you’re photographing and the way you need it to look,” Allison wrote. “This stuff all occur within the background, and customarily, we like them.”

Samsung’s weblog submit ends with a telling line: “Samsung continues to enhance Scene Optimizer to cut back any potential confusion that will happen between the act of taking an image of the true moon and a picture of the moon.” (Our emphasis.)

On one stage, Samsung is basically saying: “We don’t need to get fooled by any extra artistic Redditors who take footage of pictures of the Moon that our digital camera thinks is the Moon itself.” However on one other the corporate can also be highlighting simply how a lot computational work goes into producing these footage, and can proceed to in future. In different phrases, we’re left asking the identical query: “what’s {a photograph} anyway?”



Source link

More articles

- Advertisement -

Latest article