View Single Post
  #1  
Old September 12th 17, 08:55 PM posted to rec.photo.digital
Sandman
external usenet poster
 
Posts: 5,467
Default Next step in iPhone photography

While we're not really encouraging people to use their smartphones *more* to
take photos, I still think it's interesting to see Apple try to make the most
of the limited camera technology they are able to use in such a smart device.

With the iPhone 7 plus, they introduced dual lenses and "portrait mode",
which uses both lenses to gauge depth in the scene and separate the subject
from the background to create fake bokeh which was still based on actual 3d
geometry rather than just blurring parts of a 2D image.

With the iPhone 8 and iPhone X, the next step is to use this 3D data and
create lighting for the scene, meaning they can isolate the subject and use
the 3D topographic data of their face to create what seemed to be pretty
realistic lighting across the face.

Here's a video that shows how a normal photo can be separated and lighted:

https://images.apple.com/media/us/ip...18c-4bb3-a900-
2d84eb37a5d7/overview/primary/cameras_portrait_lighting/large.mp4

The important part here is that this is not "filters" that they lay on top of
a 2D image, it's virtual lights that is being shone on a 3D representation of
the subjects "facial landscape"

Whether you like the end result or not isn't really important, it's just
fascinating seeing a company making the most of what they have to try to
emulate what has always required big cameras or a big studio. Even if they
don't emulate it perfectly, it's still pretty noteworthy.

This is, after all, the world's most used camera.

--
Sandman