Published On: Tue, Nov 14th, 2017

Apple plans to add a 3D sensor to the back of the iPhone, too

Enlarge (credit: Samuel Axon)

The iPhone X’s front-facing TrueDepth sensor array could be used for more than just Face ID authentication, and it fits neatly into Apple’s broader march into augmented reality on the iPhone, but the iPhone X’s rear camera still uses a combination of motion sensors and two rear cameras for AR. That could change in next year’s iPhone; sources cited by Bloomberg claim that Apple plans to add 3D camera technology to the rear of next year’s iPhone in addition to the TrueDepth array already on the iPhone X’s front.

The rear camera might not use the same technology as the TrueDepth sensor array used for Face ID on the front of the iPhone X, however. Rather, the rear array might use time-of-light sensors, which would map objects in 3D space by calculating how long it takes for light from its laser to bounce off of an object in its field of view. Bloomberg’s sources say that adoption of this technology is not certain, but it seems to be what Apple is testing right now. The technology is in development at Sony, Panasonic, Infineon Technologies, and STMicroelectronics.

In the iPhone X, Apple aligned the telephoto and wide-angle lens cameras on the back vertically (instead of horizontally, as on the iPhone 8 Plus) to make augmented reality applications more effective. But without a more advanced way to read and track 3D space, AR apps will remain limited. Unlike more robust hardware like Microsoft’s HoloLens, the current iPhones’ rear cameras can’t deal well with surfaces that aren’t flat. They can’t even track when an object is obstructing the camera’s view; current iPhone AR apps place an object in space relative to the flat surface but can’t partially obscure it behind a real-world obstacle, for example.

Read 5 remaining paragraphs | Comments

Ars Technica UK

Videos

Most Popular Posts