Apple and AR

Following up on a prior thread about Apple and a prediction/speculation on what might be coming next (One more thing…) I came across the following article today that may give the idea more credence.…

Smartphones boasting “dual cameras” are becoming more common, and news that they will feature on the just-announced iPhone 7 Plus indicates their arrival into the mainstream. But while dual cameras may stem from efforts to improve picture quality, they have the potential to lead us down much more interesting paths: The real story may be that Apple is using dual cameras to position itself for the augmented reality world ushered in by the Pokemon Go phenomenon.

Augmented Reality, or AR, has been a solution in search of a problem for years. In the last few months, Pokemon Go has been the app to take AR into the mainstream after years in the wilderness. The new dual-camera system in the iPhone 7 Plus may just be the platform on which to expand fully into AR.

Building stereoscopic scenes

Having two slightly different viewpoints means live images can be processed for depth information per pixel captured, so images gain an extra dimension of depth data. Since the distance between the two cameras is known, software can make triangulation calculations in real time to determine the distance to corresponding points in the two images. In fact, our own brains do something similar called stereopsis so that we’re able to view the world in three dimensions.

The iPhone uses machine learning algorithms to scan objects within a scene, building up a real-time 3D depth map of the terrain and objects. Currently, the iPhone uses this to separate the background from the foreground in order to selectively focus on foreground objects. This effect of blurring out background details, known as bokeh, is a feature of DLSRs and not readily available on smaller cameras, such as those in smartphones. The depth map allows the iPhone to simulate a variable aperture that provides the ability to display areas of the image out of focus. While an enviable addition for smartphone camera users, this is a gimmick compared to what the depth map can really do.

Software that provides analysis of people’s poses and location within a scene for dual camera smartphones would provide a virtual window onto the real world. Using hand gesture recognition, users could naturally interact with a mixed reality world, with the phone’s accelerometer and GPS data detecting and driving changes to how that world is presented and updated.

There has been speculation that Apple intends to use this in Apple Maps for augmenting real world objects with digital information. Other uses will come as third party manufacturers and app designers link their physical products to social media, shopping, and payment opportunities available through a smartphone.