Huawei Technologies Co. is planning to unveil a new phone with a camera, capable of taking three-dimensional pictures for augmented reality applications.
The phone, code-named Princeton internally, will be announced this month and go on sale within a few weeks. The technology uses sensors developed by Sony Corp. that can accurately measure distances by bouncing light off surfaces.
The science behind Sony’s “Time of Flight” technology is as follows:
- The ToF sensor inside the camera pulses out infrared light.
- Light bounces off surfaces and back to sensors.
- Sensor detects depth by measuring differences in energy levels.
- Depth data creates 3D map of environment.
- App makers can get creative: design rooms with virtual furniture or have a Pikachu lounge on your desk.
The new feature — dubbed “3D Camera” at Huawei — comes at a critical juncture for the smartphone industry, which is facing cooling global demand as consumers find fewer reasons to upgrade to new phones. Huawei is aiming to boost sales and win market share from competitors such as Apple Inc. by offering users the ability to generate 3-D models of themselves and the environment in real-time, and share it with others.
This is technology that has never been seen before and, at the extreme, has the potential to change how we view the world.
Besides generating pictures that can be viewed from numerous angles, Huawei’s new camera can create 3-D models of people and objects that can be used by augmented-reality apps. The new camera will also let developers control apps and games in new ways, such as hand gestures.
For Sony, the world leader in image sensors used in regular cameras, 3-D cameras could generate billions in additional revenue from the sale of its new components. The company bought Brussels-based Softkinetic in 2015, combining the Belgian startup’s time-of-flight technology with its own semiconductor manufacturing prowess to create 3-D chips small enough to fit inside smartphones.
While Apple’s FaceID facial-recognition feature is also powered by 3-D sensors, it relies on a different technology called Structured Light, which can measure depth at shorter distances. Sony’s time-of-flight sensors can do so at longer distances.
Is this a game changer?
No comments:
Post a Comment