With smartphone sales stagnating in some markets, device makers have begun ramping up their innovation efforts with features like multiple rear camera sensors, hole-punch selfie cameras, and foldable form factors.
Time-of-flight (ToF) camera technology has been appearing in many flagship smartphones and will continue. You may also see it referred to as a range camera or even a 3D sensor. But what is it?
What is a time-of-flight camera?
- Emits an infrared light signal
- Measures how long the signal takes to return
- Determines depth based on extracted data
A ToF camera uses infrared light (lasers invisible to human eyes) to determine depth information – a bit like how a bat senses it surroundings. The sensor emits a light signal, which hits the subject and returns to the sensor. The time it takes to bounce back is then measured and provides depth-mapping capabilities. This provides a huge advantage over other technologies, as it can accurately measure distances in a complete scene with a single laser pulse.
In comparison to other 3D-depth range scanning technologies available, such as a structured light camera/projector system, ToF technology is relatively cheap. The sensors can reach up to 160 frames per second – that’s 160 relays of data every second – which means they’re great for real-time applications such as background blur in on-the-fly video. Better yet, they use a small amount of processing power. And once distance data has been collected, features like object detection can be easily implemented with the right algorithms.
What can a time-of-flight camera do?
- Object scanning, indoor navigation, gesture recognition
- Also helps with 3D imaging and improving AR experiences
- Theoretically, it can better blur backgrounds in ‘portrait mode’
A ToF camera sensor can be used to measure distance and volume, as well as for object scanning, indoor navigation, obstacle avoidance, gesture recognition, object tracking, and reactive altimeters. Data from the sensor can also help with 3D imaging and improving augmented reality (AR) experiences. In phones, ToF camera sensors will likely be used for 3D photography, AR, and in particular portrait mode.
Theoretically, ToF cameras can better blur photo backgrounds in portrait mode. We say “theoretically” because the process still requires software magic, and in the end, it’s up to a manufacturer to decide how it applies the data that the ToF camera collects.
ToF cameras can also assist in low-light situations – since the sensor uses infrared light to pick up “distance-to-subject”, it could help smartphones focus even in darkness.
Are time-of-flight cameras new?
- Microsoft used ToF cameras in second-gen Kinect
- Lidar sensors use ToF cameras – such as in the iPhone 12 Pro
ToF technology isn’t very new, as various companies have been experimenting with it for at least the past decade. Microsoft, for instance, used ToF cameras in its second-generation of Kinect devices.
Lidar – which is popular in self-driving cars, but more recently appeared in the iPhone 12 Pro and Pro Max – also commonly features ToF sensors. The difference is that Lidar can use multiple to create ‘point maps’, which is why this multiple positioning makes great sense in car safety and automation systems.
Even drone companies have adopted them – the Chouette drone in the video above uses a TeraRanger ToF camera to surveil vineyards.
So, while ToF cameras weren’t invented yesterday, they’re still cutting-edge and rapidly becoming more efficient, affordable, and accessible.
Who makes time-of-flight camera sensors?
Sony makes next-generation 3D sensors with ToF technology, which is the source used by Apple.
Other ToF camera sensor manufacturers include AMS/Heptagon, ASC TigerCub, TeraRanger One, Riegl, Lucid/Helios, and AdaFruit.