3D cameras are expensive in part because they need two of almost everything, but a new method developed by Toshiba replaces that expensive second lens with something much simpler: a set of colored filters.
Dual-lens camera systems create 3D footage by using information coming in from two different sources to create an experience similar to seeing with two eyes. The problem is that the two-lens design is expensive to produce, especially for applications where the image quality isn’t top priority, like backup cameras or giving a robot “sight.”
Earlier this month, however, Toshiba announced a method it developed to create 3D images using a color-depth map instead of a second lens. The system is set up much like a traditional camera, only both a blue and yellow filter are attached to the lens. The blue filter blurs and color codes everything in front of the subject while the yellow filter does the same to everything behind the subject.
With the image color-coded, the software can measure the distance between objects, creating a “map” of the surroundings. Toshiba claims that the dual-filter setup is just as accurate as a camera with two lenses 35cm apart.
The technology is designed to be used with such applications as self-driving cars, robots and park-assist cameras — professional videographers likely won’t be seeing any single-lens cameras any time soon. That’s because, while the method is cheaper to produce, the image quality isn’t up to par, with the filters intentionally blurring parts of the image in order to measure distance. The filters also mean that only green light gets through to the camera, so this type of single-lens 3D system would be horrible in low light.
Still, if the accuracy is as good as Toshiba says, this technology could help to cut some of the costs of backup cameras, self-flying drones, and robots — as long as you don’t mind your robot being a Cyclops.