Lidar-Based AR Screen Can Improve Road Safety | Research and Technology | Apr 2021
CAMBRIDGE, England, April 29, 2021 – A lidar-based AR heads-up display allows drivers to “see through” objects to alert potential dangers without distraction. Researchers at the University of Cambridge, the University of Oxford and University College London (UCL) developed the technology, which uses lidar to create ultra-high-definition holographic representations of road objects.
These objects are then projected directly into the driver’s eyes, contrasting with the 2D windshield projections used in most head-up displays.
“Our results show that 2D projections from the windshield could distract the driver when they appear in a small area of the windshield and the driver must always shift their gaze from the road to the windshield”, Jana Skirnewskaja, lead author of a study describing the technology and a doctorate. Cambridge University candidate told Photonics Media. “In the case of the 3D augmented reality optical configuration, the holograms are projected directly into the driver’s eyes so that the pupil acts as a lens to focus the holographic objects projected onto the road corresponding to the distance and size of the objects. real. . ”
An image of a tree based on lidar data (left). The same image converted to a hologram (right). Courtesy of Jana Skirnewskaja.
The setup, she said, consists of a helium-neon (HeNe) laser, linear polarizers, half-wave plate, ultra-high-definition spatial light modulator, and convex lenses and concave. It takes input from a lidar sensor that feeds information into algorithms, which then feed the relevant data to the optical system.
Using lidar, the researchers scanned Malet Street, a busy area of the UCL campus in central London. Co-author Phil Wilkes, a geographer who typically uses lidar to scan rainforests, scanned the entire street with a technique called land laser scanning. Millions of pulses were sent from several positions along Malet Street to create a 3D model.
“That way we can put the scans together, build an entire scene, which not only captures trees, but also cars, trucks, people, signs and whatever you would see on a typical city street. Wilkes said. “While the data we’ve captured comes from a stationary platform, it’s similar to the sensors that will be in the next generation of autonomous or semi-autonomous vehicles.”
After the 3D model of Malet Street was completed, the researchers turned various street objects into holographic projections. The lidar data, in the form of point clouds, was processed by separation algorithms to identify and extract the target objects. Another algorithm converted the target objects into computer generated diffraction patterns. These data points were then sent to the optical facility.
“With the help of an algorithm, we are able to project multiple layers, therefore multiple holographic objects into the driver’s eyes, thus creating augmented reality in the driver’s field of vision on the road,” Skirnewskaja told Photonics Media.
The holographic projection seen by the driver is faithful to the scale and position of the real object depicted on the street. For example, a masked street sign would appear as a holographic projection relative to its actual position behind the obstacle, acting as a warning mechanism.
The researchers plan to refine their system by customizing the layout of the heads-up displays. They created an algorithm capable of projecting several layers of different objects that can be freely arranged in the driver’s viewing space. For example, in the first layer, a traffic sign at a further distance can be projected at a smaller size. In the second layer, a warning at a closer distance may display the sign at a larger size.
“Currently, we are testing the technology in an automotive environment. We intend to experiment with different light sources to decrease the size of the optical setup and reduce the number of lenses by implementing an advanced algorithm that creates virtual lenses, ”said Skirnewskaja. “This will allow us to practically adapt the optical configuration to the environment of the car.”
The research was published in Optical Express (www.doi.org/10.1364/oe.420740).