Cameras, radar and Lidar sensors join to enable fully autonomous vehicles.
The challenge has been to replace the human eye and the ability of the human brain to process decisions based on what the eye sees. Autonomous vehicles have often relied on data from cameras and radar. Radar sensors are reliable, but the resolution is insufficient to identify the likes of pedestrians and small objects at a distance. The camera offers a sufficient level of detail and a good overview in two dimensions but requires massive software to convert 2D images of the surrounding environment to 3D. Generally, for autonomous vehicles, software is more important than hardware.
For safe autonomous driving, an additional sensor is therefore needed. It’s called Lidar, which stands for light detection and ranging. There needs to be overlap between the sensors, so that one can be a backup to take over from the other if needs be. For example, what happens if the camera and radar suggest conflicting information? Which of these sensors should we trust? With lidar, we can obtain a better basis for decisions.
Scania’s first fully autonomous self-driving concept truck, Scania AXL, is equipped with cameras, radar, lidar and GPS receivers. The system is designed for a level that meets the operational needs of mines. Scania indicates the system isn’t yet street smart but it’s smart enough for being used in mines. The human eye is not easily replaced but a relatively good overview of surroundings can be obtained through sensors. Driving in a mine is fairly simple and predictable. If you’re driving in a more dynamic and less predictable environment, more work is needed.
It’s been difficult to decide just how complex the system should be. This involves balancing an opportunity to develop a more general system for many applications while ensuring there is a robust and reliable system for the mining industry. Source