Autonomous driving is said to be safer simply because machines work more quickly than we do. Our eyes capture fewer frames per second than a high-speed camera, our brain processes data and reacts to it more slowly than a CPU, and our hands and feet are trounced for pace by electronic actuators.
In addition to safety and mobility benefits, autonomous cars would also relieve us of driving’s regular tedium. Why waste your attention edging through town or slogging up the motorway when you could be reading or working? Or, as one demo showed us, why spend your time parking when you could be shopping?
Nissan says the logical progression is for cars that communicate with each other to create optimum traffic flow without stop lights or lanes. Same space, more cars, less congestion. The company’s laser-equipped knee-high robots (‘EPOROs’) preview the possibility by moving together like a school of fish.
Here's the technology which will move autonomous vehicles from the blackboard to the black top.
Once the driver steps out, a ‘valet’ button on the key fob locks the doors and sets the car off to find a space, having recorded the drop-off point by GPS. Using lasers and cameras, the Leaf navigates around cars (both parked and moving) and painted lanes. When a space is identified, the Leaf signals, pulls past it, checks the size of the gap and then reverses in with the help of conventional radar sensors. Shopping done, the owner presses the valet button again and the Leaf navigates its way back to the starting point.
Fully equipped, the autonomous Leaf uses four cameras that combine to give a near-360 degree view (the front camera is a hi-res, long-range unit to help read road signs). A front radar sensor reads up to 200m ahead of the car, there is a further radar sensor at each rear corner (whose arrays overlap and extend to 70m), plus there’s the new bit: six laser scanners, positioned front, rear and at each corner. These scanners are the boxy addenda you see on the silver car and have a useful range of 100m. The laser control unit in the rear of the car collates the feeds, then sends signals to the steering actuator, accelerator and brakes.
The Leaf observes lane discipline via laser scanners, passing slower vehicles using the outside lane and signalling appropriately. Unexpected obstacles such as errant pedestrians are recognised by the lasers, too, and if braking alone won’t avoid them, the steering actuators kick in. An ‘SOS’ button pulls the car in then triggers a call to the emergency services. All the while, speed limits are observed via sign recognition.
Working with or without sat-nav assistance, the Leaf identifies and negotiates complex urban scenarios such as US four-way stop sign intersections, acting on the first-stop, first-go convention by recognising road signs and monitoring the activity of other vehicles via laser scanning. Traffic lights are observed, accompanied by a “signal is red/green” voice message to explain the car’s behaviour. The Leaf also pauses behind parked vehicles before passing via the oncoming lane when it’s clear to do so.
The passenger experience
Though prone to adopting overly cautious driving lines around obstacles, the autonomous Nissan Leaf isn’t shy about calling on the electric drivetrain’s instant torque for rapid acceleration, and braking is fairly aggressive, too.
The actuated steering inputs are a revelation, though; close your eyes and there could almost be human hands on the wheel, so smooth are the inputs — even during emergency manoeuvres.