Nissan has said it wants to develop autonomous vehicles by 2020
Mercedes-Benz has set a similar target for self-driving cars
A whole host of sub-systems will need to work together to make the concept a reality
Self-steering inputs are smooth and well calculated by the on-board computers
The Leaf prototype uses four cameras to provide a 360 degree view of the car
A 'valet' option on the keyfob would send the car to self-park
Some vehicles are already equipped with emergency services call buttons
This Leaf can identify and negotiate a variety of driving terrains
This prototype observes lane discipline using laser scanners
The race for mass-market self-driving cars is well and truly on.
Mercedes-Benz recently tested an autonomous S-class on public roads and declared it wants to be first to market such a car, tentatively suggesting it would do so this decade. But Nissan has boldly announced plans to sell affordable self-driving cars by 2020, so we took the chance to assess its progress via an autonomous Leaf prototype.
It’s all part of Nissan’s ‘Blue Citizenship’ social responsibility plan, a trinity of goals consisting of zero emissions (the Leaf is already the world’s best-selling EV by far), near-zero fatalities and mobility for all. It’s the latter aims that autonomous driving will tackle; after all, 93 per cent of the US’s road accidents are caused by human error, and an autonomous car could also grant new independence to those unable to drive.
The tech is based on Nissan’s existing ‘Safety Shield’, a radar and camera-based set-up that has featured on 730,000 cars to date, bringing with it features such as lane departure warning, adaptive cruise and all-around camera views. The autonomous Leaf adds continuously scanning lasers — ten times more precise than radar — and a raft of new software algorithms to fuse, process and react to sensor feeds and actuation systems for the steering, accelerator and brakes. (Only the steering wheel is physically actuated; the Leaf’s accelerator and brakes are electronically controlled, so there’s no need for pedal movement, and the controls retain precautionary manual override.)
Autonomous driving is said to be safer simply because machines work more quickly than we do. Our eyes capture fewer frames per second than a high-speed camera, our brain processes data and reacts to it more slowly than a CPU, and our hands and feet are trounced for pace by electronic actuators.
In addition to safety and mobility benefits, autonomous cars would also relieve us of driving’s regular tedium. Why waste your attention edging through town or slogging up the motorway when you could be reading or working? Or, as one demo showed us, why spend your time parking when you could be shopping?
Nissan says the logical progression is for cars that communicate with each other to create optimum traffic flow without stop lights or lanes. Same space, more cars, less congestion. The company’s laser-equipped knee-high robots (‘EPOROs’) preview the possibility by moving together like a school of fish.
Here's the technology which will move autonomous vehicles from the blackboard to the black top.
Once the driver steps out, a ‘valet’ button on the key fob locks the doors and sets the car off to find a space, having recorded the drop-off point by GPS. Using lasers and cameras, the Leaf navigates around cars (both parked and moving) and painted lanes. When a space is identified, the Leaf signals, pulls past it, checks the size of the gap and then reverses in with the help of conventional radar sensors. Shopping done, the owner presses the valet button again and the Leaf navigates its way back to the starting point.
Fully equipped, the autonomous Leaf uses four cameras that combine to give a near-360 degree view (the front camera is a hi-res, long-range unit to help read road signs). A front radar sensor reads up to 200m ahead of the car, there is a further radar sensor at each rear corner (whose arrays overlap and extend to 70m), plus there’s the new bit: six laser scanners, positioned front, rear and at each corner. These scanners are the boxy addenda you see on the silver car and have a useful range of 100m. The laser control unit in the rear of the car collates the feeds, then sends signals to the steering actuator, accelerator and brakes.
The Leaf observes lane discipline via laser scanners, passing slower vehicles using the outside lane and signalling appropriately. Unexpected obstacles such as errant pedestrians are recognised by the lasers, too, and if braking alone won’t avoid them, the steering actuators kick in. An ‘SOS’ button pulls the car in then triggers a call to the emergency services. All the while, speed limits are observed via sign recognition.
Working with or without sat-nav assistance, the Leaf identifies and negotiates complex urban scenarios such as US four-way stop sign intersections, acting on the first-stop, first-go convention by recognising road signs and monitoring the activity of other vehicles via laser scanning. Traffic lights are observed, accompanied by a “signal is red/green” voice message to explain the car’s behaviour. The Leaf also pauses behind parked vehicles before passing via the oncoming lane when it’s clear to do so.
The passenger experience
Though prone to adopting overly cautious driving lines around obstacles, the autonomous Nissan Leaf isn’t shy about calling on the electric drivetrain’s instant torque for rapid acceleration, and braking is fairly aggressive, too.
The actuated steering inputs are a revelation, though; close your eyes and there could almost be human hands on the wheel, so smooth are the inputs — even during emergency manoeuvres.