Plugged In: Explaining Electric & Hybrid Vehicle Tech

This month, I want to continue the discussion on autonomous driving.

In the next 10 years, most vehicles on the road will be, in some way, electrified. That could mean hybrid or full electric and, as I stressed in the last issue, training in this technology must be complete (the MTA Institute runs a government-backed in which you could consider enrolling).

In the next 50 years, full autonomy will be the normal situation and your smartphone will be your controller for everything. An example of how quickly things are moving comes from Tesla, which has a new autopilot feature it has called ‘Enhance Summon’ which will allow you to summon your parked car to your location if you are within 150 metres of it.

The level of autonomy is what I want to discuss here.

There are six levels of autonomous – 0 to 5 – as set down by the SAE (Society of Automotive Engineers) in their J3016 standard.

  • Level 0 – Non-Automated: The driver is in complete control.
  • Level 1 – Assisted: Some level of assistance to support the driver. The operator, however, must constantly supervise these support systems.
  • Level 2 – Partial Automation: This has more support systems for the driver. May include lane centering, cruise control, automatic emergency braking.
  • Level 3 – Conditional Automation: This level has full automation available, but the driver has to be ready to take over.
  • Level 4 – High Automation: In some situations, a vehicle can drive itself without input from the driver.
  • Level 5 – Full Automation: At this level, a vehicle is fully autonomous and not even a steering wheel is required. It is a full-time automated driving system under all conditions.

While there is some belief amongst the public that Level 5 autonomy is now available, that is not the case. At the time of writing, there is no vehicle for sale that has Level 5 autonomous capability.

To achieve fully autonomous level 5, a high level of sensing and perception has to take place. While several sensing systems are used – including cameras and radar – the major sensor used on most vehicles is called Lidar (Light Detection And Ranging) that uses pulsed laser light to illuminate objects and measures the reflected pulses. It can perceive shapes with high resolution and detect distance with high accuracy.

Interestingly, Tesla does not use LiDAR in its Autopilot system. Instead, Tesla uses eight cameras (that provide a 360-degree view of up to 250 metres), 12 ultrasonic sensors, and a forward-facing radar capable of seeing through heavy rain, fog, dust. This system is really semi-autonomous, level 4 by the SAEJ3016 chart.

LiDAR systems are often the most expensive items on cars with autonomous capabilities – at one point, the bulky, spinning LiDAR systems seen on the roof of autonomous concept cars cost more than the car on which they were installed – but new technology has advanced to the point that prices are coming down considerably and solid state systems that are about the size of two decks of cards can installed behind a vehicle’s grille. Quanergy, a Californian company that develops LiDAR systems currently has a solid-state unit available for less than $250.

There are systems that can be compared to LiDAR.

Echo-location, or SONAR, for example, is a system found in the natural world – bats use SONAR to navigate obstacles, sending out a sound wave that, when it echoes back, gives them a three-dimensional view of its surroundings.

RADAR (Radio Detection and Ranging) uses a similar process with radio waves (a type of electromagnetic radiation).

LiDAR uses this principle – sending out pulsed laser light and timing how long it takes to return to the unit. A picture of its surrounds is formed from that incoming data.

Laser light has a much shorter wave length than Radar and Sonar and because of this, LiDAR can create a much more accurate picture of smaller objects – making millions of measurements of depth information in all directions simultaneously and creating a detailed three-dimensional map.

To achieve level 5 autonomy, a high level of processing will be needed and a lot of testing, and while LiDAR will not be the only sensor used to attain fully autonomous vehicles, it will likely be the major piece of the puzzle.

The automotive landscape is changing rapidly and it can, perhaps, be compared to the era of the birth of the motor car when there were dozens of small companies developing and building their own vehicles. As it was then, so it is now, and we have many start-up companies building electric vehicles, electric controllers, autonomous technology and the like. It’s an exciting time to be involved in the industry and the opportunities as this new era gets underway are legion. Training and preparing for these new technologies is important if you want to take advantage of those opportunities.

Thanks for reading. In my next column I will be looking at vehicle-to-vehicle (V2V), vehicle-to-everything (V2X), and vehicle-to-infrastructure (V2I) technology.

If you have any questions about hybrid, electric vehicle or autonomous vehicle technology, you can email me at [email protected]

5 April 2019