Skip to main content
#2Case Study

CO2 Avoidance or Resource Depletion: How Ecological Is Autonomous Driving?

Magazine #2 | Summer 2023

CO2 Avoidance or Resource Depletion: How Ecological Is Autonomous Driving?

Self-driving minibuses are seen by many as a possible solution for ensuring rural mobility and keeping people in the countryside connected in an environmentally friendly way – thus enabling rural residents to be part of the transition to a carbon-neutral economy. However, we need to make a realistic assessment of the true extent to which the AI built into these vehicles, and the resources they consume, are actually beneficial to the environment.

Sensors in Autonomous Vehicles

Autonomous vehicles rely on a range of sensors to detect their surroundings and make decisions based on that information:

Cameras

Object detection and classification, scene understanding, localization and other functions rely on cameras. Typically, image sensors capture the environment, producing data that is then processed by computer vision algorithms.

Ultrasonic Sensors

These sensors use high-frequency sound waves to detect objects and determine how close they are. They are usually inexpensive and compact, and are used for parking assistance, object detection and obstacle avoidance.

Radar (Radio Detection and Ranging)

This sensor uses radio waves to detect objects and their distance in addition to measuring speeds and angles. The radar function is reliable even in poor weather conditions. It can be used for obstacle detection and tracking, lane detection or vehicle tracking.

Ultrasonic sensors
Radar (Radio Detection and Ranging)
LiDAR (Light Detection and Ranging)
GPS (Global Positioning System)
Encoders

LiDAR (Light Detection and Ranging)

Remote sensing technology uses laser light for obstacle detection and navigation to quickly and accurately measure distances and create a high-resolution 3D map of the environment.

GPS (Global Positioning System)

The satellite-based navigation system provides accurate location and time information. In autonomous vehicles, it enables global localization for navigation and mapping.

Inertial Measurement Units (IMUs)

IMUs are sensors that measure acceleration and angular velocity. They are used in self-driving vehicles to detect and control movement.

Encoders

These sensors measure the rotational position of the wheels to provide information about the vehicle’s movement and position.

How Much Hardware Is Needed?

In addition to sensors, self-driving minibuses also require additional hardware. The average power consumption of a bus is around 550 Wh/km. The share of hardware components in that consumption is only around 5 percent. However, the production of the components accounts for more than a quarter of all the CO2 emissions of autonomous minibuses.

AI in Autonomous Vehicles

AI systems are used in machine perception, localization,­­ trajectory planning and control.

Machine Perception

For a self-driving minibus to navigate safely, sensors must scan its surroundings. With the help of Machine Learning, the large amount of sensory data can be processed in real time. Algorithms are trained on large datasets of annotated imagery, ­LiDAR point clouds and radar data to detect and classify objects. Convolutional neural networks (CNNs) can identify vehicles, other road users, road signs, lanes and traffic lights. The vehicle uses the high-resolution map of the surroundings ­created from the data collected by its sensors to plan its
route.

Localization

Simultaneous localization and mapping (SLAM) algorithms are often used. Data from the different sensors is processed in real time to create a map of the surroundings, on which the estimated position and orientation of the vehicle is recorded. Deep Learning algorithms such as Convolutional Neural Networks and Recurrent Neural Networks (RNNs) are increasingly ­being relied on for localization.

Trajectory Planning

To ensure that an autonomous vehicle follows a safe and ­efficient driving route, algorithms dynamically process information about the surroundings based on mathematical modeling. Machine Learning algorithms are increasingly being used to improve trajectory planning. Deep learning algorithms such as CNNs and RNNs can recognize patterns in sensor data such as camera images and point clouds from LiDAR sensors to estimate the position and speed of other road users.

Control

The planned trajectory still needs to be executed safely and efficiently. In the control phase, the position, speed and orientation of the vehicle must be determined using sensor data. In trajectory following, control algorithms ensure that the vehicle follows the trajectory as closely as possible, taking into ­account the dynamics of the environment. Actuators such as the steering, throttles and brakes control the vehicle’s movement. For autonomous control, a model of the vehicle is often used to identify important process variables and the dynamic relationships between them (“model predictive control”).

How Resource-Intensive Must the Mobility Transition Be?

Autonomous driving is only possible with the use of many sensors and algorithms. Self-driving minibuses cannot be sold as environmentally friendly lodestars of the mobility revolution without calculating the resource consumption of all their ­components. Will they contribute to a reduction in emissions by encouraging rural residents to rely less on their cars and more on convenient public transportation options? Or will the resource consumption of minibuses actually produce even more emissions? Might there be simpler means available for advancing the mobility revolution in rural areas? Answers to these questions are not yet available because there hasn’t been an honest debate so far about emissions reductions and the resource consumption of the technology required by auto­nomous minibuses.

ANDREAS MEYER

Research Associate at the Distributed Artificial Intelligence Lab at TU Berlin

He is researching applications of ­Machine Learning methods for load forecasting and the sustainability of AI systems.