The car as a sensor: Crowd sensing and cooperative learning for automated driving

May 01, 2015

The car as a sensor: Crowd sensing and cooperative learning for automated driving

Today’s driver assistance systems support drivers in a variety of driving conditions, from hazardous to monotonous. In manageable traffic condit...


As we move toward automated driving, it will be increasingly necessary to share real-time infrastructure information and digital map data with other road users, in addition to using it within an individual vehicle. The shared data will supplement classic map data and provide drivers with a comprehensive picture of the road network.

Automated driving will also necessitate high data precision, as well as information about relevance and validity. Conventional navigation systems – which are not cloud connected and only updated occasionally when the vehicle goes in for a service – cannot deliver this.

Crowd sensing

In the future, therefore, it will be critical for data from different traffic participants to be bundled (distributed perception and mapping). Sensor data won’t just be collected for individual vehicles, but will also be made available to all road users. Vehicles will communicate with each other and exchange data with other vehicles in their immediate environment. Vehicle data will also be stored and evaluated in the cloud for purposes such as map optimization when a section of road is surveyed. Many drivers are already participants in this process of indirect data collection because they make data available via their smartphones. One example of this is data on traffic jams used for traffic reports.

The quality of the information gained from the cloud depends, among other things, on the volume of data. The more vehicles providing information, the more precisely the environment can be mapped. Traffic sign recognition is a good example of this. Even the latest navigation systems don’t have information on speed limits for all roads. They usually only have this information available for main roads. They are also unable to provide information on recently implemented changes to speed limits. This information gap can be closed by a road sign recognition function in the front camera. However, the net recognition rate isn’t ideal. And in poor weather and bad lighting conditions, it is even lower.

On the other hand, if a fleet of several hundred thousand vehicles regularly sends its data to a central cloud repository for evaluation, the net recognition rate improves vastly. This is called “crowd sensing,” and the sheer volume of data provides a far more precise and up-to-date picture than any map provider’s special survey vehicles. In addition to traffic sign data, information about the route that is relevant for transmission and drive strategies will be necessary for hybrid and electric vehicles, and cars with a proactive chassis will need data on road conditions such as slickness or iciness. For all types of vehicles, additional information about curves, lanes, and traffic routing will be necessary (Figure 1).


Figure 1: Many vehicles have sensors that record data on speed, incline, curves and other information that is stored in the cloud. This information is used to supplement map material, thereby providing a more comprehensive picture of the worldwide road network.




Map information must be supplemented by metadata so that the vehicle can check the data’s validity. For example, today’s maps don’t include the age of the data or an aging model (i.e., they don’t provide information about how old data can be to be classified as reliable). All this information is necessary for autonomous vehicles because different features of a map age at a different rate. For example, data on black ice sensed by the vehicle won’t be valid after several hours or even minutes. In contrast, information about a tunnel will probably still be valid even after severalyears.

From the vehicle to the cloud

A modern, well-equipped vehicle generates several gigabytes of sensor data every minute. Not all of this data can be transferred due to limited network capacity, so the volume of data has to be reduced. To achieve this, the sensor data is initially compared with the vehicle’s existing map data in the cloud. For example, in the case of traffic sign recognition, the system checks whether the sign has an identical counterpart in the map material. If it doesn’t, the information is uploaded to the cloud.

Even if there is a data match, it might still make sense to upload the information so that the map material can be validated. The decision on whether to transfer the data or not depends on the aging model, the objective being to transfer as little data as possible and as much as necessary.

Once the relevant data is in the cloud repository it has to be preprocessed, grouped, sorted, and interpreted. In the preprocessing stage, obviously incorrect features are discarded. Then, the information on individual map features is grouped. Whe100 vehicles provide information on a specific road sign, there will initially be 100 data records that will deviate slightly as to the sign’s precise location or the perceived maximum speed limit. Data mining is then used to establish whether the 100 data records all relate to exactly the same sign, where the sign is located, and what kind of a sign it is (Figure 2). Then a calculation is performed to see whether the feature can be assigned to existing map data. After that, the system interprets the data to establish whether the sign is new or not, and whether the aging model has to be adapted. This can happen, for instance, with variable message road signs.


Figure 2: Several vehicles record and transfer data to the cloud about a specific road sign. These data records, which vary in terms of exact location and perceived maximum speed limit, are then analyzed by data mining. The objective is to update or verify existing sign information and to possibly include new road signs in the map material.




From the cloud to the ECU

When the information in the cloud has been analyzed, it has to be transferred back to the vehicle. Various protocols are necessary for the transfer process. When incremental updates are made over relatively long time intervals, the Navigation Data Standard (NDS) format is used[1]. There are also several proprietary formats, as well as the OpenLR Standard for shorter interval updates[2].

Data dissemination is modularized on different levels to separate fast-changing information from rarely updated information. This reduces bandwidth requirements during data transfer and simplifies the update process. At the same time, it ensures that the in-vehicle systems always have the latest information available. Combining this information with metadata for relevance also makes it possible to assess data reliability.

The information from the cloud is first integrated into the navigation system’s map material. Then it is transferred to the relevant ECUs so that the assistance functions can use it via a protocol such as Advanced Driver Assistance Systems Interface Specifications (ADASIS). Anticipatory driving systems, such as the Elektrobit electronic horizon solution, use an ADASIS Reconstructor to receive the data sent by the ADASIS Provider (within the navigation system), and then sort it correctly into the ECU’s data structure (Figure 3). The ADASIS protocol ensures that the components interactproperly.


Figure 3: Analyzed data is integrated in the navigation’s map material and made available for driver assistance features via a transceiver mechanism (provider/reconstructor).




Authenticity and data protection

The technical function of crowd sensing must comply with data protection legislation. To this end, the sensor data from vehicles is initially reduced and compressed, and then it is typically anonymized, signed, and encrypted before being transferred to thecloud.

Anonymity protects the drivers’ privacy. After the anonymization process, it is necessary to ensure that consecutive information of the same type can be assigned to a sender in the cloud. For example, if the system receives the message “vehicle stationary” several times within a short period of time, it has to know whether the information relates to one or several vehicles. If it is sent by several vehicles, it can be assumed that they are in a traffic jam. The vehicle’s identity is irrelevant, as is the fact that different information types relate to one vehicle.

However, there are some exceptions. For example, if the vehicle sends a technical defect message, it needs to be identified correctly by its manufacturer. So, with all information, it is necessary to find a balance between anonymity and identification. The data is signed so that its authenticity and credibility can be assessed and verified. It is also encrypted for secure transmission with standard encryption methods, such as symmetric and asymmetric encryption.

Self-learning systems

Many vehicle manufacturers already store sensor data in the cloud. However, they are still having problems using it efficiently. Their analyses tend to focus on specific functions and are performed manually. In the future, the data to be collected and evaluated will become increasingly complex. In order to exploit the potential of sensors, it will be necessary to increase the degree of automation along the data processing chain. Suitable systems will cover the entire process, from data mining and evaluation to data transfer and use. The vehicle manufacturers will retain ownership of the data at all times.

We are still in the very early stages of vehicle sensor data use. Growth in the number of driver assistance functions in the mid- and lower-price segment vehicles will cause the volume of analyzable data to increase over the next few years. The technologies described allow data from the cloud to be used for both comfort and safety functions. Crowd-sensing information will provide a precise picture of the road network, which will be indispensable as the basis for automated driving.


[1] Navigation Data Standard.

[2] OpenLR.


Dr. Michael Reichel is Senior Manager, Head of Technology and Innovation, Driver Assistance at Elektrobit Automotive.

Elektrobit Automotive

@EB_Automotive eb-automotive


Dr. Michael Reichel (Elektrobit Automotive)