TY - JOUR
T1 - High-resolution radar road segmentation using weakly supervised learning
AU - Orr, Itai
AU - Cohen, Moshik
AU - Zalevsky, Zeev
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Nature Limited.
PY - 2021/3
Y1 - 2021/3
N2 - Autonomous driving has recently gained lots of attention due to its disruptive potential and impact on the global economy; however, these high expectations are hindered by strict safety requirements for redundant sensing modalities that are each able to independently perform complex tasks to ensure reliable operation. At the core of an autonomous driving algorithmic stack is road segmentation, which is the basis for numerous planning and decision-making algorithms. Radar-based methods fail in many driving scenarios, mainly as various common road delimiters barely reflect radar signals, coupled with a lack of analytical models for road delimiters and the inherit limitations in radar angular resolution. Our approach is based on radar data in the form of a two-dimensional complex range-Doppler array as input into a deep neural network (DNN) that is trained to semantically segment the drivable area using weak supervision from a camera. Furthermore, guided back propagation was utilized to analyse radar data and design a novel perception filter. Our approach creates the ability to perform road segmentation in common driving scenarios based solely on radar data and we propose to utilize this method as an enabler for redundant sensing modalities for autonomous driving.
AB - Autonomous driving has recently gained lots of attention due to its disruptive potential and impact on the global economy; however, these high expectations are hindered by strict safety requirements for redundant sensing modalities that are each able to independently perform complex tasks to ensure reliable operation. At the core of an autonomous driving algorithmic stack is road segmentation, which is the basis for numerous planning and decision-making algorithms. Radar-based methods fail in many driving scenarios, mainly as various common road delimiters barely reflect radar signals, coupled with a lack of analytical models for road delimiters and the inherit limitations in radar angular resolution. Our approach is based on radar data in the form of a two-dimensional complex range-Doppler array as input into a deep neural network (DNN) that is trained to semantically segment the drivable area using weak supervision from a camera. Furthermore, guided back propagation was utilized to analyse radar data and design a novel perception filter. Our approach creates the ability to perform road segmentation in common driving scenarios based solely on radar data and we propose to utilize this method as an enabler for redundant sensing modalities for autonomous driving.
UR - http://www.scopus.com/inward/record.url?scp=85100269121&partnerID=8YFLogxK
U2 - 10.1038/s42256-020-00288-6
DO - 10.1038/s42256-020-00288-6
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85100269121
SN - 2522-5839
VL - 3
SP - 239
EP - 246
JO - Nature Machine Intelligence
JF - Nature Machine Intelligence
IS - 3
ER -