Semantic fusion of exteroceptive sensor information with digital road map for autonomous navigation application
编号:126 访问权限:仅限参会人 更新:2021-12-03 10:14:29 浏览:117次 张贴报告

报告开始:暂无开始时间(Asia/Shanghai)

报告时间:暂无持续时间

所在会场:[暂无会议] [暂无会议段]

暂无文件

摘要
For autonomous navigation applications, real-time modeling of the surrounding environment is a key component. Many approaches focus on how to interpret exteroceptive sensor data to obstacle information, with well-tailored sensor models to enable the transform and tackle the sensor uncertainties. However, sole information of obstacles is far from adequate for autonomous perception. The ego vehicle needs semantic information to understand the accessibility of the space. In the literature, methods have been proposed to encode prior map information into grids, the co-called semantic grid maps offer a new method to represent prior information. In this article, we propose a grid-based evidential approach to fuse exteroceptive sensor information with prior map knowledge. Each source of information is transformed into a specially designed evidential grid, the combination is carried out layer-wise based on the belief function theory. We report real results carried on public roads with the real-time software.
关键词
CICTP
报告人
Chunlei Yu
Tsinghua University

稿件作者
Chunlei Yu Tsinghua University
发表评论
验证码 看不清楚,更换一张
全部评论
重要日期
  • 会议日期

    12月17日

    2021

    12月20日

    2021

  • 12月16日 2021

    报告提交截止日期

  • 12月24日 2021

    注册截止日期

主办单位
Chinese Overseas Transportation Association
Chang'an University
联系方式
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询