Mapping of linear road features with the inverse visual detector observation model

Mapping of linear road features with the inverse visual detector observation model

Oleg S. Shipit’ko
Institute for Information Transmission Problems of the Russian Academy of Sciences (Kharkevich Institute), Junior Research Scientist, 19-1, Bolshoy Karetny per., Moscow, 127051, Russia, tel.: +7(964)509-27-21, This email address is being protected from spambots. You need JavaScript enabled to view it.

Anatoly E. Kabakov
Kharkevich Institute, Intern Research Scientist, 19-1, Bolshoy Karetny per., Moscow, 127051, Russia, tel.: +7(985)305-94-62, This email address is being protected from spambots. You need JavaScript enabled to view it.


Received 16 June 2021

Abstract
The paper proposes an algorithm for mapping linear features detected on the roadway — road marking lines, curbs, road boundaries. The algorithm is based on a mapping method with an inverse observation model. An inverse observation model is proposed to take into account the spatial error of the linear feature visual detector. The influence of various parameters of the model on the resulting quality of mapping was studied. The mapping algorithm was tested on data recorded on an autonomous vehicle while driving at the test site. The quality of the mapping algorithm was assessed according to several quality metrics known from the literature. In addition, the mapping problem was considered as a binary classification problem, in which each map cell may or may not contain the desired feature, and the ROC curve and AUC-ROC metric were used to assess the quality. As a naive solution, a map was built containing all detected linear features without any additional filtering. For the map built on the basis of the raw data, the AUC-ROC was 0.75, and as a result of applying the algorithm, the value of 0.81 was reached. The experimental results have confirmed that the proposed algorithm can effectively filter noise and false-positive detections of the detector, which confirms the applicability of the proposed algorithm and the inverse observation model for solving practical problems.

Key words
Linear features, mapping, inverse observation model, road map, autonomous vehicle, digital road map.

DOI
https://doi.org/10.31776/RTCJ.9307

Bibliographic description
Shipit'ko, O. and Kabakov, A., 2021. Mapping of linear road features with the inverse visual detector observation model. Robotics and Technical Cybernetics, 9(3), pp.214-224.

UDC identifier:
004.942

References

  1. Abramov, M.P. et al., 2019. Sistema pozitsionirovaniya vnutri zdaniy mobil'noy robototekhnicheskoy platformy na osnove detektsii kraev [Positioning system inside buildings of a mobile robotic platform based on edge detection]. Sensory Systems, 33(1), p.30-43. DOI: 10.1134/S0235009219010025. (in Russian).
  2. Hao Zhu et al., 2014. A path planning algorithm based on fusing lane and obstacle map. In: Proceedings of 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), pp.1442-1448. DOI: 10.1109/ITSC.2014.6957889.
  3. Kibalov, V. and Shipit’ko, O., 2020. Safe speed control and collision probability estimation under ego-pose uncertainty for autonomous vehicle. In: Proceedings of 23rd International IEEE Conference on Intelligent Transportation Systems (ITSC), pp.1–6. DOI: 10.1109/ITSC45102.2020.9294531.
  4. Poggenhans, F. et al., 2018. A high-definition map framework for the future of automated driving. In: Proceedings of 21st International IEEE Conference on Intelligent Transportation Systems (ITSC), pp.1672-1679. DOI: 10.1109/ITSC.2018.8569929.
  5. SAE International, 2014. Ground Vehicle Standard J3016_201609. Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles. Pp.1-16.
  6. Barandiarán, J. et al., 2020. Automated Annotation of Lane Markings Using LIDAR and Odometry. IEEE Transactions on Intelligent Transportation Systems (IEEE T INTELL TRANSP), pp.1-11. DOI: 10.1109/TITS.2020.3031921.
  7. Elfes, A., 1991. Occupancy grids: A probabilistic framework for robot perception and navigation. Ph.D. in Electrical and Computing Engineering. Carnegie Mellon University, Pitsburgh.
  8. Moravec, H.P., 1989. Sensor fusion in certainty grids for mobile robots. In: Sensor devices and systems for robotics. Berlin: Springer, Heidelberg, pp.253-276. DOI: 10.1007/978-3-642-74567-6_19.
  9. Moravec, H. and Elfes, A., 1985. High resolution maps from wide angle sonar. In: Proceedings of IEEE international conference on robotics and automation, v.2, pp.116-121. DOI: 10.1109/ROBOT.1985.1087316.
  10. Thrun, S., 2003. Learning occupancy grid maps with forward sensor models. Autonomous robots, 15, pp.111-127. DOI: 10.1023/A:1025584807625.
  11. Thrun, S., 2001. Learning occupancy grids with forward models. In: Proceedings of 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium, v. 3.
  12. Korthals, T., Kragh, M., Christiansen, P. and Rückert, U., 2018. Towards inverse sensor mapping in agriculture. arXiv preprint, 1805.08595.
  13. Ziegler, J., Lategahn, H. and Schreiber, M., 2014. Video based localization for bertha. In: IEEE Intelligent Vehicles Symposium Proceedings, pp.1231-1238.
  14. Konrad, M., Nuss, D. and Dietmayer, K., 2012. Localization in digital maps for road course estimation using grid maps. In: 2012 IEEE Intelligent Vehicles Symposium, pp.87-92.
  15. 2018. GOST R 51256-2018. Tekhnicheskie sredstva organizatsii dorozhnogo dvizheniya. Razmetka dorozhnaya. Klassifikatsiya. Tekhnicheskie trebovaniya [Russian National State Standard R 51256-2018. Traffic control devices. Road marking. Classification. Technical requirements]. Available at: <https://docs.cntd.ru/document/
    1200158480> [Accessed 15 July 2021].
  16. Panfilova, E.I. and Kunina, I.A., 2020. Ispol'zovanie okonnogo preobrazovaniya Khafa dlya poiska protyazhennykh granits na izobrazhenii [Using the Hough window transform to find extended boundaries in an image]. Sensory Systems, 34(4), pp.247–261. (in Russian).
  17. Bresenham, J.E., 1965. Algorithm for computer control of a digital plotter. IBM Systems journal, 4(1), pp.25-30.
  18. Collins, T., Collins, J.J. and Ryan, D., 2007. Occupancy grid mapping: An empirical evaluation. In: Proceedings of 2007 Mediterranean Conference on Control & Automation IEEE, pp.1-6. DOI: 10.1109/MED.2007.4433772.
  19. Martin, M.C. and Moravec, H.P., 1996. Robot Evidence Grids. Pittsburgh: The Robotics Institute Carnegie Mellon University.
  20. Kim, S. and Kim J. Building occupancy maps with a mixture of Gaussian processes. In: 2012 IEEE International Conference on Robotics and Automation, pp.4756-4761. DOI: 10.1109/ICRA.2012.6225355.
  21. Panfilova, E., Shipitko, O. and Kunina, I., 2021. Fast Hough transform-based road markings detection for autonomous vehicle. In: Thirteenth International Conference on Machine Vision: Proceedings, v.11605. DOI: 10.1117/12.2587615.
Editorial office address: 21, Tikhoretsky pr., Saint-Petersburg, Russia, 194064, tel.: +7(812) 552-13-25 e-mail: zheleznyakov@rtc.ru