TRANSLATION-INVARIANT ESTIMATION OF AUTONOMOUS MOBILE ROBOT ANGULAR ORIENTATION BASED ON HOUGH TRANSFORM

TRANSLATION-INVARIANT ESTIMATION OF AUTONOMOUS MOBILE ROBOT ANGULAR ORIENTATION BASED ON HOUGH TRANSFORM

D.N. Aldoshkin
Siberian Federal University, Institute of space and information technology, Teaching Assistant, 26, ul. Academica Kirenskogo, Krasnoyarsk, 660074, Russia, tel.: +7(950)404-36-33, This email address is being protected from spambots. You need JavaScript enabled to view it.


Abstract
In this article the algorithm of mobile robot angular orientation estimation is considered. Current environment configuration is described by measurement samples from mobile robot proximity sensors. Environment is considered as an object with dominance of straight edges in its geometry. The problem is thought of as a task of estimation of two-dimentional rotation of object, abstracted by a polygon. Algorithm is based on the idea of Hough transform with translation from the measurement space to line parameters space. This transformation preserves angles between lines being invariant to rotation transformations, translation and isotropic scale. Application of Hough transform reduces rotation estimation problem to one dimensional optimization problem. Proposed algorithm is notable for robustness against measurement noise and outliers.

Key words
Mobile robot, Hough transform, angular orientation, simultaneous localization and mapping, SLAM, similarity transformation.

Bibliographic description
Aldoshkin, D. (2017). Translation-Invariant Estimation of Autonomous Mobile Robot Angular Orientation Based on Hough Transform. Robotics and Technical Cybernetics, 2(15), pp.25-31.

UDC identifier
004.896

References

  1. Bailey, T. and Durrant-Whyte, H. (2006). Simultaneous localization and mapping (SLAM): part I. IEEE Robotics & Automation Magazine, 13(2), pp.99-110.
  2. Aldoshkin, D. and Tsarev, R. (2016). Mobile Robot Path Planning in the Presence of Obstacles and Lack of Information about the Environment. Mehatronika, avtomatizacia, upravlenie, 17(7), pp.465-470.
  3. Pshikhopov, V. and Ali, A. (2012). Upravlenie nazemnymi robotami v nedeterminirovannykh sredakh s prepyatstviyami opredelennogo klassa [Control of on-ground robots in undetermined enviroments with obstacles of certain class]. In: Upravlenie v tekhnicheskikh, ergaticheskikh, organizatsionnykh i setevykh sistemakh [Control in technical, ergatic, organizational and network systems - 2012]. pp.790-793.
  4. Kuchersky, R. and Man'ko, S. (2012). Algoritmy lokal'noy navigatsii i kartografii dlya bortovoy sistemy upravleniya avtonomnogo mobil'nogo robota [Local navigation and mapping algorithms for the onboard control system of autonomous mobile robot]. Izvestiya SFedU. Engineering Sciences, 3(128), pp.13-22.
  5. Davydov, O. and Platonov, A. (2015). Metod opredeleniya pozitsii i orientatsii mobil'nogo robota s lazernym skanerom [Method of localization and orientation of mobile robot with laser scanner]. [online] Library.keldysh.ru - Preprinty IPM im. M.V.Keldysha [Preprint of the Keldysh institute of applied mathematics]. Available at: http://library.keldysh.ru/preprint.asp?id=2015-45 [Accessed 10 Feb. 2017].
  6. Bessmeltsev, V. and Bulushev, E. (2014). Bystryy algoritm sovmeshcheniya izobrazheniy dlya kontrolya kachestva lazernoy mikroobrabotki [Fast image registration algorithm for automated inspection of laser micromachining]. Computer Optics, 2, pp.343-350.
  7. Syryamkin, V. and Shidlovskiy, V. (2010). Korrelyatsionno-ekstremal'nye radionavigatsionnye sistemy [Correlational-extreme radionavigational systems]. 1st ed. Tomsk, Russia: Tomsk Unniversity Publ.
  8. Nguen, A. (2015). 3D sistema obnaruzheniya prostranstvennykh ob"ektov s pomoshch'yu manipulyatsionnogo robota [3D system of spatial objects detection with use of manipulation robot]. PhD in Technical Sciences. Bauman Moscow State Technical University.
  9. Alves de Araújo, S. and Kim, H. (2011). Ciratefi: An RST-invariant template matching with extension to color images. Integrated Computer-Aided Engineering, 18(1), pp.75-90.
  10. Iocchi, and Nardi, (1999). Self-Localization in the RoboCup Environment. In: 3rd International Workshop on RoboCup.
  11. Großmann, A. and Poli, R. (2001). Robust mobile robot localisation from sparse and noisy proximity readings using Hough transform and probability grids. Robotics and Autonomous Systems, 37(1), pp.1-18.
  12. Iocchi, L., Mastrantuono, D. and Nardi, D. (2001). A probabilistic approach to Hough localization. Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164) - Seoul, South Korea, pp.4250-4255.
  13. Iocchi, L. and Nardi, D. (2002). Hough Localization for mobile robots in polygonal environments. Robotics and Autonomous Systems, 40(1), pp.43-58.
  14. Grisetti, G., Iocchi, L. and Nardi, D. (2002). Global Hough localization for mobile robots in polygonal environments. Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).
  15. Censi, A., Iocchi, L. and Grisetti, G. (2005). Scan Matching in the Hough Domain. Proceedings of the 2005 IEEE International Conference on Robotics and Automation, pp.2739-2744.
  16. Saeedi, S., Paull, L., Trentini, M., Seto, M. and Li, H. (2014). Map merging for multiple robots using Hough peak matching. Robotics and Autonomous Systems, 62(10), pp.1408-1424.
  17. Hough, P. (1962). Method and means for recognizing complex patterns. US3069654 A.
  18. Ballard, D. (1981). Generalizing the Hough transform to detect arbitrary shapes. Pattern Recognition, 13(2), pp.111-122.
  19. Duda, R. and Hart, P. (1972). Use of the Hough transformation to detect lines and curves in pictures. Communications of the ACM, 15(1), pp.11-15.
  20. Dudkin, A. and Vershok, D. (2004). Approksimatsiya pryamymi liniyami konturov ob"ektov na polutonovykh izobrazheniyakh [Straight line fit of object form in grayscale images]. Izvestiya SFedU. Engineering Sciences, 9(44), pp.190-196.
Editorial office address: 21, Tikhoretsky pr., Saint-Petersburg, Russia, 194064, tel.: +7(812) 552-13-25 e-mail: zheleznyakov@rtc.ru