Induced virtual environment for control of a manipulator designed for working with radioactive materials

Induced virtual environment for control of a manipulator designed for working with radioactive materials

Alexey V. Sergeev
Russian State Scientific Center for Robotics and Technical Cybernetics (RTC), Leading En-gineer, 21, Tikhoretsky pr., Saint Petersburg, 194064, Russia, tel.: +7(921)975-04-97, This email address is being protected from spambots. You need JavaScript enabled to view it., OR-CID: 0000-0002-6196-1433

Victor V. Titov
RTC, Research Scientist, 21, Tikhoretsky pr., Saint Petersburg, 194064, Russia, tel.: +7(950)021-19-28, This email address is being protected from spambots. You need JavaScript enabled to view it., ORCID: 0000-0003-2817-5841

Igor V. Shardyko
RTC, Research Scientist, 21, Tikhoretsky pr., Saint Petersburg, 194064, Russia, tel.: +7(904)648-74-90, This email address is being protected from spambots. You need JavaScript enabled to view it., ORCID: 0000-0003-0622-9896


Received 15 October 2020

Abstract
This article discusses the control issues of a robotic arm for a hot cell based on the induced virtual reality methodology. A human-machine interface based on the virtual reality is presented, comprising a set of interactive features, designed to construct trajectories, along which the end effector of the arm should move. The prospects of computer vision are further considered as means that update the virtual environment state. An experiment to compare two approaches designed to control the robotic arm in virtual environment was carried out.

Key words
Virtual reality, induced environment, human-robot interface, manipulator, end effector, robot, computer vision system.

Acknowledgements
The reported study was funded by Ural Federal University named after the first President of Russia B.N. Yeltsin in the frame of the research project 17706413348200000540/686-20.

DOI
https://doi.org/10.31776/RTCJ.9104

Bibliographic description
Sergeev, A., Titov, V. and Shardyko, I., 2021. Induced virtual environment for control of a manipulator designed for working with radioactive materials. Robotics and Technical Cybernetics, 9(1), pp.32-41.

UDC identifier:
004.5

References

  1. Chizhevskij, R.A. and Shardyko, I.V., 2014. Manipulation control system for metal of the upper block branch pipes. Extreme robotics, 1(1), pp.253-256. (in Russian).
  2. Voinov, I.V. et al., 2018. Radiation-resistant manipulators and methods of expanding their functionality. Extreme robotics, 1, pp.113-125. (in Russian).
  3. Vasil'ev, D.B., Ivlev, A.K., Gordon, Yu.A. and Gordon, A.T., 2011. Distancionno upravlyaemoe ustrojstvo dlya remontnyh rabot, preimushchestvenno dlya osusheniya trubok reshetki kollektora parogeneratora AES [Remotely controlled device for repair work, mainly for draining the tubes of the collector grid of a steam generator at nuclear power plants]. Patent no.RU108578U1, Russian Federation.
  4. Voroshilov, M.S. et al., 1990. Manipulation device. Patent no.SU1541048A1, Russian Federation.
  5. Sergeev, S., 2006. Ergonomic problems of interface design based on induced virtual environments. Mir Avioniki, 3, pp.62-67. (in Russian).
  6. Minitaeva, A.M., 2013. Osnovnoj podhod v reshenii zadachi sozdaniya cheloveko-mashinnogo interfejsa s ispol'zovaniem dual'nogo principa [The main approach to solving the problem of creating a human-machine interface using the dual principle]. Programmnye produkty i sistemy, 4, pp.19. (in Russian).
  7. Tyrva, V.O., 2018. Sovmestnoe upravlenie ob"ektom v ergaticheskoj sisteme: modeli i realizacii [Joint control of an object in an ergatic system: models and implementations]. Vestnik Gosudarstvennogo universiteta morskogo i rechnogo flota imeni admirala S. O. Makarova [Bulletine of State University of Marine and River Fleet n.a. Admiral Makarov], 10(2), pp.430-443. (in Russian). DOI: 10.21821/2309-5180-2018-10-2-430-443.
  8. Popechitelev, E.P., 2016. Problemy sinteza biotekhnicheskih system [Problems of the synthesis of biotechnical systems]. Nauchnoe obozrenie. Tekhnicheskie nauki, 2, pp.54-62. (in Russian).
  9. Sergeyev, A. and Gook, M., 2018. Mobile Space Robot Control with the Use of Virtual Reality. Pilotiruemye Polety v Kosmos, 4(29), pp.44-52. (in Russian).
  10. Sergeev, A. and Sergeev, S., 2019. Complexity reduction of interfaces of robotic and ergatic systems. Robotics and Technical Cybernetics, 7(2), pp.109-118. (in Russian).
  11. Szeliski, R., 2011. Computer Vision: Algorithms and Applications. 10.1007/978-1-84882-935-0.
  12. Zhao, Z.-Q., Zheng, P., Xu, S.-T. and Wu, X., 2019. Object Detection with Deep Learning: A Review. IEEE Transactions on Neural Networks and Learning Systems, 30(11), pp.1-21. DOI: 10.1109/TNNLS.2018.2876865
  13. Filatov, N., Vlasenko, V., Fomin, I. and Bakhshiev, A., 2020. Application of deep neural network for the vision system of mobile service robot. Studies in Computational Intelligence, 856, pp.214-220.
  14. Devi, R., Yambem, J., Chanu and Singh, K., 2016. A Survey on Different Background Subtraction Method for Moving Object Detection. International Journal for Research in Emerging Science and Technology, 3, pp.7-10.
  15. Marchand, E., Uchiyama, H. and Spindler, F., 2016. Pose Estimation for Augmented Reality: A Hands-On Survey. IEEE Transactions on Visualization and Computer Graphics, Institute of Electrical and Electronics Engineers, 22(12), pp.2633-2651. DOI: 10.1109/TVCG.2015.2513408.hal-01246370.
  16. Lateef, F. and Ruichek, Y., 2019. Survey on Semantic Segmentation using Deep Learning Techniques. Neurocomputing. DOI:10.1016/j.neucom.2019.02.003.
  17. Yuan, J., Wang, D.L. and Cheriyadat, A.M., 2015. Factorization-based texture segmentation. IEEE Transactions on Image Processing.
  18. Orlova, S.R. and Isakov, T.T., 2020. Primenenie glubokih nejronnyh setej v zadache segmentacii izobrazhenij dorozhnoj obstanovki [Application of deep neural networks in the problem of segmentation of traffic images]. XLII SPbPU Week of Science, 2, pp.43-46. (in Russian).
  19. Han, P. and Zhao, G.A., 2019. Review of edge-based 3D tracking of rigid objects. Virtual Reality & Intelligent Hardware, 1(6), pp.580—596.
Editorial office address: 21, Tikhoretsky pr., Saint-Petersburg, Russia, 194064, tel.: +7(812) 552-13-25 e-mail: zheleznyakov@rtc.ru