Computer Science ›› 2023, Vol. 50 ›› Issue (9): 90-100.doi: 10.11896/jsjkx.221200053

• Computer Software • Previous Articles     Next Articles

Active Observation Schemes and Software Implementation Architecture of Autonomous Robot

XIAO Huaiyu1,2, YANG Shuo3, MAO Xinjun1,2   

  1. 1 College of Computer Science and Technology,National University of Defense Technology,Changsha 410073,China
    2 Key Laboratory of Software Engineering for Complex Systems,Changsha 410073,China
    3 College of Systems Engineering,National University of Defense Technology,Changsha 410073,China
  • Received:2022-12-08 Revised:2023-04-20 Online:2023-09-15 Published:2023-09-01
  • About author:XIAO Huaiyu,born in 1999,postgra-duate,is a student member of China Computer Federation.His main research interest is robot software engineering.
    YANG Shuo,born in 1992,Ph.D,lectu-rer,is a member of China Computer Fe-deration.His main research interests include intelligent robot software engineering,software engineering theory and methods and robot behavior mode-ling and task decision-making.
  • Supported by:
    National Natural Science Foundation of China(62172426).

Abstract: Autonomous robots operate in the open environment,their perception of environmental information is limited,and it is difficult to obtain complete and timely information about the environment.In order to effectively complete tasks,autonomous robots need to actively observe the environment,that is,according to the requirements of tasks,to make decisions,schedule and execute observation behaviors spontaneously,and obtain task-related environmental information.The demand for active observation poses two challenges to the observation scheme of autonomous robots and the construction of software systems.On the one hand,in order to support the effective implementation of tasks,active observation schemes should be designed to ensure that autonomous robots can observe the required environmental information based on task requirements from the mechanism level.On the other hand,the active observation schemes make the function abstraction and data interaction of the software components such as observation and decision-making of autonomous robots more complicated,so it is necessary to design a software architecture sui-table for the implementation of the complex mechanism on the upper level.In order to deal with the above challenges,this paper defines the behaviors of autonomous robots as task behaviors and observation behaviors.Two kinds of active observation schemes are proposed to construct a collaborative mechanism between observation behaviors and task behaviors,aiming at the two typical scenes with limited environmental information in the open environment:one-sided observation and outdated observation scenes,and the decision and scheduling algorithms of observation behavior are designed based on these two active observation schemes.In addition,an autonomous robot software architecture based on the multi-agent system is designed to implement the proposed active observation schemes.Finally,in order to verify the effectiveness of the proposed active observation schemes,a typical task in the open environment:the book transfer task of the library service robot is selected to carry out experimental verification.In this task,the location information of the book is limited by the autonomous robot,which easily leads to the failure of the book transfer task.In this paper,the reactive observation and the accompanying observation schemes of the current mainstream in the field of autonomous robots are selected as the comparison method,and the effectiveness of the proposed method is verified by comparing the behavior execution process,motion trajectory and time consumption.

Key words: Autonomous robot, Open environment, Observation behavior, Active observation scheme, Autonomous robot software

CLC Number: 

  • TP311
[1]MAO X J.A Systematic Review on Software Engineering for Autonomous Robot[J].Chinese Journal of Computers,2021,44(8):1661-1687.
[2]PETTERSSON O.Execution monitoring in robotics:A survey[J].Robotics and Autonomous Systems,2005,53(2):73-88.
[3]MURPHY R R.Introduction to AI robotics[M].MIT press,2019.
[4]MICHAUD F,NICOLESCU M.Behavior-based systems[M].Springer Handbook of Robotics.Cham:Springer,2016:307-328.
[5]HOMBERG B S,KATZSCHMANN R K,DOGAR M R,et al.Robust proprioceptive grasping with a soft robot hand[J].Autonomous Robots,2019,43(3):681-696.
[6]SZOT A,CLEGG A,UNDERSANDER E,et al.Habitat 2.0:Training home assistants to rearrange their habitat[J].Advances in Neural Information Processing Systems,2021,34:251-266.
[7]BREYER M,OTT L,SIEGWART R,et al.Closed-Loop Next-Best-View Planning for Target-Driven Grasping[C]//2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).IEEE,2022:1411-1416.
[8]CHEN C,LIU Y,KREISS S,et al.Crowd-robot interaction:Crowd-aware robot navigation with attention-based deep reinforcement learning[C]//2019 International Conference on Robotics and Automation(ICRA).IEEE,2019:6015-6022.
[9]CHEN Y,LIU C,SHI B E,et al.Robot navigation in crowds by graph convolutional networks with attention learned from human gaze[J].IEEE Robotics and Automation Letters,2020,5(2):2754-2761.
[10]LI J K,HSU D,LEE W S.Act to see and see to act:POMDP planning for objects search in clutter[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).IEEE,2016:5701-5707.
[11]LIU Z,MAO X,YANG S.A Dual-Loop Control Model andSoftware Framework for Autonomous Robot Software[C]//2017 24th Asia-Pacific Software Engineering Conference (APSEC).IEEE,2017:229-238.
[12]ZHENG K.Ros navigation tuning guide[M].Robot Operating System(ROS).Cham:Springer,2021:197-226.
[13]DONG H,GIAKOUMIDIS N,FIGUEROA N,et al.Approac-hing behaviour monitor and vibration indication in developing a General Moving Object Alarm System(GMOAS)[J].International Journal of Advanced Robotic Systems,2013,10(7):290.
[14]RAMAKRISHNAN S K,JAYARAMAN D,GRAUMAN K.Emergence of exploratory look-around behaviors through active observation completion[J].Science Robotics,2019,4(30):eaaw6326.
[15]CASHMORE M,FOX M,LONG D,et al.Rosplan:Planning in the robot operating system[C]//Proceedings of the Interna-tional Conference on Automated Planning and Scheduling.2015,25:333-341.
[16]BAJCSY R,ALOIMONOS Y,TSOTSOS J K.Revisiting active perception[J].Autonomous Robots,2018,42(2):177-196.
[17]RUEHL S W,XUE Z,KERSCHER T,et al.Towards automatic manipulation action planning for service robots[C]//Annual Conference on Artificial Intelligence.Berlin,Heidelberg:Sprin-ger,2010:366-373.
[18]NIEMUELLER T,HOFMANN T,LAKEMEYER G.CLIPS-based execution for PDDL planners[C]//ICAPS Workshop on Integrated Planning,Acting and Execution(IntEx).2018.
[19]YANG S,MAO X J,LIU Z,et al.The accompanying behavior model and implementation architecture of autonomous robot software[C]//2017 24th Asia-Pacific Software Engineering Conference(APSEC).IEEE,2017:209-218.
[20]MAO X J,YANG S,HUANG Y H,et al.Towards Software Architecture and Accompanying Behavior Mechanism of Auto-nomous Robotic Control Software Based on Multi-agent System[J].Journal of Software,2020,31(6):1619-1637.
[21]ODDI A,RASCONI R,SANTUCCI V G,et al.Integratingopen-ended learning in the sense-plan-act robot control paradigm[M].ECAI 2020.IOS Press,2020:2417-2424.
[22]GARATTONI L,BIRATTARI M.Autonomous task sequen-cing in a robot swarm[J].Science Robotics,2018,3(20):eaat0430.
[23]BROOKS R.A robust layered control system for a mobile robot[J].IEEE Journal on Robotics and Automation,1986,2(1):14-23.
[24]ARNOLD R D,YAMAGUCHI H,TANAKA T.Search andrescue with autonomous flying robots through behavior-based cooperative intelligence[J].Journal of International Humanita-rian Action,2018,3(1):1-18.
[25]RUSSELL S J.Artificial intelligence a modern approach[M].Pearson Education,Inc.,2010.
[26]BURKHARD H D.Agent oriented techniques for programming autonomous robots[J].Fundamenta Informaticae,2010,102(1):49-62.
[27]IÑIGO-BLASCO P,DIAZ-DEL-RIO F,ROMERO-TERNERO M C,et al.Robotics software frameworks for multi-agent robo-tic systems development[J].Robotics and Autonomous Systems,2012,60(6):803-821.
[28]GARRIDO-JURADO S,MUÑOZ-SALINAS R,MADRID-CUEVAS F J,et al.Automatic generation and detection of highly reliable fiducial markers under occlusion[J].Pattern Recognition,2014,47(6):2280-2292.
[29]LIU Z,MAO X,YANG S.Autorobot:A multi-agent softwareframework for autonomous robots[J].IEICE TRANSACTIONS on Information and Systems,2018,101(7):1880-1893.
[30]QUIGLEY M,CONLEY K,GERKEY B,et al.ROS:an open-source Robot Operating System[C]//ICRA Workshop on Open Source Software.2009:5.
[1] XUE Yuanzhou, YANG Shuo, MAO Xinjun. Adjoint Observation Schemes and Software Implementation Framework for Autonomous Robots [J]. Computer Science, 2023, 50(7): 1-9.
[2] XU Deng, HUANG Xiao-dong. Fire Images Features Extraction Based on Improved Two-stream Convolution Network [J]. Computer Science, 2019, 46(11): 291-296.
[3] LIU Pei-feng and WANG Jian. Algorithm of SLAM Based on Robust EKF [J]. Computer Science, 2017, 44(Z6): 115-118.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!