Computer Science ›› 2024, Vol. 51 ›› Issue (11A): 240300063-5.doi: 10.11896/jsjkx.240300063
• Interdiscipline & Application • Previous Articles Next Articles
ZHAO Yufei1, JIN Cong2, LIU Xiaoyu3, WANG Jie2, ZHU Yonggui1, LI Bo4
CLC Number:
[1]HUA J,ZENG L C,LI G F,et al.Learning for a robot:Deep reinforcement learning,imitation learning,transfer learning[J].Sensors,2021,21(4):1278. [2]JAIN A,SHARMA A,PARHI D R.A survey on imitationlearning techniques for end-to-end autonomous vehicles[J].IEEE Transactions on Intelligent Transportation Systems,2022,23(9):14128-14147. [3]GHASEMIPOUR K S,ZEMEL R,GU S.A Divergence Minimi-zation Perspective on Imitation Learning Methods[J].arXiv:1911.02256,2019. [4]HU A,CORRADO G,GRIFFITHS N,et al.Model-based imitation learning for urban driving[J].Advances in Neural Information Processing Systems,2022,35:20703-20716. [5]RAJARAMAN N,YANG L,JIAO J,et al.Toward the fundamental limits of imitation learning[J].Advances in Neural Information Processing Systems,2020,33:2914-2924. [6]RASHIDINEJAD P,ZHU B,MA C,et al.Bridging offline reinforcement learning and imitation learning:A tale of pessimism[J].Advances in Neural Information Processing Systems,2021,34:11702-11716. [7]ORSINI M,RAICHUK A,HUSSENOT L,et al.What matters for adversarial imitation learning?[J].Advances in Neural Information Processing Systems,2021,34:14656-14668. [8]CELEMIN C,PÉREZ-DATTARI R,CHISARI E,et al.Interactive imitation learning in robotics:A survey[J].Foundations and Trends© in Robotics,2022,10(1/2):1-197. [9]BEHRENS G A,GREEN S B.The ability to identify emotional content of solo improvisations performed vocally and on three different instruments[J].Psychology of Music,1993,21(1):20-33. [10]WOODY R H.Emotion,imagery and metaphor in the acquisi-tion of musical performance skill[J].Music Education Research,2002,4(2):213-224. [11]WALTHAM C.The science of string instruments[M].NewYork:Springer,2010. [12]YOUNG D.The hyperbow controller:Real-time dynamics measurement of violin performance[C]//Proceedings of the 2002 Conference on New Interfaces for Musical Expression.2002:1-6. [13]SCHOONDERWALDT E,DEMOUCRON M.Extraction ofbowing parameters from violin performance combining motion capture and sensors[J].The Journal of the Acoustical Society of America,2009,126(5):2695-2708. [14]BEVILACQUA F,RASAMIMANANA N H,FLÉTY E,et al.The augmented violin project:research,composition and performance report[C]//6th International Conference on New Interfaces for Musical Expression(NIME 6).2006:402-406. [15]CAMPO A,MICHAŁKO A,VAN KERREBROECK B,et al.The assessment of presence and performance in an AR environment for motor imitation learning:A case-study on violinists[J].Computers in Human Behavior,2023,146:107810. [16]ALAM A.Employing adaptive learning and intelligent tutoring robots for virtual classrooms and smart campuses:reforming education in the age of artificial intelligence[M]//Advanced Computing and Intelligent Technologies:Proceedings of ICACIT 2022.Singapore:Springer Nature Singapore,2022:395-406. [17]AL HAKIM V G,YANG S H,LIYANAWATTA M,et al.Robots in situated learning classrooms with immediate feedback mechanisms to improve students' learning performance[J].Computers & Education,2022,182:104483. [18]VALAGKOUTI I A,TROUSSAS C,KROUSKA A,et al.Emotion recognition in human-robot interaction using the NAO robot[J].Computers,2022,11(5):72. [19]ZHANG Y,ZHANG C,CHENG L,et al.The use of deep lear-ning-based gesture interactive robot in the treatment of autistic children under music perception education[J].Frontiers in Psychology,2022,13:762701. [20]ZHU J,GIENGER M,KOBER J.Learning task-parameterized skills from few demonstrations[J].IEEE Robotics and Automation Letters,2022,7(2):4063-4070. [21]IJSPEERT A J,NAKANISHI J,HOFFMANN H,et al.Dynamical movement primitives:learning attractor models for motor behaviors[J].Neural Computer,2013,25(2):328-373. [22]ZHANG C Z.Study on Dynamic Vibration and Acoustic Behaviorof Violin[D].Guangzhou:South China University of Technology,2014. [23]TAKAGI T,SUGENO M.Fuzzy identification of systems and its applications to modeling and control[J].IEEE Trans.Syst.Man Cybern,1985(1):116-132. [24]SUN D,LIAO Q,LOUTFI A.Type-2 fuzzy model-based movement primitives for imitation learning[J].IEEE Transactions on Robotics,2022,38(4):2462-2480. [25]SHLENS J.A tutorial on principal component analysis[J].ar-Xiv:1404.1100,2014. [26]GUO D L.Theory and Application of Fuzzy Systems [M].Beijing:Science Press,2021. |
[1] | ZHOU Yangtao, CHU Hua, ZHU Feifei, LI Xiangming, HAN Zihan, ZHANG Shuai. Survey on Deep Learning-based Personalized Learning Resource Recommendation [J]. Computer Science, 2024, 51(10): 17-32. |
[2] | FAN Jia-kuan, WANG Hao-yue, ZHAO Sheng-yu, ZHOU Tian-yi, WANG Wei. Data-driven Methods for Quantitative Assessment and Enhancement of Open Source Contributions [J]. Computer Science, 2021, 48(5): 45-50. |
[3] | JIANG Chong, ZHANG Zong-zhang, CHEN Zi-xuan, ZHU Jia-cheng, JIANG Jun-peng. Data Efficient Third-person Imitation Learning Method [J]. Computer Science, 2021, 48(2): 238-244. |
|