计算机科学 ›› 2020, Vol. 47 ›› Issue (11A): 46-51.doi: 10.11896/jsjkx.200600055

• 人工智能 • 上一篇    下一篇

基于加权划分非平衡决策树的诗歌朗读情感度分析

董本清, 李凤坤   

  1. 大连东软信息学院 辽宁 大连 116023
  • 出版日期:2020-11-15 发布日期:2020-11-17
  • 通讯作者: 李凤坤(lifengkun@neusoft.edu.cn)
  • 作者简介:dongbenqing@neusoft.edu.cn
  • 基金资助:
    辽宁省博士启动基金(20170520398);辽宁省教育厅科学技术一般项目(L2015041)

Analysis of Emotional Degree of Poetry Reading Based on WDOUDT

DONG Ben-qing, LI Feng-kun   

  1. Dalian Neusoft University of Information,Dalian,Liaoning 116023,China
  • Online:2020-11-15 Published:2020-11-17
  • About author:DONG Ben-qing,born in 1981,Ph.D,associate professor,is a member of CCF.His main research interests include software application and computer education.
    LI Feng-kun,born in 1983,master.Her main research interests include intelligent algorithm and artificial intelligence.
  • Supported by:
    This work was supported by the Liaoning Provincial Doctor Start-up Fund (20170520398) and General Project of Science and Technology of Liao-ning Provincial Department of Education (L2015041).

摘要: 本文面向诗歌朗读的情感度分析,提出了一种新的非平衡决策树算法。该算法称为加权划分非平衡决策树(Weighted Division of Unbalanced Decision Tree,WDOUDT),通过对诗歌朗读感染力的指标展开研究,从朗读的音频中提取梅尔频率倒谱系数,应用可解释性最强的决策树方法进行建模。加权划分非平衡决策树推导算法不使用进化算法和启发信息搜索,应用在诗歌朗读音频的情感度打分中,时间复杂度低于传统决策树,该算法具有更少的节点数和较好的泛化能力,对噪音数据有较好的鲁棒性。

关键词: 非平衡决策树, 加权划分, 快速收敛, 情感度分析

Abstract: In this paper,a new unbalanced decision tree algorithm for infectious expressions of reading poem is proposed.This algorithm called Weighted Division of Unbalanced Decision Tree (WDOUDT).Through the study on the index of poetry reading appeal,mel-frequency cepstral coefficients are extracted from the reading audio,and the decision tree method with the strongest interpretability is used for modelling.WDOUDT does not use evolutionary algorithm and heuristic information search,it is applied to the emotional scoring of poetry reading audio,and the time complexity is lower than the traditional decision tree.The proposed algorithm has fewer nodes and better generalization ability,and has better robustness to noise data.

Key words: Fast convergence., Infectious expression, Unbalanced decision tree, Weighted division

中图分类号: 

  • TP391.43
[1] NIU Y F.Research on Speech Emotion Recognition Based on Deep Learning[D].Chongqing:Chongqing University,2018.
[2] GUTIERREZ-RODRÍGUEZ A E,MARTÍNEZ-TRINIDAD J F,GARCÍA-BORROTO M,et al.Mining patterns for clustering on numerical datasets using unsupervised decision trees[J].Knowledge-Based Systems,2015,82:70-79.
[3] HYAFIL L,RIVEST R L.Constructing optimal binary decision trees is np-complete[J].Inf.Process.Lett.,1976,5(1):15-17.
[4] BASGALUPP,MÁRCIO P,BARROS R C,et al.LEGAL-tree:a lexicographic multi-objective genetic algorithm for decision tree induction[C]//Acm Symposium on Applied Computing.ACM,2009.
[5] BASGALUPP M P,DE CARVALHO A C P L F,BARROS R C,et al.Lexicographic multi-objective evolutionary induction of decision trees[J].International Journal of Bio-Inspired Computation,2009,1(1/2):105-117.
[6] BARROS R C,BASGALUPP M P,FREITAS A A.Automatic Design of Decision-Tree Algorithms with Evolutionary Algorithms[J].Evolutionary Computation,2013,21(4):659-684.
[7] QUINLAN J R.Induction of Decision Trees[J].Machine Learning,1986,1(1):81-106.
[8] BUNTINE W L.Learning classification trees[J].Statistics and Computing,1992,2(2):63-73.
[9] HO T K.The random subspace method for constructing decision forests[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1998,20(8):844.
[10] HEATH D,KASIF S,SALZBERG S.Induction of Oblique Decision Trees[J].Journal of Artificial Intelligence Research,1993,2:1-32.
[11] MURTHY S K,KASIF S,SALZBERG S.A System for Induction of Oblique Decision Trees[J].Journal of Artificial Intelligence Research,1996,2(1):1-32.
[12] LOH W Y,SHIH Y S.Split Selection Methods for Classification Trees[J].Statistica Sinica,1999,7(4):815-840.
[13] MUI J K,FU K S.Automated classification of nucleated blood cells using a binary tree classifier[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1980,PAMI-2(5):429-443.
[14] HUANG J,NG M,RONG H,et al.Automated Variable Weighting in k-Means Type Clustering[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2005,27(5):657-668.
[15] KIRA K,RENDELL L A.The Feature Selection Problem:Traditional Methods and a New Algorithm[C]//Proceedings of the 10th National Conference on Artificial Intelligence.San Jose,CA,AAAI Press,1992:12-16.
[16] KONONENKO I.Estimating attributes:Analysis and exten-sions of RELIEF[M]//Machine Learning:ECML-94.Springer Berlin Heidelberg,1994.
[1] 马李昕, 李凤坤.
一种轻量级的车牌字符识别算法
Light-weight Recognition Algorithm of Vehicle License Plate Characters
计算机科学, 2019, 46(6A): 239-241.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!