计算机科学 ›› 2009, Vol. 36 ›› Issue (10): 217-221.

• 人工智能 • 上一篇    下一篇

一种优化的k-NN文本分类算法

闫鹏, 郑雪峰,朱建勇,肖赞泓   

  1. (北京科技大学信息工程学院 北京 100083);(国家信息中心 北京 100045)
  • 出版日期:2018-11-16 发布日期:2018-11-16

Optimized k-NN Text Categorization Approach

YAN Peng, ZHENG Xue-feng, ZHU Jian-yong, XIAO Yun-hong   

  • Online:2018-11-16 Published:2018-11-16

摘要: k-NN是经典的文本分类算法之一,在解决概念漂移问题上尤其具有优势,但其运行速度低下的缺点也非常严重,为此它通常借助特征选择降维方法来避免维度灾难、提高运行效率。但特征选择又会引起信息丢失等问题,不利于分类系统整体性能的提高。从文本向量的稀疏性特点出发,对传统的k-NN算法进行了诸多优化。优化算法简化了欧氏距离分类模型,大大降低了系统的运算开销,使运行效率有了质的提高。此外,优化算法还舍弃了特征选择预处理过程,从而可以完全避免因特征选择而引起的诸多不利问题,其分类性能也远远超出了普通k-NN。实验显示,优化算法在性能与效率双方面都有非常优秀的表现,它为传统的k-NN算法注入了新的活力,并可以在解决概念漂移等问题上发挥更大的作用。

关键词: 文本分类,特征选择,k-NN分类法,概念漂移

Abstract: As one of the most classical TC approaches,k-NN is advantaged in tackling concept drift. However,to avoid curse of dimensionality,it has to employ FS(feature selection) method to reduce dimensionality of feature space and improve operation efficiency. But FS process will generally cause information losing and thus has some side-effects on the whole performance of approach. According to sparsity of text vectors, an optimized k-NN approach was presented in paper. This optimized approach greatly simplified euclidean distance model and reduced the operation cost without any information losing. So it can simultaneously achieve much higher both performance and efficiency than general k-NN approach. It then enhanced the advantage of k-NN in managing concept drift.

Key words: Text categorization,Feature selection,k-NN,Concept drift

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!