计算机科学 ›› 2015, Vol. 42 ›› Issue (8): 36-39.

• 2014’江苏省人工智能学术会议 • 上一篇    下一篇

量化粗糙集的单调性属性约简方法

鞠恒荣,杨习贝,戚 湧,杨静宇   

  1. 江苏科技大学计算机科学与工程学院 镇江212003;高维信息智能感知与系统教育部重点实验室 南京210094,江苏科技大学计算机科学与工程学院 镇江212003;高维信息智能感知与系统教育部重点实验室 南京210094;南京理工大学经济管理学院 南京210094,南京理工大学经济管理学院 南京210094,高维信息智能感知与系统教育部重点实验室 南京210094
  • 出版日期:2018-11-14 发布日期:2018-11-14
  • 基金资助:
    本文受国家自然科学基金(61100116,9,61305058),江苏省自然科学基金(BK2011492,BK2012700,BK20130471),高维信息智能感知与系统教育部重点实验室(南京理工大学)开放基金(30920130122005),江苏省高校自然科学基金(13KJB520003),江苏省普通高校研究生科研创新计划项目(CXLX13_707)资助

Approach to Monotonicity Attribute Reduction in Quantitative Rough Set

JU Heng-rong, YANG Xi-bei, QI Yong and YANG Jing-yu   

  • Online:2018-11-14 Published:2018-11-14

摘要: 单调性在经典粗糙集属性约简过程中发挥着重要的作用。然而,在一些泛化模型(如量化粗糙集模型)中该性质并不存在。针对该问题,提出了量化粗糙集模型中下近似单调约简的定义,并给出了求得该约简的启发式方法。实验结果表明,相较于下近似保持约简算法,下近似单调约简算法不仅耗时短,而且增加了由正域和边界域表示的确定性,同时降低了由边界域带来的不确定性。

关键词: 单调性,下近似保持,下近似单调,粗糙集

Abstract: It is well-known that the monotonicity plays an important role in attribute reduction of classical rough set.However,such property does not always hold in some generalization models,and quantitative rough set is a typical example.From this point of view,the definition of lower approximate monotonicity attribute reduction was presented in quantitative rough set model,and the heuristic approach was also given to compute the reduct.The experiment results show that compared with lower approximate preservation reduct,the lower approximate monotonicity can not only save the time consuming,but also increase the certainties which are expressed by positive and negative regions,and decrease the uncertainty coming from boundary region.

Key words: Monotonicity,Lower approximate preservation,Lower approximate monotonicity,Rough set

[1] Zdzisaw P.Rough sets-theoretical aspects of reasoning aboutdata [M].Dordrecht:Kluwer Academic,1991
[2] Luo Gong-zhi,Yang Xi-bei.Limited dominance-based rough set model and knowledge reductions in incomplete decision system [J].Journal of Information Science and Engineering,2010,26(6):2199-2211
[3] Xu Wei-hua,Wang Qiao-rong,Zhang Xian-tao.Multi-granula-tion fuzzy rough sets in a fuzzy tolerance approximation space [J].International Journal of Fuzzy Systems,2011,14:246-259
[4] Yang Xi-bei,Song Xiao-ning,Dou Hui-li,et al.Multi-granulation rough set:from crisp to fuzzy case [J].Annals Fuzzy Mathematics Information,2011,1(1):55-70
[5] Hu Qing-hua,Che Xun-jian,Zhang Lei,et al.Rank entropybased decision trees for monotonic classification [J].IEEE Transactions on Knowledge and Data Engineering,2012,24(11):2052-2064
[6] Qian Yu-hua,Liang Ji-ye,Yao Yi-yu,et al.MGRS:A multi-granulation rough set [J].Information Sciences,2010,180:949-970
[7] Yang Xi-bei,Qi Yun-song,Song Xiao-ning,et al.Test cost sensitive multigranulation rough set:model and minimal cost selection [J].Information Sciences,2013,250:184-199
[8] Zhao Yan,Yao Yi-yu,Luo Feng.Data analysis based on discer-nibility and indiscernibility [J].Information Sciences,2007,177:4959-4976
[9] Yao Yi-yu,Zhao Yan.Attribute reduction in decision-theoretic rough set model [J].Information Sciences,2008,178(17):3356-3373
[10] Yang Xi-bei,Song Xiao-ning,She Yan-hong,et al.Hierarchy on multigranulation structures:a knowledge distance approach [J].International Journal of General Systems,2013,42(7):754-773
[11] Min Fan,Zhu William.Attribute reduction of data with errorranges and test costs [J].Information Sciences,2012,211:48-67

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!