计算机科学 ›› 2025, Vol. 52 ›› Issue (11A): 241200110-7.doi: 10.11896/jsjkx.241200110

• 数据库&大数据&数据科学 • 上一篇    下一篇

基于GRAM矩阵的粒感知机

吴少华1, 陈玉明2   

  1. 1 厦门美亚亿安信息科技有限公司 福建 厦门 361008
    2 厦门理工学院计算机与信息工程学院 福建 厦门 361024
  • 出版日期:2025-11-15 发布日期:2025-11-10
  • 通讯作者: 陈玉明(myeant2024@163.com)
  • 作者简介:myeant2024@163.com
  • 基金资助:
    福建省自然科学基金(2024J011192);厦门市自然科学基金(3502Z202473069)

Granular Perception Machine Based on GRAM Matrix

WU Shaohua1, CHEN Yuming2   

  1. 1 Xiamen Meiya eAnt Information Technology Co.,Ltd.,Xiamen,Fujian 361008,China
    2 College of Computer and Information Engineering,Xiamen University of Technology,Xiamen,Fujian 361024,China
  • Online:2025-11-15 Published:2025-11-10
  • Supported by:
    Natural Science Foundation of Fujian Province,China(2024J011192) and Natural Science Foundation of Ximen,China(3502Z202473069).

摘要: 感知机是一种简单的线性分类器,也是SVM及深度学习的基石。然而,大部分复杂问题是非线性模型,感知机在处理这类问题时,分类效果不佳。因此,引入粒计算理论,以参考样本为模板,将训练样本粒化为特征粒子及特征粒向量,进而定义粒GRAM矩阵,提出一种基于GRAM矩阵的粒感知机模型。该模型优化感知机的对偶形式,构造新的粒感知机模型。为处理非线性分类问题,引入核函数,构造基于粒向量的核GRAM矩阵,并给出GRAM粒感知机的损失函数和学习方法。最后,从收敛性、非线性处理能力、参考样本的数量以及模型分类效果4方面进行实验分析,结果表明了GRAM粒感知机的有效性与正确性。

关键词: 粒计算, 感知机, GRAM矩阵, 非线性分类, 核函数

Abstract: The perceptron is a simple linear classifier and is also the cornerstone of SVM and deep neural networks.However,most complex problems are often nonlinear,and the perceptron performs poorly in handling such issues.Therefore,this paper introduces granular computing theory,whereby training samples are granulated into feature granules and feature granular vectors based on reference samples.The paper defines a granular GRAM matrix and proposes a granular perceptron model based on the GRAM matrix.This model optimizes the dual form of the perceptron to construct a new granular perceptron model.To better handle nonlinear classification problems,a kernel function is introduced to construct a kernel GRAM matrix based on granular vectors,and the loss function and learning method of the GRAM granular perceptron are provided.Finally,experiments analyze the model’s convergence,nonlinear processing capability,the number of reference samples,and the classification performance of models.The result shows the effectiveness and correctness of the model of GRAM granular perceptron.

Key words: Granular computing, Perceptron, GRAM matrix, Nonlinear classification, Kernel function

中图分类号: 

  • TP181
[1]ZADEH L A.Fuzzy sets and information granularity [J].Fuzzy Sets,Fuzzy Logic,and Fuzzy Systems,1996,8:433-448.
[2]WANG G Y,ZHANG Q H,HU J.An overview of granularcomputing [J].CAAI Transactions on Intelligent Systems,2016,29(6):927-933.
[3]LIN T Y.Granular computing on binary relations I:data mining and neighborhood systems [J].Rough Sets in Knowledge Discovery,1998,1(1):107-121.
[4]XUE R X,YI S C,WANG P X.GBDEN:a fast clustering algorithm for large-scale data based on granular ball [J].Computer Science,2024,12:166-173.
[5]MIAO D Q,WANG J.On the relationships between information entropy and roughness of knowledge in rough set theory [J].PR & AI,1998,11(1):34-40.
[6]MIAO D Q,WANG J.An information representation of the concepts and operations in rough set theory [J].Journal of Software,1999,10(2):113-116.
[7]MIAO D Q,ZHANG Q H,et al.From human intelligence to machine implementation model:theories and applications based on granular computing [J].CAAI Transactions on Intelligent Systems,2016,11(6):743-757.
[8]YAO Y Y.Granular computing using neighborhood systems[M]//Advances in Soft Computing:Engineering Design and Manufacturing.London:Springer,1999:539-553.
[9]PEDRYCZ W.Granular computing for data analytics:a manifesto of human-centric computing [J].IEEE/CAA Journal of Automatica Sinica,2018,5(6):1025-1034.
[10]MIAO D Q,FAN S D.The calculation of knowledge granulation and its application [J].Systems Engineering-Theory & Practice,2002,22(1):48-56.
[11]HU Q H,YU D R,XIE Z X.Numerical attribute reduction based on neighborhood granulation and rough approximation [J].Journal of Software,2008,19(3):640-649.
[12]CHEN Y M,ZHU S Z,LI W,et al.Fuzzy granular convolutional classifiers [J].Fuzzy Sets and Systems,2022,426:145-162.
[13]CHEN Y M,QIN N,LI W,et al.Granule structures,distances and measures in neighborhood systems [J].Knowledge-Based Systems,2019,165:268-281.
[14]ZHENG C Y,CHEN Y Y,HOU X Y,et al.A neighbourhood granular fuzzy c-means clustering algorithm [J].Journal of Shandong University(Natural Science),2024,59(5):35-44.
[15]ROSENBLATT F.The perception:a probabilistic model for information storage and organization in the Brain [J].Psychological Review,1958,65(6):111-127.
[16]LI W,YANG H.A non-linear blind source separation method based on perceptron structure and conjugate gradient algorithm[J].Circuits,Systems,and Signal Process,2014,33:3573-3590.
[17]XU J H,ZHANG X G,LI Y D.A nonlinear perceptron algo-rithm based on kernel functions [J].Chinese Journal of Compu-ters,2002,25(7):689-695.
[18]CHOLAQUIDIS A,FRAIMAN R,KALEMKERIAN J,et al.A nonlinear aggregation type classifier [J].Jounal of Multivariate Analysis,2016,146:269-281.
[19]NIE F P,ZHU W,LI X L.Decision tree svm:an extension oflinear svm for non-linear classification [J].Neurocomputing,2020,401:153-159.
[20]KWAK N.Nonlinear projection trick in kernel methods:an alternative to the kernel trick [J].IEEE Transactions on Neural Networks and Learning Systems,2013,24(12):2113-2119.
[21]KEMPFER K C,WANG Y,et al.A comparison study on nonli-near dimension reduction methods with kernel variations:visua-lization,optimization and classification [J].Intelligent Data Analysis,2020,24(2):267-290.
[22]FU X Y,CHEN Y Y,CHEN Y M,et al.A classification method of fully connected granular neural network [J].Journal of Shanxi University(Natural Science Edition),2023,46(1):91-100.
[23]YANG T,ZHONG X R,LANG G M,et al.Granular matrix:a new approach for granular structure reduction and redundancy evaluation [J],IEEE Transactions on Fuzzy Systems,2020,28(12):3133-3144.
[24]LIU W X,LI J J,WANG H W.A dynamic attribute reduction method for formal context based on matrix information entropy [J].Journal of Nanjing University(Natural Science),2025,61(1):117-128.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!