Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
Current Issue
Volume 44 Issue 9, 13 November 2018
Survey of Ontology Mapping
WANG Shun, KANG Da-zhou and JIANG Dong-yu
Computer Science. 2017, 44 (9): 1-10.  doi:10.11896/j.issn.1002-137X.2017.09.001
Abstract PDF(896KB) ( 204 )   
References | Related Articles | Metrics
As a method of knowledge sharing and the interoperability among different ontologies,ontology mapping has attracted more and more attention.According to the ontology mapping process,we divided ontology mapping system into five parts and summarized the similarity algorithm used in different ontology mapping systems.We groomed the status of art in the ontology mapping field of the recent years in this article,did deep analysis and comparison for a serial of the typical ontology mapping systems at different levels and dimensions.Introduction and comparison have been made on some classic ontology mapping systems,and these ontology mapping systems have been evaluated.The challenges of ontology mapping systems can be seen.
Survey of 3D Object Recognition for Point Clouds
HAO Wen, WANG Ying-hui, NING Xiao-juan, LIANG Wei and SHI Zheng-hao
Computer Science. 2017, 44 (9): 11-16.  doi:10.11896/j.issn.1002-137X.2017.09.002
Abstract PDF(556KB) ( 216 )   
References | Related Articles | Metrics
With the rapid development of 3D scanning technology,it is convenient to obtain point clouds of different scenes.Since point clouds are not influenced by light,shadows and textures,recognizing 3D object from scene point clouds has become a research hotspot of computer vision.This paper first summarized the 3D object recognition methods from point clouds in recent years.Then the advantages and disadvantages of the existing methods were discussed.Finally,the challenges and further research directions of object recognition were pointed out.
Analysis of Equipment PHM Architecture Based on Condition Maintenance
Computer Science. 2017, 44 (9): 17-22, 33.  doi:10.11896/j.issn.1002-137X.2017.09.003
Abstract PDF(653KB) ( 133 )   
References | Related Articles | Metrics
In order to push the autonomic concept of equipment and improve the integrated logistics support ability of electronic equipment,a establishment scheme of electronic equipment PHM based on condition maintenance was proposed.After analyzing the content and function of PHM,the PHM system architecture was studied from three dimensions:data information dimension,model dimension and life cycle dimension.Based on open Architecture PHM,the electronic equipment PHM architecture based on condition maintenance was constructed,and the implementation technology of the system was discussed.Finally,a PHM strategy design of a certain equipment simulation training and testing system reveales that the proposed scheme is effective.
Multi-granularity Clustering of Remote Sensing Image Based on Gaussian Cloud Transformation
LIU Xuan, WANG Guo-yin and LUO Xiao-bo
Computer Science. 2017, 44 (9): 23-27, 52.  doi:10.11896/j.issn.1002-137X.2017.09.004
Abstract PDF(1825KB) ( 88 )   
References | Related Articles | Metrics
With the development of remote sensing image technology,the limitation of the traditional image analysis methods have become increasingly prominent.From the perspective of multi-granularity and multi-level,we can solve the adaptive clustering problem of remote sensing images better,with large amount of information and complex structures.Gaussian cloud transformation which is based on cloud model and Gaussian mixture model is a new model of multi-granularity method.It can extract multiple concepts from different granularities in a problem domain.However,due to its time complexity and noise sensitivity,the clustering result of remote sensing images is not ideal.An improved Gaussian cloud transformation method was proposed in this thesis.First,K-Means is used to optimize the selection of initial grain size and amplitude cloud comprehensive is used to modify the adaptive concept abstraction strategy.Then,the granularity division is gotten by using a membership distance.Finally,the method is applied to remote sensing images.The experimental results show the correctness and effectiveness of the proposed method.
Model for Type-2 Fuzzy Rough Attribute Reduction
LU Juan and LI De-yu
Computer Science. 2017, 44 (9): 28-33.  doi:10.11896/j.issn.1002-137X.2017.09.005
Abstract PDF(443KB) ( 54 )   
References | Related Articles | Metrics
Attribute reduction is an important application of rough set theory,which aims to delete redundant attributes and simplify the information system while maintaining the classifying ability of the system.But traditional rough set theory is based on an equivalent relation,which seems to be a very restrictive condition that may limit the application of the rough set model.To overcome this shortcoming,type-2 fuzzy rough set is obtained by combining rough set with type-2 fuzzy set.Two type-1 fuzzy sets,defined on the universe and the product space of feature spaces respectively,are used to construct a type-2 fuzzy partition of the universe and a model for fuzzy rough attribute reduction is expanded to the framework of type-2 fuzzy rough sets,and then a model for type-2 fuzzy rough attribute reduction is obtained.An example is given to show the application of this model.
Sequential Three-way Classifier with Local Reduction
JU Heng-rong, LI Hua-xiong, ZHOU Xian-zhong, HUANG Bing and YANG Xi-bei
Computer Science. 2017, 44 (9): 34-39, 57.  doi:10.11896/j.issn.1002-137X.2017.09.006
Abstract PDF(513KB) ( 74 )   
References | Related Articles | Metrics
Sequential three-way decision is a novel decision approach of three-way decision theory in recent years.Howe-ver,in classical sequential three-way decision research,little attention was paid to two important issues,one is the construction of sequential information granule,and the other is the application in classification learning.To address such issues,the intrinsic sequential properties of local and global reductions were studied firstly in this paper.Based on such properties,the sequential information granule was constructed with reduct’s property.Furthermore,a sequential three-way classifier was proposed and designed.The experimental results show that,the proposed classifier is not only good at making classification at an appropriate information granule,but it can also improves the classification accuracy when compared with several classical classifiers.
Similarity Algorithm Based on Three Way Decision of Time Warping Distance
XU Jian-feng, HE Yu-fan, ZHANG Yuan-jian and TANG Tao
Computer Science. 2017, 44 (9): 40-44, 61.  doi:10.11896/j.issn.1002-137X.2017.09.007
Abstract PDF(484KB) ( 90 )   
References | Related Articles | Metrics
Dynamic time warping (DTW) is widely accepted as one of the most effective methods for the similarity measurement of time series,but suffers from high time complexity.Fast search method for dynamic time wrapping (FTW) is demonstrated to accelerate DTW.The core of pruning is however a typical two-way decision rather than three-way decision,which is different from actions taken with uncertain issues.By incorporating three-way decision,an optimized DTW model three-way decision DTW (3WD-DTW) is developed first.The decision thresholds α,β are derived by solving an optimization problem with the objective of minimizing error rate.A novel simulated annealing algorithm is thus proposed.Finally,similarity algorithm based on three way decision of time warping distance is presented.Experiments show that 3WD-DTW is comparable in computing complexity as compared to FTW.In terms of accuracy,3WD-DTW outperforms FTW significantly and approximates to DTW.
Determining Clustering Number of FCM Algorithm Based on DTRS
SHI Wen-feng and SHANG Lin
Computer Science. 2017, 44 (9): 45-48, 66.  doi:10.11896/j.issn.1002-137X.2017.09.008
Abstract PDF(973KB) ( 80 )   
References | Related Articles | Metrics
Fuzzy C-Means(FCM),as the most popular algorithm of the soft clustering,has been extensively used to make compact and well separated clusters.However,its sensitivity to initial cluster number makes choosing a better C value become very important.So it is an important step to determine the number of FCM clustering when we use FCM to do cluster analysis.In this paper,the extended decision-theoretic rough sets(DTRS) model is applied for the purpose of clustering validity analysis which could overcome the defect of the FCM algorithm.We proposed the method for determining clustering number of FCM algorithm based on DTRS,and we verified the effect of the clustering by image segmentation.Good segmentation results can be obtained when we compare the cost of different number of clusters.We compared our results with the ant colony fuzzy c-means hybrid algorithm (AFHA),which was proposed by Z.Yu et al in 2015,and the improved AFHA (IAFHA).The experimental results show that our clustering result is better in Bezdek partition coefficient with a higher value than AFHA and IAFHA algorithms,and in the Xie-Beni index as well.
Extraction Method of LPR Characters Features Based on Soft K-segments Algorithm for Principal Curves
Computer Science. 2017, 44 (9): 49-52.  doi:10.11896/j.issn.1002-137X.2017.09.009
Abstract PDF(332KB) ( 46 )   
References | Related Articles | Metrics
License plate recognition is an important part of intelligent transportation systems.In order to improve the recognition rate of LPR characters,extraction of features are critical.Principal curves are nonlinear generalizations of principal components analysis.They are smooth self-consistent curves that pass through the “middle” of the distribution.By analysis of existed principal curves,we learned that a soft K-segments algorithm for principal curves exhibits good performance in such situations in which the data sets are concentrated around a highly curved or self-intersecting curves.Therefore,we attemptd to use the algorithm to extract structural features of LPR characters.Experiment results show that the algorithm is feasible for extraction of structural features of LPR characters.The proposed method can provide a new approach to the research for extraction of LPR characters features.
Simplification of Triadic Contexts and Concept Trilattices
QI Jian-jun and WEI Ling
Computer Science. 2017, 44 (9): 53-57.  doi:10.11896/j.issn.1002-137X.2017.09.010
Abstract PDF(366KB) ( 92 )   
References | Related Articles | Metrics
Triadic concept analysis (TCA) is an extension of formal concept analysis (FCA).As the data foundation,the triadic contexts is commonly used in the real world.The ternary relationship shown in a triadic context is the base to forming concept trilattice,it is more complicated than binary relationship shown in a formal context.It results in intrication of triadic concepts and concept trilattices.Aiming to the information simplification of a concept trilattice,an approach to simplifying a triadic context and its concept trilattice on the basis of binary relationship was proposed,since the binary relationship is essential to some extent.The main idea is to delete the redundant elements in each universe while keeping all the binary relationship of the original triadic context.Actually,three universes of a triadic context can be considered at the same time using this method.After the simplification,we obtained some properties about the simplified concept trilattice,and we also discussed the relationship between former triadic concept and later triadic concept.These conclusions are research basis for the future algorithm research and application,also the base for the deeper theoretic study.
NMF-Based Clustering Ensemble Algorithm
HE Meng-jiao, YANG Yan and WANG Shu-ying
Computer Science. 2017, 44 (9): 58-61.  doi:10.11896/j.issn.1002-137X.2017.09.011
Abstract PDF(297KB) ( 70 )   
References | Related Articles | Metrics
A NMF-based K-means clustering ensemble (NBKCE) algorithm was proposed for solving the problem of effective information loss in ensemble,which is caused by basic clustering results obtained from the original datasets.In NBKCE,an ensemble information matrix is built primarily by exploiting the results of the K-means,and then the relationship matrix is formed based on the original dataset.At last nonnegative matrix factorization (NMF) is employed to construct consensus function to gain the final results.The experiments demonstrate that the NBKCE may attain the underlying information effectively and improve the performance of the clustering.
Knowledge Discovery Method for Heterogeneous Data Based on Concept Lattice
NIU Jiao-jiao, FAN Min, LI Jin-hai and YIN Yun-qiang
Computer Science. 2017, 44 (9): 62-66.  doi:10.11896/j.issn.1002-137X.2017.09.012
Abstract PDF(348KB) ( 52 )   
References | Related Articles | Metrics
Recently,much attention has been paid to concept-lattice-based knowledge discovery methods.In the meanwhile,this topic has attracted many research interests from the communities of formal concept analysis and rough set theory.Especially,in recent years,some substantial progresses have been made on studying formal decision contexts.However,the existing knowledge discovery methods are lack of feasibility and effectiveness when they are applied to big data.Considering that heterogeneity is one of the main characteristics of big data,this paper investigated concept-lattice-based knowledge discovery methods for heterogeneous data.Specifically,the notion of a heterogeneous formal context was proposed as well as its corresponding concept lattice,heterogeneous formal contexts were further employed to define heterogeneous formal decision contexts,and rule acquisition was discussed.Moreover,an algorithm of mining non-redundant decision rules from a heterogeneous formal decision context was explored.
Attribute Reduction Based on Multicost Decision-theoretic Rough Set
YANG Zhi-rong, WANG Yu and YANG Xi-bei
Computer Science. 2017, 44 (9): 67-69, 92.  doi:10.11896/j.issn.1002-137X.2017.09.013
Abstract PDF(299KB) ( 42 )   
References | Related Articles | Metrics
Compared with classic rough set,traditional decision-theoretic rough set takes the cost into consideration,using cost matrix to generate a pair of thresholds.But decision-theoretic rough set doesn’t meet the monotonicity that has been widely used in classic rough set,which has brought a new challenge for us in the study of attribute reduction in rough set. Cost matrix in traditional decision-theoretic rough set is only one,doesn’t think about the variability of cost.The pessimistic decision rules and the optimistic rules of muticost decision-theoretic rough set are introduced at first and the thresholds which generated by multiple cost matrix are applied to attribute reduction.An heuristic Local attribute reduction method is proposed not on whole decision class but on individual decision class,which can get more positive rules from relevant experiment results in optimistic conditions than in pessimistic conditions,when it compared with the method based on the whole decision class.
Fuzzy Rough Sets Based on New Kernel Functions
YE Qiu-ping and ZHANG Hong-ying
Computer Science. 2017, 44 (9): 70-73, 87.  doi:10.11896/j.issn.1002-137X.2017.09.014
Abstract PDF(355KB) ( 82 )   
References | Related Articles | Metrics
Fuzzy rough sets,as a combination of fuzzy sets and rough sets,can deal with the complexity and uncertainty of data sets effectively.Fuzzy granule structures derived by fuzzy similarity relations are used to study the quantitative fuzzy rough sets.Kernel functions and fuzzy similarity relations are the key factors of machine learning and fuzzy rough sets.With the relationship between the fuzzy similarity relation and the kernel function,this paper presented a new approach to construct kernel function and gave the corresponding fuzzy rough sets.Moreover,this paper gave a comparative experimental analysis,and the results show that the new kernel function has generality.
Method of Three-way Decision Spam Filtering Based on Head Information of E-mail
YUAN Guo-xin and YU Hong
Computer Science. 2017, 44 (9): 74-77, 114.  doi:10.11896/j.issn.1002-137X.2017.09.015
Abstract PDF(407KB) ( 45 )   
References | Related Articles | Metrics
A method of three-way decision spam filtering was proposed in this paper based on the head information of E-mail.The head information is sorted by a new measurement of attribute significance.Bayesian probability based on the most significant attributes is computed to do the actions of three-way decisions.When the information is not enough to make decisions,more attribute information is added to the computing of Bayesian probability until the final decisions are made.The results of comparative experiments show that the new method is reasonable and effective.
Distribution Reduction in Inconsistent Interval-valued Decision Systems
ZHANG Nan, XU Xin, TONG Xiang-rong, GAO Xue-yi and JIANG Li-li
Computer Science. 2017, 44 (9): 78-82, 104.  doi:10.11896/j.issn.1002-137X.2017.09.016
Abstract PDF(412KB) ( 78 )   
References | Related Articles | Metrics
Different classification features in decision systems can be kept by knowledge reduction which is one of the hottest issues in rough set theory.The confidence level is unchanged because of distribution reduction in decision systems.For providing the measure criterion for universe classification in interval-valued decision systems,the similarity coefficient was introduced in this paper.To extend the equivalence relation in Pawlak decision systems to the tolerance relation in interval-valued decision systems,we proposed the concept of distribution reduction in inconsistent interval-valued decision systems.Aiming at the proposed concept,we provided the computational method of corresponding discernibility matrix.We also discussed the relation of distribution reduction and generalized decision reduction in interval-valued decision systems.Finally,experiments show that the novel method is effective.
Concept Construction and Attribute Reduction in Incomplete Decision Formal Contexts
ZU Hong-jiao, XIE Bin and MI Ju-sheng
Computer Science. 2017, 44 (9): 83-87.  doi:10.11896/j.issn.1002-137X.2017.09.017
Abstract PDF(367KB) ( 43 )   
References | Related Articles | Metrics
We defined the incomplete decision formal context,and proposed the construction method of concept with double subset intension and the generation algorithm of concept lattice in incomplete sub-condition formal context and sub-decision formal context。Further,the judgment theorems of consistent sets and attribute reduction based on the double subset intension concept were given in incomplete decision formal context.
Fuzzy Entropy and Uncertain Measurement in Lattice-valued Information Systems
ZHANG Xiao-yan, SANG Bin-bin and WEI Ling
Computer Science. 2017, 44 (9): 88-92.  doi:10.11896/j.issn.1002-137X.2017.09.018
Abstract PDF(353KB) ( 75 )   
References | Related Articles | Metrics
In the Lattice-valued information system,the concept of knowledge rough entropy,rough set rough entropy and uncertaint measurement is introduced,and the important properties are obtained.In this paper,it is proved that the knowledge rough entropy increases monotonically when the particle of the knowledge increases and the classification of the information system becomes rough.Further,by discussing the relation between them,the rough entropy of rough sets can be more accurate to measure the degree of rough sets.These conclusions lay a theoretical foundation for the knowledge discovery of Lattice valued information systems.
Hybrid Optimization Algorithm Based on Grey Wolf Optimization and Differential Evolution for Function Optimization
ZHANG Xin-ming, TU Qiang, KANG Qiang and CHENG Jin-feng
Computer Science. 2017, 44 (9): 93-98, 124.  doi:10.11896/j.issn.1002-137X.2017.09.019
Abstract PDF(548KB) ( 66 )   
References | Related Articles | Metrics
Grey wolf optimizer (GWO) is a novel intelligent optimization algorithm which has proposed resently and it has such merits as fast convergence speed,high optimization precision,but easily entraps in local optima.The differential evolution (DE) algorithm has strong global search ability,but its local search ability is poor and its performance is sensitive to the parameters.To take advantage of the merits of GWO and DE and overcome their defects in dealing with function optimization problems,a hybrid optimization algorithm based on grey wolf optimization and differential evolution (GWODE) was proposed.First,the optima-inclinded operator embedded GWO is utilized which is benefit to improving the optimization precision and convergence rate of the algorithm in a shorter search process.Then,an adaptive differential strategy,which can automatically adjust the value of the parameters,is employed to further improve the optimization performance of the algorithm for complex optimization functions.Thus,a hybrid algorithm with high performance is obtained and it’s more efficient to solve various function optimization problems.The optimization results on 12 benchmark functions show that the new hybrid optimization algorithm has higher search precision,better optimal performance and stronger applicability,and it’s more suitable for solving a variety of optimization problems,compared with the standard GWO,ACS,DMPSO and SinDE.
Research of Automatic Verification Method about Radio Frequency Identification Protocol
SONG Lan, XUE Jin-yun, HU Qi-min, XIE Wu-ping, JIANG Dong-ming and YOU Zhen
Computer Science. 2017, 44 (9): 99-104.  doi:10.11896/j.issn.1002-137X.2017.09.020
Abstract PDF(1083KB) ( 79 )   
References | Related Articles | Metrics
Population Protocols,which is a calculation model inspired by biology,was designed to represent interaction between multiple components with very limited computational capability in wireless network.It provides a theoretical framework which has the function of computation and reasoning for wireless sensor networks.This paper introduced the population protocols model into the RFID anti-collision protocol,proposed the validation framework of RFID anti-collision protocol,built the state transition model through the interaction between the tag and the reader,and verified the self-stabilizing population protocols by using the spin model checker and linear temporal logic (LTL).These work will provide us an effective method to analyze and verify the correctness of the protocol in wireless sensor networks.
3D Localization Estimation Algorithm Based on Locality Preserving Canonical Correlation Analysis in Wireless Sensor Networks
CUI Hong-fei, LIU Jia, GU Jing-jing and ZHUANG Yi
Computer Science. 2017, 44 (9): 105-109, 130.  doi:10.11896/j.issn.1002-137X.2017.09.021
Abstract PDF(511KB) ( 62 )   
References | Related Articles | Metrics
Since existing three dimensional localization algorithms have the drawbacks of low positioning accuracy and poor stability,3D-LE-LPCCA algorithm was put forward.Firstly,locality preserving canonical correlation analysis is extended to three dimensional space and a mapping model between signal space and physical coordinate space is built.After solving the model,the adjacent node which is of unknown nodes in physical coordinate space is obtained.Secondly,our algorithm calculates the best positioning unit in the adjacent node set in constrains of degree of coplanar and vo-lume ratio.Finally,the coordinate of the unknown node is calculated through the best positioning unit.The simulation results show that 3D-LE-LPCCA algorithm has a good localization effect and improves the accuracy and stability of the three-dimensional localization algorithm,and reduces the energy consumption of the node.
Improved Adaptive Relaying Strategy for Full Duplex Relay System
LONG Zhi-peng, YU Jiang and CHANG Jun
Computer Science. 2017, 44 (9): 110-114.  doi:10.11896/j.issn.1002-137X.2017.09.022
Abstract PDF(368KB) ( 71 )   
References | Related Articles | Metrics
This paper proposed a full duplex relay strategy based on threshold of the signal to noise ratio from source node to relay nodes (SR) and the minimum self-interference of relay nodes,and constructed an improved adaptive full duplex relay system through combining with the full duplex relaying strategy based on the maximum of signal to noise ratio from source node to relay nodes.Under the condition of equal power allocation and using DF protocol forwarding,the outage probability of the proposed full duplex relay system was analyzed.The results show that the outage probability of the proposed full duplex relay strategy is lower than that of an adaptive half duplex relay strategy.It is also lower than that of a full duplex relay strategy based on the maximum of signal to noise ratio from source node to relay nodes when the self-interference signal is strong and the signal to noise ratio of SR is large.The proposed adaptive full duplex relay system will use relay strategy of the full duplex relay system based on the maximum of signal to noise ratio from source node to relay nodes to maintain good outage performance when the self-interference signal is weak.
Research on Routing Protocol Based on Gradient and Energy Awareness in Wireless Sensor Networks
ZHENG Zhi-yun, GUO Fang, WANG Zhen-fei, ZHANG Xing-jin and WANG Fei
Computer Science. 2017, 44 (9): 115-119, 147.  doi:10.11896/j.issn.1002-137X.2017.09.023
Abstract PDF(2095KB) ( 52 )   
References | Related Articles | Metrics
To solve the problem of unbalanced energy consumption of wireless sensor network nodes,a new distributed routing protocol based on gradient and energy awareness (EGRP) was proposed.EGRP introduces 3 parameters to form clusters:the residual energy of the node itself,its distance gradient,and the average residual energy of its neighbors.Then according to inner cluster head’s residual energy and distance gradient,every outer cluster head chooses one inner as the forward routing to build the routing tree.Theoretical derivation and simulation results show that EGRP can achieve ideal optimal effect,and it reduces single node’s energy consumption by 10.9%,improves the balance of energy consumption between different nodes,and prolongs network’s lifetime.
Resource Allocation for D2D Communications Underlaying Cellular Networks Using Directed Weighted Bipartite
WANG Zhen-chao, ZHAO Yun and XUE Wen-ling
Computer Science. 2017, 44 (9): 120-124.  doi:10.11896/j.issn.1002-137X.2017.09.024
Abstract PDF(429KB) ( 73 )   
References | Related Articles | Metrics
This paper aimed to design a resource allocation algorithm with low complexity in D2D underlaid cellular networks where at most one D2D pair and one cellular user can reuse a same channel.An integer program is formed to maximize throughput.Then it is transformed to an integer program to minimize the sum of interference channel gains,because interference is thought to be the most effective factor to decide whether two links can use a same channel.In order to solve the optimization problem which can be seen as a one-to-one matching problem,directed weighted bipartite and relative definitions were firstly proposed.Then,a greedy algorithm,whose complexity is only O(n),was proposed to search optimal match pairs.Simulation results show that our algorithm can achieve better throughput and capacity than the weighted bipartite algorithm in certain range while the complexity is reduced two orders of magnitude.
Research on Pattern Matching Algorithm in Intrusion Detection System
XU Zhou-bo, ZHANG Yong-chao, GU Tian-long and NING Li-hua
Computer Science. 2017, 44 (9): 125-130.  doi:10.11896/j.issn.1002-137X.2017.09.025
Abstract PDF(481KB) ( 50 )   
References | Related Articles | Metrics
As an network intrusion detection system,Snort’s detection principle is based on pattern matching.In order to improve the efficiency of the matching algorithm,the BM algorithm in Snort was improved from two aspects.Firstly,in order to increase the moving distance when missing match,the two characters following the character which is aligned with the rightmost location of the pattern in the text and the two corresponding characters moved by length of the pattern are taken into consideration.And the most moving distance is 2m+2.Furthermore,the appearance frequency of the bigger moving distance when missing match is increased by using the probability characteristic of the combination of the rightmost and its next characters.Finally,experiments on these algorithms were conducted.The experimental results show that the proposed algorithm can effectively reduce the times of moving windows and comparing character.As a result,the matching efficiency is improved.
Software Watermarking Scheme Based on Dynamic Graph Coding
LIU Jia-yi and YAN Xue-feng
Computer Science. 2017, 44 (9): 131-135.  doi:10.11896/j.issn.1002-137X.2017.09.026
Abstract PDF(384KB) ( 82 )   
References | Related Articles | Metrics
The software watermark could prove the related information of software.At present,most of the watermar-king algorithms are based on the classical dynamic image watermarking algorithm——CT algorithm.After the watermark was decomposed into watermark segments,the watermarks were embedded by the coding scheme.Coding scheme of ExtendPPCT has changed the structure of PPCT,watermarked concealment is poor and nodes are easy to be removed. A new hybird encoding based on planted plane cubic tree and rank order encoding was put forward.The PPCT expresses modulus,and the leaves of PPCT sort coding expresses the remainder.This way on pair of encoding makes its segments of the watermark embedded software cut in half which have little effect on the embedded watermark and watermark hiding ability is stronger.The encoding did not change the only external loop of the original PPCT,at the same time this type of encoding could resist against reducing attack type.
Network Anomaly Detection Model Based on Time-varying Weighted Markov Chain
WANG Xiao, QI Yong and LI Qian-mu
Computer Science. 2017, 44 (9): 136-141, 161.  doi:10.11896/j.issn.1002-137X.2017.09.027
Abstract PDF(1235KB) ( 97 )   
References | Related Articles | Metrics
With the rapid development of the Internet,the network intrusion events are becoming more and more frequent,and the instruction detection is of great significance to the protection of network security.In view of the urgent demand of real-time instruction detection,a model of network instruction detection based on time-varying weighted Markov chain model was proposed in this paper.This model uses the combined state sequence to describe state transition.The log event generated by the DARPA2000 data set on the NT system was used as the experimental data to carry out simulation experiments,and the time-varying weighted Markov chain model were compared. The simulation results show that the model mentioned in this paper can be used for real-time instruction detection,which has high accuracy,strong robustness,and can effectively reduce the false detection rate.
Formal Design and Verification for Typical Security Gateway
WANG Rui-yun, ZHAO Guo-lei, CHANG Chao-wen and WANG Xue-jian
Computer Science. 2017, 44 (9): 142-147.  doi:10.11896/j.issn.1002-137X.2017.09.028
Abstract PDF(2147KB) ( 65 )   
References | Related Articles | Metrics
Security gateway which is designed in experience usually focuses on function implementation and is usually not designed according to security model.To solve this problem,we proposed a method of formally designing and verifying a typical security gateway.Firstly,we designed the typical security gateway’s security policy according to its security requirements.Secondly,we formally modeled the security policy and verified the security model’s internal consistency by means of BLP model.In the end,we verified the consistency between the security gateway’s functional specifications and its security model.To make sure the reasoning procedure’s correctness,we used the theorem prover Isabelle/HOL to formally describe the above work and help us deduce.Our work ensures the security of a typical security gateway in terms of its top-level design and plays certain referential significance on formal design of security gateway.
AS Security Alliance Mechanism for Inter-domain Routing System Based on Mimicry Protection
MIAO Fu, WANG Zhen-xing, GUO Yi and ZHANG Lian-cheng
Computer Science. 2017, 44 (9): 148-155.  doi:10.11896/j.issn.1002-137X.2017.09.029
Abstract PDF(2265KB) ( 35 )   
References | Related Articles | Metrics
Large-scale low rate denial of service attack against BGP sessions can cause paralysis of the inter-domain routing system as a whole.However,existing detection methods and protection measures are difficult to effectively detect and defense against such attacks.Detecting the topology of the inter-domain routing system and obtaining the key link parameters are fundamental steps to the BGP-LDoS attack.Network’s mimic transformation can provide continuous dynamic transformation to puzzle the attacker,increase cost and complexity of the attacker’s detection and analysis,reduce attack’s success probability.From the view of mimic security defense,this paper presented an inter domain routing system security alliance mechanism.The method uses neighboring autonomous systems form as an ally,and makes equi-valent topology transformation in the alliance.The realization of the specific process was given.The resilience of the BGP-LDoS attack after the mimicry transformation was checked and analyzed.Experimental results demonstrate that the method can effectively reduce the attacker’s network topology analysis accuracy,and interference attacker’s target link selection process.It can provide reliable protection for inter-domain system to against BGP-LDoS attack.
Hierarchical Privacy Protection of Multi-source Data Fusion for Sensitive Value
YANG Yue-ping, WANG Jian and XUE Ming-fu
Computer Science. 2017, 44 (9): 156-161.  doi:10.11896/j.issn.1002-137X.2017.09.030
Abstract PDF(517KB) ( 90 )   
References | Related Articles | Metrics
Data fusion technology enables users to get more comprehensive data to provide more effective service.Howe-ver,the existing multi-source data fusion privacy protection models do not consider the importance of the data provi-ders,and the sensitivity of different attributes and attribute values.According to the above problems,this paper proposed a hierarchical privacy model for sensitive value.The model enables data providers to set sensitive value of data attributes and attribute values by anonymous degree requirements to realize the individual privacy protection of sensitive values.At the same time,this paper proposed a multi-source data fusion privacy protection algorithm for sensitive value combining with k-anonymous privacy model and the top-to-down specialization TDS.Experiments show that the proposed algorithm can not only realize data security fusion,but also obtain better privacy protection.
Secure Data Aggregation Scheme for Multiple Applications in Wireless Sensor Networks
CHEN Yan-li, ZHANG Qian, XU Jian and WANG Meng-han
Computer Science. 2017, 44 (9): 162-167.  doi:10.11896/j.issn.1002-137X.2017.09.031
Abstract PDF(479KB) ( 49 )   
References | Related Articles | Metrics
To solve the issues of security of multi-source heterogeneous data during data aggregation in wireless sensor networks(WSNs),this paper proposed a lightweight secure data aggregation scheme which can guarantee data confidentiality,integrity and freshness.HASH function is used to update the key of each time slot by using present aggregation round and preset key as an input.The application of homomorphic encryption makes intermediate node perform aggregation operation on ciphertext directly.Homomorphic message authentication code (HMAC) enables base station to verify whether the aggregation data have been modified during transportation.Moreover,plaintext is coded before being encrypted so as to satisfy multiple applications.Theoretical analysis and simulation verify that the proposed algorithm can preserve data privacy with lower communication assumption and higher data aggregation accuracy.
Improved Meaningful Collision Attack on MD4
ZHOU Yong-peng and WANG Gao-li
Computer Science. 2017, 44 (9): 168-171, 177.  doi:10.11896/j.issn.1002-137X.2017.09.032
Abstract PDF(360KB) ( 74 )   
References | Related Articles | Metrics
In FSE’1996,Hans Dobbertin gave a meaningful collision on MD4 based on ASCII,which contains meaningless words at the beginning of the text.In 2009,Jia and Wang presented a meaningful collision on MD4 based on Latin-1character set,which contains meaningless words at the end of the text.In this paper,based on the modular differential method proposed by Wang,we gave two concrete meaningful collisions by using the differential characteristic proposed by Yu et CANS 2005.One example of the meaningful collision is in Chinese and based on GBK,an other example is in English and based on UTF-8.Moreover,an example of tampered python script was proposed.
Design for Improving Energy-efficiency and Ensuring Physical-layer Security in Full-duplex Cognitive Relay Networks
ZHANG Pei, ZHANG Jian-ming and WANG Liang-min
Computer Science. 2017, 44 (9): 172-177.  doi:10.11896/j.issn.1002-137X.2017.09.033
Abstract PDF(461KB) ( 71 )   
References | Related Articles | Metrics
Under the “green” and secure communication background,we studied the co-time co-frequency full-duplex cognitive relay networks consisting of two secondary source nodes,multiple cognitive relay nodes,multiple primary nodes and multiple primary eavesdroppers.To improve the total energy efficiency on the premise of ensuring the physical-layer security and the primary node’s performance whether the selection relay protocol is the amplify-and-forward or decode-and-forward,we proposed the power allocation schemes to obtain a cooperative beamforming coefficient and an artificial noise matrix after taking both self-cancellation and forwarding fairness into consideration.It is mainly optimized by the “hill climbing” algorithm that combines the semidefinite relaxation (SDR) technology.Simulation results and theoretical analysis show the effectiveness and rationality of our scheme.Moreover,the choice of amplify-and-forward relay protocol contributes to the higher total energy efficiency.
P2P-based Privacy Protection Strategy in Mobile-commerce Recommender System
WANG Li-e, XU Yuan-xin, LI Xian-xian and LIU Peng
Computer Science. 2017, 44 (9): 178-183.  doi:10.11896/j.issn.1002-137X.2017.09.034
Abstract PDF(539KB) ( 73 )   
References | Related Articles | Metrics
Mobile recommender systems have recently become one of the hottest topics in the domain of recommender systems.How to provide high-precision recommendations and privacy protection has become the main challenge in the development of mobile-commerce,since mobile device is privacy and mobile network is complex.Due to its weakness of computation power and bandwidth,recommender system of mobile-commerce is not able to use these privacy-preserving techniques which are initially designed for traditional recommender systems.To address above problems,a P2P-based privacy protection approach specifically for mobile-commerce recommender system was proposed in this paper.Our approach keeps incremental data intactly for guaranteeing high-precision recommendations while preserving privacy by constructing friends’ circles and forwarding data anonymously based on the model of k-anonymity.In the end,the experiment shows that the P2P-based privacy protection approach is feasible and effective.
Static Semantics of Aspect-oriented Programming
XIE Gang, WEI Li and WU Xiang
Computer Science. 2017, 44 (9): 184-189, 215.  doi:10.11896/j.issn.1002-137X.2017.09.035
Abstract PDF(474KB) ( 49 )   
References | Related Articles | Metrics
Till now,many researchers have developed various formal semantics for aspect-oriented program.However,none of the semantics have provided the characterization of aspect-oriented programming specification and the declaration section of an aspect comprehensively and precisely.To make a further step,we defined a unified aspect-oriented programming specification language in our research.Then, we provided a formal definition for joinpoint and pointcut for aspect-oriented programs.Next,we introduced structural variables into the static structure to represent the aspect-orien-ted programs.Finally,we defined static semantics of aspect-oriented programs using the definition of design in unifying theories of programming,and proved its soundness afterwards.The approach was enumerated with a case to demonstrate the usage of the semantics.
Segmentation and Application of Multilevel Morphology Model in GUI Testing
WANG Hao-liang and GAO Jian-hua
Computer Science. 2017, 44 (9): 190-194, 199.  doi:10.11896/j.issn.1002-137X.2017.09.036
Abstract PDF(467KB) ( 73 )   
References | Related Articles | Metrics
Model-based GUI testing (MBGT) approaches are efficient since their test cases can be generated automatically.Employing multilevel morphology model (MMM) in MBGT allows testers to explore the morphological differen-ces of GUI model,therefore,it can increase the fault detection effectiveness.However,MMM can only be extended as a whole to the increasing level of MMM,and the model becomes more and more complex and harder to process.In this paper,we proposed a MMM segmentation approach which is based on event classification,and a relevant test case gene-ration strategy which employs BFS and CPP algorithm.This approach enables MMM to focus on the important parts of model,and meanwhile,reduces the number and length of test cases,makes MMM more agile and efficient.The result of the experiment indicates that the segmented MMM has almost the same fault detection effectiveness as its original mo-del,and will become more efficient if the model level increases.
Method of SBDD Based on Dynamic Fault Tree
ZHANG Xiao-ce, YAN Xue-feng and ZHOU Yong
Computer Science. 2017, 44 (9): 195-199.  doi:10.11896/j.issn.1002-137X.2017.09.037
Abstract PDF(368KB) ( 42 )   
References | Related Articles | Metrics
When analyzing the dynamic fault tree of Pandora,the method of SBDD doesn’t take into account the complex relations between events.It causes that the generated SBDD has invalid branches where there are invalid cut sets in the non-intersect minimum cut sets.Aiming at this problem,an improved method of SBDD was proposed in this paper which can remove invalid node dynamically and avoid invalid branches.The improved method of SBDD mainly includes two aspects,the relation sorting method based on the structure sorting method and the dynamic optimization generation algorithm of SBDD.The basic idea of the relation sorting method is giving the events different sorting priority based on the structure relations of the fault tree and the relationship between events.The dynamic optimization generation algorithm of SBDD is used to calculate the SBDD based on the event sequencing queue.In the process of calculation,the algorithm removes invalid node dynamically and makes the results don’t contain invalid cuts.Experiments show that the SBDD generated by the improved method of SBDD is smaller and the non- intersect minimum cut sets are less and don’t contain invalid sets in a approximate time.
K-nearest Neighbor Based Interest Group Query in Geo-social Networks
WANG Jia-nan, CHEN Mo, GONG Shu-feng and YU Ge
Computer Science. 2017, 44 (9): 200-207.  doi:10.11896/j.issn.1002-137X.2017.09.038
Abstract PDF(1192KB) ( 116 )   
References | Related Articles | Metrics
Users on geo-social networks may want to find the other users with the same interests,motivated by which,we proposed a new type of spatial query named K-nearest neighbor based interest group query(KNNIG).Different from traditional spatial K-nearest neighbor queries which only consider the constraint of distance,KNNIG query also consi-ders the users’ interest values of the query keyword,based on which,we derived a D-I ranking function.KNNIG query retrieves a user group of size k that maximizes the ranking function.In addition,we proposed three kinds of query processing algorithms,including a basic processing algorithm KNNIG-G,an optimization algorithm KNNIG-G* and a distance relaxation algorithm KNNIG-DR based on grid index.Based on KNNIG-G,KNNIG-G* and KNNIG-DR are developed by a spatial pruning strategy and a distance relaxation strategy respectively,which effectively reduce computational cost and improve the query efficiency.Experimental results on a real dataset demonstrate the feasibility and effectiveness of the proposed algorithms.
Consistent Web Table Augmentation Based on Column Overlapping
QI Fei, WANG Ning, ZHANG Li-fang and SUN Wei-juan
Computer Science. 2017, 44 (9): 208-215.  doi:10.11896/j.issn.1002-137X.2017.09.039
Abstract PDF(617KB) ( 48 )   
References | Related Articles | Metrics
Web table augmentation refers to extend table content based on main column or other known information,which helps people to obtain information they are interested in.Current research focuses on entity-attribute binary table made of main column and extended column,where the primary column is the only basis.When it is applied to a table with multiple columns to be extended,the result table consolidated by binary tables will suffer from entity inconsistency problem.We proposed consistency support degree based on relationships between columns as well as between tuples in the table,and implemented the CCA system for table consistency augmentation based on column overlapping.Our methodkeeps the high matching score of candidate values using as few source tables as possible to avoid entity inconsistency.Experimental results show that the proposed CCA system has higher accuracy,coverage,consistency and lower query time cost compared with existing methods.
Research of Approximate Keyword Query on Fuzzy XML Documents
LI Ting and CHENG Hai-tao
Computer Science. 2017, 44 (9): 216-221, 226.  doi:10.11896/j.issn.1002-137X.2017.09.040
Abstract PDF(551KB) ( 71 )   
References | Related Articles | Metrics
The study of keyword queries on crisp XML documents is carried out mainly based on the LCA semantics or its variant semantics (SLCA,ELCA),and the most compact XML subtrees containing all keywords are returned as the query results.However,the generated results based on the LCA semantics always contain a lot of redundant information,and there are a lot of uncertainty and fuzzy information exist in the real world.How to search the high quality results of keyword queries on fuzzy XML documents is an issue need to be studied.Aiming at investigating the method of approximate keyword query on fuzzy XML documents,firstly the concept of minimum connecting tree was introduced,All GDMCTs problem of keyword queries on fuzzy XML documents was proposed,and a stack based algorithm All fuzzy GDMCTs was given to solve the problem.The algorithm can get all the GDMCTs results satisfying the given subtree size threshold and possibility threshold.Experimental results show that the algorithm can get the high quality results of keyword queries on fuzzy XML documents.
Algorithm of Continuous Attribute Discretization Based on Binary Ant Colony and Rough Sets
CAO Feng, TANG Chao and ZHANG Jing
Computer Science. 2017, 44 (9): 222-226.  doi:10.11896/j.issn.1002-137X.2017.09.041
Abstract PDF(415KB) ( 55 )   
References | Related Articles | Metrics
Discretization is an important process of data preprocessing and has been widely applied in the research fields of rule extraction,knowledge discovery,and classification.A discretization algorithm of continuous attribute based on binary ant colony and rough sets was proposed in this paper.The algorithm constructs binary ant colony network on the cut points set generated by multidimensional continuous attributes.Meanwhile,it searches global optimal discretization cut points set by using fitness function constructed with the accuracy of approximation classification of rough sets.To validate the effectiveness of the proposed discretization algorithm,it is applied to seven UCI data sets.And the experimental results indicate that it has relative better performance.
Research on Attitude Algorithm Based on Improved Extended Calman Filter
FENG Shao-jiang, XU Ze-yu, SHI Ming-quan and WANG Xiao-dong
Computer Science. 2017, 44 (9): 227-229, 249.  doi:10.11896/j.issn.1002-137X.2017.09.042
Abstract PDF(312KB) ( 280 )   
References | Related Articles | Metrics
In order to solve the problem that standard extended Kalman filter (EKF) in the multi-rotor UAV attitude solver has lower accuracy,an improved extended Kalman filter algorithm (BPNN-EKF) was proposed to improve the accuracy greatly.EKF prediction model parameters require the presence of priori known properties,but in engineering practice it is difficult to obtain accurate parameters.And nonlinear systems using linear model will cause error problem for standard EKF. Aiming at above problems,we used nonlinear mapping ability of neural network and adaptive ability to compensate the estimated value of the standard EKF,reduce the impact of the model as well as filtering parameters error for optimal estimates,thereby enhancing optimal estimation accuracy.The simulation results show that BPNN-EKF plays a significant role in improving multi-rotor UAV attitude solver accuracy.
Research on Improved Cache Replacement Algorithm Serving for Wind Power System
LU Er-jie, CHEN Luan, LI Jian, HUANG Qi, ZHANG Zhen-yuan, JING Shi and ZHOU Tong-han
Computer Science. 2017, 44 (9): 230-233, 238.  doi:10.11896/j.issn.1002-137X.2017.09.043
Abstract PDF(397KB) ( 47 )   
References | Related Articles | Metrics
Aiming at the problems such as the low cache hit rate in wind power system,after researching on LRU (Least Recently Used), LFU (Least Frequently Used),SIZE and Hybrid,a replacement algorithm FST (Frequency,Object Size,Access Time) based on comprehensive factors was proposed to solve the problems such as single factor and low system’s performance which are caused by traditional algorithms.This algorithm combines the features of access frequency,object size,access time interval and the longest time without access,and it takes the method of segmentation according to the length of the recent access time.By taking contrast test with LRU,LFU and SIZE in the wind power system cache server,FST algorithm shows better performance in improving the hit rate and reducing the delay time.
Neighborhood Collaborative Representation Based Classification Method
XU Su-ping, YANG Xi-bei, YU Hua-long and YU Dong-jun
Computer Science. 2017, 44 (9): 234-238.  doi:10.11896/j.issn.1002-137X.2017.09.044
Abstract PDF(398KB) ( 92 )   
References | Related Articles | Metrics
In the neighborhood rough set model,with the increasing of the size of information granules,the majority vo-ting rule based neighborhood classifier (NC) is easy to misjudge the classes of unknown samples.To remedy this deficiency,based on the idea of collaborative representation based classification (CRC),we proposed a neighborhood colla-borative representation based classification method,namely,the neighborhood collaborative classifier (NCC).NCC firstly performs feature selection in the classification learning task with neighborhood rough set model,and then finds the neighborhood space of unknown sample under selected features.Finally,instead of the majority voting rule in the neighborhood space,NCC judges the class of unknown sample with the collaborative representation,which considers the class with the minimal reconstruction error for unknown sample as the predicted category.Experimental results on 4 UCI data sets show that compared with NC,the proposed NCC achieves satisfactory performance in larger information granules and compared with CRC,and the proposed NCC greatly reduces the size of the dictionary while maintaining good classification accuracy,and improves the efficiency of classification.
Bayesian Networks Structure Learning Algorithm Based on Cloud Genetic Annealing
CAO Ru-sheng, NI Shi-hong and ZHANG Peng
Computer Science. 2017, 44 (9): 239-242.  doi:10.11896/j.issn.1002-137X.2017.09.045
Abstract PDF(311KB) ( 57 )   
References | Related Articles | Metrics
In view of the highly active requirement of Bayesian networks structure learning,a learning strategy was proposed based on cloud genetic annealing algorithm which combines cloud genetic algorithm and simulated annealing algorithm.Update solution operation are accomplished by selection,cloud cross and cloud variation.In view of the shortco-mings of algorithm being involved into the local optimization untimely,this paper put forward an adaptive cloud crossover operation and cloud mutation operator.The simulation shows that the accuracy of learning and operational efficiency are increased.
Collaborative Filtering Algorithm Incorporating Time Factor and User Preference Properties
ZENG An, GAO Cheng-si and XU Xiao-qiang
Computer Science. 2017, 44 (9): 243-249.  doi:10.11896/j.issn.1002-137X.2017.09.046
Abstract PDF(536KB) ( 63 )   
References | Related Articles | Metrics
To alleviate the impact of data sparsity and overcome the limits of traditional collaborative filtering algorithm,a novel collaborative filtering algorithm(CF-TP) by incorporating time factor and user preference properties was proposed.A user-item rating matrix is firstly converted to a user-item preference matrix by introducing a preference model,and this helps to reduce the impact of different users’ rating habits.Then the asymmetric impact between users is taken into consideration when computing the predicted scores.What’s more important,to further improve the accuracy of top-N recommendation,the time weight function considering uses’ dynamic interest changed with time is designed.The experiment results on HetRec2011 and MovieLens1M data set suggest that the proposed algorithm is superior to other advanced approaches in precision,recall and F1.
Parallel Collaborative Filtering Algorithm Based on User Recommended Influence
WANG Shuo, SUN Guang-ming, ZOU Jing-zhao and LI Wei-sheng
Computer Science. 2017, 44 (9): 250-255, 271.  doi:10.11896/j.issn.1002-137X.2017.09.047
Abstract PDF(601KB) ( 49 )   
References | Related Articles | Metrics
The similarity based on common scores and full item sets has failed to identify the nearest neighbor recommendation influence,which brings about lower recommend quality and poor scalability.Through non-common rating items,common score item categories and user visited times,this paper proposed a parallel collaborative filtering algorithm based on user recommendation influence.It computes the user recommended novelty degree and interest coincidence to measure user recommendation influence ability.By adding it to calculate similarity,the algorithm can effectively restrain the highly recommended users with high similarity,avoid similarity computation on full item sets and improve the quality of recommendation. Further more,by using MapReduce parallelization,this algorithm has good real-time performance and scalability.The experimental results show that the parallel algorithm is of higher recommendation quality and better scalability on big data.
Research on Sentence Semantic Similarity Calculation Based on Word2vec
LI Xiao, XIE Hui and LI Li-jie
Computer Science. 2017, 44 (9): 256-260.  doi:10.11896/j.issn.1002-137X.2017.09.048
Abstract PDF(1631KB) ( 425 )   
References | Related Articles | Metrics
Using the idea of deep learning,word2vec can automatically learn the essential information of data from large-scale text data.Therefore,with the help of LTP platform of Harbin Institute of Technology,based on the word2vec model,the processing of the sentence is simplified as a vector in the vector space algorithm,and the similarity of vector space represents the sentence semantic similarity.In addition,the sentence structure information is added to the sentence similarity calculation,the algorithm are improved on the special sentence pattern,and the syntax relationship between words is taken into account.The experimental results show that this method is more accurately to reveal the semantic relations between sentences,syntactic structure and improved extraction algorithm also solve the problem of computing the similarity of complex sentences,finally improve the accuracy of the similarity calculation.
Data Consolidation Based on Structure Granulation and Matrix Computation
YAN Lin, GAO Wei and YAN Shuo
Computer Science. 2017, 44 (9): 261-265, 299.  doi:10.11896/j.issn.1002-137X.2017.09.049
Abstract PDF(462KB) ( 39 )   
References | Related Articles | Metrics
In order to study the problem of data consolidation,and make the consolidation data keep the associated information before consolidation,a structure composed of a data set together with a weighted relation was constructed,which refers to a weighted association structure.It demonstrates a structured representation of various data information.Then by using a granulation set,the weighted association structure is transformed into the weighted granulation structure.This makes the data in a granule to be consolidated according to the granulation information.At the same time,the consolidation data maintain or gather the associated information before consolidation,which leads to a structural granulation method of data consolidation.On this basis,two matrices were introduced,called the weighted association matrix and weighted granulation matrix which are matrix representations of the weighted association structure and weighted granulation structure respectively.Moreover,by matrix computations of elementary transformation and target transformation,the weighted association matrix is transformed into the weighted granulation matrix.This forms an equiva-lent way of data consolidation,and can be taken as the basis of programming algorithm.
Chinese Place-name Address Matching Method Based on Large Data Analysis and Bayesian Decision
XU Pu-le, WANG Yang, HUANG Ya-kun, HUANG Shao-fen, ZHAO Chuan-xin and CHEN Fu-long
Computer Science. 2017, 44 (9): 266-271.  doi:10.11896/j.issn.1002-137X.2017.09.050
Abstract PDF(500KB) ( 99 )   
References | Related Articles | Metrics
Traditional matching technologies of Chinese place-name address is hard to deal with the fast matching pro-blem of Chinese place-name address in matching massive,diverse and heterogeneous geographic information under the big data environment.An intelligent decision matching algorithm(AIDMA) based on computing framework of Spark was proposed.Firstly,geographical elements are analyzed from semantic information and separations of Chinese strings,numbers and letters.Bayesian networks is constructed with three kind of distance combined with multi-criteria decision-making effectively.514957 desensitized gas account information and 1770979 grid addresses information which includes spatial information of Wuhu City are used to perform the experiments.The conclusions prove that the executed time of each record of AIDMA is reduced to about 2.2s from 1min when compared to traditional algorithms.The matching results are more balanced on matching rate and precise rate.The proposed method possesses the theoretical significance and application value on the road to construct the intelligent countries.
Real-time Dynamic Vehicle Scheduling and Vehicle Routing Problem Based on GPS & GIS Collaboration
FENG Liang and LIANG Gong-qian
Computer Science. 2017, 44 (9): 272-276, 285.  doi:10.11896/j.issn.1002-137X.2017.09.051
Abstract PDF(1411KB) ( 142 )   
References | Related Articles | Metrics
From the logistics industry information and intelligent development,we designed a GPS/GIS collaborative intelligent vehicle monitoring and scheduling system by using the modern information and communication technologies represented by the Internet of things.And we constructed the real-time dynamic vehicle routing problem (DVRP) MIP model based on the ability of information real-time acquisition and intelligent processing and the effect of the real-time information of vehicles and customers on the vehicle scheduling and path planning,which provides a basis and reference for the logistics industry to reduce the business operating costs ,improve the logistics and distribution efficiency and the logistics services quality.
Optimal Personalized Trip Planning Based on Spatio-Temporal Sequence Searching
ZHOU Chun-jie, QU Hai-ping and LIU Li
Computer Science. 2017, 44 (9): 277-285.  doi:10.11896/j.issn.1002-137X.2017.09.052
Abstract PDF(2734KB) ( 82 )   
References | Related Articles | Metrics
Nowadays,due to the increasing user requirements of efficient and personalized services,a perfect trip planning is urgently needed.However,it is hard for people to make a personalized traveling plan at present.For a satisfactory trip planning,the following features are desired:i) personalized recommender based on the interest and habits of travelers;ii) maximal coverage of sites of interest;iii) minimal effort such as transporting time on the route.As a trip contains a sequence of stops at multiple scenes,the problem of trip planning becomes an optimized spatio-temporal sequence where each stop is weighted.For each city,this paper generated a set of weighted scenes for each user.Then it can retrieve the optimal sequence of scenes in terms of distance,weight,visiting time and scene features.We developed alternative algorithms for searching optimal sequences,with consideration of the weight of each scene,the preference of users,and the travel time constraint.The experiments demonstrate the efficiency of the proposed algorithms based on real datasets from social networks.
Diffusion Method of Sample Points for Alleviating Staggered Situation of Classification
LIANG Lu, GONG Ben-long, LI Jian and TENG Shao-hua
Computer Science. 2017, 44 (9): 286-289, 295.  doi:10.11896/j.issn.1002-137X.2017.09.053
Abstract PDF(444KB) ( 50 )   
References | Related Articles | Metrics
The fixed similarity measurement makes learner difficult to reveal the inherent statistical rules of the data itself with the priori information,and it is difficult to get good effect for the data set with a staggered classification.In order to improve the classification accuracy of the data set with a staggered classification,this paper combined the boundary and sample diffusion method.The method applies the statistical sample label information and position information to obtain boundary point,which is treated as the center.Then we selected appropriate control function to spread neighbo-ring sample points to make the classification more clear,so as to enhance the learning accuracy.Different classifiers are used to validate the method,and the accuracy of the proposed method is improved in different degrees.Compared with three classical supervised distance metric learning method,the experimental results show that this method is suitable for processing high degree of interleaving data sets,and can effectively improve the performance of SVM.
Obstacle Avoidance Path Planning for Irregular Obstacles
JIA Chun-xue, LUO Qi and GONG Yang-yang
Computer Science. 2017, 44 (9): 290-295.  doi:10.11896/j.issn.1002-137X.2017.09.054
Abstract PDF(467KB) ( 111 )   
References | Related Articles | Metrics
The phenomenon of path redundancy and high energy consumption exist in the traditional multi-agent obstacle avoidance algorithms when the shape of the obstacle is considered,and the algorithms are not universal.Therefore,firstly,the method of automatic recognition convexity was defined to transform the obstacle from irregular to rule.Secondly,inspired by the idea of sub-target,the path of the agent was transformed into the superposition of multiple landing points of the obstacle after being ruled,so as to ensure the optimization of each path,and then selected the global optimal path.Finally,MATLAB was used to simulate,compare and analyze the results of the other two algorithms,and the feasibility and effectiveness of the algorithm was verified.
Research of Improved CPF Algorithm for Intergrated Train Positioning
WANG Geng-sheng and ZHANG Min
Computer Science. 2017, 44 (9): 296-299.  doi:10.11896/j.issn.1002-137X.2017.09.055
Abstract PDF(299KB) ( 65 )   
References | Related Articles | Metrics
In order to solve the problem that the extended Kalman filter (EKF) and unscented Kalman filter (UKF),which are widely used in the GNSS / INS integrated train positioning,can not meet the complex environment problem of high speed train positioning,a new method based on improved cubature particle filter (CPF) algorithm was proposed for the information fusion of intergrated train positioning.The Markov chain Monte Carlo (MCMC) method was used to solve the particle degeneracy problem,improving the filter performance.Using Matlab simulation,the results show that the improved CPF algorithm has smaller position error and velocity error,whitch improves the accuracy in the process of train nonlinear motion.
Text Localization Based on Edge Detection and Features Fusion in Natural Scene
WANG Meng-di, ZHANG You-mei and CHANG Fa-liang
Computer Science. 2017, 44 (9): 300-303, 314.  doi:10.11896/j.issn.1002-137X.2017.09.056
Abstract PDF(2103KB) ( 91 )   
References | Related Articles | Metrics
As the basis and premise of text recognition,text localization has an important influence on the analysis of images.Since the text localization in natural scene can be effected by illumination and the complex backgrounds significantly,we proposed a text localization method based on edge detection and features fusion.The method began with edge detection from three channels and eight directions,and then we filtered the detected edge images with heuristic rules to extract candidate text regions.On top of that,the HOG-LBP features were extracted and fused by adaptive weights.Finally,we applied support vector machine (SVM) to classify the candidate regions and realized text localization.Experimental results indicate that the proposed method can locate the text region accurately in natural scene images while reducing the influence of illumination and complex backgrounds effectively.
Moving Objects Detection under Complex Background Based on ViBe
ZHANG Wen-ya, XU Hua-zhong and LUO Jie
Computer Science. 2017, 44 (9): 304-307.  doi:10.11896/j.issn.1002-137X.2017.09.057
Abstract PDF(1973KB) ( 45 )   
References | Related Articles | Metrics
Vibe algorithm is simple,fast,and has good foreground detection performance.It is one of the main methods of moving object detection background modeling.But it is still hard to detect the foreground for the outdoor video because of the complex background,such as camera shake or trembling leaves of the trees which leads to the inaccurate detection of the moving object.We presented a novel algorithm for moving object detection from a video.The improved approach of ViBe follows a new background model which uses the frame differencing instead of pixel value.Since the fixed threshold value of ViBe algorithm cannot reflect the change of background in real time,a method with self-adaptive threshold was proposed.The experimental results show that the improved algorithm can improve the accuracy of foreground detection,and it has good robustness against disturbance.
Compressed Sensing Recovery Algorithm for Region of Interests of MRI/MRA Images Based on NLTV and NESTA
ZHAO Yang, WANG Wei, DONG Rong, WANG Jing-shi and TANG Min
Computer Science. 2017, 44 (9): 308-314.  doi:10.11896/j.issn.1002-137X.2017.09.058
Abstract PDF(2420KB) ( 117 )   
References | Related Articles | Metrics
Compressed sensing (CS) is a novel developed theoretical framework for information acquisition and proces-sing.Taking advantages of the inherent sparsity or compressibility in real world signals,CS can collect compressed data at the sampling rate much lower than that needed in Shannon’s theorem.CS is used to medical imaging techniques to accelerate the scanning speed of MRI/MRA,improve the scanning efficiency and alleviate the patients’ suffering.The flow chart of our NLTV-ROI-NESTA algorithm is as follows.The nonlocal total variation (NLTV) is applied to overcome the disadvantages of the traditional TV for its edge blurring and stepstairs effects.An improved NESTA algorithm is used to reconstruct the region of interests (ROIs) for MRI/MRA images accurately and fast to maintain or enhance the details of the low-contrast vessels.Three indices:peak signal to noise rate (PSNR),structural similarity index (SSIM) and relative l2 norm error (RLNE) are adopted to compare the reconstruction performances and clinical value of the ordinary CS-MRI algorithms and our ROI-CS-MRI algorithm qualitatively and quantitatively.Experimental results demonstrate that the proposed NLTV-ROI-NESTA algorithm is superior in reconstruction accuracy and detail features when the undersampled ratio changed from 10% to 50%,which can be extended and widely used in rapid medical imaging technology.
Improved SIFT Matching Method for Milk Beverage Recognition in Grocery
ZHENG Jian-bin, BAI Ya-xian, ZHAN En-qi and WANG Yang
Computer Science. 2017, 44 (9): 315-319.  doi:10.11896/j.issn.1002-137X.2017.09.059
Abstract PDF(2556KB) ( 69 )   
References | Related Articles | Metrics
Using scale-invariant feature transform (SIFT) to identify boxed milk beverages is vulnerable to errors caused by mismatching,which will affect the accuracy of identification.To reduce the impact of mismatching and increase the recognition rates,a new method was put forward to eliminate the mismatching points.Since the deformation of boxed milk beverage images is mostly rigid transformation,the angle differences of the key-points between reference image and observation image are calculated in order to remove some mismatching points.Then the ratios of pairwise distances between reference image and observation image are used to mark the abnormal ratios,and the mismatching points are removed by majority vote method.The proposed method is compared with SC-RANSAC(spatially consistent random sample consensus) algorithm and graph transformation matching method through experiments.The results show that the proposed method could improve the recognition accuracy effectively.