Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue 5, 16 November 2018
  
Research on PAM Probability Topic Model
YU Miao-miao,WANG Jun-li,ZHAO Xiao-dong and YUE Xiao-dong
Computer Science. 2013, 40 (5): 1-7. 
Abstract PDF(682KB) ( 1264 )   
References | RelatedCitation | Metrics
Recently,topic model is emerging as a new hotspot of research in computer science,and has a wide range of applications in natural language processing,document classification,information retrieval and so on.The paper mainly analyzed the Pachinko Allocation Model and its improved models.The improved hierarchical PAM is effective at disco-vering mixtures of topic hierarchies.Nonparametric PAM has more expressive force for complex structures.It has a nonparametric Bayesian prior based on a variant of the hierarchical Dirichlet process.The theory and applications of PAM and its related topic models were summarized,and finally the future directions were discussed.
Area-location Association and Comprehensive Expression of Multi-source Spatial Information Based on Global Subdivision Framework
YANG Yu-bo,CHENG Cheng-qi,HAO Ji-gang and LU Nan
Computer Science. 2013, 40 (5): 8-10. 
Abstract PDF(264KB) ( 577 )   
References | RelatedCitation | Metrics
Traditional GIS lacks the ability to establish the relationship between geographic object and its related global massive multi-source spatial information.GeoSOT subdivision framework was used to organize and manage multi-source spatial information in a unified manner,and "Object-Mesh-Data" three-layer spatial information organizational mechanism was put forward.Based on this mechanism,multi-source spatial information area-location association model was designed to support association and query of spatial information.In order to validate this model,multi-source spatial information comprehensive expression experiment was carried out.The experiment result proves the effectiveness of this model,which can improve the application performance of massive multi-source spatial information.
Tensor Local Fisher Discriminant with Null Space Analysis
ZHENG Jian-wei,JIANG Yi-bo and WANG Wan-liang
Computer Science. 2013, 40 (5): 11-18. 
Abstract PDF(753KB) ( 457 )   
References | RelatedCitation | Metrics
The tensor local fisher discriminant algorithm with null space analysis or NSTLFDA for short was proposed which incorporates the merits of three techniques,i.e.,tensor based methods,local Fisher discriminant analysis,and null space analysis.The main features of our implementation include:(i) local Fisher discriminant analysis is improved by inter-class discriminant information for better recognition performance and reduces time complexity.ii) the tensor based method employs two-sided transformations rather than single-sided one,and yields higher compression ratio.iii) while TLFDA directly uses an iterative procedure to calculate the optimal solution of two transformation matrices,the NSTLFDA method takes the advantages of null space information when the training samples number is less than the dimensionality of the vector samples.The effectiveness of our new method was demonstrated by the ORL,Yale,and ExYaleB face databases.
Dulity of Plateaued Functions
WANG Wei-qiong and XIAO Guo-zhen
Computer Science. 2013, 40 (5): 19-20. 
Abstract PDF(200KB) ( 418 )   
References | RelatedCitation | Metrics
The dulity of Plateaued functions was analyzed.Based on the definition and the restriction of Plateaued functions on some affine subspaces,we proved that there are strong relationships between a Plateaued function and its dulity,especially on the restriction on flats.We also derived the bound of the sum-of-squares indicator of the cross-correlation between any two Plateaued functions by using the tool of dulity.
Persuasion and Evaluation Model Based on Agent Positive Change of Mood
WU Jing-hua and SUN Hua-mei
Computer Science. 2013, 40 (5): 21-23. 
Abstract PDF(236KB) ( 479 )   
References | RelatedCitation | Metrics
Agent persuasion can improve the automation and rational degree of E-commerce negotiations,which is being researched worldwide.The change of agent mood can influence the process and result of agent persuasion.The change of agent mood in agent persuasion was classfied.Persuasion model based on agent positive change of mood was built and explained by an example.The concept of degree of positive change of agent mood was proposed,and relative evaluation model was also built based on this.Finally a calculation example was used to verify its effectiveness.
Behavior Forecasting Model of Cyber-physical Systems
SHE Wei and YE Yang-dong
Computer Science. 2013, 40 (5): 24-30. 
Abstract PDF(665KB) ( 404 )   
References | RelatedCitation | Metrics
Cyber-physical system(CPS) is a new kind of interconnected system which integrates the calculation system,communication network,sensor network,control system and physical system.Since the forms of the communication,co-operation,and interaction among its heterogeneous units are complex,and there is no unified model to describe and analyze it at present,therefore,to model and forecast the behavior of CPS are a difficult point.Based on the hybrid system,fuzzy set theory and human factors method,we proposed a kind of fuzzy time hybrid Petri net(FTHPN).Through the modeling and quantitative analysis of a typical behavior of the CPS,the dynamic behavior forecasting and the state transformation of the CPS were realized.The simulation results verify the effectiveness of the model.This model can be used to analyze the relationship and interaction between the continuous state of physical world and the discrete event of information world of CPS.It will help study the uncertainty of CPS and asynchronous concurrent relationships among system composition units,hence has provided effective methods to forecast,assess,and real-time control CPS.
Remote Virtual Desktop Oriented Application Pushing Research
JIANG Yuan-yuan and WU Yan-jun
Computer Science. 2013, 40 (5): 31-34. 
Abstract PDF(357KB) ( 602 )   
References | RelatedCitation | Metrics
Remote virtual desktop is the virtualization of desktop working environment,which can achieve centralized management,efficiently distribute and migrate operating systems and applications.The motivation is to help users to use their own working environment with the basic condition of the hardware.A solution of pushing applications of virtual desktop,named RVDvApp,was proposed.The core mechanism of RVDvApp is to implement a filter in the process of graphic commands transmission.As a result,the separate application can be pushed to the client and the virtual desktop based services can also be distributed.Therefore,users can totally ignore the heterogeneous working environment and get application services centrically deployed on the server-side virtual machine.
Floating-point CORDIC-based Real-valued Symmetric Matrix Eigenvalue Decomposition on FPGA
CHEN Gang,CHEN Xu,XU Yuan,BIAN Yi and LU Hua-xiang
Computer Science. 2013, 40 (5): 35-37. 
Abstract PDF(211KB) ( 598 )   
References | RelatedCitation | Metrics
This paper presented the architecture of real-valued symmetric matrix eigenvalue decomposition based on rotation mode CORDIC algorithm on FPGA.The proposed architecture adopts a direct mapping by 32floating-point CORDIC-based process units that can compute the eigenvalue decomposition for a 3*3real symmetric matrix.In order to achieve a comprehensive resource and performance evaluation,the whole design was realized on Spartan-3xc3sd1800a-5FPGA.The evaluation results show that 1) the proposed architecture satisfies 3*3eigenvalue decomposition for the 2-20accuracy requirement when the word length of the data path is 32-bit; 2) occupies about 2467(14%) slices,and the maximum frequency is 154MHz.
Multiple Security Partition Communication Mechanism Based on MILS CORBA
CUI Xi-ning,WANG Cong-lin,PEI Qing-qi,LI Ya-hui and SHEN Yu-long
Computer Science. 2013, 40 (5): 38-41. 
Abstract PDF(357KB) ( 570 )   
References | RelatedCitation | Metrics
With the rapid development of avionics system,the requirement of ensuring different security classification software not to communicate in the avionic operate system is more and more strict.In this case,multiple independent levels of security and safety (MILS) embedded system was proposed.The middleware in MILS is Real-time CORBA (RT CORBA).And MILS CORBA is used partition communication to communicate with each other.In order to meet the security of partition communication and the strict control of the security classification in MILS,the multiple security partition communication in MILS architecture based on RT CORBA was proposed.The mechanism we proposed is that MILS IOP as a part of real-time CORBA can ensure to achieve multiple security partition communication in MILS CORBA.The whole mechanism can ensure that the transmissions among the data of different security classifications are independent.And it also guarantees the security of the whole system.
Relationships among Several Types of Kripke Structures Based on Fuzzy Logic
PAN Hai-yu,ZHANG Min and CHEN Yi-xiang
Computer Science. 2013, 40 (5): 42-44. 
Abstract PDF(208KB) ( 702 )   
References | RelatedCitation | Metrics
With respect to the initial state,transition relation and propositional valuation function being crisp or not,fuzzy Kripke structure is classified into eight forms.This paper provided a kind of fuzzy computation tree logic to show that two different fuzzy Kripke structures are equivalent if they give the same truth values for every formula of fuzzy computation tree logic,and discussed the relationship among different fuzzy Kripke structures.The results provide the theoretical foundations for the appropriate choice of models in practice and present a new method for model checking of fuzzy computation tree logic.
Consistency Degree of Fuzzy Rule Groups
FENG Ding-yun,YU Fu-sheng and WANG Xiao
Computer Science. 2013, 40 (5): 45-47. 
Abstract PDF(225KB) ( 397 )   
References | RelatedCitation | Metrics
The research of consistency degree of fuzzy rules takes an important role in the study of rule base.When determining whether a group of fuzzy rules should enter the rule base,we need to know the consistency degree between the new group and the rules in the rule base.This paper defined the consistency degree of two fuzzy rule groups with the aid of fuzzy relation theory and closeness.Experiments show this definition can include compatible rules and exclude the the rules in contradiction,and this is crucial in the process of building a rule base.
Redundancy Clause and Redundancy Literal of Propositional Logic
ZHAI Cui-hong and QIN Ke-yun
Computer Science. 2013, 40 (5): 48-50. 
Abstract PDF(196KB) ( 617 )   
References | RelatedCitation | Metrics
We mainly studied the redundancy of clause and literal in the propositional logic formula.Some equivalent descriptions of necessary,useful and useless clause for a particular set of clauses were given respectively,and the irredundant equivalent subsets of any set of clauses were discussed.In addition,an approach of identification of redundant literal was presented.At last,an equivalent condition of the redundancy clause was obtained by using the concept of satisfiability.These results serve as the theoretical foundation of a new method for simplifying the propositional logic formula.
Application of AADL in Modeling Interrupt Control System
REN Fei,QIAO Ting-ting,LIU Jun-bo and SHAO Yang-feng
Computer Science. 2013, 40 (5): 51-53. 
Abstract PDF(326KB) ( 457 )   
References | RelatedCitation | Metrics
With the wide applications of interrupt control in embedded real-time systems,the reliability of interrupt control system is a key problem during the procedure of system design.Although the architecture analysis and design language (AADL)-based formal methods can handle this problem effectively,it fails to provide effective elements and methods for describing and modeling interrupt.For this purpose,this paper proposed a design method of interrupt control system by combining the merits of AADL and interrupt controller,and utilized reliability computation model based on GSPN to analyze the reliability of the designed system,leading to a way for the application of AADL in avionics system.
Social Set Theory Based on Similarity Relations
LI Jin-ping and HUANG Yi-mei
Computer Science. 2013, 40 (5): 54-57. 
Abstract PDF(290KB) ( 392 )   
References | RelatedCitation | Metrics
On the basis of the study of social relations by using classical similarity relation,we began to consider the fuzziness in the social relations.By using the concept of cut set,we studied many properties of social relations satisfying similarity relation (i.e.,fuzzy tolerance relation),such as the segmentation of social relation circles,the activity degree of individuals in the social relations and the cost when an individual accomplishes various tasks.The corresponding algorithms were also presented.Simulations show the proposed model can describe the social relation network satisfying similarity relation effectively.
Research and Implementation of Mobile Cloud Services Access Mechanism Based on Proxy
ZHANG Yong-jun,SHI Dian-xi,XIAO Xi,WU Zhen-dong and DING Bo
Computer Science. 2013, 40 (5): 58-61. 
Abstract PDF(1196KB) ( 365 )   
References | RelatedCitation | Metrics
The mobile devices are usually limited in its computing cabability,storage capacity,network bandwidth and battery life.Existing cloud services are not optimized for those features of the mobile devices.As a result,the quality of the mobile devices to access cloud services is seriously reduced.Aiming at this problem,a data transmission format optimization model and a servics mashup model were presented for mobile devices.And a framework of mobile cloud ser-vices access based on proxy was designed and implemented,which enables mobile devices to access cloud services through a proxy server.Under four different situations,the request response time,response process time,bandwidth flow and energy consumption of the mobile devices were tested.The test result shows that this framework has some practical significance.
PFAC:Pivot-based Fast Automatic Configuration for Data Center Networks
ZHANG Gan,LIANG Wei,BI Jing-ping and SHAO Ding-hong
Computer Science. 2013, 40 (5): 62-66. 
Abstract PDF(407KB) ( 443 )   
References | RelatedCitation | Metrics
In currently,configuring addresses manually in a large-scale data center is not only time-consuming and labor-intensive but also error-prone.Related work fails to make full use of data center network topology,resulting from large number of backtracking steps unnecessarily.To solve these problems,a pivot-based fast automatic configuration algorithm(PFAC) was proposed.PFAC matchs a pair of pivots after analyzing the topology for narrowing the candidate set of the rest devices and improves the configuring efficiency.The experiments based on FatTree network topology show that the processing time of PFAC is reduced by 35% compared to the existing classical work DAC.
Self-similarity Analysis of Satellite Network Traffic
WEI De-bin,PAN Cheng-sheng and HAN Rui
Computer Science. 2013, 40 (5): 67-69. 
Abstract PDF(305KB) ( 482 )   
References | RelatedCitation | Metrics
The model parameters of data sources were obtained based on the characteristic of the packet onboard date system and stochastic process theory.Furthermore,the satellite network model was established by STK and OPNET.The model parameters of data sources were loaded to the LEO,and the network traffic was collected on the GEO.Lastly,R/S analysis,variance-time plots and periodogram-based analysis were applied to the Hurst parameter estimation of the degree of the self-similarity and burstiness of the satellite network traffic.Simulation results show that the satellite network traffic is characterized by self-similarity.
Centralization Research of Scale-free Network Model (BA-S) Based on Coupling Coefficient
LIU Yu-hua,ZHENG Mei-rong,XU Kai-hua and XU Cui
Computer Science. 2013, 40 (5): 70-73. 
Abstract PDF(396KB) ( 352 )   
References | RelatedCitation | Metrics
The paper aimed at the centralization research on the scale-free network evolution model which is based on the coupling coefficient. At first it analyzed the centralization indexes which are frequently used,and then researched the cumulative probability distribution appoint to the several indexes on the nodes between classical scale-free (BA) model and the evolution (BA-S) model,finally researched the centralization degree and the efficiency for the two mo-dels.The results prove that the evolution (BA-S) model has a more robust and failure-resistant capability than the BA model.
Improved Steiner Multiple Antenna Channel Estimation Algorithm by Direction of Arrival
LU Zhao-gan,YANG Yong-qiang,MA Xiao-fei and LIU Long
Computer Science. 2013, 40 (5): 74-77. 
Abstract PDF(326KB) ( 466 )   
References | RelatedCitation | Metrics
In the multiple antenna OFDM systems,the non-orthogonal training sequences are usually used as the channel training sequences,which would lead to the matrix inversion operation with great complexity during the channel estimation process.So,the training symbols in multi-user CDMA uplinks was used for time channel training symbols in the multiple antenna OFDM systems,and the multiple antenna spatial channels could be evaluated according to the classical Steiner channel estimation algorithm.In fact,the training symbol matrix in transmit antenna is one circular matrix,which could be diagonalized by discrete Fourier transformation at receiver antenna,and the matrix inversion operation could also be avoided.Furthermore,based on the initial estimation results of Steiner algorithm,one could obtain the direction of arrival (DoA) for incidence waves at different multiple receiver antenna.These DoA information is further used to improve the Steiner algorithm.Numerical results show that the improved Steiner algorithm with DoA could obtain 0.5dB SNR improvement with QPSK modulation and 2dB SNR improvement with 16QAM modulation,when compared to that of original Steiner method.
Energy-balanced Unequal Clustering Algorithm in Wireless Sensor Network
LU Xian-ling,WANG Ying-ying,WANG Hong-bin and XU Bao-guo
Computer Science. 2013, 40 (5): 78-81. 
Abstract PDF(333KB) ( 388 )   
References | RelatedCitation | Metrics
The distribution of nodes in wireless sensor network(WSN) is random and equal clustering algorithm arouses energy consumption unbalance.So this paper presented an energy-balanced unequal clustering algorithm(EBUCA).Based on residual energy and density,it selects cluster heads,then forms different size of cluster by density of the cluster heads and the distance to sink ,making the clusters with higher density and closer to the base station to have smaller size to achieve the purposes of balancing energy and loads.The simulation results show that compared with LEACH,DBCP,EEUC,the improved algorithm can balance the energy consumption of nodes,and prolong the life-time of the network.
Method of Realizing Chaotic Masking Communication Based on Duffing System
HAN Jian-qun and LUN Shu-xian
Computer Science. 2013, 40 (5): 82-84. 
Abstract PDF(482KB) ( 359 )   
References | RelatedCitation | Metrics
Duffing equation is a kind of important power system.The mathematic model of this system was studied in this paper.Structure of this system was changed with the variable decomposition method,and it was also proved that applying Duffing system with new structure to realize chaotic masking is rational.At the end of this paper,the running tracks of new Duffing system and the demodulation results of chaotic masking were given.Simulation results show that the method proposed here is correct and effective.
Research on Network Management in Background of Green Computing
LI Ya,PENG Hai-yun,SHANG Xiao-pu and ZHANG Run-tong
Computer Science. 2013, 40 (5): 85-88. 
Abstract PDF(371KB) ( 347 )   
References | RelatedCitation | Metrics
With the continuous development of information techonology,energy cost of information industry is becoming more and more important in whole energy saving and emission reduction. This paper put forword the concept of green computing and green network,analysed the method and possibility of energy saving in various network device,proposed the function of energy management which should be added in network management.A plan and relevant management information was given for monitor,control,and statistic of energy cost.Thus it can be monitored and controlled quantificationally,and comprehensive monitor and management can be provided for green network.
Research of Dynamic Load Balancing Based on Simulated Annealing Algorithm
SUN Jun-wen,ZHOU Liang and DING Qiu-lin
Computer Science. 2013, 40 (5): 89-92. 
Abstract PDF(381KB) ( 740 )   
References | RelatedCitation | Metrics
This paper analyzed characters and shortages of existing dynamic load balancing algorithms on server cluster,combined advantages of simulated annealing and dynamic weighted round-robin algorithm to propose a dynamic load balancing model and a solution.The solution uses simulated annealing algorithm to adaptively decide vector of performan-ce weight and uses dynamic round-robin algorithm to balance load based on server load lively.It dynamically distribu-tes requests based on calculated data.Our experiment shows that the dynamic load balancing algorithm effectively ba-lances load and fully utilizes server resources in different load levels.
Fast Multiple Reference Frame Selection Algorithm for H.264 Based on Temporal and Spatial Correlation
ZHONG Wei-bo and MENG Yan-ru
Computer Science. 2013, 40 (5): 93-96. 
Abstract PDF(347KB) ( 578 )   
References | RelatedCitation | Metrics
The multi-reference frames technique is employed in H.264/AVC,which ameliorates video quality greatly,but also increases the complexity of coding.In order to avoid searching redundant reference frame and increase the efficiency of the H.264encoder,a fast multi-reference frames selection algorithm was proposed according to the temporal correlation between inter frames,and the temporal and spatial correlation among the adjacent macro blocks.The number of reference frames was determimed according to the degree of the temporal correlation between inter frames,and the coding mode and the best reference frame of the current macro block were detemined by the degree of temporal and spatial correlation between its adjacent macro blocks adaptively.Experimental results show that the proposed algorithm can save about 60% of encoding time with the encoding quality almost unchanged in comparison with JM16.2.
Internet of Things Architecture Based on IPv6and Bluetooth Low Energy
WANG Jian-feng,CHEN Can-feng,LIU Jia and XI Min-jun
Computer Science. 2013, 40 (5): 97-102. 
Abstract PDF(531KB) ( 855 )   
References | RelatedCitation | Metrics
Wireless sensor networks are an efficient method to sense and study the physical environment,and how to seamlessly converge wireless sensor networks with Internet is one hot research topic.A complete IPv6-based Internet of Things architecture was proposed,which enables end-to-end IPv6connectivity between the Bluetooth low energy networks and the Internet.One lightweight IPv6protocol stack was implemented on the constrained Bluetooth low energy nodes.According to the characteristics of Bluetooth low energy,the lightweight IPv6protocol stack provides 6LoWPAN-optimized functions such as IPv6header compression,packet Segmentation and Reassembly.The CoAP protocol was also implemented in the application layer.The actual measurement results show that the system has a low time delay and the Bluetooth low energy node has very low power consumption which can satisfy the requirements for long-time operation.
Research on Cloud Service Composition Based on Culture Social Cognitive Optimization Algorithm
LIU Zhi-zhong,WANG Zhi-jian,XUE Xiao and LU Bao-yun
Computer Science. 2013, 40 (5): 103-106. 
Abstract PDF(427KB) ( 371 )   
References | RelatedCitation | Metrics
QoS-aware cloud service composition is an challenge problem in the field of cloud computing.To solve this problem,this paper first improved the social cognitive optimization,and then put the improved SCO algorithm into the framework of Culture Algorithm,constructed a novel algorithm--culture social cognitive optimization(C-SCO),finally,used C-SCO to solve the QoS-aware cloud service composition problem.Experiment results show that C-SCO has stronger searching ability and faster convergence speed when solving cloud service composition problem.At the same time,C-SCO can also be used to solve other combination optimal problem.
Traffic-based Adoptive Sleep Mode Scheme for IEEE802.16e
YANG Lei-lei and LI Hai-tao
Computer Science. 2013, 40 (5): 107-109. 
Abstract PDF(243KB) ( 391 )   
References | RelatedCitation | Metrics
The sleep mode is utilized to reduce the energy consumption of the mobile station and the resource occupation of the air interface of serving base station in IEEE802.16e network.In order to improve the efficiency of energy saving,a practical sleep mode based on traffic was proposed.User’s work mode can be determined according to the user’s active level and the frequent,so that user switch between wake mode and sleep mode can be reduced in the proposed scheme.Simulations were undertaken to verify the proposed scheme and analyze the effect of the threshold of counters on mean energy consumption and mean response time.The simulation results show that the proposed scheme can save at least 30% of energy compared with the standard method.
Ability-based Fuzzy Web Service Clustering and Searching Algorithm
ZHAO Wen-dong,ZHANG Jin,PENG Lai-xian and TIAN Chang
Computer Science. 2013, 40 (5): 110-114. 
Abstract PDF(381KB) ( 324 )   
References | RelatedCitation | Metrics
With the rapid growth of Web services and the need of quickly finding the right services,automatically clustering Web services become exceedingly important and challenging.The main drawback of current clustering algorithms is needed to get total services or a service training set to mine the service attribute.This does not suit for the dynamic distributed environment.In order to address this problem,this paper first raised the concept of service ability and gave a method to compute the service ability.By means of ontology,this paper presented a novel service clustering algorithm based on service ability.Without priori knowledge or similarity computation between any two services,this clustering algorithm can cluster the services with similar function and service ability.On the basis of this algorithm,a service searching algorithm was proposed.Experimental and theoretical results show that the clustering algorithm can reflect the services’ function clustering feature effectively and searching algorithm can filter a lot of unrelated service.
Application of Wireless Sensor Network in Indoor Localization
WU Bin and LI Jun-e
Computer Science. 2013, 40 (5): 115-117. 
Abstract PDF(250KB) ( 398 )   
References | RelatedCitation | Metrics
Indoor positioning is affected by the floor,wall,personnel ambulate,and is prone to multipath phenomenon to make case signal attenuate,so that the received signal strength (RSSI) algorithm can’t obtain high positioning accuracy.In order to improve positioning precision in the indoor environment,the paper used RSSI and centroid localization algorithm advantages to get a combination indoor sensor node localization method.Firstly,the destination node RSSI value was read by beacon nodes,and then the RSSI values were used as the weights of centroid localization algorithm,through the centroid algorithm the target node localization was obtained,and lastly,the test was carried out.The results show that the proposed algorithm has less calculation and high positioning precision,and it can well meet the requirements of indoor positioning.
Wireless Mesh Network Routing Protocol Based on Statistics
WANG Hong-fei,MU Rong-zeng and YAN Yue-peng
Computer Science. 2013, 40 (5): 118-121. 
Abstract PDF(605KB) ( 550 )   
References | RelatedCitation | Metrics
Traditional routing protocols for wireless mesh network(WMN) use the routing information carried by mana-gement frames to calculate,compare and choose route.However the necessary routing information is not available because the management frames can be lost due to the instability of wireless environment,which degrades the performance of network.This paper introduced a new routing protocol named B.A.T.M.A.N.(Better Approach To Mobile Ad-hoc Networking) based on statistics.The loss of management frames makes sense because this protocol decides best paths according to the packets delivery rate of management frames.Experiment shows that B.A.T.M.A.N.outperforms OLSR in wireless mesh network and achieves greater advantages in mobile network.
Design and Implementation of Low Complexity Positioning Algorithm
HUANG Pei-yu and JIANG Zi-quan
Computer Science. 2013, 40 (5): 122-125. 
Abstract PDF(367KB) ( 474 )   
References | RelatedCitation | Metrics
Currently,applications based on location technology have become one of the most development potential Internet of Things(IOT) business.Because IOT sensing layer sensor nodes are limited by their own power consumption and processing capability,they can not directly solve the equation of wireless positioning algorithm.This paper modified traditional trilateral positioning algorithm by the method of geometric approximate calculation.New algorithm process only needs to make simple comparison,addition,subtraction,multiplication,and division operations to get location information of the node.It introduces a certain error value in the solving process,but significantly reduces solving complexity and providing a feasible solution for the low operational capability processor to directly calculate positioning information.Positioning experiment evaluation shows that the positioning average error of the algorithm is 1.18m,and it can satisfy the most small and medium-sized scene range positioning requirements.
Live Memory Migration for Virtual Machine Based on Dirty Pages Delayed Copy Method
ZHANG Wei,ZHANG Xiao-xia and WANG Ru-chuan
Computer Science. 2013, 40 (5): 126-130. 
Abstract PDF(544KB) ( 524 )   
References | RelatedCitation | Metrics
Through the analysis of the state of the memory pages in the live memory migration for virtual machine,a novel method was proposed to avoid the dirty pages to be retransmitted in the Pre-Copy algorithm.In this method,the next dirty extent of pages is predicted based on the principle of temporal locality before the pages’ transmitting in the iteration stage.According to the result of prediction,the pages are classified into different states to take appropriate transmission strategy.Experiments show that the method has effectively reduced the total migration memory nearly 10% compared to the live migration approach taken by Xen.
Provably Secure Identity-based Threshold Ring Signcryption Scheme in Standard Model
SUN Hua,WANG Ai-min and ZHENG Xue-feng
Computer Science. 2013, 40 (5): 131-135. 
Abstract PDF(386KB) ( 393 )   
References | RelatedCitation | Metrics
Signcryption is a cryptographic primitive which can provide authentication and confidentiality simultaneously with a computational cost lower than signing and encryption respectively,while the ring signcryption has anonymity in addition to authentication and confidentiality.In order to design an identity-based threshold ring signcryption,this paper presented an efficient identity-based threshold ring signcryption scheme without random oracles by means of secret sharing and bilinear pairing technique,and gave security analysis of the scheme.At last,we proved this scheme satisfies indistinguishability against adaptive chosen ciphertext attacks and existential unforgeability against adaptive chosen message and identity attacks in terms of the hardness of DBDH problem and CDH problem.
“Reduction Attacks” in Partially Blind Signature Based on Bilinear Pairings
HOU Zheng-feng,WANG Xin,HAN Jiang-hong and ZHU Xiao-ling
Computer Science. 2013, 40 (5): 136-140. 
Abstract PDF(359KB) ( 419 )   
References | RelatedCitation | Metrics
We studied three partially blind signature schemes,and found out the similarity in signature formulas of the three schemes.Then we compared and analized the three corresponding methods of“reduction attacks”,and the basic reason why these schemes are vulnerable to tampering common information attacks was pointed out.From the above analysis results,we deduced the conclusion that a class of partially blind signature schemes based on bilinear pairings are vulnerable to “reduction attacks”.Moreover,the deduction was verified.
Structured P2P Worm-anti-worm Model Based on Formalized Logic Matrix
TANG Hao-kun,LIU Yan-bing,HUANG Jun and ZHANG Heng
Computer Science. 2013, 40 (5): 141-146. 
Abstract PDF(558KB) ( 421 )   
References | RelatedCitation | Metrics
P2P anti-worms is one of the effective countermeasure to malignant worms in structured P2P networks,but the existing models are too complex in describing the propagation processes of worms under attack-defense environment.To address this problem,a simple structured P2P worm-anti-worm model was presented.This model performs the form description to the antagonistic propagation of P2P anti-worms and malignant worms in structured P2P networks by the support of logic matrix,and a number of key parameters that affect the propagation speed of malignant worm under attack-defense environment can be deduced rapidly by the model,besides,considering the significant influence of P2P churn feature on worm propagation,the change rate of nodes is added to the model to improve its accuracy.The experimental results show that formalized logic matrix can reduce the complexity of worm propagation model under attack-defense environment,rapidly identify the key factors that restrict the spread of worms,and supply the reference for the following research work on defending worms.
Analysis and Design of Network Coding Based on Chaotic Sequence
XU Guang-xian,LI Xiao-tong and LUO Hui-hui
Computer Science. 2013, 40 (5): 147-149. 
Abstract PDF(331KB) ( 309 )   
References | RelatedCitation | Metrics
A minimum overhead secure network coding based on chaotic sequence was presented in this paper.Only the source needs to be modified,and intermediate nodes implement a classical distributed network code.So the proposed scheme is applied to all the linear network coding.It combines the chaotic sequence with original source information vector,because of the high randomness and the sensitivity to initial state of chaotic sequence,and the presented network code is “One-Time Pad (OTP)”.So the secure network coding achieves complete secrecy.This scheme requires only one noisy symbol to be embedded in the original information symbol vector to achieve complete secrecy.The theoretical analysis confirms that this scheme can achieve the information-theoretic security condition and the signaling overhead to obtain complete security is minimized,while the attacker has limited wiretapping ability.
Opinion Propagation and Public Opinion Formation Model for Forum Networks
XU Hui-jie,CAI Wan-dong,CHEN Gui-rong and WANG Jian-ping
Computer Science. 2013, 40 (5): 150-152. 
Abstract PDF(337KB) ( 427 )   
References | RelatedCitation | Metrics
According to the information dissemination and the interaction characteristics of users for online forums,a discrete time model was proposed for opinion propagation and public opinion formation based on the node influence.In the model,every node has the opinion and the influence which is defined as the weight of network,and the nodes will redistribute the weights and update their opinions during the each period.When the opinion of nodes converges to a certain value,the public opinion will be formed.The simulation results show that this model can emulate accurately the characteristics of interation and convergence.Besides,the results can provide the reference for the prediction of public opinion propagation.
Identity-based Broadcast Signcryption with Proxy Re-signature
LI Chao-ling,CHEN Yue,WANG Cheng-liang,LI Wen-jun and WANG Shuang-jin
Computer Science. 2013, 40 (5): 153-157. 
Abstract PDF(443KB) ( 391 )   
References | RelatedCitation | Metrics
To protect data confidentiality and integrity in cloud data sharing and other applications,an identity-based broadcast signcryption scheme with proxy re-signature was proposed.This scheme could transform a broadcast signcryption from the initial signcrypter to the re-signcrypter by executing a proxy re-signature.It is proved that this scheme has indistinguishability against chosen multiple identities and adaptive chosen ciphertext attacks and existential unforgeability against chosen multiple identities and message attacks in terms of the hardness of CBDH(computational bilinear diffie-hellman) problem and CDH(computational diffie-hellman) problem.At last,its application in cloud data sharing was introduced.
Efficient Identity-based Sanitizable Signature Scheme in Standard Model
MING Yang and LI Rui
Computer Science. 2013, 40 (5): 158-163. 
Abstract PDF(421KB) ( 512 )   
References | RelatedCitation | Metrics
Sanitizable signature scheme allows stanitizer to modify the specific part of the original message without interacting with the signer to generate a valid signature to modify the message.This paper came up with an identity-based sanitizable signature scheme design based on Waters technology and Li technology under the standard model with the application of bilinear pairings.The safty analysis shows that the proposed scheme is able to satisfy the characteristics of unforgeability,indistinguishability and immutability.Compared with the the existing safty scheme under the standard model,this design has higher computational efficiency and smaller communication cost.
Statistically Significant Sequential Pattern Mining Applying to Software Defect Prediction
TANG Lei,LI Chun-ping and YANG Liu
Computer Science. 2013, 40 (5): 164-167. 
Abstract PDF(405KB) ( 410 )   
References | RelatedCitation | Metrics
Nowadays the human beings are more and more reliant on software systems which have high reliability and usability,and the technology of software defect prediction has been one of the most active parts of software engineering.This paper introduced the technology of software defect prediction on the basis of sequential pattern mining and designed a model for software defect prediction with the technology of mining statistically significant pattern.It described the architecture and detailed implementation of the algorithms named “InfoMiner” and “STAMP”.The model using InfoMiner and STAMP to mine patterns,chi-square test to feature selection and SVM to classify can find unknown defects with high probability.Experimental results show that the model is able to get high prediction accuracy,so that it is valua-ble and has future prospects.
Ontology Concept Learning Method for Compound Terms
LI Jiang-hua,SHI Peng and HU Chang-jun
Computer Science. 2013, 40 (5): 168-172. 
Abstract PDF(479KB) ( 395 )   
References | RelatedCitation | Metrics
Term extraction plays an important role in ontology concept learning based on text.Because of no clear boundary among words in Chinese text,domain terms,especially compound terms,are difficult to be extracted.Traditional term extraction methods usually need large amount of calculation and lack of semantic supporting.A novel ontologyconcept learning method for compound terms was presented in this paper.At first,natural language processing technology is utilized to remove the irrelevant parts to get candidate terms.Sentences in the text are cut by punctuation marks and removed parts,so that the candidate compound terms can be reserved from wrong cutting.The candidate domain-specific terms are filtered by term frequency and information entropy with multi-strategy,according to the characteristics of distribution and statistics of terms.Then domain-specific concept set is obtained after the synonymous terms recog-nition.Experimental results show that the method can extract domain-specific word terms and compound terms with higher precision.
Software Dependable Evolution Analysis Method Considering Software Historical Behavior Data
ZHAO Qian,FENG Guang-sheng and LI Li
Computer Science. 2013, 40 (5): 173-176. 
Abstract PDF(621KB) ( 354 )   
References | RelatedCitation | Metrics
According to the dependability need in software dependable evolution,a software dependable evolution analysis method considering software historical behavior data was proposed.By collecting historical behavior data in software evolution,the method extracts and computes related properties of software dependable evolution,establishs coloring principle and builds software evolution analysis chart.Hidden information embedded in evolution data is gotten real-timely,effectively and intuitively through observing software unit in software evolution analysis chart.It can analyze whether evolution is dependable or not to support evolution data analysis.
Formalization for Model Element of UML Statechart in RSL
GUO Yan-yan and LIU Jing-lei
Computer Science. 2013, 40 (5): 177-183. 
Abstract PDF(549KB) ( 428 )   
References | RelatedCitation | Metrics
UML statechart plays an important role in describing the dynamic behavior of system and model as an important part of UML dynamic description mechanism.Existing dynamic semantics of UML are lack of accurate formal description.UML Statechart was defined as the abstract syntax graphs,which were expanded into a new finite automaton based on UML statechart through the traditional finite automaton and increased state hierarchy.Then,this paper formalized the model elements of UML statechart through RAISE specification language(RSL).The formal semantic of the model elements of UML statechart is more clear and accurate,which is the base of the later operation semantic study of UML statechart.
Taint Trace with Noninterference Based Approach for Software Trust Analysis
CHEN Shu,YE Jun-min and ZHANG Fan
Computer Science. 2013, 40 (5): 184-188. 
Abstract PDF(691KB) ( 372 )   
References | RelatedCitation | Metrics
A model for software trust analysis was proposed based on taint data trace and noninterference theory.This approach extracts core operation system APIs that may cause un-trusted behaviors by tracing taint data imported from outside of software environment.These APIs forms a taint dependency behavior model and imports information flow model to analyse whether it is trusted.Theorem for the trust determinant is also improved.
Research of Process Knowledgebase Oriented to Typical Aerospace Product
GUO Ya-fei,HOU Jun-jie and SHI Sheng-you
Computer Science. 2013, 40 (5): 189-192. 
Abstract PDF(668KB) ( 365 )   
References | RelatedCitation | Metrics
It is urgent to research process experience knowledgebase oriented to typical aerospace product because of rapid development of space cause.Based on researching process experience knowledge oriented to typical aerospace product,according to its characteristics,the paper constructed knowledge express model of basic resource,process project and process experience,designed the process knowledgebase oriented to typical aerospace product,and verified practicality of the knowledgebase by prototype system.
Evaluation and Optimization of Clinical Pathway Based on Petri Net
TIAN Yan,MA Xiao-pu,ZHANG Xin-gang and ZHANG Ting
Computer Science. 2013, 40 (5): 193-197. 
Abstract PDF(408KB) ( 367 )   
References | RelatedCitation | Metrics
In order to quantitatively evaluate and optimize the clinical pathway,a model of clinical pathway based on Hier-archical Timed colored Petri Net was proposed.According to the patients’ distribution law of arrival time,using the breast cancer clinical pathway of a hospital and testing on the CPN Tools 3.4by setting and adjusting relevant parameters,some import data can be got such as hospitalization days,the medical expense,the shortest length of stay and allocation of resources of each period.These can provide an effective reference for the customization,screening and optimization of the clinical pathway.
Matching Twig Patterns in Uncertain XML without Merging
LIU Li-xin,ZHANG Xiao-lin,LV Qing,ZHANG Huan-xiang and CHU Yan-hua
Computer Science. 2013, 40 (5): 198-200. 
Abstract PDF(311KB) ( 430 )   
References | RelatedCitation | Metrics
In order to avoid storing large amounts of intermediate results and merging those intermediate results when matching twig patterns in uncertain XML,this paper proposed the algorithm ProTwigList.First,it uses Tag+Level stream pruning to reduce the number of nodes to be processed.Second,it extends the range encoding to encode the remaining ordinary nodes,and adopts rules to mark distributed nodes.Third,it uses the path of common distributed node to deal with distributed nodes.Finally,it utilizes the probability of the lowest common ancestor node to calculate the probability of every result.The theoretical analysis and experimental results prove the efficiency of ProTwigList.
Partial Absorbing Markov Chain Based Multi-document Summarization
GAO Jing and FANG Jun
Computer Science. 2013, 40 (5): 201-205. 
Abstract PDF(382KB) ( 517 )   
References | RelatedCitation | Metrics
Absorbing Markov Chain has been proven to be effective in text summarization.However,the algorithm based on Absorbing Markov Chain is not only time-consuming due to matrix inversion but also inept to integrate other information except relationships among sentences because of the limitation of the model.This paper presents a novel multi-document summarization approach based on Partial Absorbing Markov Chain.The equivalent relationship between the average expected visits in Absorbing Markov Chain and the stationary probability in the corresponding Partial Absorbing Markov Chain was demonstrated.Then,the stationary probability in Partial Absorbing Markov Chain which is easily calculated serves as a criterion to rank sentences.In addition,other kinds of information are incorporated together to generate a more accurate solution of the stationary probability.Experiments on TAC2011main task are performed.
Rules String Extracting Method for B2B System Based on Double-array Trie
LI Hui,YANG Bing-ru,PAN Li-fang and QIAN Wen-bin
Computer Science. 2013, 40 (5): 206-208. 
Abstract PDF(317KB) ( 461 )   
References | RelatedCitation | Metrics
To extract the data of product specification in B2B system,the ruled string extracting method based on dou-ble-array trie was proposed.The data feature is formed as "name:value" for the parameters of the product specification in B2B system.The method constructs the rule according to the data feature of specification parameters.The double-array trie is generated for the extracting processing according to the rules database.The optimization measures are adopted to improve the storing efficiency for the double-array trie.The measures include giving high priority to handle the sub tree with more child node.The method can extract all the ruled string by scanning the input text data once.The accuracy of the extracting results is improved via filtering according to the restrictions condition of the rules.Experimental results show that the extracting method can improve accuracy and decrease complexity comparing to the traditional method.The complexity of the extracting algorithm is O(n).
Meta-association Rule Mining for Dynamic Association Rule Used GM(1,1) Based on Wavelet Transform
ZHANG Zhong-lin and XU Fan
Computer Science. 2013, 40 (5): 209-212. 
Abstract PDF(390KB) ( 398 )   
References | RelatedCitation | Metrics
This paper put forward a method for applying wavelet transform to meta-rule mining in dynamic association rules to improve the forecast accuracy of the rules.First,it uses Daubechies wavelet to transpose the support for the technology of the meta-rule in dynamic association rules.And then it extracts the approximate part and detail part according to the multi-resolution characteristics of wavelet transform.Finally by single reconstructing the two parts the GM(1,1) can be used to forecast the reconstructed two parts to get the final prediction results.The last experiment gives higher prediction accuracy.
Improved ITO Algorithm for Solving the CVRP
YI Yun-fei,CAI Yong-le,DONG Wen-yong and LIN Guo-long
Computer Science. 2013, 40 (5): 213-216. 
Abstract PDF(332KB) ( 658 )   
References | RelatedCitation | Metrics
In order to overcome the shortcoming of the path selection for the vehicle routing problem,the method of the Ant Colony Algorithm was used to select the customer point.In addition,cooling schedule was used to control the parameters of temperature changes,and the process of drifting operator and fluctuation operator simultaneously to improve the Ito algorithm which is based on the hypothesis testing and Ito stochastic process.When it comes to solve the capacitated vehicle routing problems,the improved ITO algorithm is effective,and the numerical results show that the improved algorithm is feasible.
Mining Fuzzy Association Rules Based on Multi-mutation Particle Swarm Optimization Algorithm
WANG Fei and GOU Jin
Computer Science. 2013, 40 (5): 217-223. 
Abstract PDF(555KB) ( 401 )   
References | RelatedCitation | Metrics
To deal with the problem that continuous value in the transaction database are difficult to divide and particle swarm optimization algorithm is easy to be troubled with local optimal,this paper proposed a framework about multi-mutation particle swarm optimization algorithm for extracting fuzzy association rules.Firstly,the continuous values are divided into the fuzzy interval.Then using multi -mutation particle swarm optimization algorithm to mine the fuzzy association rules from the division results.This paper described the fuzzy division method and multi-mutation particle swarm optimization algorithm’s parameters,framework and others.And it proved the accuracy and efficiency of this method by comparative analysis in several experiments.
General Weighted Minkowski Distance and Quantum Genetic Clustering Algorithm
QIAN Guo-hong,HUANG De-cai and LU Yi-hong
Computer Science. 2013, 40 (5): 224-228. 
Abstract PDF(401KB) ( 348 )   
References | RelatedCitation | Metrics
Difference and similarity are very important factor in clustering algorithms,and always affect the results of clustering analysis.A lot of clustering algorithms use Euclidean distance as it’s similarity measure.Euclidean distance can''t reflect the global information of attributes,and don''t consider the unit differences between each attribute,so it can’t make a good result when there is obvious unit and domain differences.So,this paper put forward a generally weighted Minkowski distance which is determined by the unit and domain information of each attributes value.Not only characteristics of whole data are considered,but also dicord between attributes is removed,at the same time,using of fractional bits weakens the noise data influence.We used new distance measure in classic k-means.And quantum genetic k-means and the experimental result show that the new algorithm is effective.
Real-time Path Planning for Mobile Robots Based on Quantum Evolutionary Algorithm
SHEN Xiao-ning and XIE Wen-wu
Computer Science. 2013, 40 (5): 229-232. 
Abstract PDF(681KB) ( 420 )   
References | RelatedCitation | Metrics
In order to solve the problem of real-time path planning for mobile robots,an improved quantum evolutionary algorithm was proposed.By using the grid method to build the environment model,a novel decoding method which transforms the quantum individual into the path described by grid points was presented.On the basis of the quantum rotation gate,the cross and mutation operator in genetic algorithm and a repair operation specifically designed for the path planning problem were introduced to update the quantum population together,which improved the searching efficiency.With the help of GUI in Matlab,process of the robot real-time path planning was simulated.Simulation results indicate that the proposed method can obtain a feasible and short path in the complex environment.Additionally,when a new obstacle appears suddenly,or the original ones move towards different directions,this method can also response quickly and replan an optimal path in the new environment.
Optimization Algorithm for Vehicle Scheduling Problem Based on Quantum Immune
REN Wei
Computer Science. 2013, 40 (5): 233-236. 
Abstract PDF(410KB) ( 529 )   
References | RelatedCitation | Metrics
Aiming at the vehicle scheduling problem with time window,a hybrid quantum evolutionary algorithm with immune operator was put forward.The algorithm improves quantum rotating gate,making individual in evolutionary process close to the global optimum position,so as to avoid prematurity and to maintain the diversity of the population.In the iteration process,an immune operator is introduced,and excellent genes fragment is extracted as vaccine which was vaccinated to the other individuals in the population,as to prevent the algorithm retrogression.Finally,the experimental simulations of the standard instances show that the proposed method can not only effectively solve the problem,but also significantly speed up the convergence.
Function P-sets and Internal-mining of Internal P-information Law Dependence
ZHAO Shu-li,ZHANG Huan-li and SHI Kai-quan
Computer Science. 2013, 40 (5): 237-241. 
Abstract PDF(370KB) ( 377 )   
References | RelatedCitation | Metrics
Function P-sets(function packet sets) are obtained by introducing the function concept into and improving P-sets(packet sets),which have the dynamic characteristic.Function P-sets are a set-pair composed of function internal P-set S (function internal packet set S) and function outer P-set SF (function outer packet set SF),or (S,SF) is function P-sets.Using function internal P-sets,the paper gave the concept and attribute theorem,the dependence characteristic and data loss theorem of internal P-information las generation,the orbital transfer characteristic of information law of internal P-information law dependence.At last,the internal-mining of unknown internal P-information laow dependence and its application were given.
Bagging-based Probabilistic Neural Network Ensemble Classification Algorithm
JIANG Yun,CHEN Na,MING Li-te,ZHOU Ze-xun,XIE Guo-cheng and CHEN Shan
Computer Science. 2013, 40 (5): 242-246. 
Abstract PDF(421KB) ( 951 )   
References | RelatedCitation | Metrics
Neural networks classification algorithm now more concentrates on the BP algorithm which is the representative of the neural networks.Considering the disadvantage of BP neural network,based on the analysis of probabilistic neural networks and machine learning,and combining with the idea of ensemble learning,we proposed a new classification algorithm which is probabilistic neural networks ensemble based on Bagging.Theoretical analysis and experimental results show that the proposed algorithm can effectively reduce the classification error and improve accuracy of classification.The proposed algorithm has good generalization ability and faster speed of execution than the traditional classification methods such as BP neural networks and it can achieve better and more stable classification result.
Scheduling Single Batch Processing Machine Using Ant Colony Algorithm
XIA Xin
Computer Science. 2013, 40 (5): 247-250. 
Abstract PDF(283KB) ( 625 )   
References | RelatedCitation | Metrics
An ant colony algorithm(ACO) was proposed to study the problem of minimizing makespan on a single batch processing machine with non-identical jobs sizes.ACO was used by grouping several jobs into a batch directly and the parameter settings were also discussed.A new local search algorithm was provided to improve the performance of the ACO.The computational experiment was conducted to show the improvement of ACO compared to some other algorithms.
Mathematical Formula Matching Algorithm Based on Binary Tree
QIN Yu-ping,TANG Ya-wei,LUN Shu-xian and WANG Xiu-kun
Computer Science. 2013, 40 (5): 251-252. 
Abstract PDF(225KB) ( 992 )   
References | RelatedCitation | Metrics
A mathematical formula matching algorithm based on binary tree was proposed.Firstly,generating the binary tree form of a mathematical formula by its LaTeX form,normalizing the binary tree structure,and then pre-order traversing the binary tree to get the formula elememn sequence,normalizing the variable names.For two mathematical formulas to be matched,the similarity is computed by the number of the equal formula element at corresponding position.The experimental results show that the algorithm realizes the accurately recognition of mathematical formula,so it is a more practical algorithm.
Rough Assessment System and Path Optimization
LI Zhuo-wen,YAN Lin,SONG Jin-peng and WANG Xu-fei
Computer Science. 2013, 40 (5): 253-256. 
Abstract PDF(316KB) ( 395 )   
References | RelatedCitation | Metrics
By adding a binary relation and a number of assessment factors into an approximation space,a rough assessment system was obtained.Based on the assessment factors,the concepts of the weight and the comprehensive weight were defined,which leaded to the comprehensive lower value of an edge set.So,a path could be optimized by making use of the comprehensive lower value,also an algorithm called path optimization algorithm was formed.Moreover,by a rough assessment system which describes supply relationships between enterprises,a mathematical model of practical problems was constructed.Supply paths were therefore optimized,which resulted in applying path optimization algorithm to the supply paths.This demonstrates the validity of the algorithm.
Hybrid Differential Evolutionary Algorithm Based on Extremal Optimization
WANG Cong-jiao,WANG Xi-huai and XIAO Jian-mei
Computer Science. 2013, 40 (5): 257-260. 
Abstract PDF(321KB) ( 729 )   
References | RelatedCitation | Metrics
A new hybrid algorithm based on differential evolution (DE) and extremal optimization (EO) was proposed to solve the premature convergence and low precision of standard differential evolution when it is applied to complex optimization problems.The key points of it lie in:the hybrid algorithm introduces the population-based extremal optimization algorithm in the iteration process of DE when population aggregation gets the high degree,which uses the volatility of EO to increase the diversity of population and the ability of breaking away from the local optimum.Simulations show that the hybrid algorithm has remarkable global convergence ability,and can avoid the premature convergence effectively.
Real-time Action Recognition Based on Zone Shapes and Motion Features
PEI Li-shen,DONG Le,ZHAO Xue-zhuan and REN Peng
Computer Science. 2013, 40 (5): 261-265. 
Abstract PDF(928KB) ( 518 )   
References | RelatedCitation | Metrics
This paper presented an efficient action recognition method based on Hu moment invariant features.Firstly,the Hu moment invariants were refined to be new features that are translation,rotation and scale invariant.Then an action was characterized by a 13-dimensional feature vector consisting of both Hu moment features and action speed features.The Hu moment features represent the Zone shape of the action,and the action speed features exhibit certain motion characteristics.Finally,a support vector machine(SVM),which is trained using labeled action frames,was applied to classify test sample actions into different categories.The proposed method is performed on real-world videos and achieves acceptable recognition rates with desirable computational efficiencies.
Subspace Decomposition Filtering Based on NNED for Polarimetric SAR
LIU Gao-feng,LI Ming,WANG Ya-jun and ZHANG Peng
Computer Science. 2013, 40 (5): 266-270. 
Abstract PDF(660KB) ( 356 )   
References | RelatedCitation | Metrics
Although subspace decomposition filtering for polarimetric SAR can keep the polarimetric information very well,it is necessary to enhance its despeckling performance and capability of keeping edge-point information.For the problem,a subspace decomposition filtering based on nonnegative eigenvalue decomposition(NNED) was proposed.For each pixel,firstly we calculated eigenvalues and eigenvectors of the parameter vector covariance matrix,and then each eigen-subspace was obtained,secondly we used the NNED to select the optimal threshold for separating signal subspace from noise subspace,and the selecting criterion is the minimal similarity measurement of scattering mechanisms,thirdly the filtered result was produced by the signal subspace.The real-POLSAR-data experiment shows that compared with other congener algorithms,the proposed algorithm can efficiently suppress speckle noise and keep the edge-point information very well.
Face Hallucination via KNN Sparse Coding Mean Constrained
HUANG Ke-bin,HU Rui-min,HAN Zhen,LU Tao,JIANG Jun-jun and WANG Feng
Computer Science. 2013, 40 (5): 271-273. 
Abstract PDF(612KB) ( 410 )   
References | RelatedCitation | Metrics
A novel sparse representation based super-resolution(SR) method was proposed to reconstruct a high resolution(HR) face image from a low resolution(LR) observation via training samples.First,a specific LR and HR over-complete dictionary pair was learned for a certain patch over the patches in all training samples with the same position.Second,K Nearest Neighbor(KNN) sparse coding mean constrain was used to make the sparse representation of the input patch more accurate.Third,the HR patch was hallucinated via the sparse representation coefficients and the HR dictionary.At last,we formed the final HR face image by integrating the hallucinated HR patches together.Experiments validate the proposed method in extensive data.Compared to some state-of-the-art methods,our method exhibits better performance both in subjective and objective quality.
Vessel Skeletonization Method for Lung CT Images
CHEN Gang,LV Xuan,WANG Zhi-cheng and CHEN Yu-fei
Computer Science. 2013, 40 (5): 274-278. 
Abstract PDF(952KB) ( 794 )   
References | RelatedCitation | Metrics
The skeletonization method has been widely used in image analysis and other areas.Using skeleton to express image can preserve the topological structure and reduce redundant information.In the clinical practice of diagnosis and treatment of lung disease,how to effectively represent and analyze the vascular structure is very important for computeraided diagnosis and surgery.In this paper,a vessel skeletonization method for lung CT image was proposed.Firstly,vessels were segmented from the thoracic tissues by region growing algorithm.Secondly,the morphological operators were used to deal with the vessel segmentation results.Finally,skeleton of blood vessels was obtained by three-dimentional thinning algorithm.Experimental results show that the proposed method can accurately and efficiently extract vessel skeletons from lung CT image.
Pruned Incremental Extreme Leaning Machine Fuzzy Neural Network
HU Rong and XU Wei-hong
Computer Science. 2013, 40 (5): 279-282. 
Abstract PDF(293KB) ( 345 )   
References | RelatedCitation | Metrics
ELM for batch learning developed by Huang et al has been shown to be extremely fast with generalization performance better than other batch training methods.In order to lean online,a pruned incremental extreme leaning algorithm was developed for fuzzy neural network.Based on the previously proposed ELM method,we proposed a pruned incremental extreme leaning algorithm.First a set of simple antecedents and random values for the parameters of input membership functions were randomly generated.Then SVD was used to rank the fuzzy basis functions.Then the best number of fuzzy rules was selected by performing a fast computation of the leave-one-out validation error.Finally,the consequents parameters were determined analytically.During the procedure of learning,it is not need to memory the last date.It is a real incremental leaning method.When a new data arrives,it is not need to retrain the network.A comparison was performed against well known neuro-fuzzy methods.It is shown that the method proposed is robust and competitive in terms of accuracy and speed.
Corner Matching Based on Edge Correlation Distance Constraints
WANG Wan-liang,JIN Yi-ting,ZHAO Yan-wei and ZHENG Jian-wei
Computer Science. 2013, 40 (5): 283-286. 
Abstract PDF(368KB) ( 469 )   
References | RelatedCitation | Metrics
In the binocular vision technology,for solving the incorrect matching of edge corner,a corner matching algorithm based on edge correlation distance constraint was proposed.Firstly,use corner detection algorithm based on edge to extract corner,confirm candidate corner matching assemble by epipolar constraint and corner’s threshold constraint.Then,put forward the “edge correlation” constraint,structure candidate corner pair’s contribution value based on corner distance to have it fine matched.Finally,structure corner’s vector,and test corner matching further using sub-vector matching method.The experiment results show that this matching algorithm has high accurate rate,solves the incorrect matching problem of edge corner pair effectively,and it’s quite suitable for the applications of edge-based binocular vision.
Backward Cloud-based Method for Image Transition Region Extraction and Thresholding
WU Tao,CHEN Yi-xiang and YANG Jun-jie
Computer Science. 2013, 40 (5): 287-290. 
Abstract PDF(682KB) ( 366 )   
References | RelatedCitation | Metrics
In order to select the optimal threshold for automatic image segmentation,a novel method for image thresholding,based on cloud model and transition region,was proposed.The method calculates the cloud model for a given image by backward cloud-based algorithm.Next,it defines a criterion to search threshold adaptively,minimizes this criterion in the skeleton interval of cloud model,and generates the cloud model for image transition region.Then,it extracts the transition region with uncertainty using the maximum determination rule.Finally,it achieves image thresholding according to the grayscale peak value of pixels in transition region.The proposed method solves the issue on image transition region extraction and thresholding using cloud model,thus it has uncertainty.The quantitative and qualitative experiments indicate that the proposed method yields accurate and robust result,and is reasonable and effective.
Method of Map Matching Based on GPS Terminal
ZHU Zheng-yu,CUI Ming and LIU Lin
Computer Science. 2013, 40 (5): 291-295. 
Abstract PDF(383KB) ( 476 )   
References | RelatedCitation | Metrics
This paper presented a map matching method with lower complexity,better real-time and higher matching accuracy.The method,which is based on the weights measure and curve fitting map match ideology,and combined with road network topology information,provides vehicle location information timely and accurately,and can be applied to the vehicle monitoring,real-time traffic information collection,vehicle navigation.The paper also proposed the arc-projection algorithm with vehicle position- dependent and the dead reckoning algorithm used when the GPS signal is not valid.
2D Fuzzy Entropy Image Threshold Segmentation Method Based on CPSO
ZHAO Yue,LI Jing-jiao,XU Xin,CHEN Chao and BAI Xin
Computer Science. 2013, 40 (5): 296-299. 
Abstract PDF(562KB) ( 327 )   
References | RelatedCitation | Metrics
Because PSO algorithm is occurrence of false convergence or precocious,an adaptive chaotic particle swarm optimization(ACPSO) and its application to image segmentation were proposed.First,an improved adaptive particle swarm optimization algorithm(IAPSO) was proposed.Second,the chaos optimization algorithm was joined into the IAPSO.The chaotic variable was used to initialize the position and velocity of the particles.The new Infinite Collapses chaos chaotic mutation mapping algorithm was used to choose the best part of particles from the current population chaos optimization.Finally,ACPSO algorithm was applied to image segmentation.The maximum fuzzy Shannon entropy threshold segmentation method was compared with maximum fuzzy Shannon entropy threshold segmentation method based on the basic PSO.The result indicates that the fuzzy entropy threshold image segmentation method based on the algorithm for the adaptive CPSO has better performance.
Fast Image Segmentation Algorithm Combining CS-LBP Texture Features
LIU Yi,HUANG Bing,SUN Huai-jiang and XIA De-shen
Computer Science. 2013, 40 (5): 300-302. 
Abstract PDF(600KB) ( 420 )   
References | RelatedCitation | Metrics
Graph cuts algorithm is one of the most effective interactive image segmentation methods.But it is prone to produce segmentation errors and shrinking bias phenomena when the color of foreground and background is similar and its interaction efficiency is not high due to pixel-based calculation.To improve these problems,an algorithm combining CS-LBP texture features was proposed in this paper.First the mean shift algorithm is applied to pre-segment the original image into regions to construct region adjacency graph.Then cumulative histogram and CS-LBP texture descriptor are used to extract color and texture features form each region.A new term of texture constraint is added to the energy function and local adaptive regularization parameter is used.So the segmentation effect and shrinking bias phenomenon are improved efficiently.The experiments show that interactivity efficiency and segmentation accuracy are improved.
Algorithm of Image Rotation Based on New Interpolation Method
KANG Mu and LING Feng-cai
Computer Science. 2013, 40 (5): 303-306. 
Abstract PDF(570KB) ( 839 )   
References | RelatedCitation | Metrics
Rotation algorithm for image interpolation methods has some shortcoming.It often can make the image blur or have saw-tooth shaped edges phenomenon.This paper changed the model of expressing image.A image interpolation method combining plane interpolation and spherical interpolation was proposed.It avoids the traditional interpolation uses unified model to approximate all pixels.In different circumstances it will use different interpolation methods.Finally,the theoretical analysis and experimental results show the effectiveness of the method.
Research of Face Occlusion Recover Algorithm in Face Recognition
DU Xing-jing and GUO Ming-xiong
Computer Science. 2013, 40 (5): 307-310. 
Abstract PDF(619KB) ( 1537 )   
References | RelatedCitation | Metrics
Based on the analysis of facial occlusion handling various algorithms,a automatical multi-value mask PCA face reconstruction model (MMPCA model) was proposed.First the facial features are extracted,and the eigenface difference between a test face and a standard face is calculated to judge the facial occlusion accurately and determine the facal occlusion type.Then a occlusion mask is estimated with a M estimator,and the pixel’s amplitude parameters are estimated,and the multi-value occlusion mask is generated.Last three half secondary-type functions are used to ensure optimal synthesis coefficients is only and convergent,and then the optimal synthesis coefficient is obtained to reconstruct the face.The experimental results show that the occlusion region is restored and the occlusion influence to face recognition accuracy is weaken through using the automatical multi-value mask PCA face reconstruction model.
Height Estimation Algorithm Based on Surface Normal
YU Hong,WU Zhe-fu,LIU Kai and HE Xiong-xiong
Computer Science. 2013, 40 (5): 311-314. 
Abstract PDF(581KB) ( 517 )   
References | RelatedCitation | Metrics
Aiming at the problem of achieving height from a normal map,this paper proposed an algorithm recovering height information from a normal map of surface normals which are defined as three-dimensional vectors.The algorithm starts from a plane storing surface normals to construct a plane storing height difference,then accumulate neighbor pixel height from a certain location in pixel lattice.After all pixel do the same calculation,it again proceeds to a certain location using already calculated heights.This iteration ends according to the accuracy needed, and the total accumulated height is averaged.The experiment result shows that this algorithm can generate a relative high resolution height map,and it can be used well in relief map which has a better bump effect.