Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue 10, 16 November 2018
  
Regression Testing Selection Techniques:A State-of-the-art Review
CHEN Xiang,GU Wei-jiang,XU Hui,GU Qing and CHEN Dao-xu
Computer Science. 2013, 40 (10): 1-9. 
Abstract PDF(793KB) ( 1170 )   
References | RelatedCitation | Metrics
Regression test case selection(RTS)is a hot research topic in the study of regression testing.This technique aims to identify modification-revealing test cases from existing test suite.But to date,researchers in China do not systematically summarize and compare existing research work for RTS problem.This paper firstly formulated the RTS problem and its underlying assumptions based on the classification on regression testing activities and test cases.It se-condly classified existing RTS techniques into two categories:code-based RTS and model-based RTS. It further classified these code-based RTS techniques into subcategories,such as integer programming approach,data-flow analysis approach,graph-walk approach,program slicing approach,and firewall approach.It thirdly summarizd commonly-used experiment subjects and evaluation metrics.It finally suggested some potential future work of this topic.
Research of Keyboard Input Security
LI Peng-wei,FU Jian-ming,SHA Le-tian and DING Shuang
Computer Science. 2013, 40 (10): 10-17. 
Abstract PDF(994KB) ( 1184 )   
References | RelatedCitation | Metrics
Keylogger is one of the most serious threats to the Internet users.In order to protect sensitive information input,the study summarized the threats along the process of information inputting using a keyboard and corresponding preventive measures.These threats work at different levels(physical/ring 0/ring 3)and use different attack approach(Query/Hook/Bypass).We summarized the threats faced by soft keyboard such as peep,screenshots,or information intercepted.We also proposed new attacks based on mouse behavior record,element analysis and measures to defense these attacks.We then tested the performance of security measures which are employed by existing research and applications.At last,the existing research in behavior-based keyloggers detection was summarized.
Wire-length Driven Legalization Algorithm for Large-scale ASIC
GAO Wen-chao,ZHOU Qiang,QIAN Xu and CAI Yi-ci
Computer Science. 2013, 40 (10): 18-20. 
Abstract PDF(780KB) ( 670 )   
References | RelatedCitation | Metrics
Legalization is core task of the detail placement after global distribution of generous cells.It removes cells overlap,aligns them to the sites and places them to their final position.A wire-length driven legalization algorithm was presented in this paper.The algorithm takes the total wire-length as objective,considers the site constraint and predefined blockage.Experiments on ISPD’11and DAC’12benchmarks show that this legalization algorithm can get good wire-length results.These test cases are derived from modern industrial ASIC design,which also testify the algorithm can effectively solve the varying characteristics circuit layout legalization issues of large-scale ASIC.
Fast 2-dimension Poisson Direct Solver Based on CUDA
YUE Xiao-ning,XIAO Bing-jia and LUO Zheng-ping
Computer Science. 2013, 40 (10): 21-23. 
Abstract PDF(315KB) ( 1726 )   
References | RelatedCitation | Metrics
The finite difference approximation of two-dimension poisson equation would create a block-tridiagonal equation system.An algorithm which is suitable with the Compute Unified Device Architecture(CUDA)was proposed.Through a discrete sine transform,the computation task could be divided into several completely independent parts,after solving these parts in parallel,the final result could be obtained through another discrete sine transform.Only two global synchronizing is needed during the computation.After carefully optimized,an acceleration rate of more than 10times is obtained.
SLP Optimization Algorithm Using Across Basic Block Transformation and Loop Distribution
SUO Wei-yi,ZHAO Rong-cai,YAO Yuan and ZHANG Xiao-mei
Computer Science. 2013, 40 (10): 24-28. 
Abstract PDF(531KB) ( 1038 )   
References | RelatedCitation | Metrics
The existing SLP algorithms cannot handle dependent ring and the reduction of the inner loop,and generate a large number of redundant packet disassembly and assignment statements in a basic block boundary,which leads to the lower quantization efficiency.In order to solve the problem,this paper proposed a SLP optimization algorithm using cross basic block transformation and loop distribution.Based on the control flow graph,according to the basic blocks of the array variable between Define-Use and across basic block data relation between across basic block,the algorithm makes the quantized transform,orderly uses across basic block transform and loop distribution,and then expands inner loop within a basic block sentence parallelism as far as possible,making SLP automatic vectorization compiler to generate the vectorization code which has more SIMD instruction.The experimental results show that the algorithm can hide more across basic block redundancy operation cost,at the same time generate better SIMD instructions across basic block data dependence,effectively improving the vectorization program speedup.
Evaluating and Optimizing of PCM Based GPU Memory Architecture
MU Shuai,SHAN Shu-chang,DENG Yang-dong and WANG Zhi-hua
Computer Science. 2013, 40 (10): 29-31. 
Abstract PDF(360KB) ( 809 )   
References | RelatedCitation | Metrics
Recently,the emerging non-volatile memory(NVM) exemplified by Phase Change Memory(PCM) has been considered to take the place of conventional DRAM in the processors due to their characteristics of large capacity and little static power.However,the overhead of long write latency can cause severe performance degradation.We evaluated the feasibility of PCM based GPU memory architecture.Based on the analysis of unique memory access behaviors captured from GPU benchmarks,a dedicated buffer was designed to alleviate the pressure of frequently accessing PCM.Simulation results prove the efficiency of our proposed dedicated buffer and show its great potential for PCM based GPU architecture.
Rule-based Optimization of Reversible Toffoli Circuits
CHENG Xue-yun,GUAN Zhi-jin,ZHANG Hai-bao and DING Wei-pin
Computer Science. 2013, 40 (10): 32-38. 
Abstract PDF(502KB) ( 752 )   
References | RelatedCitation | Metrics
Optimization of reversible circuits is one of key problems of reversible logic synthesis.To solve the problem of high algorithm complexity and poor scalability of reversible Toffoli circuits,the paper analyzed and summarized relationship between adjacent Toffoli gates,proposed and proven the moving and simplification rules of sub-sequence of reversible circuit cascaded by Toffoli gates,and gave the optimization algorithm based on these rules.The algorithm exami-nes the circuit bidirectionally according to the moving rules,searches the sub-sequence satisfying simplification rules,performs corresponding optimization,and the process is repeated until the circuit does’t change.The algorithm is irrelevant to the number of input lines,does’t need to store extra information,is suitable to different Toffoli circuit synthesis methods,and its algorithm complexity is O(s3),which is superior to O(n!t2s3) of generally used template optimization technique.Results of examples and experiments on all 3-bit reversible functions show that the number of gates and the control bits can be reduced effectively and the cost of reversible circuit is decreased.
Scheduling Method for Parallel Task of Dynamic Energy-aware of Computing Resources in Cloud Environment
CAO Jie and ZENG Guo-sun
Computer Science. 2013, 40 (10): 39-44. 
Abstract PDF(510KB) ( 693 )   
References | RelatedCitation | Metrics
Cloud computing becomes more and more popular in large scale computing and data store recently because it enables the sharing of computing resources that are distributed all over the world.Cloud computing system is still facing many challenges in achieving low-cost,efficient,safe,easy-to-use computing.Saving computing resources energy has become a significant research topic which needs to be solved urgently.We proposed a subdeadline distribution approach to satisfy the deadline requirements of parallel tasks deadline.To achieve the goal of saving energy in the environment of the computing resources supply voltage to be dynamically adjusted,we proposed two energy-efficient scheduling algorithms—— energy first scheduling algorithm(Ssef) and energy genetic scheduling algorithm(Egsa) to satisfy subdeadline.Repeated experiments show that this two energy-efficient scheduling strategies can reduce the energy consumption considerably while meeting deadline constraints.
Prediction Based Approximate Schema of Data Storage and Query Processing in Object-tracking Sensor Networks
HU Sheng-ze,XIE Yi,BAO Wei-dong,FENG Xiao-sheng and GE Bin
Computer Science. 2013, 40 (10): 45-51. 
Abstract PDF(686KB) ( 637 )   
References | RelatedCitation | Metrics
Energy efficiency is one of the most critical issues in the design of wireless sensor networks.In object-trac-king sensor networks,the data storage and query processing should be energy-conserving by decreasing the message complexity.In this paper,current algorithms of data storage and data dissemination were analyzed.By figuring out the shortcomings of EASE,a Prediction-based Energy-conserving Approximate StoragE(P-EASE) was proposed,which reduces the query error of EASE by prediction and enables a geo-based optimal query algorithm to taking into consideration to query the proper storage node.The simulation experiments were conducted with semi-random walk and random waypoint mobility models which compareed the overall messages,message complexity,average message complexity and query error to validate that P-EASE is more energy-conserving than EASE and has less query error as well.
Study of Novel Adaptive Multi-tree Anti-collision Search Algorithm
WEI Dong-xue,ZHENG Jia-li,LI Liang-liang and YAO Fu-shi
Computer Science. 2013, 40 (10): 52-55. 
Abstract PDF(416KB) ( 574 )   
References | RelatedCitation | Metrics
This paper proposed an adaptive binary search algorithm based on IAMS.The adaptive algorithm chooses binary-tree or quad-tree search adaptively by the number of collisions.In this algorithm,if the number of detected collisions is 2,the reader will begin binary tree search by setting the maximum collision position "0" or "1".If the number of detected collisions is more than two,the reader will record the position and set the bits value of the maximum and minimum collisions,then starting the quad-tree search.Finally we simulated the 3algorithms on Matlab platform.The results and analysis show that compared with the return binary search algorithm and IAMS,the average number of search instructions reduces 46.7%,31.52%,and throughput is increased by an average of 85.8%,24.22% and average 85.3%,82.54% reduction in the amount of data transmitted in our algorithm.
Three-dimensional Positioning Method for Wireless Intrusion Detection System
GUO Pei-yuan,HE Duo-duo and WU Xiao
Computer Science. 2013, 40 (10): 56-60. 
Abstract PDF(932KB) ( 628 )   
References | RelatedCitation | Metrics
This paper was based on the application background of wireless intrusion detection system,which wants to locate target host,but can not predict transmitted power of target’s wireless network card.By using received signal strength indication difference for the least square calculation,which is based on the transformation of Shadowing model,the method can eliminate the dependence of transmitted power parameter.By means of the design,realization and test of the positioning module for wireless intrusion detection system,the feasibility of the method was verified.Finally,an indoor three-dimensional positioning module with Room-level precision was achieved in WIDS.Experiment data shows that precision and cost of the system are integrally optimal as a space point can be covered by 8detection nodes.
Trust Computation Model of Nodes Based on Bayes Estimation in Wireless Sensor Networks
LIU Tao,XIONG Yan,HUANG Wen-chao,LU Qi-wei and GUAN Ya-wen
Computer Science. 2013, 40 (10): 61-64. 
Abstract PDF(349KB) ( 743 )   
References | RelatedCitation | Metrics
Traditional network security policy can not prevent attack or identify abnormal behavior of the internal nodes in the sensor network.This paper presented a trust computation model of nodes based on Bayes estimation in wireless sensor networks(Abbreviates as TCM-BE)according to the node resource-constrained characteristics.The direct trust which calculates the expectations of the reputation function on nodes behavior based on beta distribution is priori information and the recommendation information from neighbor nodes as sample information in Bayes estimation method.The simulation shows that the scheme has good stability,can effectively identify abnormal node,thereby can prevent attack from the internal nodes of the network.The analysis shows that compared with RFSN scheme it not only can save storage space and computing time with traffic,but also can avoid the phenomenon of malicious evaluation on the node trust.
Digital Modulation Recognition Based on Sparse Representation and K-SVD
WANG Zhen-yu,QIN Li-long and DIAO Jun-liang
Computer Science. 2013, 40 (10): 65-67. 
Abstract PDF(323KB) ( 624 )   
References | RelatedCitation | Metrics
With the analysis of the pattern recognition based on sparse representation,a new feature extraction method using K-SVD and sparse representation was proposed to improve the accuracy of the digital modulation recognition under the low signal-to-noise ratio.Firstly,the principle component analysis was put forward to reduce the dimensionality of the samples.Secondly,the sparse dictionary was constructed by the algorithm of K-SVD.Finally,the sparse representation of the sample was calculated by 1-minimization,and the feature was extracted according to the distribution of the sparse coefficient values.The identification problem was solved by using SVM classification machine.The simulation results indicate that the performance of this feature values extracted by this new algorithm is feasible in engineering application.
Real-time Geographic Routing Algorithm in Wireless Multimedia Sensor Networks
ZHOU Kun and FU Yi-de
Computer Science. 2013, 40 (10): 68-71. 
Abstract PDF(365KB) ( 597 )   
References | RelatedCitation | Metrics
For the routing void problem existing in wireless multimedia sensor networks when performing geographic forwarding,this paper presented a new geographic routing algorithm RTGR.This algorithm determines the holes position by periodically identifying which node is located on the border,and then boundary nodes construct a circular domain which covers the hole. After arriving at the domain,packets will select a convex hull node dynamically as intermediate node to bypass the routing holes in the shortest path.Simulation results show that average number of hops is significantly reduced and it’s possible to guarantee the real-time transmission of multimedia data in a multi-routing void environment.
Study on Check-in and Related Behaviors of Location-based Social Network Users
LI Min,WANG Xiao-cong,ZHANG Jun and LIU Zheng-jie
Computer Science. 2013, 40 (10): 72-76. 
Abstract PDF(716KB) ( 866 )   
References | RelatedCitation | Metrics
In Web 2.0era,location-based social networks(LBSNs) are increasingly developed along with the maturity of spatial positioning technology.The typical behaviors of LBSN users are checking-in and commenting at check-in places.Exploring the rules and motivations behind check-in and related behaviors can make better understand of users’ needs and find out the mismatches between system design and user needs,and it is meaningful for the design and development of LBSN applications.GooSeeker,an online data capture tool,was used to craw Digu which is one of the most typical Chinese LBSNs.After processing and analyzing the data,the features of check-in behaviors were known.At the same time,comments at check-in places were also analyzed.Taking comments at MacDonald as example,classification tool SVMCLS was used to classify the comments into different sentiment inclination levels.Ultimately,the rules and features of check-in times and places were presented,and it was also found that users tend to leave brief and positive comments at check-in places.All of the findings can help designers and developers get better understand of users and what users really needs,and then they can refine the designs and provide more appropriate applications and services to users.
Research on Node Localization Algrithom Based on Square Area in Wireless Sensor and Actor Network
XU Huan-liang,LI Duo,REN Shou-gang,FANG He-he and WANG Hao-yun
Computer Science. 2013, 40 (10): 77-82. 
Abstract PDF(541KB) ( 611 )   
References | RelatedCitation | Metrics
In WSAN,actors make decisions and execute appropriate actions according to the data gathered from sensors.So accurate positioning of sensors that perceive the events is very important to implementing exact control policy.Different from those range-free algrithoms in WSN,a mobile location algrithom based on square area was proposed in WSAN.It used mobile actors instead of anchors to locate sensors.First,the area of the unknown node was determined through the movement of actors,secondly,this area decreased through iteration,when precision was satisfied,the centroid of this area was calculated and treated as coordinate of the unknown node.Via simulation experiments,the algrithom is proven to have a good locating accuracy when RSSI error and GPS error exist,and localizing unknown nodes with a few actors can not only save the cost of network deployment,but also can overcome shortcomings of range-free location algrithoms in WSN,which seriously depend on density of anchors.
Electromagnetic Field Volume Rendering Based on CUDA
ZHANG Wen-bo,CAO Yao-qin,SUN Wei and WANG Lian-feng
Computer Science. 2013, 40 (10): 83-86. 
Abstract PDF(332KB) ( 735 )   
References | RelatedCitation | Metrics
Electromagnetic field data generation and volume rendering are dense computation and time-consuming on CPU.To meet the requirements of rapid computation,the CUDA-based architecture of electromagnetic field data genera-tion and ray casting rendering algorithm were designed respectively for GPU’s highly parallel ability.Considering of the characteristics of electromagnetic field data,a method to judge ray and data intersection was presented which utilizes 3D data’s projection on detector along view direction.Simulation test results show that data generation on GPU can achieve a speedup of 158than on CPU,and ray casting algorithm renders in high FPS perfectly.
Method of Real-time Update for PLC Routing Management Platform
CHEN Yan,LIU Hong-li and LIU Shu-gang
Computer Science. 2013, 40 (10): 87-91. 
Abstract PDF(385KB) ( 582 )   
References | RelatedCitation | Metrics
This paper provided a method for remote update for routing management platform of power line carrier communication.It designs a complete set of communication protocol,and puts forward several mechanisms to assure the update of real-time and efficient.So secure,reliable and low-cost remote management can be achieved.It is generic and can guide practice to deal with the related problems.This described method has been tested at the laboratories and in the fields which can achieve real-time update for routing management platform,save a lot of manpower and material resources,greatly reduce the maintenance costs of L-PLC routing software.It can be applied for Automatic Meter Reading system in low-voltage power line carrier communication,especially for the system which consists of master station,concentrator,GPRS module and routing management platform.
Collaborative Position Privacy Protection Method Based on Game Theory
CHEN Yu-feng,LIU Xue-jun and LI Bin
Computer Science. 2013, 40 (10): 92-97. 
Abstract PDF(532KB) ( 789 )   
References | RelatedCitation | Metrics
Location privacy protection is arising people’s more attention and research.Currently location privacy-preserving based on users of mutual cooperation without a central server is now the focus of the study.In order to better protect the privacy of the user’s location in real untrusted environment, this paper provided a location privacy protection method Privacy_l which is based on the idea of user collaboration game.Anonymous group is formed by users collaborate,density center of anonymous group as the anchor instead of the true position to initiate the query.To calculate the anchor through secure sum,this method could eliminate the situations that do not cooperate in good faith in real implausible circumstances.Meanwhile,according to the users’ different location-privacy requirements,different anonymous protective effect is achieved by setting different levels of privacy protection parameters.In addition improved incremental query method is used to improve the efficiency of nearest neighbor queries.Simulation results show that this method has better performance,and can be applied to reality better.
Research of Routing Protocol Based on Path Accumulation for Ad-hoc Networks
HUANG Ting-hui,LU Xiang-yuan,CUI Geng-shen and YANG Min
Computer Science. 2013, 40 (10): 98-103. 
Abstract PDF(631KB) ( 776 )   
References | RelatedCitation | Metrics
Aiming at the problem of link breakage caused by node movement and RREQ(Route Request )packets’ flooding,a routing protocol based on path accumulation was proposed.With the path-accumulation mechanism and link-disjoined multi-path algorithm,the proposed protocol enhances the node’s ability to obtain rout and increases the avera-ge number of valid route resided in a node.Consequently,a RREQ would be response more likely and the RREQs’ broadcast range and the forwarding number are restricted.Based on the path maintain time probability density’s characteristics of exponential distribution,the proposed protocol chooses a routing strategy,which preferentially uses the la-test path and take the length of the path into account,to extend the communication path maintenance time in statistic sense.NS2simulation shows that,compared with AODV,AOMDV and AODV_PA,the proposed protocol has higher packet delivery ratio,lower routing overhead and end-to-end delay of packet.
Zone-based Controllable Routing Model
CAO Sheng-lin and MA Xu
Computer Science. 2013, 40 (10): 104-107. 
Abstract PDF(1217KB) ( 565 )   
References | RelatedCitation | Metrics
As the defacto interdomain routing protocol,BGP has high sensitivity and is slow to reach convergence.To improve the controllability of routing,a zone-based controllable routing model(ZCR) was proposed in this paper.In ZCR,ASes needed to be controlled can be organized to a zone where the link vector routing protocol is employed inside the zone and path vector protocol is taken between zones.ZCR can support routing policies,improve the convergence speed and reduce the load of routing.Moreover,ZCR can be deployed incrementally in the network.Experiment results show that ZCR is effective.
Network Software Test Data Generation Based on Decomposition and Reconstruction
LI Cheng,WEI Qiang,PENG Jian-shan and WANG Qing-xian
Computer Science. 2013, 40 (10): 108-113. 
Abstract PDF(1310KB) ( 581 )   
References | RelatedCitation | Metrics
Protocol fuzz testing can effectively detect vulnerabilities in network software,whereas when facing encryption and checksum mechanisms,existing approaches are hard to generate valid test data.A test case generation method based on “decomposition and reconstruction” was proposed.By means of detection technology on check point and decrypted memory,the valid decoded test data was decomposed at test side.A memory-backtracking algorithm was also proposed,which detects the memory none of the duplication of other memories at the other side,based on which the encoded test packet is reconstructed.Case study and comparison test demonstrate that the method can effectively generate test cases.
Wireless AD-hoc Network Group Key Management Scheme Based on Paillier Homomorphic
HE Wen-cai,DU Min,LIU Pei-he,CHEN Zhi-wei and ZHENG Zhao
Computer Science. 2013, 40 (10): 114-118. 
Abstract PDF(403KB) ( 843 )   
References | RelatedCitation | Metrics
Based on the Paillier homomorphic cryptography system,we presented a safe and effective homomorphism key management scheme.For the sake of the collusive attack,forward secrecy and backward secrecy,our scheme is suitable for group-oriented and rapid variable topology of wireless mobile Ad-hoc network.The homomorphism operations on the ciphertext improve the efficiency of renewing the group key when the external nodes join the group and the internal members leave the group.The security and correctness of our scheme were discussed in this paper.Compared with other approaches,this new scheme has less interaction frequency,a smaller communication and memory cost and a stronger security.
Vulnerability Finding Using Symbolic Execution on Binary Programs
NIU Wei-na,DING Xue-feng,LIU Zhi and ZHANG Xiao-song
Computer Science. 2013, 40 (10): 119-121. 
Abstract PDF(338KB) ( 999 )   
References | RelatedCitation | Metrics
Software vulnerability is one main source of computer safety issues,and the key technology of vulnerabilities finding is fuzzing(fuzzy test)which is based on randomly changing the input,however,it cannot construct test cases effectively and eliminate the redundancy of test cases.In order to overcome the shortcomings of traditional fuzzing test,effectively generate test inputs and do not need to analyze the input format,we designed and implemented vulnerability found system(called SEVE) based on symbolic execution for binary program.SEVE makes the inputs symbolic and uses dynamic instrumentation tools to establish the propagation relationship of the symbolic variable,collect path constraints in branch statement,uses interpreter to solve these path constraints to obtain test cases.Experimental results which are based on mp3and pdf software show that the system can improve the efficiency and the degree of automation of vulnerability discovery.
Improved Algorithm for Large-scale CTL Formulae Verification
XI Qi,WANG Qing-xian,ZENG Yong-jun and QIN Yan-feng
Computer Science. 2013, 40 (10): 122-126. 
Abstract PDF(380KB) ( 982 )   
References | RelatedCitation | Metrics
Labelling algorithm is a standard algorithm for model checking CTL formulas.This paper presented an algorithm to improve efficiency for large-scale CTL formulas verification.It identifies common subformulas from formulae set,binding program states with labels of common subformulas to avoid repeating checking them.The experimental results illustrate the advantages on the efficiency of the new algorithm.
Method of Target Threat Assessment Based on Cloudy Bayesian Network
ZHANG Yin-yan,LI Bi-cheng and CUI Jia-wei
Computer Science. 2013, 40 (10): 127-131. 
Abstract PDF(699KB) ( 677 )   
References | RelatedCitation | Metrics
Cloudy bayesian network was proposed by combining cloud model and bayesian network,and a threat asses-sment model was built based on cloudy bayesian network.Firstly,the bayesian network structure was designed according to the background,and continuous observation node was transformed into cloud model.Secondly,the observation variable value was input to the cloudy bayesian network,reasoning out the probabilities of target which belongs to each threat level.Finally, repeat reasoning was made to eliminate the influence of target information uncertainty on the overall threat grade,and the final threat grade was obtained by probability-composing formula.The validity of the method was checked by simulation for aerial targets threat assessment against a background of joint air defense operations.
Similarity-based Trust Recommended Model
DONG Xiao-hua and ZHOU Yan-hui
Computer Science. 2013, 40 (10): 132-134. 
Abstract PDF(370KB) ( 850 )   
References | RelatedCitation | Metrics
It is difficult to distinguish the cheating and other malicious behaviours in the trust recommendation.A similarity-based trust recommended model was proposed.With social psychology research,two users are more likely to trust each other,when they have higher similarity in behaviours.Using similarity as trust recommendation weight,theoretical analyses and simulation results show that the model can effectively avoid malicious recommendation behaviour in trust recommendation.The model can reduce the calculation error of the trust remarkably compared with the existing model,and the preciseness of the trust evaluation model is enhanced greatly.
Improved Fast Algorithm of Scalar Multiplication for Fix Base Point
WANG Yu-xi,ZHANG Chuan-rong and ZHANG Bing-hong
Computer Science. 2013, 40 (10): 135-138. 
Abstract PDF(286KB) ( 665 )   
References | RelatedCitation | Metrics
For the scalar multiplication of a fix base point,the algorithm of LLECC has a good performance in efficiency,however the huge amount of precomputation and storage restrict its application.Through a new arrangement of the coefficient matrix in which the scalar k is recoded in the non-adjacent form based on the window,the storage can be reduced with the sparse property of the encoding method.When the length of the scalar is 160bit and the window’s width is 4bit,the improved algorithm can reduce 12.4% and 53.3% in the aspect of the computational complexity and storage cost,compared with the original LLECC algorithm.
Improved Certificateless Signcryption Scheme without Pairing
ZHOU Cai-xue and WANG Fei-peng
Computer Science. 2013, 40 (10): 139-143. 
Abstract PDF(514KB) ( 570 )   
References | RelatedCitation | Metrics
A certificateless signcryption scheme without pairing was analyzed.This paper showed the scheme can not achieve confidentiality and unforgeability.The mistakes in the security proofs were pointed out,and an improved scheme was proposed.The improved scheme was proved to be confidential under the computational diffie-hellman(CDH) assumption and existentially unforgeable under the discrete logarithm(DL) assumption in random oracle model(ROM).Performance analysis shows the improved scheme is of high efficiency.
Application of Network Coding in Wiretap Network
CAO Zhang-hua,JI Xiao-dong and LIU Min
Computer Science. 2013, 40 (10): 144-147. 
Abstract PDF(326KB) ( 703 )   
References | RelatedCitation | Metrics
Focusing on the issue of secure communication on a wiretap network where a wiretapper can eavesdrop a limi-ted number of links,we proposed a secure communication scheme based on linear network coding.The key idea of our scheme is to combine the ability of network coding that allows intermediate nodes to mix information from different data flows with one time pad.Further,the presented scheme achieves secure communication without employing a secrecy channel and the utilization of network capacity is up to n-1n.Moreover,we showed that if the efficient domain is large enough,the probability of achieving secure communication with random linear network coding tends to 1in our scheme.
Verification Method for Concurrent Programs Properties Based on Separation Logic
WAN Liang,SHI Wen-chang and FENG Hui
Computer Science. 2013, 40 (10): 148-154. 
Abstract PDF(543KB) ( 712 )   
References | RelatedCitation | Metrics
With the popularity of multi-core,multi-thread and parallel execution,there is an increasing demand for formal verification of parallel programs.The uncertainty of execution flows in parallel program verification makes it difficult to determine the relation between verification contents and targets.Verifying directly from the parallel programs will lead to large-scale verification.To this end,we proposed a new verification method based on separation logic.On the basis of the feature that the semantics of separation logic’s programming language are both interpretive and axiomatic,our method transforms the property formulae to be verified into logical composition expression,and reforms and simplifies them.Then separation logic’s axiom system is used to verify the expression and calculate the value of property formulae with verified assertions.Case studies further illustrate that the proposed method is effective and can reduce verification scales.
Improved RBAC Authorization Model Based on Time Partition
WANG Xiao-lin,SHI You-qun,TANG Cheng and XU Kang
Computer Science. 2013, 40 (10): 155-158. 
Abstract PDF(313KB) ( 713 )   
References | RelatedCitation | Metrics
The traditional RBAC model has less flexibility to assign permissions of users.This paper proposed an improved RBAC authorization model based on time partition.The whole system process is divided into several time partitions.The permissions associated with each role are distributed in each independent partition.The set of stages which have the permissions of each time partition constructs the improved RBAC model by combining the basic RBAC model.When the stages are determined,the permissions of roles and users can be configured by the administrator.The proposed RBAC model is applied in public selecting system stably and obtains good performance.
Research of Software Defects Associated Analysis and Software Defect Removal
LI Peng and ZHAO Feng-yu
Computer Science. 2013, 40 (10): 159-161. 
Abstract PDF(353KB) ( 925 )   
References | RelatedCitation | Metrics
Software defects have propagation characteristics in the software development process.The propagation characteristics of the software defects make defects related,which are needed to be located and removed.Software defects associated analysis is of great significance for software defects removing,quality assurance,and process improvement.In this paper,the propagation process of software defects was analyzed,and the relationship between object association and object defect association was established based on object-oriented analysis and design model.A tree association rules and characteristics similarity association rules were presented based on the propagation process of software defects.The procedures to create these association rules were given and the steps to build up tree association and the characteristics similarity association were presented.
Dynamic Resource Allocation Method of Reliability Testing for Multi-modules Software with Markov Transfer of Control
QI Bei and QIN Zhi-dong
Computer Science. 2013, 40 (10): 162-165. 
Abstract PDF(388KB) ( 678 )   
References | RelatedCitation | Metrics
The task modules have the imbalance characteristic in execution,and to cut back the test cost,a dynamic optimizing allocation method in module-level reliability test was proposed for multi-module software of Markov control transfer,considering the specific situation of the reliability growth test for the module-based software.Compared with the static source allocation methods,the proposed method makes full use of the testing resource optimally,which can reduce the total test cost to meet the reliability target.
Model Checking of Temporal Description Logic ALC-LTL Based on Label Büchi Automata
ZHU Chuang-ying,CHANG Liang,XU Zhou-bo and LI Feng-ying
Computer Science. 2013, 40 (10): 166-171. 
Abstract PDF(524KB) ( 606 )   
References | RelatedCitation | Metrics
The temporal description logic introduces the description abilities of the description logic into the proposition temporal logic.It’s suitable for describing the temporal properties of relevant systems under the semantic Web environment.To verify the temporal properties efficiently,the model checking problem of temporal description logic based on ALC-LTL was investigated in this paper.On the one hand,the temporal description logic ALC-LTL formula was used to express the specification to be checked.On the other hand,the description logic ALC was used to model the system which is investigated.For the resulted model checking problems,a model checking algorithm based on label büchi automation was presented.This algorithm composes of three steps.Firstly,the negation of the specification and the model of the system are constructed as two separate label büchi automation,and then constructing a product automation for the two label büchi automations,and the reasoning mechanisms of ALC is embedded in the process of constructing in this step,finally,detecting the emptiness problem for the product automation.Compared with the model checking of propositional temporal logic LTL,the model checking of ALC-LTL is provided with the representation ability and reasoning mechanisms of description logics,and therefore is suitable for the semantic Web environment.
Finding XML Pseudo-relevance Document Based on Search Results Clustering
ZHONG Min-juan,WAN Chang-xuan,LIU De-xi and LIAO Shu-mei
Computer Science. 2013, 40 (10): 172-177. 
Abstract PDF(535KB) ( 550 )   
References | RelatedCitation | Metrics
Recently study shows that traditional pseudo-relevance feedback may bring topic drift.Therefore,to avoid topic drift effectively,it is essential to identify relevant documents and to form the pseudo relevant documents to user’s query.In this paper,based on clustering XML search results,a method was proposed to find good feedback documents.Firstly,a cluster-label extraction method based on equalizing weights was introduced,by fully considering the content and structure features in XML documents.Secondly,a two-stage ranking strategy was presented,as the candidate cluster ranking model and document ranking model.Finally,experimental data shows that compared to original retrieving method, the ranking models obtain better performance and find more relevant XML documents.
Purchasing Order Integration Technology for Auto Parts Industry Chain Cloud Services Platform
LI Bin-yong,SUN Lin-fu,TIAN Ran and WANG Shi-bo
Computer Science. 2013, 40 (10): 178-182. 
Abstract PDF(1063KB) ( 764 )   
References | RelatedCitation | Metrics
Aiming at the auto parts industry chain supplier enterprise group orders integration problem,this paper put forward technical scheme oriented to the cloud services platform order integration.In order to integrate multiple mapping between abstract based on the established order,multi level mapping tree model was built.Flexible dynamic confi-gurable interface was used to construct acts to the auto parts industry chain cloud services platform mapping rules,and the complex mapping was converted to a direct mapping .Through order mapping tree hierarchy and mapping integration algorithm,different vehicle plant of large quantities of orders were integrated in heterogeneous components production plan set required for suppliers enterprise group.Results show that the use of multi level order mapping tree model can support for automobile parts industry chain integration of cloud services platform order extended application.
Research on Fuzzy Logic in Database Information Retrieval
ZHANG Jun,GAO Yan and YU Su-hua
Computer Science. 2013, 40 (10): 183-189. 
Abstract PDF(655KB) ( 770 )   
References | RelatedCitation | Metrics
Application of fuzzy logic in traditional information retrieval has been widely studied in recent years,and more and more attention has been paid to the intersect field of fuzzy logic and database information retrieval.Making use of the membership function to express the uncertainty and imprecision of the semantics in databases,establishing the corresponding fuzzy indexes,employing the fuzzy inference mechanism and object-level information retrieval method can improve the effect of database information retrieval to a great extent.Firstly,fuzzy logic in traditional information retrieval was introduced and their advantages and disadvantages were analysed briefly.Secondly,fuzzy logic in database information retrieval was presented in detail.Finally,fuzzy logic based object-level information retrieval over relational databases was further discussed.
MR-GSpar:A Distributed Large Graph Sparsification Algorithm Based on MapReduce
CHEN De-hua,ZHOU Meng,SUN Yan-qing and ZHENG Liang-liang
Computer Science. 2013, 40 (10): 190-193. 
Abstract PDF(441KB) ( 709 )   
References | RelatedCitation | Metrics
As an important data pre-processing operation,graph sparsification has attracted wide attentions from the area of database.Nowadays the graph data is becoming popular and scale.Thus this paper proposed an efficient parallel graph sparsification algorithm,namely the MR-GSpar algorithm.The MR-GSpar algorithm is presented by reforming the traditional Minhash algorithm into a parallel and distributed algorithm using MapReduce framework,which can archive efficient sparsification on large-scale graph data in a large machine cluster environment.Experiments on real datasets show that the algorithm is feasible and effective.
Research and Realization of Data Storage Model for Multi-tenant under SaaS Mode
ZHOU Wen-qiong,LI Qing-zhong,FAN Lu-qiao and ZHENG Shu-zhao
Computer Science. 2013, 40 (10): 194-197. 
Abstract PDF(341KB) ( 1186 )   
References | RelatedCitation | Metrics
SaaS mode introduces the multi-tenant environment,which may cause following two concerns raised for database storage:isolation of tenant’s data and expandability of tenant’s data.With research of the data storage and access mode for multi-tenant environments,this paper pointed out the model of ‘shared schema for data storage and indepen-dent schema for data access on shared database’.This model separates the schema for data storage and access for SaaS application,which would resolve the problems of ‘isolation of tenant’s data’.Meantime,the paper pointed out another XML-based model for multi-tenant data expansion,which would resolve the problem of ‘expandability of tenant’s data’.Furthermore,the report described the detailed solutions for the two models,which are proven practical and flexible.
Research on Method of Rules Mining Based on Attributes Hierarchy
YANG Wei,MIAO Duo-qian and WEI Zhi-hua
Computer Science. 2013, 40 (10): 198-202. 
Abstract PDF(390KB) ( 525 )   
References | RelatedCitation | Metrics
Based on the attributes hierarchy decision table,the attributes hierarchy trees were introduced and the concept of attributes hierarchy decision table was defined.An algorithm was proposed for the rules mining in the attributes hierarchy decision table and numerical examples were employed to substantiate this algorithm.And the experiment indicates that the number of objects in attributes hierarchy decision table is less than original decision table,and the time of implementing attributes reduction is less than that.
Character Analysis of Standardization Methods of Decision Matrix with Intervals
HU Ming-li,FAN Cheng-xian and SHI Kai-quan
Computer Science. 2013, 40 (10): 203-207. 
Abstract PDF(347KB) ( 936 )   
References | RelatedCitation | Metrics
Standardization of the decision matrix with interval numbers is the basis for interval multi-attribute decision-making.Standardization makes the different types of attribute values can be compared and calculated.The advantages and disadvantages of the existing standardization formula were analyzed. This paper presented several characters that standardization function should satisfy.According to limits of the existing method,a new formula was given based on the range transformation idea.After a standardized,the attribute values are in the range of [0,1],and the relation of different projects under the same attribute remainis unchanged.It is proven that the new formula satisfies characters of monotone,translation invariance,difference ratio invariance,scaling invariance and stability properties of interval.Finally,an example was given to verify the feasibility and effectiveness of new method.
Landmark Guided Segmental Speech Decoding Algorithm for Continuous Mandarin Speech Recognition
CHAO Hao,YANG Zhan-lei and LIU Wen-ju
Computer Science. 2013, 40 (10): 208-212. 
Abstract PDF(722KB) ( 653 )   
References | RelatedCitation | Metrics
A framework was proposed which attempts to incorporate landmarks into segment based Mandarin speech recognition system.In the method,landmarks provide boundary information and phonetic class information,and the information is used to direct the decoding process.To prove the validity of this method,two kinds of landmarks which can be detected reliably were used to direct the decoding process of a segment model(SM)based Mandarin LVCSR system.Experiments conducted on “863-test” set show that decoding time can be saved about 12.92% without obviously decreasing the recognition accuracy.Thus,potential of the method is demonstrated.
Memristor-based Successive Learning Chaotic Neural Network
ZHANG Yi,DUAN Shu-kai,WANG Li-dan and HU Xiao-fang
Computer Science. 2013, 40 (10): 213-217. 
Abstract PDF(949KB) ( 782 )   
References | RelatedCitation | Metrics
With the unique memory ability and continuously variable conductance state,memristors have promising prospects in the fields of artificial intelligence and artificial neural network.This paper derived the charge-controlled memristor model in detail.Combining the nanometer memristor and chaotic neural network,a novel type of memristor-based successive learning chaotic neural network model was proposed.The numerous feedbacks and iterative in the network,that is,the spatio-temporal summation of external input to neurons and the interaction between neurons,can been achieved by taking advantage of memristor.In the proposed model,it makes use of the difference in the response to the input patterns to distinguish the unknown pattern from the stored known patterns.When an input pattern is regarded as an unknown pattern,it will be memorized in the network.The effectiveness was verified through the given simulation experiments.With the memristor’s nano-scale size and automatic memory capacity,the program is expected to greatly simplify the structure of chaotic neural network.
Judgment of NP-NP Equivalence for 3-bit Reversible Logic Functions via Fixed Polarity Reed-muller Forms
LUO Qing-bin,YANG Guo-wu,SHAO Yuan-hua and FAN Fu-you
Computer Science. 2013, 40 (10): 218-220. 
Abstract PDF(351KB) ( 788 )   
References | RelatedCitation | Metrics
In reversible logic synthesis,the templates can be used repeatedly through classification.Extending the definition of NP-N equivalence for Boolean functions to the reversible logic functions,the definition of NP-NP equivalence for reversible logic functions can be given.Divide all 3-varible Boolean functions in which everyone of them has exactly four minterms into five classes by their cofactor weight vectors,and calculate the fixed polarity Reed-Muller Form of each class.By comparing the sorted cofactor weight vectors,whether the reversible logic functions are NP-NP equivalent can be judged preliminarily.When the sorted cofactor weight vectors are identical,the reversible logic functions should be judged as NP-NP equivalence if and only if their every corresponding output which is Boolean function has the same vari-able mappings.Otherwise,the reversible logic functions are not NP-NP equivalent.Thus,whether two 3-bit reversible logic functions are NP-NP equivalent can be judged by this method.
Semantic Modeling for Story Based Shallow Text Understanding and Event Frame
XIE Qiu-mei,GAO Chun-ming and WANG Xiao-lan
Computer Science. 2013, 40 (10): 221-225. 
Abstract PDF(564KB) ( 715 )   
References | RelatedCitation | Metrics
For the semantic understanding task of story text,this paper used open information extraction method to capture N-ary facts from story,and then described N-ary facts frames as event semantic model.Our method proposes extraction rules for frame elements based on dependency parser and regular expressions,and an event semantic model SOSDL ontology for story text with representation of qualitative temporal and spatial relations.Our experiments indicate that this approach captures more facts per sentence,is greater completeness, and SOSDL can effectively model the semantic elements of N-ary facts frames and their relationship.
Double k-nearest Neighbors of Heterogeneous Data Stream Clustering Algorithm
HUANG De-cai,SHEN Xian-qiao and LU Yi-hong
Computer Science. 2013, 40 (10): 226-230. 
Abstract PDF(465KB) ( 647 )   
References | RelatedCitation | Metrics
On the one hand,most of the existing data stream clustering algorithm can handle data with numerical attri-bute,but can not cope with the data containing both numeric and classification attributes.On the other hand,there is also a lot of room for heterogeneous data stream algorithms to improve standardization and clustering of data.So,double k-nearest neighbors of heterogeneous data stream clustering algorithm was proposed.The algorithm uses CluStream’s online and offline framework with proposing three steps of clustering thought.Firstly,the algorithm uses double k-nearest neighbors and improved dimension distance to form micro clusters.Secondly,the algorithm uses dynamic standardization data method and cosine model based on mean value to form initial macro clusters.Thirdly,the algorithm uses cosine model based on mean value and priori clusters to do macro clustering optimization.Experimental results demonstrate that the proposed method improves clustering’s accuracy and scalability.
Modeling and Analysis of Railway Crossing Based on Differential Dynamic Logic
QIAN Lei and YU Wen-sheng
Computer Science. 2013, 40 (10): 231-234. 
Abstract PDF(599KB) ( 648 )   
References | RelatedCitation | Metrics
We presented the analysis of railway crossing based on differential dynamic logic. When a train sends approaching signal and goes into the crossing,the limit of time spending in this period helps us identify safe ranges for the train speed control.We also illustrated modeling of this hybrid system using hybrid program and differential dynamic logic.Using the theorem prover KeYmaera,we formally verified hybrid safety properties of Railway Crossing about the train,and obtained the right speed control condition.
Improved NSGA2Algorithm with Differential Evolution Local Search
XIE Cheng-wang,LI Kai and LIAO Guo-yong
Computer Science. 2013, 40 (10): 235-238. 
Abstract PDF(393KB) ( 666 )   
References | RelatedCitation | Metrics
NSGA2algorithm with its selection mode of Pareto dominate method and the strategy of using individual density estimation operator of solution to select winning solution becomes the model of modern multi-objective evolutionary algorithm,but the algorithm by computing the solution of individual crowding distance to keep the population distribution mechanisms has certain defects.In view of this,this paper proposed a kind of improved algorithm which takes differential local search with NSGA2algorithm.The new algorithm uses the differential evolution mutation operator in directional guiding ideology,takes the difference vector,and combines the NSGA2algorithm to improve the solution population distribution.Simulation results show that the new algorithm compared with the NSGA2algorithm in the solution of cluster distribution uniformity and depth is improved obviously.In addition,the new algorithm in the time complexity is same as the classic NSGA2algorithm.
Novel Smooth Regularization Based Semi-supervised SVM Approach and its Application in Credit Evaluation
XUE Fei,LU Li-min and WANG Lei
Computer Science. 2013, 40 (10): 239-242. 
Abstract PDF(365KB) ( 642 )   
References | RelatedCitation | Metrics
This paper proposed a novel smooth regularization based semi-supervised support vector machine approach,and applied it to set up credit evaluation model of small-and-medium enterprises.It computes a manifold-related smooth regularization term on both few labeled samples and plenty of unlabeled samples,which is combined into the learning process of maximal margin classifiers.Then,it adopts a progressive method to acquire semi-labeled samples iteratively so that the generalization performance of support vector machine can be improved gradually.Experiments on reality dataset show that the testing accuracy of proposed approach outperforms several popular ones,and is very suitable for evaluating credit grades of small-and-medium enterprises.
Optimized Semantic Conditioned Fuzzy C-Means
LI Hong-bo,LI Ren-pu,ZHANG Zhi-wang and ZHOU Chun-jie
Computer Science. 2013, 40 (10): 243-247. 
Abstract PDF(420KB) ( 555 )   
References | RelatedCitation | Metrics
On the basis of Conditioned Fuzzy C-Means,it was proposed that a foreign condition is determined by a computational user semantic.The user semantic was computed by the membership function based Axiomatic Fuzzy Sets.Further,a new concept,Adjusted Factor,was introduced to adjust the impact of the membership based on semantic and that one based on Euclidean Distance on clustering results,and one uniformed coherence framework of Fuzzy C-Means and Conditioned Fuzzy C-Means was built up.In addition,Semantic Strength Expectation was brought forward in order to assess the clustering quality.Furthermore,in order to raise the clustering accuracy,Semantic Conditioned Fuzzy C-Means was processed after the raw data was transformed into spectral data.Finally,based on multiple assessment indexes,FCM,Semantic Conditioned Fuzzy C-Means and its Spectral Optimization were tested on Iris data set.Experiment results show that the cluster that is closest to user semantic is able to be found by Semantic Conditioned Fuzzy C-Means.
Short-term Traffic Flow Forecasting Model Combining SVM and Kalman Filtering
ZHU Zheng-yu,LIU Lin and CUI Ming
Computer Science. 2013, 40 (10): 248-251. 
Abstract PDF(424KB) ( 1139 )   
References | RelatedCitation | Metrics
Aiming at the issue about short-term traffic flow forecasting,a prediction model combining with Kalman filtering and support vector machine was proposed.The model adopts appropriate forecast method intelligently in each prediction period by the standards of error sum of squares and vector cosine of the angle,utilizes the stability of SVM and the real-time nature of Kalman filter comprehensively,and takes respective advantages of the two models.Experiments show that the model’s error indicators are lower than single forecast model.In particular,the model in the peak hours,which average relative error is maintained at less than 8%,is a feasible and effective method of short-term traffic flow forecasting.
Text Feature Selection Methods Based on Information Gain and Feature Relation Tree
REN Yong-gong,YANG Xue,YANG Rong-jie and HU Zhi-dong
Computer Science. 2013, 40 (10): 252-256. 
Abstract PDF(452KB) ( 644 )   
References | RelatedCitation | Metrics
Due to the maldistribution of classes and features,the classification performance of traditional information gain algorithm will decline sharply.Considering that,a text feature selection method UDsIG was proposed which is based on the information gain.Firstly,because the feature selection may be influenced when the classes is unevenly distributed,we selected features based on class.Secondly,we used feature distribution uniformity to improve the influence on feature selection process when features are uneven distributed in the class.Then we adopt the feature relation tree model to deal with the class features,retain strong correlation features and delete the weak correlation and irrelevant ones.At last,we got the best feature subset by using of information gain formula which is based on weighted dispersion.The comparison experiment shows that the method has better classification performance.
Novel Classifier Algorithm Based on Kernel Fisher Discriminant and its Application in Language Recognition
LI Jin-hui,YANG Jun-an and XIANG Yao-jie
Computer Science. 2013, 40 (10): 257-260. 
Abstract PDF(323KB) ( 580 )   
References | RelatedCitation | Metrics
GMM and SVM have a good complementation on the modeling and recognition performance.Therefore,GMM-MMI-SVM has become a mainstream research method in language recognition.However,SVM only employs some special samples in the training samples,i.e.support vector,but doesn’t use all samples.This affects further improvement of system’s recognition performance.In order to solve this problem,an novel classification algorithm based on Kernel Fisher Discriminant(KFD) was proposed in this paper,called GMM-MMI-KFD.The core idea is the substitution of SVM with KFD,Extracting eigenvector sequence from voice segment,and then inputing them into GMM-MMI and GMM-KFD classifiers respectively,which judge them.Compared to SVM,KFD gets more emphasis on the characteristic of nonlinear distribution of voice data.Meanwhile,it can maximize between-class space and minimize within-class space after the projection of samples onto high-dimensional space.The experimental data shows that the GMM-MMI-KFD Classifier has higher recognition rate in language recognition.
Research on Distortion-free Fusion of Sequence Images
YANG Bo,ZHANG Wen-sheng and XIE Yuan
Computer Science. 2013, 40 (10): 261-264. 
Abstract PDF(611KB) ( 613 )   
References | RelatedCitation | Metrics
Aero-optical effects,atmospheric turbulence,water fluctuations and so on, can cause image distortion.To analyze image and recognize object precisely,we need to get an image free of distortion.In this paper,we made a research on getting a single image of high quality by fusing sequence images in an environment with time-varying distortion.With the help of non-rigid image registration technique,and taking into consideration of the image quality index value of images,we proposed a new algorithm on distortion-free fusion of sequence images.Experiments using real datasets show the efficiency of our method.
Feature-Preserving Image Denoising Method Combining EMD and Wavelet Analysis
WANG Wei-hong,CHENG Shi-wei,ZHANG Su-qiong and QIN Xu-jia
Computer Science. 2013, 40 (10): 265-268. 
Abstract PDF(1132KB) ( 592 )   
References | RelatedCitation | Metrics
In the process of obtaining and transmission,image will be degraded by all kinds of noise.Since the detail and edge are essential for describing the features of image,a novel denosing method,which combines empirical mode decomposition(EMD) and wavelet analysis with the aim to keep the detail features,was proposed in this paper.Firstly,the method decomposes the image by EMD, obtains the intrinsic mode function(IMF) and the remaining component(R).Secondary,it decomposes the IMF by wavelet,filters obtained the IMF to remove the noise and keep the detail features.Finally,it adds the IMF filtered by wavelet and the remaining component decomposed by EMD,and then obtains the denoised image.Experiments show that the proposed method can remove noise well while keeping the detail feature which can’t be achieved only by EMD or wavelet method.
Application and Method for Linear Projective Non-negative Matrix Factorization
HU Li-rui,WU Jian-guo and WANG Lei
Computer Science. 2013, 40 (10): 269-273. 
Abstract PDF(615KB) ( 519 )   
References | RelatedCitation | Metrics
To solve the problem that the iterative method for Linear Projection-Based Non-negative Matrix Factorization(LPBNMF)is complex,a method,called Linear Projective Non-negative Matrix Factorization(LP-NMF),was proposed.In LP-NMF,from projection and linear transformation angle,an objective function of Frobenius norm is considered.The Taylor series expansion is used.An iterative algorithm for basis matrix and linear transformation matrix is derived strictly and a proof of algorithm convergence is provided.Experimental results show that the algorithm is convergent,and relative to Non-negative Matrix Factorization(NMF)and so on,the orthogonality and the sparseness of the basis matrix are better,in face recognition,there is higher recognition accuracy.The method for LP-NMF is effective.
Fast Remote Sensing Image Segmentation Algorithm Based on Nearest Neighbor Direct Graph
CUI Bin-ge and MENG Ao-xiang
Computer Science. 2013, 40 (10): 274-278. 
Abstract PDF(692KB) ( 676 )   
References | RelatedCitation | Metrics
The existing region growing algorithms do not take into account the direction of the nearest neighbor relations,which results in frequent rebuilt of the neighbor relations.In this paper,a fast algorithm for remote sensing image segmentation was proposed based on nearest neighbor directed graph.First of all,a remote sensing image was segmented using the watershed algorithm,and then a nearest neighbor directed graph was established on the basis of the region objects of the previous segmentation.In the region growing phrase,the adjacent region objects were merged along the directed edges.When the first round is finished,the nearest neighbor directed graph should be rebuilt,and the second round of region growing is initiated.This process repeats until the region number is no longer changed.This method avoids recalculating the neighbor relations whenever a merge happens,which reduces the computational complexity.The experimental results show that the algorithm proposed in this paper is more reasonable,more efficient compared with the other three algorithms.
Face Recognition Based on HOG Multi-feature Fusion and Random Forest
GUO Jin-xin and CHEN Wei
Computer Science. 2013, 40 (10): 279-282. 
Abstract PDF(694KB) ( 664 )   
References | RelatedCitation | Metrics
A novel approach to face recognition,which is based on HOG multi-feature fusion and Random Forest,was proposed to solve the problems of low face recognition rate in complex environments.This approach introduces the HOG descriptor(Histograms of Oriented Gradients)to extract information of the facial feature.Firstly,the face image grid is set to extract the holistic HOG features of the entire face,and the face image is divided into homogeneous sub-blocks,and local HOG features are extracted in the sub-blocks which contain key components of the face.After that,the dimensions of holistic and local HOG features are reduced using 2D Principal Component Analysis(2DPCA)and Linear Discriminant Analysis(LDA)and the final classification features are formed by the feature level''s fusion.Finally,the random forest classifier is employed to classify the final features.Experimental results on FERET CAS-PEAL-R1and real scene database demonstrate that the proposed approach not only significantly raises the recognition rate and reduces the computing time but also has certain robustness to the influence of light.
Modeling and Simulation of Soft Tissue Deformation Based on Local Dynamic Model
LI Yan-dong,ZHU Ling,YE Xiu-fen and SUN Ming
Computer Science. 2013, 40 (10): 283-288. 
Abstract PDF(1066KB) ( 621 )   
References | RelatedCitation | Metrics
This paper selected human soft tissue model as a research object and proposed local mass-spring/damper(ALMSDM)according to characteristics of medical palpation training.Dynamic characteristics of this model include position-changing and extensible area,which improves the restrictions of static limitation in local modeling method in existing literature and resolves the problem of poor recovering ability and large volume data in global surface model.And the strategy of local updating for vertex normal and pre-computing was also proposed combining ALMSDM’s features.This strategy dramatically enhances the real-time of system whose performance is assessed from recovery,feedback force and real-time under different models.This can guarantee the accuracy and real-time of virtual soft tissue deformation and simulation,thus verifies feasibility and universal property of this proposed algorithm.
Research on Stylized Lines Rendering of 3D Models
JIANG Bao-ji and TANG Di
Computer Science. 2013, 40 (10): 289-291. 
Abstract PDF(1137KB) ( 727 )   
References | RelatedCitation | Metrics
A substantial challenge for making stylized line drawings from 3D models is the visibility computation.Current algorithms for computing line visibility in models of moderate complexity are either too slow for interactive rende-ring,or too brittle for coherent animation.This paper introduced methods that exploit graphics hardware to provide fast and robust line visibility.Using a full optimized pipeline can support line visibility and a broad range of stylization options.The experimental results show that operation speed can satisfy the requirements of real-time interactive system.
Research on Traffic Parameter Acquisition System Based on ST-MRF Model
ZHOU Jun
Computer Science. 2013, 40 (10): 292-295. 
Abstract PDF(855KB) ( 588 )   
References | RelatedCitation | Metrics
Robust vehicle tracking algorithm is an important precondition for realizing traffic event detection,but occlusion is a key influence factor for vehicle tracking.An adaptive vehicle tracking algorithm based on spatial temporal Markov random field(be short for ST-MRF)model was proposed to deal with the problem .This tracking algorithm is used to acquire the object maps and the motion vector maps.The traffic parameters can be obtained from the object maps and the motion vector maps.Then using the camera calibration technique gets the real time speed,as well as the vehicle motion coordinates.At last,through the analysis of the three parameters of traffic flow,the real-time traffic flow operation characteristics are got,which can provide basis for the traffic incident detection in the future.
Human Action Recognition Using Action String
ZHAO Hai-yong and LI Jun-qing
Computer Science. 2013, 40 (10): 296-300. 
Abstract PDF(945KB) ( 699 )   
References | RelatedCitation | Metrics
A template match-based method was presented by using the silhouettes of image sequence as representative descriptors of human posture to achieve human actions recognition.First,human silhouette extraction was obtained by background subtraction and shadow elimination.A new silhouette descriptor based on R transform was defined,which is employed to transform the silhouette into a 1D distance signal.Hierarchical clustering method was proposed to extract key postures of human action.Then,key postures were coded as a template called action string.At last,dynamic time warping was proposed to measure the similarity of the templates and the test sequences.This algorithm was evaluated using six human daily actions.The experimental results show that this method can achieve the correct recognition rate above 85%.
Three-dimensional Modeling of Soybean Leaf Based on Area Constraint
WANG Jing-wen and LIU Hong
Computer Science. 2013, 40 (10): 301-304. 
Abstract PDF(594KB) ( 613 )   
References | RelatedCitation | Metrics
Aiming at the realistic three-dimensional model of soybean leaf based on images,considering the impact of leaf area index(LAI)on soybean production,a method for three- dimensional modeling of leaf on the constraint of leaf area was proposed.Firstly,the feature points and leaf area were extracted form the image,and then the feature points were interpolated to get a bi-cubic uniform B-spline surface.The mesh was deformed according to the curl of leaf edge and the curly leaf model was constructed.Lastly,the maximum illumination area of each model separately was calculated to chose the optimal model.The experimental results show that the method can construct the realistic three-dimensional model of soybean leaf.The method has certain biological significance and application value.
Target Recognition of Hyperspectral Images Band Selection Method
ZHANG Hai-tao,MENG Xiang-yu,CHEN Hong-yu and ZHANG Ye
Computer Science. 2013, 40 (10): 305-308. 
Abstract PDF(643KB) ( 675 )   
References | RelatedCitation | Metrics
The large amount of data for remote sensing image,more band number and more information redundancy bring difficulties to the further interpretation of the image.To solve this problem,using mutual information between adjacent bands and correlation coefficient matrix between the entire band to group the band,and using the band index and spectral angle mapping algorithm,band selection method for a specific region of interest was proposed.First corrected all of the valid band was grouped and divided in to subspace,and then the index of the biggest bands in the various sub-space was removed.Finally,based on surface features the spectral separability select the best band combination.The trial and the comparision with the common band selection method compares show that the method proposed in this paper has the obvious object extraction effect.
Matrix Multiplication for Line Clipping of Polygon
HUANG Wen-jun
Computer Science. 2013, 40 (10): 309-317. 
Abstract PDF(586KB) ( 683 )   
References | RelatedCitation | Metrics
This paper proposed a new method for line clipping with a polygon.The method gets the intersection points of a polygon and a line by matrix multiplication.For a set of line segments,the algorithm of this paper tests it by a bounding box that included the polygon to discard the line segments which do not intersect the box,then the algorithm introduces homogeneous coordinates,and makes a group of matrixes and applies the matrixes multiplication to the polygonal window and the straight line to make the continuous affine transformations,and gets the intersection points of the window and straight line from the matrix.Having sorted and matched the points of intersection,the algorithm of this paper obtains the result of the polygon clipping the line segment.The experiment shows that the new method is effective,and the speed is improved.