Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue 7, 16 November 2018
  
Review on Image Content Representation Models
ZHANG Lin-bo,XIAO Bai-hua,WANG Feng and SHI Lei
Computer Science. 2013, 40 (7): 1-8. 
Abstract PDF(877KB) ( 551 )   
References | RelatedCitation | Metrics
Content-based image representation has become one of the most popular problem in the field of computer vision.To deal with the challenge of object deformation,object occlusion,scale variability and background confusion,a lot of strategies have been proposed,and a large number of works have been produced.This paper presented a review on the classic works related to content-based image representation,which are in the order of codebook-based models,part-structure models,contour-fragment-based models,biological-cognition-related models and context related models.In addition,the advantages and limitations of each model were also provided.
Survey of Automatic Model Composition Based on Semantic Web Service
HUANG Hui,CHEN Xue-guang and WANG Zhi-wu
Computer Science. 2013, 40 (7): 9-14. 
Abstract PDF(588KB) ( 378 )   
References | RelatedCitation | Metrics
This paper reviewed the proposition,development and implementation of the model composition function in decision support systems (DSS).A survey of distributed model management and model composition was given,and multiple model composition methods were compared.Due to the emergence of Web Service technology,and the innovative application of both of the semantic Web and AI planning fields,the automatic composition of DSS models is possible.But some problems also exist.The distributed model composition methods mainly learn from the automatic Web service composition,and models are encapsulated into Web services.The problem of model composition is converted to Web service composition.However,model composition has its own characteristics that the traditional automatic Web service composition methods are not fully applicable,for example the direct composition of qualitative and quantitative models.
Research on Autonomic Computing System and its Key Technologies
WANG Zhen-dong,WANG Hui-qiang,FENG Guang-sheng,LV Hong-wu and CHEN Xiao-ming
Computer Science. 2013, 40 (7): 15-18. 
Abstract PDF(436KB) ( 639 )   
References | RelatedCitation | Metrics
Autonomic computing is an emerging research hotspot,and the aim is to decrease system complexity for users by means of technologies managing technologies.Currently,autonomic theory is used in network security,autonomic control fields for some years and gets initial success.However,there is not mature theory for autonomic computing at home and abroad.This paper summarized the conception,architecture and key technologies for autonomic computing systems,to guide the follow-up studies for researchers on autonomic computing.
Survey of Helper Thread Prefetching
ZHANG Jian-xun and GU Zhi-min
Computer Science. 2013, 40 (7): 19-23. 
Abstract PDF(795KB) ( 491 )   
References | RelatedCitation | Metrics
Helper thread prefetching is one of the key techniques to improve the prefetch effect of non-irregular data intensive applications.It has become a hot research topic at all over the world in recent years.Aiming at the memory access characteristic of discontinuous locality of non-irregular data intensive applications,helper threading could effectively convert discontinuous locality into continuous-instant spatial or temporal locality by using the shared LLC of CMP platform.And as a result,the application’s performance can be improved.In this paper,the classification of helper thread prefetching techniques was summarized from the perspective of implementation method.The limitation and superiority of different types of prefetching were compared and surveyed.The current helper thread prefetching control policy was systematically analyzed and compared.Finally,several major issues and research directions of helper thread prefetching for further exploration were also pointed out.
Specification of Bigraphical Categories
XU Dong,ZHU Gang and LI Jing
Computer Science. 2013, 40 (7): 24-27. 
Abstract PDF(283KB) ( 490 )   
References | RelatedCitation | Metrics
Category theory is a mathematical theory to process abstractly mathematical structures and the relations among them.Bigraph which uses category theory as mathematical base,is a design,simulation and analysis platform for ubiquitous information systems.There,however,is a lack of specification,and are some errors in the definitions for bigraphical categories.The definition of bigraph basic signature was improved,and the juxtaposition operation of place graphs was corrected,and the relations among precategory,category,s-category and symmetric partial monoidal categorywere revealed in conjunction with giving an algorithm to construct bigraphical categories in this paper.Therefore,bigraph theories and its applications will be further investigated.
Anti-jamming Performance Test System Building Method of Beidou Satellite Navigation Receiver
GUO Shu-xia,DONG Zhong-yao,ZHANG Ning and LIU Meng-jiang
Computer Science. 2013, 40 (7): 28-31. 
Abstract PDF(803KB) ( 511 )   
References | RelatedCitation | Metrics
In view of the increasingly complex space electromagnetic environment,users pay more attention to the receiver anti-jamming performance of satellite navigation.For testing receiver anti-jamming performance,a kind of method to construct test system was proposed.By using digital simulation technology and instrument driving technology based on script,and combining with signal simulation equipments,couplers,anechoic chamber to construct semi-physical simulation system,the test was realized.The effective carrier to noise ratio at the receiver was used as index of anti-jamming performance,and it was simulated by adjusting output power of the jamming signal transmitter.The simulation results show that the curve of bit error rate versus the effective carrier to noise ratio can be got by the system,which can provide reference for the Beidou receiver anti-jamming performance test.
New MAC Layer Protocol of Tactical Data Link
PENG Sha-sha,ZHANG Hong-mei,BIAN Dong-liang and ZHAO Yu-ting
Computer Science. 2013, 40 (7): 32-35. 
Abstract PDF(346KB) ( 539 )   
References | RelatedCitation | Metrics
Aiming at the problem that Link-16data link can not satisfy the requirements of sensitive information with reliable transmission when precision strike of moving target is needed,this paper proposed a MAC protocol based on multi-channel priority Statistics (MCPS).MCPS protocol uses different priorities thresholds and channel occupancy statistic value to determine whether data packets should be sent.Thereby,the method greatly reduces channel conflict,guarantees a data transmission in real-time at the same.The simulation results show that the Link-16data delay increases sharply with an increasing nodes,but MCPS protocol End to End Delay constants in 3~4ms.As network traffic increases,Link-16system can’t carry the heavy traffic and lead to severe packet loss.But MCPS can satisfy protocol requirements of high-traffic network and keep the success rate of send is not less than 95% at first time.
Balanced Algorithm to Suppress Free-riding in P2P Network
LIU Jian-hui,WANG Jun,JI Chang-peng and WANG Yang
Computer Science. 2013, 40 (7): 36-39. 
Abstract PDF(366KB) ( 395 )   
References | RelatedCitation | Metrics
With the rapid development of P2P network in recent years,there are a lot of ‘free-riding’ node problems.A kind of balanced algorithm was proposed to determine whether a node is free-riding or not.This algorithm not only takes the rational behavior of node itself into consideration but even also the characteristics of the physical environment where nodes exist.And it slows down the speed of downloading resources of a node to suppress the free-riding beha-vior.The simulation experiment shows that this algorithm can inhibit the number of free-riding nodes effectively,but also improve the download success rate of the network.It enhances the fairness and stability of network.So finally it achieves the purpose of suppressing the fee-riding.
Dynamic Self-adaptive Gray Prediction Algorithm for RFID Tag Arrival Rate
CHEN Yi-hong,FENG Quan-yuan and YANG Xian-ze
Computer Science. 2013, 40 (7): 40-43. 
Abstract PDF(366KB) ( 463 )   
References | RelatedCitation | Metrics
To solve the RFID tag-arrival-rate prediction problem in dynamic environment,the tag-arrival-rate prediction algorithm,which can dynamically self adapt to the change of arrival rate,was proposed based on the gray model.The changing trend of arrival rate was judged by the rule knowledge denoting reverse mutation of the tag arrival rate,in order that the model length can dynamically self adapt to the arrival rate change,as a result,the contradiction between prediction tracking speed and accuracy was overcome,which improves the prediction accuracy.The tag-arrival-rate change was modeled by the Non-Homogeneous Poisson Processes with sine intensity function.Simulation experiments show that the algorithm can effectively predict the arrival rate based on a few data in a few random data environment,and the accuracy of prediction is above general gray prediction algorithms and exponential smoothing algorithm.
Flow-based Asynchronous Averaging Consensus Protocol on Communication Network
WANG De-yang,WANG Cong-yin,ZHUANG Lei and CHEN Hong-chang
Computer Science. 2013, 40 (7): 44-48. 
Abstract PDF(546KB) ( 378 )   
References | RelatedCitation | Metrics
In the large-scale asynchronous communication network,the real-timely computing of system-level average parameters is of great significance for guiding the systems to make control decisions,such as resource selection and load balancing.This paper concentrated on the problem of averaging consensus in the asynchronous network environment and proposed a flow-based asynchronous averaging consensus protocol FBAA.The FBAA protocol is applicable for the dynamic asynchronous communication network system,and doesn’t require global coordination in its whole running process.The simulation results presented in this paper show that our protocol can converge to the average value more quickly and the convergence time is independent from the scale of network.Furthermore we derived the relationship between the convergence time and some other parameters through analyzing the experimental data,and the settings of parameters when the system achieves to optimal convergence time.
Utility Analyzing of Campus-wide Cloud Computing Platform Based on Markov Chain
JI Yao,XU Guang-hui,MAO Dong-fang and HAN Jing
Computer Science. 2013, 40 (7): 49-53. 
Abstract PDF(406KB) ( 424 )   
References | RelatedCitation | Metrics
There are many problems of high energy consumption,low efficiency,and waste,which can be solved by cloud computing.However,what rate of resource utilization has been improved by cloud computing and what influence should be figured.This paper gave a brief description on the campus-wide cloud computing platform,analyzed the cost of the datacenter and the user instances,put forward a cost-utility function,made a Markov queuing model,researched the resource allocation and scheduling policies,used CloudSim to simulate the Markov queuing model,discussed the optimization strategies.By madding and simulating the model,the rate of resource utilization in the campus network can be improved evidently and the cost can be reduced,or the QoS can be improved by using proper policy.
Node Scheduling Algorithm Based on Communication and Sensing Coverage in WSNs
YANG He,WANG Wen-yong and TANG Yong
Computer Science. 2013, 40 (7): 54-60. 
Abstract PDF(1312KB) ( 407 )   
References | RelatedCitation | Metrics
In the randomly deployed wireless sensor networks (WSNs),there may always contain some redundant nodes of coverage and communication,which would not only cause huge waste of energy,but also affect the network’s service quality.Therefore,how to effectively schedule the redundant nodes becomes a heated issue in WSNs.This paper proposed a distributed algorithm for active nodes with communication and sensing coverage in the randomly deployed WSNs (RCSC).Based on the cellular structure,RCSC optimizes the working set of nodes by adding "bridge nodes" and filling the "sensing hole",in order to ensure "communication coverage" and "sensing coverage" performance of the whole network to be achieved.Finally,combined with LEACH protocol,RCSC can schedule all the nodes dynamically.Simulation shows that in the network topology constructed by RCSC,working nodes are fewer while the network is stable,as a result,it extends the network lifetime by decreasing the extra energy consumption caused by random communications among nodes.
Joint Spectrum Handoff Scheduling and QoS Re-routing for Performance Optimization
XIE Kun and LIU Xue-li
Computer Science. 2013, 40 (7): 61-66. 
Abstract PDF(647KB) ( 372 )   
References | RelatedCitation | Metrics
Spectrum handoff has attracted the interest of many people as it can reduce interference among wireless transmissions and optimize the structure of wireless network.Current studies on spectrum handoff can’t guarantee the connectivity and high throughput of wireless network when spectrum handoff happens because these studies ignore that spectrum handoff order among multiple links can impact performance.To maximize the throughput of wireless network,this paper proposed a novel spectrum handoff scheduling problem (SHSTM),and proved that SHSTM is an NP-hard problem.To solve the SHSTM problem,we proposed a cross-layer optimization algorithm by jointly considering the spectrum handoff scheduling and QoS Re-routing (JSHSQ-R).Spectrum handoff executes in several rounds in JSHSQ-R.To reduce the total delay of spectrum handoff and to guarantee the network connectivity,JSHSQ-R computes links sets whose spectrum needs to handoff in every round based on weighted Minimum Spanning Tree.To satisfy the QoS requirement of every flow,JSHSQ-R computes a QoS routing for every flow in every round.We made lots of simulations in NS2.The simulation results demonstrate that the proposed algorithm can make full use of multi-radios and multi-channels in wireless mesh network and obtain high throughput for multiple flows.
Multi-round Cluster Based Multi-hop Clusting Routing Protocol for Wireless Sensor Networks
CAO Jian-ling,CHEN Yong-chao,REN Zhi and LI Qing-yang
Computer Science. 2013, 40 (7): 67-70. 
Abstract PDF(442KB) ( 353 )   
References | RelatedCitation | Metrics
A multi-round cluster based multi-hop clustering routing protocol (MCBMC)for wireless sensor networks(WSN)was proposed.It improves the RBMC protocol as fellows:it adds a parameter of energy for the head node’s election arithmetic,uses multi- round clustering instead of every round clustering,reduces frequent clustering and repetition to establish multi-hop routing.Finally,it is simulated on OPNET14.5,and the results are compared with LEACH and RBMC.The results show that MCBMC can reduce node energy consumption and improve the network lifetime.
Trust Optimization Search p2p Node Localization Algorithm
SUN Tao,ZHAO Guo-sheng and WANG Bin
Computer Science. 2013, 40 (7): 71-73. 
Abstract PDF(326KB) ( 370 )   
References | RelatedCitation | Metrics
The traditional method HDHT uses trust mechanism to complete isomerism node location of the resources.When the node selection initial trust is yet not established,some processing power weaker node will generate congestion.Node congestion phenomenon will severely affect the performance of the search algorithm.This paper put forward a trust optimization search p2p node localization algorithm,which is based on credibility,establishes a search interval,in every search process,uses the direct search to optimize the search direction,to ensure that each node can be searched along the trust optimization direction,avoid the disadvantages of repeatly searching node due to p2p network heterogeneous distribution in the traditional algorithm.Experimental results show that this algorithm not only in the initial trust establishment can improve the node resources search efficiency,but also can improve download success rate after setting up trust.Algorithm can be realized in p2p query cycle platform,and the experimental analysis verifies the effectiveness of the proposed algorithm.
Block Diagonalization Algorithm of Eliminatin g Multiple Base Stations Time-varying Channel Asynchronous Interference
LI Su-ruo
Computer Science. 2013, 40 (7): 74-76. 
Abstract PDF(254KB) ( 405 )   
References | RelatedCitation | Metrics
Due to the combined impact of combined impact of time-varying characteristics of channel state information and channel feedback error,beamforming algorithm can’t completely eliminate the interference between cells,in particular,asynchronous interference,resulting in decline of cell edge user’s transmission rate and the service quality.Aiming at this problem,through studying the channel time-varying characteristics and statistical characteristics of the feedback error,this article put forward a kind of beamforming scheme which considers the statistical characteristics of time varying channel,and is based on block diagonalization.This scheme on the base station side sends the signal coding to eliminate the asynchronous interference so as to improve the cell edge user service quality.The simulation results show that this scheme effectively restrains the disturbance caused by the time-varying channel,improves the capacity of the system.
Precoding Method of Cancelling Asynchronous Interference for Cognitive Radio System
ZHANG Bao-jian and QU Pei-xin
Computer Science. 2013, 40 (7): 77-79. 
Abstract PDF(317KB) ( 446 )   
References | RelatedCitation | Metrics
When the primary users and second user geographically has certain distance,there are asynchronous interferen-ces.The interferences not only influence the primary user system,also can affect second user system.In order to eliminate the influence of asynchronous interference on primary and secondary users systems,this paper put forward a kind of precoding based on the minimum mean square error.Analysis and simulation results show that the proposed scheme in the main user system interference limited conditions,effectively enhances the user’s system and second user system capacity,and improves the reliability of the transmission system of second users system.
Remote Sensing Data Organization Model Based on Cloud Computing
LAI Ji-bao,LUO Xiao-li,YU Tao and JIA Pei-yan
Computer Science. 2013, 40 (7): 80-83. 
Abstract PDF(482KB) ( 586 )   
References | RelatedCitation | Metrics
To solve massive remote sensing data storage problem in cloud computing environment,a remote sensing ima-ge data organization model based on cloud computing(RSC-DOM) was put forward.A concept about Virtual Disk Space(VDS) was adopted to meet the requirement using direct addressing method in distributed storage environment.The distributed storage architecture,VDS file architecture in cloud computing were analyzed,and remote sensing stora-ge architecture was built.Experimental results show that the proposed model is convenient to search,and its efficacy is greater than the traditional Oracle management style.
Privacy-preserving Dynamic Integrity-verification Algorithm in Data Aggregation
CHEN Wei,YANG Long and YU Le
Computer Science. 2013, 40 (7): 84-88. 
Abstract PDF(528KB) ( 357 )   
References | RelatedCitation | Metrics
Privacy exposure,information tampering and false data injection are serious challenges in wireless sensor network data aggregation.How to protect the privacy and integrity of data has become a hot research issue in data aggregation.To detect any malicious data tampering and protect data privacy in data aggregation,a novel Privacy-preserving dynamic Integrity-verification(PDI) algorithm was proposed.The PDI algorithm uses the data is perturbation to protect data privacy.In order to protect the integrity of privacy data,the PDI algorithm generates monitoring nodes between the current aggregator and its parent node depending on the network structure dynamically.Therefore,if the data is tampered by the aggregator,it can be detected at the early stage.The simulation results show that the PDI algorithm can reali-ze privacy preserving and integrity verification with less communication and computation overhead.
Incomplete Exponential Sums over Galois Rings and their Applications in Kerdock-code Sequences Derived from Zp2
SUN Ni-gang,ZHENG Hong and LV Meng
Computer Science. 2013, 40 (7): 89-92. 
Abstract PDF(259KB) ( 354 )   
References | RelatedCitation | Metrics
An upper bound for the incomplete exponential sums over Galois rings was derived.Based on the incomplete exponential sums,the nontrivial upper bound for the aperiodic autocorrelation of the Kerdock-code p-ary sequences derived from Zp2was given,where p is an odd prime.The result shows that these sequences have low aperiodic autocorrelation and provide strong potential applications in communication systems and cryptography.The estimate of the partial period distributions of these sequences was also derived.
Dynamic Threshold Attributes-based Signature Scheme
FU Xiao-jing,ZHANG Guo-yin and MA Chun-guang
Computer Science. 2013, 40 (7): 93-97. 
Abstract PDF(374KB) ( 558 )   
References | RelatedCitation | Metrics
Security flaw of an attribute-based signature was pointed out and analyzed firstly,and on the basis of Li’s attribute-based signature (ABS),a new efficient ABS was proposed,in which signing cost and signature size are decreased.The proposed ABS is proved secure in the random oracle machine and satisfies existential unforgeability against adaptive chosen message and predicate attack based on the standard computational Diffie-Hellman assumption.Furthermore,it provides attribute-signer privacy.Result of simulation shows that the proposed ABS can be well applied to data dissemination in mobile peer-to-peer network to achieve message authentication.
Research of Transferring Attack Based on Supply Chain Network
LIU Hong,ZHOU Gen-gui,FU Pei-hua and MAO Guo-hong
Computer Science. 2013, 40 (7): 98-101. 
Abstract PDF(356KB) ( 360 )   
References | RelatedCitation | Metrics
Unexpected events have negative impacts on the whole performance of supply chain network.Based on complex networks theory,and considering the transferring characteristic of nodes’ invalidation performance,this paper put forward a transferring attack strategy to simulate supply disruption and demand disruption of supply chain system,and further analyse the frangibility and robust of supply chain network.A hypothetical supply chain network was constructed to illustrate the proposed attack strategy.The experiments’ results show that such proposed attack strategy can well simulate supply disruption and demand disruption of supply chain system,and for suffering the transferring attack,supply chain network exhibits some frangible.And it also illuminates such strategy is feasible and has some realization significance.
Research on PRNG Suitable for UHF RFID Tag
GAO Shu-jing and WANG Hong-jun
Computer Science. 2013, 40 (7): 102-106. 
Abstract PDF(406KB) ( 365 )   
References | RelatedCitation | Metrics
With the development of Internet of Things,the application of RFID is becoming more and more prevalent.The security of RFID has been a hot topic in recent years.Due to the limitation of cost and power consumption,the security components in EPC Class 1Generation 2(C1G2) passive tags are only random number generator (RNG) and Cyclic Redundancy Code (CRC).The design of RNG with low hardware complexity is critical to the security of C1G2tag.A simple hash function,named M-hash,which is suitable to be realized in hardware was proposed.Furthermore,a pseudo-random number generator M-PRNG was designed based on one-wayness of M-hash.The M-PRNG is based on LFSR and has low hardware complexity which is suitable to passive devices like C1G2tag.It is proved that the random sequences generated by the M-PRNG are fully compatible with EPC C1G2protocol and successfully passe the most demanding randomness test NIST.
Trust Negotiation-based Services Verification in Cloud Computing
YANG Shao-yu,WANG Shi-qing and GUO Xiao-feng
Computer Science. 2013, 40 (7): 107-112. 
Abstract PDF(535KB) ( 422 )   
References | RelatedCitation | Metrics
In cloud computing,the resources of service are widely distributed and migrated frequently.The trust relationship between them is hard to establish and maintain.There are some problems for traditional remote attestation based on trust computing,such as performance bottleneck and computational-complexity.This article proposed a novel remote attestation mechanism based on property negotiation in cloud computing.According to the ring signature algorism and sensitive property-based protection,this mechanism promotes the computational efficiency and reduces the leakage risk of sensitive property.Security of the mechanism is verified by security model.Validity and feasibility are tested by the experiment on Hadoop platform.
Improved Trust Mechanism Based on EigenRep Trust Model
LI Jun,XUE Wei and GAN Xu-yang
Computer Science. 2013, 40 (7): 113-115. 
Abstract PDF(264KB) ( 481 )   
References | RelatedCitation | Metrics
The current EigenRep trust model could not make reputation punishment correction to the cheating behavior of malicious peers.To solve this problem,this paper improved the global reputation computing method.Many comprehensive factors in computing trust value are introduced in the mechanism.Moreover,another defect of EigenRep is that the system has to do the iterative calculation of global credibility within the whole network for each transaction,and this causes the system cost problem.To solve this problem,this paper presented a new trust mechanism which combines risk,local trustworthiness and global reputation.Simulations show that this mechanism could reduce the system cost greatly.At last,this paper realized an E-commerce platform system based on this trust mechanism.
Protecting Location Privacy with Voronoi Diagram over Road Networks
ZHAO Ping,MA Chun-guang,GAO Xun-bing and ZHU Wei
Computer Science. 2013, 40 (7): 116-120. 
Abstract PDF(426KB) ( 383 )   
References | RelatedCitation | Metrics
Location privacy disclosure has become main constraint of LBS applications,while most existing location privacyprotection methods do not consider the background of mobile users——the road network.A location privacy protection method over road networks was presented.This method consists of three phases.First,in order to meet the requirement of segment l-diversity,the road-network Voronoi diagram is constructed based on the structure of the road network.Second,VK-privacy model is put forward.It satisfies all the users’ privacy requirement in the cloaking set and effectively insures the QoS of LBS.Finally,a clocking algorithm based on VK-privacy model is presented,which improves processing efficiency and safety by cloaking multiple users in the same V-region together.The method takes full account of the structure characteristics of road networks and leverages users’ privacy requirement and QoS of LBS.The robustness against inference attacksof the method was proved through theoretical analysis,and the feasibility of the method was proved by the experimental data.
Application-layer DDoS Attack Detection Based on Request Keywords
XIE Bai-lin,JIANG Sheng-yi and ZHANG Qian-sheng
Computer Science. 2013, 40 (7): 121-125. 
Abstract PDF(447KB) ( 367 )   
References | RelatedCitation | Metrics
Today,the application-layer DDoS attacks may cause great harm to the security of the Internet.Existing detection methods lack the versatility,i.e.,an approach only focuses on one particular application-layer DDoS attack.In order to quickly and effectively identify several different application-layer DDoS attacks,this paper presented a detection method based on request keywords.In this method,the input is the number and frequency distribution distance of request keywords per unit time.Then,the hidden markov model is used to detect application-layer DDoS attacks.The experimental results show that the proposed method is valid to discover several different application-layer DDoS attacks with relatively high detection ratio and low false positive ratio.
Security Analysis and Improvement of Certificateless Strong Designated Verifier Signature Scheme
LIU Tang,WANG Xiao-fen and DING Xue-feng
Computer Science. 2013, 40 (7): 126-128. 
Abstract PDF(335KB) ( 363 )   
References | RelatedCitation | Metrics
Hafizul Islam SK and G.P.Biswas recently proposed a certificateless strong designated verifier signature scheme based on elliptic curve bilinear pairing,and claimed their scheme satisfies provable security against three types of adversaries,including the type 1adversary who only learns the system public parameters,the type 2adversary who can’t obtain the private key of the user and the system master key,but can replace the user’s public key,and the type 3adversary who has obtained the system master key.However,this paper pointed out their signature scheme is actually not secure as claimed by presenting an attack launched by an adversary who has learned the system master key.Furthermore,to make up this flaw,we also provided an revised certificateless strong designated verifier signature scheme in which the verifier’s partial private key generated by himself is included in the computation of the verification procedure,thus above attack can be efficiently resisted.
Adaptive Blind Watermarking Based on DCT
JI Yan
Computer Science. 2013, 40 (7): 129-130. 
Abstract PDF(720KB) ( 376 )   
References | RelatedCitation | Metrics
Aiming at the problem that the imperceptibility and robustness of the general digital watermarking algorithm can not reach better balance,this paper put forward an improved DCT adaptive blind digital watermarking.When embedding watermark,we used the odd-even quantization method to embed the watermark in DC component of the DCT block and used the fixed coefficient method to embed the watermark in AC components according to the texture and shadow character in the cloud chart and human visual system (HVS).The experimental results show that the adaptive blind digital watermarking proposed in this paper not only has good robustness for the Gauss noise,common salt and pepper noise attack,and also better meets the invisibility of watermark,,while reducing the watermark embedding and extraction time.
Research on Technology of Voice Instruction Recognition for Air Traffic Control Communication
LIU Wan-feng,HU Jun and YUAN Wei-wei
Computer Science. 2013, 40 (7): 131-137. 
Abstract PDF(670KB) ( 704 )   
References | RelatedCitation | Metrics
Training of air traffic control(ATC) communication in English is the main content of ATC simulator trai-ning.This paper focused on the voice instruction recognition in ATC simulator training which includes the analysis of the basic characteristics of ATC communication,the grammar description for language model,the recognition processing in special pronunciation of instruction,the processing method after recognition and the way of acoustic model adapting.Based on the excellent speech recognition engine Sphinx-4,we designed and implemented a voice instruction recognition system,AIRS (ATC Instruction Recognition System).The experimental results show that the accuracy rate of voice instruction recognition after the acoustic model adapting can reach the demand for ATC simulator training.
Runtime Verification Method for Web Service Based on UML2.0Sequence Diagrams
ZHANG Ya-hong,ZHANG Lin-lin,ZHAO Kai,CHEN Jia-li and FENG Zai-wen
Computer Science. 2013, 40 (7): 138-138. 
Abstract PDF(459KB) ( 398 )   
References | RelatedCitation | Metrics
To verify the consistency between run-time behavior of Web service and its specification,a runtime verification method for Web Service was proposed.In this paper,UML2.0Sequence Diagrams were extended to describe the specification of Web service from both functional and QoS aspects,and then Extended Sequence Diagrams (ESD) were transformed to Deterministic Finite Automata to indicate semantics,and verification criteria was given to verify the consistency between run-time behavior and the specification.In addition,a runtime verification tool for Web service (RVT4WS) was developed to support the runtime verification method we proposed.
Design and Implementation of Web Service Vulnerability Testing System Based on SOAP Messages Mutation
CHEN Jia-mei,CHEN Jin-fu,ZHAN Yong-zhao,WANG Huan-huan and LI Qing
Computer Science. 2013, 40 (7): 143-146. 
Abstract PDF(502KB) ( 438 )   
References | RelatedCitation | Metrics
The automatic tool of testing Web service vulnerability brings great effect on Web service-based software engineering,and they can effectively ensure the security and reliability of Web service-based software.According to Web service which is used widely,a prototype system WSVTS(Web Service Vulnerability Testing System) was designed and implemented.Two mutation approaches of testing Web service vulnerability based on the input domain of SOAP message,namely the worst-input mutation approach and fuzz data-input mutation approach,were implemented.Based on the two approaches,two test cases generation algorithms which are Test Cases generation based on Farthest Neighbor (TCFN) and Fuzz Data-input Mutation Algorithm (FDMA) were also implemented.Then,the test cases generated by the algorithms were executed in the SOAP requesting message.The vulnerability of the Web services can be detected by the response message of the client.
Continuous Probabilistic Reverse Skyline Query on Moving Objects with Uncertainty
TANG Zhi-jun,FAN Ming-suo,HE Xian-mang,CHEN Hua-hui and DONG Yi-hong
Computer Science. 2013, 40 (7): 147-152. 
Abstract PDF(530KB) ( 397 )   
References | RelatedCitation | Metrics
Reverse Skyline Queries have been proved usefully in business planning,environmental monitoring and other applications.Existing researches focus on static reverse skyline.This paper considered reverse skyline queries processing on moving target objects with uncertainty.On the basis of a detailed analysis of the reverse dominant relationship between moving objects,by defining reverse dominant probability,reverse skyline probability and the process which lead to reverse skyline set change,a new process-based algorithm was proposed to handle continuous probabilistic reverse skyline query on moving object with uncertainty.Types of processes were defined that will affect the p-RSky set,and by tracting and calculating those processes,the p-RSky of any time can be found out speedly.Two pruning rules were proposed to avoid a large number of invalid calculating.Extensive experiments show that our algorithm is efficient and effective.
Study on Fuzzy Query of Heterogeneous Bipolarity Information
ZHAO Fa-xin and JIN Yi-fu
Computer Science. 2013, 40 (7): 153-156. 
Abstract PDF(442KB) ( 349 )   
References | RelatedCitation | Metrics
In daily life,people often give both positive and negative information to state what they desire and what they reject for the same things.Because positive and negative statements do not necessarily mirror each other,this results in so-called heterogeneous bipolar information.The fuzzy query in traditional information systemss does not adequately support the handling of heterogeneous bipolar information.In this paper,based on the regular database,vague set was introduced into the modeling of heterogeneous bipolar information for dealing with heterogeneous bipolar information in fuzzy queries,and a bipolar query satisfaction modelling framework which is based on couples that consist of an independent degree of satisfaction and degree of dissatisfaction was given,then the processing of heterogeneous bipolar queries that contain both positive and negative criteria was also discussed.
Research and Application of NServiceBus Service Bus in SOA Environment
TANG Rong-jun,YE Bo and WEN Jun-hao
Computer Science. 2013, 40 (7): 157-161. 
Abstract PDF(471KB) ( 401 )   
References | RelatedCitation | Metrics
In the traditional service-oriented environment,service customers and service providers need to complete service binding through the UDDI server which cannot provide a asynchronous,reliable and manageable process.This paper introduced NServiceBus which is a open source service bus to provide a kind of asynchronous communication model.The centralized management and service quality control for registration services can effectively improve the system service calls’ flexibility and extendability.In addition,this paper analyzed message forwarding mechanism of NServiceBus open source service bus,and proposed a kind of service response time model to forecast overload condition of current system in high concurrent access and provide overload protection.This paper also designed and implemented a kind of self-adaptive priority queuing to provide hierarchical scheduling of service request,to a certain degree to ensure service response time and improve the system response rate.Finally the evaluation experiment and contrast analysis in the application of actual project prove validity and practicability of this method to improve the performance of NServiceBus service.
Opinion Combination Recognition and Orientation Judgment of Network Information
RU Cheng-sen,RAO Lan and WANG Ting
Computer Science. 2013, 40 (7): 162-166. 
Abstract PDF(453KB) ( 469 )   
References | RelatedCitation | Metrics
With the rapid development of internet technology and the explosive growth of online reviews,the technology of opinion mining has emerged.The extraction of opinion targets and opinion phrases is an important task in opinion mining.As template-based methods have several disadvantages:too much manual intervention,the lack of template co-verage,and having difficulty in identifying opinion targets and opinion phrases with long distance,this paper presented a method that could extract templated automatically,identify the opinion combination by using the probability and identify long distance opinion targets and opinion phrases.This paper calculated the sentiment strength of sentiment words by using thesauruses and judged the opinion orientation by considering the impacts of sentiment words and qualifiers.The proposed work has been evaluated on the corpus of COAE2011,compared with two baseline methods,and obtains a good result.
Parallel Affinity Propagation Clustering Algorithm Based on Hybrid Measure
ZHANG Jian-peng,CHEN Fu-cai,LI Shao-mei and YU Hong-tao
Computer Science. 2013, 40 (7): 167-172. 
Abstract PDF(713KB) ( 431 )   
References | RelatedCitation | Metrics
Affinity propagation clustering (AP) algorithm has a difficult to get ideal clustering results in a complex manifold structure and non-uniform density datasets.Through studying the low-dimensional manifold structure,this paper drawed out the density auto-adapted "manifold distance kernel " concept(ad-MDK),which takes into account the local density information of data points,also contains the overall structure of the data set globally,making the algorithm can sovle complex distributed data clustering problem.Meanwhile,in order to reduce the manifold distance calculated,parallel algorithm for the proposed algorithm was introduced to the affinity propagation clustering to effectively improve the speed of the algorithm.Experiments on several data sets verify that the proposed algorithm is superior to the traditional AP algorithm performance in dealing with large-scale multi-scale data set.
Incremental Approach for Updating Approximations of Gaussian Kernelized Fuzzy Rough Sets under Variation of Object Set
ZENG An-ping,LI Tian-rui and LUO Chuan
Computer Science. 2013, 40 (7): 173-177. 
Abstract PDF(446KB) ( 357 )   
References | RelatedCitation | Metrics
In real-applications,there are many kinds of data in information systems.The data may consist of categorical,numerical,fuzzy values.Fuzzy rough set model can deal with this complex data.Gaussian kernels have been introduced to acquire fuzzy relations between samples described by fuzzy or numeric attributes to carry out fuzzy rough data analysis.In addition,the information systems often vary with time.How to use the previous knowledge to update approximations in fuzzy rough set model is a key step of its application on big data.This paper discussed the principles of updating approximations in fuzzy information systems under the variation of the object set.An approach for incrementally updating approximations of fuzzy rough set was then presented.Some examples were employed to illustrate the proposed approach.
System State Detection-Recognition Based on Random P-sets
GUO Zhi-lin,ZHAO Shu-li and SHI Kai-quan
Computer Science. 2013, 40 (7): 178-181. 
Abstract PDF(336KB) ( 372 )   
References | RelatedCitation | Metrics
Based on the concept of packet sets (P-sets),according to the random characteristics of element transfer and structure of random packet sets,the state function and shrinking-extension theorem,reduction-discernible theorem for state function were given,and then the measurement of state departure and criterion of system state recognition were presented.The application of random P-sets in system state detection-recognition was given.
Algorithm for Choosing Optimal Uncertain Data of Complete Data Streams
XU Xue-song,XU Jia,GUO Li-wei,ZHANG Hong and ZHOU Jin-hai
Computer Science. 2013, 40 (7): 182-186. 
Abstract PDF(444KB) ( 336 )   
References | RelatedCitation | Metrics
To address the information gap between RFID data and the requirements of upstream applications,the chara-cter of real time of sensor data,an algorithm for choosing the optimal uncertain data of complete data streams was proposed.The drawbacks of generic particle filter were analyzed.Then an entropy-based method was adopted to estimate the most likely attribute weight for each object,by using possibility degree matrix to select optimal particles,to efficiently capture the possible locations and containment for tagged objects.The performance of the generic particle filter is improved.In this method,though particle optimization,particles are moved to the regions where they have larger values of posterior density function.The experimental results show the accuracy and efficiency and the number of particles needed for accurate location are reduced dramatically.Finally,a numerical example was given to show the feasibility and effectiveness in terms of measurement of underlying uncertainties over RFID data.
Novel Domain Transfer Learning Approach Using Minimum Enclosing Ball
GU Xin and WANG Shi-tong
Computer Science. 2013, 40 (7): 187-191. 
Abstract PDF(454KB) ( 517 )   
References | RelatedCitation | Metrics
Traditional machine learning methods assume that different learning tasks have nothing with each other,but in fact there are some links between them.Transfer learning attempts to use these links and even past learning experiences between different tasks to accelerate the learning for new tasks.This paper integrated the MEB (Minimum enclosing ball algorithm together with Parzen windows probability estimation to develop a new transfer learning method named MEBTL (Minimum enclosing ball Transfer learning).We also used CVM (Core Vector Machines) theory to develop its fast version of the proposed algorithm CCMEBTL for large domain adaptation.The experimental results about “WIFI indoor positioning” and “face detection” indicate the effectiveness of the proposed algorithm.
Distributive Reduction of Set-valued Decision Table Based on Restriction Tolerance Relation
QIAO Quan-xi and QIN Ke-yun
Computer Science. 2013, 40 (7): 192-195. 
Abstract PDF(306KB) ( 333 )   
References | RelatedCitation | Metrics
This paper discussed distributive reduction of set-valued decision table based on symmetry restriction toleran-ce relation.It is the minimal attribute subset to remain all rough upper approximation of decision class unchanged.The authors defined a distributive compatible set and gave three necessary and sufficient conditions.The example shows that this algorithm can obtain the distributive reduction of set-valued decision table.
Study on Game Theory in Decision Interaction for Multi Intelligent Agents Based on Information Fusion
XU Zhao-hui,LIAN Fei-yu and FU Mai-xia
Computer Science. 2013, 40 (7): 196-200. 
Abstract PDF(435KB) ( 501 )   
References | RelatedCitation | Metrics
Variation tendency of situation needs to be perceived and predicted in a high level data fusion process.Because variation tendency of situation of a interaction system combined by multi intelligent agent is pushed forward by their respective decision process,pure probability and evidence theory are not satisfying available tools for this type of prediction.For these prediction,game theory can provide preferable decision and recognition,and is a tools of Process Planning.By combining influence diagrams with game theory,we proposed a new architecture—Bayesian Game Model for decision support to enhance capability of the awareness and prediction for a complex situation caused by multi interactive intelligent agents.Through a case,we explained the effect of the policymakers’ game strategies on situation assessment and gave the case’s Bayesian game model.The model and method proposed in the paper overcome the disadvantage of traditional method that neglects subjective factors,and might offer a new way to situation assessment with game feature.
CP_SDD+RDS:An Algorithm Based on Row-divided Sorting and Single-direction Detecting for Finding Closest Pair of Points
YAO Hua-chuan,WANG Li-zhen,CHEN Hong-mei and HU Xin
Computer Science. 2013, 40 (7): 201-205. 
Abstract PDF(368KB) ( 397 )   
References | RelatedCitation | Metrics
Finding out the closest pair of points has been widely applied in many areas,such as the geographic information query system and spatial databases etc.But so far,there is not an efficient algorithm for the problem.For example,the divide and conquer algorithm contains much comparison,slow convergence and much computing of the distance between two points.Grid-based methods used to solve the nearest neighbor problem can not determine the size of the grid reasonably,and they are inefficient.For this reason,a single-direction detecting algorithm (CP_SDD) was presented in the paper.Then a row-divided sorting algorithm was proposed.Finally,an algorithm based on row-divided sorting and single-direction detecting for finding the closest pair of points (CP_SDD+RDS) was formed.Compared with the divide and conquer algorithm,our algorithm not only efficiently overcomes these drawbacks that occur in the divide and conquer algorithm,the row-divided strategy of the RDS algorithm is also effective to solve these problems that occur in grid-based methods.A large number of experiments show that the CP_SDD+RDS algorithm is efficient and feasible.
Ensemble Model with Semisupervised SVM for Remote Sensing Land Cover Classification
LIU Ying,ZHANG Bai,WANG Ai-lian,SANG Juan and HE Yong-mei
Computer Science. 2013, 40 (7): 206-210. 
Abstract PDF(445KB) ( 357 )   
References | RelatedCitation | Metrics
Nowadays,most SVM-based remote sensing classification methods are challenged by incorrectly selecting parameters values and the small sample problems.This paper proposed a novel ensemble model with semisupervised SVM (EPS3VM) to address the problem of remote sensing images classification.The key characteristics of this approach are to 1)self-adaptive mutation particle swarm optimizer is introduced to improve the generalization performance of the SVM classifier (PSVM),2)self-training semisupervised learning method that leverages large amounts of relatively inexpensive unlabeled data is presented to produce a number of semisupervised classifiers (PS3VM).Then by the weighted voting method,these classifiers are combined so as to improve the generalization ability of the classification model.In order to reduce the impact of this issue by incorrect labels,Gustafson-kessel fuzzy clustering algorithm (GKclust) is used for selecting the useful points from the unlabeled set.The effectiveness of the proposed classification approach is de-monstrated for identifying different land cover regions in multispectral remote sensing imagery.In particular,the perfor-mance of the EPS3VM is compared with PSVM and PS3VM in terms of classification accuracy and kappa coefficient.On an average,the EPS3VM model yields an overall accuracy of 96.88% against 88.48% for PSVM and outperformed PS3VM in terms of overall accuracy (by about 5%).The obtained results clearly confirm the effectiveness and robustness of the EPS3VM approach to the remote sensing land cover classification.
Research of Mining Word Category Knowledge Based on CABOSFV
WANG Dong-bo and ZHU Dan-hao
Computer Science. 2013, 40 (7): 211-215. 
Abstract PDF(384KB) ( 336 )   
References | RelatedCitation | Metrics
According to the Chinese word syntactic function distribution,the paper constructed syntactic function distribution knowledge base based on Tsinghua 973treebank.The Chinese word category knowledge was mined by using the CABOSFV(Clustering Algorithm Based On Sparse Feature Vector) based on syntactic function distribution knowledge base.The Chinese word categories were analyzed one by one.
Fast Approach to Mutual Information Based Gene Selection with Fuzzy Rough Sets
XU Fei-fei,WEI Lai,DU Hai-zhou and WANG Wen-huan
Computer Science. 2013, 40 (7): 216-221. 
Abstract PDF(636KB) ( 432 )   
References | RelatedCitation | Metrics
Feature selection is an essential step to perform cancer classification with DNA microarrays.Rough set theory has already been successfully applied to gene selection.To avoid losing information by discretization of continuous gene expression data in rough set theory,the theory of fuzzy rough sets is applied to gene selection.A fuzzy rough attribute reduction algorithm based on mutual information was proposed and applied to gene selection.The cost of computation of the algorithm is too high to be carried out if the number of the selected genes is large.This paper raised an approximate replacement of computation of the mutual information,from both maximum relevance and maximum significance.The novel method improves the efficiency and decreases the complexity.Extensive experiments were conducted on three public gene expression datasets.The experimental results confirm the efficiency and effectiveness of the algorithm.
Question-Answering System Based on OWL Knowledge:Agile
ZHAO Xi-qing,ZHANG Li-ming and GAO Ming-xia
Computer Science. 2013, 40 (7): 222-225. 
Abstract PDF(332KB) ( 423 )   
References | RelatedCitation | Metrics
Existing Question-Answering systems for the Web focus on source of text.The Web Ontology Language(OWL) is recommended by W3C as the standard for knowledge representation and exchange on the Internet in 2004.So QA based on OWL knowledge becomes an important research.A QA system named Agile was presented in this paper,and the detail schemes of formulating questions and indexing OWL were described.In order to acquire mapping unit,Agile defined two data structures for formulating questions and OWL and formatted them through existing natural language processing technology and OWL parsing method.Agile is auto and can deal with more kinds of questions than former QA based on OWL systems.
Human-computer Hybrid Algorithm and its Application in Constrained Layout
CAO Juan,ZHANG Ying-chun and ZHAO Ling
Computer Science. 2013, 40 (7): 226-228. 
Abstract PDF(387KB) ( 504 )   
References | RelatedCitation | Metrics
Complex engineering layout design is a typical constrained layout optimization problem with behavioral constraints(inertia,balance,stability and vibration etc).It is difficult to solve.Aiming at these problems, the paper put forward a human-computer hybrid algorithm.The human-computer hybrid algorithm is constituted by combining the layout strategy of the artificial design with artificial bee colony optimization algorithm.And then human-computer interactive hybrid algorithm is formed to solve the practical engineer layout problems and the specialties of human and computer can be exerted to the utmost respectively.Example test and experimental contrast show that the proposed algorithm is feasible and effective.
Design of Multi-faults Diagnostic System
ZHANG Xue-nong,CHEN Ai-xiang and ZHANG Li-cheng
Computer Science. 2013, 40 (7): 229-231. 
Abstract PDF(330KB) ( 337 )   
References | RelatedCitation | Metrics
This paper presented framework of diagnostic models and diagnostic systems,and discussed two important properties,sound and complete that describe the power of a diagnostic system.For designing a diagnostic system,a set of models were selected to be tested such that a diagnostic system based on these tests is sound and complete,then tests were designed for all selected model to obtain a diagnostic system.Finally,the equivalence between diagnostic system using tests and the model-based diagnostic system was proved.
Fault Data Optimization Mining Algorithm Based on Theory of Prediction Decision Homomorphism
LU Qing-mei and CHU Yu-xiao
Computer Science. 2013, 40 (7): 232-235. 
Abstract PDF(327KB) ( 359 )   
References | RelatedCitation | Metrics
In some large intelligent mechanical equipment environment,the fault data increases variety,forms a strong redundant interference environment,and in the environment,mining is time-consuming,because of the existence of unassociation rules.On the basis of full research of association mining algorithm, strong redundant data mining algorithm based on a prediction of the theory of decision homomorphism was proposed which constructs homomorphisms interval by the redundancy of the data with punish factor,constraints the interval of the huge redundant associated data correlation,ensures related data in the distance nears the homomorphism interval,and in the nearby interval,uses prediction methods of operation decision-making to make fault final confirmation.Experiments show that the method can improve the redundant environment,the accuracy of fault data mining,the calculation cost is not high,and it has a good robustness.
Application of Mind Evolution Based Ant Colony Algorithm in Typical Production Scheduling
WEI Xian-min
Computer Science. 2013, 40 (7): 236-238. 
Abstract PDF(344KB) ( 386 )   
References | RelatedCitation | Metrics
Aiming at solving the NP-hard workshop production scheduling problems,this paper proposed a ant colony algorithm based on mind evolution.The algorithm is established in the traditional ant colony algorithm,and the combination of evolutionary thought and local optimization idea overcomes defects that the basic ant colony algorithm is easy to fall into local optimization,improves state transition rules,defines a pheromone range,improves the pheromone update strategy,and increases neighborhood search.Experimental results show that,for a typical production scheduling problems,ant colony algorithm based on mind evolutionary can obtain the optimal solution in theory,and optimal solution,the solution and average three indicators are better than the basic ant colony algorithm,shows good performance.
Feature Gene Selection Based on Improved Binary Particle Swarm Optimization Algorithm and its Application in Detection of Colon Cancer
CHAI Xin,SUN Jing-yao,GUO Lei and WU You-xi
Computer Science. 2013, 40 (7): 239-243. 
Abstract PDF(446KB) ( 435 )   
References | RelatedCitation | Metrics
In order to avoid local optimal solution of Binary Particle Swarm Optimization algorithm,an Improved Binary Particle Swarm Optimization (IBPSO) algorithm was presented.In this approach,the crossover and mutational strategies are introduced to increase the diversity of populations and avoid the premature-convergence of particles.Vaccine extraction,vaccination and immune selection are used to realize the vaccine mechanism to control the population degradation.In order to reduce the features of the tumor,Wilcoxon is used to remove the useless genes.IBPSO algorithm is used to optimize the subset of features and the parameters of Support Vector Machine (SVM).Finally,this method mentioned above is applied to detect the key genes of colon cancer dataset.The experimental results show that our approach can get higher classification accuracy with smaller size of feature subset than that of some other approaches and the selected genes are proven to be disease-causing.The experimental results also verify the correctness and effectiveness of our approach.
Highly Efficient and Dynamic Binary Searcher Based on Node Group
ZHANG Zhao-xia,HAN Su-qing and QI Hui
Computer Science. 2013, 40 (7): 244-247. 
Abstract PDF(436KB) ( 514 )   
References | RelatedCitation | Metrics
Based on the analysis and summary of several improved binary search algorithm,we put forward a more highlyefficient and dynamic binary searcher based on node group,which is not only improved in the search efficiency,but alsoin the storage structure.This algorithm has achieved the dynamic real-time search,and it especially allows to insert and delete a group of elements.In addition,we discovered that the scheme is obviously better than the previous binary search algorithm in searching a large amount of data.
Heuristic Algorithm for Solving Mixed Load School Bus Routing Problem
DANG Lan-xue,WANG Zhen,LIU Qing-song and KONG Yun-feng
Computer Science. 2013, 40 (7): 248-253. 
Abstract PDF(522KB) ( 450 )   
References | RelatedCitation | Metrics
Mixed load school bus routing problem (SBRP) for multiple schools in a city or county,which allows students from different schools to get on same bus at the same time,aims to reduce the number of school buses needed and total operational costs.Several algorithms for solving mixed load SBRP have been developed since 1990s.However,due to the complexity of mixed load SBRP,the neighborhood solutions are not fully explored in these approaches.This paper proposed a new heuristic method for mixed load SBRP using a record-to-record travel (RRT) algorithm and neighborhood operators of pickup and delivery (PDPTW) to minimize the number of required buses.Test results show the effectiveness of this algorithm.Compared with the existing SBRP solutions,this algorithm expands the neighborhood search strategy,and is capable of finding better solutions using fewer buses.
Combination Prediction Model of Mooring Load Based on Wavelet Analysis Method and Neutral Network
ZHENG Jian,BAI Xiang-en,XIAO Ying-jie and ZHANG Hao
Computer Science. 2013, 40 (7): 254-257. 
Abstract PDF(315KB) ( 310 )   
References | RelatedCitation | Metrics
To achieve high-precision prediction for mooring load data,based on wavelet analysis method and BP neutral network method,a forecast combination algorithm was proposed.Wavelet analysis method was used to make multi-scale decomposition and reconstruction calculations for original no-stationary mooring load series,and multi-layer steadier mooring load series was obtained.Then BP neutral network method was used to build no-stationary prediction models for each layer,and realize forecast calculation.Simulation results show that the combination algorithm attains high-precision forecast results.The combination algorithm has respectively excellent subdivision and self-learning ability.The combination algorithm can meet requirements of short-term early-warning of mooring load.
Image Understanding Model Based on Local Visual Perception and Semantic Association
ZHOU Hai-ying and MU Zhi-chun
Computer Science. 2013, 40 (7): 258-261. 
Abstract PDF(787KB) ( 395 )   
References | RelatedCitation | Metrics
On the basis of image information from visual perception,a model integrated with bottom up visual searching and top down semantic determining was proposed,establishing a perception contact between image areas or image blocks and image semantic or image categories.Through the scanning and searching for image areas,the moving eyes simulate the movement of the focus of attention to index and memory the interested content so as to produce association under the driving of task events.The model blends two aspects of visual perception and semantic explanation in image understanding,which conforms to the human cognitive law.
Fast Implementation of Bilateral Filtering with Identification Point in Pixel Layer
ZHENG Li-ping,LI Jun-qing,YU Cheng-min and ZHANG Min
Computer Science. 2013, 40 (7): 262-265. 
Abstract PDF(585KB) ( 404 )   
References | RelatedCitation | Metrics
Bilateral filtering is a technique to delete images noise while effectively preserving edges.The nave implementation of the bilateral filtering can be extremely slow.The time complexity is high.According to concepts of appro-ximate layer and intensity layer,a improved bilateral filtering was proposed.This improved algorithm uses identification and pixel layer to realize Bilateral Filtering.This improved algorithm is called identification Bilateral Filtering(IBF).At first,the pixel layer with gray D-value was specified,then identification point in pixel layer was choosen,and the filtering value of every pixel layers with identification point was computed.At last,linear interpolation was used to compute filtering value of pixels and output filtered image.Gray image and color image were taken as research objects in experiment.Experiment results show that the IBF algorithm has short executing time and has a good filtering result.
Novel Framework for Multi-view Object Detection through Combining Multiple Classifiers
YIN Wei-chong and LU Tong
Computer Science. 2013, 40 (7): 266-269. 
Abstract PDF(339KB) ( 786 )   
References | RelatedCitation | Metrics
We proposed a novel framework for detecting generic objects from arbitrary viewpoints described by varied object appearances.Our key insight is to exploit the multi-view detection patterns established by a number of detectors from different viewpoints and their relationships through the Multi-View Detector Sphere (MVDS),reflecting the underlying intrinsic structure for detecting multi-view objects.We first modeled the annotated objects from different viewpoints,and then triangulated the sphere into a number of uniformly distributed meshes to represent the explicit correspondences across view detectors.As a result,multi-view objects from untrained viewpoints can be detected by combining the outputs of the adjacent view detectors on the sphere.Our experiments on several public datasets give promising results for the experimental object classes.
Improved SIFT Algorithm
WU Jian and MA Yue
Computer Science. 2013, 40 (7): 270-272. 
Abstract PDF(247KB) ( 680 )   
References | RelatedCitation | Metrics
Feature extraction is an important technology in digital image processing and computer vision.And making use of feature descriptor to construct the image feature point is a crucial step in the image feature extraction and image registration.SIFT feature point detection operator has the advantages of translation,rotation and scaling invariance.So,it is widely used in image registration.We mainly improved the 64dimensional description operator based on the SIFT characteristics.The simulation results prove that the improved algorithm has higher accuracy than the original algorithm,and the time complexity is reduced.
Mutual Information Medical Image Registration Based on Firefly Algorithm
DU Xiao-gang,DANG Jian-wu,WANG Yang-ping,LIU Xin-guo and LI Sha
Computer Science. 2013, 40 (7): 273-276. 
Abstract PDF(557KB) ( 372 )   
References | RelatedCitation | Metrics
To solve the problem that the object function is easy to get into local optimalization because of much local extremes in the mutual information registration method,a mutual information medical image registration algorithm based on firefly algorithm was put forward.The normalized mutual information is used as the similarity measure and registration parameters are expressed by the locations of fireflies in the algorithm,and mutual information function values are calculated according to the locations of fireflies and are set as brightness values of fireflies,and the best registration parameters are retrieved by updating the brightness and attractiveness iteratively while the mutual information function reaches the maximum value.The experimental results indicate that this algorithm can effectively overcome the problem that the mutual information function is easy to fall into local optimalization,and the precision of registration result is improved obviously.
Edges Description and Matching Algorithm for Different-source Images
ZHU Ying-hong,LI Jun-shan,YANG Wei,YANG Ya-wei and ZHU Yi-juan
Computer Science. 2013, 40 (7): 277-279. 
Abstract PDF(788KB) ( 383 )   
References | RelatedCitation | Metrics
A point matching algorithm based on edges of key points region was proposed to resolve the problem of IR and visible images matching.Firstly,the feature points were extracted by the CSS corner detector.Edges of key points region were reconstructed.Secondly,the normal direction of each feature point on the curve was adopted as the main direction of the point,making the point descriptor rotation invariant.Thirdly,through calculating the B-LBP weight histogram in interesting points’ neighborhood,the nearest feature point of each extracted one on the same edge was searched and the histograms of edge pixels of the two key points region were constructed.Then a 512-dimentional UB-LBP joint descriptor combining with two histograms was constructed and normalized.Finally,the feature matching was realized via the nearest neighbor algorithm.Experimental results show that the proposed algorithm can match the feature points in the IR and visible images more efficiently than the original SIFT.
Grain Classification Based on Edge Feature
LIU Chun-li and ZHANG Gong
Computer Science. 2013, 40 (7): 280-282. 
Abstract PDF(464KB) ( 327 )   
References | RelatedCitation | Metrics
This paper proposed an object detection approach using edge detection.Firstly,it uses dynamic edge detection algorithm to extract the grain edge,secondly,computes the distance between the edge points and the center of gravity,and obtains a vector as the primitive feature which will be processed via amplitude and length normalization and become the feature.The features are used to train the SVM classifier.At last,we conducted simulation experiments using grain image,and the experiment results reveal that the proposed method can efficiently extract the edge feature,and has higherclassification accurate rate.
On Selection and Parameter Calculation of Wave Spectra in Ocean Wave Rendering
CHEN Li-ning,JIN Yi-cheng,REN Hong-xiang and ZHANG Xiu-feng
Computer Science. 2013, 40 (7): 283-288. 
Abstract PDF(1193KB) ( 988 )   
References | RelatedCitation | Metrics
The structure of Phillips wave spectra was analyzed.Phillips spectra is directional spectra,and can be separated into the frequency spectra and directional spreading function.Its frequency spectra accords with the Neumann spectra form,and is familiar with P-M spectra.Its directional spreading function is the function recommended by International Towing Tank Conference.By referencing P-M spectra,the wind speed of Phillips spectra is specified and its constant is calculated,which solves the unsettled problems in application.By comparing the rendering results of PM-ITTC directional spectra and Phillips spectra,it is found that long waves with low frequency band are more obvious in ocean wave rendered with Phillips spectra.Additionally,the study shows the peak spectral frequency of Phillips spectra is close to that of PM-ITTC directional spectra,but its spectral width is smaller and its wave energy more concentrates in low frequency band.Fetch length is added as a parameter of Phillips spectra.So the rendering result can reflect the influence of both wind speed and fetch length on rendered ocean wave.The propagation direction and wave height of the rendered ocean wave vary as wind speed changes,which conforms to the actual condition in navigation.The method is used in navigation simulator.
Improved Skeleton Extracton Algorithm Based Active Contour Model Research
REN Shou-gang,MA Chao and XU Huan-liang
Computer Science. 2013, 40 (7): 289-292. 
Abstract PDF(872KB) ( 356 )   
References | RelatedCitation | Metrics
Active contour model is an effective image segmentation method,but there aren’t many methods in the active contour model when determing the initial contour.In such case,this article put forward an improved skeleton extraction algorithm based active contour model to solve the problem.The model creates an initial contour by improved skeleton extraction algorithm and contour repossession algorithm,then evolves the contour towards true edge of object by the active contour model with shape energy item,to achieve the expected image segmentation.The example verification and comparision experiment prove the model can segment out the object area from the image in a much better way,with good noise immunity.Compared with traditional active contour,the model has a great improvement in segmentation accuracy.
Algorithm for Automatic Recognition of Red Tide Algal Images Captured by Flow Cytometry
XIE Jie-zhen,LUO Ting-wei,DAI Jun-wei,WANG Di,GAO Yan and RAN Sheng
Computer Science. 2013, 40 (7): 293-296. 
Abstract PDF(548KB) ( 458 )   
References | RelatedCitation | Metrics
The red tide is a global marine natural disaster.In order to predict and forecast the occurrence of red tide,with the technology of flow cytometry,micro-image and image processing,a real-time harmful algae monitoring system was developed.A method based on background subtraction was used to quickly and accurately segment the algae images.In order to overcome the influence of algal cells in morphology and individual difference brought by the different growth period and environment,geometry features which are invariant to translation,rotation and scale and texture features based on GLCM were extracted.And finally,one-to-one multi-class support vector machine was adopted for identification.The experimental results show that the average recognition accuracy rate is as high as 94.37%.
Consistent View Construction Mechanism in Trustworthy and Controllable Network
LIU Ze-min
Computer Science. 2013, 40 (7): 297-301. 
Abstract PDF(524KB) ( 345 )   
References | RelatedCitation | Metrics
Multi control nodes are used to control an AS coordinately in Trustworthy and Controllable Network.The views of different control nodes may be inconsistent.To solve this problem,we proposed a consistent view construction mechanism,in which a selection algorithm is used to generate a primary control node first,and then the primary control node is responsible for organizing other control nodes to build consistent view for all view requests.This mechanism avoids the problem that the views of different control nodes are inconsistent.Besides,the simulation experiment verifies that the mechanism proposed in this paper has better performance than prior view construction method in both time cost and communication overheads.
Filtering Method for Images Based on Adaptive Neuro-fuzzy Inference System
LUO Hai-chi,LI Yue-yang and SUN Jun
Computer Science. 2013, 40 (7): 302-306. 
Abstract PDF(864KB) ( 466 )   
References | RelatedCitation | Metrics
A neuro-fuzzy network approach to impulse noise filtering for gray scale images was presented.The network is constructed by combining four neuro-fuzzy filters with a postprocessor.Each neuro-fuzzy filter is a first order Sugeno type fuzzy inference system with 4-inputs and 1-output.The proposed impulse noise filter consists of two modes of operation,namely,training and testing (filtering). The experimental results demonstrate that the proposed filter not only has the ability of noise attenuation but also possesses desirable capability of details preservation.It significantly outperforms other conventional filters.
Algorithm on Strokes Separation for Chinese Characters Based on Edge
CHENG Li,WANG Jiang-qing,LI Bo,TIAN Wei,ZHU Zong-xiao,WEI Hong-yun and LIU Sai
Computer Science. 2013, 40 (7): 307-311. 
Abstract PDF(400KB) ( 1638 )   
References | RelatedCitation | Metrics
To extract strokes of Chinese characters from it’s contour,the key lies in seeking the intersection points of intersecting strokes.An algorithm used in separating strokes was proposed based on extracting contour of Chinese chara-cter picture and detecting the feature points.At last the algorithm was implemented by VC program.Test result shows the algorithm is effective in extraction of strokes of printed Chinese characters and handwritten Chinese characters without script strokes.
Research of Applying Streaming Media for Medical Image Based on Cloud Computing
ZHAO Mei-ze and DIAO Li-juan
Computer Science. 2013, 40 (7): 312-316. 
Abstract PDF(788KB) ( 359 )   
References | RelatedCitation | Metrics
This paper proposed the medical image streaming media transmission system based on cloud computing,which integrates medical devices and PACS.The system makes use of scanning to transform images from different me-dical into digital images which are submitted to the remote severs real time in the form of streaming media.In the virtual storage platform of cloud database,the paper adopted advanced streaming media technology to solve the problems of sharing medical resource and storing continuously high resolution images.The experimental results show that compared with the transmission rates of traditional servers using FTP in LAN((Local Area Network),TANet(Taiwan Academic Network) and the home network,the transmission rate of the medical image streaming media transmission system based on cloud computing rises respectively 45.6%,49.4%,8.1%.