Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 2, 01 December 2018
  
Survey of Sensor Networks Key Management and Authentication
ZHANG Guo-yin,SUN Rui-hua,MA Chun-guang,ZHU Hua-min
Computer Science. 2010, 37 (2): 1-6. 
Abstract PDF(605KB) ( 456 )   
RelatedCitation | Metrics
Because of the flexibility and extensive applicability of the wireless sensor networks, they can be applied into many fields such as civil, commercial and military. Based on nodes' characters, the sensor networks are quite different from the traditional networks. The key management and authentication arc important to guarantee nodes' secure communication. Now there're many schemes for key management and authentication,in this paper the current schemes were classified and compared,also the future studying directions were pointed out.
Algorithms for Dominating Problems:A Survey
WANG Jian-xin,CHEN Bei-wei,CHEN Jian-er
Computer Science. 2010, 37 (2): 7-11. 
Abstract PDF(460KB) ( 640 )   
RelatedCitation | Metrics
In complexity theory, dominating problem is an important problem, which has important applications in many fields such as resource allocations, electric networks and wireless ad hoc networks. The two most known dominating problems arc Vertex Dominating Set(VDS) and Edge Dominating Set(EDS) problem. People designed and analyzed exact algorithms for the two problems by dynamic programming and measure-and-conquer techniques and proposed many approximation algorithms for EDS problem by reducing the problem to Edge Cover problem. Recently, parameterized dominating problem has attached considerable attention. Planar VDS and General EDS problem has been proved to be Fixed-Parameter Tractable(FPT) problem, and many techniques such as tree decomposition and branch-search have been used in the design of FPT algorithms for them. The paper presented the classification for the two problems respectively, and gave definitions and some relative algorithms for each derivative problem. Furthermore, the Matrix Dominating Set problem and some research directions on dominating problem were also introduced.
Survey on the Research of Network Visualization
SUN Yang,JIANG Yuan-xiang,ZHAO Xiang,XIAO Wei-dong
Computer Science. 2010, 37 (2): 12-18. 
Abstract PDF(723KB) ( 1044 )   
RelatedCitation | Metrics
The traditional way of depicting network data, text+data sheet, can not meet the requirement of analyzing and managing the increasingly-larger network dataset. As an effective tool of assisting users' understanding the network structure, as well as mining the connotative information, network visualization techniques arc widely applied nowadays.In the beginning of this paper,we summarized the tasks of network visualization. After the review of graph drawing aesthetics related works, the rationales and characters of representative methods were expatiated on each category of a taxonomy framework of network visualization. Afterwards, the strategy of solution to the scalability of network visualization was recapitulated. And a comparison of technictues on each category was presented in the end, followed by an expectation of future work and challenges of network visualization faced.
Overview on Target Tracking Methods in Binary Wireless Sensor Networks
SUN Xiao-yan,LI Jian-dong,HUANG Peng-yu,CHEN Ting
Computer Science. 2010, 37 (2): 19-22. 
Abstract PDF(460KB) ( 556 )   
RelatedCitation | Metrics
Wireless sensor networks arc one of new technologies in recent years and target tracking in the networks is one of the research emphases. The binary sensor networks are becoming more attractive due to the low cost deployment, small size, low energy consumption and simple operation and data communication of the sensors. hherefore, it is taken into consideration in target tracking. The methods availably used in target tracking of binary wireless sensor network can be cataloged into two classes,piecewise linear approximation based method and particle filter based method.The current research results were analytically presented, and the future work was also pointed out.
Overview on RRM Based on BoD in Broadband Multimedia Satellite Systems
QIN Yong,ZHANG Jun,ZHANG Tao
Computer Science. 2010, 37 (2): 23-30. 
Abstract PDF(806KB) ( 1005 )   
RelatedCitation | Metrics
Radio Resource Management (RRM) is an important researching aspect in broadband multimedia satellite communication. Dealing effectively with burst characteristic of multimedia to achieve balance between QoS of traffics and resource utility is the main challenge for RRM. The RRM based on bandwidth is an effective approach for this challenge, which is also becoming the research hotspot at present. By analyzing the framework of RRM based on BoD for broadband satellite systems, this article decomposed the framework into five main function parts. Furthermore, the exiting research achievements were corresponded to these five function parts. I}he classifying and overview on various methods for each function part were mainly discussed, and the characteristic and shortcomings for each class of arithmetic were analyzed. At last, suggestions on further research direction were proposed.
Survey of the Cross-layer Approaches of P2P over MANET
QU Da-peng, WANG Xing-wei,HUANG Min
Computer Science. 2010, 37 (2): 31-37. 
Abstract PDF(755KB) ( 514 )   
RelatedCitation | Metrics
Mobile Ad-hoc networks (MANET) and peer-to-peer (P2P) systems share a common underlying paradigm:self-organization. However, the two fields were researched independently all the time. In recent years, with the success of P2P over Internet and rapid development of wireless technology, P2P systems began to be applied to MANET. Because of the limited resource and dynamic topology structure, the performance of applying P2P directly over MANET is poor. A lot of cross layer approaches were proposed to improve the performance. In this paper, after analyzing the similarity and difference between P2P and MANET, some existing cross layer approaches were compared in their characteristics and classifications. Finally, the important features that good approaches possess and the future research strategies and trends were summarized.
Survey on Progressive Transmission Strategy of Large-scale Virtual Scenes
WANG Wei,JIA Jin-yuan,ZHANG Chen-xi,JIANG Yin
Computer Science. 2010, 37 (2): 38-43. 
Abstract PDF(710KB) ( 532 )   
RelatedCitation | Metrics
With the rapid enlargement of 3D virtual scenes on Internet, the contradiction between the requirement of fast downloading from user and limited bandwidth is more seriously. Now many researchers have been working on fast downloading of large scale virtual scenes form Internet by progressive transmission strategy, so it is necessary to give an overview of existed works. This paper introduced the principle and main processes of progressive transmission with a formula to evaluate different transmission strategies firstly, then described the progressive transmission strategy of downloading large scale virtual scenes from: (1) the determination of PVS(Potential Visible Scenes) and the Incremental PVS, (2) the predigestion techniques and streaming encoding of 3D o均ect model, (3) the transmission mechanisms of virtual scenes in different DVE (Distributed Virtual Environment),(4) the polices of prefetching virtual scenes. The technical prospect on progressive transmission of large scale virtual scenes on Internet is given at last.
Non-cluster Based Topology Control Method in Wireless Sensor Networks
ZHANU Wen-zhu,LIU Jia,ZHANG Lin,YUAN Jian,SHAN Xiu-ming
Computer Science. 2010, 37 (2): 44-47. 
Abstract PDF(324KB) ( 436 )   
RelatedCitation | Metrics
Since sensors arc constrained by limited energy and small communication diameters, topology control is a primary problem of wireless sensor network engineering. We proposed a cellular automata-based model for addressing the topology control problem. Different from the traditional cluster-based methods, our approach leads a special way to maintain longer system lifetime at the cost of a small proportion of coverage and connectivity rates. We found that nodal state transition rule plays key role in the system topological characteristics. The stable patterns under specific rules meet well the requirements of topology control in sensor networks. Further we discussed the feasibility of this mechanism into engineering design.
Self-feedback Fault Detection Algorithm for Peer-to-Peer Storage System
WAN Ya-ping,FENG Dan,OUYANG Li-jun,LIU Li,YANG Tian-ming
Computer Science. 2010, 37 (2): 48-52. 
Abstract PDF(531KB) ( 684 )   
RelatedCitation | Metrics
Peer-to-Peer(P2P) storage systems have a lot of attractive advantages, such as self-organizing, scalabihty and fault tolerance. Fault detection must be one of basic components to build a reliable P2P storage systems. I}o address the highly dynamic characteristics of system nodes, a self-tuning heartbeat fault detection algorithm, which can combine heartbeat strategy with unbiased grey prediction model, was designed to improve the fault detection duality of system (QoS) according to the application needs and network environment changes. The results show that,on the basis of the algorithm implementation,fault detectors have better performance.
Efficient Roaming Authentication with Anonymity Protocol for Wireless Vehicle Mesh Networks
WU Xiu-qiang,MA Hua,ZHANG Wei-dong,PEI Qing-qi
Computer Science. 2010, 37 (2): 53-55. 
Abstract PDF(350KB) ( 505 )   
RelatedCitation | Metrics
Due to their own characteristics, wireless mesh networks arc facing more security challenges than traditional networks. Their security mechanism should take into account both the security and the application environment, Anonymous authentication and key negotiation, when users roam, are an elementary protocol in constructing a secure network system, and are the implementation foundation of the secure routing protocol .The user's certificated in wireless mesh networks , protection of identity information is very important, but the study of anonymous authentication and key nego- nation when users roami in wireless mesh networks was too little. Motivated by these concerns,on the enough consideration on the attack methods for wireless networks and the characteristics of wireless mesh networks, an efficient roaming authentication with anonymity protocol for wireless vehicle mesh networks was proposed, in which the ID based public key cryptography and Diffie-Hellman algorithm were adopted. Analysis shows that this protocol can be used to guarantee the security of wireless mesh networks, and at the same time, it can meet the special application environment of wireless vehicle mesh networks.
Optimal Search Flooding Protocol for Wireless Sensor Networks
YU Qin,WANG Wei-dong,QIN Zhi-guang,MAO Yu-ming
Computer Science. 2010, 37 (2): 56-60. 
Abstract PDF(439KB) ( 441 )   
RelatedCitation | Metrics
Flooding is a fundamental methodology in wireless sensor networks. Applications such as topology formation, route establishment, target detection or data query often use the flooding strategy to provide the solution. Most of previous research on efficient flooding strategics focuses on producing an optimal broadcast tree, with assumption that communications between all nodes are reliable. However, for real-world wireless sensor networks, this aim as well as the assumption may be inefficient This paper proposed an efficient flooding protocol-Optimal Search Flooding Protocol (OSFP) for applications such as target detection in sensor networks. We took advantage of the clustering methodology and optimal search theory in the OSFP to achieve good performance. Our OSFP can be applied not only to the perfect data transmission network, but also to imperfect data transmission network. Simulations show OSFP can find out the target with a maximum probability and does not increase the search cost compared with other flooding protocols.
Novel Dynamic Security Analysis Model for Computing System Based on DBN
ZHAO Feng,ZHANG Qin,LI Min
Computer Science. 2010, 37 (2): 61-64. 
Abstract PDF(339KB) ( 445 )   
RelatedCitation | Metrics
In recent years, computing system vulnerability analysis attracts more and more researchers, which has become a hot spot in the field of system security. With the emergence of multi-core technology, computing systems become more open and dynamic. An attack graph-based dynamic security analysis model was proposed, which can measure combined effect of dynamic computing system vulnerabilities. An improved attack map generation algorithm was also presented to improve performance and simplify further security analysis by system administrators. Moreover,a virtual computing system-based example shows the analysis process of the proposed method and validates its efficiency and performance. hhe experimental results show that our method is an effective way to dynamic system security risk analysis.
Research and Implementation of Distributed Index Based on DHT
WU Wei,SU Yong-hong,LI Rui-xuan,LU Zheng-ding
Computer Science. 2010, 37 (2): 65-70. 
Abstract PDF(576KB) ( 781 )   
RelatedCitation | Metrics
A distributed inverted index's building method based on DHT (Distributed Hash Table) was adopted to improve the index's creating and updating efficiency.The arithmetic, using the DHT technology based on improved Chord network,hashes the terms and their relational information to the distributed index servers and builds the index parallely. This method reduces the index's building time through distributing a task to many nodes. The strategies of scheduling the index building task through chained index management servers and the incremental distributed inverted index updating method were used,which could assure index's consistency and updating efficiency.
Research on Multi-peers Cluster P2P System
SHEN Xin-peng,LI Zhan-huai
Computer Science. 2010, 37 (2): 71-74. 
Abstract PDF(474KB) ( 481 )   
RelatedCitation | Metrics
Distributed structured peer-to-peer network can provide precise resource discovery and has a good scalability and self-organizing capabilities,so it has become the focus of the study. However,the research often focuses on how to improve routing efficiency and neglects of other serious problems. This paper proposed a multi-peers cluster P2P system. In the system,multiple peers could have same identifier. They form a cluster. All peers in a cluster save the same resource. So the resource was not lost even if one or more peers failed suddenly. The multiple copies of the resource could work together to effectively improve the download speed. Most of the structured peer-to-pcer systems only provide the precise key query. I}his system could automatically collect related keywords on the query route. Using this method, the fuzzy query could be implemented according to related keywords. Experimental results show that this work is effective and efficient.
Consistency Protocol for Metadata Processing in KESS
DENG Ke-feng,HE Lian-yue,WANG Xiao-chuan,ZHOU Xian-feng
Computer Science. 2010, 37 (2): 75-77. 
Abstract PDF(277KB) ( 482 )   
RelatedCitation | Metrics
In Kylin Distributed Encrypting Storage System, data inconsistency may occur during distributed metadata processing under system exception. To solve this problem, the consistency protocol 2PC-MP was presented. In this protocol, transaction number is introduced to ensure the consistency of log records and message interaction, append queue is added to prevent process blocking under network exception, redeem queue is added to allow failed operation to roll back when the user's session gets invalid,and distributed logging technology brings a fast recovery after system failure. Resups show that 2PC-MP can ensure the consistency of metadata processing and also enhance its performance.
Cluster-head Predication Distributed Clustering Routing Protocol
TANG Qiang,TANG Xiao-ying,WANG Bing-wen
Computer Science. 2010, 37 (2): 78-81. 
Abstract PDF(319KB) ( 534 )   
RelatedCitation | Metrics
The cluster-head predication distributed clustering routing protocol(CP-DCRP) was proposed. In the initial stage,BS computes the cluster heads by using the uniform distributed positions, and broadcasts the cluster heads information in the network. After specific-rounds, the cluster heads of the next specifirrounds are computed by the cluster heads,which are the last cluster heads of the last round of the specific-rounds. The cluster heads of the next specifies rounds are computed by executing the cluster-head predication mechanism. All of the computed cluster heads informalion is broadcasted by the last cluster heads in the network. The effect of the specificrounds on the average energy consumption per round of the network was analyzed, and the optimality of the specificrounds as well as the time complexity of implementing the cluster-head predication mechanism was analyzed, the optimum cluster head number was also analyzed. The simulation results show that compared with Leach, CP-DCRP improves the energy consumption balance property of the network and prolongs the network lifetime.
Research on Virtual Hardware Method for Carrier Ethernet
DAI Jin-you,YU Shao-hua,WANG Xue-shun,ZHU Guo-sheng
Computer Science. 2010, 37 (2): 82-86. 
Abstract PDF(471KB) ( 454 )   
RelatedCitation | Metrics
Continually development of the network brings about higher requests to hardware resources of network node devices including carrier Ethernet systems. Switch chips are usually important components of carrier Ethernet products and they are less expandable because that they can' t be programmed. The contents of important hardware resources such as layer 2 table,layer 3 table and access control list are unchangeable without upgrade of the hardware. However,those hardware resources can't often meet needs of applications. As to current Ethernet products, no special measure except upgrading hardware is adopted to solve the problem. On the other hand, Pareto principle indicates that only the minority of items in hardware tables plays important role in practical network and the majority of the items has less influence on network behavior. So, it is feasible to adopt a special method to manage and make full use of the hardware resources. A method was presented to solve the problem of hardware resource inadequacy. Based on the virtual memory technology of operation system, a special algorithm was used to expand the hardware resources in order to improve system forwarding performance. Software tables are set up in the system memory and the hardware tables can be thought to be caches of their corresponding software tables. The LFU algorithm is adopted to exchange data between hardware tables and software tables in order that the hardware tables always store more important items, then the performance of carrier Ethernet systcxn can be improved.The experiments show that the method can reach its expected aim.
QoS Routing Protocol for Wireless Sensor Networks Based on Strip of Area
CHENG Ghen,LI La-yuan,YANG Shao-hua,ZHANG Peng
Computer Science. 2010, 37 (2): 87-89. 
Abstract PDF(364KB) ( 438 )   
RelatedCitation | Metrics
A QoS routing protocol for wireless sensor networks based on strip of area was proposed .According to analsis of the network transmission path energy consumption model, the forward routing nodes were controlled in strip of area of the straight line between source node and sink node, effectively reducing the transmission energy consumption on path .In addition, in the improved QoS protocol, the forward nodes dynamicly established the width of strip of area according to the current QoS constraints, so that the routing path most fits the straight line between source node and sink node to achieve the purpose of optimising the path transmission energy consumption .Simulation experiments show that the QoS protocol for wireless sensor networks, at meetting the networks QoS constraints, saves network energy consumption and prolongs survival time of the network.
Risk Assessment of Information Security Based on Improved Wavelet Neural Network
ZHAO dong-mei,LIU Jin-xing,MA Jian-feng
Computer Science. 2010, 37 (2): 90-93. 
Abstract PDF(349KB) ( 519 )   
RelatedCitation | Metrics
Based on the uncertainty and complexity of risk assessment of information security and limitations of the application of the traditional mathematical models in risk assessment of information security, we proposed an evaluating method of risk assessment of information security based on particle swarm-wavelet neural network(PWNN) by means of integrating the artificial neural networks, wavelet analysis and particle swarm optimization algorithm Firstly, the risk factors were quantized by fuzzy evaluation method, and the input of ANN was fuzzily pretreated. Secondly, the wavcletneural network was trained by particle swarm optimization algorithm The simulation results show that risk level of the information system can be evaluated ctuantitatively by the PWNN model proposed in this paper, and the shortcomings of current assessment methods can be overcome, such as more subjectivity, randomness and fuzzy conclusion, and PWNN has better learning ability and more faster convergence rate than that of the current methods.
Network Model for Convergence of WSN and Computer Network
WU Zuo-shun
Computer Science. 2010, 37 (2): 94-96. 
Abstract PDF(389KB) ( 439 )   
RelatedCitation | Metrics
As the basic architecture for convergence of WSN and computer network.Interconnected model,Tunneled model and Heterogeneous model were presented and compared with analysis of WSN and computer networks.Node mobility management based fuzzy-logic in heterogeneous network model was specially focused.Routing model and protocol frame were also detailed for Heterogeneous integrated network of WSN and Computer network.Key technologies based on heterogeneous network model were examined in the end.
New Information-theoretic Security Channel Model
WANG Bao-cang, LIU Hui,HU Yu-pu
Computer Science. 2010, 37 (2): 97-98. 
Abstract PDF(256KB) ( 433 )   
RelatedCitation | Metrics
The satellite broadcasting channel model is one of the practical models to realize unconditional security. However, the satellite scenario suffers from some drawbacks such as the receiving synchronization and high communication costs. ho overcome these drawbacks, a virtual satellite broadcasting channel model was proposed. The proposed model uses virtual binary symmetric channels to simulate a satellite model. The model only uses XOR operations to perform the initialization process of the secret key agreement. Therefore, the proposed model is efficient, and easy to implement, to synchronize, and has lower communication costs. It was proven that in the proposed model, the wiretapper has a larger receiving bit error probability.
Collaborative Filtering Recommendation Algorithm Based on Co-ratings and Similarity Weight
WANG Jing,YIN Jian,ZHENG Li-rong,HUANG Chuang-guang
Computer Science. 2010, 37 (2): 99-104. 
Abstract PDF(519KB) ( 642 )   
RelatedCitation | Metrics
Collaborative filtering recommendation algorithm is one of the most successful technologies in the e-commerce recommendation system. This paper presented a collaborative filtering algorithm based on co-ratings and similarity weight. First, the co-ratings were selected to compute the similarity between users or items. Most importantly, the algorithm acquiring the last prediction result was acquired by combining prior predicting rating with similarity weight,from which recommendation was produced. The experimental results in real data show this algorithm can consistently achieve better prediction accuracy, thereby brings better recommendation quality.
Adaptive Control-based Proportional Delay Differentiated Service on DTN Web Servers
SUN Qi-lu,DAI Guan-zhong,PAN Wen-ping
Computer Science. 2010, 37 (2): 105-109. 
Abstract PDF(463KB) ( 440 )   
RelatedCitation | Metrics
The DTNs construct a new platform for the future network applications, but nothing has been done for the research to deploy the Web servers which can provide service differentiation on DTNs. This paper designed a DTN Web server supporting differentiated service, and implemented adaptive control-based proportional delay differentiated service. Dynamically the adaptive controller can calculate and adjust the number of service threads in Web server for different types of clients according to the pre-specified delay differentiation parameters. Simulation results demonstrate that proportional delay differentiation can be achieved on the DTN Web server even under dynamic workloads, different workload distributions and varying references.
Research on Sensor Data Description Protocol Based on Extended SDP
SHEN Chun-shan,WU Zhong-cheng, LUO Jia-rong
Computer Science. 2010, 37 (2): 110-112. 
Abstract PDF(370KB) ( 557 )   
RelatedCitation | Metrics
Sensor network, a new kind of distributed measurement and control system, has been developed rapidly with high rectuirements for openness and interoperability, and standard description of sensor data is becoming a popular research topic. hhe paper presented a kind of sensor data description protocol (SDDP) which was based on SDP (session description protocol). SDDP defined some important parameters of sensors in a text based way,and gave a formalization description for it in ABNF language,which offered convenience for connecting sensors in control-measurement system.Through cooperation with SDP, SDDP protocol can connect networked sensors more effectively. The experiment results indicated that SDDP-SIP has good real-time performance, handiness and openness.
Research on Hierarchical Security Model of Web Services Composition
SHANG Chao-wang,YANG Zong-kai,LIU Qing-tang,ZHAO Cheng-ling
Computer Science. 2010, 37 (2): 113-115. 
Abstract PDF(402KB) ( 407 )   
RelatedCitation | Metrics
With the development and application of Web services technologies, the security issues of Web services composition are increasingly prominent. The present security specifications of Web services arc only specified for a single Web service, and there is no architecture for security of composite Web services being approved by most people. The insufficiency of present research on Web services composition security architecture was pointed out and the requirements of Web service composition security were analyzed. We proposed a hierarchical security model for Web services composition (HSM-WSC) based on Web services stack according to the Web service application mode,and analyzed the security function of each layer in the model. This model has the advantages of flexibility and extensibility, and satisfies various requirements of secure composite Web services. The paper also described the Implementation architecture of HSM-WSC in the end.
Data Backup Scheme Based on Network Coding
LIN Guo-qing,CHEN Ru-wei,LI Ying,WANG Xin-mei
Computer Science. 2010, 37 (2): 116-119. 
Abstract PDF(347KB) ( 447 )   
RelatedCitation | Metrics
According to the problems that data backup needs a lot of storage space and its safe privacy needs to be protected, a file backup scheme based on network coding was proposed. Specifically, adopting network coding, many backup files arc encoded into one encoded file, and then saved in the backup server. This scheme includes the following aspects;the principle of backup, the process of backup, influence of file update on recovery and how to handle it,backup file update and the global control system. Experiment and theoretical analysis show, compared with traditional method, this scheme can save storage space greatly and enhance the safe privacy of the backup data, but decrease appreciably the recoverability of it and increase the complexity of the system.
Security Model for Image Hidden Communication
DING Yi-jun,ZHlENG Xue-feng,YU Gui-rong
Computer Science. 2010, 37 (2): 120-122. 
Abstract PDF(336KB) ( 433 )   
RelatedCitation | Metrics
Security is precondition for application of Image Hidden Communication. There is lack of constraint condilions in security based on Shannon information entropy and relative entropy, because of steganalysis by image spatial corrclativity. We presented generalized information entropy, relative entropy and probability distribution, by means of analysis for correlativity of adjacency pixels of image. A security model for data hiding based on generalized information measure was proposed. The model is suitable for Image Hidden Communication and the hidden data can't be detected.Security constraint conditions and a method of steganography were derived, and we gave experimental data to verify the model. The experiment results indicate that the security model has values of theory and application.
Research on BitTorrent-like Peer-to-Peer Media Streaming Systems
ZHENG Wei-ping,QI De-yu,XU Ke-fu
Computer Science. 2010, 37 (2): 123-125. 
Abstract PDF(399KB) ( 569 )   
RelatedCitation | Metrics
Current research progress on BitTorrent like systems which provide media streaming capacities was presented. Then modifications towards piece selection and peer selection algorithms within these kinds of systems were fully discussed, as well as the important performance metrics and design parameters.
Improved Watermarking Algorithm for Digital Images Robust to Geometric Distortion
LI Jian,YE You-pei,HAN Mou
Computer Science. 2010, 37 (2): 126-130. 
Abstract PDF(416KB) ( 435 )   
RelatedCitation | Metrics
A novel digital watermarking algorithm based on singular value decomposition(SVD) was proposed. The data was collected in amplitude of image frequency domain through setting the start parameter and termination parameter of sampling. Data matrix for embedding watermarking was constructed by using these collection data. The algorithm made SVD on blocks of data matrix, and got a sub-maximum singular value sequence. Watermark was embedded via modifying the sub-maximum singular value. The modification method is a additive means,and intensity can be adjusted according to the image characteristics. The watermarking algorithm can resist geometric distortion depending on data matrix construction method and geometric attacks invariance of sub-maximum. I}he experimental results show that the method is more flexibility and robustness than traditional methods.
Object-based Rate Control Scheme for H. 264 Video Coding
XUE Wei,DU Si-dan,YE Xi-jun
Computer Science. 2010, 37 (2): 131-133. 
Abstract PDF(225KB) ( 390 )   
RelatedCitation | Metrics
A new rate control scheme in H. 264 coding standard was proposed. The scheme includes+shape MAD prediction model based on space and time relativity of MAl).We used former frame bit usage as penalty factor to modify the bit allocation rule to balance buffer share. Furthermore,combining video text extraction technictues,the method realined macro bit allocation and quantization step based on text area. Experimental results show that the new ratccontrol scheme can control the bits very well and the average PSNR is improved.
Research on Web Service Discovering Algorithm Based on Information Description Framework
HUANGFU Xian-peng,WEI Wei,CHEN Hong-hui
Computer Science. 2010, 37 (2): 134-138. 
Abstract PDF(390KB) ( 485 )   
RelatedCitation | Metrics
Web service has been greatly developed as a distributed, lousccoupled networked system technology. But the problems of service invahdation,UDDI not supporting service selection and optimization,and services found cannot efficiently satisfying the demand of user rectuirement still haven't been solved. the paper brings forward Web service discovering algorithm based on information description framework. hhrough the research of Web service formalized description, service selection algorithm and service optimization sort algorithm, and with the experiment and analysis, the research validates optimized algorithm efficiently improves recall and precision rate of service discovering, and strikes a stable base for the improvement of Web service discovering efficiency.
Formalizing Advanced Branching and Synchronization Patterns Using Pi Calculus
GUO Xiao-qun,HAO Ke-gang,HOU Hong,DING Jian-jie
Computer Science. 2010, 37 (2): 139-140. 
Abstract PDF(186KB) ( 503 )   
RelatedCitation | Metrics
Pi calculus is a computer model which can be used to model concurrent and dynamic systems.The Pi calculus was proposed as a formal foundation for workflow after researching Pi calculus, further more advanced branching and synchronization patterns workflow patterns were described by using the Pi calculus in detail.
Approach for Evolution of OWL-S Requirements Specification Based on Reflection Mechanism
YUAN Wen-jie,YING Shi,WU Ke-jia,YAO Jun-feng
Computer Science. 2010, 37 (2): 141-145. 
Abstract PDF(573KB) ( 447 )   
RelatedCitation | Metrics
Software system has being faced challenge such as change of user requirements, software resources and system context environment, so software rectuirements must support sustainable evolution. Reflection mechanism has been successfully applied in the realm of the runtime management and dynamic evolution of software systems, but scarcely applied in the evolution of requirements specification. I}his paper proposed an approach which supports the evolution of OWL-S requirements specification based on reflection mechanism. Rectuirements analyst can implement the evolution of OWL-S requirements specification through describing and reasonably using the metes information of OWL-S requiremenu specification. By use of this approach analyst can effectively manage the requirements change and finish the evolulion task of recauirements specification through the controlled and orderly manner.
Web Serives Composition-oriented Framework of Active Agent Aggregation and Simulation Experiment Analysis
YE Rong-hua,WEI Shan-shan,ZHONG Fa-rong
Computer Science. 2010, 37 (2): 146-149. 
Abstract PDF(359KB) ( 458 )   
RelatedCitation | Metrics
Web services composition need to aggregate adequate meta services to meet the service request. In SOA, the task to find these services is commonly fulfiled by the service requestor, but this method does not take into account the wish of service provider, who want to promote the sale of its services. To this end, an active service aggregation frame-work of Web service composition-oriented was proposed, which introduce Agent as a Web Service proxy. By Agent,services can be packaged into abstract entities which are able to find service requests actively. Through the "Intention-Behavior-Achievement" mechanism, the service capabilities and the service requests arc metched. Finally, in order to verify the feasibility of the method, the performance of aggregation framework was analyzed and discussed by several sets of experimental data on a simulation program.
Integrated Scheduling Algorithm of Complex Product Based on Scheduling Long-path
XIE Zhi-qiang,ZHANG Lei,YANG Jing
Computer Science. 2010, 37 (2): 150-153. 
Abstract PDF(350KB) ( 564 )   
RelatedCitation | Metrics
Aiming at the problem that the current integrated scheduling algorithm of complex products processing and assembling mainly considers horizontal optimization in vertical and horizontal scheduling optimization and it neglects the effect of vertical restraints inherent in product operation to manufacture efficiency, a vertical and horizontal scheduling optimization algorithm based on critical path was proposed, namely scheduling algorithm of complex product based on scheduling long path. This algorithm considers the structure of complex product processing tree and confirms the schcdining order of operations by the priority strategy, scheduling long-path strategy and long-time strategy. The priority strategy can take into account operations on the same level of other branches. Scheduling long-path strategy can take into account the other branches and consider the effect of operations in critical path on the total processing time first.Long-time strategy can first schedule operations that have big influence on the processing time. The starting processing time of the operations which have been determined scheduling order is confirmed according to a dispatching rule(Earliest Due Date, EDD). Analysis and examples validate that the scheduling strategics proposed are simple and convenient feasible,and they gain the better scheduling result.
Research on Trace-based Network Storage System Evaluation
ZHAO Xiao-nan,LI Zhan-huai,ZHANG Xiao,ZENG Lei-jie
Computer Science. 2010, 37 (2): 154-157. 
Abstract PDF(473KB) ( 680 )   
RelatedCitation | Metrics
Network storage system evaluation becomes more and more important due to the network storage scale becomes huge sharply and its application scope becomes wide as well. Network storage system evaluation includes performance, availability, manageability, power consumption and other metrics. This paper discussed the current research topics and the status on network storage system evaluation, proposed a logical framework for it, and discussed about how to use trace as a basic means to evaluate each metrics, the challenges and solutions in trace collection, trace replay and trace availability. Finally, gave out expectation on the future research topics in this field.
Research on Negotiated System Model in B2B Based on Agent and Game Theory
LI Wei-wei
Computer Science. 2010, 37 (2): 158-160. 
Abstract PDF(270KB) ( 487 )   
RelatedCitation | Metrics
Aiming at the problem of automation and intelligence of bargaining in B2B e-commerce, the paper proposed a B2B e-commerce negotiated system model based on agent technology and Rubinstein Bargaining game theory. In the Multi-Agent context, network communication traffic can be reduced, transaction efficiency can be improved and atomizalion and intelligentization can be realized .The Rubinstein Bargaining game provides a concrete negotiated policy for the system model. The paper also discussed the negotiated flow, negotiated algorithm and negotiated result of the system model.
Study on Trust Mechanism Based Component Selection Approach in Internetware
ZHANG Xiao-mei,ZHANG Wei-qun
Computer Science. 2010, 37 (2): 161-164. 
Abstract PDF(298KB) ( 435 )   
RelatedCitation | Metrics
Internetware has become a new form of software now. However, how to choose reliable components to construct an internetware with as high reliability as possible, has become a serious problem We proposed a trust mechanism-based component selection model, which selects the more reliable component by way of the credibility evaluation of the components used before, the recommendation of friends, and the reputation of all the components. Then we proposed a component selection approach based on the selection model. At last, the results of experiment demonstrate the feasibility and validity of the approach.
Complementarity Support Vector Machines
ZHANG Xiang-song,LIU San-yang
Computer Science. 2010, 37 (2): 165-166. 
Abstract PDF(245KB) ( 414 )   
RelatedCitation | Metrics
A complementarity support vector machine was obtained which is based on a ammended problem of surpport vector machine. By using Fischer-Burmeister function,a new descent algorithm for support vector machine optimization problem was presented. The proposed algorithm does not base on the primal quadratic programming problem of SVM,but a complementarity problem. It mustn't compute any Hesse or the inverse matrix with simple and small computational work. And the shortcoming of Lagrangian method proposed by Mangasarian et al.,which need compute the inverse matrix that is not adapted to handle nonlinear largcscale classification problems, is overcomed. Furthermore, without any assumption, the global convegence is proved. Numerical experiments show that the algorithm is feasible and effective.
Generalized Equivalence between RST and SPA
ZHU Hong-ning,ZHANG Bin
Computer Science. 2010, 37 (2): 167-170. 
Abstract PDF(280KB) ( 392 )   
RelatedCitation | Metrics
Based on the analysis of the essential conception about positive region, negative region, boundary region of RST and degree of sameness, difference, opposition of SPA, discussed the equivalence between RST and SPA from the generalized angle of human cognizing the world.
Word Segmentation Approach in Military Text on the Basis of Word Combination
HUANG Wei,GAO Bing,LIU Yi,YANG Ke-wei
Computer Science. 2010, 37 (2): 171-174. 
Abstract PDF(383KB) ( 453 )   
RelatedCitation | Metrics
Since the unknown word in military texts is excessive, and the feature of some words is incomplete, the word segmentation method which is based on lexical chunk as the unit was provided. word segmentation was divided into some sections:bidirectional scanning in the text in the base of dictionary,marking the various and segment the words;deleting the stop-words which share the same segmentation results, then count words mutual information and adjacency frequency by the first times word segmentation, according to this counting result, the lexical chunk with relevant words can be judged and signed. At last,picked up the signed various segment and lexical chunks to audit and deal with them artificially. The experimentation shows that after the word combination, the lexical chunk bears much more feature information which shares a better effect of the process.
Diversity Strategies on Multiobjective Evolutionary Algorithms
XIE Cheng-wang,DING Li-xin
Computer Science. 2010, 37 (2): 175-179. 
Abstract PDF(444KB) ( 902 )   
RelatedCitation | Metrics
The intrinsic random errors on the evolutionary operators and the pressure of selection and the noise of seleclion in the evolutionary process easily make the loss of diversity on the evolutionary population. But the maintenance of diversity on the population is very important because it is not only benificial to the search process but also becomes the essential objective in multiobjective optimization. With the unified framework, this paper first classifiesd the diversity strategy on the MOEAs and described the principles and mechanisms on different types of diversity strategies and analyzed their characteristics. Then this paper analyzed the complexity of these diversity operators. At last, this paper proved the convergence of MOEAs in the general sense and pointed out that it is necessary to keep the monotonicity in the evolutionary population and avoid the degradation of population as the design of new diversity strategy.
Algorithm of Dynamic Event System's Synchronization Diagnosis
WANG Xiao-yu,OUYANG Dan-tong,ZHAO Xiang-fu,CHANG Xiao-huan
Computer Science. 2010, 37 (2): 180-182. 
Abstract PDF(328KB) ( 442 )   
RelatedCitation | Metrics
Traditional system modeling rectuires the assumption that the model must be a complete one, according to this disadvantage, this paper proposed a method by synchronization components' model. The method synchronizes incomplete models to a larger one, to solve incomplete diagnosis caused by incomplete models. Optimizing dynamic diagnosis for discrete event systems,and with the idea of distributed system and the nature of Petri Nets, components can make diagnosis in an independent and parallel way, thus improve the rate of obtaining a diagnosis.
Study of Obtaining Chinese Entity Relation Pattern Automatically
DENG Bo,ZHENG Yan-ning,FU Ji-bin
Computer Science. 2010, 37 (2): 183-185. 
Abstract PDF(286KB) ( 420 )   
RelatedCitation | Metrics
Obtaining Chinese entity relation pattern automatically is very important for entire information extraction system. Based on method of bootstrap and features that Chincsc can express same meaning by many forms, using technology of statistical learning to obtain new pattern automatically. Experiment shows the method can find new pattern rapidly and need very small manual work,the process can't be limited by extract region. So the method is effective for promoting the function of information extraction system.
Novel Ant Colony Optimization Algorithm with Estimation of Distribution
XU Chang,CHANG Hui-you,XU Jun,YI Yang
Computer Science. 2010, 37 (2): 186-188. 
Abstract PDF(353KB) ( 543 )   
RelatedCitation | Metrics
In order to improve the performance of the ant colony optimization algorithm, a new ant colony optimization algorithm with estimation of distribution (ACO-ED)was presented. ACO-ED uses probabilistic model based on estimating the distribution of promising solutions in the search space,and adjusts the state transition rule and the global updating rule. Furthermore,ACO-ED is significantly improved by extending with a local search procedure. We applied ACO-ED to TSP problems and compared it with other ant colony optimization algorithms. Simulation results show that ACO-ED is an effective and efficient way to solve combinatorial optimization problems.
Study on Term Relation Extraction from Domain Text
SUN Xia,WANG Xiao-feng,DONG Le-hong,WU Jiang
Computer Science. 2010, 37 (2): 189-191. 
Abstract PDF(350KB) ( 557 )   
RelatedCitation | Metrics
A term relation extraction approach was proposed. It was cast as a classification task. The hybrid classification algorithm combining the advantages of both naive bayes and perceptron was also presented. In this algorithm, a subset of the features was estimated from training data, and another subset of the features was trained by discriminative function. The experimental results showed that the proposed hybrid algorithm almost always outperforms the naive bayes algorithms and perceptron algorithms when the training set is small.
Hybrid Evolutionary Algorithm Based on Adaptive Operator and its Application
YOU Xiao-ming,LIU Sheng,SHUAI Dian-xun
Computer Science. 2010, 37 (2): 192-195. 
Abstract PDF(378KB) ( 465 )   
RelatedCitation | Metrics
A novel hybrid evolutionary algorithm based on adaptive operator for solving multi-objective optimisation was proposed. By niche methods population is divided into subpopulations of real-coded chromosome automatically, each subpopulation can obtain optimal solution by self-adaptive mechanism. We introduced real-coded chromosome with innovalion to solve precision and efficiency problem of binary system; co-evolutionary strategy of niche can guarantee ctuite nicely the population diversity and the convergence speed. The algorithm is applied to urban public transportation system transfer, and experimental results show its superiority. The convergence of the algorithm is proved based on Markov chain in this paper.
Parallel Gene Expression Programming Based on FDA
DU Xin,DING Li-xin,XIE Cheng-wang,CHEN Li
Computer Science. 2010, 37 (2): 196-199. 
Abstract PDF(346KB) ( 515 )   
RelatedCitation | Metrics
In order to reduce the computation time and improve the quality of solutions of Gene Expression Programming (GEP),synchronous and asynchronous distributed parallel GEP algorithm based on Estimation of Distribution Algorithm (EDA)was proposed. The idea of introducing EDA into GEP is to accelerate the convergence speed. Moreover,the improved GEP was implemented by synchronous and asynchronous distributed parallel method based on the island parallel model. Some experiments were done on distributed network connected by twenty computers. The best results of sequential and parallel algorithms were compared, speedup and performance influence of some important parallel control parameters to this parallel algorithm were discussed. hhe experimental results show that parallel algorithms may approach linear speedup and have better ability to find optimal solution and higher stability than sequential algorithm.
Novel Strategy for Selecting Incremental Support Vectors
SHEN Feng-shan,MA Yu-jun,ZHANG Jun-ying
Computer Science. 2010, 37 (2): 200-202. 
Abstract PDF(342KB) ( 408 )   
RelatedCitation | Metrics
Incremental support vectors determine the generalization performance of incremental support vector machine,so their choosing is significant We proposed a novel strategy to choose the incremental support vectors by computing their probabilities. The method constructs an appropriate separating hyperplane for each training sample, which goes through the corresponding training sample point. And then the probability of a sample to be incremental support vector is estimated by the separating rate of the corresponding hyperplane. Those samples with higher probabilities are chosen as the incremental support vectors to train and update the incremental support vector machine. Comparative numerical experiments of our method against the existing methods show that our method has outstanding advantage in the training efficiency without deteriorating the generalization performance of incremental support vector machine.
Improvement of Web Information Extraction Algorithm Based on
ZHU Wei-hua,LU Yi,LIU Bin-bin
Computer Science. 2010, 37 (2): 203-206. 
Abstract PDF(356KB) ( 706 )   
RelatedCitation | Metrics
With the development of the Internet technologies, the information on the Internet increases exponentially.One important research focuses on how to extract structured data from these great capacities of online documents in unstructured texts. This thesis mainly studied relative algorithms on Web information extraction based on hidden Markov model ( HMM) , discussed how to use HMM and how to mark data in text information extraction, offered several methods to improve the hidden Markov model in information extraction, introduced the establishment of Web information extraction model based on HMM, Comparatively analysed the output data of information extraction, verified the validity of the algorithm through experiments.
Vehicle Type Recognition Based on Biological Vision Salience
CHEN Zhen-xue,LIU Cheng-yun,CHANG Fa-liang
Computer Science. 2010, 37 (2): 207-208. 
Abstract PDF(264KB) ( 389 )   
RelatedCitation | Metrics
Vehicle type recognition is typical target recognition. According to biological vision and pattern recognition theory, the detection and recognition of vehicle type were researched. So, a novel method based on feature salience was presented to recognize vehicle type. Firstly, this algorithm extracts the multi-features of vehicle type and compares their salience. Then, the more salient feature was put larger power. Finally, the salient features were fused to recognize the vehicle type. The experimental results show that the algorithm has better identify rate.
Research of Text Clustering Based on Fuzzy Granular Computing
ZHANG Xia,WANG Su-zhen,YIN Yi-xin,ZHAO Hai-long
Computer Science. 2010, 37 (2): 209-211. 
Abstract PDF(272KB) ( 532 )   
RelatedCitation | Metrics
The traditional K-means is very sensitive to initial clustering centers and the clustering result will wave with the different initial input. To remove this sensitivity, a new method was proposed to get initial clustering centers. This method is as follows; provide a normalized distance function in the fuzzy granularity space of data objects, then use the function to do a initial clustering work to these data objects who has a less distance than granularity试,then get the initial clustering centers. The test shows this method has such advantages on increasing the rate of accuracy and reducing the program times.
Trust Computing Based on Probability Theory in Unstructured P2P Network
WANG Ping,QIU Jing,QIU Yu-hui
Computer Science. 2010, 37 (2): 212-215. 
Abstract PDF(355KB) ( 438 )   
RelatedCitation | Metrics
Trust plays an important role in determining who and how to interact in open and dynamic P2P network. To this end,a trust mechanism for unstructured P2P network was proposed in which peers evaluate the trust based on beta probability distribution and aggregate the reputation with Gossip algorithm. I}he experimental data shows that our way can autonomously deal with the biased information and identify trust belief with low overhead and better efficiency.
Relationship between Uncertainty and Parallelism of Natural Computation
WANG Peng,LI Jian-ping
Computer Science. 2010, 37 (2): 216-220. 
Abstract PDF(483KB) ( 550 )   
RelatedCitation | Metrics
Uncertainty principle of quantum mechanics is the basic law of nature. Uncertainty is considered as the root of parallelism and intelligence of natural computation by analyzing the relationship between parallelism and uncertainty.According to this understanding the uncertainty principle of algorithm was proposed, and the uncertainty intelligence model (UIM) of natural computation was established. Shannon entropy principle was used in this model. UIM indicates that prior knowledge and uncertainty information constitute the basic intelligent system. Intelligent system is essentially an information system. The information provided by prior knowledge assurances the algorithm in the correct direction of search, and the information provided by uncertainty realizes the parallel search in solution space. So we can conclude that improving the entropy will achieve the high level intelligence. UIM is also verified by Monte-Carlo algorithm of Pi in this paper.
Multiple Features Based Intelligent Biometrics Verification Model
SUN Ao-bing,ZHANG De-xian,ZHANG Miao
Computer Science. 2010, 37 (2): 221-224. 
Abstract PDF(412KB) ( 428 )   
RelatedCitation | Metrics
With the development of biometrics verification technology, related devices and implements arc put into market, which make it possible that the devices replace traditional means. But current single feature based verification devices depend on one single means and own very low cheat cost,which can not defense the specific cheat mode. We analyzed the intelliegent verification of human beings and brought forward one multiple features based verification model. It can capture several biometric features at same time and select several features for veification according to need. The multiple features based cross index means can improve the search efficiency of complicated features. The joint of historic data in the model enables the system focuses on the changed features and provoke the verification with high precision.
Research on a Clustering Algorithm Based on Generalized Quantum Particle Model and its Convergence
HUANG Liang-jun,SHUAI Dian-xun,ZHANG Bin
Computer Science. 2010, 37 (2): 225-228. 
Abstract PDF(470KB) ( 446 )   
RelatedCitation | Metrics
A novel generalized quantum particle model (GQPM) was presented for data self-organizing clustering. In this model the data clustering process is transformed into a stochastic self-organizing process of the ctuantum particles in the state configuration space. I}he state configuration will evolve to a stationary probability distribution, and thus the optimal state configuration on particles can be obtained from the state configuration which has the highest probability in the stationary probability distribution. The convergence of the self-organizing process was proved in this paper. The GQPM algorithm has much faster clustering speed than the traditional clustering algorithm for the large scale database.Its superiorities were verified by the simulation experiments.
Research of Support Vector Classifier Based on Neighborhood Rough Set
HAN Hu,DANG Jian-wu,REN En-en
Computer Science. 2010, 37 (2): 229-231. 
Abstract PDF(338KB) ( 407 )   
RelatedCitation | Metrics
Support vector machine can not directly deal with high dimension and large scale training set and it is sensifive to abnormal samples,an improved support vector classifier based on neighborhood rough set was proposed. In the paper, data preprocessing was done on training set from two different sides. On the one hand, neighborhood rough set was used to find these samples in boundary and obtain a reduced training set, at the same time, those abnormal samples which not only lead to over-learning but also decrease the generalization ability were deleted. On the other hand, attribute reduction was done and feature weight was imported based on attribute significance because different feature effects differently on classification. At last several comparative experiments using synthetic and real life data set show the performance and the effectivity of the method.
Using Co-classification Approach to Detect the Type of Cancer
LU Xin-guo,CHEN Dong,DU Jia-yi,ZHOU Juan
Computer Science. 2010, 37 (2): 232-236. 
Abstract PDF(456KB) ( 557 )   
RelatedCitation | Metrics
Cancer recognition with gene expression profile was studied. Due to large redundant and noise information in the gene expression data and the sensitiveness to selected feature genes,the classification is lack of generalization capability. With studying the gene expression profiles, two cancer recognition models including global component model (GCM) and cancer component model (CCM) were constructed. And a weighted voting strategy was applied to propose an Co-classification Approach based on GCM and CCM for cancer recognition (CAGC). Test experiments were conducted on Leukemia, Breast, Prostate, DLI3CL, Colon and Ovarian cancer dataset respectively, and great performance is acquired by CAGC on all datasets. The experiment results show that the recognition solution and the generalisation are strengthened by combination of GCM and CCM.
Clustered Microcalcification Detection in Digital Mammograms Based on an Active Learning with Support Vector Machine
FENG Jun,JIANG Jun,Ip Ho-Shing Horace,WANG Hui-ya
Computer Science. 2010, 37 (2): 237-241. 
Abstract PDF(595KB) ( 403 )   
RelatedCitation | Metrics
Clustered microcalcification is an important signal for breast cancer in the early stages. However, computer aided detection of microcalcification is a challenge in the field of medical imaging. To improve the performance of the detection system, a large amount of lesion labeling is essential. Besides the difficulty on collecting samples itself, it also takes experts much time for manual labeling. Few state-of-the-art technictues take into account this problem. W first applied the techniques of active learning with SVM into this area to try to solve this problem. The basic conditions for the selected training set samples were proposed. I}he experiments on benchmark dataset show that our approach can reduce much works on labeling samples with holding the classification perfom}ance of the system of detecting interesting ROI regions.
Stability Analysis of Swarms with Interaction Time Delays
LIU Qun,WANG Lan-fen,LIAO Xiao-feng,WU Yu
Computer Science. 2010, 37 (2): 242-245. 
Abstract PDF(315KB) ( 390 )   
RelatedCitation | Metrics
This paper proposed a general delay swarm model and researched its collective behavior in the presence of communication time delays under general condition satisfied by several environment profiles. The results show that the swarm members can eventually converge to a finite region bound around swarm center along an attractant/repellent profile. Simulation proved the time delay swarm may display more complex dynamics, including stability and oscillation, depending on the delay values, the results also show the motion of the center.
Colored Multiway Cuts in Almost Trees with Parameter k
LI Shu-guang,XIN Xiao
Computer Science. 2010, 37 (2): 246-249. 
Abstract PDF(330KB) ( 392 )   
RelatedCitation | Metrics
The colored multiway cut problem generalizes the traditional multiway cut problem. It is motivated by data partitioning in Peer-to-Peer networks. Given a graph G with color dependent edge weights and a partial coloration on some distinguished vertices,the colored multiway cut problem is to extend the partial coloration such that all the vertices are colored and the total weight of edges that have different colored endpoints is minimized. A polynomial time exact algorithm was presented for almost trees with parameter k. That is to say, the colored multiway cut problem is fixed-parameter tractable with respect to the parameter k, which is defined as the maximum, over all biconnected components C of the graph G, of the number of edges that must be removed from C to obtain a tree.
Design of Hierarchical Weighted Neural Network Control System for City Traffic in Single Intersection
XU Xin
Computer Science. 2010, 37 (2): 250-252. 
Abstract PDF(238KB) ( 488 )   
RelatedCitation | Metrics
City traffic system is a very complicated non-linear system. It's very difficult to build precise mathematical model and BP neural network has advantage in self-study and self-adaption. In this paper, for the control problem in one intersection, based on improved BP neural network algorithm, considering key traffic flow and nonkcy traffic flow, hierarchical weighted neural network controller was proposed and desiged. It's used to control traffic on time. The simulation results show that this control method is better than traditional methods.
Research on Adaptive Grey Prediction Method for Inverted Pendulum Networked Control Systems
WU Hong-ying,WEI Li-sheng
Computer Science. 2010, 37 (2): 253-255. 
Abstract PDF(249KB) ( 420 )   
RelatedCitation | Metrics
This paper proposed a new adaptive grey prediction control strategy based on the research of grey theory, adaptive switching method and MIMO networked control systems. The whole modeling procedure of this method was established. The equal dimension GM (1,1) model was established by using metabolic principle. This method only identifies two parameters,and avoids online solving the Diophantine equation and inverse matrix. So the computation load of the algorithm can be reduced greatly,and real-time property is advanced. The efficacy of the proposed method is shown by presenting simulation results from inverted pendulums networked control systems' example.
Color Image Retrieval Based on Multiple Features of Image Edges
YANG Fang-yu,WANG Xiang-yang
Computer Science. 2010, 37 (2): 256-260. 
Abstract PDF(412KB) ( 474 )   
RelatedCitation | Metrics
An image edge is the boundary between an object and the background, and indicates the boundary between overlapping objects. The image edges are the important clue to understand image content for they can mark the locations of discontinuities in depth, surface orientation, or reflectance. An edgcbased color image retrieval by using multiple fcatures was proposed. Firstly,the color edge was extracted by using canny detection operator. Secondly, color histogram and direction histogram about the extracted color edge image were computed as image features. Finally, the similarity between color images was computed by using a combined feature index based on two kinds of histograms. Experimental results show that the proposed image retrieval is more accurate and efficient in retrieving the user-interested images.
Registration for Breast Magnetic Resonance Imaging Using Demon Algorithm
WANG Yang-ping,DANG Jian-wu,DU Xiao-gang,LI Sha,TIAN Zhong-ze
Computer Science. 2010, 37 (2): 261-263. 
Abstract PDF(375KB) ( 662 )   
RelatedCitation | Metrics
In order to improve the registration speed of breast dynamic contrast enhancement magnetic resonance images (DCE MRI) which are obtained at different time points, this paper presented a registration model for breast DCE MRI using fast demon non-rigid registration algorithm with intensity correction. Original demon is based on intensity change to get deformation parameters and unsuitable for breast DCE MRI. Intensity correction between pre and post contrast images based on polynomial was suggested to overcome the problem according to the signal enhancement of the breast model. The experiment results show that the presented approach performs better than only demon algorithm and is remarkably faster than free form deformation non-rigid registration algorithm with nearly same precision.
Two New Digital Image Encryption Effect Evaluation Criterions
ZHANG Xue-feng,FAN Jiu-lun
Computer Science. 2010, 37 (2): 264-268. 
Abstract PDF(439KB) ( 745 )   
RelatedCitation | Metrics
Image information entropy and gray modification average value arc two familiar image encryption effect evaluation criterions. We pointed out that these two criterions are influenced obviously by the size of images,and two new image encryption effect evaluation criterions were presented,that is,histogram proportion degree and run statistic. Histogram proportion degree can be used in the effect evaluation of image encryption algorithm based on the image pixel gray values modification, and run statistic can be used in the effect evaluating of image encryption algorithm based on the image pixel coordinate permutation. The proposed criterions arc simple on expression and convenient in computation, their main advantages arc having litter effect on the size of images and the computing process arc only related with the encrypted image.
Locate Tampered Image Region Automatically Based on Inconsistency of ,JPEG Blocking Artifacts
WANG Xin,LU Zhi-bo
Computer Science. 2010, 37 (2): 269-273. 
Abstract PDF(506KB) ( 746 )   
RelatedCitation | Metrics
Inconsistency of blocking artifacts is introduced when image regions with different JPEG quality factors or grids are spliced for forgery. A novel image forgery detection method was proposed to automatically locate the tampered regions. The compression noise was firstly extracted from the images using a wavelet based denoising filter. The proposed method measured the JPEG blocking artifact using the compression noise to improve the SNR. Then an appropriate threshold was found using an iterative method to separate tampered regions in the histogram of blockiness metrics.Experimental results were presented to demonstrate the effectively and reliability of our method.
Robust Renal Artery Flow Signal Extraction Method for Ultrasonic Doppler Images
WANG Jie,YANG Meng,CAI Sheng,LI Jian-chu,TANG Ping
Computer Science. 2010, 37 (2): 274-276. 
Abstract PDF(297KB) ( 447 )   
RelatedCitation | Metrics
Renal artery ultrasonic Doppler blood flow velocity signal extraction is a prerectuisite for the feature extraclion of blood flow velocity signals. And features of renal artery ultrasonic Doppler blood flow velocity signals are important means for curly renal arterial stenosis (RAS) diagnosis. I}o extract signals from ultrasonic images the locally adapfive confidence connected segmentation method was proposed to separate the signal region and the background. The segmentation is impactful even with intensity inhomogencity and strong noise. For the segmented image, the connected components labeling was used to fill small gaps and to eliminate the impact of interference information. The extracted signal was corrected according to its local statistics, and filtered for noise smoothing with Mean Shift All these result in a robust signal sequence extraction of blood flow velocity in renal artery. The experimental results show that the signal sequence of blood flow velocity in renal artery can be extracted accurately and robustly with this method.
Graph Cut Method Based on Non-scalar Distance Metric for Texture Synthesis
ZOU Kun,HAN Guo-qiang,WO Yan,ZHANG Jian-wei
Computer Science. 2010, 37 (2): 277-281. 
Abstract PDF(463KB) ( 471 )   
RelatedCitation | Metrics
Graph cut technique is widely used in patch-based texture synthesis algorithms to optimize patch boundaries.The traditional graph cut method is based on the cumulative distance metric which sometimes leads the path to taking short cuts through high cost areas. To overcome this problem, a graph cut method based on the non-scalar distance metric was proposed. A minimum cut algorithm based on this metric was presented, and its optimality was proved. The regularity problem of the improved graph cut method was discussed and a solution was provided. Experimental results show that the cutting paths by the improved graph cut method arc smoother and more circuitous, and the patch scams are less obvious.
Simple and High Precision Fast Motion Estimation Algorithm
LU Ji-yuan,CHAO Hong-yang
Computer Science. 2010, 37 (2): 282-285. 
Abstract PDF(382KB) ( 513 )   
RelatedCitation | Metrics
Because of intense requirement of image quality for digital television,high precision motion estimation algorithms were recommended instead of simple motion estimation algorithms. To benefit from both the speed of simple search and the accuracy of high precision search, a new method making use of inter macroblock characteristic was proposed. According to experiments, a reduction of 70% execution time is attained while still maintaining the same accuracy of high precision search.
Image Spatial Semi-fragile Watermarking Algorithm with High Classification Capability of Attack Types
XIAO Lei
Computer Science. 2010, 37 (2): 286-289. 
Abstract PDF(323KB) ( 422 )   
RelatedCitation | Metrics
An image spatial semi-fragile watermarking algorithm was proposed by exploiting the contrast sensitivity of the host image. The watermark bit was embedded into the host image by adaptive least significant bit (LSB) substitulion and the bit plane of each pixel for data embedding was determined by the contrast sensitivity. The algorithm deduced theoretically the adaptive threshold for automatic tampering detection and location, and has good transparency because of the exploiting of the contrast sensitivity. Experimental results show that the proposed scheme has good robustness against admissible incidental attacks, while it is sensitive to malicious attacks, and even can localize the tampered region precisely. In addition, this algorithm can distinguish effectively incidental attacks from malicious ones and outperforms other semi-fragile watermarking algorithms at the classification of attacks.
Carry-free Addition in Parallel Based on Ternary Optical Computer
WANG Xian-chao,YAO Yun-fei,JIN Yi
Computer Science. 2010, 37 (2): 290-293. 
Abstract PDF(324KB) ( 535 )   
RelatedCitation | Metrics
This study implemented carry-free addition of two vectors in parallel on the ternary optical computer. Its key part is made up of a piece of monochromatic LCD, whose volume is 38.0 X 65.5 X 2. 2mm3 and power is 0. 3mw, and a layer of polaroid on either side. The parallel and carry-free addition was realized in three steps by defining four transformations on Modified Signed-Digit(MSD) number system. And the required time of the addition is independent of the digits of operands. An experiment certified the feasibility and correctness of the parallel and carry-free addition on the ternary optical computer. The system can finish the addition of two numbers with 680 bits in MSD code in three steps and in fully parallel.
Fast Vehicle-logo Location Based on Local Symmetrical Character
XIAO Fei,WANG Yun-qiong,LIU Li-mei,ZHAO Yang
Computer Science. 2010, 37 (2): 298-300. 
Abstract PDF(312KB) ( 531 )   
RelatedCitation | Metrics
A fast method of vehicle logo location was presented, which is based on the symmetrical character in both vehick body area and vehicle logo area. Firstly, the plate and lamp area was located by vertical interval difference and horizontal maximum local projection,and the rough logo area was given by the restriction of geometric position and size of the vehicle plate, vehicle lamp and vehicle logo area. Secondly, by local symmetrical checking and difference horizontal projection in rough logo area, we got a more accurate vehicle logo rectangle area. The experimental results show that the average accuracy of vehicl}logo location is about 94%, and the average time is 21ms.