Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 36 Issue 11, 16 November 2018
  
Research and Development of Plangragph Based on Heuristic State Search
GU Wen-xiang,WANG Gai-ge,YIN Ming-hao,SUN Yan
Computer Science. 2009, 36 (11): 1-9. 
Abstract PDF(849KB) ( 314 )   
RelatedCitation | Metrics
Traditionally, planning problems are cast in terms of imperative constraints that are either wholly satisfied or wholly violated. Heuristic search is generated and improves the performance of planner. A variety of heuristic searches were argued in the near decade, then the pros and cons of it were showed. At once, the future of heuristic search was also introduced.
Survey of Task Computing Paradigm in Ubiquitous Environment
JIANG Fa-qun,LI Jin-tao,ZHU Zhen-min,LUO Hai-yong
Computer Science. 2009, 36 (11): 10-13. 
Abstract PDF(344KB) ( 174 )   
RelatedCitation | Metrics
To minimize distractions on a user's attention and realize the user centric ubiquitous computing, task computing paradigm has gained increasing acceptance and been regarded as a promising way for ubictuitous environment. The existing related researches on task computing paradigm were analyzed and compared, and the summary of the future trend of the development of task computing paradigm was proposed.
Survey of Research on Probabilistic XML Data Management Techniques
WANG Jian-wei,HAO Zhong-xiao
Computer Science. 2009, 36 (11): 14-17. 
Abstract PDF(357KB) ( 213 )   
RelatedCitation | Metrics
With the rapid development of network application,a large amount of XML data have cxisted,so the style of XML data becomes the primary data and the standard style of data exchanging and representation on Internet. Because of the complexity of external world,the uncertainty is the common internal attribute of data, the uncertain information universally exists. Usually the uncertain information can be represented as the probability values in XML document (probabilistic XML document),so the research ways of representing and processing the probabilistic XML data will be a new research field. Since 2001, a series of research achievements of the probabilistic XMI_ data management have been obtained. The paper surveyed the research techniques of the probabilistic XML data management including the probabi-listic XML data model, the PXML algebra, query and the prototype systems. The existing problems in the current research work and the new research issues were also discussed.
Survey of Case Decision Techniques and Case Decision Support System
NI Zhi-wei,LI Jian-yang,LI Feng-gang,YANG Shan-lin
Computer Science. 2009, 36 (11): 18-23. 
Abstract PDF(714KB) ( 579 )   
RelatedCitation | Metrics
Case is the integrated representation of the human sense, logics and creativity. Cascbased Reasoning (CBR) is the simulator of human analogy learning, great achievement has been acctuired for CI3R in the field of knowledge lack.This paper discussed the logic rationality of C13R and the case intelligent decision techniques. hhe intelligent decision support system (IDSS) that is built from CBR can overcome some defects,such as poor flexibility of traditional IDSS,and provides decision support This decision support system technique and the aspect of research and development about case decision support system were investigated in the end.
Modeling and Analysis of Cooperative Services Based on Stochastic Petri Net
ZENG Ming,YANG Yang,WANG Yuan-zhuo,ZHANG Jing-le
Computer Science. 2009, 36 (11): 24-28. 
Abstract PDF(353KB) ( 210 )   
RelatedCitation | Metrics
A model and analysis method of collaborative services based on Stochastic Petri net were put forward, a collaborative enterprise service in several of the most representative process was selected. We used stochastic Petri net to model these processes. Then,we took the logistics system of e-- commerce systems as example,based on stochastic Petri net modeling, used equivalent simplification, and performance evaluation and compared the original model and simplified model of system performance and computing time.
Improved Topology Control Algorithm for Wireless Sensor Networks
ZHAO Xue-jian,ZHUANG Yi,OUYANG Jian
Computer Science. 2009, 36 (11): 29-31. 
Abstract PDF(355KB) ( 303 )   
RelatedCitation | Metrics
Topology control algorithms for wireless sensor networks are significant for prolonging the network lifetime,reducing radio interference, increasing the efficiency of routing protocols and MAC protocols, among other things. Based on the analysis of the XTC (eXemplary Topology Control) algorithm, this paper proposed a modified local distributed topology algorithm M-XTC (Modified-XTC). The modified algorithm maintains the advantages such as simple, practical, available without node position information, applicable for nodes without special instruments, heterogeneous networks, three-dimensional space networks as the XTC; algorithm does. Moreover, it is also more conductive to prolong the network lifetime, and has much more real-time and robustness.
Method of Web Service Selection Based on SPA
ZHU Hong-ning,ZHANG Bin
Computer Science. 2009, 36 (11): 32-35. 
Abstract PDF(309KB) ( 190 )   
RelatedCitation | Metrics
Advanced a method of Web service selection based on SPA. The method includes two parts: one is determining sameness, difference and opposition measure intervals of QoS criteria by non-functional constraint of Wcb service; the other is evaluating candidate simple and convenient method of service by connection-degree and set pair shi With instance testification, it is a very Web service selection on condition that ensures the accuracy.
Adaptive Deficit IEEE 802. 11 PCF Polling Scheme Based on NS-2
LIAO Yong,YANG Shi-zhong,XU Chang-biao
Computer Science. 2009, 36 (11): 36-39. 
Abstract PDF(421KB) ( 202 )   
RelatedCitation | Metrics
A adaptive deficit polling scheme was put forward which aimed to guarantee the QoS(Quality of Service) of Round Robin polling scheme under the condition of inefficient WI_AN IEEE 802. 11 PCF (Point Coordination Function) mechanism. The core idea of this scheme was described as well as the processing of adjustment mechanism, and it was simulated on the platform of network simulator NS-2 by revising PCF source code. I}hc results indicated that adaptive deficit IEEE 802. 11 PCF polling scheme is prior to Round Robin scheme with its better performance of QoS(Quality of Service) which includes end-to-end delay, system throughput, packet delay, etc.
Analysis of Resource Competition for Internet Congestion Control Systems with Different Deployment of Source Algorithms
DAI Hang,MU DE-jun,WANG Lin,ZHANG Hui-xiang
Computer Science. 2009, 36 (11): 40-42. 
Abstract PDF(219KB) ( 210 )   
RelatedCitation | Metrics
In the future Internet congestion control protocols, several groups of users may deploy different source algorithms in order to meet the different QoS requirements. The analysis of resource competition for congestion control systans with mixed deployement AIMD algorithm and MIMD algorthm was systematically studied, including the system model, the system steady-state analysis, the system stability analysis. In the end of this paper, simulations based on NS2 system validitied the analysis results of proposed method, and the resource competition between AIMD and MIMD algorithms.
Dynamic Load-balancing Algorithm on Structured P2P Network
LU Chui-wei,LI Zhi-tang,LIN Huai-qin,HUANG Qing-feng, ZHANG Ye-jiang
Computer Science. 2009, 36 (11): 43-46. 
Abstract PDF(338KB) ( 196 )   
RelatedCitation | Metrics
Loading balancing is one of research hotspot of P2P network. There exist many problems such as low load-balancing degree and excess assumption conditions etc. in existing load-balancing technology. I}he paper proposed an en-hanced load-balancing algorithm: ELB P2P. The algorithm assigns rational load and corresponding )D address space that can be dynamicly regulated to every peer in P2P system. In addition, the algorithm introducesd flux control mechanism,and automatically selected light load peers with low delay and high bandwidth for load diversion. hhe experiments demonstrate: compared with traditional Chord protocol, the ELI3-P2P algorithm has faster velocity of load balancing, less spending of load-diversion, and more excellent stability of P2P system, furthermore, it still obtains high load-balancing degree in the event that P2P network load is very heavy, and without stern limitation and assumption conditions aiming to property of peers.
Scheme of Cross-domain Hand-off Based on Hierarchical P2PSIP
LIU Tian-cheng,TAO Jun,DONG Yong-qiang,XIA Qin
Computer Science. 2009, 36 (11): 47-51. 
Abstract PDF(478KB) ( 170 )   
RelatedCitation | Metrics
This paper presented a scheme of cross-domain hand-off based on hierarchical P2PSIP. It solved the problem about positioning of station which moves among different P2P networks by combining P2P Overlay location with SIP application-layer mobility support. There are usually multiple adjacent P2P networks in a certain area in the actual de- ployment. We specified a standalone P2PSIP network as a domain. An infrastructure of hierarchical P2PSIP network can be adopted to enable stations in different P2P networks to communicate. When a client moves among different P2P networks,we design a set of hand-off procedure to ensure that other stations can locate the mobile one. The stations can communicate with each other directly in the hierarchical P2PSIP network. And mobile station can be located and initiate
Research on Intermittent Connectivity for Wireless Multiple-hop Space Network
QIAN Yan-bin,CHEN Xing-yuan,DU Xue- hui,ZHANG Chuan-fu
Computer Science. 2009, 36 (11): 52-55. 
Abstract PDF(457KB) ( 239 )   
RelatedCitation | Metrics
Intermittent connectivity destroyed the building blocks of the traditional network-end-to-end principle, and the space network further posed a great challenge because of the characteristics of long, variable latency, low band width,high noise and so on asymmetric links. This paper analyzed the essence and definition of intermittent connectivity, and revisited the fundamental tradeoff between end-to-end and hop-by-hop to discuss the performance of basic store and-forward mechanism under the intermittent connectivity environment. The analysis and the simulation result indicated that the performance of hop-by-hop way is usually higher than end-to-end in wireless multiple-hop space network.However, it's disadvantage is the higher protocol complexity and additional memory and processing requirements.
Data Placement Scheme for Distributed Caching System in Media Streaming Service
GUO Pan-hong,YANG yang,LI Xin-you
Computer Science. 2009, 36 (11): 56-60. 
Abstract PDF(429KB) ( 229 )   
RelatedCitation | Metrics
With the improvement of the high-bandwidth access technologies, multimedia streaming has experienced quickly development, and has wide application prospect As an important solution for reducing the load of server, improving the response time for customers, the steaming proxy technique has become one of the most popular research fields in the streaming research community. This paper presented an optimized data placement scheme for distributed caching system. hhe idea of the proposed scheme is to put the cache data into a particular proxy server, in order to minimize the transmission cost of data accessing in future. hhe simulation results demonstrate that our proposed algorithm achieves smaller transmission cost and has better scalability comparing with traditional cache data placement scheme.
Self-healing Group Key Distribution Scheme Based on Secret Sharing with Access Structures
PENG Qing-quan,PEI Qing-qi,PANG Liao-jun
Computer Science. 2009, 36 (11): 61-64. 
Abstract PDF(378KB) ( 243 )   
RelatedCitation | Metrics
Self-healing group key distribution enables group of users to establish group session keys for secure communication over an unreliable networks. Based on a secret sharing method with access structures in which each user's secret shadow is selected by the user himself, a self-healing group key distribution scheme was proposed. The scheme enables each group member's personal secret key is selected by the member himself, the group manager dose not have to distribute each member's personal secret key,no secure channel is required to be established between the group manager and each member. The security of the scheme was analyzed in a system security model, the analysis results show that it is a computational secure self-healing group key distribution scheme with revocation capability and achieves both forward and backward secrecy. Finally, the performance analysis results show that the proposed scheme has less storage cost and communication cost.
Self-adaptive Mechanism of Dynamic Forensics
CHEN Lin,LI Zhi-tang,GAO Cui-xia
Computer Science. 2009, 36 (11): 65-67. 
Abstract PDF(364KB) ( 160 )   
RelatedCitation | Metrics
With the development of intrusion and computer crime technologies,dynamic forensics is becoming more and more important. Dynamic forensics based on intrusion detection and honeypot technologies has great advantage in realtime performance,whcrcas these methods arc defective in overcoming the difficulty of evidence and system reliability,and hard to seize the opportunity of investigation. A self-adaptive mechanwasm was proposed which used intrusion detection system as forensics trigger and shadow honeypot was used to verify the suspicious attack, observe and analyze the attack activities further more to gather key evidences. And then the finite state machine model of this mechanism was illuminated and key technologies such as shadow honeypot, state transition opportunity and evidence security storage method were described. The dynamic forensics system with this mechanism can tolerate intrusion in a certain degree and get the investigation process under control. Moreover, the amount of unnecessary evidences can be reduced obviously.
Study of Anomaly Intrusion Detection System on False Positive Rate
CHAI Zheng-yi, WANG Hong-hai
Computer Science. 2009, 36 (11): 68-70. 
Abstract PDF(293KB) ( 228 )   
RelatedCitation | Metrics
False positive rate of intrusion detection systems (IDS) affect the detection creditability. Methods to reduce the false positive rate were presented after analyzing creditability of IDS and false positive rate of anomaly IDS. It put methods include the followings; method based on process detection, multi-detection system model. It put emphasis on constructing normal profile dynamically based on artificial immunity to restrain false positive rate, then simulation experiment was done. The results show that the method can improve the detection efficiency and reduce the false positive rate.
Security Analysis of Mechanisms for the Improved Polymorphic Cipher
YIN Yi-feng, HU Yu-pu
Computer Science. 2009, 36 (11): 71-74. 
Abstract PDF(325KB) ( 206 )   
RelatedCitation | Metrics
In contrast to most or all commonly known symmetric encryption algorithm designs (including the AES candilates such as Rijndacl and Twofish),the Polymorphic Cipher (PMC) can immunize differential power attack. The algorithm is mostly used to encrypt disk files. The problem that would be solved is to improve the Polymorphic Cipher in P2P,and to provide mass produced session keys for two parties across a communication channel. A strong oncway function satisfing the Strict Avalanche Criterion (SAC) was constructed. The security of the function was analyzed by some experimental results and correlated theories.
Multi-layer Integrated Anomaly Detection of Mobile Nodes Behaviors in Mobile Ad Hoc Networks
WANG Tao YU Shun-zheng
Computer Science. 2009, 36 (11): 75-78. 
Abstract PDF(389KB) ( 250 )   
RelatedCitation | Metrics
Mobile Ad hoc Networks arc very vulnerable to malicious attacks due to the nature of mobile computing environment such as wireless communication channels, limited power and bandwidth, dynamically changing and distributed network topology,etc. The general existing Intrusion Detection Systems (IDS) have provided little evidence that they are applicable to a broader range threats. Based on the generalized and cooperative intrusion detection architecture proposed as the foundation for all intrusion detection, we presented an anomaly detection mechanism to discriminate the illegitimate network behaviors of mobile nodes. 13y collecting the observation sequences of multiple protocol layers, Hidden semi-Markov Model (HSMM) was explored to describe the network behaviors of legitimate nodes and to implement the anomaly detection for various malicious attacks. We conducted extensive experiments using the ns-2 simulation environment to evaluate and validate our research.
Study on GEP Rule Extraction Algorithm for Network Intrusion Detection
TANG Wan,CAO Yang,YANG Xi-min,QIN Jun
Computer Science. 2009, 36 (11): 79-82. 
Abstract PDF(335KB) ( 172 )   
RelatedCitation | Metrics
Network intrusion detection based on machine learning suffers from the problems of low detection ratio for unknown intrusion and low detection efficiency due to many complex rules. To solve these problems, a constraint based gene expression programming (GEP) rule extraction algorithm (CGREA) was proposed. The intrusion detection rules were represented based on GEP model,and a constraint grammar was defined to guarantee the rules closeness and adequacy. It restricted the ratio of randomly selecting various symbols in the gene head of GEP rules, and used the elitist strategy to guarantee convergence. The KDI)CUP' 99 DATA Set was used for evaluation the intrusion detection rules auto-extracted by CGREA. A 91%probability of detection was achieved, and three unknown attacks' probabilities of detection were more than 88 %. These results indicate that the intrusion detection rules that extracted by CGREA are effective, simple, and capable of detecting unknown intrusions. Moreover, the efficiency of rule generation and detection is improved.
Optimization Deployment Algorithm for Network Efficiency of Linear Wireless Sensor Networks
LIU An-feng,NIE Hong-wei,WU Xian-you,XIAO Zhi-dong,CHEN Zhi-gang
Computer Science. 2009, 36 (11): 83-87. 
Abstract PDF(431KB) ( 190 )   
RelatedCitation | Metrics
In a multi-hop wireless sensor network (WSN),the sensors closest to the sink tend to deplete their energy faster than other sensors,which is known as an energy hole around the sink. hhis paper employed the unit deployment cost of the network life, network efficiency, which is more reasonable and is our optimization goal. It is a challenging research that how to avoid energy hole and maximize network efficiency by effective node deployment when we just know the scale of the network and the sense radius of the node. hhis paper gave an algorithm of effective node deployment and worked out the best number of work node, the best deployment approach of relay node, the optimal node transmission distance. hhe algorithm not only be able to avoid energy hole, and also effectively improve network efficiency, compared with the uniform deployment algorithm and the non-uniform deployment algorithm. Therefore, the algorithm is of great significance for the application of constructing low-cost wireless sensor networks.
Research on a Dynamic Grid Service Discovery Model and Algorithm
ZHANG Wei,LI Qin-hua,MA Dan
Computer Science. 2009, 36 (11): 88-92. 
Abstract PDF(409KB) ( 252 )   
RelatedCitation | Metrics
Services join in and withdrawal from the grid frectuently, this makes dynamic grid service discovery a difficultissue. This paper presented a two-dimensional model for dynamic service discovery. Based this model, the service's registration, refreshing, cancellation protocols and discovery algorithm were described. A partial centralized and global distributed service management and discovery mechanism was applied in the model, parting function and property matching on different nodes, boosting the holistic performance of dynamic grid service discovery. The simulation results show the model is working well in performance of dynamic grid service discovery according to such indexes as the scale of grid,the busy degree of grid, dynamic degree of grid, the partition granularity of grid.
Adaptive FH Cosite Interference Cancellation Using Laguerre Filter
YUAN Xiao-gang,HUANG Guo-ce,,LIU Jian,GUO Xing-yang
Computer Science. 2009, 36 (11): 93-96. 
Abstract PDF(343KB) ( 292 )   
RelatedCitation | Metrics
Frequency-selective frequency hopping (FH) cosite dinterference is a typical long impulse response interference. The Laguerre structure is a compromise between Finite Impulse Response (FIR) and Infinite Impulse Response (FIR) structure, so the Laguerre filter can match this long impulse response interference effectively and have better stability. Based on the research of the characteristic of the cosite multi-path channel, the new adaptive Laguerre FH cosite interference canceller was proposed, and the algorithm of the canceller and the estimation of the optimum pole of the Lagucrre filter were presented. The simulation results show the new canceller can achieve much better performance of cancellation and stability in both non-time-varying channel and time-varying channel.
Method of Network Security Situation Prediction Based on Likelihood BP
TANG Cheng-hua,YU Shun-zheng
Computer Science. 2009, 36 (11): 97-100. 
Abstract PDF(450KB) ( 192 )   
RelatedCitation | Metrics
Situation prediction is the advanced stage of network security situation awareness. For purpose of resolving the limitations of depending on experts giving weight, lacking of self-learning on data processing in complex network system, a method of network security situation prediction based on likelihood BP was proposed. hhe BP neural network was introduced to the situation prediction area, and the traditional error function was replaced by the maximum likelihood error function. hhe situation sequences established through the situation assessment model were used as the training input sequences, and the self-learning adjustment of the appointed parameters’values was implemented in the process of back propagation training. The new method can make full use of the characteristics of the network more complex, finer grain size, the higher the efficiency. Experimental results show that the method has good performance of situanon prediction, and provides a new solution for network security situation prediction.
Group Key Managment Protocol by Using Geometric Approach and Binary Key Tree
GE Li-na,TANG Shao-hua
Computer Science. 2009, 36 (11): 101-105. 
Abstract PDF(483KB) ( 182 )   
RelatedCitation | Metrics
Many emerging applications arc based upon a group communications model. A new group key management scheme for a secure group communication system based on a geometric approach was proposed. The proposed scheme can be divided into three phases: user registration, group key assignment, and group key computation. In the user registration phase, the group manager computes and gives a secret to the new user based on geometric approaches over a secure channel. In the group key assignment phase, the group manager first constructs a secret circle using the group key.Then it computes a shadow of the group key for each member based on the member's private key. Finally, each member obtains an additional secret point based on his private key. I}he member reconstructs the secret circle by its shadow and the public information,and then obtains the group key in the group key computation phase. Based on simple scheme of group key management, a binary tree of keys is set up to redesign the scheme and demonstrate it hhe computation complexity for rekeying decreases from O(m) to O(log(m)).The public information on the note board keeps the same. No a secure channel is needed when the group key is updated. So this scheme is scalable.
Cryptanalysis of the Hash Function MD5
MAO Ming, CHEN Shao-hui,YUAN Zheng,JIA Yong-xing
Computer Science. 2009, 36 (11): 106-108. 
Abstract PDF(339KB) ( 228 )   
RelatedCitation | Metrics
This paper is about the cryptanalysis of the Hash function MD5,on the basis of the paper concerning the breaking the MD5 which was written by Wang et al. In this paper, the eighth step of MD5 algorithm was taken for example and the properties of the F function and the effective control of the differential path were introduced. The MD5 algorithm was cryptanalyzed from both the manual calculation and the testing program and some part of the paper written by Wang was amended. Finally, the conditions and key points of meeting the differential characteristics were discussed. In general, this paper plays an important role in cryptanalyzing and breaking the MD5 algorithm.
Design of Structured LDPC Codes with Large Girth
ZHANG Wei,ZHU Guang-xi,PENG Li,SHEN Qiong-xia
Computer Science. 2009, 36 (11): 109-112. 
Abstract PDF(305KB) ( 235 )   
RelatedCitation | Metrics
A parity-check matrix H with large girth has important significance to improve the performance of LDPC codes. And the key to the encoder implementation is the algebraic code structure. This paper proposed a novel code construction algorithm with low complexity based on the Column-Difference Scarch(CDS) Algorithm, which can design regular Quasi Cyclic LDPC codes with large girth and arbitrary code rate. It has linear encoding complexity and is friendly to hardware implementation. The experimental results show that CDS-LDPC codes with different code rates perform better than Tanner codes and Array codes,which increase 0. 79--3. 28dB than another two classical QC-LDPC codes,and also outperform the counterparts of random codes. In addition,CDS-LDPC codes have more flexibility on the design of code length and rate.
Secure Itinerary Protection Based on Mobile Agents
LIU Yi,DING Yong,PANG Liao-jun
Computer Science. 2009, 36 (11): 113-115. 
Abstract PDF(229KB) ( 165 )   
RelatedCitation | Metrics
Mohile agents arc software programs that can he autonomously migrated from host to host to fulfill their goals. It also creates significant new security threats from malicious agents and hosts. Route protection is one of them. It is important to protect the routes of a mobile agent if it will visit a list hosts or if it will dispatch some mobile agents to other hosts. Route protections availahle were analyzed and discussed. After that,a new mohile agents route protection was given using noncommutative associative on}way functions. The analysis shows that this method not only satisfies a set of general security properties for agent route protection, but also has lower computational cost than those existed.
Home Networks-oriented Digital Rights Management System
LI Ping,LU Zheng-ding,ZOU Fu-hao,LING He-fei
Computer Science. 2009, 36 (11): 116-119. 
Abstract PDF(357KB) ( 197 )   
RelatedCitation | Metrics
With the popularization of the consuming electronic products, home networks are prevailing gradually, at the same time the copyright infringement of the digital content in the network is very serious. A digital rights management (DRM) system for protecting the copyrights of digital contents in home networks was proposed. In the proposed system, the home gateway device is responsible for managing the user electronic terminals in the home network and interacting with the external DRM server to enforce DRM functions as their agent. The group key technique adopted by this system makes sure only the legal user terminals can decrypt the encrypted digital content and then consume, and the digital contents can be shared in the home network through the super distribution of the contents between the user ter- urinals.
Anonymous Proxy Signature Scheme without Trusted Party Based on Elliptic Curve
JIN Hong,WANG Xiang-hai
Computer Science. 2009, 36 (11): 120-122. 
Abstract PDF(258KB) ( 180 )   
RelatedCitation | Metrics
It is found that the Uu scheme has not got the property of strong unforgeability and low implementation efficicncy and redundant data of the construction of discrete logarithm problem. This paper proposed a new anonymous proxy signature scheme without trusted party based on ECC, and analyzed the essential security characters it has. It shows that the new scheme can keep the property of the unforgcability and does not have the redundant data in signature. It also has the quality of high security and high efficiency of implementation.
Data Organization Research of the High Availability Object Storage System
ZHAN Ling,ZHANG Qiang-shan,WAN Ji-guang
Computer Science. 2009, 36 (11): 123-126. 
Abstract PDF(450KB) ( 207 )   
RelatedCitation | Metrics
Based on a thorough analysis on the fault tolerance capability on various existing storage systems, we proposed a new hierarchical, highly reliable, multi-disk fault tolerant storage system architecture; High Availability Object Story ge System (HAOSS). The HAOSS is composed of two layers; the upper-layer and the lower-layer. I}he upper-layer achieves the high availability by storing multiple replicas for each storage object in a set of storage devices. The indi-victual replicas can service the I/O requests in parallel so as to obtain high performance. But the effective disk space uti-lization rate for the upper-layer is relatively low. The lower-layer deploys RAIDS,RAID6 or RAID-Blaum coding schemes to tolerate multi-disk failures. The disk utilization rate of coding schemes is higher than that of multiple replicas. These advantages come at the price of more complicated fault tolerant coding schemes, which involve a large amount of calculation for encoding and cause an adverse impact on the I/O performance. The HAOSS puts new objects and hot objects in its upper-layer,so that the majority of the rectuests are absorbed by the upper-layer,hence achieving guaranteed system I/O performance. The main purpose of the lower-layer is to provide a reservoir for the cold data. In a 1000Mb Ethernet interconnection environment, with a request block size of 1024kB, the sequential read performance for a HAOSS server reaches 104MB/s, which is very close to the theoretical maximum effective bandwidth of Ethernet networks.
Research on Model-based Testing on AADL
WANG Geng,ZHOU Xing-she,ZHANG Fan,DONG Yun-wei
Computer Science. 2009, 36 (11): 127-130. 
Abstract PDF(327KB) ( 203 )   
RelatedCitation | Metrics
As MDA becomes a popular software development methodology, it is one of the hot issues to ensure of software which is based on models. This article studied on model-based testing(MI37)focusing on AADL the quality model, and an algorithm was proposed to carry out model-based testing on AADI. with markov chain. An example was given at the end of the article to demonstrate the algorithm.
Deontic Petri Net for Modeling and Analyzing Social Interactions in Collaborative Organizations
CAI Guo-yong,DONG Rong-sheng,GU Tian-long
Computer Science. 2009, 36 (11): 131-135. 
Abstract PDF(487KB) ( 217 )   
RelatedCitation | Metrics
Over an open and dynamic networked environment filled with heterogeneous services, a challenge issue to be coped with, whenever constructing services oriented collaborative organizations, is the complicated social interactions among autonomic service agents. An extended reference Petri net (called deontic Petri net, DPN) was proposed, which incorporates the role and norm concepts of organizational modeling with the concepts of net within net and synchronous channel in reference net model. The modeling and analyzing methods toward collaborative interactions among service a- gents with DPN were presented. An electronic institution model was applied to demonstrate the advantages of DPN and a case study shows the usefulness of the presented approach.
Algorithm for Finding Expression of Petri Net Language
ZHANG Ji-jun,FAN Hao,GENG Xia
Computer Science. 2009, 36 (11): 136-139. 
Abstract PDF(383KB) ( 236 )   
RelatedCitation | Metrics
Petri net language is a set of sectuences for describing Petri net behavior. It analyzed the behavior characteristic of Petri net based on its state transition diagram, in order to give the formula description of the language of Petri net.The expression of Petri net language was defined,and an algorithm for finding expression of Petri net language was given. A new method was provided for describing and analyzing the language of Petri net.
Global Function Optimization Based on Gene Expression Programming with Differential Evolution
LI Tai-yong,TANG Chang-jie,WU Jiang,QIU Jiang-tao
Computer Science. 2009, 36 (11): 140-142. 
Abstract PDF(322KB) ( 165 )   
RelatedCitation | Metrics
To improve the efficiency in function optimization via Gene Expression Programming(GEP),Differential Evolution(DE) was introduced into GEP. A novel algorithm called DEGEPO was proposed. The main work of this paper included (1) the gene in GEP was redesigned to adapt global function optimization; (2) novel mutation and crossover operations were applied; (3) a parameter optimization algorithm based on GEP with DE called DEGEPO was proposed and it was also analyzed; (4) experiments demonstrated the efficiency and effectiveness of DEGEPO. Compared with basic GEP, the precision of DEGEPO increased 2-4 orders of magnitude averagely.
VINCASim: A Simulation Toolkit for Evaluating Dependability of Grid Workflows
ZHAO Xiao-wei,ZHANG Li-yong, HAN Yan-bo
Computer Science. 2009, 36 (11): 143-147. 
Abstract PDF(515KB) ( 160 )   
RelatedCitation | Metrics
As a special kind of "programming" technology for constructing problem-solving applications on the basis of grid resources, grid workflow has been widely applied. Methodologies for ensuring ddependability of grid workflows have attracted attention. However, it remains a challenge how to evaluate the effectiveness of these methodologies due to the complexity and uncertainty of grid environments. Based on VINCA grid workflow, key factors that affect the depen dability were systematically analyzed, and a general component model and dependability attributes model for grid work- flow systems were established. A configurable and extensible simulation toolkit called VINCASim for evaluating the dependability of grid workflows was developed based on GridSim. The toolkit can simulate various failures raised from grid nodes, workflow engines, network connection and workflow execution in a configurable manner, and supports incorporating different dependability ensuring methods programmatically. Thus, a controllable and repeatable experiment platform was provided for evaluating different methods. The usability was demonstrated by a use case scenario.
Synopsis Data Structure Based Mixture Probabilistic Density Data Stream Clustering Approach
LIU Jian-wei, LI Wei-ming
Computer Science. 2009, 36 (11): 148-151. 
Abstract PDF(343KB) ( 158 )   
RelatedCitation | Metrics
Many current and emerging applications require support for on-line analysis of rapidly changing data streams. Limitations of traditional DI3MSs and data mining in supporting streaming applications have been recognized,prompting research to augment existing technologies and build new systems to manage streaming data and propose new algorithm for mining data stream A synopsis data structure based mixture probabilistic density data stream clustering approach was proposed,which rectuires only the newly arrived data,not the entire historical data,to be saved in memory. This approach incrementally updates the density estimate taking only the newly arrived data and the previously estimated density. I}his method uses three distance metric criteria for judging if merging new arriving component into a component of existing Gaussian mixture model or as a new model is added existing Uaussian mixture model. The experimental results have demonstrated that the algorithm is feasible and fulfill high quality clustering results.
Research and Application of Awareness in Social Network Sites
LI Jian-guo,TANG Yong,YAO Liang-chao,ZHANG Wen-sheng,FANG Wen-chong
Computer Science. 2009, 36 (11): 152-156. 
Abstract PDF(458KB) ( 192 )   
RelatedCitation | Metrics
Social networks have experienced exponential growth in membership and have become a hotspot for research in recent years. It helps users to organize their friendship links and maintain their social relations. I}he key point of social networks is maintenance of personal profile and awareness of other users' information in the networks. Considering the lack of awareness information, this paper sought to empirically explore the question of whether the concept of awareness can be fruitfully adapted and applied to the field of social networks. We analysed the concept and connotation of awareness, compared the group in the field of CSCW and social networks, discussed the formation progress of the social networks’awareness information, researched awareness from the aspect of social networks’environment and groups,extended and adjusted the awareness technology to social networks and improved the communication and interactivity in social networks. Finally, we implemented the academic community that is a social network site supporting awareness. It helps researchers find research focus and groups, and advances academic exchange and innovation.
Lifecycle of Modeling Business Process Based on Pi Calculus
GUO Xiao-qun,HAO Ke-gang,HOU Hong,DING Jian-jie
Computer Science. 2009, 36 (11): 157-159. 
Abstract PDF(346KB) ( 164 )   
RelatedCitation | Metrics
With the dramatic competition of enterprise, the technology of business modeling gets more and more important. Formal business process model receives a lot of attention in the scientific community because formal methodology is proved effective in avoiding ambiguity and feasible in supporting analysis and verification of model. 13ut at present,there is not suitable systemic theory and methodology to support modeling business process easily and to analyze and verify model formally. This paper discussed how to use formal methodology when modeling business process at the viewpoint of the lifecycle. I}he contribution of this paper is giving a lifecycle based on Pi calculus,discussing technology and tools used in every stage of lifecycle.
Parallel-computing Strategy for Large-scale Heat Equation Based on PETSc
CHENG Tang-pei,WANG Qun
Computer Science. 2009, 36 (11): 160-164. 
Abstract PDF(405KB) ( 590 )   
RelatedCitation | Metrics
A parallel-computing strategy was presented to solve the large-scale heat equations. The distributed memory and compressed matrices technology was adopted for both the process of storage and evaluation of largcscale sparse matrices. All kinds of Krylov subspace methods and preconditioners were introduced to assemble and solve the linear systems of ectuations. The code implementation of this strategy was written in high-level abstractions based on object oriented technology which promotes code reuse, flexibility and helps to decouple issues of parallelism from algorithm choices. The experiments carried on Linux clusters demonstrate that this strategy has achieved desirable speedup and efficicncy.
Research on Integration of Safety Analysis in Model-driven Software Development
CHEN Feng,LI Wei-hua,FANU Ding-yi,CHEN Xiao-jiang
Computer Science. 2009, 36 (11): 165-168. 
Abstract PDF(344KB) ( 177 )   
RelatedCitation | Metrics
This paper proposed a new method aiming at integrating the safety analysis in software design and development. Model Driven Architecture is used for the basic framework. By making use of UML extension, UMLsec modeled the Platform Independent Model of software security,which achieved more security requirements in the initial stage of the system design cycle. hhis approach reduces the risk and the cost of software development and improves the reusability.
Workflow Process Definition Reuse Method in Software Product Line Architecture
GONG Xiao-qing,LIU Feng,GE Wei,HAO Ke-gang
Computer Science. 2009, 36 (11): 169-172. 
Abstract PDF(369KB) ( 151 )   
RelatedCitation | Metrics
There is a problem in practical application of workflow management system. It is that the workflow processes have been defined with a low efficiency and productivity due to the complexity of actual business processes. An approach to this problem was proposed by adopting the idea and technique of software reuse. The solution is based on the archilecture of software product line and aims to improve the efficiency of the domain-specific workflow processes' definilion. Two types of reusable assets,domain business ontology and workflow template,were identified. Moreover,how to construct, describe and retrieve these reusable assets was discussed in detail. hhe steps to define the workflow process based on reuse were explained. As a demonstration, a definition tool was developed and the method was proved feasible and more productive in practice.
Design and Implementation of a High-extensible Satellite Scheduling System
CHEN Hao,JING Ning,TANG Yu,LI Jun,YANG Jian
Computer Science. 2009, 36 (11): 173-176. 
Abstract PDF(398KB) ( 142 )   
RelatedCitation | Metrics
For the poor extensible characteristics of existing satellites scheduling systems, a high-extensible satellite scheduling system named HES3 (High-Extensible Satellite Scheduling System) was designed and implemented based on the framework similar to High Level Architecture (HI_A). A two-stage scheduling process which is composed of expert system and scheduling algorithm with constraints independence was proposed to promise the high-extensible character of HES3. Then a Graphical Uscr Interface (GUI) was designed to illustrate the scheduling result in the form of a view similar to Gantt and a view based on Geographic Information System (GIS). Finally,a demonstration was conducted to show how HES3 works.
Distributed Function Mining for GEP on Grid Services
DENG Song,WANG Ru-chuan,REN Xun-yi
Computer Science. 2009, 36 (11): 177-181. 
Abstract PDF(442KB) ( 190 )   
RelatedCitation | Metrics
This paper presented distributed function mining for GEP on grid scrvices(DFMGEP-GS),which combined grid services and GEP algorithm to realize not only function finding for GEP on grid platform successfully,improve but also global optimization of GEP algorithm with every grid node. Meanwhile, it proved the method by which global data model is obtained by means of local data model on grid. By simulation experiment, it is showed that for data with known function type, and with the augmentation of datasets, average consumptive time of DFMGEP-GS is less than other three algorithms under the condition of mining target function successfully, and that with the increment of grid nodes, the convergent speed of DFMGEP-GS is improved about 17 times maximally. For very complex data with unknown function type, the error which is mined by DFMGEP-GS algorithm is minimum.
Researches on Ubiquitous Computing Service Matching
LU Qing-cong,ZHOU Ji-liang,YANG Fan,CAO Qi-ying
Computer Science. 2009, 36 (11): 182-185. 
Abstract PDF(476KB) ( 168 )   
RelatedCitation | Metrics
The ubiquitous computing environments should provide suitable services according user' s requirement, so service matching should be made between service provider and user's requirement However, syntax-based service matcking methods and semanticbased service matching methods do not well adapt to the ubiquitous computing rectuiremenu. 13y analyzing syntax-based service matching methods and semanti}based service matching methods of ubiquitous computing,with the special requirements of efficiency, resource constraints and context awareness of the ubiquitous computing environments,we gave the corresponding solution;meanwhile we analyzed the composition service matching methods.
Community Partitioning Algorithm Based on Spectral Bisection Method
XIE Fu-ding,ZHANG Lei,JI Min,HUANG Dan
Computer Science. 2009, 36 (11): 185-188. 
Abstract PDF(254KB) ( 335 )   
RelatedCitation | Metrics
Based on the improved SNN similarity matrix and spectral bisection method, this paper proposed a new algorithm for detecting the community structure in complex networks. The improved SNN similarity matrix was firstly computed and normalized and its cigenvalucs and cigenvectors were obtained subsequently. hhen different numbers of the first non-trivial eigenvectors were chosen as clustering samples, FCM algorithm began to work and the corresponding modularity was computed. The best structure of the network was detected by mapping the largest value of modularity.The experiment shows the validity of the presented method The result obtained here is compared with other popular ones and the conclusion is that the accuracy of the results calculated by this approach is much better than the known ones.
Research on Event-oriented Ontology Model
LIU Zong-tian,HUANG Mei-li,ZHOU Wen,ZHONG Zhao-man,FU Jian-feng,SHAH Jian-fang,ZHI Hui-lai
Computer Science. 2009, 36 (11): 189-192. 
Abstract PDF(435KB) ( 630 )   
RelatedCitation | Metrics
Event oriented knowledge representation method is reasonable according to the philosophical viewpoints that the world is of material, material is of movement, movement is absolute and static is relative. Nowadays, the academic community has paid more and more attention to event that is the unit of knowledge. By studying of the Event oriented Knowledge representation, the definition of event was given and the representation of event in a 6-tuple formalized structure was presented.Then, an event ontology model was proposed. Faking text understanding as an application domain, the representation, building and application of the event network of texts and sentences were briefly introduced. In comparison with the conventional method, event ontology model represents knowledge with a higher granularity, meanwhile, it'll be much more suitable for computers to simulate human brain.
Self-adaptive Particle Swarm OPtimization Algorithm Based on Tentative Adjusting Step Factor
ZHENG Chun-ying,ZHENG Quan-di,WANG Xiao-dan,WANG Yu-bing
Computer Science. 2009, 36 (11): 193-195. 
Abstract PDF(229KB) ( 144 )   
RelatedCitation | Metrics
Aiming at premature defect and poor result of Particle Swarm Optimization algorithm, a new Self-adaptive inertia factor was designed according to diversity in the population and generation number based on analysing inertia factor's effect of algorithm. And through ploughing around adjusting step factors,the Particle's ability in local searching was enhanced. Three typical function tests were given. Comparing with APSO, the result indicates the effectiveness of this improvement.
Feature Selection Method Based on Optimized Document Frequency and Beam Search
ZHU Hao-dong,ZHONG Yong
Computer Science. 2009, 36 (11): 196-199. 
Abstract PDF(321KB) ( 169 )   
RelatedCitation | Metrics
In text categorization, one problem is usually confronted with feature spaces containing 10, 000 dimensions and more, even exceeding the number of available training samples. In order to enhance the operating speed and reduce the memory space occupied and filter out irrelevant or lower degree of features, feature selection algorithms must be used. In order to obtain more representative feature subset, it firstly presented document frequency method based on minimum word frequency, and then introduced rough sets and presented an algorithm of attribute reduction based on Beam scarch,finally, combined the attribute reduction algorithm with document frequency method based on minimum word frequency and proposed a comprehensive feature selection algorithm. The comprehensive algorithm firstly uses document frequency method based on minimum word frectuency to select feature, and then use the attribute reduction algorithm to eliminate redundancy, so can acquire the feature subset which arc more representative. Experimental results show that the comprehensive algorithm is effective.
Research on the Dynamic Extending of Story in Story Link Detection
ZHANG Xiao-yan,WANG Ting
Computer Science. 2009, 36 (11): 200-203. 
Abstract PDF(472KB) ( 170 )   
RelatedCitation | Metrics
Story Link Detection is to determine whether two stories are about the same topic. To overcome the limitation of the story length, sparse data and the drifting problem in story content, this paper provided a technology of dynamic information extending to improve the story representation model. It extended the current story with its previous latest topirrclated story. hhe refinement on the information for dynamic extending was also studied. It aims to reduce the influcnce of the noise introduced when extending by increasing the weights of some important features in the extending story. This method was used for Story Link Detection on the Tl)」4 Chinese corpus. The experiment results indicate that the technology of dynamic extending and the refinement of extending information can both affect the performance of story link detection systems evidently.
Efficient Distributed Mining Algorithm for Alarm Correlation in Communication Networks
WU Jian,LI Xing-ming
Computer Science. 2009, 36 (11): 204-207. 
Abstract PDF(413KB) ( 157 )   
RelatedCitation | Metrics
This paper described the alarm correlation in communication networks based on data mining. A direct applicalion of sequential algorithms to distributed databases is not effective,because it requires a large amount of communicalion overhead. An efficient algorithm-EDMA was proposed. It minimized the number of candidate sets and exchanged messages by local and global pruning. In local sites, it runs the application based on the improved algorithm-CMatrix,which is used to calculate local support counts. Our solution also reduced the size of average transactions and datasets that leads to reduction of scan time. I}he performance study shows that EDMA has superior running efficiency, lower communication cost and stronger scalability than direct application of a sequential algorithm in distributed databases.
Approximate Approach to Train SVM on Very Large Data Sets
ZENG Zhi-qiang, LIAO Bei shui,GAO Ji
Computer Science. 2009, 36 (11): 208-212. 
Abstract PDF(421KB) ( 171 )   
RelatedCitation | Metrics
Standard Support Vector Machine (SVM) training has O(l3) time and O(l3)space complexities,where L is the training set size. It is thus computationally infeasible on very large data sets. A novel SVM training method, Approximatc Vector Machine (AVM),based on approximate solution was presented to scale up kernel methods on very large data sets. This approach only obtains an approximately optimal hyper plane by incremental learning, and uses probabilistic speedup and hot start tricks to accelerate training speed during each iterative stage. hheoretical analysis indicates that AVM has the time and space complexities that are independent of training set size. Experiments on very large data sets show that the proposed method not only preserves the generalization performance of the original SVM classifiers,but outperforms existing scalcup methods in terms of training time and number of support vectors.
Improved Sort-based KNN Text Categorization Method
LIU Hai-feng,ZHANG Xue-ren,YAO Ze-qing,LIU Shou-sheng
Computer Science. 2009, 36 (11): 213-216. 
Abstract PDF(347KB) ( 161 )   
RelatedCitation | Metrics
The problem of large feature dimension and the expansibility reduces the KNN function. This paper brougt forward an ameliorative KNN method to solve the problem that big swatch sort with more texts is easy to become the knearest neighbors under the feature reduction condition. Firstly, it used an improved odds radio method to select feature.Secondly, it estimated the possible sorts for the text by using the sort vector. Lastly, it used an improved KNN method in the reduced texts to realize text categorization. The experiment shows that this method has improved the precision.
Dependency Parsing Based Event Recognition
FU Jian-feng, LIU Zong-tian,FU Xue-feng, ZHOU Wen,ZHONG Zhao-man
Computer Science. 2009, 36 (11): 217-219. 
Abstract PDF(254KB) ( 288 )   
RelatedCitation | Metrics
Event Extraction is an important part of information extraction. As the basis of Event Extraction, Event Recognition directly affects the results of Event Extraction. Machine learning based Event Recognition needs to find more features in words. For the deficiency of present Event Recognition method, this paper presented a novel method of Depen-dency Parsing based Event Recognition (DPER). Dependency parsing was used to find the syntactic relation among triggers and other words. As one of features, this relation was used to event classification on SVM and then to event recognition. The experiments show DPER has better performance than traditional method, and Event Recognition integrating multi-features improves F-measure to 69.3 %.
Vague Partions
LIANG Jia-rong,LIU Li,WU Hua-jian
Computer Science. 2009, 36 (11): 220-223. 
Abstract PDF(280KB) ( 144 )   
RelatedCitation | Metrics
The concepts of degree of truth-compatibility, degree of false-compatibility degree of truth-equality, degree of falscequality based on t-norm and t-conorm were introduced, because a vague set has the characteristic of truth-membership function and fase-membership function. Futhermore we presented the concepts of semi-vague partitions and vague partitions by using locically degree of truth-compatibility, degree of false-compatibility degree of truth-equality,degree of falscequality, and we investigated the characters of semi-vague partitions and vague partitions.
Novel Algorithm GEP Based on Niche Applied to Association Rules Mining
CHEN Yun-liang,LI Xin,YANG Jie,XIE Chang-sheng
Computer Science. 2009, 36 (11): 224-227. 
Abstract PDF(362KB) ( 159 )   
RelatedCitation | Metrics
In order to advance the performance of association rules mining algorithm when disposing big dataset, a novel UEP based on Niche (NUEP) was presented to solve the probleme. The procedure of NUEP began with the niche evolution and then fused some of the sub-niches according to the similarity of best individuals. Then Cartesian product was nested in the kernel set of niches to generate better outcomes. The experimental results show that our algorithm performs better than the other similar evolutionary algorithm in terms of diversity of population and precision; besides, it can discover more association rules.
Novel Algorithm to Optimize the Capacity of the Restoration of ASON
XU Jun,CHANG Hui-you,YI Yang
Computer Science. 2009, 36 (11): 228-229. 
Abstract PDF(231KB) ( 174 )   
RelatedCitation | Metrics
In order to solve the NP-completely nonlinear programming of ASON restoration capacity assignment, the corresponding mathematical model was established and a new optimization method based on particle swarm optimization algorithm was presented in this paper. Compared with liner programming optimization method, this algorithm reduces the calculation work significantly,which facilitates the application of algorithm on projects. I}he algorithm can handle a variety of the best restoration routing of selected issues in the different failure cases and effectively address the "debris" problem of restoration capacity. I}he simulation experiments show that the algorithm is highly practical.
Weighted Minimum Mean Square Kalman Filter
CHEN Peng,QIAN Hui,ZHU Miao-liang
Computer Science. 2009, 36 (11): 230-231. 
Abstract PDF(227KB) ( 323 )   
RelatedCitation | Metrics
In order to use Kalman Filter (KF) in nonlinear systems, a new method was proposed. Using the principle that a set of discretely sampled points can be used to form a linear system, the estimator yields performance ectuivalent to the Extended Kalman Filter (EKF) for nonlinear systems and can be elegantly used to nonlinear systems without the differential steps required by the EKF. We argue that the ease of implementation and more accurate estimation features of the new filter recommend its use in applications.
Hybrid Particle Swarm Algorithm for Optimal Motion Planning of Robot Arms
HUANG Gang,LI De-hua,YANG Jie
Computer Science. 2009, 36 (11): 232-234. 
Abstract PDF(223KB) ( 288 )   
RelatedCitation | Metrics
Trajectory Planning Problem (TPP) of robot manipulator is a highly constrained and nonlinear optimization problem, aims to minimize the total path motion associated with obstacle avoidance. A hybrid optimization algorithm integrating SA and PSO was presented to address the motion planning problem of robot arms. I}hen the SA-PSO was implemented on a tested example. In addition, a conventional algorithms, namely A* Algorithm (AA),was introduced to make a comparison with SA-PSO. The computational results show that the developed algorithm is computationally better than the other method.
Extraction of Information from Web Pages Based on Extended DOM Tree
GU Yun-hua,TIAN Wei
Computer Science. 2009, 36 (11): 235-237. 
Abstract PDF(346KB) ( 249 )   
RelatedCitation | Metrics
A method of information extraction from Web pages was presented, and it is based on extended DOM tree. Web pages were firstly transformed to DOM tree, then the DOM tree was extended by adding semantic expression to node and influence degree was calculated for each node. According to influence degree of nodes, the DOM tree was pruned,and it can automatically extract the useful relevant content from Web pages. This approach is a universal method, which does not recauire to pre-know the structure of the Web page. The results of the information extraction are used not only for browsing but also for further Web information process, such as Internet data mining, topic-based search engine.
Adaptive Adjusting Algorithm of Firewall Based on Endocrine System
ZHU Si-feng,WANG Hua-dong,WEI Rong-hua
Computer Science. 2009, 36 (11): 238-241. 
Abstract PDF(348KB) ( 178 )   
RelatedCitation | Metrics
Firewall is a useful tool for protecting network,and adjusting its parameters adaptively is important to network performance. At present,there are no available adjusting algorithms for firewall. The mechanism of endocrine system was analyzed, a kind of artificial endocrine system (AES) was designed, and an adaptive adjusting algorithm based on AES was constructed. Simulation tests show that the adj usting algorithm given in this paper has advantages of good performance, and applying prospect. The algorithm with modification can be used in intrusion detection system.
Pedestrian Detection Method Based on Part Detector and Substructure Assemble
HU Bin,WANG Sheng-jin,DING Xiao-qing
Computer Science. 2009, 36 (11): 242-246. 
Abstract PDF(471KB) ( 177 )   
RelatedCitation | Metrics
We presented a pedestrian detection method based on part detector and substructure assemble which can be used for driving assistant system or video surveillance system. Firstly we used the head classifier to search the whole image in order to get the ROI(Region of Interesting).Then in each ROI five part detectors including head, torso, leg,left arm and right arm were used individually. Finally we verified the detection result through part detector assemble method based on substructure. Experiments on different database show that our method has high performance in detecting pedestrians with numerous poses and partial occlusion in cluttered background.
Adaptive 3D Facial Feature Tracking Combining Robust Feature with Online Learning
WANG Xiao-yan,WANG Yank-sheng,ZHOU Ming-cai,FEND Xue-tao,ZHOU Xiao-xu
Computer Science. 2009, 36 (11): 247-250. 
Abstract PDF(387KB) ( 168 )   
RelatedCitation | Metrics
An algorithm based on robust feature combining edge strength and raw intensity and online appearance model fitting was proposed to track head pose and facial actions in video. A 3D parameterized model, CAND)DE model, was used to model the face and facial expression, a weak perspective projection method was used to model the head pose, an adaptive appearance model was built on shape free intensity and edge texture, and then a gradient decent model fitting algorithm was taken to track parameters of head pose and facial actions. Experiments demonstrate that the algorithm is more robust than using only intensity especially when the lighting condition and facial expression is complicated.
Four Collaborated Cameras Multi-touch Locating Method Based on Look-up Table
WANG De-xin,XIONU Zhi-hui,ZHANG Mao-jun
Computer Science. 2009, 36 (11): 251-253. 
Abstract PDF(355KB) ( 180 )   
RelatedCitation | Metrics
To solve the occlusion of multi touch, this paper presented a multi-touch locating method based on the collaboration of four cameras. It used four cameras fixed at each corner of the frame to capture images synchronously, detected objects in each camera,built lines directing the object according to the principle of geometric optics and physical dimension constraint of the frame, and finally calibrated with the four cameras to locate each object. Experimental results show that this method can locate multi touch precisely even under occlusion and it is low cost, easy to install and transplant.
Real Color Image Enhanced and Denoised by Stationary Wavelet Transform
XIONG Jie,FENG Xiao-qiang,GENG Guo-hua,ZHOU Ming-quan
Computer Science. 2009, 36 (11): 254-257. 
Abstract PDF(324KB) ( 325 )   
RelatedCitation | Metrics
Because all real color images contain noise in different degree and the high dynamic range of some real color images with dark shadows and bright light sources exceeds the range of most electronic capturing devices and human eyes scnscmg,real color images should be enhance and denoised at the same time. Based on the analysis of noise in H,S,V channel and a new illumination-reflectance model described by stationary wavclct transform(SWT),a new method which can enhance and denoise real color images simultaneous was put forward. Experiments show that the color images are enhanced obviously and the noise is efficiently restrained at one time.
Dense Matching Algorithm Based on Region Boundary Restriction via Graph Cuts Optimization
CHEN Wang,ZHANG Mao-jun,XIONG Zhi-hui
Computer Science. 2009, 36 (11): 258-261. 
Abstract PDF(466KB) ( 267 )   
RelatedCitation | Metrics
Two main challenges of stereo matching algorithm via graph cuts global optimization are discontinuity and occlusion problems. Energy function with convex smooth term has a global optical solution but is over-smoothing at the boundary of scene object,while energy function with non-convex smooth term preserves discontinuity but just has a sccond global optical solution via iterative optimization, and problems of occlusion, orderingness, uniqueness are not dealt with appropriately as well. I}herefore, with the observation that disparity almost j umps at the color discontinuity, this paper proposed an approach of energy function presentation for dense stereo matching based on restrictions between region boundary pixels, which assured not only a global optimal solution but also discontinuity-preserving, meanwhile treated with issues of occlusion, orderingness, uniqueness and greatly improved computing efficiency.
Terrain Matching for Three-dimensional Visualization of Two-dimensional GIS Vector Data
Terrain Matching for Three-dimensional Visualization of Two-dimensional GIS Vector Data
Computer Science. 2009, 36 (11): 262-265. 
Abstract PDF(473KB) ( 225 )   
RelatedCitation | Metrics
A fast terrain matching method for 3D visualization of 2D GIS vector data was proposed. 3D terrain features based on vector information were firstly classified by its characteristic,thcn either feature adjusting or terrain adjusting method was chosen for matching of features and terrain. Feature adjusting method adjusted the position and gesture of features in real-time,terrain adjusting method applied terrain deformation to vector constrained field based on different distance measurement. I}he algorithms also dealt with some special cases such as low resolution of terrain dataset and multi-vector restriction applied to the same area. Experimental results show that our method could meet the requirement of terrain matching effectively for 3D terrain features based on vector data including points,lines and surfaces with high visual quality.
Research on Binocular Stereo Matching in the Presence of Specular Reflections
LU Si-jun,TANG Zhen-min,GUO Long-yuan,LU A-li
Computer Science. 2009, 36 (11): 266-268. 
Abstract PDF(335KB) ( 154 )   
RelatedCitation | Metrics
Traditional stereo correspondence algorithms rely heavily on the Lambertian model of diffuse reflectance. While this diffuse assumption is generally valid for much of an image, processing of regions that contain specular refleclions can result in severe matching errors. We addressed the problem of binocular stereo dense matching in the presence of specular reflections by introducing a novel correspondence measurement which separates reflection components based on chromaticity. Accurate depth can be estimated for both diffuse and specular regions. Unlike the previous works which seek to eliminate or avoid specular reflections using image preprocessing or multibaseline stereo, our approach works in its presence. Experiments with both synthetic demonstrate the effectiveness and robustness of our approach.
Needle Segmentation in US Images Based on Real-time Gray-scale Hough Transformation
QIU Wu,DING Ming-yue,ZHOU Hua
Computer Science. 2009, 36 (11): 269-272. 
Abstract PDF(330KB) ( 174 )   
RelatedCitation | Metrics
Rcal-time needle segmentation and tracking arc very important technique in imagcguided surgery, biopsy, and therapy. We proposed a needle segmentation technique based on a Real-Time Gray-Scale Hough Transform (RTGHT),which is composed of an improved Gray Hough transformation using phascgrouping algorithm with the coarscfine searching strategy. Furthermore, the RTGHT technique was evaluated by patient breast biopsy images. Experiments with patient breast biopsy ultrasound (US) image sectuences showed that our approach can segment the biopsy needle in real time with the angular rms error of about 10 and the position rms error of about 0. 5~on an affordable PC computer without the help of specially designed hardware. It can be applied in image-guided surgery and therapy.
Effective Method for the Segmentation of Fingerprint Images Based on New Feature
MEI Yuan,CAO Guo,SUN Huai-jiang,SUN Quan-sen,XIA De-shen
Computer Science. 2009, 36 (11): 273-278. 
Abstract PDF(584KB) ( 161 )   
RelatedCitation | Metrics
The segmentation of fingerprint images plays a very important role in Automatic Fingerprint Identification System (AFIS). Effective segmentation can not only reduce the time of subsequent processing, but also improve the reliability of feature extraction considerably. This paper mainly contains two works; proposed a new segmentation feature for fingerprint images which is called Effective Point Cluster Degree (EPCI));proposed an effective method for the segmentation of fingerprint images based on the new feature and C1uD mentioned in reference[l],in this new method,we first segmented fingerprint images with EPCD,and then processed the first segmentation results with iteration-based me thod, did the second segmentation with CLuD based on the processed results, and finally, morphology was applied as postprocessing to reduce misclassification. All experimental results show that: compared with other commonly used features, the EPCD holds the characteristics of good discriminability, robustness, and the segmented foreground and background are more concentrated; at the same time, the segmented method based on EPCD and CluD possesses high accuracy and adaptability.
Edges Preserving Method for Medical Image Denoising
QIN Xu-jia,ZHANG Su-qiong,LIU Shi shang,XU Xiao-gang
Computer Science. 2009, 36 (11): 279-282. 
Abstract PDF(354KB) ( 143 )   
RelatedCitation | Metrics
Image denoising plays an important part in medical image processing,denoising the medical images is the basis for further analysis and calculations. To extend the on}dimensional EMI)method to two-dimensional, a novel edges preserved image denoising method for medical images based 2D EMD was proposed. Firstly, decomposed the image by EMD,and the intrinsic mode function(lMF) and the remaining component of the image were obtained. Image noises and edges information arc focus on the IMF. Secondary, decomposed the IMF into high frequency component and the remaining component by EMD. Finally, added the two remaining components, the edges preserved denoise image was obtwined. Experiments show that the proposed method can remove noises in the image and keep image edges information.
Mechanism of Naming Topological Entities in Semantic Feature Modeling
GAO Xucyao,SUN Li-quan,SUN Da-song
Computer Science. 2009, 36 (11): 283-285. 
Abstract PDF(362KB) ( 183 )   
RelatedCitation | Metrics
Feature modeling is the key technique in CAl)system, and topological entity naming is one of the essential issues of feature modeling. A feature-based method was proposed to name the surfaces of feature entities, then name edges and vertices of feature entities. Because of invariance of feature name in process of modeling, topological entities can be correctly recorded and maintained in model editing. The conception of reference line and its definition principle were proposed to solve topological edge splitting and multiple intersecting curves generated by two surfaces, which can identify and refer the splitting edges and multi-intersections efficiently.
Image Denoising Using Nonsampled Contourlet Transform and Muiti-scale Thresholds
FU Zhong-kai,WANG Xiang-yang,ZHENG Hong-liang
Computer Science. 2009, 36 (11): 286-289. 
Abstract PDF(373KB) ( 220 )   
RelatedCitation | Metrics
The nonsubsampled contourlet transform is a fully shift invariant, multi-scale, and multi-direction expansion that has better directional frequency localization and a fast implementation. We proposed a novel image denoising method by incorporating the nonsubsampled contourlet transform. The fully shift invariant property and the high directional sensitivity of the nonsubsampled contourlet transform make the new method a very good choice for image denoising.Firstly, the image was decomposed in different subbands of frequency and orientation responses using the nonsubsampled contourlet transform. Then the multi-scale thresholds were computed according to noise distribution, and used to shrink the nonsubsampled contourlet coefficients. Finally, the modified nonsubsampled contourlet coefficients were transformed back into the original domain to get the denoised image. Simulation results show that the method can obtain higher peak-signal-to-noise ratio, compared with other recent image denoising methods, such as wavelet denoising and contourlet denoising.
Research of Interactive Medical Image Segmentation Algorithm Based on Paring Heap
DANG Jian-wu,DU Xiao-gang,WANG Yang-ping
Computer Science. 2009, 36 (11): 290-292. 
Abstract PDF(289KB) ( 186 )   
RelatedCitation | Metrics
The segmentation speed is a bottleneck of application of interactive algorithm in the interactive segmentation of serial medical images. This paper provided an interactive medical image segmentation algorithm based on pairing heap. The time complexity of procedure of searching the shortest path dynamically of Liv}Wire was decreased by paring heap implementing degradable prior-queue. Algorithm analysis and experiment in radiation therapy plan system indicate that the algorithm can improve the segmentation efficiency of serial medical images.
Face Recognition Based on Multi-scale Block Local Binary Pattern
LIU Zhong-hua,SHI Heng-liang,ZHANG Lan ping,JIN Zhong
Computer Science. 2009, 36 (11): 293-295. 
Abstract PDF(335KB) ( 290 )   
RelatedCitation | Metrics
We proposed a face recognition representation method, called multi scale block Local Binary Pattern. The Local Binary Pattern(LBP) was proved to be effective for image representation,but it was too local to be robust. In MS-BLBP, the computation was done based on average values of block subregions, instead of individual pixels. The face area was first divided into small regions from which Block Local Binary Patterns (BLBP) with different weights histograms were extracted and concatenated into a single, spatially enhanced feature histogram efficiently representing the face image. The classification was performed using a nearest neighbour classifier in the computed feature space with Chi square as a dissimilarity measure. Experiments in face databases show that the proposed MS-BLBP method outperforms other LBP based face recognition algorithms.
New Method of Tow Direction and Two Dimension Extract Features for Face Recognition
GUO Zhi-qiang,YANG Jie
Computer Science. 2009, 36 (11): 296-299. 
Abstract PDF(308KB) ( 152 )   
RelatedCitation | Metrics
Two-way compression project subspace method combining both the Two-dimension Principle Component Analysis (2DPCA) and the Two-dimension Linear Discriminant Analysis (2DLI)八)was proposed for face recognition.This method first transposes the feature matrix after it performs the 2DPCA and then it performs the 2DLDA. Compared with the (2D) } PCA and (2D)2 LDA, this method makes full use of the advantages of the 2DPCA and 2DLI)、.It not only contains the sample category information, but also eliminates the image matrix correlation of the row and column,so that it effectively extracts the row and column recognition information,and mcanwhile,the recognition feature dimension decreases dramatically. The experiment on the ORL and PERET face databases shows that the recognition rate of this method is better than the existing two-dimension feature extract method without influencing the recognition speed.
Study on Algorithm of Texture Image Retrival by Gray Level Co-occurrence Matrix
YUAN Li-hong,SUN Shuang-zi,FU Li
Computer Science. 2009, 36 (11): 300-303. 
Abstract PDF(323KB) ( 150 )   
RelatedCitation | Metrics
Image's feature extraction and matching is essential for content based image retrieval. Io retrieve texture image better, this paper discussed the Rational method for extract features of texture image by gray level co-occurrence matrix. Then combined with similarity measures, the basic system based on LLCM was developed. Different measure funcdons and different combination of statistics were used to analyze their effect on retrieval. I}he result indicates that better retrieval performance is achieved by extracting four parameters from GLOM and measuring them with weighted block distance.