Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 38 Issue 3, 16 November 2018
  
Survey on P2P Network Pollutions
WANG Yong,YUN Xiao-chun,QIN Zhi-guang,GUO Li,CHENG Hong-rong
Computer Science. 2011, 38 (3): 1-4. 
Abstract PDF(478KB) ( 905 )   
RelatedCitation | Metrics
In recent years, the security issues of Peer-to-Peer networking applications(P2P) are becoming the hot spots of network security research. The pollution and poisoning problems of shared content in P2P network present a new challenge for P2P networking security; such as how to find and locate the polluted data, how to modeling the pollution and poisoning process,and how to prevent the spreads of polluted contents efficiently etc. In this paper, the up-to-date researching results were discussed in detail from the three main angles as the polluted data measurements, the modeling processes, and the pollution constrain strategics; some key issues related to P2P network pollutions were analyzed. Finally, the paper pointed out the future possible researching trends on P2P network pollutions.
Survey of CPU/GPU Synergetic Parallel Computing
LU Feng-shun,SONG Jun-qiang,YIN Fu-kang,ZHANG Li-lun
Computer Science. 2011, 38 (3): 5-9. 
Abstract PDF(571KB) ( 3763 )   
RelatedCitation | Metrics
With the features of tremendous capability, high performance/price ratio and low power, the heterogeneous hybrid CPU/GPU parallel systems have become the new high performance computing platforms. However, the architecture complexity of the hybrid system poses many challenges on the parallel algorithms design on the infrastructure. According to the scale of computational resources involved in the synergetic parallel computing, we classified the recent researches into three categories,detailed the motivations,methodologies and applications of several projects,and discussed some on-going research issues in this direction in the end. We hope the domain experts can gain useful information about synergetic parallel computing from this work.
Research and Progress in Computational Thinking
MOU Qin,TAN Liang
Computer Science. 2011, 38 (3): 10-15. 
Abstract PDF(742KB) ( 2285 )   
RelatedCitation | Metrics
Computational thinking is an widely-concerned and important concept in current international computer community, and it is an important issue which needs to be researched by current computer educational. It has been wildly researched and discussed by many domestic and overseas scholars in computer community, academic and philosophical circles on this subject, although there are some positive results, but the differences are still widely remained. Firstly, according to the clues in time and the characters of the formation of computational thinking, a method of phases was proposed by the article basic on its infancy, foundation and chaos period, it widely analysed the calculation of the formation and development process of the computational thinking and gave a panoramic view of the evolution process of computational thinking. Then, it illuminated the key content of the computational thinking, including the conception, the principium, the application of the computational thinking in education, and it summaricd the computational thinking. Based on the existing research results, we made a vista of the future research direction of the computational thinking and its challenges.
Research on Virtual Sample Generation Technology
YU Xu,YANG Jing,XIE Zhi-qiang
Computer Science. 2011, 38 (3): 16-19. 
Abstract PDF(388KB) ( 1464 )   
RelatedCitation | Metrics
Virtual sample generation technology mainly makes research on how to combine the priori knowledge of a domain with the existing training samples to create additional training samples, enlarging the training dataset, and improving the generalization ability of learning machine. As a method of bringing priori knowledge to machine learning, virtual sample generation technology has become a primary means to improve the generalization ability on small sample learning problem, and has been studied widely by scholars home and abroad. In this paper, firstly the concept of virtual sample was introduced, two indexes of evaluating the capability of virtual sample generation technology were given, and the influence of virtual sample generation technology on the generalization ability of learning machine was discussed.Then the current virtual sample generation technologies were divided into 3 classes by their essence, and for each class several typical virtual sample generation technologies were discussed. Moreover the deficiencies of the current virtual sample generation technology were pointed out Finally, made a summary and proposed our own viewpoint on the further development of virtual sample generation technology.
Invasive Weed Optimization and its Advances
HAN Yi,CAI Jian-hu,LI Yan-lai,ZHOU Gen-gui
Computer Science. 2011, 38 (3): 20-23. 
Abstract PDF(395KB) ( 455 )   
RelatedCitation | Metrics
Invasive Weed Optimization(IWO) is a recently proposed simple and effective population-based novel numerical optimization algorithm. It's receiving increasing focuses from academic and engineering optimization fields. IWO is inspired from the invasive and colonizing characteristics of weeds, which tries to imitate the robustness, adaptation and randomness embodied by weeds during the colonizing process. Here,the fundamental principles and framework of IWO were described in detail. Then, the advances of IWO in the current optimization and engineering fields were summed up.
Research on Security Evaluation Model of DRM System
LI Run-feng,MA Zhao-feng,YANG Yi-xian,NIU Xin-xin
Computer Science. 2011, 38 (3): 24-27. 
Abstract PDF(480KB) ( 417 )   
RelatedCitation | Metrics
According to the design patterns and implementations of many existing digital rights management systems, a minimal set of DRM system and a typical DRM system were designed. Against to the usual attack, some evaluation indices were proposed in this article. One of the indices is defense intensity after attack. AHP is used in the evaluation model of the security of DRM system. To evaluate the security, encrypt algorithm, commercial secret and the ability of anticrack were proposed as foundation of security. In the last part, two existing DRM systems were evaluated using the indices and the quantitative indicators which were given.The result of contrastive experiment indicates that the security evaluation model of DRM system is rational.
Detecting Primary Emulation Attacks Based on Energy Fingerprint Matching in Cognitive Radio Networks
PANG De-ming,HU Gang,XU Ming
Computer Science. 2011, 38 (3): 28-33. 
Abstract PDF(668KB) ( 486 )   
RelatedCitation | Metrics
Spectrum sensing is an essential mechanism in cognitive radio networks. While so many works have been done or being carried on upon improving the efficiency of the sensing process, there is no appropriate scheme to realize security sensing in untrustworthy environments. We proposed a mechanism based on Energy Fingerprint(EF) matching to address a typical attack in CR networks, the Primary User Emulation Attack(PUEA). Based on the distribution of their positions and through energy detection,the secondary users can create PU's EF which is used as an identification of node to analyze the spectrum accessing mode of different users, and finally detect the PUEA. Through analysis and simulation, it can be demonstrated that while keeping the probability of missing the return of the primary low, it is possible to detect the emulation attack effectively.
Research on Access Control Based on Source of Information Flow
LI Wei-nan,LI Han-chao,SHI Wen-chang
Computer Science. 2011, 38 (3): 34-39. 
Abstract PDF(638KB) ( 488 )   
RelatedCitation | Metrics
Due to the impact of information flow on system integrity, explored ways to enforce access control with consideration of information flow. Proposed a model for Access Control based on Source of Information Flow(ACSIE),in which, source of information flow refers to the entity that is a starting point of the flow. Source of information flow was expressed via set of users. Integrity level was described through sources of information flows. Dominance relationship of integrity levels was defined according to inclusion relationship of sets. Access control rules were constructed based on information flows. Integrity constrained subjects and constrained sets were introduced to improve adaptability of the model. Existing system information may be utilized effectively in implementing the model, hence, complexity of system configuration can be reduced,and system usability can be improved.
ID-based Proactive Threshold Proxy Signature in the Standard Model
YU Yi-ke,ZHENG Xue-feng,LIU Xing-bing,HAN Xiao-guang
Computer Science. 2011, 38 (3): 40-46. 
Abstract PDF(559KB) ( 368 )   
RelatedCitation | Metrics
At present, the security of the identity based proactive threshold proxy signature schemes was almost proven in the random oracle model, and the threshold value of these schemes was almost changeless. It is more practical to design threshold proxy signature scheme in the standard model, compared with the existing ones. Aiming at the two problems pointed out above, this paper presented a proactive threshold proxy signature scheme by using the hardness of the computational DifficHellman problem with changeable threshold value, based on the modification of Paterson' proposed identity based signature scheme. At last, the scheme's correctness was exactly proven in terms of bilinear pairing technique and its security analysis and prove were given in detail in the assumption of the computational Diffie-Hellman problem, therefore, this scheme was proven secure and reliable.
Unstructured Peer-to-Peer Search with Routing Orientation of Query Rate
FENG Guo-fu,ZHANG Jin-cheng,LI Wen-zhong,LU Sang-lu,CHEN Dao-xu
Computer Science. 2011, 38 (3): 47-50. 
Abstract PDF(404KB) ( 376 )   
RelatedCitation | Metrics
In the Decentralized Unstructured Peer-to-Peer(P2P),the peers are usually organized to form an ad hoc overlay network, and the ctueries are propagated among the overlay to search blindly. Since the storage location is independent of the data content, a peer has no idea of which peer is more likely to satisfy a request Therefore, it's vital to find the routing orientation and improve the routing effectiveness. Some semantic methods, such as interest and Ontology, are commonly used in the related work to cluster the peers and to decrease the search range. However, these approaches have not been adopted widely because they are generally limited by the current immature semantic obtainment and description. This paper proposed a novel search method QRRO(Routing Orientation of Query Rate) with the Routing Orientation of Query Rate. In QRRO, each peer is allocated a weight identifier; a peer only indexes for the files whose query rate is close to its weight Therefore, the coupling relation was built up and the routing orientation was formed through query rate. Our simulations show that QRRO is effective in improving the success rate and decreasing the search path length. What's more, QRRO is a pervasive method because the query rate is a non-semantic property of each file.
Self-healing Intrusion Tolerant Method Based on J2EE Application Server
ZHOU Rui-peng,GUO Yuan-bo,LIU Wei,HAN Lei-lei
Computer Science. 2011, 38 (3): 51-56. 
Abstract PDF(575KB) ( 383 )   
RelatedCitation | Metrics
Aiming at the limitation for intrusion-tolerant and self-healing, this paper presented a self-healing intrusion-tolerant model based on J2EE application server, and presented an self-healing intrusion-tolerant method for this model. Compared with the traditional method, this method not only solved the limitation for intrusion-tolerant, for example,the hidden intrusion, software aging, and the vulnerable prerequisite of intrusion tolerance, but also solved the problem of intrusion in the self-healing. Finally, comparison tests of the self-healing intrusion tolerance clusters,JANTM clusters and J13oss4. 0 clusters verify the self-healing intrusion tolerance method to make self-healing intrusion tolerance clusters higher reliability and survivability.
Anti-worm Based Defensive Scheme for P2P Worm
ZHOU Shi-jie,QIN Zhi-guang,LIU Le-yuan,DENG Yi-yi
Computer Science. 2011, 38 (3): 57-64. 
Abstract PDF(934KB) ( 466 )   
RelatedCitation | Metrics
P2P worms employ the distinctive features of P2P network, such as the local routing table, application routing mechanism and so on, to quickly distribute them into the network while holding the covert characteristic. Contrarily, the common Internet worms generally rely on detecting the victims' IP address to spread.Therefore, the lack of hidden feature and feasible promulgating paths make that it is easier to detect and defense the ordinary Internet worms than P2P worms. Consequently, the P2P worm can do more damage to the network if lacking the effective defensive scheme. In this paper, the P2P worm, especially its transmission mechanism was analyzed synthetically. I}hen, an anti worm based scheme for the defensive of P2P worm was presented. The principle and functional modules of this new scheme were addressed as well.By using the Peersim P2P simulator, the performance of our novel scheme was evaluated experimentally in various system parameters. The primary experimental results indicated that our anti-worm based defensive scheme for P2P worm has the features of fast convergence, low overload of networking resource(including communication traffic and computing power),and high adaptability.
Optimal Power Allocation for Wireless Multicast Networks with Regenerative Relaying Based on Cooperative Communication Technology
ZHANG Kai,QIAN Huan-yan
Computer Science. 2011, 38 (3): 65-69. 
Abstract PDF(451KB) ( 380 )   
RelatedCitation | Metrics
Aiming at all related issues between system performance of wireless regenerative relaying networks and system frame error probability, outage probability, based on the Multi-user Diversity-cooperative Protocol(MDP) of cooperative communication technology, a new type of transmission scheme(MDPTS) was designed, which optimizes the signal transmission process and enhances both signal transmission efficiency and network throughput. On this basis,the minimum SFEP(System Frame Error Probability) was set as target, and the optimal power allocation scheme between the source and the relay was proposed, the optimal solution of system power allocation was obtained by Lagrange multiplier method. Simulation results show that compared with the network using traditional transmission scheme, the network adopting MDPTS possesses higher diversity gain and multiplexing gain, therefore, has better system performance.
Research of a MAC Protocol Based on Multiple-step Channel Reservation Mechanism
CHEN Yi,LI Bo
Computer Science. 2011, 38 (3): 70-72. 
Abstract PDF(385KB) ( 396 )   
RelatedCitation | Metrics
Real-time traffic transmission over wireless Ad hoc networks requires authentic and reliable QoS assurance from wireless network protocols, which is not available from most of the current MAC protocols. Considering the periodicity of real-time traffic, a MAC protocol based on multipl}step channel reservation mechanism was proposed, which guarantees more reliable transmission for real-time frames. Simulation results and analysis obviously indicate that the protocol raises the network performance to a more excellent level.
Efficient Authenticated Key Agreement Protocol in MANET
LI Xiao-qing,LI Hui,MA Jian-feng
Computer Science. 2011, 38 (3): 73-75. 
Abstract PDF(277KB) ( 626 )   
RelatedCitation | Metrics
Based on IBE and DifficHcllman key agreement arithmetic, an efficient authenticated key agreement protocol for MANET was proposed. The PKG distribution and the created public/private key of nodes without using the trust third part were achieved by used distributed polynomial secret sharing scheme. The authenticated key agreement was carried out by IBS and DH key agreement arithmetic. The protocol was combined with IBE and DH, it had the advantage of low storage and traffic of IBE,as well as the communication after key agreement could adopt symmetry cryptographic algorithm which could availability lower computation and save resource of the MANET. The protocol was proved secure in theory.
Differential Fault Analysis on RSA-CRT Based on Fault in Error Checking Operation
CHEN Cai-sen,WANG Tao,TIAN Jun-jian,ZHANG Jin-zhong
Computer Science. 2011, 38 (3): 76-79. 
Abstract PDF(345KB) ( 637 )   
RelatedCitation | Metrics
The former fault analysis can't attack on RSA-CRT with corresponding countermeasure. In order to find the new vulnerability to fault analysis,this paper took BOS countermeasure as the analyzed object An attack model based on fault in error checking operation was advanced, and a differential fault analysis algorithm was given that can completely recover the RSA key. The fact that the previous countermeasures can't effectively resist the differential fault analysis was demonstrated,and the complexity of our attack was estimated both by a theoretical analysis and software simulations. Experiment results show that the new fault analysis algorithm has well feasibility, it requires less faulty signature samples than Wagner's attack algorithm, almost need 256 samples for single byte fault Finally, a corresponding advice on countermen sure to differential fault analysis was given by analyzing the problem of previous countermeasures.
Internet Traffic Classification Based on Detecting Community Structure in Complex Network
CAI Jun,YU Shun-zheng
Computer Science. 2011, 38 (3): 80-82. 
Abstract PDF(360KB) ( 389 )   
RelatedCitation | Metrics
In recent years, Internet traffic classification using port based or payload-based methods is becoming increasingly difficult with peer-to-peer(P2P) applications using dynamic port numbers,masquerading techniques, and encryption to avoid detection. Because supervised clustering algorithm needs accuracy of training sets and it can not classify unknown apphcation,we introduced complex network's community detecting algorithm,a new unsupervised classify algorithm, which has previously not been used for network traffic classification. We evaluated this algorithm and compared it with the previously used unsupervised K-means and DBSCAN algorithm, using empirical Internet traces. The experiment results show complex network's community detecting algorithm works very well in accuracy and produces better clusters, besides, complex network's community detecting algorithm need not know the number of the traffic application beforehand.
Construction of Odd-variable Boolean Function with Optimum Algebraic Immunity
TANG Yang,ZHANG Hong,ZHANG Kun,LI Qian-mu
Computer Science. 2011, 38 (3): 83-86. 
Abstract PDF(297KB) ( 354 )   
RelatedCitation | Metrics
Algebraic immunity is a new cryptographic criterion proposed to against algebraic attacks. In order to resist algebraic attacks,Boolean functions used in many stream ciphers should have optimum algebraic immunity. This paper presented a construction of Boolean function in odd variables with optimum algebraic immunity. It's a recursive construction. Given any odd number, we can construct Boolean function with optimum algebraic immunity in the same number of variables.
Real-time Scheduling Algorithm of Hybrid with Fault-tolerant in Heterogeneous Distributed Systems
DENG Jian-bo,ZHANG Li-chen,DENG Hui-min
Computer Science. 2011, 38 (3): 87-92. 
Abstract PDF(661KB) ( 537 )   
RelatedCitation | Metrics
The primary/backup process is commonly used in heterogeneous distributed systems with fault tolerance.This paper proposed a heterogeneous distributed hybrid model with fault tolerance. Compared with the traditional heterogeneous distributed scheduling models, this model can simultaneously schedule both periodical and aperiodical tasks.Three fault tolerant scheduling algorithms based on this model were presented: SSA(Schedulability Scheduling Algorithm) algorithm aimed at schedulability, RSA(Rcliability Scheduling Algorithm) algorithm aimed at reliability and BSA (Balanced Scheduling Algoritlnn) algorithm aimed at load equalization. These algorithm can simultaneously process real-time tasks in demand of periodical or aperiodical fault-tolerance in heterogeneous systems. And they can guarantee that real-time tasks could complete befor the cut off time even if some node of the system fails. Finally, this paper analyseds the algorithms in five ways; schedulability, reliability cost, load equalization, number of periodical and aperiodical tasks, cycle and granularity. Experiment results show that the algorithms have advantages and disadvantages respectively, so they should be chosen according to the characteristics of a specail heterogeneous system.
Connectivity of Wireless Multi-hop Networks under Fading Channel
LI Yan-jun,ZHU Yi-hua
Computer Science. 2011, 38 (3): 93-96. 
Abstract PDF(415KB) ( 460 )   
RelatedCitation | Metrics
By assuming the deployment of nodes a homogeneous Poisson point process, closed-formed expressions of network non-isolation probability were derived under different fading channel models, as the upper bound of 1-connectivity probability,with particular emphasis on the effects of lognormal shadowing and Rayleigh fading. The impact of cooperative communication was also addressed, showing how it can be used to enhance the network connectivity. Finally, numerical simulations in synthetic wireless network scenarios show a satisfying consistence between the theoretical findings and simulation results.
Algorithmic Verification of Forward Correctability
WU Hai-ling,ZHOU Cong-hua,JU Shi-guang,WANG Ji
Computer Science. 2011, 38 (3): 97-102. 
Abstract PDF(534KB) ( 332 )   
RelatedCitation | Metrics
Due to the incompleteness of the "Unwinding Theorem",a system can't be judged to fail to satisfy the forward correctability, when some local conditions of "Unwinding Theorem" are not satisfied. This paper proposed an algorithmic verification technique to check the forward correctability based on the state transition system. The technique reduces forward correctability checking to the reachability problem and the reduction enables us to use the reachability checking technique to perform forward correctability checking. Our method is complete and it can give a counter-examples to control and eliminate the illegal information flow when a system fails to satisfy the forward correctability. Finally,Disk-arm Convert Channel illustrates the effectiveness and practicality.
Novel Admission Control Algorithm Based on Differential Services
SHI Sheng-lin,ZHU Guang-xi,SU Gang
Computer Science. 2011, 38 (3): 103-106. 
Abstract PDF(468KB) ( 318 )   
RelatedCitation | Metrics
EDCA in IEEE802.11e can't ensure the QoS of the high priority service flow when there are lots of background traffic in the network. We proposed a novel admission control algorithm based on adaptive, consecutive, variable requests of RTS, which can increase admission probability of high priority service flow, and reduce the collision between the high priority data flow. The theoretical analysis and simulation results show that the algorithm can significantly increase admission probability of high priority service flow, improve the end-to-end throughout, and reduce the retry times and delay, which will ensure the QoS of high priority service flow.
Construction of Fingerprint Key Extraction with High Performance
LI Xi-ming,YANG bo
Computer Science. 2011, 38 (3): 107-110. 
Abstract PDF(373KB) ( 436 )   
RelatedCitation | Metrics
In this paper, a kind of method which can evaluate fingerprint picture quality was proposed and used in the construction of basic and improved fingerprint key extraction scheme. The method measures the concentration ratio of energy in spectrum picture and the quality of fingerprint picture is denoted by the sum of all the points which have high energy. Key extraction experiment was done on the URU4000B database of Institute of Automation of Academia Sinica and FVC2004 DB3,performance of basic key extraction scheme and improved key extraction scheme were tested. The result shows that the improved scheme is better in key extraction success rate. For example, in URU4000B, when the key extraction failure rate of two pictures from two fingers is 99. 8%,in the basic key extraction scheme the key extraction success rate of the same finger is 43. 7%,in the improved key extraction scheme the key extraction success rate of the same finger is 46. 3%.
Study on Evaluation Method of MAC Protocol for Tactical Data Links
WANG Yi,WANG Wen-zheng,ZHOU Jing-lun
Computer Science. 2011, 38 (3): 111-114. 
Abstract PDF(598KB) ( 373 )   
RelatedCitation | Metrics
Medium Access Control(MAC) protocol has a decisive influence on the overall performance of tactical data links, and how to evaluate the performance of MAC protocol is an important part of the life cycle of tactical data links.TDMA is the most widely used access method in tactical data links, so, the evaluation indexes of slot assignment protocol were proposed,and a simulation evaluation method of slot assignment protocol for tactical data links was proposed.Then, the simulation evaluation framework was designed, the simulation models and the process of the framework were discussed. Finally, the validity of the method was verified by a simulation evaluation example.
Research on Hierarchical Optimization Strategy for Large-scale Mobile Ad Hoc Networks
YANG Juan,LI Ying,LIU Hong-fei
Computer Science. 2011, 38 (3): 115-119. 
Abstract PDF(467KB) ( 347 )   
RelatedCitation | Metrics
For large-scale mobile ad hoc networks,hierarchical protocol can enhance the stability of network topological structure and reduce communication relay. This paper proposed a hybrid method to construct a largcscale MANET hic rarchical structure, so as to enhance the stabhty of network topological structure. This method combines dynamic node's Multi-feature fusion with Gibbs Random Field-Maximum a Posterior. Simulation experiments show that our algorithm performs better in the stability of hierarchical structure than the existing algorithms.
Secure Computation Protocol for Private Matching and Inclusion Relation against Outsourced Database System
JIANG Ya-jun,YANG Bo,ZHANG Ming-wu,CHEN Xu-ri
Computer Science. 2011, 38 (3): 120-122. 
Abstract PDF(333KB) ( 393 )   
RelatedCitation | Metrics
The secure computation protocol based on distributed environment, namely Protocol 1, was proposed for privale matching against outsourced database system. The data owner adopted Mignotte's secret sharing scheme to outsource a dataset_ The user interacted with some third-party service provider to determine if some elements of the user's dataset belonged to the data owner's dataset by means of additive homomorphic encryption and secret reconstruction to construct discriminant and with the value of discriminant being zero or not, and ultimately realized private matching. In addition, the other protocol was also proposed to determine whether the user' s dataset was included in the owner' s dataset, namely protocol 2. In the semihonest model, the security of the two protocols was proved by simulator.
Watermarking Algorithm for Color Images Based on Quaternion Frequency Modulation
SUN Jing,YANG Jing-yu,FU De-sheng
Computer Science. 2011, 38 (3): 123-126. 
Abstract PDF(467KB) ( 427 )   
RelatedCitation | Metrics
Separate channel technictues performed on color images have the disadvantage of less ability to show connection between each channel during image processing. A new method to process color images with quaternion Fouricr transform was put forward in this paper. Firstly, ununiformity of each unit quaternion block in host image was calculated to pick up the potential block adaptively. Secondly, frectuency matrix of host color image was obtained by using quaternion Fouricr transforms. After that, the water marks were embedded into the amplitude of AC coefficient. Experimental results show that the image distortions from the watermark are spread into the three channels of a color host image and the proposed watermarking algorithm realizes a better trade off between imperceptibility and robustness.
New Identity-based Key Agreement Scheme for WSN
GUO Jiang-hong,LI Xing-hua,WU Jian-qiang
Computer Science. 2011, 38 (3): 127-130. 
Abstract PDF(335KB) ( 448 )   
RelatedCitation | Metrics
For the openness of wireless channel, establishing pairwise key between sensor nodes is the basis of the secure communication in wireless sensor networks. In most of existing IBE-based key agreement schemes, the pairwise key was established using hate pairing with high energy consuming and time consuming. In this paper, authors proposed a key agreement scheme based on BNN-IBS signature, sensors broadcast the parameters required for establishing pairwise key and compute the key using DifficHcllman key exchange technology. Compared with the IBE-based key agreement scheme, the proposed scheme has patent advantages in energy consuming and time consuming with the same security and scalabihty.
RepTrust:Reputation Based Trust Model in P2P Environment
YANG Chao,LIU Nian-zu
Computer Science. 2011, 38 (3): 131-135. 
Abstract PDF(460KB) ( 458 )   
RelatedCitation | Metrics
Reputation is a new method to build trust. In Pecr-to-Peer networks, trust mechanism based on reputation can isolate inferior peers and recommend excellent peers in order to evade potential risk and provide better service. Some shortcomings were found in related research such as overly dependency on network structure, neglect of collusion cheat and disregard of dynamic reputation change. Aiming at these shortcomings, RepTrust, a novel trust model based on reputation was presented. In this model, reputation unrelated with raters and routes through that it been gotten was used to compute trust. Mendacious reputation was filtrated and an attenuation factor was introduced into to amend the dynamic reputation. According to the simulation experiment, this model is effective in trust computing, dealing with collusion cheat and dynamic reputation.
Short Public Key Provable Security Identity-based Signature Scheme
WANG Zhi-yi,LIU Tie,KANG Li,XIE Jing,LEI Gang
Computer Science. 2011, 38 (3): 136-139. 
Abstract PDF(332KB) ( 437 )   
RelatedCitation | Metrics
In the standard model an UCMA security IBS scheme was proposed by Peterson and Schuldt, which was based on computational Diffie-Hellman problem in bilinear pairing group. Two independent Waters' identity hash functions were directly employed to treat the user's identity and the signature message, respectively, so PS's IBS scheme had a great number of public keys. An improved parameters selecting method was proposed in the new scheme, which only needs a small number of public keys,and the new scheme can be proved security in the standard model.
Data Delivery Scheme of Delay Tolerant Mobile Sensor Network Based on Node Priority
LIU Tang,PENG Jian,WANG Jian-zhong,LIU Liu
Computer Science. 2011, 38 (3): 140-143. 
Abstract PDF(356KB) ( 347 )   
RelatedCitation | Metrics
To better solve the problem of data collection in 1)丁MSN(delay tolerant mobile sensor network,DTMSN),this paper proposed a new data delivery scheme-NPD(Node Priority Data Delivery Scheme). NPD calculates the node delivery priority, according to which nodes make decisions when delivering messages. ho optimize the management of information copy,NPl)adopted dynamic message queue and decided on the principle of abandoning messages based on their survival time. Simulation results show that compared with the existing algorithm of D"hMSN data transmission,NPD has a higher delivery ratio in data transmission and smaller delivery delay and it has a relatively longer network lifetime.
Research on Network Security Situation Awareness System Architecture Based on Multi-source Heterogeneous Sensors
LAI Ji-bao,WANG Ying,WANG Hui-qiang,ZHENG Feng-bing,ZHOU Bing
Computer Science. 2011, 38 (3): 144-149. 
Abstract PDF(749KB) ( 930 )   
RelatedCitation | Metrics
Combined with the largcscale network security monitor application requirements, network security situation awareness system(NSSAS) architecture based on multi-sensors was studied with using the idea of 'distributed acquisition, multi-domain processing',and then the corresponding ring physical architecture and hierarchical conceptual model of NSSAS were put forward. I}he architecture of NSSAS is composed of three levels, including information acquisition level,element extraction level and situation decision-making level from bottom to top successively. hhe modules of every level were designed in detail,and the solution of multi-source heterogeneous security information XMI. format was given. The NSSAS architecture based on multi sensors is an open and extensible ring architecture that can reduce system implementation complexity and avoid singlcpoint failure problem. At the same time, it can clearly describe the relationship among levels and components, and guide the development of engineering practice and key technologies.
Threshold Group Signature Scheme with Privilege Subsets Based on Multipartite Secret Sharing
WANG Tian-qin
Computer Science. 2011, 38 (3): 150-152. 
Abstract PDF(316KB) ( 331 )   
RelatedCitation | Metrics
(t,n) threshold group signature scheme enables t out of n signers to sign messages on behalf of the group.Multipartite access structure can be realized by multipartite secret sharing schemes. Based on multipartite secret sharing, this paper presented a threshold group signature scheme with privilege subsets, where each participant shares a single piece of the secret.A valid signature can only be generated under the cooperation of an authorized set. Every unauthorized set can learn nothing ahout the secret from their shares. Morever, it has the properties of anonymity and traceability. scheme access structure, Ideal secret sharing
Approach to Dynamic Services Composition Based on Knowledge Reuse
LIU Ying,HU Hai-tao
Computer Science. 2011, 38 (3): 153-158. 
Abstract PDF(611KB) ( 349 )   
RelatedCitation | Metrics
How to make full use of Web services and construct application by composing existing services on demand in the dynamic and open environment is a challenge issue. I}his paper presented an approach to larger-granularity service composition for business-users, which encapsulated domain knowledge into prefabricated and modifiable templates. By this approach, business-users can construct applications just like assembling hardware by composing reusable and larger-granularity modules, which facilitates automated service composition and enhances abstract level to some extent.
Research on Service-oriented Network Simulation and Runtime Support Platform
SUN Li-yang,MAO Shao-jie,LIN Jian-ning
Computer Science. 2011, 38 (3): 159-161. 
Abstract PDF(394KB) ( 399 )   
RelatedCitation | Metrics
With the development of distributed simulation technology, combined with the demands of the simulation system construction and operation within the military field, based on the service-oriented technology, we presented a service-oriented network simulation construction and operation models to meet the future military system simulation new demands. First the concepts of network simulation machine features were introduced. Secondly, we particularly presenled relevant research on the network simulation supporting platform and interrelated core services, focusing on operation support platform architecture and construction and operation of the simulation application services under the architecture. End of the paper gave the system analysis for future explanation of the operation support platform.
Generation of a Consensus Sequence and its Applications in the Fault Location
YE Jun-min,XIE Qian,JIANG Li,LI Song-song,XU Lei
Computer Science. 2011, 38 (3): 162-165. 
Abstract PDF(367KB) ( 366 )   
RelatedCitation | Metrics
In order to improve software reliability, software running in the event of the failure, rapid and accurate positioning of the point of failure has become a very meaningful study. Previous methods, Fault running sequence and the nearest running sequences to run the previous differences,in order to avoid selecting the first path caused by the recent success of recent blind-area exist,the impact of the increasing post search space,the paper introduced gene biology the principle of alignment of the nearest sequence to run not by editors to select from the comparison, but to processed through a consensus sequence. Experiments show that the method of fault location better.
Research on Flexible Constraint of Business Process Based on Reinforcement Learning
HAN Dao-jun,XIA Lan-ting,ZHUO Han-kui,LI Lei
Computer Science. 2011, 38 (3): 166-171. 
Abstract PDF(592KB) ( 366 )   
RelatedCitation | Metrics
Software business process needs higher flexibility and adaptability. Traditionally, various methods are developed to improve the flexibility of process during the modeling period, and the running period is omitted. It is hard to describe the environment model for its dynamic, thus, environment can't be controlled during the modeling period. Because the change of environment can be explained by the data of running period, analyzing the treated data of running period of process is necessary. We built a learning algorithm based on reinforcement learning, a well-known learning technique, to learn flexible constraints from the treated data of running period, and the adaptability of process is improved. We will evaluate our learning algorithm in a user-centred complex information system, and show that our algorithm is effective and efficient.
Study on an Improved Approach of Web Services Trust Evaluation Based on Requirement-contexts
WANG Ling-ling,ZHANG Wei-qun,LIU Zhe
Computer Science. 2011, 38 (3): 172-174. 
Abstract PDF(330KB) ( 403 )   
RelatedCitation | Metrics
There are diffrent qualities of Web services in Web environment. However, how to choose reliable Web service that satisfies users' requirements has become a serious problem. We proposed a trust mechanism based on requirement contexts and recommendation dishonesty. Filter unvaluable historical data by calculating requirement contexts similarity and find the appropriate recommenders by fuzzy clustering of recommenders' different preferences. Then we proposed a approach based on recommendation dishonesty and honesty to distinguish recommenders' capacity to weaken malicious recommenders' influence. At last, the results of experiment demonstrate the feasibility and validity of the approach.
Research on Battlefield Environment Metamodel Hierarchy
SUN Guo-bing,HUANG Jin-jie,GAO Ya-juan
Computer Science. 2011, 38 (3): 175-178. 
Abstract PDF(396KB) ( 397 )   
RelatedCitation | Metrics
Currently potency and level values arc used to present the metamodel hierarchy, but the problem that the same concepts could be defined in the different levels is not taken into account. In order to solve the problem, potency and domain field values were proposed to present the metamodel elements in the domain of Battlefield Environment(BE)simulation. The potency values are the instance times of level factors and the domain field values present whether the factors can be instanced between different level.
Pervasive Computing Environment for Embedded System
QI Zhi-yong,CHANG Pai-pai
Computer Science. 2011, 38 (3): 179-181. 
Abstract PDF(383KB) ( 315 )   
RelatedCitation | Metrics
Service provider and the user interface in ubiquitous computing adaptive problem is an important research topic. Embedded systems provide services to ubictuitous computing service played an important supporting role; pervasive computing tasks to the service users also need access to adaptive user interface to display services, embedded system, adaptive interface is appropriate choice. However, the traditional services of embedded systems software can not achieve the above goals very well. Traditional service model for embedded system deficiencies, this paper summarized the embedded systems for ubiquitous computing hardware, and named the computing-unit of this structure. Subsequently,this paper proposed a new service delivery model for embedded systems, unified user interaction context of the standard message format, and finally studied the implementation of the service model.
Processing Continuous K Nearest Neighbor Queries on Highly Dynamic Moving Objects
NIU Jian-guang,CHEN Luo,ZHAO Liang,TAN Jie
Computer Science. 2011, 38 (3): 182-186. 
Abstract PDF(919KB) ( 402 )   
RelatedCitation | Metrics
To evaluate multiple continuous k nearest neighbor queries on highly dynamic moving objects, we proposed an algorithm for processing Query Index based Multiple Continuous K-Nearest Neighbor Qucrics(QI-MCKNN),discussed the conception of Query Index and the solution to construct the grid index, analyzed the effect of grid size on query processing,and provided the corresponding procedure. Finally experimental results verify that when dealing with highly dynamic moving objects,the proposed algorithm is much more effective than the traditional SEA-CNN algorithm which is based on the objects index
Associative Web Document Classification Based on Word Mixed Weight
LAN Jun,SHI Hua-ji,LI Xing-yi,XU Min
Computer Science. 2011, 38 (3): 187-190. 
Abstract PDF(351KB) ( 423 )   
RelatedCitation | Metrics
There are two shortages when the method of classification based on association rules is applied to classify the Web documents:one is that the method process the Web document as a plain text,ignoring the HTML tags information of the Web page;another is that either item of the association rules is only the words in the Web page,without considc ring the weight of the words, or it quantifies the weight of the word frequency, ignoring the importance of the location of the words in the Web document. Therefore, a new efficient method was proposed in the paper. It calculates the word's mixed weight by the information of the H TMI_ tags feature, and then mines the classification rules based on the mixed weight to classify the Web pages. The result of experiment shows that the performance of this approach is better than the traditional associated classification methods.
XML Parsing Schema Based on Parallel Sub-tree Construction
CHEN Rong-xin,LIAO Hu-sheng,CHEN Wei-bin
Computer Science. 2011, 38 (3): 191-194. 
Abstract PDF(456KB) ( 397 )   
RelatedCitation | Metrics
Abstract Since XML parsing is time-consuming operation which greatly affects the performance of XML application,parallclization is an important optimization measure. Existing parsing methods need prcparsing stage to ensure proper data partition so as to complete XML segments parsing in parallel, however, pre-parsing tends to be long-running and Rdifficult to be optimized. This paper presented a schema which supports parallel suB-tree construction upon arbitrary XML segments. Sub-trees were merged to form whole XML tree in final stage. Experiment results indicate that our schema can efficiently realize parallel XML parsing in multi-core environment.
Locating Table in P2P Data Management System
SHEN Xin-peng,LI Zhan-huai,ZHAO Xiao-nan,ZENG Lei-jie
Computer Science. 2011, 38 (3): 195-198. 
Abstract PDF(353KB) ( 337 )   
RelatedCitation | Metrics
The data management system has become the focus of peer-to-pecr computing. Semantic heterogeneity is the first problem to P2P data management system In order to solve this problem,in each data source node,some keywords of table names and attribute names were defined as the medium of semantic mapping. The semantic mapping was automatically created between the heterogeneous data source nodes, which had the same keywords. The keywords became the external schema of the shared tables. But the keywords were not really materialized to the views. All the keywords were distributed to the P2P network using the external schema description files. When a query was executed, once the external schema description files were found,all the aliases would be immediately changed into real table names or attribute names. Using the way, the amount of data backup was reduced, the search algorithm was simplified and the system efficiency was improved.
Finding Intentional Knowledge of Outliers Based on Attribute Subspace
WANG Yue,LIU Ya-hui,TAN Shu-qiu
Computer Science. 2011, 38 (3): 199-202. 
Abstract PDF(348KB) ( 363 )   
RelatedCitation | Metrics
Outliers usually contain important information, it can help improving the users' understanding of the data.New definitions of cause attribute subspace of outliers, degree of cause attribute subspace and similarity of outlicrs were given, and then an algorithm for finding intentional knowledge of outliers based on attribute subspace was proposed, the approach can obtain the cause attributes set of every outlier. Then the outliers were classified by their similarity combined with the thinking of clustering, all the outliers of every class have the same cause attributes set under certain precision. The experiment results show that the algorithm is effective and practical,and more ease of use.
Topic Extracting with User Information Protection on Web
XIE Ming,WU Chan-le
Computer Science. 2011, 38 (3): 203-205. 
Abstract PDF(406KB) ( 344 )   
RelatedCitation | Metrics
This paper proposed a method of topic extracting with user information protection in Web learning resources.We added the HMM model of user information protect stated into user information protection layer. The model can exit topic extracting procedure automatically once judging the invalid state of user information protection to prevent violations of user information. This paper evaluated HMM model of user information protection with the real data sets of 227 user's query browsing behavior, user link, the user configuration information in four study sites. The experimental data show that for 500 random massages test, the correct ration of the model judgment on the user information protection states is 94%, the incorrect ration of that on false security messages is 0.04. According to the user ratings data of fourlearning sites in the 120 days, the average rate of increase of user rating reaches to 23. 23% after the system is used.
Algorithm Based on Sign Transformation for Paraconsistent Reasoning in Description Logic ALC
ZHANG Xiao-wang,XIAO Guo-hui
Computer Science. 2011, 38 (3): 206-212. 
Abstract PDF(623KB) ( 360 )   
RelatedCitation | Metrics
In an open, constantly changing and collaborative environment like the forthcoming Semantic Web, it is reasonable to expect that knowledge sources will contain noise and inaccuracies. It is well known,as the logical foundation of the Semantic Web, description logic is lack of the ability of tolerating inconsistent or incomplete data. Recently, some paraconsistent scenarios were used to avoid trivial inferences so that inconsistencies occurring in ontologics could be tolerated. Their inference powers are always weaker than that of classical description logics even handling consistent ontologoies since they cost the inference power of description logics. hhis paper proposed a tableau algorithm based on sign transformation which has stronger reasoning ability. We proved that the new paraconsistent tableau algorithm is decidable and has the same reasoning ability with the classical Description Logic over consistent ontologies.
Semantic Graph Discovery of TCM Documents
TAO Jin-huo,CHEN Hua-jun,HU Xue-qin
Computer Science. 2011, 38 (3): 213-217. 
Abstract PDF(566KB) ( 401 )   
RelatedCitation | Metrics
This paper proposed an ontology based TC;M semantic graph discovery of TCM Document, The core method includes three procedures. Firstly, extract keywords from the TCM documents, using the TCM ontology concept name as dictionary. Secondly calculate the frequency of the keywords. hhirdly, identify the semantic relation between the keywords with the TC;M ontology knowledge base. Furthermore, predict the semantic relation that can' t be identified.Therefore, every group of keywords could generate a semantic graph that express the possible semantic of the original sentence. In the experiment section, the TCM ontology knowledge base was used to identify the semantic graph from TCM Documents,and verify the feasibility of the method of this paper.
Dynamic Threshold Rough C-Means Algorithm
WANG Dan,WU Meng-da
Computer Science. 2011, 38 (3): 218-221. 
Abstract PDF(424KB) ( 404 )   
RelatedCitation | Metrics
Selection of parameters wz,wμ,εplays an important role in rough C; Means algorithm In this paper, a dynamic threshold rough C-Means algorithm was proposed to self-adaptive adjusting threshold。that reflects the superposition between classes. This algorithm computes a threshold for every object on the basis of class interval and the distance between class and o场ect, The better effect can be testified by two synthetic data and image data experiments.
Relations between for Variable Precision Covering Approximation Operators and Covering Approximation Operators
LIANG Jun-qi,ZHANG Wen-jun
Computer Science. 2011, 38 (3): 222-223. 
Abstract PDF(139KB) ( 345 )   
RelatedCitation | Metrics
The model of variable precision coverage rough set was proposed under certain conditions of relaxing coverage standard which yields diversification of the approximation operators, and the change is in certain laws in addition. Afto introducing the concepts on models of coverage rough set and also of variable precision coverage rough set, we proved the relationships of approximation operators between them. That is,we achieved the results in theorem 1,theorem 2 and related corollaries.
Sequence Fuzzy Concept Lattice Model and Incremental Construction
LI Yun,YUAN Yun-hao,SHENG Yan,CHEN Ling
Computer Science. 2011, 38 (3): 224-230. 
Abstract PDF(588KB) ( 366 )   
RelatedCitation | Metrics
Traditional algorithms in mining sequence patterns only discover the frequent sequences satisfying the minimum support threshold minsup;however,these methods don't consider the importance of actual sequences. In order to mine the important sequence patterns satisfying users' many demands, this paper presented a new sequence fuzzy concept lattice model. The model firstly introduced the weight for each item in sequences. On the basis of the weight, we defined the weight of every sequence and the self-adaptive coefficient that may dynamically adjust the minimum support threshold minsup. And then the fuzzy formal context was extended to express sequences in brief. Making use of the sequcnce fuzzy formal context, Galois connection, sequence fuzzy conception and sequence fuzzy concept lattice were defined in the paper. At last, this article presented the incremental construction algorithm SeqFuzCL of the sequence fuzzy concept lattice. The experimental results show that the algorithm SeqFuzCL can effectively express self-adaptive sequcnce patterns in the lattice, and has excellent performance on the timcspatial complexity. Simultaneously, the model provides theoretic support for mining self-adaptive secauence patterns.
Efficient Semi-supervised Active Relevance Feedback Scheme for Image Retrieval
WANG Wen-sheng,CHEN Li-ke,WU Jun
Computer Science. 2011, 38 (3): 231-235. 
Abstract PDF(513KB) ( 321 )   
RelatedCitation | Metrics
Active learning plays an important role for boosting interactive image retrieval. Among various methods, support vector machine(SVM) based active learning approaches have been drawn substantial attention. However, most SVM-based active learning methods are challenged by small example problem, asymmetric distribution problem, and redundancy among examples. This paper proposed two mechanisms to tackle above problems; (1) designing an asymmetric semisupervised learning(ASL) framework that exploits unlabeled data for semantic relevant and irrelevant classes in different ways. Under the influence of ASI,the efficiency of SVM is significantly improved;and(2) developing a representative measure based active selection criterion to identify the most informative images from unlabeled data while the diversity among them is augmented. Experimental results validate the superiority of our scheme over several existing methods.
Soundness and Completeness Proof of Agent Intention Production in AgentSpeak
YANG Bo,SHAO Li ping,QIN Zheng
Computer Science. 2011, 38 (3): 236-242. 
Abstract PDF(641KB) ( 504 )   
RelatedCitation | Metrics
Intention production is the procedure that BDI Agent drafts an action sequence to realize a goal. However, it is also a big obstacle to verify the validity of Agent intention produced by the specification of Agent Programming Languagc(API).In this paper, to prove the validity of intention execution in AgentSpeak, we constructed a model-theoretic semantics and a formal interpretation of Agent programs intention. Then, on the bases of the model-theoretic semantics and the operational semantics presented by Moreira and I3ordini, we proved the equivalence theorem; the intention in model-theoretic semantics of AgentSpcak language is equivalent with the one in operational semantics of AgentSpeak program According to the ectuivalence theorem, we can get the conclusion that intention execution of AgentSpeak is sound and complete.
New Polynomial Smooth Support Vector Machine
YUAN Hua-qiang,TU Wen-gen,XIONG Jin-zhi,LIU Ting-ting
Computer Science. 2011, 38 (3): 243-247. 
Abstract PDF(365KB) ( 395 )   
RelatedCitation | Metrics
Smooth Support Vector Machine(SSVM),which has better advantage than Support Vector Machine(SVM),is a model of SVM for ctuick solving. We got a new smooth function for approximating the plus function by interpolation base on smooth piecewise polynomial functions. We found a new model of Smooth Support Vector Machine by improving the model of Polynomial Smooth Support Vector Machine using the new smooth function. Performances of approaching and approximate error were given for the new smooth function, as well as the study of convergence and the approximation limit of optimum for the new model. Obviously numeric experiment shows that the new model has better performance than PSSVM.
Improved Glowworm Swarm Optimization Based on the Behavior of Follow
LI Yong-mei,ZHOU Yong-quan,YAO Xiang-guang
Computer Science. 2011, 38 (3): 248-251. 
Abstract PDF(391KB) ( 319 )   
RelatedCitation | Metrics
To improve the glowworm swarm optimization, the behavior of follow of the artificial fish school algorithm and a swarm degree were used. The improved algorithm was used to optimise multi modal functions. The experiment resups show that the algorithm, with the smaller populations and the fewer number of iterations, can simultaneous capture multiple optima of several standard multimodal test function.
Adaptive Semi-supervised Marginal Fisher Analysis
JIANG Wei,YANG Bing-ru,SUI Hai-feng
Computer Science. 2011, 38 (3): 252-253. 
Abstract PDF(260KB) ( 613 )   
RelatedCitation | Metrics
Graph based semi supervised methods have successfully used in face recognition. These algorithms not only consider the label information, but also utilize a consistency assumption. Conventional algorithms assumed that the consistency constraint is defined on the original feature space. However, the original feature space is not the best for defining consistency. We proposed adaptive semi supervised marginal fisher analysis(ASMFA) by which the consistency constraint is defined in the original feature space and the expected low-dimensional feature space. Experimental results on the CMU PIE and YALE-B databases demonstrate that ASMFA brings signification improvement in face recognition accuracy.
Hybrid Algorithm for Tool-path Airtime Optimization during Multi-contour Processing in Leather Cutting
YANG Wei-bo,WANG Wan-liang,JIE Jing,ZHAO Yan-wei
Computer Science. 2011, 38 (3): 254-256. 
Abstract PDF(371KB) ( 544 )   
RelatedCitation | Metrics
Tool-path airtime optimization during multi-contour processing in leather cutting is regarded as generalized traveling salesman problem. A hybrid intelligence algorithm was proposed. The improved genetic simulated annealing algorithm was applied to optimize multi-contour sequence, and then combining machining characteristics, the problem was changed into multi-segment map problem which is solved with dynamic programming algorithm. Traditional I3oltzmann upgrade mechanism increases memory function and sets up dual-threshold to reduce the calculation amount while maintwining the premise of optimality. Individual fitness function based on multi-segment map optimal sub-structure was designed. The practical application and the standards tests show that the algorithm has satisfactory solution quality and convergence.
Neighbor Feature Extended kd-tree and Searching Algorithm for Game
XU Jian-min,LI Huan,LIU Bo-ning
Computer Science. 2011, 38 (3): 257-262. 
Abstract PDF(526KB) ( 909 )   
RelatedCitation | Metrics
Processing the interactions among large numbers of objects is the main computation task in game system. Using kd-tree to organize the game scene improves such computation. There's a obvious performance degradation in situalion of node-crossings as traditional algorithm uses hierarchically recursive way to search. The concept of neighbor fealure was proposed to extend traditional kd-tree structure, so the planar adjacent relationship of hierarchical nodes was added. A new algorithm searching the tree in a 4-sides expanding way from the standing node as the center was devised.The analysis and simulation showed that the new algorithm improves the performance by about 40% and is more stable than the traditional one.
MATLAB Realization on Subalgebras of Direct Product of Lattice Implication Algebras
WU Ming-hu,LIU Xu,XU Yang,MA Jun
Computer Science. 2011, 38 (3): 263-265. 
Abstract PDF(310KB) ( 337 )   
RelatedCitation | Metrics
The current paper gets the concrete forms of the subalgebras of two finite chain-type lattice implication product algebras by means of MATLAB, it is absolutely necessary to know the structure of lattice implication algebras more intuitively. An example was given to show the result. It investigated the relations of some filters of lattice implication algebras. Furthermore, it discussed about the filters of direct product of two finite chain-type lattice implication subalgebras, and got the direct product that two finite chain-type lattice implication subalgebras have trivial filters and prime filters only.
Emotion Recognition of Physiological Signals Based on Intelligent Algorithm
XIONG Xie,LIU Guang-yuan,WEN Wan-hui
Computer Science. 2011, 38 (3): 266-268. 
Abstract PDF(364KB) ( 509 )   
RelatedCitation | Metrics
For the problem of emotion recognition, genetic algorithm based on simulated-annealing method, max-min ant colony algorithm and particle swarm algorithm were used for feature selection, and combined with Fisher linear classifier to recognize six emotions: joy, surprise, disgust, grief, anger and fear, it has obtained higher recognition rate. Effective feature subset which can identify the emotion recognition system model with better perfomfance was found, and the recognition system was established with forecasting ability of six emotions.
Design of a Hybrid Real-time Intelligent Information Fusion System
HUANG Kun
Computer Science. 2011, 38 (3): 269-270. 
Abstract PDF(248KB) ( 315 )   
RelatedCitation | Metrics
Confronting with complex real-time working environment, designing a high-performance reliable information fusion system is uppermost important. Aiming at this problem, on the basis of expert system technology, by comprehensively using database technology, blackboard theory and neural network theory, complex fusion system was studied. On the basis of it, a hybrid real-time intelligent information fusion system was designed, and fusion function system, knowledge base, knowledge acquisition module, database and blackboard in the fusion system were stated in detail respectively.
Template Updating Based Adaptive Tracking Algorithm Using Mean-shift
LI Yong-yong,TAN Yi-hua,TIAN Jin-wen
Computer Science. 2011, 38 (3): 271-274. 
Abstract PDF(363KB) ( 324 )   
RelatedCitation | Metrics
This paper proposed an adaptive tracking algorithm in which the Mean-shift algorithm is taken to obtain target position. Among this framework, the target model is updated by analyzing the results of SIFT points matching using least sctuares model. This algorithm can fast track target with the advantages that the template updating and the finding of scale coefficients arc implemented at the same time. The experimental results show that the algorithm is very robust to the cases that the scales of target vary largely.
Robust Bilateral Filter in Consistent Sub-neighborhoods
XIAO Xiu-chun,WANG Zhang-ye,ZHANG Yu-nong,JIANG Xiao-hua,PENG Qun-sheng
Computer Science. 2011, 38 (3): 275-278. 
Abstract PDF(398KB) ( 456 )   
RelatedCitation | Metrics
A robust bilateral filtering algorithm based on consistent subneighborhoods was presented. Firstly, employ adaptive region growing method to split local neighborhood of the seed pixel into consistent sub-neighborhoods. Then,within one of the sulrneighborhoods, smooth the value of seed pixel by using improved bilateral filter algorithm. To enhance the robust performance, by following the definition of non-local means filtering, define kernel function of this filter based on geometric closeness and local neighborhood window similarity. Because this algorithm combines both the advantages of bilateral filter and non-local means filter, and also denoises within consistent sulrneighborhoods, it can gain a more reasonable image effect. Simulation experiments demonstrate that the presented algorithm can remove noise effectively, and simultaneously preserve detailed feature of the image.
Cortical Surface Parcellation Using STAPLE Algorithm
HU Xin-tao,LI Gang,GUO Lei
Computer Science. 2011, 38 (3): 279-282. 
Abstract PDF(369KB) ( 481 )   
RelatedCitation | Metrics
An algorithm for simultaneous truth and performance estimation of various approaches for human cortical surface parcellation was proposed. The probabilistic true segmentation was estimated as a weighted combination of the segmentations resulted from multiple methods. Afterward,an Expectation-Maximization algorithm was used to optimize the weighting depending on the estimated performance level of each method. Furthermore, a spatial homogeneity constraint modeled by the Hidden Markov Random Field theory was incorporated to refine the estimated true segmentation into a spatially homogenous decision. The proposed method was evaluated using both synthetic and real data. Also, it was used to generate reference sulci regions to perform a comparison study of three methods for cortical surface parcel lation. The experimental results demonstrate the validity of the proposed method.
Tidal Level Measurement in Ruler Image Based on Hough Transform and Harris Detection
ZHANG Chao-liang,JIANG Han-hong,ZHANG Bo,JIANG Chun-liang
Computer Science. 2011, 38 (3): 283-285. 
Abstract PDF(264KB) ( 389 )   
RelatedCitation | Metrics
Tidal level information is important in hydrographic developed a ruler reticle check method using image processing.surveying and water area measuring domain. This paper by reading video sequence of ruler which is pegged in the water, arithmetic based on hough transform and harris Goner detection can calculate the tidal level. Water experiment in dicates that the method has advanced practicability.
Research on Family of Object Model in Semantic Feature Modeling
LIU Xian-guo,SUN Li-juan,WANG Qi-hua,ZHANG Hui
Computer Science. 2011, 38 (3): 286-289. 
Abstract PDF(354KB) ( 311 )   
RelatedCitation | Metrics
A new family of object model, i. e. declarative family of object model, was imposed in this paper. The model was defined, geometric and topological structure of this model was presented, semantic of feature was specified by constraints, constraint graph was used to represent features in the model, grammar specification of model was provided.Validation of the model was verified in a instance, and insufficient of traditional procedural-based modeling was overcome, design efficiency of CAl)modeling was developed, and design cost was reduced.
Image Zooming by Combination of the Curvature-driven and Edge-stopping Nonlinear Diffusion
ZHU Xuan,LEI Wen-juan,ZHANG Shen-hua,WANG Lei
Computer Science. 2011, 38 (3): 290-291. 
Abstract PDF(247KB) ( 385 )   
RelatedCitation | Metrics
Based on the good performances of nonlinear diffusion in preserving important image features, the Curvature driven with Edge-stopping(C& E model) was proposed and applied into image zooming. The C& E model has the proteclion of little curvature and big gradient is emphasized, at same time, it has two effects of connecting broken isophotes and strengthening edge. hhe condition of the initial and boundary value of image zooming, as well as numerical schemes,were further discussed. For image zooming, the C&E model exhibits better performance than the nonlinear diffusion model.
Image Denoising Using Double-density Complex Wavelet Transform
YUAN Bo,SHANG Zhao-wei,LANG Fang-nian
Computer Science. 2011, 38 (3): 292-294. 
Abstract PDF(346KB) ( 383 )   
RelatedCitation | Metrics
This paper proposed a new image denoising method based on the DoublcDensity Complex Wavelet I}ransform. The proposed algorithm uses the Double-Density Complex Wavelet Transform to decompose the denosied image,and permutes the decomposition coefficient to stress the edge information. The I3ivariate model under the maximum estimate of 13aycsian can exploit the intra scale and inter-scale corrclations of coefficients. Compared with some current outstanding denoising methods,the simulation results and analysis show that the proposed algorithm obviously outperforms in both Pcak Signal-to-Noise Ratio(PSNR) and visual quality, and effectively preserves detail and texture information of original images.
Adaptive Threshold-based Detection Algorithm for Image Copy-move Forgery
KANG Xiao-bing,WEI Sheng-min
Computer Science. 2011, 38 (3): 295-299. 
Abstract PDF(495KB) ( 343 )   
RelatedCitation | Metrics
Image forgery detection is a burgeoning research field in digital forensics. As a most common way of image tampering, copy-move forgery is used to conceal objects or clone regions to produce a non-existing image scene. The target of copy-move forgery detection is to identify bigger duplicated image regions which are same or extremely similar. We reviewed several methods proposed to achieve this goal. These methods failed in detecting digital images with homogeneous texture or uniform areas and selecting the appropriate thresholds. We presented a novel adaptive thresholdbased detection algorithm for image copy-move forgery,which might be applied to the color images with homogeneous or smooth regions and identified and located forged image regions automatically if only reasonable thresholds were estimated. Experimental results on several forged images with various homogeneous or uniform regions were presented to demonstrate the effectiveness of the proposed algorithm.
GPU-based Real Time Image Registration with Variant SIFT
YUAN Xiu-guo,PENG Guo-hua,WANG Lin
Computer Science. 2011, 38 (3): 300-303. 
Abstract PDF(345KB) ( 358 )   
RelatedCitation | Metrics
Improvements and parallelization in the aspects of creating differences of gaussian and locating sub-pixel keypoints were made against the problem that the dimension of descriptors from variant SIFT-GLOH(Gradient Location Orientation Histogram) is too high to meet the need of real time. The algorithm was implemented in multi-threads on the GPU hardware with CUDA(Compute Unified Device Architecture). On one hand, the information loss of keypoints caused by PCA was avoided. On the other hand,the speed of registration sacrificed the need of real time in engineering.The programs were compiled by C language & CUDA on VS2005 platform. The result shows that the ratio of correct matching point pairs and registration speed are both promoted greatly.
P-code Analyzing Based on BDM Model and Extension Research
DU Jian-jun,LI Hua,CHEN Ming
Computer Science. 2011, 38 (3): 304-307. 
Abstract PDF(367KB) ( 326 )   
RelatedCitation | Metrics
P-code is one new member of Redundant Array of Inexpensive Disks(RAID) coding family, and its structure,coding method and decoding algorithm were analyzed in this paper. Further more the Binary Distribution Matrix(BDM) was firstly applied in P-code study. Based on this work, the approach of RAIL〕6 extension for tolerating more than three disks failure was concluded. Including P-code, the research of vertical RAIL)6 extension of Maximum Distance Separable (MDS) codes should be a new study direction and a difficult research problem in this domain.