Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 8, 01 December 2018
  
P-sets and its Applied Characteristics
SHI Kai-quan
Computer Science. 2010, 37 (8): 1-8. 
Abstract PDF(666KB) ( 651 )   
RelatedCitation | Metrics
P-sets are a set pair which are composed of internal P-sets XF and outer P-set sXl',or (XF,XF ) is P-sets. P-sets have dynamic characteristics. 13y using P-sets, the concept of Information Inheritance and Information Inheritance Metric,as well as Information Inheritance Theorems were given. Based on characteristic of Information Inheritance,method and application of information state identification were proposed. Information Image-generation, applications of Information Image hidden-potential were provided. P-sets arc new theory and method to study dynamic information system. Finally, the reality of P-sets was given,the differences of P-sets and Z. Pawlak rough sets were given.
Code Similarity Detection:A Survey
XIONG Hao,YAN Hai-hua,GUO Tao,HUANG Yong-gang,HAO Yong-lei,LI Zhou-jun
Computer Science. 2010, 37 (8): 9-14. 
Abstract PDF(627KB) ( 1190 )   
RelatedCitation | Metrics
Identifying program code similarity is to measure similar degree between codes with a kind of detection method, which is very important in the fields of computer science education and intellectual property protection. We introduced the research purpose, the history and some conceptual models of code similarity detection. By analyzing several approaches of code similarity detection, the characteristic of each method was summarized. hhen we discussed some code similarity based researches. Finally, the conclusion of the problems in current technologies was given before the discussion about some promising tendency of code similarity detection.
Advances in Gait Representation and Gait Fusion Methods
CHEN Chang-hong,LIANG Ji-min,ZHAO Heng,JIAO Li-cheng
Computer Science. 2010, 37 (8): 15-20. 
Abstract PDF(609KB) ( 369 )   
RelatedCitation | Metrics
As one of the biometrics which can be perceived unobtrusively at a distance, gait recognition gets more and more attention. Efficient gait representation is critical for gait recognition and information fusion methods are important alternative to improve the performance of gait recognition. We summarized the recent advances in gait recognition from the viewpoint of gait representation and gait information fusion methods. We also summarized the latest gait representation methods in detail and sums up the gait information fusion methods from three categories: multi-feature fusion,multi-view fusion and multi-biometric fusion. Furthermore, the development trend of gait recognition was analyzed.
Survey of Nonlinear Filters in the Framework of Recursive Bayesian Estimation
WANG Jian-wen,LI Xun,ZHANG Hui,MA Hong-xu
Computer Science. 2010, 37 (8): 21-25. 
Abstract PDF(400KB) ( 621 )   
RelatedCitation | Metrics
Nonlinear filters in the framework of recursive Baycsian estimation were classified. These filters were divided into three categories based on their designed ideas. These categories include nonlinear filters based on functions’approximation or transform and nonlinear filters based on moments' approximation and nonlinear filters based on conditional posterior probability density function's approximation. At the same time, common properties and special properties among the linear regression Kalman filter and divided difference 2 Kalman filter and unscented Kalman filter and Gauss-Hermite filter were discussed in detail. Deficiencies of filters' synthesis which is used to design new nonlinear filters were indicated. Sufficiency of designed ideas in nonlinear filters was analyzed, and deficiencies in some designed ideas were detected. Potential breakthrough directions in nonlinear filters were specified.
Survey on Identity-based Key Establishment Protocols for Wireless Sensor Networks
FU Xiao-jing,ZHANG Guo-yin,MA Chun-guang
Computer Science. 2010, 37 (8): 26-31. 
Abstract PDF(539KB) ( 365 )   
RelatedCitation | Metrics
Wireless sensor networks (WSN) security faces more challenges than traditional Ad hoc networks due to its highly self-organization and limited energy. Key management is one of the fundamental security issues in WSN. As a vital part of key management, session key establishment needs to achieve a trade-off between security and performance. In this paper, a survey on Identity-based key establishment in WSN was provided. We firstly described the identedity-based cryptography from its three fundamental primitives, then presented some security properties and security models of identity-based key agreement protocols, analyzed the existing identity-based key establishment protocols for WSN to point out their shortages, and finally discusses the research area of Identity-based key establishment for WSN in the future.
Research on Security Problems of Web Service
HE Zheng-qiu,WU Li-fa,HONG Zheng,WANG Rui,LI Hua-bo
Computer Science. 2010, 37 (8): 32-39. 
Abstract PDF(898KB) ( 397 )   
RelatedCitation | Metrics
Web service is characterized by its platform-independence, dynamic, openness, and loose coupling. These chary cteristics greatly facilitate the application-to-application integration based on heterogeneous platform, but they also lead to many security problems. The security of Web service deeply influences its development and is also one of the main reasons why Web service has not been adopted widely. In this paper, we firstly summarized the main security problems of Web service and outlined the existing security specifications for Web service, and then we analyzed the representative solutions to Web service security in detail, including message security, security policy, security in Web service composition,identity and trust management, access control, attacks and defenses, as well as development of secure Web services. On the basis of current research achievemented, this paper also presented a discussion on the future research directions and the challenges of Web service security.
Research on the Modeling of Wireless Network System
FU Tao,HUANG Ben-xiong
Computer Science. 2010, 37 (8): 40-46. 
Abstract PDF(787KB) ( 672 )   
RelatedCitation | Metrics
Modeling is one of the most important tools in wireless system research. It's the foundation of simulation experiment, and relates to many fields, such as statistical characteristics modeling, parameters estimation, optimization of the algorithm and organization of all the models. Firstly, this paper summarized the common wireless systems, and introduced the metrics. Secondly, channel, topology, traffic and protocols behavior were researched in details, mainstream tools were evaluated. Finally, existing problems and research trend were analyzed.
Bioinformatics Approach for Molecular Evolution Research
ZHANG Shu-bo,LAI Jian-huang
Computer Science. 2010, 37 (8): 47-51. 
Abstract PDF(539KB) ( 634 )   
RelatedCitation | Metrics
The phylogeny inference among different species aims to reveal the evolutionary history of certain specics,as well as the evolutionary relationship among them,it is an essential branch in the today's research of science life. The progress in molecular biology supplies a large volume of data to the field of phylogeny inference, and the research of evolutionary relationship among different species at molecular level is receiving considerable attention. In the past decade,much progress in this area has been achieved by means of bioinformatics, and a variety of algorithms have been devised to reconstruct the phylogenetic tree of species. In this paper, the development of measuring the evolutionary distance and reconstructing algorithm for phylogeny were reviewed and commented, and the challenge of researchers in this field faced was then pointed out, in the end, perspectives in this realm were also proposed.
Incremental Search for Unstructured P2P Network
ZHU Gui-ming,GUO De-ke,JIN Shi-yao
Computer Science. 2010, 37 (8): 52-55. 
Abstract PDF(338KB) ( 325 )   
RelatedCitation | Metrics
Nowadays it is quite easy for common users to share and exchange resources on the Internet through application software based on peer-to-peer computing mode,and therefore more and more people join in peer-to-peer network to share and exchange resources. As a result, the number of peers becomes extremely large, and resources are extremely abundant and scattered. In this case, it is an important and challenging job to do incremental search to incrementally retrieve new related resources for any query. This paper presented a general incremental search model for unstructured P2P network. This general incremental search model makes incremental search try to access semantic most related nodes, and therefore it does similar searches as Google[1] in totally distributed P2P network.
New Approach for Analyzing of E-commerce Protocol
GUO Hua,LI Zhou-jun,ZHUANG Lei,JI Hong-lin
Computer Science. 2010, 37 (8): 56-60. 
Abstract PDF(390KB) ( 346 )   
RelatedCitation | Metrics
This paper presented an extended CSFM by combining communication finite state machine(CSFM) with some new logic rules based on Qing-Zhou logic to analyze security properties of E-commerce protocols. It not only can describe the knowledge and behavior of participants, but also analyze the security properties without initial state assumptions. In addition, this method enables us to verity other security properties after abstracting and modifying the model. Using this method,accountability,fairness and atomicity were analyzed to be satisfied in the anonymous and failure resilient fair-exchange ecommerce protocol. Then UPPAAL was used to verify the properties of fairness, liveness and timeliness.
ABC Supported Handover Decision Scheme Based on Population Migration
WANG Xing-wei,Qin Pei-yu,HUANG Min
Computer Science. 2010, 37 (8): 61-66. 
Abstract PDF(456KB) ( 347 )   
RelatedCitation | Metrics
In this paper, a handover decision scheme with ABC supported was proposed. Comprehensively considering access network conditions, application requirements, user preferences to access network coding schemes, user preferences to access network providers, terminal velocities and terminal battery capacities, an optimal handover solution was found based on PMA (Population Migration Algorithm). With the help of gaming analysis,Pareto optimum under Nash equilibrium of user utility and network provider utility was achieved or approached for the found solution. Simulation resups showed that the proposed scheme was effective.
Attacks and Defenses in Automated Trust Negotiation
LI Kai,LU Zheng-ding,LI Rui-xuan,LIU Bai-ling
Computer Science. 2010, 37 (8): 67-71. 
Abstract PDF(544KB) ( 409 )   
RelatedCitation | Metrics
The purpose of Automated Trust Negotiation (ATN) is mainly to establish trust among different security domains. ATN is an approach to establish mutual trust between strangers wishing to share resources or conduct business by gradually requesting and disclosing access control policies and digital credentials. Special attacks can be initiated to ATN according to the characteristics of the way of trust establishment, which cannot be effectively tackled by the measures preventing normal network attacks. Therefore, it is essential to analyze all kinds of attacks existing in ATN. A comprehensive survey of research on attacks in ATN was presented based on the classification and introduction of different attacking manners and corresponding defenses,the shortcomings of the current related research were pointed out and the development trend was also discussed.
On Robust Ttrust Mechanisms in Ad hoc Networks
SUN Yu-xing,DU Jing-lin,XIE Li, FENG Guo-fu
Computer Science. 2010, 37 (8): 72-76. 
Abstract PDF(408KB) ( 371 )   
RelatedCitation | Metrics
Packet delivery of Ad hoc networks depends on collaboration among distributed entities. Different needs on the trust models of varieties of routing protocols were analyzed separately. The robust trust mechanisms(RTM) were proposed to meet the new demands of source routing protocols for trust models. A new scheme for gathering first hand trust information from non-neighbor nodes using acknowledgements was presented in paper. The RTM expands the OTMF by providing recommendation trust revision support, which is based on I3ayesian Decision-Making theory and minimum-loss principle and promotes accuracy of trust value. Simulations show that compared with the OTMF, the RTM is helpful to reduce the impact of some threats on trust management and has better convergence.
Statistic IP Network Quality's Fuzzy Evaluation Method
LUO Yun-qian,XIA Jing-bo,ZHI Ying-jian,QIAN Yuan
Computer Science. 2010, 37 (8): 77-79. 
Abstract PDF(318KB) ( 1063 )   
RelatedCitation | Metrics
The definition of network quality was presented and its connotation was expatiated for scientific evaluating network running status. The metrics were designed to reflect the network quality and the membership vector was used to compute its statistic values, the values can definitely reflect the metrics' ranks in the period of statistics. The network quality fuzzy evaluation model based on statistics was designed. The case study evaluated the four local IP networks'qualities based on their statistic values and got the quantitive evaluating results.
Verifiable Secret Sharing Scheme Based on ElGamal Cryptosystem
LIU Yi,HAO Yan-jun,PANG Liao-jun
Computer Science. 2010, 37 (8): 80-82. 
Abstract PDF(248KB) ( 769 )   
RelatedCitation | Metrics
Based on ElGamal cryptosystem, a new verifiable secret sharing scheme was proposed. In this scheme, each participant's secret shadow is selected by the participant himself and even the secret dealer does not know anything about his secret shadow. All these shadows arc as short as the secret to be shared. In the recovery phase, any participant computes only one time in order to detect if cheats exist and the probability of successfully cheating can be ignored. The secret dealer can point out the identity of cheats if they exist. For this scheme, the secret information is fully used and the computation complexity of verifying can be reduced largely. The shadows do not need to be changed when the shared secret is renewed. Moreover, each participant can share many secrets with other participants by holding only one shadow. The security of this scheme is the same as that of the ElGamal cryptosystem and Shamir's (t,n) threshold secret sharing scheme.
LHL-cube Interconnection Networks and their Properties
LI Yong,FAN Jian-xi,WANG Xi,ZHOU Wu-jun
Computer Science. 2010, 37 (8): 83-87. 
Abstract PDF(367KB) ( 497 )   
RelatedCitation | Metrics
The parallel processing system is one of the research focuses on computer science. The properties of the network are very crucial because they determine the performance of the whole network. Many interconnection network topologies have been proposed. Hypercube topology has enjoyed popularity due to many of its attractive properties, including small diameter, strong connectivity and symmetry. But the hypercube is not the best topology on all aspects. Some variants of the hypercube have better properties than the hypercube. Among these variants the locally twisted cube has drawn a great deal of attention from the researchers. Its superior properties over the hypercube on diameter, Hamilton connectivity and some other properties have been proved. This paper gave a kind of connection-the hyper connection between the nodes of the hypercube and the nodes of the locally twisted cube. Thus, a new interconnection network called a LHL-cube was obtained by using this kind of connection. These properties were studied in this paper:vertex connectivity,link connectivity, Hamilton connectivity and diameter. The results show that the vertex connectivity and the link connectivity of then dimension LHL-cube are all n. Then it was proved that the n dimension LHL-cube is Hamilton connecuvmy and the upper bound diameter is [n/2]+3.
P2P Traffic Identification Using Variance Analysis and Support Vector Machine Algorithm
WU Min,WANG Ru-chuan
Computer Science. 2010, 37 (8): 88-91. 
Abstract PDF(460KB) ( 353 )   
RelatedCitation | Metrics
P2P traffic has taken great portions in the Internet traffic. While having a significant impact on the Internet,it brings serious problems such as network congestion and traffic hindrance caused by the excessive occupation in the bandwidth. The paper firstly introduced methods in identifying P2P traffic and their characters, then put forwards a P2P traffic feature selection method by exploring analysis of variance. Meanwhile a model based on Support Vector Machine (SVM) algorithm was set up to fulfill the quasi-real-time identification of P2P traffic. Experimental results show that the method is efficiency for P2P traffic identification and has a more accurate precision.
Dynamic Decision Domain Based Admission Control Algorithm for Multi-service Network
QIU Gong-an,ZHANG Shi-bing,ZHOU Li-heng
Computer Science. 2010, 37 (8): 92-94. 
Abstract PDF(357KB) ( 332 )   
RelatedCitation | Metrics
Flow-aware admission control based on no singling decision provided diversified performance for different services by distinguishable access in multi-service.The link state spaces constructed with different service metrics could respond to the change state.And different flows in progress were modulated to rational distribution in real-time. The link state space was mapped to dynamic decision domain based on fuzzying the threshold of service state metrics.The link decision table was made at the same time.The node assured the requested quality of service for services in progress according to the service-aware category and decision table.The stable condition of the link state transfer was deduced that elastic flows are in under-load.Simulation result shows that the link fair rate changes steadily.
Elliptic Hyperbola Cryptography System
ZHANG Xin-yan,YAN De-qin
Computer Science. 2010, 37 (8): 95-98. 
Abstract PDF(286KB) ( 349 )   
RelatedCitation | Metrics
In recent years, the elliptic curve cryptography system(ECCS) plays more and more important role in the cryptography. Instead of traditional public key cryptography system that it is RSA, it is the ECCS in a lot of application areas. Therefore, there are more and more attacks on the ECCS. In this paper, we advanced the technology that improves the ECCS's safeness and keeps its good property. The technology is called "Elliptic Hyperbola Cryptography System (EHCS)". The EHCS has the same calculation complexity as ECCS, but it has excellently randomness. So EHCS can improve cryptographic safeness in theory. The EHCS has more flexible operation mode than ECCS. Therefore, the EHCS can improve information technology's safeness.
System Level Analysis of Cluster-tree Based Wireless Sensor Network Design
WAN Ya-dong,WANG Qin,ZHANG Xiao-tong
Computer Science. 2010, 37 (8): 99-103. 
Abstract PDF(396KB) ( 337 )   
RelatedCitation | Metrics
Cluster-tree based topology is wildly used in wireless sensor network applications. In order to target different applications using this kind topology, a system level analysis approach is needed for the flexible and efficient protocol design and deployment. Based on a TDMA industrial MAC protocol,a system level model was proposed to reveal the relationship between network parameters and application requirements on reliability, energy efficiency and delay. The large scale deployment and experiment on the factory real field showed that the approach could be useful for the adj ustment of protocol parameters and satisfy the application recauirements.
Modeling and Performance Simulating for Network Mobility Based on Satellite Communication
XIE Zhi-dong,HE Chao,ZHANG Geng-xin
Computer Science. 2010, 37 (8): 104-106. 
Abstract PDF(373KB) ( 342 )   
RelatedCitation | Metrics
Network mobility (MEMO) means that it can solve the problem of mobile users in big mobile platforms changing access to IP core network while not disrupt the communication. Satellite has the characteristics of broadcasting,large scale and seamless coverage of the earth. Thus, it is an effective and feasible way to tackle the problem of MEMO with satellite communication. At first, a model of network mobility communication system based on mobile satellite communication was built up and implemented in the NS2. Then, under the model, the performance of TCP was simulated. Finally, according to the traits of satellite communication, a handoff strategy in IP layer was proposed for MEMO based on satellite communication. The strategy can reduce delay of interrupt and improve the performance of TCP.
Research and Realization of Enhancing Bluetooth Baseband Based on Security and Error Correction Algorithm
LI Zhen-rong,ZHUANG Yi-qi,ZHANG Bo,NIU Yu-feng
Computer Science. 2010, 37 (8): 107-110. 
Abstract PDF(435KB) ( 491 )   
RelatedCitation | Metrics
The frequency hopping (FH) and error correction algorithms of Bluetooth were analyzed and improved for the defects of security and anti jamming. The performance of FH sequences based on advanced encryption standard (AES) iterated block cipher was analyzed,and the results showed the sequences had good performance on uniformity,correlation, and security. An enhancing error correction mechanism(EECM) combining the forward error correction with interleaving was adopted for Bluetooth DM packets, and the simulation was carried out based on Gillbert-Elliott channel. hhe result showed the EECM could improve the anti-interference ability largely. The ASIC structures of FH sequences generator and EECM were proposed,and the realized results of these enhancing IPs were obtained by using low-power,low-cost VLSI design techniques. The performance of enhancing Bluetooth baseband was analyzed compared with the standard 131uctooth baseband. Finally, the Bluctooth system-on-chip was realized and tested by using platform-based design method.
Research of the FCSP-RTF Based on Reducing the Token's Frequency
LIU Jun-rui,FAN Xiao-ya,KANG Ji-chang
Computer Science. 2010, 37 (8): 111-113. 
Abstract PDF(275KB) ( 334 )   
RelatedCitation | Metrics
In order to acquire high communication performance in high-speed fibre channel switch network, the high-speed fibre channel switch protocol based on reducing the token's frequency,named as FCSP-RTF,was put up. On the basis of the token-routing technology,FCSP-RTF codes the token frames with the reducing-frequency code, so the switch can extract the information of token without the serial-to-parallel converter, and fulfill the work of switch quickly and correctly; and FCSP-RTF also simplifies its frames, so that the higher communication efficiency is achieved. The resups show that the structure of the FCSP-RTF frame is simple and the implementing of FSP-RTF is easy, and the communication performance and the reliability of the fibre channel switch network arc greatly improved while it's costs are significantly lowered.
Adaptive Link-layer Hybrid FEC/ARQ Mechanism for Multi-hop Wireless Sensor Networks
JIN Yong,LE De-guang,BAI Guang-wei,CHANG Jin-yi
Computer Science. 2010, 37 (8): 114-119. 
Abstract PDF(615KB) ( 721 )   
RelatedCitation | Metrics
This paper proposed an adaptive hybrid forward error correction (FEC)/automatic repeat request (ARCS algorithm at the data link layer for wireless sensor networks, in the hope that the reliability of data transmission could be greatly improved. The algorithm, using the Kalman filter to predict packet loss rate, adjusts the FEC parameter n. In addition, we determined that the energy consumption of ARQ depends on the wireless network condition. However, it is independent on the retransmission strategy. Both the mathematical analyses and simulation result demonstrate that the proposed mechanism can significantly improve cauality of communication, and the energy consumption is reduced as well.
Learning Mechanism of Trust Evaluation of MAS Based on Experience and Reputation
LIU Zhe-yuan,MU De-jun,WANG Xiao
Computer Science. 2010, 37 (8): 120-123. 
Abstract PDF(329KB) ( 361 )   
RelatedCitation | Metrics
To accurately evaluate cooperation agents, it's necessary to use the expericnccbased and reputation-based models existed in MAS. Therefore, a learning mechanism of MAS according to the learning result of active parameter was proposed. MAS using this mechanism gives different trust model the corresponding weight, dynamically calculates the reliability and evaluates the cooperation objects to make the optimal rewards. Simulation shows the efficiency of the learning mechanism.
FAST TCP Preference Method Based on Invariant Manifolds
CHEN Xiao-long,ZHANG Yun,LIU Zhi,YANG Ling-ling
Computer Science. 2010, 37 (8): 124-128. 
Abstract PDF(396KB) ( 352 )   
RelatedCitation | Metrics
When each route FAST TCP connections were assumed Poisson arrival and exponentially distributed document sizes,a Lyapunov function including FAST TCP connection's lingering time was constructed to prove that each route connection lingering time was shortest at the invariant manifolds under the heavy traffic service intensity, and to prove that the invariant manifolds were reached in a finite time. Aiming at the weakness of FAST TCP static mapping protocol parameter method, according to the relation between connection numbers and the protocol parameter in reduced order invariant manifolds, a slow timescale iterative search protocol parameter method based on the expected connection numbers and invariant manifolds was proposed. NS-2 simulation results were presented to verify the effectiveness of this method.
Research on Semantic Web Service Discovery with Cache
XU De-zhi,CHEN Xi-wei,CHEN Jian-er
Computer Science. 2010, 37 (8): 129-132. 
Abstract PDF(335KB) ( 341 )   
RelatedCitation | Metrics
In view of the lower efficiency of query in the existing semantic Web services discovery model, this paper proposed a new Semantic Web Services Discovery Model called SWSDM_Cache. It extended the traditional UDDI to a distributed framework, divided the semantic Web services on domain, and stored them in respective domain UDDI database. In the discovery process, domain matchmaking was executed firstly, and Cache retrieval was executed secondly.Experimental results show that the model significantly improves the efficiency compared with the traditional discovery strafegy.
QoS-aware Web Services Discovery Scheme Based on Semantic
WANG Xiao-jun,MAO Ying-chi,QIAN Guo-feng
Computer Science. 2010, 37 (8): 133-138. 
Abstract PDF(546KB) ( 282 )   
RelatedCitation | Metrics
With the increase of Web services with the similar or same functionality, QoS (Quality of Service) has become the important criteria for users to select the appropriate Web services. At present, most approaches to the QoS-based Web services selection suffer from the purely syntactic matchmaking, which could not meet the requirements of complicated QoS matchmaking in semantic level. In this paper, we proposed a based on semantic QoS-aware Web service discovery scheme. Initially, a QoS ontology was presented to define QoS data into service descriptions. hhen, the ontology reasoning was adopted to change previous syntactic matchmaking into a semantic way. Through confirming the compatibility of concepts, complicated QoS conditions were solved as constraints via constraints programming. The optimization algorithm based on QoS was proposed to obtain the appropriate service to users. Finally, case study was given to illustrate the effectiveness of discovery scheme.
Model Checking Web Services Based on Timed Automata
LUO Xiang-yu,XUAN Ai-cheng,SHA Zong-lu
Computer Science. 2010, 37 (8): 139-142197. 
Abstract PDF(444KB) ( 1132 )   
RelatedCitation | Metrics
The traditional model checking technictues based on finite state automaton cannot guarantee the correctness of Web services composition with timed constraints. We regarded Web services composition as multi agent system Each atomic Web service was modeled as timed automaton and by parallel composition of them,a network of timed automata was generated and inputted into the model checker UPPAAL. By using the proposed method and UPPAAL we were able to simulate the execution process and verify the liveness,safety and deadlock properties of a Web services composi- tion. We took the atomic services of employee evection arrangements service as a case study of the proposed method and verified some related liveness and safety properties. A deadlock problem of the case study was found by simulation. By analyzing the execution path leading to the deadlock state, we found the reason and finally eliminated the deadlock by revising the communication protocol of the Web services composition.L
Research on Collaborative E-government Model of Access Control Based on the Distribution of Role and Authority
ZHAO Zai-jun
Computer Science. 2010, 37 (8): 143-145250. 
Abstract PDF(371KB) ( 340 )   
RelatedCitation | Metrics
This paper first analyzed the collaborative characteristics of cgovernment, described the security problem e-government system facing in the environment of multi-domain collaborative work. ho draw on the experience of role based access control thought, we advanced the access control model of e-government based on the distribution of user,role and authority, through introducing the concept of authority period and effect scope, we established the distribution mechanism of role and authority, described the operating mechanism of model, and gave the thought of algorithm realizalion for conciliation problem in the situation of occurring access violation. Finally we gave a typical administrative activitics-documents circulation countersigning for example, concretely specified the application method of model, effectively solved the problem of system security and information confidentiality for e-- government in the multi-domain environ- mcnt.
Practical Framework of Object Persistence
GU Si-shan,ZHAO Li-yang,LI Shi-xian
Computer Science. 2010, 37 (8): 146-151. 
Abstract PDF(532KB) ( 307 )   
RelatedCitation | Metrics
By separating cross-cutting concerns from core concerns and supplying additional mechanisms to modularize cross-cutting concerns, aspect oriented programming gracefully solves the problem of code scatting and code tangling coming across when object oriented technique deals with cross-cutting concerns. Just like logging and secure authenticalion, persistence is considered as a kind of classic crosscutting concerns and suited to be dealt with using AOP. After analyzing statcof-thcart persistence frameworks or implementations,we found most of them cared too much about the obliviousness property which is inherent in AOP and hard to meet the need of actual applications either in functionality or performance. A practical framework of object persistence was proposed after probing into characters of persistence and mechanisms of aspectizing persistence. The framework not only preserved functionality and performance achieved by object oriented solution,but also wined higher rcusability,maintainability and portability by aspcctizing persistence.
Design and Implementation of the Software-sensor of Self-adaptive System
WU Bin,MAO Xin-jun,DONG Meng-gao,LI Xue-si
Computer Science. 2010, 37 (8): 152-155293. 
Abstract PDF(439KB) ( 292 )   
RelatedCitation | Metrics
Along with the popularization of Internet applications, more and more software systems operate in an open environment, and need to perceive and adapt the change of environment. How to support the development of such complex software systems has become a major challenge to the current software engineering. We discussed the problem betwecn the adaptive system and its environment, and we abstracted software entity from adaptive system to Autonomous Agent We put forward the thought of software sensor dynamic association with the environment,given the design and implementation of software sensor. Unlike existing research, we regarded software sensor as a kind of special software Agent Finally, papers demonstrated the feasibility and validity of these ideas and techniques by case studies.
Tool Implementation of Non-functional Verification for Component-based Embedded Software Designs
XU Bing-feng,HU Jun,CAO Dong,HUANG Zhi-qiu,GUO Li-juan,ZHANG Jian
Computer Science. 2010, 37 (8): 156-163. 
Abstract PDF(769KB) ( 308 )   
RelatedCitation | Metrics
Non-functional properties of the embedded software system arc considered as one of the important features for the high reliability assurance of whole system. Traditional reliability methods in embedded computing domain mostly concern the functional implementation and testing phrase,without effective tools supporting the analysis and verification of the system designs, especially for the non-functional properties. In this paper, a prototype I=CBESD( Tool for Component-Based Embedded Software Designs) was extended with analysis and verification capabilities considering both of resource utilization and energy consumption propertics,which include the input/output mechanisms for resource interface automata and energy automata respectively, the pr}translation from a UML sectuence diagram to a set of message event sequences, the state space data structure designs with non-functional semantics, the implementation issues of several analysis and verification algorithms for resource and energy consumption properties, and an example of a component based system design analysis.
Research of Integration of Middleware Architecture
WANG Qiong,DU Cheng-lie,CAI Xiao-bin,LI Gang
Computer Science. 2010, 37 (8): 164-167. 
Abstract PDF(357KB) ( 354 )   
RelatedCitation | Metrics
Building large-scale complex system in military field needs the support of middleware technologies. The crucial technique and the integration mechanism of middleware architecture were researched. This paper introduced software architecture formal description method based on TDB-calculus and proposed a performance bound component interface model. Then we studied the integration compatibility between component and system based on this model. This study has already used in a virtual test real-time soft-bus research and has well effect.
Optimizing Sparse Matrix-vector Multiplication Based on GPU
BAI Hong-tao,OUYANG Dan-tong,LI Xi-ming,LI Ting,HE Li-li
Computer Science. 2010, 37 (8): 168-171. 
Abstract PDF(422KB) ( 1703 )   
RelatedCitation | Metrics
Sparse matrix computations present additional challenges for harnessing the potential of modern graphics processing unit(GPU) for general-purpose computing. We investigated various optimizations on thread-mapping, data reuse etc. and a parallel Sparse Matrix-Vector multiplication(SpMV) on GPU with compute unified device architecture(CUDA) was proposed under compressed sparse row(CSR) structure afterwards. The optimizations include; (1) exploiting each clement using half-warp threads, which synchronize free within one warp; (2) making up integer address to achieve coalesced accesses; (3) data reuse through reading from texture vector resides in; (4) data transfer using page locked memory; (5) reading results in shared memory. We compared the performance of our approach with that of efficicnt parallel SpMV implementations such as(1)the one from NVIDIA’s CUDPP library and(2)the one from NVIDIA's SpMV library. Our approach outperforms two both in memory bandwidth and GFLOPS. In addition, the total performance of our approach is three times greater than that of a CPU counterpart.
Recursive Semantic Consistency of Sequence Diagram and State Diagram
ZHOU Xiang,SHAO Zhi-qing
Computer Science. 2010, 37 (8): 172-174. 
Abstract PDF(329KB) ( 788 )   
RelatedCitation | Metrics
The dynamic diagrams in the UML are used extensively to model object oriented software systems,in which sequence diagrams describe the message transfer and state diagrams emphasize the behavior. However, the lack of semantics may result in the confusion of these diagrams that arc often used in the development of large systems. In particular,this confusion could lead to the deadlock of state diagrams during the recursive transfer. This paper proposed a solution to this problem using real time multi-agent ASM, combined with formal rules. Specifically, we improved the reliability by using multi-level agent to control the transition so that state diagram is consistent with sequence diagram during the complex message transfer.
Research and Design of Fault Monitoring Mechanism Based on Autonomic Computing
LIU Wen-jie,LI Zhan-huai,ZHOU Yun-tao
Computer Science. 2010, 37 (8): 175-177. 
Abstract PDF(267KB) ( 329 )   
RelatedCitation | Metrics
Autonomic Computing is an effective technique to achieve system self-management in distributed heterogonous computing environment. Its aim is to automatically discover hardware fault and software fault and recover the fault with policy technology,which then realizes the system self management. hherefore,fault monitoring is an important research direction. This paper proposed an event classification method to design the fault monitoring of autonomic computing system. This method can monitor all the resources in heterogonous environment and report the fault to AC system,and then activate the policy to recover system fault, which provides the basis for system self recovery.
Multi-fractal Based Methodology for Software Aging
XU jian, XU Man-wu,YAN Han, LI Qian-mu
Computer Science. 2010, 37 (8): 178-181. 
Abstract PDF(356KB) ( 422 )   
RelatedCitation | Metrics
This paper discussed a multi-fractal based method to analyze the fluctuation of the parameters of system resource, and proposed a new methodology which combined qualitative analysis with ctuantitative analysis to predict resource consumption and the trend of software aging. Firstly, this study used fractal theory to discuss the fractal structure of the parameters of system resources that influenced the performance of software system. And the results show that the variations of the parameters are not a stochastic process, but have characteristic properties of fractal. In addition, the characteristics of the spectra can be used to analyze the changes of system parameters during the running time qualitatively. Secondly, this paper put forward a new methodology for calculating multidimensional exponent, which is applied to data of system resource usage. Thirdly, the Auto-Regressive Moving-Average (ARMA) model was adopted in order to carry out the analysis of the multidimensional exponent and build the corresponding forecast model. Finally, the experiment was taken to calculate the multidimensional exponent of parameter series related with several memory resources using the parameters data collected from a realistic software system. The results of the experiment indicate effective ability to predict software aging.
Dynamic-stochastic Influence Diagrams for Decision Analysis
ZHAO Xin,LI Qun,ZHU Yi-fan
Computer Science. 2010, 37 (8): 182-185. 
Abstract PDF(431KB) ( 379 )   
RelatedCitation | Metrics
As an expanded mode of Influence Diagrams,the himcSliced IDs (TSIDs,or Dynamic IDs) increases cause effect relationship description capability effectively for decision analysis. But it still can't becomingly describe intercurrent, alternating, or coordinated processes of complex system. Based on the analysis of discrete-event system model, the paper suggested a Dynami}stochastic IDs (DSIDs) by introducing a time node to expand existing model-criterion, explicated its graphical syntax,and gave its evaluating algorithm The DS)Ds could be used as an effected approach to improve existing himcSliced IDs to describe complex processes.
Performance Test and Analysis of Alltoall Collective Communication on Domestic
RAO Li,ZHANG Yun-quan,LI Yu-cheng
Computer Science. 2010, 37 (8): 186-188. 
Abstract PDF(328KB) ( 885 )   
RelatedCitation | Metrics
As rapid development of the high performance computers, more and more cores arc used and thus lead to more and more communication which debases the perfor-mance of parallel applications greatly. In the test of the performance of Dawning 5000A by a kind of Fast Fouler Transform (HFFT),we found out that the huge overhead time of MPI_Alltoall is the bottleneck of HFFT. Thus, this paper aimed to test and analyze the leading Alltoall algorithm on Dawning 5000 and Deepcomp 7000 hoping to do a favor to further collective communication optimization. In this paper,the leading Alltoall algorithms such as 2D_Mesh, 3D_Mesh, Bruck, MPICH native, Pair, recursive doubling, Ring,LAM/MPI simple were recounted and tested with different message size and core numbers. The conclusion is that for short message MPICH native and 13ruck performs well on Dawning 5000A while the lower time consuming algorithms on Decpcomp 7000 arc LAM/MPI simple and Bruck; when the message size is medium and large, the best choice for Dawning 5000A is Ring while the optimal algorithm on Deepcomp 7000 is Pair.
Prediction of Trajectory Based on Markov Chains
PENG Qu,DING Zhi-ming,GUO Li min
Computer Science. 2010, 37 (8): 189-193. 
Abstract PDF(406KB) ( 3261 )   
RelatedCitation | Metrics
In this paper,a prediction based on markov chains was proposed,which supports Moving Objects trajectory prediction on traffic networks. This method is based on characters of traffic networks, depends on statistics, and effectively uses historical trajectories. Finally this paper discussed about some optimizations on data structure and algorithm,and analysed the time and space complexity. Experimental studies indicate that the prediction based on markov chains gives us a satisfying result.
Data Stream Fuzzy Clustering Algorithm Based on Relative Density
LIU Qing-bao,WANG Wen-xi,MA De-liang
Computer Science. 2010, 37 (8): 194-197. 
Abstract PDF(362KB) ( 281 )   
RelatedCitation | Metrics
This paper provided a relative density based data stream fuzzy clustering algorithm which inherits the advantages of relative density based clustering and fuzzy clustering, so it can discover arbitrary-shape and multi-resolution clusters. With the subtraction operator on the set of micro-clusters which is defined according to the spatial overlapping relations among micro-clusters, this algorithm can do clustering on any user-specified data stream window. Compared with C1uStream algorithm on the two areas of clustering quality and processing time, this algorithm demonstrates a clear advantage.
New Spatial Data Partition Approach for Spatial Data Query
JIA Ting,WEI Zu-kuan,TANG Shu-guang,KIM Jae-hhong
Computer Science. 2010, 37 (8): 198-200. 
Abstract PDF(247KB) ( 381 )   
RelatedCitation | Metrics
In parallel spatial database, it is necessary to make the spatial data set cluster in each node, because it can improve the efficiency of parallel database query. The partition approach of Oracle Spatial is based on grid. It only consi deres data sets in each node are a balanced division, without taking into account the topological characteristics of these data. In order to improve the problem, this paper presented a new spatial data partition approach which is based on K-means clustering algorithm. Experiments show that the method greatly improves the spatial data retrieval and ctuery efficiency in parallel.
On Web Source Quality Pattern Mining Approaches
ZHU Yan
Computer Science. 2010, 37 (8): 201-207. 
Abstract PDF(667KB) ( 320 )   
RelatedCitation | Metrics
The key to success of the Web-based information management, business intelligence and decision making systerns is high duality information from the Web. However,the Web source duality is very problematic due to the peculiar characteristics of the Web, such as, dynamics and autonomy of Web sources, enormous amount and various types of Web data, multifarious quality requirements of Web applications, etc. There has been some work on Web source quality mamagement In this paper,the quality requirements of advanced Welrbased applications(e. g. business intelligence) and the quality challenges of Web sources were analyzed. The statcofart in Web source quality pattern discovery and evaluation was surveyed. Data mining and the related approaches for dealing with Web duality issues were investigated to reveal many still unsolved problems and to suggest several important research directions.
XML United Signature
ZHENG Xiao-mei,ZHANG Tian
Computer Science. 2010, 37 (8): 208-213. 
Abstract PDF(574KB) ( 353 )   
RelatedCitation | Metrics
This paper firstly analysed the typical application of the XML multi-party communication chain business in the current Web server architecture and built up the corresponding research model. Based on the modcl,this paper proposed a new encryption technology named untied signature, which can solve the problems of the reduplicate signature and not strict information association using the traditional signature technology. This paper also analysed the limitations of using current XMI. specification to fulfill the encryption technology, and then raised the implementation scheme of XML untied signature. Finally, this paper presented the syntax definition of XML united signature.
Clustering Ensemble Algorithm Based on Mathematical Morphology
LUO Hui-lan,WEI Hui
Computer Science. 2010, 37 (8): 214-218. 
Abstract PDF(467KB) ( 345 )   
RelatedCitation | Metrics
In this paper, a clustering ensemble algorithm named CEOMM was proposed, which combines multiple clustering cores explored by different structure elements to get a desirable and correct clustering core of a data set, And then CEOMM gets the clustering of the data set based on the ensemble clustering core. Experimental results demonstrate CEOMM can cluster data with complex cluster shapes better than the classical clustering algorithms, and it can also find an optimal number of clusters. Moreover,CEOMM can discover overlapping clusters with different arbitrary shapes,because it uses different structure elements.
Study of Temporal Difference Learning in Computer Games
XU Chang-ming,MA Zong-min,XU Xin-he,LI Xin-xing
Computer Science. 2010, 37 (8): 219-223. 
Abstract PDF(423KB) ( 621 )   
RelatedCitation | Metrics
Temporal Difference (Abbr. TD) learning algorithm was used to adjust weights of evaluation function by using Connect6 game as testbed in this paper,which makes the weights adjustment process can be done automatically. A new evaluation scheme was proposed,which can solve the difficult to combine the prior knowledge and multi-layer neural network organically. On account of the specific application,the method selecting part of the whole TD sectuence to learn was proposed, by which the interference of useless states is prevented to a certain extent. After 10020 self-learning training, the winning rate is increased with 8 % around against the same Connect6-playing program, which is a good result.
Study on the Fast Training Algorithm of Iteratively Re-weighted Least Squares Support Vector Machine
WEN Wen,HAO Zhi-feng,SHAO Zhuang-feng
Computer Science. 2010, 37 (8): 224-228. 
Abstract PDF(474KB) ( 937 )   
RelatedCitation | Metrics
Iteratively reweighted method is an important approach to improve the robustness of least sctuares support vector machine(LS-SVM). However, the reweighting and retraining procedure demands a lot of computational time, which makes it impossible for practical applications. In this paper, the iteratively reweighted least squares support vector machine (IRLS-SVM) was studied. An improved training algorithm of IRLS-SVM was proposed. It is based on novel numerical method, and can effectively reduce the computational complexity of IRIS-SVM. Three different weight funclions were implemented in the IRLS-SVM. Experiments on simulated instances and real-world datasets demonstrate the validity of this algorithm. Meanwhile, the results reveal that different weight function may require different computational time for the fast training algorithm of IRLS-SVM.
Evaluation of Attention-based Model of Artificial Consciousness
ZHU Chang-sheng,WANG Zhi-liang
Computer Science. 2010, 37 (8): 229-231. 
Abstract PDF(336KB) ( 474 )   
RelatedCitation | Metrics
Emotional robot has become the focus of study. However, it still faces many difficulties to establish a universal affective model and ectuip the robot with emotional logic system so that it can have real affective thinking. Based on the current emotional models, psychological theory of consciousness was studied deeply, and based on the Maslow's Hierarchy of Needs theory, this paper built an artificial consciousness model which is sensitive to stimulus of outside and based on attensity evaluation. Experiments have shown that this model provides a new way of study for the establishment of robot emotional model.
Adaptive De-noising Method of Ballistocardiogram Based on Orthogonal Wavelet Transform
JIN Jing-jing,WANG Xu,YU Yan-bo,JIANG Fang-fang
Computer Science. 2010, 37 (8): 232-235. 
Abstract PDF(307KB) ( 660 )   
RelatedCitation | Metrics
The ballistocardiogram least mean square adaptive de-noising method based on orthogonal wavelet transform was researched. The principal of least mean sctuare adaptive de-noising method based on orthogonal wavelet transform was analyzed. The decomposition scale was confirmed by central frequency of ballistocardiogram gained from joint time frequency analysis based on adaptive radial gauss kernel function, and the level confirmed approach for adaptive filter was proposed by choosing and choosing the wavclet base and the length of input signal. I}hc realization approach was described from the view of matrix, and the reason why orthogonal wavelet transform could improve convergence speed of least mean square algorithm was explained. The experiment results show that, by using least mean square adaptive de-noising method based on orthogonal wavelet transform, the adaptive denoised ballistocardiogram becomes steady more ctuickly and its wave changing along with cardiac cycle is clearer. Compared with power spectrum density of before and after denoised signal, least mean square adaptive dcnoising method based on orthogonal wavelet transform holds the characters of ballistocardiogram while denoising time variable noise, and gains better denoising results.
Construct Ensembles of Bayes-based Classifiers Using PCA and AdaBoost
CHEN Song-feng,FAN Ming
Computer Science. 2010, 37 (8): 236-239256. 
Abstract PDF(399KB) ( 520 )   
RelatedCitation | Metrics
We presented a novel method for constructing ensembles of I3ayes-based classifiers called PCABoost, For creating a training data, our method splited the features set into K-subsets randomly, and applied principal component analysis to each of the feature subsets to get its corresponding principal components. And then all of principal components were put together to form a new feature space into which the total original dataset were mapped to create a new training set. Different process could generated different feature space and different training sets. On each of the new training data we generated a group of classifiers which were boosted one by one using Adal3oost, so we could generate several different classifiers groups in the several different feature spaces. In the classification phase we firstly got several predicts using weighted-voted inside each of the classifiers groups, and then voted on the several predicts to get the final result as the ensembles predict. Experiments were carried on 30 benchmark datascts picked up randomly from the UCI Machine Learning Repository, the results indicate that our method not only improves the performance of I3ayes-based classifiers significantly, but also get higher accuracy on most of data sets than other ensemble methods such as Rotation Forest and AdaBoost.
Performance Evaluation with Optimization Strategy for Support Vector Machine Based on ROC Curve
WANG Xu-hui,SHU Ping,CAO Li
Computer Science. 2010, 37 (8): 240-242. 
Abstract PDF(255KB) ( 1072 )   
RelatedCitation | Metrics
Support vector machine (SVM) has become a popular tool in the area of pattern recognition, and parameters selection for SVM is an important issue to make it practically useful. In this paper, we introduced Receiver Operating Characteristic Curve into the performance evaluation and model optimization of SVM within the kernel parameters s and penalty factor c. Area under ROC curve was applied to the model evaluation, and model optimization was performed by seeking of optimal operating point of ROC. Pattern recognition experiment with UCI dataset shows that ROC curve is an effective approach for performance evaluation and optimization of SVM.
Improvement of Entity Resolution Based on Markov Logic Networks
LOU Jun-jie,XU Cong-fu,HAO Chun-liang
Computer Science. 2010, 37 (8): 243-247. 
Abstract PDF(423KB) ( 400 )   
RelatedCitation | Metrics
Entity Resolution is a crucial and expensive step in the data mining process. Domingos and Singla of University of Washington proposed of well-founded, integrated solution to the entity resolution problem based on Markov Logic.This paper tried to improve Domingos and Singla's solution by adding a formula with a changeable weight to it, to handle the problem of ambiguity of entities that the original system cannot distinguish. The new algorithm can effectively handle ambiguity of entities, and improve accuracy compared with the original algorithm, which is proved by experiment s.
Experiment of Double Inverted Pendulum Based on Chaos Optimal Control
YAO Rong-bin,LI Sheng-quan
Computer Science. 2010, 37 (8): 248-250. 
Abstract PDF(233KB) ( 297 )   
RelatedCitation | Metrics
A method of controlling double inverted pendulum was presented in this paper. Chaos optimisation linear quadratic controller based on the dynamic performance index was proposed, in order to solve the choice of weighting matrix problem in LQ control multi-variable. The controller designed by the approach can satisfy the performance index of linear quadratic. The real-time control curves of the double inverted pendulum show that this method not only own characteristics of ctuick speed, but also solve the problem of serious nonlinear and absolutely unstable.
Classification Algorithm Based on a Chaotic Neural Network
ZHANG Jian-hong
Computer Science. 2010, 37 (8): 251-252261. 
Abstract PDF(271KB) ( 350 )   
RelatedCitation | Metrics
In this paper, a classification algorithm based on Chaotic Neural Network(CNN) was presented, which established classifiers by a group of threclayer feed-forward CNN. The chaotic neural networks were trained an improving algorithm. The class label of the identifying data could first be evaluated by each chaotic neural network, and the final classification result was obtained. Experimental results show that the algorithm CNN is effective for the classification,and has the better performance in classification precision, stability comparing with the traditional neural network algorithms and decision trees algorithm.
Fuzzy Edge Detection Algorithm of Image Based on Cloud Space and Fuzzy Entropy
WANG Zuo-cheng,ZHANG Fei-zhou,XUE Li-xia
Computer Science. 2010, 37 (8): 253-256. 
Abstract PDF(344KB) ( 333 )   
RelatedCitation | Metrics
Based on fuzzy set theory and cloud theory, fuzzy edge detection algorithm based on object cloud, OCFD was proposed. Considering the fuzzy and random characteristics of image,OCFI)constructed the mapping model between image space and cloud space by the representation methods of uncertain object cloud in image. According to the mapping model, object cloud and edge cloud could be generated. Mapping from image space to cloud space could be accomplished based on object cloud and edge cloud. I3y logical cloud calculating in cloud space,the algorithm of transition region detection was proposed. Based on maximum fuzzy entropy principle, edge detection in transition region could be accompushed. Experiments demonstrate that OCFD exhibits a considerable improvement in performance compared with both Fuzzy Gmean and Pal. King. The algorithm proposes a new idea for image comprehending and analyzing. It enriches and extends the cloud theory.
Analyze of Sensitivity Field Characteristics and Image Reconstruction for Electrical Resistance Tomography System
ZHANG Yan-jun,CHEN Yu,CHEN De-yun,YU Xiao-yang
Computer Science. 2010, 37 (8): 257-261. 
Abstract PDF(427KB) ( 1008 )   
RelatedCitation | Metrics
In electrical resistance tomography system (b;R`I}),the sensitivity-field is influenced by the distribution of multiphase flows. As a prior knowledge of image reconstruction, sensitivity-field distribution data must be computed on theory. I}o improve the quality of reconstruction image, it is necessary to analyze the sensitivity-field distribution. In this paper, based on the basic principle of electrical resistance tomography, finite clement method was adopted to build a mathematical model of sensitivity-field. Based on the analysis of sensitivity-field, the factors and rules that affect sensitivity-field distribution were studied, and the computation of distribution and visual simulation were finished. According to the computation of sensitivity-field distribution, an image reconstruction algorithm accelerated by polynomial for electrical resistance tomography and corresponding mathematical model were brought forward, and the convergence feature of the algorithm was proved using spectrum analysis. The result of simulation and experiment shows that the finite element module of sensitivity-field is effective. The image reconstruction algorithm has the advantages such as high quality and fast convergence speed,which brings forward a new method to the research of ERT image reconstruction.
Adaptive Image Fusion Algorithm Based on Nonsubsampled Contourlet Transform
ZHANG Hai-chao,ZHANG Fang-fang,SUN Shi-bao,WANG Ya-tao
Computer Science. 2010, 37 (8): 262-265. 
Abstract PDF(347KB) ( 448 )   
RelatedCitation | Metrics
To solve the fuzzy phenomenon of fusion images,an adaptive image fusion algorithm based on nonsubsampled contourlet transform(NSCT) was proposed. The good properties of the contourlet transform and the nonsubsampled contourlet transform were discussed. After the original images were decomposed by NSCT, the low-frequency images were fused by the low-frequency changing rate and uniformity. When the changing rate difference between the low-frequcncy images was lower than the threshold, the changing rate was selected for making decision; otherwise the uniformity was used to make decision. The high-frequency images were fused by the high frequency contrast to make decision. Afto the image was fused, the entropy, relative error and average gradient were used to evaluate the fusion performance.The results show the adaptive fusion algorithm based on the nonsubsampled contourlet transform(NSCh) obtains better fusion performance.
Object Recognition Based on a New Method of Edge Crawling
WANG Yan-qing,CHEN De-yun,SHI Chao-xia,LIU Bo,FANG Guo-zhi
Computer Science. 2010, 37 (8): 266-269272. 
Abstract PDF(421KB) ( 356 )   
RelatedCitation | Metrics
According to the characteristics of real-time, rapidness, robustness in object recognition and the slow changrog of the most edge directions, a new memory-based edge crawling method was proposed in this paper. An obvious flaw of the edge crawling was that the bug was easy to be trapped and kept crawling around a certain local region. Memory-based searching algorithm could remedy the shortcoming of the edge crawling effectively. This method segmented different objects having the similar colours by marking the extracting edge during the course of edge crawling. The experimental results demonstrated that, compared with traditional approaches, the adopted approach was able to get more complete and clear contour, shorten the time of image processing and therefore optimize the accuracy and robustness of object recognition.
Strategy of Feature Interaction Based on the Sufficiency Principle
SUN Li-juan,JIN Ying-hao
Computer Science. 2010, 37 (8): 270-272. 
Abstract PDF(247KB) ( 357 )   
RelatedCitation | Metrics
To maintain validation of models, it must track and check each modeling operation in feature modeling systans. The ways used at large about feature interaction checking can not be used for product models with complicated topology and lots of features. Improving feature interaction checking efficiency is one of the most challenge questions in the researching on feature modeling technology. 13y the deeply research on feature interaction, cellular models theory and feature interaction principle were imported into feature interaction checking,a new feature interaction strategy was put forward,and this method solved effectively the problem about capability limitation.
MeanShift Tracking Algorithm with the Adaptive Bandwidth of the Target in Occlusions
LIN Qing,CHEN Yuan-xiang,WANG Shi-tong,ZHAN Yong-zhao
Computer Science. 2010, 37 (8): 273-275289. 
Abstract PDF(299KB) ( 424 )   
RelatedCitation | Metrics
An improved Mean-Shift based tracking algorithm was proposed to solve the poor tracking ability problem in occlusions. Combinating multi-scale space theory, Kalman filter and occlusion, it can have a good tracking about scale using Kalman when target has been occluded. The experimental results show that the improved algorithm can select the proper size of the tracking window in the scenarios when the target has been occluded.
Minorities Features Extraction and Recognition of Human Faces
DUAN Xiao-dong,WANG Cun-rui,LIU Xiang-dong,LIU Hui
Computer Science. 2010, 37 (8): 276-279301. 
Abstract PDF(461KB) ( 544 )   
RelatedCitation | Metrics
Minorities feature of face is one of the most important features of face features. We created a face database of ethnic minorities and extracted facial features using face recognition technology. In the feature extraction method,we adapted the algebra and geometry from face database, used I_DA algorithm to extract the algebraic features of human face images,this paper also constructed a new face templates to extract the geometric features and used gabor wavelet to locute the points of face templates. KNN and C5. 0 Classifiers were used to learn the train dataset. I}hc result indicates that the average recognition accuracy rates of I}ibetan, Uygur and Zhuang reach 79 0 0 by algebraic features and 90. 95% by geometry features.
Face Detection in Coal Mine Surveillance Image Based on Skin Color and Face Template
CHEN Wei,DING Shi-fie,XIA Shi-xiong
Computer Science. 2010, 37 (8): 280-282. 
Abstract PDF(245KB) ( 285 )   
RelatedCitation | Metrics
The miner surveillance images arc of low contrast and unevenness gray level, which make the progress of processing and recognition very hard. Miner surveillance images from many angles arc sampled in laneway conditions. The miners' face areas of all angles can be segmented in HSV color space with up and bottom threshold value according to statistical characteristic of face color. The pixels which arc mis segmented can be wipped off with the open and close oporation in morphological with the circular structure clement of 2 pixels radius. An average face template based on the distribution feature of face gray level was constructed. A similarity functions was used as the discriminant functions to determine the locations of faces. I}hc results show that the locations of the miners' faces can be pinpointed fleetly with the method of face color segmentation and template matching.
Design and Optimization of Synonymous Image Cloning System
WANG Yu-xin,LU Guo-ji,GUO He,HE Chang-qin,YANG Yuan-sheng
Computer Science. 2010, 37 (8): 283-286. 
Abstract PDF(400KB) ( 356 )   
RelatedCitation | Metrics
Image seamless cloning technique is an important filed in image editing region. In previous work, researchers separated the target object from source image and then embed it into the target image without consideration of whether the source image and the target one arc semantic matching. A novel and complete image cloning system was presented,in which Gist descriptors were used to describe image's scene semantics and semantic matching technique was applied to arbitrary cloning method to give it a realistic significance, which is called synonymous image cloning system. In consideration of the large time complexity of the semantic matching process, CUDA parallel programming model and OpenMP multithread programing method of the heterogeneous multicore environment were used to accelerate the process and effectively improve the performance of overall system. Experiments show that this system guarantees higher user involvement,stronger practicability and ensures the users' satisfaction of the cloning speed and effect.
Fire Recognition Algorithm Based on L-M in YCbCr Color Space
HAN Dian-yuan
Computer Science. 2010, 37 (8): 287-289. 
Abstract PDF(222KB) ( 374 )   
RelatedCitation | Metrics
As conventional fire detection technology has many limitations, this paper proposed a fire recognition method based on image. Firstly, the image color space was transformed from RGI3 to YCbCr and a coordinate system was set up then the Cb and Cr values of the fire sample pixels were depicted. Secondly, an ellipse was drawn up to ring up the most of the points so as to create an elliptic equation and a two-dimensional normal distribution function which value out of the ellipse was zero. Lastly L-M algorithm was used to optimize the parameters of normal distribution function. So the recognition of the fire was converted to judge if the normal distribution function value of a given couple of Cb and Cr isgreater than zero. I}he method of this paper has good real-time and recognition performance.
Research on Face Recognition with Robustness to Illumination Change Based on Phase-frequency Characteristic
TENG Yun,HE Chun-lin,TANG Yong-bin
Computer Science. 2010, 37 (8): 290-293. 
Abstract PDF(328KB) ( 449 )   
RelatedCitation | Metrics
The traditional face recognition method has high requirement to the face image to be recognized and require that there are little illumination differences between the face image acquired and the image in the training database,which restrict the environmental condition in which the face recognition system is operated, thus restrict the application of face recogntion. In order to lessen the requirement of environmental condition in the face recognition and overcome the effect of illumination to the face recognition, the paper analyzed the amplitude-frequency characteristic and phase-frequency characteristic and put forward the illumination normalized face recognition method in the frequency domain.By normalization, the illumination between the i:工cage acquired and the it工cage in the training database is identical and also the distinguishable property of face image was preserved. Generally, the information of the differences among the face images is less, so this paper considered the minimum non-zero eigenvector as the face feature. By the experiment simulation, compared with the traditional face recognition method, the method put forward in this paper is robust to illumination change.
Research on Information System Architecture Description Method Based on Multi-views
LUO Ai-min
Computer Science. 2010, 37 (8): 294-297. 
Abstract PDF(326KB) ( 579 )   
RelatedCitation | Metrics
Architecture design is a key process in information systems development. Architecture assessment can improve the efficiency of system development The architecture assessment method based on executable model is an effective method to verify and assess the architecture. The traits of architecture assessment method based on executable model were analyzed, and the content and process of assessment were provided in the paper. The method was discussed in which model executable model is based on Object Petri Net model,and the feasibility of the method was shown by an instance in the paper.
New Cache Replacement Algorithm for Solid-state Drive
LI Bo,XIE Chang-sheng,WANG Fen,ZHAO Xiao-gang
Computer Science. 2010, 37 (8): 298-301. 
Abstract PDF(390KB) ( 731 )   
RelatedCitation | Metrics
The appearance of solid-state drive(SSl))incurs exciting changes in the architecture of computer storage sub-systems and SSD has become the main storage devices for embedded application, gradually. But the bottle neck problem of "erase before write" in NANI)based SSl)and its lifetime are the most important issues in today's SSl)design. This paper presented the algorithm,LRU-AB(access-based) by considering the access frequency to improve the performance of write operations. Meantime, we also discussed some existing algorithms for this area.
Testing Jitter on Clock Signals Based on Analysis of Instantaneous Phase
ZHU Yan-qing,HE Yi-gang
Computer Science. 2010, 37 (8): 302-304. 
Abstract PDF(242KB) ( 330 )   
RelatedCitation | Metrics
A novel method based on the analysis of instantaneous phase was proposed to extract the fitter on clock signals. I}he method utilizes the Hilbert transform to extend the real signal of clock into an analytic signal, and the implementation of Hilbert transform is based on the Fourier transform windowed with two window functions. Then, the fitter of clock is extracted from the instantaneous phase of analytic signal. The experimental results for a fitter signal present that the proposed method can effectively extract the fitter on clock signals and has better precision than other methods.