Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 9, 01 December 2018
  
Survey on Power-saving Technologies for Disk-based Storage Systems
TIAN Lei,FEND Dan,YUE Yin-liang,WU Su-zhen,MAO Bo
Computer Science. 2010, 37 (9): 1-5. 
Abstract PDF(568KB) ( 572 )   
RelatedCitation | Metrics
Hard disk drives have been one of important components of modern-day storage systems, and their power consumption is the major part of total power consumption of storage systems. I}herefore, the problems of high power dissipation of disk-based storage systems are paid much more attentions by researchers. The research progress and sta- tus on power consumption issues from disk to storage system were extensively studied in this paper. Then representative power-saving technologies, implementation mechanisms as well as evaluation methodologies were presented and discussed in details. Further, their respective characteristics and applicability were also analyzed and summarized. I3y taking workload characteristics of mass storage systems and complexity of application scenarios into consideration, the future work on power-saving technologies for disk-based storage systems was pointed out finally.
Survey of Trust Management on Peer-to-Peer Network
ZHANG Guang-hua,ZHANG Yu-qing
Computer Science. 2010, 37 (9): 6-12. 
Abstract PDF(665KB) ( 409 )   
RelatedCitation | Metrics
Constructing a trust mechanism is very significant to benefit users in P2P network and verify validity of resource and service. Basic notions in P2P trust management was clarified in this paper and system constitution with key issues was analyzed in detail,which was followed by the introdcution of typical P2P trust management systems and the discussion about combination of P2P trust management and other subjects. In the end future research such as uniform framework and evaluation standard was expected.
Software Reliability Models:A Survey
LOUJun-gang,JIANG Jian-hui,SHUAI Chun-yan,JIN Ang
Computer Science. 2010, 37 (9): 13-19. 
Abstract PDF(728KB) ( 689 )   
RelatedCitation | Metrics
The software reliability model is one of the best approaches to predict. This purer analyzed and evaluated software reliability quantitatively. It is very important to infer software reliability by reasonably modeling and incorporating software failure data as well as other prior information. The basic concepts of software reliability models were presented. More than thirty different models proposed recently were analysed. Their predictive validity and applicability were discussed in detail. Finally,future research directions and potential applications of software reliability models were pointed out.
Advances in Non-photorealistic Rendering
WANG Xiang-hai, QIN Xiao-bin, XIN Ling
Computer Science. 2010, 37 (9): 20-27. 
Abstract PDF(856KB) ( 1345 )   
RelatedCitation | Metrics
As a graphics branch corresponding to photorealistic graphics, in recent years, the non-photorcalistic became a research hot in computer graphics, which has got more and more attention. The research productions have been applied to computer animations, computer art, science information iconograph rendering and etc. This paper summarized the non-photorealistic rendering technology. Firstly, the paper introduced development of non-photorcalistic rendering, then used simulation of the effect of different art as a clue, classified the non-photorealistic rendering technology, and there were some analysis and discussion on the characteristic, development and leading arithmetic of each technology, meanwhile,we expected the future development of NPR.
Research and Development of Trust for the Security in Internet Applications
WANY Yong-hao,ZENG Guang-ping,XIAO Chao-en,ZHANG Qing-chuan
Computer Science. 2010, 37 (9): 28-31. 
Abstract PDF(399KB) ( 344 )   
RelatedCitation | Metrics
Internet will be the main platform that the computer applications run on. Because of the openness and complicacy of Internet, such traditional technology as ALL and PKI cannot completely solve the security problems of computer applications. The trust management technology based on security credentials and trust valuation technology on base of direct history experience and recommendation experience, which solve the trust problems of open and coordinated environment using the trust relations among entities, arc the hot spots in security research of Internet applications present1y. Based on the status quo, an overview was made in this paper, in which the current research situation about trust. gement and trust valuation technology at home and abroad were analyzed and summarized. Subsequently, their advantage and weakness were pointed out, too. Finally, a prospect for further interesting research directions was proposed.
New Security Protocol Verification Approach Based on Attack Sequence Solving
HAN Jin,XIE Jun-yuan
Computer Science. 2010, 37 (9): 32-35. 
Abstract PDF(442KB) ( 465 )   
RelatedCitation | Metrics
With the premises of the prefect encryption mechanism and the ICY attacker model, it concluded that the inject attack is the necessarily method for attackers to realize their aims. In this paper, the attributes of inject attack and attack sequences which come from inject attacks were analyzed. Based on those conclusions,it presented an algorithm to determine whether there is an attack sequence in a security protocol. And an new security protocol automatic verification approach was brought up based on this algorithm It was also proved that the algorithm can be terminated in the verificanon process for a regular security protocol. In the paper, the NSPK was verified by the algorithm The experimental results show that compared with other security protocol verification tools, as OFMC, the algorithm can not only realize security protocol automatic verification, but also more practicability for it can be terminated in regular protocol verification process.
Two-stage IPv6 Address Lookup Scheme Based on Hash Tables and Tree Bitmaps
WANG Ya-gang,DU Hui-min,YANG Kang-ping
Computer Science. 2010, 37 (9): 36-39. 
Abstract PDF(455KB) ( 1959 )   
RelatedCitation | Metrics
IP address lookup is a key issue in modern high performance muter design, especially with the evolution of IPv6. In order to improve the efficiency of IP address lookup, a novel IPv6 address lookup scheme based on Hash tables and tree bitmaps was proposed, with an analysis on the prefix length distribution of routing table state-of-the-art. In this scheme,four Hash tables were used to store the prefixes with the length of 16,32,48 and 64 bits respectively; the subprefixes with length of 16,32,48 bits of the other prefixes were stored in these four Hash tables too, their remaining part shorter than 16 bits was coded into tree-bitmap and indexed by a certain Hash table entry, thereby to form a two-stage address lookup architecture. I}hc results show that the scheme can achieve an average memory access number of1--2 per IPv6 address lookup and 7 for the worst case, and can be applied in the high performance IPv6 address lookup implementation.
Dynamic Multi-secret Sharing Scheme for P2P Environment
BA0 Yang,LU Zheng-ding,HUANG Bao-hua,LI Rui-xuan,HU He-ping,LU Song-feng
Computer Science. 2010, 37 (9): 40-43. 
Abstract PDF(460KB) ( 345 )   
RelatedCitation | Metrics
Lack of a trusted third party, relatively low node availability and constantly changing membership and network size,make existing multi-secret schemes unsuitable for P2P environment A dynamic multi-secret sharing scheme was proposed for P2P networks. Neither trusted dealers nor secure communication channels are necessary for the proposed scheme, which allows dynamic changes of participants and the system threshold in keeping ciphertext untouched. In the meantime, the participants' identity and public commitments together with system parameters are managed by Byzantine Quorums, which makes it possible to reconstruct a shared secret, add a participant or change the system threshold with only threshold participants online. Additionally,II}based public key cryptosystem and bivariate polynomials are used to reduce message traffic and deal with participant cheating. Altogether, the proposed scheme overcomes the drawbacks of the previous schemes in P2P environment.
Simulation Analysis of Message Transmit Delay and Influencing Factors in Datalink Systems
ZHOU Zhong-bao,REN Pei, MA Chao-qun, ZHOU Jing-lun
Computer Science. 2010, 37 (9): 44-47. 
Abstract PDF(596KB) ( 463 )   
RelatedCitation | Metrics
Transmit delay of message has great influence on its tactical value in datalink system. This paper analyzed the message transmit process in datalink system. A datalink system simulation platform was established and multi-net datalink topology, communication protocol, communication route and message generating were built. I'he transformation process of message format and ways of transmitting were analyzed in details. The message delay, response time and delay variation under different transmitting ways were compared using above models. Simulation results show that the indexes are significantly influenced by such factors as number of nodes,message transmitting way and service strategy of nodes.
Research on C2 Capability Package Service Dynamic Composition Method Based on OPN
HUANGFU Xian-peng,CHEN Hong-hui,LUO Xue-shan
Computer Science. 2010, 37 (9): 48-53. 
Abstract PDF(500KB) ( 406 )   
RelatedCitation | Metrics
Service composition method is of great importance for military information system based on SOA in the distributed environment, the paper proposed a service dynamic composition method. Firstly the paper defined command and control capability package service description and composition model based on object Petri-net, analysed and proved service composition's arithmetic operators, then brought forward service dynamic composition flow chart, finally made use of object Petri-net modeling and simulation environment of national university of defense technology to simulate service composition model the paper has brought forward, and in the end analysed the experimental datum.
Optimal Cooperative Spectrum Sensing Algorithm of the Resource Constrained Cognitive Radio Networks
XUE Feng,QU Dai-ming,ZHU Guang-xi,LIU Li
Computer Science. 2010, 37 (9): 54-56. 
Abstract PDF(321KB) ( 373 )   
RelatedCitation | Metrics
The primary task of cognitive radio is dynamically sensing spectrum. Cooperative spectrum sensing increases the reliability of spectrum sensing. Recently most spectrum sensing algorithm uses all the cognitive radio to participate in the sensing and assumes the SNR measured values of each cognitive radio as a constant. However, because the characteristics and environment of wireless channel have difference, the SNR of each cognitive radio is difference and system resources arc also limited. Based on this, we proposed an optimized spectrum sensing algorithm on the foundation of cooperative spectrum sensing which uses hard decision. It can use partial of cognitive radio to participate in spectrum sensing and minimize the sensing overhead. The analytic results were verified by computer simulation.
Distance Estimating Algorithm Based on Fine-grain Gradient in Wireless Sensor Networks
ZHENG Ming-cai,ZHANU Da-fang,ZHAO Xiao-chao
Computer Science. 2010, 37 (9): 57-62. 
Abstract PDF(545KB) ( 311 )   
RelatedCitation | Metrics
Locali}ation takes an important role in wireless sensor networks, while distance measuring is usually the precondition of tracking or positioning. Finding a distance-measuring algorithm with low cost, low overhead and high precision is the main intension of this paper. In this paper, derived from the distributing characteristics of node's fine-grain gradient value in wireless sensor networks' minimum hop gradient field, a way based on the fincgrain gradient to estimate the distance between nodes, namely DV-FGI, was presented. Compared with the DV-hop algorithm, the measuring precision of DV-FGI is improved largely at the nearly same cost and overhead, and the resolution ratio of measuring is improved from communication radio range to the distance interval between neighbor nodes. "hhe theoretical analysis and simulation results validate that the method is cauite effective in the wireless sensor networks deployed with dense nodes.
New Fuzzy Role-based Access Control Model for Ubiquitous Computing
DOU Wen-yang,WANG Xiao-ming,ZHANG Li-chen
Computer Science. 2010, 37 (9): 63-67. 
Abstract PDF(476KB) ( 430 )   
RelatedCitation | Metrics
In the Ubiquitous Computing Environment the user's contextual conditions of satisfaction, the user's level of trust and the permission's level of security risk arc fuzziness. Many of the existing access control model do not support for the inference of fuzzy information. This paper presented a fuzzy role-based access control model(FRBAC),the roles assigned to users(UA) and role to the permissions assigned(PA) are divided into two parts in the FRABC model, the user can activate the role by contextual conditions of satisfaction, the user's level of trust, as well as the possible security risks of activating the role. This authorization process is completed through the fuzzy reasoning.FRBAC model achieves dynamic fuzzy authorization and automatic distribution of user roles, it simplifies the security management of the RBAC model. Finally, the paper gave the architecture of the model to achieve and the related fuzzy authorized reasoning algorithm.
Cooperative Time Synchronization Based on Industrial Wireless Networks
XU Na,HU Guo-lin,GHANG Xiao-tong,SONG Hong-ling
Computer Science. 2010, 37 (9): 68-71. 
Abstract PDF(465KB) ( 337 )   
RelatedCitation | Metrics
According to spatial average, not traditional time average, cooperative time synchronization supposes a new method for time synchronization in wireless sensor networks. In this work, we presented a cooperative time synchronization protocol that uses broadcasting and spatial averaging for the characters of industrial wireless networks. This protocol can adjust the rate of the time synchronization messages between time and space, in order to achieve high precision and strong robust. Simulation estimates that the protocol has much better synchronization performance than traditional protocol only using time average.
Provably Secure Subliminal-free Protocol in EDL Digital Signature
ZHANG Ying-hui,MA Hua,WANG Bao-cang
Computer Science. 2010, 37 (9): 72-74. 
Abstract PDF(319KB) ( 492 )   
RelatedCitation | Metrics
Subliminal channels in EDL signature were constructed firstly, then an interactive subliminal-free protocol was designed. It is shown that the proposed protocol can completely close subliminal channels existing in the random pa- rameters in EDL signature. hhe proposed protocol is proved to be secure in RO(random oracle) model assuming the CDH(computational Diffie-hellman) problem is hard. In the new protocol, the warden participates the generation of sig- nature, but can not sign messages. Thus, the signature authority of the signer is guaranteed. To generate a signature, it only needs to perform 2 and 3 modular exponentiation for the signer and the warden respectively.
Research on Automatic Classification of RFID Modulation Signal
ZHANG Song-hua,HE Yi-gang
Computer Science. 2010, 37 (9): 75-76. 
Abstract PDF(256KB) ( 371 )   
RelatedCitation | Metrics
This paper proposed a classification method of RFID(Radio Frectuency Identification) modulation signal hased on software radio. I}he global optimization of the topology of BP(hack propagation) network fusing with GA was proposed, and a recognition classifier of GA-BP nerual network was designed. Compared with traditional algorithms of BP, the method Improves the convergent rate and convergent precision. Moreover, the recognition rate is very high. The result of the experiments proves it can get high classification efficiency under low SNR.
Security Analysis and Improvement of a Group Signature Scheme Based on the Braid Groups
WEI Yun,XIONG Guo-hua,ZHANG Xing-kai, BAO Wan-su
Computer Science. 2010, 37 (9): 77-80. 
Abstract PDF(307KB) ( 460 )   
RelatedCitation | Metrics
The rapid development of quantum computing makes public key cryptosystcms based on noncommutativc al gebraic systems hot topic. Because of the non-commutativity property, the braid groups with braid index more than two become a new candidate for constructing cryptographic protocols. I}he security vulnerabilities of a group signature scheme based on the braid groups were pointed out that it does not satisfy the unlinkability, which means the signatures generated by the same group member can be linked, and the publication of several signatures will induce information leakage of the private key of the group. An improved scheme was proposed using random factor, which not only ensures the unlinkability of the scheme but also protects the group' s private key. Security analysis shows that the improved scheme satisfies the security requirements of group signature.
Approach on Promoting Survivability for Information System Based on Game-theory
WANG Zhi-wen, LU Ke WANG Xiao-fei
Computer Science. 2010, 37 (9): 81-84. 
Abstract PDF(352KB) ( 352 )   
RelatedCitation | Metrics
The current study of survivability of information system is focused on the technique realization of quantitalively analysis and guarantee of survivability in static environments. The survivability is different with various techniqucs in dynamic work conditions,which can be classified into multiple grades according to corresponding capability.Custom needs to pay different service fee at different survivability grade in an information system. At the same time provider must invest considerable money so as to keep a particular survivability grade. An approach has to be proposed urgently, with which the provider can decide whether to promote the survivability grade by his income. A game-theory based model was constructed in this paper by analyzing the action and income of customer and provider who act as the two players. The mixed strategy Nash equilibrium were derived from the model and the strategy for promoting survivability grade was devised, with which the income of provider can be maximized. An experiment was carried out in an information system simulated by 5 survivability grades and the results show that the gamcthcory based approach presenled in the paper is correct and reasonable.
Identity-based Ring Signcryption Scheme in the Standard Model
SUN Hua,ZHENG Xue-fengYAO Xuan-xia,LIU Xing-bing
Computer Science. 2010, 37 (9): 85-89. 
Abstract PDF(389KB) ( 313 )   
RelatedCitation | Metrics
Signcryption is a cryptographic primitive which provides authentication and confidentiality simultaneously with a computational cost lower than signing and encryption respectively. The ring signcryption has anonymity in addilion to authentication and confidentiality, which allows any user to choose any set of users that includes himself and signcrypt messages without revealing who in the set has actually produced it. This paper presented an efficient identity-based ring signcryption scheme in the standard model, which proves its indistinguishability against adaptive chosen ciphertext attacks and existential unforgeability against adaptive chosen message and identity attacks in terms of the hardness of CDH problem and DBDH problem.
Generalized Frequency Division Multiplex for Power Line Communication System
LI Qi-lin,HU Su,WU Gang,ZHOU Ming-tian
Computer Science. 2010, 37 (9): 90-93. 
Abstract PDF(336KB) ( 490 )   
RelatedCitation | Metrics
Orthogonal frequency division multiplex(OFDM) is a good candidate for power line communication system,because it is effective against pulse noise, multipath delay and group delay. However, the drawbacks of the high peak-to-average power ratio and cyclic prefix lead OFDM system to have lower efficiency. Fortunately, Generalized Frequency Division Multiplex(GFDM) with good timcfrequcncy localisation can solve inter symbol interference(ISI) and inter carrier interference(ICI). Meanwhile,GFDM has higher spectrum efficiency because of no cyclic prefix.
Risk Assessment Method for Network Security Based on Intrusion Detection System
CHEN Tian-ping,XU Shi-jun,ZHANG Chuan-rong,ZHENG Lian-qing
Computer Science. 2010, 37 (9): 94-96. 
Abstract PDF(234KB) ( 473 )   
RelatedCitation | Metrics
The Hidden Markov Model(HMM) for describing host security states was established to evaluate the real time security risk of network, whose input is Intrusion Detection System alers. The probability for host to be attacked was calculated by this model. Aimed at the attack alers, a new calculating method for attack success probability was presented, and used attack threat level to calculate the risk index of the host node. Finally, the importance weight and risk index of all the host nodes were used to calculated the risk of the network ctuantitatively. The case study demonstrated this method can provide the real-time risk curves of host system for security managers to adjust security policies.
Research of Differential Attack Algorithms to Hash
ZHOU Lin,HAN Wen-bao,WANG Zheng
Computer Science. 2010, 37 (9): 97-100. 
Abstract PDF(256KB) ( 1014 )   
RelatedCitation | Metrics
Hash functions arc widely used in business, military field etc. Therefore, the attack to Hash functions has important meaning in theory and in practical application. Since professor Wang proposed differential attack algorithm and succeeded to break SHA 1, MD5 , RIPEMD,MD4,this algorithm has been paid more and more attention. However, Pro Wang do not supply the method to get difference and differential path. Experts at home and abroad guessed that she made it by hand with her outstanding intuition. Therefore,finding the method to get difference and differential path becomes hotspot. We must tackle the circle shift difference and select the high probability sufficient conditions when constructing the differential path. This paper verified there arc four conditions and gave their probability and compared them with each other.
Improved LSB Matching Steganographic Method Based on Complementary Embedding of Adjacent Intensity Pixels
XI Ling,PING Xi-jian,ZHANG Tao
Computer Science. 2010, 37 (9): 101-104. 
Abstract PDF(342KB) ( 579 )   
RelatedCitation | Metrics
LSB matching steganographic method has the advantage of large capacity and high invisibility, but the histogram of the stego image produced by this method is smoothened. Then attackers can detect whether the image has been changed. In this paper, an improved method was proposed. According to the complementary property of the random士1 behavior, the improved method embeds most of the secrete bits in pixels with adjacent intensity. By the proposed method the cover image's histogram can be maintained in a considerable extend so as to make the stego image more difficult to be detected than I_SI3 matching steganograpy.
Model of Network Device Resource Management Based on ForCES
JIA Feng-gen,GUO Yun-fei,DING Xin-yi
Computer Science. 2010, 37 (9): 109-112. 
Abstract PDF(346KB) ( 356 )   
RelatedCitation | Metrics
In this paper, the concept of ForCES(Forwarding and Control Element Separation) resources was proposed based on the ForCES architecture and FE model. The operation of CE to the FE was abstracted to the operation of ForCES resources, and a model of ForCES resources management was proposed. The key technologies of the model were realized, including the mechanism of FE resources library management and CE resources storage. The implementation method of the model was discussed, which gives example of its running to build a Web-based management of IPv6 routers. Finally, the functional and performance tests were carried out, and a comparative analysis was put forward.
Designing and Implementation of Reliable Multicast Based on FEC in Distributed Switch
LUO Ting,YU Shao-hua, WANG Xue-shun
Computer Science. 2010, 37 (9): 113-116. 
Abstract PDF(449KB) ( 434 )   
RelatedCitation | Metrics
We presented the general architecture of a distributed ethernet switch, focusing on analysis of the model of internal communication subsystem. According to its character,a novel reliable multicast communication mechanism based on FEC recovery algorithm was applied and evaluated in our experlrrlent.
Fast Re-registration Procedure for User in IMS
ZHANU Qi-zh,HUANG Xing-ping,FAN Bing-bing
Computer Science. 2010, 37 (9): 117-120. 
Abstract PDF(335KB) ( 725 )   
RelatedCitation | Metrics
Aimed at the rcregistration procedure of user who has registered in IMS(IP Multimedia Subsystem) , an improved fast r}registration method was proposed. As a registering user's UE(User Equipment) has known the address information of S-CSCF(Serving-Call Session Control Function) in home network, the delay of r}registration can be decreased effectively by carrying routing information of S-CSCF and transferring the REGISTER message from P-CSCF(Proxy-Call Session Control Function) in visited network to S-CSCF in home network directly. Analysis result shows that the improved fast rcregistration mechanism is better than the standard rcregistration procedure and some improved procedure proposed by other researchers. Furthermore, the fast rcregistration procedure has minor change to IMS network and can be realized easily.
Web Single Sign-on Scheme Based on Trusted Computing
QIU Gang, ZHANG Chong,ZHOU Li-hua
Computer Science. 2010, 37 (9): 121-123. 
Abstract PDF(336KB) ( 352 )   
RelatedCitation | Metrics
To enhance the security of user domain in single sign-on system, the hrusted Platform Modul}(TPM) was introduced to ensure the terminal trustworthiness. Meanwhile a user authentication scheme combined with password, fingerprint and smartcard was adopted, which achieves the mutual identification among user, user terminal and smartcard,and ensures the usage security of the information provided by application service. The security and performance analysis shows that the user authentication can identify the owner of user terminal from the genuine operators without any prcnegotiation, computation with Hash function in user authentication and push validation attestation on user platform integrity are of high efficiency.
NACK-oriented Reliable Multicast Transport Protocol for Live Streaming over Wireless Channel
HAN Li, QIAN Huan-yan
Computer Science. 2010, 37 (9): 124-126. 
Abstract PDF(365KB) ( 530 )   
RelatedCitation | Metrics
For wireless Channels are inherently lossy,it is challengeable to maintain the Qoality of Service(QoS) for live streaming. In this paper, we proposed a well-designed light weighted Negative-ACKnowledgment(NACK) Oriented Reliable Multicast Protocol, which combined the FEC-based repair in its design. This protocol can provide end-to-end rcliable transport of streams over generic IP multicast routing and forwarding services. For less complicated and more efficient implementation, feedback round mechanism is applied in HACK suppression. Experiment results show that our protocol gets much higher performance than NORM.
Description of Software Architecture Evolution Based on Delta-Grammar
CHENG Xiao-yuZENG Guo-sun,XU Hong-zhen
Computer Science. 2010, 37 (9): 127-130. 
Abstract PDF(409KB) ( 318 )   
RelatedCitation | Metrics
Software requires continuing evolution to adapt complex environment and meet variable rectuirements. In order to analyze the process and rules of software evolution, we proposed a special graph grammar, delta-grammar to describe the evolution of software architecture(SA). In particular,we provided the production rules of insertion,removal,replacement, recombination, split and concurrency for depicting the evolution process more conveniently, intuitively and graphically. Finally, we showed the process and effect of applying delta-grammar to describe software architecture evolution by taking ccommerce information system for example.
Software Fault Diagnosis Framework Combining Bayesian Networks with SFMEA
WANG Xue-cheng,LI Hai-feng,LU Min-yan,YANG Shun-kun
Computer Science. 2010, 37 (9): 131-134. 
Abstract PDF(390KB) ( 346 )   
RelatedCitation | Metrics
Software faults are the underlying and important roots which result in the mistake, failure and even break-down of system. Therefore, the fault diagnosis technology is very significant to software quality assurance. Recently, the fault diagnosis technology based on the artificial intelligence theory attracts more and more attention. Because I3ayesian networks theory has some significant advantages, such as easy expression and precise reasoning, a software fault diagnosis model with four layers(namely, reason, mode, fault and watch, short for WCMF) was firstly proposed by combining Bayesian networks with SFMEA(Software Failure Modes and Effect Analysis). Secondly, a software fault diagnosis framework based on the fault information database and the fault diagnosis database was presented. Finally, a case study on navigation software was proposed. The results shown that the fault diagnosis model and presented framework is feasible and effective,which have the following advantages,such as timely and convenient fault diagnosis,improving diagnosis efficiency by utilizing the history information of the faults and their diagnosis.
U2TP Test Model Profiling for Web Services
HUANG Long,YANG Yu-hang
Computer Science. 2010, 37 (9): 135-136. 
Abstract PDF(207KB) ( 319 )   
RelatedCitation | Metrics
This paper extended the metes model of UMI_ by the profile method, and constructed the UMI_ model descriplion group for Web Service. Based on this, it extended the U2TP test model by the stereotype array technique. Finally, it generated the U2WSTP test model for Web Service, implemented the test model for Web Service based on U2TP. It established the base for model driven Web Service testing.
Research on Bigraph-based Aspect-oriented Dynamic Software Architecture Evolution
WANG Ling,RONG Mei,ZHANG Guang-auan,WANG Sheng
Computer Science. 2010, 37 (9): 137-140. 
Abstract PDF(345KB) ( 354 )   
RelatedCitation | Metrics
As the development of network technology, the runtime environment of software is becoming more and more complicated and the rectuirements of software users are growing with diversification. These variations lead to more advanced demands on the ability of dynamic evolution of software. The concept, which is called separation of concerns,during the development of aspect oriented software development, can well support the dynamic evolution of software.The existing formal methods can not represent the dynamicity of software architecture intuitively; however, more important is that they can not efficiently verify the validity of the system before and after the evolution. 13igraph not only has the ability of intuitive graphical representation, but also possesses better mathematical foundation. Therefore, we proposed a new model,which is called Aspect Oriented Dynamic Software Architecture(AODSA),w solve these problems. First, l3igraph was extended in order to describe the structure of AODSA. Then, Bigraph reactive system(BSR) was used to represent the dynamic evolution of AODSA. At last, a simpleA),M deposit system model was used for example to illustrate the usage of AODSA.
Crosscutting Feature Analysis-based Automatic Software Architecture Refactoring Method
LI Bing-xiang,SHEN Li-wei,PENG Xin,ZHAO Wen-yun
Computer Science. 2010, 37 (9): 141-146. 
Abstract PDF(544KB) ( 347 )   
RelatedCitation | Metrics
Crosscutting concerns in software architecture increase the complexity of software architecture and the difficulty for evolution and maintenance. This design problem can be improved by refactoring on architectural level. This paper offered an automatic software architecture refactoring method based on the existing analysis of crosscutting feature.At first, this method analyzes crosscutting features based on traceability between features and components. hhen those components which are direct trace relation to these crosscutting features are extracted from initial architecture as aspectual components,finishing architecture refactoring. We developed a architecture refactoring tool based on Aspect Oriented Architecture Description Language AC}ADI.Experiments on architecture refactoring were taken on a business system. The result shows this method can refactor crosscutting concerns in software architecture effectively and automatically.
Security Policy of Attribute-based Access Control in SOA
WEN Jun-hao,ZENG Jun,ZHANG Zhi-hong
Computer Science. 2010, 37 (9): 147-150. 
Abstract PDF(311KB) ( 579 )   
RelatedCitation | Metrics
In order to improve the security of SOA-based system, it is essential to take advantage of access control in SOA. However, the traditional access control models are unable to be used in heterogeneous SOA environment To coordinate access control with heterogeneous environment,an Attributcbased access control(ABAC) model was proposed,which, takes the entities attributes as the basic units of evaluation. According to pre-defined strategy, the model can provide a dynamic access control by evaluating the attributes of subject,resource and environment The model was implemented by XACML and SAMI.Analysis shows that the access control model based on XACML and SAML standard provides more flexibility and portability, therefore it can be dedicated to the distributed environment using SOA.
Formal Model of Component Real-time Interaction Behavior Based on Automata Theory
JIA Yang-li,ZHANG Zhen-ling,LI Zhou-jun
Computer Science. 2010, 37 (9): 151-156. 
Abstract PDF(491KB) ( 357 )   
RelatedCitation | Metrics
Formal specification and verification on complex real-time component system' s interaction behavior have great significance of improving component systems' trustworthy properties such as correctness and reliability. The advantages and disadvantages of using process algebra and automata to model components interaction behavior were analyzed, and timed component interaction automata(TCIA) based modeling methods were presented based on the analysis.The related definition, composition and verification algorithm of I}CIA were given. Models based on I}CIA can clearly specify components' interaction behavior, architecture and real-time information in detail and arc convenient to verify.Finally, an application example was introduced.
Study of Consistency for Reflective Software Architecture
LUO Ju-bo,YING Shi
Computer Science. 2010, 37 (9): 157-160. 
Abstract PDF(401KB) ( 330 )   
RelatedCitation | Metrics
This paper proposed a reflective software architecture supporting the reuse of architectural level designs, and describes the reuse operatins based on formal specification language---Object-Z. Moreover, it defined the characters of meta-level and bas}level of the reflective software architecture. Finally, it provided proof method and process of consistency of base-level and meta-level for the reflective software architecture after the reuse operations.
Immune-based Method for Malware Detection
ZHANG Fu-yong.QI De-yu
Computer Science. 2010, 37 (9): 161-163. 
Abstract PDF(325KB) ( 310 )   
RelatedCitation | Metrics
In order to solve the problems existing in the current malware detection, a new malware detection method based on immune was proposed. In this method, the IRP request sectuences created by running programs are regarded as antigen, and the normal programs in operating system arc self, malwares arc nonsclf. I}he nonself will be detected by some antibodies using artificial immunology. Experimental results reveal that this model has high true positive rate, and low false positive and false negative rate.
Framework of Semantic Web Service Discovery Based on Multi-phase Matching
YANG Yong-qi,FU Yun-qing,YU Wei
Computer Science. 2010, 37 (9): 164-167. 
Abstract PDF(342KB) ( 321 )   
RelatedCitation | Metrics
With the rapid development and extensive application of Web services,how to find a Web services that users need in many Web services becomes a key issue. Based on semantic Web service's research, we proposed a framework of semantic Web service discovery based on multi-phase matching. We divided the entire discovery process into service type,service function, service name and service text statements matching four stages. In service function matching stage, for the density between concepts on domain ontology library, we proposed the improved GCSM algorithm based on the amount of information. In the service name and text statements matching stage, for polysemy problem, we proposed a disambiguation strategy based on instances and basic meaning. Finally, the discovery framework was proved to be feasible and effective.
Selection Oriented Database Data Distribution Strategy for Cloud Computing
WEN Ming-bo,DING Zhi-ming
Computer Science. 2010, 37 (9): 168-172. 
Abstract PDF(410KB) ( 384 )   
RelatedCitation | Metrics
Many methods have been proposed to satisfy the needs of massive data processing, among which cloud compuling is an outstanding one. The main thought of Cloud computing is using large number of PCs to compose a huge cluster as a server. With the development of Cloud Computing technology, more and more applications will turn into Cloud, including the DBMS. As Database system requires ACID, when it comes to data distribution, some operations'performance may decline, such as joins. In this paper we proposed a Selection Oriented data Distribution strategy(SOD) to improve the performance of DBMS in Cloud Computing, which works well through the experiments.
Improved Text Retrieve Algorithm Based on Subject-verb-object Structure
HUANG Cheng-hui,YIN Jian,HOU Fang
Computer Science. 2010, 37 (9): 173-176. 
Abstract PDF(372KB) ( 360 )   
RelatedCitation | Metrics
In text retrieve area, popular methods either considered word frequency or semantic information between retrieve terms and text corpus. These methods ignore the semantic structure information of retrieve terms and text corpus,and then the good result limits to some domains. This paper analyzed the subject verb-object structure information of text, and then computed the similarity of the words where the words lie in the subject-verb-object structure, and final1y implemented semantic information retrieving of texts. The experiment shows that the approach could improve the precision effectively.
Research on Scene Dispatch Strategy Based on DBTNN Algorithm
YANG Dong-mei,YIN Gui-sheng,LAI Chu-rong
Computer Science. 2010, 37 (9): 177-179. 
Abstract PDF(266KB) ( 375 )   
RelatedCitation | Metrics
A new scene graph that can effectively describe larg}scale virtual environment and a scene graph based Multi Space Partition tree(MBSP) were proposed. Proposed scene dispatch strategy of Dynamical Binary-tree Based Neural Network(DBTNN) , and gave the nerve network to dig the viewpoint change rule during the assembly process. At the same time,forecasted viewpoint state of the next step or the next steps by the network output, so that the scene had a fault tolerant scheduling, improved scheduling of real-time scenes and more intelligent scheduling. Finally, the test resups of different scenes show that the algorithm for largcscale optimization of complex scenes has very good results.
New Algorithm of Generating Concept Lattice Based on Concept-matrix
CHEN Zhen,ZHANG Na,WANG Su-jing
Computer Science. 2010, 37 (9): 180-183. 
Abstract PDF(304KB) ( 485 )   
RelatedCitation | Metrics
Concept lattice,the core data structure in FCA(Formal Concept Analysis),has been widely used in machine learning and data mining. In its applications, building concept lattice is very important, for which an efficient algorithm CMCG based on concept matrix was put forward. The algorithm started from the top node of the lattice, generated all subnodes for each node using the rank of the concept matrix's attributes,completed the link between the subnodes and their parent, and generated the Hasse graph. The validity of the algorithm was proved in theory. In the end, the pseudo code of CMCG algorithm was given and that performance of CMCG is superior in time to lattice algorithm was proved by experiments.
Similarity Algorithm Based on User's Common Neighbors and Grade Information
HE Yin-hui,CHEN Duan-bing,CHEN Yong,FU Yan
Computer Science. 2010, 37 (9): 184-186. 
Abstract PDF(328KB) ( 509 )   
RelatedCitation | Metrics
With the rapid growth of Internet, recommender systems have been used in many fields, and collaborative filtering(CF) is one of the earliest and the most successful ones. CF method usually identifies the neighborhood of each user based on similarity between two users; then predicts items' rating by integrating ratings of target user's neighbors, and lastly those items with higher predicted score are recommended to target user. So similarity plays an important role and affects the accuracy of the prediction. Up to now, various similarity measures have been proposed by rescarchers from different aspect. And common-neighbor algorithm is a simple and efficient method. However, common-ncighbor algorithm just considers the number of common objects scored by two users,doesn't consider the user's grade information. In this paper, an improved algorithm based on common-neighbor and user's grade information was proposed.Experimental results indicate that improved common-neighbor algorithm can obtain rather good predicting results.
Self-adaptable Granularity Road Network Moving Objects' Clustering Algorithm
SHI Heng-liang,LIU Chuan-ling,BAI Guang-yi, TANG Zhen-min
Computer Science. 2010, 37 (9): 187-189. 
Abstract PDF(333KB) ( 332 )   
RelatedCitation | Metrics
Although previous clustering algorithms can reduce the communication cost between moving objects and central database in road traffic network, the clustering granularity is set by experiences. hhis paper analysed the influence factors on clustering distance granularity, and introduced a novel method to train historical data with I3P network, and then got clustering distance granularity and clustering time granularity dynamically. Being new historical data, these granularity values can be made to train BP network further. This network can self-adapt in respect of influence factors dynamically, and birth efficient clustering granularity values to reduce communication cost, and forecast traffic jams as optimal route planning's observation.
Research and Application of Lightweight Data Persistence Technology Based on ORM
LI Jie
Computer Science. 2010, 37 (9): 190-193. 
Abstract PDF(442KB) ( 438 )   
RelatedCitation | Metrics
In order to setup an efficient mapping relation between object and database, the paper proposed a method for data persistence,in which a data persistence layer between logical layer and database layer is created based on the key function of data persistence layer. Compared the typical methods for data persistence and analyzed their merits and demerits,got inspirited from the function of Hibernate,then realized a universal data persistence layer frame.
Set Algebra is Semantic Interpretation for Classical Formal System of Propositional Calculus
LIU Hong-lan,GAO Qing-sh,YANG Bing-ru
Computer Science. 2010, 37 (9): 194-197. 
Abstract PDF(333KB) ( 709 )   
RelatedCitation | Metrics
The well formed formulas(wffs) in classical formal system of propositional calculus(CPC) arc only some formal symbols,whose meanings are given by a interpretation. Both logic algebra and set algebra are Boolean algebra,and are interpretations for CPC. A set algebra is a set semantics for CPC, in which set operations are the interpretation for connectives, set functions arc the interpretation for wffs, the set inclusion is the interpretation for logical implication,and the set equality= is the interpretation for logical equivalence. Standard probabilistic logic is based on a standard probabilistic space, a proposition describes a random event which is a set, the event domain in a probabilistic space is a set algebra, probabilistic logic is j ust the practical application of the set semantics for CPC. We can perform event calculus instead of probability calculus in CPC. CPC is applicable to probabilistic propositional calculus completely.
Cognition Evolutionary Algorithm
WANG Lei,WANG Wei-ping,YANG Feng,ZHU Yi-fan
Computer Science. 2010, 37 (9): 198-204. 
Abstract PDF(621KB) ( 370 )   
RelatedCitation | Metrics
Inspired by the process of human creative thinking, cognition evolutionary algorithm, a novel intelligent algorithm based on cognition science and computational creativity, was proposed, which simulates the creative thinking based problem solving process and behaviors. The algorithm is comprised of six components which arc divergent thinking, convergent thinking, memory, execution, learning and value measures. Taking the problem-solving as a knowledge based creative thinking process,knowledge evolution and knowledge based creative thinking skills play important roles in the algorithm. I}he impact of the parameters of cognition evolutionary algorithm on the performance of the algorithm was analyzed through an extended path optimization problem. The results show that the novel algorithm can reduce object score evaluation times compared with other classic intelligent algorithms for solving knowledge intensive optimization problems.
Attributes Reduction Based on the Variable Precision Rough Set in Decision Tables Containing Continuous-valued Attributes
FENG Lin,LI Tian-rui,YU Zhi-qiang
Computer Science. 2010, 37 (9): 205-208. 
Abstract PDF(342KB) ( 302 )   
RelatedCitation | Metrics
Attribute reduction is one of the key problems of the rough set theory. In order to effectively use the rough set theory to deal with the problem of attribute reduction in Decision Tables containing Continuous-Valued Attributes (DTCVA) directly,a new variable precision rough set model and a heuristic algorithm for attributes reduction in DTCVA were developed. Simulation results show that the proposed approach is effective for reduction of continuous-valued attributes, and more efficient than the classical rough set approaches in processing attribute reduction in decision information systems containing continuous-valued attributes.
Research of the Real Adaboost Algorithm
YAN Chao,WANG Yuan-qing
Computer Science. 2010, 37 (9): 209-211. 
Abstract PDF(344KB) ( 655 )   
RelatedCitation | Metrics
In the current artificial intelligence and pattern recognition, Real Adaboost Algorithm, as for high accuracy rate and very fast specd,has been used more widely. As a result, we researched the theoretical basis of the Real Adaboost Algorithm conscientiously and analyzed the training procedures of classifiers based on the Real Adaboost Algorithm meticulously. In this course, we probed into the relationship between the mathematical variables involved in the algorithm; deduced the mathematical process involved in the algorithm quantitatively, and analyzed the reasons of problems appearing in training procedures qualitatively. At last, in order to improve the Real Adaboost Algorithm, we brought up several suggestions.
Kernel k-means Clustering Algorithm for Detecting Communities in Complex Networks
FU Li-dong
Computer Science. 2010, 37 (9): 212-213. 
Abstract PDF(248KB) ( 411 )   
RelatedCitation | Metrics
Discovery community structure is fundamental for uncovering the links between structure and function in complex networks. In this context, recently, Li et al. recently proposed modularity density objective function for community detecting called the U function and gave the equivalence between modularity density objective function and the kernel k-means by using a kernel matrix In this paper, based on this ectuivalence, we used the kernel matrix to optimize the modularity density and developed a new kernel k-means algorithm. Experimental results indicate that the new algorithms arc efficient at finding community structures in complex networks.
Study of the Embedded Basic Motion Control System for an Unmanned Surface Vehicle
LIAO Yu-lei,PANG Yong-jie,ZHUANG Jia-yuan
Computer Science. 2010, 37 (9): 214-217. 
Abstract PDF(353KB) ( 600 )   
RelatedCitation | Metrics
This paper studied the embedded basis motion control system problem for an unmanned surface vehicle from the viewpoint of the hardware and software. The motion control system takes the combinative model which is convenient for debugging and monitoring the bottom of IPC and the top-level control. Synchronously, it expatiated in detail the unmanned surface vessels basis motion control system flow of control tasks partition, control motion and control arithmetic etc. Finally, the reliability and feasibility of the whole embedded basis motion control system was verified by simulation trial.
P(ρ,σ)-sets and its Random Characteristics
YU Xiu-qing
Computer Science. 2010, 37 (9): 218-221. 
Abstract PDF(289KB) ( 299 )   
RelatedCitation | Metrics
P-sets(packet sets) is a pair of sets composed of internal P-set and outer P-set, wMch has dynamic character'tsties. The dynamic characteristics are got by element transfer that has random characteristics. Based on it this paper presented concepts of P(p.a)-sets,gave its structure and proved that P(p.a)-sets is the general case of P-sets,while P-sets is the special case of P(p.a)-sets. The paper discussed random and dynamic characteristics of P(p.a)-sets,and gave random relation theorems among internal P(p.a)}-set,outer P(p.a)-set and clement transfer probability,and their applications.
Study on Text Clustering Algorithm Based on Similarity Measurement of Ontology
WANG Gang,ZHONG Guo-xiang
Computer Science. 2010, 37 (9): 222-224. 
Abstract PDF(316KB) ( 364 )   
RelatedCitation | Metrics
To improve the quality of text clustering and get the satisfactory clustering results,we proposed a text clustering based on similarity of ontology. I3y organizing text as ontology, we were easy to represent the meanings and relalions of concepts. We designed and improved the measurement of similarity and measured the text similarity by similarity of text ontology, we designed the algorithm of text clustering based on similarity. Experiments show that our method can avoid using the term isolation and high-dimensional, and can improve the clustering quality in correction degree and association degree.
Uncertainty Measures of Rough Sets Based on Knowledge Granularities
XIE Bin,LI Lei-jun,MI Ju-sheng
Computer Science. 2010, 37 (9): 225-228. 
Abstract PDF(299KB) ( 412 )   
RelatedCitation | Metrics
Uncertainty of rough sets has close relatives with the knowledge granularities of the approximation space. A concept of relative knowledge granularities was proposed. The roughness of a rough set based on relative knowledge granularities not only reflects the action of the approximation space, but also gets rid of the effect of the negative region of the rough set. A new kind of fuzziness of rough sets based on boundary entropy was designed. Both of the roughness and fuzziness are monotonously decreasing with the refining of knowledge granularities in approximation spaces.
Dynamic Fuzzy Description Logic for Deep Web Uncertain Knowledge Representation
FANG Wei,CUI Zhi-ming
Computer Science. 2010, 37 (9): 229-233. 
Abstract PDF(413KB) ( 311 )   
RelatedCitation | Metrics
A large number of information of Decp Web is valued and topi}oriented, it is difficult for us to use knowledge representation formalisms to encode and reason within the uncertain knowledge in Deep Web. The paper presented a method to represent uncertain knowledge in Decp Web using dynamic fuzzy description logic(DFDLs). I}he syntax and semantics of DFDLs were given. Then, we provided a tableau-based decision algorithm for this logic. The DFDLs provides more reasonable logic foundation for the Deep Web uncertain knowledge, and it can present more fuzzy and dynamic information for the Deep Web.
Ability Assessment Model in Learning Community Based on a Membership Classification Quantitative
CHENG Yan, XU Wei-sheng,HE Yi-wen
Computer Science. 2010, 37 (9): 234-238. 
Abstract PDF(469KB) ( 351 )   
RelatedCitation | Metrics
The assessment standard of the learner-centered teaching evaluation steers from knowledge to ability. Using virtual learning communities as a E-learning platform, a community learners' ability fuzzy assessment model was set up, based on the characteristics of virtual learning communities and E-learning, combining personal ability, online cooperation ability and the test results. The assessment agrees with the membership vector of the comments. To narrow the fuzzy disparity, this paper made use of the information from the comprehensive evaluation to build a classification quantitative model, thus further quantifying the assessment to classify the students' ability more accurately.
Study on Service Scheduling Based on PSO, Ontology and Market Mechanism
ZHONG Sheng-hai,WANG Gang,QIU Yu-hui
Computer Science. 2010, 37 (9): 239-241. 
Abstract PDF(323KB) ( 302 )   
RelatedCitation | Metrics
Service scheduling based on semantic is important in service computing, and service scheduling is an optimal process. As the current service scheduling method can not very well considerate the efficiency and the semantic, we researched the way to combine the PSO, ontology and market mechanism, and proposed the method SI3POM to schedule service based on them. hhis method may improve the service scheduling and it provides an efficiency method for users to get the service satisfied. Experiment shows that our method takes full advantage of the capability of PSO, service schedining and resource using arc efficiency also.
Iequality Relation of Knowledge Attribute Disturbance Law
ZHANG Guan-yu,LIU Xia-fei
Computer Science. 2010, 37 (9): 242-244. 
Abstract PDF(283KB) ( 327 )   
RelatedCitation | Metrics
The attribute disturbance of knowledge causes the changing of the knowledge law, and the changing laws are called disturbance laws. I3y using one direction S-rough sets and dual of one direction S-rough sets, the concepts of knowledge upper law and knowledge lower law of attribute disturbance were given, knowledge law and the knowledge law of attribute disturbance were discussed, and the inectuality theorems of attribute disturbance knowledge upper law and attribute disturbance knowledge lower law were proposed.
Counter-example Generation in Generalized Symbolic Trajectory Evaluation
LI Yi-nian,CAO Zhan-tao,ZHENG De-sheng,YANG Guo-wu
Computer Science. 2010, 37 (9): 245-248. 
Abstract PDF(300KB) ( 420 )   
RelatedCitation | Metrics
Generalized Symbolic trajectory evaluation is a powerful model checking technique which introduces symbolic quaternary and symbolic variables into symbolic quaternary assignments. But to find Counter-example has obstacles. In this paper, we presented an solution to search Counter-example. It uses set Intersection to backward simulation to generalized Counter-example. And we extended the solution to solve the problems from symbolic variable.
Steel Frame Model Updating Based on Self-adaptive Quadratic Particle Swarm Optimization Algorithm
QIN Yu-ling,KONG Xian-ren, LUO Wen-bo
Computer Science. 2010, 37 (9): 249-251. 
Abstract PDF(231KB) ( 271 )   
RelatedCitation | Metrics
Particle swarm optimization(PSO) algorithm which has less parameters is widely used in optimisation area for its better global search ability and calculation efficiency, it's necessary to change some parameters of the formula to improve its search ability and avoid getting into local optimum. The inertia factor and optimal position in the velocity fomina of PSO were updated and the self-adaptive quadratic particle swarm optimization(SAQPSO) algorithm with simple form and high search efficiency was proposed, model updating of the five-layer steel frame structure confirms the validity and superiority of SAQPSO.
Research on Hamiltonian Cycle Based on Path with Interface
LIU Chao,WANG Wen-jie
Computer Science. 2010, 37 (9): 252-256. 
Abstract PDF(405KB) ( 885 )   
RelatedCitation | Metrics
In order to find all Hamiltonian cycles in a digraph, to begin with, it presented an encoding method for the power set, which converts the problem of Hamilton circuits into the computation of hierarchical matrix Secondly, it estimated the complexity of algorithm with the proof of Xiaerci guess.Finally, it gave the exact algorithm for CTSP.
Multi-information for Visual Object Categorization
JIANG Ai-wen,WANG Chun-heng,XIAO Bai-hua,CHENG Gang
Computer Science. 2010, 37 (9): 257-260. 
Abstract PDF(350KB) ( 310 )   
RelatedCitation | Metrics
Visual object categorization(VOC) is one of the most difficult challenges in computer vision. Spatial pyramid histogram has been proposedd in recent years as an effective way to deal with features sets. However, there remains a large space for improvement. We made use of the respective advantage of spatial pyramid histogram and fisher score representation and proposed to use multi-information for recognition from information complement point view. The experiment results confirm our strategy, and our proposed algorithm consistently boosts the performance of all classes compared with their respective performances.
FPGA Design of Wavelet Transform in Spatial Aircraft Image Compression
TANG Yao,CAO Jian-zhong,LIU Bo,ZHOU Zuo-feng
Computer Science. 2010, 37 (9): 261-263. 
Abstract PDF(244KB) ( 430 )   
RelatedCitation | Metrics
A novel architecture based on lifting wavelet transform was proposed to implementing CCSDS image compression algorithm on FPGA. The lincbased parallel architecture which consists of two row processors performs 3 level 2-D 9/7 integer to integer forward discrete wavelet transform and can process the 2 row image data simultaneously. The row and column data arc processed in parallel way by storing the middle data in the 10 row buffer. The whole 3 level wavelet transform architecture is optimized in the pipeline design way and achieve lower utilization and less storage time. The architecture which has been demonstrated on Altera Stratix II FPGA performs a decomposition in approximutely Nz/ 2 clock cycles for an NX N gray image. According to the experimental results,the new architecture can implement the wavelet transform for 1024X1024 gray image at 100 frame per-second and working at 86. SMHz.
Investigation of the Algorithm for Iris Localization
SHI Chun-lei,JIN Long-xu
Computer Science. 2010, 37 (9): 264-266. 
Abstract PDF(269KB) ( 429 )   
RelatedCitation | Metrics
The accuracy and speed of iris boundary localization affect recognition system performance in the iris recognilion system. Based on analyzing some prevailing iris recognition algorithms, the edge information of the iris image was extracted by the edge detection operator which is based on Canny's thought, iris inner circle and outer circle were loca lined by Hough transform within the small block images by incorporating prior knowledge, and experimental result shows that this localization method improves the boundary localization speed and it also ensures the localization accuracy. The yawp from iris region includes the eyelid, eyelash, eyelid shadow and specular reflections. The segmental-secondary linear localization method adopting edge detection and Radon transform was proposed to remove the interference from the eyelid on the eyelid localization, the eyelash yawp and eyelid shadow were removed by threshold method, and experimental result shows that the algorithm is efficient and accurate.
Sparse Representation Based Face Recognition Algorithm
YANG Rong-gen,REN Ming-wu,YANG Jing-yu
Computer Science. 2010, 37 (9): 267-269. 
Abstract PDF(317KB) ( 733 )   
RelatedCitation | Metrics
We analyzed the mathematic essence of sparse representation, sparse regularized signal decomposition. Studied a sparse representation algorithm of orthogonal matching pursuit Using the matrix Cholesky decomposition,we realined the OMP algorithm a fast version. We cast the recognition problem as one of classifying among multiple linear regression models and developed a new framework from sparse signal representation. We viewed a test sample as the linear combination of training samples. We conducted experiments on face recognition to verify the efficacy of the proposed algorithm.
Medical Image Registration Based on Moving Least Squares
WANG Wei,SU Zhi-xun
Computer Science. 2010, 37 (9): 270-271. 
Abstract PDF(231KB) ( 589 )   
RelatedCitation | Metrics
The purpose of medical image registration is to match corresponding points in spatial and anatomic location from two or more different images. A novel algorithm of medical image registration based on moving least squares was proposed. This algorithm segmented interested regions of images,and then used semi-automatic method to extract the landmark points, deformed the image using moving least sctuares, thereby achieving medical image registration. The experimental results demonstrate that the proposed algorithm is an accuracy and effective method of medical image registration.
Vectorial Scale-based Image Filtering
LI Yue-yang,WANG Shi-tong
Computer Science. 2010, 37 (9): 272-275. 
Abstract PDF(350KB) ( 369 )   
RelatedCitation | Metrics
As to the filtering of medical images, it is important to preserve edges and details. A major drawback of filtering is that it often blurs important structures along with noise. Scale-based filtering methods of scalar images have been studied in recent years. In this paper, we generalized the scalcbased filtering method from scalar images to vectorial imar ges. Here we introduced three vectorial scale-based image filtering methods on the basis of conventional VMF, I3VDF,and DDF. These new methods use local structure size or "object scale" information to arrest smoothing around fine structures. I}he object scale allows us to better control the filtering process by constraining smoothing in regions with fine details while permitting effective smoothing in the interior of homogeneous regions. Qualitative and quantitative experiments based on Visible Human Project data sets demonstrate that our proposed methods outperform the corre-sponding conventional filtering methods in preserving edges and details.
Feature Extraction from Cryo-EM Image Based on DoG
WU Xiao-rong,WU Xiao-ming
Computer Science. 2010, 37 (9): 276-278. 
Abstract PDF(243KB) ( 486 )   
RelatedCitation | Metrics
It may be difficult to extract distinctive features pertinent to a specimen when dealing with very low-contrast and low SNR cryo-Electron micrograph(Cryo-EM) images. A DoG(difference of Gaussian) based method to extract shape features was provided in this paper,which is aimed to get a binary image by using DoG. This binary image was used as an object mask and shape features were calculated for the largest object in the mask. The experiments showed this method has good results.
SMT Solder Joint Image Denoising Based on Wavelet Packet Transform and Wiener Filter
ZHAO Hui-huang,ZHOU De-jian,WU Zhao-hua
Computer Science. 2010, 37 (9): 279-281. 
Abstract PDF(350KB) ( 336 )   
RelatedCitation | Metrics
A novel image processing approach was proposed to reduce the noise in surface mount technology(SM7) solder joint image, based on wavelet packet transform and Wiener filter. At first, by the using wavelet packet transform, the approach not only decomposed the image into the low frequency part but also into the high frequency part of image in several scales so that the noise could be eliminated and the image useful information could be reserved. Then, after analyzing the wavclet packet tree coefficients, they were processed with the use of the Wiener filter, and leaved the wavelet packet tree low frequency coefficients without change. At last, the inverse wavelet packet transform was applied to reconstruct the SMT solder joint denoised image. The experimental results have illustrated that the proposed approach can accomplish a better result in image denoising compared with the conventional methods and can retain most image edges.
Shot Segmentation Based on Motion Compensation and Self-adaptive Dual Thresholds
ZHANG Yu-zhen,YANG Ming,WANG Jian-yu,DAI Yue-wei
Computer Science. 2010, 37 (9): 282-286. 
Abstract PDF(461KB) ( 571 )   
RelatedCitation | Metrics
Shot segmentation is structural base for video retrieval. An efficient shot segmentation algorithm was proposed in this paper. Firstly aiming at the motion among the video frames,motion vector field adaptive search technique was used to estimate block motion vectors between two video frames and then motion compensation was made. Afterwards on this base the number of pixsels whose value don't change much between two frames were computed and then based on sliding window, local self-adaptive dual thresholds were gained. Finally based on comparison between dual thresholds, cuts and gradual transition were detected. Experimental results prove that this algorithm can not only efficiently detect cuts and gradual transitions, but also is robust to motion.
Novel Video Watermarking Algorithm Based on MPEG7 Contour Description
MA Jie,LI Jian-fu
Computer Science. 2010, 37 (9): 287-289. 
Abstract PDF(340KB) ( 338 )   
RelatedCitation | Metrics
For the copyright protection of digital video, a novel video watermarking algorithm based on MPEG7 contour description was proposed. I}he original video is spitted into a series of frame groups, for each frame in the group, the points of contour shape are described by MPEG7 , and taking advantage of hash function to calculate the Keys between those points in the lI}DI}I} transform domain and the watermark signals. Then, decomposes all those contour points of each frame in one group through 2I}DI}T transform and obtains the invariant domain of the time axis and the contours,and then embeds the watermark signals into this domain to ensure the watermark reliability and robustness. And in the process of watermark extracting, it is performed multiple watermark verification to keep it accuracy. Experimental resups show that the watermarked frames are indistinguishable from the original frames subjectively and the proposed video watermarking algorithm is robust against the attacks of additive Uaussian noise,frame dropping,frame averaging and lossy compression.
Authentication Enhanced Object-based Storage Security Mechanism
YAO Di,FENG Dan
Computer Science. 2010, 37 (9): 290-293. 
Abstract PDF(354KB) ( 331 )   
RelatedCitation | Metrics
To improve the access performance,the client communicates with the object-based storage device(OSD) directly in the object based storage structure. But this brings security questions. In this paper, a set of new security mechanism for the object based storage system was proposed. In which, by using the secure key agreement and mutual authentication protocol in the main communications between the devices of the object based storage system,it can prevent most of the network attacks, and so improves the security of the o场ect based storage system totally.
Efficient Method of Energy Saving on Variable Voltage Multi-core Processor
WANG Ying-feng,LIU Zhi-jing
Computer Science. 2010, 37 (9): 294-296. 
Abstract PDF(254KB) ( 299 )   
RelatedCitation | Metrics
For applications including dependent tasks with timing constraint in the case of considering transition overhead and inter-core communication overhead on variable voltage multi-core processors, an overhead-conscious synthesis method of energy optimization was proposed for real-time multi core embedded systems. The method effectively combined dynamic power management with adaptive body bias and dynamic voltage scaling based on independent tasks produced by the RDAG algorithm. Simulation experiments were conducted with several random task graphs and task graphs representing real applications on 2,3 cores, respectively. The experimental results show that the proposed method gets advantage over the original method.
Research on Storage Security of Dynamic Distributed Diskless Network Based on PXE
HUANG Guan-lil,JIN Yan,GOU Chuan-jing,WANG Ping
Computer Science. 2010, 37 (9): 297-301. 
Abstract PDF(492KB) ( 400 )   
RelatedCitation | Metrics
PXE(Prcboot Execution Environment) tech. has played great roles for the powerful compatibility and case for maintenance. Internet requirement for data safety is more and more important while PXE methods is lackly. The applications of diskless tech and dynamic distributed security system in data storage were analyzed. Designing and studing the PXE tech.,reduced the risk of letting out and dependability of the network data under the strict data management.The function may bring huge economic benefit with the popularization of the applications and low maintenance cost.
Online Clustering and Detective Cost Based Anomaly Detection Scheme for MANET
WANG Lei-chun,MA Chuan-xiang
Computer Science. 2010, 37 (9): 405-108. 
Abstract PDF(407KB) ( 332 )   
RelatedCitation | Metrics
Mobile Ad hoc networks(MANET) arc highly vulnerable to be attacked and difficult to deploy complicated safe protocols and algorithms due to the open medium, dynamically changing network topology,lack of centralized monitoring and management point, and limited resources. To detect efficiently anomaly behaviors in MANET, this paper proposed a online clustering and detective cost based anomaly detection scheme for MANET, TCDC. In this scheme, TCDC firstly analyzes and deals with access behaviors in single node using online clustering based on access behaviors, and then validates farther access behaviors by cooperative detection based on detective cost among different nodes. Simulation results show TCDC can efficiently detect anomaly behaviors in MANET with less resource consumption.