Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 38 Issue 10, 16 November 2018
  
Survey on the Networks-on-Chip Interconnection Topologies
WANG Wei,QIAO Lin,TANG Zhi-zhong
Computer Science. 2011, 38 (10): 1-5. 
Abstract PDF(534KB) ( 1705 )   
RelatedCitation | Metrics
Along with the development of device, process and application technology, chip multiprocessor is becoming the mainstream technology. As the scale of chip multiprocessor as well as the number of integrated on-chip cores is getting larger and larger, the network-on-chip, which is dedicated to the interconnection and communication among on-chip cores and other components, is becoming one of the performance bottlenecks of chip multiprocessor. hhe topology of network-on-chip defines the physical layouts and the interconnection patterns of network nodes,determines the cost,latency, throughput, area, fault tolerance and power of network-on-chip, and impacts on the network routing policies, theplacement and routing designs of network chips. Therefore, the topology is one of the key technologies of network-on-chips. The paper compared various topological structures of network-on-chips in brief , analyzed their performances, and proposed recommendations for future research on the topology of network-on-chip.
Survey of the P2P Trffic Identification Techniques
LIU San-min, SUN Zhi-xin
Computer Science. 2011, 38 (10): 6-12. 
Abstract PDF(674KB) ( 1145 )   
RelatedCitation | Metrics
This paper analysed the importance of P2P traffic identification. 13y summarizing literature, the existing method was divided into four kinds of traffic identification according to the mechanism of identification; port based classificanon, application-layer signatures, transport layer feature and machine learning. This study analysed in detail the principle of traffic identification and the critical problem and explored the recent research at home and abroad, measures to solve the problem and advantages and disadvantages. Following it, this paper made prospect for the future research trends.
Survey on Access Control Technology of Web Services Composition
SHANG Chao-wang,ZHAO Cheng-ling.LIU Qing-tang,WANG Yan-feng
Computer Science. 2011, 38 (10): 13-15. 
Abstract PDF(353KB) ( 622 )   
RelatedCitation | Metrics
Access control is one of the key technologies in secure and reliable Web services composition valucadded application. This paper briefly reviewed the state of the research for access control in Web services composition environment. We firstly discussed the challenges to Web services secure composition. Subsequently we analysed the security problems concerning Web services composition from a hierarchical perspective. Then, we discussed the research progress on the key access control technology from three respects of Web services composition access control architecture,atomic security policy consistent coordination and business process authorization. Finally, the conclusion was given andthe problems were pointed out,which should be resolved in future research.
Survey of Steiner Tree Problem
ZHENG Ying,WANG Jian-xin,CHEN Jian-er
Computer Science. 2011, 38 (10): 16-22. 
Abstract PDF(0KB) ( 372 )   
RelatedCitation | Metrics
Steiner tree problems are classical NP problems, which have wide applications in many fields, such as computo network arrangement, circuit design, biological network analysis and so on. With the development of parameterized computation theory, parameterized Steiner tree problem has been proved fixed parameter solvable not only in undirected graph but also in directed case. This paper firstly introduced the approximation algorithms and parameterized algorithms for Steiner tree problem in general graphs, then analysed the research situation for some special Stciner tree problems.Moreover, the vertex-weighted Steiner tree problem was also discussed. Finally, some further research directions on this problem were proposed.
Research on Progress of Fuzzy Cognitive Map
MA Nan,YANG Bing-ru,BA0 Hong,GUO Jian-wei
Computer Science. 2011, 38 (10): 23-28. 
Abstract PDF(614KB) ( 1519 )   
RelatedCitation | Metrics
As an effective tool for knowledge representation reasoning and soft computing method, fuzzy cognitive map (FCM) has become a hot issue for researchers home and abroad in recent years. FCM quantifies the causal relationship between concepts. We summarized the basic framework and reasoning mechanism of FCM from the perspective of research development, generalized classification of FCM in the mainstream research and analyzed main features of its learning algorithm and put forward possibilities of future research, expecting to be helpful to the future research.
Retrospect and Prospect of Decomposition Technology
LI Zhan-shan,HAN Wen-cheng,GUO Ting
Computer Science. 2011, 38 (10): 29-33. 
Abstract PDF(510KB) ( 1054 )   
RelatedCitation | Metrics
Constraint satisfaction problems formalism offers a powerful frame of knowledge representation that can solve many problems. But constraint satisfaction problems are often NP-hard problems,so it's very important to using decomposition to reduce the costs of computation. This paper mainly described the importance of the decomposition in constraint satisfaction problems, several classic decomposition technologies and history of decomposition, and then analyzed these technologies. We introduced several new technologies of decomposition and analyzed them, then made a summary. We proposed our next research ideas and direction according to the problems in these technologies.
Organizational Evolution-based ABC Supported Unicast Routing Scheme
WANG Xing-we,SUN Yong-jian,JIANG Ding-de,HUANG Min
Computer Science. 2011, 38 (10): 34-38. 
Abstract PDF(402KB) ( 645 )   
RelatedCitation | Metrics
An ABC (Always Bcst Connected) supported QoS (Quality of Scrvicc) unicast routing scheme was proposed. In the proposed scheme, intervals were used to describe the user QoS requirements and network link parameters;preference sequences were introduced to reflect users' preferences to different types of networks; probability density functions, satisfaction functions and evaluation functions were adopted to overcome difficulties on accurately measuring network link parameter values and exactly expressing on user QoS requirements; cost, price, billing and gaming were taken to deal with profits of both the user and the network provider; finally, OEA (Organizational Evolutionary Algorithm) was used to find the specific QoS unicast path with Pareto optimum under Nash Equilibrium among all parties's utilities achieved or approached. Simulation results show that the proposed scheme is both feasible and effective.
Secure E-commerce Payment Protocol Based on Four Parties
GAN Zao-bin,XIAO Shi-cheng,LI Kai,XIAO Guo-qiang
Computer Science. 2011, 38 (10): 39-44. 
Abstract PDF(533KB) ( 655 )   
RelatedCitation | Metrics
With the rapid development and popularization of electronic commerce, the secure payment in the area has become more and more important and is a key technology that impacts on the development of ccommerce. Aiming at the standard Secure Electronic Transaction protocol (SET) , some improvements were done for the SET protocol in order to assure the goods atomicity and certified delivery. Then, this paper proposed a secure e-commerce payment system based on the four parties that can not only support the goods atomicity and certified delivery, but also access the key electronicevidence automatically and deal with the transaction disputes. In the meantime, the formal description of the protocol is presented. Finally, the security of the proposed protocol was compared and analyzed.
Dynamic Deployment of Nodes in Wireless Sensor Networks
LIAO Zhuo-fan,WANG Jian-xin,LIANG Jun-bin
Computer Science. 2011, 38 (10): 45-50. 
Abstract PDF(534KB) ( 677 )   
RelatedCitation | Metrics
Wireless Sensor Networks (WSNs) is a self-organized network consisting of multiple sensors with limited resources. One fundamental issue in WSNs is deployment of nodes which reflects the networks construct costs, coverage quality, topology and routing etc. As a new technology, dynamic deployment of nodes makes WSN more applicable in many scenes. hhis paper first introduced the self dynamic deployment of nodes with recent valuable theories and algorithms,which are focused on uniform distribution of nodes. And then typical robot assistant deployment strategies were described according to different phases of WSN, which arc aimed at robot movement control and network maintenance.More specifically, advantages and disadvantages of these algorithms were summarized. Finally, key design points and future research directions were put forward.
Geographic Routing Algorithm Based on Link Quality in Mobile Ad-hoc Networks
HONG Lei,HUANG Bo,ZHAO Chun-xia
Computer Science. 2011, 38 (10): 51-54. 
Abstract PDF(461KB) ( 763 )   
RelatedCitation | Metrics
How to implement the simple routing mechanism enables nodes transfer the packets efficiently within a shorter time is a basic problem in research of the mobile Ad-hoc networks. According to the deficiencies of high bit error rates and anti interference technique, link quality was proposed as the new metric for route selection and a geographic routing algorithm based on link duality called LQPR was designed and implemented in this paper which solves the problem of a downward trend of packet delivery ratio on Non-ideal wireless link by using traditional greedy algorithm. The LQPR algorithm, which combines the LQ mode and Perimeter mode, guides data forwarding by use of the geographic information obtained by the location techniques, which has such advantages as less control overhead, optimal path selection and efficient transmission. The proposed routing protocol LQPR was simulated by NS-2. Through evaluating and comparing the result in term of average end-to-end delay, aggregate throughput and delivery success rate, the validation of LQPR was then carried out with simulating data.
Security Analysis of Access Control Policy Based on Predicate Abstract and Verification Space Division
WANG Chang-da,HUA Ming-hui,ZHOU Cong-hua,SONG Xiang-mei,JU Shi-guang
Computer Science. 2011, 38 (10): 55-59. 
Abstract PDF(467KB) ( 598 )   
RelatedCitation | Metrics
In order to implement security analysis of access control policy rapidly, predicate abstract with verification space division was presented, i. e. transfer pristine state machine model analysis to abstract state machine model which contains fewer states. Furthermore,verification space division was introduced to decrease the dimensions of model checking. Endorsed by both theoretic analysis and experiment, time and space requirement are effectively reduced. Compared with the known methods,our methodology is more efficiency and less human interacted.
P2P Network Virus Detection Model Based on Immune Collaboration
CHENG Chun-ling,CHAI Qian,XU Xiao-long,XIONG Jing-yi
Computer Science. 2011, 38 (10): 60-63. 
Abstract PDF(370KB) ( 627 )   
RelatedCitation | Metrics
P2P provides the advantage of convenient resource sharing and direct communication, meanwhile, the virus obtains a more convenient spread and infection channels. A peer-to-peer network virus detection model based on immune collaboration was proposed. The nature of collaboration between peers was used to share the memory detector. The suspeeled virus database was established in the central peers to reduce the false rate and missed rate. In detector generation phase, an improved NSA based on double mature was proposed to reduce the redundancy of detector set. In virus deteclion phase, multi immune P2P virus detection algorithm can detect the unknown viruses. Simulation results show that the improved algorithm can reduce the number of detectors and improve the detection performance. Meanwhile, the memory detectors can be quickly shared by peers,so the security of P2P network is improved.
Security Protocol for Protecting the P2P Trust Information Management
HU Jian-li,ZHOU I3in,WU Quan-yuan
Computer Science. 2011, 38 (10): 64-67. 
Abstract PDF(437KB) ( 599 )   
RelatedCitation | Metrics
Most of previous work concentrates on how to rate peers' behaviors with the trust mechanism and how to establish the reputation-based trust management system (TMS),but takes less consideration of the security problems in TMS. As for this issue, this paper proposed a security protocol for protecting the P2P trust information management (SPRI). I}heoretical analysis and simulation experiments show that, SPRI can effectively suppress the sybil attackers and trust information tamper peers in transmissions with little time overhead. Moreover, it can be easily integrated into various reputation-based TMSs.
Research on Fault Analysis against RSA Based on Fault in CRT Combination Operation
CHEN Cai-sen,WANG Tao,KOU Ying-zhan,ZHANG Jin-zhong
Computer Science. 2011, 38 (10): 68-71. 
Abstract PDF(329KB) ( 951 )   
RelatedCitation | Metrics
hhe former fault analysis can not attack on RSA-CRh with corresponding countermeasure. In order to find the new vulnerability to fault analysis,this paper took Shamir countermeasure as the analyzed object. An attack model based on fault in CRT combination operation was advanced, and gave a differential fault analysis algorithm that can completely recover the RSA key. The fact that the previous countermeasures can not effectively resist the differential fault analysis was demonstrated,and the complexity of our attack was estimated both by a theoretical analysis and software simulations. Experiment results show that the new fault analysis algorithm has well feasibility; it only requires two fault injections for permanent fault, and an improved scheme of key searching for random fault is advanced. Finally, a corresponding advice on countermeasure to differential fault analysis was given by analyzing the problem of previous countermeasures.
Improvement of Hierarchical Failure Detection Algorithm in Distributed Systems
XU Guang-xia,CHEN Shu-yu
Computer Science. 2011, 38 (10): 72-74. 
Abstract PDF(353KB) ( 737 )   
RelatedCitation | Metrics
Aiming at the problem of the accuracy and efficiency of hierarchical failure detection method in distributed systems, this paper proposed an improved algorithm of hierarchical failure detection in distributed systems based on Chen prediction algorithm, which is on guidance of hierarchical failure detection mechanism of the o均ect level, process level and host level. In distributed systems, the traditional hierarchical failure detection method always encounters problems such as single point failure, detection delay and so on. hhcy proposed that detection messages in local area network are limited within the group when layering, and different nodes of one group assume different detection among groups.A trust variable and a correction scale factor are added in improved algorithm. In order to increase network delay,increasing the load of network is adopted for simulating the complexity of largcscale networks. Then, the experimental verification of the algorithm is completed. The experimental results demonstrate that the accuracy and efficiency of fai lure detection are improved and misdiagnosis rate is reduced by adopting the improved algorithm They offer research base for further optimization of failure detection methods.
Distributed Key Management for Ad-hoc Network Based on CPK
ZHANG Yu-chen,WANG Ya-di,HAN Ji-hong,FAN Yu-dan
Computer Science. 2011, 38 (10): 75-77. 
Abstract PDF(241KB) ( 609 )   
RelatedCitation | Metrics
A distributed key Management scheme for Mobile Ad-hoc Networks(MANETS) was proposed,which combines Combined Public Key(CPK) cryptography with secret sharing threshold cryptography. Four aspects were depicted detailedly including initializing system, designing management platform, obtaining nodes' private/public key and updating private matrix sharing. Analysis shows that this project is secure, distensible, concise and applicable. So it is especially suitable for the characteristics of MANETS.
Research on Dynamic Service Composition Based on QoS
WU Hai-bo,DENG Mu-sheng,CHEN Xin-xi
Computer Science. 2011, 38 (10): 78-80. 
Abstract PDF(249KB) ( 556 )   
RelatedCitation | Metrics
Web Services technology has been developing rapidly in the recent years, but the adopt of rate is very low all along because of the problem of QoS in Web Services. So, this paper analyzed problem of QoS of Web Services in the project of Dynamic Composition of Web Services, presented a computation model of QoS and a computing method of QoS Optimization, the scheme supports dynamic binding of Web Services, automatic and semiautomatic executions and optimal choice of Web Services based on QoS and so on. Lastly, the prototype and implementation were introduced.
Pseudo Random Generator Based on Chaotic Maps
QIU Jing,WANG Ping,XIAO Di,LIAO Xiao-feng
Computer Science. 2011, 38 (10): 81-83. 
Abstract PDF(247KB) ( 860 )   
RelatedCitation | Metrics
Abstract A new pseudorandom number generator based on piecewise linear chaotic map (PWLCP) was proposed. The proposed scheme can overcome the defect of piccewisc linear when using PWI_CP to generate the pscudorandom sequence. Theoretical analysis and computer simulation indicate that the proposed pseudo random generator has good cryptographical properties.
Improved Mixed Video Watermarking Scheme
WU Geng-rui,YANG Xiao-yuan, NIU Ke
Computer Science. 2011, 38 (10): 84-86. 
Abstract PDF(326KB) ( 607 )   
RelatedCitation | Metrics
Proposed an improve mixed video watermarking scheme. 13y embedding the robust and fragile watermarkings into AC coefficients of the middle frequency in the I}CT macroblock and the horizontal component of motion vector to a- chieve the gold of hiding the secret and verifying information in the video, we can protect the video copyright with the a- bility of the integrity certification. I}he results show that; compared to the mixed video watermarking scheme proposed by U. Qiu et al, when embedding the robust watermark, the code rate decreases by 6 % on average and it owns a better resistance to attacks; the fragile watermark is more sensitive to attacks.
Parameters Optimization and Sensitivity Analysis Based on Particle Swarm Optimization Algorithm in Cognitive Radios
FENG Wen-jiang,LI Jun-jian,WANG Pin
Computer Science. 2011, 38 (10): 87-90. 
Abstract PDF(343KB) ( 771 )   
RelatedCitation | Metrics
Cognitive radio can adaptively adjust its working parameters according to users' needs and changes in the en- vironment, most of the existing cognitive engines use genetic algorithm to optimize parameters, however with the in- crease in the number of cognitive users, the increased chromosomes result in long convergence time of genetic algo- rithm,which can't meet the needs of real-time communication. An improved inertia factor particle swarm optimization was used for parameter optimization in cognitive radio, and parameter sensitivity analysis on transmission parameters was done in different communication modes,so as to remove lower sensitivity parameters selectively from the objective function, and reduce the processing complexity. Simulation results show that parameter optimization based on particle swarm optimization has better convergence,efficiency and stability than genetic algorithm,and can successfully find op- timal parameter solution at smaller evolution generation, reduce the optimization time, and meet the real-time processing recauirement of cognitive radio.
Towards a Logical Framework of Composing Attribute-based Access Control Policies
KE Ke,LI Ou,XU Chang-zhen
Computer Science. 2011, 38 (10): 91-95. 
Abstract PDF(478KB) ( 587 )   
RelatedCitation | Metrics
In multi-domain environment, the composition of access control policies is the key for aggregated resources when several domains are organized to form a new one. To formally express the composition and guarantee the correctness,a logical framework of composing policies was proposed. The framework is described at the attribute level. It not only fertilizes the existing algebraic models but also can express the dynamic composing scenery which they don't support, Several examples were introduced to demonstrate its expressing ability. The framework involves a logic deduction system which is sound. Based on the system, a compound policy can be formally verified whether it meets each party's protection needs. At last, how to evaluate a compound policy for an access request to some aggregated resource was dis- cussed.
Id-based Private-key Management Scheme in Distributed Satellite Network
WU Yang,JIAO Wen-cheng,PAN Yan-hui,LI Hua
Computer Science. 2011, 38 (10): 96-99. 
Abstract PDF(348KB) ( 706 )   
RelatedCitation | Metrics
In order to prevent data modification attack in process of privatckcy component update of satellite network node, we presented a double-encryption scheme, which can ensure integrity of data in process of private-key component update. At the same time, to prevent violent private-key component updating request from venomous node with normal identity, a mechanism was provided to judge the validity of update request time, which can resist denial of service attack from venomous node with normal identity. Finally, we compared the security and computing complexity of the scheme, it shows that by introducing low computing costs, the scheme we provided can resist data modification attack and denial of service attack.
Formal Approaches for Analyzing Timing Attacks
WANG Yin-long,ZHAO Qiang,LIN Ke-cheng,LI Zhi-xiang,WANG Xi-wu ,DENG Gao-ming
Computer Science. 2011, 38 (10): 100-102. 
Abstract PDF(347KB) ( 778 )   
RelatedCitation | Metrics
The sidcchanncl attacks take advantage of physical characteristics leaking from side channel of implementa- lion in cipher device to recover the key or other secret parameters involved in the computation running in the cipher de- vice, which blazes a new path distinct from conventional cryptanalysis methods. Equivalence relation and ectuivalence class were adopted in formal qualitative analysis against timing attacks, one type of sidcchannel attacks. hhe method of measurement in information entropy was adopted in quantitative evaluation on timing-attack ability. Formal analysis was conducted on timing attacks against RSA binary modular exponentiation algorithm, indicating that formal analysis on timing attacks could make the attack procedure intuitive and accurate, thus providing valuable reference for formal de- scription of other side channel attack approaches.
Multi-watermarks Technology in Software Copyright Management
LUO Yang-xia,FANG Ding-yi
Computer Science. 2011, 38 (10): 103-109. 
Abstract PDF(626KB) ( 638 )   
RelatedCitation | Metrics
To enhance the robustness of software watermarking, the image "multi watermark" was introduced into the software, and gave the definition of multi watermark of the software, analysed the joint, model and optimisation method. On the basis, the multi-watermarking based on model of software copyright protection was proposed, and the original al- gorithm was improved,such as the meaningful preprocessing software watermarking,fingerprint embedded though the dynamic and confusion, the interactive tamper detection improves software watermark robustness. Experiments show that the model in the prevention of static analysis, dynamic tracking, and protection against reverse engineering and soft- ware integrity of the watermark has good performance.
Failure Rate Function of Heuristic-based Active Queue Management
FAN Xun-li,WANG.Iie, GUAM Lin,ZHAO Jian,GAO Li
Computer Science. 2011, 38 (10): 110-112. 
Abstract PDF(310KB) ( 582 )   
RelatedCitation | Metrics
This paper studied the relationship between the changing rate of dropping probability and the queue stability, and specifically researched computing function of dropping probability of Adaptive Random Early Detection(AREI))al- gorithm and Random Exponent Marking(REM) algorithm respectively. Based on heuristic packet loss approach, this pa- per proposed a Heuristic based Failure rate ARED(HFA) , which applies the failure rate function based on heuristic al- gorithm in ARED, to estimate the packet dropping function. With the proposed failure rate packet dropping function, the performance of the packet dropping of HFA is similar as that of ARED and REM at light traffic load. However, the pro- posed algorithm can not only keep lower packet dropping rate and its variance, but also have a stable instantaneous queue length around the target length and reduce the fitter of queue length distinctly with the heavy load. Simulation re- sups demonstrate that the HF八algorithm outperforms AREI)and REM in the following three aspects: instantaneous queue length, packet dropping rate and fitter, respectively.
Delay Aware Power Management Scheme for WLANs
XU Lin,ZHU Yi-hua,HU Hua
Computer Science. 2011, 38 (10): 113-116. 
Abstract PDF(305KB) ( 496 )   
RelatedCitation | Metrics
Most nodes in a wireless local area networks (WLAN) arc powered by battery due to mobility requirement. Power conservation is significant for the nodes to prolong their run-time. Power management (PM) is presented in the IEEE 802. 11 standard, which allows nodes to enter dozing mode to save power when they are not engaged in data deli- very. I}he PM was modeled and analyzed so that the packet delay, the number of successive dozes, the active time, and the number of switches between active and doze modes were all derived. A delay aware power management (DAPM) scheme, which is suitable for IEEE 802. 11 infrastructure WI_ANs, was proposed and analyzed. hhe DAPM was able to help nodes seek the optimal dozing period so that the power consumption is minimized while the given delay requirement is met.
Security Vulnerabilities Analysis Technology Based on Multi-core Architecture
JIAO Wan-ni,WU Kai-gui
Computer Science. 2011, 38 (10): 117-120. 
Abstract PDF(382KB) ( 576 )   
RelatedCitation | Metrics
While conducting vulnerabilities analysis and testing in military enterprise network, it is difficult to get the vulnerability information and the analyzing or verifying process will be accomplished in a long time with relatively low precision. While traditional vulnerabilities analysis and testing techniques couldn't solve the problem in a good way, this paper proposed a vulnerabilities analysis platform based on multi core architecture. The schema of vulnerabilities detec- ting and analyzing, heuristic penetrating test, dynamic patches creation and installation were combined with the software architecture based on multi-core processors to implement the parallel processing of vulnerabilities analysis, test and con- trol. The implementation and experiment of prototype system prove that this platform can discover, verify, test the hid- den vulnerabilities and make perfect patch in time for digital manufacturing network of military enterprises. It also will build a solid base for trustable computing network and security level estimation.
Public-key Encryption Based on Extending Discrete Chebyshev Polynomials' Definition Domain to Real Number
CHEN Yu,WEI Pcng-cheng
Computer Science. 2011, 38 (10): 121-122. 
Abstract PDF(234KB) ( 836 )   
RelatedCitation | Metrics
By combining Chcbyshcv polynomials with modulus compute, extending Chcbyshcv polynomials' definition domain to real number, some conclusions were drawn by theoretic verification and data analysis. Making use of the framework of the traditional publi}kcy algorithm RSA and E1Gama1, proposed a chaotic publi}key encryption algo- rithm based on extending discrete Chebyshev polynomials' definition domain to Real number. Its security is based on the intractability of the integer factorization problem as RSA,and it is able to resist the chosen cipher-text attack against RSA and easy to be implemented.
Threshold Signature Scheme with Tracking Identity
ZOU Xiu-bin,HAN Lan-sheng,FU Cai
Computer Science. 2011, 38 (10): 123-126. 
Abstract PDF(291KB) ( 569 )   
RelatedCitation | Metrics
The author presented a threshold signature scheme with tracking identity. In this scheme, there arc a trusted center named as PKU, a secret dealer, a signature combiner and n members(t ones of these members can take part in (t, n) threshold signature). The traceable identities information not only include information of dealer's identity but also do that of some members identities. We noted that these members take part in threshold signature together. I}he dealer can control production of partial signature and easily revoke the power of member's signature. We can verify whether a par- tial signature is made by some member substituting dealer. Moreover, we can also track the identities of those members who participate in threshold signature.
UbiCloud : A Cloud Computing System for Ubiquitous Terminals
CHEN Yuan,CUI Li, ZHU Zhen-min,ZENG Yi,WU Yuan-kun
Computer Science. 2011, 38 (10): 127-132. 
Abstract PDF(472KB) ( 672 )   
RelatedCitation | Metrics
A cloud computing system, UbiCloud, was developed to enable the ubiquitous terminals access to powerful and reliable computing resources anywhere and anytime through building a virtual computing environment between the front end ubiquitous terminals and the back-end servers (cloud). The experiment results show that the system performance is good enough to support most applications deployment on resource-poor terminals.
From the Definition of Aspect-oriented Programming to Aspect-oriented Programming Languages
GU Si-shan,CAI Shu-bin,LI Shi-xian
Computer Science. 2011, 38 (10): 133-139. 
Abstract PDF(636KB) ( 690 )   
RelatedCitation | Metrics
Today a lot of people not only from industrial community but also from academic community simply take As- pect-Oriented Programming (AOP) as modularizing crosscutting concerns and in a narrow-mined way to believe that AOP is just an extension or an effective supplement to Object Oriented Programming (OOP). Based on the definition of AOP, its nature which makes it different from the other programming languages was dug out. And the quantified state- ment and aspect in the definition were formalized. Moreover the semantics of them were defined. And then we argued quantification and obliviousness in the definition are the real nature of AOP. Modularizing crosscutting concerns is just a benefit from it. And AOP is a new programming paradigm which is independent of all the other programming langua- ges. Based on the definition,the minimum condition set which Aspect Oriented Programming Languages (AOPI)need to satisfy was proposed. And the difference between the mainstream AOPL from the view of the definition was probed into.
Access Control Rule Description Based on Logic Unify
HAN Dao-jun,HUANU Ze-long,ZHAI Hao-liang, LI Lei
Computer Science. 2011, 38 (10): 140-144. 
Abstract PDF(402KB) ( 611 )   
RelatedCitation | Metrics
Traditional methods arc hard to describe the subsume relationship of subjects and objects among some access control rules. In this paper, we built an algorithm based on logic unify to resolve which problem. Firstly, we converted access control request to logic question, obtained the access control result by means of logic answer. Next, we utilized the facts to describe the each component of access control, and we realized the flexible access control by using the dy- namic instantiation of variables in non-ground facts during the system running period. Finally, our experiment results show that our algorithm is effective.
Improved Design and Implementation of T-CBESD Based on On-the-Fly Verification Methods
GUO Li-juan,HU Jun,ZHANG Jian
Computer Science. 2011, 38 (10): 145-151. 
Abstract PDF(612KB) ( 547 )   
RelatedCitation | Metrics
Model-based techniques for system designs and analysis can effectively satisfy high reliability requirements of modern embedded software system. In this paper,an improved version of prototype T-CI3ES1)was designed and imple- mented based on On-thcfly verification mechanism Specifically, a graphic modeling environment was provided by in- tegrating Topcased and JFLAP into T-CI3ESD framework, and pre-translation transformation algorithms were also de- signed. The data structures of state space were redesigned and several kinds of consistency verification algorithm based on On-thcfly method were designed and implemented, which include analysis and verification frameworks for functional and non-functional system behaviors. Moreover, one example was shown by using the improved version of T-CBESD.
MXDR, Distributed Information Retrieval for Multi-XML Document Based on Keywords
LI Xia,LI Zhan-huai,ZHANG Li-jun,CHEN Qun,LI Ning
Computer Science. 2011, 38 (10): 152-156. 
Abstract PDF(428KB) ( 530 )   
RelatedCitation | Metrics
The emergence of the Web has increased interests in XML data. Keyword search has attracted a great deal of attention for retrieving XMI_ data because it is a user-friendly mechanism. 13ut Keyword search is hard to directly im- prove search quality because lots of keyword-matched nodes may not contribute to the results. A more important issue is the current studies are focused on single XML retrieval, lack of practicability. To address the challenge, this article pro- posed a new approach for automatically correcting queries over Multi-XMI,called MXDR(Multi XML Distributed Re- trieval). We first classed multi XML documents by a clustering method,and elicited the common structure information. Then generated certifiable structured queries by analyzing the given keywords query and the common structure informa- tion of XML datasets. We can evaluate the generated structured queries over the XML data sources with any existing structure search engine. We conducted an experimental study on real-life multi-XML datasets. The experimental results show that MXDR is effective and efficient in supporting structural querics,compared with existing proposals.
△-tree Based Similarity ,Join Algorithm for High-dimensional Data
LIU Yan,HAO Zhong-xiao
Computer Science. 2011, 38 (10): 157-160. 
Abstract PDF(308KB) ( 622 )   
RelatedCitation | Metrics
Similarity joins arc used in a variety of fields, such as clustering, text mining, and multimedia databases. In or- der to solve the proplemes of high-dimensional similarity joins in main-memory environment, a novel similarity join algo- rithm called △-tree-join* that can efficiently combine two different database sets based on p-tree was presented. △-tree has been proven to be an efficient index method in main-memory. △-tree-join* adopted the top-down join scheme and made full use of the properties of p-tree to compute the distances between clusters and between point and cluster with fewer number of dimensions,so as to filter unnecessary nodes or points,reduce computations and improve joins efficien- cy. Experiments on both synthetic clustered dataset and real datasets were conducted, and the results demonstrate that △-tree-join* is more suitable for main-memory similarity joins, and it performs well compared with the two state-of-the- art similarity join methods EGO and EGO".
Prefetching T-Tree: A Cache-optimized Main Memory Database Index Structure
YANG Zhao-hui,WANG Li-song
Computer Science. 2011, 38 (10): 161-165. 
Abstract PDF(414KB) ( 683 )   
RelatedCitation | Metrics
As the speed gap between main memory and modern processors continues to widen, memory access has be- come the main bottleneck of processing, so the cache behavior becomes more important for main memory database sys- tans (MMDI3s). Indexing technique is a key component of MMDBs. We proposed a cachcoptimized index-Prefetching I=tree (pI=tree) based on a novel CST-tree index, which applies prefetching to CST-tree to accelerate search opera- dons. p T-tree uses prefetching to effectively create wider nodes which arc larger than the natural data transfer size. These wider nodes reduce the height of the CST Tree, thereby decreasing the number of expensive misses when going from parent to child. The experimental performance study shows that our pT-Trees can provide better search perfor- mance than I3+-Trees,T-Trees,CST-Trees and Cache Sensitive I3+-Trees.
Research on Parallel k-means Algorithm Design Based on Hadoop Platform
ZHAO Wei-zhong,MA Hui-fang,FU Yan-xiang,SHI Zhong-zhi
Computer Science. 2011, 38 (10): 166-168. 
Abstract PDF(328KB) ( 918 )   
RelatedCitation | Metrics
In the past decades, data clustering has been studied extensively and a mass of methods and theories have been achieved. However, with the development of database and popularity of Internet, a lot of new challenges such as massive data and new computing environment lie in the research on data clustering. We conducted a deep research on parallel k-means algorithm based onHadoop, which is a new cloud computing platform. We showed how to design parallel k-means algorithms on Hadoop. Experiments on different size of datasets demonstrate that our proposed algorithm shows good performance on speedup,scaleup and sizeup. Thus it fits to data clustering on huge datasets.
Protein Secondary Structure Prediction Algorithm Based on Mixed-SVM Method
SUI Hai-feng,QU Wu,QIAN Wen-bin ,YANG Bing-ru
Computer Science. 2011, 38 (10): 169-173. 
Abstract PDF(499KB) ( 926 )   
RelatedCitation | Metrics
Protein secondary structure prediction is one of the most important problems in bioinformatics. I}he protein secondary structure prediction accuracy plays an important role in the field of protein structure research. In this paper, using a Knowledge Discovery Theory based on the Inner Cognitive Mechanism (KDTICM) , an efficient protein seconda- ry structure prediction algorithm based on mixed-SVM ( support vector machine) approach was proposed. The algo- rithm makes full use of the evolutionary information contained in the physicochemical properties of each amino acid and a position- specific scoring matrix generated by a PSI-SEARCH multiple sequence alignment, secondary structure can be predicted at significantly increased accuracy. At last, the experiments were used to show the superior accuracy and gen- erality of the new algorithm than other classical algorithm.
Research on the Missing Attribute Value Data-oriented Decision Tree
QIU Yun-fei,LI Xue,WANG Jian-kun,SHAO Liang-shan
Computer Science. 2011, 38 (10): 174-176. 
Abstract PDF(240KB) ( 779 )   
RelatedCitation | Metrics
In the existing multiple choice methods of decision trec'test attributes, can't sec such report as "I_et missing data processing integrated in the selection process of test attributes",however,the existing process methods of missing attribute value data could draw into bias in different degrees,based on this,proposed an information gain rate based on combination entropy as the decision tree's testing attributes selection criteria,which can eliminate missing value arrtib- utes'infulence on testing attributes selection,and carry out contrast experiments on WEKA. Experiment results indicate that the improvement can significantly increase whole efficiency and classification accuracy of the algorithm operation.
Algorithm for Outlier Detection in Large Dataset Based on Wei沙ted KNN
WANG Qian,YANG Zheng-kuan
Computer Science. 2011, 38 (10): 177-180. 
Abstract PDF(345KB) ( 831 )   
RelatedCitation | Metrics
Traditional KNN is an advanced algorithm based on the distance of outlicr detection algorithm on large data- set. However this algorithm only uses the k`h nearest neighbor as the criterion for outher which is inaccurate under cer- lain condition. This paper presented a weighted KNN outlier detection algorithm for large datasets. In this algorithm, a weight factor is presented. It represents the average distance of its k nearest neighbors. The outlicrs arc those having the largest distance with it's k`h neighbor and having the biggest weight under the same condition. The algorithm improves the accuracy of the outlicr detection algorithm. Experiment result shows that the algorithm is feasible compared with the traditional KNN.
Analysis about Parameters Selection of PSO Based on Cluster-degree
DOU Quan-sheng,SHI Zhong-zhi,JIANG Ping,LI Guo-jiang
Computer Science. 2011, 38 (10): 181-183. 
Abstract PDF(232KB) ( 627 )   
RelatedCitation | Metrics
The particle's trajectory of PSO was fully analyzed in this paper, the influence of random parameters on par- tide's trajectory was discussed,the concept of cluster-degree was put forward and distribute status of particle with dif- ferent cluster-degree was studied. The reasonable parameters setting range based on cluster-degree was proposed,at the same time,reinforce strategy of particle's velocity was proposed in order to improve performance of PSO under some condition. So this paper is helpful for the choosing and a山ustment of PSO parameters in practical application.
Parallel Text Categorization of Massive Text Based on Hadoop
XIANG Xiao-jun,GAO Yang,SHANE Lin,YANG Yu-bin
Computer Science. 2011, 38 (10): 184-188. 
Abstract PDF(402KB) ( 798 )   
RelatedCitation | Metrics
In recent years, there have been extensive studies and rapid progresses in automatic text categorization, which is one of the hotspots and key techniques in the information retrieval and data mining field. In recent years,as the text data grows exponentially, to effectively manage the large storage of data, we must use efficient algorithm to process it in the distributed environment. In this paper, we implemented a simple and effective text categorization algorithm on ha- doop--TFIDF classifier, an algorithm based on vector space model, cosine similarity was applied as the metrics. The ex- periments on two datasets show that the parallel algorithm is effective on large storage of data and can be applied in practical application field.
Backward P-reasoning and Attribution Residual Discovery-Application
LIN Hong-kang,FAN Cheng-xian,SHI Kai-quan
Computer Science. 2011, 38 (10): 189-193. 
Abstract PDF(462KB) ( 574 )   
RelatedCitation | Metrics
Novel Interval Valued Vague Sets and its Application in Pattern Recognition
ZHANG Zhen-hua,YANG Jing-yu,YE You-pei,ZHANG Qian-sheng
Computer Science. 2011, 38 (10): 194-198. 
Abstract PDF(388KB) ( 534 )   
RelatedCitation | Metrics
First, it was proved that interval valued vague sets is not the generalization of vague sets based on the con- ventional definition in this paper, and a novel interval valued vague sets were presented. Then, we proved that vague sets and interval valued vague sets are both special cases of novel interval valued vague sets. And then the concept of inter- val-valued vague sets with parameters (IVVSP) was proposed. Based on membership and non-membership, this paper focused on the construction of IVVSP. Finally, a pattern recognition example and a medical diagnosis decision-making example were given to demonstrate the application of IVVSP. I}he results of simulation show that the method of IVVSP is more comprehensive and flexible than that of the traditional vague sets.
Effective Approach to Deep Web Entries Identification
WU Chun-ming,XIE De-ti
Computer Science. 2011, 38 (10): 199-201. 
Abstract PDF(341KB) ( 753 )   
RelatedCitation | Metrics
Automatic identification of deep Web entries is the basis of deep Web data integration. Owing to the subjec- tivity of form design,deep Web entries lack unified standard and it is difficult to judge whether the form is a deep Web entry by the definite rules. Based on the statistics, this paper first chose several form attributes as the defining features, which can distinguish searchable forms from non-searchable forms. Then, an entry identification algorithm was proposed by using neural network. Unlike previous approaches, neural network can be trained, which is very suitable for entry i- dentification of the deep Web. I}he experimental results show that our proposed algorithm can be an effective way in au- tomatic identification of the deep Web.
Region Matching Algorithm Based on Historical Information Sorting in DDM
WANG Zhuo,FEND Xiao-ning,LIU Ting-bao
Computer Science. 2011, 38 (10): 202-204. 
Abstract PDF(344KB) ( 577 )   
RelatedCitation | Metrics
The key point of DDM implementation is to match the update region sets with the subscription region sets. The efficiency and performance of the distributed simulation system is determined by the design of matching algorithm. The matching algorithm must maintain the index form and the information form of region intersection. And the problem is that the forms arc constantly extended. Firstly, the idea and realization of region aggregation were provided. Secondly, historical information sorting was added to the region matching algorithm. Therefore, the running efficiency of region matching was improved on the base of using historical information. I}he basic idea and detail process of the algorithm were provided in the paper. I}he implementation process of algorithm was amply explained by the application example. Finally, the advantage and disadvantage of the algorithm were analyzed by the simulation data.
Study of Decision Support Technology Based on Value-based Argumentation Framework
FU Yi-xing,QI Xue-tian,YAO Li ,WANG Yan-juan
Computer Science. 2011, 38 (10): 205-208. 
Abstract PDF(421KB) ( 627 )   
RelatedCitation | Metrics
In the complex, open and uncertain environments, all the decision-makers have to confront the problem of how to evaluate and select action plans. Decision support technology based on argumentation, which uses arguments to help to make decisions and explain reasons,is a new method that is different from traditional Decision support theory. Firstly we analysed drawbacks of current decision support model based on valucbased argumentation framework, and put forwad the theory of argument category. Secondly in the light of it we designed the Argumentation-Decision control- ling algorithm ArguDecision, and implemented Medicalcare-Decision support system called Smart Doctor to validate the feasibility and effectivity of this algorithm. Therefore we explored a decision support technique which intergrated the human thinking style of argumentation and logical laws, and carried out the whole the process of argumentation-decision usmg computer.
Shadowed Sets Based Threshold Selection in Rough Clustering
GUO Jin-hua,MIAO Duo-qian,ZHOU Jie
Computer Science. 2011, 38 (10): 209-210. 
Abstract PDF(194KB) ( 867 )   
RelatedCitation | Metrics
Rough set based clustering has been applied widely in soft clustering since proposed,but the threshhold is oft en given subjectively,fails to consider the characteristics of the data set itself. hhis study based on shadowed sets opti mization theory gave an objective thereshold selection method, and applied it to rough fuzzy GMeans clustering algo rithm. Artificial data and UCI data experimental results show the effectiveness of the proposed method.
V Orthonormal Basis Neural Network
XIONG Gang-qiang,QI Dong-xu
Computer Science. 2011, 38 (10): 211-214. 
Abstract PDF(299KB) ( 591 )   
RelatedCitation | Metrics
In order to solve the problem that the convergence rate of BP network is not fast, and the neural networks with continuous orthogonal basis cannot approximate discontinuous functions,this paper constructed a class of feed-for- ward neural networks with V orthonormal basis (referred to as V orthogonal network) , and investigated its convergence condition and pseudo-inverse rule. For V system is a class of complete orthonormal systems in I}z(巨。,1习),and the con- vergence rate of Fourier-V series is comparatively fast, the convergence rate of V orthogonal network is also fast, and it can effectively approximate a class of discontinuous functions of one variable. The simulation results also show that the convergence rate of V orthogonal network is obviously faster than that of 13P network, wavclet network and Legendre network;if using V orthogonal network to approximate the functions whose breakpoints only appear at dyadic rational, its performance of function approximation becomes much better.
Research of Unequal Weighted Universal Average Operation Model
JIA Peng-tao,HE Hua-can
Computer Science. 2011, 38 (10): 215-219. 
Abstract PDF(390KB) ( 561 )   
RelatedCitation | Metrics
Universal average operation model can satisfy the requirement of logic compromise of continuous valued logic. But the existing universal average operation only discusses an ideal state that every factor is with equal weight Two kinds of weighted operators were given, and unectual weighted universal average operation model and its dual model were proposed. It is pointed out that weighted arithmetic average operator, weighted geometric average operator, wcigh- ted harmonic average operator, and general average operator with weight are special cases in dual model of unequal weighted universal average operation model. Finally, we compared similarities and differences between general average with weight and unequal weighted universal average operation model.
Variable Granulation Rough Set
ZHANG Ming,TANG Zhen-min,XU Wei-yan,YANG Xi-bei
Computer Science. 2011, 38 (10): 220-222. 
Abstract PDF(278KB) ( 637 )   
RelatedCitation | Metrics
By analyzing the weakness of optimistic Muitl-granulation rough set and pessimistic Muitl-granulation rough set, the Variable granulation rough set was proposed, the lower and the upper approximation were defined. Further- more, the properties of these three kinds of rough sets were discussed, and that the Variable granulation Rough Set is the generalization of Muitl-granulation rough set was proved. At last, some measures of those rough sets were defined, the relation of their measures was discussed.
Evaluation Rules Acquisition of Performance Audit for IT Projects in China Based on Dominance Intuitionistic Fuzzy Rou沙Set Model
HUANG Bing,LI Hua-xiong
Computer Science. 2011, 38 (10): 223-227. 
Abstract PDF(370KB) ( 519 )   
RelatedCitation | Metrics
Rules acquisiwn was studied in a type of decision systems where its values of condition attributes took domi- nance crisp values and those of decision attribute were intuitionistic fuzzy numbers. Firstly, the dominanting and domi- named classes of objects in the universe of discourse were constructed by the dominance crisp values of condition attri butes. Secondly,the lower/upper approximation set of an object were ascertanied by comparing the intuisionistic fuzzy numbers of decision attributes among objects. Thirdly, using discernibility matrix, the lower approximtion reduction and rules extraction algorithm based on discernibility relations among objects were devised. Finally,the presented model and algorithm were applied to performance audit for IT project,and some logical rules of performance audit for IT projects were obtiancd.
Topological Structure of Rou沙 Sets in Infinite Universes
QIAO Quan-xi,QIN Ke-yun
Computer Science. 2011, 38 (10): 228-230. 
Abstract PDF(232KB) ( 641 )   
RelatedCitation | Metrics
We investigated the topological structure of approximation operators satisfied reflexive and transitive rela- dons on the universe which is not restricted to be finit. It was proved that there is a oncto-one correspondence between the set of all reflexive and transitive relations and the set of all topologies. It gave the base of topological space.
Dynamic Negotiation Model under the Influence of Uncertain Competition and Cooperation Environment
CHENG Zhao-qian, WANG Hong-guo,SHAO Zeng-zhen,YANG Yi
Computer Science. 2011, 38 (10): 231-235. 
Abstract PDF(479KB) ( 593 )   
RelatedCitation | Metrics
This paper presented a dynamic multi issue negotiation model (DMNM),the model is composed of two parts:the environment model (PCCM) and decision-making model (NDM). PCCM uses the population growth theory of the ecology, combines with changes in population density to effectively analyse the competition and collaborative envi- ronment. I3y introducing the characteristics of individual decision-making, NDM improves flexibility, and applies a hy- brid optimisation algorithm (CE-HOA) based on co-evolution to achieve concessions balance on the multiple issues, to ensure that the interests of decision makers. Experiment the results show that the negotiation model (DMNM) not only can effectively balance the selection of short-term interests and long-term interests, but ensure makers to obtain maxi- mum benefits. In additional,CE-HOA algorithm improves the efficiency of search under the premise of ensuring the re- sult cauality,so that decision-making process is more efficient
GEP Classification Based on Clonal Selection and Quantum Evolution
WANG Wei-hong,DU Yan ye,LI Qu
Computer Science. 2011, 38 (10): 236-239. 
Abstract PDF(378KB) ( 550 )   
RelatedCitation | Metrics
Gene Expression Programming based Classification algorithm has shown good classification accuracy,however, it often falls into the local optimums and needs long time searching. In order to further improve the classification power of GEP, clonal selection and quantum evolution were introduced into GEP. A novel approach called C1onalQuantum-GEP was proposed. After affecting the search direction and evolution ability of the antibody population through the updating and exploring of the quantum population, and keeping the best results in the memory pool, this approach gets more pop- ulation diversity, better ability of global optimums searching, and much faster velocity of convergence. Experiments on several benchmark data sets demonstrate the effectiveness and efficiency of this approach. Compared with basic GEP, C1onalQuantum-GEP can achieve better classification results with much smaller scale of the population and much less evolutionary generation.
Capturing Human Interaction Semantics in Meetings
FAN Xiang-chao,YU Zhi-wen,MA Hui
Computer Science. 2011, 38 (10): 240-242. 
Abstract PDF(365KB) ( 599 )   
RelatedCitation | Metrics
Meeting is an important and vital event in our daily life to solve questions, exchange information, share and create knowledge, so smart meeting system is one of the research hotspots in academia and industry. Current smart meeting systems mainly research on recognition and visualization of physical interaction, and less on human semantic in- teraction in meetings. Human semantic interaction is interaction activities with semantics which is done by participants with regard to current topic. We designed and implemented a method to capture human interaction semantics in meetings with Nave Baycs model via dealing with contributes in session, including head gesture, attention from others, speech tone, speaking time, interaction occasion, type of previous interaction and keywords. Experiments show that the recogni- tion accuracy rate of human interaction semantics can be up to 80. 1 0 0 using this method, which is effective in some degree.
Attribute Reduction Based on Ordered Discernibility Set and Significance of Attribute
ZHANG Ying-chun,WANG Yu-xin,GUO He
Computer Science. 2011, 38 (10): 243-247. 
Abstract PDF(420KB) ( 595 )   
RelatedCitation | Metrics
A new improved algorithm for the simplified discernibility-matrix was proposed on the subject of attribute re- duction in rough set theory. Discernibility-matrix is being simplified without being sorted and at fewer cost of traver- sing. I}his can notably raise the speed of being simplified discernibility-matrix and ultimately obtain the ordered and sim- plified discernibility set. The comparative experiments on computational efficiency show that this new algorithm is more efficient than the homogeneous ones. A new criterion of significance of attribute was put forward based on the three as- pects which arc the weight of element containing the attribute, the frequency and the absorptive ability of the attribute in the discernibility set. Therefore a new method for attribute reduction was introduced on the basis of the above two points, the theoretical analysis proves that the worst time complexity of the new method is less than the other ones based on discernibility-matrix. In addition, lots of comparative experiments in attribute reduction display that this new method is effective and can largely find out a minimal attribute reduction.
Large Ontology Partition and Mapping Based on Module Extraction
WANG Run mei,XU De-zhi,LAI Ya,YAO Xue-cong
Computer Science. 2011, 38 (10): 248-251. 
Abstract PDF(318KB) ( 563 )   
RelatedCitation | Metrics
The ontology mapping is complex because of the large size of ontology. Aiming at the defect of existing ap- proaches,this paper presented a new way which is based on the module extractedion to realize the large ontology parti- tion and mapping. We extracted the ontology modules by the I_aplace Matrix of the dependency graph, and computed the module similarity, then got the mapped modules. The empirical results indicate that our method can achieve ontology partition and enhance the mapping efficiency.
Fading Memory Discrete GM(1,1)Model and its Recursive Algorithm
ZHAO Min, SUN Di-hua,FAN Wan-mei,LIU Wei-ning
Computer Science. 2011, 38 (10): 252-255. 
Abstract PDF(276KB) ( 604 )   
RelatedCitation | Metrics
Considering the different effects of the old and new data on the prediction results, an improved moving-aver- age pretreatment method was established for the original data. On this basis,the forgetting factor was introduced to set different weight for the old and new data, and then the fading memory discrete GM(1,1) model was proposed. ho han- dle with the huge calculational burden of GM(1,1) model,a new online real-time recursive prediction algorithm of the fading memory discrete grey model was presented. The proposed model and recursive algorithm were employed to pre- diet the traffic accidents and the region freight Ton-kilometers. The results show that the real-time tracking ability is enhanced and the prediction precision is improved while solving inverse matrix is avoided.
Relevance Feedback Algorithm Based on Memory Support Vector Machines
SUN Shu-liana,LIN Xue-yun
Computer Science. 2011, 38 (10): 256-258. 
Abstract PDF(263KB) ( 700 )   
RelatedCitation | Metrics
Support vector machine(SVM) is based on the minimum of structure risk and used for small samples in ma- chine learning. Memory support vector machine(MSVM) feedback is based on SVM and used cumulation samples repla- cing feedback samples by memory. It reduces the risk of recall vibration. MSVM feedback also proposes memory label which is used for lightening user's burden. MSVM feedback is proved its superiority by relevant experiments.
Wave Simulation Based on Geometric Modeling
WANG Xiang-hai,LI Ting-ting
Computer Science. 2011, 38 (10): 259-262. 
Abstract PDF(332KB) ( 1289 )   
RelatedCitation | Metrics
This paper presented a fast wave simulation based on geometric modeling method, this method makes the hu- man eye perception of the waves integrated into the subdivision of the sea grid,for the human eye no-sensitive region u- ses coarse grid,thc human eye sensitive region uses fine mesh grid,further different regions use different mathematical functions or different parameters to simulate ocean waves control, at the same time by setting the corresponding random process to increase the sensitive areas of the details on wave motion simulation. The proposed method is simple, calcula- tion of consumption is small, algorithm is easy to implement, can meet the games, animation and other scenarios in the context of real-time recauirements of the waves. Simulation results verify the validity of the model.
Collision Detection and Response in Crop Visualization
WU Yan-lian,TANG Liang.CAO Wei-xing,ZHU Yan
Computer Science. 2011, 38 (10): 263-266. 
Abstract PDF(406KB) ( 992 )   
RelatedCitation | Metrics
Adopting the collision detection and response methodology into the field of crop growth visualization, an algo- rithm used for collision detection between leaves defined by NURBS (Non Uniform Rational l}Spline)surfaces was pro- posed based on the techniques of surface subdivision and hybrid hierarchical bounding volume (HHBV). Firstly, the leaf surface was subdivided by using the technique of inserting node. Secondly, a HHBV tree for the subdivided leaf based on AABB(Axis Aligned Bounding Box) and FDH(Fixed Direction Hull) was developed. The HHBV tree used AABB as its root node for fast overlap test, and used FDH as other nodes for accurately determine the contact status between leaves in closer proximity. Finally, based on leaf morphological architecture, reasonable and efficient solutions for colli- sion response were presented. Further results proved that the algorithm was efficient for realizing leaves collision simu- lation.
Automatic Face Detection in Video Sequences in Complex Lighting Environments
XIE Qian-ru,GENG Guo-hua
Computer Science. 2011, 38 (10): 267-269. 
Abstract PDF(251KB) ( 632 )   
RelatedCitation | Metrics
Auto human face detection from video sequences is the base of studies for human face recognition and track- ing. This paper proposed an efficient and robust method to detect face in vido sequences. The key step of this work is to use the technique of image enhancement to alleviate the impact of human face detection caused by variation illumination such as local shadow and highlight The approach firstly strengthens the edge and detail information of images by means of high-frequency enhanced filtering and uses histogram-based technictue to adjust the brightness of the image,then ap- plies the Gabor wavclet to extract features of images,finally trains samples using the adaboost algorithm and complete face detection. The experimental results show that the approach can detect human face accurately under different light- ing conditions.
Binary Image Hiding Algorithm Based on Multi-secret Sharing and DCT
CAO Ru-bing,Askar
Computer Science. 2011, 38 (10): 270-272. 
Abstract PDF(333KB) ( 656 )   
RelatedCitation | Metrics
A new binary image hiding scheme was proposed, which hided a binary image into several cover images. Based on the characteristics of the binary image whose value is 0 or 1, this scheme combined multi-secret sharing with RLE al- gorithm and DCT algorithm for the first time. According to the biggest size of cover images, this scheme divided RLE length of the shadow image into the n blocks, and then embedded the n blocks of secret information in cover images un- der the control of n keys. Experiment demonstrates that this scheme sufficiently increases the security of secret informa- lion and improves the robustness and acheves the better recovery effect.
Applications of Graphical Models in Color Texture Classification
YANG Guan,ZHANG Xiang-dong,FENG Guo-can,ZOU Xiao-lin,LIU Zhi-yong
Computer Science. 2011, 38 (10): 273-277. 
Abstract PDF(705KB) ( 598 )   
RelatedCitation | Metrics
Texture is one of the important visual features in image analysis. For convenience, color texture images are often converted to gray images. It is a pity that color information is ignored. In order to keep texture and color informa- tion,principle component analysis (PCA) was utilized to reduce the dimension of color textures. Gaussian graphical models (GGM) have good prospect due to themselves advantages, and are applied to construct texture model. The structure of GGM is explored by the connection between the local Markov property and conditional regression of Gauss- ian random variables. Thus, the model selection can be converted to select variables in GGM. The development of tech- nic}ue of penalty regularization provides many methods for variable selection and parameter estimation. And, the methods of penalty regularization conduct neighborhood selection and parameter estimation simultaneously. Then, the texture feature is extracted and applied in color texture classification. The experiments show the good results. Therefore, the texture models based connection of GGM and PCA have an attractive prospect.
Image Retrieval Based on Improved PSO Algorithm and Relevance Feedback
TANG Zhao-xia,ZHANG Hui,XU Dong-mei
Computer Science. 2011, 38 (10): 278-280. 
Abstract PDF(245KB) ( 568 )   
RelatedCitation | Metrics
Because of semantic gap between low-level image features and high-level semantics, and user understanding image subj ectivity and variability, image retrieval results can not satisfy the needs of users. To solve this problem, PSO algorithm and relevance feedback were introduced into the image retrieval process,based on user feedback, W automatic adjustment and Beta adaptive mutation of PSO adjust the weights of the image feature dynamically, to improve search accuracy,better meet the needs of users.
Research of MPI Programs Optimization Technology on Multi-core Clusters
WANG Jie,ZHONG Lu-jie,ZENG Yu
Computer Science. 2011, 38 (10): 281-284. 
Abstract PDF(327KB) ( 1102 )   
RelatedCitation | Metrics
The new features of multi-core make the memory hierarchy of multi-core clusters more complex, and also add the optimization space for MPI programs. We tested the communication performance of three different multi-core clus- ters, and evaluated some general optimization technologies, such as hybrid MPI/OpenMP, tuning MPI runtime parame- ters and optimization of MPI process placement in Intel and AMD multi core cluster. I}he experiments result and opti- mization performances were also analyzed.
Executing Method of Time and Energy Optimization in Heterogeneous Computing
YU Li-hua,ZENG Guo-sun
Computer Science. 2011, 38 (10): 285-290. 
Abstract PDF(471KB) ( 896 )   
RelatedCitation | Metrics
Both the heterogeneity of the computing environment and the complexity of various application tasks lead to heterogeneous computing. hhe purpose of heterogeneous computing is to obtain the best executing effect of the parallel task running in the processing system by putting emphasis on the difference between the parallel system and the task and exploring the optimal match between the system and the task. Currently, in heterogeneous computing, the schedu- ling method only for time optimization is quite mature, but the research on the executing method both for time and ener- gy optimization is very few. This paper aimed at the high performance computing and green computing, and paycd more attention to the scheduling problem of parallel task in heterogeneous computing environment. We proposed the hetero- geneous task model,the heterogeneous computing velocity matrix and the heterogeneous computing power matrix And making use of the idea that energy can be unified time, this paper presented heuristic executing algorithms to achieve both time and energy optimization for parallel task on heterogeneous system. Finally, a case study shows the feasibility and efficiency of proposed algorithms.
Complexity Analysis Method on Information Systems Architecture
LUO Ai-min
Computer Science. 2011, 38 (10): 291-294. 
Abstract PDF(229KB) ( 669 )   
RelatedCitation | Metrics
The architecture design is an important phase in information system development. The complexity of archi- lecture is an key influence factor for information systems. According to the characteristics of information systems archi- lecture development, the measure model of system Cohesion and that of system couple were defined in the paper. Based on these measure models, a complexity analysis method on information systems architecture was provided. Furthermore, an application example was given to illustrate the method is effective.