Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 5, 01 December 2018
  
S-rough Sets and Characteristics of Data Mining Unit Circle
SHI Kai-quan
Computer Science. 2010, 37 (5): 1-8. 
Abstract PDF(625KB) ( 391 )   
RelatedCitation | Metrics
One direction singular rough sets and the structure of dual of one direction singular rough sets were given.One direction singular rough sets and dual of one direction singular rough sets come from improved Z. Pawlak rough sets. One direction singular rough sets and dual of one direction singular rough sets have dynamic characteristics. The relation between one direction singular rough sets, dual of one direction singular rough sets and Z. Pawlak rough sets was also proposed.S-rough sets have three types of forms: one direction singular rough sets, dual of one direction singular rough sets and two directions singular rough sets. Based on one direction singular rough sets, dual of one direction singular rough sets, the concepts of data internal-mining, data outer-mining and their Outer concentric circle theorem,Internal concentric circle theorem were proposed, and the applications were given. S-rough sets is a new branch of Rough sets theory and applied research.
Security Alert Correlation: A Survey
FU Xiao,XIE Li
Computer Science. 2010, 37 (5): 9-14. 
Abstract PDF(634KB) ( 394 )   
RelatedCitation | Metrics
Alert correlation is a new promising technology and has drawn more and more attentions in recent years. It can efficiently solve many problems bothering security managers now, such as high false positives (i. e. alerts mistakenly triggered by benign events) , high false negatives (i. e. intrusions mistakenly missed by security mechanisms) , and large amounts of alerts created by security products per day. In the past several years, a lot of vulnerable researches were done in this field,but most of them only focused on few issues. And there are still many challenging problems that have not been addressed wcll,or even not been touched. Researchers of this field need put more efforts into them in the future.This paper gave an overview of the research progress in this area. Firstly, we introduced the common process and the popular architectures of current alert correlation systems. hhen we summarized and compared the main algorithms of three key phases (i. e. alert aggregation and fusing, attack scenarios constructing, and attack plan recognition) in the common process. After these, the main applications of this technology were introduced, and the difficulties and corresponding methods were summarized. At the end of this paper, we analyzed the shortages of current work and the possible new directions in this field.
Recent Advances of Data Distribution Mechanisms for Live P2P Streaming
PENG Xue-na,LI Jia,WEN Ying-you,ZHAO Hong
Computer Science. 2010, 37 (5): 15-20. 
Abstract PDF(580KB) ( 489 )   
RelatedCitation | Metrics
Data distribution is one of the key technologies in P2P live streaming system. Considering the driving element in data distribution, the data distribution mechanizms for P2Plive streaming system can be classified into three categries, namely; path-driven, data-driven and hybrid-element driven. This paper thoroughly investigated the classical technologies of the above data distribution mechanisms. Finally, some issues to be further studied were discussed.
Review of Semantic Web Service Composition
CUI Hua,YING Shi,YUAN Wen-jie,HU Luo-kai
Computer Science. 2010, 37 (5): 21-25. 
Abstract PDF(458KB) ( 472 )   
RelatedCitation | Metrics
Semantic Web Service is designed to achieve effective automation of Web Service discovery, composition, and execution. As Semantic Web technology maturating and Web Services proliferating in Internet, Semantic Web Service composition (SWSC) becomes a feasible and practicable way to enable software developers to create applications and systems rapidly. We gave some basic notions and recent research of SWSC, and then classified SWSC approaches according to the methodology they used. Furthermore, we analysed every approach's motivation and its shortages, as well as outlined the essential problems of SWSC. Finally,we concluded and discussed the developing trends.
Research on Domain-independent Data Cleaning: A Survey
CAO Jian-jun,DIAO Xing-chun,WANG Ting,WANG Fang-xiao
Computer Science. 2010, 37 (5): 26-29. 
Abstract PDF(353KB) ( 567 )   
RelatedCitation | Metrics
Research on domain-independent data cleaning was surveyed. First, relationships among total data quality management, data integration and data cleaning were clarified, and characteristics of domain-independent data cleaning were emphasized. hhen, domain-independent data cleaning was classified as fcaturcbased similarity methods, context based methods and relationship-based methods. They were introduced respectively. At last, the future research direclions of domain-independent data cleaning were discussed.
Research on Clock Synchronization of Ad Hoc Networks:A Survey
WANG Bo,YE Xiao-hui,ZHAO Yu-ting,YAN Xue-li
Computer Science. 2010, 37 (5): 30-33. 
Abstract PDF(437KB) ( 601 )   
RelatedCitation | Metrics
Clock synchronization, one of the key technologies in ad hoc networks, is the base of the implementation of some network functions. This survey gave an overview of popular clock synchronization methods of ad hoc networks.These methods were classified by some different principles. Then the scalability of approaches above was compared and analyzed, so did the network overhead and synchronization error. Because of the similarity of clock synchronization methods between wireless sensor networks(WSN) and ad hoc networks,some clock synchronization methods of WSN were proposed to be used for ad hoc networks. At last, some developments were introduced, which offer great potential for the new coming problems of clock synchronization.
Overviews on Internet Emergent Behaviour Research
TANG Hong,HUANG Ding,WU Yu
Computer Science. 2010, 37 (5): 34-39. 
Abstract PDF(580KB) ( 593 )   
RelatedCitation | Metrics
As a constant evolving open complex giant system,Internet brings a big challenge on network management,as well as developing new technology and new business. One of the most important behaviour characters of Internet is emergent phenomena which attracts many researchers' attention. Investigation of Internet behaviour can help to understand the law of Internet, and play a significant role in guiding the theory and practice on effective network and user management, development of new network protocols and services. In this paper, the emergent behaviour and major features in the different layers of Internet were analyzed.Then the current researches about network emergent behaviour and related computing models were summarized. Finally the direction of investigation in this area was discussed.
Cost-drive Service Composition
ZHU Rui,WANG Huai-min,TANG Yang-bin
Computer Science. 2010, 37 (5): 40-44. 
Abstract PDF(448KB) ( 349 )   
RelatedCitation | Metrics
The characteristics of Internet such as open, autonomy and dynamic impose new challenges on share and composition over Internet resources. Meanwhile, researchs on runtime QoS-guarantee of Web service composition, service redundant technology and replanning arc very important fault-tolerant methods.But they are facing severe problems on high-cost, requirement of real-time and infcasibility of replanning strategy. Fault tolerance is a tradeoff between QoS-guarantee and extra-overhead. In order to reduce the fault tolerant cost, several decomposed principias were presented based on analyise of high-cost brought by fault tolerant technology. According to these principias, it is helpful for the designer to select the proper service granularity and reduce the degree of coupling of services. Simulation results indicate that the mechanism is effective to increase the reliability of composite services and reduce the fault tolerant cost.
Research on Quantitive Security Risk Assessment Method of an Enterprise Information System Based on Information Entropy
LIU Yong,LIN Qi,MENG Kun
Computer Science. 2010, 37 (5): 45-48. 
Abstract PDF(433KB) ( 597 )   
RelatedCitation | Metrics
For the security risk assessment, the result always relies on the value assigned by some ctueries directly. In order to assess the information system of an enterprise objectively, we proposed an security risk assessment method based on information entropy. By constructing the matrix of Threat-Vulnerability and the matrix of Threat-Loss, we can enhance the result accuracy through dealing with the data of the matrixes by the method of the information entropy.In the end of the paper, we gave an example to explain the efficiency of the proposed method by analyzing an special enterprise information system
Extended CS Logic for Analyzing Non-repudiation Protocols
WANG Juan,LIU Jun,ZHANG Huan-guo
Computer Science. 2010, 37 (5): 49-52. 
Abstract PDF(386KB) ( 347 )   
RelatedCitation | Metrics
This paper presented an extension of CS logic which is a type of knowledge logic and can be used to analyze the properties of non-reputation protocols with timeliness. Using the extended logic, we proved the improved ZG protocol and found a reply attack on signature message of the protocol. As a result, it is not satisfied with non-repudiation property. The example shows the non-reputation protocols with timeliness can be effectively analyzed by the extended CS logic.
Novel Detection Scheme Based on IBE in WSNs Selective Forwarding Attacks
CHEN Dan-wei,HOU Nan,SUN Guo-zi
Computer Science. 2010, 37 (5): 53-56. 
Abstract PDF(362KB) ( 339 )   
RelatedCitation | Metrics
Since nodes in wireless sensor networks have limited capabilities,messages are generally transmitted between nodes through multi-hop. The multi-hop routing protocols provide a lot of convenience for the selective forwarding attack. As a result, a random checkpoint based multi-hop acknowledge scheme to detect selective forwarding attacks in WSNs was brought out. At the same time, IBE and LEACH were adopted to improve the formal detecting scheme.Firstly, the overall framework of defense scheme and its working method were given, then, the scheme was simulated in the NS2 environment, thinking about many factor, such as the number of checkpoints, the calculating speed and consumption, the storage requirements, the robustness and so on. The results show that the WSNs' security has improved.
Research on the Key Technologies of Media Streaming for 3G Networks
CAO Lei,SHEN Hang,LUO Bin,BAI Guam-wei
Computer Science. 2010, 37 (5): 57-61. 
Abstract PDF(504KB) ( 334 )   
RelatedCitation | Metrics
As one of the novel mobile value added services,real-rime media streaming has seen increased demand on 3G networks in recent years, and has drawn tremendous attention from both academia and industry, since it is creating a new era of integration of wireless communication, the Internet and video. However, communication over wireless networks is characterized by limited bandwidth, high error rates, unstable and dynamic changes of wireless channel condilion, user mobility, channel competition, etc. These characteristics and stringent QoS requirements of multimedia apphcanons identify significant challenges for providing QoS guarantees for media streaming applications in 3G networks.This work began with a comparison of technology characteristics in 3G standards. On this basis,we explored key technologies of media streaming for 3G networks in terms of QoS system, transport protocol, video coding, and modeling on real-time video traffic. Finally, we outlined key open problems in this area.
Secure and Efficient Identity-based Aggregate Signature Scheme
SUN Hua,ZHENG Xue-feng,YU Yi-ke,ZHOU Fang
Computer Science. 2010, 37 (5): 62-65. 
Abstract PDF(322KB) ( 513 )   
RelatedCitation | Metrics
An aggregate signature scheme is a digital signature that given n signatures on n distinct messages from n distinct users, it is possible to aggregate all these signatures into a single signature. We proposed an identity-based aggregate signature scheme based on the bilinear pairings, which has a lower verification cost compared with the existing identity-based aggregate signatures. We proved that the proposed signature scheme is secure against existential forgery under adaptively chosen message and ID attack in the random oracle model, assuming that the Computational Diffie-Hellman problem is hard to solve.
Self-certified Proxy Signcryption Based on Discrete Logarithm Problem
YU Hui-fang,ZHAO Hai-xing,WANG Zhi-cang,WANG Xiao-hong
Computer Science. 2010, 37 (5): 66-67. 
Abstract PDF(245KB) ( 312 )   
RelatedCitation | Metrics
Self-certified cryptosystem realizes the properties of no public key certificate and key-unescrow, proxy signcryption is a scheme that combines the proxy signature and signcryption. By merging the thoughts of self-certified cryptosystem and proxy signcryption, a new self-certified proxy signcryption scheme based on discrete logarithm problem was proposed on the basis of the existing literatures. Under the hardness of discrete logarithm problem in finite field, the new scheme was proved to be correct and secure.
ElGamal-like Public-key Cryptosystem and Digital Signature Scheme Based on 5F-L Sequence
DUANMU Qing-feng,ZHANG Xiong-wei,WANG Yan-bo,LI Bing-bing,LEI Feng-yu
Computer Science. 2010, 37 (5): 68-71. 
Abstract PDF(304KB) ( 342 )   
RelatedCitation | Metrics
The characteristics of fifth-order Fibonacci Lucas sequence were deeply researched, and based on it 5FLELG publirkey cryptosystem and digital signature scheme were presented which replace the Lucas sequence and 3F-L sequence by fifth-order Fibonacci Lucas sequence. The correctness and validity were studied and the fast computational algorithm for evaluating the term of fifth-order Fibonacci-Lucas sequence was given. At last, efficiency and security analysis of the scheme was provided.
Security P2P Sharing Model Based on Bloom Filter
YAN Hua-yun,GUAN Ji-hong
Computer Science. 2010, 37 (5): 72-76. 
Abstract PDF(408KB) ( 359 )   
RelatedCitation | Metrics
To resolve the problem of inauthentic files, this paper proposed a security P2P sharing model (SPSM). To incentive all the nodes sharing its resources, this paper introduced virtual currencies in SPSM; and introduced a proseculive rules in SPSM,that is some nodes would prosecute a node which uploaded an inauthentic files,then CA (certificalion authorities) would put the malicious node into malicious sets when validating the identity of malicious node, to save the storage spaces, and considing the character of SPSM, this paper used DBF, the variants of Bloom Filter, to store the malicious sets in SPSM. To validate the validity of SPSM, this paper compared the performances of SPSM and Trust model, the results of experiments show that the performances of SPSM are more effective than Trust's.
MintRoute-HNLB:A Novel Wireless Sensor Network Routing Protocol with Load Balancing
YIN An,WANG Bing-wen,HU Xiao-ya,YANG Wen-jun
Computer Science. 2010, 37 (5): 77-80. 
Abstract PDF(403KB) ( 382 )   
RelatedCitation | Metrics
According to the load balancing issue on wireless sensor network, this paper presented and implemented the MintRout-HNLN protocol on TinyOS. "Hot notifing" and "Hot node avoidance" were introduced in the MintRout-HNLB to distribution of the traffic loads in the network by choosing the suboptimal node sharing the forwarding task of the hot node. The criterion for load-balancing performance named the SLN-LBEI (Same-level nodes load-balancing evaluation indicator) was proposed and used for results analysing. From statistic analysis and simulation using TOSSIM,it shows that the MintRout-HNLB is more effective on load-balancing than MintRoute and balances the energy comsumption of the same hop nodes.
Macroblock Importance Estimation Based Resynchronization Approach
QIU Jin-bo,FENG Bin,YU Li,ZHU Guang-xi
Computer Science. 2010, 37 (5): 81-83. 
Abstract PDF(341KB) ( 383 )   
RelatedCitation | Metrics
Resynchronization is an important error resilient approach for video transmission. The video transmission distortion was analyzed and a new macroblock importance estimation based resynchronization approach was proposed.Through the definition of macroblock importance estimation, the approach utilized the macroblock importance combining with the bit stream length to packet compressed video with the same importance.The proposed approach is compared with number of macroblock based method and bit stream length based method by different test sequences for the same video packet loss pattern. The experimental results show that the new approach can decrease video duality fluctuation in error prone channel and increase su均ective quality greatly.
Secure Communication Scheme of Mobile VPN Based on IKE Protocol
SHU Ming-lei,TAN Cheng-xiang,TAN Bo
Computer Science. 2010, 37 (5): 84-86. 
Abstract PDF(248KB) ( 563 )   
RelatedCitation | Metrics
The security access control of mobile terminals and secure transmission of mobile data play an important role for the widespread usage of mobile intelligent terminals and the extension of mobile service. According to the security problem when mobile terminals access intranet, this paper put forward a secure communication scheme of mobile VPN with the aim of security exchange for mobile data. The scheme improved the negotiation process IKE protocol which is one of the important protocols in IPsec protocol suite, and the scheme can support multifactor authentication and role based access control. The results of theoretical analysis and experiments demonstrate the practicability and the security of our scheme.
Key Agreement Scheme Based on Bilinear Pairing for Wireless Sensor Network
CHEN Hao,GUO Ya-jun
Computer Science. 2010, 37 (5): 87-90. 
Abstract PDF(388KB) ( 338 )   
RelatedCitation | Metrics
Considering the problem of inadequate security in wireless sensor networks, the paper proposed a key agreement scheme based on bilinear pairing for wireless sensor network. Firstly, the proposed scheme pre-distributed network system parameters using ID-based encryption algorithm and computed nodes' parameters on bilinear pairings. Then it broadcasted to networks and exchanged parameters between nodes and computed nodes' key using DifficHellman key exchange technology. Analysis results show that the proposed scheme is not only more efficient than the previous LZC scheme and Shim-Woo scheme, but also satisfies all the required security attributes; implicit key authentication, knownkey security,perfect forward secrecy,key-compromise impersonation resilience and unknown key-share resilience.
Research on Quorum System-based Framework for Distributed Access Control
XIONG Ting-gang,LU Zheng-ding,ZHANG Jia-hong,MA Zhong
Computer Science. 2010, 37 (5): 91-94. 
Abstract PDF(442KB) ( 369 )   
RelatedCitation | Metrics
By researching on generalized system for access control, the paper put forward an access control framework adopting quorum system, and presented a mutual quorum distributed access control system, with the high efficiency and the stability and security. Furthermore, the relation of system capability and system availability under the circumstance that the system nodes are not reliable was analyzed,and a result was given that the mode of the optimization of the entire system considers the capability and availability.
One Method of Requirement Mapping Based on Service-oriented QoS Model
LIANG Quan,WANG Yuan-zhuo
Computer Science. 2010, 37 (5): 95-98. 
Abstract PDF(400KB) ( 348 )   
RelatedCitation | Metrics
For how to control Quality of Service(QoS) in service-oriented grid environment,the paper conducted a deep research, a QoS model with detailed description was presented. The model was in close connection with the management structure and strategies of grid system, and highlighted its characteristics. In this model, QoS rectuirement of applicadons was submitted as the form of an order. Afterwards, a QoS constraint mapping model was established. The paper also analyzed the relationship between parameters and the definite method of parameters in the mapping model. Actual examples gave the concrete process of QoS mapping and its theoretical and applied value for QoS control in grid system.
On the Condition and the Proof of the Existence of Perfect Secrecy Cryptosystem
LEI Feng-yu,CUI Guo-hua,XU Peng,ZHANG Sha-sha,CHEN Jing
Computer Science. 2010, 37 (5): 99-102. 
Abstract PDF(308KB) ( 1013 )   
RelatedCitation | Metrics
Perfect secrecy of cryptosystem is one of the important methods weighing the security of secrecy system.Based on the deep analysis of the relationship among plaintext size, ciphertext size, key size and the key probability of perfect secrecy of cryptosystem, two necessary conditions for special perfect secrecy cryptosystem were presented and proved. hhis paper suggested an approach to build perfect secrecy cryptosystem and summarized four correlated restriclions. By researching a question about the existence of a sort of special perfect secrecy cryptosystem which does not be solved, this paper gave the conclusion and correlative proof; furthermore, this paper got a group of relationship of parameter among plaintext size, ciphertext size, and key size, and proved that perfect secrecy cryptosystem can not be built in this way. The results contract the conditions to build perfect secrecy cryptosystem and develop communication theory of Shannon's secrecy system and arc helpful for designing secure cryptosystem.
Efficient Certificateless On-line/Off-line Signcryption Scheme
LUO Ming,WEN Ying-you,ZHAO Hong
Computer Science. 2010, 37 (5): 103-106. 
Abstract PDF(332KB) ( 441 )   
RelatedCitation | Metrics
On-line/off-line signcryption has the characteristics of low computation cost However, the existing on-line/off-line signcryption schemes arc based on certificate public key cryp tography and identity-based cryptography, which have the problems of certificate management or key escrow. Based on the merit of certificateless public key cryptography, an efficient certificateless on-line/off-line signcryption was proposed. The result shows that the new on-line/off-line signcryption scheme solves all the problems of certificate management and key escrow, and also satisfies all the required characteristics of signcryption. It is more efficient than the existed D)based on-line/off-line signcryption schemes and certificatcless on-line/off-line signcryption schemes.
Event-driven Middleware Platform
HE Jian-li,CHEN Rong,GU Wei-nan
Computer Science. 2010, 37 (5): 107-111. 
Abstract PDF(437KB) ( 1422 )   
RelatedCitation | Metrics
Event driven middleware has becomes research focus due to its asynchronous, onto-many communication properties. This paper proposed a self-adaptive middleware architecture consisting of basclevcl and metes level. The meta-level was partitioned into three independent models, namely interface metes-model, assembly metes-model and event driven perception metes-model. This paper focused on the design and implementation of perception model, which serves for the data exchange between objects and provides environments for running applications. A formal specification of systerns based on finite state machine and linear temporal logic was proposed. The design combined aspects of threads and event to manage the system-level and application-level concurrency. The application example of GUI system implementation proves that the platform is well suitable for developing complicated concurrent applications.
Abnormal Behavior Model Based on Environment Constraint
HE Jia-lang,XU Jian,ZHANG Hong
Computer Science. 2010, 37 (5): 112-114142. 
Abstract PDF(357KB) ( 352 )   
RelatedCitation | Metrics
Including environmental factors for the procedure operation in the control-flow model, combining with the advantages of static analysis methods, this paper established the model for analysis of procedure behaviors. It marked function call instructions, using consistency constraint to return value to overcome the indirect call problems that the general method produces for avoiding the function pointer in a dynamic run-time. At the same time, using the local principle of the procedure limited the scope of the analysis in the functions. The experimental results show that the model has good accuracy and lower performance impact.
Flexible Online Evolution Mechanism for System Software
CHENG Hong-long,LI Ren-fa
Computer Science. 2010, 37 (5): 115-117. 
Abstract PDF(254KB) ( 329 )   
RelatedCitation | Metrics
With the Internet being the main runtime environment, the computer model is increasingly under the more open, dynamic and variable environment. So the software must dynastically adjust the organization or behavior to satisfy the changing requirement. Based on the command pattern, utilizing the isolation of method invocation and method execution and using centralized schedule, we designed a flexible online evolution mechanism. Finally the mechanism was testified to be convenient for changing system function and logic flow.
Research on a Front-end Tool for Program Analysis Based on Model Checking
YE Jun-min,XIE Qian,JIN Cong,LI Ming,ZHANG Zhen-fang
Computer Science. 2010, 37 (5): 118-122174. 
Abstract PDF(437KB) ( 412 )   
RelatedCitation | Metrics
A model checking-based program analysis method was proposed. The main steps include translating C/C++ source code into Kripke structure that is equivalent to control flow graph, describing properties of source code in CTL formula and verifying the program using model checker NuSMV. Based on this idea, a tool which is used to automatically translate C/C++ source code into NuSMV input was designed and implemented. The experiments show that this approach is able to analyze the program effectively.
Study on Component-based Software Safety Analysis
WAN Yong-chao,ZHOU Xing-she,DONG Yun-wei
Computer Science. 2010, 37 (5): 123-126161. 
Abstract PDF(407KB) ( 390 )   
RelatedCitation | Metrics
Many obscure expressions and uncertainties exit during the process of safety analysis for complicated safetycritical software,while the theory of fuzzy sets and subjective evaluation is an effective methodology to deal with these problems. We presented the fuzzy expressions of the software safety factors, then analyzed the safety score of single component. After that, we synthesised the safety score of subsystem and system quantitatively by using the fuzzy operalions and evidential reasoning approach. Finally, an example was presented to demonstrate the proposed software analysis and synthesis method.
Research of Web Service QoS Evaluation Method with Extended Owl-S Ontology
ZHOU Min,ZHANG Wei-qun,LIN Yi-jie,SHI Ying
Computer Science. 2010, 37 (5): 127-129183. 
Abstract PDF(325KB) ( 336 )   
RelatedCitation | Metrics
With the large usage of Web service technology, how to find the more suitable services fmom Web services with similar functions is a urgent problem. This article extended the concept of Web service ontology OWI- S and some service quality indicators were added into the research such as cost, time, reliability and so on. The research quired standard measure indexes from many aspects. According to these measure indexes was used to deduce the ranking of Web integrated service with similar function. The,evaluation results will model was huilt supply the help for the service choices.
Research of Compiler Optimization Technology Based on Predicated Code
TIAN Zu-wei,SUN Guang
Computer Science. 2010, 37 (5): 130-133. 
Abstract PDF(464KB) ( 375 )   
RelatedCitation | Metrics
A lot of branch instructions in program severely restrict the exploiting of parallelism of the architecture and the compiler. One of the major challenges to exploit effectively instruction level parallelism is overcoming the limitations imposed by branch instructions. Predicated execution can effectively delete branch instructions, and convert branch instructions to predicated code, which enlarges instruction scheduling scope and removes branch misprediction penalties.hhis paper described compiler optimization technology based on predicated code, such as instruction scheduling, software pipeline, register allocation, instruction merging. An instruction scheduling algorithm based on predicated code was designed and implemented. The experimental results show that compiler optimization based on predicated code can improve instruction parallelism degree, shorten code execution time effectively, and improve greatly the program' s performance.
Spreadsheet-like Construct for Information Convergence
WEI Yong-shan,HAN Yan-bo,SUN Zhong-lin,ZHANG Feng,CHEN Xin
Computer Science. 2010, 37 (5): 134-138. 
Abstract PDF(431KB) ( 380 )   
RelatedCitation | Metrics
Spreadsheet is intuitive for many non-technical users. But there arc two difficulties in operating xml data on spreadsheet, One is how to represent xml data in spreadsheet, and the other is how to express complex query with simple opcrations,such as copy,pastc and move. A spreadsheet like construct was proposed to represent XML data and to express complex query using a group of simple operations. XML Schema was represented as nested-table in spreadsheet like construct and users’operations on spreadsheet were transformed into XQuery statements on xml schema. The spreadsheet like construct was implemented in an information convergence system to query complex data distributed in heterogeneous data sources. Compared with popular XQuery constructing tools, spreadsheet is suitable for users with little or no experiences in programming to query complex data.
Data Quality Model and Metrics Research at Attribute Granularity
CHEN Wei-dong,ZHANG Wei-ming
Computer Science. 2010, 37 (5): 139-142. 
Abstract PDF(336KB) ( 368 )   
RelatedCitation | Metrics
Based on the author's previous research at data quality propagation for relational database, the paper sumed up and presented a data quality model at attribute granularity, which both considers the influence of accuracy and completeness. According to the model, the tuples were classified into five categories. The definitions of accuracy and completeness both include the influence of the data item and tuple. After analysing the null data neither correct nor complete, the mutual relationship between them was formed. Then the attribute error rate before and after quantitative was introduced to define metrics further.
Cloud Database Dynamic Route Scheduling Based on Ant Colony Optimization Algorithm
SHI Heng-liang,BAI Guang-yi,TANG Zhen-min, LIU Chuan-ling
Computer Science. 2010, 37 (5): 143-145. 
Abstract PDF(343KB) ( 464 )   
RelatedCitation | Metrics
Cloud computing is the development trend of next generated computing network model. And how to effectively route storage resource in cloud is development's difficulty in ring of industry. Ant colony algorithm is a bionics optimization algorithm based group ants which has many priorities such as intelligent routing, overall optimization, robust ness,distributed computing,ability to mix itself with other algorithm. Based on the above two factors, this paper proposed a reasonable algorithm which can find the requiring database rapidly and effectively, and reduce the dynamic routing burdens of cloud database routing, and enhance the efficiency of cloud computing at a large.
Real-time Ordered Query Processing in On-Demand Broadcast Environments
WANG Hong-ya,LIU Xiao-qiang,HE Hao-yuan,SONG Hui,XIAO Ying-yuan,LE Jia-jin
Computer Science. 2010, 37 (5): 146-150. 
Abstract PDF(517KB) ( 311 )   
RelatedCitation | Metrics
Existing research efforts on real-time data dissemination in on-demand data broadcast environments are only concerned with scheduling single data request with deadline constraints. hhe issue of processing real-time ordered query in on-demand broadcast systems was investigated. Particularly, we first formally defined a new kind of scheduling problem called ROBS by formulating the real-time ordered ctuery processing problem. We also showed that the ROBS problem is NP-hard. Secondly, a novel scheduling algorithm called OL-ROBS was proposed to address the on-line version of ROBS. To tackle the performance issue of OL-ROBS, a delicate data structure for managing data requests was construeted and an efficient pruning algorithm was proposed. Finally, extensive simulations were conducted and the empirical resups show that OL- ROBS owns significant performance gain over the well-known Sin-8 algorithm.
Automatic Abstracting System Based on Improved LexRank Algorithm
JI Wen-qian,LI Zhou-jun,CHAO Wen-han,CHEN Xiao-ming
Computer Science. 2010, 37 (5): 151-154. 
Abstract PDF(449KB) ( 667 )   
RelatedCitation | Metrics
Automatic abstracting has been a priority research point in computational linguistics field, and the study and application of automatic summarization have widely attracted the attention of interrelated academic subjects such as computer science, linguistics, informatics. I}his article firstly brought out how LexRank algorithm works in automatic summarization, then improved the method in three aspects including sentence similarity computing, sentence weight computing and redundancy resolution. And the factors of influence could be dynamically adjusted according to the documents content. The system described in this article could deal with single or multi-document summarization both in English and Chinese. With evaluations on two corpuses, our methods could produce better summaries than the original LexRank algorithm to a certain degree. We also show that our system is quite insensitive to the noise in the data that may result from an imperfect topical clustering of documents. And in the end, existing problem and the developing trend of automatic summarization technology were discussed.
Researches on Similarity Measurement of High Dimensional Data
HE Ling,CAI Yi-chao,YANG Zheng
Computer Science. 2010, 37 (5): 155-156. 
Abstract PDF(249KB) ( 358 )   
RelatedCitation | Metrics
The similarity measurement among data is important for further analysis of the data set. Aiming at the similarity measurement of high dimensional data, the paper put forward a new method based on subspace. After dividing high dimensional space into grids, and computing the similarity among data in proper subspaces, the disturbance from the curse of dimensionality can be abated efficiently under the proper dividing parameters.
Outlier Detection Based on the Damped Model in Mixed Data Streams
SIJ Xiao-ke,LAN Yang,QIN Yu-ming,CHENG Yao-dong
Computer Science. 2010, 37 (5): 157-162. 
Abstract PDF(411KB) ( 342 )   
RelatedCitation | Metrics
Outlier detection in data streams poses great challenges due to the limited memory availability and real time detection rectuirement. A fast outlier detection algorithm in mixed data streams was introduced by clustering the data streams incrementally based on the damped model and generating the cluster features on behalf of the data distribution.The radius threshold value changed dynamically. When detection requirement was received the outlier factor of specified clusters was calculated and the clusters with high outlier factor were taken as the abnormal clusters. At the same time the method is proposed to distinguish between the abnormal cluster and the initial stage of data evolution. The complexity of the time and space were nearly linear with the size of data streams. The experimental results on the KDDCUP99 dataset demonstrate that the method can effectively detect the outliers in mixed data streams.
Application of Data Warehouse and OLAP in Preference Data Analysis on National College Entrance Examination and Admission
YIN Yuan-fen,ZHANG Zi-li,CAI Hai-min,ZENG Zheng
Computer Science. 2010, 37 (5): 163-164. 
Abstract PDF(392KB) ( 444 )   
RelatedCitation | Metrics
How to decide the preference to increase the probability of dream college matriculation is concerned by examinees and their families closely. By employing OLAP technology on the historical preference data, some interesting and referential results were mined. hhis paper was focused on how to build the related data warehouse as well as the related technical issues.
Self-adaptive Genetic Algorithm with Classification
HUANG Li,DING Li-xin,DU Wei-wei
Computer Science. 2010, 37 (5): 165-167. 
Abstract PDF(318KB) ( 432 )   
RelatedCitation | Metrics
Aiming at the balance of search results and search speed of evolutionary algorithm, we proposed a search strategy to classify the individuals by the similarity of their fitness. This differentiated respective function of individuals in search process. Nevertheless,premature convergence was one of GA-difficulties. So,an improved selection mechanism in GA was used to deal with the mentioned drawback. On the one hand,a new parameter named success ratio which is higher setting causes higher selection pressure. It could keep the algorithm from premature convergence. On the other hand, another parameter T the same as the simulated annealing algorithm's temperature was recommended. While mutation and crossover happened,a virtual population could be generated according to this parameter for enlargeing the search space and keeping the diversity of generation. Finally, experimental results on benchmark problems of TSP show that the new method is capable of producing highest quality solutions and preventing premature convergence efficiently.
Research on Relevance Feedback Algorithm Based on Combining Classifiers
LU Xiao-yan,ZHOU Liang,DING Qiu-lin
Computer Science. 2010, 37 (5): 168-170. 
Abstract PDF(261KB) ( 339 )   
RelatedCitation | Metrics
High retrieval performances in content based vector graphics retrieval system can be attained by adopting relevance feedback algorithms. A new relevance feedback approach based on combining classifiers was proposed, which combines the expected results from the independent nearest neighbor classifiers with only one training sample formed by each positive or negative feedback sample, computes the relevance score of every vector graphics and optimizes the relevance score by introducing the technique called "Bayesian Query Shifting“. The results of the experiment show that the algorithm not only can further improve the precision of the vector graphics retrieval system but also can ensure the recall of the system.
Research and Simulating of Global Optimal Path Planning of Mobile Robot Based on Ant Colony System
ZHOU Jing,DAI Guan-zhong,CAI Xiao-yan
Computer Science. 2010, 37 (5): 171-174. 
Abstract PDF(365KB) ( 319 )   
RelatedCitation | Metrics
The purpose of this paper is to construct active, adaptive ant agents by bottom-up modeling ideas of CAS theory and rely on the cooperation between them to complete the task of global path planning of autonomous mobile robot based on ant colony system when using visibility graph as a route map. hhe ant-agent evolving in the environment incessandy has its own internal structure and behavior rules. Multiple ant agents gathered into an artificial ant colony, communicated inside in the way of stimergy by pheromone, consecauently emerged the swarm intelligence of path planning.
Techniques for Thread Assessment Based on Intuitionistic Fuzzy Theory and Plan Recognition
WANG Xiao-fan,WANG Bao-shu
Computer Science. 2010, 37 (5): 175-177. 
Abstract PDF(235KB) ( 391 )   
RelatedCitation | Metrics
By analyzing the nondeterminary of the infomation in situation assessment and the insufficiency of plan recognition, a technictue for multiple attribute plan recognition based on intuitionistic fuzzy theory was proposed through the combination of intuitionistic fuzzy theory and plan recognition. Therefore, multiple attribute plan recognition models based on Intuitionistic fuzzy sets were built up and calculating methods were given. Concrete examples demonstrated the validity and correctness of the models. The results of the experiment show that the models can promote the efficiency and reliability of thread assessment and give vivid description of trend anticipation.
Discrete Particle Swarm Optimization Based on Chaotic Ant Behavior and its Application
XU Qing-he,LIU Shi-rong,LV Qiang
Computer Science. 2010, 37 (5): 178-180. 
Abstract PDF(317KB) ( 425 )   
RelatedCitation | Metrics
Considering their own characteristics of ant colony algorithm and particle swarm optimization algorithm, the update equations of the speed and position of particles were redefined on the basis of PSO algorithm. A discrete particle swarm optimization algorithm based on chaotic ant behavior was proposed using the idea of pheromone refresh mechanism of ant colony algorithm for reference. Knapsack problem was used to test the performance of the algorithm. Compared with other algorithms, the results of the experiment show that the proposed algorithm can result in better profits.
Research on CF Algorithm Based on Probabilistic Analysis of Discrete Rating Vector
TIAN Wei,XU Jing,PENG Yu-qing
Computer Science. 2010, 37 (5): 181-183. 
Abstract PDF(259KB) ( 376 )   
RelatedCitation | Metrics
The users’similarity computation is a key step of Collaborative Filter (CF) algorithm. The All-Average method and classified recommendation improved algorithm based on probabilistic analysis of users' discrete explicit rating vector were proposed to solve the problem of CF sparsity and other practical problems. Experimental result shows the improved method enhances the precision and quality of CF prediction.
Solving Einstein's Puzzle with SAT
TIAN Cong,DUAN Zhen-hua,WANG Xiao-bing
Computer Science. 2010, 37 (5): 184-186. 
Abstract PDF(213KB) ( 542 )   
RelatedCitation | Metrics
Einstein Puzzle, or Zebra Puzzle, is a widely known riddle given by Einstein in the early 20th century. He said 98% people in the world cannot solve this riddle. The question is a typical logical question which can be formalized as a SAT problem. We investigated how to solve the riddle by SAT. And then the currently popular SAT solvers, such as MinSat, was employed in solving this riddle automatically.
Uncertain Temporal Logic Model Based on Intuitionistic Fuzzy Sets
SHEN Xiao-yong,LEI Ying-jie,ZHOU Chuang-ming,YANG Shao-chun
Computer Science. 2010, 37 (5): 187-189. 
Abstract PDF(319KB) ( 350 )   
RelatedCitation | Metrics
To the limitation of the description for complicated and uncertain temporal knowledge,an uncertain temporal logic based on intuitionistic fuzzy sets was proposed. The model defines the determinant formula of the point temporal logic, point interval temporal logic and interval temporal logic in continuous domain and discrete domain. By adding hesifancy degree parameter, the reasoning result is more believable and precise. Finally, two kinds of the uncertain temporal knowledge were descripted and the possibility of all temporal relationship was measured by two instances, then it is indicated that the model is more predominant.
Definition of N-norm and it's Generation Theorem on[0,∞] Interval
FAN Yan-feng,HE Hua-can
Computer Science. 2010, 37 (5): 190-193. 
Abstract PDF(402KB) ( 395 )   
RelatedCitation | Metrics
The research aim of flexible logics is to explore the general law of logic, it indicates that propositional truth value error is described by the continuously changeable generalized self-correlation coefficient k∈[0,∞].N-norm is the mathematic model of 1-level operation in uncertain reasoning. In the real world, many logic reasoning controls must be accomplished in the original true value domain. This paper used triangular norm theory as an important mathematical tool to study the flexible logics on interval[0,∞].N-norm and N-generator on[0,∞] were defined, and the relative main properties were studied. Generation theorem of N-norm was proven. Method for calculating generalized self-correlation coefficient k was given. The paper proved that exponential N-generator is N-generator integrity cluster on[0,∞].These researches provided the important theoretical base for 1-level operation model on[0,∞] interval in flexible logics.
New Approach to Analyzing Meaning of Natural Language on Modi行ing Relations
TIAN Wei-xin,ZHU Fu-xi,DAN Zhi-ping
Computer Science. 2010, 37 (5): 197-202. 
Abstract PDF(596KB) ( 399 )   
RelatedCitation | Metrics
Acquiring the meaning of natural language is a bottleneck to make deeper use of natural language processing (NLP). There are two main measures on analyzing meaning of natural language at conception-relation level: one is the method of extracting characteristic vectors based on statistics, and the other one is method of computing semantic similarities according to semantic dictionary like WordNet or HowNet. Both of the two methods have weakness when putting them to applications. The previous is only applicable to analyze the meaning of those materials with big granularitics such as paragraphs, documents or multi-documents, but is not fit for the applications at the level of sentences or words. The latter can deal all sorts of relations between conceptions easily, but when coming to complicated modified relations between conceptions and events, conceptions and conceptions or events and events, the semantic dictionary and computing method shall be extended. This paper presented a new method to structure semantic knowledge base(SKB) according to the modifying relation of real context; algorithm of computing unknown relations on the knowledge base was presented; we pointed out the way to design the rules of constructing natural language sentences under modifying relation and present the algorithm; in the end we made experiment on the platform developed in the light of the theory mentioned above and the result shows the theory is feasible.
Policy of Pheromone Update with Important Solution Components
BI Ying-zhou,ZHONG Zhi,DING Li-xin,YUAN Chang-an
Computer Science. 2010, 37 (5): 203-205. 
Abstract PDF(344KB) ( 292 )   
RelatedCitation | Metrics
The pheromone trails in ACO are used to reflect the ants' search experience, and the ants exploit them to probabilistically construct solutions to the problem, so the quality of the pheromone is crucial to the success of ACO.The main factors affecting the duality of the pheromone include the policy of updating the pheromone and the duality of the constructed solutions. In order to improve the constructed solutions, this paper presented a method to analyze the invalid components of the constructed solution, and then repaired the invalid components with immunity operator. When the pheromone density on the components is updated according to the improved solution, they will more exactly reflect the character of high quality solution, so it will speed the positive feedback procedure. The results show that the use of immunity repairing helps to find competitive solutions in a relatively short time.
UIMMS of Maneuverable Target Tracking for Multistatic Radar System
ZHAO Hui-bo,PAN Quan,LIANG Yan,WANG Zeng-fu
Computer Science. 2010, 37 (5): 206-209. 
Abstract PDF(296KB) ( 344 )   
RelatedCitation | Metrics
This paper presented the problem of tracking radar measurements of maneuvering target, discussed the kalman smoother, introduced the smooth methods, compared the methods of Unscented Kalman filter with Unscented Kalman smoother, Interacting Multiple Model Filter(IMMF) and Interacting Multiple Model Smoother(IMMF) based on the Multiple Model idea. Meanwhile, for the problem of nonliner based on the radar measurements and the target charicterristic, designed the models of maneuvering target and several radar measurements, compared the classed methods and tested the new methods, experiment results show the effectiveness.
Efficient Fixed-parameter Enumeration Algorithm for the 3-D Matching Problem
LIU Yun-long,WANG Jian-xin
Computer Science. 2010, 37 (5): 210-213. 
Abstract PDF(343KB) ( 283 )   
RelatedCitation | Metrics
Enumerating a number of good solutions to a problem has an increasing demand in recented research in computational science. In this paper, we presented a fixed-parameter enumeration algorithm for the 3-D Matching problem.More precisely, we developed an algorithm that, on a given set S of n weighted triples and two integers k and z, produces z "best" k-matchings in time O(5. 483kkn2z). Our algorithm is based on the recent improved color-coding techniques and the fixed-parameter enumeration framework. This result shows that the 3-D Matching problem is fixed parameter linearly cnumcrablc.
Emotional Modeling in an E-learning System Based on OCC Theory
QIAO Xiang-jie,WANG Zhi-liang,WANG Wan-sen
Computer Science. 2010, 37 (5): 214-218. 
Abstract PDF(442KB) ( 1001 )   
RelatedCitation | Metrics
According to OCC model, an emotional recognition model based on cognition and appraisal theory was proposed. The expectation probability of events in the educational environment was determined by reasoning with a fuzzy way. A dynamic bayesian network was also built to evaluate and simulate the learning process and the result shows the model's validity and rationality. Thus this paper's work has provided a novel emotion recognition model and frame to make e-learning systems more emotional intelligent.
Dimensionality Reduction Symmetrical PSO Algorithm Characterized by Heuristic Detection and Self-learning
SHAG Zeng-zhen,WANG Hong-guo,LIU Hong
Computer Science. 2010, 37 (5): 219-222. 
Abstract PDF(349KB) ( 294 )   
RelatedCitation | Metrics
A novel Symmetrical PSO algorithm (SymPSO_HD) was proposed to improve the search ability of PSO algorithm. To initialize the cluster effectively, population scatter entropy strategy was introduced. In order to improve the particle's position vector, a special kind of particle characterized by detection was proposed too. And to enhance the partide's learning ability both in local and global domain, we put forward the clone-mutation-selection strategy in neighborhood and the dimensionality reduction symmetrical strategy in global area. Simulation and analysis show that SymPSO_HD algorithm has stable search ability and strong adaptability, and can converge to the global optimum with large probability.
Logic Study of Epistemic and Rational Conditions of Some Game-theoretic Solutions
JIANG Jun-li,TANG Xiao-jia
Computer Science. 2010, 37 (5): 223-227. 
Abstract PDF(442KB) ( 625 )   
RelatedCitation | Metrics
In the process of seeking for gamctheoretic solutions, players arc supposed to be rational which means that they all tend to maximize their expected payoffs. However, there is no a definitely definition of rationality in the game theory and the usual assumption of mutual knowledge of individual rationality to be common knowledge is coarse and maybe too strong, especially when the game is finite, in fact, the nested degree of interactive knowledge of individual rationality has a maximal value that we will show. In this paper we formalized the special rationality conditions for each solution algorithm that we modified for the sake of program. One the other hand, we proposed an algorithm-solution in correspondence with strong rationality. Moreover, we showed that iterated announcement of the rationality we defined could arrive exactly at the algorithm-solution, correspondingly.
Tone Recognition Based on Support Vector Machine in Continuous Mandarin Chinese
FU De-sheng,LI Shi-qiang,WANG Shui-ping
Computer Science. 2010, 37 (5): 228-230. 
Abstract PDF(243KB) ( 497 )   
RelatedCitation | Metrics
Done is an essential component for word formation in Chinese languages. It plays a very important role in the transmission of information in speech communication. We looked at using support vector machines (SVMs) for automatic tone recognition in continuously spoken Mandarin. The voiced segments were detected based on Meager Energy Operation and ZCR. Compared with BP neural network, considerable improvement was achieved by adopting 6 binary-SVMs scheme in a speaker-independent Mandarin tone recognition system.
Modeling Traffic Flow Based on Autonomous Agents and Cellular Automata
DU Xiao-dan,ZHAO Shi-bo,YAN Tao,ZHANG Feng-li
Computer Science. 2010, 37 (5): 231-233239. 
Abstract PDF(333KB) ( 367 )   
RelatedCitation | Metrics
This paper presented a new cellular automaton model to simulate traffic flow on a singl}lane highway. Adriver-vehicle unit was abstracted to be an autonomous agent. The model was focused on multiple autonomous agents' interactions ahead and agents' physical restrictions (particularly, acceleration and deceleration capabilities). The physical restrictions were realized by dual-regime acceleration and deceleration capabilities. The fundamental diagram, speedlocation plot,spacctime plot and phase diagram were presented.The cross correlation function was studied to identity the free flow and synchronized flow. The simulation results show that the model can reproduce the most empirical findmgs.
Rapid PC Hardware Based Visualization Method for Large-scale Datasets
CHEN Shi-hao,HAO Chong-yang
Computer Science. 2010, 37 (5): 234-236. 
Abstract PDF(256KB) ( 514 )   
RelatedCitation | Metrics
As the 3D datasets arc usually in large scalar, the capability of a single CPU to rendering is not sufficient to achieve interactivity. Direct volume rendering via GPU has positioned itself as an efficient tool for the display and visual analysis of volumetric scalar fields. A rapid PC hardware based visualization method for largcscale datasets method was proposed. At last we demonstrated the effectiveness of our method with several data sets. It was proved that the proposed method can generate high-duality visual representations on normal PC.
CT Image Segmentation Based on Support Vector Machine and Regional Growth
LIU Lu,CHU Chun-yu,MA Jian-wei,LIU Wan-yu
Computer Science. 2010, 37 (5): 237-239. 
Abstract PDF(268KB) ( 360 )   
RelatedCitation | Metrics
In order to solve the difficulty of determining the growth rules in conventional regional growth algorithm and the slowly of support vector machine segmentation algorithm, an image segmentation method combined support vector machine and regional growth was proposed. Firstly, selected a certain numbers of sample point from target area and nontarget area and trained the support vector machine classification,then used the trained classification search seed point and regional growing, the support vector machine classification was used as growth rules, the last, some necessary retrogrossing were used for the edge and noise. IThe experimental results show that this algorithm is feasible and it performs better than conventional region growth segmentation algorithm and faster then conventional support vector machine segmentation algorithm.
New Compression Approach to Hyper-spectral Images Based on Second Order Difference Predictive
ZHANG Wei,DAI Ming,YIN Chuan-li,FENG Yu-ping
Computer Science. 2010, 37 (5): 240-242246. 
Abstract PDF(350KB) ( 470 )   
RelatedCitation | Metrics
According to hyper-spectral images having strong correlation both in spectral and spatial, a novel compression scheme based on second order difference predictive that combines with spatial predictive was presented. MED predictorwas used to remove the spatial correlation. Second order difference predictor was used to remove spectral correlation.Then a unified predictor was designed based on the weight of predictive error. At last near lossless compression was completed after context based coding. The results show that the compression ratio can reach up w 12. 7 when the PSNR is about 39413, so the algorithm is efficient.
3D Information Extraction Algorithm Based on a Single Image
LIU Ri-chen,LIU Yi-guang,YIN Xin,ZHAO Yi-ming
Computer Science. 2010, 37 (5): 243-246. 
Abstract PDF(296KB) ( 477 )   
RelatedCitation | Metrics
A 3D information extraction algorithm based on a single image was designed in this paper for the difficulties of this research field, in this algorithm, camera imaging matrix M3x4 was obtained by camera calibration firstly, the pseu do inverse matrix M4X3 was calculated by M3x4,the essence of this computation was solving inconsistent ectuations by least squares approximation, 3D information would be extracted from a single image through M1x3,the result wasshown in a image by projection operator. The result could be used for qualitative analysis (qualitative vision) currently due to the limited information from a single image. The experiment indicates that the extracted object should be coplanar. Moreover, the essential theories of 3D to 2D's projection were deduced, the conclusion of the deduce is that imaging process is an irreversible process.
Method of Constructing Intermediate Space for Print-oriented Spectral Color Management
WANG Ying,ZENG Ping
Computer Science. 2010, 37 (5): 247-250. 
Abstract PDF(309KB) ( 358 )   
RelatedCitation | Metrics
High dimension of the spectral space in spectral color management results in long run time and large memory requirement in color processing of multi spectral images. To overcome this shortcoming,a method to construct intermediate space was proposed. 13y analyzing the color management process, an intermediate space was introduced and the flow of spectral color management based on the intermediate space was established firstly. Then for the printing applicalion of multi-spectral images, the principal component analysis method was employed to construct an eigenvector space based on the printer characteristic samples. The new space derived from the method was appointed as the intermediate space. Finally the eigcnvector matrix was utilized to achieve the transformation between the intermediate space and the spectral space. Experiments show that the transformation efficiency is increased when the printer characteristic samples are used during dimension reduction. The colorimetric and spectral precision of the conversion is high and low-dimensional image data in the intermediate space can keep the main information of the source spectral data.
Face Recognition Based on Two-dimensional Maximum Difference Marginal Fisher Analysis
LU Gui-fu,LIN Zhong,JIN Zhong
Computer Science. 2010, 37 (5): 251-253. 
Abstract PDF(313KB) ( 319 )   
RelatedCitation | Metrics
A novel two-dimensional maximum difference marginal Fisher discriminant analysis(2DMDMFA) was proposed for face recognition. The algorithm adopts the difference of similarity matrix Sp which characterizes the interclass reparability and similarity matrix S` which characterizes the intraclass compactness as discriminant criterion. In such a way,the small sample size problem occurred in marginal Fisher analysis(MFA) is avoided. In addition,the construction of Sp and Sr is directly based on original training image matrices rather vectors. It is not necessary to convert the image matrix into high-dimensional image vector like those previous methods so that the recognition rate is raised. Besides, the relations between the maximum difference marginal Fisher analysis discriminant criterion and marginal Fisher analysis discriminant criterion for feature extraction were revealed. Experimental results on ORL and Yale face database show that the algorithm outperforms the traditional methods in recognition performance.
Adaptive Parameters Conditional Random Field Video Segmentation Algorithm
ZHENG He- rong,CHU Yi-ping,PAN Xiang
Computer Science. 2010, 37 (5): 254-256. 
Abstract PDF(369KB) ( 448 )   
RelatedCitation | Metrics
Aimed at the problem that the video segmentation method based on conditional random fields needs set empirical values to obtain optimal segmentation results,the video segmentation model was proposed to compute adaptively video neighboring relationship feature function by video features and construct adaptive parameters conditional random fields. The core idea of algorithm is to calculate different kinds of model feature function by pixel neighboring relationship in video, these feature energy functions arc constrained by conditional random field model, which is solved via Gibbs sampling algorithm to obtain globally optimal segmentation results. The experiment shows that the results of adaptive parameter algorithm are the same as one of optimal empirical parameters.
Fast Image Reconstruction for Cone-beam CT Based on Minimal Cylindrical Reconstruction Region
ZHANG Shun-lil,ZHANG Ding-hua,LI Ming-jun,GUO Xin-ming
Computer Science. 2010, 37 (5): 257-260. 
Abstract PDF(315KB) ( 350 )   
RelatedCitation | Metrics
Conventional image reconstruction of concbeam CI} usually assumes a cube or its inscribed cylinder as the reconstruction region, considering the characteristic of large difference of reconstruction objects of industrial CT in size, a fast image reconstruction method based on minimal cylindrical reconstruction region was proposed. Firstly, the envelope diagram of minimal region of reconstruction object was obtained by line scan conversion algorithm from concbcam projection data of different view. Then, the envelope of minimal region was determined by region filling algorithm. On this basis, the radius of minimal cylinder was obtained by midpoint circle algorithm. This method can adaptively determine the reconstruction region of minimal cylinder according to the size of reconstruction object, thus unnecessary computation is reduced. Experimental results show that the proposed method can effectively improve the reconstruction speed of algebraic reconstruction technique and obtain better reconstruction quality simultaneously.
Face Recognition Method Based on Multi-channel Gabor Filtering and Center-Symmetric Local Binary Pattern
HE Zhong-shi,LU Jian-yun,YU Lei
Computer Science. 2010, 37 (5): 261-364. 
Abstract PDF(329KB) ( 533 )   
RelatedCitation | Metrics
In recent years, Local Binary Pattern (LISP) has been successfully applied to face recognition. However, the dimensionality of the feature vector extracted by LISP is usually very high. On the contrary, Center-Symmetric Local Binary Pattern (CS-LBP) encodes the image with the technique of Center Symmetry. As a result,CS-LBP can largely reduce the extracted feature dimension. Therefore, in this paper, CS-LISP was employed to extract features from facial images,and then a new face recognition algorithm based on multi channel Gabor filtering (MCGF) and CS-;BP was brought forward. Experimental results on Yale, ORL and FETER face databases demonstrate that compared with LBP,CS-LISP can achieve the comparable performance in terms of recognition rate with lower feature dimensionality. Additionally, CS-LBP based on MCGF can increase the accuracy obviously.
Improved Destriping Algorithm of Hyperspectral Images
ZHENG Feng-bin,ZHI Jing-jing,GAO Hai-liang,LAI Ji-bao,PAN Wei
Computer Science. 2010, 37 (5): 265-267. 
Abstract PDF(254KB) ( 538 )   
RelatedCitation | Metrics
Traditional moment matching algorithm can change the line average or column average of images and cause the original image information change. After analyzing HJ-1-A satellite hyperspectral images,an improved algorithm of Moment Matching was put forward. The proposed algorithm uses the column average and variance which was processed with smooth filter instead of the average and standard deviation of the reference image in traditional moment matching algorithm. Comparing with the traditional algorithm,it has advantages of less image information loss and effectively destriping the strips under the premise of maintaining the original image features. The method has a strong applicability for destriping other multi-sensor remote sensing images.
Algorithm for Recognition Methods of Paper-cutting's Patterns Based on Multiresolution Fourier-Mellin Transform
WANG Xiao-yun,WEI Yue-qiong,QIN Fang-yuan,LI Guo-xiang,ZHANG Xian-quan
Computer Science. 2010, 37 (5): 268-270. 
Abstract PDF(255KB) ( 499 )   
RelatedCitation | Metrics
A paper-cutting recognition algorithm was proposed based on Multiresolution Fourier-Mellin transform. The algorithm does Fourier-Mellin transform on paper-cutting, then based on the invariance of Fourier-Mcllin transform, calculates the energies of the subbands acquired by wavelet transform, which is composed by mean square deviation and mean value of apiece subband. These energies regard as the invariant feature vectors. Experiments indicate that this feature vector not only has the translation, rotation and scale invariance, but also satisfactorily achieves the excessive geometrical deformation pattern recognition.
DEW Video Watermarking Algorithm Based on Energy Difference Ratio
SUN Wen-jing
Computer Science. 2010, 37 (5): 271-273. 
Abstract PDF(252KB) ( 333 )   
RelatedCitation | Metrics
A DEW video watermarking algorithm based on energy difference ratio was proposed. This algorithm embeds watermarks in energy-equally-distributed areas of I-frame video image blocks. Every I-frame has the same embedded-watermarking. Experimental results show that the proposed watermarking algorithm not only has better performance in the aspect of robustness but also has stable rates and high reliability in the test.
Study and Design of Reconfigurable Embedded Computing System Based on Xilinx SoPC
ZHANG YU,FENG Dan
Computer Science. 2010, 37 (5): 274-277. 
Abstract PDF(423KB) ( 535 )   
RelatedCitation | Metrics
The high performance embedded computing systems need considerable computational power and flexibility to meet various application rectuirements. A reconfigurable SoPC design based on Xilinx FPGA was presented. The system uses a dedicated hardware accelerator to process computational intensive tasks, and the accelerator can be dynamically configured during run-time. The hardware processing engine can be coupled to the host system as a CPU coprocessor,a PLB accelerator or an MPMC accelerator. Based on the experimental result that the MPMC accelerator had the highest performance, a reconfigurable MPMC accelerator was designed and integrated into the SoPC system. I}he experiments used 128-bit AES encryption and decryption as case studies. Features such as hardware resource utilization and reconfi guration latency for the reconfigurable system were also studied.
Instruction Replication Scheme of Fault-tolerant Processor
LI Hong-bing,SHANG Li-hong,ZHOU Mi,JIN Hui-hua
Computer Science. 2010, 37 (5): 278-281. 
Abstract PDF(337KB) ( 521 )   
RelatedCitation | Metrics
An instruction replication scheme used in a fault-tolerant processor called RSED was introduced. The processor's fault tolerant mechanism mainly using temporal redundancy technique was implemented by modifying superscalar processor architecture. As an important function of the fault tolerant mechanism, instruction replication scheme was introduced in detail,and the issue of control-flow error checking implemented by utilizing instruction replication was also presented.
Vector Timest Based Software Transactional Memory Algorithm
PENG Lin,XIE Lun-guo,ZHANG Xiao-qiang
Computer Science. 2010, 37 (5): 282-286. 
Abstract PDF(467KB) ( 464 )   
RelatedCitation | Metrics
Transactional Memory(TM) is perceived as an alternative to locks for mufti core processors. Software transectional memory can run on commercial mufti-core processors without additional hardware support, and make full use of them. We proposed VectorSTM a software transactional memory algorithm without employing any atomic instruclions. VectorSTM employs distributed vector timestamps to track the progress of transactions and provides more concurrency. We evaluated VectorSTM with STAMP benchmarks and the results show that the design offers superior performance or stronger semantics than TL2 and RingSTM.
Research on Novel Test Vector Ordering Approach Based on Markov Decision Processes
WANG Guan-jun,WANG Mao-li,ZHAO Ying
Computer Science. 2010, 37 (5): 287-290. 
Abstract PDF(331KB) ( 415 )   
RelatedCitation | Metrics
Delay test vector ordering is an efficient technique to reduce test power. Proposed a new delay test vector order approach based on Markovian decision process. To reorder delay test vector, defined transfer probability with the induced activity functions based on transition probability and hamming distance, determined the test vector sequence according to transfer probability. Reduced the swtiching activity of the CUT(Cirscuits Under Test),so we could get a better result to reduce peak power and average power. Proposed the TVO-MDP algorithm and conducted optimization and complexity analysis. The experiment results show our method's effectiveness.
Optimization Model of Loan Portfolio with Fuzzy Random Return Rates under Semivariance Constraint
PAN Dong-jing
Computer Science. 2010, 37 (5): 291-294. 
Abstract PDF(303KB) ( 525 )   
RelatedCitation | Metrics
The return rates of loan in bank often have fuzzy random characteristic in many cases, this paper described the return rates as fuzzy random variables, used semivariance as the risk measure method, constructed the optimization model of loan portfolio with fuzzy random return rates under semivariance constraints. The purpose of the model is to maximize the primitive chance measure when the total return rate is no less than the preset value at a given confidence level under semivariance constraints. The hybrid intelligent algorithm was employed to solve the model, the algorithm integrates fuzzy random simulation, neural network and genetic algorithm. At last, numerical examples illustrate the feasibility and effectiveness of the model and the algorithm.