Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 11, 01 December 2018
  
Research Development of Multimedia Semantic Model
LUAN Xi-dao,XIE Yu-xiang,TAN Yi-hong,CHEN Zhi-ping,ZHAO Bi-hai,HU Sai
Computer Science. 2010, 37 (11): 1-6. 
Abstract PDF(688KB) ( 502 )   
RelatedCitation | Metrics
Multimedia semantic research is one of the most important issues in multimedia analyzing and multimedia information service field. This issue originated from multimedia data's capture becomes an important bottleneck in multi-media applications. Multimedia semantic model is a summary and abstraction of multimedia processing, and focuses on all semantic problems existing in the lifecycle of multimedia data. The theme introduced the research developments of multimedia semantic models in content description, semantic representation and retrieval models.
Problems in Future Network Science and Engineering: A Survey
LI Bing,LI Qi-feng
Computer Science. 2010, 37 (11): 7-11. 
Abstract PDF(593KB) ( 502 )   
RelatedCitation | Metrics
Described the challenges and open problems of theory of networked computation, network science and net work design, network design and engineering, network design and societal values in the field of future network. The resolution of any one of the problems will make great progress in computer network.
Research on Delay/Disruption Tolerant Networks
GUO Hang,WANG Xing-wei,HUANG Min,JIANG Ding-de
Computer Science. 2010, 37 (11): 12-18. 
Abstract PDF(685KB) ( 548 )   
RelatedCitation | Metrics
Delay/Disruption Tolerant Networks (DTN) are an abstract network model different from traditional networks,which is proposed based on interplanetary networks. This paper provided an overview on recent development in these areas of DTN, including network protocol, routing algorithm, multicast schemes, and security mechanism. It also analyzed and compared in detail kinds of protocols, algorithms, and mechanisms about DTN in performance and fealures, and pointed out some open research issues and the further development
Studies on Community Question Answering-A Survey
ZHANG Zhong-feng,LI Qiu-dan
Computer Science. 2010, 37 (11): 19-23. 
Abstract PDF(551KB) ( 9949 )   
RelatedCitation | Metrics
As a burgeoning platform for knowledge sharing, community question answering (CQA) is capable of satisfying personalized information needs for individuals, with its distinct type of interactions among users and openness.This paper reviewed the researches and applications of CQA. We systemically described the research topics in CQA,such as user behavior analysis, content quality detection, question searching etc, and the application of CQA in other forms of social media. Finally, we discussed some issues for further study. The discussions in this paper are beneficial to enrich and expand researches in CQA.
Overview of Gossip Algorithm in Distribute System
LIU De-hui,YIN Gang,WANG Huai-min,ZOU Peng
Computer Science. 2010, 37 (11): 24-28. 
Abstract PDF(444KB) ( 1259 )   
RelatedCitation | Metrics
Gossip algorithm is simple, efficacious and scalable, while achieves fault tolerant information dissemination, it can be applied in decentralized, large scale and dynamic distribute network very well. We introduced the history of Gossip firstly; then the evaluation basis of Gossip Algorithm was proposed, and the factor which will affect the execution of Gossip algorithm was proposed as well;the application of Gossip Algorithm in distribute network was discussed in detail; the challenge in Gossip Algorithm research was discussed lastly.
Research Development of Access Control Model
HAN Dao-jun,GAO Jie,QU Hao-liang,LI Lei
Computer Science. 2010, 37 (11): 29-33. 
Abstract PDF(524KB) ( 1823 )   
RelatedCitation | Metrics
Control plays an important role in some fields, which is one of the necessary parts of information system to manage and control the safety of all kinds of resource. In this paper,we proposed some popular access control models,explained the characteristics and realization method of every access control model, analyzed and compared their advantages and shortcomings. Also, we introduced a standard access control policy language, XACML, with stronger semantic.Furthermore, we pointed out some contents being researched currently.
Strategy-independent Trust Negotiation Protocol
LI Kai,LI Rui-xuanLU Jian-fengLU Zheng-ding
Computer Science. 2010, 37 (11): 34-37. 
Abstract PDF(323KB) ( 436 )   
RelatedCitation | Metrics
Automated trust negotiation is a flexible approach to establish mutual trust between strangers that wish to share resources or conduct business transactions in open environments. However, existing automated trust negotiation systems cannot intemperate with each other. The main reason is lack of a unified trust negotiation protocol. A strategy-independent trust negotiation protocol was presented. In the protocol, message was classified into three categories; resource request message, information disclosure message and ending negotiation message, and their form was defined.Three states of the negotiation process and the transformation between them wre illustrated,and the protocol algorithm was expressed using pseudo codes. The analysis indicates that the protocol supports the disclosure of digital assertions including credentials with various formats and access policies specified with different policy languages, and allows adopting manifold strategies in one negotiation process, and satisfies to negotiate in various application scenes, and is provided with distinct generality as a result.
Analysis of Internet AS-level Topology under Skitter and Ark Measurement Infrastructure
ZHANG Jun,ZHAO Hai,KANG Min
Computer Science. 2010, 37 (11): 38-40. 
Abstract PDF(239KB) ( 461 )   
RelatedCitation | Metrics
This paper selected Internet AS-level Skitter dataset from Jan, 2003 to Dec, 2007 and Ark dataset from Jan,2008 to Dec, 2008 authorized by CAIDA(The Cooperative Association for Internet Data Analysis) to analyze in-depth to explain the effect on the measurement results of the Internet topology by the changing of the measurement infrastructure. Research began with the detailed statistics of various characteristics of Internet AS-level topology. The similarities and differences in several characteristics of Internet AS-level topology under Skitter and Ark were analyzed. Then the power-law distribution and the evolution of connectivity and topology core of the network were analyzed. The results show that the power-law property and high clustering feature of Internet have not disappeared with the transition of infrastructure.
Research on the Intelligent Schedule Measure of Grid Network
ZHANG Huan-jiong,ZHONG Yi-xin
Computer Science. 2010, 37 (11): 41-43. 
Abstract PDF(287KB) ( 449 )   
RelatedCitation | Metrics
The asset management and schedule measure in Grid computing system are the key technologies of describing the network souses, network management and task assignment. In this paper, the intelligent schedule measure in Grid Internet system was researched, the essential conditions of realizing intelligent schedule measure were discussed, furthermore,one intelligent schedule measure was pointed out.
Analysis of Wireless Sensor Network Characteristics Measurement Based on Complex Network Theory
ZHANG Cheng-cai,QI Xiao-gang
Computer Science. 2010, 37 (11): 44-46. 
Abstract PDF(333KB) ( 423 )   
RelatedCitation | Metrics
By studying the complex network theory, this paper introduced the main features of a complex network of several metrics, and analyzed the characteristics of wireless sensor networks. A wireless sensor networks degree distribution, clustering coefficient, characteristic path length, network connectivity and other features were listed. This paper studied the relationship between the number of nodes, communication radius and the rate of connectivity. The data obtwined from the simulation shows,increasing the node cannot fully guarantee connectivity of the network, but can only increase the probability of connecting, increasing emission radius can quickly make the network connect. Therefore, if the conditions allow, to select sensors of a larger emission radius is better than to layout more sensors. Finally, the article described a method which is suitable for wireless sensor networks to assess betweenness.
New Network Topology Optimization Approach Based on Vertex Separator Set
WANG Hong,ZHAO Feng,PENG Wei
Computer Science. 2010, 37 (11): 47-49. 
Abstract PDF(238KB) ( 641 )   
RelatedCitation | Metrics
Optimization network topology design is one of goals of network management. A network topology optimizalion algorithm,BTop,was proposed,which combines network traffic engineering and graphic theory to optimize network topology. I}he efficiency of the algorithm was verified by the real traffic and topology data sampled from the Abilene network.
Candidate-based Cluster Routing Protocol in WSN
YANG Bing,LI Guo-hui,XU Hua-Jie,DU Jian-qiang
Computer Science. 2010, 37 (11): 50-54. 
Abstract PDF(475KB) ( 403 )   
RelatedCitation | Metrics
LEACH is an important hierarchical routing protocol. It changes the cluster head periodically with the intention of saving energy. But it has bug in the clustering policy that brings on inappropriate cluster structure. CCRP introduces a clustering policy based on candidate and a multi-frames data transferring policy based on forecast to resolve the problem.The simulation result proves that CCRP is effective.
Enhanced Secure Routing Protocol Based on TPM
WAND Bo,HUANG Chuan-he,YANG Wen-zhong,WANG Tong
Computer Science. 2010, 37 (11): 55-58. 
Abstract PDF(429KB) ( 462 )   
RelatedCitation | Metrics
The design of secure routing protocol is one of an important part of research on network security for Ad hoc networks. At present,the research mainly focuses on means of classic cryptograph to guarantee security of routing. Integrating TPM of trusted computing and typical secure routing protocol - ARAN, this paper proposed a new secure protocol called TEARAN, this protocol doesn't adopt the way of the centralized public key certificate issued center-PKI,but utilizes the technique of DAA in TPM to authenticate the identity of each node, and employs the trust threshold of soft security to monitor the behavior of neighbor nodes, so that attaining the purpose of the trust distributed public key,in addition, avoiding malicious nodes joining in the network. This paper also assured the end to end confidentiality, integrity and non-epudiation. I3y theoretical analysis on the proposed TEARAN was presented to satisfy the demand of anonymous security, resist conventional malicious attacks and possess better security in effect.
Comparison Study of Declarative Networking Program Language
QI Xin,QU Wen-wu
Computer Science. 2010, 37 (11): 59-63. 
Abstract PDF(454KB) ( 417 )   
RelatedCitation | Metrics
The development of network technology and the increasing type number of heterogeneous computational devices have brought lots of challenges to network protocol design. The fundamental problem in front of the network protocol designer is how to get rid of the tedious protocol implementation details and pay their attention mainly to the design of protocol function. Recent ycars,for sake of solving this problem,the declarative networking programming language has been proposed. The declarative networking programming language has absorbed the experience of database management system's success and divided the network into logical level and physical level. The network protocol designers only need to use the high-level programming abstraction provided by declarative networking programming language to design the function of network protocol and don't need to take care of the tedious physical implementation.This paper surveyed the development of declarative networking programming language through analyzing and comparing different declarative networking programming languages. Besides, we pointed out the problems that should be paid attention to by further research work.
Network Security Situation Element Extraction Method Based on DSimC and EWDS
LAI Ji-bao,WANG Hui-qiang,ZHENG Feng-bin,FENG Guang-sheng
Computer Science. 2010, 37 (11): 64-69. 
Abstract PDF(607KB) ( 468 )   
RelatedCitation | Metrics
For the sake of fusing multi-source heterogeneous security information and extracting security element information about the whole network, a network security situation element extraction method based on Dissimilarity Computing (DSimC)and Exponentially Weighted DS Evidence I}heory(EWDS) was proposed. The method was divided into two phases including multi-source alert clustering and alert fusing. First of all, multi-source alert clustering method was put forward through computing different characteristics dissimilarity of alert to judge the dissimilarity among alerts.Then multi-source alert fusion method based on EWDS was proposed through fusing different sources to indentify intrusion attack behaviors. Experimental results indicate that the proposed method does well in True Positive rate (TPR),False Positive rate (FPR) and Data to Information Rate (DIR),remarkably reduces the number of alerts and enhances detection performance, and supplies data sources for network security situation evaluation and situation prediction.
Fast Scalar Multiplication on Edwards Curves Based on w-NNAF
HE Yi-chao,KOU Ying-zhan,QU Wen-long
Computer Science. 2010, 37 (11): 70-74. 
Abstract PDF(378KB) ( 479 )   
RelatedCitation | Metrics
Based on analysing and using the triple formula of Edwards curve to calculate 3nP(n=1,2,…),according to circumstances that the coordinate of every 3nP have the unitive representation,we put forward a new algorithm Tripling_Algorithm which can quickly compute 3nP (n=2,3…)by reducing inverse operation. And combining this algorithm with the w-NNAF representation with scalar k, we gave a kind of highly efficient algorithm ImprovedSM-3-NNAF to calculate scalar multiplication kP(for short ImprovedSM-3-NNAF).The analysis of the complexity and security of the Improved SM-3-NNAF shows that not only calculating kP by using this algorithm is security, but also it can save the amount of calculation by at least 20.78%, and has greatly improved the computational efficiency of the scalar multiplication in Edwards curve.
Multiple Attribute Evaluation Method Based on Multidimensional Cloud Model
GUO Rong-xiao,XIA Jing-bo,DONG Shu-fu,LONG Men
Computer Science. 2010, 37 (11): 75-77. 
Abstract PDF(250KB) ( 604 )   
RelatedCitation | Metrics
Considering multidimensional cloud can describe complex fuzzy concept, a multiple attribute evaluation method based on multidimensional cloud model was proposed. The concept of "attribute conceptualizing" was introduced and each attribute was described using on}dimension cloud. Then, the multidimensional judging clouds of comments and the multidimensional attribute cloud of system were established. The evaluation result was achieved by comparing the similarity degree between the two kinds of cloud models. The experimental results show that this method can realize multiclassification and ranking and reflect the influence of each attribute on the synthetical evaluation result clearly.
Mobility Models for Wireless Organization Networks
LIU Xing-bing,ZHENG Xue-feng,HAN Xiao-guang,YU Yi-ke
Computer Science. 2010, 37 (11): 78-80. 
Abstract PDF(366KB) ( 412 )   
RelatedCitation | Metrics
Group mobility model is one of the basic research issues for designing network protocol and evaluating algorithms performance in wireless network. This paper reviewed the characteristics and applications of the group mobility models which can't effectively simulate behavior of the wireless organization networks (WON). Based on the analysis of characteristics of WON,it proposed the central node as reference point of WON group mobility model (CWONM). By setting different parameters to compare the actual network and simulation networks, it confirmed that the new group mobility model is available.
Real-time MAC Protocol for Wireless Multimedia Sensor Networks
HUANG Zhi-jie,LI Feng,GAO Qiang
Computer Science. 2010, 37 (11): 81-85. 
Abstract PDF(537KB) ( 363 )   
RelatedCitation | Metrics
Timeliness is one of the most important considerations in wireless multimedia sensor networks(WMSN),and whether the MAC protocol can efficiently use the radio channel plays a decisive role in guaranteeing timeliness of WMSN. According to the data features of the wireless multimedia sensor networks, we proposed a multi channel real-time MAC protocol based on time slots reservation. hhe protocol establishes a time slots reservation based flow path from the source node to the convergent node before responding to the streaming media inquiries, thereby it can minimize each channel access delay while transmitting the data package. hhe simulation results show that the protocol can significantly reduce the end to end delay and the delay dithering of the streaming media data, and it also has better energy efficiency.
Balance Control Algorithm Based on Node Load Degree over Peer-to-Peer Network
CHEN Li-long,LIU Yu-hua,XU Kai-hua,WEI Yu-ying
Computer Science. 2010, 37 (11): 86-88. 
Abstract PDF(356KB) ( 469 )   
RelatedCitation | Metrics
In unstructured peer-to-peer network>we need to find hubs to restrain "free-riding" behavior. In this paper,based on referring previous connection numbers from hubs, we took nodes’power difference into account, also introduced the concept about load degree and those nodes holding high load degree are called overload nodes. Then a load balance control algorithm was proposed, in which load from overload nodes is shifted to light-load nodes. Simulation resups shows the algorithm can effectively balancing nodes' load and load distribution is more balanced,so the algorithm can effectively restrain "free-riding" behavior and helps to maintain and improve network performance.
Differentiated Services of Multi-tier Web Applications
HU Yan-su,DAI Guan-zhong,GAO Ang,PAN Wen-ping
Computer Science. 2010, 37 (11): 89-91. 
Abstract PDF(327KB) ( 354 )   
RelatedCitation | Metrics
Based on the analysis of threctier Web applications,a state space model of a MIMO system supported differentiated service was proposed which breaks through the commonly transfer function ways. Then the controller was designed by pole placement and state feedback of control theory to adjust the resource quotas assigned to different classes in every tier and achieve the proportional delay guarantees. The experiments demonstrate that it is feasible to approximate the system to a group of low-level linear equations and the proposed controller can hold the relationship between different classes.
Study for the Streaming Media Application Based on Live Mesh of Cloud Computing
LAO Bin,CHENG Jiu-jun,YAN Chun-gang
Computer Science. 2010, 37 (11): 92-96. 
Abstract PDF(449KB) ( 395 )   
RelatedCitation | Metrics
With the development of streaming media technology and cloud computing, and the emerging on more and more streaming application platform, transmission and sharing on steaming information have been facilitated. But there exist a lot of problems, for example, playing state handover and broken-point continuingly-playing, and so on. A streaming media application system based on live mesh of cloud computing was proposed in this paper. The system adopted URI and XML to accomplish media source centralize management, and used LAN UDP broadcast and multi thread mechanism to realize real-time handover on steaming media playing state between terminals. At the same time, the broken-point continuingly-playing was carried out with message management mechanism based on Live Mesh. The experiments show that the sharing on the same streaming media resource and the seamless handover on playing state have been well implemented between terminals, and the broken-point continuingly-playing has also been intelligently achieved.
Cryptanalysis and Modification of Chen et al. 's E-voting Scheme
DOU Ben-nian,ZHANG Hong,XU Chun-gen,HAN Mu
Computer Science. 2010, 37 (11): 97-98. 
Abstract PDF(252KB) ( 464 )   
RelatedCitation | Metrics
In 2004,Chen et al.proposed a secure internet e-voting.Inthis paper,we pointed out the security weaknesses of Chen et al.'s e-vothing scheme.We gave a modification which satisfies the security requirements of an e-voting scheme.
Energy-efficient Top-K Query Approach in Wireless Sensor Networks
CHENG Jie,LIU Wen-yu,ZHANG Sheng-kai,JIANG Hong-bo
Computer Science. 2010, 37 (11): 99-102. 
Abstract PDF(341KB) ( 387 )   
RelatedCitation | Metrics
Data query is a basic application field in wireless sensor networks, and the greatest (or least) K value query i. e. Top-K query is one of the most important scenario in query applications. As energy-efficiency is the key problem in wireless sensor networks applications, this paper presented an energy-efficient Top-K Query approach EI}QA in Top-K query. This method prevents data messages forwarding tied to message packets saving based on data aggregation with data filtering. Proposed system employs data stream model to support nearly real-time Top-K query. To save energy consumption the sink node modified filter of every node in good time. To ensure the query correctness the sink probes data from nodes when filter is set up too higher (or too lower). The performance of the proposed ETQA approach was evaluated using real data traces. The result shows that ETQA substantially outperforms the existing NAIVE, FILA and TAG in integral energy consumption.
New Method of Underdetermined Blind Source Separation Based on Pseudo Extraction Vectors
BAI Lin,CHEN Hao
Computer Science. 2010, 37 (11): 103-106. 
Abstract PDF(327KB) ( 344 )   
RelatedCitation | Metrics
By theory analysis, a new method based on pseudo extraction vectors was put forward for accomplishing underdetermined blind signal separation (UBSS) in the paper. The method accomplishes UI3SS by judging which signal dominating for each sampling and choosing corresponding pseudo extraction vector for it to recover sampling data of source signals. Some conclusions can be seen by separately simulating the method based on pseudo extraction vectors and linear programming; the method based on pseudo extraction vectors free of optimizing process increases the velocity of separating source signals. The velocity of separating of the method based on pseudo extraction vectors is tens of times of the method basal on linear programming.
Secure RFID System for Aviation Goods Management
DENG Miao-lei,MA Yu-jun,SHI Jin-eZHOU Li-hua
Computer Science. 2010, 37 (11): 107-110. 
Abstract PDF(339KB) ( 357 )   
RelatedCitation | Metrics
Radio frequency identification (RF)D) technology will greatly facilitate aviation goods management A RFID system model for aviation goods management was proposed and the main security solutions were presented. An authentication protocol for RFID was also designed in this context The proposed protocol is invulnerable to a number of malicious attacks. The implementation of the protocol is based on random permutation functions and XOR operations, has higher efficiency.
Performance Assessment Method of Online Auditing Based on Rank-centroid
CHEN Wei
Computer Science. 2010, 37 (11): 111-116. 
Abstract PDF(491KB) ( 402 )   
RelatedCitation | Metrics
Continuous auditing is a frontier in audit area, and the online auditing researched and implemented in china is also a realization mode of continuous auditing. According to the present condition, characteristics and requirements of online auditing researched and implemented in China nowadays, a performance assessment method of online auditing was proposed in this paper. Firstly, based on analysis of the costs, benefit and audit risk in the process of implementing online auditing, the assessment criteria fit to the characteristics of online auditing were constructed. Then, rank-centroid was used to compute the weights of the assessment criteria. Then, the performance assessment model of online auditing was constructed. Finally, a case was given to analyze the application of this performance assessment method. The research in this paper can lay a basis for studying the performance assessment methods of online auditing.
Reversible Network Synthesis for Positive/Negative Control Gates Based on Reversible Function's Complexity
NI Li-hui,GUAN Zhi-jin,NIE Zhi-lang
Computer Science. 2010, 37 (11): 117-121. 
Abstract PDF(381KB) ( 372 )   
RelatedCitation | Metrics
A methodology on synthesizing reversible network for positive/negative control (PNC) gates based on reversible function's complexity was proposed in this paper. According to the reversible function's output permutation,we exchanged the positions of two output vectors step by step to reduce the function's complexity until the complexity was zero. Each output switching corresponded to an individual PNC gate. Through synthesizing part of reversible funclions and comparing the results with the international representative cases used in other literatures, our reversible network created by the proposed method had an improvement in the number of gates.
Model Checking of Web Service Composition Based on UPPAAL
HE Ya-li,RONG Mei,ZHANG Guang-quan
Computer Science. 2010, 37 (11): 122-125. 
Abstract PDF(339KB) ( 383 )   
RelatedCitation | Metrics
Correctness verification of Web service composition plays an important role for improving software development efficiency and realizing service value-added. To study the correctness of Web service composition and their formal verification method from a high and abstract view, considering the real-time feature in Web service composition, based on the description of Web service composition using software architecture description language XYZ/ADL,the real-time description XYZ/RE was transformed to timed automata, the properties that the composite system should be satisfied were expressed in CTL formula, last, the correctness of Web service composition was automatically verified through a model checking tool-UPPAAL.
Research on ReflectiveArchitecture Formalism Based on Object-Z
LUO Ju-bo,YING Shi
Computer Science. 2010, 37 (11): 126-130. 
Abstract PDF(372KB) ( 387 )   
RelatedCitation | Metrics
Banding together meta information, meta modeling, reflection and software architecture, we presented a method of reusing software architecture based on reflection mechanism. It is a more versatile and convenient method. This mothod has defined and constructed a reflection mechanism RMRSA for software architecture reuse at software design phase. This paper described the meta-level architecture model of the reflective software architecture based on RMRSA.Moreover, it formalized the metes level architecture model using the formal specification languag—Objcet-Z completely.Taking the Link schema as an example, this paper also gave the initial theorem and its testified process, so as to testify the correctness of the formalized reflective software architecture.
Selecting Regression Test Subset by Multidimensional Scaling
WANG Xiao-hua,ZHANG Tao,SHANG Jing-liang,WANG Jin-bo
Computer Science. 2010, 37 (11): 131-134. 
Abstract PDF(420KB) ( 414 )   
RelatedCitation | Metrics
To resolve the problem that there arc not appropriate techniques for regression testing, when software changes much and testing resources aren't enough, this paper proposed a method to select regression subset by multidimensional scaling. According to the efficacy of the test suite testing,this method classified the test suite visually by computing its execution profiles data which denote the suite. For the method took both software changing and test cases representation into consideration, it was adaptable for obtaining typical regression test subset of big size software which changed much. The experiment indicates that this method can reflect the efficacy of the original test suite testing change and get classical test subset, so it can be satisfied with the strict requirements of regression testing and guiding practical
Research on Composite Services Selection Based on C-MMAS Algorithm
LIU Zhi-zhong,WANG Zhi-jian,ZHOU Xiao-feng,LOU Yuan-sheng
Computer Science. 2010, 37 (11): 135-140. 
Abstract PDF(493KB) ( 365 )   
RelatedCitation | Metrics
The problem of composite Web services selection with multiple composite paths was transformed into a constraint optimal path selection problem. A new optimization algorithm C-MMAS was proposed by integrating Max-Min Ant System into Culture algorithm framework, and was applied to solve the optimal path selection problem. This compuling model consists of a MMAS-based population space, excellent solution-based belief space and communication protocols between the two spaces. After completing MMAS-based evolution, population space carrys out variation-based evolution, and contributes excellent solutions as knowledge to belief space after evolutions. Belief space updates knowledge according to certain optimization principle. When the knowledge in belief has been accumulated and precipitated some generations, it is used to guide the MMAS-based evolution. Due to implementing two evolutionary mechanisms on population and knowledge,making the best use of population's evolutionary mechanism and guidance effect of knowledge, this computing model has improved population's diversity and convergence speed largely, realized the purpose of avoiding precocity and reducing computing expense. Theoretical analysis and experimental results indicate the feasibility and efficiency of this algorithm.
Similarity Determination Method of Workflow Process Oriented to Reuse
HU Jun,SUN Rui-zhi,XIANG Yong
Computer Science. 2010, 37 (11): 141-144. 
Abstract PDF(437KB) ( 425 )   
RelatedCitation | Metrics
Business process reuse makes use of existing business processes or business knowledge to create a new process model. It's a hot research in workflow technology, business process reuse can reduce the complexity of process definition, improve the quality and efficiency of process definition. This paper presented the basis and method of determing similarity of process, and detailed the principles and steps of the Similarity Determination method of Workflow Process Oriented to Reuse. Using this method can determine whether the processer is similar, and produce a workflow template automatically according to the similar processes. It achieves the reuse of the process definition.
Coordination Research for Service Composition Based on SPN
ZHANG Jing-le,YANG Yang,GAO Ang,WANG Yuan-zhuo,ZHAO Xiao-yong
Computer Science. 2010, 37 (11): 145-147. 
Abstract PDF(331KB) ( 383 )   
RelatedCitation | Metrics
In this paper, we proposed a research way about coordination of service composition based on SPN. First of all, the definition of services and service composition of coordination was proposed. Then, analysis methods were given based on combination of services and service coordination definition. At last, taking ccommerce systems in the clogistics system as an example, coordinate research methods of e-logistics system services were given.
Study and Implementation of OpenMP Multi-thread Load Balance Scheduling Scheme
REN Xiao-xi,TANG Ling,LI Ren-fa
Computer Science. 2010, 37 (11): 148-151. 
Abstract PDF(454KB) ( 666 )   
RelatedCitation | Metrics
Loop is one of the most important structures that can be parallelized effectively with OpenMP. However, the performance outcome will depends on the threads number, size of loop body and scheduling scheme. In order to get better performance and reach a better trade-off between scheduling overhead and load balance, this paper implemented the trapezoid scheduling scheme using OMPi, a open-source compiler that supports OpenMP API. The evaluation results show that; with normal threads number, the trapezoid self-scheduling will yield better performance than guided self-scheduling in case of the decreasing- and irregular-loop structure.
Genetic Selection Algorithm for OLAP Data Cubes
DONG Hong-bin,CHEN Jia
Computer Science. 2010, 37 (11): 152-155. 
Abstract PDF(370KB) ( 437 )   
RelatedCitation | Metrics
The data cube selection problem is known to be an NP-hard problem. In this study, we examined the applicalion of genetic algorithms to the cube selection problem. We proposed a genetic local search algorithm. The core idea of the algorithm is as follows. First, a pre-process algorithm based on the maximum benefit per unit space was used to generate initial solutions. Then, the initial solutions were improved by genetic algorithm having the local search of optimal strategies. The experimental results show that the proposed algorithm outperforms heuristic algorithm and canonical genctic algorithm.
LDA-based Model for Online Topic Evolution Mining
CUI Kai,ZHOU Bin,JIA Yan,LIANG Zheng
Computer Science. 2010, 37 (11): 156-159. 
Abstract PDF(465KB) ( 725 )   
RelatedCitation | Metrics
A computational model for online topic evolution mining was established through a latent semantic analysis process on textual data. Topical evolutionary analysis was achieved by tracking the topic trends in different time-slices.In this paper, Latent Dirichlet Allocation (LDA) was extended to the context of online text streams, and an online LDA model was proposed and implemented as well. The main idea is to use the posterior of topirword distribution of each time-slice to influence the inference of the next time-slice, which also maintains the relevance between the topics. The topirword and document-topic distributions arc inferenced by incremental Gibbs algorithm. Kullback Leibler (KI)relative entropy is uesd to measure the similarity between topics in order to identify topic genetic and topic mutation. Experiments show that the proposed model can discover meaningful topical evolution trends both on English and Chinese corpus.
Research on Interval Differential Skyline Based on Wavelet Synopsis
CHENG Wen cong,ZOU Peng,JIA Yan
Computer Science. 2010, 37 (11): 160-165. 
Abstract PDF(587KB) ( 409 )   
RelatedCitation | Metrics
In many applications, we need to analyze a large number of time series. Segments of time series demonstrating dominating advantages over others arc often of particular interest. Based on volume measure, the current interval skyline query returns the time series which are not dominated by any other time series in the interval. Some times this kind of query can not satisfy application rectuirements,and the "submerge" phenomenon may exist. So we proposed the concept of the interval differential skyline which focusing on the attribute of increasing rate of data to fix the shortage of the former kind of interval skyline query. Currently most of the time series are generated as data streams. Due to the limitation of the resource, people only maintain synopses which describe the main data characters. In this background we proposed the algorithm to implement the interval differential skyline query in different granularitics based on the common used wavelet synopsis and then we improved the efficiency of the naive a algorithm on the basis of keeping the accuracy of the results. Extensive experiments on the real stock price data set demonstrate the effectiveness of the proposed methods.
Inconsistency Checking and Resolving of CWM Metadata Based on Description Logics
ZHAO Xiao-fei,HUANG Zhi-qiu
Computer Science. 2010, 37 (11): 166-171. 
Abstract PDF(513KB) ( 341 )   
RelatedCitation | Metrics
The inconsistencies in metadata have remarkable influence on the stability and reliability of data warehouse system. During the metadata creation based on Common Warehouse Metamodcl(CWM),the different experiences and views of describing data of organizations involved in metadata creation bring metadata inconsistencies inevitably. However, detecting and resolving these inconsistencies in CWM metadata automatically is difficult because CWM metamodel and metadata are rendered to users by graphs, which lack precise semantics. In this paper, we researched how to check and resolve CWM metadata inconsistencies in terms of a logic belonging to Description Logics. First,how to formalize CWM metamodel and metadata by means of the presented Description Logic DLid which supports identification constraints on concepts was researched. Then the approach for checking metadata inconsistencies by query and reasoning mechanism of DLid was researched. At last, the approach for resolving inconsistencies through defining inconsistency resolution rules in DLid knowledge base was proposed. The results of the experiments with reasoning engine RACER are cncouragmg.
Research on the Decision Problem of XML Path and Type
SHEN Jie,YIN Gui-sheng,WANG Xiang-hui
Computer Science. 2010, 37 (11): 172-174. 
Abstract PDF(307KB) ( 417 )   
RelatedCitation | Metrics
In this paper, a new algorithm was proposed to analyze the decision problem to XPath in XML data which is represented as regular tree. By this way we could also check the static data's type of XPath. Based on the decidability of a logic with converse for finite ordered tree,we proved that its' time complexity is a simple exponential of the size of a formula. Then a practical and effective mathematical model was built to slow the satisfiability problem of a formula.Through some examples of decision problems, such as XPath emptiness, containment, overlap, and coverage, with or without type constraints, the algorithm was confirmed by these experiments that our system can be effectively used in static analyzers for both XPath expressions and XMI. type annotations.
Research and Realization of Temporal Data Integrity Constraints
LIU Hai,TANG Yong,GUO Huan,YE Xiao-ping
Computer Science. 2010, 37 (11): 175-179. 
Abstract PDF(418KB) ( 342 )   
RelatedCitation | Metrics
Temporal integrity constraints are used to ensure the correctness of the data in temporal database. To avoid the appearance of inconsistent data with temporal semantics model, relevant theories of temporal integrity should be studicd,which also provide theoretical support for implementation of temporal updating operations. Based on temporal relational model (TRM),this paper gave an integral definition of temporal integrity constraints in the temporal extensions to traditional integrity constraints, and studied the concrete processing mechanism in detail when some manipulating operations violate temporal integrity constraints. Some of the proposed processing mechanisms for temporal integrity constrains were realized in TempDB. This study not only provides the theory and implementation basis for temporal data integrity constraints in TempDB, but also has important reference value for the further perfection of the theories and relevant implementation techniques of temporal database.
Processing of k Nearest Neighbor Queries Based on Shortest Path in Road Networks
LIAO Wei,WU Xiao-ping,HU Wei,ZHONG Zhi-nong
Computer Science. 2010, 37 (11): 180-183. 
Abstract PDF(383KB) ( 470 )   
RelatedCitation | Metrics
To efficiently process k nearest neighbor queries in spatial road networks, this paper presented a distributed moving objects updating strategy to reduce the computing cost of server. In-memory spatial network adjacent matrix,shortest path matrix and hash table structures were introduced to describe the road network topology and store the moving objects. A shortest path based network expansion (SPNE) algorithm was proposed to decrease the processing cost of k nearest neighbor queries by reducing the search network space. Experimental results show that the SPNE algorithm outperforms existing algorithms including NE and MKNN algorithms.
Selectivity Estimation Based on 7ipf Distribution and Attribute Correlation
JIANG Fang-jiao
Computer Science. 2010, 37 (11): 184-189. 
Abstract PDF(498KB) ( 532 )   
RelatedCitation | Metrics
In Deep Web data integration,some Web database interfaces express exclusive predicates,which permit only one predicate to be selected. Accurately and efficiently estimating the selectivity of each exclusive query is of critical importance to optimal query translation. In this paper, we proposed a novel selectivity estimation method. Firstly, we computed the Attribute Correlation and access approximately random attributclevel sample through submitting the query on the least correlative attribute to the real Web database. hhen we computed Zipf equation aided by the information of word rank from the sample and the actual selectivity of several words from the real Web database. Finally, the selectivity of any word on the infinitcvaluc attribute was derived by the Zipf equation. An experimental evaluation of the proposed selectivity estimation method was provided and experimental results are highly accurate.
Application of UCON Model on Electronic Medical Record
WANG Ying,CHEN Wei-he,JU Shi-guang
Computer Science. 2010, 37 (11): 190-193. 
Abstract PDF(351KB) ( 372 )   
RelatedCitation | Metrics
By analyzing the disadvantages existing in EMR (Electronic Medical Record) system, this paper applied the idea of the next generation access control model usage control(UCON) model to the EMR system,and access control rules and formal description were given.
Good Point Set Genetic Algorithm with Zooming Factor
PENG Yong,LIN Hu,PU Xiao-fei
Computer Science. 2010, 37 (11): 194-198. 
Abstract PDF(394KB) ( 668 )   
RelatedCitation | Metrics
Good point set genetic algorithm has superiority in convergence speed, accuracy and overcome premature effectively by using the good point operator which is based on the principle of set in number theory. However, when the length of chromosome is fixed, the discretization error is inevitable. Aiming at the domino phenomenon of convergence from the highest position to lowest position of binary coding in good point set genetic algorithm, a zooming factor was proposed to lengthen the length of chromosome indirectly to minimize the discretization error, so the search efficiency and solution accuracy are improved as a result hhe simulation results based on Benchmark test function of different dimensions verify that the proposed good point set algorithm with zooming factor has the advantage of global convergence, high precision solution and search efficiency.
Study on the Approach of Fuzzy Multiattribute Cloud Decision Based on Natural Language
WU Ai yan,YU Chong-chong,ZENG Guang-ping,TU Xu-yan
Computer Science. 2010, 37 (11): 199-202. 
Abstract PDF(296KB) ( 384 )   
RelatedCitation | Metrics
The paper used cloud modcl,and proposed a natural languagcbased fuzzy multiattributc cloud decision method. It includes mainly several parts as follows. First,the described rank cloud and the evaluated rank cloud respectively were presented to measure attribute rank and rank certainty degree. On the basis, cloud normalizing algorithm was presented to fuse several values of an attribute given by experts. Then, cloud aggregation algorithm was presented to integrate different attribute information and get every alternative value, and the most desirable alternative was selected according to their values. Cloud model can express the relationship between randomness and fuzziness, so the method is more effective to measure linguistic assessment information. Finally, a example with linguistic assessment information shows that the method is simple and feasible.
Training Algorithm of Process Neural Networks Based on Numerical Integration
XU Shao-hua,WANG Ying,WANG Hao,HE Xin-gui
Computer Science. 2010, 37 (11): 203-205. 
Abstract PDF(229KB) ( 415 )   
RelatedCitation | Metrics
Aiming at the training problem of process neural networks, a training algorithm based on numerical integralion was proposed. In proposed algorithm, the numerical integration was directly applied to deal with the weighted aggregation of dynamic samples and weight functions in time-domain,and the gradient descent method was used to adjust the weight function characteristic parameters and network property parameters. hhrec kinds of numerical integration methods of Trapezoidal, Simpson, and Cotes were designed. Taking the prediction of sunspot data as an example, the simulation results show that the training algorithms based on numerical integration arc efficient, and the approximation performance of Simpson integration is optimal.
Ontology Matching Approach Based on Virtual Path
HUANG Tao,CUI Hong-yang,LIU Qing-tang,YANG Zong-kai
Computer Science. 2010, 37 (11): 206-211. 
Abstract PDF(536KB) ( 366 )   
RelatedCitation | Metrics
Ontology Matching is to Measuring the relation between entities of ontology. It relics not only on the relation between concepts,but also between concept neighbors and semantic associations. A new Ontology Matching Approach Based on Virtual path was proposed. Through the definition of virtual path around the two concepts, the approach utilined the semantic similarity of concepts, and computed the combined similarity of two virtual paths of two concepts with graph matching. After computing the similarity of two virtual paths, it decided the matching relation of ontology element. The experiment showed that the approach could increase ontology measuring quality and performance effectively.
Improved Spectral Subtraction Based on Real-time Noise Estimation
CHENG Gong,GUO Lei,HE Sheng,ZHAO Tian-yun
Computer Science. 2010, 37 (11): 212-213. 
Abstract PDF(243KB) ( 367 )   
RelatedCitation | Metrics
Aiming at speech enhancement under non-stationary noise environment and low SNR, an improved spectral subtraction based on real-time noise estimation was proposed. First, voice activity detection was carried out based on selected sub-bands vector distance, and then a real-time adjustment coefficient was defined through noisy speech properties of low-frequency regions and high-frequency regions. The combination of coefficient and voice activity detection could realize the updating of noise estimate.The method allowed tracking the time variation of noise environment.The experiment showed that the method was more effective in reducing background noise, improving SNR and decreasing speech distortion than traditional speech enhancement methods.
Method of Grey Relational Analysis for MADM Problem with Intuitionistic Trapezoidal Fuzzy Numbers
ZHANG Shi-fang,LIU San-yang,ZHAI Ren-he
Computer Science. 2010, 37 (11): 214-216. 
Abstract PDF(229KB) ( 429 )   
RelatedCitation | Metrics
With respect to multiple attribute decision making ( MADM) problem in which the attribute weights are known completed and the attribute values are intuitionistic trapezoidal fuzzy numbers, a method of grey relational analysis was proposed. Firstly, the definition and distance and nature of intuitionistic trapezoidal fuzzy numbers were given.Then, based on the basic idea of traditional grey relational analysis method, decision making steps of MADM problem with intuitionistic trapezoidal fuzzy numbers were proposed. Finally, an example was given to show the practicality and effectiveness of the developed approach.
Split-Merge Based Clustering Algorithm Oriented to Structure Stability of Clusters
LEI Xiao-feng,HE Tao,LI Kui-ru,XIE Kun-qing,DING Shi-fei
Computer Science. 2010, 37 (11): 217-222. 
Abstract PDF(536KB) ( 469 )   
RelatedCitation | Metrics
Clustering is to find the best partition of unlabeled observations under a certain group stucture hypothesis.Given the group stucturc hypothesis, the most clustering algorithms is to to iteratively optimize of fittness of data distribution (called algorithm validity). In fact, the clustering validity is determined by three factors: hypothesis, algorithm and apriori validity. Therefore, a variation of gaussian mixture model was proposed in this paper, then the measurement and estimation method of cluster structure stability were defined. Based on them, the SMCIus algorithm was designed to achieve the stable clustering structure by means of split merge operations. The experiment shows SMCIus' performance in clustering quality.
Counting Interlacing Sequence for C/E Systems
WU Zhen-huan,GAO Ying,WU Zhe-hui
Computer Science. 2010, 37 (11): 223-226. 
Abstract PDF(352KB) ( 381 )   
RelatedCitation | Metrics
That a pair of events in a system is said to be in concurrency is defined as "they can occur in any order" by R. Milner in CCS (Calculus of Communication System). This definition about concurrency is named as "interlacing concurrency". However, concurrency is defined as "disorder" by C. A. Petri in net theory. It is recognized as "true concurrency". In order to investigate the relationship and the difference between these two concepts, we used C/E system as models to discuss the appearance and essentiality of concurrency under both of these two definitions. As a result, a set of formulas to count the number of interlacing sequence (under the concept of interlacing concurrency) upon variable situations were given.
On the Decidability and Expressive Power of Timed Interval Temporal Logic
ZHU Wei-jun,ZHOU Qing-lei
Computer Science. 2010, 37 (11): 227-229. 
Abstract PDF(221KB) ( 506 )   
RelatedCitation | Metrics
Model checking is used widely in verification of real-time system. Satisfiability of discrete Timed Interval Temporal Logic is decidable, so is model checking of it, But in dense-time domain, the problem of model checking Timed Interval Temporal Logic is not clear. We prove that Satisfiability of Timed Interval Temporal Logic is un-decidable and we find a subset of Timed Interval Temporal Logic which can be decidable. So, it can be decidable to model checking the subset.
Topological Space Based on Rough Sets
QIAO Quan-xi,QIN Ke-yun,HONG Zhi-yong
Computer Science. 2010, 37 (11): 230-231. 
Abstract PDF(151KB) ( 407 )   
RelatedCitation | Metrics
This paper is dcvotcd to the discussion of topological structure of rough sets in Pawlak approximation space.Topological space based on rough sets on the universe which is not restricted to be finite was investigated and it was proved that the rough topological space is a normal space on separability of topological space.
Rough Entropy of Set-valued Information Sytems
MA Jian-min,ZHANG Wen-xiu
Computer Science. 2010, 37 (11): 232-233. 
Abstract PDF(223KB) ( 444 )   
RelatedCitation | Metrics
Entropy theory is an effective tool for studying uncertainty.Based on a pre-order relation defined on an set-valued information system,rough entropy was proposed.The maximum and minimum values of rough entropy were shown.And the monotony of rough entropy was also proved.
Clustering Algorithm for Mixed Data Based on Clustering Ensemble Technique
LUO Hui-lan,WEI Hui
Computer Science. 2010, 37 (11): 234-238. 
Abstract PDF(527KB) ( 437 )   
RelatedCitation | Metrics
A clustering algorithm based on ensemble and spectral technique named CBEST that works well for data with mixed numeric and categorical features was presented. A similarity measure based on clustering ensemble was adopted to define the similarity between pairs of objects,which makes no assumptions of the underlying distributions of the fcature values. A spectral clustering algorithm was employed on the similarity matrix to extract a partition of the data. The performance of CREST was studied on artificial and real data sets. Results demonstrate the effectiveness of this algorithm in clustering mixed data tasks and its robustness to noise. Comparisons with other related clustering schemes illustrate the superior performance of this approach. Moreover, CREST can infuse prior knowledge effectively to set the weights of different features in clustering.
Modified Linear Discriminant Analysis Method MLDA
LIU Zhong-bao,WANG Shi-tong
Computer Science. 2010, 37 (11): 239-242. 
Abstract PDF(318KB) ( 400 )   
RelatedCitation | Metrics
Linear Discriminant Analysis (LDA)is one of methods in pattern recognition, and is widely used in many fields such as pattern recognition and data analysis. LDA is to find an effective classification direction. While the sample dimention is much larger than its quantity, it is hard for LDA to deal with this problem. In order to effectively solve small sample size problem in LDA,this paper presented a modified LDA algorithm MLDA. This new algorithm turns within-class scatter matrix into scalarization in order to avoid computing the inverse of within-class scatter matrix. A series of experiments verify MLDA solves the small sample size problem to some extend.
Design of Speech Recognition Classifier Based on Genetic Wavelet Neural Network
HAN Zhi-yan,WANG Jian,LUN Shu-xian
Computer Science. 2010, 37 (11): 243-246. 
Abstract PDF(337KB) ( 390 )   
RelatedCitation | Metrics
Classification is an important problem in speech recognition, due to the fact that the learning effects of wavelet neural network strongly depend on the number of hidden nodes, the initial weights(including thresholds) , the scale and displacement factors, the learning rate and momentum factor, which leads to weak global search capability, easily falling into local minimum values,low convergence rate, and even not convergent Genetic Algorithm (GA) has height parallel performance, random and adaptive search performance, and it has obvious advantages in solving complex and nonlinear problem. Therefore,we can combine neural network and genetic algorithm by using GA to select initial value,and use wavelet neural network to finish the learning. The simulation results show that the new model effectively improves speech recognition rate, shortens the recognition time, realizes double wins in efficient and time, establishes the foundation for practicality of the algorithm.
New Algorithm of Zernike Moments Features for Shape-base Image Retrieval
GUO Dan,YAN De-qin,WU Xiao-ting,LIU Sheng-lan
Computer Science. 2010, 37 (11): 247-251. 
Abstract PDF(544KB) ( 456 )   
RelatedCitation | Metrics
As shape feature descriptors, high dimention zernike moments have the function of describing the detail information of image region, which exist "dimension disaster". This will result to increase the complexity of the algorithm and unnecessary information which make major information confused, and will affect decribing the content of the image.A new algorithm based on Manifold method was proposed to realize dimension deduction in image data. Under the condition of Laplace figure keeping local sample data, overall algorithm was introduced to ensure the integrity of the sample. Considering the influence of the correlation between information on projection accuracy,schur cigenvaluc decomposition was made to obtain the orthogonal vectors. This can make the data reconstruction relatively easier, and the rotation invariant of Zernike moment can still keep down, then making the image retrieval accords with the human visual effect. This method is superior than LPP in the retrieval performance, and retrieval results arc significantly improved.
Conditional Planning Encodings Based on Quantified Boolean Formulas
GAO Bing-bing,ZHANG Chang-hai,LU Shuai
Computer Science. 2010, 37 (11): 252-256. 
Abstract PDF(518KB) ( 685 )   
RelatedCitation | Metrics
This paper introduced the conditional planning problems and their associated planners, and analyzed logic based encoding methods. By analyzing translation based planning methods,whose targets are ctuantified boolean formulas, it introduced three different forms of quantified boolean formula encodings. Finally, it compared the above encodings,analyzed their respective advantages and disadvantages of two different translation mode based on propositional logic formulas and ctuantified boolean formulas,and then discussed the future research directions and trends on planning methods basal on quantified boolcan formulas.
Classification on Full Symmetric Function Sets and Decision on the Minimal Covering Members in Partial Four-valued Logic
LIU Ren-ren,WANG Ting,TAN Hao-xun
Computer Science. 2010, 37 (11): 257-260. 
Abstract PDF(236KB) ( 339 )   
RelatedCitation | Metrics
According to the completeness theory and concept of similar relationship in partial k-valued logic,the full symmetric function sets were classified by means of similarity relationship and some sets were proved minimal covering component of precomplete classes.
Improved Watermarking Based on Significant Difference of Wavelet Coefficient Quantization
HU Qing,LONG Dong-yang,LU Wei
Computer Science. 2010, 37 (11): 261-264. 
Abstract PDF(380KB) ( 389 )   
RelatedCitation | Metrics
In this paper an improved SDQ watermarking scheme was developed based on "Significant Difference of Wavelet coefficient Quantization" (SDQ) by Lin et al. Firstly, two blocks of the middle frequency band in DWT were divided into a group of 8 wavelet coefficients in a pseudorandom manner using a key. Then,1 bit watermark information was embedded in a group.The important coefficients difference was enhanced through slight quantization of the maximum coefficient, and the robustness was improved. Watermark was extracted by a secret key generated by coefficients difference and watermark. hhe experimental results show that the proposed scheme is quite effective against JPEG compression, low-pass filtering, Craussian noise and slight geometric attacks, and the PSNR of watermarked image is about 55dB.
Fatigue Recognition Based on Facial Motion and Improved Locality Preserving Projections
ZHANG Wei,XIA Li-min,LUO Da-yong
Computer Science. 2010, 37 (11): 265-267. 
Abstract PDF(225KB) ( 362 )   
RelatedCitation | Metrics
In this paper, we presented a new method for fatigue recognition based on facial motion and improved locality preserving projections. Facial velocity information,which is determined using Optical flow techniques,was used to characterize fatigue. The improved locality preserving projections,which preserve the local and global structure of the data manifold,was proposed to extract the effective fatigue feature. Weighted K-nearest neighbor (WKNN) was instructed to recognize fatigue. A set of experiments of fatigue recognition were presented, and the experiment results show that the proposed method is effective and attains a satisfactory effect.
Reduced Set Based Support Vector Machine for Hyperspectral Imagery Classification
YU Xu-chu,YANG Guo-peng,FENG Wu-fa,ZHOU Xin
Computer Science. 2010, 37 (11): 268-270. 
Abstract PDF(261KB) ( 423 )   
RelatedCitation | Metrics
Aiming at the problem of more computational time needed in hyperspectral imagery classification procedure based on support vector machine, a reduced set method was brought forward to heighten hyperspectral imagery classification efficiency. The radial basis kernel function was adopted, one-against one decomposition algorithm was used to construct multi-class Support Vector Machine classifier and cross validation grid search method was applied to select model parameter. The reduced set algorithm was also used to reduce the computational complexity of predication.Through hyperspectral imagery classification experiment it can be concluded that it does not need to use all support vectors to keep generalization ability of Support Vector Machine.The reduced set algorithm can improve hyperspectral imagery classification predicative efficiency highly and keep classification accuracy at the same time.
Shape Image Retrieval Based on Integrating Weighted Skeleton Segments Features
SHU Xin,PAN Lei,WU Xiao-jun
Computer Science. 2010, 37 (11): 271-274. 
Abstract PDF(330KB) ( 366 )   
RelatedCitation | Metrics
This paper presented an approach on shape image retrieval based on integrating weighted skeleton segments features. Firstly, the skeleton of the object in the image was extracted. Secondly, the skeleton was partitioned into several segments according to the feature points (such as end points and junction points) which were characterized using momenu. Finally, the distance between two images could be obtained based on the principle of MSHP (Most Similar Highest Priority) to measure the similarity between pairs of the segments from two images respectively. Experiments indicate that this approach can acquire better retrieval results than that of the traditional approach based on the whole skeleton features.
Object Invariant Feature Extraction in Contourlet Field
MEI Xue,XIA Liang-zheng
Computer Science. 2010, 37 (11): 275-277. 
Abstract PDF(244KB) ( 384 )   
RelatedCitation | Metrics
Extracting features which arc invariant and can discriminate targets with similar shape is one of key problems in shape-based target recognition. Multiscale geometric analysis(MUA) offers a high degree of directionality and anisotropy,which can express local features of objects more effectively. However, it is restricted greatly when used for object recognition because most of Multiscale geometric transforms are not invariant. A new feature descriptor which is invariant to the translation, scaling and rotation, was constructed in Contourlet field in this paper, which uses the idea of the image generalized moment. This method is specialized in extracting target local characters. Experimental results demonstrate the potential of Contourlet in feature extraction, and the features will not vary with translations, scaling and rotation. Furthermore, a study of the influence of using different decompose scale of Contourlet was carried out.
Study on Construction Methods Based on the Euclid Algorithm for Generalized Cat Map and its Application in Image Scrambling
LI Yong-jiang,LI Chang-li,GE Jian-hua,SUN Zhi-lin
Computer Science. 2010, 37 (11): 278-281. 
Abstract PDF(340KB) ( 338 )   
RelatedCitation | Metrics
Based on the idea of multiplication inverse of Euclid algorithm, two easy construction methods for generalized cat map were presented. One is based on the Fibonacci series and the other is based on the Dirichlet series. Moreover,one construction method was presented combined with these two series. Simulation experiments show that the period of generalized cat map is alterable and greater compared with that of cat map, thus they have better scrambling effect and also make them much securer than cat map and Fibonacci transform. In practice they can find great value in practice in image information hiding for storage and transmission and provide a much solider theoretical foundation for image scrambling.
Flower Image Retrieval Based on Multi-features Fusion
KE Xiao,CHEN Xiao-fen,LI Shao-zi
Computer Science. 2010, 37 (11): 282-286. 
Abstract PDF(466KB) ( 481 )   
RelatedCitation | Metrics
This paper had systematic and overall researches on flower images, including regional segmentation, feature extraction, content based duplicate images filtering and SVM-based image retrieval, etc. Firstly, in order to ensure retrieval results, we proposed duplicate images filtering algorithm based on Canny edge to detect duplicate images. Then we proposed adaptive threshold segmentation algorithm based on 2RGB mixed color model to segment flower images.Using multi-feature fusion strategy for feature extraction, we proposed weighted invariant moment shape based on HSV color model, as well as edge LBP feature which both combined texture feature and shape feature. Finally, we executed experiments on flower image library, comparative results show that above algorithms are effective.
Traffic Sign Recognition Based on Two-dimensional Principal Component Analysis
TANG Jin,LIU Bo,CAI Zi-xing,XIE Bin
Computer Science. 2010, 37 (11): 287-288. 
Abstract PDF(267KB) ( 402 )   
RelatedCitation | Metrics
This paper proposed a feature extraction method for traffic sign recognition based on Two-Dimensional Principal Component Analysis (2DPCA). A series of experiments were performed on two traffic sign databases with the nearest neighbor classifier and Euler distance. One database is the image library in which images are obtained through a series of simulation transformation after image binarization, While another database is made up of images shot from real scenes through selecting many different location scenes. The method has a good effect on the recognition of the both image databases.
Statistical Analysis-based Approach for Storage System Performance Tuning
LU Cheng-tao,FENG Dan,WANG Fang,GE Xiong-zi
Computer Science. 2010, 37 (11): 289-293. 
Abstract PDF(557KB) ( 477 )   
RelatedCitation | Metrics
The reasonable configuration of computer systems can dramatically improve performance of applications.Taking an NFS storage system as a case study, this paper proposed a statistical analysis-based performance tuning approach for storage systems and it proceeds in two phases. In the first phase, we leveraged the analysis-of-variance method(ANOVA) to model the performance sensitivity and thus identified the critical system parameters that have a significant effect on performance of applications; and furthermore, based on the previous phase, we employed the response surface methodology (RSM) to fulfill the tuning analysis of the critical parameters in the second phase. Combining the foregoing two steps, we then presented the performance tuning algorithm, which is eventually used to figure out the optimal combination of critical system parameters. Finally, the experimental results demonstrated the effectiveness and fcasibility of our performance tuning approach through an extensive evaluation under various representative scenarios including Web, E-mail, Fileserver, Linux utilities, and micro-benchmarks.
MT2RAID: A High Reliable Architecture for Large Scale Disk Arrays
WANG Zhi-kun,FENG Dan
Computer Science. 2010, 37 (11): 295-299. 
Abstract PDF(408KB) ( 375 )   
RelatedCitation | Metrics
Traditional disk arrays have centralized control architecture. The number of connected disks is constrained by the system bus. Centralized RAID architecture easily generates performance bottleneck and can not tolerate more than two disks faults. This paper proposed MT2RAID,a modular tree-connected multi tier RAID architecture for large scale disk arrays. MT2RAID is built from a collection of commodity components. Storage units are connected through fat tree based interconnection channels. The performance and reliability of different MT2RAID level were also analyzed and discussed. Prototype experimental results show that MT2RAID also has performance advantages compared with centralized RAID architecture.
Optimization of Audio and Video Encoder on ARM Platform
JIANG Chun-lin,JIA Wei-jia,ZHANG Li-zhuo,GU Ke
Computer Science. 2010, 37 (11): 300-301. 
Abstract PDF(201KB) ( 388 )   
RelatedCitation | Metrics
ARM is appropriate for audio and video encoder. In audio encoding, ARM supports digital signal processing,and thus can finish one DSP operation in one cycle. In video encoding, the running time can be reduced rapidly by making full use of 16 registers and using shift operation along with other operation in one instruction. Experiment proves that ARM can dramatically improve performance of audio and video encoder.
Solution and Design of N-body Algorithm on FPGA
FU Li-li,ZENG Guo-sun
Computer Science. 2010, 37 (11): 302-306. 
Abstract PDF(400KB) ( 555 )   
RelatedCitation | Metrics
N-body algorithm is a classic problem in dynamics. It has been widely used in many fields. But in recent years,as its scale is much larger than before,the computing rectuirement for high performance has turn out to be a kind of bigger obstacle on its research. Right now the reconfigurable technology of FPGA therewith the hardware reconfigurable architecture and high parallel processing ability has become the focus of high performance computing industry. In this paper, with the example of using FPGA to accelerate solving N-body algorithm, we introduced a new method to solve computation-intensive task.