Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 38 Issue 5, 16 November 2018
  
Research of Phase Transition in Artificial Intelligence
GU Wen-xiang,HUANG Ping,ZHU Lei,YIN Ming-hao
Computer Science. 2011, 38 (5): 1-7. 
Abstract PDF(602KB) ( 737 )   
RelatedCitation | Metrics
Phase transition exists in many problems of Artificial Intelligence. Phase transition is with a little change to the parameter there is a sharp transition to the system. There is a great relationship between phase transition and the structure of problem, this paper made more comprehensive introduction about the developing, related concepts of phase transition and the relationship between phase transition and the structure of problem. Moreover, summarized some unresolved questions and given the summary and outlook of research on phase transition.
Surveys of Software Safety
FEN Xiao-guang,CHU Wen-kui,ZHANG Feng-min
Computer Science. 2011, 38 (5): 8-13. 
Abstract PDF(726KB) ( 1521 )   
RelatedCitation | Metrics
As software is one of the important safety factors in a softwarcintensive and safety-critical system, c. g.,an integrated modular avionics(IMA) system, software safety is to be a mainstream research direction in the crossing fields between software engineering and safety engineering. The paper analysed firstly the meanings and extensions of software safety, and then gave a definition of it. Measuring models of software safety were then discussed. The paper focused on the state-of-the-art of software safety from a software engineering perspective about development processes,designed alternatives, assessment techniques and certification methods. The potential research directions of software safety were finally pointed out.
Research on Eliciting Security Requirement Methods
JIN Ying,LIU Xin,ZHANG Jing
Computer Science. 2011, 38 (5): 14-19. 
Abstract PDF(585KB) ( 1008 )   
RelatedCitation | Metrics
Recently more and more attention has been paid to use active defense in software security because it provides a positive way to guarantee software security and effectively construct high-confidential software. Security requirements were critical to software security assurance. Eliciting security requirements was one of major and difficult tasks during the security assurance. Some typical methods about eliciting security requirements were studied, compared and analyzed with respect to their research methods, application, etc. The current status of different approaches to security requirements elicitation were summarized, and future trends were explored in the end. The above work will provide a valuable reference for carrying out research and application in security requirement engineering.
Two-phrase Retrieval Strategy in Content Aware Network Storage System
LIU Ke,QIN Lei-hua,ZHOU Jing-li,NIE Xue-jun,ZENG Dong
Computer Science. 2011, 38 (5): 20-23. 
Abstract PDF(511KB) ( 660 )   
RelatedCitation | Metrics
As the storage capacity approach Exabytes, how to efficiently organize, find and manage data is becoming increasingly difficult for us. The query requests in storage system are coming from two aspects, the first one is metadata retrieval delivered by administrator and the second one is user's common keyword query. But the functions of de-duplication and block similarity detection in content aware storage system are not utilized to enhance the above query processing. In order to take advantage of the upper semantic information and the lower storage system's duplicate block information to deliver efficient query service for users, a two-phrase retrieval strategy was introduced. It combined metadata/keyword query with block similarity query and utilized ranking coefficient to evaluate similarity among query resups. The experiments indicate that the retrieval strategy has efficiently enhanced the retrieval recall.
Study of Adaptive RED Algorithm Based on 2nd-order Difference Equation
FAN Xun-li,ZHENG Feng,GUAN Lin,GAO Li
Computer Science. 2011, 38 (5): 24-27. 
Abstract PDF(378KB) ( 682 )   
RelatedCitation | Metrics
This paper analyzed and studied the internal structure of the adaptive RED algorithm through time-domain analysis using the classic control theory, which is based on AQM. According to the analyzing results,the paper designed a new discrete controller based on 2nd-order difference ectuation and applied it to the AQM algorithm. To illustrate the performance of the given algorithm,the paper simulated the prompted algorithm on NS-2 platform. Numerical simulation results show that the 2nd-order difference ARED algorithm can keep the queue length much more stable and has better control capability for the queue length.
New Mobile Agents Secure Itinerary Protection Based on Merkle Trees
LIU Yi,HAO Yan-jun,PANG Liao-jun
Computer Science. 2011, 38 (5): 28-30. 
Abstract PDF(235KB) ( 619 )   
RelatedCitation | Metrics
Mobile agents are software programs which are believed to play an important role in future ccommerce systams,but the security problems have been an obstacle for mobile agents to be practical. Using a special oncway function and Merkle trees,a route protection of mobile agents was presented in this paper which not only owns better security and convenience but also reduces the computational cost.
Grid-based K Nearest Neighbor Query Processing Algorithm in Wireless Sensor Networks
LIU Yu-lei,QIN Xiao-lin,SHEN Jia-jia
Computer Science. 2011, 38 (5): 31-36. 
Abstract PDF(571KB) ( 559 )   
RelatedCitation | Metrics
A grid-based KNN ctuery processing algorithm called GKNN was proposed in this paper which takes energy consumption, query latency, query result correctness and etc into consideration in an integrated way. It optimizes the existing query area estimation methods in order to reduce the energy consumption of the algorithm. GKNN takes advantage of grids to manage the nodes and divides the ctuery region into several grid zones. Each grid zone processes query parallel to reduce query latency. Furthermore, GKNN takes advantage of node redundancy to reduce the influence of node failures on query result correctness which improves the accuracy of query result Experimental results show that GKNN outperforms the existing algorithms.
Multicollinearity-based 3D DV-Hop Localization Algorithm
YAN Xiao-yong,QIAN Huan-yan,GAO De-min,YU Ji-min
Computer Science. 2011, 38 (5): 37-40. 
Abstract PDF(468KB) ( 825 )   
RelatedCitation | Metrics
To overcome the disadvantage of multilateral ranging method, consider the location algorithm applied realistic environment, based on the characteristics of DV-Hop, an improved three-dimensional localization algorithm was proposed. The algorithm not only transplants DV-Hop localization algorithm from two dimensions to threcdimensional space outside, also taking into account the algorithm beacon node topology and hops on the localization accuracy, and a Multicollinearity-Based 3D DV-Hop Algorithm(MCB3D DV-Hop) was proposed.The algorithm sets two threshold parameters of hop and multicollincarity, and selects the good beacon node teams to realize the position estimate.Thcoretical analysis and simulations results show that our proposed method works well.
Research of Trust Evaluation Model Based on Reputation and Cooperation
LI Li-miao,CHEN Zhi-gang,DENG Xiao-heng,GUI Jing-song
Computer Science. 2011, 38 (5): 41-44. 
Abstract PDF(366KB) ( 728 )   
RelatedCitation | Metrics
Because some of the existing trust evaluation model uses the method of linear weighted sum to measure the trust, it may appear a dimension index information is very high, while the other dimension index information is relatively low, thus there is no comprehensive information for computing the trust, and it makes serious impacts on the accuracy and effectiveness of the model to compute the trust evaluation for nodes. For the shorting of the model, the paper proposed a mutual trust matrix evaluation model based on reputation and cooperation. We evaluated the reputation of the nodes by adopting two-tuplc linguistic information and evaluated the cooperation of the nodes by adopting date measurement,thus we concluded the mutual trust evaluation of the nodes. At last,the paper compared the model with the mode of linear weighted by analysis of simulation experiment, it appears that the proposed model can suppress the malice node's attacking and improve the efficiency for executing a plan of the nodes.
Joint Multi-path Routing and Channel Assignment Strategy for Cognitive Wireless Mesh Networks
GU Jin-yuan,ZHANG Guo-an,BAO Zhi-hua
Computer Science. 2011, 38 (5): 45-48. 
Abstract PDF(382KB) ( 845 )   
RelatedCitation | Metrics
In this paper,a novel joint multi-path routing and channel assignment strategy based on path crossing for the cognitive wireless mesh networks was proposed. The strategy combines the basic processes of on-demand routing, and sets up the relay function of cross-nodes according to the chosen routes,then selects the channel based on the minimum number of channels occupied by the primary users, and finally gives a solution to channel conflict. Simulation results show that, compared with the link-based and interference-based strategy, the new strategy has an obvious advantage in the performance of delay and throughput.
Perceived Speech Quality Estimation Model for VoIP-based Networks
HU Zhi guo,ZHANG Da-lu,ZHANG Jun-sheng
Computer Science. 2011, 38 (5): 49-53. 
Abstract PDF(559KB) ( 1124 )   
RelatedCitation | Metrics
In VoIP system, the performance of network transmission makes a fundamental impact on quality of experience. However, the traditional research based QoS can not be directly reflected to be Quality of Experience. We combined ITU PESQ and E-model algorithm and made a comprehensive analysis of the impact on user experience which was caused by packet loss, end-to-end delay, fitter, voice coding, and so on. Our contributions are thus twofold: firstly, we offered a detailed analysis of the impact of these parameters and their interactions on the perceived conversational quality.Secondly,we adopt regression model for singlccnded(nonintrusive) method for objective speech quality estimation according to different conditions. The experimental results show that the objective assessments are correlated well with objective assessments ones,which indicates that the method proposed here is effective.
PSO-based k-means Algorithm and its Application in Network Intrusion Detection System
FU Tao,SUN Ya-min
Computer Science. 2011, 38 (5): 54-55. 
Abstract PDF(273KB) ( 708 )   
RelatedCitation | Metrics
In the traditional k-means algorithm, the initial cluster center is chosen randomly, clustering result varies from the initial cluster center, and clustering result is unstable. The PS(}based k-means algorithm was proposed in the paper. The PSO optimization algorithm generates the initial cluster center. The clustering result is global optimal and doesn't fall into local optimal solution. Experimental results show that the intrusion detection rate is significantly higher than the traditional k-means algorithm and its false positive rate is largely lower than the latter by applying the PSO-based k-means algorithm to the rule mining module of intrusion detection system. Obviously, the PSO-based k-means algorithm can improve the performance of network intrusion detection system effectively.
Cross-layer Adaptive Round Distributed Clustering Routing Model for Wireless Sensor Networks
XU Nan,SUN Ya-min,HUANG Bo,YU Ji-ming
Computer Science. 2011, 38 (5): 56-59. 
Abstract PDF(462KB) ( 572 )   
RelatedCitation | Metrics
The character of location-related transmission information and model of energy consumption were analyzed. Across-layer adaptive round inter-clusters routing was proposed based on the cluster election method of the energy index and distance index. The model can balance traffic load and energy consumption of nodes. Communication of inter-clusters is based on energy and energy-consumption factor and RSSI and energy index of next hop.By comparison with other clustering and cross-layer routing protocols, the simulation results show that the model prolongs the lifetime and improves the throughput. I}he model is high energy-efficiency, small control overhead and well balanced energy load.
Research of Multiple Dimensional Reliability Mobel for MANET
ZHAO Zhi-feng,ZHAO Xi-bin,CHEN Dan-ning
Computer Science. 2011, 38 (5): 60-63. 
Abstract PDF(473KB) ( 600 )   
RelatedCitation | Metrics
Without infrastructure, Mobile Ad Hoc Network(MANET)is a kind of self-organized and dynamic wireless communication network that is lack of any centralized control. These characteristics make it vulnerable to communication continuity and security attack. Comparing with traditional network, MANET has great limitation in network reliability. Many researches have been focused on two main factors of reliability or reliability modcl,which influence the reliability of MANET. Therefore, this paper proposed a multiple dimensional reliability model, which considers both movement and security attack, for MANET.Furthermore, the experimental analysis of the model was given. According to the result of experimental analysis, the key factor of MANET reliability was presented.
Performance Evaluation Algorithm of Digital Modulation Signals Recognition
LIU Ming-qian,LI Bing-bing,LIU Han
Computer Science. 2011, 38 (5): 64-66. 
Abstract PDF(254KB) ( 656 )   
RelatedCitation | Metrics
Because the accurate rate could not be evaluated objectively and overall in recognition of digital modulation,it was proposed that area under receiver operating characteristic(AUC) was used for assessing the LS-SVM classifier and NN classifier's performance. After using the five characteristics parameters, the classification of digital modulation was successfully realized by adopting the LS-SVM classifier and NN classifier. The values of AUC were used for evaluating the merits and demerits of classifiers. The relevant simulation results shown the average performance of the LS-SVM classifier is better than that of the NN classifier.
Nearest Neighbor Method Data Association Algorithm to Support Multi-target Tracking in WSN
ZHU Xiao-gang,YANG Bing,XU Hua-jie
Computer Science. 2011, 38 (5): 67-70. 
Abstract PDF(369KB) ( 879 )   
RelatedCitation | Metrics
Multi-source data association is one of the key technologies of multi sensor data fusion in wireless sensor network. The joint probability data association algorithm is a data association algorithm of Multi-target tracking,it doesn't need any priori information about targets and clutters, but its computer expense is very large compared with the other related data association algorithms.The joint probability data association algorithm based on the nearest neighbor method integrates the ideology of nearest neighbor data association in the process of the construction of effective matrix, then selects three valid measurements which have the least statistical distance to construct the effective matrix.This method simplifies the effective matrix, thus reduces the computation of original algorithm.
Improved Algorithm of Congestion Control Based on Bandwidth Estimation
SHI Shi-jie,XUE Kai,SHAO Yan,ZHANG Guo-yong
Computer Science. 2011, 38 (5): 71-73. 
Abstract PDF(264KB) ( 678 )   
RelatedCitation | Metrics
In the wireless environment, the method of bandwidth estimation for TCP based on channel noise model(Ntcp) has two disadvantages, one is the low estimated bandwidth and the other is the high fluctuation of the estimated bandwidth. With that, we proposed an improved algorithm(Ntcp' ) based on the amount of data sent by the sender.The analysis and simulated results show that Ntcp' works well.
Research of Security Authentication Protocol of the RFID Based on the Multi-prover Model
LU Yao,LIAO Ming-hong,LI Gui-lin
Computer Science. 2011, 38 (5): 74-78. 
Abstract PDF(435KB) ( 695 )   
RelatedCitation | Metrics
RFID technology is widely used in today's society. However, due to the shortcomings of its security model,RFID technology is still not applied in some areas very well. We proposed a new RFID security model-RHZD multi prover model.It can compensate for deficiencies in the original model. We also proposed a security protocol based on this model. It can solve many RFID security problems, such as cloning attack, replay attack, tag tracking, and etc.
Background Traffic Simulation and its Application in Satellite Network Performance Analysis
PAN Yan-hui,WANG Tao,LI Hua
Computer Science. 2011, 38 (5): 76-88. 
Abstract PDF(259KB) ( 1054 )   
RelatedCitation | Metrics
It is generally accepted that network traffic could be characterized by self-similarity. It is important to generate network traffic according to this principle for network performance analysis. A scheme for network background traffic generation was given, and it was validated by a program implemented in VC++. Furthermore, this procedure was used to simulate satellite network and help to analyze end to end delay. It suggests that bursty traffic has influence on satellite network performance.This factor should be carried to satellite network routing optimization and traffic allocation.
Regular Region Algorithm Optimal Coverage and Connectivity in WSN
SUN Ze-yu,XING Xiao-fei
Computer Science. 2011, 38 (5): 79-82. 
Abstract PDF(352KB) ( 1235 )   
RelatedCitation | Metrics
It is one of the challenging core issues in the research of WSN that coverage and connectivity can be realized by using the least sensor nodes in certain coverage conditions. An algorithm of optimal coverage in the regular region and connectivity was put forward. Target nodes can be distributed into the range of square by the double square, then association mould can be worked out by sensor nodes and target nodes. By the way, in case of the whole coverage region, the least sensor nodes under coverage can be figured out by using expectation value of probability. At last, connectivity probability mould of fringe nodes within the range of outer square and inference procedures were analyzed. The experiment shows that the error between the theoretical value and simulated result is below 5%. The validity of this algorithm is thus verified. The distribution of network resources is decreased so that network coverage and connectivity can be better evaluated.
QoS Research of H. 264 Video Transmission in Embedded Wireless LAN
LI Wen-xin,LI Yu-guang,HU Yan-su,MU De-jun
Computer Science. 2011, 38 (5): 83-85. 
Abstract PDF(303KB) ( 735 )   
RelatedCitation | Metrics
H. 264 transmission issues in the Wireless LAN were studied and the existing video transmission strategy was improved according to the characteristic of embedded systems. First, a dynamic transmission algorithm based on feedback and buffer-driven was proposed on the server,and then a zero-buffer and slow-start algorithm for dynamic real-time playing was applied on the client. The improved strategy which takes the network, buffer and real-time into account not only ensures the real-time video transmission, but also provides the network QoS. And the experiments show that the two algorithms achieve satisfactory results.
Two Dimensional Dynamic Priority-based FCFS Token-Queuing Algorithm
LIU Jun-rui,CHEN Ying-tu,FAN Xiao-ya
Computer Science. 2011, 38 (5): 89-92. 
Abstract PDF(367KB) ( 700 )   
RelatedCitation | Metrics
According to the characteristics of the token-routing switches, the author put forward the token-queuing algorithm based on FCFS, the token fixed priority and the switch ports dynamic priority. The requests received by the switch at different time are queued based on FCFS, and the requests received by the switch at the same time is queued by the token fixed priority and the switch ports dynamic priority. So, this algorithm is called as the hwo Dimensional Dynamic Priority-based First Come First Serve Algorithm, abbreviation for TDDP-FCFS. Then, the author used the seizing M/M/1/∞ queue to model the hDDP-FCFS system, and discussed its performance indexes, gave the calculation method and the actual results of the performance indexes. Results showed that the algorithm TDDP-FCFS took both the token priority and the switch ports priority into account, and could meet the scheduling rectuirements of the token-routing switch. hDDP-FCFS has higher operation efficiency of scheduling.
Time-dependent Chinese Postman Problem Solved by Two Layers SA/GA Algorithm
SUN Jing-hao,WU Xiong,TAN Guo-zhen,YAN Chao
Computer Science. 2011, 38 (5): 93-95. 
Abstract PDF(318KB) ( 990 )   
RelatedCitation | Metrics
The Chinese postman problem is one of the classical problems in graph theory and has been widely and deeply researched since it was proposed. It is applicable in a wide range of fields. With the rapid development of computer networks, computer communication and intelligent transportation system, problems in time-dependent networks become more realistic than the classical problems. First, we presented the definition of timcdependent Chinese postman problem (TDCPP) and the property of TDCPP. Then we showed that the classical algorithms of Chinese postman problem (CPP) can't work in the timcdependent circumstances. Finally, two layers SA/GA algorithm(simulated annealing/ genetic algorithm) was proposed, this approach was tested on some randomly generated data, the computational results were analyzed by comparing with lower bound of the problem.
CCNete: An Automatic Modeling Tool Based on Petri Nets for C Program
ZHOU Guo-fu,SUN Yun-qiu,CAI Yu
Computer Science. 2011, 38 (5): 96-101. 
Abstract PDF(502KB) ( 743 )   
RelatedCitation | Metrics
CCNeter is an automatic modeling tool based on CNet, an extension of Petri nets. CCNeter respectively describes data, operations and control from a sourece code. Accordingly, on Petri nets specification the relationship among data,operation and control can be discovered. Through capturing the dependency relations among source files,functions and variables of C project,CCNeter automatically creates CNet specification for C program,then draws and lays out the specification. CCNeter is an important precondition task of static analysis of program code.
Reliable Service Composition Mechanism in Pervasive Environments Based on Graph
ZHANG Jian,RAO Ruo-nan
Computer Science. 2011, 38 (5): 102-106. 
Abstract PDF(426KB) ( 591 )   
RelatedCitation | Metrics
In pervasive environments, devices and network environments are extremely heterogeneous and highly dynamic. So, service providers are more possible to disconnect the current computing environments, leading to the failure of service composition. After considering pervasive computing features, and the requirements of reliable service composition, this paper gave a reliable service composition mechanism-RScMPG. RScMPG ensures service composition's reliability by "Generating More Reliable Service Composition Plan" and "Monitoring the Implementation of Service Com- position".
Domain Modeling and Mapping Technique Based on Goal-tree
JIAO Feng
Computer Science. 2011, 38 (5): 107-112. 
Abstract PDF(477KB) ( 573 )   
RelatedCitation | Metrics
Domain-specific software is characterized with heterogeneous data and resources, complicated business processes and changing business requirements from users. It is of practical significance and great value to find out a technique for integrating these software which may be developed by different people using different languages and technologies , and provides very different functionalities. Considering the characteristics of oil-drilling engineering, a goal-tree based domain-specific modeling technique and a SCA based platform independent modeling technique were suggested,which SOA and MDA arc used for reference to. Finally, a real case, Simulation System of Oil-Drilling Engineering,shows the modeling techniques are practical to integrate domain-specific software and can meet the requirements to flexihle extension and confi}nration of hnsiness modules.
Method Based on Fuzzy Evaluation Model of Layered Component Testability Evaluation
LIU Zhe,ZHANG Wei-qun,XIAO Wei-na
Computer Science. 2011, 38 (5): 113-115. 
Abstract PDF(216KB) ( 645 )   
RelatedCitation | Metrics
This paper analyzed the component testability research situation, proposed a layered model based on fuzzy evaluation component testability assessment method. This method according to the fuzzy comprehensive evaluation principle, set up the index set and evaluation sets, and put forward the AHP, SCI}FCE and MCTFCE algorithm to make a more accurate evaluation results for the component testability. Simulation results showed that it was an effective evaluation method.
New Heuristic Algorithm Based on Web Service Composition
SUN Zhan zhi,ZHU Yi-an, CHE Ming
Computer Science. 2011, 38 (5): 116-118. 
Abstract PDF(256KB) ( 581 )   
RelatedCitation | Metrics
Web service composition is often considered to be one of the most important and vital building blocks for Service Oriented Architecture. Toward that, we presented a new heuristic algorithm named HASC. The algorithm obtwins the solution through two steps which were traverse searching and regression. Both of the steps used heuristicmethod to select optimal Web services. In the process of traverse searching, the number of input parameters the Web service needed was considered as the heuristic function. In the process of regression, the heuristic function was the cardinality of the intersection generated by the output parameter set and the object ontology set. We evaluated the efficiency and effectiveness of HASC with two publicly available test sets-EEE05 and ICEI3E05. Compared with other similar algorithms, HASC can provide higher efficiency and shorter solution path for the requests.
COSMIC-FFP Function Size Measurement for Object-oriented Methods
JI Chun-lei,YAN Shun-cheng,SONG Guo-xin
Computer Science. 2011, 38 (5): 119-122. 
Abstract PDF(365KB) ( 646 )   
RelatedCitation | Metrics
The four existing Functional Size Mcasurement(FSM) methods that meet the ISO standards can not take the interactions and behaviors of objects into consideration and can not correctly measure the functional size of object oriented systems. So, based on the analysis of software development process of object oriented methods, according to the characteristic of object-oriented systems,this paper presented an Object Oriented Full Function Point method based on COSMIG-FFP, and gave the mapping rules, measurement rules and an example of the application process of the approach. I}his method provided an effective way for the correct measure of the function size of object oriented systems.
Procedural Semantics and its Implementation of Pruning Operators in Logic Programming Language
LI Hui-qi,ZHAO Zhi-zhuo
Computer Science. 2011, 38 (5): 123-126. 
Abstract PDF(456KB) ( 683 )   
RelatedCitation | Metrics
The use of pruning operators in logic programming is to reduce the search space of computations. The importance of pruning operator in logic programming was discussed. However, the implementation of traditional pruning operator may cause some semantic problems. We discussed the Uodel pruning operator, called the commit, which could be used to prune away parts of a search tree and can affect the completeness of the search procedure. In this perspective, we proposed the method to realize the control facility in logic programming language which could support the fully implementation of Godel language.
Design of Linear System Solver on CCA
ZHOU Jing-jing,SHENG Xin-ya
Computer Science. 2011, 38 (5): 127-128. 
Abstract PDF(286KB) ( 602 )   
RelatedCitation | Metrics
Programming based on component is an important way for decreasing complexity and program development time in high performance computing. In this paper we introduced CCA, which is a component architecture towards HPC. Programming in CCA environment, design and implementation of CCA were also discussed. According to design and implement linear system solver,people can realize advantages of HPC software based on component Result of experiment shows different performance between running MPI application and component application.
Software Cascading Failures Based on Coupled Map Lattice
ZHOU Kuan-jiu,LAN Wen-hu,FENG Jin-jin
Computer Science. 2011, 38 (5): 129-131. 
Abstract PDF(341KB) ( 606 )   
RelatedCitation | Metrics
It is unavoidable that there exist faults in software systems, and how to provide continuing error-free service when software system breaks down or is disturbed by the external perturbation, is an urgent problem of the theory. Networks of static function-call and weighted networks from dynamic software execution are of small-world and free scale.Based on the model of coupled map lattice, the formation mechanism and propagation behavior of cascading failures in the software system were studied to improve the credibility of software testing based on critical node.
Multiple Data Threshold Detecting Method Based on Autonomic Computing
LIU Wen-jie, LI Zhan-huai
Computer Science. 2011, 38 (5): 132-134. 
Abstract PDF(377KB) ( 599 )   
RelatedCitation | Metrics
Autonomic computing system often uses threshold to mark performance faults. Threshold is the boundary value of performance counter,which will be detected automatically by the system. When the performance counter of one of the devices reaches the threshold, it is considered that performance fault occurs on one of the devices. Then the autonomic manager will actively select policies to recover these faults,which therefore makes the whole system to maintain normal state. Nowadays,it lacks of research on system performance faults in autonomic computing field. On the basis of studying the self-monitoring of autonomic computing system, this paper proposed a multiple data threshold detecting method,by detecting the exceeding and recovering of the threshold in multiple times, system can effectively judge whether the performance fault occurs. This method assures the detecting accuracy,which provides the effective basis to recover fault for autonomic computing system.
Local Cluster Based Biased Sampling of Trajectory Stream
WANG Kao-jie,ZHENG Xue-feng,SONG Yi-ding,AN Feng-liang
Computer Science. 2011, 38 (5): 135-137. 
Abstract PDF(335KB) ( 561 )   
RelatedCitation | Metrics
Managing trajectories of moving objects is a research focus in mobile computing. Building data synopses by sampling technologies is one of the widely used method. But traditional uniform sampling usually discard some significant points that reveal relative spatiotcmporal changes. A novel biased sampling approach based on sliding window model was proposed utilizing the property of local continuity. Firstly, through local clustering, the sliding window was divided into various sized basic windows and sampling the data elements of a basic window using biased sampling rate, then forming trajectory stream synopses. This algorithm takes advantage of the intrinsic characteristics of trajectory stream and achieves superior approximation cauality. The extensive experiments verified the effectiveness of our algorithm.
Mining Frequent Subtrees from Dynamic Database
GUO Xin,DONG Jian-feng,ZHOU Qing-ping
Computer Science. 2011, 38 (5): 138-141. 
Abstract PDF(381KB) ( 624 )   
RelatedCitation | Metrics
On account of dynamic database's characteristic which is changing over time,a new algorithm aiming to mine frequent subtree from dynamic database was proposed. It put forward the support algorithm and subtree-searching space involving some concepts such as tree change probability, subtree expectation support and subtree dynamic support. The problem of mining frequent subtree from dynamic database was investigated. With the process of the subtrecsearching,algorithm definition pruning expressions and mix data structure could reduce subtre}searching space and improve frequcnt subtrec isomorphism speed efficiently. The experimental result showed that the new algorithm is effective and workable and has a better operating efficiency.
Adaptive Query Optimum Scheduling Strategy on Data Streams
DONG Nan-nan, LU Yan,SONG Bao-yan
Computer Science. 2011, 38 (5): 142-144. 
Abstract PDF(350KB) ( 581 )   
RelatedCitation | Metrics
Query processing mechanisms of data stream were investigated in this paper and we proposed a dynamic optimum strategy, which is based on multiple factors, namely MultiFactor. This strategy decides a scheduling sequence by the number of tuples which consumed by a operator per unit time. Scheduling timcslices arc decided by deadline of a query. Also the paper proposed a Scheduling method of multi-stream join queries. hhis paper gave the adaptive optimizalion of the timing and the a山ust strategy of MultiFactor.
DataStreams Clustering Algorithm Based on Density and Sliding Window
HU Ru,LIN Zhao-wen,KE Hong-li,MA Yan
Computer Science. 2011, 38 (5): 145-148. 
Abstract PDF(356KB) ( 900 )   
RelatedCitation | Metrics
Summarizing the advantages and disadvantages of the current main datastrcams clustering algorithms, this paper presented a new datastrcams clustering algorithm-DsStrcam. The algorithm uses the I}oublclaycr clustering framework, makes use of sliding window technology, clusters the datastreams dynamically based on the density. This algorithm can mine the datastrcams with arbitrary shape and grasp distribution of datastrcams dynamically.
Missing Data Imputation Based on Generalized Mahalanobis Distance
CHEN Huan,HUANG Der-cai
Computer Science. 2011, 38 (5): 149-153. 
Abstract PDF(456KB) ( 1291 )   
RelatedCitation | Metrics
Missing data arc inevitable in data-collection, how to restore these data has become one of the hottest issues in data mining. Just like most algorithms,missing data imputation algorithms based on Mahalanobis Distance make full use of relationships between data. I}hough the results arc acceptable, the covariance matrixes arc not always reversible, which limit the algorithms greatly. This paper improved a traditional principal component analysis(PCA) method, proposed a new distance named Generalized Mahalanobis Distance according to SVl)and Moore-Penrose pseudoinverse. Combining with SOFM neural network and entropy, we designed GS missing data imputation algorithms. After academic analysis and simulation, it was proved that Generalized Mahalanobis Distance inherits the advantages of Mahalanobis Distance wonderfully in dealing with relatived data. Not only the new algorithm has good accuracy and stability, but also suits for any datascts.
Customer Segmentation Modeling on Factor Analysis and K-MEANS Clustering
PENG Kai,QING Yong-bin,XU Dao-yun
Computer Science. 2011, 38 (5): 154-158. 
Abstract PDF(545KB) ( 1548 )   
RelatedCitation | Metrics
To develop customers' potential demands for data services, the research for customer segmentation has become a primitive work of telecommunications operators in order to run a differentiated users' marketing. Through the use of clustering algorithm, this paper presented a segmentation modeling for differentiating customers using short messaging services in telecommunications operators. Firstly, based on factor analysis, redundant properties were simplified in the complex data mining under variable parameters in order to improve the quality and efficiency of the modeling, and then the customer segmentation model was constructed through unsupervised clustering K-MEANS algorithm. It was verified that the SMS users have the obvious differentiation of characteristics by using the cluster model. In 2009,a western communications enterprise achieved significant benefits with application of the model in the differentiated data service marketing.
Neighboring Point Data Recovery for CDP Based on Data Gap
HOU Li-man,LI Zhan-hua,HU Na
Computer Science. 2011, 38 (5): 159-163. 
Abstract PDF(473KB) ( 543 )   
RelatedCitation | Metrics
B1ock-level CDP system can't give clear recovery points with semantic information. Users have to execute data recovery several times to get correct data. Traditional algorithm for each data recovery will be time consuming and performance overhead costing. Target data of two neighboring point data recovery had little data gap. This paper divided neighboring point data recovery into four types, and presented a gap algorithm which eliminated more data and wrote lost data using bits table. It is proved by prototype experiment that the algorithm is correct and the less gap data, the more efficiency.
Chernoff Bound Based Approximate Frequent Itemset Mining Method over Streams
LI Hai-feng,ZHANG Ning
Computer Science. 2011, 38 (5): 164-168. 
Abstract PDF(446KB) ( 723 )   
RelatedCitation | Metrics
A data stream is fast, unlimited and dynamic, these characteristics constraint the computational resources and storages when mining frectuent itemsets. This paper addressed this problem and proposed a simple and effective algorithm AFIoDS, AFIoDS is an approximate algorithm based on sliding window model, which splits stream data into batches and maintains them with 2-tuple lists;thus,a false negative result can be obtained using a probabilistic parameterbased on chernoff bound. The approximation will be changed dynamically to guarantee the mining frequent itemsets are error controllable. Plus, a compression of frequent itemsets, the closed frequent itemsets, arc employed to represent the results of each batch for further memory saving. Our experimental results on 3 real world data show that without precision reduction, AFIoDS achieves a faster speed and a much reduced memory cost in comparison with the statcof-thcart algorithms.
Study on Indexing Method for Korean Information Search
JIN Guang-he,WANG Xing-wei,JIANG Ding-de
Computer Science. 2011, 38 (5): 169-174. 
Abstract PDF(525KB) ( 599 )   
RelatedCitation | Metrics
Based on the sufficient analysis of the Korean information search system, this paper investigated the indexing method to improve the search performance. After the advantage and shortcoming of the typical indexing methods such as the noun unit indexing, the morphological analysis indexing, the n-gram unit indexing, the word segmentation unit inflexing and so on, were analyzed in detail, the key factor impacting significantly on the search performance was found by trial and error. At the same time, thirty stop words in Korean, indexing way used to search, and its characteristics were illustrated. Finally, a new indexing method for Korean information search was proposed by taking advantage of every inflexing method. Simulation results show that new method proposed holds the significant performance improvement and is promismg.
Research of iJser Interest Drift Pattern Based on Web Access Information
MA Li,TAN Wei, LI Pei
Computer Science. 2011, 38 (5): 175-177. 
Abstract PDF(374KB) ( 669 )   
RelatedCitation | Metrics
Focused on the phenomenon that the visited interest of user would change with time,a model of mining user's interest drift pattern was proposed. Abstracted the visited interest of user into a time sequence with the method of hidden Markov,which was used to reflect the sectuential features of the user interest. Used GSP algorithm to mine the drift of user's visited interest pattern from the interested sequence of user. At last,verificd the feasibility of the given model using the simulation experiments, and the model could give a deeper description of the draft of the visited interest on the attribute of time.
Predicate Formal System VULh[0.75,1]and its Soundness
MA Ying-cang,HE Hua-can
Computer Science. 2011, 38 (5): 178-180. 
Abstract PDF(310KB) ( 647 )   
RelatedCitation | Metrics
The aim of this paper is the axiomatization for first order predicate calculus formal system VULh[0.75,1] based on first-level universal AND operator. 13y introducing the universal quantifier and existential quantifier, the predi- cute calculus formal deductive system VULh[0.75,1], based on 1-level universal AND operator according to propositional calculus formal deductive system VULh[0.75,1] of universal logic was built up, moreover, the soundness and deduction the- orems of system VULh[0.75,1], a were proved.
Subject Sentence Extraction Based on Undirected Graph Construction
GE Bin,LI Fang-fang,LI Fu,XIAO Wei-dong
Computer Science. 2011, 38 (5): 181-185. 
Abstract PDF(444KB) ( 662 )   
RelatedCitation | Metrics
Undirected graph based on the sentence was proposed. The problem of sentence extraction was transformed to computing undirected graph node weights. This paper first proposed sliding window-based keywords extraction algorithm,followed by the establishment of the undirected graph. The edge weights of the graph were modeled by the Vecfor Space Model(VSM) in turn. The node weights were computed finally by the weight model based on the similarity matrix,and the subject sentences were obtained on the ratio of compression. Experiments show that the proposed automatic summarization techniques improve the recall rate and accuracy effectively.
Self-adaptive Optimization for Traffic Flow Model Based on Evolvable Hardware
NIE Xin,LI Yuan-xiang,WANG Long,LIU Lin
Computer Science. 2011, 38 (5): 186-189. 
Abstract PDF(362KB) ( 632 )   
RelatedCitation | Metrics
BML model is a kind of cellular automata model used to simulate and analyze the traffic system in the road network structure. hhe simulation and evolutionary optimization of the model implemented by software arc optimized slowly and very low efficiently, so that it limits the ability enormously of traffic flow model to be used in some high real time and high-speed occasion. In view of this question, we presented the architecture of an EHW-based cellular automata model,a cellular automata model implemented in evolvable hardware platform and intended for the on-line evolution of the traffic flow model. And then it can adjust the rule of the traffic light signal according to the real-time state of traffic flow. After a careful analysis of the comparison result, the self-adaptive optimization for traffic flow model based on evolvablc hardware was proved to be very useful and can meet the needs in the research and design of intelligent traffic system.
Classifier Algorithm Based on Orthogonal Projection
WANG Wei-dong,MIAO Shuai,YANG Jing-yu
Computer Science. 2011, 38 (5): 190-193. 
Abstract PDF(356KB) ( 667 )   
RelatedCitation | Metrics
In this paper,a novel classifier algorithm based on orthogonal projection was presented. This classifier projecled testing samples to orthogonal subspaces that were spanned by training samples of per class, and calculated the dislance of the testing samples and subspaces,then classified the testing samples according to the distance. The character of this classifier is that it need not calculate the covariance matrices' inverses, so it is very suit for small sample size problem. The experimental results in ORI. database prove this classifier algorithm outperforms the traditional classifier algorithm in recognition rate.
P-Fuzzy Sets (AF ,AF) and its Applications
YAN Li-mei,XU Feng-sheng,SHI Kai-quan
Computer Science. 2011, 38 (5): 194-198. 
Abstract PDF(411KB) ( 563 )   
RelatedCitation | Metrics
By introducing dynamic characteristics to finite ordinary set X, the ordinary set Xwas improved and P-sets(packet sets) was proposed. P-sets is a pair of sets composed of internal P-set XF(internal packet set XF) and outer P-set XF (outer packet set XF ) , or, (XF , XF ) is P-sets. P-sets has dynamic characteristics, that is, internal P-set has internal dynamic characteristics and outer P-set has outer dynamic characteristics. P-sets(XF,XF ) was introduced to fuzzy set A of L.A. Zadeh, the fuzzy set A of I.A. Zadeh was improved, P-fuzzy sets(packet fuzzy sets) was given. P-fuzzy sets is a pair of fuzzy sets composed of internal P-fuzzy set AI' (internal packet fuzzy set AI') and outer P-fuzzy set AF(outer packet fuzzy set AF ) , or, (AF ,AF ) is P-fuzzy sets. P-fuzzy sets has dynamic characteristics, some properties and applications of P-fuzzy sets were given. Under certain conditions,P-fuzzy sets (AF ,AF )can come back to the original of ordinary fuzzy set A of L. A. Zadeh. P-fuzzy sets has more extensive applications than fuzzy set of L. A. Zadeh. P-fuzzy sets is a new direction of fuzzy set theory and applied research.
Research on New Optimal Portfolio Selection Model with iJncertain Returns
LIU Jian-jun
Computer Science. 2011, 38 (5): 199-202. 
Abstract PDF(312KB) ( 751 )   
RelatedCitation | Metrics
This paper solved the portfolio selection problem whose security returns with uncertainty. Utilizing a new perspective, this paper gave a new definition of risk for uncertain portfolio selection. A new optimal portfolio selection model was proposed based on this new definition of risk. A new hybrid intelligent algorithm was designed for solving the new optimization problem In the proposed new algorithm, 99 method was employed to calculate the expected value and the chance value. It greatly reduces the computational work and speeds up the process of solution compared with hybrid simulation used in our previous algorithm. A numerical example was also presented to illustrate feasibility and validity of the new modeling idea and the proposed new algorithm.
Data Dependence and Separation-application of Abnormal Data
LIN Hong-kang,LI Yu-yingu,RUAN Qun-Sheng
Computer Science. 2011, 38 (5): 203-207. 
Abstract PDF(413KB) ( 670 )   
RelatedCitation | Metrics
Two kinds of phenomenon often in data transference:some data elements are lost and some unknown data elements enter into original data. The data was changed into abnormal data due to such phenomena. By using a new mathematical model, theoretical research and application respect to two phenomena were given. This new model is called P-sets(packet sets). P-sets is a set pair which is composed by internal P-set XF(internal packet set XF) and outer P-set XF (outer packet set XF);or(XF,XF ) is called P-sets. In this paper, the concepts and characteristics of F-dependence and F-dependence were presented. I}he dependence theorem and the application of abnormal data separation were given. Data dependence is one of application characteristics and it is a new theory and method in dealing with dynamic data system.
F-outer Embedding Information and F-heredity Identification-application
YU Xiu-qing
Computer Science. 2011, 38 (5): 208-211. 
Abstract PDF(377KB) ( 583 )   
RelatedCitation | Metrics
Based on the dynamic characteristic of outer P-set, this paper gave concepts of F-outer embedding information,its core and its measurement, Then using these concepts the paper obtained sequence theorems such as iterative F-outer embedding information theorem,F-outer embedding information existing theorem,F-heredity theorem of F-outer embedding information, the recovery-restore theorem of F-heredity, and so on. At the end, application about F-heredity identification was presented. One of the important characteristics of outer P-set is to discover the information existing out of some information, and this discovery was finished by F-heredity.
P-sets and F-memory Information Characteristi-application
WANG Yang,SHI Jin-chang,SHI Kai-quan
Computer Science. 2011, 38 (5): 212-215. 
Abstract PDF(401KB) ( 566 )   
RelatedCitation | Metrics
P-sets(packet sets)are combined with internal P-set XF(internal packet set XF)and outer P-set XF(outer packet set XF) ;or (XF ,XF) is P-sets. P-sets possess dynamic characteristic. Dynamic characteristic of P-sets is deleting some attributes and adding some attribute with the attributes set a of cantor set X. By using the structure and dynamic characteristic of P-sets,the concept of F-memory information genaration,F-memory information measurement,F-memory circle were given, the F-memory information exist theorem, the F-memory information recovery theorem and the F-memory information characteristic theorem were proposed. By using the results, the application of F-memory information was given. P-sets is a new mathematical model and mathematical method.
Effort of User-Item Degree Correlations to Bipartite Network Personalized Recommendations
CHENG Ting-ting,WANG Heng-shan,LIU Jian-guo
Computer Science. 2011, 38 (5): 216-219. 
Abstract PDF(372KB) ( 575 )   
RelatedCitation | Metrics
In this paper first bipartite graph was project based on mass diffusion, then random walk method was used to get collaborative filtering results. Degree correlation between users and objects was embedded into the similarity index to improve the algorithm The numerical simulation shows that the algorithmic accuracy of the presented algorithm is improved by 18. 19% in the optimal case and the diversity is improved by 21. 90%. The statistical analysis on the prodtrct distribution of the user and object degrees indicates that, in the optimal case, the distribution obeys the power-law and the exponential is equal to--2. 33.
Feature Selection Model and Generalization Performance of Two-class Emotion Recognition Systems Based on Physiological Signals
WEN Wan-hui,LIU Guang-yuan,XIONG Xie
Computer Science. 2011, 38 (5): 220-223. 
Abstract PDF(362KB) ( 747 )   
RelatedCitation | Metrics
The feature selection process in an emotion recognition system is an NP hard problem, i. e. the scale of the problem increases exponentially with the increasing number of initial features. I}he goal of establishing good two-class emotion recognition systems was to find a subset of the initial features which minimized the missing rate and the false rate of the system. Such a task was regarded as a combinatorial optimization problem and solved by Tabu search algorithm and the Fisher classifier. Two kinds of physiological signals(the Galvanic Skin Response and the Heart Rate) recorded under four discrete emotion states(joy,anger,grief and fear) of 66 college students were used during the establishment of the systems. It was found that the problem of feature selection could be properly solved by Tabu search, and the user-independent emotion recognition systems had good generalization performance. Furthermore, the individual difference of affective physiological responses had different influence on the recognition of joy,anger,grief and fear.
AP Clustering Based Biomimetic Pattern Recognition
DING Jie,YANG Jing-yu
Computer Science. 2011, 38 (5): 224-226. 
Abstract PDF(374KB) ( 610 )   
RelatedCitation | Metrics
A classify based on AP Clustering and biomimetic pattern recognition was proposed. It can relatively classify the samples by calculating the distance to the relative subspace. The training sample space was constructed by the AP algorithm and bionic pattern recognition theory. hhe posterior probabilities based on the class condition were estimated to reduce the reject rate caused by the space overlapping with low misclassification. Experiments were performed with Concordia University CENPARMI's handwritten digit database and Nanjing University of Science and Technology's handwritten amount database. Experimental results indicate that the proposed classifier has a higher recognition rate than the traditional classifiers.
Particle Swarm Optimizer with Simulated Binary Crossover and Polynomial Mutation and its Application
LIU Yan-min,NIU Ben,ZHAO Qing-zhen
Computer Science. 2011, 38 (5): 227-230. 
Abstract PDF(352KB) ( 914 )   
RelatedCitation | Metrics
PSO may easily get trapped in a local optimum, when it comes to solving multimodal problems. In view of the default, we presented a variant of particle swarm optimizer(PSO) with simulated binary crossover and polynomial mutation(SPDPSO for short). In SPDPSO, additionally, the external archive was introduced to store the personal best performing particle(pbest) , and simulated binary crossover and polynomial mutation were used to produce new particles. In benchmark function, the results demonstrate good performance of the SPDPSO algorithm in solving complex multimodal problems compared with the other algorithms. In practical application, the experimental results show that the SPDPSO algorithm can achieve better solutions that other PSOs.
Research on Adaptive Modeling and Scenario-driven Online Simulation for Product Lifecycle Management
BAI Yong,CHEN Yang,ZHAO Yong
Computer Science. 2011, 38 (5): 231-235. 
Abstract PDF(595KB) ( 627 )   
RelatedCitation | Metrics
Modeling and Simulation(M&S) , which arc applied to product lifccyclc managcmcnt(PLM),have the ability to impove the efficiency and quality of product development, usage and maintenance. But with products' complexity, the M&S need high adaptability of environment change. Based on the background of weapon acquisition, an adaptive simulation framework, which adopts hierarchical multi-agent technology, was presented. Under the framework, scenario models,which drive reorganization of the simulation systems,were generated and mapped level by level according to the evolution of the real systems and their environment. The framework reflects the parallelism, dynamics and integrativity of the simulation systems,and can enhance the adaptability and application ability of Mc}S throughout PLM.
Design and Implement of Embedded Web Server toward Video Surveillance System
ZHOU Ruo-gu,DING
Computer Science. 2011, 38 (5): 236-239. 
Abstract PDF(338KB) ( 756 )   
RelatedCitation | Metrics
Based on ARM2410 platform, this paper proposed a hardware and software solutions of embedded Web server in the security system. The program model was constructed by using the multi-process H I TP server engine to improve speed of response, used SESSION to save client-side and server-side connections to improve system security and efficiency, improved method based on Wcol to read intelligent predictive algorithm to predict the page to improve the efficicncy of the server. Experimental results show the program is correct and has a safe,fast response and strong stability characteristics.
RMLT DR Algorithm's Research and Implementation in Pedestrian Navigation System
ZHENG Wei,WANG Wei-xing,LIANG Shun-long
Computer Science. 2011, 38 (5): 240-243. 
Abstract PDF(329KB) ( 880 )   
RelatedCitation | Metrics
Dead reckoning is the most common pedestrian navigation of a projection algorithm. This paper analyzed the traditional dead reckoning.The dead reckoning for pedestrian navigation which has a fixed threshold can not be different according to the pedestrian environment and can not automatically adjust the threshold value,which leads to Lower accuracy. Therefore this paper proposed the radar-based multi-level threshold dead reckoning that is RMLT DR algorithm. Through simulation experiments,comparcd RMLT DR algorithm and dead reckoning positioning. During pedestrians walking,RMLT DR algorithm can automatically select the threshold size according to the surrounding environment with higher accuracy.
DCSP-based Resource Allocation Approach for Emergency Rescue in Coal Mine
LI Wei,ZHANG Zi-li,WU Hua-jun
Computer Science. 2011, 38 (5): 244-248. 
Abstract PDF(486KB) ( 690 )   
RelatedCitation | Metrics
How to allocate resources efficiently is pivotal for emergency rescue after large-scale incidents occurred. This paper discussed the appropriate resource allocation approach for emergency rescue in coal mine. Distributed constraint satisfaction problem(DCSP) , an effective approach to deal with resource allocation, is suitable for showing and solving collaborative problems in distributed situation. This approach features in information distribution, demands change with dynamic environment, which arc also the characteristics of emergency rescue in coal mine. I}his paper adopted I}CSP approach to solve resource allocation of emergency rescue in coal mine, drew and formulated DCSP model, as well as defined Agent and Constraint model, and improved Multiple Asynchronous Weak-commitment Search (MAWS) algorithm which is used to solve I}CSP. I}he experiment results testify that DCSP approach is effective and feasible to solve resource allocation for emergency rescue in coal mine.
Feature Preserved Mesh Simplification Algorithm Based on Stochastic Sampling
ZHAO Ye,ZHOU Chang,WANG Chang
Computer Science. 2011, 38 (5): 249-251. 
Abstract PDF(261KB) ( 686 )   
RelatedCitation | Metrics
This paper presented a new mesh simplification algorithm based on stochastic sampling driving by local geometric feature. First,local geometric feature value of each triangle was computed and the selection probability of the triangle was acquired according to the probability distribution function. Then, the selection triangles were collapsed and new vertices were generated by minimization volume change between the original mesh and the simplify mesh. The expenment results show that mesh models are simplified and the volume is kept while the detail feature is preserved.
Emotion Mechanism of ilncertain Behavior Selection
ZHANG Guo-feng,LI Zu-shu
Computer Science. 2011, 38 (5): 252-257. 
Abstract PDF(611KB) ( 604 )   
RelatedCitation | Metrics
Through thorough analysis of the emotion-related theories in psychology and microeconomics, the attributes of emotion were clarified whose essence is energy, function is driving behavior, sort reflects the sort of survival resources,and the emotional essence of motivation, (inner) drive and utility(subjective value) was discovered. According to the common law abided by these kinds of sense, and consulting Prospect Theory, emotion schema function was built up to obtain the principle of functional relationship between emotion and living resource. Based on these work, adopting optimistic selection principle, emotion-driven behavior selection mechanism was established. In order to validate the mechanism, emotion-driven competitive, cooperative behavior selection mechanism and fight behavior selection mechanism were constructed and simulations of them on platform Swarm were achieved. The simulation results confirm the proposed mechanism. Thus,the theory base of autonomous behavior selection achievement under uncertain conditions is provided.
Robust Watermarking Algorithm Based on the Feature of Digital Image
CHEN Hai-peng,QIN Jun,SHEN Xuan-jing,WANG You wei
Computer Science. 2011, 38 (5): 258-260. 
Abstract PDF(356KB) ( 948 )   
RelatedCitation | Metrics
In order to achieve the goal of image authentication and protection simultaneously, we proposed a feature based robust digital image watermarking algorithm. I}he Hessian-Affine feature detector was used at first to search characteristic regions of an image, then embed the copyright watermark into these characteristic regions according to the local orientation of each pixel. Moreover, the remainder regions were applied for image authentication by using fragile watermarking method. The proposed watermark extract scheme is similar with the above embedding scheme, the procedure is executed blindly, for the host image is not needed. The experimental results in this paper show that the proposed watermarking algorithm can resist most removal and geometric attacks. Besides, changes of an image will be reflected in our hidden watermarks.
Lip Feature Extraction Based on DCT and ONPP
LIANG Ya-ling,DU Ming-hui
Computer Science. 2011, 38 (5): 261-264. 
Abstract PDF(389KB) ( 699 )   
RelatedCitation | Metrics
For the visual feature extraction of visual only lipreading system, this paper proposed an DCT}ONPP method to extract the visual feature of lip, compared with PCA which preserving the global structure, ONPP is a method preserving the intrinsic geometry of the local neighborhoods, in the same time, the overlap of the neighborhoods can keep the global structure of the data. The experimental results show that the proposed method is better performance than PCA and DCT+PCA methods. About the influence of the neighbor numbers on the system, some research has been done on it, the experimental results show that 3 neighbors can get better performance.
Image Watershed Segmentation with Evolutionary Programming
ZHAO Shan,WANG Shui
Computer Science. 2011, 38 (5): 265-267. 
Abstract PDF(385KB) ( 521 )   
RelatedCitation | Metrics
In order to solve over-segmentation in watershed algorithm, this paper proposed a combination of evolutionary programming and control tags in image preprocessing technique of the watershed image segmentation method. The method was based on evolutionary programming Birth-death process, restructured structural elements of the control tags, and reorganized the minimum values for each of the marked points, with it modified the control tags and then doing watershed transform, achieved the image region segmentation. The experimental results show that the method can effeclively solve the issue of over-segmentation in watershed algorithm, and still retaining a variety of important target area.And it can split the image characteristics and specific requirements by adjusting the segmentation process to get the satisfaction of selected parameters image segmentation results.
Design and Implementation of MDSE for Optical Molecular Imaging Simulation
REN Nu-nu,CHEN Duo-fang,CHEN Xue-li,PENG Kuan,MAO Jing-jing,TIAN Jie
Computer Science. 2011, 38 (5): 268-271. 
Abstract PDF(376KB) ( 834 )   
RelatedCitation | Metrics
With the development of the researches and improvement of experiment on optical molecular imaging, a simulation platform, which can realize the light propagation in tissues and frecspace, is urgently required by the researchers.This paper introduced a platform MOSE(Molecular Optical Simulation Environment) which integrates the forward simulation of three different modalities of optical molecular imaging. Based on a uniform framework,the platform provided the simulation of light propagation in turbid media and frecspace combined with the functions of graphic visualization and data analysis. The structure design of the platform, simulation algorithms, graphic visualization and some simulation examples were provided in details. The experiment shows that the researchers could efficiently implement the simulation experiment and data analysis in the study of optical molecular imaging through the platform.
Gradient Field Optical Flow Estimation for Color Image Sequences Based on Automatic Growcut
LIAO Bin,DU Ming-hui,HU Jin-long
Computer Science. 2011, 38 (5): 272-274. 
Abstract PDF(366KB) ( 626 )   
RelatedCitation | Metrics
Detecting the boundaries of deformation objects is one of of the difficulties of optical flow estimation and it is hard to solve only using improved optical flow algorithms. Gradient field optical flow estimation for color image sequences based on automatic growcut was presented. It combined the segmented information with gradient field optical flow estimation to improve the detection accuracy for deformation objects.
Automatic Video Object Segmentation Based on Spatio-temporal Information
ZHANG Xiao-yan,MA Zhi-qiang,ZHAO Yu-bo,SHAN Yong
Computer Science. 2011, 38 (5): 275-278. 
Abstract PDF(460KB) ( 571 )   
RelatedCitation | Metrics
A novel video moving object segmentation algorithm based on spatio-temporal information was proposed in this paper. The algorithm can extract the moving object from the video sequence with static or global motion background automatically. Firstly, an efficient and accurate global motion compensation method was used to change the motion background to static background. In temporal motion information extracting,the value of background noise variance was estimated by histogram fitting to overcome the shortcoming of setting the value by experience, then the significance test and the symmetrical difference method were applied to achieve accurate moving o均ect mask. In spatial image information extracting, an improved multi-scale watershed algorithm based on viscous morphological gradient correction and edge value merging was employed to segment moving regions which can solve over segment problems greatly. Finally,video object was extracted by performing double threshold ratio operation on spatial and temporal results. Experimental results validate the proposed algorithm.
Extracting Methods of Primitive Shapes and Boundary Curves from Scattered Point Set
LIU Guang-shuai,LI Bai-lin,HE Chao-ming
Computer Science. 2011, 38 (5): 279-282. 
Abstract PDF(374KB) ( 1444 )   
RelatedCitation | Metrics
A huge scattered point data includes all kinds of scanning artifacts including noise,outliers,holes and irregular/anisotropic sampling. Most common surface reconstruction methods fail due to these shortcomings. This poses challenges to recovering datasets topology and retrieving features. In order to solve this problem, a robust and efficient reconstruction method was proposed. First, computed local properties for each data point, then used this information to detect simple primitive shapes in the data, at last, described a novel method to extract and optimize boundary curves on the primitive shapes and Employed the reconstructed boundary curves to extract a piccewise smooth surface mesh. The experimental results show the effectiveness of our method with reconstructions of synthetic datasets and real-scenes datasets.
Research on Workload Balancing Strategy Based on Flexible Layout
LIU Qun,FENG Dan,LI Jian
Computer Science. 2011, 38 (5): 283-286. 
Abstract PDF(360KB) ( 547 )   
RelatedCitation | Metrics
The workload balancing is always the key of study in the Based on Scalable Object Mass Storage System (BSO-MSS). How to choice Storage Objects(SOs) and the number of them are. It presented a strategy of workload balancing in this paper. Not only do we thought over network effect on I3S0-MSS, but also we noticed SOs themselves. Aiming at different storing ability of SOs, the system adapts to choice the number of SOs by itself and adopts different size of stripping for data storing. When the number of SOs don't add up to the best value,we are adding the more number,the system has the least response time. So it enhances the whole system performance.
Cache-style Parallel Checkpointing for Large-scale Computing System
LIU Yong-yan,LIU Yong-peng,FENG Hua,CHI Wan-qing
Computer Science. 2011, 38 (5): 287-289. 
Abstract PDF(385KB) ( 814 )   
RelatedCitation | Metrics
Checkpointing is a typical technique for fault tolerance, whereas its scalability is limited by the overhead of file access. According to the multi level file system architecture, the cache-style parallel checkpointing was introduced,which translates global coordinated checkpointing into local file operation by out of-order pipelining of checkpoint flushing opportunity. The overhead of writcback is hidden effectively to increase the performance and the scalability of parallel checkpointing.
Automatic Data Permutation Generation and Optimization for SIMD Devices
CHEN Xiang,SHEN Li,LI Jia-wen
Computer Science. 2011, 38 (5): 290-294. 
Abstract PDF(457KB) ( 739 )   
RelatedCitation | Metrics
Nowadays, more and more general-purpose microprocessors provide enhanced SIMD instruction-set extensions to exploit data level parallelism. However, some inherent characteristics of applications and algorithms, such as memory address nonalignment, inconsecutive memory access and control flow, etc.,make compilers or programmers have to use permutation instruction to reorganize the element of vectors to get correct operands for SIMI)instructions.And these redundant permutation instructions had become the performance bottleneck of exploiting data level parallelism. hhis paper proposed an automatic data permutation generation and optimization algorithm. It can effectively reduce the performance loss caused by permutation instruction. The algorithm is based on a new intermediate representation,which contains enough address message of the operand, with which the problem of data permutation generation and optimization can be solved via identifying and eliminating all conflict edges in data flow graphs with minimal costs. The test result to a group of typical multimedia program shows that the algorithm can achieve performance acceleration up to 7% on the average.
Cyber-physical System Architecture Design
CHEN Li-na,WANG Xiao-le,DENG Su
Computer Science. 2011, 38 (5): 295-300. 
Abstract PDF(592KB) ( 1520 )   
RelatedCitation | Metrics
Cyber-Physical System(CPS) is a next generation intelligence system based on the technology of network and embedded system. Architecture frame is the key technology of CPS. We analyzed the concept and features of CPS, proposed a three-layers architecture frame. They are physical layer, network layer and application layer. We discussed each layer, introduced the description of layers. The threclayers architecture frame was proved to be congruent to concept and features of CPS by the experiment in intelligent traffic system. The three-layers architecture frame can be the direction of research in CPS in the future.
Instructing Low-power TLB Design by the Analysis of Program Behavior
SHI Li-wen,FAN Xiao-ya,CHEN Jie,HUANG Xiao-ping,ZHENG Qiao-shi
Computer Science. 2011, 38 (5): 301-305. 
Abstract PDF(484KB) ( 606 )   
RelatedCitation | Metrics
Translation Look-Aside Buffer(TLB) is a dedicated hardware component that the Memory Management Unit(MMI)utilizes to improve page address tanslation speed. However, some researchers indicate that working TLBs may occupy as much as 17% of a processor's total power consumption. The o均ective of this paper is to effectively reduce the TLB on-chip power consumption by looking into the program behavior in respect of page access traits. With careful analysis of the memory access patterns of SPEC CPU benchmarks,we demonstrated that the Page Interval which non-sequential page accesses heavily exhibit can be used to largely reduce the power of TI_Bs. Based on this observation, we proposed a novel low-power hLB design methodology. Experimental results show that using our design the on-chip power consumption can be further saved.