Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue 6, 16 November 2018
  
Disk Storage Testing Technology
Computer Science. 2012, 39 (6): 1-5. 
Abstract PDF(433KB) ( 630 )   
RelatedCitation | Metrics
We studied technologies about testing IOPS and data transfer speed of disk arrays in mass storage systems,and proposed a parallel testing technology towards the high performance disk array and realized the testing work on the newest developed disk array. I3y taking researches on many factors which effect disk performance seriously and quantifying them through expcriments,we realized evaluations on the whole disk array. Finally,we studied bottlenecks of the disk arrays and proved that the results are correct.
Recent Development of Application of Swarm Intelligence in Multi-agent System
Computer Science. 2012, 39 (6): 6-9. 
Abstract PDF(389KB) ( 1207 )   
RelatedCitation | Metrics
Swarm intelligence algorithm is a kind of distributed method originated from collective behaviors of social insects. Its application in multi-agent systems aims to improve the robustness,flexibility and adaptability. Taking the application of swarm intelligence in multi-agent system as masterstroke, this paper discussed the critical mechanisms of swarm intelligence, and then summarized and discussed current study works of swarm intelligence applied in multi agent communication, multi-agent cooperation, multi-agent learning and multi-agent architecture. At last, this paper analyzed and discussed some problems in current research and showed some preliminary points.
State-of-the-Art on Gait Recognition
Computer Science. 2012, 39 (6): 10-15. 
Abstract PDF(623KB) ( 972 )   
RelatedCitation | Metrics
As a new technology of biometrics,gait recognition has attracted a great of interest in computer vision community due to its advantage of unobtrusive recognition at a distance. In this paper, the general process of gait recognition was described firstly. Then the paper reviewed most existing typical methods for gait recognition and analysed their respective pros and cons. In addition, challenges that constrain practical application of gait recognition systems were discussed and more than 10 publically available datasets and experimental results of some typical methods were reported.Finally,future trends were given to guide further research in this field.
Research on Self Service Terminal Interaction Design
Computer Science. 2012, 39 (6): 16-20. 
Abstract PDF(474KB) ( 958 )   
RelatedCitation | Metrics
Self-service terminals in China arc being developed rapidly, but many users still cannot enjoy its facility. The main reason is the deficiency of effective interaction design methods in the interface design process. What kind of interaction design methods can be applied to satisfy public users rectuirement in the maximum extent has been the main research question for self-service terminals' designers and researchers. Recent self-service terminal research in interface interaction design area is mainly focused on building user perception model, proposing correspondent interface design methods and self-service terminals' interface evaluation methods, which arc summarized in this paper to assist the interface interaction design and research of self-service terminals. I}hc research tendency of self-service terminal interface interaction design was discussed.
Group Extension Analysis Network Process Decision Method Based on Interval Scales
Computer Science. 2012, 39 (6): 21-24. 
Abstract PDF(432KB) ( 608 )   
RelatedCitation | Metrics
The Analysis Network Process(ANP) is the effective method to resolve the complex decision problem. But the traditional ANP method has obvious limitation, which is the express means about the relative importance of elements and the problem about group decision. To solve this problem, this paper used "interval scales" to replace "point estimating", adopted GOWA operator to aggregate the group preference and established the decision method that is named the group extension ANP based on interval scales. Finally, the application example demonstrates the method.
Target Localization Based on Double-level Grid Division in Wireless Sensor Networks
Computer Science. 2012, 39 (6): 25-29. 
Abstract PDF(515KB) ( 405 )   
RelatedCitation | Metrics
Target localization and tracking have become an important fundamental technology in wireless sensor networks. Because of the special topology structure, grid-deployed wireless sensor networks have a great advantage at target localization. Based on the grid-deployed networks, this paper proposed a doublclevcl grid localization method, which is both theoretical and practical. Simulation results discover the regular pattern between different impacting factors and location error, which is meaningful to the real application of our method. I}he results of real system prove that the double-level grid localization method can not only keep location precision at some degree, but also has great capabilities of real-time and good practicality.
Data Selection Strategy for Data-intensive Applications in Cloud
Computer Science. 2012, 39 (6): 30-34. 
Abstract PDF(533KB) ( 374 )   
RelatedCitation | Metrics
Bag-of-Tasks data-intensive applications in cloud have been appeared in many fields. Considering the decentralized data centers and "pay-on-demand" model for resource usage, these applications now are facing new challenges in data selection. One of the problems is how to choose the appropriate data resources from multiple datasets which have the same content but the different locations and access costs. Firstly, cloud environment and data selection problem were modeled. Based on the model, a cost-minimized data selection process was abstracted as a weighted set covering problem, and a new data selection strategy was proposed to make a tradeoff between execution efficiency and economic cost.Results of experiments show that the strategy takes into consideration both cost optimization and execution efficiency,and achieves a comprehensive performance.
Dynamic Access Control Model Based on Situation Calculus
Computer Science. 2012, 39 (6): 35-39. 
Abstract PDF(442KB) ( 400 )   
RelatedCitation | Metrics
Access control model defines the whole framework of security system access control. The existing access control models are mostly static authorization models. Although the models can be extended to realize local dynamic nature(such as activate roles temporary by defining conditions of them),but they arc restricted by the cxtensional elements in real applications,and most of them can not describe the dynamic changes of the authorized process. To solve the problem,this paper proposed a dynamic access control model based on situation calculus(SCDAC). SCDAC describes the attributes and strategy of access control with logic facts and rules and treats the authorization state at an instance of time as a situation. It can achieve the changes of situation through actions, and portrays the preconditions of the action and successor state axioms. Finally an example was adopted to illustrate describing the dynamic changes of authorization states by SCDAC is feasible.
Improved Structure for Mesh Topology Based on NoC and its Routing Algorithm
Computer Science. 2012, 39 (6): 40-43. 
Abstract PDF(321KB) ( 531 )   
RelatedCitation | Metrics
Mesh structure becomes a widely used NoC topology due to its simplicity, regularity, easy to implement and expand. This paper improved the 2D-Mesh structure. VMesh structure which connects every vertex to each other was presented. A deadlock-free routing algorithm based on this structure was proposed too. Finally,we proved that VMesh structure decreases the network diameter and the ideal average communication delay, increases the ideal throughput by detailed calculations. We also emulated the topology and the algorithm with gpNoCsim simulator and our results demonstrated a certain reduction in the average packet delay and routing hops.
Encryption Algorithm for Compressed Image Based on Chaotic Maps
Computer Science. 2012, 39 (6): 44-46. 
Abstract PDF(258KB) ( 452 )   
RelatedCitation | Metrics
We proposed an encryption scheme for the compressed image. In the scheme, the image data is first encrypted in space domain and then is encrypted in frequency domain. The proposed scheme can not only achieve high security but also guarantee the efficiency of the compression algorithm.
Novel Energy Dissipation Rate Model Based Clustering Routing Protocol
Computer Science. 2012, 39 (6): 47-50. 
Abstract PDF(329KB) ( 455 )   
RelatedCitation | Metrics
Using energy-efficient routing protocol, wireless sensor networks can save much energy, but there are still the problems of rapid and uneven energy consumption. LEACH, called low energy adaptive clustering hierarchy routing protocol, is one of the most classical routing protocols, which ignores both the rate of energy dissipation and the distance between normal node and sink node during the election of cluster header. Focusing on these limitations, a novel clustering routing protocol was proposed. Experimental results show that the new protocol makes the consumption of energy more balanced, decreases energy dissipation and extends the lifetime of sensor network.
Utility Allocation Strategy for Virtualized Resource Based on Cooperative Game
Computer Science. 2012, 39 (6): 51-53. 
Abstract PDF(319KB) ( 647 )   
RelatedCitation | Metrics
The utility allocation is a key problem when grid virtualized resource providers form coalition to complete grid tasks. Aiming at the situation that grid resource providers form coalition to increase overall utility, the cooperative game theory was applied to research the grid resource allocation. The basis of forming resource coalition was provided,and an optimal resource allocation was presented by MIN COSI} algorithm based on the minimum cost. For the utility allocation,we made some analyses from two aspects,including average allocation and Shapley value allocation of coahtion utility,and proposed an allocation strategy of coalition utility based on Shaplcy value. I}he numerical results show that the grid resource coalition can improve the executing efficiency of tasks and the entire resource revenue, and the Shapley value is feasible in balancing utility allocation among coalition members.
Service Discovery Algorithm Using Semantic Technology in Grid Environment
Computer Science. 2012, 39 (6): 54-57. 
Abstract PDF(0KB) ( 251 )   
RelatedCitation | Metrics
A grid is a large-scale resource sharing technology, which increases resource utilization by distributed collaboration. Grid service discovery is the basis and prerequisite for using the grid, but grid service discovery in the recall and precision rates can't reach a satisfactory level. I}his paper proposed the concept of service behavior, and used the semantic technology to note it, On this basis, a semantic annotation service behavior based grid service discovery algorithm was proposed. Experiments show that this algorithm is better than the traditional method. It has higher accuracy and recall rate, and stability.
Research on Vulnerability Analysis Framework for Large-scale Distributed System
Computer Science. 2012, 39 (6): 58-60. 
Abstract PDF(254KB) ( 457 )   
RelatedCitation | Metrics
As the large-scale distributed system plays an increasingly important role in such fields of national security,critical infrastructure and social life, its vulnerability analysis has become a growing focus nowadays. Regarding it as a vulnerability analysis object, a multi-layer model for largcscale distributed system was put forward first, and then a multi-dimension vulnerability analysis framework was proposed, which provided an overview of vulnerability analysis research area and method in three aspects, including vulnerability analysis phase, lifecycle process and taxonomy of vulnerability.
Prediction Method for Network Security Situation Based on Elman Neural Network
Computer Science. 2012, 39 (6): 61-63. 
Abstract PDF(326KB) ( 575 )   
RelatedCitation | Metrics
Grasping the security situation of network system accurately, which can provide effective information, helps network managers to make security decisions. Based on the assessment of current network security situation, using the feature of nonlinear and time series, this paper presented a method of networks security situation prediction, based on Elman neural network, with the advantages of dynamic memory of Elman network and sensitivity of historical data, etc,to predict the network security situation. Finally, experiment shows that networks security situation can be effectively and accurately predicted with this method.
HIBE Scheme Based on Composite Order Bilinear Groups
Computer Science. 2012, 39 (6): 64-67. 
Abstract PDF(283KB) ( 463 )   
RelatedCitation | Metrics
Presently, most of the HIRE schemes are based on the prime order bilinear groups. hhe author constructed a HIRE scheme based on composite order bilinear groups. The component parameters of secret key tuple are those elemenu in an group of prime order while the ones of ciphertext tuple arc the products of some elements in two groups,which are of different prime order, and elements in one of these groups, which act as blind factors. hhe blinded cipher-text enhances the security of the new HIRE scheme. The blind factors of the ciphertext have no effects on the decryption. hhe new HIBE scheme is selectivcID secure in the standard model.
Study on Improved Algorithm Based on Concentric Circles Localization
Computer Science. 2012, 39 (6): 68-71. 
Abstract PDF(323KB) ( 740 )   
RelatedCitation | Metrics
This paper presented a kind of circular algorithm, according to principles of concentric circles localization algorithm, based on analyzing several common wireless sensor nodes localization algorithm. The circular localization algorithm focuses on the use of certain rules made by the anchor nodes to drawings in order to continuously reduce the unknown node estimation area is taken until the end to get the smallest region containing the unknown nodes. Then, the centroid position in the smallest area is takero as the estimate coordinates of the unknown nodes. I}he compared simulalion experiments between the concentric circles localization algorithm and circular localization algorithm and the improved schemes show that when the anchor nodes proportion increase to 5% , and in the 20*20 square meter simulation scenarios 1000 sensor nodes are deployed and the anchor node density is 5 %,the error of concentric circles localization algorithm is 34. 86%,and the circular localization algorithm is 26. 64%. The improved scheme uses multiple methods of partitioning rings to improve positioning accuracy. The experimental results show that when the anchor node density is 5%,the localization error of the improved algorithm is reduced to 15. 76%.
Attacks and Improvements on a Strong-password Authentication Scheme
Computer Science. 2012, 39 (6): 72-76. 
Abstract PDF(388KB) ( 443 )   
RelatedCitation | Metrics
Recently Yu Jiang et al. proposed a USB-Key based strong-password authentication scheme (USPA),and claimed that their scheme was resistant to DoS attack, replay attack, stolen-verifier attack and server impersonation attack. However,we found USPA can't achive these purposes. An improved scheme was advanced and analyzed. The analysis shows that our new scheme precludes the defects of USPA, keeps the merit of high performance, and is suitable for mobile application scenarios where resource is constrained and security is concerned.
SOAP Message Security Transport Mechanism Based on SOA
Computer Science. 2012, 39 (6): 77-80. 
Abstract PDF(317KB) ( 463 )   
RelatedCitation | Metrics
With the technology development and popularization of SOA applications, the security issues of Web services based on SOA have become increasingly prominent. The security of SOAP message is of great importance to Web service security. Currently SOAP message transport mainly depends on the WS Security standards. However, the WS-Security standards have some drawbacks. hhe SOAP messages in transport will be attacked by XML injection attacks and other Web attacks. Therefore, this paper designed a new SOAP message security transport mechanism which added the SOAP Validation node into the existing Web services security transport framework based on the WS standards. At last the experiments demonstrate that this security transport mechanism can truly detect some of XML attacks and improve the security of SOAP message.
Distributed Constraint Satisfaction of an Improved Channel Assignment Approach in Cellular Network
Computer Science. 2012, 39 (6): 81. 
Abstract PDF(316KB) ( 383 )   
RelatedCitation | Metrics
As demands grow for wireless communication systems and the limited number of channel resources, the channel assignment problem(CAP) becomes increasingly important. The goal of channel assignment is to reduce the interferences, and enhance the capacity. CAP is a well-known NP-hard problem. In this paper, we modeled CAP as a Distributed Constraint Satisfaction Problem(DSCP) with fully considerations on all constraints of interferences, then proposed an improve channel assignment approach to minimized more required number of channels in cellular mobile system and minimize the number of blocked hosts. We also provided one kind of complete search algorithms, which outperforms others by providing quasi optimal solutions at a related lower cost and time. We evaluated the performance of our approach in solving CAP,which is based on the existing benchmark. The simulation result shows that our approach provides optimal solutions and minimizes call failures, which is more feasible and easy to be applied in practical engineering.
DLD-MAC:A Diffserv-based Low-delay MAC Protocol for WSNs
Computer Science. 2012, 39 (6): 84-88. 
Abstract PDF(405KB) ( 357 )   
RelatedCitation | Metrics
Considering that most of the existing MAC protocols in wireless sensor networks do not support any priority scheme,this paper proposed a novel,Diffserv-based low-delay MAC mechanism(DLl)MAC),on the basis of the existing DW-MAC protocol. I}he main idea is that, by introducing differentiated services, the high priority data chooses smaller back-off window, so that it can achieve much more chance to be delivered, and its packet transmission latency and energy consumption can be reduced significantly compared with low priority data. On the other hand,a Markov chain model was designed in this work, to analyze and evaluate the mechanism performance. Our analytical analyses show that the proposed DLl)MAC can effectively make the high priority data achieve higher QoS over the low priority data, and achieve much lower transmission latency than traditional MAC protocol as well, which is adequate to serve delay-sensi- tivc data flow.
Research on Clustering Algorithm of Wireless Sensor Networks Based on Predictable Mobile Sink
Computer Science. 2012, 39 (6): 89-92. 
Abstract PDF(316KB) ( 355 )   
RelatedCitation | Metrics
A clustering algorithm for wireless sensor networks was presented under the predictable mobility condition of the sink node. The proposed algorithm introduces subsink node into the HEED clustering algorithm to achieve faster sensing of mobile track changes and quick formation of clustered topologies, and uses the sink node registration to achieve the information exchange in the sink node mobile process. Case analysis shows that the algorithm can quickly form a rational network topology and prolong the lifetime of wireless sensor networks.
Kripke Structure Generating with Control Flow Information
Computer Science. 2012, 39 (6): 93-97. 
Abstract PDF(408KB) ( 498 )   
RelatedCitation | Metrics
Malware detection is an important part of information security technology. The detection based on program behavior characteristics can remedy the limits of binary signature detection method. Model checking technology can verify a program's specific behavior property, which requires a model for the target program, in order to obtain a transition system which is coincident with Kripke structure. Current model checking technology and Kripke structure were thoroughly analyzed, and then the method of generating Kripkc structure was proposed, which is based on the full control flow information and greed strategy. I}he generated transition system can fully represent the control flow information and describe the changes of target system status.
Platform Resource Scheduling Method Based on DLS and ACO
Computer Science. 2012, 39 (6): 98-103. 
Abstract PDF(467KB) ( 488 )   
RelatedCitation | Metrics
Platform resource scheduling method is an important part of operational mission planning and provides operational resource allocation scheme for campaign. Operational task, platform and the relationship between them were described. A mathematics model was set up for platform resource scheduling. hhe objectives arc the mission's finish time minimisation and the platform resource's utilization rate maximization. The algorithm, which is composed of dynamic list scheduhng(DLS) and ant colony optimization algorithm(ACO) to solve this model was designed. The task selection method, the binary coding scheme, and the candidate solution formation strategy were described. The repaire strategy for infeasible candidate solutions and pheromone updating method were designed. The fitness function was designed with three factors: the time priority coefficient, the platform function capabilitary priority coefficient, and the requirement degree to follow-up tasks. Simulation results based on operational scenario indicate the platform resource scheduling method based on DI_S and ACO behaves well. Compared with other algorithms, the proposed algorithm has less mission's finish time and higher platform resource's utilization rate.
Methods for Evaluating Reliability of Mobile Ad hoc Networks in the Perception Layer of IOT
Computer Science. 2012, 39 (6): 104-106. 
Abstract PDF(325KB) ( 460 )   
RelatedCitation | Metrics
The reliability of mobile Ad hoc network is the key factors that affect data acctuisition and device control in the application of IO"I}. In order to solve the problem of uncertainty of the reliability analysis and quantitative assessment, through overall considering these factors of dynamic connection and network components failures, which are brought out by nodes mobility, a method for computing the reliability of Ad hoc network based on nodes mobility was presented. When the speed and direction of two mobile nodes arc stable, the continuous time in which valid link of one node is kept is forecasted. I}hen during some time, the speed and direction of the mobile node alter, and the validity of the link is evaluated. I}herefore,thc reliability of Ad hoc network is indirectly computed. The experiment results show that the reliability of Ad hoc network depends on not only the reliability of nodes and links but also the redundancy of topology and the distribution of nodes.
Semantic Retrieval Based on Shallow Semantic Analysis Technology
Computer Science. 2012, 39 (6): 107-110. 
Abstract PDF(433KB) ( 507 )   
RelatedCitation | Metrics
In the field of information retrieval,semantic retrieval performs better than traditional keywords retrieval in many aspects including effectiveness and User Experience. Semantic retrieval that has become an important technique in the specific domain integrates several methods such as information retrieval, semantic parsing and information integralion. hhis semantic retrieval method based on Lucene analyzes sentences semantically and acquires a formal representalion describing sentences' simple semantic content subsectuently. Afterwards, index of this formal representation will be established. Since multi levels' similarities demonstrating semantic relationships arc merged by information fusion tech- nology and mapped as similarities between query sentences and index data, the purpose of semantic retrieval will be achieved consequently.
Research on Influence Maximization Problem Based on Dynamic Networks
Computer Science. 2012, 39 (6): 111-115. 
Abstract PDF(428KB) ( 440 )   
RelatedCitation | Metrics
Influence spread is one of key problems about dynamic process problems on complex networks, and the research results of influence maximization problem based on dynamic networks are less. We discussed dynamic independent cascade model and dynamic linear threshold model and proposed a dynamic influence maximization problem based on above two models. I}hen, we presented an improved greedy algorithm, which eliminates the uncertainty of stochastic models and improves its performance by using connected graph approach. The algorithm was validated on four datasets with different sizes including AS, EMAII,DELICIOUS and DI3I_P. The results show that,the size of influence spread of our algorithm has an obvious advantage and time efficiency is better compared with H I} algorithm.
Study on the Formalization of Model Evolution with Model Driven Architecture
Computer Science. 2012, 39 (6): 116-119. 
Abstract PDF(406KB) ( 358 )   
RelatedCitation | Metrics
Model evolution involves a series of complex change activities. Based on model driven architecture, the relevant model evolution conceptions were formally defined to accurately describe model and model changes. In MDE, model changes can be classified as the primitive operations and composition operations. The primitive operations are identified as addition, deletion and modification operation and can be imposed on the model clement individually. Merge and union operations arc composition operations based on simple operations.
Research of Web Services Composition Transaction Coordination Framework Based on BPEL and WS-TX
Computer Science. 2012, 39 (6): 120-124. 
Abstract PDF(425KB) ( 369 )   
RelatedCitation | Metrics
As composite service process must be manually defined when using Web service transaction coordination framework because of lack of process definition support, a transaction coordination framework which supports process and transaction semantic information automatic extracting was proposed. It coordinates the composite service by using transaction coordinator defined by Web services transaction (WS-TX) and extracting coordination information from extended business process execution language (t3PEI).The proposed framework combines the advantages of both W} TX in the transaction coordination and BPEI. in business process design to separate transaction logic from business logic effectively. Finally the effectiveness of the framework was validated through a case study.
Protocol Based Real-time Component Behavior Consistency Verification
Computer Science. 2012, 39 (6): 125-128. 
Abstract PDF(389KB) ( 354 )   
RelatedCitation | Metrics
The formal specification and consistency verification of complex real-time component systems' behavior can efficiently improve the systems' reusability, correctness and reliability. This paper analyzed the timed behavior protocol and the other mainstream formal specification methods of real-time behavior used in academia and industry. Based on the analysis we gave the substitution theory and the consistency verification algorithm based on timed behavior protocol,which can support complex real-time component based systems' development.
Detecting Behavioral Mismatch of Web Services Based on Bounded Model Checking
Computer Science. 2012, 39 (6): 129-132. 
Abstract PDF(325KB) ( 325 )   
RelatedCitation | Metrics
Due to inconsistent of the interface type or interaction protocol, services can not be combined in the right way. Web services mismatch detection can accurately capture the mismatch points,which creates a foundation for rcalizing right interaction, avoid invalid composition. This paper presented a method for detecting mismatch of Web services based on Satisfiability Modulo Theories(SM7).The issue of detecting mismatch of Web services can be transformed into the problem of existence model checking that a deadlock is reachable or not between the interaction of the services,and the issue of existence model checking can be transformed into the problem that the logic formula is satisfiable or not. Finally, an example was given to explain the process of Web service mismatch detection.
Research on the Compatibility Testing of the Domestic Foundational Software Application Platform
Computer Science. 2012, 39 (6): 133-137. 
Abstract PDF(420KB) ( 420 )   
RelatedCitation | Metrics
In recent years, with the continuous advance of the domestic foundational software, operating system, database software, service middleware, office software and other domestic foundational software have been used widely in many social fields. However, it is hard to test domestic basic software application platform compatibility due to combinalion relations complexity of domestic basic software and uncertainty of causing application platform compatibility problems. The paper presented a dependency-based domestic foundational software application platform compatibility testing method DRI3A (dependency relationship based approach) and its policies. The method was defined and encoded following the execution tree policy. And a compatibility testing tool based on DRI3A by the C/S way was implemented to verify the method's availability and effectiveness.
Personalized Active Service Model Based on Ontology and Multi-Agents
Computer Science. 2012, 39 (6): 138-142. 
Abstract PDF(407KB) ( 346 )   
RelatedCitation | Metrics
In order to break through performance bottlenecks of traditional active service model, and offer better service to satisfy users' personalized service request,firstly,ontological user personalized model was estabilished based on domain ontology, and personalized active service model based on ontology and multi-agents was proposed in this paper. Afterwords, we analysed the running process of the model, described the function of each part in the model in detail. Based on the analysis of the key technologies of model running, we expatiated the collaborative operational mechanism of Push and Pull. In the end, the experimental results of the prototype system based on our model indicate that the service performance can be improved effectively.
Line Segment Closest Pair Queries Based on Voronoi Diagram
Computer Science. 2012, 39 (6): 143-146. 
Abstract PDF(300KB) ( 532 )   
RelatedCitation | Metrics
Closest pair(CP) query is one of the important spatial queries in spatial database. But the existed researches on CP query are mainly focused on point objects,which can rarely solve some instance that spatial object can not be abstracted as a point. The question of closest pair query between line segment and line segment was first put forward.That is finding the line segment pair which has the shortest distance among all the line segment pairs of two line segment sets. The line segment closest pair ctuery algorithm based on Voronoi diagram was proposed, and the relevant theorern and proof were given. `hhe Voronoi diagram of two line segment sets was constructed respectively in the method. By making use of nearest adjacent property and local dynamic property, the line segment pairs that arc the nearest neighbor mutually were finded from which the result was got, in order to reduce a lot of computational cost. The two circumstances of insert line segment and delete line segment were dealed with. Experiments demonstrate the proposed algorithms have high query efficiency.
ESSK; A New Approach to Compute Clickstream Similarity
Computer Science. 2012, 39 (6): 147-150. 
Abstract PDF(301KB) ( 532 )   
RelatedCitation | Metrics
Clickstream is widely used in Web usage mining. Clickstream similarity is usually used to classify or cluster Web user sessions. SSK(string subsequence kernel) is an approach for computing string similarity originally. Then it is introduced to compute chckstream similarity and becomes one of the most popular methods. It selects all subsequences of length k of two strings to generate the feature space. A single value of k may cause a problem that the number of features is not enough to get an accurate clickstrcam similarity. So, a new approach to compute clickstream similarity ESSK (extended string subsequcnce kernel) was proposed. ESSK generates the feature space by all subsequences to solve the problem of SSK. To reduce the complexity of computation, an effective algorithm to compute ESSK was proposed. An experiment indicates that ESSK is more accurate than SSK and has a higher discrimination than other approaches. So it is more suitable to compute clickstrcam similarity.
Algorithm of Improved Top-k Query on Uncertain Data for Requirement Extension
Computer Science. 2012, 39 (6): 151-154. 
Abstract PDF(443KB) ( 357 )   
RelatedCitation | Metrics
The semantic of the existing top-k query on uncertain data only returns one response, which has the largest gathered probability in all possible worlds. As a result, it cannot meet the consideration of users' individuation requirement. To address this issue, the concept of requirement extend degree was introduced and a top-k query semantic on uncertain data for requirement extension was defined. Further, an algorithm named RU-Topk to process the semantic was presented. The Experimental results show the superiority of RU-Topk algorithm in terms of run time for per query ansaver and query efficiency compared with the existing algorithms under the condition of meeting users'requirements.
GC-BES:A Novel Graph Classification Approach Based on Embedding Sets
Computer Science. 2012, 39 (6): 155-158. 
Abstract PDF(321KB) ( 389 )   
RelatedCitation | Metrics
Many graph classification approaches have been proposed. These approaches only look at the structural information of the pattern, but do not take advantage of the embedding information during mining frequent subgraph. In facts,in some efficient subgraph mining algorithms,the embedding information of a pattern can be maintained. A graph classification approach was presented. Based on L-CLAM coding,it uses a feature subgraph selection strategy based on label information to select the feature subgraph, while making full use of embedding sets to directly generate feature subgraph in mining frectuent subgraph. Experiment results show that it is effective and feasible.
Opinion Topic Mining and Orientation Identification
Computer Science. 2012, 39 (6): 159-162. 
Abstract PDF(349KB) ( 460 )   
RelatedCitation | Metrics
hhe paper mainly focused attention on how to mine opinion topic from online subjective text set,and identified the orientation of these opinion topics. It combined the heuristic rule and co-occurrence probability to identify area noun phrase, and adopted Latent Dirichlet Allocationn(LDA) to obtain local opinion topic. Then, it computed the sentence orientation with a method of multiple features fusion, and obtained the opinion topic orientation by adding up the number of each orientation of the sentences belonging to the specified topic. Experimental results show that the method can identify the topic and their orientation effectively.
Design of Emergency Preplan Expression and Optimization Method Based on and Case-based Reasoning
Computer Science. 2012, 39 (6): 163-165. 
Abstract PDF(350KB) ( 546 )   
RelatedCitation | Metrics
The structured technology of frame expression can express static preplan. The structure of preplan library and index were designed by using relation model. At the same time, cascbased reasoning method and nearest neighbor method were unified, and the similarity of emergent events and preplan was calculated, therefore optimal similarity search was achieved.
Mining Method of Key Time Interval Based on Maximum Clique
Computer Science. 2012, 39 (6): 166-169. 
Abstract PDF(329KB) ( 322 )   
RelatedCitation | Metrics
To efficiently process mass transactions with temporal attribute, the key time interval ( KhI),the minimum time interval reflecting correlation of items was introduced. A maximum clique based KTI mining algorithm was proposed, which reduces the complexity for information mining and decision making. Assuming the probability distribution of items is uniform for the target transaction, we stated an approach to find the KhI of the transaction and analyzed the correctness and complexity of the method. Experiments show that through considering the KTI of a clique, the candidates set impacting the accuracy of decision is reduced, and the resource consumption is saved mainly. In the end, we evaluated the feasibility of the proposed method in the real world use.
Multi-granularity Temporal and Spatial Approximate Aggregate Query on RFID Data Warehouse
Computer Science. 2012, 39 (6): 170-174. 
Abstract PDF(416KB) ( 324 )   
RelatedCitation | Metrics
With the development of Internet of hhings (IOT) , the research concerning storage, query, processing, etc. of sensor data, such as RFID, is becoming more and more popular. In this paper, with the thoughts of temporal and spatial dimensions in data warehouse and column-style storage, we built a column-style warehouse for RFID data. Moreover, according to the temporal and spatial characteristics of RF ID data, we designed a data structure for continuous aggregate query of multiple temporal and spatial granularity and a fast updating algorithm, which can remove a part of redundant operation in traditional aggregate query. I}he structure and algorithm can be applied for processing largcscale continuous aggregate query in RFID data warehouse. The experiment results show that the model and algorithm can achieve high efficiency in some typical application of IOT ( Internet of Things) , and can be widely applied to massive data OLAP analysis in RFID data warehouse.
Data Analysis on National College Entrance Examination and Admission Using OLAP and Data Mining
Computer Science. 2012, 39 (6): 175-178. 
Abstract PDF(477KB) ( 398 )   
RelatedCitation | Metrics
How to find useful information from massive data of national college entrance examination and admission (NCEEA) is one of the key issues of the provinces and cities admission offices(PCAO),and it also attracts the attention of parents and examinees as well as all sectors of the society. Around this issue, a data warehouse and a data cube were built based on the admission data accumulated over the years in one province. Some potential and useful information was found through OLAP analysis and data mining analysis. Research and analysis represent that these information can provide decision-making support for the PCAO and can be important basis for guiding examinees to choose preference reasonably. This article focused on the process of building data warehouse and cube, the process of OI_AP analysis on admission related data and the result interpretation, the process of data mining by using decision tree algorithm and associate rule algorithm.
Applying Temporal Features of Social Tags to Tag Predication
Computer Science. 2012, 39 (6): 179-183. 
Abstract PDF(425KB) ( 355 )   
RelatedCitation | Metrics
Tag is a kind of description of Web resources generated by users,and it represents the semantics of resources and interests of users. Because the Web resources are dynamic, tags show some temporal features. However,few researches arc concentrated on temporal features of tags. I}he temporal features represented by tags dataset were analyzed in this paper,and the semantic relations between tags based temporal features were discussed. The principle of time segmentation for discovering temporal features was proposed, and the effect of tags temporal on topics was analyzed by statistical topic model. I}he discovered temporal features were used in tags predication. The experiments based on different datasets shows that applying tags temporal feature to tags predication can improve the predication performance.
Adaptive Greedy GA Algorithm for TSP
Computer Science. 2012, 39 (6): 184-187. 
Abstract PDF(330KB) ( 342 )   
RelatedCitation | Metrics
TSP is a typical combinatorial optimization problem, and many real life problems can attributed to the TSP.GA is a typical optimization algorithm. Analyzing GA's important points,a adaptive greedy GA was proposed to solve TSP. Definitions and theorems on adaptive fitness function ensure the correctness of the algorithm. Algorithm does not prematurely fall into local optimum because of average replication method for select operations. Algorithm can be efficiently converged by establishing bidirectional ring greed insert operator based on Hamiltonian two-way loop circuit for cross operation. Finally, the calculation and analysis of the example and the comparison with the traditional GA algorithm show that the proposed GA algorithm can play better role in TSP study.
Balanced Fuzzy Support Vector Machine Based on Imbalanced Data Set
Computer Science. 2012, 39 (6): 188-190. 
Abstract PDF(293KB) ( 391 )   
RelatedCitation | Metrics
In view of the classification of imbalance data set with the larger unbalanced ratio of class, a balanced fuzzy support vector machine(BFSVM) was proposed,making use of the imbalance adjustment factor and the fuzzy membership based on the features of sample points. Firstly, it computes the sample covariance matrix and gets the imbalance adjustment factor,then computes the fuzzy membership of every sample and gets the contribution rate of every sample.Fuzzy membership and imbalance adjustment affect the sample error of classifier at the same time. The experiment resups prove that the algorithm has a good effect on the larger unbalanced ratio.
Research on Properties of Approximations in Rough Sets Based on Characteristic Relation When Object Varies with Time
Computer Science. 2012, 39 (6): 191-193. 
Abstract PDF(294KB) ( 337 )   
RelatedCitation | Metrics
The information system based on characteristic relation is an extension of the general model of the information system. It only satisfies the reflexive and can deal with "lost" and "do not care" data in incomplete information systans simultaneously. In real-life applications, the information system may change dynamically according to the variation of objects. This paper discussed properties of approximations in rough sets based on the characteristic relation when one object is added to or deleted from the information system. An incremental algorithm for updating approximations was proposed and experimental evaluation was employed to validate the proposed method.
Noun Sense Disambiguation Based on Semantic Density
Computer Science. 2012, 39 (6): 194-197. 
Abstract PDF(312KB) ( 418 )   
RelatedCitation | Metrics
Proposed a novel approach for noun sense disambiguation based on concept correlation. Different from existing algorithms, we extended the notion of semantic distance on WordNet by defining a semantic density for a group of word senses, thus quantizing the correlation among a group of word senses. We disambiguated noun sense after converting the correlation into semantic density. Besides, we also proposed an LSH like semantic hashing on WordNet.With semantic hashing, we greatly reduced the time complexity of calculating semantic density and that of the whole disambiguation algorithm. Experiments and evaluation of this novel approach on SemCor were made.
Random PSO Algorithm Based on Cultural Framework
Computer Science. 2012, 39 (6): 198-200. 
Abstract PDF(249KB) ( 368 )   
RelatedCitation | Metrics
Bringing standard PSO and random particle swarm optimization(rPSO ) proposed in the paper into the framework of cultural algorithm(CA) , a novel optimization method named random particle swarm optimization based on cultural algorithm(CA-rPSO) was established. In CA-rPSO, the evolving algorithms of belief space and the population space were represented with rPSO and PSO respectivcly,forming independent and parallel "dual evolution-dual promolion" mechanism. 5 testing functions were selected to simulate and analyze CA-rPSO. The result shows that optimization performance of CA-rPSO is obviously promoted and the algorithm is simple and easy to carry out.
Algorithm for Finding the Critical Paths Based on Petri Net
Computer Science. 2012, 39 (6): 201-203. 
Abstract PDF(340KB) ( 634 )   
RelatedCitation | Metrics
Using extended time Petri net, the directional network of project planning was converted into its Petri net model. Any potential limitations were analyzed by the methods of Petri net and corrected firstly. hhe critical path of net was automatically got after operating network for pruning optimization. This algorithm has better effectiveness than the existing methods,and is easy to realize.
Optimal Model of Portfolio Selection Based on VaR and CVaR under Uncertain Environment
Computer Science. 2012, 39 (6): 204-206. 
Abstract PDF(203KB) ( 644 )   
RelatedCitation | Metrics
This paper discussed the portfolio selection problem under uncertain environment, using uncertain measure to define VaR and CVaR. VaR and CVaR were used to measure risk. An optimal model of portfolio selection based on VaR and CVaR was established, and the hybrid intelligence algorithm integrating genetic algorithm and 99-method was designed to solve the model. Finally, an numerical example was given to illustrate the feasibility and effectiveness of the model and the algorithm.
Adaptive Central Force Optimization Algorithm
Computer Science. 2012, 39 (6): 207-209. 
Abstract PDF(209KB) ( 406 )   
RelatedCitation | Metrics
The adaptive central force optimization(ACFO) algorithm was proposed for the global optimization problems in order to balance the abilities of global detective and local search. hhe particles fitness functions was defined. hhe partides movement time was updated based on the fitness value compared with the average fitness value, and the current position was updated by the crossover operation. As a result, the algorithm convergence speed was improved. 8 classic benchmark functions were used to test it Simulation results show that, ACFO is accurate, has strong robustness, compared with several other particle swarm optimization algorithms and CFO algorithms.
Research of Text Categorization Based on Improved Maximum Entropy Algorithm
Computer Science. 2012, 39 (6): 210-212. 
Abstract PDF(227KB) ( 397 )   
RelatedCitation | Metrics
This paper discussed the problems in text categorization accuracy. In traditional text classification algorithm,different feature words have the same affecte on classification result, and classification accuracy is lower, causing the increase algorithm time complexity. Because the maximum entropy model can integrated various relevant or irrelevant probability knowledge observed, the processing of many issues can achieve better results. In order to solve the above problems, this paper proposed an improved maximum entropy text classification, which fully combines rmcan and maximum entropy algorithm advantages. I}he algorithm firstly takes Shannon entropy as maximum entropy model of the objective function, simplifies classifier expression form, and then uses c-mean algorithm to classify the optimal feature. The simulation results show that the proposed method can quickly get the optimal classification feature subsets,grcatly improve text classification accuracy, compared with the traditional text classification.
Research on Mechanism of Automatic Construction of Ontologies for Web Information Resources
Computer Science. 2012, 39 (6): 213-216. 
Abstract PDF(351KB) ( 529 )   
RelatedCitation | Metrics
To build domain ontology is an important part of ontology research and application areas. hhis paper proposed the process and algorithm of automatic construction of ontologies with data mining and machine learning technologies for web information resources. During the procedure of ontology construction, this paper firstly presented the process and algorithm of concepts classification with muti classifiers based on Bayes classification principle, then discussed the algorithms of concepts-associated analysis and concept self-learning to build the ontology prototype, whereafter presented ontology transforming mechanism from ontology prototype to OWL ontology. This paper also proposed a whole system solution from mining web pages, building domain ontology to the end of ontology application.
Research on MI Migrating Strategy in Migrating Workflow
Computer Science. 2012, 39 (6): 217-221. 
Abstract PDF(448KB) ( 345 )   
RelatedCitation | Metrics
Migrating workflow is a new technology by which mobile agent is applied to workflow management. MI(Migrating Instance) , work place and workflow engine arc the main three elements which constitute a migrating workflow system. Among them, MI is based on mobile agent computing paradigm for the structure of the business process execulion agent, which can move between work places and by following the their own workflow instructions, uses local service to implement one or more tasks. Migration strategy of MI is one of the most important issues in migrating workflow system. This paper presented a hierarchical structure of the migrating workflow service model aimed to the question for MI how to find services, also proposed a way for finding suit destination host including static and dynamic elements such as the availability of the host hardware, host resources. Finally, we gave a description of the experiment and the results of the discussion and analysis. Experiments show that hierarchical organization of workflow services, and the destination host assessment methods are available to effectively improve the efficiency of the migrating workflow system.
Design of Embedded Remote Monitoring System Based on ZigBee and GPRS
Computer Science. 2012, 39 (6): 222-225. 
Abstract PDF(447KB) ( 348 )   
RelatedCitation | Metrics
In order to realize the remote monitoring of wireless sensor, a design scheme combined with ZigI3ee network and GPRS network was proposed with advantage and features of ZigBee and GPRS respectively. Data collection, converging,transferring and remote monitoring could be completed through the cooperation between the wireless sensor network constituted by CC2530 and the mobile communication network with widespread popularity. Experiment results show that the scheme combines the advantage of each network, strengthens the real-time monitoring ability and processing ability of the remote node in the network.
Expressive Temporal Planning Algorithm under Dynamic Constraint Satisfaction Framework
Computer Science. 2012, 39 (6): 226-230. 
Abstract PDF(411KB) ( 487 )   
RelatedCitation | Metrics
AI planning has been the key research topic in artificial intelligence(AI) community. In recent years, AI planning technology has been applied to solve many real-world problems, which gives a challenge for AI planner both in handung power and efficiency. This paper studied a kind of expressive temporal planning paradigm-constraint based interval(CBI) planning. Based on the dynamic constraint satisfaction problem framework, the author designed a new CBI algorithm named LP-TPOP. hhe paper gave the proof of soundness and completeness for LP-TPOP, and algorithm demonstration for a CBI planning example.
Research on Particle Filter with Adaptive Resampling Based on Diversity Measure
Computer Science. 2012, 39 (6): 231-234. 
Abstract PDF(310KB) ( 499 )   
RelatedCitation | Metrics
As its advantage in non-linear non-Gaussian system and multi-mode processing, particle filter (PF) has widely been applied into many fields in recent years. With the deficiency analysis of existing algorithm,a particle filter with adaptive resampling based on diversity guidance was presented. Firstly, it adaptively tuned the resampling threshold by diversity guidance. Based on the adaptive resampling technictues on effective sample size, other diversity measure, population factor, was used to adj ust the resampling threshold. Moreover, the operation of particle mutation after resampling was integrated into PF so as to assure the diversity of particle sets. I}hen, an improved partial stratified resampling(PSR) in PF was proposed. It drew from the advantage of PSR in implementation speed and time. In addition, it combined with the weights optimal idea to improve the performance of PF. With the simulation experiments, the validity of the proposed method was verified.
Analysis of Preference Aggregation Based on Judgment Aggregation Logic
Computer Science. 2012, 39 (6): 235-239. 
Abstract PDF(497KB) ( 389 )   
RelatedCitation | Metrics
How to aggregate individual rational preference into collective rational preference is the main problem faced by the cognitive study of social rationality. The research about judgment aggregation brings about new ideas for analyzing preference aggregation questions. The close connection between judgment aggregation and preference aggregation was investigated from the logic perspective. After the analysis of the judgment aggregation logic JAL,it was proved that the modal logic JAL(LK) of preference aggregation could be constructed based on JAI in the language of first order logic,thus preference aggregation ctuestions could be changed into judgment aggregation questions,and the generality of the judgment aggregation model was illustrated.
Research of Semantic Virtual Environment Ontology Visualization Model
Computer Science. 2012, 39 (6): 240-243. 
Abstract PDF(420KB) ( 345 )   
RelatedCitation | Metrics
Currently, information visualization is facing a bottleneck a model is needed to integrate the scene graphics content and the semantic information of specific areas effectively so that the users can effectively interpret, personalize the visualized information. Based on the X3D standard and Ontology, the Web ontology Language named OWL was usedto descript the X3D standard and then a X3D standard ontology and a mapping ontology used to implement the mapping of classes and attributes between the X3D standard ontology and the other specific domain ontology were built to enrich the semantics of virtual scene. Based on the X3D standard ontology and the mapping ontology, a semantic virtual environment ontology visualization model was designed to provide a method for the visualization of specific ontology. The feasibility of this model was proved by the experimental results.
Speech Endpoints Detection Algorithm Based on Support Vector Machine and Wavelet Analysis
Computer Science. 2012, 39 (6): 244-246. 
Abstract PDF(333KB) ( 333 )   
RelatedCitation | Metrics
In order to improve the adaptability and robustness of speech endpoint detection, this paper presented a algo- rithm for speech endpoint detection based on wavelet analysis and support vector machine. Firstly, the characteristic quantities of speech signals are obtained by the wavelet transformation. Then the input to support vector machine can be computed based on these characteristic quantities. Finally the signal's type can be determined. The simulation experi- menu results show that the proposed algorithm improves the detection rate, has better adaptability and robustness, and can detect signals with different SNR, compared with the traditional detection algorithms.
Research of the Mobile CSCL Model Based on Extended Granott Peer Interaction Mode
Computer Science. 2012, 39 (6): 247-250. 
Abstract PDF(463KB) ( 392 )   
RelatedCitation | Metrics
The research and structure of the traditional Mobile CSCL model are made from the technical point of view, while ignoring the social peer interaction between learners in the process of collaboration. This paper researchd the ex- tended Granott peer interaction mode based on classic Granott model, presented the mobile CSCI. model based on ex- tended Granott peer interaction mode, and then gave a formalization algorithm of the Granott logic object, which was validated and implemented in a prototype system. This model supports mobile learners to get the collaborative behavior with partner in the process of collaboration based on extended Granott peer interaction mode, thus efficiently help the learners to accomplish the mobile cooperating learn work.
Fast and Robust LOGFAST Corner Algorithm
Computer Science. 2012, 39 (6): 251-254. 
Abstract PDF(594KB) ( 397 )   
RelatedCitation | Metrics
Based on timcefficient FASh algorithm, the paper described a fast and robust LOGFASh corner algorithm. Histogram ectualization as one kind of image enhancement was firstly applied to the original image for sharpening useful image information and improving image illumination invariance. "hhen Laplacian of Gaussian operator was convoluted to achieve image guassian smooth and edge enhancement, and also suppress noise in the maximal degree. At last,ESST al- gorithm was applied to produce LOGFAST corners. The new corner detection algorithm not only features with time-ef- ficicnt as FASh algorithm, also has illumination invariant, noise invariant and robust feature. Experiments show that LOGFAST algorithm can achieve a detection time 0. 05s on noised images sized with 640、480. It has the similar detec- tion performance on illuminate variant images and has repeatability 98 percent. Duc to its excellent performance, LOG FASh algorithm can be used in real time video process applications such as intelligent vehicle warning system.
Image Retrieval Based on Uniform Region Segmentation
Computer Science. 2012, 39 (6): 255-257. 
Abstract PDF(586KB) ( 293 )   
RelatedCitation | Metrics
In order to reduce the effect of the accurate image segmentation on the content based image retrieval, a new image retrieval algorithm based on uniform region segmentation was proposed. At first, the image was segmented using uniform region,and then the histogram color feature and Uabor wavelet texture future were extracted. Finally,a reson- able similarity measure was designed to improve the retrieval efficiency. The experiments results show that the algo- rithm in this paper is more accurate than SIMPI_Icity system. The average retrieval performance is increased by 3. 6%. Better ave-rage precision ratio is obtained.
Shot Boundary Detection Algorithm Based on Self-adaptive Dual Thresholds of Accumulative Frame
Computer Science. 2012, 39 (6): 258-260. 
Abstract PDF(323KB) ( 436 )   
RelatedCitation | Metrics
Improving the precision of shot boundary detection is very important. This paper presented an algorithm for shot boundary detection based on self-adaptive dual thresholds of accumulative frame. It used an accumulative frame to memorize the accumulative differences among the sequential frames of the video and magnify gradual feature changes. A block-matching algorithm was designed to compensate for the motion of the object to reduce the effects of it. Self-adap- five dual thresholds were adopted to improve the veracity of the shot boundary detection. The proposed method can well detect gradual shot with less calculating strength by experiment on various video clips.
Hyperspectral Remote Sensing Image Classification Based on MFA and Knns
Computer Science. 2012, 39 (6): 261-265. 
Abstract PDF(429KB) ( 380 )   
RelatedCitation | Metrics
In order to explore dimensionality reduction and classification in hyperspectral remote sensing image, an algo- rithm based on marginal Fisher analysis(MFA) and k-nearest neighbor simplex(kNNS) was proposed in this paper. First,the data were projected from a high-dimensional space onto low-dimensional space by MF八combined with the in- formation of different classes. I}hen, classification was performed under the kNNS classifier by using a few neighbors from each class. The experimental results on the Urban data set, Washington DC Mall data set and Indian Pine hyper- spectral data set show the effectiveness of the proposed algorithm. When i(i=4,6,8) samples of each class arc randomly selected for training and 100 samples of each class for testing, the overall accuracy of our proposed algorithm is im- proved by 3. 7%一8. 5 0 o compared with other methods.
Face Recognition Based on MB-LBP and Improved LFDA
Computer Science. 2012, 39 (6): 266-269. 
Abstract PDF(342KB) ( 300 )   
RelatedCitation | Metrics
A algorithm on face recognition based on multi scale block local binary patterns(MI3-LISP) and improved Lo- cal fisher discriminant analysis(LFDA) was proposed, which strengthens local analysis for labeled samples and global a- nalysis for training samples with the ability of MB-LBP for local and global description. The algorithm makes use of avc rage Euclidean distance from every sample to other samples in the same class as the parameter to overcome the limit for computing within-class scatter and preserves global structure by inosculating the total scatter of training samples in the form of parameter. Experimental results show that M13-I_BP provides good base for making analysis of local and global preserving,and improved LFDA has more obvious adaptability and recognition rate than LFDA when the number of la- bcled samples is small.
Fast Face Recognition of Sparse Representation Based Fusion of Visible and Near Infrared Images
Computer Science. 2012, 39 (6): 270-273. 
Abstract PDF(314KB) ( 378 )   
RelatedCitation | Metrics
Recently,the face recognition of the fusion of visible (VIS) and near infrared (NIR) images has attracted more attention. Here we studied the fast face recognition in this field and gave an applied scheme in detail, which in- eludes three main techniques;down-sampling of the original samples,sparse representation based selection of M-neigh- hors to substitute the original training samples and the weighted decision fusion strategy. Comparing with the classic al- gorithms, the experiments on CSIST face databases show that our scheme can achieve higher classification accuracy with a lower computation load.
Kernel Direct LDA Subspace Hyperspectral Image Terrain Classification
Computer Science. 2012, 39 (6): 274-277. 
Abstract PDF(312KB) ( 345 )   
RelatedCitation | Metrics
In order to reduce the data dimensionality of hyperspectral image and improve recognition efficiency, a new terrain classification method, i. e. , KDLDA subspace method, was presented. Firstly, kernel direct linear discriminant a- nalysis (KDI_DA) was used to extract nonlinear discriminant features, and then shortest distance classifier was used to perform terrain classification in the KDLDA feature subspace. The solution of KDLDA under the ordinary form of class prior possibility was also deduced. Recognition results based on airborne visible八nfrarcd imaging spectrometer (AVIRIS) hyperspectral image show that, comparing with original space method, I_DA subspace method, direct linear discriminant analysis (DLDA) subspace method, and kernel linear discriminant analysis (KLDA) subspace method, the presented KDI_DA subspace method can remarkably improve recognition efficiency.
Dynamic Combinational Corner Detection Algorithm Based on Brain Magnetic Resonance Image Registration
Computer Science. 2012, 39 (6): 278-282. 
Abstract PDF(484KB) ( 326 )   
RelatedCitation | Metrics
Corner detection algorithm is key to the image registration algorithm based on corner feature point. Harris and Susan algorithms arc two important detection algorithms of them because of their satisfying detection capability. But they are not comprehensive to describe the information of the corner points. Therefore, it is good solution to combine them together to enhance their capability. For this solution, it is important to find the weight of the two algorithms. hhis paper proposed a combinational method,and improved its capability greatly by introducing two weighted factors m and },Z and deciding their proportion based on statistical experiments. In the end, the method was used in brain MR image registration. hhe experimental results show that this algorithm can be used for brain MR image registration and can ob- lain higher registration precision and stability compared with the existing corner detection algorithms.
Insulator Image Denoising Based on Pixel Peer Groups and Neighbor Groups
Computer Science. 2012, 39 (6): 283-284. 
Abstract PDF(249KB) ( 335 )   
RelatedCitation | Metrics
A new denoising method was proposed in the paper according to the characteristics of insulator infrared image with impulse noise and gaussian noise. First of all, according to the number of peer group of contains pixels, this paper determined impulse noise and signal areas,and then judged the target edge using neighbor groups. The denosing method uses the average of its peer group members instead of the pixel, and keeps the useful signal areas and edge areas at the same time. The experiment results show that the method in the signal-to-noise ratio(SNR) has better effect than other methods,and without blurring edges and details.
Parallelization of Label-free Protein Quantification Software QuantWiz Based on GPU
Computer Science. 2012, 39 (6): 285-288. 
Abstract PDF(320KB) ( 376 )   
RelatedCitation | Metrics
QuantWiz is a label-free ctuantitative software based on mass spectrometry, well used in ctuantitative pro- teomics. The increasing experimental data causes the enormous workload. Having hundreds of GFlops or even hFlops performance, GPU can speed up such compute-intensive quantitative proteomics applications. This article analyzed the software QuantWiz,to find the hotspot module of this software. Then we presented an accelerated program on GPU for this software called GPU-QuantWiz and implemented it under CUDA Framework. Statistical performance results show that the accelerated program can achieve good performance, with 9. 66 speedup. Moreover, our algorithm can be exten- ded on two or more GPUs,with a good scalability.
Estimation and Analysis of Network Modules Power Consumption Based on Android System
Computer Science. 2012, 39 (6): 289-292. 
Abstract PDF(327KB) ( 615 )   
RelatedCitation | Metrics
Embedded system devices, especially portable mobile devices, arc powered from batteries which arc limited in size and weight, Therefore power becomes an important challenge to embedded system devices design. To investigate this problem, we used two strategies WIFI and GPRS to access the network and measure the changes of the system power consumption. hhe conclusions arc as follows: under both two strategics the battery life time decreases about 50 0 0 in the small flow mode (lOKb/min) and decreases more rapidly about 85 0 o in the big flow mode (2Mbit/min). Under the same condition, the devices consume less power in WIFI mode than GPRS mode. hhesc results were validated by two Android devices. The experiment results demonstrate that it is very important to find techniques to reduce the po- wer consumption of mobile network modules.
Mixed Precision Finite Element Algorithm on Heterogeneous Architecture
Computer Science. 2012, 39 (6): 293-296. 
Abstract PDF(300KB) ( 650 )   
RelatedCitation | Metrics
For a long time, single precision has been giving away to double precision in scientific computing. However, on computer architectures, mixed-precision computing, can take full advantages of excellent computing compatibilitics of vector components, GPGPU, offering merits such as reducing communication bandwidth requirements, improving data movement efficiency etc. A mixed-precision explicit finitcclement algorithm was proposed and implemented on nVidia GPU for strongly nonlinear multi scale material simulation. I}he developed mixed-precision finitcelement method gives the same results as that of the fully double-precision calculation, while keeping a 90 0 o portion of finite element calcula- lions to be done by single precision float calculation. As a result, on the device that does not support native double preci- sion float format, the mixed-precision algorithm makes it possible to fulfill double precision finite element simulation, while on the device that supports the native double precision, the mixed-precision algorithm is 1. 6一1. 7 times faster than the full double precision calculation.
Loop-carried Anti-dependence MPI Auto-parallelization Research
Computer Science. 2012, 39 (6): 297-300. 
Abstract PDF(335KB) ( 435 )   
RelatedCitation | Metrics
Traditional MPI auto-parallclization dependence testing methods can only detect whether there arc loop-car- ried dependences, but not their types. It was proved that auto-parallclization with loop-carried anti dependence can be a- chieved under certain conditions. 13y creating reasonable copies of dependence data, a MPI auto-parallclization method with loop-carried anti-dependence was proposed based on the dependence testing methods and the data flow informa- tion. I}he experimental results show that the proposed method can effectively recognize the parallel loops with loop-car- ried anti-dependence. Using the results of the method to generate MPI codes can efficiently improve the efficiency of MPI programs.
Message-passing Code Generation Algorithm in the MPI Automatic Parallelizing Compilation System
Computer Science. 2012, 39 (6): 301-304. 
Abstract PDF(323KB) ( 437 )   
RelatedCitation | Metrics
From the perspective of data redistribution, traditional MPI automatic parallelizing compilation systems gen- erate message-passing programs for distributed-memory systems,but a large number of data redistribution communica- lion overheads result in their low speedups. Aiming at the problem, this paper proposed a message-passing code genera- lion algorithm in the back-end of the MPI automatic parallelizing compilation system based on Open64. With the centre of uniform data distribution, the algorithm generates more accurate message-passing codes, according to the given sets of parallel loops and communication arrays, by modifying the WHIRL syntax trees of serial codes. Experimental results show that the algorithm can reduce communication overheads of messagcpassing programs to a large extent and im- prove their speedups significantly.
Designing and Implementation of 3D FDTD Parallel Algorithm on Many-core Architecture
Computer Science. 2012, 39 (6): 305-308. 
Abstract PDF(239KB) ( 361 )   
RelatedCitation | Metrics
In the electromagnetism,FDTD which has been widely used in the field of dielectric device's designing can simulate the changes of electromagnetic field in space accurately .With abundant computing resources, many-core pro- cessor can get good adaptability for compute-intensive task. Based on the analysis of Maxwell's ec)uations and the FDTD simulation algorithm,we achieved the parallel algorithm about FDTD. According to the experiment's results,FDTD al- gorithm has high computational efficiency on the platform of many-core processor, and can make full use of the structure characteristic of many-core processor.