Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue 10, 16 November 2018
  
Survey of Resource Scheduling in Cloud Computing
Computer Science. 2012, 39 (10): 1-6. 
Abstract PDF(549KB) ( 3978 )   
RelatedCitation | Metrics
Resource scheduling is a fundamental issue in cloud computing. The resource scheduling methods of reducing energy consumption and improving resource utilization in cloud computing data center, and economics-based cloud resource management models were discussed. I}hen cloud computing resource scheduling model of minimizing energy consumption and minimizing number of servers was proposed. Finally, important directions for future research in resource scheduling of cloud computing, which include prediction-based resource scheduling, power and performance tradeoff scheduling, resource management policies and mechanisms for different application workload types, comprehensive multi resource allocation with computing power (CPU, memory) and network bandwidth, multi objective optimization of resource schcduling,wcrc presented.
Research Envelopment on Human Group Activities Analysis
Computer Science. 2012, 39 (10): 7-11. 
Abstract PDF(704KB) ( 506 )   
RelatedCitation | Metrics
Human group activities analysis has become a new research interest of computer vision. It has broad application prospects and great economic value in many aspects, such as intelligent video surveillance, virtual reality, video retrieval and so on. This paper made a survey on the human group activities analysis from group databases and analysis algorithms. Firstly, the group databases mainly consist of behavior databases and surveillance databases. This paper summarized the typical databases of these two classes. Secondly, we sumed up the human group activities analysis algorithms from decomposition methods and recognition methods of group activities. `hhe decomposition methods were divided into three kinds and their strengths and weaknesses were pointed out, separately. The recognition algorithms of group activities were divided into statistical approaches and description-based ones. We compared them in detail and pointed out the development tendency. At last, we summarized the potential problems of human group activities analysis and looked forward to its future.
Survey on Multi-biometric Fusion Based on Match Score
Computer Science. 2012, 39 (10): 12. 
Abstract PDF(381KB) ( 588 )   
RelatedCitation | Metrics
Influenced by data noise and limitation of recognition system itself, the accuracy of identification system based on single biometric trait is proved to be quite limited. Therefore, the research of using multi biometric recognition for improving recognition accuracy has become one of the hotspots in biometric recognition field. This paper first introduced the importance of multi-biometrics, the classification of multi biometrics, and then mainly discussed the current study on score-based multi-biometric recognition. At the end, this paper concluded the problems and the future work in multi-biometrics.
Energy Optimization and Modeling in Wireless Sensor Networks ; A Survey
Computer Science. 2012, 39 (10): 15-20. 
Abstract PDF(703KB) ( 1009 )   
RelatedCitation | Metrics
Due to the energy supply limitation in wireless sensor nodes, it's the primary challenge to optimize the networks' energy consumption and estimate network lifetime in wireless sensor networks (WSN). By analyzing the energy consumption features of WSN, this paper reviewed energy optimization strategies from the views of sensor nodes and networks,and then surveyed and concluded the WSN's energy modeling studies from different research perspectives;wireless communication based, state transition based, protocol stack based and others. Finally, this paper pointed out the future research should focus on the cross-layer energy optimization and the hardware and software integrated energy modeling techniques.
Overview of Wireless Sensor and Actor Networks
Computer Science. 2012, 39 (10): 21-25. 
Abstract PDF(470KB) ( 681 )   
RelatedCitation | Metrics
The wireless sensor and actor networks(WSANs) derived from wireless sensor networks not only can sense the change of the environment, but also interact with it and perform appropriate actions. I}his paper reviewed the architecture, hardware structure, coordination mechanism between different nodes, and communication protocol of the WSANs. The survey of the recent researches on WSANs was given and future research directions of WSANs were raised in the last of the paper.
Research on Witkey Credit Ability Evaluation Mechanism
Computer Science. 2012, 39 (10): 26. 
Abstract PDF(557KB) ( 508 )   
RelatedCitation | Metrics
Witkey's idea aims at encouraging people to transform their specialist knowledge, skills, experience, and ability to valuable products by the help of the online community and share them with other people. But nowadays, many Witkey community websitcs' credit ability evaluation systems arc not only too simple to reflect their users' real credit level,but also go against with the project publishers to choose their partners easily. After analyzing developers' technology and credit merits and demerits in software development filed,we raised an objective estimate and subjective evaluation model and specifications. I}he experiments use classic scenes to simulate the Witkcy users' growing-up experience,then analyze the guiding effect to the users' development if applying such evaluation system, finally prove its feasibility and usability.
Stochastic Model Checking Composed Services in Cloud Computing Environment
Computer Science. 2012, 39 (10): 31-34. 
Abstract PDF(411KB) ( 371 )   
RelatedCitation | Metrics
The existing works can not verify whether the composed service's function, response time and cost satisfy requester's requirement simultaneously,and treat it separately. We extended basic workflow patterns in order to depict probabilistic choice, stochastic time and nondeterministic choice of composed cloud services. An approach to mapping the extended patterns to continuous time Markov reward process was proposed. The existing temporal logic CSRI. was extended for specifying the specifications which can depict function,response time and cost constraints,and the stochastic model checking approach was proposed. This paper shows that we can depict cloud service's runtime dynamic behaviors effectively and verify their correctness and availability.
LISP-HIMS:A Hierarchical Identifier Mapping System for LISP
Computer Science. 2012, 39 (10): 35-39. 
Abstract PDF(447KB) ( 490 )   
RelatedCitation | Metrics
Aiming at the problem of currently rapid growth of global routing table size in DFZ, IETF has proposed to reconsider the Internet addressing architecture by splitting host identifier and its routing locator. LISP is considered to be one of the most promising solutions based on such idea,which was proposed by Cisco. I}his paper presented a novel mapping system for LISP proposal, elaborated the model and modes of this mapping system and also the allocation strategy of the hierarchical host identifier. Compared with other LISP-based mapping systems, the proposed mapping system has better scalability and lower mapping querying latency.
Cognitive Radio Based Routing Protocol for Multi-channel Wireless Mesh Networks
Computer Science. 2012, 39 (10): 40-44. 
Abstract PDF(444KB) ( 429 )   
RelatedCitation | Metrics
In multi-channel wireless mesh network, there is tight inter-dependence between route selection and availability of spectrum, so selection of channel is required to be fully considered when designing routing protocol. Traditional routing protocol can't be well applied to cognitive radio network. The proposed routing protocol, in which each node is equipped with dual network interface, achieves routing selection and channel selection simultaneously, and takes advantage of multiple channels with collision avoidance design. Simulation shows that the proposed protocol, compared to traditional routing protocol,can significantly improve throughput of the network.
P2P Streaming Media Recognition Method Based on SVM Probabilistic Output
Computer Science. 2012, 39 (10): 45-49. 
Abstract PDF(419KB) ( 411 )   
RelatedCitation | Metrics
P2P streaming media has taken up a lot of bandwidth, and is prone to spread the virus, so is required to be identified accurately. This paper analyzed the shortcomings of the method P-Abacus, and proposed a kind of P2P streaming media recognition method P-Abacus based on SVM probabilistic output. P-Abacus can express it in probabilistic output, which reflects the extent of the sample belonging to known applications. We ordered the output, and according to maximum probability,made a judgement on that whether the sample belongs to class of maximum probability or unknown,or needs a further judgement If a further judgement is needed,we calculated the probabilistic output difference of the SVM built between the two largest classes, and made sure that whether the sample belongs to one of the two largest classes,or unknown. Thus P-Abacus has a better recognition effect,because probabilistic output contains more information that can be utilized. Experiments show that P-Abacus has a higher recognition rate and a lower false positive rate than Abacus, and has a limited increase of time overhead.
Adaptive Space Orthogonal Matching Pursuit Algorithm for Signal Reconstruction Based on Compressive Sensing
Computer Science. 2012, 39 (10): 50-53. 
Abstract PDF(865KB) ( 481 )   
RelatedCitation | Metrics
Abstract In order to well ensure reconstruction of the original signal, the traditional Nyquist sampling theorem requires that the sampling rate must be twice as much the highest frequency of the original signal at least, which causes a tremendous amount of calculation and the waste of resources. But the compressive sensing theory describes that we can reconstruct the original signal from a small amount of random sampling as long as the signal is sparse or compressible.Based on the research and summarization of the traditional matching algorithm, this paper presented a new adaptive space orthogonal matching pursuit algorithm (ASO MP) for the reconstruction of the sparse signal. I}his algorithm introduces an regularized adaptive and spatial matching principle for the choice of matching atoms with reverse thinking,which accelerates the matching speed of the atom and improves the accuracy of the matching, ultimately leads to exact reconstruction of the original signal. Finally, we compared the ASOMP algorithm with the traditional MP and OMP algorithm under the software simulation. Experimental results show that the ASOMP reconstruction algorithm is superior to traditional MP and OMP algorithm on the reconstruction quality and the speed of the algorithm.
Distributed Protocol for Multipath Relay Routing Based on Node's Security Degree
Computer Science. 2012, 39 (10): 54. 
Abstract PDF(667KB) ( 448 )   
RelatedCitation | Metrics
Based on the assessment of relay node's security degree, a distributed protocol for multipath relay routing based on node's security degree (NSD-DPMRR) was proposed for P2P network. The best send rate of the source node and the best forward rate of the relay nodes can be computed with distributed method. The simulation shows that the protocol proposed will reduce the damage to the data transmission by malicious relay nodes,maximize the right data received by the destination node,and ensure the security and effectiveness of relay routing with lower complexity.
Cooperative ARQ Mechanism Based on Communication Distance for Wireless Sensor Networks
Computer Science. 2012, 39 (10): 60-64. 
Abstract PDF(449KB) ( 402 )   
RelatedCitation | Metrics
This paper proposed a cooperative automatic repeat request (ARQ) mechanism with multiple relays based on communication distance, in order to achieve high energy efficiency, saturation throughput rate and reliable transmission simultaneously. Considering the characteristics of traditional ARQ mechanism and dividsion of network layer packets into link layer frames, a Markov chain model was built for analyzing the characteristics of the saturation throughput rate,packet dropping probability,average delay and energy efficiency with the different communication distance. On the otherhand,a cooperative ARQ based on the Markov model was established for performance analysis and evaluation of the mechanism proposed in this work. Especially, a linear WSNs model was considered. hhe mathematical analyses show that compared with the traditional ARQ the proposed cooperative ARQ can perform better in terms of saturation throughput rate, reliability and energy efficiency.
Routing Algorithm of Hierarchical Wireless Sensor Network
Computer Science. 2012, 39 (10): 65-68. 
Abstract PDF(348KB) ( 372 )   
RelatedCitation | Metrics
Cluster-heads closer to the sink are burdened with heavy relay traffic and incline to die early, because the cluster-heads transmit their data to sink via multi-hop communication. And this phenomenon is known as "energy hole". It wasproved that the architecture of hierarchical network can effectively delay the energy hole problem. Based on the method of the main routing algorithms, the existing routing algorithms was improved in computing the number of optimal cluster-head and the probability of each node being cluster-head, in every annular network. Considering the thought of hierarchy, cluster-head routing quota (CRQ) algorithm was proposed, which can be used to control the accepting numbers of each muter, in phrase of routing detecting. Thus, it meets the demand of evenly consuming the energy of each cluster-head located in the same ring. Simulation results demonstrate that the new algorithm is better than existing routing algorithm in the network lifetime and energy consumption.
Performance Analysis of Relay Selection Schemes for Cooperative Communications
Computer Science. 2012, 39 (10): 69-72. 
Abstract PDF(307KB) ( 397 )   
RelatedCitation | Metrics
Outage probabilities of all relays selection scheme and opportunistic relaying selection scheme based on decode-and-forward (DF) protocol were analyzed. Especially, exact closed-form results of the outage probability for the two schemes over Rayleigh fading channels were derived. Further investigation of the outage performance was made and asymptotic expressions for outage probability were obtained in the high signal-to-noise ratio (SNR) regime. Theoretical analysis and numerical simulation results show that the outage results of the two methods arc equal in high SNR or in the case that the perfor-mance of relay-destination channels is better than that of sourccrclay channels. The outage performance of all relays selection scheme is better than that of opportunistic relaying selection scheme when the performance of channels from the source to relays is better than that of the channels form relays to the destination.
Forgery Attacks on a Series of ID-based Threshold Proxy Signature Schemes
Computer Science. 2012, 39 (10): 73-77. 
Abstract PDF(381KB) ( 403 )   
RelatedCitation | Metrics
Recently, YU Y K et al. proposed a series of II}basecf threshold proxy signature in the standard model. This paper constructed three attack algorithms for the latest scheme of YU-ZHENG, with which attacker can forge valid both regular signature on behalf of the original signer and proxy signature of any proxy signer on any message without knowing the signing key of these signers. Our attacking algorithms work well with scheme of YU-ZHENG and the decedent schemes. Attacks show that this series of schemes are unsecure. We analyzed the root cause of attacks and gave some suggestions for modifications in the end.
Botnet Propagation Model with Two-factor on Scale-free Network
Computer Science. 2012, 39 (10): 78-81. 
Abstract PDF(476KB) ( 409 )   
RelatedCitation | Metrics
With the developing of network, botnet has become a major threat attack platform w Internet, The current network is a complex network consisting of the random network and the scale-free network. This paper, combining the features of scalcfree network, immunity and network traffic congestion, proposed a new botnet propagation model with two-factor on Scal}free network. This model considers carefully the real situation of the Internet, especially the Scale free network topology, immunity of the host removed from the susceptible network in advance and network traffic congestion. Simulation result shows that the botnet propagation model more exactly satisfies the practical propagation laws and infection characteristics of bot on Internet.
Study of Information Flow Behavior on Conditioned Energy Networks
Computer Science. 2012, 39 (10): 82-85. 
Abstract PDF(478KB) ( 417 )   
RelatedCitation | Metrics
The increasing popularity of network applications has touched various aspects of human daily activities. In order to study network interactive behavior better, the network information flow was abstracted as a form driven by the relation of information resource supply and demand. The drive of supply and demand was described by physics concept as the gravity between a pair of nodes. A conditioned energy network model which is not based on details of network behavior mode was built and divided into energy network status and chaotic network status according to the concept of "relative energy". At last, the feasibility that information flow behavior could be studied by the network rule having the mean of energy was verified. It was demonstrated that the model could express information flow behavior of real networks briefly and accurately, had a profound and lasting significance in universality of the application of network strategics and consistency of network study.
SMAC Protocol for Wireless Sensor Adjustment Mechanism of the Adaptive Duty Cycle
Computer Science. 2012, 39 (10): 86-89. 
Abstract PDF(426KB) ( 703 )   
RelatedCitation | Metrics
SMAC protocol for wireless sensor network can not adapt to dynamic changes in load, leading to network packet loss and delay jitter,so I}SMAC protocol was designed to solve these problems. Based on packets in the queue waiting time on average the protocol predicts the current network load conditions and dynamically adjusts the duty cycle of nodes,thus achieving efficient use of energy,while ensuring timely and reliable data transmission. The simulation resups show that the improved protocol can effectively use the energy of nodes, ensure the network throughput, lower latency in end-to-end transmission.
Research on Efficient Routing in Three-dimensional Underwater Acoustic Sensor Networks
Computer Science. 2012, 39 (10): 90-93. 
Abstract PDF(453KB) ( 425 )   
RelatedCitation | Metrics
The area of three-dimensional (3D) underwater acoustic sensor networks (UW-ASNs) have attracted significant attention recently from both academia and industry due to its applications in detecting and observing phenomena that cannot be adequately observed by means of two-dimensional ocean bottom underwater acoustic sensor nodes. Many problems arise with 3D UW-ASNs that need to be solved in order to enable underwater monitoring in the new environment. Among them, providing efficient routing is very challenging due to the unique characteristics of 3D UW-ASNs.We proposed a geographic based routing protocol and a self-adaptive routing protocol which is adaptatire to 3D UW-ASNs with incomplete sets of failure nodes. The simulations demonstrate that our self-adaptive routing protocol can achieve a good tradeoff among packet delivery ratio,end-to-end delay and network throughput,while our basic geographic routing protocol can get a good performance in average end-to-end delay.
Fine-grained Disclosure Scheme for Attribute Certificate
Computer Science. 2012, 39 (10): 94-98. 
Abstract PDF(568KB) ( 481 )   
RelatedCitation | Metrics
In order to effectively verify X. 509v4 attribute certificate after part of attributes is removed, a fine-grained disclosure scheme is proposed. In this scheme, every attribute in certificate is pretreated, and digital signature of the pretreated results was generated by attribute authority. In different scenarios,uncorrelated attributes is removed from certificate and essential validation information is calculated by certification owner. I}he validation information and digital signature in certificate can be used to validate legitimacy of the attributes disclosed. The scheme has some characteristics as follows : strong compatibility, good flexibility, high security and little additional cost
Spatio-temporal Event Detection Using Pub/Sub Middleware
Computer Science. 2012, 39 (10): 99-103. 
Abstract PDF(574KB) ( 343 )   
RelatedCitation | Metrics
Many applications in the Internet of things (IoI}) decide their next actions according to the occurrences of events with temporal and spatial constraints. In order to help IoT applications to detect spatio-temporal events, the paper built a Pub/Sub middleware OPS4S7.OPS4ST permits users to express in their subscriptions diverse temporal, spatial and logical relationships of events, implements the detection of spatio-temporal events in a distributed way, and can efficiently detect whether the spatio-temporal events to which the users pay attention in their subscriptions occur. The paper also evaluated OPS4ST's matching performance and costs by simulation experiments. I}he experimental results show that OPS4ST's can achieve satisfying performance and acceptable overheads.
New Copyright Protection Method Based on Text Feature Extraction
Computer Science. 2012, 39 (10): 104-107. 
Abstract PDF(338KB) ( 377 )   
RelatedCitation | Metrics
On the Internet, the illegal copying and piracy of text arc becoming increasingly serious, so there is an urgent need for effective text copyright protection method. This paper proposed a new text copyright protection method based on text feature extraction and text classification techniques. The experiments show that the proposed algorithm can distinguish the different authors and can effectively classify a controversial literary works, and identify the real author.Therefore the method can assist to resolve the copyright disputes of dispute works (especially the famous works) to mamtam mtcgnty.
Research on Web Service Behavior Adaptation Based on Regular Flow Nets
Computer Science. 2012, 39 (10): 108-114. 
Abstract PDF(625KB) ( 377 )   
RelatedCitation | Metrics
Web service adaptation is an important research focus in the field of service-oriented computing. Aiming at problems of service behavior formalization and adaptation in cyclic service behavior, data flow modeling and state space explosion,we proposed a new approach for service behavior formalization and adaptation. The whole process of how to formalize service behavior based on regular flow nets,how to construct symbolic coverabihty tree of service behavior,how to build data dependence and actions dependence relationships, how to generate symbolic execution trace adapters,till accomplishing service behavior adaptation was illustrated through examples in the paper.
Meta-analysis Technique and its Application in Software Engineering
Computer Science. 2012, 39 (10): 115-118. 
Abstract PDF(372KB) ( 331 )   
RelatedCitation | Metrics
It is difficult to draw reliable results from a single experimental study. Therefore, there is a need to use a method to combine the results from multiple experimental studies. This paper introduced a meta-analysis technique for this purpose, described its application in empirical software engineering, and pointed out the problems that need to be addressed in the future.
Mobile Robot Middleware Supporting Self-adaptive Programming
Computer Science. 2012, 39 (10): 119-124. 
Abstract PDF(569KB) ( 375 )   
RelatedCitation | Metrics
Robot middleware systems facilitate the development of self-adaptive robot applications by providing a set of high-level sensor/actuator APIs that abstract and hick the heterogeneity of different hardware platforms. We presented a middleware infrastructure supporting self-adaptive programming on mobile robot systems. It aims to cope with cross-platform issues and provide guarantee for the service quality in terms of a set of abstracted and quality-guaranteed APIs. We used this middleware infrastructure to support the development of self-adaptive applications on mobile robot systems, so that it could provide a consistent level of QoS guarantee despite of the varying physical differences among different robot car systems. Our experimental evaluation reports promising results that the middleware can effectively support quality-guaranteed selfadaptive programming.
OSF:A Component-based Framework Supporting Offline SaaS Applications
Computer Science. 2012, 39 (10): 125-130. 
Abstract PDF(587KB) ( 337 )   
RelatedCitation | Metrics
Offlinc SaaS application is environment-appreciable smart software which can work with discontinuity connection. Currently there are a few research works about the offline Web application. They can't meet the requirements of SaaS, and can't take componentized framework into account. To solve the problems above, OSF (Offhne SaaS Framework) was proposed. Firstly, the structures and mechanisms of operation-oriented component based offline SaaS framework were proposed. And then a case study was introduced to verify the proposed framework. Theories and practices show that offline SaaS application framework promotes the users' experience, enforces the usability of SaaS service, and enhances the applied area of SaaS applications.
Test Case Generation for Web Services Based on OWL-S Documents
Computer Science. 2012, 39 (10): 131-135. 
Abstract PDF(440KB) ( 373 )   
RelatedCitation | Metrics
Nesting Web service in the view of service requesters, only specification can be obtained. 13y now, the methods that generate test cases by analyzing the input or output parameters were introduced in some papers. This paper used the input or output parameters in OWI: S specification to generate test data. And Web service control-flow graph was constructed by analyzing the service model information in OWL-S specification. Then, the reduction of test data was performed based on Web service control flow graph,which will improve the efficiency of the test cases and reduce the test cost.
Simulation Analysis Method for Importance of Exception Handling Module in Service-oriented Software System
Computer Science. 2012, 39 (10): 136-138. 
Abstract PDF(387KB) ( 338 )   
RelatedCitation | Metrics
Based on Monte Carlo method, the impact factors of the importance of exception handling module were proposed,which include the importance of its protected areas and its steady-state failure rate. The former is calculatecd,based on bayesian theory,while the latter using the ergodicity in Markov chain. The ctuantitative analysis tries to offer an effective solution. According to the result, designer could pay more attention to the specific module.
Semantic Web Services Composition Based on Concurrent Transaction Logic
Computer Science. 2012, 39 (10): 139-142. 
Abstract PDF(429KB) ( 336 )   
RelatedCitation | Metrics
Concurrent Transaction Logic (CI}R) is an extension of predicate logic which supports reasoning the automatic composition of semantic Web Services. This paper used CTR as the describing and reasoning tools, and proposed a composition method from the two aspects of OWI: S Web Services:function and behavior. The executing semantics of CTR and the procedural semantics of its Horn clause reduce the reasoning complexity. A polynomial time algorithm was constructed. This paper provided a new method for handling the problem that current mainstream semantic Web service composition methods arc not able to model concurrent behaviors.
Specification and Compatibility Verification of Timing Behavior in Components
Computer Science. 2012, 39 (10): 143-147. 
Abstract PDF(725KB) ( 348 )   
RelatedCitation | Metrics
It can efficiently improve the system's correctness and reliability for specification and verifying of timing behavior in complex real-time components. This paper presented the timed behavior protocol based formal modeling methods of timing behavior and the verification method for component based systems. The architecture of the specification and verification tool named TC;I3V was given. An application example was introduced and the experimental results show that the timed behavior protocol based specification and verification methods can accurately model and conveniently verify errors of timing behavior in complex real-time component systems. Finally, TCBV and other related tools were compared and the differences between them were analyzed.
Cold Data Concentration:Energy Saving Method for Streaming Media Storage Systems
Computer Science. 2012, 39 (10): 148-151. 
Abstract PDF(457KB) ( 396 )   
RelatedCitation | Metrics
Streaming media storage systems have encountered serious challenges due to the rapid accumulation of the data volume. Howerer,thc researches on saving energy specially for streaming media applications arc few. First of all,based on detailedly modeling the energy conservation problem in streaming media storage systems,we analyzed data layout's impacts on energy saving and duality of service (QoS) quantificationally. It was found that cold data has the most contribution to energy saving, and has the least negative impact on QoS. Secondly, a new energy saving algorithm called cold data concentration (CDC) was proposed, which puts the coldest data on part of the disks to optimize the energy saving effects. Finally, simulations validate that CI}C saves much more energy than traditional energy saving methods with the guarantee of QoS.
Pruning-based Outlier Mining from Large Dataset
Computer Science. 2012, 39 (10): 152-156. 
Abstract PDF(424KB) ( 439 )   
RelatedCitation | Metrics
Distanccbascd outlicr detection approach typically requires time of distance computation and comparison. This quadratic scaling restricts the ability to apply this approach to large datasets. To overcome this limitation, a novel distance-based outlier mining approach with pruning rules was proposed. The approach consists of two phases.During the first phase, the original input data arc scanned and the majority of non-outlicrs arc pruned. During second phase, an improved nested loops approach is applied to compute the average K-nearest distance which measures the degree of being an outlicr and finally reports the top-n outlicrs. Experiments on both synthetic data and real-life data show hat the proposed approach achieves a high hit rate with a low false alarm rate. Compared with related approaches, theproposed approach has a lower time complexity.
Research for Multidimensional Linear Hashing on Heterogeneous Platforms
Computer Science. 2012, 39 (10): 157-159. 
Abstract PDF(356KB) ( 371 )   
RelatedCitation | Metrics
Nowadays,multidimensional data is used in plenty of areas, but for the complex of multidimensional data, the efficiency of computing is not perfect, In order to speed up the query and manipulation of multidimensional data,a multi-dimensional linear hashing parallel computing solution on CPU/GPU heterogeneous platforms was proposed. Based on the extension of traditional hashing table data structure, it is able to achieve quick creating and query. Experiments on different platforms were done. The experimental results show that the proposed solution is about 25 times and 38 times faster than traditional solution on hashing table creating and data qucry,and it reflects the advantage of the proposed solution.
Efficient FP-STREAM Algorithm Based on Vertical Compression Data Format
Computer Science. 2012, 39 (10): 160-163. 
Abstract PDF(630KB) ( 346 )   
RelatedCitation | Metrics
Along with the sharp increment of the information, mining frectuent itemsets gradually becomes a hot point in recent years.日'-Stream is a classic algorithm for mining frequent itemsets at multiple time granularitics.But the weakness is the contradiction of the massive data and the limited memory in a certain time which will lead to the result that the algorithm can not be used in high speed data stream mining. This article proposed a improved FP-Stream algorithm based on vertical and Dif-bits compression data format
Approximate Query and Results Ranking Approach Based on XML Twig Query Fragment Relaxation
Computer Science. 2012, 39 (10): 164-169. 
Abstract PDF(554KB) ( 331 )   
RelatedCitation | Metrics
Based on XML twig query fragments relaxation, this paper proposed an approximate querying and results ranking approach to achieve the approximate query results against XML documents; our method gathers the query history to speculate the user's preferences, which is used to calculate the importance for each query fragment of the twig query,and relax the original query according to the sectuence of the fragments’importance; based on the number of query fragments we adopt different relax way; if the number>2,relax the original query according to the granularity of the fragment; if the number<2,rclax the original query according to the granularity of query node,and adopt a different way to relax the numerical query and non-numerical query, and then obtain the most relevant query results. Finally, the relevantctuery results are ranked based on their satisfaction degree to the original query and the user preferences. Our experiment shows that the approximate ctuerying and the results ranking approach can efficiently meet the user's needs and user's preferences, has the high recall and precision.
Integration Method for Named Entities in Dataspace
Computer Science. 2012, 39 (10): 170-173. 
Abstract PDF(443KB) ( 402 )   
RelatedCitation | Metrics
A named entity integration model (NEIM) was proposed for Dataspace, as well as integration methods for named entities of heterogeneous data sources. Named entity integration model describes the relations among data source, named entity and the descriptions of entity. It supports any inquires from one of them to the other relevant information. I}he framework of named entity integration points out that the main works of the integration arc named entity and its information recognition, entity integration and mapping, and entity resolution. The integrated algorithm repre-Bents the integration methods of named entity and its information in heterogeneous data sources. Especially, for the structural and semi structured data, it constructs mapping rules, makes the system can continuous integration. The experiment validates the mapping rules.
Research on Petri Net Based OWL-S Semantic Matching Mechanism
Computer Science. 2012, 39 (10): 174-176. 
Abstract PDF(360KB) ( 331 )   
RelatedCitation | Metrics
Traditional UDDI does not have the capability to process semantic and match to the service attribute, as a result, it has the low recall rate and matching rate. Through the service matching framework of Multi-Agent, a Petri net based OWL-S semantic matching mechanism was proposed to deal with the shortcoming, in which PNSDL description language is used to publish and request service. In virtue of the ability of Petri net for describing Web service, the matcking mechanism uses the possibility and necessity to quantify the confidence levels that the service could satisfy a request, so as to finish the matching of service. hhe matching mechanism has the advantage of powerful ability to process, modeling for Multi-service, high reality of matching and high matching rate, etc.
Indoor Moving Objects Index Research Based on DR-tree
Computer Science. 2012, 39 (10): 177-181. 
Abstract PDF(425KB) ( 504 )   
RelatedCitation | Metrics
For the index of historical trajectories of moving objects, most of the schemes arc based on outdoor space,which are hard to be directly applied to indoor space. Moreover, the object itself is not indexed as an independent dimension and the efficiency of the queries based on objects is ctuite low. Thus, this paper proposed an index structure DR-tree(Dual R-tree) which can index three dimensions,such as the localization,the object and the time. I}his scheme can convert the three-dimension index into two two-dimension index by decoupling the location and object dimension,and provide query optimization method. I}he experimental results show that compared with RTR-tree, DR-tree, the scheme can not only support the efficient spatiotemporal query,but also provide the trajectory query based on objects.
Fast kNN Text Classification Algorithm Based on Area Division
Computer Science. 2012, 39 (10): 182-186. 
Abstract PDF(461KB) ( 430 )   
RelatedCitation | Metrics
As a simple, effective and non-parametric classification algorithm, kNN method has been widely used in text classification. In order to improve the efficiency of classification,We proposed a fast kNN text classification algorithm based on area division. We divided the training set into several parts based on their area distribution, and then according to the relative positions between test patterns and those parts, easily found out k nearest neighbours of the test patterns in the training set. hhis will sharply cut down the amount of calculation of kNN algorithm Mathematical reasoning and the experimental results both show that this algorithm significantly improves the efficiency of classification while keeping the same accuracy rate of kNN classifier algorithm.
Study on Stability of Clonal Feedback Optimization Algorithm Based on Feedback Mechanism
Computer Science. 2012, 39 (10): 187-189. 
Abstract PDF(340KB) ( 387 )   
RelatedCitation | Metrics
The clonal selection algorithm is an evolutionary optimization algorithm on the basis of clonal selection theory of the immune system, but it is subject to antibody concentration, leading less stableness. Based on the traditional clonal selection algorithm and consideration of the antibody concentration and diversity of the population, we proposed a novel clonal feedback optimization algorithm. The proposed algorithm is integrated into an evolution feedback depth model and population survivability idea, to effectively improve the stability of the algorithm. Finally, the proposed algorithm is applied to the task scheduling in grid computing,and achieves satisfactory results.
Fuzzy Neural Networks Based on IPSO for Traffic Flow Prediction
Computer Science. 2012, 39 (10): 190-192. 
Abstract PDF(350KB) ( 371 )   
RelatedCitation | Metrics
In a traffic flow prediction based fuzzy neural network, the optimization of the parameters of the nodes is very critical. An improved particle swarm optimization method was used to optimize the fuzzy neural network parameters to improve the precision and speed of vehicle prediction by fuzzy neural networks. Simulation results show that the accuracyof method is faster and accuracy is more accurate than PSO and BP algorithm, and the fuzzy neural network prediction model based on IPSO is an effective method for traffic flow prediction.
Auto-selection of Informative Gene for Multi-class Tumor Gene Expression Profiles
Computer Science. 2012, 39 (10): 193-197. 
Abstract PDF(454KB) ( 347 )   
RelatedCitation | Metrics
In microarray analysis, the selection of informative gene is an essential issue for tissue classification and successful treatment because of its ability to improve the accuracy and decrease computational complexity. The ability of successfully distinguishing tumor from normal tissues using gene expression data is an important aspect of this novel approach for cancer classification. In this paper, a non-parameter method for autonomous selection of informative gene was proposed for processing multi-class tumor gene expression profile,which contained 218 tumor samples spanning 14common tumor types, as well as 90 normal tissue samples, to find a small subset of genes for distinguishing tumor from normal tissues. At First, the randomness of a decision sequence was defined to measure gene importance based on the non-parameter method and filter algorithm. I}hcn correlation information entropy was used to eliminate redundant genes and selected informative feature genes. As a result, 30 informative genes are selected as markers for making distinctions between different tumor tissues and their normal counterparts. Simulation experiment results show that the selected genes arc very efficient for distinguishing tumor from normal tissues. In the end, several methods for informative gene selection were also analyzed and compared to validate the feasibility and efficiency of the proposed method for dealing with tumor gene expression profiles.
MB-SinglePass:Microblog Topic Detection Based on Combined Similarity
Computer Science. 2012, 39 (10): 198-202. 
Abstract PDF(441KB) ( 482 )   
RelatedCitation | Metrics
Topic detection achieves quite good result in the traditional media research. This paper discussed the refiness and performance evaluation of the topic detection technique in the new kind of medics such as microblog, proposed theM13-SinglePass topic detection algorithm on the basis of the structured information such as the relationships of attenlions and fans between contacts, the inner connection relationships such as forwarding and comment between posts. Beside considering the above microblog characteristics,MB-SinglePass introduces the characteristics extension technique in order w enrich characteristics information. At the same time, the paper used the combined similarity aiming at the shortage of singly utilizing the Jaccard similarity cocfficient,cosine based similarity and semantic similarity. Compared with the traditional algorithms,MI3-SinglePass shows better performance on the actual dataset of sing microblog. Additionally, experiment according to the similarity strategy reveals better result by using combined similarity than singular similariy.
Influence of Chinese and Western Thinking Modes on Sentiment Analysis
Computer Science. 2012, 39 (10): 203-208. 
Abstract PDF(633KB) ( 391 )   
RelatedCitation | Metrics
Sentiment analysis is an important research domain of affective computing,and researchers focus on the context in this field. Because the thinking mode influences the formation of language, this paper researched and compared the Chinese thinking modes and western thinking modes. I}he former contains "spiral graphic mode","concreteness"and "scattered view", and the latter owns "straight line mode","abstractness" and "focus view". A position-related method was proposed to quantify the "spiral graphic mode" and "straight line mode",and then according to the "concreteness" and "abstractness",an approach was implemented based on part of speech and grammar. Finally a view-window method was applied to analyze the "scattered view" and "focus view". The experimental results show that the performance of sentiment analysis including the thinking mode factors is better than that not including.
Method for Modeling the Distortions of Transmembrane Helix in Gprotein Couple Receptor
Computer Science. 2012, 39 (10): 209-213. 
Abstract PDF(467KB) ( 440 )   
RelatedCitation | Metrics
Transmembrane helix is main feature of the GPCR, and the accuracy of single helix prediction directly affects the prediction is of entire GPCR structure. It is a challenge problem to predict the distortion of the GPCR helix The distortion is represented by using the kink residue place and the angle of two fragment helix around the distortion position.Based on the all known GPCR's helixes structure currently, the helixes are clustered according to the helix sequence similarity, and the bend angles in each cluster are modeled by using continuous von mises probability distribution. The modeled GPCR hM kink angles arc tested by using the regression and forecast testing method. Based on this article's model,only fifteen times sampling would have a sample result close to the native TM distortion angle,which will help to improve the hM-helix structure prediction.
Improved Genetic Algorithm with Adaptive Convergence Populations
Computer Science. 2012, 39 (10): 214-217. 
Abstract PDF(362KB) ( 879 )   
RelatedCitation | Metrics
The premature convergence seriously affects the performance of genetic algorithm. At present time, most of the improved algorithms focus on improving the convergence accuracy and speed at the expense of the algorithm time complexity, which limits the applications of genetic algorithm in industrial control system. For this situation, this paper presented a new improved genetic algorithm with adaptive convergence populations. This algorithm optimizes performance through increasing the genetic quality of populations, and at the same time, strictly controls the algorithm complexi ty. Simulation results show that the new algorithm can significantly improve the accuracy and speed of convergence, without time complexity increasing.
Global Optimization Problem of Continuous Function Based on Distribution Estimation Algorithms
Computer Science. 2012, 39 (10): 218-219. 
Abstract PDF(271KB) ( 918 )   
RelatedCitation | Metrics
The distribution estimation algorithms use statistical learning to create a probability model from a macro point of view, use the probability model to describe the distribution of problem solution in the solution space, and obtain the advantage of individuals by evolutionary computation. At present, the discrete distribution estimation algorithms are already quite mature, but receach progresses of continuous distribution estimation algorithm are slow. This article used the idea of uniform distribution to narrow the sampling field for continuous optimization problems, designed a new distribution estimation algorithm. Experimental data show that this kind of distribution estimation algorithms are valid.
Automatic Analysis of Comments in Commentary Literatures
Computer Science. 2012, 39 (10): 220-223. 
Abstract PDF(363KB) ( 499 )   
RelatedCitation | Metrics
The commentary literatures contain a wealth of knowledge, and they have the characteristics of semi structurgid. I}his paper researched sentences alignment between classic original and their commentary literatures, and auto matic analysis of comments in commentary literatures. The study can provide a convenient way to build the ancient corpus,and more intelligent retrieval mode for language researchers.
Number of Simply Separable and Full Symmetric Relations in Partial Multiple-valued Logic
Computer Science. 2012, 39 (10): 224-226. 
Abstract PDF(202KB) ( 374 )   
RelatedCitation | Metrics
According to the completeness theory of partial multiplcvalued logic, some properties of simply separable function set and full symmetric function set were discussed, and the formulas for the number of simply separable relalion and full symmetric relations were established.
Characterizing Hierarchies of Neighborhood Systems via Rough Sets Approach
Computer Science. 2012, 39 (10): 227-230. 
Abstract PDF(332KB) ( 321 )   
RelatedCitation | Metrics
In neighborhood system, by analyzing the inclusion and intersection between neighborhoods and target, two different types of neighborhood system based rough sets were presented, and two sets of properties for describing the monotonic verities of levels of neighborhood systems were proposed immediately. Moreover, two different quasi-orderings were also proposed to express the coarser or finer relationships between neighborhood systems. It is proven that there arc corresponding relationships between two quasi-ordering and two sets of properties, respectively.
Research of Chinese Noun Phrase Anaphora Resolution:A SVM-based Approach
Computer Science. 2012, 39 (10): 231-234. 
Abstract PDF(360KB) ( 354 )   
RelatedCitation | Metrics
Coreference resolution is an important subtask in natural language processing systems. In natural language, to make the natural language clear and explicit illusions, it is common that two or more words, sentences or events which have the same meaning are replaced by different words. In compare to people, it is difficult to understand these phenomenon by using the computer, so more and more researchers focus on noun phrases coreference resolution. A great deal of research has been done on this task in English. In Chinese, because the research about coreference resolution starts late,much less work is done in this area. We presented a Chinese noun phrase coreference resolution system based on a SVM approach and gave the details of the platform in the paper. We adopted three tools to evaluate the performance of the platform. Experiments on the Chinese portion of OntoNotes 3. 0 show that the platform achieves a good performance.
Complete Algorithm for Covering Reduction Based on Information Quantity
Computer Science. 2012, 39 (10): 235-239. 
Abstract PDF(509KB) ( 325 )   
RelatedCitation | Metrics
Covering generalized rough sets arc an important extension of Pawlak's rough sets. Similar to the Pawlak's rough sets, reduction is also one of the key issues in covering generalized rough sets. Information quantity of a family of coverings was introduced to characterize the consistent set, reduct and core of a family of coverings. A novel approach for measuring the significance of coverings was presented. Then, a complete heuristic algorithm was proposed to reduce a family of coverings. In the algorithm,unimportant coverings can gradually be removed from the search space to effectively avoid the repeated computation of the significance of the unimportant coverings. Finally,a real example of evaluating houses was used to illustrate the feasibility and effectiveness of the proposed algorithm.
Geo-serviceChain Model Expression by Directed Graph and Verification
Computer Science. 2012, 39 (10): 240-244. 
Abstract PDF(514KB) ( 365 )   
RelatedCitation | Metrics
Because the generic language of Web service composition is unsuitable for the visual expression of geographic information processing and the geo-spatial domain user, the Geo-serviceChain model was established by directed graph and it was defined and designed detailed from the model elements, constraints and control mode. We researched the GeoserviceChain model from three aspects of grammatical correctness, structural correctness and semantic correctness and designed the algorithm and implementation flow combined with the graph reduction rules through analyzing the struclure of Gco-serviceChain model. It approves the effectiveness and feasibility of the proposed algorithm and implementalion flow through the practical case analyzing.
Research on Shortest Path Search of Improved Dijkstra Algorithm in GIS Navigation Application
Computer Science. 2012, 39 (10): 245-247. 
Abstract PDF(378KB) ( 495 )   
RelatedCitation | Metrics
Research on the application of GIS navigation system is the shortest path search efficiency. hhe electronic navigation system has high demand for the shortest path search efficiency. With urban development, traffic lines increase, and the traditional Dijkstra algorithm based on GIS navigation system can not adapt to the increasingly complex traffic lines. The shortest path search efficiency is too low. Considering the GIS spatial distribution characteristics, this paper proposed the improvement Dijkstra algorithm to solve the shortest path search problem in GIS navigation. The improved algorithm not only avoids the case that traditional Dijkstra algorithm traverses the search by node, but also according to priority narrow search direction features range, greatly reduces the workload search, and through the change of the storage of data structure search node improves the shortest path search efficiency. Experiment indicates that compared with the traditional method the improved algorithm can effectively improve the shortest path algorithm of the search efficiency, and satisfy the shortest path search efficiency requirements of the electronic navigation system, and obtaines satisfactory results.
Study on the Results of Triple I Method
Computer Science. 2012, 39 (10): 248-250. 
Abstract PDF(245KB) ( 341 )   
RelatedCitation | Metrics
Zadeh's CRI method is applied in fuzzy reasoning wildly, but it has been criticized to be too complex and it is not consistent. To alleviate CRI method's drawback,Triple I Method was proposed by Professor Wang Uuojun. Now,Triple I method is widely recognized as one of the most important fuzzy reasoning methods, and many existing fuzzy inference methods are based on it. Despite Triple I method's success in various research fields, it still has no effect on fuzzy control. This paper analyzed I}riple I method. The results show that Triple I method has some drawback and it cannot be applied in control.
Research of Simulation Reasoning Algorithm in Causality Diagram Based on Sampling
Computer Science. 2012, 39 (10): 251-253. 
Abstract PDF(318KB) ( 475 )   
RelatedCitation | Metrics
Because the causality diagram's reasoning calculation is a NP problem, it is inconvenient in its popularization and application. This paper presented two simulation reasoning algorithms including A-R sampling and importance sampling based on causality diagram. The effective application in fault diagnosis shows the method is feasible.
Multi-factor Performance Evaluation Model of Computing System Based on User Feeling
Computer Science. 2012, 39 (10): 254-257. 
Abstract PDF(389KB) ( 351 )   
RelatedCitation | Metrics
Traditional performance monitoring system monitors system resources to reflect and evaluate the system running state indirectly. But this method has greater difference to the system performance perceived by user. For end-users,the direct experience to the system performance is the response time of the request, which is affected by many factors.Based on gene expression programming theory, a gene expression programming (GEP) algorithm was proposed in the paper. Using GEP algorithm, a mathematical model (GEP model) between response time and a variety of system resources was established to evaluate the computing system performance and predict the performance changes. Finally,for specific simulation environment and sampling data, the algorithm was used to obtain the multi-nonlinear model about response time and the results show that the model can better predict the system performance.
Computation Tree Logic CTL* Based on Possibility Measure and Possibilistic Bisimulation
Computer Science. 2012, 39 (10): 258-263. 
Abstract PDF(477KB) ( 593 )   
RelatedCitation | Metrics
The notion of computation tree logic CTL* based on possibility measure(PoCTL* in short) was proposed in this paper. Then the possibilistic bisimulation in possibilistic Kripke structure was defined and its properties were discussed. Finally, the quotient possibilistic Kripke structure and the related construction were studied particularly.
ABox Knowledge Update Based on ALCO@
Computer Science. 2012, 39 (10): 264-267. 
Abstract PDF(360KB) ( 335 )   
RelatedCitation | Metrics
This paper explored knowledge update based on description logic Al C(}'.First, it introduced the syntax and semantics of ALCO@,secondly it proposed the concept of determined set and conflict set, after that it gave the algorithm which can obtain determined set and conflict set from the original knowledge set according to the Tableau algorithm of description logic. In the algorithm of update, we deleted assertions of Conflict Set, added new assertions to the original knowledge set, and modified other assertions affected by new assertion, so it could satisfy the open world assumption.
Quick Reduction Algorithm for High-dimensional Data Sets Based on Neighborhood Rough Set Model
Computer Science. 2012, 39 (10): 268-271. 
Abstract PDF(423KB) ( 428 )   
RelatedCitation | Metrics
According to the particle swarm optimization algorithm's idea,a new algorithm(SPRA) to get a optimal attribute reduction on the high-dimensional neighborhood decision table was proposed. Through the use of intrinsic dimension analysis method, taking the intrinsic dimensionality estimated as the SPRA algorithm' s initialization parameter,a quick reduction algorithm(QSPRA) was proposed to deal with the high-dimensional data sets. hhe algorithm's validity was verified by five high-dimensional data sets from UCI. In the experimental analysis section, the population size and the number of iteration to the influence of the reduction result were also discussed. Moreover, the experiments also show that it is impossible to solve high-dimensional data sets based on kernel-based heuristic algorithm ideas.
Research on Parsing Algorithm of EGG Graph Grammar
Computer Science. 2012, 39 (10): 272-277. 
Abstract PDF(604KB) ( 352 )   
RelatedCitation | Metrics
EGG is an edgcbased context sensitive graph grammar formalism, in which the parsing (reduction operation)algorithm is a very important part. On the basis of a brief introduction of EGG, this paper presented the design of a parsing algorithm, which includes subgraph matching algorithm, subgraph substitution algorithm, and their computational complexity analyses. In order to show how to apply EGG to define graph languages and especially parse graphs with the designed algorithms, the paper taking program flow diagram as an example, presented its formal definition of producdons and its parsing procedures for a given flow diagram. As the future research directions, the possible ways to reduce the complexity of the algorithms were also discussed.
Rule-based Identification of Chinese zero Anaphora
Computer Science. 2012, 39 (10): 278-281. 
Abstract PDF(341KB) ( 442 )   
RelatedCitation | Metrics
A rule-based approach for Chinese zero anaphor detection was proposed. Given a parse tree, the smallest IP sub-tree covering the current predicate was captured. Based on this IP sub-tree, some rules were proposed for detecting whether a Chinese zero anaphor exists. I}his paper also systematically evaluated the rulcbased method on OntoNotescorpus. Using golden parse tree, our method achieves 82. 45 in F-measure. And the F-measure is 63. 84 using automatic parser. The experiment results show that our method is very effective on Chinese zero anaphor detection.
Ridge Based 3D Fingerprint Reconstruction Method
Computer Science. 2012, 39 (10): 282-285. 
Abstract PDF(389KB) ( 498 )   
RelatedCitation | Metrics
Three-dimensional (3D) fingerprint recognition based on 3D fingerprint model could technically solve the problems suffered in traditional touch-based fingerprint system, such as filth, skin deformation, and the latent finger-prints from a dirty sensor. The 3D fingerprint model reconstruction using multiple views is the key procedure in the recognition system. This paper presented a new ridge based method to obtain 3D reconstruction of the fingerprint. Different from the existing algorithms which focus on reconstruction of finger surface, the presented method directly reconstructs ridge pattern and minutiae pattern in 3D space as the 3D fingerprint model and it is expected to be more suitable for subsectucnt process of recognition, such as feature extraction.
Contour Detection by Thresholding the Gradient Image in Spatial-frequency Domain
Computer Science. 2012, 39 (10): 286-289. 
Abstract PDF(873KB) ( 361 )   
RelatedCitation | Metrics
Standard edge detectors compute the binary edge maps by thresholding the gradient image only in the spatial domain. When they are applied to the natural images, their results contain a lot of spurious edges because the gradient magnitudes of spurious edges are often stronger than that of object contours. To improve their performance in detection of object contours in natural scenes, we proposed a novel contour detection method by thresholding the gradient image in spatial-frequency domain in this paper. Taking Canny edge detector as an example, we used natural images with associaled subjectively defined desired output contour maps to evaluate the performance of the proposed method. Experimental results show that the proposed method can effectively improve the contour detection performance of the Canny edge delector.
Moving Objects Segmentation Method Combining DM and Background Reconstruction
Computer Science. 2012, 39 (10): 290-293. 
Abstract PDF(847KB) ( 376 )   
RelatedCitation | Metrics
Due to the disadvantage of timcconsuming and large volumes data of video processing, based on the hypothesis that the background pixel intensity always appears in image sequence with maximum probability, a novel method of moving objects segmentation was presented. This approach extracts the approximate components of video images sequcnce on hand firstly using the DWT (discrete wavclet transform) , and then selects the higher frequency pixel values of an interval to reconstruct the background according to the pixels clustering combing dual-threshold with similar classes merged method, at last verifies background reconstruction accuracy by quoting the evaluation criteria of image matches. Experimental results demonstrate that the proposed method can extract better background to achieve efficient and integrated moving object segmentation.
Novel Spectral Similarity Measurement Based Spectral Clustering Algorithm in Hyperspectral Imagery
Computer Science. 2012, 39 (10): 294-299. 
Abstract PDF(1061KB) ( 451 )   
RelatedCitation | Metrics
As the gaussian radial basis function (RBF) is based on the Euclidean distance of two spectral vectors, it is not sensitive for variation of spectral curves of a material, which results in decrease of the performance of the RI3F based spectral clustering for hyperspectral imagery degenerate. In order to solve this problem, according to the spectral curves similarity description, a novel spectral similarity measurement based on spectral angle cosine was proposed, and the measurement was used to build the affinity matrix used by spectral clustering algorithms. Finally, the experiments carried on with several hyperspectral data. The results of the experiments prove the validity of the proposed method.
Research on Maize Disease Diagnosis Method Based on Image Processing Technique and BP Neural Network Algorithm
Computer Science. 2012, 39 (10): 300-302. 
Abstract PDF(232KB) ( 349 )   
RelatedCitation | Metrics
To diagnose and identify maize diseases quickly, take preventive measures timely and improve the diagnostic level of maize diseases,this article introduced the image processing technictue and the I3P neural network algorithm into the identification and diagnosis of maize diseases. Experiments show that conclusions of analysis which are gained by using the disease recognition model of the image processing technique to identify the disease samples collected match real conclusions of the practical application field and meet the the practical application of the agricultural production. The technology provides an effective method for ensuring the yield and quality of maize.
Quick Data Collecting Algorithm Based on Ring-tree with Auto-reset
Computer Science. 2012, 39 (10): 303-307. 
Abstract PDF(477KB) ( 355 )   
RelatedCitation | Metrics
Acquisition of insidcstates transition-functions of sequential PI_D is the main point of PLD reverse analysis and the foundation of security vulnerability analysis. With the analysis of existing data collecting algorithms,an auto-reset mechanism was proposed to collect the PLD functional corpora data based on the ring-tree data collecting algorithm.For the dynamic change problems of driver paths of the ring-tree data collecting algorithm, a novel dynamic shortest path algorithm was proposed which reduces the times of state transition and improves the efficiency of data collection.Experiment results show that quick data collecting algorithm based on ring-tree with auto-reset can not only realize the collection of functional corpora data correctly, but also increase about 9% of data collection efficiency than existing algo- rithms.
Analysis and Detection of UEFI Bootkit
Computer Science. 2012, 39 (10): 308-312. 
Abstract PDF(400KB) ( 976 )   
RelatedCitation | Metrics
This paper analyzed the work mechanism and key technology of UEFI Bootkit, expanded the definition of Trojan according to it,illustrated the differences of hiding technology between UEFI Bootkit and Trojan,built a formal model of UEFI Bootkit cooperative concealment, showed an application of the model, proved the idea that detecting Bootkit before the operating system kernel starting can obtain a better effect than after the operating system starting.We designed and developed UEFI I3ootkit detection system which works before the operating system kernel starts. The detection system was used to do practical test, and the results show UEFI Bootkit detection system obtains a good effect and has the accuracy.
Research of Job Scheduling Strategy of High-performance Computer Based on Adaptive Power Management
Computer Science. 2012, 39 (10): 313-317. 
Abstract PDF(929KB) ( 344 )   
RelatedCitation | Metrics
The job scheduling system is a core component of high-performance computer,and the goal of job scheduling is to meet the performance requirements undo the premise, and get the lowest total power consumption of all tasks. We presented a job scheduling strategy based on adaptive power management, The strategy is based on genetic algorithm,and takes the performance/power ratio of the job queue as the scheduling factor. Comparing with the traditional job scheduling algorithms, it largely increases the system's energy efficiency with ensuring the resources utilization rate and the job throughput as well as decreasing the resources pieces and the pending jobs. The experiments show that this strategy can effectively improve productivity, and reduce energy consumption about 9% compared with traditional strategy.