Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue 7, 16 November 2018
  
Multi-core Parallel Computational Model Based on Horizontal Locality
Computer Science. 2012, 39 (7): 1-6. 
Abstract PDF(526KB) ( 487 )   
RelatedCitation | Metrics
Almost all modern CPUs are multi-cores with shared cache on chip. A number of models have been proposed for predicting the shared cache contention, but few of them consider the influence of shared cache sharing. This paper proposed horizontal locality and task shared ratio, and then proposed a parallel computational model for multi-core architecture, which can be used by parallel algorithm, compiler, parallel programming model and runtime system.
Code Execution Security Mechanism for Open Cloud&Client Computing
Computer Science. 2012, 39 (7): 7-10. 
Abstract PDF(344KB) ( 391 )   
RelatedCitation | Metrics
The cloud & client computing can take full aggregation of network server-side and edge node computing resources of Internet to gain greater benefits. However, deploying tasks to terminal nodes would bring the corresponding security risks at the same time. The behaviors of terminal nodes belonging to different users are clearly not reliable,which means the computing security is difficult to guarantee. One of these security risks is that a terminal node working as the task executor may tamper with the program or data of the task, and return the fake result, or pry into the code and data with privacy requirement. hhis paper presented a new code protection mechanism based on encryption function with verification code meeting integrity and privacy both,which makes it possible to effectively verify the correctness of returned results and to guarantee the code not be spied. In order to improve the success rate of task implementation further and reduce job cycle time, tasks ought to be distributed to those nodes with good reputations and high success rate of task implementation to execute. This paper proposed the credibility evaluation of node,described the work procedure of the code protection mechanism and gave the analysis and verification of the security performance of the system in detail.
Network Security Situation Awareness System Based on Knowledge Discovery
Computer Science. 2012, 39 (7): 11-17. 
Abstract PDF(5963KB) ( 469 )   
RelatedCitation | Metrics
Network security administrators need to obtain and analyze network security situation for management,maintenance, and planning purposes. The complexities and diversities of security alert data on modern networks, however,make the precise analysis and evaluation of network security situation extremely difficult We summarized the research progress and existing problems of network security situation awareness, and proposed a network security situation modeling and generation framework based on knowledge discovery. Then,we designed and implemented the network security situation awareness system(Net SSA) based on this framework. Net SSA consists of the modeling of network security situation and the generation of network security situation. The purpose of modeling is to construct the formal model of network security situation measurement based upon the IBS evidence theory, and support the general process of fusing and analyzing security alert events collected from security situation sensors. The network security situation is generated by extracting the frequent patterns and sequential patterns from the dataset of network security situation based upon knowledge discovery methods and transforming these patterns to the correlation rules of network security situalion, and finally automatically constructing the network security situation graph. The experimental results show that the system supports the accurate modeling and effective generation of network security situation.
Overview of Statistical Clustering Models
Computer Science. 2012, 39 (7): 18-24. 
Abstract PDF(593KB) ( 486 )   
RelatedCitation | Metrics
Clustering analysis is widely applied to engineering fields, such as biology sequence analysis, image segmentation, text analysis. Currently there have been many clustering methods and statistical learning based methods constitute a class of them. This paper started from FCM, introduced classical methods, such as potential and mountain functions,entropy method, and then analyzed their properties and applicability. Moreover, we also introduced the stat}of-art clustering techniqucs,such as kernel clustering, spectral clustering and Gaussian mixture model based clustering, narrated the solving process and analyzed their properties, computation complexity. At last, this paper presented several research directions.
Electronic Warfare Ontology and it's Application in Simulation Systems
Computer Science. 2012, 39 (7): 25-28. 
Abstract PDF(403KB) ( 441 )   
RelatedCitation | Metrics
Based on analyzing the electronic warfare systems and the engaging processes, the method hierarchizing object, class and type was introduced, and the elements in electronic warfare system, such as entity, state, attribute, and relation, etc. were generalized. And then the ontology for electronic warfare was brought forward. At last, the ontology was applied in the example ship confronting approaching missile in which general formalization about electronic warfare simulation models was researched. hhe ontology makes for ensuring the basic attributes in electronic warfare, constructing the models of electronic warfare in advance simulation systems, which all arc basics for bottoming the combined distributed electronic warfare simulation systems.
Performance Analysis of GSM Based on Labled Stochastic Petri Net
Computer Science. 2012, 39 (7): 29-31. 
Abstract PDF(319KB) ( 413 )   
RelatedCitation | Metrics
USM mobile communication network system is currently a public wireless digital transmission system which has most extensive coverage, the highest reliability, maximum capacity, and strong confidentiality. GSM communication system realizes data transmission in vehicle unit and the monitoring center. The performance of vehicle positioning system plays a very important role. LSPN is capable of making theoretical analysis. To study GSM communication module work mode and analyze the performance of mobile communication network, preliminary exploration with examples using real-time UML state machine and LSPN were presented.
Research and Design for the Modeling of Simulation of CPS
Computer Science. 2012, 39 (7): 32-35. 
Abstract PDF(4281KB) ( 675 )   
RelatedCitation | Metrics
The simulation and modeling of CPS are of great significance in the development of CPS-System,not only can help testing and validating in the process of building system, but also are the Model-driven development's important part. Cyber physical system can be divided into physical entity and computation entity by means of essence and refining.Physical entity can be constructed by dynamic continuous-and time-based motion-state of behavioral model and using finite statcmachine of behavioral model to carry on construct computation entity. This paper extended these two bcha vioral models through time oriented statcrefinement, and used the extended models to complete the simulation and modeling of CPS, besides, discussed the ability of using Unified Model Language to build computation entity and using Simulink/RTW tools to construct physical entity, and proposed the approach for integration of heterogeneous models at the framework of UML.
Research on Implementation of Dynamic Adaptive Real-time Middleware Based on DDS
Computer Science. 2012, 39 (7): 36-38. 
Abstract PDF(363KB) ( 1169 )   
RelatedCitation | Metrics
An adaptive DDS middleware was implemented by improving the DCPS DDS middleware on data service finding and fault tolerance, which achieves adaptability by modifying the centralized information store model with peer-to-peer model and applying chord protocol for data publication and subscription. Based on the traditional heart beating model,a heart beating accelerate push-pull model was proposed for enhancing the node failure detecting efficiency. Finally, a middleware was implemented. I}he test result shows that the implemented middleware is characterized with real-time ability,fault tolerance and adaptalility.
Threshold-based Electronic Voting Scheme
Computer Science. 2012, 39 (7): 39-43. 
Abstract PDF(399KB) ( 429 )   
RelatedCitation | Metrics
A ballot checker is responsible for checking of ballot in the past electronic voting schemes. The ballot checker can not tally the legitimate ballot but count illegal ones if he is not honest. For solving the problem, the author presented an electronic voting scheme based on threshold. In this scheme, checking of ballot requires some ballot checkers's cooperation in the author's scheme. This makes the result of ballot checking acceptable. Voters publish Hash or ciphertext of the ballots content and nobody can know the authentic content of these ballots. It prevents someone from forcing voters to vote and voting trading, etc.
Combat Super Network Multi Agent Model
Computer Science. 2012, 39 (7): 44-47. 
Abstract PDF(349KB) ( 579 )   
RelatedCitation | Metrics
After analyzing the properties of military network under network-centric war idea, combat super network modeling method was naturally and necessarily used to expressing the military network. I3y combing this method with CAS(complex adaptive system) theory and MAS(multi-agent system) modeling method, a super network agent model was proposed. In this model, at the micro level, the network properties arc extended into entity properties, and at the macro level, the evolution behavior of overall network is taken into account by super network. The model regulates and reflects the interaction and dynamics of network with combination of emerge behavior from entity relationship to network, and driving efficiency from overall network characteristic to entity. Simulation platform by this model is very fit for command and control testing experiment.
Link-quality Aware Beacon-less Geographic Routing Protocol for Mobile Ad hoc Networks
Computer Science. 2012, 39 (7): 48-51. 
Abstract PDF(349KB) ( 394 )   
RelatedCitation | Metrics
Geographic routing algorithms require nodes to periodically transmit HELLO messages to allow neighbors get their positions (beaconing mechanism). But periodically transmitting the messages will result in high bandwidth overhcad. For this reason, beacon-less routing algorithms have been recently proposed to reduce the control overhead.However, existing beacon-less algorithms don't consider unreliability of realistic physical channels. In this paper, a novel beacon-less routing protocol called I_QBGR was presented. Based on cross-layer routing technology, its design takes link quality into account to reduce the retransmission and enhance the performance. Simulation results show that LQBGR achieves higher packet delivery ratio, shorter end-to-end delay and lower traffic.
Time Synchronization Algorithm in Medium and Hi沙Rate WSN Based on Local Routing
Computer Science. 2012, 39 (7): 52-54. 
Abstract PDF(326KB) ( 365 )   
RelatedCitation | Metrics
The traditional time synchronization in WSN does not fit the special needs of Medium and High Rate WSN(MHWSN) ,in which data fusion and data transmission are in large scale. In the case of fixed topology, there is the problem that the synchronization errors between neighbor nodes are relatively large. At the same time, the data processing of MHWSN takes more time than the WSN,so the global synchronization mechanism is not suitable. An event triggered synchronization mechanism was presented in this paper. A local routing algorithm for synchronization was designed to solve the neighbors' synchronization problem. The simulation results show that the proposed synchronization mechanism reduces the cost of synchronizing in the event area, and the synchronization error between interest nodes is lowered.
Novel Three-dimensional Localization Algorithm in Wireless Sensor Networks
Computer Science. 2012, 39 (7): 55-57. 
Abstract PDF(292KB) ( 421 )   
RelatedCitation | Metrics
Node localization is crucial to wireless sensor network and both academia and industry pay much attention to it all the times. Existing localization methods are not suitable for three-dimensional terrains due to their design for planar applications, but in realistic application, the sensor nodes always distribute in threcdimensions, so study on the localization in three-dimensions will be much more in line with the actual application. According to the shortage of some existing algorithms proposed for threcdimensional space, a novel threcdimensional localization algorithm in wireless sensor networks was proposed. This algorithm needs no additional hardware support. I}his scheme establishes a vector space model based on the number of the anchor nodes from the communication range of unknown node, and when estimating unknown node coordinate, anchors in the unknown node communication region constraint the estimation range of unknown nodes. The simulation results demonstrate that the algorithm is of great advantages in low communication overhcad and improving coverage rate of location and location accuracy.
Analytic Hierarchy Process (AHP)-based Vulnerability Quantitative Assessment Method for Information Systems
Computer Science. 2012, 39 (7): 58-63. 
Abstract PDF(481KB) ( 769 )   
RelatedCitation | Metrics
This paper proposed a practical vulnerabilities quantitative assessment method for information system based on the Analytic Hierarchy Process (AHP). According to the hierarchical thought,the system vulnerability that reflects the severity serious degree model was decomposed into four factors, such as factors layer, evaluation factors layer, characteristic layer and target layer. Some vulnerability risk factors were evaluated respectively by expert to determine the weight from several aspects, such as the risk probability, risk influence and uncontrollable character. Through calculating the value of each layer,we got the overall value of information system vulnerability severity assessment finally. The experimental results show that the Analytic Hierarchy Process (AHP)-based vulnerability assessment method can quantify and assess the seriousness of system vulnerability effectively.
Lightweight On-demand QoS Source Routing for LEO Satellite Network
Computer Science. 2012, 39 (7): 64-68. 
Abstract PDF(4244KB) ( 445 )   
RelatedCitation | Metrics
QoS routing for LEO(Low Earth Orbit) satellite network faces great challenges due to several issues, such as, ingress satellite handover, unbalanced distribution of traffic and the limited on-board processing ability. A light-weight on-demand QoS source routing algorithm using mobile agent was proposed. Firstly,ISL(inter-satellite link) availability index was designed considering ISL utility. Secondly, satellite availability index was put forward based on traffic distribution undertaken by satellite network. hhirdly, the migration strategy of mobile agent was given taking both indexes into account. Consequcntly,ISI. routing and rerouting were presented in detail. Simulation results demonstrate that the proposed approach has lower signaling overhead, better delay fitter, new call blocking probability and handover blocking probability than the traditional methods.
Dynamic Computing Network;Architecture of Pervasive Computing for the Internet of Things
Computer Science. 2012, 39 (7): 69-73. 
Abstract PDF(428KB) ( 389 )   
RelatedCitation | Metrics
With the development of Internet of Things, a lot of different devices connect to the Internet of Things through a variety of ways. This trend makes the pervasive computing environment more complex and requires the architecture of pervasive computing change to adapt the new environment of the Internet of Things. This paper provided architecture of pervasive computing for Internet of Things,which aims to how to make the pervasive computing systems to automatically adapt o the complex hardware and software operating environment in the Internet of things. I}his architecturc of pervasive computing can make the software run everywhere without modification, and implement the service discovery, context aware service and service migration easily. I}his paper used the computing area network as the base infrastructure,and used the "devices dynamically match" as the solution of software self-adaptive. The research of this paper enhances the adaptability of pervasive computing software on the environment of the Internet of Things, and provides an effective solution of pervasive computing software system design for the Internet of Things.
Research on General Wireless Authentication Protocol Based on PKI
Computer Science. 2012, 39 (7): 74-77. 
Abstract PDF(311KB) ( 356 )   
RelatedCitation | Metrics
The WTLS protocol based on PKI requires complicated processing of certificate,which causes the communicating and computing overload is too much. Moreover,it doesn't verify the server certificate. In order to solve these issues,trust certificate verification proxy (TCVP) and certificate validity ticket (CVT) were introduced. CVT generated by TCVP for wireless communication nodes (WN) has a short life time and WN exchanges and uses it to verify certificafe and share public keys in its life time. On this basis, a general wireless authentication protocol (GWAP) was proposed. Under the guidance of GWAP, a specific wireless security authentication protocol was designed by adopting Elliptic Curve Cryptography (ECC). Result of performance analysis shows that new protocol improves the efficiency of wireless communication without security losses.
Equivalent Keys in NTRU Public Key Cryptosystem
Computer Science. 2012, 39 (7): 78-81. 
Abstract PDF(291KB) ( 1823 )   
RelatedCitation | Metrics
NTRU public Key cryptosystem has the problem that multiple private keys correspond to a common pubic key. Firstly the condition of decryption was discussed, and the conception of equivalent keys in NTRU was proposed.Secondly the invertibility of polynomials in truncated polynomial ring and the sem-norm were discussed, and four schemes to construct equivalent keys were presented. Finally the security effect of equivalent keys was analyzed, which indicates that if NTRU parameters are not chosen properly, some special ectuivalent keys will pose a serious security threat to NTRU.
Immunodominance-based Clonal Network Clustering Algorithm for Intrusion Detection
Computer Science. 2012, 39 (7): 82-86. 
Abstract PDF(524KB) ( 386 )   
RelatedCitation | Metrics
According to the idea of intelligent complementary fusion, a combination of immunodominance, inverse operation, clonal selection, non-uniform mutation and forbidden clone was employed in a novel clustering method with network structure for intrusion detection. The clustering process was adjusted in accordance with affinity function and evolution strategics. So an intelligent, self-adaptive and self-learning network was `evolved' to reflect the distribution of original data. Then the minimal spanning tree was employed to perform clustering analysis and obtain the classification of normal and anormal data. I}he simulations through the KDD CUP99 dataset show that the novel method can deal with massive unlabeled data to distinguish normal case and anomaly and even can detect unknown intrusions effectively.
Research on Critical Nodes Detection Algorithm Based on Node Stability Prediction in Ad hoc Network
Computer Science. 2012, 39 (7): 87-91. 
Abstract PDF(425KB) ( 412 )   
RelatedCitation | Metrics
In order to better adapt to the dynamic characteristics of Ad hoc network topology structure, a NS-PMRC algorithm with node stability prediction was proposed based on grey prediction model. Combined with node location information provided by the GPS system, using established equidimensional filling vacancies grey prediction model, the next moment geographical position of the nodes can be predicted, and the distance between the nodes can be computed and the node stability at the next moment can be predicted in the routing maintenance stages. Comparing the next moment stability between nodes can determine the conditions of existence of critical nodes. Compared with PMRC,NS-PMRC improved the accuracy of detection, then the network performance is improved significantly.
Network Traffic Prediction Based on Phase Space Reconstruction and Least Square Support Vector Machine
Computer Science. 2012, 39 (7): 92-95. 
Abstract PDF(327KB) ( 468 )   
RelatedCitation | Metrics
In order to improve the network traffic prediction accuracy, this paper proposes a network traffic prediction method based on least sctuare support vector machine(LSSVM) optimized by genetic algorithm which uses the relation between phase space reconstruction and parameters of prediction model. Firstly, phase space reconstruction and the parameters of LSSVM were used as an individual of genetic algorithm while the model prediction accuracy was used as the fitness function, and then global optimal parameters of the model were obtained by genetic algorithm, lastly, the simulation tests were carried out based on network traffic data. The results show that, compared with the traditional forecasting methods, the proposed model improves the prediction accuracy of network traffic and provide a new research thought for network traffic prediction.
Feature Selection Algorithm Based on IMGA and MKSVM to Intrusion Detection
Computer Science. 2012, 39 (7): 96-99. 
Abstract PDF(441KB) ( 423 )   
RelatedCitation | Metrics
In order to improve performances of intrusion detection system in terms of detection speed and detection rate,itis necessary to apply feature selection in intrusion detection system. Firstly,an efficient search procedure based on immune memory and genetic algorithm (IMGA) was proposed. Then, support vector machine (SVM) based on wrapper feature evaluation methods was surveyed,in order to improve the feature selection performance of unbalanced datasets. We used the conformal transformation and Riemannian metric to modify kernel function, and reconstructed a new Modified Kernel SVM (MKSVM). Finally, the simulation experimental results show that this approach can improve the process of selecting important features, and has better feature selection ability on the unbalanced data. Furthermore, the experiments indicate that intrusion detection system with this feature selection algorithm has better performanccs than that without feature selection algorithm.
Research on Process of File Diffusion and Influence Factors of Propagation in P2P Networks
Computer Science. 2012, 39 (7): 100-103. 
Abstract PDF(391KB) ( 365 )   
RelatedCitation | Metrics
In P2P networks, the behaviors of popular file downloaded arc similar to the spread process of the epidemic diseases,which can be described by dynamics of infectious diseases spread. I}he diffusion process of P2P file was modeled in accordance with the epidemic dynamic model. And also the basic reproductive number was derived, which can determine the ability of file propagation. Further experiments are proved that the basic reproductive number obtained can describe the behavior of file propagation and reflect of the differential impact the different parameters on file diffusion fairly well.
Self-adaptive Epidemic Routing Algorithm
Computer Science. 2012, 39 (7): 104-107. 
Abstract PDF(344KB) ( 446 )   
RelatedCitation | Metrics
In some scenarios, Epidemic algorithm has high delivery ratio, small delivery delay, but the adaptability of the algorithm is poor. However, the performance of the algorithm will be significantly reduced in other scenarios. On the basis of an analysis of the factors affecting the algorithm performance,Crowding-Out effect is considered as the main reason leading to decreased performance. Self-adaptive mechanism was put forward so that the nodes can adjust the number of packets joining the network, according to the buffer of the nodes nearby, and inhibit the Crowding-Out effect actively and then the performance of epidemic algorithm can be improved. The simulation results show that the proposed algorithm has greatly improved the delivery ratio and has considerably dropped the routing overhead under various scenarios.
Research on QoS-reliability Fusion Based Selection Mechanism for Web Service
Computer Science. 2012, 39 (7): 108-111. 
Abstract PDF(326KB) ( 350 )   
RelatedCitation | Metrics
With speedy development of Web service, the function of similar Web service in abundance has been more and more issued to Internet In order to filter out the Web service needed by requester from the set of Web service, the paper explored a sort of selection mechanism of Web service based on the fusion of QoS and reliability. Aimed at the similar complexity of Web function, it made the anatomy of existent limitation in the current selection mechanism. I3y means of integrated estimation of service reliability, and through the monitoring data of service quality in third side, it evaluated the Web service selection from two aspects in su均ectivity and o均ectivity factors,and based on the model of QoS monitoring, target consumption group, service quality estimation and evaluation feedback, it finally proposed the model of Web service selection. The paper took case to make the simulation, and validated that the mechanism not only could adapt dynamic varying environment, but also could ensure the actual service quality, and overcame the individual difference of evaluation. hhc research results show that the proposed method of service selection is feasible and practicable.
Research on Modeling and Test Case Generation for UAV Flight Control Software System
Computer Science. 2012, 39 (7): 112-118. 
Abstract PDF(610KB) ( 481 )   
RelatedCitation | Metrics
The rapid growth of software size and complexity has become an important challenge for designing and verifying modern high-quality UAV flight control software (FCS) system. Based on the architecture of Model Driven Engineering(MDE),an UAV flight control software model was established by using embedded real-time system modeling language( MAR I'E),and an example of formal model for system dynamic behaviors based on timed automata was given.Considering the application background of UAV FCS system, we proposed a test case generation method based on timed automata, including establishment of testing architecture, coverage rules and strategics of test case generation. Lastly,a case study of timed automata modeling and test case generation for the main control module of an UAV FCS system were provided.
Design and Implementation of an MVC-based Framework for Developing Portlet
Computer Science. 2012, 39 (7): 119-122. 
Abstract PDF(330KB) ( 432 )   
RelatedCitation | Metrics
In order to solve the incompatibility between different portlet application providers and portal providers, JCP issues the Java Portlet Specification to specify how portlet should be developed to be integrated into portals. I}he complex programming interface and the Portlet Context defined in JSR make it complicated to develop a portlet application.We proposed an MVGbased framework OPDS with simpler API and easier deployment to make the process more efficicnt and enhance the reusability of resources in portlet applications.
Study on Syntax and Semantics Properties of Model Evolution in Model Driven Development
Computer Science. 2012, 39 (7): 123-126. 
Abstract PDF(421KB) ( 323 )   
RelatedCitation | Metrics
Model evolution involves a series of complex change activities and should follow certain constraints to preserve certain properties of models. An illustration of model evolution was showed. Based on the set-valued mapping, the mapping from model elements to semantics domain was defined. Syntax and semantics properties of model evolution such as property preserving, consistency, absorbing and equivalence were studied by defining the semantic function.
SMC/ADL:An Architecture Description Language for Hierarchical Component-based System
Computer Science. 2012, 39 (7): 127-131. 
Abstract PDF(478KB) ( 355 )   
RelatedCitation | Metrics
To solve the problem of abstractly setting up the static, running and dynamic architecture model for the system constructed by the SMC component model, an architecture description language, which is named SMC/ADL and expressed by XML, was put forward. From the three perspectives of the type, instance and instance-behavior, SMC/ADL utilizes a set of XMI. schemas to define the integrated architecture convention framework supporting software modeling from design to running, until evolution phase, so as to extend the support for the architecture high-level abstracting to the whole lifecycle of software. Relevant tools have testified its validity and practical applicability.
Research on Communication Protocol Fault-oriented Reliability Testing of Distributed Software
Computer Science. 2012, 39 (7): 132-134. 
Abstract PDF(282KB) ( 440 )   
RelatedCitation | Metrics
We analyzed the classic communication protocol faults, researched on the reliability testing methodology of large distributed software,and designed a reliability testing tools based on communication protocol fault injection.
Recovering Traceability Links among Multi-level Software Evolution Information
Computer Science. 2012, 39 (7): 135-139. 
Abstract PDF(462KB) ( 343 )   
RelatedCitation | Metrics
Recovery of software artifact traccability is an important part in the research area of software maintenance and reverse engineering. However, most of the existing research works on traceability recovery focus on traceability within single product version. Different from these existing works, this paper concentrated on recovery of traceability a- mong evolution information on different levels, i. c. change document level, configuration management level and imple- mentation code level. This kind of evolution traceability is essential for understanding software evolution and mainte- nance. We proposed a method for evolution traceability recovery by combining keyword-based retrieval and heuristic rules. We also reported our experimental study on the evolution process of an open-source software.
Information Retrieval Model Based on Relative Feedback
Computer Science. 2012, 39 (7): 140-143. 
Abstract PDF(319KB) ( 336 )   
RelatedCitation | Metrics
In view of the existing information retrieval systems which are difficult to process document retrieval problem according to query demand, this paper proposed one retrieval model based on relative feedback, analyzed the decomposi- lion of the query words, derived the correlation feedback mechanism and normalization process, and further expounded the document extraction method. Through relative feedback and query expansion, this model can overcome the problem that traditional methods can not calculate the similarity between document and query word, and can effectively deal with document retrieval. The simulation results proved the validity and feasibility of the model.
Solving Multi-instance Learning Problem with Evaluating the Importance of Concept in Instances
Computer Science. 2012, 39 (7): 144-147. 
Abstract PDF(325KB) ( 313 )   
RelatedCitation | Metrics
In multi-instance learning, the training set is composed of labeled bags, each of which consists of many unla- beled instances,and the goal is to learn some classifier from the training set for correctly labeling unseen bags. In the past, some researches about multi-instance learning aim at improving single-instance learning algorithms to meet the multi-instance representation,and others try to propose some new methods to find the relationship between instances and bags and use the result to solve the problem. This paper started from adapting the representation of the bag and proposed a new algorithm-concept evaluating algorithm. First, this algorithm uses a cluster algorithm to cluster all instances into d group,here each group can be treated as a concept in the instances. Then,it uses the TF-IDF (term fre- qucncy-inverse document frequcncy)algorithm to get the importance of each concept in the bag. Finally, each bag is re represented as a d dimensional vector}concept evaluating vector, the ith value in this vector is the importance of the ith group in the bag. Because after re-representing the data set is not "multi" again, some propositional singl}instance learning algorithms can be used to solve multi-instance learning problem effetely.
Tracing the Evolution Lineage of Complex Event
Computer Science. 2012, 39 (7): 148-153. 
Abstract PDF(4238KB) ( 353 )   
RelatedCitation | Metrics
Some novel applications such as network security event and tracing lineage of IOT,etc. presented many chal- lenges for the lineage studying of complex event Because of the presence of factors including fuzzy time and uncertain transfer of state, tracing the lineage of complex event often encounter inaccurate derived time and hard reverse derivation problem, therefore the evolution lineage of complex event can not be traced or queried effectively. A reverse derivation model with the provenance semantic, called I3REI门N (backward reasoning extended fuzzy time petri net) , was proposed for such problems, and based on this model, a backward reasoning algorithm according to time automation theory was designed. Given goal places) and other conditions,it can not only get all information evolution path and analyze the possibility distribution of path,but also efficiently compute the fuzzy time function value of the state and transition of complex events. Finally, the completeness of BREI}TN model and the properties of evolution path were analyzed, and the performance of algorithm was verified by experiments.
Framework of Vita Event Extraction and Retrieval
Computer Science. 2012, 39 (7): 154-160. 
Abstract PDF(658KB) ( 446 )   
RelatedCitation | Metrics
A curriculum vitae (henceforth referred to as a vita) usually contains a wealth of abundant data such as per- sonal information, educational background, publications and work experience. It is significant to search, extract and ex- plore the data from these vita documents which may provide a more comprehensive and integral personal profile. This personal profile can be viewed as a series of events. Moreover, we can take advantage of events from different individual's vita to explore and establish relationships between these events and the people involved. In this paper, we presented a framework extracting and explorating vita event, which can retrieve vita documents from the Internet, extract events from these documents and save the events to a database for further exploration. More concretely, the work introduced in this paper includes; (1) an event presentation model which characterizes the basic attributes of events and is utilized for event exploration; (2) a probabilistic model for extracting events from vita documents automatically; (3) an event explo- ration approach by exploiting the co-occurrence of the event attributes on the basis of the event presentation model and the event extraction approach.
Uncertain Data Preconditioning Method in Frequent Itemset Mining
Computer Science. 2012, 39 (7): 161-164. 
Abstract PDF(438KB) ( 409 )   
RelatedCitation | Metrics
Traditional studies of frectuent itemset mining cannot obtain information from uncertain data efficiently. We studied the frequent pattern tree and proposed an effective uncertain data preconditioning method, the PCAFP-Growth, which can reduce the itemset dimensions with principal component analysis method,and prune data with fuzzy associa- lion analysis. Our experimental results over real world datasets show that our method is effective and efficient
Research of "Set Attribute Code" and Related Algorithms
Computer Science. 2012, 39 (7): 165-169. 
Abstract PDF(312KB) ( 358 )   
RelatedCitation | Metrics
In the traditional process of set operations, the elements of set arc usually represented through natural lan- guage rather than formalized representation,and current representation has affected the efficiency of set operations. In order to solve this problem and improve the efficiency of set operations, this paper innovatively introduced the binary in- to the process of set operations, proposed the concept of "Set Attribute Code" , and defined a series of operation rules a- bout "Set Attribute Code". All the above formed a relatively complete and formal computing system of "Set Attribute Code". Based on above theory of "Set Attribute Code",this paper proposed a series of related algorithms, which demon- strated the correctness of the theory of "Set Attribute Code". Experimental results show that the process of set opera- lions can be achieved through 0,1 operation in the theory of "Set Attribute Code" and related algorithms. And the tar- get of query operation was successfully achieved,which is very important in the database through the mechanism of "Set Attribute Code".
Weh-based Term Translation Extraction and Verification Method
Computer Science. 2012, 39 (7): 170-174. 
Abstract PDF(443KB) ( 591 )   
RelatedCitation | Metrics
Extracting translation of Chinese term and oov words is an important problem in many applications such as machine translation and cross-language information retrieval. However, these translations are difficult to access from traditional dictionary. hhis paper proposed a method to automatically extract English translation of Chinese term from Web by search engine. It constructs three ctuery modes using the partial translation of terms, proposes multi-featured extraction method. Aim at the problem of low accuracy and unrelated translations, it verifies the candidate translations by three verification modes: side analog alignment verification, Bilingual alignment degree verification and word-building verification. Experimental results show that TOPl accuracy arrives 97. 4 %and TOP3 98. 3%.
SVM Active Learning via Dynamic Version Space Division
Computer Science. 2012, 39 (7): 175-177. 
Abstract PDF(322KB) ( 393 )   
RelatedCitation | Metrics
This paper presented a dynamic version space division algorithm for SVM active learning, in view of the drawbacks of traditional batch sampling methods. Based on the duality property of feature space and parameter space, we discussed SVM active learning in dual space and concluded that example labeling in feature space corresponds to ver- sion space division in parameter space. Faking both the existing classification model and the previously labeled examples into consideration, we optimized the version space division process and maximized the value of selected examples for model refinement. In this way, a more effective selective sampling was achieved. Experimental results demonstrate the effectiveness of the dynamic version space division algorithm, which remarkably improves the classification performance under the cost of limited labeling effort.
Interval-valued Decision-theoretic Rough Sets
Computer Science. 2012, 39 (7): 178-181. 
Abstract PDF(401KB) ( 345 )   
RelatedCitation | Metrics
By considering the "multiple-valued" character in practical decision procedure, the interval-valued loss funcdons were induced to decision-theoretic rough set theory (DTRS) based on the 13ayesian decision theory. With respect to the minimum I3ayesian expected risk, a model of interval-valued decision-theoretic rough set theory was built The corresponding propositions and criteria of interval-valued decision-theoretic rough set theory were also analyzed. An example of oil investment was given to illuminate the proposed model in applications.
Research on Parallel Genetic Algorithms Based on MapReduce
Computer Science. 2012, 39 (7): 182-184. 
Abstract PDF(310KB) ( 651 )   
RelatedCitation | Metrics
MapReduce programming model enjoys widespread application and becomes new popular parallel programming paradigm nowadays. It uses MapReduce model to parallelize coarse-grained genetic algorithms,and takes mull objective optimization problem as the benchmark. All the experiments are made under hadoop and a cluster which consists of commodity servers. When the number of variable reach to 10E}7,the efficiency of parallel algorithm can be multiplied several times without introducing any bottlenecks revolved memory. Finally we studied how the parallel degree af- feels the performance.
Dynamic Assessment Method of Online Auditing Performance:A Combined Use between AHP and GM(1,1)
Computer Science. 2012, 39 (7): 185-189. 
Abstract PDF(382KB) ( 302 )   
RelatedCitation | Metrics
A dynamic assessment method for online auditing performance based on AHP (analytic hierarchy process)and GM(1,1) was proposed in this paper according to the dynamic assessment requirement of online auditing performance. In this method, the AHP hierarchy fit to the characteristics of online auditing used in China was structured. Then,pairwise comparisons of criteria were made and led to a matrix of comparisons. Then, the comparison matrices were translated into weights, and the performances of online auditing were computed. Based on the performances assessment results computed by AHP, GM<1,1) was used to forecast the future performance. Then, the performances of online auditin can be assessed dynamically according to the present and future performance. Therefore, an effective dynamic as sessment method for online auditing performance was provided in this study.
Research on Auto-identi行ing of Adjoining Relation Markers Based on Rule
Computer Science. 2012, 39 (7): 190-194. 
Abstract PDF(422KB) ( 316 )   
RelatedCitation | Metrics
Relation words arc of great significance for analyzing the semantic relations between clauses in multiple compound sentences. But in the process of automatic identification of relation words based on rule,we found that not all the relation marks in multiple compound sentence arc relation words, and identifying the relation marks serving as relation words is the key point. An algorithm was proposed to identify typical relation words which are adjacent. I3y combining the relationship marked corpus and the extraction of relation markers, this algorithm uses the feature to identify these markers. Experiments prove that this approach achieves the accuracy of 72. 9 0 0 in the identification of these relation marks.
Cover Problem of Linear Separable Structures to Binary Neurons
Computer Science. 2012, 39 (7): 195-199. 
Abstract PDF(373KB) ( 365 )   
RelatedCitation | Metrics
In binary neural networks, every neuron is ectuivalent to a linear separable function, however, the logical meaning of every linear separable function which is expressed by the binary neuron is still not clear. I}his paper firstly analysed the known linear separable structures,then discussed whether these known structures cover the whole binary neurons,finally pointed out when the threshold value is in some range,the logical meaning of the linear separable function is still unclear. This result provides the way of the cover problem of binary neurons.
Using Cross-event Inference to Fill Missing Event Argument
Computer Science. 2012, 39 (7): 200-204. 
Abstract PDF(392KB) ( 378 )   
RelatedCitation | Metrics
Event extraction is an important research direction in the area of information extraction. In response to the phenomenon that a large number of arguments arc missing in ACE event extraction caused by focusing on the current single sentence, we proposed a theory of filling the missing event arguments based on cross-event inference and achieved the prototype system. The system is divided into two parts that arc identification and classification of missing roles. Identification part is applied to decide whether a missing event argument can be filled while classification part is applied to decide which argument in other event mention can be used to fill the missing event argument. We annotated ACE 2005 corpus to reveal the filling relationships. I'he experimental results show that the F measure reaches 72. 97 and 74.68 respectively.
PPI Networks Clustering Model and Algorithm Combining with the Principle of Artificial Fish School
Computer Science. 2012, 39 (7): 205-209. 
Abstract PDF(442KB) ( 341 )   
RelatedCitation | Metrics
Predicting function of unknown proteins in the protein-protein interaction networks is a hot topic in the bioinformatics. Recently functional flow method has effectively solved the problem of clustering PPI networks. However, the accuracy is relatively low and the time complexity is high. So the PPI networks clustering algorithm combining with the principle of artificial fish school was proposed, which considered an artificial fish as a set of cluster centers. The foragrog behavior was regarded as searching the neighbor nodes of initial cluster centers and adding the nodes into cluster module. Afterwards the set of cluster modules having the highest fitness value was selected as the initial cluster result,which was corresponding to the following behavior of artificial fish school. Then other artificial fish began to execute swarming behavior and judged the similarities between the corresponding cluster modules and initial cluster result. If the similarity was lower than given threshold, the cluster module was added into the initial cluster result The simulation experiment on PPI datascts shows that this algorithm can automatically determine cluster number. In addition, both the accuracy of cluster result and efficiency of algorithm are superior to functional flow algorithm
Efficient Dynamic Updating Algorithm of the Computation of Core in Decision Table
Computer Science. 2012, 39 (7): 210-214. 
Abstract PDF(389KB) ( 322 )   
RelatedCitation | Metrics
The dynamic updating algorithm of computation of core was discussed in the decision table. Aiming at this situation, the concept of the simplified decision table was first introduced, and a large number of repeated objects in decision table were deleted effectively, what's more, some dynamic updating mechanisms were analyzed when the objects were deleted in original decision table. At the same time, in order to avoid needless repeated computation, multi-level hierarchical model was applied to dynamic updating process. On this condition, an efficient dynamic updating algorithm for computing core was proposed, which does not store discernibility matrix. The algorithm only scans updated decision ta- ble to compute core when the objects are dynamically deleted. At last, theoretical analysis and experimental results show that the algorithm is feasible and effective.
New Regularization Method of Co-training
Computer Science. 2012, 39 (7): 215-218. 
Abstract PDF(358KB) ( 460 )   
RelatedCitation | Metrics
Semi-supervised learning is a hot research topic of machine learning. Co-training is a multi-view semi-superwised learning method. Co-training is studied from the regularization point of view. By exploiting the metric structure of the hypotheses space, the smoothness and consistency of a hypothesis were defined. A two levels regularization algorithm was presented which uses the smoothness to regularize the within-view learning process and uses the consistency to regularize the between-view learning process. The experimental results were presented on both synthetic and real world datascts.
Multi-measures Similarity Measures between Vague Sets
Computer Science. 2012, 39 (7): 219-221. 
Abstract PDF(307KB) ( 317 )   
RelatedCitation | Metrics
Aimed to the drawbacks of measuring similarity between Vague sets on a single measure, two similarity measures on multi-measures between Vague sets were proposed. A simplex geometrical expression of Vague sets was presented, which represents the degree of true membership, false membership and unknown for Vague sets as the 3 triarr files partitioning a same plane within a simplex. Then an area measure suiting for measuring similarity between Vague sets was proposed. Combining the area measure with the measure of distance and unknown degree, the similarity measures on multi-measures between Vague sets were constructed, which embody a measuring pattern for similarity by multi-features of "point lin}region" in spatial. The examples verify the validity and the advantage of the proposed multi-measures similarity measures.
Legendre Kernel Function for Support Vector Classification
Computer Science. 2012, 39 (7): 222-224. 
Abstract PDF(212KB) ( 403 )   
RelatedCitation | Metrics
This paper presented a new set of kernel function-Legendre kernel function based on Legendre polynomial.The performance and robustness of the presented kernel were investigated on bi-spiral benchmark data set as well as five data sets from the UCI benchmark repository. I}he experiment results demonstrate that the presented kernel has competitive robust and generalization performance compared with commonly used kernel functions (polynomial kernel and Radial Basis Function ctc.).Moreover, the I_egendre kernel has one parameter which is only chosen from natural number, thus parameter optimization is facilitated greatly.
Function P-sets and Attribute Control of Information Laws
Computer Science. 2012, 39 (7): 225-228. 
Abstract PDF(302KB) ( 302 )   
RelatedCitation | Metrics
Function P-Sets (function packet sets) are a new mathematical model and structure, and are a new theory and methodology in the research of the laws of dynamic information system. Function packet sets are a pair of function sets composed of function internal P-set了(function internal packet set SF)and function outer p-set SF (function outer packet set SI'). Function P-sets have the functional and dynamic characteristics. I3y virtue of these characteristics, this paper explored the attribute control of information laws and presented the generation of function packet sets and information laws, and attained the theorem of the attribute control of information laws, and achieved the application of attribute control of information laws in image border information stabilization.
Comparison on Covering-based Rough Set Models
Computer Science. 2012, 39 (7): 229-231. 
Abstract PDF(305KB) ( 322 )   
RelatedCitation | Metrics
By means of comparing the upper approximation sets, the lower approximation sets and accuracy measures of six covering-based rough set models, the six models were systematically studied. Three conclusions were obtained. First1y, among six covering upper approximation sets, the second one is the largest, and the first five upper approximation sets all have inclusion relationship, except the third and fourth ones. Secondly, there is inclusion relationship between two lower approximation sets. Thirdly, among the accuracy measures of six covering-based rough set models, the second one has the lowest accuracy measure, and the fifth one has the highest accuracy measure, but it has no relationship with the sixth one. I3y some illustrative examples, all the conclusions gain demonstrations. The comparative study on different rough set models provides a better understanding of these models and some references for model selection in different applications.
Randomized Reverse Greedy Algorithm for k-median Problem
Computer Science. 2012, 39 (7): 232-236. 
Abstract PDF(383KB) ( 645 )   
RelatedCitation | Metrics
Research on the approximated algorithms for k-median problem has been a focus of computer scientists.Based on the balanced parameter, this paper presented a randomized algorithm for k-median problem by means of the reverse greedy. The new algorithm's expected approximate ratio was proved to be(3}0(ln(ln(k)/a)) with high probability. The running time of the new algorithm is巨立ln(k)习z(二+二),wher。二。nd yn represent the number of clients and facilities for the given instance. Finally,computer verification was used to study the real computational effect of the al gorithm.
Solving Dynamic 0-1 Knapsack Problems Based on Dynamic Programming Algorithm
Computer Science. 2012, 39 (7): 237-241. 
Abstract PDF(413KB) ( 403 )   
RelatedCitation | Metrics
Random time-varying knapsack problem (RTVKP) is a dynamic combinatorial optimization problem, is a typical NP-hard problem too. Because the value and size of items and the size of knapsack can change along with the time, it causes that solving this problem is more difficult. We proposed an efficient algorithm for solving RTVKP with dynamic size of knapsack based on dynamic programming method, and analyzed the complexity of new algorithm and the condition of its successful executing. I}he results of simulation computation show that the exact algorithm is an efficient algorithm for solving RTVKP.
Formalizing Geo-ontology Alignment and Integration Based on Category Theory
Computer Science. 2012, 39 (7): 242-244. 
Abstract PDF(336KB) ( 354 )   
RelatedCitation | Metrics
Ontology alignment can solve the heterogeneity between ontology, while category theory can shield the heterogeneity between ontologies and provide uniform methodological framework for ontology integration. This paper applied category theory into the field of ontology alignment. Based on the features of gco-ontology it redefined morphism and intended to construct a more complex category, where objects are geo-ontologies, but morphism can not only express edual relationship but also describe multiple semantic relationships between geo-ontologics. Based on the definition and the features of morphism, this paper also defined ontology alignment and composing alignment, and elaborated ontology in-tcgratton.
Backward P-reasoning and the Characteristics of Internal P-attribute Class
Computer Science. 2012, 39 (7): 245-249. 
Abstract PDF(374KB) ( 322 )   
RelatedCitation | Metrics
P-reasoning(packet reasoning) is obtained by the structure and dynamic characteristic of P-sets. P-reasoning is composed of internal P-reasoning(internal packet reasoning) and outer P-rcasoning(outer packet reasoning),which has the dynamic characteristic and "sequence" characteristic. Backward P-reasoning (backward packet reasoning) is presented by the "opposite sequence" study for P-reasoning. Backward P-reasoning is composed of backward internal Preasoning and backward outer P-reasoning. hhe characteristic of backward P-reasoning is to find the attribute set changed in the reasoning conclusion using the change of the clement set in the reasoning condition. Using backward Preasoning, the paper gave P-attribute class, the characteristic and discovery-identification of P-attribute class characteristic, the application of P-attribute class in the information system.
Improved Text Feature Selection Method Based on Text Feature Weight
Computer Science. 2012, 39 (7): 250-252. 
Abstract PDF(346KB) ( 861 )   
RelatedCitation | Metrics
This paper compared several feature selection methods in text categorization, and proposed a new feature se-lection method(TF)DF Ci) based on weighted frequency of distinction between the text. It improves TF)DF function from weighted freduency and the feature items can increase the ability of text categorization in documents. In the experiment, we tested the effect of this feature selection method and other feature selection methods by using KNN classifiers. The experiments show the new method has good performance and stability under different numbers of training sets.
Relative Entropy Threshold Segmentation Method Based on the Minimum Variance Filtering
Computer Science. 2012, 39 (7): 253-256. 
Abstract PDF(8088KB) ( 362 )   
RelatedCitation | Metrics
The relative entropy thresholding segmentation algorithm based on the co-occurrence matrix is a commonly used image segmentation method. An asymmetric co-occurrence matrix was constructed by an adaptive filter method to improve the relative entropy thresholding segmentation method, which is better adapted to the noise images segmentation. hhe segmentation experiment results show that this method can reduce the noise interferes more effectively, and get the more complete objectives and the more distinct edge.
Real Time Image Mosaic Based on GPU
Computer Science. 2012, 39 (7): 257-261. 
Abstract PDF(4263KB) ( 675 )   
RelatedCitation | Metrics
Telepresence with wide field of vision is one of the key technologies for teleoperated unmanned vehicle. To meet the real-time requirement, a fast image mosaic method was proposed, which selects evenly distributed points and calculates their similarity by rotation-invariant NCC (Normalized Cross Correlation) operator, calculates the projective transformation model by RANSAC (Random Sample Consensus) method,and fuses images by linear fade in and fade out. Finally, the algorithm was accelerated with a parallelized self-adaptive image matching method using GPU (Graphic Process Unit). Experiment results show that the efficiency of the parallclized image mosaicing method, compared with that of the serial scheme on CPU, is dramatically improved with more than 60 times on frame rate.
Method for Spine Segmentation and Feature
Computer Science. 2012, 39 (7): 262-266. 
Abstract PDF(425KB) ( 352 )   
RelatedCitation | Metrics
A method for spine segmentation based on adaptive mathematical morphology was proposed in order to increase the correct rate of Pyrrophyta recognition. At first, the pixel-width was introduced, and the optimal structure element was computed by pixel-width histogram and area distribution automatically, and then the spine was extracted by mixed mathematical morphology operations. Finally, two kinds of local biological morphology feature parameters were constructed and their visual invariance was proved. The experiments results show that the optimal structure element can be computed by different Pyrrophyta cell and this method is of high precision and fast speed.
Restoration Method for Video Sequences Based on Distinguishing Damaged Objects
Computer Science. 2012, 39 (7): 267-269. 
Abstract PDF(361KB) ( 388 )   
RelatedCitation | Metrics
A restoration method for video sequences based on distinguishing damaged objects was proposed. hhc restoration of the first damage is based on total variation algorithm, while the second damage is based on texture synthesis from samples with spatial-temporal correlation. For the former, the eight neighborhoods of target point in scratches and small speckle are repositioned. For the later, the search of match blocks is limited in the 21、21 range with repairing block-centered in front and back frames. The experimental results show that the method can improve effectively efficiency and effectiveness for video sequences' restoration. It is applicable for roboticized restoration of old film material.
Multi-object Visual Tracking Based on Reversible Jump Markov Chain Monte Carlo
Computer Science. 2012, 39 (7): 270-275. 
Abstract PDF(4449KB) ( 742 )   
RelatedCitation | Metrics
MCMC-based multi-object visual tracking was investigated here. To improve the confidence of sampling and perform the iteration effectively,a new approach to multi-object visual tracking was proposed based on reversible jump Markov chain Monte Carlo (RJMCMC) sampling. Uiven image observation, the tracking problem was formulated as computing the MAP (maximum a posteriori) estimation .The prior proposal distribution of object was developed with the aid of association match matrix,and four types of reversible and jump moves were designed for Markov chains dy- namics. I}he likelihood distribution measure was presented via position-weighted colour Kist match between reference objects and candidate objects. The state updating was generated from mean-shift(MS) iteration,rather than from random walk in the MCMC sampling. Experimental results and quantitative evaluation demonstrate that the proposed approach is effective for challenge situations.
Research on Boolean Operation in Semantic Feature Modeling System
Computer Science. 2012, 39 (7): 276-279. 
Abstract PDF(403KB) ( 326 )   
RelatedCitation | Metrics
To improve the efficiency of Boolean operations in semantic feature modeling system, a semantic representalion based method was proposed. It represents feature models with semantic representation, manages feature elements by cellular model,improves detect efficiency of features interaction and builds new the feature entity by splitting ntersectant cells and semantic faces. hhis method can not only build the Boolean entity rapidly and exactly, but also avoid crrors such as holes and losing geometry faces. Experiments on computer show that this new method is more adaptable and practicable.
Vector Quantization Method for Image Compression Based on GA and LBG Clustering Algorithm
Computer Science. 2012, 39 (7): 280-281. 
Abstract PDF(273KB) ( 448 )   
RelatedCitation | Metrics
Vector quantization is a very important image compression algorithm and its key is the design of codebook.The classic LI3G clustering algorithm results in different performance because of sensitive selection of the initial cluster center. This paper applied LI3(;clustering algorithm into genetic algorithm to optimize cluster center, which uses the advantages of high local search ability of I_I3G and global optimization ability of GA. hhe hybrid method can not only improve the duality of codebook but also speed algorithm convergence.
Color Image Retrieval Algorithm Based on the Maximum Similarity Submatrix
Computer Science. 2012, 39 (7): 282-286. 
Abstract PDF(449KB) ( 408 )   
RelatedCitation | Metrics
Color is one of the important characteristic of image, which is widely used in image retrieval. In the conventional color histogram retrieval algorithm, the color's spatial location information is discarded, and the retrieval precision of the algorithm is also affected. Aiming at this problem, this paper presented a color image retrieval algorithm based on the maximum similarity submatrix. hhis algorithm uses the odd square matrix in image segmentation to capture the color spatial distribution pattern and get the color matrix,then,find the maximum similarity submatrix of the two color matrix to achieve the similarity calculation. The results show that the algorithm in this paper can accurately describe the image color characteristic, and similarity calculation is very effective. Compared with global histogram, it has better retrieval precision.
Application of Trajectory Estimation Algorithm in the Automatic Capture of Key Figures
Computer Science. 2012, 39 (7): 287-289. 
Abstract PDF(258KB) ( 317 )   
RelatedCitation | Metrics
Because the color of the key figures and the background area is too close, the color difference between the both is not obvious. The traditional algorithm is based on the different date of the gray scale of adjacent images to catch the key figures and it can't avoid the defects in which the color difference is not obvious bacause the color difference betwecn the key figures and the background area is too small, so it will reduce the precision in which the key character is captured automatically. A method based on the trajectory estimation algorithm for automatic acquisition of key figures was proposed in this paper to solve the above problem. The method extracts the characteristic parameters of the key figures,and predicts the critical movement trajectory of the key figures to complete the automatic capture of key figures.The experiments show that this algorithm enhances the accurate rate of key figures automation capture, and gets a satisfactory result
Remote Sensing Image Watermarking Based on PCA and Data Fusion
Computer Science. 2012, 39 (7): 290-292. 
Abstract PDF(4251KB) ( 415 )   
RelatedCitation | Metrics
An algorithm that watermark is embedded to remote sensing images to protect their copyright was proposed.Algorithm combined data fusion technology with digital watermarking technology. First, panchromatic image was decomposed using wavelct and edge feature of the third approximation coefficient was extracted. PCA transform was done on the edge feature getting the first principal component as watermark, and then the watermark was embedded in detail coefficient In the end, approximation coefficient and changed detail coefficients were constructed to obtain the image having watermark. Finally,panchromatic image that container watermark was fused with multispectral image using the wavelet transform and PCA fusion method. When extracting watermark, independent component analysis(ICA) method was used. The algorithm can protect the copyright of remote sensing images and certificate authenticity, also does not destroy the original information and characteristics of remote sensing images. Experiment results show that the algorithm is feasible and effective.
Layered Near-lossless Compression Scheme of Hyper-spectral Image in Airborne Remote Sensing System
Computer Science. 2012, 39 (7): 293-296. 
Abstract PDF(460KB) ( 322 )   
RelatedCitation | Metrics
For real-time compression and transmission of hyper-spectral image in airborne remote sensing system, a new layered near-lossless compression scheme was proposed according to the analysis of correlations among bit planes. An error-prevention DPCM was used in spectral decorrclation of higher bit plane, and then compression ratio control stratcgy was applied based on the complexity of residual image. Lower bit plane was uniformly quantified after quadtree splitting and removing mean. The algorithm is easy to realize and can effectively control the compression ratio. hhe diffcrence between the practical compression ratio and the setting compression ratio is less than 5 0 o. The PSNR is bigger than 33 and the XSD of higher-bit reconstructed image is bigger than 0. 98 when the compression ratio is about 10.
Xen Virtual Machine Scheduling Enhancement by Improving Cache Efficiency
Computer Science. 2012, 39 (7): 297-201. 
Abstract PDF(469KB) ( 346 )   
RelatedCitation | Metrics
The two scheduling algorithms released by Xen4. 1 neither get good performance in multitask virtualization on server. After analyzing the characteristic and requirements of three types of tasks running on virtual machines,optimization methods were proposed;sorting I/O tasks by consumed time,preferably scheduling I/O task of less consumed time, recording computing-mode task associated with the cache to maintain cache consistency. Thereby we improved the cache efficiency on the basis of ensuring the I/O response and bandwidth performance.
Firm-code Disassembly Technology Based on IVT Reconstruction
Computer Science. 2012, 39 (7): 302-204. 
Abstract PDF(351KB) ( 575 )   
RelatedCitation | Metrics
Disassembly is an important part of firmware reverse engineering analysis, whose correctness directly influences the precision of FREA. At present,most of the disassembly methods focus on practical program. However,these methods could not be directly used in firm-code disassembly due to its particularity. IV T (Interrupt Vector Fable) is the core of firm-code. Effective interrupt vectors are available by reconstructing the IVT. The more interrupt vectors we obtwin, the more precise the disassembly result is. The structural characteristics of firm-code were studied, and the IV T reconstruction method was introduced. Moreover, a disassembly technology based on the reconstruction of IVT was proposed. The experimental results show that the proposed technology can effectively improve the precision of firm-code disassembly, by which both of main function and interrupt subprograms could be disassembled, compared with traditional static disassembly methods. The disassembly precision is increased by 8. 72 0 in average.
Robust H-guaranteed Cost Fault-tolerant Control for Uncertain NCS Based on Dynamic Output Feedback
Computer Science. 2012, 39 (7): 305-312. 
Abstract PDF(533KB) ( 309 )   
RelatedCitation | Metrics
For a class of networked control system with delay and data-packet dropout in consideration of both parameter uncertainties and external disturbance with limited energy, the problem of robust H}x, guaranteed cost fault tolerant control for networked control system with actuator failures was discussed using dynamic output feedback control strategy. A type of Lyapunov-Krasovakii functional was proposed which contains some triple-integral terms, and a delay-depcndcntrobust H.guaranteed cost integrality sufficient condition against actuator failures was derived, meanwhile, the optimization design method of controller was given via solving several linear matrix inequalities. Finally, an example was used to illustrate the effectiveness and the feasibility of proposed approach. In the proof the model tranformation is not used and the informations of delay are fully used in piecewise processing, so the results possess less conservativeness.
Research of Software Development Method for the Parallel Modeling of System-on-chip
Computer Science. 2012, 39 (7): 313-316. 
Abstract PDF(345KB) ( 637 )   
RelatedCitation | Metrics
As the description language library of System-on-chip, SystemC can define executable virtual prototype of hardware platform and has become do facto industrial standard, but can not make full use of the computation capacity of SMP (Symmetrical Multi Processing) for the inherent property of sequential simulated kernel. ho solve the problem,this paper proposed a software development method from kernel engine to high level modeling. By utilizing parallel programming, the method improves kernel scheduler algorithm and implements really parallel simulation kernel, on the basis of which, combining interfaccmethod-call mechanism and hierarchy channel conception of SystemC, proposes a kind of rapid software development hicrachy. hhe experiment shows modeling system-on-chip by the method can efficently improve simulated speed,and possesses highly developed efficiency in project.