Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 38 Issue 11, 01 December 2018
  
Survey of Decision Table Research of Attribute Reduction
ZHANG Ren-wei,BAI Xiao-ying,YU Lian,LU Hao
Computer Science. 2011, 38 (11): 1-6. 
Abstract PDF(530KB) ( 628 )   
RelatedCitation | Metrics
The paper reviewed the basic concept of decision tables. It analyzed the problems of conditions confliction and necessity of attribute reduction when a decision table, with a large number of conditions, has the issue of combinatorial explosion. The paper surveyed the state-of-the-art research on the above two problems, and compared their algorithms from different aspects including motivations, approaches and efficiencies. It then discussed research challenges and future research focuses including decision table construction,algorithm scalability and efficiency.
Survey on Network Flow Watermarking Technologies
ZHANG Lian-cheng,WANG Zhen-xing,LIU Hui-sheng
Computer Science. 2011, 38 (11): 7-11. 
Abstract PDF(560KB) ( 853 )   
RelatedCitation | Metrics
As active traffic analysis approaches, network flow watermarking technologies can effectively traceback anonymous abusers and network attackers behind a connection chain of stepping stones. As they can achieve high detection rates and low false positive rates within short observation time, flow watermarking technologies have significant applicadons in many fields, such as attack traceback, network supervision and attack forensic. This paper firstly represented basic framework and major characteristics of flow watermarking technologies, then briefly introduced typical packet payload based, traffic rate based and packet timing based flow watermarking schemes at present, after that, main attack technologies, such as timing analysis attack, multi-flow attack and mean-square autocorrclation attack, and countermeasures of flow watermarking schemes were described, finally, near future research directions were discussed.
Survey of Attack Graph Technique
CHEN Feng,MAO Han-dong,ZHANG Wei-ming,LEI Chang-hai
Computer Science. 2011, 38 (11): 12-18. 
Abstract PDF(633KB) ( 892 )   
RelatedCitation | Metrics
The network attack techniques arc being more diversified, and intelligent, an attacker can often infiltrate a seemingly well-guarded network system using multi-step attacks by exploiting sequences of related vulnerabilities. As the novel vulnerability assessment technique, the attack graph technique analyzes the interaction between the target network and the attacker through the models of these two agents, generates attack graph to show possible attack paths. Because this technology has the capacity to automatically discover the unknown system vulnerabilities and the relationship between vulnerabilities, it is currently a hot subject of research. "l}he attack graph technique has experienced the stage of manual analysis and the stage of the automatic analysis of small-scale network, and is currently in the way of the automatic analysis of large-scale network. In this paper, the development of attack graph technique was summarized and challenges arising from the current research were discussed and some suggestions for the future research work were put forward.
Recent Advances in Wireless Multimedia Technologies with Network Coding
XU Jin,FU Zhi-zhong,LI Xiao-feng,XIAN Hai-ying
Computer Science. 2011, 38 (11): 19-25. 
Abstract PDF(697KB) ( 496 )   
RelatedCitation | Metrics
In the recent years, researchers all around the world have actively studied wireless multimedia technologies using network coding and have achieved promising results so far. This paper first analysed the opportunities and challenges for networked multimedia streaming,and introduced the frequently-used mechanisms for multimedia streaming with network coding, and then summarized theirs latest research advances in typical wireless multimedia application scenarios, i. e. , wireless multimedia multicasting/broadcasting/unicasting and wireless video conferencing. At last, the open issues of this research field were proposed.
Research of P2P Traffic Identification Based on Decision Tree Ensemble
LIU San-min,SUN Zhi-xin,LIU Yu-xia
Computer Science. 2011, 38 (11): 26-29. 
Abstract PDF(352KB) ( 482 )   
RelatedCitation | Metrics
A novel P2P traffic identification method based on decision tree ensemble was proposed for improving the model stability. First, the most optimal feature set was extracted by using fast correlation based filter(FCBF) , and then the decision model based on five sub-classifier formed by Bagging was developed by the principle of the majority.Through test result comparison based on the open data set in the two distinct experiment scheme among the proposed model, naive bayes and naiva bayes based on kernel density estimation, it shows the proposed model owns a better stability, the high classification accuracy and P2P traffic identification accuracy and gives the explanation about this phenorEcnon.
Singular Vector Quantization Watermarking Scheme Based on DWT-SVD
HU Qing,LONG Dong-yang
Computer Science. 2011, 38 (11): 30-33. 
Abstract PDF(422KB) ( 842 )   
RelatedCitation | Metrics
A novel watermarking scheme was proposed for image authentication by applying a simple ctuantization process on wavclet domain singular value decomposition. Unlike the traditional wavelet based watermarking schemes where the watermark bits were embedded directly on the wavelet coefficients, the singular vectors of the blocks within wavelet low frectuency sub-band of the original image were explored for embedding the watermark. Watermark detection is efficient and blind in only the key and threshold but not the original image. Experimental results show that the quality of the watermarked image is good and is ctuite effective against general image processing and extremely robust against JPEG compression.
Markov Game Theory Based Routing Countering Eavesdropping
MA Zheng-xian,DONG Rong-sheng,WANG Yu-bin,LIU Jian-ming
Computer Science. 2011, 38 (11): 34-36. 
Abstract PDF(321KB) ( 401 )   
RelatedCitation | Metrics
To reduce the probability of eavesdropping in wireless sensor networks, this paper proposed a Markov game theory based routing(MGBR) to counter tapping problems under stochastic routing. The sender and the eavesdropper are considered as the two players of the game. Data is transmitted by senders with probability, determining the data transmission to effective tap is hence difficult for the eavesdroppers. With the tool of PRISM, simulation results demonstrate that there is a Nash Ectuilibrium point in MGBR, where the probability to be eavesdropped can be minimized. Furthermore,we presented the probability variation tendency of information to be eavesdropped on the Nash Equilibrium point. Compared with protocol based on minimal-hop algorithm, MGI3R can reduce the probability of information to be eavesdropped effectively.
Algorithm of QoS Routing Based on Underwater Acoustic Delay and Disruption Tolerant Network
LIU Yun-luo,HU Jia-hui,FENG Yan-juan,HUANG Qiu-xiang
Computer Science. 2011, 38 (11): 37-39. 
Abstract PDF(342KB) ( 457 )   
RelatedCitation | Metrics
Setting whales as network nodes, the QoS routing problem was researched for the particular environment of underwater acoustic DTN. For the constraints of energy and bandwidth, a novel QoS routing algorithm based on nodes' mobility model was proposed. Through theoretical analysis and simulations, the aspects of packet lost rate, energy consumption and latency of the algorithm achieve the prospective design objects. The work of this literature has important significance on protection of endangered whales, the oceans and seas observation, resources exploration and so on.
Research on Correlation Power Analysis Attack against PRESENT
LIU Hui-ying,WANG Tao,ZHAO Xin-jie,ZHOU Lin
Computer Science. 2011, 38 (11): 40-42. 
Abstract PDF(266KB) ( 666 )   
RelatedCitation | Metrics
The correlation power analysis attack against PRESENT was discussed in this paper. An correlation power attack method according to the sbox in PRESEN T cipher was presented based on hamming distance power leakage model. The results of experiment indicate that hardware implementation of PRESENT without protection measure is vulnerable to correlation power analysis attacks. The 64-bit first round expanded key can be recovered with 5 power traces,and the 80-bit PRESENT master key searching space can be reduced to 216,so that cryptographic devices should be protected to prevent this kind of attack.
Research on Trust Degree of Authentication
ZHANG Ming-de,ZHENG Xue-feng,LV Shu-wang,ZHANG Qing-guo
Computer Science. 2011, 38 (11): 43-47. 
Abstract PDF(388KB) ( 506 )   
RelatedCitation | Metrics
There arc several uncertain factors in authentication procedure, so authentication services arc not absolutely credible. However existing researches focused on not whole but some aspects of authentication. Based on formalized description for authentication and logical analysis on security of authentication, uncertain factors in authentication procelure were divided into four categories. Meanwhile a common approach on trust degree of authentication was proposed through the introduction of trust classification,non-trust factor and security degree. Two examples were given to demonstratc how to use this approach.
Low Computational Load Anonymous Routing Protocol for Ad-hoc Networks
LIU Fang-bin,ZHANG Kun,ZHANG Hong
Computer Science. 2011, 38 (11): 48-53. 
Abstract PDF(455KB) ( 384 )   
RelatedCitation | Metrics
Nodes in Ad-hoc networks are limited in energy, have poor computational ability and move fast, and public key encryptions have heavy computational load, consume a lot of energy and have long computational cycle time, so publie key encryptions are adapted for the Ad-hoc networks. The proposed anonymous routing protocols have a lot of public key encryptions. To reduces public key encryptions,we applied bilinear pairing and Zero Knowledge Proofs into anonymous routing protocol,and proposed a new anonymous routing protocol一一an low computational load anonymous routing protocol for Ad-hoc networks(LCAR),which reduces public key encryptions heavily. Our analysis and simulation study verify that our protocol is much better than existing anonymous routing protocols on the aspects of energy efficicncy and end-end delay.
Novel Method for Anomaly Detection of User Behavior Based on Shell Commands and DTMC Models
XIAO Xi,ZHAI Qi-bin,TIAN Xin-guang,CHEN Xiao-juan
Computer Science. 2011, 38 (11): 54-58. 
Abstract PDF(511KB) ( 461 )   
RelatedCitation | Metrics
This paper presented a novel method for anomaly detection of user behavior based on the discretctime Markov chain model,which is applicable to intrusion detection systems using shell commands as audit data. In the training period, the uncertainty of the user's behavior and the relevance of the operation of shell commands in short time were fully considered. This method takes the sequences of shell commands as the basic processing units. It merges the sequences into sets in terms of their ordered frequencies and then constructs states of the Markov chain on the merged resups. Therefore this method increases the accuracy of describing the normal behavior profile and the adaptability to the variations of the user's behavior and sharply reduces the number of states and the required storage space. In the detection stage, considering the real-time performance and the accuracy requirement of the detection system, it analyzes the anomaly degree of the user's behavior by computing the occurrence probabilities of the state sequences, and then provides two schemes, based on the probability stream filtered with single window or multi windows, to classify the user's behavior. I}he results of our experiments show that this method can achieve higher detection performance and practicability than others.
Clustering Algorithm for Cognitive Radio Ad-hoc Networks Based on Affinity Propagation
ZHANG Jian-zhao,YAO Fu-qiang,ZHAO Hang-sheng,LIU Yong-xiang,WANG Fan
Computer Science. 2011, 38 (11): 59-61. 
Abstract PDF(348KB) ( 415 )   
RelatedCitation | Metrics
According to the dynamic heterogeneity of available channels and scarcity of global common channels in cognitive radio Ad-hoc networks(CRAHNs),a distributed clustering algorithm CRAP-CMI(Clustering Based on Affinity Propagation with Controlled Message Interchanges) was proposed. Based on the interchange and update of messages among neighboring nodes, cluster structure was constructed on the channels shared by most local neighbors with the nodes holding most available channels as cluster heads. To reduce the clustering overhead as well as adapt to the variadons in CRAHNs, CBAP-CMI refines the number of messages interchange and achieves rapid distributed clustering.The simulation results show that the proposed algorithm reduces the number of clusters and increases the average available channels for each link and the common channels in each cluster, providing an efficient topology for distributed spectrum cooperation in CRAHNs.
Hop-counts-based Fairness Enhancement RFD and its Classifier Implementation
WU Hang-xing,ZHENG Xue-feng,PAN Wen-ping
Computer Science. 2011, 38 (11): 62-66. 
Abstract PDF(409KB) ( 429 )   
RelatedCitation | Metrics
To solve unfairness bandwidth allocation problem induced by different RI I} and multiple congestion links,we proposed a fairness enhancement RED(FERED) based on the measurement results in the real Internet in this paper.FERED utilizes the hop-counts to improve fairness and constructs a two-category classifier to modify the RED algorithm.Simulation results in NS2 show that FERED can efficiently improve fairness. Moreover, FERED can be deployed easily.
New Improvement of the Hadoop Relevant Data Locality Scheduling Algorithm Based on LATE
LI Li-ying,TANG Zhuo,LI Ren-fa
Computer Science. 2011, 38 (11): 67-70. 
Abstract PDF(584KB) ( 396 )   
RelatedCitation | Metrics
In the present,scheduling problem is a hot cloud computation research issues, and the purpose is to coordinate the cloud computation resources to be fully rational use. Data locality is one of the main properties in the particular cloud platform for Hadoop. The paper discussed the property, proposed a new improvement of the Hadoop relevant data locality scheduling algorithm based on LAhE. hhe algorithm mainly sows the bakcup of slow task performance problem which arises during the implementation of data read,taking most of the time and envently influencing its processing speed. Finally, carried on experiment to the algorithm and analyzed the funcation, verified the algorithm to improve the response time and the whole system throughput.
On a Family of Pseudorandom Binary Sequences from Elliptic Curve
ZHAO Long,HAN Wen-bao,JI Hui-fang
Computer Science. 2011, 38 (11): 71-74. 
Abstract PDF(285KB) ( 421 )   
RelatedCitation | Metrics
One family of pseudorandom binary sequences were constructed from elliptic curves over binary finite fields.With the help of exponential sums on elliptic curves, the well-distribution measure and correlation measure of order k were computed, and the low bound of linear complexity was derived by the relation between linear complexity and correlation measure of order k. The results show that these sequences have good randomness and provide strong potential applications in communication systems and cryptography.
Efficient Fault-tolerant Mechanism in Super-peer Network
TAN Yi-hong,LUAN Xi-dao,LI Bin
Computer Science. 2011, 38 (11): 75-78. 
Abstract PDF(425KB) ( 377 )   
RelatedCitation | Metrics
The mechanism that super-peer, as a server of client peers, is used to manage the client peers and execute query in super-peer network improves the search efficiency. However,failure of super-peer will seriously affect stability and search efficiency of network. A novel efficient fault tolerant mechanism was proposed. Firstly, based on improving the traditional structure of undirected doublcloop,k-undirected doublcloop was proposed,and the topology structure of super-peer layer was built with it, Meanwhile, the method of super-peer selection and super-peers load-balance was proposed to reduce the possibility of super-peers failure caused by super-peer overloaded, and it used the resuming algorithm and the fault tolerant routing algorithm to solve the fault tolerant problem of super-peer network if super-peer was failed. The experimental results also show that it is the high fault tolerance of network and its dynamic maintenance is simple.
Feature Optimization for Digital Modulation Signals Recognition
LIU Ming-qian,Li Bing-bing,ZAHO Lei
Computer Science. 2011, 38 (11): 79-82. 
Abstract PDF(308KB) ( 512 )   
RelatedCitation | Metrics
Aiming at the problem of feature parameter numerousness and feature redundancy in the recognition of digital modulation signals, a feature optimization method for digital modulation signals recognition was proposed in this paper.Firstly,the method optimized the feature parameters of twenty selected features by orthogonal experiment And then it recognized nine kinds of digital modulation signals using RI3F neural network. Finally, the method compared to PCA method and KPCA method. The simulation results show that the method is able to optimize feature parameters effeclively in Gaussian and multipath channel,and has much better optimization ability than PC八and KPCA methods.
Complex Networks Immune Strategy Based on Acquaintance Immunization
GE Xin,ZHAO Hai,ZHANG Jun
Computer Science. 2011, 38 (11): 83-86. 
Abstract PDF(303KB) ( 1114 )   
RelatedCitation | Metrics
We proposed one synthesis immunization algorithem aiming at the drawbacks of acquaintance and target immunization and improve them separately. This immunization strategy randomly chooses nodes in network and adoptes different action according to different characteristics. This trategy remains the advantage of acquaintance that is based on local information without global structure and high degree nodes. It is more effective in the situation that the number of node need to be immunized is the same as target immunization. The better effectivity and wider situation of application were verified throughout simulation on scale free, random and some real-world networks.
Access Control Policy Optimization Model Based on Neural Network
LI Ken-li,KANG Qiang,TANG Zhuo,SHA Xing-mian,YANG Liu
Computer Science. 2011, 38 (11): 87-91. 
Abstract PDF(408KB) ( 394 )   
RelatedCitation | Metrics
Access control is the main core policy of network security and protection, and its main task is to ensure that network resources arc not illegal use and access. Pulling the concept of the risk into access control, analysed delegate permissions based on risk and the basic nature of permissions of redistribution, and the calculation based on MUS collection, proposed a risk assessment based on neural network. Since the neural network is suited for the quantity data processing, and the risk factors arc of great uncertainty, the risk factors of information security were quantized by fuzzy evaluation method and the input of neural network was pretreatmented. The simulation results show that the trained neural network can estimate the degree of risk factor real time.
Adaptive Stochastic Resonance System Based on Standard Genetic Algorithm
WU Li-ping,LI Zan,LI Jian-dong,CHEN Chen
Computer Science. 2011, 38 (11): 92-95. 
Abstract PDF(340KB) ( 610 )   
RelatedCitation | Metrics
According to the demand of weak signal detection in actual engineering, an adaptive stochastic resonance system was designed utilizing standard genetic algorithm(SUA) based on the principle of bistable resonance systems and relationships among signal, noise and the nonlinear system. Through genetic algorithm using output signal-to-noise-ratio (SNR) as objective function and binary joint coding of system parameters, the optimal parameters of bistabile stochastic resonance were identified. Then,received signal was proceeded by making use of optimal stochastic resonance parameters. Simulation results show that the adaptive system designed can always maintain in an optimal state of stochastic resonance to achieve the maximum output SNR and processing gain to 15~20dB quickly. Therefore it ensures effective detection and processing of weak signal in low SNR.
Spreading Dynamics Model Research on P2P Specific Information
DING Jun-ping,CAI Wan-dong
Computer Science. 2011, 38 (11): 96-99. 
Abstract PDF(406KB) ( 590 )   
RelatedCitation | Metrics
Spreading of P2P specific information and epidemic are similar to each other, thus spreading dynamics becomes new research direction of P2P specific information spreading. Since the existing spreading dynamics model cannot accurately simulate P2P specific information spreading process, we improved the SEIR spreading dynamics model, and established the new SEInR spreading model. Main features of this model are as follows:dividing the traditional infected group(I) into n sulrgroups,each of which has different parameter;and establishing the transformation relationship between Exposed (E) and Removed (R). Matrix theory was applied to get formula of basic reproductive number for SEInR model, and the basic reproductive number was analyzed. The simulation result indicates that the improved SEInR model can much more accurately simulate P2P specific information spreading process than traditional SEIR model; and the basic reproductive number can accurately reflect P2P specific information threshold.
Model of Information Terminal's Kernel Based on RBAC
LI Hong-xin,GUAN Ke-qing
Computer Science. 2011, 38 (11): 100-103. 
Abstract PDF(362KB) ( 377 )   
RelatedCitation | Metrics
With the development of such technologies as 3G and wireless network, desktop systems and embedded systerns are used as individual's or enterprise user's information terminals, and confidentiality of user's data will face more serious security threats than before. ho improve the security of information terminals, this paper analyzed the threats to system's information security which comes from the abnormal operation by processes in operation systems, and applied RBAC model into the safety management mechanism of operation system's kernel. The access control model of system's process based on RI3AC was built, and an implement framework was proposed. Finally, this paper illustrated the methods of implementation of the model for open source systems and non-open source systems.
Anti-key Logger Based on Hardware-assisted Virtualization
MA Jian-kun,HUANG Hao
Computer Science. 2011, 38 (11): 104-108. 
Abstract PDF(465KB) ( 669 )   
RelatedCitation | Metrics
The message flow of Windows operating system was introduced and the potential threats was analyzed. A solution based on hardware-assisted virtualization was developed to defend against the software key logger. A virtual machine monitor was implemented using Intel virtual technology. When the protected thread is reading keyboard input, the keyboard interrupt is handled in the virtual machine monitor. By reading scan code of the keyboard in the virtual machine monitor, the keyboard input can be safely sent to the protected thread.
Features of the Predicate Dynamic Logic for Web Services
WU Xiao-qing,MA Yue,CAO Cun-gen,SUI Yue-fei
Computer Science. 2011, 38 (11): 109-113. 
Abstract PDF(340KB) ( 379 )   
RelatedCitation | Metrics
There are several formalizations of Web services: WSMO(Web Service Modeling Ontology) and OWI-S(Web Ontology Language for Services). To analyze the logical properties of WSML,we represented WSML in a predicate dynamic logic,where the predicate dynamic logic is different from the traditional PDL which is designed to represent dynamic properties of programs which change the assignments of variables, and the predicate dynamic logic for Web services is designed to represent dynamic properties of services which change the relations. We gave the syntax and semantics of the PDI_ for Web services,and an example to show how to represent the static and dynamic properties of Web services.
Coinductive Data Types and their Applications in Programming Languages
SU Jin-dian,YU Shan-shan
Computer Science. 2011, 38 (11): 114-118. 
Abstract PDF(426KB) ( 520 )   
RelatedCitation | Metrics
Inductive data types mainly focus on the finite syntactic structures inductively in terms of algebras from the construction perspective, but have some disadvantages in describing dynamic behaviors. As their categorical dual nodons, coinductive data types aim to coinductivcly describe the observable behaviors of data types in terms of coalgebras from the observation perspective. We firstly gave the definitions of inductive data types in programming languages from the categorical and algebraic viewpoints. After that, we continued to present the definition of coinductive data types with coalgebras and analyze the corresponding corecursion operations according to the finality of coinductive data types. Finally, we pointed out how to use J,-bialgebras and distributive laws to combine inductive and coinductive data types and discuss the relations between syntactic constructions and dynamic behaviors of data types.
Low-cost Protection Strategy Based on the Code Compression
CHEN Yong,HE Yan-xiang,SHI Qiang,WU Wei,LI Qing-an
Computer Science. 2011, 38 (11): 119-122. 
Abstract PDF(353KB) ( 426 )   
RelatedCitation | Metrics
Taking advantage of compression algorithm and C language compiler supporting analysis,we proposed a lowcost protection strategy. To reduce the cost of protection, we designed a security model based on the security problem of C language itself to protect the object code using different security level. Moreover,we invented a new binary compression code algorithm(I3CC) based on the blocks to compress the target code before using the protected strategies to further step down protects spending. Through the experiment, we got about 80%~90% cost of the strategy which do not use the compression algorithm.
Model Checking of Data and Time Aware Web Service Composition
Computer Science. 2011, 38 (11): 123-126. 
Abstract PDF(450KB) ( 435 )   
RelatedCitation | Metrics
To validate the data properties and time properties of Web service composition, we presented a model checking method based on data and time aware service model (DTSM). In this approach, first we translated Web service composition described by BPEL to formal model which contains data information and time information,and then translated this model to UPPAAL specification, at last the correctness of Web service composition was verified through a model checking tool-UPPAAL.
Middleware-based Framework for the Quality Management of Context-aware Pervasive Applications
Computer Science. 2011, 38 (11): 127-130. 
Abstract PDF(358KB) ( 351 )   
RelatedCitation | Metrics
With the rapid development of the information technology, it is inevitable that the distributed mobile computing will evolve to the pervasive computing gradually whose final goal is fusing the information space composed of computers with the physical space in which the people arc working and living in. I}o achieve this goal, one of the problems is how to continuously monitor/capture and interpret the environment related information efficiently to assure high context awareness. Many attentions have been paid to the research of the context aware pervasive applications. However,most of them just use the raw context directly or take just some aspects of the Quality of Context(QoC) into account.Therefore, we proposed a middleware based context aware framework that supports QoC management in various layers.By this framework we can refine raw context, discard duplicate and inconsistent context so as to protect and provide QoS-enriched context information of users to context-aware applications and services.
Exception Handling Analysis for Business Process
XUE Gang,YE Xiao-hu,ZHANG Nan,YAO Shao-wen
Computer Science. 2011, 38 (11): 131-136. 
Abstract PDF(534KB) ( 400 )   
RelatedCitation | Metrics
Exceptions are aroused when built in attributes for executing a well-designed business process are not supported. Process exception handling technology involves detecting and processing runtime exceptions in process executing environment. This paper first introduced features and classification of business process exception. Based on Exception Handling Patterns, bigraphs for CCS were applied to investigating process exception handling, and some results regarding modeling and analysis were discussed. At last, exception handling strategics were applied to analyzing exception handling patterns,and basic model structure and dynamic behavior analysis for process exception handling were proposed in the paper.
Micro-blogging Information Recommendation System for Mobile Client
SONG Shuang-yong,LI Qiu-dan
Computer Science. 2011, 38 (11): 137-139. 
Abstract PDF(376KB) ( 367 )   
RelatedCitation | Metrics
As a new type of online community, micro-blogging has gained more and more attention from mobile users.The most favorite person, as well as understanding interesting themes have become the main reason why users visit micro-blogging. In this paper, we proposed a correlated topic model-based approach for information recommendation in micro-blogging system. The approach can automatically find the relationship among users, topics and words, from which we can get the hot topics in the most recent period, the most influential users about each topic, and the incidence relationamong those topics. Experiment results show that the method can provide mobile users with useful information related to their interest.
Multi-resolution Modeling and Consistency Maintenance in Distributed Interactive Simulation
YUAN Ling,ZHANG Xiao-fang,LI Guo-hui,PANG Yong-jie
Computer Science. 2011, 38 (11): 140-143. 
Abstract PDF(448KB) ( 631 )   
RelatedCitation | Metrics
In distributed interactive simulation, the interactions among the multi resolution models of the same entity and the multi-resolution models of different entities may cause the problem of data inconsistency. Based on the analysis of existing multi-resolution modeling methods,a tree configuration modeling method was proposed for multi-resolution modeling in distributed interactive simulation, which is composed of aggregated entity models and platform entity models. In order to solve the problem of data inconsistency in the multi-resolution models of the tree configuration modeling method, the interactive among the multi-resolution models was regarded as the transaction of distributed database management. I}hen a nested 2-phasccommitment protocol algorithm was proposed to settle the consistency maintenance of multi-resolution models.
Effect of Initial Infected Individuals on the Spread of Epidemics in Complex Networks
Computer Science. 2011, 38 (11): 144-147. 
Abstract PDF(323KB) ( 434 )   
RelatedCitation | Metrics
Infectious diseases incessantly threaten human beings for a long time. Many researchers pay much attention on the spread of epidemics, such as spread model, threshold spread rate, and affecting factors of spread, etc. Different initial infected individuals in the network how to affect the disease spreading was studied in this paper. Five initial infected individuals selecting methods were analyzed and two largescale undirected networks were used to simulate the disease spreading under different initial selected individuals. Simulation results demonstrate that the spread rate and the affected population size arc related to initial infected individuals as well as network structure. I}hc results can offer some valuable suggestions on disease controlling.
Improved Search Results Clustering Algorithm Based on Suffix Tree Model
Computer Science. 2011, 38 (11): 148-152. 
Abstract PDF(445KB) ( 393 )   
RelatedCitation | Metrics
To make up for the deficiencies in clustering label selection, clustering quality evaluating and the control of overlapping clustering in the existing search results classification algorithm, this paper proposed an improved search resups clustering algorithm based on vector space model and suffix tree model. We modified LINGO algorithm's clustering function and clustering label scoring function,basic clustering merging process was added and the treatment effect of Chinese was improved. Finally, we analyzed the algorithm's classification results and the generated label's quality according to the experiment results. What' s more, a platform for recommended Web search results clustering based on carrotz framework was established and CQIG algorithm's classification accuracy and clustering label’s discriminative and readability were confirmed on this platform.
Method of Time Series Piecewise Linear Representation Based on the Function
Computer Science. 2011, 38 (11): 153-155. 
Abstract PDF(321KB) ( 659 )   
RelatedCitation | Metrics
Considering the actual situation of the different influence on the different segments in terms of time property of time series and the dynamic growth data of time series, a new method FPAA(Function Piecewise叔gregate Approximation)of piecewise linear representation was proposed based on the method of RPAA(Reversc Piccewise Aggregate Approximate) and PA八(Piecewise Aggregate Approximate). The proposed method overcomes the disadvantages of RPAA and PAA by defining the influence factor of function. FPAA has the linear complexity, satisfies lower bounding lemma and supports online segmentation of time series. Compared with the methods of PAA and RPAA, the FPAA method can effectively query time series online.
Social Networks Data Publication Based on k-anonymity
Computer Science. 2011, 38 (11): 156-160. 
Abstract PDF(439KB) ( 481 )   
RelatedCitation | Metrics
Because of scientific researching and data sharing, social networks data should be released. However, individual privacy will be breached if social networks data will be published directly. Therefore, privacy protection should be carried on while releasing social networks data. A privacy protection method based on k-anonymity was proposed. The method is suitable for the scene that the aggressor with background knowledge of neighborhood information wants to reidentify the target node in published social networks. According to the individual privacy protection rectuirement, the entities set different levels of privacy protection to share data and improve data utility as possible as. Designed and implemented the KNP algorithm to publish data anonymously and carry on experiment on dataset to validate the algorithm. Experimental results show that the algorithm can effectively resist the neighborhood attack.
Privacy Preserving Distributed Data Mining Based on Game Theory
Computer Science. 2011, 38 (11): 161-166. 
Abstract PDF(542KB) ( 569 )   
RelatedCitation | Metrics
Privacy preserving distributed data mining has become an important issue in the data mining. Based on economic perspectives, game theory has been applied to privacy preserving data mining, which is a relatively new area of research. This paper studied the strategies of partics(two-party or multi-party) by using a complete information static game theory framework for the privacy preserving distributed data mining, where each party tries to maximize its own utility. Research results show that the semi-honest adversary strategy of partics(two-party or multi-party) is Pareto dominance and Nash equilibrium under certain conditions in distributed data mining; and non-collusion strategy of parties(multi-party) is not a Nash equilibrium under the assumption of semi-honest adversary behavior, then the mixed strategy Nash equilibrium was given. So this paper has some theoretical and practical implication for the strategy of partics in privacy preserving distributed data mining.
Rough Set Approach to Data Completion Based on Weighted Similarity
Computer Science. 2011, 38 (11): 167-170. 
Abstract PDF(441KB) ( 364 )   
RelatedCitation | Metrics
In recent years,much attention has been given to the treatment of incomplete data. By now,many completion methods to incomplete data have been proposed in rough set theory. hhese methods usually compute the similarities between the object that contains missing values and other objects that do not contain missing values,and use the values of the most similar object to replace the missing values. However, there is a common problem for these methods. That is,these methods assume that the dependencies of decision attribute on all condition attributes arc the same, and the significances of all condition attributes are also the same,they ignore the differences between different condition attributes in a decision table. To solve this problem, in this paper we introduced a new notion of weighted similarity, which employs the dependencies of decision attribute on condition attributes and the significances of condition attributes as weights to compute the similarity. Based on the weighted similarity, we proposed a novel rough set data completion algorithm WSDCA.We compared WSDCA with the current data completion algorithms on UCI data sets. And experimental results demonstrate the effectiveness of our method to data completion.
Vectorizing Process and Operation of the Temporal Spans
Computer Science. 2011, 38 (11): 171-175. 
Abstract PDF(472KB) ( 437 )   
RelatedCitation | Metrics
Temporal span is one of the most important objects for temporal assertion. How to compute the result of the operation between temporal spans accurately is the key issue. Since temporal span is uncanonical and can be affected by the flexible temporal granularity, the granularity conversion is not always effective. By vectorizing process, the temporal granularity system can be isomorphism with the n dimensional vector space. Furthermore, temporal span can be mapped to free vector in the vector space after being completed and smoothed. Thus, the operations of temporal spans in any complex situations can be simplified with the operation rule in the vector space.
Research and Implementation of a Plate Parameter Measurement Method Based on IST
Computer Science. 2011, 38 (11): 176-178. 
Abstract PDF(347KB) ( 374 )   
RelatedCitation | Metrics
The plate parameter measurement method can be applied to industrial intelligent control. It not only reduces the cost of plate parameter measurement, but also improves the accuracy and efficiency. Based on the practical applicalion and research of object extraction on plate image, the paper proposed a kind of plate parameter measurement method based on ISL(Image Segmentation and Thinning) algorithm that combines image threshold segmentation with image region thinning. Under the premise of ensuring accuracy, the method can effectively eliminate the noise and movement interference and meet the demands of real-time.
Union Argumentation Framework Based on Subject Trustworthiness
Computer Science. 2011, 38 (11): 179-186. 
Abstract PDF(688KB) ( 514 )   
RelatedCitation | Metrics
Argumentation framework is the foundation of computer using debate mechanism to solve some practical problems,such as business negotiations,legal disputes and labor disputes. The traditional frameworks provide a formal description of debate mechanism and reasoning methods, but ignore the description of debate subjects and their influence on the result in a debate, and in the process of a debate an argument usually requires joint demonstration of multiple arguments. To focus on those observations, we first introduced the concept of debate subject and provided a formal description of joint demonstration among the arguments. }hhen a union argumentation framework based on subject trust worthiness(STUAF) was proposed on the basis of traditional frameworks, the fundamental lemma and some basic resups showed by Dung's argumentation framework were also hold for STUAF. Finally, the algorithm for semantic calculation was presented by combining dispute tree. Our experimental analysis and results show that STUAF and the corresponding algorithm are effective.
Database Semantics and Natural Language Communication
Computer Science. 2011, 38 (11): 187-190. 
Abstract PDF(348KB) ( 355 )   
RelatedCitation | Metrics
Database Scmantics(DBS),initiated by Prof. Hausscr in 1990’s as a linguistic framework for computational modeling of natural language communication, treats the agent' s switching back and forth between the speaker and the hearer mode as a well-defined and well-motivated computational problem. As a procaiural approach aiming at a systematic general model of cognition, differing from those etalanguagcbased theories, DI3S has two innovative basis, namely,the time-linear motor algorithm of LA-grammar(Left Associative grammar) and the data structure of word bank.With its application to the ancient Chinese as an example, we attempted to provide an introduction to the DBS under-standing and production of natural language in the sense of the functioning of three LA-grammar variables, i. e.,LA-hear, LA-think and LA-speak.
New Approach of Model Checking Based on Management for Dynamic Memory and State
Computer Science. 2011, 38 (11): 191-195. 
Abstract PDF(391KB) ( 367 )   
RelatedCitation | Metrics
Abstract The model checking is one of main formalization methods. However there is the problem about state explosion and insufficient memory, which is bottleneck of verification of large scale systems. hhough many researchers have done amount of work, the problem has not been settled well yet. The paper, based on the investigation of the management for fixed memory and state, presented a new approach of model checking, which avoids the problem that model checking can not proceed because of insufficient memory.
Training Algorithm of Fuzzy Neural Network Based on Improved T-S Fuzzy Reasoning
Computer Science. 2011, 38 (11): 196-199. 
Abstract PDF(376KB) ( 695 )   
RelatedCitation | Metrics
A training algorithm of fuzzy neural network based on improved T-S fuzzy reasoning was proposed in the predicate model design, in order to reduce the complexities of the algorithm. The main work is as below. Firstly, improved T-S fuzzy reasoning method based on moving rate is defined. Then, compared with existing fuzzy reasoning method based on composed rules and distance-type fuzzy reasoning method,new fuzzy reasoning algorithm has a less amount of complexity in calculating and is more effective. Finally, the training algorithm of fuzzy neural network is improved,and it can be applied in weather forecast and security situation prediction. Lest results show that this method significantly improves the effectiveness of training,reduces the order of training,time complexity and training error.
Fault Diagnosis Research by Rough Set Theory and the PSo-BP Neural Network
Computer Science. 2011, 38 (11): 200-203. 
Abstract PDF(303KB) ( 368 )   
RelatedCitation | Metrics
For the imperfections of BP network fault diagnosis model, including the complexity of the network structure, the long time of training, and the low precision, this article introduced rough set(RS) , particle swarm optimization(PSO) and genetic algorithm(GA) into the diesel engine fault diagnosis, then proposed a new algorithm that is based on rough set theory and the improved 13P neural network. hhe algorithm uses self-organization mapping net(SOM) to discretize the continuous attributes, rough set theory to make a reduction on the properties for characteristic parameters,and the particle swarm optimization(PSO) to optimize the BP network structure,so that it can shorten training time and improve the accuracy of fault diagnosis effectively. Finally, the result of the diesel engine's diagnosis proves the fcasibilily, rapidity, veracity of the algorithm.
New Multi-label Text Classification Algorithm
Computer Science. 2011, 38 (11): 204-205. 
Abstract PDF(246KB) ( 382 )   
RelatedCitation | Metrics
A new multi-label text classification algorithm based on hyper ellipsoidal was proposed in this paper. For every class, the smallest hyper ellipsoidal that contains the samples of the class is structured, which can divide the class samples from others. For the sample to be classified, its class is confirmed by the hyper ellipsoidal that surrounds it. If the sample is not surrounded by any hyper ellipsoidal, the membership is used to confirmed its class. The experiments were done on Reuters 21578 and the experiment results show that the algorithm has a higher performance on classificalion speed and classification precision compare with hyper sphere algorithm.
New and Better Algorithm for Evaluation of Overall Performance of Embedded Computer through Combining Grey Entropy with Absolute Correlation Degree
Computer Science. 2011, 38 (11): 206-207. 
Abstract PDF(219KB) ( 550 )   
RelatedCitation | Metrics
Conventional gray entropy correlation degree suffers from the non-uniqueness and asymmetry. We now proposed a novel evaluation model and applied the model to evaluate the performance evaluation of embedded computer.The algorithm combining the gray entropy theory with gray absolute correlation degree is to guarantee that the gray entropy absolute correlation degree has unique and symmetry, which can effectively avoid the miscarriage of justice for performance of embedded computer. A numerical example shows that the algorithm improves the effectiveness and accuracy of the performance evaluation of embedded computer,and provides a way to evaluate the comprehensive performonce of embedded computer.
Locality Preserving Data Domain Description One-class-classifier
Computer Science. 2011, 38 (11): 208-212. 
Abstract PDF(444KB) ( 396 )   
RelatedCitation | Metrics
In a support vector data dcscription(SVDD),the compact description of target data was given in a hyper spherical model which was determined by a small portion of data called support vectors. Despite the usefulness of the conventional SVDD,however,it may not identify the optimal solution of target description due to neglecting the structure of the given data. In order to mitigate this problem, a novel oncclass classifier named locality preserving data domain description(LPDI))was proposed which takes the data density into account by using of affine factor. Besides, the sequential minimal optimization was adopted to a山ust model parameters for applying in the large sample occasions. Experiments with various real data sets show promising results.
Common Algorithm on Computing Networks' Clustering Coefficient and Cycles
Computer Science. 2011, 38 (11): 213-215. 
Abstract PDF(325KB) ( 781 )   
RelatedCitation | Metrics
Triangles arc absent in bipartite networks, this paper proposed clustering coefficient definition in bipartite networks which is the fraction of quadrilaterals. Numerical results show that both coefficients have the same property.Furthermore this paper applied both coefficients to compute the number of large cycles,deduced an common expression for estimating number of large size cycles.
Study on Multimodal Interaction in Smart Home to Support Senior User
Computer Science. 2011, 38 (11): 216-219. 
Abstract PDF(382KB) ( 690 )   
RelatedCitation | Metrics
A new multimodal Human Computer Interaction(HCI) model was proposed to support senior user in smart home. hhe prototype system which integrates the functions of Chinese language speech information processing and eye gaze tracking in the network environment of smart home was developed. I3y using the avatar technologies, achieved capability of visual auditory dual-channel interaction. User's intention was perceived by rule-based task reasoning. The experimental results show that user experience is improved.
Reductions Based on Dominance-Equivalence Relations and Rule Extraction Methods
Computer Science. 2011, 38 (11): 220-224. 
Abstract PDF(428KB) ( 370 )   
RelatedCitation | Metrics
Considered the inconsistent target information systems which respectively induce dominance relation on condition attribute set and ecauivalence relation on decision attribute set, Under this dominance-equivalence relation, the relationships were analyzed among three different reductions, compatible reduction maximum distribution reduction and positive domain reduction.The rule extraction method based on positive domain reduction(PDRIS) was improved by in troducing inferior relations together with dominance relations to extract rules with larger coverage. Finally, an example was given to illustrate our rule extraction method, and the experimental results on 13 UCI data were demonstrated to make comparisons between the proposed method and the PDRIS method.
Analyzing of Gravitational Search Algorithm Based on Triangular Norms
Computer Science. 2011, 38 (11): 225-230. 
Abstract PDF(439KB) ( 423 )   
RelatedCitation | Metrics
To transform universal gravitational formula after analyzing Gravitational Search Algorithm(GSA) Esmat Rashedi proposed, the product operator between two particles's inertia quality in the universal gravitation formula ws substituted by other triangular norms operator. Five triangular norms operators were chose for experience after analyzing feature of two-dimensional image to different triangular norms operators. Results show; to a test function having certain threcdimensional image feature, ability to find the global optimum using GSA of corresponding triangular norms operator is relatively better than GSA of others triangular norms operators.
Explosion Search Algorithm and its Convergence
Computer Science. 2011, 38 (11): 231-233. 
Abstract PDF(281KB) ( 425 )   
RelatedCitation | Metrics
Inspired by explosion of fireworks(bomb) , a new intelligence search algorithm was proposed, which is called Explosion Search Algorithm(ESA). One theory defined as Neighborhood Search was proposed in ESA, the Steepest descent search algorithm was introduced into the ESA,which makes this new algorithm stronger ability of global search as well as local search. The convergence of the algorithm was also proved in this article. The simulation using standard benchmark functions and comparison with other algorithms proved the efficiency of the new algorithm.
Algorithm of Mining Association Rules Based on Rough Sets and Transaction Itemsets Combination
Computer Science. 2011, 38 (11): 234-238. 
Abstract PDF(370KB) ( 358 )   
RelatedCitation | Metrics
The Apriori algorithm contains weaknesses such as often requiring a large number of repeated passes over the database to generate the frequent item sets and does not support the incremental updating. To solve these problems, a novel algorithm was proposed in this paper which is based on rough sets, single transaction combination itemsets and set operations for mining. It firstly uses the rough sets to reduce attributes, and then combines data item to each itemset from new decision table and marks it's tags. Finally, it calculates the support and confidence using set operations. This novel algorithm just needs to scanning the decision table only once, while effectively supporting the update of association rules mining. The results of application and experiments show that this novel algorithm is better than Apriori algorithm, it is an effective and fast algorithm for mining association rules.
Text Classification Algorithm Study Based on Rough Set Theory
Computer Science. 2011, 38 (11): 239-240. 
Abstract PDF(247KB) ( 362 )   
RelatedCitation | Metrics
Text dataset is transformed to information system without attribute of decision making and the core content of attribute reduction has been applied to text classification. Experiment shows that the precision rate and recall rate are enhanced in this method; furthermore, it does not require any a priori information.
Bounded Clustering on Bounded Tree-width Graphs
Computer Science. 2011, 38 (11): 241-244. 
Abstract PDF(325KB) ( 546 )   
RelatedCitation | Metrics
The bounded clustering problem is motivated by system S, a distributed stream processing system developed at IBM Research. Input to the problem is an undirected graph with vertex and edge weights and a subset of vertices called terminals. A cluster is a subset of the vertices. The cost of a cluster is defined as the total vertex weight in the cluster plus the total edge weight at the boundary of the cluster. The goal of the problem is to partition the vertices into clusters of cost at most a given budget B such that each cluster contains at most one terminal and the total cost of the clusters is minimized. For the problem on graphs of bounded tree-width, a pseudo-polynomial time exact algorithm was presented,which can be modified via rounding to yield a (1+s)-approximation in polynomial time violating the given budget by a l+s factor,where e}0 can be made arbitrarily small. For a variant of the problem on graphs of bounded treewidth where each cluster contains exactly one terminal, a polynomial time algorithm was presented that yields a(less)-approximation violating the budget by a 2}s factor.
Research on Remote Health Monitoring System Based on Motion Sensor
Computer Science. 2011, 38 (11): 245-247. 
Abstract PDF(249KB) ( 478 )   
RelatedCitation | Metrics
A remote health monitoring system based on IMU was presented. Acceleration data,magnetic data and angular data were collected by the llVIU tied up on human body. The raw measurement data was then processed by Kalman filter algorithm to produce a quaternion representation of orientation. Finally the BVH file containing the data captured above was transmitted to a remote PC of the doctor. The results show that the system is able to capture the motion data accurately and reproduce the movement of the human in real time.
3D Mesh Model Retrieval Using Incremental Clustering
Computer Science. 2011, 38 (11): 248-251. 
Abstract PDF(340KB) ( 486 )   
RelatedCitation | Metrics
For the model retrieve of largcscale threcdimensional inefficiency, this paper presented a threcdimensional model retrieval method based on the idea of incremental clustering. Firstly, for the models among the model base, constructed a retrieval words codebook. Then extracted the feature points to attain the feature vector of models according to the feature histogram,after that,followed by an incremental clustering method and update retrieval words codebook. Finally,a feature vector matching method was used to determine whether the model base contains the models which is related to the target model. Experimental results show that our implementation can get the retrieval result rapidly and precisely.
Adaptive Weighted Variational 2DPCA for Face Recognition
Computer Science. 2011, 38 (11): 252-256. 
Abstract PDF(400KB) ( 348 )   
RelatedCitation | Metrics
For full using of local information and more discriminative information of the image matrix to improve the recognition rate of 2DPCA, an adaptive weighted variational 2DPCA for face recognition was presented, which partitions its subpatterns from an original whole image, extracts features from them in the variational 2DPCA, adaptively computes the weight of each subpattern in classification,finally makes classiffication in terms of weight of each classification. The experiments on ORI. face bases show this algorithm is also superior to the traditional 2DPCA and Modular 2DPCA in terms of the recognition accuracy and recognition time.
Medical Image Registration with Mixed Programming
Computer Science. 2011, 38 (11): 257-263. 
Abstract PDF(1586KB) ( 384 )   
RelatedCitation | Metrics
How to improve images registration's precision and efficiency while keeping a low error rate is an important topic worthy of study. In order to meet the need of clinical demand, a mixed programming registration strategy was proposed. In this method, the preregistration is finished by the combination of the extraction of the image's gravity center and the detail enhancing of the image which is generated by wavelet decomposition. On this basis, the process of registration is based on the gray-scale of the images,and it uses the Powell optimization algorithm and the traditional maximum mutual information as the method of measuring the similarity between the images. In addition, we proposed an improvement on the oncdimensional search within Powcll algorithm which is different from the 13rent algorithm, and this makes it more suitable for image registration without losing the accuracy and efficiency. The experiment shows that our registration strategy performs very well in avoiding false registration, and its accuracy reaches sub-pixel level, moreover, its efficiency meets the need of clinical demand.
Shape Description and Recognition Approach Based on Distance Ratio Context
Computer Science. 2011, 38 (11): 264-266. 
Abstract PDF(307KB) ( 489 )   
RelatedCitation | Metrics
We suggested a shape descriptor named Distance Ratio Context(DRC) which is based on the distance betwecn sampled points and the centroid of the object. hhis descriptor has the properties of invariant to scaling and translation essentially and can be calculated easily. It's also invariant to deformation and distortion in some ways and can discriminate different shapes effectively. The dynamic programming algorithms were used to measure the distance between DRCs and this mechanism solved the problem of start point choosing on the object contour. hhe experiments in kimia's-99 shape dataset show that this approach, used in image retrieval of shape with a single closed contour, can get favorable results.
License Plate Character Recognition Based on Structural Features and Grayscale Pixel Features Algorithm
Computer Science. 2011, 38 (11): 267-270. 
Abstract PDF(408KB) ( 533 )   
RelatedCitation | Metrics
This paper proposed a new method based on the structural features and gray-scale features. To get a higher performance,the structural features and binary pixels features were extracted from the binary images respectively.hhese features were mapped to the high dimensional space through SVM to get the category. Gray pixels features were needed as an input of SVM while the category was"8" or "B". If the category was "0" or "D",we used the OD-classifier to classify it again. The algorithm was used to classify the license plate char images. Experiment shows that the algorithm can effectively improve the recognition rate of confusing characters with nice recognition rate and high performance. The algorithm has a broad application prospects.
Steganographic Research for Batch Binary Images
Computer Science. 2011, 38 (11): 271-274. 
Abstract PDF(308KB) ( 481 )   
RelatedCitation | Metrics
The core of steganographic research is capacity and security of hiding information. According to Cachin'definition of security and properties of binary image block of host image, batch steganographic model of binary images was presented. Batch steganographic capacity of binary images was summarized, when satisfying static markov chain and mutually independent of embedding in covers. Security of the model was proved under certainly condition. The experimental results and analysis show that the model satisfies steganographic requirement for security. The research results are helpful for theory and application.
Improved Side Information Generation Algorithm in Distributed Video Coding
Computer Science. 2011, 38 (11): 275-277. 
Abstract PDF(364KB) ( 450 )   
RelatedCitation | Metrics
Side information generation is the most critical factor leading to the performance of the distributed video coding(DVC) scheme. An improved macroblock partition technique according to image activity measure(IAM) was proposed for the motion compensation frame interpolation process with respect to its neglect of the motion intensity inequality at different regions. Before the forward motion estimation, IAM of the residual image between adjacent key frames was calculated to estimate motion activity, by which various macroblock sizes are judged for various regions. Experimental result shows that for different video sequences, 0. 3~1. 3dB gain in the ratcdistortion performance is achieved by applying the proposed algorithm in the DVC system, thus it effectively improves the system performance.
Method of Statistical Property Extraction Based on Rotational Projection in Handwritten Digit Recognition
Computer Science. 2011, 38 (11): 278-281. 
Abstract PDF(444KB) ( 393 )   
RelatedCitation | Metrics
To extract geometrical property by abstracting the contour and skeleton of digits can obtain the particulars efficiently, but irregularity induces low recognition rate. I}he statistical analysis theory can overcome the weakness.First, Gap-Rate and Gap-Variation extraction methods were probed, by projecting and computing the rate and change of each gap as the eigenvector of the digit Further,Rotational Projection was probed based on the idea of enhancing the orthogonality of cigenvcctors and reducing the redundancy information in the way of rotating the datum line. I}heoretical analysis and experiment proved the high recognition rate of rotational projection with the same statistical properties, also parameters were recommended in the end. Further more, it also recognizes the oblique digit.
Edge Detection Algorithm of Oil Spills Remote Sensing Image Based on DBT Denoising and Improved GDNI Edge Linking
Computer Science. 2011, 38 (11): 282-285. 
Abstract PDF(610KB) ( 396 )   
RelatedCitation | Metrics
Edge detection technology of oil spills image on the sea is one of most key technologies to monitor oil spills on the sea. In view of characteristics of oil spills remote sensing image, this paper presented a novel method to detect edge of oil spills image on the sea. The algorithm is composed of three parts: "non-maximal suppression" realizes candidate edge detection of oil spills image, DBT eliminates the noise and pseudo edge in candidate edge, and makes edge locate more accurately and continuously, an improved GDNI edge linking algorithm is used for linking discrete edge points into closed edge contours. Experiment results show that the proposed algorithm can gain continuous and closed edges of oil slick remote sensing images and edge extraction for oil slick images with low contrast and strong noise.
Research on Technological Evaluation Standard of General-purpose Operating Systems
Computer Science. 2011, 38 (11): 286-290. 
Abstract PDF(424KB) ( 872 )   
RelatedCitation | Metrics
In the field of operating system, researchers and engineers both focus on how to enhance OS technology itself in different aspects and on how to promote their system software productions. However, there is rarely any work done on the standards of evaluating different general-purpose operating system in a rational and measurable way. In this paper,first we made a comprehensive research on different methods of comparing different general-purposed operating systems. Then we pointed out the advantages and disadvantages of these methods. Based on the basic ideas of these methods,we proposed a technological evaluation standard of general-purpose operating systems, which compose both quantitative and qualitative comparison items. The comparison items were classified as seven measurement dimensions.The weight of comparison items or measurement dimensions is configurable. And default weights arc preset. We also gave the guidance and steps of how to evaluate different operating systems based on the proposed standard. Moreover,we described how to express the evaluation results of different levels. Through the comparison based on the proposedstandard,researchers and OS producers could evaluate general-purpose operating system in a rational and measurable way. We compared and evaluated RHEL 5. 5 and Windows Server 2008 based on this standard.
Algorithms of Placing and Routing Hardware Task in Reconfigurable System
Computer Science. 2011, 38 (11): 291-295. 
Abstract PDF(431KB) ( 411 )   
RelatedCitation | Metrics
The problems of placing and routing of 2D reconfigurable hardware tasks arc important factors of system resource utilization in reconfigurable computing system Reconfigurable hardware tasks,which are based on heterogeneous reconfigurable devices and task models, were properly classified, and a new algorithm named of DRS I}CW was proposed,which can place and route multiple reconfigurable hardware tasks simultaneously. Experiment results indicate that the algorithm can improve the resource utilization of reconfigurable devices and tasks' routing ratio.
Parallel Gene Expression Programming Based on General Multi-core Processor
Computer Science. 2011, 38 (11): 296-302. 
Abstract PDF(594KB) ( 395 )   
RelatedCitation | Metrics
Gene Expression Programming(GEP) is a new versatile evolution algorithm with huge calculation. The conventional GEP cannot take advantage of current popular multi-core processors. In order to improve the efficiency of GEP, parallel Gene Expression Programming based on general multi-core processor (PEEP-MP) was proposed. The main contributions include: (1) the mechanism of parallel GEP based on general multi core processor is analyzed; (2)the parallel model of GEP based on general multi core processor combined with coarscgrained and fincgrained levels is designed by the combination of MPI and OpenMP; (3) evolution strategies to improve PEEP-MP are proposed; (4) experiments on function mining and classification show that PGEPMP improves the efficiency of function mining and classification. Compared with conventional GEP, the mean parallel speedup ratio of PEEP-MP are 4. 22 and 4. 02 times while the number of parallel dual core processors is 4.