Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue 4, 16 November 2018
  
Overview on Exception Handling in Service Oriented Software
GUAN Hua,YING Shi,JIA Xiang-yang,JiangCaoQing and WANG Yi-bing
Computer Science. 2013, 40 (4): 1-8. 
Abstract PDF(913KB) ( 600 )   
References | RelatedCitation | Metrics
As service oriented software is becoming a hot research field of software engineering,the reseach of exception handling in service oriented software(EHSOS) gets more and more attentions and its importance is becoming more prominent.We first defined some EHSOS related concepts,described several typical classifications,introduced several methods used in EHSOS and pointed out their limitations.Through the analysis and summary of current research fields and status quo of EHSOS,we proposed the key issues to be addressed,discussed the inadequacies of current research,and predicted the next main research tendency of EHSOS.
Survey of P2P Network Security and Defense Mechanism
LIU Yue,LI Qiang and LI Zhou-jun
Computer Science. 2013, 40 (4): 9-13. 
Abstract PDF(444KB) ( 1462 )   
References | RelatedCitation | Metrics
The applications based on P2P networks have been playing an important role in the Internet.P2P networks with the distributed architecture are scalable and flexible while they are facing the enormous security challenges.This paper began with an overview of the concepts and features of the P2P network,and explained the difference between it and traditional C/S structure,then detailed the three popular methods of attack against the P2P networks:Sybil attack,Eclipse attacks and DDoS attack,and pointed out the relations and differences of the three kinds of attack,finally gave an overview of research on the defensive measures against the attacks.
Multi-label Data Mining:A Survey
LI Si-nan,LI Ning and LI Zhan-huai
Computer Science. 2013, 40 (4): 14-21. 
Abstract PDF(712KB) ( 1915 )   
References | RelatedCitation | Metrics
In traditional single-label data mining,each sample belongs to only one label.However,an real-world object usually associates with more than one attribute.Multi-label data mining is motivated by increasing requirements of mo-dern applications and is widely applied in many fields,such as semantic annotation of images and video,functional geno-mics,music categorization into emotions and directed marketing,which has attracted significant attention from a lot of researchers.This paper systematically introduced technology of multi-label data mining from two aspects:methods and evaluation metrics.Finally,we summarized some problems and challenges in current study and gave prospects of the tendency in this area.
Study on Energy Optimization of Servers Based on States Management
XIAO Zhi-jiao,MING Zhong and CAI Shu-bin
Computer Science. 2013, 40 (4): 22-25. 
Abstract PDF(417KB) ( 387 )   
References | RelatedCitation | Metrics
With the development of cloud computing,the problem of huge energy consumption has become more and more critical.State management has always been an effective way of saving energy.State management of servers can also bring an impressive amount of energy saving.A strategy based on state management was proposed to optimize energy consumption of servers in data centers,which can guarantee the performance and bring down the energy consumption at the same time.Petri nets and states analysis were used to analyze the strategy.An instance analysis and some experiments were done to show the validity and superiority of the strategy.
Reconfigurable Architecture Model Based on Layered Hypergraph
SHEN Lai-xin,ZENG Guo-sun and WANG Wei
Computer Science. 2013, 40 (4): 26-30. 
Abstract PDF(452KB) ( 448 )   
References | RelatedCitation | Metrics
The breadth of computer applications has led to the diversity of application tasks,and the same kind of architecture is difficult to adapt vastly different application tasks.The resource requirement of computing,storage and communication of the task was analyzed.By analyzing the program structure theoretical and extracting the key granularity extraction of the real application,we got the sub-algorithm clusters which have independent function.Hierarchical hyper graph was used to visually describe the application structure and architecture,and the perception algorithm was utilized to get the system states such as application features and components busy information,then the architecture reconfigurabies and decision-making algorithm were utilized to achieve a reasonable matching of the sub-algorithm clusters and substructures.A computing instance analysis verifies the validity of the reconfigurable architecture.The graph isomorphism theory shows that this variable architecture is reasonable,has high-efficiency and low power consumption in solving different complex applications.
Design and Implementation of Leading-One Prediction
LI Xing,HU Chun-mei,LI Yong and LI Zhen-tao
Computer Science. 2013, 40 (4): 31-34. 
Abstract PDF(389KB) ( 1286 )   
References | RelatedCitation | Metrics
Leading-one prediction(LOP),which is often used in floating-point addition/subtraction,can operate in parallel with the adder and reduce the delay in the normalization shift.However,this prediction might generate one-bit-error.Three different LOP architectures were classified by the methods handling the one-bit-error.Among that,the LOP architecture with serial correction was described in detail.At the same time,serial correction’s key components in the algorithms were optimized.Through the synthesis experiments of LOP architecture with concurrent correction,serial correction and traditional leading one detector(LOD) method,we found that serial correction method has the best performance balancing area,power and delay.It is successfully used in two-path floating-point adder which is operated in 3-cycle pipeline with a 1Ghz clock frequency.
Design and Implementation of Software Pipelined Loop Buffer
CHEN Ji-xiao and LI Yong
Computer Science. 2013, 40 (4): 35-37. 
Abstract PDF(275KB) ( 404 )   
References | RelatedCitation | Metrics
One of software pipelined loop buffer was designed.It is used to store and dispatch instructions of loop body,reduce the times of accessing memory when executing loop programs,thereby reducing the influence of memory access latency on performance.Based on the study of software pipeling and loop unrolling,the design of software pipelined loop buffer was finished.The loop buffer has storage for up to 11232-bit instructions.The special instructions of loop buffer are used to control the operation of loop programs.Numerical simulation was performed for the design.Using the design complier,analysis was also conducted for the design.
Self-adaptive Memory Dependence Predictor
BAN Dong-song,YAN Shi-yun,LI Li,YANG Jian-xin and LU Dong-dong
Computer Science. 2013, 40 (4): 38-40. 
Abstract PDF(329KB) ( 487 )   
References | RelatedCitation | Metrics
The out of order execution of store and load instructions always results in memory dependence hazards.MDP(Memory Dependence Prediction) can reduce these hazards,and improve the processor performance.Most of academic research works are complex and spend high hardware cost.Although MDP is implemented simply in commercial processors,it still has some shortages,such as,no self-adaptive ability or blocking ILP(Instruction Level Parallelism).This paper proposed a simple and effective memory dependence predictor SMDP,which possesses self-adaptive ability,low hardware cost,and high ILP.The simulation shows that SMDP improves the processor performance effectively,by 0.7991% on average over blind prediction,and by 4.9225% at best.
Evaluation and Analysis of Effects of Auto-vectorization in Typical Compilers
LI Chun-jiang,HUANG Juan-juan,XU Ying,DU Yun-fei and CHEN Juan
Computer Science. 2013, 40 (4): 41-46. 
Abstract PDF(467KB) ( 936 )   
References | RelatedCitation | Metrics
SIMD(Single-Instruction-Multiple-Data) architecture plays an important role in the architecture of modern processors.Also,it is supported in multiple types of homemade high performance general purpose processors.For exploiting the data-parallel performance potentials of the short vector processing ability presented by the SIMD architecture,the auto-vectorization of compilers is one of the main means for improving the performance of applications.Evaluating the effects of auto-vectorization in typical compilers using mature commercial general purpose processor with SIMD support,is beneficial both to processor architecture design and to compiler analysis and design.We evaluated the effects of auto-vectorization in the typical compilers (including Intel compiler,PGI compiler and GCC compiler) with the stan-dard benchmarks SPECCPU2006and SPECOMPM2001.Then taking the open source product level GCC compiler as the target,we thoroughly evaluated the effects of auto-vectori-zation in GCC with hand-coded program segments (mainly multiple types of loops),and we analyzed the ability and limitations of the current implementations of the auto-vectori-zation in GCC.Our work provides a valuable contribution for the research and development of auto-vectorization in compilers.
Research on Observation Mechanism for Congestion Control in InfiniBand Network Based on ibdump
CAO Guang-quan,ZHANG Zi-wen,SUN Zhi-gang,CHEN Hong-yi and XU Qing-jie
Computer Science. 2013, 40 (4): 47-50. 
Abstract PDF(359KB) ( 1766 )   
References | RelatedCitation | Metrics
Congestion control in InfiniBand network can guarantee high performance and resource utilization.It avoids performance damage of congestion spreading to victim flow.This paper first introduced the InfiniBand congestion control mechanism of ECN (Explicit Congestion Notification).Then,a new tool named CTBG (Central Traffic Behavior Generator) was proposed to generate congestion traffic from multiple nodes and analyze the result in a central controller mode.In order to further investigate the congestion control behavior of InfiniBand,we proposed the new observation method based on ibdump and wireshark.The experiment indicates the observation method in this paper can observe the InfiniBand congestion control behavior with fine granularity and low cost.This method has important significance for the research of the InfiniBand congestion control mechanism.
Research and Design of Multiple Thread Mechanism in Matrix DSP
DENG Yu,SUN Yong-jie and WAN Jiang-hua
Computer Science. 2013, 40 (4): 51-54. 
Abstract PDF(336KB) ( 429 )   
References | RelatedCitation | Metrics
This paper presented the study of a multiple thread mechanism in YHFT_Matrix high performance DSP.It emphatically introduced the read and write mechanism of the loop instruction buffer and the mode switch mechanism between single_threaded and multithreaded.Based on the 65nm process,after being synthesized,both the code area and power consumption have been decreased,and the key path optimizes 0.07ns.The implementation of the program assessment test was analyzed,and the outcome shows that compared with the operation mode of single_threaded,the processor performance IPC(Instructions Per Cycle) has an average increasing of 9.64%.
Implement of Matrix Compiler’s If-convertion Algorithm
LIU Fei,CHEN Yue-yue,SUN Hai-yan and YANG Liu
Computer Science. 2013, 40 (4): 55-58. 
Abstract PDF(419KB) ( 424 )   
References | RelatedCitation | Metrics
The ILP(instruction-level-parallel) becomes more and more important for improving the microprocessor’s speed.The if-convertion technology is an effective compiler-optimization method of facilitating exploiting ILP based on the microprocessor which supplies conditional execution.After introducing if-convertion technology and algorithm of GCC and transplanting it into the Matrix compiler successfully,some improvements for if-convertion of Matrix compi-ler were made.Experiment results indicate that the if-convertion achieved in the paper can remove the branches,reduce the number of basic-blocks and broaden the bound of basic-block.It can also facilitate making more predominant codes for compiler.
Power Management of Idle Nodes in Clusters
LIU Yong-peng,LU Kai and CHI Wan-qing
Computer Science. 2013, 40 (4): 59-63. 
Abstract PDF(529KB) ( 406 )   
References | RelatedCitation | Metrics
Existence of massive active idle nodes causes huge energy waste in large scale systems.Cache-style power management for idle nodes was proposed to schedule the power states of idle nodes.According to their different sleep states,idle nodes are placed into multiple groups with corresponding sleep states.It is expected to achieve a system response speed similar to the active state and a power saving similar to the deepest sleep state.The idle nodes are dynamically transformed between different sleep groups.Assuring response speed of system,idle node is put into a sleep state as deep as possible.In our experiments,CPMI conserves the power consumption of idle nodes by 69.51% with the cost of relative slowdown only by 0.99%.
Design of Content-based Data Forwarding Network and Algorithm
ZHU Zhao-meng,ZHANG Gong-xuan,ZHANG Yong-ping,GUO Jian and ZHANG Wei
Computer Science. 2013, 40 (4): 64-68. 
Abstract PDF(527KB) ( 420 )   
References | RelatedCitation | Metrics
The data in the Internet of Things needs to be aggregated and disseminated effectively.There are always central massive storage to manage the numeric data produced by sensors.Proper data forwarding mechanism is necessary for such a system.We proposed a design of content-based data forwarding network.This network can combined with the node network of massive storage,that is,utilizing the scattered computing resources of each node in a massive stora-ge network to take up the challenging task of dissemination of a great amount of data.Inspired by the system virtuali-zation,we introduced "workers" and "function bricks" to dynamically balance the loads of different nodes and tasks between nodes in the network automatically.We also proposed a distributed content-based publish/subscribe algorithm using Bloom filter.By efficiently using Bloom filter to represent the object as well as the set of constrains which is satisfied by the object,a large number of redundant computing can be eliminated to sure the algorithm’s efficiency.
Research of Parallel SVM Algorithm Based on CUDA
ZHANG Wei,ZHANG Gong-xuan,WANG Yong-li,ZHANG Yong-ping and ZHU Zhao-meng
Computer Science. 2013, 40 (4): 69-72. 
Abstract PDF(653KB) ( 640 )   
References | RelatedCitation | Metrics
SVM has been widely used in statistical classification and regression analysis.With the rapid development of Internet of Things,SVM algorithms in various applications often need to address the challenges in the rapid processing of large amounts of data.Firstly,this paper studied SVM algorithm parallelization and proposed a parallel CUDA-based SVM algorithm scheme,and then further researched the massive data processing,raised a massive parallel data proces-sing program.Finally,the performance of the parallel algorithm was compared via the experimental analysis comparing.
Hotspot-aware Data Storage Strategy for Wireless Sensor Networks
LI Qiao-qin,WU Lei and WANG Yan
Computer Science. 2013, 40 (4): 73-77. 
Abstract PDF(417KB) ( 398 )   
References | RelatedCitation | Metrics
Considering hotspot problem in GHT(Geographic Hash Table)-based Data-centric storage (DCS) sensor networks,this paper proposed a hotspot-aware data storage scheme (SASS),which improves the routing scheme of GHT to reduce the energy consumption caused by perimeter walk,and exploits neighboring nodes to extend storage space dynamically.Simulation results demonstrate that compared with exiting schemes,SASS can effectively alleviate data lost caused by the limitation of storage space and reduce communication load of nodes in hotspot areas.
P2P Organization Model for Service Clustering Based on Semantic Tree
LAN Ming-jing
Computer Science. 2013, 40 (4): 78-82. 
Abstract PDF(522KB) ( 391 )   
References | RelatedCitation | Metrics
Aiming at solving problems existing in centralized and traditional distributed service discovery mechanism,a new service organization model was proposed.It identifies service by semantic string that comes from a semantic tree based on services functions in the service system.With an improved algorithm based on Kademlia,the model organizes all the services to form a P2P overlay network in which nodes are gathered together according to the semantic tree and can be found using semantic strings.This model has solved the single point failure and bottleneck problem,can find and invoke service without service registry.It is highly scalable,and supports more high-level applications such as dynamic scheduling,fuzzy search.The approach has been successfully applied in a service computing platform,which has already been verified and well operating nearly a year.
Improved Minimum Hop Count Routing Protocol in Wireless Sensor Network
CHEN Zhi-bo and XU Xiao-cheng
Computer Science. 2013, 40 (4): 83-85. 
Abstract PDF(364KB) ( 487 )   
References | RelatedCitation | Metrics
Routing protocol design is an important research area in wireless sensor networks,and reliability,low-cost and easy to maintain are goals of WSN routing protocol design.Hop based routing protocol has been receiving extensive attention for its simple and effective design ideas.This paper analyzed hop based routing protocol in detail analysis,and then proposed an improved routing policy.Compared with MHC and DD in OMNeT++ simulation tool,the improved policy proves its strength in reliability,load balancing,extending network lifetime and low routing overhead.
Multi-topology Link Weight Optimizing Algorithm Based on Dynamic Traffic
CHEN Duo-long,MENG Xiang-ru,LIANG Xiao and WEN Xiang-xi
Computer Science. 2013, 40 (4): 86-90. 
Abstract PDF(408KB) ( 553 )   
References | RelatedCitation | Metrics
Considering the characteristic that network traffic change is dynamic and different traffics demand for congestion control and transmission cost are different,a link weights optimizing algorithm for multi-topology routing sub-layer based on niche particle swarm optimizer (NPSO) was proposed.The traffic matrix differentiated by periods and weight gene following different traffic demands were set to adapting the dynamic changing of network traffic.The congestion and transmission cost were considered in optimizing objective function,and NPSO was used to solve the optimization problem.The experimental result shows that the new algorithm can balance the load in network.
Hierarchical Scheduling of Large Scale Distributed Computation
QIN Gao-de and WEN Gao-jin
Computer Science. 2013, 40 (4): 91-95. 
Abstract PDF(393KB) ( 344 )   
References | RelatedCitation | Metrics
With the rapid development of cloud computing,large scale distributed computation is used widely.However,recently,energy consumption of such distributed systems has become problem for further application.Existing methods for saving energy are developed mainly by decreasing the amount of running servers.However,these approaches do not consider the energy cost on network devices.This paper proposed a hierarchical scheduling algorithm.Our algorithm employs a dynamic maximum node sorting (DMNS) method to optimize the assignment of applications on servers which are connected to a switch in the same level.Secondly,we transfered the applications on the nodes with low load to the nodes which can handle more application in order to reduce the number of nodes.In addition,we chose the transfer path which bears less capacity of data exchange and less length which helps to reduce the energy consumption of network.As a result,both the running servers and the data transfer can be greatly reduced.The time complexity of HSA is satisfactory,and its stability is verified through simulations.Experimental results show that the performance of HSA outperforms existing methods.
Cooperative Channel Allocation Strategy Based on Balancing Algorithm
LUO Qing-yun,CHEN Min and ZHAO Jin-guo
Computer Science. 2013, 40 (4): 96-101. 
Abstract PDF(514KB) ( 366 )   
References | RelatedCitation | Metrics
Physical and MAC layer of IEEE 802.11supports multi-channel and multi-rate.In the case of multi-rate,IEEE 802.11network causes the problem of performance anomaly and low-speed links degrade the performance of high-speed links seriously,which leads to the degradation of system performance.For this problem,a cooperative channel allocation (CCA)agreement was designed to solve the problem of performance anomaly in wireless network.The idea of CCA is to solve channel allocation problem via the criterion of estimated transmission time(ETT) and balancing algorithm.Under the criterion of estimated transmission time,CCA separates links at different speed via multi-channel.Based on balancing algorithm,CCA can also increase the fairness of throughput.Stimulation results indicate that CCA can improve the network performances in wireless mesh networks.
Research on RFID MAC Protocol under Dense Mobile Tag Environment
YANG Jian,WANG Yong-hua and CAI Qing-ling
Computer Science. 2013, 40 (4): 102-106. 
Abstract PDF(676KB) ( 393 )   
References | RelatedCitation | Metrics
This paper firstly presented a RFID MAC protocol based on ALOHA and query tree for static tag environment (S-TMAC),then analied the fundamental,process and performance.Secondly,S-TMAC was improved to be M-TMAC which is suitable for mobile tag environment.Protocol structure was defined,and system model was built.Optional reader deployment strategies were put forward.The discussion of the tag identifying time,incidence angle and speed limit was followed.Lastly,simulations indicate that the tag identifying time in M-TMAC and S-TMAC are similar,which is half of ALOHA-based and one-third of tree-based.Furthermore,M-TMAC has low extra system cost,upper tag speed limit and tag incidence angle limit when satisfying default identifying rate and interference rate.
Heterogeneity Based Algorithm for Scheduling Out-Tree Task Graphs
ZHANG Jian-jun,SONG Ye-xin and KUANG Wen
Computer Science. 2013, 40 (4): 107-110. 
Abstract PDF(378KB) ( 393 )   
References | RelatedCitation | Metrics
Effective scheduling of a distributed application is one of the critical issues in heterogeneous computing systems.Many previous algorithms are developed based on homogeneous systems and neglect the heterogeneity of processors,which results in poor schedule efficiency.This paper proposed a heterogeneity,list and task duplication based static heuristics greedy algorithm for scheduling Out-Tree task graphs,the time complexity of which is O(hv2p) where h,v and p are the height of the DAG,the number of tasks in the task graph and the number of processors required respectively.The experimental results show that the proposed algorithm produces an efficient schedule which has shorter schedule length and less number of used processors in comparison with other related algorithms and so is more practical.
Mobility Prediction and Energy-balance Topology-control Algorithm for Ad hoc Networks
CHEN Hui and JU Yong-feng
Computer Science. 2013, 40 (4): 111-114. 
Abstract PDF(345KB) ( 384 )   
References | RelatedCitation | Metrics
In order to reduce the power consumption of nodes in transit and extend the lifetime of the network,this paper proposed the dynamic topology control algorithm EMTCA (Energy balance and Mobility prediction Topology Control Algorithm).The link weight is consided by both the energy consumption and distance of two nodes to a certain speed movement.Network topology is optimized dynamically by remaining energy in node.The simulation experiment results show that EMTCA algorithm ensures a the network topology connectivity,extends the network lifetime so that reliable operation of the network is ensured.
Ad hoc Network Node Routing Algorithm Based on the Optimal Energy Consumption Multicast Tree Structure
LI Yuan and YANG Li-bo
Computer Science. 2013, 40 (4): 115-118. 
Abstract PDF(336KB) ( 359 )   
References | RelatedCitation | Metrics
In order to solve Ad hoc least energy consumption multicast tree generation and optimization problems, this paper put forward Ad hoc network node routing algorithm based on the optimal energy consumption multicast tree structure. In this algorithm, the minimum cost multicast tree generation problem is first transformed into different relay node set power space of the dynamic optimization problem, and solving model based on the optimal energy consumption multicast tree is built. And the improved particle swarm optimization (pso) algorithm is used to make mapping and correction calculation for the right value of particle representing relay point link in different dimension space and then according to the particle fitness value, the particle's local extremum and global extremum are updated. Accor-ding to the particle position and velocity update mechanism of the iterative calculation, eventually global extreme value point and extreme value are used as the most optimal multicast tree node position and energy consumption value. The simulation results show that this algorithm has better particle diversity and global search and local search ability are good, and the optimization ability is strong.
Research on Internet Users Number Prediction Based on Equal Dimension and New Information Grey Markov Model
ZHAO Ling and XU Hong-ke
Computer Science. 2013, 40 (4): 119-121. 
Abstract PDF(322KB) ( 492 )   
References | RelatedCitation | Metrics
The scientific prediction of Internet users can provide the basis for decision-making for the construction and management of the network.Equal dimension and new information grey Markov forecasting model was established based on the traditional grey forecasting model,and Markov chain forecasting model was taken to predict the fluctuation range of the forecasting results,to form the equal dimensional and new information grey Markov forecasting model.Then,the number of Internet users from December 2007to June 2012was taken as original data to establish forecasting model to predict the number of Internet users from December 2012to June 2014.The results show that the prediction accuracy of the equal dimensional and new information grey Markov forecasting model has fewer errors and better forecasting precision in comparison with the grey model,and it can provide the fluctuation range and the probability of the predicted results.
RABOLC—A New Methodology Used in SCNEO Building
YUAN Wei-wei,DUAN Yi-feng,HU Gu-yu,WANG Zhi-ming and ZHU Bao-shan
Computer Science. 2013, 40 (4): 122-126. 
Abstract PDF(538KB) ( 383 )   
References | RelatedCitation | Metrics
Satellite communication network evaluation ontology (SCNEO) has characteristics such as moderate complexity,clear goal and high demands on scalability,and current ontology construction methodologies cannot be directly adopted to build SCNEO.A new ontology building methodology,namely Rule and Activity Based Ontology Life Cycle(RABOLC),based on some very successful methodologies was proposed.The procedure of building SCNEO under RABOLC was designed and discussed in detail.At last,SCNEO was built,which showed that RABOLC could be effectively and efficiently used to build SCNEO.
Analysis of Topic Models on Modeling MicroBlog User Interestingness
CHEN Wen-tao,ZHANG Xiao-ming and LI Zhou-jun
Computer Science. 2013, 40 (4): 127-130. 
Abstract PDF(885KB) ( 408 )   
References | RelatedCitation | Metrics
This paper analysed different topic models,and compared three extended topic models’ performance on mo-deling microblog user interestingness via three experiments.Experimental results show that TwitterLDA can apply to predict words on new unseen docuemnts and users,that the topics generated by AuthorLDA have a higher degree of differentiation,and that UserLDA and AuthorLDA can better reflect the users’ relationships in real social network.The work in this paper lays the foundation for further studying how the topic model is applied to the text mining applications of microblogs such as personalized recommendation,sentiment analysis and topic detection and tracking.
Imbalanced Data Classification Method and its Application Research for Intrusion Detection
JIANG Jie,WANG Zhuo-fang,GONG Rong-sheng and CHEN Tie-ming
Computer Science. 2013, 40 (4): 131-135. 
Abstract PDF(653KB) ( 329 )   
References | RelatedCitation | Metrics
The traditional classification algorithms always have low classification accuracy rate especially for the minorityclass when they are directly employed on classifying imbalanced datasets.A K-S statistic based new classification method for imbalanced data was proposed to enhance the performance of minority class recognition.At first,the K-S statistic was employed as a correlation measure to remove redundant variables.Then a K-S based decision tree was built to segment the training data into several subsets.Finally,two-way resampling methods,forward and backward,were used to rebuild the segmentation datasets as to implement more reasonable classification learning.The proposed K-S based method,with a realistic assumption,is very high efficient and widely applicable.The KDD99intrusion detection experimental analysis proves that the method has high classification accuracy rate of both minority and majority class for imbalanced datasets.
Research on Formal Design of Multi-thread Mechanism Based on Microkernel Architecture
QIAN Zhen-jiang,LU Liang and HUANG Hao
Computer Science. 2013, 40 (4): 136-141. 
Abstract PDF(823KB) ( 439 )   
References | RelatedCitation | Metrics
Microkernel architecture has become a hot topic in the research area of operating systems because of its effective isolation for modules.The multi-thread mechanism is a critical issue for the performance of the microkernel architecture.Many works research into the multi-thread of microkernel operating systems,but there are some problems such as frequent switching of system address space and high degree of implementation complexity.We used formal methods to describe and design the multi-thread and security mechanism,and proposed a hierarchical object semantics model.With the object semantics model,we formally designed the mechanism of inter-thread communication,thread scheduling,mutual exclusion and synchronization.Meanwhile,we used our self-implemented and verified microkernel operating system-VTOS as an example to test,and the results show that VTOS achieves multi-thread effectively and has a good system performance.
ID-based Generalized Signcryption without Trusted Party
ZHOU Cai-xue
Computer Science. 2013, 40 (4): 142-146. 
Abstract PDF(418KB) ( 377 )   
References | RelatedCitation | Metrics
GSC(Generalized signcryption) can realize encryption,signature and signcryption with only one algorithm.The formal definitions of identity based generalized signcryption without trusted party and the complete security model were proposed.A concrete scheme was presented by using the bilinear pairings.Its confidentiality and unforgeability were proved in ROM under BDH assumption and CDH assumption.Compared with other ID-based GSC schemes,the new scheme is also efficient.
Algebraic Immune Order of Correlation Immune Functions Satisfying Strict Avalanche Criterion
HUANG Jing-lian,WANG Zhuo and ZHANG Zhi-jie
Computer Science. 2013, 40 (4): 147-151. 
Abstract PDF(338KB) ( 307 )   
References | RelatedCitation | Metrics
Using the derivative of Boolean functions and custom e-derivative as a research tool,we discussed the issue of algebraic immunity of H Boolean function with correlation immunity and weight of 2n-1+2n-2 which meets the strict avalanche criterion. We got the optimal algebraic immunity function and its construction method of these functions with odd(n≥17)variables and even(n≥16)variables and gave the construction method of algebraic immunity function with n variables that the algebraic immunity order is AI(f)≥8,also gave the method of solving the annihilator and the minimum algebraic degree annihilators and the derivative relations of annihilator and the Boolean function.The derivative of the Boolean function and the e-derivative defined with derivative can directly and explicitly depict the weight of the Boolean functions as research tools,in-depth to the internal structure of the value of the Boolean function.
Approach on Network Security Enhancement Strategies Based on Optimal Attack Path
LI Qing-peng,WANG Bu-hong,WANG Xiao-dong and ZHANG Chun-ming
Computer Science. 2013, 40 (4): 152-154. 
Abstract PDF(259KB) ( 579 )   
References | RelatedCitation | Metrics
To reach the proportional security of the goal network,an approach on network security enhancement strategies based on optimal attack path was proposed.The approach assesses the risk of attack target,and makes strategies for network security enhancement on the basis of optimal attack path.To obtain optimal attack path,an approach to generating optimal attack path based on ant colony optimization algorithm was proposed,and the pheromone updating way was improved.Simulation results show that the proposed approach can generate strategies effectively,and the improved ant colony optimization algorithm has strong global search ability and rapid convergence.
Discovering Critical Nodes in Social Networks Based on Cooperative Games
WANG Xue-guang
Computer Science. 2013, 40 (4): 155-159. 
Abstract PDF(413KB) ( 490 )   
References | RelatedCitation | Metrics
Discovering critical nodes in social networks has many important applications and more and more institutions and scholars have been attended.For finding out the top K critical nodes in social networks,this paper presented a method based on cooperative games to obtain each node’s marginal contribution by using Owen value and considering the widespread community structure in social networks.And then we can get the solution of the critical nodes problem.The feasibility and effectiveness of our method were verified on two synthetic datasets and four real datasets.
New Near Space Security Handoff Scheme Based on Context Transfer
XU Guo-yu,CHEN Xing-yuan and DU Xue-hui
Computer Science. 2013, 40 (4): 160-163. 
Abstract PDF(319KB) ( 431 )   
References | RelatedCitation | Metrics
To solve the problem of security handoff in near space,a new security handoff scheme based on context transfer was proposed.Firstly,a handoff destination estimate algorithm was designed,which is based on the Doppler shift mechanism that can estimate the handoff time and location.Secondly,the previous base-station sent authentication message to the next based-station in advance based on context transfer,which can increase handoff efficiency.Performance analysis and simulation results show that the communication and computation overhead of the scheme are small,and the forced termination probability is low.The scheme is suitable for the application of near space.
Software Reliability Forecasting Model Based on Empirical Mode Decomposition and Gene Expression Programming
ZHANG De-ping,WANG Shuai and ZHOU Wu-jie
Computer Science. 2013, 40 (4): 164-168. 
Abstract PDF(496KB) ( 490 )   
References | RelatedCitation | Metrics
A forecasting method based on empirical mode decomposition (EMD) and gene expression programming (GEP) was presented and applied to software reliability forecasting.Firstly,the software failure samples were handled in order to eliminate the pseudo-data,and the intrinsic mode functions (IMFs) and the residue of different frequency bands were obtained according to EMD.Then the corresponding failure data series in the IMFs and the residue were chosen as the training samples.By means of the flexible expressive capacity of GEP,the models of each IMF and the residue were forecasted.Finally,the ultimate forecasting result was obtained by reconstructing the forecasting results of each IMF and the residue.The method of EMD overcomes the shortcomings that it’s difficult to select proper wavelet function for wavelet transform,and the final result indicates that the IMFs can reflect the characteristic of software fai-lure.After comparing with the results forecasted by means of combination of SVR and GEP,it proves that the effect of the forecasting method of EMD&GEP in software reliability forecasting is better.
Relevance Feedback-based Search of Topic Time Series Similarity in Micro-blogging
BAO Hong-yun,LI Qiu-dan,SONG Shuang-yong and GAO Heng
Computer Science. 2013, 40 (4): 169-171. 
Abstract PDF(454KB) ( 413 )   
References | RelatedCitation | Metrics
A new approach based on relevance feedback was proposed for the topic time series similarity search in micro-blogging.By considering whether the user is satisfied with the returned time series,we established an objective function for learning the coefficient of the unique metric,which reflects the user’s accurate interest.Therefore,the approach can provide the user with more satisfying topic time series in micro-blogging.We also developed a topic time series similarity search system in micro-blogging based on the new approach.Experiment results on Twitter data show the effectiveness of our proposed approach.
All Pairs Label-constraint Path Query in Large Graph
BAO Jia-jia and TIAN Wei
Computer Science. 2013, 40 (4): 172-176. 
Abstract PDF(472KB) ( 370 )   
References | RelatedCitation | Metrics
Graph data has been used to model open and heterogeneous data such as social network,biological network and semantic Web.The edge-labeled graphs are drawing the attention of researchers for its scalability to describe the path reachability.Its fundamental problem is about returning true or false of the label-constraint path query.Based on this,we put forward all pairs label-constraint path query problem.There are two kinds of difficulty to solve this problem:1) It needs to enumerate all pairs of vertices exhaustively if taking the label-constraint path query to solve it;2) The spanning tree method can’t support the all pairs path query problem even though it can answer the path query efficiently.In this work,we compressed the label path transitive closure through spanning tree and quickened the query time by inverted index technique.We also gave two optimal algorithms for the query when searching answers on the spanning tree.The extensive experiments value the effectiveness and efficiency of our approach both on computing time and storage space.
Method of Inconsistent Database Opening Branches Repairs Based on tableau Node Closed Value
GAO Long,LIU Quan,FU Qi-ming and LI Jiao
Computer Science. 2013, 40 (4): 177-180. 
Abstract PDF(395KB) ( 320 )   
References | RelatedCitation | Metrics
Based on extending the tableau method to inconsistent database repairs,a new method that uses the value of branch closure to repair the database was proposed.This method combines the analysis of tableau open and closed reasoning criterion,bases on opening formula tree TP(IC∪r),and imports a closed value for every node of the TP(IC∪r).According to the definition of node closed value, branches repair can be selected by computing node closed value,and directly determine the database instances which need repair.This method also considers repair with I closure and extends opening branches to repair,and gives the logical proof.At last,the logical feature of consistent query answer is proved.
Improved K-means Clustering Algorithm Based on DKC in Uncertain Region Environment
REN Pei-hua and WANG Li-zhen
Computer Science. 2013, 40 (4): 181-184. 
Abstract PDF(322KB) ( 386 )   
References | RelatedCitation | Metrics
This paper presented an improved K-means clustering algorithm based on DKC in uncertain region environment,namely U2d-Kmeans.Firstly,the algorithm takes uncertainty factors into account of the data object description,then uses new pretreatment method(removing isolated point) of data set and the cumulative distance method of determining the initial clustering center that is mentioned in the 2d-Kmeans algorithm.These methods avoid the defect of clustering instability caused by the random selection of clustering initial point.Finally,comparison experiment of the algorithm proves that the improved U2d-Kmeans is more objective and effective than the other two algorithms.
Commonsense Knowledge Analysis Approach Based on Event Preconditions and Effects
LI Shan-shan and CAO Cun-gen
Computer Science. 2013, 40 (4): 185-192. 
Abstract PDF(694KB) ( 468 )   
References | RelatedCitation | Metrics
Event reasoning requires support of commonsense event knowledge.We proposed a method of event know-ledge analysis in order to analysize preconditions and effects of events from multiple perspectives.In the process of ana-lysis we followed three phases:fundamental analysis,knowledge supplement,and knowledge optimization.We summarized four common sense knowledge dimensions to solve the problem of comprehensiveness and accuracy in analysis,which are physiological,psychological,social and physical.The method of role specification was proposed in the know-ledge supplement phase.Experiments show that this method is promising for acquiring comprehensive and accurate commonsense event knowledge.
Integrated Scheduling Algorithm of Two Workshops Based on ACPM
XIE Zhi-qiang,ZHOU Han-xiao,GUI Zhong-yan and ZHENG Fu-ping
Computer Science. 2013, 40 (4): 193-198. 
Abstract PDF(527KB) ( 470 )   
References | RelatedCitation | Metrics
Aiming at the problem that how processes of complex products are effectively allocated to two workshops with the same equipment resources,this paper proposed integrated scheduling algorithm of two workshops based on ACPM.In order to let the two workshops load balancing and fully parallel processed,and complete products processing as soon as possible,the algorithm according to the ACPM,takes the pre-scheduled strategy for scheduling for the end of the two workshops in which processed time is close.In order to reduce the number of relocation of two workshops,the algorithm assigns fork point process for in-degree no less than two into the workshop in which predecessor assigned is more;assigns the processes of their in-degree less than 2, at the same time having the successor with the in-degree no less than 2into the workshop which makes them early finished,to take pre-scheduled strategy for the other processes which have the only predecessor and successor formed by their leaf nodes to finishing a whole string.Examples show that the algorithm has better achieved the two workshop distributed scheduling with the same equipment resources in which in the two time complexity.
Context-aware Knowledge Acquisition and Reasoning Based on Description Logic
HU Bo,WANG Zhi-xue,DONG Qing-chao and NIU Yan-jie
Computer Science. 2013, 40 (4): 199-203. 
Abstract PDF(425KB) ( 753 )   
References | RelatedCitation | Metrics
Aiming at the problem of uncertain model and lack of automatic reason reasoning support in context-aware technique,the paper proposed an approach to model and reason the context-aware knowledge based on description logic.It suggests a context model framework in the semantic restrictions of the ontology,which divides the context model into two-level structure as meta ontology and domain-specific ontology,according to the abstract hierarchy.The objectives of our model framework include modeling a set of high-level entities,and providing flexible extensibility to add specific concepts in different application domains.After that,the paper proposed an algorithm to convert the context model to the knowledge described in description logic using SHOIN(D).Finally,a case study was given to illustrate the practicality of the method.
Gear Fault Diagnosis Based on Margin Distribution Ensemble Optimization
HU Qing-hua,ZHU Peng-fei and ZUO Ming
Computer Science. 2013, 40 (4): 204-208. 
Abstract PDF(415KB) ( 334 )   
References | RelatedCitation | Metrics
Gear crack level identification is of great significance for gear box fault diagnosis.Regarding instability and performance limitation in current identification,we generated a set of neighborhood separable subspaces based on randomized attribute reduction,on which a set of base classifiers were obtained.The weight vector of the base classifiers was learned by optimizing loss of ensemble margin and regularization learning to change margin distribution.Base classifiers were ranked according to the weight value and a set of classifiers that make the ensemble classification accuracy highest on the training set was got.The experiment analysis shows that the proposed method is much better than other state of the art methods in crack level identification.
Internal P-reasoning Information Recovery and Attribute Hiding Reasoning Discovery
ZHAO Shu-li,WU Song-li and SHI Kai-quan
Computer Science. 2013, 40 (4): 209-213. 
Abstract PDF(380KB) ( 335 )   
References | RelatedCitation | Metrics
P-sets(packet sets) are a dynamic model obtained by embedding the dynamic characteristics into the finite Cantor set X and improving it.P-sets are a set-pair composed of internal P-set X(internal packet set X) and outer P-set XF(outer packet set XF),or (XF,X) is P-sets.P-reasoning is composed of internal P-reasoning(internal packet reasoning) and outer P-reasoning(outer packet reasoning).Using internal P-sets and internal P-reasoning,the paper gave the concept and characteristics of internal P-information recovery,internal P-reasoning generation and attribute hiding of internal P-information recovery,information elements supplemented theorem and dependence theorem of internal P-information recovery,attribute hiding theorem and attribute hiding discovery theorem of internal P-reasoning information.Based on the results,the application of internal P-reasoning information recovery in the information system was given.
Layered Construction Algorithm and Application of Rough Concept Lattice
LIU Bao-xiang,CHEN Huan-huan and LIU Jie-bing
Computer Science. 2013, 40 (4): 214-216. 
Abstract PDF(210KB) ( 337 )   
References | RelatedCitation | Metrics
Rough concept lattice can reflect the certain and uncertain relationships between objects and attributes,and can deal with uncertainty knowledge.It is an important task to construct concept lattice efficiently in the applications.Analyzing the concept and structure of rough concept lattice,and combining with the construction of the general concept lattice thought,this paper proposed a layered construction algorithm,which is based on the decision-attribute value in decision context as the breakthrough point,and enriches the construction theory.The examples show that this algorithm is simple and intuitive,and gets good effect.
New Cooperative Multi-robot Path Planning Algorithm
XIAO Guo-bao and YAN Xuan-hui
Computer Science. 2013, 40 (4): 217-220. 
Abstract PDF(397KB) ( 580 )   
References | RelatedCitation | Metrics
This paper presented a new approach of multi-robot path planning in dynamic environments.The architecture of multi-robot,composed of centralized and distributed combination,can weaken some failings of bad global properties in distributed environment and worse real-time in centralized environment.On this basis,the immune cooperative co-evolution algorithm and APF algorithm were combined to solve global and local path planning problem,and the robots have better global coordination and self-adaptive ability.The results of dynamic simulation show the feasibility and efficiency of the algorithm.
Rough Approximations of Intuitions Fuzzy Sets in Fuzzy Approximation Space
XUE Zhan-ao,CHENG Hui-ru,HUANG Hai-song and XIAO Yun-hua
Computer Science. 2013, 40 (4): 221-226. 
Abstract PDF(433KB) ( 334 )   
References | RelatedCitation | Metrics
Combination of Rough sets and intuitionistic fuzzy sets is a new research hot topic.With fuzzy equivalent relation,the intuitionstic fuzzy rough approximation operators were reconstructed in fuzzy approximation space,and their properties were proved based on γoperator and complementary operatorγ in the fuzzy approximation space.The λ upper (lower) approximation was presented,and λ upper (lower) approximation of αβ-cut set was presented,and their properties were proved in fuzzy approximation space.Furthermore,the rough degree ραβA of the intuitionstic fuzzy sets was introduced,and the rough degree of αβ-cut set was introduced in fuzzy approximation space,and their properties were discussed.
Similarity Measure for Time Series Based on Incremental Dynamic Time Warping
LI Hai-lin and YANG Li-bin
Computer Science. 2013, 40 (4): 227-230. 
Abstract PDF(350KB) ( 584 )   
References | RelatedCitation | Metrics
To address the issues on the over expensive time cost,an incremental dynamic time warping (IDTW) to measure the similarity between two time series was proposed.First of all,dynamic time warping (DTW) was used to measure similarity of the past time sequences and retrieves the best warping path and the cumulated distance cost of each element in the warping path.Next,after computing the similarity between the two current time series by backward warping method,a new warping path intersects with the past one was obtained and its warping distance was minimal.Finally,the incremental dynamic warping method was realized to measure similarity.The new method not only has the good quality to measure the similarity but also is efficient to compute.The numerical experiments demonstrate that the classification accuracy and computing performance of IDTW are better than DTW.
Neighborhood Based Rough Sets in Incomplete Interval-valued Information System
WANG Tian-qing and XIE Jun
Computer Science. 2013, 40 (4): 231-235. 
Abstract PDF(349KB) ( 301 )   
References | RelatedCitation | Metrics
The study and use of the unknown interval value are still in its infancy.The complex incomplete interval-va-lued information system in which all unknown values are looked as lost was deeply investigated,and then by using grey lattice operation and Hausdorff distance,we provided a new neighborhood relationship in an interval-valued information system.Furthermore,three forms of rough set models were proposed based on neighborhood relationship,maximal consistent blocks and neighborhood system to improve the accuracy of approximations.Moreover,three numerical examples were employed to substantiate the conceptual arguments.
Study on Intelligent Test Paper Generation Strategy through Improved Quantum-behaved Particle Swarm Optimization
LI Xin-ran and FAN Yong-sheng
Computer Science. 2013, 40 (4): 236-239. 
Abstract PDF(366KB) ( 560 )   
References | RelatedCitation | Metrics
One improved Quantum-behaved particle swarm optimization for intelligent test paper generation was put forward.First of all,inertia weight is expressed as functions of particle evolution velocity and particle aggregation by defining them.Secondly,slowly varying function is introduced into the traditional position updating formula to effectively overcome the problem of getting into the local optimal solution.Finally,mathematical modeling is set for test paper genera-tion problems based on item response theory.Simulation results show that,comparing with the standard particle swarm optimization algorithm and quantum particle swarm algorithm,the proposed algorithm is of better performance in the success rate and efficiency of test paper generation.
Analysis of Cooperative Game in Repeated Prisoners’ Dilemma Based on Reputation Mechanisms
LI Dong,JIANG Jun-li and TANG Xiao-jia
Computer Science. 2013, 40 (4): 240-243. 
Abstract PDF(345KB) ( 421 )   
References | RelatedCitation | Metrics
The players under the repeated Prisoners’ Dilemma have the possibility of mutual cooperation.On this basis,We examined a class of two-state reputation mechanisms,and found that only three reputation mechanisms are efficient robust perfect Nash equilibrium in Markov strategies.The research shows that the strategy to cooperate with good opponents and defect against bad opponents is a global attractor,and hence cooperation is successfully achieved and sustained in the long run.
Reduction in Incomplete Hybrid Decision System Based on Generalized Neighborhood Relationship
XU Jiu-cheng,ZHANG Ling-jun,SUN Lin and LI Shuang-qun
Computer Science. 2013, 40 (4): 244-248. 
Abstract PDF(385KB) ( 313 )   
References | RelatedCitation | Metrics
In order to deal with the incomplete,symbol and numeric hybrid data directly,a new kind of generalized neighborhood relationship was constructed by combining with relative neighborhood relationship and tolerance relationship.Under the general neighborhood relationship,the conditional entropy used for incomplete hybrid decision system was defined on the basis of information entropy.It was proved that the attibute significance of the condition entropy contains that of the positive regions in this paper.And then the reduction algorithm based on conditional entropy of incomplete hybrid decision system was constructed.The experiments on six hybrid attribute UCI datasets were made,and the proposed method and the similar methods were compared in aspects of feature gene number,classification accuracy and run-time.The results show that the method of feature gene selection based on the proposed extended rough set model is effective.
Cellular Automata Algorithm for Solving Optimization Problems Based on Memory Principles and its Global Convergence Proof
LU Qiu-qin,NIU Qian-qian and HUANG Guang-qiu
Computer Science. 2013, 40 (4): 249-255. 
Abstract PDF(583KB) ( 654 )   
References | RelatedCitation | Metrics
To solve large-scale optimization problems(OP),the algorithm with global convergence was constructed for solving OP based on the characteristics of memory principles(MP) and cellular automata(CA).In the algorithm,the theoretical search space of OP is divided into the discrete space,and the discrete space is defined as celullar space where each cell is an alternative solution of OP;the memorizing and forgeting rules of MP are used to control transition of states of each cell;a cellular state consists of position,increment of position and residual memory which is divided into three kinds of memory state such as instantaneous,short and long-term memory,each of which is strengthed or wea-kened by accepted stimulus strength.A cell is forgotted and then discarded when its residual memory is lower than a threshould.During evoluation process,a cell’s transferring from one state to another realizes the search of cellular space on the theoretical search space.The stability condition of a reducible stochastic matrix was applied to prove the global convergence of the algorithm.The case study shows that the algorithm is efficient.
Incremental Maintenance of Concept Lattice and Association Rules under Granularity of Relation
ZHI Hui-lai and ZHI Dong-jie
Computer Science. 2013, 40 (4): 256-258. 
Abstract PDF(239KB) ( 319 )   
References | RelatedCitation | Metrics
In order to meet the needs of concept lattice application in dynamic environment,it is necessary to find a way to maintain concept lattice based on the granularity of relation.Firstly,we put forward the term father-son concept pair and gave its definition,and brought about an incremental maintenance algorithm.Secondary,we discovered that association rules can be calculated by intent reduction of concepts which is determined by the intersection of its father concept’s intent and its own intent.Finally,we put forward a method for intent updating.
Research of Multi-objective and Multi-mode Project Scheduling Problem Based on Chaos Particle Swam Optimization
ZHOU Rong,YE Chun-ming,XIE Yang and CHEN Jun-lan
Computer Science. 2013, 40 (4): 259-262. 
Abstract PDF(425KB) ( 350 )   
References | RelatedCitation | Metrics
Keeping the time,cost,quality and resources in balance is the key factor of building the general objective in the engineering project scheduling,which is related to the success or failure of the whole project.Basic particle swarm optimization is easy to trap in local optima.In this consideration ,this paper presented the chaos particle swarm optimization algorithm,built comprehensive optimization model by establishing time,expenses,resources and quality objective functions,and used chaos particle swarm optimization based on priority rule to solve this model problems.Through an application example,the article also proved that compared with basic particle swarm algorithm,the chaos particle swarm optimization algorithm can solve the multi-objective optimization problems of this model more accurately and rapidly.
Property Patterns of Markov Decision Process Nondeterministic Choice Scheduler
HUANG Zhen-jin,LU Yang,YANG Juan and FANG Huan
Computer Science. 2013, 40 (4): 263-266. 
Abstract PDF(299KB) ( 375 )   
References | RelatedCitation | Metrics
Markov decision process can model complex system with nondeterminism.Schedulers are required to resolve the nonderministic choices during model analysis.This paper introduced the time-and space-bounded reachability probabilities of markov decision process under different schedulers.Firstly,the formal definition and classification method of schedulers for nonderminism were proposed and then we proved that the reachability probabilities coincide for determini-stic and randomized schedulers under time-abstract.Also,it was proved that time-dependent scheduler generally induces probability bounds that exceed those of the corresponding time-abstract.At the end of paper,two cases were illustrated for describing the correctness of the conclusion.
SAR Image Matching Pretreatment Algorithm Based on Line Feature Histogram
ZHANG Hui-hui,LIN Wei and LV Quan-yi
Computer Science. 2013, 40 (4): 267-270. 
Abstract PDF(1077KB) ( 301 )   
References | RelatedCitation | Metrics
Due to the speckle noise and imaging conditions,there are great differences between the gray-level-statistics of synthetic aperture radar(SAR) images in the same scene,which makes it difficult for SAR images to realize high-precision registration based on direct application of image features.To solve the problem,we proposed a matching pretreatment algorithm to determine the angle and scale changes of the images by using the statistical analysis of the main line feature information,which can realize matching pretreatment of the images.Experimental results indicate that the algorithm can effectively and accurately find the angle and scale changes of both before and after images change.And while compared to the traditional method of image registration based on direct application of image features,the accuracy and efficiency of the SAR images which are matched by our matching pretreatment algorithm registration are greatly improved.
Method of Human Facial Feature Points Positioning Based on Improved ASM
HAN Yu-feng and WANG Xiao-lin
Computer Science. 2013, 40 (4): 271-274. 
Abstract PDF(544KB) ( 370 )   
References | RelatedCitation | Metrics
In the searching of ASM algorithm for target points,only surrounding local gray-level information of pixel points in the profile neighborhood on the training images is adopted,and the pixel points are equally treated.Furthermore,searching space is only limited to several pixel points along the normal direction on both sides of fixed point of target image.Obviously,the searching space is too simple.Considering that color facial images are easily obtained and have rich information,the paper suggested that firstly the images are processed on three channels of R,G and B,and secondly pixel points in the profile neighborhood are endowed with different weights on the construction of local gray-level model.Thirdly,searching space is extended to pixel points on traditional normal line and other two adjacent parallel normal lines.The results of experiments show that the new algorithm greatly improves the accuracy of positioning by sixteen point five percent than traditional algorithm,so the improving way is feasible.
Semantic-based Automatic Extraction of Reusable Regions in 3D CAD Models
BAI Jing
Computer Science. 2013, 40 (4): 275-281. 
Abstract PDF(1188KB) ( 326 )   
References | RelatedCitation | Metrics
Aiming at the partial retrieval and reuse,a semantic-based automatic extraction approach of reusable regions was proposed.Giving an engineering part,the defined design feature model can accurately describe both its features with different scales and forms and the relationships between these features by using extended feature tree,so as to effectively capture the design semantics of the model.Furthermore,based on the analysis of reusable regions for design of 3D CAD models,a reusability metric function of a given local region was defined and an algorithm that automatically extracts its candidate reusable regions for inputted extended feature tree was put forward.Finally,for each extracted candidate reusable regions,its reuse rate in given library was calculated and those candidate reusable regions which have high reuse rates were identified as final reusable regions.The proposed approach was implemented and tested.Preliminary results show that the reusable regions which are extracted using this method meet the design reuse requirements,so as to effectively support partial retrieval for design reuse.
Research on Dynamics Particles Modeling of Water into Vessel
SONG Chuan-ming,LI Ting-ting,WANG Xiang-hai and CHENG Chen
Computer Science. 2013, 40 (4): 282-286. 
Abstract PDF(631KB) ( 311 )   
References | RelatedCitation | Metrics
Modeling approach based on physical processes can simulate the real state of fluid movement,obtain good simulation results.However,different forms of fluid and same fluid of different details,involved in the different details of the property parameters usually have large differences in modeling.This article first analyzed the principle of dyna-mics fluid simulation,and then based on smoothed particle hydrodynamics model,studied four different shapes water poured into containers,proposed a neighborhood particle search method based on the blocks to improve the neighborhood particle search speed on uniform grid space,gave fluids and containers collision response determine methods.Finally,the simulation tests of water into the different containers verify the effectiveness of the established model.
Adaptive Threshold Background Modeling Algorithm Based on Chebyshev Inequality
ZHANG Kun,WANG Cui-rong and WAN Cong
Computer Science. 2013, 40 (4): 287-291. 
Abstract PDF(1248KB) ( 627 )   
References | RelatedCitation | Metrics
Background modeling is a critical element of detecting and tracking moving objects.A adaptive threshold background modeling algorithm based on Chebyshev Inequality was proposed to solve the problem about background modeling of more real-time requirements in real-time video surveillance system.Firstly,Chebyshev inequality and the adaptive threshold are used to calculate the probability of each pixel belonging to the foreground or background,so the pixels are classified as the foreground points,background points and suspicious points.Secondly,kernel density estimation method is used to estimate the probability of these suspicious points for further discrimination.Finally,the background image is extracted through the real-time background update algorithm.Experimental results show that Chebyshev inequalities can quickly distinguish the foreground and background points.Further kernel density estimation me-thod can reduce the background segmentation error,and the real-time background image results are satisfactory.This algorithm significantly improves the computing speed,is suitable for real-time video monitoring system.
Articulation Points Extraction without Manual Intervention and 3D Reconstruction
WEI Wei,WANG Dan-dan,LIU Jing and LIU Ming
Computer Science. 2013, 40 (4): 292-294. 
Abstract PDF(731KB) ( 316 )   
References | RelatedCitation | Metrics
Now with the development of the research of the motion capture and the behavior understanding of the human,the demand is increasing.The people extracted the articulation points as the feature points manually before,compared with that.How to make the extraction of the feature points much more automated is meaningful to the research of the motion capture and the behavior understanding later.This paper proposed the method of extracting the dynamic human’s articulation points and obtaining the position of the articulation points on the first frame image automatically from monocular video sequences to solve the disadvantage of traditional method about extracting the points from marked human body manually.Then the points were tracked with the light flow sparse L_K algorithm to acquire the information of two-dimensional coordinate about the moving human.Finally,combining the camera model the relative depth of the points can be obtained through geometric calculation.
Wavelet Frame Based Blind Image Inpainting
LIU Chun-li and ZHANG Gong
Computer Science. 2013, 40 (4): 295-297. 
Abstract PDF(985KB) ( 393 )   
References | RelatedCitation | Metrics
Image inpainting has been widely used in practice to repair damaged/missing pixels of given images.Most of the existing inpainting techniques require knowing priori information of pixels.However,in certain applications,such information neither is available nor can be reliably pre-detected.This paper introduced a blind inpainting model to solve this type of problems.A tight frame based regularization approach was developed for such blind inpainting problems.And the resulted minimization problem is solved by the split Bregman algorithm.The experiments show that our method is efficient in image inpainting.
Transmission Lines Fitting and Towers Positioning in LiDAR Point Cloud
YOU An-qing,HAN Xiao-yan,LI Shi-ping and YAN Zhao-jin
Computer Science. 2013, 40 (4): 298-300. 
Abstract PDF(981KB) ( 242 )   
References | RelatedCitation | Metrics
For 3D point cloud generated by LiDAR scanning electric power system,vertical projection and histogram were made to convert the cloud into 2D gray image which has even grids.Some highest histogram bins were determined as the horizontal position of electric towers.The towers were separeted and meanwhile the lines were dirvided into seve-ral segments along their stretching direction.For each segment,projection was made along the normal direction of its pendent plane.Iterative robust quadratic fits were made on the projected points to divide them into different layers from the top down.Extraction of towers and fit of lines are important for spatial distance check,incarnation of lines point cloud and scene roam.
Remote Sensing Images Fusion Method Based on Morphology and Regional Feature of Contourlet Coefficients
ZHU Kang and HE Xin-guang
Computer Science. 2013, 40 (4): 301-305. 
Abstract PDF(1422KB) ( 321 )   
References | RelatedCitation | Metrics
In the course of the fusion of remote sensing images,the retaining of the spatial information of panchromatic images is mutually contradictory with the preserving of the spectral information of multi-spectral images.So it has been being a research hotspot how to achieve the best fusion result within this paradox in image fusion fields.A new selective fusion method based on morphology and contourlet transform was proposed in this paper,which aims at the fusion of images that have several kinds of surface features.In this algorithm,the edge information of the image is distinguished to the non-edge by morphology,after that the contourlet transform is conducted to get the approximate components and a series of detail components,then the approximate images and detail images are infused selectively in contourlet coefficients domain by applying different fusion rules based on proper criteria,and finally the resultant image is obtained by inverse contourlet transform and inverse IHS transform.The results of the fusion experiment show that the proposed algorithm is feasible and effective for the fusion of remote sensing images that have different surface features.
2D-3D Medical Image Registration Based on GPU
DANG Jian-wu,HANG Li-hua,WANG Yang-ping and DU Xiao-gang
Computer Science. 2013, 40 (4): 306-309. 
Abstract PDF(815KB) ( 781 )   
References | RelatedCitation | Metrics
In 2D-3D medical image registration,the DRR generation and the computation of the similarity measure between the DRR and the x-ray image are the most important and time-consuming registration procedures.Because of the problem of large amount of calculation in registration procedure,this paper combined the pattern intensity with gradient to simplify the calculation,and used GPU multithreading parallel computing to generate the DRR and comput the similarity on the GPU,introduced the gradient descent and multi-resolution strategies in the registration optimization procedure to complete the registration process.Compared with other similarity measures and registration on the CPU,this registration method ensures the registration precision and improves the registration speed.
Study on Image Segmentation Complexity Measure
WEI Qing,LU Zhao-gan and SHAO Chao
Computer Science. 2013, 40 (4): 310-313. 
Abstract PDF(810KB) ( 313 )   
References | RelatedCitation | Metrics
The basic foundation of processing and analysis technology of image is confirming the content of image,namely with semantic content object.The existing image segmentation algorithms make segmentation areas only accor-ding to the color,and texture.Therefore it does not have any semantic information.Aiming at this issue,the necessity of image segmentation problem was analyzed,and the definition of a new image segmentation necessity evaluation index was proposed,and also,the judgment procedure and realization method were provided based on this index.The experimental results show that compared with the existing image segmentation method,the proposed image segmentation necessity index has very good adaptability to the target area size,and it can be an important measure and effective method to the necessity of the image segmentation.
Optimization Technique of 2D Texture Synthesis
DU Chang-qing and QIAN Wen-hua
Computer Science. 2013, 40 (4): 314-323. 
Abstract PDF(490KB) ( 394 )   
References | RelatedCitation | Metrics
The paper proposed a framework for exemplar-based texture synthesis.Our main idea is reducing synthesis time consuming and advancing synthesis quality by some optimization methods.During the course of process,many optimizations such as searching along a spiral path,edge expansion,L2distance computing predigesting,accelerate synthesis speed and advance the final results.Experiments show that synthesis results are quite satisfactory and effective.