Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue 9, 14 November 2018
  
Representation Theory of Probabilistic Graphical Models
LIU Jian-wei,LI Hai-en and LUO Xiong-lin
Computer Science. 2014, 41 (9): 1-17.  doi:10.11896/j.issn.1002-137X.2014.09.001
Abstract PDF(1394KB) ( 752 )   
References | Related Articles | Metrics
Probabilistic graphical models bring together graph theory and probability theory in a formalism,so the joint probabilistic distribution of variables in the model can be represented using graph.In recent years,probabilistic graphical models have become the focus of the research in uncertainty inference,because of its bright prospect for the application in artificial intelligence,machine learning,computer vision and so on.This work introduced the representations of probabilistic graphical models and discussed how to represent the joint probabilistic distribution compactly using the indepen-dences in the network.First,models of the conditional probabilistic distribution of single node and their independences were introduced,including tabular CPD,deterministic CPD,context-specific CPD,CPD of causal influence,Gaussian models and hybrid models,and a general framework called the exponential family that encompasses a broad range of distributions was defined.Then Bayesian network representation and its independence properties were described in detail,as well as the Bayesian network of Gaussian distribution and exponential family.This work also introduced Markov network representation,independence properties in Markov network and Markov network of Gaussian distribution and exponential family.We also gave two partially directed models,conditional random fields and chain graph models.In a-ddition,this work discussed template-based representations,including temporal models that contain dynamic Bayesian network and state-observation models,directed probabilistic models for object-relational domains which contain plate models and probabilistic relational models,and undirected representation for object-relational domains.Finally,this work raised some questions that probabilistic graphical models face with and discussed its development in the future.
Bidirectional Reversible Logic Circuit Synthesis Algorithm Based on Matrix Elementary Transformation
WANG Dong,LIU Qiang and LI Shan-zhi
Computer Science. 2014, 41 (9): 18-23.  doi:10.11896/j.issn.1002-137X.2014.09.002
Abstract PDF(433KB) ( 630 )   
References | Related Articles | Metrics
Quantum reversible logic circuit synthesis is one of the key techniques to construct the quantum computer.Bidirectional reversible logic circuit synthesis algorithm based on matrix elementary transformation was proposed in this paper.The algorithm can construct reversible logic circuit of any given permutation by using the circuit transformation rules of the adjacent matrix and the method of exchanging either the row number or the elements of the matrix in terms of hamming distance of two numbers.Compared with the other algorithms,the time-space complexity of our algorithm has been decreased greatly because of without exhaustive searching.Moreover,the types of the quantum reversible logic circuits synthesized by our algorithm are even and odd permutation because using n-qubit extended general Toffoli gates,and the number of quantum gates in the circuit is cut down.
Research on Semantic Validation Methods of Equipment Support Simulation Conceptual Model
GU Chuang,LIU Bin,ZHANG Xing,TIAN Shu-chao and WANG Gui-qi
Computer Science. 2014, 41 (9): 24-27.  doi:10.11896/j.issn.1002-137X.2014.09.003
Abstract PDF(1246KB) ( 514 )   
References | Related Articles | Metrics
Conceptual model validation is an important measure to guarantee its correctness and creditability.Aimed at the complexity of conceptual model formal validation methods and subjectivity of conceptual model informal validation methods,a conceptual model semantic validation method based on ontology and rule reasoning was advanced by using ontology and semantic Web technology.The method contains four steps.Firstly,conceptual models described by UML are translated into models described by OWL.Secondly,validation rules are constructed based on domain knowledge and described by semantic Web rule language.Thirdly,the model and rules are transformed into the data format recognized by Jess rule engine.Fourthly,the model is compared with the rules to check whether the model accords with validation rules.At last,an example was used to show that the method reduces the complexity of the formalized validation methods,decreases subjectivity and uncertainty of informal validation methods and improves the efficiency of conceptual model validation by replacing domain experts with validation rules and semantic reasoning machine to validate the semantic content of conceptual models automatically in the computer.
Unified Vectorization Framework for SIMD Extensions
LIU Peng,ZHAO Rong-cai,ZHAO Bo and GAO Wei
Computer Science. 2014, 41 (9): 28-31.  doi:10.11896/j.issn.1002-137X.2014.09.004
Abstract PDF(1054KB) ( 860 )   
References | Related Articles | Metrics
With the popularization of multimedia applications and the requirement of high performance computing,more and more processors are integrated with SIMD extensions.A virtual simd instruction set was designed for generating efficient simd code for different SIMD extension units,and a unification framework facing isomeric SIMD extension units was built.Program input can be transformed into intermediate representation of virtual simd instruction and then translated into simd instruction set of specific SIMD extensions through vector length and instruction set devirtualization solutions.The experimental results on SW1600,DSP and Alpha show that efficient simd code in three platforms is generated automatically on this framework and DSP has the better speedup than the other two platform.
Whole System Realtime Power Profiling and Modeling
YANG Liang-huai and ZHU Hong-yan
Computer Science. 2014, 41 (9): 32-37.  doi:10.11896/j.issn.1002-137X.2014.09.005
Abstract PDF(506KB) ( 675 )   
References | Related Articles | Metrics
Power profiling and modeling are fundament to power aware DBMS.This paper proposed a component-level power modeling method by exploiting the utilizations of the main components (CPU and disk) as the indicators of the system power.To cope with the multi-core architecture,we investigated into the individual cores instead of the monolithic CPU that may cover up some essential details.And we fitted the relationship between the utilizations and power by using the multiple linear regression method.Experimental results show that the average relative error of the power model is less than 12%,and it is lightweight without incurring too much additional cost on other applications.
POP-PHP:Online Integrated Development Environment for PHP Applications
YANG Nan,WU Ling and WANG Qian-xiang
Computer Science. 2014, 41 (9): 38-44.  doi:10.11896/j.issn.1002-137X.2014.09.006
Abstract PDF(3617KB) ( 621 )   
References | Related Articles | Metrics
With the development of cloud computing,developing programs anytime and anywhere becomes a new vision for many people.As a result,online integrated development environment receives wide attention from software develo-pers.POP-PHP (Peking University Online Programming - PHP version) is an online integrated development environment for PHP Applications and has the basic function of integrated development environment and good support for large number of users simultaneously.It provides a lightweight debugging method,and uses the services composition to achieve a more perfect grammar checking and implements programming behavior playback for monitoring.
Introduction of i Star Framework Auto Modeling and Edit Software Tool “Lerisk”
LI Tian-ying,LIU Lin,KOU Xiao-xi and ZHAO De-wang
Computer Science. 2014, 41 (9): 45-51.  doi:10.11896/j.issn.1002-137X.2014.09.007
Abstract PDF(2769KB) ( 478 )   
References | Related Articles | Metrics
In the field of requirements engineering,the subject based on i* framework including the strategy dependency model and strategy rationale model has become one of the most commonly used tools in the early requirements modeling and analysis.At the same time,many research works focus on the development of i* modeling framework editing tools.However,the existing tools often provide only basic functions,such as model diagram editing,storage,etc.,which do not satisfy the requirements that we need to visualize the modeling results of requirements text in our requirements engineering group,and also provide the editing storage and the automatic layout.Therefore,it is needed to develop a new tool based on i* framework. Firstly,this thesis investigated and surveyed the well-known modeling tools and analyzed the basic functions of them,as well as the shortcomings.Then,on this basis,we proposed and designed the supplement functions of our new tool,and introduced the layout problems of i* framework,as well as detailedly described the implementation of its automatic layout algorithm.Finally,we presented the detailed design of the visualization tool. Based on the proposed tool,we carried out experiments on the requirements text modeling and the model editing functions,and also compared them with the corresponding functions of well-known modeling tools to demonstrate the features of our tool.
Code Function Mining Tool Based on Topic Modeling Technology
HUA Zhe-bang,LI Meng,ZHAO Jun-feng,ZOU Yan-zhen,XIE Bing and LI Yang
Computer Science. 2014, 41 (9): 52-59.  doi:10.11896/j.issn.1002-137X.2014.09.008
Abstract PDF(1166KB) ( 620 )   
References | Related Articles | Metrics
Code reuse is an important way of software reuse.Software engineers need to understand the code functions before they reuse the code.A Code Function Mining (CFM) method based on static code analysis and LDA technologies was proposed.CFM method is a code-oriented method for mining,filtering,organizing and describing topics.The output of CFM method is a hierarchy structure of functional topics with descriptions.Topic descriptions can help better learn code functions and the hierarchy structure can help better understand code functions from different abstraction levels.CFM method can be used as a supplement of traditional methods based on topic modeling technology to make up for the lack of semantic analysis of topics. CFM method includes four parts:Topic Mining,Topic Filtering,Topic Organizing,Topic Describing.A CFM tool based on CFM method can automatically analyze code and show the function topic hierarchy to users through Web page.To verify the validity of CFM method,the experimental analysis was also presented on several key algorithms in it.
ConUp:SCA Middleware with Dynamic Component Updating Support
REN Guo-chao,WANG Jiang and MA Xiao-xing
Computer Science. 2014, 41 (9): 60-62.  doi:10.11896/j.issn.1002-137X.2014.09.009
Abstract PDF(310KB) ( 415 )   
References | Related Articles | Metrics
Middleware systems provide essential support for modern business applications.However,the development of Internet makes the application environment open and dynamic,which requires the application to be dynamically adaptable.Mainstream middleware systems support hot deployment only but cannot ensure system consistency during and after system updating.ConUp is a SCA middleware that supports safe and efficient dynamic component updating.This tool demo exhibits the process of dynamic component updating with ConUp,the capability of using different algorithms and the advantages in ensuring system consistency,update timeliness,and low disruption to application execution.
Large Scale Performance Test Service Platform Based on Cloud
CHEN Tie-nan,TANG Zhen,WANG Xiao-ran,REN Kai and ZHI Meng-xuan
Computer Science. 2014, 41 (9): 63-66.  doi:10.11896/j.issn.1002-137X.2014.09.010
Abstract PDF(949KB) ( 461 )   
References | Related Articles | Metrics
In software engineering,performance test is in general testing performed to determine how a system performs in terms of responsiveness and stability under a particular workload.The object of performance test is divided to two group.One is benchmark,and the other is non-benchmark.As performance test needs significant input of software resowrces, hardware resources and corresponding input of management and maintenance,traditional performance test takes a measure of miniature simulation.But this kind of miniature simulation causes inadequate test,and brings with serious consequences.Based on cloud computing,with its promise of on-demand,low-cost software and hardware resource,we builded the large scale performance test service platform to provide test service that is on-demand and customized.
Services Network Prototype System
FENG Zhi-yong,CHEN Shi-zhan,WANG Hui and LIANG Qi-xuan
Computer Science. 2014, 41 (9): 67-70.  doi:10.11896/j.issn.1002-137X.2014.09.011
Abstract PDF(363KB) ( 512 )   
References | Related Articles | Metrics
Web services technology is the best practice of Software as a Service (SaaS),and it also is the key to solve inter-organizational business integration.However,Web service organizations are scattered on the Internet,and they are not easily organized and managed.Service Network (SN) is a management platform for Web service organization,management and utilization throughout the life cycle of Web services. The main idea is that taking advantage for semantic Web and social networking technology explicitly indentifies and mines semantic association and interaction between Web service (referred to as service relation,SR),to organize available Web service on the Internet into a service ecosystem.The SN Prototype System consists of several modules,such as services collecting,service annotation,service relationship mining,service discovery,service composition and so on.By organizing available Web services into SN with rich semantic information,business context and interactions,it is firmly believed to be a new infrastructure for Services-oriented computing,and it will also be helpful to improve and enhance the degree of automation and efficiency of Web service composition to meet the needs of more complex requirements.
Program Slicing-guied Test Case Generation System
WANG Zhi-wen,HUANG Xiao-long,WANG Hai-jun,LIU Ting and YU Le-chen
Computer Science. 2014, 41 (9): 71-74.  doi:10.11896/j.issn.1002-137X.2014.09.012
Abstract PDF(319KB) ( 479 )   
References | Related Articles | Metrics
A program slicing-guided test case generation system was introduced in this paper,which could generate the set of test case covering all program behavior without scanning all paths of the program.It consists of three modules:the static analysis,dynamic symbolic execution and test case generation.In the static analysis module,the control flow and information flow of input program are analyzed to extract the control dependency and data dependency,and the potential dependency is also computed.The dynamic symbol execution module is applied to solve the constraints,generate test case and monitor the execution path.In the test case generation module,the covered and uncovered program slices of the execution test scase are computed to guide the new test case generation.In the experiments,the test case set generated by our system,can cover all program behaviors and significantly reduce the number of test cases.
Module Based Big Data Analysis Platform
ZHAO Wei,LIU Jie and YE Dan
Computer Science. 2014, 41 (9): 75-79.  doi:10.11896/j.issn.1002-137X.2014.09.013
Abstract PDF(951KB) ( 491 )   
References | Related Articles | Metrics
As the expansion of data size,the data analysis tools that run only on stand-alone computers are no longer sufficient.We designed and implemented a module based big data analysis platform named Haflow to solve this problem.In Haflow,we defined an analysis business model and an extensible module interface,which supports integration of heterogeneous tools.Users submit their analysis flows,and system will interpret them,and then commit to the Hadoop.Haflow is an extensible,distributed,heterogeneous supported,service oriented big data platform.The goal of the platform is twofold.First,it provides an platform that encapsulates the dummy jobs that have nothing to do with the business itself,improving the development speed of analysis applications.Second,by submitting jobs to Hadoop which can execute jobs concurrently,the platform decreases the mean time of the analysis applications.
MFIE:A Web-based Multi-faceted Interactive Feature Localization Tool
PENG Xin,WANG Jin-shui,FU Kun and ZHAO Wen-yun
Computer Science. 2014, 41 (9): 80-83.  doi:10.11896/j.issn.1002-137X.2014.09.014
Abstract PDF(2631KB) ( 567 )   
References | Related Articles | Metrics
In performing software maintenance tasks,developers often need to find and understand program elements (e.g.classes or methods) that are relevant to a given feature (called feature location or concept location).Some empirical studies have shown that feature location is a human-centric and information-intensive process with interactive exploration,feedback and strategy adjustment.Based on this idea,we proposed a multi-faceted and interactive features localization approach,and developed a Web-based supporting tool MFIE (Multi-Faceted Interactive Explorer).This paper introduced the features localization approach supported by MFIE,its multi-faceted interface design and other characteristics.Furthermore,the paper also described the usage of MFIE with a use case of feature location.
PSC2GS:A Tool for Generating Monitors Based on Property Sequence Charts
YU Jun,ZHANG Peng-cheng,ZHOU Yu-peng and LIU Zong-lei
Computer Science. 2014, 41 (9): 84-87.  doi:10.11896/j.issn.1002-137X.2014.09.015
Abstract PDF(1279KB) ( 431 )   
References | Related Articles | Metrics
In open and dynamic environments,unsafe run-time changes of system or environments may compromise the correct execution of the entire system.It may eventually lead to the occurrence of software failures.Run-time verification techniques based on monitors have become the basic means of detecting software failures in open environments.This tool proposes an automated approach to generate monitors from property sequence charts based on game theory.The monitors are interpreted in multi-valued semantics:satisfied,infinitely controllable,system finitely controllable,system urgently controllable,environment finitely controllable,environment urgently controllable,and violated.The monitors can provide enough information to help the system to take measures for failure prediction or prevention.This paper described a novel tool called ‘PSC2GS’,which is able to design property sequence charts,generate game structure based on property sequence charts and generate Aspect Oriented Programming codes (the Monitor) based on game structure.The tool chain provides a completely graphical front-end which can make software designers do not have to deal with any particular textual and logical formalism.
Tool for Evaluating Effectiveness of Fault Location Techniques Based on Automated Software Repair:GenProg-FL
JI Tao,QI Yu-hua and MAO Xiao-guang
Computer Science. 2014, 41 (9): 88-90.  doi:10.11896/j.issn.1002-137X.2014.09.016
Abstract PDF(348KB) ( 870 )   
References | Related Articles | Metrics
Although the technique of fault location and automated program repair have been developed in recent years,programmers still need to spend a lot of time and effort on repairing.Most developers still work with traditional debugging techniques such as breakpoints.The research results of fault location have not been applied in practical work of repairing.In recent years,the technique of automated program repair has been concerned and developed.In the work of automated software repair,using the technique of fault location to locate the bugs is a necessary activity and the accuracy of localization affects the generation of patches,which has a great impact on the effect of repair.GenProg-FL tool can use different techniques of fault location to repair programs.Also,GenProg-FL can be used to evaluate the effectiveness of fault location techniques.
Design and Implementation of Natural Language Based Software Information Retrieval Tool
YE Ting,CHEN Xiu-zhao,ZOU Yan-zhen,ZHAO Jun-feng and XIE Bing
Computer Science. 2014, 41 (9): 91-95.  doi:10.11896/j.issn.1002-137X.2014.09.017
Abstract PDF(434KB) ( 454 )   
References | Related Articles | Metrics
Open source software projects have become a kind of important sources in software reuse.When the source code and code related documents in a project are very large,it is time-consuming for end-users to find the right information (code or document) segments.We proposed a natural language based software information retrieval approach,which helps developers to get the software information they need more quickly and conveniently.Based on this approach,we designed and implemented a natural language based software information search engine (NaLSiSe).We took Lucene as an example to illustrate how NaLSiSe automatically analyses and organizes the related software information,constructs query in natural language based search engine so as to improve the precision.
User-steered Application Building Environment for Situational Data Integration
WANG Gui-ling,CAO Bo,ZHANG Sai,GENG Mei-zhen and ZHANG Feng
Computer Science. 2014, 41 (9): 96-100.  doi:10.11896/j.issn.1002-137X.2014.09.018
Abstract PDF(1259KB) ( 427 )   
References | Related Articles | Metrics
As Internet is being widely and deeply used,Internet has become a vast repository of online information resources.It is essential herein to enable end-users to develop situational data integration applications by themselves,and thus to profit from the online information resources’ potential value.The paper presented a user-steered situational data integration environment called data service space(DSS),which supports the most current used network resources,implements the ability to personalized wrap HTML Web page interactively,combines spreadsheet programming style and nested relational model,provides a set of visualized nested table operators and formula language,offers required agility and expressive power to support situational data integration by non-professional users.Use case and related work analysis reveal the potentials of DSS in situational data integration.
High-speed Nested CRC Code Generation Method and Implementation
DUAN Bin-bin,SUN Song-song,JIAO Li and ZHOU Wen-li
Computer Science. 2014, 41 (9): 101-103.  doi:10.11896/j.issn.1002-137X.2014.09.019
Abstract PDF(1238KB) ( 675 )   
References | Related Articles | Metrics
To achieve the data error control in high-speed converged network data transmission,a nested CRC code ge-neration method was proposed to improve the situation that it is difficult to further enhance the computing speed through currently available cyclic redundancy check(CRC) code calculation technique.It was implemented on Xilinx Field Programmable Gate Array (FPGA) chip.This nested CRC code is obtained through calculating the traditional CRC code concurrently in multiple channels,and thus the speed of error control code generation is highly increased while multiple-types of data flows are processed by different kinds of calculating channels.At the end,the calculating performance and error control capability were analyzed and a guidance to set the nested level,the number of computing channel and parallel computing width of a single computing channel was given.
Optimization of Cluster Resource Fuzzy Clustering Partition Algorithm for Cloud Computing
DONG Shi-long,CHEN Ning-jiang,TAN Ying,HE Zi-long and ZHU Li-rong
Computer Science. 2014, 41 (9): 104-109.  doi:10.11896/j.issn.1002-137X.2014.09.020
Abstract PDF(535KB) ( 447 )   
References | Related Articles | Metrics
The classic fuzzy clustering serial algorithm has the problems of heavy computation and low efficiency in dealing with high dimensional matrix operations,so it can’t be applied effectively into the fuzzy clustering partition model in cloud computing environment,and it’s hard to meet the time efficiency requirement of resource scheduling.Therefore,a transitive closure method was optimized based on the equivalence relation-based fuzzy clustering algorithm.What’s more,a fuzzy clustering concurrent algorithm based on multi-threading for cloud resources was applied to the improvement strategies for Hadoop scheduler.The experimental results indicate that the optimization strategy can reduce the computation for solving square-based fuzzy equivalent matrix problem.Moreover,the concurrent algorithm can effectively solve the computation bottleneck of resource clustering on small and medium-sized clusters,and it has a better speed-up ratio.To solve the problem of heterogeneity which exists in the existing Hadoop schedulers,theoretical analyses of the concurrent optimization algorithm show that it can help to solve scheduling problems caused by heterogeneity.
Application of Ternary Spread Code in Spread Spectrum Communication
WANG Hui and WU Cheng-mao
Computer Science. 2014, 41 (9): 110-114.  doi:10.11896/j.issn.1002-137X.2014.09.021
Abstract PDF(485KB) ( 533 )   
References | Related Articles | Metrics
In order to produce,work out a set of signal with approximate white Gaussian noise statistical characteristics,and use it as the spread code to reduce the error rate and improve reliability for the spread spectrum communication system,a novel spread code based on the ternary random sequence was presented.It takes the initial random dynamic ele-ment as the initial value,and a multi-round reconstruction method is designed to pretreat a background before using some methods such as periodic orbit-changed,control space mapping and constraint judgment to realize the discrete trajectory transform,what’s more,it is producted by uniform mapping.With different SNR and sinusoidal interference,ternary spread sequence’s BER was simulated and analyzed as the spread code based on Monte Carlo,so well as m sequence’s,piecewise logistic chaotic’s and linear congruence pseudorandom sequence.Experimental results demonstrate that ternary spread sequence has lower BER,and it ensures system stability under low signal-to-noise ratio,severes channel attenuation and strong interferences. It could improve anti-intercept ability and anti-jamming ability,and thus guarantee the reliability in spread spectrum communication system.
Friends Prediction Based on Fusion of Topology and Location in LBSN
PAN Guo and XU Yu-ming
Computer Science. 2014, 41 (9): 115-118.  doi:10.11896/j.issn.1002-137X.2014.09.022
Abstract PDF(324KB) ( 415 )   
References | Related Articles | Metrics
In location based social networks,friends prediction usually predicts friends with some similarity metric,and recommends the most similar users to some user.Traditional selection methods of user features don’t consider the difference between different features,so they cannot represent the overall features of users.This paper proposed a friends prediction method based on fusion of location information and social topology.First,we selected three relative features that can represent the overall user feature with information gain,then fused the selected relative features,and finally predicted friends with classification method.The experiments show that the proposed method doesn’t depend on the concrete classification method,and performs better than the multi-layer friend model.
Handoff Mechanism Based on Context-aware and Mobile Location Prediction
WANG Yu-xiang,WANG Jin and XIE Sheng-dong
Computer Science. 2014, 41 (9): 119-124.  doi:10.11896/j.issn.1002-137X.2014.09.023
Abstract PDF(1122KB) ( 440 )   
References | Related Articles | Metrics
The handoff mechanism is a key technology which can provide user with seamless mobile application services in wireless communication system.It also has very important impact on user oriented service experience and application.In this paper,a context-aware and mobile location prediction based handoff mechanism was proposed,which uses the modified Dempster-Shafer theory together with a variety of context information to predict the location of mobile users.The proposed method can not only efficiently avoid "ping-pong" handoff,but also shorten handoff delay,and provide basis for fast,accurate and reliable decisions,and reserve resources in advance.Thus,it can significantly improve the QoS(Quality of Service)of network.Theoretical analysis and simulation results show that location prediction is closer to user’s real trajectory.The RMSE of position is smaller,which means the success rate of handoff is largely improved.
Fast Algorithm for Detecting Multi-scale Overlapping Community Structure Based on Information Spreading
LI Hui-jia
Computer Science. 2014, 41 (9): 125-131.  doi:10.11896/j.issn.1002-137X.2014.09.024
Abstract PDF(3119KB) ( 493 )   
References | Related Articles | Metrics
Most existing community detection methods require the complete graph information,thus is impractical for large-scale technological and social networks.This paper proposed a novel algorithm for the fast detection of multi-scale overlapping community structure.It does not embrace the universal approach but instead tries to focus on local ties and model multi-scale interactions in these networks.It identifies leaders and modules around these leaders using local information.It naturally supports overlapping information by associating each node with a membership vector that describes its involvement of each community.Our method for the first time optimizes the topological entropy of a network and uncovers communities through a novel dynamic system converging to a local minimum by simply updating the membership vector with very low computational complexity.Both theoretical analysis and experiment show that the algorithm can detect communities in social and biological networks fast and accurately.
Energy Consumption in Ad hoc Based on Quantum-behaved Particle Swarm Optimization Elitist Learning Algorithm
ZHANG Feng,JIA Zhi-ping,CAI Xiao-jun and ZHANG Lan-hua
Computer Science. 2014, 41 (9): 132-136.  doi:10.11896/j.issn.1002-137X.2014.09.025
Abstract PDF(483KB) ( 408 )   
References | Related Articles | Metrics
In Ad hoc networks,with the emerging of multicast applications,how to construct a multicast tree of the minimum energy consumption is an important problem.For the effect of the different choices of relay nodes on the construction of the minimum energy consumption multicast tree,quantum-behaved particle and elitist learning swarm optimization algorithm (QPELSO) to optimize the construction of the minimum energy consumption multicast tree was proposed.In order to avoid the premature convergence of the particle swarm optimization algorithm.This method exerts the dynamic-approximation search strategy on the elitist particles to avoid them running into local optima and provides a good guidance for the population.While the algorithm is found to be in a dead state according to the premature judgment mechanism,the mutative-scale chaotic perturbation is used to exhibit a wide range exploration and keep the balance of exploration and exploitation.The results of simulated experiments show that the modified discrete particle swarm optimization algorithm has strong optimization ability,and can effectively optimize the construction of the minimum energy consumption multicast tree.
Research on Routing Algorithm of Semantic-based Publish/Subscribe System over P2P Networks
ZHANG Qiang,LI Jian-hua and SHEN Di
Computer Science. 2014, 41 (9): 137-140.  doi:10.11896/j.issn.1002-137X.2014.09.026
Abstract PDF(445KB) ( 650 )   
References | Related Articles | Metrics
Constructing semantic publish/subscribe system based on structured P2P network is research focus in recent years.The paper proposed a new semantic event routing algorithm based on Chord,which uses routing strategy based on rendezvous node.Firstly,the mapping from subscriptions to event broker rendezvous nodes is achieved using hash function with semantic reserved.Secondly,the algorithm just publishes events to the subscription rendezvous nodes which may be matched based on semantic information between subscriptions and events,and delivers notifications using the subscription spanning tree based on Chord routing protocol.At last,load balancing is realized through transferring subscriptions between overload rendezvous nodes.Simulation shows that to some extent,the algorithm reduces resource consumption,improves efficiency,and reaches load balance.
Context-aware Mobile P2P Social Network Multicast Routing Algorithm
CAO Huai-hu,ZHANG Yan-mei,ZHU Jian-ming and GUO Shu-hang
Computer Science. 2014, 41 (9): 141-145.  doi:10.11896/j.issn.1002-137X.2014.09.027
Abstract PDF(1020KB) ( 454 )   
References | Related Articles | Metrics
Mobile social network multicast communication is one of hot issues that researchers focus on in recent years.Due to the dynamic change and social features of nodes,the traditional multicast routing algorithms can not be directly applied to mobile social network.According to the environmental characteristics of the mobile social network,a multicast model of mobile social network was established.Using environmental awareness information,context-aware mobile P2P social network multicast routing algorithm was proposed by combining minimum spanning tree and grid multicast algorithm.The theoretical analysis and simulation tests of the algorithm were conducted,showing that the CMR algorithm improves the performance of data transmission,and has high scalability,robustness.
Study on User Permissions Management Based on Attribute for Cloud Environment
LI Shuan-bao,FAN Nai-ying,FU Jian-ming,QI Hui-min and LIU Qian
Computer Science. 2014, 41 (9): 146-151.  doi:10.11896/j.issn.1002-137X.2014.09.028
Abstract PDF(570KB) ( 554 )   
References | Related Articles | Metrics
User permissions assignment is one of the important challenges of cloud computing services.We proposed an user permissions management scheme based on an attribute.The program make the key distribution of new users in cloud services as the study object,which discusses the multi-collaborative signature verification and decryption management mechanism.Data owners and authority commonly decide on attribute set,and data owner defines ciphertext access structure based on the attribute set,so that only authorized users who hasbeen certified can get the decryption key,to upgrade and downgrade synchronously user permissions management.In addition,we designed CP-ABE group signature verification decryption mechanism by updating-centric for group attribute set,which constitutes group of data owners,users and authority.Users can sign message and publicly verifiability by combining group and own attribute so that the fine- grained access control of ciphertext data can be protected.At last,the validity and unforgeability of the signature can be proved.
Efficient Access Control Scheme Combining CP-ABE and SD in Cloud Computing
CHEN Yan-li,SONG Ling-ling and YANG Geng
Computer Science. 2014, 41 (9): 152-157.  doi:10.11896/j.issn.1002-137X.2014.09.029
Abstract PDF(625KB) ( 693 )   
References | Related Articles | Metrics
The privacy and secure access of sensitive data stored in the cloud server is important content in cloud computing security research.A secure,effective,fine-grained access control scheme in cloud computing was proposed.The ciphertext encryption employs a CP-ABE with a linear secret sharing matrix,and most of the re-encryption work is transferred to the cloud service provider,so the scheme reduces the data owner’s computational cost on the premise of security.When user attributes’ revocation occurs,the scheme introduces SD broadcast encryption technology,effectively reducing the computing and communication overheads.The analysis shows that the scheme has the data confidentiality,collusion-resistance,backward and forward secrecy.Finally the experiment result validates the high revocation efficiency of the scheme.
Substitution-Permutation Network Structured Image Encryption Algorithm Based on Chaotic Map
CAI Jun,CHEN Xin and XIANG Xu-dong
Computer Science. 2014, 41 (9): 158-164.  doi:10.11896/j.issn.1002-137X.2014.09.030
Abstract PDF(669KB) ( 485 )   
References | Related Articles | Metrics
Various image encryption algorithms based on the permutation-diffusion structure have been proposed in the past few years.However,permutation and diffusion are considered as two separate stages,making image encryption vulnerable to attacks.Moreover,the algorithms,in general,have low diffusion efficiency and high computational complexity in key scheming.To solve these problems,this paper proposed a SP(Substitution-Permutation) network image encryption algorithm based on chaotic map.The proposed algorithm adopts the following three strategies:(1) To further enhance the security of the cryptosystem,XOR and S-box operations are introduced in the beginning of each encryption round;(2) An bidirectional permutation-diffusion strategy is proposed to accelerate the spreading process;(3) Simple circular bit shift and XOR operations are used to improve the efficiency of key scheming.We conducted a rich set of cryptanalyses on the proposed algorithm,e.g.,key space analysis,key sensitivity analysis,various statistical analyses and differential analysis.Analytical results demonstrate that the proposed algorithm reaches the same security level of previously proposed counterparts in merely three iterations with high efficiency,and is thus applicable for secure image encryption.
Non-trapdoors Lattice Signature Scheme with Message Recovery
ZHANG Xiang-song and LIU Zhen-hua
Computer Science. 2014, 41 (9): 165-168.  doi:10.11896/j.issn.1002-137X.2014.09.031
Abstract PDF(323KB) ( 469 )   
References | Related Articles | Metrics
Based on Lyubashevsky’s rejection sampling approach (without trapdoors),a lattice-based signature scheme with message recovery was proposed.This scheme can be regarded as lattice-based cryptographic version of Abe-Okamato signature with message recovery.In the random oracle model,we proved the new scheme’s existential unforgeability under chosen message attacks security relies on the Small Integer Solution hardness assumption by using the General Forking Lemma.The proposed scheme does not use Gauss pre-image sampling as a signature,requires just simple matrix-vector multiplication operations,and has short message- signature size.
Network Risk Analysis Method Based on Node-Game Vulnerability Attack Graph
ZHANG Jian,WANG Jin-dong,ZHANG Heng-wei and WANG Na
Computer Science. 2014, 41 (9): 169-173.  doi:10.11896/j.issn.1002-137X.2014.09.032
Abstract PDF(391KB) ( 576 )   
References | Related Articles | Metrics
Due to the lack of considerations of mutual constraints between offensive and defensive vulnerability in the current risk analysis methods,this paper attempted to introduce game theory into the nodes analysis process,and the Risk Analysis Model based on node game Vulnerability Attack Graph was proposed.On this basis,a vulnerability risk analysis algorithm based on connection matrix was proposed.The algorithm builds connection matrixes of the attack graph,and evaluates the overall risk based on the analysis of self risk and transmission risk of information system vulnerabilities.The evaluation result can help the manager to determine the critical vulnerability.The example analysis proves the effectiveness of the model and algorithm.
Random Network Coding Based on Anti-eavesdropping and Byzantine Adversaries
ZHAO Jia-jia and REN Ping-an
Computer Science. 2014, 41 (9): 174-177.  doi:10.11896/j.issn.1002-137X.2014.09.033
Abstract PDF(301KB) ( 431 )   
References | Related Articles | Metrics
Random network coding has been widely applied in the actual network environment,and can effectively improve network throughput.However,network coding is vulnerable to eavesdropping attack and Byzantine attack.This paper presented a random network coding algorithm that can prevent the eavesdropping adversaries and effectively detect Byzantine modification.There is a secret channel sharing between the source and destination,and the algorithm increases a few computations and redundancies to achieve anti- eavesdropping attack and anti- Byzantine attack through the hash function.The algorithm only changes the encoding method of the source node and the decoding method of the destination node based on the random network coding,and the intermediate nodes remain unchanged.It has a high coding efficiency.
Improved Flow-insensitive Type Qualifier Inference
LI Hui-song,XU Zhi-wu and CHEN Hai-ming
Computer Science. 2014, 41 (9): 178-184.  doi:10.11896/j.issn.1002-137X.2014.09.034
Abstract PDF(531KB) ( 408 )   
References | Related Articles | Metrics
Type qualifiers can refine the standard types and improve the expressivity of type systems.Flow-insensitive type qualifier inference has been used in the CQual framework to improve the quality of C programs.Type casts,however,will affect the effectiveness of type qualifier inference.First a language allowing type casts and its flow-insensitive qualifier inference system were presented. Then this paper proposed a variable-involved inference system,introduced union types and given constraints solving algorithm.Finally,the soundness was proved and some case studies were pre-sented.
Semantic-based Approach for Business Activities Recommendation
ZHANG Long,YING Shi,JIA Xiang-yang,GONG Zhi-yuan and LI Lin
Computer Science. 2014, 41 (9): 185-189.  doi:10.11896/j.issn.1002-137X.2014.09.035
Abstract PDF(506KB) ( 396 )   
References | Related Articles | Metrics
In a dynamic constructed business workflow,users always leave out the activities they should do or select the wrong activities they shouldn’t do at the background of complex enterprise business logic.To solve this problem,a semantic-based approach for business activities recommendation was proposed.First we defined the foundation ontology model for business activities,and on the basis of the model built some basic reasoning rules.Then we described the re-commendation strategy in detail,and designed the recommendation algorithm based on the strategy.The mechanism of the approach is receiving and analyzing business activity events to invoke the recommendation algorithm,then the algorithm accesses to knowledge base to generate the final recommended results to the end user.Finally,a case of state comprehensive disaster reduction system was studied to validate the proposed approach.
Location Range-based Skyline Query in Road Networks
SHI Chang-yue,QIN Xiao-lin,XU Jian-qiu and HU Cai-ping
Computer Science. 2014, 41 (9): 190-195.  doi:10.11896/j.issn.1002-137X.2014.09.036
Abstract PDF(480KB) ( 539 )   
References | Related Articles | Metrics
With the development of wireless communication and positioning technology,skyline query in road networks has recently been important in LBS.For the consideration of human privacy and the limited accuracy of positioning devices,user’s location is always represented as a spatial range.However,the existing researches focus on the point-based skyline query.This paper studied a new problem of the location ranges-based skyline query in road networks(RNS),and proposed an efficient query processing algorithm.In addition,because the complex distance calculation brings decrease of the query efficiency,this paper also proposed an index-based query algorithm,and then by pre-computing effective skyline road segments of POI and creating a new road network model,designed an index to support the RNS efficiently.Extensive experiments on real networks datasets verify the performance and accuracy of the proposed algorithms.
Systematic Review of Test Suite Minimization for Regression Testing
CHEN Xiang,GU Qing,CHEN Dao-xu and JIANG Zheng-zheng
Computer Science. 2014, 41 (9): 196-204.  doi:10.11896/j.issn.1002-137X.2014.09.037
Abstract PDF(855KB) ( 744 )   
References | Related Articles | Metrics
Test suite minimization (TSM) is a hot and difficult issue in regression testing search.It aims to identify and remove redundant test cases,and reducing the cost of regression testing.The reduced test suite can satisfy the same test requirements as the original test suite.In this paper,we reviewed the existing research work of TSM.We firstly classified existing TSM approaches into two categories:source code based and model based.In source code based approaches,we analyzed and summarized traditional TSM approaches and fault detection ability concerned TSM approaches.In modelbased approaches,we mainly analyzed and summarized EFSM based approaches.We secondly summarized empirical subjects,evaluation metrics,and empirical results of previous empirical studies.We thirdly summarized the successful applications of TSM in some specific testing domains,such as GUI application testing,Web Application testing,and fault localization.We finally gave some future work for this hot research topic.
Interrupt Modeling and Verification for Embedded Systems Based on Time Petri Nets
ZHOU Kuan-jiu,CHANG Jun-wang,HOU Gang,REN Long-tao and WANG Xiao-long
Computer Science. 2014, 41 (9): 205-209.  doi:10.11896/j.issn.1002-137X.2014.09.038
Abstract PDF(486KB) ( 611 )   
References | Related Articles | Metrics
The embedded systems are interrupt-driven systems,but the triggered methods of interrupts are with randomness and uncertainty.The behavior of interrupt can be quite difficult to fully understand,and many catastrophic system failures are caused by unexpected behaviors.Therefore,interrupt-driven systems need high quality tests,but there is a lack of effective interrupt system detection methods at present.In this paper,firstly a modeling method of interrupt system was proposed based on time Petri nets,which has ability of describing concurrency and time series.Then the time Petri nets were transformed into timed automata for model checking.Consequentially,a symbolic encoding approach was investigated for formalized timed automata,through which the timed automata could be bounded model checked (BMC) with regard to invariant properties by using Satisfiability Modulo Theories (SMT) solving technique.Finally,Z3 was used in the experiments to evaluate the effectiveness of our approach.
Action Theory Based on Dynamic Linear Temporal Description Logic
SUN Yong-xin and ZHAO Xi-shun
Computer Science. 2014, 41 (9): 210-214.  doi:10.11896/j.issn.1002-137X.2014.09.039
Abstract PDF(513KB) ( 415 )   
References | Related Articles | Metrics
Dynamic linear temporal description logics (DLTLDLs) are a family of dynamic and temporal extensions of description logics.Based on DLTLALCIO,a method for modeling dynamic domains was presented.By application of the modeling method,a DLTLALCIO theory for describing a giving domain can be constructed,and frame problem and ramification problem can be solved.Action reasoning problems,such as executability problem and projection problem,can be reduced to reasoning problems in the DLTLALCIO theory,and can eventually be reduced to the satisfiablity problem of DLTLALCIO formulas.Since constraints with action and time can be expressed with DLTLALCIO formulas,so compared with the other action formalisms based on description logics,the action formalism based on DLTLALCIO is more applicable in the case of complex queries,especially queries containing time or action,to be executed.
Test Configuration Method Based on Dynamic Programming under Cloud Environment
ZHOU Huan-huan and JIANG Ying
Computer Science. 2014, 41 (9): 215-219.  doi:10.11896/j.issn.1002-137X.2014.09.040
Abstract PDF(383KB) ( 439 )   
References | Related Articles | Metrics
In order to obtain the lowest test cost scheme of the resource configuration combination,this paper briefly described the influence of test configuration on the test quality and efficiency.Through analyzing the characteristics of loose coupling and dynamic changing available resources of cloud platform,this paper proposed the process of test configuration,structure of test requirement and test configuration.In order to meet the users’ test requirements,a method based on dynamic programming was proposed to select available test resource.Finally,the feasibility of the method was verified by the relevant experiments in CloudSim.
Research on Methods of Construction of Voronoi Diagram and Nearest Neighbor Query in Constrained Regions
ZHANG Li-ping,ZHAO Ji-qiao,LI Song,JING Hai-dong and CUI Huan-yu
Computer Science. 2014, 41 (9): 220-224.  doi:10.11896/j.issn.1002-137X.2014.09.041
Abstract PDF(503KB) ( 931 )   
References | Related Articles | Metrics
The Voronoi diagram plays an important role in spatial data query,data mining,image processing,pattern recognition and intelligent traffic management etc.To reduce complexity and improve efficiency,based on the divide and conquer,local optimization and dynamic update of scan lines,the method to construct Voronoi diagram using the convex hulls was proposed.The Create_Voronoi() algorithm was given.Furthermore,to remedy the defects of the existing methods for the near neighbor query,the identical -quality nearest neighbor query and the heterogeneous nearest neighbor query in the constrained regions were studied.The TVor_NN() algorithm and the YVor_NN () algorithm were presented.The theatrical study and the experimental results show that the redundant calculation is reduced and the algorithms hold large advantage at the construction of Voronoi diagram and the nearest neighbor query in constrained regions.
New Strategy Based on Selection of Mutation Operator
WANG Shuai-qun,Ao-ri-ge-le,GAO Shang-ce,TANG Zheng and MA Hai-ying
Computer Science. 2014, 41 (9): 225-228.  doi:10.11896/j.issn.1002-137X.2014.09.042
Abstract PDF(313KB) ( 500 )   
References | Related Articles | Metrics
In immune optimization algorithm with multi-learning mechanism,the selection probability of mutation operators plays a vital role in the effectiveness of the algorithm.If the choice is not appropriate,the algorithm is easy to fall into local optimum,to a certain extent,affects the quality of the solution and reduces the rate of convergence.According to the characteristics of the multi-learning mechanism,the dependence and correlation between mutation operators were discussed.A new mutation operator selection strategy was proposed and four benchmark functions were selected as test functions to verify the performance of the strategy.The result shows that the quality of the solution and the convergence speed have obvious improvement.
Study on Matrix Games with Lattice-valued Payoffs
ZHANG Lin and XU Yang
Computer Science. 2014, 41 (9): 229-231.  doi:10.11896/j.issn.1002-137X.2014.09.043
Abstract PDF(303KB) ( 468 )   
References | Related Articles | Metrics
Game theory has been applied widely to interpret and solve the complex and interrelated decision problems.There are few results on non-real valued domain game.This paper investigated two person zero-sum matrix games with lattice-valued payoffs.New equilibrium solutions,i.e.pure strategy Nash equilibrium solution and quasi equilibrium solution,mixed strategy Nash equilibrium solution and quasi equilibrium solution were defined based on the specificity of this kind of game.The properties of equilibrium solutions were studied.The approaches of obtaining equilibrium solutions were proposed.The sufficient and necessary conditions that strategies are the equilibrium solutions were given.Finally,an example was shown to verify the feasibility and effectiveness of the new method dealing with the two person zero-sum matrix games with lattice-valued payoffs.
Batch Least-squares Policy Iteration
ZHOU Xin,LIU Quan,FU Qi-ming and XIAO Fei
Computer Science. 2014, 41 (9): 232-238.  doi:10.11896/j.issn.1002-137X.2014.09.044
Abstract PDF(558KB) ( 1099 )   
References | Related Articles | Metrics
Policy iteration is a reinforcement learning method which evaluates and improves the control policy iteratively.Policy evaluation with the least-square method can extract more useful information from the empirical data and improve the data validity.For the low empirical utilization rate of online least-squares policy iteration method which uses each sample only once,a batch least-squares policy iteration (BLSPI) method was proposed and its convergence was proved in theory.BLSPI method combines online least-squares policy iteration method and batch updating method,stores the generated samplesonline and reuses these samples with least-squares methods to update the control policy.We applied the BLSPI method to the inverted pendulum system,and the experiment results show that the method can effectively utilize the previous experience and knowledge,improve the empirical utilization rate,and accelerate the convergence speed.
Big Data Oriented Online Feature Extraction
XU Shuo-na,ZENG Bi-qing and XIONG Fang-min
Computer Science. 2014, 41 (9): 239-242.  doi:10.11896/j.issn.1002-137X.2014.09.045
Abstract PDF(300KB) ( 461 )   
References | Related Articles | Metrics
In big data,the high dimension of training samples makes it difficult for classifying these samples during data mining.Applying the sparsity of the L1 norm,this paper proposed an online feature selection algorithm,and used this algorithm to classify the training samples.Experiments on public datasets show that the proposed online feature selection algorithm has better prediction accuracy than related work,and thus can be applicable to data mining for big data.
Model-free Gene Selection Method Based on Maximum Mutual Information
WEI Sha-sha,LU Hui-juan,AN Chun-Lin,ZHENG En-hui and JIN Wei
Computer Science. 2014, 41 (9): 243-247.  doi:10.11896/j.issn.1002-137X.2014.09.046
Abstract PDF(401KB) ( 514 )   
References | Related Articles | Metrics
The large number of irrelevant and redundant features in high dimensionality of large-scale gene chip expression data may reduce the performance of the classifiers.We proposed a model-free gene selection method based on the maximum mutual information (MMI) to transform feature selection into a global optimization problem.The fitness function was defined as the distance between the class and class in the ratio of the distance.In order to evaluate the performance of the algorithm,experiments were done in three data sets.Experimental results show that MMIGA-Selection obtains a better effect in every data set of the 5 fold cross validation accuracy.MMIGA-Selection has two main advantages.First,it can effectively reduce the redundant genes.Second,the model-free algorithm makes the feature subset directly apply to other types of classifier and obtains higher classification accuracy.
SVM Sentiment Classifier Based on Semantic Distance for Web Comments
XIAO Zheng,LIU Hui and LI Bing
Computer Science. 2014, 41 (9): 248-252.  doi:10.11896/j.issn.1002-137X.2014.09.047
Abstract PDF(513KB) ( 592 )   
References | Related Articles | Metrics
The analysis of sentimental orientation can be regarded as a problem of classification on emotional polarity.Under the background of the mass data processing,we proposed a classification approach in terms of sentimental orientation of texts based on LSA(Latent Semantic Analysis) and SVM(Supported Vector Machine),in order to improve the accuracy of the text emotional judgment.On the concept of semantics,we established a space model of "word-document" semantic distance vectors by the latent semantic analysis,and then on account of the privileges of accuracy and generalization of support vector machine,designed a SVM classifier with semantic distance as the input feature vectors.Experimental results validate that our method effectively improves the classification accuracy compared with the traditional SVM method.The classification accuracy rate rises to near 88% on the test set of Web comments with short sentences and explicit sentimental orientation.
Emotion Analysis of Chinese Microblogs Using Lexicon-based Approach
NIU Yun,PAN Ming-hui,WEI Ou and CAI Xin-ye
Computer Science. 2014, 41 (9): 253-258.  doi:10.11896/j.issn.1002-137X.2014.09.048
Abstract PDF(632KB) ( 1936 )   
References | Related Articles | Metrics
The proliferation of microblogs has created a digital platform where people are able to express themselves through a variety of means.Automatic analysis of the emotional content in microblogs plays an important role in capturing popular feelings and adjusting personal mood.In this paper,a lexicon-based approach was proposed to automatically determine whether a microblog expresses one of the four basic emotions:joy,sadness,anger,and fear.We performed an extensive analysis of current Chinese emotion lexicons to understand their roles in analyzing social media text.The experimental results show that lexicon is a crucial resource in emotion analysis.The results also reveal limitations of current Chinese emotion lexicon.The characteristics of emotion in microblgs are identified for building advanced emotion analysis system.
Distributed Path Generation Algorithm Based on Real-time Traffic Information in Dynamic Road Network
LONG Qi,YE Chen and ZHANG Ya-ying
Computer Science. 2014, 41 (9): 259-262.  doi:10.11896/j.issn.1002-137X.2014.09.049
Abstract PDF(428KB) ( 933 )   
References | Related Articles | Metrics
The path finding problem in dynamic road network has important significance for traffic guidance and traffic flow simulation.This paper presented a distributed path planning algorithm based on real-time traffic information.We modeled the intersection unimpeded degree and estimated the time that vehicle travel between adjacent intersections requires based on the traffic parameters collected by smart cameras installed in the road intersections.When a vehicle needs to conduct a traffic induction,we found the shortest path through the network between the smart cameras like the network routing idea.We proposed a delay delivery mechanism in the process of path finding.According to the intersection unimpeded degree and the distance to the neighboring intersection,the smart cameras set a delay to broadcast the path finding packet.In this way,the path finding packet can simulate the current road conditions.Thus the path finding result can be obtained quickly and effectively.
Research and Design of Parallel Particle Swarm Optimization Algorithm Based on CUDA
CHEN Feng,TIAN Yu-bo and YANG Min
Computer Science. 2014, 41 (9): 263-268.  doi:10.11896/j.issn.1002-137X.2014.09.050
Abstract PDF(478KB) ( 1484 )   
References | Related Articles | Metrics
In the application of graphic processing unit (GPU) to accelerate particle swarm optimization (PSO) algorithm for parallel computing,many references worsen the performance of PSO algorithm on CPU side in order to highlight the acceleration performance.The concept of “Effective Speedup” was proposed in this paper to measure the achievement of GPU-PSO algorithm and CPU-PSO algorithm.The proposed method aims at accelerating the implementation to the target precision.The GPU parallel algorithm was compared with the best CPU serial algorithm,which does not require the same number of particles between CPU side and GPU side.Experiments based on several benchmark test functions using compute unified device architecture (CUDA) show that substantially increasing the number of particles on GPU side can significantly accelerate the accomplishment of PSO algorithm to the target precision.Compared with CPU-PSO, an “Effective Speedup” of more than 10 has been achieved.
Characteristic Analysis of Pattern Matching with Wildcards Problem and its Solution Space
XIANG Tai-ning,GUO Dan,WANG Hai-ping and HU Xue-gang
Computer Science. 2014, 41 (9): 269-273.  doi:10.11896/j.issn.1002-137X.2014.09.051
Abstract PDF(464KB) ( 476 )   
References | Related Articles | Metrics
With the developments in bioinformatics and information retrieval,pattern matching with wildcards and length constraints has attracted wild attention.This problem extends the exact pattern matching,so it brings more flexibility as well as complexity.Meanwhile,the time complexity of non-linear algorithms is greatly increased.The solution space makes great difference to the efficiency of matching algorithms,but the characteristics of the problem solution space is still lack of systematic research.Therefore,the problem solution space was presented,and the divisibility of the space was analyzed.Afterwards,a solution space partitioning algorithm,named SPLIT,was proposed,and its time complexity was also illustrated.In our experiments,three pattern matching algorithms were used as baselines running on realDNA data sets with 5109 sets of patterns.Our empirical results verify that SPLIT can effectively reduce the time consumption of non-linear matching algorithms without influencing the matching solution.
α-Semantic Resolution Method Based on Lattice-valued First-order Logic LF(X)
ZHANG Jia-feng and XU Yang
Computer Science. 2014, 41 (9): 274-278.  doi:10.11896/j.issn.1002-137X.2014.09.052
Abstract PDF(390KB) ( 577 )   
References | Related Articles | Metrics
Automated reasoning is one of the most important research directions in artificial intelligence.Resolution-based automated reasoning has been extensively studied because of its easy implement on computer.Semantic resolution method is one of the most important modified methods for resolution principle in semantic resolution method,and it utilizes the technology that restrains the type of clauses and the order of literals participated in resolution procedure to reduce the redundant clauses,and can improve the efficiency of reasoning.For improving the efficiency of α-re-solution principle in lattice-valued logic based on lattice implication algebra,we applied the semantic resolution strategy to α-resolution principle.Firstly,this paper gave the conceptions of α-semantic resolution and α-semantic resolution deduction in LF(X).Subsequently,the semantic resolution method on it was investigated and sound theorem and conditional complete theorem of this semantic resolution method were proved.At last,the effectiveness of α-semantic resolution method was illustrated through an example.
Q-gram Index for Approximate String Matching with Multi-seeds
SUN De-cai and WANG Xiao-xia
Computer Science. 2014, 41 (9): 279-284.  doi:10.11896/j.issn.1002-137X.2014.09.053
Abstract PDF(473KB) ( 387 )   
References | Related Articles | Metrics
How to find out all approximate strings of a given string from a big text database quickly is a key issue in the age of big data.Approximate string matching algorithm based on multi-seeds is researched for its fast searching speed.But it is difficult to process large text database due to its huge memory consumption.Here,a new q-gram index was proposed to solve the problem of multi-seeds used in approximate string matching algorithms.In the proposed index,the addresses of consecutive seed with arbitrary length can be computed out quickly.The experimental results demon-strate that the space consumption is decreased largely.As a result,the proposed index is of great practicality to deal with large database in the age of big data.
Solving HW/SW Partitioning Problem by Improved Estimation of Distribution Algorithm
YU Juan,HE Yu-yao and FENG Xiao-hua
Computer Science. 2014, 41 (9): 285-289.  doi:10.11896/j.issn.1002-137X.2014.09.054
Abstract PDF(371KB) ( 421 )   
References | Related Articles | Metrics
Hardware/software (HW/SW) partitioning is a crucial step in embedded system co-design,also an NP hard problem.Estimation of distribution algorithm is good in globe search,but poor in local search and is prone to premature convergence because of diversity loss.An improved estimation of distribution algorithm was proposed to solve HW/SW partitioning problem.The improved algorithm clones and searches the promising solutions to strengthen the local searching ability,and corrects the probability model to improve the diversity loss.A method of repairing infeasible solutions was also proposed.Simulation was carried out.And the comparisons with existing algorithm demonstrate the effectiveness of the improved estimation of distribution algorithm in solving HW/SW partitioning problem.
Design and Implementation of Citrus Orchard Soil Environmental Decision-making Support System Based on Semantic Web
ZHU Ying and ZHANG Zi-li
Computer Science. 2014, 41 (9): 290-293.  doi:10.11896/j.issn.1002-137X.2014.09.055
Abstract PDF(1415KB) ( 465 )   
References | Related Articles | Metrics
Agricultural production management decision-making system plays more and more important role in improving the yield and quality of agricultural products.In this paper,according to the soil environment problems in citrus production,we proposed an approach based on Semantic Web to build citrus orchard soil environmental decision-making support system.We focused on the system structure of the decision support system,the establishment of soil semantic database and definition of reasoning rules. Finally,we applied the semantic database software AllegroGraph to realize the citrus soil semantic database.
Research on Extraction about Ethnic Features of Face Images Based on Integrated Feature and BKPCA
LIU Wen-hui,XU Rui,LIU Hua-yong and MA Guang-chun
Computer Science. 2014, 41 (9): 294-296.  doi:10.11896/j.issn.1002-137X.2014.09.056
Abstract PDF(870KB) ( 379 )   
References | Related Articles | Metrics
A method based on the extracting block integrated features of KPCA was presented for the extraction of the ethnic features of face images.Because the complementarity of global features and local features can better reveal the essence of information,firstly,KPCA was used to the extract the global features of face image and the local feature of each sub block,secondly these features was combined into ethnic features,finally the Boosting-RBF classifier designed in the paper was used to identify image samples of ethnic face.The experiment subjects are a constructed Minorities face image database in this paper.The experimental results show that the extracted ethnic features in this paper can identify accurately the ethnic group to which the face image belongs.
Framework of Brain MRI Images Segmentation Based on MultiLayer Level Set
ZHU Xiao-shu,SUN Quan-sen,XIA De-shen and SUN Huai-jiang
Computer Science. 2014, 41 (9): 297-300.  doi:10.11896/j.issn.1002-137X.2014.09.057
Abstract PDF(844KB) ( 414 )   
References | Related Articles | Metrics
This paper proposed a new automated method to initialize level set function and a region-based active contour model based on MultiLayer level set formulation.Because of jointing segmentation and bias correction of images,the proposed model can overcome intensity inhomogeneity.Finally,considering the distance information of cerebral cortex,a thickness constraint item was added to segmentation framework.Experimental results show that our framework can segment images more precisely and much faster than the well-known LBF model.
3D Palmprint Recognition Based on Guided Filter and Cross-correlation of Binary Image Groups
LIU Ming,LI Li-hua and LI Zhe
Computer Science. 2014, 41 (9): 301-305.  doi:10.11896/j.issn.1002-137X.2014.09.058
Abstract PDF(2981KB) ( 602 )   
References | Related Articles | Metrics
A robust personal recognition method was proposed based on the 3D palmprint in this paper.In the stage of feature extraction,guided filter is used to remove the noise in the image.Then robust orientation features,namely a group binary images,are extracted based on Gabor transformation.In the stage of matching,the matching score is calculated based on the cross-correlation of binary image groups.The algorithm can make full use of the information in the image groups to obtain the accurate matching score.Experimental results on the HK-Poly 2D+3D palmprint database demonstrate that the proposed approach can improve the performance of the 3D palmprint verification system.
Research on Multi-object Tracking Using Hierarchical Data Association
YANG Guo-liang and ZHANG Jin-hui
Computer Science. 2014, 41 (9): 306-310.  doi:10.11896/j.issn.1002-137X.2014.09.059
Abstract PDF(2653KB) ( 490 )   
References | Related Articles | Metrics
Tracking by detection is a main research direction in the field of multi-target tracking in recent years.We proposed a global multi-object tracking algorithm using hierarchical data association following the tracking by detection framework.We first obtained the detection response in the whole video using an object detector,and then utilized the to solve the data association problem on detection response in video clip and obtained tracklets.At last we obtained the object track by solving the association problem on tracklets in whole video using a hierarchical method.Experiments on the public datasets show the proposed method can solve data association and handle occlusion effectively.
Image Defogging Algorithm Based on Constrained Evolutionary Time-frequency Weighted Filtering
LIU Wei,CUI Yong-feng and WU Xiang-lin
Computer Science. 2014, 41 (9): 311-314.  doi:10.11896/j.issn.1002-137X.2014.09.060
Abstract PDF(1209KB) ( 477 )   
References | Related Articles | Metrics
The traditional defogging algorithm based on physical texture feature cannot remove edge fog effectively,and the performance is bad especially in thick fog.An improved image defogging algorithm was proposed based on constrained evolutionary time frequency weighted filtering.The block processing method was taken,and 3×3 blocks were used to solve the dark color.The dark color was used as the new feature.The constraint evolution condition was set.The smoothing pseudo and RSPWVD-Hough time-frequency analysis were implemented.The RSPWVD-Hough time-frequency feature was extracted as the weighting vector to direct image box filter,and the defogging process was rea-lized.The constrained evolutionary algorithm was used to optimize the result.Because the RSPWVD-Hough time-frequency feature can effectively reflect the edge atomization feature points of the image,the edge defogging performance is perfect.Simulation results show that the improved defogging algorithm can optimize the image defogging performance and improve the operating speed.The penetrating rate is improved greatly.It has good application value in image target recognition on thick fog condition.
Improved HOG Algorithm of Pedestrian Detection
TIAN Xian-xian,BAO Hong and XU Cheng
Computer Science. 2014, 41 (9): 320-324.  doi:10.11896/j.issn.1002-137X.2014.09.062
Abstract PDF(405KB) ( 694 )   
References | Related Articles | Metrics
In view of the HOG with large amount of calculation and the high detection accuracy,through the structural adjustment of the HOG features,the paper put forward that using Fisher feature selection criterion picks out the feature blocks full of ability to distinguish characteristics of pedestrian,and finally produces MultiHOG characteristics.Combined with the Lib-SVM classifier, the algorithm in the paper detects pedestrians with slided windows,and has a higher accuracy and real-time performance than HOG.