Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue 6, 14 November 2018
  
A kind of Big Data Placement Method
ZHANG Gui-gang
Computer Science. 2014, 41 (6): 1-4.  doi:10.11896/j.issn.1002-137X.2014.06.001
Abstract PDF(427KB) ( 428 )   
References | Related Articles | Metrics
More and more data-intensive applications have come into being.It is becoming more and more important for the big data’ efficient placement in the data center.This paper proposed a kind of big data placement model.The major factors that influce the big data placement have the following three points:energy consumption,sevice capability of heterogeneous node and the data sets which have associated computing.Based on these three factors,our big data placement model considers the energy-saving,service capability of heterogeneous node and the complex Join query mapreduce computing so on.This model can implement the big data’s efficient placement management efficiently.At the same time,it will estabilish a foundation for software customized data center in the future.
TVBRT:A Time-varying Multivariate Data Visualization Method Based on Radial Tree
SUN Ning-wei,ZHAO Yu,LIU Yong,LIU Hai-feng,XIAO Wei-dong and ZHANG Chong
Computer Science. 2014, 41 (6): 5-11.  doi:10.11896/j.issn.1002-137X.2014.06.002
Abstract PDF(1174KB) ( 662 )   
References | Related Articles | Metrics
The areas of social science,environmental monitoring,financial and economic,health and geographic information have generated a large amount of time-varying multivariate data.With a depth analysis of the time-varying multivariate datasets,we proposed a layout algorithm for Radial Tree integrated metric properties called LAMPRT to demonstrate metric properties.And then,we proposed another layout algorithm called LOVEBRT to exhibit the time-varying feature.These two algorithms and some human-computer interaction techniques both form the visualization method TVBRT.Case studies demonstrate the effectiveness of this method,showing that the method is more effective in exhibiting more details of the datasets.TVBRT outperforms over other methods in exhibiting data in a hierarchical and more accurate way.
Access Optimization Technique for Mathematical Library of Slave Processors on Heterogeneous Many-core Architectures
XU Jin-chen,GUO Shao-zhong,HUANG Yong-zhong and WANG Lei
Computer Science. 2014, 41 (6): 12-17.  doi:10.11896/j.issn.1002-137X.2014.06.003
Abstract PDF(1221KB) ( 527 )   
References | Related Articles | Metrics
Due to the nature of mathematical function’s algorithms,there are a great deal of access operations remaining in reality.In the heterogeneous many-core architectures,which is becoming ubiquitous recently,the slave processors are equipped with shared memory to access data,thereby impacting the accessing rate heavily.Therefore,the performance of the mathematical library’s functions is not able to meet requirements of high performance computing.To efficiently solve this problem,this study proposesd a novel accessing instructions based scheduling strategy to cover the access delay with the necessary computation.With the help of the dynamic calling mode,an algorithm called ldm_call was introduced based on the LDM (local data memory) of the slave processors,which can speed up the accessing rate significantly.These two optimizing technologies both possess general applicability in the shared memory.At the same time,they can efficiently reduce the accessing frequency and speed up the accessing rate.The experimental results show that they can improve the functions’ performance 16.08% and 37.32% on average respectively.
Adaptive Simulated Annealing Algorithm for Task Assignment on Homogeneous Multi/Many-core Processors
YAN Qiao,QIN Zhi-dong,WANG Shao-yu and YAN Hong-man
Computer Science. 2014, 41 (6): 18-21.  doi:10.11896/j.issn.1002-137X.2014.06.004
Abstract PDF(434KB) ( 457 )   
References | Related Articles | Metrics
With rapid increasing of the number of cores in multi/many-core processors,the task assignment solution space increases sharply so that reducing the relative deviation of the approximate solution becomes more and more difficult.An adaptive simulated annealing algorithm was put forward by establishing the relationship of the algorithm parameters and the number of optimized environment tasks and cores.The increasing of core number can not only effectively reduce the relative deviation of the approximate solution,but also shows high adaptability for the environment.Experiments reveal that on the 16cores platform,the adaptive simulated annealing algorithm iterations are increased by 41%,but the relative deviation is decreased by 86% versus the recent research results.
Multi-job Assignment Optimization Approach Based on Closed Minimum Graph-partitioning Model
ZHANG Yong-jun and LIN Yu-fei
Computer Science. 2014, 41 (6): 22-26.  doi:10.11896/j.issn.1002-137X.2014.06.005
Abstract PDF(414KB) ( 406 )   
References | Related Articles | Metrics
Due to the augment of the parallel computing system size and the increase of its complexity,the existing multi-job assignment approaches can cause severe communication latency and contention.In order to solve this problem,a new multi-job assignment approach based on a closed minimum graph-partitioning model was proposed.To minimize the communication latency and eliminate the communication contention,this approach translates the multi-job assignment optimization problem to the closed minimum graph-partitioning problem by building a closed minimum graph-partitioning model,and designs the closed minimum graph-partitioning algorithm to obtain an optimized multi-job assignment scheme.
Spectrum Decision Making in Clustering Cognitive Radio Subnet
ZHAO Jun,LIAO Ming-xue,HE Xiao-xin and ZHENG Chang-wen
Computer Science. 2014, 41 (6): 27-30.  doi:10.11896/j.issn.1002-137X.2014.06.006
Abstract PDF(414KB) ( 401 )   
References | Related Articles | Metrics
Tree cognitive radio subnets employ multi-cluster parallel operation mode.Its spectrum decision involves subnet capacity,subnet throughput and subnet stability,leading to high compute complexity.To solve the spectrum decision making problem,a triple-layer priority decision model was established and a corresponding heuristic decision algorithm was proposed.The proposed algorithm creates a search space without generating duplicated nodes via cluster structure and cluster growth degree,greedily searching for subnet structure with higher growth rate on the heuristic condition that is the search steps of updating the optimal solution.The solution space is rigorously pruned at double thresholds,i.e.,subnet capacity limit and the available spectrum and subnet transmission rate limit.Simulation results show that the proposed algorithm can obtain the optimal solution and meet real-time requirement under a specified subnet scale and spectrum space constraints.
Weight Aware Wireless Sensor Network Deployment in 3-D Indoor Space with Obstacles
PANG Bo,QIN Xiao-lin,JIANG Guo-hua and LIU Liang
Computer Science. 2014, 41 (6): 31-36.  doi:10.11896/j.issn.1002-137X.2014.06.007
Abstract PDF(543KB) ( 476 )   
References | Related Articles | Metrics
None of existing research for wireless sensor network deployment problem in restricted interior space consi-ders about the requirements of various applications or the interference by obstacles on wireless sensors’ signal,which leads to the waste of sensor perception and communication capabilities.To solve the problems,we proposed a heuristic-based wireless sensor network deployment algorithm which deploys sensors with a greedy-based strategy and optimizes the deployment with a weight aware genetic algorithm for global optional solution.The algorithm distinguishes the importance of different area by weight division and models the signal attenuation by log-normal shadowing model with dynamic variance (LNSM-DV) to maximize the coverage effectiveness and minimize the deployment cost guaranteeing k-coverage and network connectivity.The experimental results show that our algorithm is more efficient than the Cost-efficient k-coverage algorithm presented by Kouakou,Marc T,et al.when the obstacle causes certain effects on wireless sensor nodes.
Dynamic Load Balance Mechanism Based on Mobile Terminal Service-aware in Integrated Heterogeneous Wireless Networks
LUO Jun-hui,BAI Guang-wei,SHEN Hang and CAO Lei
Computer Science. 2014, 41 (6): 37-42.  doi:10.11896/j.issn.1002-137X.2014.06.008
Abstract PDF(549KB) ( 444 )   
References | Related Articles | Metrics
Load balance in heterogeneous wireless networks is one of the key technologies to enhance service quality of networks.Most existing load balance mechanisms do not take the difference of personalized business demand of users for network resources into account,and moreover,lack a satisfaction guarantees mechanism.To address this problem,a novel dynamic load balance in advance mechanism based on end-user service awareness was proposed.At first,a fuzzy mathematics model was introduced to achieve access expectations with respect to each candidate network of terminal.On this basis,the services of mobile end-user in current network could be transferred to low resources congestion rate and high exceptions access network selectively,in full consideration of the congestion status of these candidate network resources,such that local hot-spots could be alleviated,as a result to enhance the resource utilization efficiency and improve user satisfactions.Simulations demonstrate that the proposed mechanism achieves significant performance improvement,in terms of access point load,as well as congestion avoidance and user network experiences.
Friendship Prediction Based on Fusion of Network Topology and Geographical Features
LUO Hui,GUO Bin,YU Zhi-wen,WANG Zhu and FENG Yun
Computer Science. 2014, 41 (6): 43-47.  doi:10.11896/j.issn.1002-137X.2014.06.009
Abstract PDF(434KB) ( 407 )   
References | Related Articles | Metrics
Friendship prediction has become one of the major studies of location based social network (LBSN).This paper proposed an approach for predicting friendship,which fuses the topology network and geographical features of LBSN.We first adopted the information gain to measure the contribution of different features to human friendship,and chose three key features:user social topology,the category of the location where people check in,and check in points.We then presented the friendship prediction method based on the fusion of the selected features.Three different classification models,including Random Forests,Support Vector Machine (SVM),and Naive Bayes,were selected to predict human friendship.Experimental results on the real collected data from Foursquare and JiePang verify the efficacy of the selected features and the accuracy of friendship prediction.
Research of Cloud Computing Virtual Machine Allocated Strategy on Multi-objective Evolutionary Algorithm
AI Hao-jun,GONG Su-wen and YUAN Yuan-ming
Computer Science. 2014, 41 (6): 48-53.  doi:10.11896/j.issn.1002-137X.2014.06.010
Abstract PDF(732KB) ( 398 )   
References | Related Articles | Metrics
The thesis analyzed cloud computing virtual machine resource model,and aiming at the mapping between virtual machines and physical machines in the model,and the characteristics of virtual machine multi-resource factor and multi-objective optimization,translated the virtual machine allocation problem into multi-dimensional packing problem,introduced multi-objective evolutionary algorithm to solve problem.Firstly,the algorithm designs the growing chain coding and chromosome evaluation function for virtual machine distribution problem,and according to the Encoding,designs two crossover operators and a smart mutation operator,finally,introduces the Hypervolume Measure as population selection Criterion,and designs cloud virtual machine allocation algorithm based on SMS-EMOA.In order to verify the performance of SMS-EMOA,the simulation experiment was made using PMH,CGA,SMS-EMOA,and the experimental results show that SMS-EMOA is better in virtual machine allocation performance.
Node Sensitivity-based Autonomous Decision-making Target Tracking Algorithm
ZHENG Jin,LV Peng-peng and GUO Shao-ming
Computer Science. 2014, 41 (6): 54-58.  doi:10.11896/j.issn.1002-137X.2014.06.011
Abstract PDF(907KB) ( 461 )   
References | Related Articles | Metrics
Target tracking is one of the basic applications in wireless sensor networks (WSNs).Because of the limitation of node energy,how to reduce node energy consumption to ensure the tracking accuracy is the major research.In this paper,the wireless sensor networks were divided into several polygons using RNG planarization technology,and weighted centroid algorithm based on the polygon structure was used to estimate the target’s position.Since the node sensor quality is inversely proportional to the distance,we gave the calculation method of node sensitivity,and proposed a node sensitivity-based autonomous decision-making target tracking algorithm (named NS-ADTT).According to the node sensitivity and the local network,the node can make decision to participate or not in the current tracking.The simu-lation results show that our algorithm can reduce the tracking number of nodes and the energy consumption of network,and extend the network lifetime efficiently.
Resource Allocation and Pricing Mechanism for Multi-type Resources of Cloud Market Based on Mechanism Theory
SHEN Zhang-guo,LOU Jun-gang,MA Xiao-long and MA Wang-yong
Computer Science. 2014, 41 (6): 59-62.  doi:10.11896/j.issn.1002-137X.2014.06.012
Abstract PDF(400KB) ( 444 )   
References | Related Articles | Metrics
In order to solve the conflict of individual utility and social welfare on account of participants’ selfishness in the cloud resource allocation,a cloud market model for task request was proposed under the assumption that all participants are rational.Then,a resource allocation based on mechanism theory for multi-type resources of cloud market pricing mechanism which meets effective allocation and reasonable pricing in complex task of users was presented.The mechanism can maximize the individual utility and social welfare.Finally,the mechanism was proved to satisfy individual rationality,budget balance and incentive compatibility,and the algorithm of implementation mechanism was also given with cost minimization and utility maximization.
GPU-based Parallel SVD Least Squares Estimation Algorithm
LI Fan,JIN Ming-lu and LIU Ji
Computer Science. 2014, 41 (6): 63-68.  doi:10.11896/j.issn.1002-137X.2014.06.013
Abstract PDF(444KB) ( 613 )   
References | Related Articles | Metrics
Singular value decomposition (SVD) for solving least-squares estimation problem was studied.The proposed iterative divide and merge algorithm (IDMSVD) which aims to improve the singular value decomposition in the estimation of parameters is very time-and memory space-consuming.Based on IDMSVD,a parallel IDMSVD algorithm was proposed and realized using the Nvidia GPUs.The experimental results show that IDMSVD can effectively improve the minimum run time and memory space consuming problems in the SVD squares solution,and the prallel IDMSVD algorithm can further improve IDMSVD operation time.
Randomized Algorithm for Virtual Network Mapping Problem Based on Load Balancing
YU Jian-jun and WU Chun-ming
Computer Science. 2014, 41 (6): 69-74.  doi:10.11896/j.issn.1002-137X.2014.06.014
Abstract PDF(556KB) ( 430 )   
References | Related Articles | Metrics
This paper analyzed the existing problems of virtual network mapping (VNM) algorithm based on best-effort service,and pointed out their shortcoming on resource load balancing,and then defined load balancing cost indicator on the physical network,proposed a virtual network mapping randomized algorithm based on load balancing.Experiment shows that the proposed algorithm increases physical network resource load balancing metric and coefficient of utilization,therefore can improve the virtual network construction request acceptance ratio and income profit of physical network service provider.
Data Collection and Analysis of Traffic Flow between Peers in BitTorrent Networks
YAN Yu and TANG Hong
Computer Science. 2014, 41 (6): 75-78.  doi:10.11896/j.issn.1002-137X.2014.06.015
Abstract PDF(314KB) ( 537 )   
References | Related Articles | Metrics
The fact that BitTorrent networks generate a large portion of the Internet traffic has attracted many experts to focus on it.Unfortunately,due to the existing tools’ inability to measure dynamic volume of traffic between two individual peers,there’s few research on the traffic flow between the peers in real BT networks.The authors designed an inter-peer traffic data collection system,got the real data after implementing it on the Planetlab platform and verified the data’s correctness,analyzed the measured data and found that (1) the return of list of neighbor peers from Tracker server to a downloading peer will be affected by the peers’ joining time into the network,(2) for 50% or more peers,the downloaded data from the seed is zero,and for most peers,the downloading seldom depends on seed,which demonstrates the advantage of the P2P mechanism.
Assembling Algorithm of Optical Burst Switching Network under Multiple Performances Restriction
NIU Da-wei,LI Qi,YU Wei-bo and MI Zhi-chao
Computer Science. 2014, 41 (6): 79-83.  doi:10.11896/j.issn.1002-137X.2014.06.016
Abstract PDF(539KB) ( 418 )   
References | Related Articles | Metrics
The paper proposed an assembling algorithm for edge node of the optical burst switching networks.The algorithm can satisfy multiple network performances at the same time.The foundation of the algorithm is based on the ana-lysis of the impaction of assembling algorithm on control plane’s end to end delay,resource utilization of data plane and the drop ratio of the burst.The algorithm can adjust the time threshold and the queue length threshold according to the network performances’ restriction,and satisfies the desired performance of resource utilization and burst drop ratio at the same time.
JPEG2000Digital Image Encryption Algorithm Supporting Ciphertext Transcoding
FU Yong,YI Xiao-wei and MA Heng-tai
Computer Science. 2014, 41 (6): 84-88.  doi:10.11896/j.issn.1002-137X.2014.06.017
Abstract PDF(1004KB) ( 429 )   
References | Related Articles | Metrics
Aiming at the increasing demand for transcoding capacity caused by the aggravating isomerization degree in communication networks,a digital image hierarchical encryption algorithm supporting ciphertext domain transcoding named CT-HEA(Ciphertext Transcoding-Hierarchical Encryption Algorithm) was proposed.Compared with the traditional JPEG2000image encryption algorithms,aiming at the feature of rate-distortion optimized truncation model,CT-HEA truncates and combines the compressed codestream according to image quality layer and resolution,applies hierarchical encryption to the codestream after reorganization by using cryptographic algorithms.It supports transparent transcoding operations directly on the encrypted and compressed bitstream.Simulation results demonstrate that CT-HEA is characterized by low complexity,high performance of secure,high secure transcoding flexibility and low transcoding overhead.
Data Trust Model for Road Information in Vehicular Ad hoc Networks
WANG Guang-hao and WU Yue
Computer Science. 2014, 41 (6): 89-93.  doi:10.11896/j.issn.1002-137X.2014.06.018
Abstract PDF(490KB) ( 451 )   
References | Related Articles | Metrics
Dynamic routing is one of the solutions to urban traffic congestion problem.In dynamic routing plan,some vehicles are willing to produce and forward road information,so that other vehicles will analyze received information to avoid congested roads.Because of the lack of validation towards road information,malicious vehicles can tamper the road information in order to mislead other vehicles to choose the wrong route.A new trust model to verify road information was proposed.Validation towards the road information is based on data trust rather than entity trust.Dempster-Shafer theory is applied to voting algorithm to increase robustness in cases with uncertain.Simulation shows that with no additional data exchange,the model effectively detects and avoids malicious data.When looking for the right route according to route information from database,vehicles will not consider the malicious route information so that this model will help improve the vehicles’ travel time.
Steganalysis Based on Multi-domain Features for JPEG Images
WANG Lei,ZENG Xian-ting and SU Jin-yang
Computer Science. 2014, 41 (6): 94-98.  doi:10.11896/j.issn.1002-137X.2014.06.019
Abstract PDF(425KB) ( 518 )   
References | Related Articles | Metrics
A universal steganalysis method for JPEG images was proposed.The algorithm can achieve high detection correct ratio by combining various statistical features,including DCT,spatial domain,and DWT features.The combined statistical features can reveal the image difference better between an original cover image and its stego-image.Experiments were conducted to show the effectiveness of the proposed method.
Improved RSSI-based Localization Method Using Bounding-box Algorithm in WLAN
SUN Shan-wu,WANG Nan and CHEN Jian
Computer Science. 2014, 41 (6): 99-103.  doi:10.11896/j.issn.1002-137X.2014.06.020
Abstract PDF(655KB) ( 419 )   
References | Related Articles | Metrics
Comparing to the outdoor localization technology,such as GPS,WLAN-based localization is more applicable to indoor environments.There are two RSSI-based localization methods widely used and researched in indoor localization:location fingerprints method and signal propagation modeling method.We combined the two methods to present an improved RSSI-based localization method in WLAN by using the bounding-box algorithm and an improved binary range search algorithm.The proposed method pre-rearranges the samples of fingerprint database on the basis of their X-coordinates (or Y-coordinates) and significantly reduces the number of samples in fingerprint database by using the linear binary range search algorithm,so that the real-time localization efficiency is greatly increased.We maximized the dimension of and added time period dimension to location fingerprint and presented experimental results that demonstrate the ability of the proposed methods to estimate user location with a high degree of accuracy.
Algorithm of Matching to XACML-Policy Based on Component of Multi-valued Attribute
LI Dong-hui,ZHANG Bin,FEI Xiao-fei and LIU Yang
Computer Science. 2014, 41 (6): 104-107.  doi:10.11896/j.issn.1002-137X.2014.06.021
Abstract PDF(289KB) ( 447 )   
References | Related Articles | Metrics
Aiming at the demand of matching between the XACML policy and request based on the multi-valued attri-bute,this paper analyzed the corresponding attribute relationships between policy rule and the request.Three related theorems were proposed and proved based on the relationship between attributes implication and permissions implication.According to the three policy matching theorems,a matching algorithm was put forward.Finally,several experiments show that the algorithm enhances the matching efficiency.
Data Assured Deletion Scheme Based on Trust Value for Cloud Storage
FENG Gui-lan and TAN Liang
Computer Science. 2014, 41 (6): 108-112.  doi:10.11896/j.issn.1002-137X.2014.06.022
Abstract PDF(514KB) ( 393 )   
References | Related Articles | Metrics
The data assured deletion is a spot for cloud storage security.The data assured deletion approach which is distributed is not account for nodes’ trust in the DHT,so it will lead user can’t access data before the deadline.To solve this problem,a data assured deletion scheme based on trust value for cloud storage (DDTV) was proposed.The core of DDTV is evaluation of DHT nodes and choosing high nodes to store key shares.The dynamic property of DHT network makes keys to be deleted periodically,causing the sensitive data in cloud computing to be automatically destroyed after the expiration time.Compared with the data assured deletion approach which is distributed,the difference is that keys are pushed to DHT network based on trust value after partitioned by secret sharing scheme,especially the nodes which have high trust value will be chosen in the DDTV.This difference improves the possibility of decrypting the data before the deadline.The experiment results show that the method can not only identify the malicious node in the DHT,but also improve the success rate of key share extract.The high rate of key share extract can increase the rate of user access to sensitive data in the authorized time.
Optimizing Algebraic Multigrid on NUMA-based Cluster System
GU Jian and LIU Wei
Computer Science. 2014, 41 (6): 113-118.  doi:10.11896/j.issn.1002-137X.2014.06.023
Abstract PDF(573KB) ( 845 )   
References | Related Articles | Metrics
Algebraic Multigrid (AMG) as a key method of many numerical simulation programs,a serious scalability problem is encountered when it scales on mulit-core NUMA based cluster.There is a problem that data tend to be allocated nearby the CPU core on which the main thread are running,and the data lack locality.By applying our NUMA-aware memory allocator,each partition of a data block can be binded to the corresponding NUMA memory nodes which the data owner thread is laid on,so as to successfully maintain more data locality when using OpenMP parallelizing AMG,and make BoomerAMG scaling up more easily and efficiently.In the experiment on a single node and a small 16nodes cluster,using NAAlloc allocator gets 16% and 60% peak performance improvement respectively.
Quality of Recommendation Based Trust-aware Recommender System
WANG Hai-yan and ZHOU Yang
Computer Science. 2014, 41 (6): 119-124.  doi:10.11896/j.issn.1002-137X.2014.06.024
Abstract PDF(673KB) ( 434 )   
References | Related Articles | Metrics
Recommender system has achieved great success in dealing with information overload,meanwhile has some problems,such as data sparse,cold start and so on.How to get satisfied recommendation under the circumstance of data sparse is urgent for recommender system.Introducing trust into recommender system is an efficient way to resolve the above problems.Most of existing trust-aware recommender systems are based on Boolean trust relationship,and do not take domain correlation of trust into account.In the field of service selection,service requestor selects services based on QoS(quality of service).Inspired by QoS,recommendation requestor can find recommender based on QoR(quality of recommendation).Thus we put forward the concept of QoR and QoR based trust-aware recommender system.The attributes of qor include user rating similarity,domain trust,domain relative degree and social intimacy degree,whose weights are determined by method of information entropy.Empirical evaluation shows that our method improves the precision and ra-ting coverage of recommender system under condition of data sparse,above all,effectively improves the recall rate of cold start user,resolving the cold start problem to some extend.
Approximation of Multi-valued Models via Reduction
CHEN Juan-juan and WEI Ou
Computer Science. 2014, 41 (6): 125-130.  doi:10.11896/j.issn.1002-137X.2014.06.025
Abstract PDF(465KB) ( 410 )   
References | Related Articles | Metrics
Multi-valued model can be used for modeling and verification of software systems with uncertain and inconsistent information.In this paper,the approximation of multi-valued models was described by reducing,which provides the theoretical basis for solving the state-explosion problem.A new approach of decomposing the multi-valued model into several three-valued ones was proposed.The equality of the model-checking result of any μ-calculus formula on the multi-valued model and the union of that on the associated three-valued ones was strictly proves.Furthermore,the approximation of multi-valued models was defined based on the mixed simulation of the corresponding three-valued mo-dels,which preserves μ-calculus model-checking results with respect to information order.
New Methods of Software Requirements Risk Assessment Using UML
LIU Jin-hang and XIA Hong-xia
Computer Science. 2014, 41 (6): 131-135.  doi:10.11896/j.issn.1002-137X.2014.06.026
Abstract PDF(445KB) ( 449 )   
References | Related Articles | Metrics
Evaluating software risks in the early stages of software development activities will effectively reduce the levelof software development costs and the development risk .Aiming at the status that the current research on the software risk assessment mainly focuses on late stage of the software development,and following the practice principle which is “early identification and control software risk”,this paper presented a method for assessment software risk at requirements analysis phase using UML.This method primarily focuses on the software risk prevention at requirements analysis phase,and sequentially gives some reference to reduce the risk impact of the late phase.
Simple Continues Near Neighbor Chain Query in Dynamic Constrained Regions
LI Song,ZHANG Li-ping,ZHU De-long and HAO Xiao-hong
Computer Science. 2014, 41 (6): 136-141.  doi:10.11896/j.issn.1002-137X.2014.06.027
Abstract PDF(526KB) ( 406 )   
References | Related Articles | Metrics
The simple continues near neighbor chain query in the constrained regions(CRSCNNC-Query) has important significance in the spatial data mining,similarity analysis and reasoning of data,spatial database etc.To remedy the deficiency of the existing work,the simple continues near neighbor chain query in the dynamic constrained regions was stu-died respectively.The VOR_IN_CRSCNNC algorithm,VOR_EX_CRSCNNC and the VOR_DE_CRSCNNC algorithm were presented based on the Voronoi diagram.Furthermore,the performance of the methods were analyzed and compared by experiment.The theatrical study and the experimental results show that the redundant calculation is reduced and the algorithms hold large advantage at the big data sets and the regions with complex shapes.
PBPP:Pipelined Parallel Processing Based on Passing Buffer in Column-store System
DING Xiang-wu and ZHANG Guang-hui
Computer Science. 2014, 41 (6): 142-147.  doi:10.11896/j.issn.1002-137X.2014.06.028
Abstract PDF(485KB) ( 356 )   
References | Related Articles | Metrics
Chip multiprocessor(CMP) with low-power dissipation,lowcost advantages becomes rapidly the leading role of the market,and it provides hardware support for multithread.Column-store has significant advantages in analytical applications.Query optimization is one of the key issues in column-store.In column-store,multi-core resources can improve performance of query processing.In order to improve query performance of column-stores,this paper established passing block buffer to make main thread and worker thread to read and write respectively different passing blocks,so parent node and child node of physical execution tree execute parallel.We used classic producer-consumer pattern to solve the problem of synchronization between the threads.In column-stores DWMS developed by our laboratory,experimental results on benchmark data set SSB show the effectiveness of this design,and it can improve 50% execution performance for some typical complex queries.
Research of Extraction Method of Web Mathematical Formula Based on LaTex
CHEN Li-hui,SU Wei,CAI Chuan and CHEN Xiao-yun
Computer Science. 2014, 41 (6): 148-154.  doi:10.11896/j.issn.1002-137X.2014.06.029
Abstract PDF(609KB) ( 1275 )   
References | Related Articles | Metrics
The influence of Wiki,mathematics forum and other social networking sites on the mathematics education field is growing.Mathematical formulas exist widely in these websites.How to search the mathematical formulas of these websites is very important for study and research.The extraction of mathematical formulas is the premise and foundation of the indexing system.This paper mainly studied the format of the LaTex mathematical formulas,and pre-sented the automatic analysis extraction method of Web mathematical formulas based on LaTex through the BNF paradigm.According to features the formulas contain,the paper proposed the method of extraction and filtration of LaTex mathematical formula.The experiment discovers that the recall rate reaches 75% and the precision rate comes to 99% using this method.
Large Scale Induced Subgraphs Mining Algorithm on Self Adaptive Cloud
GUO Xin,DONG Jian-feng and ZHOU Qing-ping
Computer Science. 2014, 41 (6): 155-160.  doi:10.11896/j.issn.1002-137X.2014.06.030
Abstract PDF(619KB) ( 605 )   
References | Related Articles | Metrics
Aiming at the current puzzles of random resource allocation of cloud computing platform and lower mining efficiency of traditional induced subgraph,promoting the efficiency of resource integration and using of cloud computing platform and large-scale induced subgraph mining,the paper put forward an algorithm of large-scale induced subgraph extraction for self-adaption cloud to solve the problems of resource optimal utilization and massive graph mining.The paper firstly introduced the relevant concepts and problem description of cloud computing and induced subgraph mining,then designed an algorithm SAC_TA of self-adaption task dynamic allocation according to MapReduce parallel processing model,which can comput task self-adaption allocation system resources to reach the optimum of cost wasting,meanwhile designed the self-adaption cloud framework.On the basis of the framework,the paper put forward the massive induced subgraph mining algorithm SFGFF,which includs four stages of mining.And while applying all the algorithms to self-adaption cloud, the whole induced subgraph mining system can be constructed. The experimental result of manual simulation data and real environment data shows that the self-adaption cloud runs well and the algorithms are efficient and feasible,and have higher speed-up ratio and operating efficiency to satisfy the demand of massive frequent induced subgraph mining.
Different Linear Discriminant Analysis Based on Laplacian Orientations
LI Zhao-kui,DING Li-xin,WANG Yan,HE Jin-rong and ZHOU Ling-yun
Computer Science. 2014, 41 (6): 161-165.  doi:10.11896/j.issn.1002-137X.2014.06.031
Abstract PDF(965KB) ( 363 )   
References | Related Articles | Metrics
For the traditional linear discriminant analysis method,there are usually three questions:1) In order to ensure that the within-class scatter matrix is nonsingular,the principal component analysis must firstly is performed,which limits the effect of multidimensional space.2) If the number of training samples per person is single,the within-class scatter matrix is generally singular,and the method does not work.3)Without considering the partial correlation between pixels.To address these problems,this paper proposed a different linear discriminant analysis based on Laplacian orientations.The usage of the Laplacian orientations results in a more robust dissimilarity measures between images.The introduction of the difference scatter matrix avoids the singularity of the within-class scatter matrix.Experiments show that the proposed method has better robustness for facial expressions,illumination changes and different occlusions,and achieves a higher recognition rate.Especially for illumination changes,the effect is better.
Incremental Learning with Support Vector Regression
ZHANG Yi-fan,FENG Ai-min and ZHANG Zheng-lin
Computer Science. 2014, 41 (6): 166-170.  doi:10.11896/j.issn.1002-137X.2014.06.032
Abstract PDF(418KB) ( 755 )   
References | Related Articles | Metrics
In view of costly time-space complexity for support vector regression in dealing with large scale data,this paper presented a novel algorithm named the L Incremental υ Support Vector Regression (LISVR) by means of incremental learning.LISVR eliminates non-support vectors each iteration and then takes the support vectors as the training samples with the weight factor.It reduces time-space complexity and enhances the regression results simultaneously.Theoretically,this paper proved the convergence of the global optimal solution.The experiments on the artificial data sets,UCI data set and airport noises show the effectiveness of the LISVR.
Research on Discovery Method in Literature Implicit Knowledge Using Semantic Complementary Reasoning
WEN Hao and WEN You-kui
Computer Science. 2014, 41 (6): 171-175.  doi:10.11896/j.issn.1002-137X.2014.06.033
Abstract PDF(663KB) ( 764 )   
References | Related Articles | Metrics
The literature knowledge discoveries methods have become the breakthrough technologies to solve the difficult problem of massive information retrieval.But the current literature knowledge discovery methods are vector space methods based on word bag.This methods have congenital deficiency, i.e.semantic independence between vocabulary elements, so a large number of potential knowledge which exists between texts can't be discovered.Therefore, a new Chinese literature knowledge discovery method is put forward in this paper.The representation of the minimum knowledge unit and the semantic reasoning are based on subject predicate object (S,P,O) structure in this method.So the deficiency of traditional knowledge discovery method can be avoided, and a reasoning algorithm is proposed based on the structure to discover the potential knowledge between texts.The efficiency of discovering the potential of knowledge of our method is enhanced in contrast to the traditional methods, which is proved by the experiments in this paper.
Observation Reduction in Multi-agent Domain
WU Xuan,WEN Zhong-hua,WANG Quan and CHANG Qing
Computer Science. 2014, 41 (6): 176-179.  doi:10.11896/j.issn.1002-137X.2014.06.034
Abstract PDF(464KB) ( 507 )   
References | Related Articles | Metrics
Observation information reduction is a hot area of the uncertainty planning research in recent years,but these researches concentrate on the single agent’s environment,and the planning problem related to observation information reduction in the multi-agent domain is lack of researching.Confronted with the planning problem in the multi-agent domain,this paper designed an ORMAP algorithm which can find a collaborative planning in the nondeterministic multi-agent domain.At first,the ORMAP algorithm layers all states in the problem domain according to the model-based hierarchical states thought in order to avoid the conflicts between different agents.Then,it searches the collaborative planning solution with the method of the backtracking prior to minimum cost,meanwhile reduces the observation information.At last,a cooperative planning solution can be obtained and it is the one which needs least amount of observation information in all cooperative planning solution to the problem domain,so that it reaches the point.Finally,the experiment shows the efficiency of this algorithm is higher after considering the constraints of the observation information reduction.
Analysis and Improvement of SPFA Algorithm
XIA Zheng-dong,BU Tian-ming and ZHANG Ju-yang
Computer Science. 2014, 41 (6): 180-184.  doi:10.11896/j.issn.1002-137X.2014.06.035
Abstract PDF(428KB) ( 765 )   
References | Related Articles | Metrics
SPFA (Shortest Path Faster Algorithm) algorithm is a kind of single-source shortest path algorithm for digraphs.Because its easy implementation and good performance,SPFA has a large influence in the Domestic.But unfortunately,it lacks of correct theoretical analysis.This paper analysed this algorithm both in theoretical and experimental.The paper pointed out that the worst-case running time of SPFA algorithm is Θ(|V||E|) in the case of digraphs with no negative-weight cycle reachable from the source.In the case of digraphs with negative-weight cycle reachable from the source,the algorithm will run indefinitely.This paper then revisesd the algorithm so that it is O(|V||E|) for any digraphs.Finally,this paper compared SPFA in experiment with other shortest path algorithms which are based on the same idea.
Temporal Shortest Path Algorithm
DENG Dong-mei,WANG Guan-nan,ZHU Jian,GAO Hui and CHEN Duan-bing
Computer Science. 2014, 41 (6): 185-187.  doi:10.11896/j.issn.1002-137X.2014.06.036
Abstract PDF(319KB) ( 867 )   
References | Related Articles | Metrics
The shortest path is a path which has the least hinder strength between two nodes in the specified network.Traditionally,it is studied on static network,but many networks are dynamic and temporal in real life.So the traditional algorithms can’t solve all problems about shortest path.This paper presented a precision algorithm to find the temporal shortest path based on the Dijkstra’s algorithm.It proved the correctness of the algorithm by mathematical derivation and verified the feasibility of the algorithm on a constructed network.
Multi-epoch Analysis to Evolution Strategy of Enterprise Cloud Computing Application
YIN Xing,ZHOU Jian-xiong and WANG Ming-zhe
Computer Science. 2014, 41 (6): 188-192.  doi:10.11896/j.issn.1002-137X.2014.06.037
Abstract PDF(467KB) ( 399 )   
References | Related Articles | Metrics
With the development of cloud computing,many companies are in urgent need of an effective way to migrate the application systems from isolated state to cloud platform.By applying analytic network process (ANP) and the convergence process of supermatrix to formulate enterprise cloud computing application evolution,a multi-epoch analysis approach of evolution strategy was put forward.An ANP model of multi-attribute decision making is built by architecture modeling and simulating to the enterprise cloud computing application systems.And the multi-epoch crucial influence factors to each system are excavated by analyzing the convergence process of supermatrix from initial state to the limit.The factors are used to support the evolution of the enterprise’s strategy.At last, a case was given to show the specific process of the method,and extract the corresponding key factors to provide a theoretical basis for dynamic programming of enterprise’s strategy.
Safety Verification for VCCR Subsystem of Space Life Support System
LI Qian and YU Wen-sheng
Computer Science. 2014, 41 (6): 193-198.  doi:10.11896/j.issn.1002-137X.2014.06.038
Abstract PDF(959KB) ( 408 )   
References | Related Articles | Metrics
This paper applied the differential-dynamic logic theory to analyze and verify the safety properties of a hybrid system named Variable Configuration Carbon Dioxide Removal (VCCR),which is a subsystem of space life support system.Based on the hybrid program,a model of the hybrid system VCCR with its safety properties was built.Using the hybrid system verification tool KeYmaera,it was formally proved that the system satisfies the safety properties under all possible scenarios.
Cost Reference Marginalized Particle Filter Based on Adaptive Particle Swarm Optimization
HU Zhen-tao,WEI Dan,JIN Yong and HU Yu-mei
Computer Science. 2014, 41 (6): 199-203.  doi:10.11896/j.issn.1002-137X.2014.06.039
Abstract PDF(450KB) ( 390 )   
References | Related Articles | Metrics
Aiming at the precise measures of important weights and the effective sampling of particle in measurement uncertainty,a novel cost reference marginalized particle filter based on adaptive particle swarm optimization was proposed.In the new algorithm,cost function and risk function are firstly introduced to complete reasonable utilization of the latest observation,and the dependency on priori information of observation noise in classical measuring method of important weights is improved.Secondly,through the extraction and utilization of particle distribution features information,the adaptive selection strategy of the limit velocity is obtained and a new adaptive particle swarm optimization method is given.Finally,combining with the mechanism of colony optimization in particle swarm optimization,the approximation effectiveness of sampling particles relative to estimated state is enhanced,and the diversity of particle after re-sampling is improved.The theoretical analysis and experimental results show the efficiency of the proposed algorithm.
Research and Improvement of TFIDF Text Feature Weighting Method
HUANG Lei,WU Yan-peng and ZHU Qun-feng
Computer Science. 2014, 41 (6): 204-207.  doi:10.11896/j.issn.1002-137X.2014.06.040
Abstract PDF(368KB) ( 669 )   
References | Related Articles | Metrics
Keywords extraction method plays a very important role in the areas of text classification and information retrieval.This paper firstly analysed the shortage of the original TFIDF algorithm,that is the IDF (Inverse Document Frequency) algorithm does not consider the distribution of feature term between categories.So some problems will appear,such as the terms with low frequency and the high IDF weights,and some words with high frequency and low IDF weights,which can cause that the precision of keywords extraction is not accurate.After analysis of these problems,by increasing a new weight DI (Distribution Information),we got a new DI-TFIDF algorithm.A corpus used in the experiment was downloaded from the Sogou corpus and we selected the 1000article of sports,education and military documents as an experiment based on the traditional TFIDF method and the DI-TFIDF method.Experimental results show that our proposed DI-TFIDF method can extract the keywords in a higher accuracy than traditional TFIDF algorithm.
Dynamic Memory Risk Identification Model Based on Residual Antigen
TAO Yuan,HU Min and WANG Ping
Computer Science. 2014, 41 (6): 208-213.  doi:10.11896/j.issn.1002-137X.2014.06.041
Abstract PDF(489KB) ( 391 )   
References | Related Articles | Metrics
The paper focused on the risk identification of small probability events.A Residual Antigen based Dynamic Memory Risk Identification Model DMRIM was proposed.As small probability events have no rules to follow,the intensity and frequency of risk is intuitively and dynamically mapped to the concentration of residual antigen by DMRIM. By using residual antigent,DMRIM stimulates immune memory,guides antibody evolution and controls the life-cycle of the detector.It breaks through traditional memory cell life-cycle and realizes distribution and self-made of the detector,thereby improving recognition of small probability events.The simulation experiments prove that DMRIM achieves dynamic memory and recognize small probability events effectively.Its feasibility is verified in practical application.
Improvement of Information Gain in Spam Filtering
ZHAI Jun-chang,QIN Yu-ping and CHE Wei-wei
Computer Science. 2014, 41 (6): 214-216.  doi:10.11896/j.issn.1002-137X.2014.06.042
Abstract PDF(361KB) ( 400 )   
References | Related Articles | Metrics
The paper put forward a kind of improved information gain for the feature words selection in spam filtering.Firstly,defined gain ratio according to the probability of feature words,and then amplifed or weakened the amount of information of the feature words for classification,thereby improving the calculation method of category conditional entropy. Finally, combining with the naive Bayes decision method of maximum a posteriori hypothesis,carried out an experiment on the English Corpus to analyze the algorithm through recall,correct,accuracy and error.The experimental results show that the improved algorithm can enhance classification precision and reduce user loss.
Multi-adjacent-vertexes and Multi-shortest-paths Problem of Dijkstra Algorithm
WANG Shu-xi and LI An-yu
Computer Science. 2014, 41 (6): 217-224.  doi:10.11896/j.issn.1002-137X.2014.06.043
Abstract PDF(654KB) ( 2463 )   
References | Related Articles | Metrics
Dijkstra Algorithm is one of the most classical algorithms to solve the shortest path problem.This paper listed and analyzed Dijkstra Algorithm and its pseudo-code.To deeply understand Dijkstra Algorithm,listed several error views and rectified them.Through analyzing Dijkstra Algorithm,there are maybe multiple pre-adjacent vertexes for a vertex in one shortest path,and there are maybe multiple shortest paths with the same weight.Regrettably,Dijkstra Algorithm does not solve the above problems.To solve the above problems and improve Dijkstra Algorithm,we analyzed its causes,proposed an algorithm,gave its pseudo-code and programmed with c language,and analyzed the time complexity of this algorithm.Experimental results show that the improved Dijkstra algorithm can effectively solve the problem of multi-adjacent-vertexes and multi-shortest paths.
Diagnosis of Web Service Based on Bayes and Multi-faults Reasoning
JIA Zhi-chun and XING Xing
Computer Science. 2014, 41 (6): 225-230.  doi:10.11896/j.issn.1002-137X.2014.06.044
Abstract PDF(486KB) ( 391 )   
References | Related Articles | Metrics
The increasing size and complexity of web service composition process has led to an amplified number of potential failures and makes it harder to ensure service reliability.To localize the faults of undefined service behaviors in service process,we proposed a diagnosis method based on Bayes and multi-faults logic reasoning.On the basis of modeling the service execution matrix by historical data,we combined multi-faults logic reasoning technique with the statistical technique to obtain the diagnosis candidates.By applying Bayes’ formula to compute the fault probability of each diagnosis candidate,our diagnostic method is able to obtain an asymptotically optimal diagnosis with increasing historical data.Experiments were conducted to evaluate the effectiveness of the method in diagnosing the web services with multi-faults.
Improved Weighted-network Based Algorithm for Predicting Protein Complexes
ZHAO Bi-hai,XIONG Hui-jun,NI Wen-yin,LIU Zhi-bing and HU Sai
Computer Science. 2014, 41 (6): 231-234.  doi:10.11896/j.issn.1002-137X.2014.06.045
Abstract PDF(334KB) ( 418 )   
References | Related Articles | Metrics
The increasing amount of protein-protein interaction (PPI) data has enabled us to predict protein complexes.Due to the limitation of experimental conditions and techniques,there is a lot of noise in the PPI networks.To reduce the negative effects of noise on protein complex prediction,a new improved method named WPC (Weighted-network based method for Predicting protein Complexes) was proposed.Given a selected node,candidate set consists of all neighbors of the node and neighbor set consists of neighbors of all nodes in the candidate set.If the weighted ratio of a node between the candidate set and the neighbor set is lower than a threshold,the node is removed from the candidate set.After repeating the process for all nodes in the candidate set,the candidate set is represented as a complex.For a node not being included in any complexes,if its average weighted degree within a complex exceeds a self-adjustment threshold,WPC adds the node to the complex.A comprehensive comparison among the competitive algorithms and WPC was made.Experimental results show that WPC outperforms the state-of-the-art methods.
Cicada Sing Optimization:A New Evolutionary Algorithm Based on Bionics
HE Yi-chao,LI Ning and LI Wen-bin
Computer Science. 2014, 41 (6): 235-238.  doi:10.11896/j.issn.1002-137X.2014.06.046
Abstract PDF(379KB) ( 570 )   
References | Related Articles | Metrics
Inspire of the synchronization of the cicada singing and the life habit of the cicada,this paper proposed a novel optimization algorithm based on bionics:Cicada Sing Optimization (CSO).Then analyzed and presented that besides the characters of the general evolutionary algorithms,it owns two more special characters,and proved its asymptotic convergence based on the Markov-chain theory. Through the comparision of simulating calculation results of 9high dimension Benchmark functions by using CSO,PSO and DE,we can see:CSO is a kind of evolutionary algorithm very suitable to solve numerical optimization problems.
Actor-Critic Algorithm Based on Tile Coding and Model Learning
JIN Yu-jing,ZHU Wen-wen,FU Yu-chen and LIU Quan
Computer Science. 2014, 41 (6): 239-242.  doi:10.11896/j.issn.1002-137X.2014.06.047
Abstract PDF(455KB) ( 709 )   
References | Related Articles | Metrics
The Actor-Critic(AC) approach is a class of reinforcement learning method which has good performance and ensures convergence,but the Agent does not study the dynamic of environment in the process of learning and improving policy,which causes the performance of the AC method to be restricted to a certain extent.In addition,the AC method needs to represent the policy and value function approximately,and the encoding methods of state and action and para-meters have important influence on AC method.Tile Coding has advantages of simple and low computing time complexity,so we combined the Tile Coding with Actor-Critic method based on model and applied the algorithm to the simulation experiment on reinforcement learning,and the results show that the algorithm has good performance.
Approximate String Matching Using Tail Matched q-gram
SUN De-cai and WANG Xiao-xia
Computer Science. 2014, 41 (6): 243-249.  doi:10.11896/j.issn.1002-137X.2014.06.048
Abstract PDF(603KB) ( 797 )   
References | Related Articles | Metrics
Approximate string matching is a basic problem in computational biology,text retrieval and signal proces-sing,etc.,and how to improve the matching speed is a key issue up to now.Here,a new lossless q-gram filter was proposed to enhance the performance of finding true matches in a large text database for a given query.Q-gram index is employed as the index structure of large text database for its fast searching speed.To improve new filter’s performance by enhancing its filtration efficiency,some new features based on tail matched q-gram were extracted from text that includes true matches.In new filter,more irrelevant text is eliminated in filtration phase by using a filtration criterion which is enhanced by new extracted features,and the unfiltered text is verified by smith-waterman algorithm to locate the positions of all true matches.The experimental results demonstrate that the proposed filter’s filtration efficiency and performance are both enhanced.As a result,new filter algorithm has strong commonality and is suitable for approxi-mate string matching on condition of various matching error ratio.
Research on Database Log Based on Petri Nets
JING Bo,LIU Ying and CHEN Geng
Computer Science. 2014, 41 (6): 250-253.  doi:10.11896/j.issn.1002-137X.2014.06.049
Abstract PDF(323KB) ( 373 )   
References | Related Articles | Metrics
In order to improve audit efficiency in complex ERP systems,the method of quickly discovering non-compliance of business processes existing in the system through the database log was proposed.The method utilizes the structural relationship already existing in the database to determine the order of the log operation workflow and convert the database log by a-algorithm in Petri nets,and compares the differences between converted real business flow and compliance business flow,so that auditors can more intuitivey and accurately judge the anomalous business processes in the system.
Improved Artificial Bee Colony Algorithms for Multi-objective Continuous Optimization Problem
GE Yu,LIANG Jing,WANG Xue-ping and XIE Xiao-chuan
Computer Science. 2014, 41 (6): 254-259.  doi:10.11896/j.issn.1002-137X.2014.06.050
Abstract PDF(548KB) ( 387 )   
References | Related Articles | Metrics
In order to solve multi-objective continuous optimization problem,this paper gave the solving process accor-ding to artificial bee colony algorithm theory,and pointed out that the updating strategy in the algorithm has defect of blind searching and missing good individuals,thus proposed an improved strategy.The improved strategy has two parts.First,a self-adapting searching operator is designed to enable the algorithm to adjust the searching range automa-tically according to individual quality during the iterative process,leading to a more accurate and efficient algorithm searching process.Second,the newly produced individuals are recorded by external archive,and external archive is combined to reconstruct the colony after a iteration,which can save good individuals in the iterative process.The experiment compares improved artificial bee colony algorithm with NSGA2algorithm,artificial bee colony algorithm and superior algorithm alike in papers.The comparison result indicates the improved artificial bee colony algorithm has good convergence and uniformity in solving multi-objective continuous optimization problem.
Design and Realization of RBF Neural Network Classifier Based on Advanced Self-adaptive Clustering Algorithm
HAO Xiao-li and ZHANG Jing
Computer Science. 2014, 41 (6): 260-263.  doi:10.11896/j.issn.1002-137X.2014.06.051
Abstract PDF(341KB) ( 552 )   
References | Related Articles | Metrics
Owing to defects of lower classification precision and longer training time of Radial Basis Function Neural Network (RBF) classifier,a new self-adaptive clustering algorithm was produced firstly,which can be applied into construction of nodes in implicit layer.The new algorithm optimizes initial cluster centers by choosing good samples based on silhouette coefficients.It not only avoids the effects of initial centers in traditional k-means,but also avoids classification deviation.Secondly,the new algorithm was introduced into designing of RBF classifier.It can ascertain centers of radial basis function and its width efficiently.Finally,by a large number of tests and simulation,the new clustering algorithm was testified to be superior in clustering time,silhouette coefficients and accuracy rate.Besides,RBF classifier based on the advanced algorithm was proved to have higher precision.
Further Research on Collaborative Filtering Algorithm for Sparse Data
LUO Qi,MIAO Xin-jie and WEI Qian
Computer Science. 2014, 41 (6): 264-268.  doi:10.11896/j.issn.1002-137X.2014.06.052
Abstract PDF(415KB) ( 402 )   
References | Related Articles | Metrics
In electronic commerce and information system,collaborative filtering is a very important technique.User similarity measure of the scientific method is crucial.In order to obtain better accuracy,numbers of common Ratings between users were used to dynamically adjust the original similarity to more accuratelly reflect the authenticity of the similarity between users.On this basis,according to the social network FTL model (follow the leader) thoughts,for new users or users who cannot find the nearest neighbor,prediction algorithm based on expert trust degree was used instead of similarity to predict the user’s score,making up the deficiency of the traditional algorithm.Experiments show that the algorithm can improve the prediction score,the accuracy and the quality of recommendation,and alleviate the cold-start problem for new users.
Vehicle Matching Based on Quaternion Visual Attention Model
XU Hang,ZANG Di,CHENG Cheng and ZHANG Ya-ying
Computer Science. 2014, 41 (6): 269-274.  doi:10.11896/j.issn.1002-137X.2014.06.053
Abstract PDF(1286KB) ( 380 )   
References | Related Articles | Metrics
Searching and matching the vehicle which caused traffic accidents is a challenging problem of great importance in Intelligent Transport System (ITS).Compared with the traditional methods,visual attention model provides a new perspective by merging several fundamental features.This paper presented a new method to extract color information from an image aiming at vehicle matching problem.By combining with intensity and orientations under the framework of quaternions,a quaternion visual attention model was proposed.Experiments use the quaternion saliency as the new feature to match the vehicle which caused traffic accidents and the results demonstrate that the proposed approach is able to narrow the searching range effectively with good accuracy and robustness.
Feature Extraction Based on 2D Local Discriminative Gaussians
ZHANG Zhi-bin,ZHU Jun-yong,ZHENG Wei-shi,WANG Qian and LAI Jian-huang
Computer Science. 2014, 41 (6): 275-277.  doi:10.11896/j.issn.1002-137X.2014.06.054
Abstract PDF(265KB) ( 431 )   
References | Related Articles | Metrics
Feature extraction plays an important role in face recognition.In general,feature extraction methods need to transfer the 2D images into 1D vectors.As a result,it is hard to calculate the covariant matrix and eigen-vector efficiently and exactly due to the high dimensionality.This paper proposed a new feature extraction method named 2D local discriminant Gaussian (2D-LDG).It inherits the properties of LDG and the objective function of proposed method is also an approximation to the leave-one-out training error of a local quadratic discriminant analysis classifier.Also,it applies local Gaussians to eastimate probability in each point,relaxing the assumption on the class probability density function.Meanwhile,2D-LDG is operated on 2D images directly which avoids turning the image matrixes into high dimensional vectors,and is able to calculate the covariance matrix and eigen-vector in a more efficient and accurate way.Experiments on ORL and YaleB-Extended show that our proposed 2D-LDG feature extraction method achieves better performance in face recognition.
Image Quality Assessment of Panoramic Image
CHANG Jia-yi,QIN Rui,LI Qing and CHEN Da-peng
Computer Science. 2014, 41 (6): 278-281.  doi:10.11896/j.issn.1002-137X.2014.06.055
Abstract PDF(1247KB) ( 443 )   
References | Related Articles | Metrics
Due to no effective quality evaluation methods for bird-eye panoramic image stitching,a novel metric was proposed,which is based on the sampling distribution of the final image to the original image.According to this metric,a novel framework for image quality assessment integrated with sampling distribution was elaborated in the following section.Firstly,we briefly described a general algorithm of bird-eye panoramic image.Then,the metric based on sampling distribution was proposed.At last,other metrics,such as image variance,entropy and average gradient were compared.The comprehensive experiments demonstrate our method is effective and robust.
Twofold Adjusted Threshold SIFT
WEI Bao-guo and ZHANG Hai-xi
Computer Science. 2014, 41 (6): 282-286.  doi:10.11896/j.issn.1002-137X.2014.06.056
Abstract PDF(878KB) ( 371 )   
References | Related Articles | Metrics
SIFT has been widely used for its good performance on feature extraction and matching.However,effects under conditions of insufficient illumination and blur are not so satisfactory.We proposed an adaptive threshold selection method based on global and local information.First,initial threshold can be obtained according to image contrast,thus it’s adapted to insufficient illumination and image blur.Second,in order to control the number and distribution of feature points,the threshold is adjusted secondly according to feature points distribution.Finally,the mismatch removing method has also been improved.Experiment results show that the improved SIFT algorithm is not only well adapted to low light and image blur,but also can adjust feature point numbers and reduce clustering effects.
Gradient Structural Similarity Image Assessment Index Based on Dilation
SANG Qing-bing,LIANG Di-lin,WU Xiao-jun and LI Chao-feng
Computer Science. 2014, 41 (6): 287-290.  doi:10.11896/j.issn.1002-137X.2014.06.057
Abstract PDF(869KB) ( 387 )   
References | Related Articles | Metrics
The traditional gradient structure similarity algorithm (GSSIM) simply takes the average of each sub-block GSSIM index as quality evaluation of the whole image.The human visual sensitivity is different when observing the different areas,which is ignored by GSSIM.So an approach of weighted gradient structural similarity based on dilation and image block classification was proposed for image quality assessment.In our new method,firstly the distorted image is divided into two regions:edge dilation region and smooth region.Then the distorted image is divided into 8×8image blocks,which are classified into edge dilation blocks and smooth ones according to the distorted region.The GSSIM index is given different weight values according to different type blocks.The whole image quality is calculated by Weighted GSSIM index.Experimental results on three simulated databases show that the proposed metric is more reasonable and stable than other methods.It obtains high correlations with subjective quality evaluations and low calculation,and is more consistent with human visual system.
CamShift Moving Object Tracking Algorithm Based on SIFT Feature Points Matching
MA Zheng-hua,GU Su-hang and RONG Hai-long
Computer Science. 2014, 41 (6): 291-294.  doi:10.11896/j.issn.1002-137X.2014.06.058
Abstract PDF(652KB) ( 506 )   
References | Related Articles | Metrics
A new algorithm integrating SIFT feature points matching into CamShift algorithm was proposed,aiming at tracking object which is prone to failure caused by using general CamShift algorithm under complex backgrounds.The algorithm uses the SIFT feature to realize precise matching of the continuous image sequence which has nothing to do with scale and direction.And it is partially invariant to object scaling,translation and illumination change.Not only it compensates for the lack of taking color as key information of the general CamShift,but also the displacement between the centroid and the center of mass of object tracking window is stable within the threshold.Finally,the effectiveness and stability of the algorithm were verified via comparative experiments.The experimental results show that the new algorithm can achieve stable tracking object against illumination mutations,scaling and rotation under the complex backgrounds.
Extraction of Nighttime Images with Time Frequency Weighted and Constrained Optimization Evolutionary Algorithm
LIU Shu-qin and PENG Jin-ye
Computer Science. 2014, 41 (6): 295-298.  doi:10.11896/j.issn.1002-137X.2014.06.059
Abstract PDF(776KB) ( 395 )   
References | Related Articles | Metrics
The extraction of nighttime images with time frequency weighted and constrained optimization evolutionary algorithm was studied.The treat ment of multi-frame nighttime images in time domain and frequency domain is an important method for information extraction.In traditional processing method,the low quality nighttime images are treated with a separate time-domain retrieval method,so the whole domain information cannot be used.The extraction of nighttime images with time frequency weighted and constrained optimization evolutionary algorithm was proposed.Firstly,the nighttime images were processed in the time domain and frequency domain simultaneously.Then the multi-frame images were weighted,and the new feature was formed.On this basis,the effect of the images was cyclically improved through constrained optimization evolutionary algorithm,and ultimately a better result was achieved.A team of nighttime images was used to test the ability.The result shows that the information of time domain and frequency domain can be used well,and the image information is extracted with good performance.The algorithm has good value for image information extraction.
Enhancement Algorithm for Structured-light Stripe Image Based on Orientation and Frequency Fields Constraint
ZHENG Hong-bo,XU Ling-ling,DU Yi-cheng,QIN Xu-jia and CHEN Song-hui
Computer Science. 2014, 41 (6): 299-303.  doi:10.11896/j.issn.1002-137X.2014.06.060
Abstract PDF(878KB) ( 511 )   
References | Related Articles | Metrics
Stripe image enhancement has important applications in fingerprint image processing and recognition,structured light 3D reconstruction.This paper focused on stripe image processing of structured-light 3D reconstruction,pre-sented an enhancement algorithm for stripe image based on orientation and frequency fields constraint,designed calculation methods of image orientation and frequency fields,and constructed a Gabor filter based on orientation and frequency fields constraint.The enhancement algorithm includes following steps:firstly uses Gaussian filter to remove the effect of uneven illumination of the image,then calculates the orientation and frequency fields of stripe image,finally uses the Gabor filter to filter the image constrained by orientation and frequency fields.The experimental results show that the algorithm can eliminate the effects of uneven illumination of image effectively and enhance the stripe information of structured-light image preferably.
Approach for 3D Medical Image Registration Based on Clifford Algebra Geometrical Invariance
HUA Liang,DING Li-jun,HUANG Yu,FENG Hao and GU Ju-ping
Computer Science. 2014, 41 (6): 304-308.  doi:10.11896/j.issn.1002-137X.2014.06.061
Abstract PDF(842KB) ( 461 )   
References | Related Articles | Metrics
Considering that 3D medical image registration has the problems of huge data,high computational complexity and low registration precision,this paper proposed a registration method based on Clifford algebra geometric invariants to realize 3D medical image registration of skull part for human,proposed Clifford algebra geometric invariants and Clifford algebra equation formulas needed by registration,constructed Clifford geometric rotation operator which is fit for rotation for the geometric reference axis,established the rotation composite operator using the corresponding Clifford geometric invariants obtained by the maximum and minimum values.To realize registration,geometric transformation was made for the floating image data.Two famous 3D medical data were tested in registration experiments.The experimental result indicates that this method has the advantages of simple calculation method,intuitive geometric meaning,high registration precision,high execution efficiency,and also not easily falls into the local extreme value in the registration process.
Hyperspectral Band Selection Method Based on Conjugate Class Separability and Grey Decision
ZHANG Hai-tao,WANG He-qiao,MENG Xiang-yu and WU Wen-bo
Computer Science. 2014, 41 (6): 309-313.  doi:10.11896/j.issn.1002-137X.2014.06.062
Abstract PDF(878KB) ( 428 )   
References | Related Articles | Metrics
As researchers’ demand for the quality of the spectral information in hyperspectral images gradually increases,the characteristics of hyperspectral images impedes the further information extraction to the images.The existing single band selection method can not fully consider the criterias about "information content,correlativity,class separability",and the results are inevitably restricted by other index measurements.Using the quality of grey system theory and taking the small sample,small information and uncertainty system as research subjects can do the grey incidence decision on the basis of subspace partition,overcoming the independence and incompatibility of the single-index measure.Therefore,aiming at the growing demand about ensuring the separability of conjugate class,this paper put forward a band selection method which can synthetically consider the results of other single band selection mehtods with grey incidence decision.Finally,an experiment was made and it was compared with common fusion methods.
Face Automatic Recognition Algorithm for Small Sample High-dimensional Features
LI Ling and LI Gui-juan
Computer Science. 2014, 41 (6): 314-316.  doi:10.11896/j.issn.1002-137X.2014.06.063
Abstract PDF(275KB) ( 360 )   
References | Related Articles | Metrics
In face recognition,efficient feature extraction method is the key.Canonical correlation analysis (CCA) is a classic feature extraction method,but due to the singularity of the covariance matrices of its two groups of features caused by the small sample high-dimensional feature problem,traditional CCA fails.Moreover because its globally linear property in nature,it can not better portray the local changes in the face image.So there are some two defects of poor prediction accuracy and stability.To improve the prediction accuracy of face recognition model,a novel face recognition method was proposed based on sub-pattern CCA (SpCCA).With the correlation between global features and local features,the redundant information between the features was eliminated,and the global information and local information were integrated effectively at the same time.Lastly,SpCCA was applied to AR and Yale datasets,and was proved to have significantly better recognition accuracy and higher stability in contrast to the reference model.The result shows that SpCCA can avoid the small sample and nonlinear problems with the assistance of sub-pattern.
Method of Real Time Recognition of Ground Traffic Signs Based on Image Matching
WANG Peng-fei,LIU Hong-zhe,YUAN Jia-zheng and CHEN Li
Computer Science. 2014, 41 (6): 317-325.  doi:10.11896/j.issn.1002-137X.2014.06.064
Abstract PDF(1722KB) ( 544 )   
References | Related Articles | Metrics
Ground traffic sign detection and recognition are the hotspot of the intelligent driving.Timeliness and accuracy are the requirements of the study.Image matching method is commonly used in detection and identification.This article introduced a method for ground traffic sign detection and recognition combined with image matching and prior know-ledge.Algorithm consists of two parts,preprocessing & detection and identification.Preprocessing stage includes image compression,region of interest extraction,morphological processing,median filtering and inverse filtering perspective,making image noise reduced and orthodontic so as to to prepare for the detection and identification.Detection and identification stages include contour extraction,filtration area,image matching,in order to determine whether the test image containing ground traffic signs and its species.Experimental results show that the algorithm has a better perfor-mance.