Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue 1, 14 November 2018
  
Evolution of Ada Programming Language
WU Di and XU Bao-wen
Computer Science. 2014, 41 (1): 1-15. 
Abstract PDF(1512KB) ( 899 )   
References | RelatedCitation | Metrics
The Ada programming language,born in 1979,was designated as the USA military standard in 1980and officially established as an ISO standard and put into use in 1983.Ada’s original purposes are reliability,maintainability,readability and efficiency.Ada,with its robust features,good reliability and excellent software engineering ideas embo-died,has exerted great influences upon the development of programming languages from 1980s to 1990s.Ada is widely applied to exploit high-integrated and long-lived large software and it plays dominant role in manufacturing key software in the areas such as military,commerce,public transportation,finance,etc.Many software systems,including systems of national defense and air control,transportation systems and bank security guarding systems,are exploited with Ada in Europe and America.In general,four standards (Ada 83,Ada 95,Ada 2005,Ada 2012) of the language are published as international standards by ISO in the past 30years and each standard has kept good compatibility upon the former one.From the perspective of language mechanism,application and influences,a comprehensive introduction and analysis of Ada’s evolution will be presented in the paper.
Survey of Thread Level Speculation on Multi-core Platform
GUO Hui,WANG Qiong,SHEN Li and WANG Zhi-ying
Computer Science. 2014, 41 (1): 16-21. 
Abstract PDF(551KB) ( 610 )   
References | RelatedCitation | Metrics
The development of multi-core architecture enables researchers to explore coarse-grained parallelism by a speculative mode.Thread level speculation (TLS) is the most representative one among current speculative parallelization techniques.The most abstractive advantage of TLS is the simple programming model in which programmers only need to mark the codes that can be executed speculatively while the runtime system or hardware is responsible for the correctness.This paper analyzed the existing TLS techniques and summarized the challenges TLS confronts and the developmented trend in the future.The main contributions are:1) we presented a novel taxonomy of TLS techniques based on the life circle of speculative variables and compared their advantages and disadvantages,2) we summarized the design space of multi-core platform supporting TLS on the basis of the life circle of speculative variables and proposed several ways to explore it,3) the paper pointed out the challenges TLS confronts and the development trend in the future.
Survey on Data Deduplication Techniques for Storage Systems
XIE Ping
Computer Science. 2014, 41 (1): 22-30. 
Abstract PDF(961KB) ( 1330 )   
References | RelatedCitation | Metrics
With the ever-increasing data volume in enterprises,the needs of massive data storage capacity currently become a grand challenge in data centers,and researching shows that there are about 60% redundant data in storage systems.Therefore,the problems of high redundancy in data storage systems are paid much more attentions by resear-chers.Exploiting CPU resource to compare the data block’s fingerprint which is unique,data deduplication techniques can efficiently accomplish data reduction in storage systems,thus data deduplication techniques have become a hot topic in both industry and academia fields.Based on adequately analyzing and summarizing literatures on data deduplication techniques appeared in recent ten years,this paper first presented the principle of representative data deduplication systems,implementation mechanisms as well as evaluation methodologies after analyzing volume-level data deduplication system architecture.Second,we also focused on existing deduplication optimizing techniques with consideration of both the characteristics of data and scale of data deduplication systems.Finally four new research directions were given as follows by comparatively analyzing various application scenarios of data deduplication systems,including research of primary-Storage-Level data deduplication approaches,research of distributed data deduplication scheme for clustered storage systems,research of highly-efficient fingerprint searching techniques and research of intelligent data detection techniques.
Parallel Computation Performance Analysis Model Based on GPU
WANG Zhuo-wei,CHENG Liang-lun and ZHAO Wu-qing
Computer Science. 2014, 41 (1): 31-38. 
Abstract PDF(682KB) ( 584 )   
References | RelatedCitation | Metrics
In order to solve the problem of lacking accurate performance analysis model in parallel computation field based on GPU,we proposed a quantitative performance model which can simulate the performance of three major components of GPU including instruction pipeline,shared memory access time,and global memory access time.It is designed to build a performance model that helps programmer find the performance bottlenecks and improve the system’s performance efficiently.To demonstrate the usefulness of the model and to optimize the algorithms performance,we analyzed three representative real-world programs:dense matrix multiplication,tridiagonal systems solver,and sparse matrix vector multiplication.
Research of Method for Virtualizing 64-bit Windows Application Binary Interface
HUANG Cong-hui,CHEN Jing,GONG Shui-qing and CHEN Ming-hua
Computer Science. 2014, 41 (1): 39-42. 
Abstract PDF(307KB) ( 601 )   
References | RelatedCitation | Metrics
Aiming at the problem of virtualizing 64-bit Windows ABI on Linux platform,the differences in Windows and Linux x86-64ABI were analyzed,and then the three key issues were proposed and studied to achieve 64-bit Windows ABI Virtualization,which are how to load and link the Windows program,emulate the library interface and the system call.On this basis,the two solutions to achieve 64-bit Windows ABI virtualization on the user space and kernel space of Linux platform were analyzed,and then an operating system named KgdLinux was implemented,which is compatible with Win64applications based on the solution of Linux user space.The result of experimental tests shows the method for virtualizing 64-bit Windows ABI is feasible.
Survey on 3D Reconstruction from Single Line Drawings
ZHENG Jin-xin,TANG Zhi and WANG Yong-tao
Computer Science. 2014, 41 (1): 43-47. 
Abstract PDF(542KB) ( 466 )   
References | RelatedCitation | Metrics
Reconstructing the three-dimensional structure of an object from single line drawings is an important issue in the field of machine vision.Its applications include three-dimensional design and hand drawing based creation,conversion of existing engineering drawings to solid models,single natural image based 3D reconstruction,image understanding and retrieval.This paper presented our taxonomy of the existing methods,and reviewed the relevant methods according to the types of algorithms they use.Finally,we concluded the achievements and drawbacks of the existing methods of 3D reconstruction from single line drawings,and pointed out some future directions of research.
Survey of Influence Analysis for Social Networks
DING Zhao-yun,JIA Yan,ZHOU Bin and TANG Fu
Computer Science. 2014, 41 (1): 48-53. 
Abstract PDF(549KB) ( 1287 )   
References | RelatedCitation | Metrics
The Internet is gradually evolved into a ubiquitous computing platform and information dissemination platform.Emergence and rapid development of online social networking sites,micro blogging,blogs,forums,wikis and social networking applications,make the form for the human to use the Internet to produce profound changes from simple information search and Web browser to construction and maintenance of online socialration information creation,exchange and sharing based on social relations.Of interaction between individuals in the social network influence,the influence of the social network is mainly dependent on the strength of the relationship between the individual,the network distance between individuals,the timing factor,as well as network characteristics and individual characteristics.Influential analysis technology related research includes individual impact strength measurement technology,individual influence and the diffusion mechanisms of influence.
Iris Image Deblurring Method Based on Non-local Regularization and Reliable Region Detection
LIU Jing,SUN Zhe-nan and TAN Tie-niu
Computer Science. 2014, 41 (1): 54-58. 
Abstract PDF(507KB) ( 554 )   
References | RelatedCitation | Metrics
In practical applications,it is inevitable that some captured iris images are out-of-focus or motion blurred.Blurred iris images without clear iris texture details will lead to high false reject rate of iris recognition.In this paper,a novel image deblurring method was proposed to automatically enhance iris image quality.The proposed method makes full use of the distinct property of iris images,and applies a novel coarse-to-fine framework.Point spread function (PSF) is firstly initialized based on parametric model,and then it turns to be modeled on pixel-level for refinement.In optimizations,non-local regularization is applied due to the emphasis on texture patterns,and only the reliable regions are detected to guide the image restoration.Experimental results on various iris image databases illustrate that the proposed method is both effective and efficient,and outperforms state-of-the-art image deblurring methods in the improvement of iris recognition accuracy.
Heuristic Algorithm Based on 3-layer Centrality for Influence Maximization in Social Networks
WANG Jun,YU Wei,HU Ya-hui and LI Shi-jun
Computer Science. 2014, 41 (1): 59-63. 
Abstract PDF(413KB) ( 523 )   
References | RelatedCitation | Metrics
Influential maximization in a social network is the problem of finding the limited initial nodes which can maximize the spread of influence.Some greedy algorithms can get wide spread of influence,but have too high cost to be applied in large-scale social networks.The heuristic method based on degree centrality is simple but of low accuracy.Heuristic algorithms based on global metrics such as betweenness centrality and closeness centrality can better identify the most influential nodes,but the computational complexity of which is much high too.As a tradeoff between the low-accurate degree centrality and other time-consuming measures,we defined the 3-layer local centrality for computing the potential influence of nodes.Based on linear threshold model,we selected some seed nodes through the heuristic algorithm in which the node with maximal potential influence is selected at each step,and we chose another seed nodes by the greedy algorithm in which the node with the largest influence increment is chosen at each time.The experimental results show that our hybrid algorithm has good spread of influence and low time complexity.
Realization of Graph Guts Based on Improved Push-relabel Algorithm on GPU
LI Ye,YU Shuang-yuan and LUO Si-wei
Computer Science. 2014, 41 (1): 64-68. 
Abstract PDF(446KB) ( 425 )   
References | RelatedCitation | Metrics
Graph Cuts has been an important method of computer vision,which can apply to the field of image processing.In recent years,the graphics processing unit (GPU) has progressively and rapidly become a kind of higher parallel computing processing unit with launching the Compute Unified Device Architecture (CUDA) from Nvidia.The parallel implementing of push-relabel algorithm on high-end parallel computing platform has a high level of parallelism to improve the computational efficiency.This technology has an important value of theoretical study and widely actual application background,which expands the range of applications of graph cuts in the field of computer vision.It firstly presents a basic parallel implementation of the push-relabel algorithm for graph cuts on the GPU,which introduces texture memory of GPU to improve the speedup.At last,the experiments test the times of foreground-background segmentation and compare speedup of different groups,which show optimizations can enhance the performance of the implementation.
Personalized Resource Recommendation Based on Tags and Collaborative Filtering
CAI Qiang,HAN Dong-mei,LI Hai-sheng,HU Yao-guang and CHEN Yi
Computer Science. 2014, 41 (1): 69-71. 
Abstract PDF(334KB) ( 939 )   
References | RelatedCitation | Metrics
Traditional collaborative filtering algorithm reflects the user interest preferences and similarity of items by user ratings.It ignores the characteristics of user and project,and performs not very well for sparse data and new items.Under the age of Web2.0,social tab allows the user to label resources based on personal preferences freedom.To solve the problems,a hybrid algorithm based on tags and collaborative filtering recommendation algorithm was proposed.The method uses the label as the user interest information and the item characteristic.Through making use of the multidimensional relationship of the user,social and labeling,algorithm generates user feature vector and Item feature,and calculates the user preferences for items and projects similarity.Then based on the historical behavior of the user,user preference on other projects is predicted.Finally,sorting the predicted preference,recommended results are generated.Experimental results show that our algorithm can effectively alleviate data sparsity,solve the cold start,and enhance the accuracy of the recommendation.
Evaluating Heterosexual HIV Transmission and Interventions Based on Agent Bipartite Dynamic Weighting Scale-free Network
HE Xiao-li,BI Gui-hong and WANG Hai-rui
Computer Science. 2014, 41 (1): 72-79. 
Abstract PDF(725KB) ( 514 )   
References | RelatedCitation | Metrics
The spread of HIV is the result of coevolution of individual behaviors,intervention measures and social networks.This paper presented a method based on bipartite dynamic weighting scale-free agent network for investigating HIV heterosexual transmission and the effectiveness of HIV prevention interventions.According to different high-risk behavior,the female is divided into the general population and female sex workers (FSW),and the male is divided into the general population and clients of female sex worker (CSW) in the bipartite network.A bipartite scale-free network configuration generation algorithm was constructed for describing the relationships of heterosexual contacts.A new weighting algorithm was developed to construct weighting network by adding sex act explicitly on heterosexual contact network.The dynamics of heterosexual network is supported through heterosexual network stable,casual and temporarypartnership dissolution and formation.In the model,agent represents males and females who interact each other within risk networks engaging in sexual behavior over time.Agent describes individual high-risk behavior,course of HIV disease and social structure.Agent also defines the behavior rules for adaption to the HIV prevention interventions.Exploratory simulation results are consistent with empirical studies,which demonstrate the effectiveness of a combination of interventions,including voluntary counseling and HIV test,condom promotion for safe sexual behavior and initiating antiretroviral treatment.
Research on HMM-based Mongolian Speech Synthesis
ZHAO Jian-dong,GAO Guang-lai and BAO Fei-long
Computer Science. 2014, 41 (1): 80-82. 
Abstract PDF(322KB) ( 482 )   
References | RelatedCitation | Metrics
HMM-based speech synthesis method,as a mainstream method nowadays,has been widely applied to English,Chinese,Japanese,and so on.However,the research on HMM-based Mongolian speech synthesis is still in blank field.We applied the HMM-based speech synthesis method to Mongolian firstly,and did some experiments.From the evaluation results of the final Mongolian speech synthesis system,the synthesized Mongolian speech is stable,fluent,rhythmed and has high intelligibility.The mean opinion score of the synthesized Mongolian speech is 3.80.This laids the foundation for further research on the HMM-based speech synthesis technology.
Application of Unscented Kalman Filter in Rotary Table Tennis Trajectory Prediction
ZHANG Kang-jie and WANG Qi-zhi
Computer Science. 2014, 41 (1): 83-87. 
Abstract PDF(433KB) ( 1073 )   
References | RelatedCitation | Metrics
The inaccurate results of ball trajectory prediction lead to ping-pong robot can not play the table tennis intelligently.In order to reduce the trajectory prediction error,the following measures may taken:The proposed method first analyses the kinematics model of the flying rotary ball,and then constructs the motion equation and observation equation of the ball’s flying trajectory based on the Unscented Kalman Filter (UKF).Finally ping-pong ball’s three-dimensional space position,velocity and angular velocity can be estimated,according to the three-dimensional space position information obtained by a visual observation system.The Matlab simulation experiments and real experimentals show that UKF algorithm compared to EKF algorithm in trajectory prediction time can be saved by 99%,and the tracking error is small.
Multi-label Scene Classification Based on I2C Distance and Label Dependency
HAO Hong,JI Hua,ZHANG Hua-xiang and LIU Li
Computer Science. 2014, 41 (1): 88-90. 
Abstract PDF(256KB) ( 419 )   
References | RelatedCitation | Metrics
Combining improved ML-I2C and the correlation between labels, we proposed a modified multi-label scene classification method.First,the SURF feature of all images is extracted, and each class is represented with a feature set.Second,the improved I2C method is adopted to calculate the distance between a query image and each class,getting a label rank based on the distance.Last,label correlation is used for label prediction according to the label rank.Experiment shows that this method achieves a higher accuracy rate on multi-label scene classification.
Fault Diagnosis of High-speed Rail Based on Approximate Entropy and Empirical Mode Decomposition
ZHAO Jing-jing,YANG Yan,LI Tian-rui,ZENG Jing and WEI Lai
Computer Science. 2014, 41 (1): 91-94. 
Abstract PDF(448KB) ( 565 )   
References | RelatedCitation | Metrics
The faults of anti-yaw damper,lateral damper and air spring are three kinds of common faults of high-speed rail.According to the non-stationary and nonlinear characteristic of three kinds of common faults of high-speed rail,approximate entropy and empirical mode decomposition were introduced to extract the feature of high-speed rail faults,and BP neural network was used as the model for the fault diagnosis of high-speed rail.The experimental results show that the proposed method is effective.In addition,the comparison experiment indicates that the fault diagnosis based on the combination of approximate entropy and empirical mode decomposition obtains better result than the fault diagnosis based on only one feature.
Threshold Image Segmentation Based on Min-max Cut Algorithm
LIU Ya-kun,YU Shuang-yuan and LUO Si-wei
Computer Science. 2014, 41 (1): 95-99. 
Abstract PDF(646KB) ( 577 )   
References | RelatedCitation | Metrics
In recent years,the spectral clustering algorithm based on graph theory is a new tool to be applied to image segmentation.Essentially,image segmentation is to be converted into the optimization problem,and the minimum cut algorithm (Min-max cut) can fully meet the criteria of the clustering algorithm.In the process of implementation,optimization criteria into eigen system solves the problem.The implementation is computationally complex,and the required storage space and computing time complexity are increased as the image size increases.In the page,when Min-max cut algorithm is achieved,the weight matrices used in evaluating the graph cuts are based on the gray levels of an image,rather than the commonly used image pixels to determine the segmentation threshold.Experimental results show that the Min-max cut segmentation algorithm that this method achieves is simple,real-time,and has automatic segmentation and other superior segment ation performance.
Bi-level Codebook Based Speech-driven Visual-speech Synthesis System
JIA Xi-bin,YIN Bao-cai and SUN Yan-fen
Computer Science. 2014, 41 (1): 100-104. 
Abstract PDF(880KB) ( 426 )   
References | RelatedCitation | Metrics
The paper proposed a bi-level codebook based speech-driven visual-speech synthesis system. The system uses the vector quantization principle to establish a coarse-coupling mapping relationship from the speech feature space to the visual speech feature space.In order to enhance the relationship between the speech and the visual speech,the system makes the unsupervising-clustering on the sample data according to the similarity of both the acoustic speech and the visual speech and constructs the bi-level mapping codebook reflecting the similarity of both the acoustic speech and the visual speech .At the stage of preprocessing,the paper proposed a joint feature model,which reflects the geometric character and the visibility of teeth.The paper also proposed an approach to extract the visual speech correlative speech feature from the speech features of LPCC and MFCC on the basis of genetic algorithm.The comparison results between the synthesis image sequences with the original one show that the synthesis one can approximate the original one and the result is good.In the future research,the restriction between the visual speech contexts should be considered to improve the smoothness of the synthesis results.
Cell Optimization Algorithm for Cache Resource Allocation of CDN
FENG Xiang,MA Mei-yi and YU Hui-qun
Computer Science. 2014, 41 (1): 105-110. 
Abstract PDF(479KB) ( 418 )   
References | RelatedCitation | Metrics
The Internet bandwidth capacity expansion,on the other hand,is lagging behind,making the Web a major performance bottleneck.For solving the crowd of Internet network and improving the responding rate of users accessing the webpage,we need a new policy of cache resource distribution.This paper investigated and developed a new bio-inspired parallel Cell Optimization Algorithm (COA) for parallel cache resource allocation of Content Delivery Network (CDN).To simulate the functions of cell system,models of COA,including the nuclear,cytoplasm consistency,affinity of cells,hybrid energy function and dynamical evolution of cells,were built biologically and mathematically.Furthermore,the parallel computing architecture and steps of COA were designed.Via numerous simulations and comparison with other classical algorithms,the characters of high efficiency,parallel distribution and effectiveness for CDN were illustrated,which are especially crucial for the functioning of large-scale distribution problems.
Image Local Geometric Registration Based on Improved Scale Invariant Features
SUN Tong-feng and DING Shi-fei
Computer Science. 2014, 41 (1): 111-115. 
Abstract PDF(687KB) ( 412 )   
References | RelatedCitation | Metrics
Aiming at the problems that the existing image registrations may trigger inaccurate registrations or miss some registrations,an image local geometric registration based on improved scale invariant features was proposed.The approach improves scale invariant features,designs SI(E)FT (scale invariant edge feature transform) by constructing edge scale space and combines scale invariant feature points and scale invariant edges.Based on the improved scale invariant features,the local transforms between two images are searched to implement image local geometric registration.Experiments show that the complementary combination of SIFT points and edges provides more registration information and reduces registration errors,and the approach is insensitive to scale,noise,deformation,light,etc.,can register mo-ving objects and truly reflects image registration status.
Formalized Representation and Reasoning of Event Action Based on Extended Description Logic and Logic Program
LIU Wei,XU Wen-jie,TANG Ying-ying,FU Jian-feng,ZHANG Xu-jie and LIU Zong-tian
Computer Science. 2014, 41 (1): 116-125. 
Abstract PDF(967KB) ( 549 )   
References | RelatedCitation | Metrics
Event is a human knowledge unit with larger-grained than concept.It is closer to the human cognitive process.As an important element of the event,action describes the process of status change of the objects in the event.Adding time to the process of status change,and describing action as the process of object''s status change with time,lead to a more specific description of action.This paper constructed an action formalism through actions,objects and time in the event,and studieds the syntax and semantics of certainty action and uncertain action.This action formalism combines an extended description logic with time (T-ALC) with logic program,thus to enhance the expressive ability of action.In this action formalism,the certainty actions forms are transformed to Datalog rules based forms to realize dynami-cal reasoning,and uncertainty action forms are transformed to Datalog rules based forms to realize uncertainty reasoning.At last,a case study of bank service was described to verify the feasibility of the action formalism and reasoning method.
Movement Drive and Control Constraints of Virtual Hand Based on Multi-curve Spectrum
XIN Qin-qin,CHEN Zhi-xiang,FENG Xiao-xia and ZHU Yue-xiu
Computer Science. 2014, 41 (1): 126-129. 
Abstract PDF(674KB) ( 393 )   
References | RelatedCitation | Metrics
In view of the complexity of existing motion simulation control of the hand,through the analysis of structure and movement characteristics of human anatomy,the movement and control mechanism of virtual hand was proposed which is based on multi-curve spectrum.For the movement of single finger,flexible movement of single knuckle can be driven by controlling the normalization contraction of muscle.However,in the cooperative movement of fingers,because the movement promotion and restriction relationship between adjacent finger exists,the effect of other fingers should be considered.According to the constraint relate,the coordinated movement can be realized by controlling the correspon-ding muscle contraction.Experiments show that the established control mechanism can well simulae not only the movement of single finger but also coordinated movement of fingers.Under the control of multi-curve spectrum,it is convenient and quick to simulate a variety of gestures and sign language.Besides,the movements are vivid,and the hand movements are natural and flexible.
Rail Surface Defect Detection Algorithm Based on Spatial Filtering
ZHAO Hong-wei,HUANG Ya-ping,WANG Sheng-chun and LI Qing-yong
Computer Science. 2014, 41 (1): 130-133. 
Abstract PDF(674KB) ( 461 )   
References | RelatedCitation | Metrics
Detection of the rail surface defects is important to the safety of railway transportation.Leveraging the techniques of image processing and pattern recognition to detect and locate defects is a viable and rapidly developing research techniques.In previous work,a Rail Surface Defect Detection (RSDD) was proposed by our research group.RSDD first enhances the contrast of the rail image,on this basis locates and detects suspicious defects.It has a high detection performance for conventional rail image,but misses some defects in the cases of rail image that contains multiple defects and has high gray value difference among them.This paper put forward an Improved Rail Surface Defect Detection (I-RSDD),which fills the detected defect areas with mean gray value of original rail image and then detects the intermediate result image again.In the rebuilt contrast image,thereby,the contrast value of the defect areas which are not obvious in the original image is enhanced and the possibility for the defect areas to be detected is increased.Our experimental results demonstrate that I-RSDD has high detect property:in the case of no noticeable increase of total time consumption,the average rate of accuracy of detection is 90.8%,and the average detection error rate is 4.0%,which has substantially improvement compared with RSDD.
Knowledge Reasoning Based on Linguistic Truth-valued Intuitionstic Fuzzy Logic
ZOU Li,TAN Xue-wei and ZHANG Yun-xia
Computer Science. 2014, 41 (1): 134-137. 
Abstract PDF(291KB) ( 423 )   
References | RelatedCitation | Metrics
In view of the lattice implication algebra,intuitionistic fuzzy sets and knowledge representation,based on the related properties and operations of linguistic truth-valued intuitionistic fuzzy algebra,we proposed six-element linguistic truth-valued intuitionistic fuzzy algebra related logical properties.And on the basis of six-element linguistic truth-valued intuitionistic fuzzy knowledge representation,we extended CRI method of fuzzy reasoning,discussed the six-element linguistic truth-valued intuitionistic fuzzy reasoning method,i.e.,6LTV-CRI algorithm.Then comparing with the intuitionistic fuzzy reasoning,six-element linguistic truth -valued intuitionistic fuzzy reasoning method was validated.We obtained the rationality of the 6LTV-CRI algorithm and analysed its advantages and disadvantages.
Algorithm of Mining Maximal Co-location Patterns for Fuzzy Objects
WEN Fo-sheng,XIAO Qing,WANG Li-zhen and KONG Bing
Computer Science. 2014, 41 (1): 138-145. 
Abstract PDF(612KB) ( 388 )   
References | RelatedCitation | Metrics
A spatial co-location pattern is a group of spatial objects whose instances are frequently located in the same region.There are lots of jobs and achievements of co-location patterns mining algorithms for certain and uncertain data,but less maximal co-location patterns mining algorithms,especially spatial maximal co-location patterns for fuzzy objects.Mevent-tree algorithm was proposed in this paper for mining maximal co-location patterns for fuzzy objects.Firstly,it builds a event tree which can get candidate patterns for each object,build the HUT tree of candidate patterns,and then depth-first searches maximal co-location patterns begining with maximal-size candidate patterns and ending to size-2candidate patterns in HUT tree,as well pruning co-location candidate patterns after geting maximal co-location patterns.Then we put forward two improved strategies,including the pruning fuzzy objects during preprocessing and the pruning co-location candidates before creating HUT trees.Finally,extensive experiments show the effectiveness and efficiency of Mevent-tree algorithm and its improved methods.
Research of Visual Simulation Platform for Consensus Protocol of Multiagent System
XIE Guang-qiang,ZHANG Yun,LI Yang,TIAN Jian-feng and ZENG An
Computer Science. 2014, 41 (1): 146-151. 
Abstract PDF(768KB) ( 676 )   
References | RelatedCitation | Metrics
In this study,a general visual simulation platform was designed for large-scale multiagent system.The platform was introduced including the architecture,workflow,detailed design and implement.Finally a simulation and analysis of the platform were showed by the evolution of a krause consensus model.The definition of the system performance according to different properties was designed.The platform can finish the analysis and evaluation of the system performance.The simulation of platform shows the evolution of system rapidly by setting the system parameters and loa-ding different consensus algorithms.The main method including the designs of agent,topology,task,consensus algorithm was introduced.The evolution of system can be showed by graphical display,and the statistical information of the resources consumed can be got.
Grid Services Composition Based on Colored Petri Nets
ZHAI Zheng-li
Computer Science. 2014, 41 (1): 152-155. 
Abstract PDF(464KB) ( 379 )   
References | RelatedCitation | Metrics
Under the new service-oriented grid computing architecture,grid services offer a prominent paradigm for distributed computing on the Internet.It is an emerging opportunity for both service providers and service consumers to create the value-added service by assembling new,value-added grid service out of existing ones.grid services composition has become an important research issue in Grid field.As the complexity of the available Grid services,many of them exhibit complex session protocols,requiring that the offered operations are invoked according to specific rules.This paper addressed the problems:(1) how to specify complex session protocols of grid service,(2) how to construct composition rules to composite component service,and (3) how to verify grid services composition,especially its conformance to component services’ session protocols.Petri net provides the constructions for specifying synchronization of concurrent processes,and the programming language provides the constructions for specifying and manipulating data values,while colored Petri net (CPN for short) combines the strengths of Petri net with the expressive power of high-level programming language.In order to address the above problems,we proposed a CPN-based model for the specification of both the session protocol and the composition of grid services.Through using the colored token of CPN to model diffe-rent message and event type of business process,we can transform services session protocols and grid service composition process into CPN,then analyze and evaluate behavioral properties and performance of the system by CPN Tool,an available specific tool for CPN.
Gradient Based Approach to Secure Localization Algorithm in Constrained Space
WANG Ning,QIN Xiao-lin and SHEN Yao
Computer Science. 2014, 41 (1): 156-162. 
Abstract PDF(585KB) ( 401 )   
References | RelatedCitation | Metrics
Existing wireless sensor network location technology can not satisfy the location demands,not only because it calls for a lot of calculation time and processing resources but also for it is interfered by environmental factors,attacks from other users and its positional accuracy is low.Therefore,in order to improve location accuracy in constrained space,a gradient-based security location algorithm was proposed in this paper,in which gradient is brought based on hyperbolic location model.This algorithm lowers the computational complexity so that it can satisfy the environmental demands of constrained space and achieve the high-accuracy localization.Meanwhile,to improve the security of the algorithm,the resistant ability was enhanced and the selective pruning process for inconsistency measuring was also adop-ted.We finally improved the location accuracy by using main frequency strategy according to the simulation results of indoor corridor experiments.And the results show that the proposed algorithm has higher performances not only on lo-calization accuracy and energy consumption but also on anti-interference ability than the existing one.
TBLFM:Trust Based Latent Factor Model for Social Recommendation
XING Xing,ZHANG Wei-shi and JIA Zhi-chun
Computer Science. 2014, 41 (1): 163-167. 
Abstract PDF(498KB) ( 412 )   
References | RelatedCitation | Metrics
Recently online social networks have become the major platform with millions of registered users on the Web.The amount of information is increasing so quickly that users can’t handle the information overload without the support of recommendation methods.Traditional recommendation methods have a limited performance in the context of social recommendation due to not considering the social network information,such as trust.Trust-based methods attempt to introduce a trust metric during the social recommendation.However,most of these methods are based on the explicit trust statements expressed by users,which are not available in the social networks such as Facebook,Twitter and Sina Weibo.This paper presented a trust metric to quantitatively measure the recommendation trust between pairs of users by aggregating the implicit trust and trust propagation values.We proposed a trust-based latent factor model,which incorporates the pairwise recommendation trust values into the probabilistic model for top-k item recommendation.The experiments on Sina Weibo demonstrate that our method outperforms the traditional recommendation methods and trust-based methods.
Service Overlay Networks Construction Algorithm Based on Node Potential Oriented Multi-nexthop Routing Protocol
YU Jing,ZHANG Jian-hui and WANG Bin-qiang
Computer Science. 2014, 41 (1): 168-171. 
Abstract PDF(358KB) ( 469 )   
References | RelatedCitation | Metrics
Service overlay networks are overlay networks constructed to satisfy the end-to-end QoS.It is the most efficient network architecture to supply the traffic demands in reconfigurable flexible network.Based on the architecture of reconfigurable flexible network,the dominant problems of service overlay networks construction were analyzed,and the construction principles were proposed.A construction algorithm based on node potential multi-nexthop routing protocol was also proposed.Simulation results indicate that the algorithm can archieve higher successful construction rate.
CBSD:A Fuzzy Service Discovery Algorithm Based on Chord
ZHAO Wen-dong,TIAN Chang and PENG Lai-xian
Computer Science. 2014, 41 (1): 172-177. 
Abstract PDF(515KB) ( 346 )   
References | RelatedCitation | Metrics
The main drawback of the structured service discovery method based on DHT doesn’t support fuzzy search in the distributed computing environment.A Bloom filter based structured service discovery method was raised that combines the service clustering and structured service discovery technology.This method uses Bloom filter to represent the service semantics.The clustering feature vectors are got by service training queue.Before the services are published into the chord ring,they are clustered by the relevance among the feature vectors.Without redundancy advertisement,this method can guarantee that the services with similar semantics can be published to the same node and can support fuzzy service discovery.Based on this method,a distributed service composed algorithm was raised.At last,the feasibility of the proposed method was demonstrated by simulation.
Power-efficient Message Routing Algorithm for Community-based Opportunistic Network
ZHOU Jun-hai,LIN Ya-ping and ZHOU Si-wang
Computer Science. 2014, 41 (1): 178-182. 
Abstract PDF(460KB) ( 363 )   
References | RelatedCitation | Metrics
The movement of nodes in community-based opportunistic network has some kinds of relativity and different nodes always don’t have the same moving characteristic.However,the prevalent multi-copy message routing algorithms in opportunistic network don’t consider these characteristics adequately,which leads to high resource consumption and low transmission successful rate when deploying them directly in the community-based opportunistic network.To handle the above problems,this paper proposed a community-based power-efficient message routing algorithm.The algorithm can control the number of message copies adaptively, calculate the activity degree of target community and local community of nodes according to the meeting history of nodes with both the nodes in target community or in local community,and complete the message transmission relying on the nodes which have higher activity degrees.Simulation results show that the message forwarding times,which are the main part of energy consumption,and message transmission successful rate of the algorithm excel the spray and focus algorithm in evidence in the community-based opportunistic network which has loose requirement in delay.
RF Signal Estimation Algorithm for Mobile Sensor Networks in Urban Environment
LI Zhen-jie,YIN Li-xin,ZHANG Xuan and MI Li-hong
Computer Science. 2014, 41 (1): 183-186. 
Abstract PDF(347KB) ( 394 )   
References | RelatedCitation | Metrics
RF signals between the mobile sensor networks nodes in an urban environment are greatly impacted by multipath fading and reflection diffraction mode of transmission,which makes practical applications such as mobile sensor networks positioning,tracking so difficult.In this paper,a robust estimation algorithm based on the threshold of the RF signal was presented which fits a complex urban environment.The algorithm low complexity possesses,and can achieve mutation detection of the RF signal in the complex environment,and fully utilize the advantages of the Kalman algorithm to optimal RF signal valuation.Finally,the simulation also proves the better performance of the algorithm than traditional Kalman and the sliding window algorithm.
Algorithm of Network Traffic Feature Selection Based on PCA and Tabu Search
YE Xiao-long,LAN Ju-long and GUO Tong
Computer Science. 2014, 41 (1): 187-191. 
Abstract PDF(420KB) ( 537 )   
References | RelatedCitation | Metrics
A network traffic feature selection method using principal component analysis and tabu search(PCA-TS) was proposed for the purpose of the efficiency and quality when using fecture selection.This approach reduces high-dimensional features using PCA and gets the optimal feature subset on the basis of tabu search. Experiment shows that PCA-TS method has better efficiency and selection accuracy compared with GA and PSO-SVM.
Method of Constructing Routing Based on Normal Sequence in Omega Networks
ZHANG Yi-hao,SHEN Yue-hong and JIANG Rong
Computer Science. 2014, 41 (1): 192-195. 
Abstract PDF(414KB) ( 956 )   
References | RelatedCitation | Metrics
Constructing conflict-free routings is seriously prevented due to complex window restrictions of window detection in shuffle-exchange networks.In order to overcome this obstacle,the concept of normal sequence was proposed,and a new method called sequence detection which is used to construct conflict-free routings is offered based on the concept.This method translates constructing conflict-free routing into constructing a single sequence in 2n-1stages Omega networks,which reduces complexity and the size of space of object constructed relative to window detection.So,constructing conflict-free routings becomes simpler.
Methods of Network Topology Evaluation and Optimization for Resilient Routing Layers Generation
WU Wen,MENG Xiang-ru,KANG Qiao-yan and YANG Ting
Computer Science. 2014, 41 (1): 196-201. 
Abstract PDF(519KB) ( 370 )   
References | RelatedCitation | Metrics
In order to solve the problem of resilient routing layers’ adaptability under different requirement and sub-optimum routes,the methods of network topology evaluation and original topology optimization for resilient routing layers were presented.The formal matrix expression of resilient routing layers was given on the basis of its theoretical background.Three indices to evaluate the resilient routing layers’ performance were put forward from different point of view.The indices to evaluate the potential of original full topology generating resilient routing layers were proposed.In order to realize optimum design of original topology, the original topology optimum method for resilient routing layers’ generation was given.The simulation results show that the indices can objectively assess the adaptability of resilient routing layers under different requirement,and the optimized original topology can satisfy the application requirement with fewer resources,also the problem of sub-optimum routes is greatly overcome.
Secure Private Cloud Storage System Based on Virtual Isolation Mechanism
BAO Ai-hua,YUAN Xiao-ping,CHEN Feng and MIAO Jia-jia
Computer Science. 2014, 41 (1): 202-207. 
Abstract PDF(679KB) ( 459 )   
References | RelatedCitation | Metrics
Cloud storage technology is an important research area of cloud computing,because of the loss of privacy and security concerns,public cloud storage services are often difficult to be widely used in organizations which keep the core data,such as the innovative enterprises or the army.VI-PCS,a secure private cloud storage system based on virtual isolation mechanism,was proposed in which physical storage media and public cloud storage services are virtualized as storage capabilities,which are managed through centralized life-cycle,provide storage services for applications in VI-PCS;file storage procedure is divided into three levels(i.e.Meta-data management,virtual storage and physical stora-ge),in which secure,reliable data storage is achieved by file renaming and transparent encryption and decryption technology;a secure net disk based on isolated sandbox is provided as access method,and data security,controllability and availability are achieved in this isolated environment;a file bidirectional synchronization method based on ordered hash tree is proposed,and its offline mode is also helpful to improve system availability and adaptability.The results show that VI-PCS has certain advantages in reliability,security,scalability and adaptability.
Selection of Virtual Links in Mobile Grid
DU Li-juan and JU Hong-jun
Computer Science. 2014, 41 (1): 208-211. 
Abstract PDF(331KB) ( 361 )   
References | RelatedCitation | Metrics
In order to manage resources effectively,overlay network technology is applied in mobile grid.All nodes are divided into ordinary nodes and super nodes,and some virtual links are selected between super nodes to form overlay network to manage resources.The selection of virtual links has great impact on network performance.The factors such as connectivity,bandwidth of underlying physical link and maintenance costs were considered in this paper.Then the selection of virtual links was described as multi-objective constrained optimization problem and immune clone intelligent algorithm was used as solutions.During the problem-solving,the constraints were transformed into optimization objectives first,then the concept of Pareto-dominant was introduced to solve multi-objective optimization problem.For proposed algorithm,complexity analysis and experimental analysis were conducted.And simulation results show its effectiveness.
Multi-topology Routing Generation Algorithm Based on Immune Mechanism
CHEN Duo-long,MENG Xiang-ru,LIANG Xiao and YUAN Rong-kun
Computer Science. 2014, 41 (1): 212-216. 
Abstract PDF(432KB) ( 373 )   
References | RelatedCitation | Metrics
An multi-topology routing generation algorithm based on immune mechanism was proposed due to the fact that the algorithm in existence causes unreasonable usage of storage resource and maladjustment of multi-faults recovery.Aiming at adapting for fault of high probability,according to the principle of the key part matching instead of corresponding completely between epitope and paratope in the immune mechanism,sub-topology generation is treated as antibody generation in immune system in the proposed algorithm,thus the network invulnerability under complexion of multi-faults is enhanced.And the question of optimization is solved by using artificial immune algorithm.The experimental result shows the network invulnerability under multi-faults is improved.
Accurate Prediction Method of QoS Based on Global Subspace Decomposition Mining
ZHANG Bo-ya and HU Xiao-hui
Computer Science. 2014, 41 (1): 217-219. 
Abstract PDF(562KB) ( 307 )   
References | RelatedCitation | Metrics
The accurate prediction of QoS is an important criterion for the Web services.In traditional QoS prediction method,when the time-weighted average and simple weight of various parameters are used,but a large number of Web services can not be accurately predicted,and the outcome is vague.An accurate prediction method of QoS based on global subspace decomposition mining was proposed,and the idea of global analysis of all data preprocessing was done,on this basis,through subspace methods,the subspace decomposition analysis of the data was carried out and the depth features of the data were extracted,then the global result of the analysis and subspace decomposition analysis results were fuzzed for effective data fusion and achieving the analytical data valid mining and predictive.A set of Web nodes and metrics were used to do the prediction experiment,and the result shows that with the global subspace decomposition mining method,the gradual process can be predicted with detailed process.The result is accurate and it has wide application value in QoS prediction.
CIL Static Analysis Method for C# Program Defect Detection
BIAN Pan,LIANG Bin and SHI Wen-chang
Computer Science. 2014, 41 (1): 220-224. 
Abstract PDF(698KB) ( 407 )   
References | RelatedCitation | Metrics
Finding potential defects by statically detecting source code can help programmers find and fix the defects before the software is released,and thus can improve the security of the software.This paper provided a CIL static analysis method to detect defects in C# programs.We adopted an improved depth-first search algorithm to traverse the control flow graph of the target program,and combining with the strategy of caching history states,the performance of the detection can be greatly improved.In addition,to be convenient for alias analysis,we proposed a method based on Memory Region to represent variables.Based on the analysis method described in this paper,we developed a system for detecting defects in C# programs.We applied the system on real C# projects,and the detecting result shows that it can detect common kinds of defects in C# programs efficiently and accurately.
On CSP Improvements to David’s Digital Library Protocol and Formal Analysis in Internet of Things
WU Ming-huan and CHENG Xiao-hui
Computer Science. 2014, 41 (1): 225-229. 
Abstract PDF(746KB) ( 340 )   
References | RelatedCitation | Metrics
This paper analysed David’s digital library protocol used in the Internet of Things,and pointed out the secu-rity risks of this protocol.Aiming at the security risks was put forward.this protocol, the way to solve the security risks was put-forward.Using the formal analysis method, the communicating sequential processes (CSP),role of protocol and the attackers were modeled.In the experiments,the new protocol is attacked under Dolev_Yao model by a attacker,but no attacks is finally founded.The experimental results show that the algorithm can effectively solve the security risks of this David’s digital library protocol.Mutual authentication protocol role,as well as the security of the session key,and the feasibility of the model are proved.
Research on Security Technology of Workflow Customization for Collaborative SaaS Platform of Industrial Chains
CAO Shuai and WANG Shu-ying
Computer Science. 2014, 41 (1): 230-234. 
Abstract PDF(882KB) ( 389 )   
References | RelatedCitation | Metrics
A workflow control security model for security risks that may exist in the process of workflow rules choreography,storage/access and execution control in the workflow customization of collaborative SaaS platform of industrial chains with dominant enterprise as core was proposed based on the digital signature and service instance association.According to the research on the XML digital signature alone with its validating procedure and the strategy on storage and executing models,a digital signature which includes methods of validating the identity of users and is related to business operations,way of storing and arithmetic was proposed.The security model and arithmetic were validated in the customization and execution of the after service of vehicle on the collaborative SaaS platform of industrial chains,showing that it can prevent illegal tampering and transferring of workflow in the process of choreography,storage and execution.
Average Scale Stochastic TBFL Approach
WANG Zhen-zhen
Computer Science. 2014, 41 (1): 235-241. 
Abstract PDF(577KB) ( 363 )   
References | RelatedCitation | Metrics
Approaches for fault localization based on test suites are now collectively called TBFL (testing based fault localization).However,current algorithms have not taken advantage of the prior knowledge about test cases and program so that they waste these valuable “resources”.[12] introduces a new kind of stochastic TBFL approach whose spirit is to combine the prior knowledge with actual testing activities under stochastic theory,so as to locate program faults.This algorithm presented in [12] may be regarded as a general patter of this kind of approach,from which people can develop various algorithms.Based on the mind of [13],we performed an improvement of the algorithm in [12].We mainly constructed two tools-the executive matrix E and the efficient matrix F-from the testing results.Then combined with the prior knowledge of test suite and program,the probability of statement being faulty is rated from two scales.Finally the two scales are “averaged”.In this way we got the average rank of program statements about their probability of being faulty,which may help programmers correct program faults.Moreover,this paper presented two standards for comparing different TBFL approaches.And from the investigation of the two standards on some specific instances,the results of the approach presented in this paper are satisfactory.
Capturing Causal Behavioral Profile Based on Structural Decomposition Technique
CAI Min and WANG Shi-yi
Computer Science. 2014, 41 (1): 242-245. 
Abstract PDF(388KB) ( 336 )   
References | RelatedCitation | Metrics
The causal behavioral profile can be applied to measure the consistency between two given process models as well as to monitor process execution.To overcome the limitation of existing approach to obtain the causal behavioral profile,a novel approach was presented based on T-invariants decomposition technique.A workflow system is first decomposed into a set of complete subsystems,and then the global relations between transitions are deduced from their local relations in each complete subsystem.The approach can be used for arbitrary sound free choice workflow systems.
Web Service Selection Method Based on Runtime Verification
ZHANG Ya-hong,ZHANG Lin-lin,ZHAO Kai,CHEN Jia-li and FENG Zai-wen
Computer Science. 2014, 41 (1): 246-249. 
Abstract PDF(351KB) ( 386 )   
References | RelatedCitation | Metrics
To verify the consistency between run-time behavior of Web service and user requirements,a Web service selection method based on runtime verification was proposed.In this paper,based on automata theory,Web service was verified at run time.And then three kinds of behavior matching relationships were presented.Based on the results of runtime verification,the Web services matching degree between run-time behavior of Web service and user requirements was quantified.Besides,the user preferences were considered based on AHP.The method can deal with Web service selection in terms of behavior matching degree and users’ preferences,and the experimental results indicate that this method is reasonable.
Dynamic Program Slicing Based on Forward Computation
WANG Xing-ya,JIANG Shu-juan,JU Xiao-lin and SHAO Hao-ran
Computer Science. 2014, 41 (1): 250-253. 
Abstract PDF(393KB) ( 642 )   
References | RelatedCitation | Metrics
Dynamic program slicing is widely used in software analyzing,testing and debugging.This paper proposed a novel forward computation approach for computing dynamic slices.Firstly our approach computes the influenced set of a defined variable in current executing statement based on the defined and used variables.Secondly it computes the direct dynamic dependence relationship of current statement.Finally it computes the dynamic slice of the variables.Applying this method,we designed and implemented a dynamic program slicing prototype for Java,and performed an experimental study on several open source programs.The experimental results show that the size of program slices by our approach is less than other methods.
Method of Design Patterns Definition Based on XML Schema Technology
GU Hui and ZHANG Wei-xing
Computer Science. 2014, 41 (1): 254-257. 
Abstract PDF(309KB) ( 471 )   
References | RelatedCitation | Metrics
The identification of software design patterns can help software technical personnel understand the system’s design intent and function from the software structural in program comprehension and reverse engineering.Generally,the software design information in the form of UML class diagram representation is hard to identify design patterns accurately from the pattern feature.This paper proposed a design patterns definition language can based on XML Schema—DPDLXS.The representation of specific design patterns instance by using DPDLXS language shows that the language can portray the feature of design patterns accurately,and provide a technical support for the identification of design patterns.
Skyline Based Link Prediction on Social Networks
XU Shuo-na and ZENG Bi-qing
Computer Science. 2014, 41 (1): 258-261. 
Abstract PDF(327KB) ( 377 )   
References | RelatedCitation | Metrics
In social media,users update their statues in real time,and the links between them change quickly,which poses a huge challenge for link prediction in such social networks.Traditional link prediction algorithms are usually efficient in specific situations,but for other situations,they are not efficient.In order to deal with the deficiency of single link prediction algorithm,a Skyline query based link prediction approach in social networks was proposed.The algorithm combines multiple link prediction algorithms,assumes the values of these algorithms as a vector of the predicted link,and returns users with the Skyline points based on the calculated vector.Experiments show that the Skyline query based link prediction approach is more efficient than related researches,and can be used in real applications of link pre-dication and recommendation in social media.
Internal Inverse P-information Intelligent Fusion and its Attribute Disjunctive Character Application
WU Song-li,CHEN Gui-you and SHI Kai-quan
Computer Science. 2014, 41 (1): 262-266. 
Abstract PDF(408KB) ( 365 )   
References | RelatedCitation | Metrics
Inverse P-sets (packet sets) are a pair of element sets composed of internal inverse P-set F (internal inverse packet set F) and outer inverse P-set (outer inverse packet set ),or () is an inverse P-set.Inverse P-set has dynamic characteristic.Using internal P-set,internal inverse P-reasoning (internal inverse packet reasoning),internal inverse P-information intelligent fusion generation,internal inverse P-information intelligent fusion supplemented generation and internal inverse P-information intelligent fusion measurement,internal inverse P-information intelligent fusion theorem,internal inverse P-information intelligent fusion dependent theorem and internal inverse P-information intelligent fusion recovery theorem were proposed.The attribute disjunctive character and attribute disjunctive expansion theorem of internal inverse P-information intelligent fusion,the attribute disjunctive expansion and unknown internal inverse P-information intelligent fusion discovery theorem recovery theorem were put forward.The application of these theory results was presented.Inverse P-sets is a new theorem and method of studying the other dynamic information application.The dynamic information has attribute disjunctive character.
Prediction Model for Airport-noise Time Series Based on SSA
WEN Dong-qin and WANG Jian-dong
Computer Science. 2014, 41 (1): 267-270. 
Abstract PDF(298KB) ( 1169 )   
References | RelatedCitation | Metrics
Along with the development of civil aviation in our country,the airport noises are getting more and more serious.Aimed at the airport-noise time series prediction problem,this paper presented the prediction model based on SSA,in which the airport-noise time series are obtained by singular value decomposition,and the principal component and the Empirical orthogonal function are obtained,the characteristics of trend and vibration are analyzed and then the appropriate feature vectors are selected for sequence reconstruction,and prediction model is constructed by linear repeating formula.Based on the state-transition matrix and contribution ratio,the forecast values are revised.The experiment on the measured data of an airport shows that the accuracy of this model is better than other original prediction models.
Ideal of R-Generalized Fuzzy Sublattice
WANG De-jiang,LIANG Jiu-zhen and LIAO Zu-hua
Computer Science. 2014, 41 (1): 271-273. 
Abstract PDF(293KB) ( 350 )   
References | RelatedCitation | Metrics
This paper introduced the definition of ideal of R-generalized fuzzy sublattice,and discussed some properties of it.Besides,it was proved that the finite intersection (union) of ideal of R-generalized fuzzy sublattice is still ideal of R-generalized fuzzy sublattice when function R(x,y) is decreased correspond to y,and the homomorphic image (preimage) of ideal of R-generalized fuzzy sublattice is still ideal of R-fuzzy sublattice.Finally,eight kinds of equivalent form of ideal of R-generalized fuzzy sublattice based on eight kinds of implication operator were discussed.
Parallel Mining of Densest Subgraph Based on Twitter Storm
WANG Jin-ming and WANG Yuan-fang
Computer Science. 2014, 41 (1): 274-278. 
Abstract PDF(427KB) ( 812 )   
References | RelatedCitation | Metrics
In large scale graph,finding densest subgraph has a wide range of applications,such as community discovery,spam detection and reference relation extraction.Based on tagged undirected graph,we introduced the concept of QLS and designed an approximation algorithm DSFLC which can quickly find the densest subgraph:users submit a QLS and the algorithm will return the densest subgraph under QLS within the time that user can accept.For any ε>0,DSFLC only needs to scan large-scale data sets O(log1+εn)times,and can ensure the approximation algorithm is a 2(1+ε)-approximation algorithm.After analyzing DSFLC,we found this algorithm is easy to parallelize,so we chose Twitter Storm platform to parallel DSFLC algorithm.Finally,the test data sets extracted from the DBLP database verify DSFLC’ practicality and scalability.
Improved Discrete Differential Evolution with Parameter Adaptive Mechanism
WANG Cong-jiao,WANG Xi-huai and XIAO Jian-mei
Computer Science. 2014, 41 (1): 279-282. 
Abstract PDF(360KB) ( 410 )   
References | RelatedCitation | Metrics
An improved discrete differential evolution algorithm (PA-DDE) with mechanism of parameter adaptive was proposed,based on the research and analysis of discrete differential evolution algorithm.Firstly,the parameters of continuous domains are adaptively adjusted in the process of evolution to balance the global search and local search,and also to coordinate the contradiction of population diversity and convergence speed.Secondly,the co-evolution processing is guided by the feedback information of discrete encoding of the successful evolutionary individuals on the corresponding discrete domains.Simulation results on the knapsack problem show that the proposed algorithm has good convergence efficiency and stability.
P-Concept Lattice and its Basic Properties
YANG Ya-feng and LIU Bao-xiang
Computer Science. 2014, 41 (1): 283-285. 
Abstract PDF(290KB) ( 393 )   
References | RelatedCitation | Metrics
Packet sets are dynamic set pair combined with internal packet sets and outer packet sets.By using the core methods of P-set,the P-evolution features of formal context were analyzed,and then a new kind of dynamic concept lattice structure —P-concept lattice was constructed.Finally,the translation theorems between P-concept lattice and classic concept lattice were given,and the Galois connection and some basic characters were proved before a case study.
Optimized Query Method of Location-based Service with Polymorphic Association Mining
CAI Zhao-hui,ZHANG Jian-pei and YANG Jing
Computer Science. 2014, 41 (1): 286-289. 
Abstract PDF(355KB) ( 304 )   
References | RelatedCitation | Metrics
High efficient and reliable location-based service (LBS) is key for the wide use of LBS.In traditional method,the neighbor query of location-based services query is used,and the neighborhood stability is often faced with the problem of poor ability,so it is unable to achieve high efficient query.An optimized query method of location-based service with polymorphic association mining was proposed.According to the search criteria,the data is divided into more than one state,then the multiple state association mining is carried out,and the depth features of data were extracted,and according to the deep-seated characteristics of the location services available method,the efficiency and reliability of the system queries are improved. An actual analysis was carried out using a team of 200groups of data,and the result shows that with polymorphic association mining method,the recall and precision rate are increased by 9.5% and 8.8%,so it has a good application value in LBS and information mining.
Bilateral Multi-protocol Negotiation Strategies Based on Reinforcement Learning
ZHANG Ke,LUO Jun and DENG Jun-kun
Computer Science. 2014, 41 (1): 290-292. 
Abstract PDF(255KB) ( 335 )   
References | RelatedCitation | Metrics
Traditional reinforcement learning negotiation strategy has the shortcoming of compromising too fast and reduces the utility of agent.Aiming at this problem,improved reinforcement learning bilateral multi-issue negotiation strategy which imports expectation restoration rate to restore the expectation of agent can improve the quality of the negotiation result.This paper analysed the influence of different expectation reduction rate on negotiation and contrasted traditional reinforcement learning negotiation strategies,time-based negotiation strategy and the proposed enhance learning negotiation strategy consultation.The result shows that negotiation strategy can get higher bilateral utility within allowing negotiation turns.
Application of Improved Experimental Model Based on MC Method in Estimation of pi
ZHANG Bing and ZHAO Yue-long
Computer Science. 2014, 41 (1): 293-296. 
Abstract PDF(318KB) ( 396 )   
References | RelatedCitation | Metrics
Monte Carlo method uses statistical sampling theory approximation for solving engineering problems.There are the conflicting issues of model accuracy and time complexity,so experimental model of the Monte-Carlo method was established to estimate pi and theoretically analyze the model accuracy and time complexity.This paper put forward the idea of two improvement program based on the experimental model,and through shift and pretreatment, converted the Tarsus division to reduce the computational complexity of multiplication Mason algorithm to improve efficiency.The simulation results show that in the case of maintaining the same accuracy,the improved experimental model and algorithms can significantly reduce the simulation time,improve the simulation speed,and thus possess a certain value in engineering.
Extraction and Correction of Feature Parameter for Online Freehand Conic Section
WANG Guan-feng,WANG Shu-xia,YU Sui-huai and GAO Man-tun
Computer Science. 2014, 41 (1): 297-299. 
Abstract PDF(218KB) ( 428 )   
References | RelatedCitation | Metrics
Freehand sketch is an efficient way for expression,communication and record of ideas at the conceptual design stage of products.The paper presented a novel method for feature extraction and the correction for online freehand conic section and developed a human-computer interface prototype system (FSR) for assisting designers during conceptual design stages,which maks system interface easy and friendly to use.Firstly,this paper classified the conic section to central curve and non-central curve and correspondingly introduced its parameter feature extraction method.Secondly,the paper proposed the parameter characteristic correction method of the circle (arc),the parabola.Consequently,it provided the parameter interface for CAD system that sustains the online freehand sketching.The system was tested with a number of sketched inputs of 2D geometry.The effectiveness of the algorithm was demonstrated preliminarily by experiments.This feature extraction and the revision method may carry on the classification well to the recognition result of online freehand sketch.
3D Digital Watermarking Technology Based on Genetic Algorithm in Wavelet Domain
LIU Jing and WEN Xian-bin
Computer Science. 2014, 41 (1): 300-302. 
Abstract PDF(542KB) ( 399 )   
References | RelatedCitation | Metrics
In this paper,a new 3D watermarking algorithm,which resolves problem of serious local distortion or heavy alteration of the model caused by embedding watermarking into the 3D models,was proposed based on Genetic algorithms in wavelet domain.Firstly,the feature points of the 3D model are extracted.Then,the best embedded coefficients are found via GA in which the fitness function is defined as the minimal change of the significant region after the 3D watermarking embedding.Finally,the final watermarked 3D content is obtained through the inverse discrete wavelet transform of the modified coefficients.The numerous experimental results show that the proposed 3D watermarking algorithm is robust against various attacks and makes the 3D polygonal models to have less distortion.
Colorful Medical Image Compression Based on Contourlet Transform and SPIHT Algorithm
TANG Min,CHEN Xiu-mei and CHEN Feng
Computer Science. 2014, 41 (1): 303-306. 
Abstract PDF(590KB) ( 396 )   
References | RelatedCitation | Metrics
Wavelets two-dimension are good at isolating the discontinuities at edge points,but not the smoothness along the contours.In addition,separable wavelets only capture limited directional information,which is restricted in image processing applications.In comparison,contourlet transform combines Laplacian pyramid (LP) with directional filter bank (DFB) to achieve a flexible multi-resolution,local and directional image expansion based on contour segments.A novel image compression method based on contourlet transform and set partitioning in hierarchical trees (SPIHT) algorithm was proposed for colorful medical images,because SPIHT algorithm based on wavelet transform can’t express the texture and contour effectively.Firstly,original RGB image is converted to YIQ color space according to the characteristics of human visual system.Secondly,contourlet transform is applied to region of interest (ROI) to capture the main characteristics and then SPIHT algorithm is used to guarantee the compressed image quality and detail.For back ground image,wavelet transform is used to improve compression ratio greatly by wavelet coefficients truncation.Experimental results demonstrate that our algorithm is practical and effective for colorful medical images,which is a good balance for compressed image quality and compression ratio.
Optimal Exemplar Matching Algorithm Based on Matrix Similarity and its Application in Image Inpainting
ZHAI Dong-hai,LI Tong-liang,DUAN Wei-xia,YU Jiang and XIAO Jie
Computer Science. 2014, 41 (1): 307-310. 
Abstract PDF(806KB) ( 614 )   
References | RelatedCitation | Metrics
In the image inpainting algorithm based on texture synthesis,the matching accuracy of optimal exemplar matching is not high but its time complexity is too high,which eventually leads to errors in image inpainting.Focusing on these two issues,firstly,the block matching algorithm was constructed and the matching degree between template and exemplars was measured by using matrix similarity,so,the candidate set of optimal exemplar was preliminarily determined in relatively coarse granularity.Secondly,the pixel matching algorithm was constructed and the matching degree between corresponding pixels was measured by inner product of error matrix between template and candidate exemplar,so,the final optimal exemplar was determined in fine granularity.The block matching algorithm can reduce the time complexity while the pixel matching algorithm can improve matching accuracy,therefore,the optimal exemplar matching algorithm based on matrix similarity can improve matching accuracy without raising time complexity.The experimental results demonstrate that,compared with current texture-based inpainting algorithm,the proposed algorithm can improve matching accuracy and reduce time complexity.
Faster Discriminative Common Vectors Classification Approach
HAN Shan-shan,HUANG Kai,WANG Wan-liang,ZHENG Jian-wei and JIANG Yi-bo
Computer Science. 2014, 41 (1): 311. 
Abstract PDF(560KB) ( 362 )   
References | RelatedCitation | Metrics
This paper proposed an improved Fast Discriminative Common Vectors (FDCV) classification algorithm based on the traditional Discriminative Common Vectors (DCV).Compared with the traditional DCV,the FDCV not only has a faster classification rate,but also guarantees the same recognition performance.The FDCV does the classification by calculating the distance between scalars but not vectors which are used in the traditional DCV.Theoretical ana-lysis and complexity calculation show that Faster DCV has twice the classifying speed of the traditional DCV.The simulation experiment on the three face databases of Yale Face Database B,ORL Database and PIE Database further verifies the effectiveness of the algorithm.