Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue Z11, 14 November 2018
  
Priority Assignment Strategy for Real-time System under Fault Bursts
ZHOU Zheng-yong,YANG Fu-min,LI Jun,HU Guan-rong,TU Gang and ZHANG Jie
Computer Science. 2014, 41 (Z11): 1-6. 
Abstract PDF(833KB) ( 457 )   
References | RelatedCitation | Metrics
In real-time systems,schedulers must be fault tolerant to guarantee no missed deadline.Based on the analysis of worst-case response time schedulability for real-time systems under fault bursts,we found out that fault-tolerant priority assignment strategy could improve system fault resilience effectively,compared with traditional fault-tolerant strategy.Also,we presented a fault-tolerant priority configuration search algorithm for the proposed analysis.
Application of Generalized Molecular Computation Model in 0-1 Knapsack Problem
YANG Zhen,MA Tian-bao,YU Wen and LI Yan-mei
Computer Science. 2014, 41 (Z11): 7-9. 
Abstract PDF(253KB) ( 517 )   
References | RelatedCitation | Metrics
Biomolecular computing has many limitations in implementation.In the literature,molecular computing sticker model and turing machine were combined to generate a generalized turing model GTM.The model has been proved to be accurate to get all the feasible solution set of mulitiple NP complete problems like 0-1 integer programming and set covering .In this paper,on this basis,GTM was applied to solve the 0-1 knapsack problem.The simulation shows the advantages of the model and further validation of the extensive application of the model.
Mixed RFID Anti-collision Algorithm Based on Dimensional Code Number
HUANG Qing-huan,ZHENG Jia-li,WEI Dong-xue and DENG Lin
Computer Science. 2014, 41 (Z11): 10-14. 
Abstract PDF(425KB) ( 417 )   
References | RelatedCitation | Metrics
Combining ideas of Binary Search algorithm and Slotted Aloha algorithm,an adaptive hybrid anti-collision algorithm based on the code number of every dimension was proposed.To solve the collision problem about numbers of tags in the range of a same reader in RFID(Radio Frequency Identify),the new algorithm firstly divides the tags into groups based on the bits of tags adaptively.Through detecting the collision bits,different tags adapt different tactics to calculate the ID numbers.The new algorithm also introduces a stack to save the dimensional code numbers,reduces unnecessary idle time slot.Simulation results show that,comparing with the traditional algorithm,the new algorithm improves system performance,reduces search times by 75% and search depth by 50%.
Research Method Combining Both True Table and Statistics
WANG Shu-xi and XIA Zeng-yan
Computer Science. 2014, 41 (Z11): 15-20. 
Abstract PDF(1606KB) ( 417 )   
References | RelatedCitation | Metrics
Starting from a practical problem of human resource management,this paper innovatively proposed a research method combining both true table and statistics.Through calculating the correctness probability of employee incentive model by using truth table,thereby we judged the logical correctness of employee incentive model.Based on above work,we proposed a generic algorithm and did experiments.Experimental results show that ,this research method combining both true table and statistics has widespread research significance and practical value,and is able to judge the logical correctness of hypothetical model and pioneer a new research idea:“True Table+Statistical”.
Representing and Reasoning Pitches in Algorithmic Composition Based on Bayesian Networks
WENG Shi-jie,LI Wei-hua and DING Hai-yan
Computer Science. 2014, 41 (Z11): 21-24. 
Abstract PDF(402KB) ( 580 )   
References | RelatedCitation | Metrics
Algorithmic composition is the partial or total automatic process of music composition by a computer.One of the challenges in algorithmic composition is to create pitches.However,uncertainty is an intrinsic feature of music.Bayesian network (BN) is an effective and popular framework for representing and reasoning knowledge under uncertainty,and BNs have been successfully applied to a variety of problems.Based on MIDI format to create pitches in algorithmic composition,we firstly built a model about pitches with BNs and a knowledge base based on the model.More-over,based on bayesian inference,the pitch of every note at each pat could be created.A preliminary experiment demonstrates empirically that this method for pitch inference is feasible.
Research on Rule Engine for Automatic Identification of Relational Words in Chinese Complex Sentences
YANG Jin-cai,XIE Fang and HU Jin-zhu
Computer Science. 2014, 41 (Z11): 25-28. 
Abstract PDF(340KB) ( 425 )   
References | RelatedCitation | Metrics
In recent years,the study of the rules engine has achieved fruitful results.It brings new ways and ideas to the processing of Chinese complex sentences.In this paper,we designed a rule engine for automatic identification of relationalwords in Chinese complex sentences.A pattern-matching strategy to select relation words set,a’eliminate contains maximizing strategy’ to eliminate the conflict rules,and a ‘front cover strategy’ to select final result were designed.By using the three strategies in rule engine,the efficiency and accuracy of identification of relational words in Chinese complex sentences are improved.
Research on Application of Automated Storage and Retrieval System Slotting Optimization Based on Virus Coevolutionary Genetic Algorithm
WANG Ting-zhang,QIU Jian-dong,SHANG Qing-jian and LIU Ya-li
Computer Science. 2014, 41 (Z11): 29-34. 
Abstract PDF(311KB) ( 555 )   
References | RelatedCitation | Metrics
The accessing efficiency of automated storage and retrieval system directly affects the overall efficiency of modern logistics,and the key of accessing efficiency high and low is slotting optimization.For the goods planning problem of the automated storage and retrieval system in practical application,virus coevolutionary genetic algorithm was proposed in this paper to study the goods position optimization problem of automated storage and retrieval system,and a comparison between this algorithm and traditional genetic algorithm was made .To improve the efficiency of shelf stability and directly as the optimization goal,slotting optimization mathematical model of multi-objective optimization problem was established.Finally by the use of MATLAB tools for programming and simulation,the experimental results show that the virus coevolutionary genetic algorithm (VEGA) compared with the traditional genetic algorithm has better convergence and search efficiency.Thus,by using virus coevolutionary genetic algorithm for automated storage and retrieval system slotting optimization,the goods loading and unloading efficiency and shelf stability can be largely improved,and thus the utilization of shelves is enhanced.
Vehicle Routing Problem with Time Window and its Application Based on Rich Road Network Weights
ZHANG Bei-jin,ZHOU Xiao-gen,MING Jie,YAO Chun-long and ZHANG Gui-jun
Computer Science. 2014, 41 (Z11): 35-38. 
Abstract PDF(1018KB) ( 395 )   
References | RelatedCitation | Metrics
To solve vehicle routing problem with indeterminate number of vehicles,large-scale outlets and multi-level transportation network,we established the road property mode based on rich network and GIS,mixed N-order nearest-neighbor adaptive clustering algorithm and genetic algorithm.Firstly,to solve the problem of small-scale outlets(less than 20 outlets) in traditional vehicle routing problem and the defect that abstracting the outlets into graph’s vertices during model establishing,we established network dataset based on actual road data,utilized GIS to exactly calculate the distance between outlets and built OD matrix of distance.Secondly,to reduce the complexity of designing large-scale outlets optimization algorithm,this paper used N-order nearest-neighbor adaptive clustering algorithm to determine the number of clusters,then divided the distribution outlets by clusters.Subsequently,to determine the kind and number of distribution vehicles and restriction of time window,we used genetic algorithm to optimize the distribution path.Finally,two examples verified the effectiveness of the proposed method.
Solving QoS Multicast Routing Problem Based on Improved Quantum-behaved Particle Swarm Optimization Algorithm
WAN Zhen-kai and ZENG Lei
Computer Science. 2014, 41 (Z11): 39-42. 
Abstract PDF(364KB) ( 439 )   
References | RelatedCitation | Metrics
For QoS multicast routing problem,an improved quantum particle swarm optimization algorithm was proposed.In order to solve the problem better,the algorithm uses the preprocessing mechanism.Firstly,the graphical network topology is converted to the tree network topology.On this basis,the codec of particles can easily be established.It is conducive to eliminating the bad particles and loops,also reduces duplication of the particle.Then the quantum-behaved particle swarm optimization was used to update the position of particles.After updating the position of particles,crossover and selection operator enhancing the diversity of particle populations and accelerating the convergence speed were made for particle populations.Finally,for comparison,the algorithm and the traditional particle swarm optimization algorithm were programmed.Simulation results show that the improved quantum-behaved particle swarm optimization algorithm can not only get better solutions than traditional particle swarm optimization algorithm,but also has faster convergence and global optimization capability.
Comprehensive Transportation Cyber-physical System
GONG Yan,LI Su-jian and XING En-hui
Computer Science. 2014, 41 (Z11): 43-46. 
Abstract PDF(447KB) ( 544 )   
References | RelatedCitation | Metrics
In order to provide references for the development of comprehensive transportation,according to design of comprehensive transportation system,theory,method and technology were determined.Considering the application of cyber-physical system on traffic system structure,mode and technology of transportation,a comprehensive transportation cyber-physical system was proposed.We analyzed the main problems facing the development of comprehensive transportation.In order to solve the problems of traffic,the characteristics of cyber physical systems was analyzed.At the same time,the feasibility of the application of information system on the comprehensive transportation was theoretical studied.In order to provide reference for the design and research direction for the future comprehensive transportation cyber-physical system,connotation,framework and four technical problems of comprehensive transportation cyber-physical system were proposed.The results show that cyber-physical system has a wide application prospects on comprehensive transportation problems.
AntColony Algorithm Based on Optimization of Potential Field Method for Path Planning
WANG Fang,LI Kun-peng and YUAN Ming-xin
Computer Science. 2014, 41 (Z11): 47-50. 
Abstract PDF(581KB) ( 495 )   
References | RelatedCitation | Metrics
To realize path planning in complicated environments,a new potential field optimal ant colony algorithm for path planning was presented.To further quicken the convergence speed of AC,the path planning results of potential field method were taken as the prior knowledge,and the original reached grids were initialized by neighborhood pheromone.The potential field guided weight was constructed to change transition probability as well,thus it can be active over the entire period of path searching,and can get rid of blindness.Simulation results indicate that the proposed algorithm(APF-AC) is characterized by high convergence speed,short planning path and self-adaptive.
Natural Languages Are Regular Languages
SHI Yue and SHI Hai-zhong
Computer Science. 2014, 41 (Z11): 51-54. 
Abstract PDF(293KB) ( 921 )   
References | RelatedCitation | Metrics
Natural languages are consisted of alphabet set,word set,sentence set,paragraph set and essay set.More-over,alphabet set is included in word set;word set is included in sentence set;sentence set is included in paragraph set;paragraph set is included in essay set.In this way,natural languages are regular languages.As a specific example,English is consisted of English alphabet set,English word set,English sentence set,English paragraph set and English essay set.In this way,English is regular language.We also introduced ten concepts including English alphabet empty-graph,English alphabet empty-graph language,etc.Chinese is consisted of Chinese character set,Chinese word set,Chinese sentence set,Chinese paragraph set and Chinese essay set.In this way,Chinese is regular language.We also introduced ten concepts including Chinese character empty-graph,Chinese character empty-graph language,etc.These concepts and theories not only offer a new solution for natural language processing,but also establish a new research field for linguistics.
Discussion on Application of Information Technology in New Generation of Intelligent Transport System
LIU Dao-qun
Computer Science. 2014, 41 (Z11): 55-56. 
Abstract PDF(1013KB) ( 410 )   
References | RelatedCitation | Metrics
Internet of things,cloud computing and mobile internet technology are used in intelligent transport system.They are integrated by technology to achieve the automation and intelligent traffic management,and solve the traffic congestion,prevente traffic accident and reduce pollution.
EMD-FSVM Prediction for Nonstationary Time Series
GONG Bang-ming,WANG Wen-bo and ZHAO Pan
Computer Science. 2014, 41 (Z11): 57-60. 
Abstract PDF(335KB) ( 439 )   
References | RelatedCitation | Metrics
This paper proposed a novel method to predict non-stationary time series,based on the empirical mode decomposition and fuzzy support vector machine.Firstly,uisng EMD,the non-stationary time series are decomposed into single modal components,reducing the prediction signal nonlinear complexity.Then,using the fuzzy support vector machine,each intrinsic mode function is predicted.Finally,the results predicted by each intrinsic mode function are superimposed to obtain the final forecast.Using Lorenz and sunspot month smooth value sequence with noise as the experimental data,our method was compared with BP neural network prediction and SVM prediction method by experiments.And this method has stronger adaptability to the sequence signal with isolated points and noise,and better prediction accuracy.
Extraction Method Based on Q Wavelet Transform of ECG Signal Characteristic
LI Nan,YANG Zhao-chun,SUN Le-jun and WEI Rong-guo
Computer Science. 2014, 41 (Z11): 61-64. 
Abstract PDF(355KB) ( 472 )   
References | RelatedCitation | Metrics
Compared with the traditional frequency domain based on the signal decomposition method,this paper proposed a adaptive signal decomposition method based on the quality factor.Using the Q-tunable wavelet transform to adaptive generate wavelet functions with different quality factor as basis functions of signal decomposition,we decomposed the compound signal into the high resonance component with sustained oscillation properties and low resonant component with transient impact properties with Mallat algorithm and used it to extract the ecg signal characteristic.This method can remove the noise and interference of the signal effectively and separate the spectrum aliasing and different oscillation signals compared with wavelet analysis and empirical mode decomposition method and so on.The results prove the superiority of the algorithm by numerical simulation and example analysis.
Study on Grade of Wine Quality Based on GA Toolbox of MATLAB
WU Zheng-zhi and LI Jin
Computer Science. 2014, 41 (Z11): 65-68. 
Abstract PDF(559KB) ( 569 )   
References | RelatedCitation | Metrics
The wine quality grade evaluation is a very important work.Because the quality level of wine is a kind of classified variable,traditional regression model can’t be unable to use,but the logistic regression model can.This paper used order logistic regression to build a prediction model about wine quality grade based on the actual data of a complete investigation of the Portuguese wine.Then it applied genetic algorithm with penalty function to optimize,finding out a group of factors which get the best quality level.
Study on TSP Solving Based on IPSO
GAO Feng and ZHENG Bo
Computer Science. 2014, 41 (Z11): 69-71. 
Abstract PDF(314KB) ( 855 )   
References | RelatedCitation | Metrics
In order to obtain the optimal solution of TSP,an improved particle swarm optimization (IPSO) was proposed for solving TSP.By using of adaptive updating mechanism and inheritance judgment mechanism,the IPSO overcomes the shortcomings of traditional algorithm falling into local best position easily and the effect of adjustable parameters and initial position set randomly on the uncertainty of optimization results,to ensure to obtain the consistency global optimal solutions in the solution space.By solving the different samples of TSP,we verified the effectiveness and stability of the IPSO.Comparative experiment show that the IPSO in solving large-scale optimization problem has the highlighted ability about the global optimization.
Magic Cards Recommendation Algorithm Based on Bayesian Theory
YANG Yao-fei and LI Ye-li
Computer Science. 2014, 41 (Z11): 72-74. 
Abstract PDF(508KB) ( 558 )   
References | RelatedCitation | Metrics
Magic is a long history table games, and can be composed of countless decks because of its complexity in logic and numerous cards.To recommend cards based on logic is difficult to achieve,and the time complexity of the algorithm based on logic is non-deterministic polynomial.Magic recommendation algorithm based on bayesian theory mainly uses the user’s decks as the raw data to calculate a recommendation matrix to replace the logic portion of the logic-based re-commendation algorithm.It avoids the logic-based recommendation algorithm NP problem,and the recommended rate is increasing with the user accuracy decks increasing.
Real Estate Appraisal Model and Empirical Research Based on Genetic Algorithm to Optimize Neural Network
LV Ji
Computer Science. 2014, 41 (Z11): 75-77. 
Abstract PDF(316KB) ( 734 )   
References | RelatedCitation | Metrics
Prediction accuracy of traditional assessment methods is not enough,because of the complicated nonlinear relationship between the real estate prices and influencing factors.This paper proposed an real estate appraisal predictive model based on genetic algorithms and BP neural network.It uses BPNN to determine the function mapping relationship impact factor and evaluated price of the real estate prices,which uses GA to optimize weights and thresholds of BPNN.The method improves the convergence rate of BPNN and solves the problem of getting into local extreme point.Finally,the computer simulation results of 100 sample data show that,the proposed method is valid and accurate for real estate appraisal.
Research of Obstacle Detection Based on Aerial Panorama Image
CHANG Jia-yi,QIN Rui,LI Qing,CHEN Xi and XU Jian-jun
Computer Science. 2014, 41 (Z11): 78-82. 
Abstract PDF(898KB) ( 622 )   
References | RelatedCitation | Metrics
At present,feature-based method is used to detect vehicles and pedestrians of panorama images,can not avoid all the threats of obstacles to driving.Different from the method above,we detected all the obstacles taller than the ground plane using motion-based method.First,we built the vehicle motion model,then deducd the relation-ship of vehicle motion,obstacle height and pixel optical-flow.Taking advantage of the fact that vertical views of adjacent cameras overlap each other and obstacle points in overlapping region have two different optical flows,we detected obstacles in intersection areas,fastened the solving of optimal motion parameters.At last,we detected all the obstacles using the motion compensated image.The vehicle experiments on the road demonstrate that the method proposed by us can detect all the obstacles taller than the ground effectively when vehicles are on the flat road.
Multi-lever Line Matching Method Based on Multiple Constraints
LI Jun-yao,GU Hong-bin,SUN Jin and WANG De-zhi
Computer Science. 2014, 41 (Z11): 83-87. 
Abstract PDF(902KB) ( 555 )   
References | RelatedCitation | Metrics
To match the images of larger parallax change,blocked or linear fracture,this paper proposed a multi-level line matching method based on and multiple constraints.Firstly,constraint of the linesegment in the neighborhood with matched reliable seed points to accomplish point-line matching is based on seed points;Next,perform of the line-line matching within geometric feature constraints is based on homography matrix and epipolar line constraint;Finally,we completed line matching with line-surface matching using a adaptive linear similarity constraint method.Experiments show that this method can accurately match images of larger issues,blocked or linear fracture and solve the problems that existing line matching algorithm has lower accuracy and unable to fulfil large parallax changing.
Optimization and Research on Ellipse Fitting and Application Based on Algebraic Distance
CUI Jia-li,GONG He,WANG Yi-ding,JIA Rui-ming and XIAO Ke
Computer Science. 2014, 41 (Z11): 88-90. 
Abstract PDF(814KB) ( 814 )   
References | RelatedCitation | Metrics
Considering of the fact that sample points are easily affected by noise in a traditional ellipse fitting algorithm,the ellipse fitting approach of the least median square was proposed in this paper with the foundation of algebraic,geometric distance and RANSAC algorithm (random sample consensus).Linear transformation on the original data is adopted,and then many tests of selected different points from the painted pentagon are performed with the help of the minimum Euclidean distance between the fitting ellipse and boundary points.In the end,five parameters of the ellipse are obtained and the final ellipse is fitted.The experimental results on simulated and actual images show that the algorithm has good accuracy,robustness and small time complexity;in the meantime,it can correctly fit and recognize the planets.
A New Class of Image Segmentation Iterative Algorithm Based on One-dimensional Renyi Entropy
RAN Qing-hua,GONG Qu and WANG Ke
Computer Science. 2014, 41 (Z11): 91-94. 
Abstract PDF(1154KB) ( 388 )   
References | RelatedCitation | Metrics
Aiming at the limitation of one-dimensional renyi entropy algorithm,the paper proposed a new iterative method in image segmentation based on one-dimensional renyi’s threshold.The method iteratively searches for sub regions of the image for segmentation for processing to get the final segmentation threshold.The process stops when the renyi’s thresholds calculated between two iterations is less than a preset constantand the last threshold calculated is the final threshold that we want.The paper not only gave the segmentation result intuitively but also gave the quantitative result of segmentation using the uniformity measure which is an image segmentation evaluation criteria.The experiment results show that the iterative method not only can get desired segmentation result intuitively but also that the uniformity measure calculated at each iteration is a monotone increasing sequence.And the experiment shows that the proposed method is not sensitive to parameter α.
Adaptive Module Localization Method of Local Feature for Data Matrix Code
HUANG Chong,ZHENG He-rong and PAN Xiang
Computer Science. 2014, 41 (Z11): 95-99. 
Abstract PDF(1427KB) ( 411 )   
References | RelatedCitation | Metrics
In order to solve the problem of low decoding rate of data matrix image by irregular deformation,this paper proposed an adaptive localization method by edge gradient feature.The algorithm can significantly improve the accuracy of localization.The algorithm consists of the following three steps.Firstly,according to the contour feature of DM code,it locates “L” shape to get the position of the code.Secondly,it estimates the number of code blocks by detecting the dashed line and locates mapping points by affine transformation.Finally,local adjusting is performed to correct irregular deformation by edge gradient feature.In consequence,the original code can be reconstructed.In experiments,we tested decoding by a variety of data matrix code images.The results show that this algorithm can be applied to images in high distortion and decoding accuracy.The decoding accuracy can be greatly improved from 93% to 98%.
Similar Bézier Curves with Two Shape Parameters
XI Hai-ying and ZHANG Gui-cang
Computer Science. 2014, 41 (Z11): 100-102. 
Abstract PDF(518KB) ( 394 )   
References | RelatedCitation | Metrics
The paper generalized the Bernstein Bases replacing the variable of u by the function f(u).The structure and quality of the new basis were analyzed.This new curve using this bases not only possesses all of the characters of Bézier curve but also produces some advantages.For example, the curve’s degree can easily be changed by adjusting factor,but u do not changes the shape of the curve,and the position for Bezier curve and the proposed Bézier curve to correspond to the points on the curve is different.Finally,the continuity condition of the curve was also discussed.This bases and curve have some study value.
Lossless Color Image Compression Method Based on Fuzzy Logic
LI Qing and LI Dong-hui
Computer Science. 2014, 41 (Z11): 103-106. 
Abstract PDF(563KB) ( 496 )   
References | RelatedCitation | Metrics
In digital image processing,lossless color image compression has increasingly wide range of applications.To take full advantage of the correlation between color components and improve lossless compression rate of color image,this paper proposed a lossless color image compression algorithm based on fuzzy logic.The algorithm combines fuzzy logic and H.264 intra prediction algorithm together,proposes an improved intra prediction algorithm which is used to predict color component of G.To reduce the texture complexity of R,B component,the author used the similarity of image texture between color components.Then,an algorithm that combining the optimal prediction mode of G component and adaptive compensation algorithm based on the context of statistics was used to predict color component of R and B.At last,the difference between the predictions for the three color components is encoded by Golomb coding in which the optimal parameter is calculated by predicted results.The experimental results show that when encoding these color images which have clear textures,the proposed method has significant improvement in coding efficiency compared with the JPEG-LS.
Image Registration Method Based on Straight-line in Hough Parameter Space
QU Zhi-guo,TAN Xian-si,LIN Qiang,WANG Hong and GAO Ying-hui
Computer Science. 2014, 41 (Z11): 107-109. 
Abstract PDF(830KB) ( 606 )   
References | RelatedCitation | Metrics
In order to solve the image registration problem of different sensors under different view obtained at different time,a novel image registration algorithm based on straight-lines was presented in this paper.Firstly,we transformed the straight lines in the image space into points in the Hough parameter space by using Hough transform.Then we proposed an approach that simultaneously determines the correspondences between the points and solves for the parameters involved in the registration transformation function in Hough space.Experiments were conducted on real-world images.The experimental results reveal that the algorithm is robust and efficient for most images with rich line features.
Improved Iris Recognition Algorithm Based on PCNN
JIN Xin,NIE Ren-can and ZHOU Dong-ming
Computer Science. 2014, 41 (Z11): 110-115. 
Abstract PDF(990KB) ( 470 )   
References | RelatedCitation | Metrics
We proposed an improved iris recognition algorithm based on pulse coupled neural network (PCNN).Because the iris location is not enough accuracy,the morphological filtering method was applied to image denoising,which can improve recognize accuracy.And,by the statistical analysis of oscillation time sequences of the neurons,we concluded that different iris textures have the unique neurons oscillation time sequences (OTS).Finally we achieved the improved iris recognition algorithm through calculating and classifying the Euclidean distance of the OTS.In the iris database of CASIA-Iris-Interval,the experimental results show the effectiveness of the method proposed in this paper,and reveal that this method is better than traditional methods in recognition accuracy and recognition rate.
3D Face Registration Based on Surface Deformation
GE Yun
Computer Science. 2014, 41 (Z11): 116-118. 
Abstract PDF(1094KB) ( 465 )   
References | RelatedCitation | Metrics
D face database is an important data platform for model training,algorithm design.To improve the matching result and efficiency of 3D face sample,we proposed a new registration method based on surface deformation.First a series of deform operation is performed on the template sample to get the correspondence between template sample and raw sample.The registration operation is performed on the raw sample.In the procedure of matching process,the statistical method is used to tackle the problem of noise point and hollow.The experiment results show that the proposed method has good performance on face registration.
Image Annotation by Similarity Content-based Image Retrieval
DENG Li-qiong,HAO Xiang-ning,XIA Ming and LI Zhong-ning
Computer Science. 2014, 41 (Z11): 119-122. 
Abstract PDF(609KB) ( 647 )   
References | RelatedCitation | Metrics
Image annotation is an active research topic in recent years.In this paper,we targeted at solving the automatic image annotation problem in a novel search and refinement framework.In the search stage,we performed content-based image retrieval(CBIR) based on the MSF global feature to find similar images from image database.Then in the refinement stage,an algorithm using random walk with restarts is used to re-rank the annotations.The refinement keywords are used to annotate the uncaptioned image.This framework does not impose an training stage,but efficiently utilizes well-annotated images,and is potentially capable of dealing with unlimited vocabulary.The experiment results show the effectiveness and efficiency of the proposed approach.
Image Retrieval of Vocabulary Tree Method Based on ISODATA
ZHANG Ting,DAI Fang and GUO Wen-yan
Computer Science. 2014, 41 (Z11): 123-127. 
Abstract PDF(941KB) ( 416 )   
References | RelatedCitation | Metrics
Vocabulary tree image retrieval is a kind of efficient image retrieval algorithm based on the structure of visual words.It employes SIFT algorithm and K-means algorithm in the process of feature extraction and cluster respectively.K-means algorithm,however,is heavily dependent on the initial value.The cluster result of K-means is easy to appear forced cluster when the class number is unknown.And SIFT algorithm is easy to cause data overflow and increase the retrieval time.Two novel feature extraction methods,called SIFT_CRONE and Color_HU respectively,were proposed and ISODATA algorithm was introduced in this paper.The SIFT_CRONE feature extraction method determines the key points of the image using SIFT algorithm,calculates the pixel gradient around the key points using CRONE operator and describes the key points by vector.Its advantages are that it keeps the advantages of SIFT features and reduces the time costs of retrieval.In Color_HU feature extraction method,we determined the key points and the effective area by SIFT,and calculated color histogram and HU moment of the effective area to reduce the feature dimension and the retrieval time costs.Meanwhile,we presented an adaptive parameter estimation algorithm for ISODATA.The experimental results show that the ISODATA algorithm can avoid the dependence on initial value of K-means,and can obtain ideal results when the cluster number is unknown.Two proposed feature extraction methods have their own advertages,and both can shorten the time of image retrieval and improve the retrieval efficiency.
3D Face Expression Recognition Based on Differential Operator
GE Yun
Computer Science. 2014, 41 (Z11): 128-132. 
Abstract PDF(955KB) ( 372 )   
References | RelatedCitation | Metrics
Based on Laplace differential operator,a 3D facial expression recognition method was proposed.First the raw samples are registered by using the method of surface deformation.Then the expression feature are calculated by diffe -rential operator and a dictionary about face expression is established based on feature vectors derived from training samples.At last sparse representation method is used to perform recognition work.The experimental results show that the proposed method can effectively improve the accuracy of 3D facial expression recognition.
Adaptive Denoising Method Based on Anisotropic Diffusion Equation
ZHAO Chuan,MA Xi-rong,MA Ling and ZHANG Tong
Computer Science. 2014, 41 (Z11): 133-135. 
Abstract PDF(511KB) ( 465 )   
References | RelatedCitation | Metrics
An adaptive filtering method based on the anisotropic diffusion equation was proposed.The denoising principal of the anisotropic diffusion equation was studied.Adaptive filtering of image was realized by combining the improved image structural similarity algorithm and the anisotropic diffusion equation.Experiment results show that the improved structural similarity algorithm has good robustness and advantages in the application of adaptive filtering.
Image Forgery Detection Using Characteristics of Background Noise
LIU Li-juan and LIN Xiao-zhu
Computer Science. 2014, 41 (Z11): 136-138. 
Abstract PDF(835KB) ( 528 )   
References | RelatedCitation | Metrics
There is a part of background noise in digital image which comes from imaging process.The noise characteristics between image forgery area and the other area are different if images with different noise levels are spliced together.This paper proposed a background noise estimation algorithm based on the statistical properties of skewness.We detected the forgery parts by dividing image into some sub-blocks and computing the noise variance of each ones.This algorithm removes the original image details with DCT transform,estimates noise with the statistical properties of skewness,and estimates the standard deviation of noise with condition of the minimum method.This algorithm improves iterative conditional minimum value method using differential method to calculate the minimum value.This algorithm avoids the problem of setting the initial value,and improves the accuracy of the algorithm.The experimental results show that the proposed noise estimation algorithm has high accuracy and effectiveness in detecting forgery part in spliced images.
Research Advance and Prospect of Membrane Computing Applied in Image Processing
KOU Guang-jie,MA Yun-yan,YUE Jun and ZOU Hai-lin
Computer Science. 2014, 41 (Z11): 139-143. 
Abstract PDF(775KB) ( 775 )   
References | RelatedCitation | Metrics
As a new branch of bio-inspired natural computing,membrane computing has potential capacity.The concept,classification and definition were discussed firstly in the paper.Then the applications of membrane computing in the field of image segmentation,image enhancement and image thinning etc.were classified and reviewed.The newest technologies of realizition of membrane computing were analyzed and discussed.Finally,the summarization and expectation of the applicaions of membrane computing in image processing were given.
Algorithm of Image Matching Based on Color SIFT and Shape Context
XU Yan-lu,MA Yan,LI Shun-bao and ZHANG Xiang-fen
Computer Science. 2014, 41 (Z11): 144-146. 
Abstract PDF(604KB) ( 502 )   
References | RelatedCitation | Metrics
This paper presented a new image matching algorithm based on improved sift and shape-context,and aimed to solve the disadvantages of conventional sift and shape-context algorithm.We took color invariant into consideration and constructed color sift descriptors.In shape context algorithm,shape context histogram based on central points is used instead of the traditional shape context histogram based on contour points.Then the new joint descriptors combining with sift and shape context are applied to lead the feature points matching according to the new given joint distance and achieve the initial matching pairs.Finally,the partial least squares method is used to eliminate mismatching points.The experimental results show that the proposed algorithm can improve image matching accuracy effectively.
Study of Super-resolution Image Restoration Algorithm Based on Wavelet Transform
TANG Jia-lin,WU Ze-feng,JIANG Cai-gao and SUN Hui-fang
Computer Science. 2014, 41 (Z11): 147-149. 
Abstract PDF(758KB) ( 693 )   
References | RelatedCitation | Metrics
Under the circumstances of without changing the existing hardware device,and taking into account that wavelet theory had been developing rapidly in recent years,this paper presented the image super-resolution algorithm based on wavelet transform.After direct neighborhood interpolation,low resolution image will be decomposed into four different Sub-band with DWT and at the same time is directly processed by SWT.High frequency band will be amended from using SWT to DWT,so as to fix the estimated coefficient.Finally,a high resolution output image can be obtained with high frequency band and the input image being modified by inverse discrete wavelet transform(IDWT).Experiments show that compared with the traditional bilinear interpolation and bicubic interpolation,the peak signal-to-noise ratio PSNR of the algorithm presented in this paper is improved.
Design and Implementation of Image Classification Software Based on OpenCV
JIA Ning,GAO Nan,LI Gui-cai and WANG Zhao-yun
Computer Science. 2014, 41 (Z11): 150-153. 
Abstract PDF(593KB) ( 543 )   
References | RelatedCitation | Metrics
For a system of analysis and discussion of image classification,the main problem to be solved is similarity extraction of face and contrast to images of two human face,to realize automatic classification management on the album picture.Application of OpenCV in face detection and face recognition function,using Haar and Adaboosting,can find images of the face region,and then use LBPH to predict the face,to obtain the similarity between two faces.In accor-dance with the directory,the pictures in the face classification can be achieved.After the completion of the user interface design,the user can obtain simple,convenient operation.Software test results show that face detection and face matching function have a higher success rate,and this software provides a convenient conditions for life.
Coverage Blind Restoration Algorithm Based on AUV Movement in UWSNs
ZHANG Ning-shen,HUANG Chen-cheng and LIU Lin-feng
Computer Science. 2014, 41 (Z11): 154-157. 
Abstract PDF(950KB) ( 406 )   
References | RelatedCitation | Metrics
This paper proposed a coverage blind restoration algorithm based on AUV Movement in underwater wireless sensor networks(UWSNs).Firstly,the area to be covered is mapped into many hex cells,and then in order to achieve restoration of coverage blind,the AUV traverses each cell according to a proper strategy.The algorithm overcomes the problem that the underwater environment is complex and unknown so that it is difficult to reliably cover the scene.It also helps the AUV to minimize the consumption of traversing paths.We also analyzed and discussed the situation of 3D scenes and multiple AUVs collaborations.Simulation results show that the algorithm performs well on the restoration of coverage blind,no matter the scenes are regular,irregular or discontinuous.
Novel Algorithm of Satellite Covert Communication Based on Chaotic Spread Spectrum Modulation
LIAN Chen,DA Xin-yu and ZHANG Ya-pu
Computer Science. 2014, 41 (Z11): 158-161. 
Abstract PDF(588KB) ( 488 )   
References | RelatedCitation | Metrics
In view of the fact that traditional Logistic-Map as well as the modified Logistic-Map only have one surjective parametera,a new chaotic map function was designed,which broadens the range of surjective parameter so as to improve the ergodic property of the chaotic sequence.Based on the coherent chaos shift-keying modulation,a new satellite covert communication system with chaotic spread spectrum was installed.The randomness correlationbalance property of the referred chaotic sequence and the error rate of the covert communication system were analyzed.The simulation results show that the referred chaotic is better than the traditional Logistic chaotic in ergodic property and balance property.The bit error rate of original service receiver is not changed when the power ratio of original service signal to spread spectrum modulated signal is greater than 20dB.The bit error rate of covert communication receiver is below 10-3 when the spread spectrum factor is 80 and the SNR(Signal to Noise Ratio) is greater than 6dB.The proposed system can meet the covert communication requirements.
Reliable Data Aggregation Strategy of Multi-optimized Factors Based on LEACH Algorithm
WANG Zhen-fei,YU Li and ZHENG Zhi-yun
Computer Science. 2014, 41 (Z11): 162-167. 
Abstract PDF(473KB) ( 473 )   
References | RelatedCitation | Metrics
Owing to the deployment of nodes in high density and limited energy,the wireless sensor nodes have the problems of high data redundancy and are vulnerable to be attacked,and so on.We proposed a reliable data aggregation strategy of multi-optimized factors based on LEACH algorithm in wireless sensor networks.The strategy makes some improvements to optimize the LEACH algorithm in three aspects:two reliable optimizing factors MN-LEACH and LF-LEACH are adopted during the calculation of data fusion in the similarity;the optimization of multi-path transmission factor MT-LEACH is used for data transmission.Firstly,in a hostile environment,Laplace is used instead of Gauss function to filter the emergent and nonlinear noise data in order to avoid a large number of malicious nodes interfering with real data and improve the accuracy of data fusion,and then we used MN-LEACH optimizing factor to calculate similarity and average the weight for data aggregation.Finally,we used optimization of LF-LEACH factor to detect the link transmission,and adopted the MT-LEACH optimizing factor in the process of transmission as well,so that the load of links will be balanced. The experimental results show that the strategy outperforms the traditional LEACH strategy,in terms of accuracy of data aggregation,signal-to-noise ratio and energy consumption,and so on.
MIMO Broadcast Transmission Scheme Based on BD Precoding and TDM
CHEN Pei-lei and LIU Ping
Computer Science. 2014, 41 (Z11): 168-169. 
Abstract PDF(221KB) ( 614 )   
References | RelatedCitation | Metrics
In MIMO(Multi-Input Multi-Output)system,the block diagonalization(BD) precoding algorithm is applied to eliminate the co-channel interference(CCI).And BD precoding algorithm requires the number of the system receiving antennas should be less than the number of all users transmitting antennas.With the increasing of system users,the amount of the receiving antennas add,and the complexity of BD algorithm increases sharply.In order to solve the above problem and maximize users capacity supported by the system,this paper presented an transmission scheme based on block diagonalization precoding and time division multiplexing(TDM).In this scheme,the users are divided into several groups which will be assigned to different slots.The scheme reduces the requirements on the number of system transmitting antennas and the complexity of the algorithm.Simulation results show that the scheme can support more users and increase the throughput of the system.
Algorithm of Selecting Optimal Dynamic Cooperation Tree in Opportunistic Networks
WU Jia,YI Xi and CHEN Zhi-gang
Computer Science. 2014, 41 (Z11): 170-173. 
Abstract PDF(276KB) ( 404 )   
References | RelatedCitation | Metrics
The randomness,mobility and disconnection are the characteristic of opportunistic networks.Then there are some similarities of human transmitting messages in human social networks.The traditional algorithms,in social networks,have not acquired good result because of changing application environment.Thus,this paper designed an algorithm——optimal dynamic cooperation tree algorithm when it is in randomness,mobility and disconnection environment.According to find dynamic topology structure,and establish dependability,usability,decline factor,we could count weight of this topology structure,and then obtain the optimal objects and paths.In the light of simulation,we compared it with classical algorithms.This algorithm acquires good result.
Research of Representative Group Mobility Models
HOU Yan-shun,SUN Jia-qi and WANG Xiao-bo
Computer Science. 2014, 41 (Z11): 174-177. 
Abstract PDF(335KB) ( 812 )   
References | RelatedCitation | Metrics
Mobility models are the base of mobility modeling and simulation.Individual mobility models have been studied thoroughly and some defects,such as imperfect nodes distribution and velocity attenuation,have been proved.By comparison,less researches have been made on group mobility models.This paper researched 2 most representative group mobility models,i.e.RPGM and ECM,analyzed their nodes distribution,velocity distribution,time correlation,nodes’ correlation and parameters’ controllability,provided reference for model choice and parameter setting.
Research of Direct Memory Communication and NIC Prototype
CHEN Ying-tu,WANG Ai-lin,ZHANG Yan and LIU Jun-rui
Computer Science. 2014, 41 (Z11): 178-181. 
Abstract PDF(850KB) ( 638 )   
References | RelatedCitation | Metrics
At present,the communications speed on PCI is limited by PCI bus,the number of the computer’s memory slots is increasing,and the management of memory becomes advanced.So,in this paper,the direct memory communication method,abbreviated as DMC,was put forward.The NIC(the network interface card) based on DMC is inserted into a memory slot,and the memory on DMC-NIC is reserved as communication precinct.When user sends data,he uses the method of writing memory to place the data into communication space,and when user receives data,he uses the method of reading memory to get the data from communication space.So,the user can accomplished the direct point-to-point communications between the computers through accessing memory.The communications speed is not limited by I/O bus,and the copy between memory and NIC is omitted.We also applied DMC into the high-speed fibre channel switch network,and designed the FIFO-DMC NIC to prove that DMC is right.
Survey on Mobile Data Offloading
YAO Hong,BAI Chang-min,HU Cheng-yu,ZENG De-ze and LIANG Qing-zhong
Computer Science. 2014, 41 (Z11): 182-186. 
Abstract PDF(922KB) ( 394 )   
References | RelatedCitation | Metrics
Mobile data offloading is a relatively new research hotspot in recent years.In order to solve the growing mobile data required by the user to cellular network operators which brings the problems such as traffic load and network congestion,mobile data offloading was raised to offload cellular network data to the ubiquitous user local opportunity communication.The basic idea is to distribute the data object to only part of subscribers (called seed sources) via the cellular network,and then allow seed sources to propagate the object to other subscribers through opportunistic local communications (e.g.,Bluetooth,Wi-Fi Direct,DSRC,Device-to-Device in LTE) or Wi-Fi AP based. At first,this paper summarized research background,significance and the research progress of data offloading in general.Then focusing on research content and trend of current academic circles at home and abroad,we classified the data offloading into different scheme via the form and technical route,and then summarized the various of schemes.Lastly,we summarized the full text combining reality environment.
Handover Algorithm Based on Location Prediction in Cellular Network
WANG Meng-ran,QIAO Shao-jie and YU Shan-shan
Computer Science. 2014, 41 (Z11): 187-190. 
Abstract PDF(673KB) ( 506 )   
References | RelatedCitation | Metrics
In order to meet the future mobile cellular network whose cellular is small,handover is frequent and support a large number of users and the demand of the multimedia applications,this article proposed a new prediction-based handover scheme on the basis of analyzing the position prediction and handover schemes.The basic idea is:(1) Mining frequent trajectories from a large number of history trajectories of mobile users;(2)Generating movement rules from frequent trajectories;(3)Applying these movement rules to decide whether to handoff the process of communication to another station or not. We used the simulation software to simulate the mentioned scheme and compared it with the traditional handover scheme,the mentioned scheme reduces the unnecessary handover times and error rate and improves accuracy handover rate.To a certain extent,the mentioned scheme reduces the communication cost and improves the capacity of communication system and QoS.
Minimum Redundancy Storage Regeneration Code Research MSRRC Based on Matrix Operation
WANG Yu,ZHAO Yue-long and HOU Fang
Computer Science. 2014, 41 (Z11): 191-194. 
Abstract PDF(366KB) ( 405 )   
References | RelatedCitation | Metrics
Distributed storage systems often use erasure code redundancy technology to improve the safety and reliability of the data,so that the system has the ability to self repair failure data,but the traditional erasure codes need to transfer the large amount of data in order to repair failure nodes.Regeneration code is an improved form of erasure code,and its main characteristic is that it does not need to download the entire data file when restoring a single node data,which can effectively reduce the network bandwidth when repairing data.The relevant documents prove that data repair has minimal storage regeneration point(MSR),so presents minimum redundancy storage regeneration code MSRRC.Research mainly used the data matrix and the repair matrix to achieve MSRRC,and through examples,detailly introduced the realization process of regenerating codes,and proved the correctness of the theory.The simulation results verify the validity of MSRRC.
Reversible Network Simplification with Similar Function and Similar Network
XU Ming-qiang,GUAN Zhi-jin,HE Jin-feng and LU Yu
Computer Science. 2014, 41 (Z11): 195-198. 
Abstract PDF(338KB) ( 352 )   
References | RelatedCitation | Metrics
This paper presented the similar function of the reversible function and the similar network of the reversible network,and on the basis of that,the simplification method of reversible network was constructed.Giving a reversible function,all its similar functions can be searched.To every similar function,the reversible network which is converted to its corresponding similar network,can be constructed by reversible logic synthesis algorithm,and the optimum can be chosen.The network simplification algorithm realizes the reversible network of all 3-variable functions and some multi-variable functions.Compared with the pertinent literature and the examples of Benchmark,it has some advantages while constructing the reversible network with less gate count.
Energy-efficient Routing Algorithm on Mobile Sink in Wireless Sensor Network
LIN Zhi-gui,WANG Xi,ZHAO Ke,LIU Ying-ping,YANG Zi-yuan and ZHANG Hui-qi
Computer Science. 2014, 41 (Z11): 199-203. 
Abstract PDF(424KB) ( 466 )   
References | RelatedCitation | Metrics
For the disadvantages,such as high energy consumption and the energy consumption imbalance in the wireless sensor network,a mobile sink node was introduced and an energy-efficient routing algorithm on mobile sink(MSEERP) was proposed in this paper.In the MSEERP,the network is divided into several square virtual grids,and each grid is called a cluster.The cluster head is selected according to the residual energy of nodes and the weighted sum of the coordinate distance of a node and a cluster center of gravity,which avoids a node of the low residual energy selected as cluster head node.In order to save network energy consumption,the sink node receives data of the cluster head node by controllable moving scheduling strategy.Influence of three parameters,the movement speed of the sink node,the number of the moving sink node and the weighting coefficient α,on the performance of MSEERP algorithm was detailed analyzed by simulation.The results show when the movement speed of the sink node is equal to 5,the weighting coefficient α is 0.6 and the number of mobile sink node is 1,the performance of the MSEERP algorithm is best.The life cycle of the network,the total energy consumption and the total amount of data the sink node receives of the MSEERP algorithm are better than that of GAF and TTDD algorithm.
Research on Network Structure of Microbloggers Based on Social Network Analysis
SONG Yang,TIAN Ai-kui and ZHANG Yi
Computer Science. 2014, 41 (Z11): 204-207. 
Abstract PDF(353KB) ( 1255 )   
References | RelatedCitation | Metrics
This paper took the users who are under the tag of entertainment in Sina microblog as sample.Combined with “concerning” and “concerned” relationship between users,it built “mutual-concern” network by using social network analysis,then analyzd the network respectively from degree centrality,betweenness centrality and cohesive subgroups analysis.Comparing with real data by using the given core users analysis method,it revealed the core microblogger of the tag and the relationship between network members.Experimental results indicate that the method is feasible for specific network.In the end,it put forward the corresponding enlightenments.
Outlier Detection Methods Based on Neural Network in Wireless Sensor Networks
HU Shi,LI Guang-hui,LU Wen-wei and FENG Hai-lin
Computer Science. 2014, 41 (Z11): 208-211. 
Abstract PDF(358KB) ( 592 )   
References | RelatedCitation | Metrics
Outlier detection in wireless sensor network(WSN) is of great significance for environmental monitoring.Two outlier detection methods for WSNs were proposed based on BP neural network and linear neural network in this paper.Latest historical data with fixed length of data window was used to train a neural network model,and then these methods can predict the sensor data of the next time.A confidence interval with probability p was calculated with the help of the model residual.The new measurement will be identified as normal one if it falls inside the prediction interval.Otherwise,it will be classified as an abnormal record.In order to compare and demonstrate the performance of the proposed methods,we finished the simulation experiments in Matlab environment.The experiment results show that the detection rate of outlier detection based on linear neural network reaches 97.9%,and the false positive rate is less than 0.76%.While the detection rate of outlier detection based on BP neural network reaches 96.7%,the false positive rate is less than 0.84%.
Adaptive PULL & PUSH Heartbeat Detection Machine-made for Disaster Recovery
WANG Hao-ming,MU Dao-sheng and GAO Li-juan
Computer Science. 2014, 41 (Z11): 212-214. 
Abstract PDF(266KB) ( 646 )   
References | RelatedCitation | Metrics
As an important part of the capacity of disaster emergency technology,heartbeat detection technology is the foundation of the disaster recover system.This paper studies a adaptive PULL&PUSH heartbeat detection machine-made for disaster recovery,made the two heartbeat model complementary advantages,and combined the network environment with application QoS requirements as the basis of judging the failure,so as to improve the reliability and real-time performance of detection mechanism.
Fast Construction of Block Jacket Transform over Finite Field
HUANG Cheng-rong
Computer Science. 2014, 41 (Z11): 215-220. 
Abstract PDF(395KB) ( 378 )   
References | RelatedCitation | Metrics
We constructed anovel cocyclic block-wise inverse Jacket transform(CBIJT) with a fast transform.To factorize the large-size cocyclic block-wise inverse Jacket matrix(CBIJM) into several low-orderidentitymatrices and sparse matrices,we achieved a successive architecture that leads to a fast transform while reducing computational load.Two kinds of the CBIJTs,named one-dimensional and two dimensional CBIJTs,were designed with a similar recursive fashion,which refers the above-mentioned multi-fold product of identity matrices and CBIJTs.
Infectious Diseases Transmission Simulation and Modeling Based on CAS
LV Ji,XU Jie,MA Lu-lu,SI Dan and ZHANG Peng
Computer Science. 2014, 41 (Z11): 221-223. 
Abstract PDF(327KB) ( 1640 )   
References | RelatedCitation | Metrics
With the rise of complexity science,epidemiology research perspectives and methods are gradually changing.Using the viewpoint of complex adaptive systems,we constructed the deterministic main body model of infectious disease transmission and defined the main body of the state transition rules.By means of MATLAB,achieved a simple simulation modeling process can be used to simulate the communication process of infectious diseases.By adjusting the parameters,we called the function and five factors which may have an effect on infectious diseases,such as interpersonal communication,public health consciousness and so on,were carried on the simulation experiment.According to the experimental results,we put forward the measures and strategies of epidemic prevention and control.The experimental results show that vaccination and quarantine treatment are the most effective means to inhibit the outbreak.
Virtual Network Mapping Optimization Algorithm Based on Virtual Network Node Migration
GUO He-bin
Computer Science. 2014, 41 (Z11): 224-227. 
Abstract PDF(292KB) ( 427 )   
References | RelatedCitation | Metrics
In order to improve the virtual network request receiving rate and the substrate network resource utilization,this paper presented a virtual network node migration algorithm based on partition.The algorithm can put virtual nodes in a group which have resource competition relationship to achieve global optimization results of virtual network mapping.Through the time complexity analysis and simulation experiment,the proposed algorithm can significantly reduce the running time of the virtual network node migration algorithm.Through experiments,comparisons with No-Migration algorithm and Long Duration algorithm were made.Experiment results show that the proposed algorithm has a higher request receiving rate,a higher average income,and saves the substrate network resources.
Design of Low-power Node and Interface Protocol in WSN
WANG Yao-xing and LIU Jian-jun
Computer Science. 2014, 41 (Z11): 228-231. 
Abstract PDF(346KB) ( 441 )   
References | RelatedCitation | Metrics
This paper proposed a design method of low power consumption wireless sensor network(WSN) nodes based on MSP430 and CC2530.It elaborated the design and implementation from the hardware and software,and also have measured and analyzed the current consumption of the node through experiments.The results of applications prove that the nodes designed by this method can work stably and reliably under low power consumption conditions.
One of Indoor Positioning Algorithm Based on Wireless Sensor Network
ZHANG Wei and SUN Qiang
Computer Science. 2014, 41 (Z11): 232-234. 
Abstract PDF(252KB) ( 501 )   
References | RelatedCitation | Metrics
Wireless sensor network is one of the key technologies of indoor location. Two aspects affect the positioning accuracy.One is ranging error,the other is the positioning calculation error.At Ranging stage,in order to solve the problems that signal propagation in complex environments has larger loss and affects the measurement accuracy,this paper proposed a method for estimating environmental factors based on the log-normal propagation path loss model,which can dynamicly correct measurements and reduce the measurement errors caused by the environment.At the positioning stage,in order to solve the shortcoming that a triangle and centroid algorithm cannot make full use of the data,this paper presented the weighted of the triangle and centroid algorithm to achieve a high-precision indoor positioning.Experimental results show that the algorithm can realize a high accuracy indoor positioning and is feasible.
Research on Detection Schemes of Sybil Attack in VANETs
LI Chun-yan and WANG Liang-min
Computer Science. 2014, 41 (Z11): 235-240. 
Abstract PDF(648KB) ( 1083 )   
References | RelatedCitation | Metrics
In vehicular ad hoc networks (VANETs)Sybil attack is an attack in which a malicious vehicle obtains multiple false identities through the way of forgery,stolen or conspiracy.The attacker uses these false identities to do misbehaviors which can threaten the lives and properties of other drivers and passengers.The cause and hazards of Sybil attack in vehicular ad hoc networks were introduced firstly.Then a survey of existing Sybil attack detection schemes was made.According as whether the detection process is related to positionthe detection schemes are classified into two categories:Non-position-based detection schemes and Position-based detection schemes.Comparison of detection schemes in each category was introduced elaborately as well.Finally,the problems in existing detection methods and some possible directions for future research were proposed.
Technique for Discovering Stored XSS Vulnerability Based on Tracing Risky Data
LI Ya-wei,LIU Zi-xi and DING Shi-jun
Computer Science. 2014, 41 (Z11): 241-244. 
Abstract PDF(306KB) ( 398 )   
References | RelatedCitation | Metrics
To discover stored XSS vulnerability with black-box testing,we put forward a new technique which is based on tracing risky data.This technology can discover stored XSS vulnerability automatically on Web application quickly and deeply.This paper introduced how to design the assisted software for this technique birefly as well as prove the effectiveness of this technique.
Research on Optimization Technology of Reconfigurable Security Protocols Based on Reconfigurable Component
LI Ling,DU Xue-hui and BAO Yi-bao
Computer Science. 2014, 41 (Z11): 245-249. 
Abstract PDF(426KB) ( 423 )   
References | RelatedCitation | Metrics
Reconfigurable implementation of security protocol is an effective way to enhance its computing and safety performance.In this paper,based on analyzing a large number of existing security protocol architecture,we presented a high-performance implementation architecture of security protocol based on reconfigurable component.We presented a best method to optimize the reconfigurable component library for it is a critical issue in this architecture.By improving the optimization algorithms of coverage set with the right and combining ideas of heuristic optimization search,the method achieves a goal of reducing resource and time in the reconfigurable implementation of security protocol.
Research on Biometric Based Access Control for Cloud Storage
CHEN Zhi-jie,HUANG Kun and XIAN Ming
Computer Science. 2014, 41 (Z11): 250-251. 
Abstract PDF(268KB) ( 331 )   
References | RelatedCitation | Metrics
Cloud computing is an arresting emerging computing paradigm that offers users on demand network access to a large shared pool of computing resources.How to strengthen the access control of cloud computing resources and protect sensitive data along with private key confidential against malicious servers or other external attackers,have been an important security problem.Biometric possesses notable advantage in this field,and hence this paper focused on leveraging biometric identity to achieve access control in cloud.We exploited and combined techniques of fuzzy identity based encryption(FIBE),biometric measurement,and key insulated encryption which enables augmenting the security of private key management.Specifically,we based on the idea that when every time legal user or malicious one makes the request of accessing data of his interest,and the cloud severs will update the corresponding header file which only the legal user has the ability to decrypt.
Technical Study of Reducing Redundant Data for Intrusion Detection and Intrusion Forensics
QIAN Qin,ZHANG Jian,ZHANG Kun,FU Xiao and MAO Bing
Computer Science. 2014, 41 (Z11): 252-258. 
Abstract PDF(713KB) ( 634 )   
References | RelatedCitation | Metrics
For the past few years,the amount of computer crime has been increasing year by year,and it is threatening various aspects of human society such as national politics,economy,and culture,etc.In modern society,the research on intrusion forensics and intrusion detection plays a significant role for fighting against computer crime,tracing intrusion,patching vulnerability and improving security system of computer network.However,with the popularity of Internet and the improving capacity of computers’ storage,we often need to handle mass data about GB size,even up to TB size for intrusion forensics and intrusion detection.It inevitably makes much useful information submerge in redundant events,which brings about a huge challenge and low accuracy of analysis result.So it will be a topmost breakthrough to design a kind of technology for reducing redundant data and improving its accuracy and efficiency.This paper summarized several methods on intrusion forensics and intrusion detection.Firstly,this paper discoursed the development course of redundancy-reducing techniques and the application in traditional field such as medical domain.Then it systematically introduced all kinds of redundancy-reducing methods in intrusion forensics and intrusion detection.Finally,it figured out the existing problems and research direction in the future.It also gave some conclusions through the comparison on current situation of redundant data reducing techniques.
Network Security Situation Prediction Based on Improved Adaptive Grey Model
CHEN Lei,SI Zhi-gang,HE Rong-yu and ZHOU Fei
Computer Science. 2014, 41 (Z11): 259-262. 
Abstract PDF(329KB) ( 464 )   
References | RelatedCitation | Metrics
Network security situation is a hot research topic in the field of network security.The now-existing research is more concerned about the assessment of the current situation,but has little discussion about the future trend predictions.An improved grey Verhulst model was put forward to predict network security situation accurately in the future.Aiming at the shortages in the prediction based on traditional Verhulst model,the adaptive grey parameters and equaldimensions grey filling methods were proposed to improve the precision.The results show that the model is valid.
Geometric Attack Resisting Double-watermarking Algorithm Based on CS-SIFT
LI Hao and LI Hong-chang
Computer Science. 2014, 41 (Z11): 263-267. 
Abstract PDF(928KB) ( 408 )   
References | RelatedCitation | Metrics
A geometric attack resisting and watermarking blind extraction implemented algorithm is proposed based on compressive sampling(CS) techniques and the SIFT feature.The first copyright watermarking is spectrum spread and embedded in DWT domain of NCST’s lower frequency band.The second watermarking is authentication watermarking.It is generated by compressive sampling of the first watermarking,and stored in IPR database as zero-watermarking.In the process of watermarking extraction,firstly,SIFT feature template is acquired through zero-watermarking,and it veri-fies the integrity of watermarked image,locates and restores alterations.Then the image is calibrated with scale features and coordinate relationship of SIFT feature points so as to update watermarking extracting locations.The simulation shows that,the proposed algorithm has huge watermarking capacity and favorable transparency.It is robust to ordinary attacks as well as several geometric attacks.
Dynamic Chaotic Encryption and its Application in VoIP
SHI Jie,ZHONG Wei-bo and GE Xiu-mei
Computer Science. 2014, 41 (Z11): 268-271. 
Abstract PDF(610KB) ( 361 )   
References | RelatedCitation | Metrics
VoIP has become reality with spread of the network and its increasing bandwidthbut the speech is exposed to danger for the openness of network.Data encryption is usually used to ensure the safety of speech communication and chao-tic sequence is very suitable as the decryption cipher for its wide spectrum random ness,sensitiv to initial parameters.In order to avoid the danger come from using fixed chaos sequence as the cipher for long timea VoIP dynamic chao-tic encryption scheme was designed and implemented,in which Henon mapimproved Logistic map and a nonlinear map are combined to update chaotic cipher randomlyand the receiver decrypts data using the dynamic chaotic cipher obtained through the dynamic cipher exchange system .Experiment results show that this dynamic chaotic encryption system has high security and can be used for secure speech transmission.
Certificate-based Multi-proxy Multi-signature Scheme
ZHOU Cai-xue and TAN Xu-jie
Computer Science. 2014, 41 (Z11): 272-276. 
Abstract PDF(410KB) ( 487 )   
References | RelatedCitation | Metrics
This paper gave out a formal definition and security notions of certificate-based multi-proxy multi-signature schemes,and proposed a concrete scheme without using bilinear pairings.It was proved to be existentially unforgeable in random oracle model under elliptic curve discrete logarithm assumption.Performance analysis shows the scheme is of high efficiency.
Efficient Key Management Scheme for Data-centric Storage Wireless Sensor Networks
PAN Zhong-qiang and CHANG Xin-feng
Computer Science. 2014, 41 (Z11): 277-281. 
Abstract PDF(720KB) ( 344 )   
References | RelatedCitation | Metrics
Based on the Exclusion Basis Systems(EBS),we proposed an efficient distributed key management scheme,termed as ERP-DCS,to improve the deficiencies identified in the pDCS scheme.ERP-DCS attempts to distribute the key management tasks,including key distribution,rekeying,and key revocation,to each cluster(i.e.grid cell) to reduce the number of rekeying messages.The results show that,comparing to the pDCS scheme,the ERP-DCS is superior,in terms of update messages needed in the rekeying process,at a little cost in key storage.
Random Generation Algorithm of Optimal Binary Signed Digit Representation of Integer
LI Zhong and ZHANG Yong-hua
Computer Science. 2014, 41 (Z11): 282-283. 
Abstract PDF(212KB) ( 445 )   
References | RelatedCitation | Metrics
Binary Signed Digit(BSD) representation of an integer is widely used in computer arithmetic,cryptogr-aphy and digital signal processing.A given integer can have several optimal BSD representations.This paper studied the pro-perties of the optimal BSD representation of an integer,and presented an generation algorithm,which can rapidly generate a random optimal BSD representation of a given integer.
On Demand Security Framework for Cloud Computing
DING Xian-hua,ZHAO Wei-dong,JU Ying,LI Jian-ping,WANG Xiao-ming and LIU Guo-ying
Computer Science. 2014, 41 (Z11): 284-287. 
Abstract PDF(444KB) ( 386 )   
References | RelatedCitation | Metrics
Security has become an important factor restricting the development of cloud computing.This paper analyzed the cloud security objectives from 6 aspects:service constancy,service authenticity,data integrity,data confidentiality,data availability and non repudiation.It summed up the cloud computing risk in seven categories:physical security risk,computing security risk,trusted computing security risk,network security risk,management security risk,storage securi-ty risk and application security risk,and elaborated the corresponding security strategies for every security risk.The stronger the securityis,the greater the consumption of computing,memory,and bandwidth resourcesis.This paper provided on-demand security framework for cloud computing which uses different safety protection measurement according to service type,security level and access network risk.The advantage of the framework was analyzed and the application method was provided.
Review of Clustering Method
JIN Jian-guo
Computer Science. 2014, 41 (Z11): 288-293. 
Abstract PDF(581KB) ( 3732 )   
References | RelatedCitation | Metrics
The paper reviewed some clustering methods and results.Four key problems were discussed:distance and similarity measures,cluster number,clustering algorithms and the valid methods.The advantages and disadvantages of clustering algorithms were analyzed.The developing trend of clustering analysis techniques was pointed out.
Personalized Recommendation Technology
ZHU Bao and XU Ling-yu
Computer Science. 2014, 41 (Z11): 294-297. 
Abstract PDF(317KB) ( 433 )   
References | RelatedCitation | Metrics
This paper proposed a new personalized recommendation method.The method is derived from the research on the essence of personalized recommendation technology.We arrived at a solution which includes several methods:one is calculating the offline similarity by the use of normal distribution’s convolution theories,one is calculating the offline similarity by calculating the number of similar operations between each pair of items,the last one is integrating the similarity results of different methods by the use of something similar to the Bayesian formula.We also mentioned some other methods and techniques used to engineering implementation.The method proposed in this paper has been successfully applied in the field of data mining.
Automatic Text Summarization Research Based on Topic Model and Information Entropy
LI Ran,ZHANG Hua-ping,ZHAO Yan-ping and SHANG Jian-yun
Computer Science. 2014, 41 (Z11): 298-300. 
Abstract PDF(330KB) ( 906 )   
References | RelatedCitation | Metrics
This paper presented a method for automatic summarization based on LDA model and information entropy for Chinese document.It uses LDA model to do shallow semantic analysis work on documents and gets the distribution of topics under each document.Through analyzing the topics of document,we got the topic which has the best expression of central idea for document.Meanwhile,this paper proposed a new method to compute the sentence weight and extract the most important sentence based on measuring the information entropy for each sentence.It treats the sentence as a random variable and calculates the information entropy for every random variable.Experimental results show that this method can pick out the most important sentence in the document.
Network Traffic Classification Method Research Based on Subspace Clustering Algorithm
XU Xue-yan,WANG Su-nan and WU Chun-ming
Computer Science. 2014, 41 (Z11): 301-306. 
Abstract PDF(873KB) ( 355 )   
References | RelatedCitation | Metrics
Currently,service types,features of network traffic are changing constantly,but existing classification methods aren’t able to satisfy such network traffic environment,because they lack capability to update features library efficiently,and have high misjudgement rate.So a subspace clustering algorithm was designed to test classification properties.Experemnts show that it can classify lots of business types,its classification precision rate exceeds 95%,and quantity demand of training samples is low.It is recommended to help DPI classifier adapt to changing network environment.
Classification Mathematical Model and Proof to Distinguish Index and Information Web Page
WANG Shu-xi and XIA Zeng-yan
Computer Science. 2014, 41 (Z11): 307-312. 
Abstract PDF(516KB) ( 405 )   
References | RelatedCitation | Metrics
This paper surveyed domestic and international research of Web page classification,analyzed core technologies of Web page classification,including ideas,algorithms,formulas and evaluation criteria.In order to attack Internet Pyramid Selling,it is necessary to accurately identify and classify Internet Pyramid Selling Web pages.According to “maximum content length” of Web page,its “information page” probability is calculated.Web page classification mathematical model is deduced through strict formulas.Above mathematical model has been applied,and “National MLM Monitor Center” effectively classifies Internet Pyramid Selling Web pages using above model.
Improved Algorithm of Attribute Reduction Based on Granular Computing
TANG Xiao and SHU Lan
Computer Science. 2014, 41 (Z11): 313-315. 
Abstract PDF(288KB) ( 377 )   
References | RelatedCitation | Metrics
Granular computing is a method of multilayer granular structure analysis based on problem solving,pattern classification and information processing.It is a new multidisciplinary cross discipline between rough sets,fuzzy sets,data mining and artificial intelligence.Some important properties of granular computing were discussed as well as the reduction algorithm.The traditional reduction algorithm based on granular computing is gradually calculated with reduction Core(A),but some information systems may have no reduction core.In this case,an improved reduction algorithm based on attribute significance of granular computing was proposed.The algorithm can be used in system with both reduction core and no reduction core.Finally,experiments show the feasibility of the algorithm.
Research of FCM Algorithm Based on Canopy Clustering Algorithm under Cloud Environment
YU Chang-jun and ZHANG Ran
Computer Science. 2014, 41 (Z11): 316-319. 
Abstract PDF(341KB) ( 655 )   
References | RelatedCitation | Metrics
FCM algorithm is one of the widely used algorithms,but the quality and convergence speed of it depend on the quality of the initial cluster centers.Because Canopy algorithm can quickly cluster the data set and get the cluster centers, we proposed the FCM algorithm combining with Canopy cluster algorithm. The algorithm accelerates the convergence rate by making the clustering center obtained by canopy algorithm as the input of FCM.Then we designed its MapReduce scheme in a cloud environment.Experimental results show that the MapReduce of FCM clustering algorithm based on Canopy clustering algorithm has better clustering quality and speed than MapReduce of FCM clustering algorithm.
Collaborative Recommendation Algorithm Combining User’s Judging Power and Similarity
ZHANG Li and XUE Yu-qing
Computer Science. 2014, 41 (Z11): 320-322. 
Abstract PDF(275KB) ( 382 )   
References | RelatedCitation | Metrics
As an effective way to solve information overload,collaborative filtering(CF) technology has been successfully used in recommendation system.To improve the performance of CF algorithm,first,this paper evaluated user’s judging power based on historical scoring.Then combining user’s judging power and similarity,an improved collaborative recommendation algorithm was proposed.Experimental results show that judging power has positive correlation with recommendation abilities of users,which also verify that judging power extracts the depth information from historical scoring and factors influencing a user on adopting recommendation results.So it can characterize the similarity between users better and improve the accuracy of a recommendation algorithm.
HMSST:An Efficient Algorithm for SPARQL Query
DONG Shu-jian and WANG Jing-bin
Computer Science. 2014, 41 (Z11): 323-326. 
Abstract PDF(425KB) ( 1107 )   
References | RelatedCitation | Metrics
The paper proposed a novel efficient algorithm,named HMSST(HashMapSelectivityStrategyTree),to optimiz SPARQL query,combining the Hash Map and Selection Strategy Tree,based on narrowing the range of massive data query.HMSST algorithm is estimated by LUBM Benchmark and it works well when the university number reaches 1000.The experimental results show that the HMSST algorithm and the storage strategy are better than the existing query schemes,its storage cost is smaller,its query performance is higher,and it works effectively in large data sets,especially when SPARQL query contains more triple patterns and more complex semanteme.
Mining Spatial Co-location Pattern with Multiresolution Pruning and Local Clustering Algorithm
LV Cheng
Computer Science. 2014, 41 (Z11): 327-332. 
Abstract PDF(742KB) ( 363 )   
References | RelatedCitation | Metrics
The traditional co-location pattern mining algorithms take the mining method that connects each furture instance one by one.As a result,they often consume a large amount of time and space resources,even they are unable to dig out the final results because memory resources are over consumed,especially in the face of a large quantity of data case.Therefore,an efficient multiresolution pruning and local clustering algorithm (MP_LC) was proposed.The MP_LC algorithm firstly divides the data region into grids,then clusteres the instances of each feature in each grid,and calculates the centroid of the instances contained by each cluster,replaces the instance set by the centroid,and finally continues to subsequent mining work.A large number of experimental results indicate that the MP_LC algorithm has high efficiency,high accuracy,and good practical application value.
MapReduce Designed to Optimize Computing Model Based on Hadoop Framework
SUN Yan-chao and WANG Xing-fen
Computer Science. 2014, 41 (Z11): 333-336. 
Abstract PDF(612KB) ( 572 )   
References | RelatedCitation | Metrics
Aiming at a university teaching resource platform for massive log analysis,analysis and processing are transformed from the traditional stand-alone mode to using Hadoop MapReduce framework under the distributed processing.MapReduce uses the idea of dividing and rule,which is good solution to the bottleneck problem alone generated massive data processing.Through the use of Hadoop source code analysis and careful study of massive data processing using MapReduce job flow analysis,this paper presented optimization strategy MapReduce distributed computing operations to better improve the processing efficiency of massive data.
Quick Attribute Reduct Algorithm on Neighborhood Rough Set Based on Block Set
LOU Chang,LIU Zun-ren and GUO Gong-zhen
Computer Science. 2014, 41 (Z11): 337-339. 
Abstract PDF(290KB) ( 367 )   
References | RelatedCitation | Metrics
Calculating each record’s δ-neighborhood elements is the most frequent and complex step in neighborhood rough set model.In this paper,we proposed the concept of the block sets according to the distribution of records in the space,then proved that each record’s δ-neighborhood elements can only be contained in its own block set and its adjacent block sets.Based on the block-set-neighborhood theory,we presented a quick attribute reduct algorithm on neighborhood rough set,which can reduce the complexity of calculating each record’s δ-neighborhood elements.Moreover,the algorithm’s validity was verified by several data sets from UCI.Expermental results show that our algorithm is effective and feasible.
Location-aware Recommendation Based on Collaborative Filtering
LI Gui,CHEN Sheng-hong,HAN Zi-yang,LI Zheng-yu,SUN Ping and SUN Huan-liang
Computer Science. 2014, 41 (Z11): 340-346. 
Abstract PDF(638KB) ( 463 )   
References | RelatedCitation | Metrics
Users have different interests in different regions,and when recommended items are spatial,users tend to travel a limited distance when visiting these venues.Accurately capturing user preferences according to the users’and items’ location can improve the precision in recommender systems.To effectively deal with users’and items’ location information,this paper introduced Pyramid Model(PM) in recommender systems for realizing users’ partitioning and calculating travel penalty,and presented a collaborative filtering recommendation algorithm based on Pyramid model(PMCF) to generate Top-N recommend.MovieLens,Foursquare and Synthetic data set were quoted to evaluate the effectiveness of the algorithm.Experimental results show our algorithm has significant improvements in terms of effectiveness measured through precision.
Study on Concept Change in Data Streams Classification
HAN Fa-wang and LIU Yao-zong
Computer Science. 2014, 41 (Z11): 347-350. 
Abstract PDF(501KB) ( 589 )   
References | RelatedCitation | Metrics
Data stream classification must face the concept of change.This paper introduced the definition and types of conceptual changes in the data stream classification,the meaning and application of conceptual changes,and the methods of conceptual changes in the data stream.Real data stream often contains a lot of noise,and needs to understand the difference between noise and the concept of change.To reproduce the phenomenon for periodic data stream concept,when “the concept of history” reproduces,the concept of prediction using a specific model of the data stream can reduce the model update price.
Research and Application on Reduction of Weighted Variable Precision Model Based on Tolerance Theory
SHE Kan-kan,HU Kong-fa and WANG Zhen
Computer Science. 2014, 41 (Z11): 351-353. 
Abstract PDF(251KB) ( 347 )   
References | RelatedCitation | Metrics
Variable precision rough set was improved in this paper.Combined with tolerance relation theory,the weighted variable precision model based on tolerance theory was proposed.Furthermore,we suggested a new heuristic algorithm using the sensitivity of attributes,which is based on the significance of attributes and entropy, can overcome the shortcoming of classical model and ensure the completeness of reduction.As a result,traditional Chinese prescriptions were used to prove the effectiveness of model and algorithm mentioned in this paper.
Collaborative Filtering Recommendation Algorithm Based on Spectral Clustering Subgroups Discovering
LI Gui,CHEN Zhao-xin,LI Zheng-yu,HAN Zi-yang,SUN Ping and SUN Huan-liang
Computer Science. 2014, 41 (Z11): 354-358. 
Abstract PDF(681KB) ( 650 )   
References | RelatedCitation | Metrics
In many recommendation systems,Collaborative filtering recommendation algorithms based on clustering use some specific algorithms such as the K-means algorithm to cluster the users and items,but the limit is that a user or item can only belong to one category in the clustering result.In practical application,a user may have a variety of in-terests and an item also belongs to multiple categories.To solve the above problem,this paper put forward a novel algorithm based spectral clustering subgroups discovering and C-means clustering,by which we got the user-item subgroups with a high degree of similarities and the membership matrix of subgroups of users and items,which can belong to multiple subgroups.The purpose of our algorithm is to predict the users’ final preference to the items by calculating user’s preference to the items in each subgroup and combining the corresponding membership of users and items in their subgroup,and generate the users’ top-N recommendation results.Experimental results show that our method reduces the data sparseness and improves the recommendation precision and recall compared with previous recommendation algorithms.
Research Advances in Runtime Verification
ZHANG Shuo and HE Fei
Computer Science. 2014, 41 (Z11): 359-363. 
Abstract PDF(426KB) ( 1395 )   
References | RelatedCitation | Metrics
Runtime verification is a lightweight verification technique which monitors the behaviors of the target systems and checks whether their behaviors satisfy the desired properties.Once a violation is observed,the monitor informs the system to react in time.Runtime verification has been applied to various areas to ensure the correctness of systems.In this paper,the concepts,principles and categories of runtime verification were first introduced.Then several solutions and research hotspots in this area were analyzed.Finally,we discussed the current challenges,and outlined the future research directions of runtime verification.
Implementation of Auto-vectorization Based on Directives in GCC
XU Ying,LI Chun-jiang,DONG Yu-shan and ZHOU Si-qi
Computer Science. 2014, 41 (Z11): 364-367. 
Abstract PDF(418KB) ( 1424 )   
References | RelatedCitation | Metrics
Auto-vectorization based on directives has become an inevitable choice for compilers to exploit performance on SIMD architecture.The latest OpenMP 4.0 specification has added some SIMD directives,and the GCC 4.9 in deve-lopment began supporting OpenMP 4.0.We analyzed the implementation of SIMD directives in GCC 4.9 in detail,and focused on how the SIMD directives affect loop vectorization.Our work provides valued references to improve the existing auto-vectorization.
Research and Application of Local Search Engine Based on Lucene
QIN Jie,SONG Jin-yu and ZHANG Guang-xing
Computer Science. 2014, 41 (Z11): 368-370. 
Abstract PDF(589KB) ( 682 )   
References | RelatedCitation | Metrics
In order to improve the deficiencies of traditional search on efficiency and return results,based on full-text Lucene efficient and accurate characteristics,we adopted the idea of unstructured documents structured,proposed a method to slice document content naturally and index,and realized an individuation search engine for local resource.
Design and Implementation of Windows Credential Provider Logon System Based on USB Key
YANG Hai,ZHAO Wen-tao,ZHANG Nai-qian and FAN Si-jiang
Computer Science. 2014, 41 (Z11): 371-374. 
Abstract PDF(446KB) ( 1386 )   
References | RelatedCitation | Metrics
Windows operating system is widely used nowadays and the security of its login system has become the issue people concerned.We first analyzed and compared the security of GINA login module and Credential Provider login module,then proposed the idea of a design of Windows Credential Provider login system based on USB key,which perfectly combines the superiority of Credential Provider login module and USB key,improving the safety of traditional way of login by using user name and password.We developed our system based on windows 7,choosed USB key3000D as development platform.The system was developed based on the COM library of Windows Credential Provider.
Test Suite Generation Based on Interaction Testing and Fault Tree Analysis
ZHANG Wei-xiang and LIU Wen-hong
Computer Science. 2014, 41 (Z11): 375-378. 
Abstract PDF(333KB) ( 487 )   
References | RelatedCitation | Metrics
Due to the increasing software complexity,how to select a few representative test cases to test software effectively has become an outstanding problem.This paper gave an integrative method for test suite generation,which can be divided into three parts,first uses an amended Fussell-Vesely algorithm to get minimal cut sets of the software,second uses black-box testing methods to obtain the typically discrete values of each element in minimal cut sets,finally uses an interaction testing algorithm to generate the test suite with a view to the interactional relationship between any two elements.Practice shows that the method can significantly reduce test cases count on the basis of ensuring software testing effect.
General NHPP Software Reliability Research and Application
FEI Qi and LIU Chun-yu
Computer Science. 2014, 41 (Z11): 379-381. 
Abstract PDF(214KB) ( 474 )   
References | RelatedCitation | Metrics
This paper analyzed the existing non homogeneous Poisson process (NHPP) software reliability model,pointd out the problems and defects of existing NHPP software reliability model.Considering the defect detection rate,coverage,software error introduction rate,the paper proposed a general NHPP software reliability model.Finally,the general model was applied to a set of published failure data,proofing the effectiveness of model.
Optimization of Control Flow Checking Algorithm
LI Jian-ming,TAN Qing-ping,XU Jian-jun and YIN Sheng
Computer Science. 2014, 41 (Z11): 382-386. 
Abstract PDF(440KB) ( 362 )   
References | RelatedCitation | Metrics
In the space environment,the electrical circuses of computer are often subject to hardware transient faults,which are caused by high energy neutrons from cosmic rays.It is necessary to utilize appropriate fault tolerance techniques for improving the reliability of space application.This paper proposed an enhanced algorithm based on RSCFC(Relationship Signatures for Control Flow Checking),which is a control flow checking approach for hardware transient faults.In RSCFC,the sum of basic blocks is limited by the length of machine word.Through the segmented encoding of signatures,the optimized method solves the problem effectively.The analytical results indicate that the maximal number of basic blocks is extended to 218 when the length of machine word is 64 bits.Compared with RSCFC,the overhead of performance and memory is decreased evidently in our algorithm,and the faults detecting capability remains.
Overview of Agriculture Big Data Research
ZHANG Hao-ran,LI Zhong-liang,ZOU Teng-fei,WEI Xu-yang and YANG Guo-cai
Computer Science. 2014, 41 (Z11): 387-392. 
Abstract PDF(816KB) ( 1389 )   
References | RelatedCitation | Metrics
Data type and amount are growing in amazing speed which is caused by the emergency of new services such as cloud computing,internet of things and social network.The era of big data has come.Agricultural informatization is an important part of the construction of modern agriculture.The use of internet of agriculture things makes the applications of agriculture more and more deeply in the big data areas.Big data analytics also provides technical support for the agricultural informatization.This paper elaborated the related concept of agricultural big data,and introduced the progress of big data analytics.The key techniques of agricultural big data were described.Finally some new challenges in the future were summarized.
Design and Realization of Distributed Big Data Management System
CHEN Hai-yan
Computer Science. 2014, 41 (Z11): 393-395. 
Abstract PDF(515KB) ( 484 )   
References | RelatedCitation | Metrics
With the pretty rapid development of cloud computing,internet of things,mobile internet,and other technologies,mass data grows in those areas in violent speed.Big Data provides a possibility of handling mass data,which acts as a subversive technique.By the way,traditional relation database is no more effective of mass data that causes distributed database NoSQL to appear and evolve.Facing with actual and various difficulties,we designed and realized a new distributed big data management system (DBDMS),which is based on Hadoop and NoSQL techniques,and it provides big data real-time collection,search and permanent storage.Proved by some experiment,DBDMS can enhance the processing capacity of mass data,and very fits for mass log backup and retrieval,mass network packet grab and analysis,and other applied areas.
Data Synchronization of Data-sharing Center Based on ESB and Agent
LI Ying-hong,HE Jing,SHEN Li,HE Li-bo and FAN Bo-wen
Computer Science. 2014, 41 (Z11): 396-398. 
Abstract PDF(260KB) ( 463 )   
References | RelatedCitation | Metrics
Radio monitoring center of Yunnan province has the process of information construction.In order to solve the problem of how to keep data consistency among application systems in this process,some research on data synchronization and relevant technologies were done.Based on analyzing and summarizing the research directions and content of data synchronization,a method was implemented based on ESB and Agent.Finally,a new architecture of data synchronization was implemented with the technique of data-sharing center.
Link Prediction in Uncertain Protein-Protein Interaction Network
ZHANG Yue-yang and LIU Wei
Computer Science. 2014, 41 (Z11): 399-402. 
Abstract PDF(682KB) ( 787 )   
References | RelatedCitation | Metrics
Prediction of protein-protein interaction network is an important research content in post-genomic era.So far,the forecast for the PPI network interactions are assuming that the interaction is determined.However,protein-protein interaction networks and other biological data because of the limitations of the experiment test and presents the uncertainty.Put forward a kind of based on the uncertainty of information dissemination PPI network link prediction algorithm.We according to their appearance on each vertex to link the probability that defines the link information,the algorithm will be on the edge of the link information to spread at a certain probability on the diagram.We set for testing using the standard data,the experimental results show that the proposed algorithm,has good accuracy and good biometric features.
Recognition of Computational Thinking in Visual Programming
LIU Xiao-yan and CHEN Yan-li
Computer Science. 2014, 41 (Z11): 403-407. 
Abstract PDF(959KB) ( 529 )   
References | RelatedCitation | Metrics
Computational thinking has become popular in education research.More students can learn computational thinking in visual programming.Previous studies center on the motivation level that visual programming produces,but don’t focus on what kinds of computational thinking students learn from visual programming.This paper made semantic analysis on games and simulations that students created by using visual language on the basis of program behavior similarity,brought up an automatic and visual assessment method that is computational thinking pattern graph.This method can evaluate computational thinking pattern used in games and scientific simulations created by students,at the same time,points out students’ possible transfer of computational thinking from game design to scientific simulations.
Program of Blocks Combining with LDLT Method for Finite Element Analysis
LIU Yue-jin and XUE Meng-jun
Computer Science. 2014, 41 (Z11): 408-409. 
Abstract PDF(159KB) ( 764 )   
References | RelatedCitation | Metrics
The method is applied to solve the large linear equations when the finite element method is used.When blocks method is used,coefficient matrix of equations is triangulated and is stored by a dimension change bandwidth.The exchange data are carried out between EMS memory, and external memory.The method saves EMS memory,improves computational efficiency,and overcomes the problems of EMS memory.Some numerical results show that the method is more efficient than the others.
Page Replacement Algorithm with Pre-paging
TU Xiao-qin,SHANG Wei and ZHOU Fan-fan
Computer Science. 2014, 41 (Z11): 410-410. 
Abstract PDF(170KB) ( 596 )   
References | RelatedCitation | Metrics
Main memory is an important resource that must be carefully managed.In modern operating systems,virtual memory technology is introduced to memory management,and the paging system is the main way to achieve virtual storage technology,so page replacement algorithm plays an important role in the operating system,but many page altorithm algorithms require special hardware support,so not widely applied.We analyzed some broad application of the page replacement algorithms problems,proposed a page replacement algorithm with pre-paging.This algorithm theoretically can reduce the rate of missing pages and improve the hit rate.
Entities Expansion and Attribute Values Discovery Method Based on Web
LI Gui,CHEN Shao-gang,HAN Zi-yang,LI Zheng-yu,SUN Ping and SUN Huan-liang
Computer Science. 2014, 41 (Z11): 411-418. 
Abstract PDF(662KB) ( 403 )   
References | RelatedCitation | Metrics
Entities expansion and attribute values discovery has been an important research topic in the field of Web data extraction and integration.In this paper the Web table and domain entity were modeled as bipartite graph.Based on quality score,the expansion entity set will be update iteratively until the expansion entity set’s quality score reaches a local maximum and the expansion entity set will not update.To collect structured numerical or discrete attributes of the entities,we presented a method based on ILP to complete the attribute values discovery of the entities.Experiment results show that the proposed approach outperforms previous techniques in terms of both precision and recall.
Study on Evaluation and Analysis Framework for Service-oriented Software Development Method
SU Hong-jun,YOU Zhen-hua and WANG Guo-hua
Computer Science. 2014, 41 (Z11): 419-425. 
Abstract PDF(858KB) ( 531 )   
References | RelatedCitation | Metrics
Service-oriented software solution becomes the guiding criterion for the implementation of enterprise-level systems because of its core concept of “reuse” and “interoperability”.However,it still lacks a well-defined way for the methodological evaluation of service-oriented software development,so it is difficult to evaluate and compare corresponding mechanisms of different situations in the implementation of projects.Analysis framework using a group of qualitative and quantitative features for evaluation can achieve perfect evaluation of software projects from three aspects of structure,process and product.An example shows the flexible,elastic and comprehensive characteristics of the framework.
Exploratory on Virtualization-based Application Disaster Recovery Platform
XU Guan-jun
Computer Science. 2014, 41 (Z11): 426-429. 
Abstract PDF(366KB) ( 564 )   
References | RelatedCitation | Metrics
Data and information systems are the basic elements of modern business operations.Ensuring the integrity of the data and the HA of the information systems becomes the focus attention of the information technology department.Aiming at the requirements and the characteristics of the small and medium enterprise application recovery problems,the thesis proposed vSphere virtualization based application disaster recovery platform.The solution’s RTO,RPO and system reliability are better to meet the needs of disaster recovery.This platform is applied to our data center,and proved its technical feasibility.
Optimization Model of Logging Lithological Identification
WEI Zhi-hua and ZHANG Jun-ru
Computer Science. 2014, 41 (Z11): 430-431. 
Abstract PDF(756KB) ( 346 )   
References | RelatedCitation | Metrics
Rock-soil is a kind of extremely complex material,and it will encounter a lot of influence,these factors include the slip,fracture,rain erosion,corrosion and other natural factors and artificial factors.So many impacts result in the identification of lithology will produce a large number of error data.In the large amounts of data information optimization processing algorithm,support vector machine (Support Vector Machine,SVM) is a widely concerned optimization method.But the traditional SVM method is time-consuming,so we optimized the traditional SVM,made the leave one out cross method change to K fold cross method,and used optimized SVM optimize large data searching.Comparative test results show that the method has the advantages of high recognition accuracy, quick convergence speed,compared with the traditional optimization algorithm of support vector machine.
Research on Prediction Algorithm Based on In-memory Computing for Steel Prices
ZHU Jing-xiang,ZHANG Bin and LE Jia-jin
Computer Science. 2014, 41 (Z11): 432-435. 
Abstract PDF(348KB) ( 682 )   
References | RelatedCitation | Metrics
Because the steel price is nonlinear and its factor is difficult to be determined,in the forecast analysis,the tranditional method can only analyze the steel price with small amount of data,which leads to low accuracy of predection and the slow speed.In the big data era,memory computing in recent years has been a reseach hotpoint,and the requrement for timely data processing gets larger and larger.Based on the memory computing,the steel prices,production,inventory,and GDP data from 2002 to 2010,were used to build the prediction model,Bayessan forcasting model,ARMA model,support vector machine model and BP neural network model to forecast the steel prices.The simulation results show that the prediction model based on the memory not only has fast speed and high accuracy,but also shows the prices real timely.It provides a strong basis for enterprises to make decision on market reaction.
Research and Design of Temperature and Humidity Intelligent Control System in Library
CHANG Kai,XUE Dong-liang,SUN Qiang,CHEN Nian-sheng,GAO Yun-wei and CHENG Jia-lin
Computer Science. 2014, 41 (Z11): 436-439. 
Abstract PDF(623KB) ( 433 )   
References | RelatedCitation | Metrics
In view of the phenomenon that temperature and humidity cannot be controlled together in the current university library of intelligent construction,a temperature and humidity acquisition system based on ZigBee was designed,so that the correlation between temperature and humidity in a university library was studied and the experience formula of the relationship between temperature and humidity of the school library was put forward,and then the theoretical basis for the comprehensive control of temperature and humidity within the library was established.In order to solve the current situation that the automatic control of the temperature and humidity in library is bad ,the intelligent temperature humidity control system on the basis of the temperature and humidity acquisition system with hardware and software working together was designed.As a subsystem for the construction of intelligent building of the school library,this system has obtained the good actual effect.
Study of Extenics Neural Networks Model and Stock Index Futures Analysis
LI Xiu-zhi and MENG Zhi-qing
Computer Science. 2014, 41 (Z11): 440-446. 
Abstract PDF(1558KB) ( 389 )   
References | RelatedCitation | Metrics
In recent years the extenics neural networks (ENNs) has developed rapidly in the field of artificial intelligence and its research results are quite rich.ENN-type 2 is one of these results.And it is a so new topic that the research of practical applications based on it is only very narrowly defined in all kinds of diagnosis.As result of it,we need to expand the application of ENN-type 2 research in other fields to enrich this emerging topic.This paper took the lead in expanding ENN-type 2 model in the field of prediction and analysis of stock index futures for applied research.This paper also described the structure and algorithm of two kinds of ENN-type 2 in detail,and then the model’s feasibility and effectiveness in prediction and analysis of stock index futures were verified through the experiments.
Charge-controlled Memristor-based Chaotic Circuit
FANG Ying and XU Bing-ji
Computer Science. 2014, 41 (Z11): 447-450. 
Abstract PDF(541KB) ( 519 )   
References | RelatedCitation | Metrics
Based on the definition of memristor,a simpler charge-controlled memristor model was used,whose relation between memristance M and charge q can be described by a quadratic nonlinearity curve.The v-i curve exhibits a pinched hysteresis loop like“8” whose shape varies with the frequency and amplitude of the periodic input.Then we got a charge-controlled memristor-based chaotic circuit from Chua’s oscillators by replacing Chua’s diodes with this charge-controlled memristor model which turns out some dynamical characteristics including portrait diagrams,time domain wavetorms,Lyapunov exponent and dimension.Varying initial value,we proved the dynamical track of this chaotic system will make a huge difference when a tiny change happens to initial state.Lyapunov exponent spectrum indicates that this system will become a hyper-chaos system when initial state changes.Furthermor,the stable condition eliminating the influence of zero characteristic root of this chaotic system in equilibrium points is judged by Routh criterion.
Research and Implementation of Mobile Phone Vertical Search Engine
SU Yong-hong and ZHANG Yu-rong
Computer Science. 2014, 41 (Z11): 455-460. 
Abstract PDF(546KB) ( 785 )   
References | RelatedCitation | Metrics
With the fast development of network technology,universal search engine always can not meet many user demands,especially when user needs to search some information in a field,vertical search engine accords with user demands.Cell phone resource search was discussed.It initially comes up with a vertical search with fairly precise outcome through expanding the use of Heritrix and Lucene.The major research work of this paper is divided into four parts.Firstly,by customizing and extending the Heritrix,it crawled some information from Internet.Secondly,the crawled information was analyzed and cramped out,some of that with the tool of HtmlParser.Thirdly,Lucene used to build a full-text index and retrieval service for the system.Finally,the system design a MVC connector.The system achieves design goals through the tests of response time,recall ratio and precision ratio.
Study on Campus Network Faults for Investigation Technology of User Self-service
ZHOU Fan-fan
Computer Science. 2014, 41 (Z11): 461-462. 
Abstract PDF(524KB) ( 405 )   
References | RelatedCitation | Metrics
In order to realize the campus network management better,aiming at the common faults and problems in campus network,we put forward the users joined to participate in the campus network fault management,and made the investigation technology of user self-service as the focus of the study.From the hardware,software,wireless network etc.,investigation on the fault and the solution was discussed,which improves the application level of campus network in the further and lays the foundation to strengthen fine management.
Mobile Printing System Technology Scheme
ZHOU Tong
Computer Science. 2014, 41 (Z11): 463-465. 
Abstract PDF(251KB) ( 365 )   
References | RelatedCitation | Metrics
Application of movable print is users’ eager for goals,but the vast majority of Internet users will use a common printer,and printers cannot achieve free movement.They won’t abandon the current used equipment and purchase expensive wireless network printer.Through the application of wireless network technology research and experiment,we implemented a technical method for ordinary printing changing into mobile printing systems.Print server is connected to the AP's network interfaces,back connection common printers,so that the printer through the USB cable is connected to the print server,and the print server through the network cable is connected to the AP.AP can access to the wireless network,enabling the printer can be moved.
Self-organizing IOV Technology Oriented Vehicle Driving and Parking Service Using Vehicle Terminal
XI Jian-zhong
Computer Science. 2014, 41 (Z11): 466-470. 
Abstract PDF(1192KB) ( 404 )   
References | RelatedCitation | Metrics
Aiming at the problem that city road traffic drives and parkes are difficult,and complex real-time navigation is easy to generate the maze phenomenon,we proposed a self-organizing internet of vehicle (IOV) technology oriented vehicle driving and parking service using vehicle terminal with independent intellectual property rights,which realizes many functions to find the best path navigation for appointment to parking,access vehicle booking and online payment.All kinds of parking information are transmitted by the network interface to the server,and sent to the client.Customers install the vehicle terminal in the car,and through GPS positioning,the parking information can be displayed on the vehicle terminal.Vehicle navigation key technology breakthrough is that choosing the direction in special multilayer road overpass,tunnels and bifurcation junction,but normal navigation has navigation blind area easily.In addition,by setting a special tag signal in the special road and installing the signal compilation on the vehicle terminal receive device,the navigation effect can be improved.
Experimental Design of Operating System in Independent College
TU Xiao-qin,SHANG Wei and ZHOU Fan-fan
Computer Science. 2014, 41 (Z11): 471-472. 
Abstract PDF(505KB) ( 520 )   
References | RelatedCitation | Metrics
Computer operating system" is a highly theoretical course.To enable independent college’s students deepen understanding of the principles of the operating system,stimulate student interest and improve programming skills,this paper first discussed the necessity of learning the Linux operating system,followed analysis of relevant knowledge principle operating system,and implementation of Linux kernel.Finally,theoretical knowledge was used to lead the completion of the design of multiple experiments,and gave tips related to the completion of the experiment.Students must use the Linux environment and C language program to write code.The experimental difficulty is individually enhanced in line with the actual level of independent college’s students.
Supervision Collaboration Platform Based on SOA and Cloud Computing Technologies
ZHANG Xiao-yuan,LIU Li-ren and HAN Hai-wen
Computer Science. 2014, 41 (Z11): 473-477. 
Abstract PDF(474KB) ( 380 )   
References | RelatedCitation | Metrics
Through the analysis of the needs and characteristics of the current electronic supervision operations,the collaborative platform of supervision operation was designed based on SOA and cloud computing technologies.In this platform’s bottom layer,the virtualization technology and parallel distributed processing technology are adopted to effectively manage the massive heterogeneous resource and provide visual resource service,supervision business core function development environment and feasible massive data processing function to upper layer.In this platform’s middle layer,the SOA technology is adopted to encapsulate the business core function,visual resource and data in Web service.In this platform’s higher layer,BPMS technology is adopted to encapsulate the collaboration among supervision business in Web service.In service centre,the highest layer of this platform of Web services is open to the terminal users in supervision organizations distributed in different area.
New Research and Design of Intelligent Information Collector of Traffic Violation
ZHU Er-xi and XU Min
Computer Science. 2014, 41 (Z11): 478-481. 
Abstract PDF(857KB) ( 339 )   
References | RelatedCitation | Metrics
The collector realizes the connection of video equipment,microprocessor,WIFI network module and signal acquisition module by using the technology of the Internet of things,realizes the recognition and tracking of vehicle targets in video image by using Adaboost algorithm and Mean Shift algorithm,records the running track of the vehicle,judges many types of violation with running track,uploads the violation information by WIFI network.The collector needs no additional auxiliary equipment and has high reliability and convenient installation.
Comprehensive Evaluation Research of Information Security Based on Power System
XU Hui,LIANG Cheng-dong and CHENG Jun-chun
Computer Science. 2014, 41 (Z11): 482-484. 
Abstract PDF(237KB) ( 372 )   
References | RelatedCitation | Metrics
In order to effectively solve the suboptimal problems of comprehensive information security based on power system,this paper combined with the specific environmental of power system,and designed a comprehensive evaluation architecture of information security.This architecture links these variable factors that contain technology,management,strategy,system,monitor,and evaluates the develope synergistic degree of these variable factors.Analysis shows that only ensuring synergetic development of these variable factors can optimize the information security level of power system efficiently.