Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
Current Issue
Volume 42 Issue Z11, 14 November 2018
Customer Classification Model of Employers by Using BP Neural Networks
QIAO Fei and GE Yan-hao
Computer Science. 2015, 42 (Z11): 1-4. 
Abstract PDF(362KB) ( 559 )   
References | RelatedCitation | Metrics
Customer classification of employers brings great benefits for correctly evaluating the category of each employers,which helps decision-makers to appraise the corporation efficiency when they corporate with each customer.In order to give quantitative model to help solving this problem,we extracted raw data series from the educational management information system and built a classification calculating model based on BP neural network.Then we managed to get each value of the parameters in the model by training the historical datasets and compared the result of prediction with other methods which are widely used nowadays to solve this problem.Finally we made a conclusion that customer classification of employers for career guiding service by using BP neural network model performs better than other existing solutions and gives more efficient supports to management layer of universities and government by making comparably precise predictions.
Improved Decision Method of Borda Score
QIN Jie, HE Yi-hui and LAI Jun
Computer Science. 2015, 42 (Z11): 5-6. 
Abstract PDF(218KB) ( 702 )   
References | RelatedCitation | Metrics
To solve the problems of the poor reflection of plans and being easy to be controlled in traditional Borda method,an algorithm to improve Borda score was proposed.The algorithm puts the integer and individual into considera-tion,takes the idea of improving score in fuzzy Borda method and interactions among members in SPAN method,and improves Borda score matrix.The method calculates the integrated value with formulas in weight,and sorts the plans by integrated value.At last,the accuracy and reasonableness of the algorithm were verified.
Disambiguation Algorithm Design and Implementation of Food Safety Issues in Network
LIU Jin-shuo, DENG Ying-ying and DENG Juan
Computer Science. 2015, 42 (Z11): 7-9. 
Abstract PDF(359KB) ( 549 )   
References | RelatedCitation | Metrics
The article aimed to put forward a disambiguation algorithm which can correctly classify the unknown terms,based on the food safety information in network.The disambiguation algorithms used in this paper combines the hidden Markov model(HMM) and SVM classifier to achieve terminology disambiguation,based on the improved TF-IDF feature selection algorithm.This paper proposed a new feature extraction algorithm LN-TF-IDF with two additional weighting factors on traditional TF-IDF.Experiments show that,the improved TF-IDF disambiguation algorithm designed in the field of food safety enhances the effect of disambiguation by average 7.31% on the 202831 texts.It was compared with the traditional TF-IDF text feature selection algorithm,with the F-measure as evaluation criteria.At the same time,the effect of the algorithm is relatively stable on different experimental data sets obtained from different time.
Research on Temporal and Spatial Distribution of Literatures and Knowledge Dissemination Based on Heat Transfer Science
ZHAO Zhi-yuan, ZHENG Yan-ning, ZHAO Xiao-yuan and JIA Ya-min
Computer Science. 2015, 42 (Z11): 10-15. 
Abstract PDF(1246KB) ( 502 )   
References | RelatedCitation | Metrics
This paper introduced the theory and research method in heat transfer science and established the model of the temporal and spatial distribution of literatures and knowledge dissemination.And based on the theory of heat transfer science,we made the innovation research in the year,heat degree,transmission model,attenuation or aging,the ability of influence,research field,research content,and industry division.Then,this paper proposed a series of new concepts,which are literature heat degree,amount of literature heat degree accumulation,amount of literature heat flow,literature thermal diffusion coefficients and so on.Furthermore,we used the heat diffusion,energy conservation and other point of view in heat transfer science to research literature distribution,capability of knowledge dissemination,the analysis and judgment of life cycle.And finally,we carried out CTI empirical study.
Multi-objective Siting and Sizing of Distributed Generation Planning Based on Improved Particle Swarm Optimization Algorithm
ZHOU Yang, XU Wei-sheng, WANG Ning and SHAO Wei-hui
Computer Science. 2015, 42 (Z11): 16-18. 
Abstract PDF(340KB) ( 572 )   
References | RelatedCitation | Metrics
By analyzing the impact of the distributed generations(DGs) on the distribution network,a method was built for siting and sizing of DGs in the distribution network.Comprehensively considering three indices of active power losses,voltage quality and DG capacity,the paper presented an improved particle swarm optimization algorithm for multi-objective optimization based on fuzzy theory.The proposed method was testified on IEEE 14-node system achieved by MATLAB.Simulation results show that the solving algorithm has high capability of global search and better convergence rate.The feasibility and effectiveness of the method and the algorithm were finally evaluated through comparative analysis.
Differential Evolution Algorithm Based on Clonal Selection and its Application in SVM
SHENG Ming-ming, HUANG Hai-yan and ZHAO Yu
Computer Science. 2015, 42 (Z11): 19-21. 
Abstract PDF(340KB) ( 596 )   
References | RelatedCitation | Metrics
The parameters of support vector machine (SVM) are important factors affecting its performance.However,the absence of a mature theory about the kernel parameter selection of SVM heavily affects its wide application.This paper introduced clonal selection algorithm into differential evolution algorithm,and improved the strategies of basic clonal selection algorithm and differential evolution algorithm.Through combining the two algorithms mentioned above,a differential evolution algorithm based on clonal selection was proposed and applied to optimize the parameters of SVM kernel.The test results show that the algorithm can not only effectively avoid the premature-convergence problem of differential evolution algorithm,but also significantly improve the optimization ability.UCI wine database application data show that the algorithm can accelerate the parameter search speed,and improve the prediction accuracy and genera-lization ability of SVM.The high accuracy of classification and better generalization performance prove that using clonalselection differential evolution algorithm is a good way to optimize SVM kernel parameter.
Local Enhancement Differential Evolution Searching Method for Protein Conformational Space
DONG Hui, HAO Xiao-hu and ZHANG Gui-jun
Computer Science. 2015, 42 (Z11): 22-26. 
Abstract PDF(695KB) ( 521 )   
References | RelatedCitation | Metrics
A local enhancement differential evolution searching method for protein conformational space was proposed to address the searching problem of protein conformational space.On the framework of differential evolution algorithm,Rosetta Score3 coarse-grained energy model was employed for decreasing the dimension of searching space and improving the convergence rate of algorithm.The knowledge-based fragment assembly technique was introduced for improving the accuracy of prediction.For getting better local near-native conformation,local enhancement operation was done with taking advantage of the well local search performance of Monte Carlo algorithm.The well global searching capacity of the differential evolution algorithm was combined for sampling the whole conformational space effectively.The experi-ment results on 5 test proteins verify the superior searching performance and prediction accuracy of the proposed method.
Supply Chain Competitiveness Evaluation Method Based on Optimized Support Vector Machine
ZHONG Fu, GUO Jian-sheng, ZHANG Si-jia and WANG Zu-tong
Computer Science. 2015, 42 (Z11): 27-31. 
Abstract PDF(447KB) ( 639 )   
References | RelatedCitation | Metrics
Aiming at that the supply chain is difficult to accurately evaluate,it comes from more variable facters,less amout of information and the difficult data collection.This paper built a new evaluation index system of supply chain and proposed a new supply chain competitiveness evaluation method.It uses the advantages of global optimization ability of artificial bee colony algorithm to optimize the control parameters of support vector machine effectively,and on this basis,the ABC-SVM evaluation model is constructed.The experimental results show that the new proposed method can effectively improve the evaluation presision of the supply chain competitiveness,and has a positive meaning to improve business decision-making effctiveness.
Premature Beat Signal Recognition Algorithm Based on Wavelet Transform and Rough Set
TANG Xiao, SHU Lan and ZHENG Wei
Computer Science. 2015, 42 (Z11): 32-35. 
Abstract PDF(329KB) ( 614 )   
References | RelatedCitation | Metrics
The selection and extraction of electrocardiogram feature parameter are the base of the analysis of electrocardiogram(ECG).To improve recognition rate of detection algorithm and classification accuracy is the key to automatic analysis technology.Thus,a hybrid algorithm based on wavelet transform(WT) and attribute reduction of granular computing(GC) to detect premature beat signal of electrocardiogram(ECG) was present.At first,12 electrocardiogram feature parameters are chosen based on diagnostic criteria from cardiovascular experts.Then the feature detection algorithm based on wavelet transform is used for feature extraction,and an attribute reduction algorithm based on granular computing is also used for attribute reduction.Finally,the data are put into pattern classification and the result is verified by MIT-BIH database.As the experiment shows,the classification accuracy after reduction is much higher than it before reduction.Therefore,that reasonable selection of feature parameter is an important factor to improve the recognition efficiency was justified in this article.
Recommending Books Based on Reading Time and Frequency
CAO Bin, GONG Jiao-rong, PENG Hong-jie, ZHAO Li-wei and FAN Jing
Computer Science. 2015, 42 (Z11): 36-41. 
Abstract PDF(638KB) ( 610 )   
References | RelatedCitation | Metrics
With the rise of electronic reading in recent years,the use of collaborative filtering(CF) recommendation algorithm to recommend user personalized books has practical application value,and has become the important research content in the study of recommender systems.But many current e-reading systems for book recommendations lack of users’ rating data,which hinders the application of CF.To address this,we analyzed the massive users’ reading beha-vior,and proposed a reading time-frequency(T-F) model to profile the users’ interests to the book.Thus,the implicit ratings matrix can be derived from this model and then classical CF algorithm could be used in a natural way.The experimental results show that the user based CF with our proposed T-F rating model can improve the recommendation effectiveness,which is feasible for real scenarios.
Accelerating the Recovery of Markov Blanket Using Topology Information
FU Shun-kai, SU Zhi-zhen, Sein Minn and LV Tian-yi
Computer Science. 2015, 42 (Z11): 42-48. 
Abstract PDF(1106KB) ( 809 )   
References | RelatedCitation | Metrics
Markov blanket(MB) has been known as the optimal feature subset for prediction,and there exist fertile works to induce MB by local search since 1996.A novel one called FSMB was proposed which heavily relies on conditional independence(CI) test to determine the existence of connection between nodes,so it is kind of constraint-based learning as well.However,it differs from previous works by treating candidate CI tests unfairly.FSMB extracts critical d-separation topology information from conducted CI tests,and applies them to sort and perform those more likely to uncover independent relations with priority.Search space therefore is expected to shrink quickly in a more efficient manner.Experimental studies indicate that FSMB achieves tremendous improvement over state-of-art works PCMB and IPC-MB in term of time efficiency,but with no sacrifice on learning quality.When given large networks(e.g.100 and 200 nodes),FSMB runs even more efficiently than IAMB which is recognized as the fastest algorithm by now,requiring up to 40% fewer CI tests,and produces much higher quality of results.Experiments with UCI data sets and four classical classification models indicate that the classification accuracy of the models trained on the output of FSMB are close to or exceed performance achieved by models trained on all features,hence FSMB is an effective feature subset selector.
Bi-direction Maximum Matching Method Based on Hash Structural Dictionary
CHEN Zhi-yan, LI Xiao-jie, ZHU Shu-hua, FU Dan-long and XING Yi-hai
Computer Science. 2015, 42 (Z11): 49-54. 
Abstract PDF(755KB) ( 916 )   
References | RelatedCitation | Metrics
In the Chinese natural language processing,aimming at the problem that ordinary dictionary cannot be used for reverse maximum matching method and it is difficult to maintain a reverse dictionary,we put forward a new kind of dictionary structure and corresponding bi-direction maximum matching method,and added mutual information ambiguity processing block in the algorithm.Compared with the previous maximum matching method,this algorithm can increase the segmentation accuracy significantly.It is applicable to some Chinese natural language processing systems which have high segmentation accuracy requirement.
Mixed Data Affinity Propagation Clustering Algorithm Based on Dimensional Attribute Distance
HUANG De-cai and QIAN Chao-kai
Computer Science. 2015, 42 (Z11): 55-57. 
Abstract PDF(358KB) ( 602 )   
References | RelatedCitation | Metrics
A new distance measurement was raised because the affinity propagation cannot cluster mixed data sets.And this distance measurement was successfully applied into affinity propagation clustering algorithm.This new algorithm doesn’t need to calculate the virtual cluster center points,and also considers the effect of diversity of whole data set.This algorithm was validated through two UCI data sets.And the clustering performance is better than K-Prototypes and K-Modes in both clustering entropy and execution efficiency.
Study of Concurrent Control Algorithm in Collaborative Editing
SUN Min and WANG Rui-hua
Computer Science. 2015, 42 (Z11): 58-62. 
Abstract PDF(416KB) ( 869 )   
References | RelatedCitation | Metrics
Aiming at the problems of various kinds of inconsistency in the collaborative editing,this paper proposed a concurrency control algorithm named ICOT(Improved Context-based Operation Transformation) based on operational transformation.This algorithm is improved on the basis of the COT algorithm and it can reduce the number of operator’stransformation. It solves the problem of the repeating operation transformation by reasonable version of operation in the middle of the transformation.At the same time,this paper gave a concrete example analysis to verify the correctness and effectiveness of the improved algorithm.The results illustrate ICOT algorithm can make copy editor get effective consistency maintenance.
Unbalanced Data Classification Algorithm Based on Clustering Ensemble Under-sampling
ZHANG Xiao-shan and LUO Qiang
Computer Science. 2015, 42 (Z11): 63-66. 
Abstract PDF(325KB) ( 1010 )   
References | RelatedCitation | Metrics
Imbalanced data exists widely in the real world,under such circumstances,most traditional classification algorithms assume the balanced data distribution,which results in the classification outcome offset to the majority class,so the effort is not ideal.The enhanced AdaBoost based on the clustering ensemble under-sampling technique was proposed in this paper.The algorithm firstly clusters the sample data by clustering ensemble,according to the sample weight.And the majority class from each cluster in certain proportion are randomly selected and then merge with all minority class to generate a balanced training set.By use of the AdaBoost algorithm framework,the algorithm gives different weight adjustment to the majority class and the minority class respectively,and selectes several base classifiers with better effect to get the final ensemble.The experiment result show that:this algorithm has a certain advantage dealing with unbalanced data classification.
Financial Decision-making Reasoning Method of 6-element Linguistic Truth-valued Intuitionistic Fuzzy System
ZOU Li, WANG Ying and TAN Xue-wei
Computer Science. 2015, 42 (Z11): 67-71. 
Abstract PDF(417KB) ( 531 )   
References | RelatedCitation | Metrics
In order to process the information with uncertainty in the financial decision-making,the present work uses the intuitionistic lattice implication algebra which can deal with the linguistic truth-valued.Based on the linguistic truth-valued intuitionistic fuzzy logic,a new model of personal financial decision auxiliary system was established.Reasoning mechanism of the system is the extension of the intuitionistic fuzzy logic and approximate reasoning method with the similarity method.To deal with the fuzzy problem,an uncertainty reasoning method was achieved.An example was gi-ven,which illustrates that the proposed method is flexible and effective when handling the financial decision-making problem.
Relative Density-based Clustering Algorithm over Uncertain Data
PAN Dong-ming and HUANG De-cai
Computer Science. 2015, 42 (Z11): 72-74. 
Abstract PDF(340KB) ( 473 )   
References | RelatedCitation | Metrics
Traditional relative density-based clustering algorithm has advantage in handling shortcomings of user-defined parameters’ sensitivity and distinguishing different hierarchy of density.This paper provided a new uncertain data clustering algorithm based on relative density,which defines distance formula,density ratio,core points and density-reachable,and can efficiently handle uncertain data.The simulation results illustrate the validity and availability of the algorithm.
Theory of Probability Truth Degree in Gdel 4-valued Propositional Logic System
HE Jin-rui, HUI Xiao-jing and SHUANG Jing-ning
Computer Science. 2015, 42 (Z11): 75-79. 
Abstract PDF(305KB) ( 482 )   
References | RelatedCitation | Metrics
This paper proposed the theory of probability truth degree in Gdel 4-valued propositional logic.It is proved that the set of probability truth degree of all formulas has no isolated point in [0,1].The conceptions of probability similarity degree on two formulas were defined.Moreover,the probability logic metric space was built.It was proved that this space has no isolated point.Then it can provide the thought for approximate reasoning theory.
Who Can Collaborate New Users in Recommendation System?
ZHANG Li and YU Lei
Computer Science. 2015, 42 (Z11): 80-82. 
Abstract PDF(255KB) ( 700 )   
References | RelatedCitation | Metrics
As a successful technology used in the recommender system,collaborative filtering has been widly concerned by scholars in various fields.However,with the increasing of new users and items,collaborative recommendation is facing serious challenge of “cold start”.This study measured the recommending ability of user based on popularity and long-tailed distribution,and then constructed a global core user set for recommadition using user popularity,which can be used to solve “cold start” problems in recommendation systems.In additional,experimental results show that the core use set used for collaborative recommending can reduce complexity of looking for similar users without lowing the recommendation performance.So it also can be used to improve real-time recommendation.
Parameter Estimation for Piecewise-linear Chen System Based on Genetic Algorithm
TANG Wen and WU Lei
Computer Science. 2015, 42 (Z11): 83-85. 
Abstract PDF(319KB) ( 461 )   
References | RelatedCitation | Metrics
In the paper,single population genetic algorithm and multi-group genetic algorithm were studied on the parameter estimation of the piecewise-linear Chen system.The problem of multiple parameter estimation was transformed into multi-parameter optimization problem by constructing an appropriate fitness function and numerical calculation was done with global optimization of genetic algorithm.The result shows that multi-group genetic algorithm has more ob-vious advantages in accuracy and robustness than single population genetic algorithm in estimating parameters on the piecewise-linear Chen system.
Improved Simple Particle Swarm Optimization Algorithm
SUN Zhen-long, LI Xiao-ye and WANG Ying
Computer Science. 2015, 42 (Z11): 86-88. 
Abstract PDF(240KB) ( 768 )   
References | RelatedCitation | Metrics
Aiming at some demerits of particle swarm optimization algorithm(PSO),such as relapsing into local extremum easily,slow convergence velocity and low convergence precision in the late evolutionary,an improved simple particle swarm optimization algorithm(YSPSO) was proposed.It employs golden section method to balance the mutual in-fluence between inertia and experience.Meanwhile,in order to avoid missing the global optimal value,it adds reverse random inertia weights to make the particles have the ability to search reversely in a certain extent.Finally,the experiment results of several classic benchmark functions show that YSPSO improves the practicability of PSO via improving convergence velocity and precision,and reducing the possibility of relapsing into local extremum.
Path Planning for Unmanned Air Vehicles Using Improved Artificial Bee Colony Algorithm
LI Ren-xing and DING Li
Computer Science. 2015, 42 (Z11): 89-92. 
Abstract PDF(835KB) ( 496 )   
References | RelatedCitation | Metrics
Aiming at the survival problem of unmanned air vehicles(UAV) in the complex combat field,a novel algorithm—artificial bee colony(ABC) algorithm based on cloud model was proposed.Considering the stochastic and the stability of the cloud model,we used the one-dimension normal cloud model to improve the robustness of the ABC algorithm and avoid the local optima.In order to maintain diversity,a new selection strategy was introduced.When the proposed ABC algorithm is applied to solve the above problem,firstly, the UAV path planning problem is transformed into a multi-dimensional optimization problem through environmental modeling.Then the advantages of the ABC algorithm and cloud model are combined.Lastly,the proposed algorithm is tested through the path planning task.The experimental results show that the improved algorithm is feasible and superior in solving UAV path planning.
12]1-Probabilistic Quasi-Hoare Logic and its Reliability
WU Xin-xing, HU Guo-sheng and CHEN Yi-xiang
Computer Science. 2015, 42 (Z11): 93-99. 
Abstract PDF(499KB) ( 718 )   
References | RelatedCitation | Metrics
A Hoare logic-baesed [α12]1-probabilistic quasi-Hoare logic was presented,and its reliability was proved.
Fuzzy Reasoning and its Application Based on Fuzzy Propositonal Logic
WU Xiao-gang and PAN Zheng-hua
Computer Science. 2015, 42 (Z11): 100-103. 
Abstract PDF(642KB) ( 528 )   
References | RelatedCitation | Metrics
FLcom is a fuzzy propositional logic system with contradictory negation,opposite negation and medium negationis based on fuzzy set FScom.The major study on the negation of fuzzy inference and processing are based on classical logic.This paper researched the semantic interpretation of fuzzy inference rules which distinguish three kinds of negation:contradictory negation,opposite negation and medium negation.The new algorithm of compositional rules of inference generalizes the implication operator in CRI algorithm.At last we compared the FLMP algorithm with CRI algorithm by using an example.Through the example analysis,the results show that the new algorithm is reasonable and feasible.
Survey of Human Body Geometry Modeling Methods
WANG Xin, YANG Yan-hong and CHEN Sheng-yong
Computer Science. 2015, 42 (Z11): 104-108. 
Abstract PDF(1273KB) ( 1866 )   
References | RelatedCitation | Metrics
Human body modeling is an important research topic in computer graphics and computer vision area.The geo-metry modeling is the foundation of the human body modeling.Up to now,the body geometry modeling technology has a large number of implementation methods.This paper reviewed and summarized the latest research achievements of human body geometry modeling domain.And it can be divided into four categories:direct modeling method,2D image recognition method,template matching method and statistics-based model synthesis method.The future development trend of the human body geometry modeling method was also discussed.
Tolerance Nearness Measure Based on HSV and Texture Feature
LIU Wen-ying, WANG Yong-jun and YANG Yi-chuan
Computer Science. 2015, 42 (Z11): 109-112. 
Abstract PDF(928KB) ( 469 )   
References | RelatedCitation | Metrics
Content-based image retrieval is a very important issue in image processing.Similarity measure is a core problem in content-based image retrieval.Tolerance nearness measure method based on near set is better than IRM(integra-ted region matching) when it just extracts the Grey feature.Considering that tNM is close to human visual,we replaced Grey feature with HSV color space. We extracted Grey+texture feature and HSV+texture feature,respectively.Then the retrieval results was obtained by IRM and tNM from 10 categories images.Through analyzing and comparing those results,we drew a conclusion that HSV+texture feature has higher performance compared to Grey+texture feature.
On-line Handwritten Flowchart Recognition Based on Grammar Description Language
CHEN Quan, SHI Da-peng, FENG Gui-huan, ZHAO Xiao-yan and LUO Bin
Computer Science. 2015, 42 (Z11): 113-118. 
Abstract PDF(507KB) ( 1143 )   
References | RelatedCitation | Metrics
This paper proposed a sketch recognition approach based on a grammar description language.The whole re-cognition process is divided into three steps.Firstly,dynamic programming is used to group strokes into stroke combinations under some spatial and temporal constraints.Secondly,a neural network is employed to classify the candidate symbols.Lastly,the candidate symbols are filtered with the constituent grammar rules,and a grammar parser is utilized to achieve the recognition results.This method has been applied to a freely available database FCinkML.The results demonstrate the effectiveness and efficiency of our approach.
Image Threshold Segmentation Algorithm Based on SUSAN Edge Information
WU Cong-zhong and LI Jun
Computer Science. 2015, 42 (Z11): 119-122. 
Abstract PDF(1103KB) ( 440 )   
References | RelatedCitation | Metrics
Because of the good performance in maintaining the target profile and partitioning the object from low-contrast pictures,threshold segmentation method based on edge information is widely used and especially suitable for industrial production images.But traditional methods are sensitive to noise,and the threshold is hard to select.To solve these problems,in this paper,an adaptive image threshold segmentation algorithm based on edge information was pre-sented.The proposed algorithm uses characteristic response of SUSAN to describe the edge information of pixels to suppress the effect of noise and weak boundaries.Time and space complexity is greatly reduced when using the min-max cut threshold segmentation algorithm based on graph spectral theory rather than other segmentation algorithms,and the received threshold is global optimum.Experimental results show that the algorithm can segment the target accurately and retain rich details at the same time,for low-contrast and noise images the algorithm also has good performance,and threshold is better than traditional algorithm.
Segmentation of Bright Speckles in SD-OCT Diabetic Retinal Images Based on Self-adaption Threshold and Region Growing
YU Chen-chen, CHEN Qiang, FAN Wen, YUAN Song-tao and LIU Qing-huai
Computer Science. 2015, 42 (Z11): 123-125. 
Abstract PDF(853KB) ( 1068 )   
References | RelatedCitation | Metrics
Hard exudation is one of obvious symptoms in diabetic retinopathy.The bright speckles in SD-OCT(Spectral Domain Optical Coherence Tomography)have close relation with exudation.In order to research the relation between retinopathy and exudation,it is necessary to extract the bright speckles.However,there are few studies about the segmentation of bright speckles.In this paper,we first limited target regions by layer segmentation methods,then determined the seeds sets by self-adaption threshold and finally extracted bright speckles by region growing based on human vision feature.Experiments demonstrate that our method can accurately segment the bright speckles in diabetic retinal images.
Thyroid Nodule Ultrasound Image Feature Extraction Technique Based on TI-RADS
HAN Xiao-tao, YANG Yan, PENG Bo and CHEN Qin
Computer Science. 2015, 42 (Z11): 126-130. 
Abstract PDF(1197KB) ( 621 )   
References | RelatedCitation | Metrics
Ultrasound is the first choice of imaging modality for thyroid examination.Clinical analysis of thyroid ultrasonography is based on quantitatively evaluating the ultrasound image features in the thyroid imaging reporting and data system(TI-RADS).However,the results of quantified features are influenced by doctors’ experience level,status and other related factors.Computer-aided analysis can objectively analyze ultrasound imaging features and reduce the influence of subjective factors on the diagnostic results.But most of the existing systems are based on classic image texture features,which are abstract and absence of explicit meaning,so they are difficult in clinical using.Sonographic features of thyroid nodules which are involved in TI-RADS were extracted and quantified.Based on doctors’ clinical experience,the visual characteristics of the corresponding quantization methods were designed,which provide a basis of standardized description of thyroid ultrasound images.Statistical learning methods were adopted to establish a model of identifying the benign and the malignant thyroid nodules based on these characteristics,which provides reference recommendations for clinical diagnosis.The recognition accuracy of the model reaches 100%.
Fault Diagnosis Method of Rolling Bearing Based on Dual-tree Rational-dilation Complex Wavelet Packet Transform and SVM
SUN Shan-shan, HE Guang-hui and CUI Jian
Computer Science. 2015, 42 (Z11): 131-134. 
Abstract PDF(293KB) ( 519 )   
References | RelatedCitation | Metrics
In order to improve the recognition accuracy of SVM classification,a fault diagnosis method was proposed based on dual-tree rational-dilation complex wavelet transform and support vector machine(SVM),according to the characteristics of rolling bearing fault vibration signal.Firstly,the fault signal is decomposed into several different frequency band components through dual-tree rational-dilation complex wavelet transform.Secondly,normalization processing is made from the energy of each component.Finally,the energy characteristics parameters of each frequency band component are taken as input of the SVM to identify the fault type of rolling bearing.The experimental results prove that the proposed method can identify the fault type accurately and effectively.
Research on Intelligent Control Method for Moving Object Tracking Based on PTZ Camera
CHEN Shuang-ye and WANG Shan-xi
Computer Science. 2015, 42 (Z11): 135-139. 
Abstract PDF(894KB) ( 963 )   
References | RelatedCitation | Metrics
According to the defects of tracking moving object by PTZ camera,such as relying on manual operation,unreal-time and uncontinuous tracking,almost leading to the failture of tracking,a new intelligent control system was designed by using HSV color histogram as model feature.Then Camshift algorithm was adopted to realize object tracking location and Kalman filter model was adopted to estimate the moving object location of the next time.Finally closed loop idea was used to controll PTZ camera platform movement and lens antomatic zoom.The new system improves the real-time performance.Cooperating with the intelligent control system,the effect of tracking is more accurate by controlling the PTZ camera with Android smart phone.The results show that the method is feasible,has the advantages of simple control and remaining correct location,and it also can improve real-time performance and reliability of object tracking.
Image Compression Based on Discrete Hermite Polynomials
XIAO Bin, LU Gang, WANG Guo-yin and MA Jian-feng
Computer Science. 2015, 42 (Z11): 140-141. 
Abstract PDF(497KB) ( 539 )   
References | RelatedCitation | Metrics
Image compression can effectively decrease information redundancy among image’s pixels,and also ensure its reconstruction quality and lower computation complexity.The transform domain image compression coding is the most commonly used and one of the most optimal compression technology,but the image compression methods based on discrete orthogonal polynomials transform have not yet been deeply studied.Through studying the procedures of encoding and decoding of JPEG,we proposed an image compression algorithm based on discrete Hermite polynomials.Quantization table was determined depending on the ratio of the information entropy of the transform core and that of DCT.At last the results of quantization were encoded by using Huffman entropy coding.We realized the whole process of image compression and reconstruction based on discrete Hermite polynomials.The experimental results show that the algorithms are of similar compression ratio,share similar performance and the difference in peak signal noise ratio(PSNR) is small compared with the mainstream JPEG image compression standard.
Modular Two-dimensional Locality Preserving Discriminant Analysis and its Application in Human Face Recognition
ZHAO Chun-hui and CHEN Cai-kou
Computer Science. 2015, 42 (Z11): 142-145. 
Abstract PDF(369KB) ( 450 )   
References | RelatedCitation | Metrics
Locality preserving discriminant analysis takes very important position in face recognition research.Based on this,the 2DLPDA method was proposed,which directly processes the operation in the two dimensional space.In some way,it improves the performance of the algorithm.But the problem of the sensitivity to such variations like lighting’ expression and occlusion will make a big influence on the recognition rate when using 2DLPDA method.We proposed an improved algorithm called modular two-dimensional locality preserving discriminant analysis method.We divided the sample in blocks,so that we could extract the local neighborhood of the sample better.Because each sample was divided into blocks,the different blocks of one sample may have different nearest neighbors,causing local features of the sample to be extracted better .At the end of the method,all the local features are integrated together to be the basis for the identification.Experimental results on AR,YALE and ORL face databases show that the proposed method outperforms the 2DLPDA method.
Palmprint Recognition Based on Improved PCA and SVM
LI Kun-lun, ZHANG Ya-xin, LIU Li-li and GENG Xue-fei
Computer Science. 2015, 42 (Z11): 146-150. 
Abstract PDF(946KB) ( 559 )   
References | RelatedCitation | Metrics
Palmprint recognition is an important part of biological feature recognition.Feature extraction and feature recognition are the main content of palmprint recognition.This paper made the improvement to the PCA algorithm based on principal component analysis.At first,palmprint image is processed by Fourier transform,and then the principal component analysis is used.Another method is that the palmprint image is processed by block principal component analysis.The improved feature extraction method was verified by the experiment.The results show that it can improve the recognition accuracy rate.In the aspect of feature recognition,although to a certain extent,template matching has small amount of calculation and high accuracy,it is easy to fall into the small sample size problem.This paper completed the palmprint recognition by training SVM classifier.Experimental results show that the method has better feasibility.
Improved Parzen Window Based Ship Detection Algorithm in SAR Images
ZHANG Hao, MENG Xiang-wei, LIU Lei and LI De-sheng
Computer Science. 2015, 42 (Z11): 151-154. 
Abstract PDF(842KB) ( 587 )   
References | RelatedCitation | Metrics
The classical Parzen algorithmis is based on the assumption that the targets occupy a small part of the SAR image and uses all pixels of the SAR image to estimate the probability density function of the clutter background.This method results in the elevation of the detection threshold,and then it is possible to miss targets that are less obvious.In order to overcome this problem,we proposed an improved Parzen detection algorithm.The proposed algorithm adaptively sets the target windows according to the size of the target and deletes the potential targets from the background.Then it estimates the clutter distribution based on Parzen window method.Finally,it determines the detection threshold for the target detection.Compared with the traditional Parzen detection method,the proposed algorithm decreases the number of missing target and also improves the detection performance.The detection results with the real SAR images verify the effectiveness of the algorithm.
Fuzzy Clustering Level Set Based Medical Image Segmentation Method
WU Jie, ZHU Jia-ming and CHEN Jing
Computer Science. 2015, 42 (Z11): 155-159. 
Abstract PDF(1179KB) ( 561 )   
References | RelatedCitation | Metrics
Medical image segmentation is an important application field of image segmentation,it widespreadly has high noise,artifacts,low contrast,uneven gray,fuzzy boundaries between different between soft tissue lesions and other characteristics,this paper used clustering algorithm,combined with LCM and two phase model level set method (CV),chose the appropriate filter for medical image denoising,then used the fuzzy c-means algorithm to get image prior model.And we improved the traditional CV model to fine the image segmentation.Experiments show that the model can solve the problem of high image noise and weak boundary,and can effectively avoid the re-initialization,and is more sensitive to the edge,improving the segmentation accuracy,suppressing noise effectively,significantly reducing the number of iterations and time,having certain application value.
Research on Image Mosaic Technology
CHEN Zhi-ang, XU Xiao-gang and XU Guan-lei
Computer Science. 2015, 42 (Z11): 160-161. 
Abstract PDF(272KB) ( 719 )   
References | RelatedCitation | Metrics
With the improvement of people’s access to visual information requirements,image mosaic technology has become a research hotspot in forward-looking graphics domain. Image mosaic is a technology that mosaics a plurality of overlapping region to form pahorama ultimately.Compared with the single image,it can provide images with larger size,which can show more content at the same time.
Improved Band Selection Method for Hyperspectral Imagery
REN Xiao-dong, LEI Wu-hu, GU Yu and ZHAO Qing-song
Computer Science. 2015, 42 (Z11): 162-165. 
Abstract PDF(1189KB) ( 598 )   
References | RelatedCitation | Metrics
Based on the principles of band selection,in this paper,a new method of band selection,named ABO,was proposed by combing the subspace partition,the band selection method based on matrix mode(BSMM),and the optimum index factor(OIF).In ABO,firstly,based on the correlation of bands,all bands are divided into different subspaces.Then, the index “W” is gotten through the method of BSMM,and only one band is seclected in each subspace,in which the seclected band has the maximal “W”.After this,the OIF from each three bands of the seclected is calculated.At last,the new method is tested by the real hyperspectral remote sensing imagery,and the three seclected bands is carried on RGB synthesize,HSV transformation and the RX anomaly detection.The results show that the proposed method has the best effect.
Region Growing in Color Image Based on Quaternion Multiplication Vector Product Properties
WANG Jian-wei and LI Xing-min
Computer Science. 2015, 42 (Z11): 166-168. 
Abstract PDF(494KB) ( 651 )   
References | RelatedCitation | Metrics
The traditional region growing methods are applied to gray scale images and they are insensitive to color information,what’s more the growth criteria are scarce.Aiming at these problems,using LUV color space model,the new region growing algorithm is based on quaternion vector product properties and it is applied to color images with complicated background.The experimental results demonstrate that the proposed method can accurately segment regions and objects.
Analysis of Grounding Grid Corrosion Grade Based on Extraction of Color and Texture
DU Jing-yi and LIU Wen-hui
Computer Science. 2015, 42 (Z11): 169-172. 
Abstract PDF(619KB) ( 462 )   
References | RelatedCitation | Metrics
Because carbon steel materials are widely used as the grounding grid materials of substations in our country nowadays,which are easily corroded and the error of human judgment for corrosion is large,this paper proposed a methodof using image processing technology to deal with the color and features extraction.Firstly,nonlinear restructuring of luminance and chrominance was used to enhance the corrosion image in the luminance/chrominance color space.Secondly, the main features of an image,including color,texture and gradient magnitude, were measured by using the local homogeneity,Gabor filter and color spaces.Then,artificial bee colony algorithm was used to get the best seeds and the best similar values of image,and seeded region growing was used to segment image into small areas of corrosion.Finally,the similarity of these small corroded areas was measured.Thereby the analysis of corrosion levels was completed.
Improved Moving Target Detection Algorithm Based on Gaussian Mixture Model
WANG Si-si and REN Shi-qing
Computer Science. 2015, 42 (Z11): 173-174. 
Abstract PDF(511KB) ( 427 )   
References | RelatedCitation | Metrics
Moving object detection is the basis for tracking and behavior analysis tasks.In moving target detection,eliminating the interference of background and noise and separating moving targets out from the image have been the focus of the study.Gaussian mixture model method is widely used in object detection, has a better anti-interference ability for existence of small amplitude motion of background and can extract more complete moving target,but at the same time noise exists and shadow suppression is less effective.An improved algorithm for moving target detection based on Gaussian mixture model was proposed to make up the deficiencies of original Gaussian mixture model method,original Gaussian mixture model and four-frame differencing were combined by taking advantage of the better adaptability to light change and simple algorithm features of frame difference method.The experimental results indicate that the proposed method can eliminate noise and shadow effectively in complex environment and improve the accuracy and integrity of moving target detection.
Fast Kernel Subspace Face Recognition Algorithm Based on Neural Network
WANG Jian, ZHANG Yuan-yuan and CHAI Yan-mei
Computer Science. 2015, 42 (Z11): 175-178. 
Abstract PDF(350KB) ( 545 )   
References | RelatedCitation | Metrics
Aiming at the situation of large calculation amount and slow calculation speed of existing kernel subspace face recognition algorithm,this paper presented a fast kernel subspace face recognition algorithm based on neural network model,and used neuron of neural network hidden layer to reduce base representation of nuclear feature subspace in order to improve the speed of recognition.Then we established the neural network approximation model based on the KPCA and KFDA kernel subspace face recognition algorithms and made an analysis on the database of ORL,UMIST and YALE.The experimental results show that when the number of the hidden layer neuron is set as half of the training samples or less,the fast kernel subspace algorithm based on neural network can achieve similar or even equal recognition rate to the nuclear subspace algorithm,and the recognition time can be reduced to 50% or even lower with certain correct recognition rate.
Remote Sensing Image of UAV Registration Based on Improved SIFT Algorithm
REN Wei-jian, WANG Zi-wei and KANG Chao-hai
Computer Science. 2015, 42 (Z11): 179-182. 
Abstract PDF(878KB) ( 544 )   
References | RelatedCitation | Metrics
This study transferred Gauss two order differential template and convolution operation of image function to the integral image subtraction of Box filter.At the same time,the dimension of feature vectors of feature detection operator was decreased through introducing SURF operator.The computational complexity of SIFT algorithm and image matching time were reduced.This study solved the problem that UAV sensing image matching requires high real-time.The simulation shows the improved SIFT algorithm accelerates the operation speed with the same rubustness and matching rate as original algorithm.
Foggy and Hazy Image Enhancement Algorithm Based on Retinex in Fuzzy Field
JIA Wei, LIU Yan-bin, LIU Wei, GE Geng-yu, SU Wen-li and FAN Li-lue
Computer Science. 2015, 42 (Z11): 183-188. 
Abstract PDF(1277KB) ( 482 )   
References | RelatedCitation | Metrics
The traditional Retinex algorithm always products “halo” effect in fog and haze image enhancement,and has several shortcomings,such as the poor image exposure,image sharpness,image details,image fidelity and so on.In order to overcome these shortcomings of the traditional algorithm,we proposed a foggy and hazy image enhancement algorithm based on Retinex in the fuzzy field.Firstly,the original image was classified into several blocks using the proposed adaptive multi-threshold algorithm,and then the optimal crossover points of the image blocking areas were computed.Secondly,we used a novel linear membership function to map the image pixel value into fuzzy domain,then computed the correlation parameter of the fuzzy hyperbolic tangent function by crossover points,used the Retinex algorithm to perform no-linear image enhancement,and used the fuzzy hyperbolic tangent functions to adjust the enhancement result.Finally,the method of superposition of the linear was used to map the enhancement result into original image domain.The experimental result shows that the “halo” effect is suppressed,the proposed method plays a better role in the image sharpness,image details and image fidelity,and it has more widely applicability.
Color Image Signature Algorithm Based on Visual Cortex Neuron Model
KOU Guang-jie, MA Yun-yan, YUE Jun and ZOU Hai-lin
Computer Science. 2015, 42 (Z11): 189-191. 
Abstract PDF(616KB) ( 481 )   
References | RelatedCitation | Metrics
After studying the color image signature problem,a triple-channel spiking cortical model(TSCM)was proposed based on the working principle of mammalian visual cortex neuron model.The invariant characteristics of color image can be extracted effectively by TSCM.On the one hand,TSCM has the properties of ordinary pulse coupled neural network,such as invariances of translation,rotation,and scale.On the other hand,TSCM has more robust characteristics when it faces with the noise.At the same time,the algorithm is more compact and efficient.In the end,the effectiveness of TSCM was proved by the results of experiment and simulation.
Three Dimensional Airway Trees Segmentation Algorithm Based on Region Growing Method
LI Yan-bo and YU Xiang
Computer Science. 2015, 42 (Z11): 192-194. 
Abstract PDF(237KB) ( 730 )   
References | RelatedCitation | Metrics
Virtual bronchoscopy(VB) plays an important role in the evaluation of chest diseases,but as the main key technology,airway tree segmentation method has problems of inaccurate and leakage phenomenon.Therefore,the airway trees segmentation based on region growing method was proposed.Firstly,region growing method is used to extract the main branch.Secondly,it extracts the sub-branch and deletes the pseudo branch according to the quality evaluation function.The experiment results show that the robust airway segmentation method can extract the complete lung airway tree simply,robustly and effectively with magnitude 5 bronchus,which solves the bronchial rupture and segmentation leakage problems.
Research on Segmentation Methods in Breast Computer-aided Detection
SHEN Kun-xiao, LAN Yi-hua, LU Yu-ling, SHANG Nai-li and MA Xiao-pu
Computer Science. 2015, 42 (Z11): 195-198. 
Abstract PDF(732KB) ( 630 )   
References | RelatedCitation | Metrics
Breast cancer, as one of the common malignant tumor,remains a leading cause of cancer deaths among women.Early diagnosis and treatment is an efficient way of reducing the morbidity of breast cancer.Computer-aided diagnosis(CAD) can improve the efficiency and accuracy of diagnosis.A brief review of breast mass segmentation was provided in this thesis.We further analyzed and compared the advantages and performance of these methods.Finally,some means of the methods used to improve the segmentation accuracy were summarized.
Research on Moving Objects Detection in Video Sequences Based on Grabcut-guassian Mixture Model
SHENG Jia-chuan and YANG Wei
Computer Science. 2015, 42 (Z11): 199-202. 
Abstract PDF(600KB) ( 646 )   
References | RelatedCitation | Metrics
To detect moving objects accurately and rapidly from the videos sequences,this paper proposed a novel G-GMM method for automatic detection via combination of GMM and Grabcut techniques in image processing.Firstly,this algorithm uses GMM(Gaussian Mixture Model) based background subtraction to produce binary images for every mo-ving object and then constructs their minimum marking rectangles.And then it follows the image information initialization of each marking rectangle via Grabcut.Finally,an iterative algorithm with foreground parameters is adopted to optimize the object segmentation and thus the moving object contour is obtained.Experimental results indicate that the proposed method achieves good accuracy and robustness in the still camera outdoor video surveillance system,providing promising detection results for both rigid and non-rigid objects.
Recognition of Impurities Based on their Distinguishing Feature in Mushrooms
XU Zhen-chi, JI Lei, LIU Xiao-rong and ZHOU Xiao-jia
Computer Science. 2015, 42 (Z11): 203-205. 
Abstract PDF(599KB) ( 499 )   
References | RelatedCitation | Metrics
In order to achieve automatic recognition of hair impurities in edible mushrooms industry,a fingerprint image segmentation method based on impurity’s distinguishing feature in mushrooms was proposed in this paper.This algorithm segments the impurity’s image through the way of the image normalization,taking the Hessian matrix back projection,taking the threshold and combining the Hessian grayscale characteristics and Lab color space.These experimental results show that it also performs well and recongnition rate is up to 99.6% in the case of nonuniform lighting conditions,which can be used in industrial production.
Spectral Clustering Algorithm and its Application in Image Segmentation
XIAO Xiao, SHI Hui and KONG Fan-zhi
Computer Science. 2015, 42 (Z11): 206-208. 
Abstract PDF(564KB) ( 695 )   
References | RelatedCitation | Metrics
An improved spectral clustering algorithm was proposed in this paper.Firstly,the spectral clustering based on the idea of path was introduced.And then in order to improve the sensitivity of traditional spectral clustering on scale GAUSS function parameters,this paper put forward a new algorithm,which is extended to the semi-supervised situation.At last the algorithm was applied in the experiments of image segmentation,and the effectiveness of the algorithm was proved.
Application of SURF Feature and Preprocessing RANSAC Algorithm in Face Recognition
JIANG Ling-zhi
Computer Science. 2015, 42 (Z11): 209-212. 
Abstract PDF(589KB) ( 748 )   
References | RelatedCitation | Metrics
In the face recognition problem,a fast recognition method based on SURF features was proposed.Firstly,the SURF feature points are extracted from the preprocessed face image,and the nearest neighbor matching method is used for coarse matching of feature points.Secondly,the feature points of the coarse matching are processed by KMeans clustering algorithm to filter out the apparently inappropriate matches.Then RANSAC algorithm is used to achieve the precise matching of filtered feature points,in order to achieve accurate matching of recognition of face feature points.The experimental results show that the proposed method is suitable for the fast matching of face images in the mobile phone terminal,and has strong robustness and practical value.
Coverage Control Algorithm Based on Sequential Game in UWSNs
QIAN Ling and ZHAI Yu-qing
Computer Science. 2015, 42 (Z11): 213-217. 
Abstract PDF(460KB) ( 548 )   
References | RelatedCitation | Metrics
Recently,much attention has been paid to marine resources.Applications of underwater wireless sensor networks (UWSNs) in marine monitoring,underwater military defense,navigation and other aspects have caused wide concern.A good underwater wireless sensor network coverage control of UWSNs can reduce coverage redundancy and optimize the usage of the network resources.It can also reduce node energy consumption and prolong the life cycle of the node,so that the underwater wireless sensor network can better accomplish the goal of environment perception.This paper proposed a method of coverage control based on sequential game to optimize coverage control of underwater wireless sensor networks,which is expected to be able to reduce the energy consumption of the sensor nodes,balance the nodes energy,and achieve the goal of prolonging the network lifetime.Finally,we verified the effectiveness of the algorithm by simulation experiments in optimization of the network coverage and extension of the network life cycle.
PAMM:An Optimized Module for Inter-domain Communication Based on Shared Memory
SUN Rui-chen and SUN Lei
Computer Science. 2015, 42 (Z11): 218-221. 
Abstract PDF(413KB) ( 629 )   
References | RelatedCitation | Metrics
The combination of cloud computing platform and virtualization technologies has brought new communication requirements to us.The inter-domain communication based on shared memory can improve the efficiency of the communication between the virtual machines running on the same physical machine.But the status-sitching in the process of memory sharing limits the optimization ability of the proposed method.New memory sharing model PAMM,by adding a management model which can aggregate memory pages in the process of memory sharing and reduce the number of super calling application,achieves the purpose of reducing the number of status-switching.The experiment shows that PAMM can enhance the efficiency of inter-domain communication based on shared memory domains.
Study on IEEE 802.15.4 Scheduling Algorithms for Real-time Communication
HU Xian-jun, CHEN Jian-xin, ZHOU Sheng-qiang and CHENG Yi
Computer Science. 2015, 42 (Z11): 222-226. 
Abstract PDF(795KB) ( 412 )   
References | RelatedCitation | Metrics
IEEE 802.15.4 protocol is a media access control protocol for low power wireless communication,which has been widely used in the internet of things field,such as medical health,industrial control,and building automation etc.In order to satisfy the real time requirement of different application,researchers have proposed a lot of IEEE 802.15.4 scheduling algorithms for real time communication.According to the bandwidth utilization,delay constraints,energy efficiency and other ability index,these algorithms for real time communication were classified,compared and analyzed to help for their deployment.In addition,the future work was prospected.
Clustering Routing Algorithm for Wireless Sensor Network along River
CHEN Tian-tian, JIANG Bing, XUE Xiao-qing and SHA Ting-ting
Computer Science. 2015, 42 (Z11): 227-230. 
Abstract PDF(620KB) ( 459 )   
References | RelatedCitation | Metrics
Aimed at the long-distance and bilinear-form topology of wireless sensor network along river,a new algorithm called RECR which can effectively improve the problem of uneven energy consumption was presented.The algorithm constructs clusters according to the characteristics of node distribution and distance from the sink node,and selects a cluster head from all the nodes in a cluster according to energy consumption ratio.Inter-cluster routing is designed based on the characteristics of multi-sink-node and bilinear-form topology.Simulation results show that RECR algorithm significantly balance the energy consumption of the network and prolong the network lifetime.
Development of Embedded RFID Middleware System for Multilayer Data Processing
LIU Bo-yang, MA Lian-bo, ZHU Yun-long and SHAO Wei-ping
Computer Science. 2015, 42 (Z11): 231-235. 
Abstract PDF(702KB) ( 634 )   
References | RelatedCitation | Metrics
According to the mobile application environmental characteristics of the large-scale RFID,this paper researched the middleware architecture and the key information processing technologies with stream data processing and semantic analysis functions.Based on massive data processing of RFID nodes,the functions of data-sensing,event-processing and embedded Web services were integrated.To achieve shielding different hardware resources and target solutions to data-processing difficulties existing in multilayer data processing,the resources and tasks scheduling optimization strategy oriented to the embedded application environment was represented,while the uniform standards for service and operating environment were provided for upper applications,which will facilitate the deployment and update of RFID services in future.
Clustering Algorithm of Sensor Network Based on Node Priority and Interest Data Screening
LI Sheng, LIU Lin-feng and CHEN Hang
Computer Science. 2015, 42 (Z11): 236-241. 
Abstract PDF(748KB) ( 402 )   
References | RelatedCitation | Metrics
The working wireless sensor node is powered by its built-in battery.Therefore the effective control of energy consumption can greatly prolong the network life cycle and improve its utilization rate.In this paper,a clustering algorithm based on node priority and interesting data screening was proposed,which can be applied into environmental monitoring,etc.The node priority for being cluster head is calculated by the combination of the distance distribution of nodes and the remaining energy.Subsequently,the chosen cluster head will adopt the corresponding data screening method to handle the data obtained from the child nodes in each round.And the screened data will be sent to base station.The simulation results show that the proposed algorithm achieves better performance in the energy control and network life than others.
Throughput-maximized Routing Scheme for Energy Harvesting Wireless Sensor Networks
CHI Kai-kai, DU Wen-jie, LI Yan-jun and CHENG Zhen
Computer Science. 2015, 42 (Z11): 242-244. 
Abstract PDF(323KB) ( 572 )   
References | RelatedCitation | Metrics
Energy harvesting wireless sensor networks (EH-WSNs) have the capability of harvesting environment energy,can work forever and thus have a lot of promising applications.Most of the available routing schemes of EH-WSNs focus on selecting energy-efficient routes.Few of them consider the throughput of routes,which is clearly one of most important performance metrics.This paper first formulated the throughput-maximized routing problem for EH-WSNs so as to deeply and theoretically understand this routing problem,and then presented a throughput-maximized routing scheme.Simulation results demonstrate that,compared to the minimum-hop routing,the proposed routing scheme is able to greatly improve the throughput.
Variety of Conjectures on Group-theoretic Model for Interconnection Networks
SHI Hai-zhong and SHI Yue
Computer Science. 2015, 42 (Z11): 245-246. 
Abstract PDF(221KB) ( 455 )   
References | RelatedCitation | Metrics
Cayley graph generated by a connected graph was proposed to design certain interconnection networks for supercomputers,on-chip interconnection and data center networks.The conjecture is that let G=(V,E) be a connected graph with node set {1,2,…,n}(n>2) and m edges.If m=2r,then the Cayley graph generated by G is the union of k(0≤k≤r) edge-disjoint Hamiltonian cycles and m-2k perfect matchings.If m=2r+1,then the Cayley graph generated by G is the union of k(0≤k≤r) edge-disjoint Hamiltonian cycles and m-2k perfect matchings.In particular,for k=r and star graph,the conjecture was proposed by Hai-zhong Shi in 1998.
Less-conservative Criterion and Design of Robust Fault-tolerant for NNCS under Saturation Constraint
CAO Hui-chao and LI Wei
Computer Science. 2015, 42 (Z11): 247-252. 
Abstract PDF(499KB) ( 466 )   
References | RelatedCitation | Metrics
The problem of robust H∞ fault-tolerant design was investigated for uncertain nonlinear networked control system(NNCS) under actuator saturation constraint and fault.In order to obtain less-conservative criterion,an appropriate delay-dependent Lyapunov-Krasovskii functional was constructed,and the improved Wirtinger integral inequality was used,which possesses tighter lower-bound than Jensen integral inequality.Then,a less-conservative criterion was derived to guarantee robust H∞ fault-tolerant performance for the closed-loop fault NNCS with actuator saturation,and the design method of robust fault-tolerant controller was given by solving LMIs.Finally,numerical example was given to verify the effectiveness of the proposed method,and the less conservativeness of the proposed criterion was revealed by solving and comparing the maximum allowable delay and the minimum disturbance attenuation with some of the existing results.In addition,there is no introduction of any new decision variable apart from the matrix variable in the L-K functional,and then low computational complexity of the conclusion can be obtained.
Research on Reliability-aware Relay Strategy in Data Link
YANG Guang and ZENG Bin
Computer Science. 2015, 42 (Z11): 253-257. 
Abstract PDF(770KB) ( 436 )   
References | RelatedCitation | Metrics
Flood relay strategy,which is used in tactical data link(TDL) to realize the broadcast of information,belongs to pure flooding which is supposed to be inefficient.The strategy improves the delivery reliability by redundancy,however,it will lead to superfluous flooding cost.The restricting conditions of the TDL broadcast relay algorithm were analyzed.Then a new relay strategy RA-MPR was proposed to guarantee optimization of relay nodes choice based on the node’s reliability and adjacency while meeting the conditions of restriction on LOS,capacity,delay and equipment.Using the algorithm analysis and simulation of the RA-MPR,the performance of RA-MPR under different network scales and delivering radii was discussed.
Spectrum Allocation and Power Control Based on Harmony Search Algorithm in Cognitive Radio Network
YANG Jin-song, ZENG Bi-qing and HU Pian-pian
Computer Science. 2015, 42 (Z11): 258-262. 
Abstract PDF(415KB) ( 465 )   
References | RelatedCitation | Metrics
Aiming at the problem of the power control and spectrum allocation processes generating interference with each other in cognitive radio network,we proposed a power control and spectrum allocation algorithm based on harmony search in cognition radio network.With the comprehensive analysis of the conditions of various constraints and influence in the processes of spectrum allocation and power control,system model which joints power control and spectrum allocation is established.Coding scheme of harmony search algorithm was designed based on the characteristics of system model,and the algorithm chose the method of multi-objective optimizations to deal with model in complex constraints by giving feasible solutions appropriate priority weights.Simulation experiments show that the algorithm can solve the spectrum allocation and power control problems very well in cognitive radio network.
Link-quality and Energy Aware High-rate Multicast Scheme for Energy Harvesting Wireless Sensor Networks
CHI Kai-kai, DAI Zhi-quan, LI Yan-jun and CHENG Zhen
Computer Science. 2015, 42 (Z11): 263-267. 
Abstract PDF(434KB) ( 394 )   
References | RelatedCitation | Metrics
As the energy harvesting wireless sensor networks(EH-WSNs) can work forever,they have a lot of promising applications.The available one-hop multicast schemes of EH-WSNs have the shortcomings that they have not taken the feature of dynamic link quality into account and waste the energy due to the overflow of energy storage,ect.This paper presented one energy-efficient one-hop multicast scheme with high packet delivery rate.This scheme integrates the erasure code and considers the node energy,energy harvesting rate and current link quality to analyze the expected number of correctly received packets of the follow data block.This scheme only let the sensor receive the block whose expected number of correctly received packets is greater than or equal to one threshold and the block followed by the occurrence of energy overflow once the sensor does not receive it,effectively reducing the occurrence probabilities of energy overflow and the failure decoding due to non-perfect link quality.Simulation results demonstrate that,compared to the available scheme,the proposed scheme is able to greatly improve the one-hop multicast packet delivery rate of EH-WSNs.
Analysis of Genre Intertexuality on Social Network User Behavior
WAN Ya-ping, YANG Xiao-hua, LIU Zhi-ming, LI Zhi and ZHANG Juan
Computer Science. 2015, 42 (Z11): 268-272. 
Abstract PDF(820KB) ( 517 )   
References | RelatedCitation | Metrics
Social network is a kind of social relationships network service which is built based on users common interests and hobbies.Social network contains a large number of user behavior,and by analyzing these behavior it is very significant to enhance the user experience,increase user viscosity,improve resource sharing rate and effectiveness of some other services.Generic intertextuality is a basic concept in linguistics and it has some common characteristics with user behavior.The results show that the social network user behavior is similar to generic intertextuality.Studying user behaviors by generic intertextuality will be more conducive to information sharing,information dissemination and knowledge communication.
High-throughput and Collision-free Medium Access Control for Wireless Nanosensor Networks
CHI Kai-kai, LIN Yi-min and LI Yan-jun
Computer Science. 2015, 42 (Z11): 273-276. 
Abstract PDF(384KB) ( 520 )   
References | RelatedCitation | Metrics
Wireless nanosensor networks(WNSNs) are a new type of sensor networks which have some very important and promising applications.Considering the severely low computation capability of nanosensors,the TS-OOK based medium access control(MAC) mechanism with very low computation complexity was proposed.Aiming to eliminate the defects of TS-OOK based MAC that the continuous bit collision may happen and the throughput is low,this paper presented three improved TS-OOK based MAC mechanisms:period-fixed and bandwidth-fair MAC,period-doubling and bandwidth-fair MAC,and priority-aware MAC.All three mechanisms achieve no collision through a few transmissions of control frames between access nodes and relay node.Performance evaluation shows that the enhanced MAC mechanisms have higher throughput and lower transmission delay.
Improved DV-Hop Algorithm Based on Mobile Anchor Nodes
FENG You-bing, MA Yan and WEI Yu-ting
Computer Science. 2015, 42 (Z11): 277-279. 
Abstract PDF(228KB) ( 701 )   
References | RelatedCitation | Metrics
DV-Hop is a typical range-free positioning algorithm.Aimed at the problem of low positioning accuracy of the algorithm in the localization process,an improved algorithm based on mobile anchor node was proposed.The mobile anchor node was used to form multiple virtual anchor nodes,so as to reduce the number of anchor node effectively.Meanwhile,on the basis of the original algorithm,the average hop distance was modified,which makes it closer to the real value.The simulation results show that,the improved algorithm is better than the traditional DV-Hop algorithm and the average positioning error is reduced by about 30%,and the location accuracy is greatly improved.
Saving Energy Strategy by Combining Node Transmission Range and Network Coding(TRNC) for Wireless Sensor Network
TIAN Xian-zhong, YANG Sheng and XU Wei
Computer Science. 2015, 42 (Z11): 280-284. 
Abstract PDF(448KB) ( 462 )   
References | RelatedCitation | Metrics
In a wireless sensor network,the sink’s neighbor nodes have to withstand enormous traffic load in forwarding data,and these nodes run out of their battery power very soon.It is prone to be energy hole problem,thus this zone becomes a communication bottleneck for the whole wireless sensor networks.In order to save energy in the bottleneck zone,the paper proposed the TRNC strategy.It properly adjusts the transmission power range of nodes in the bottleneck zone by analyzing the impact of the transmission power range of the nodes on energy consumption,at the same time,it combines the network coding method.Theoretical analysis and numerical simulation results show that the strategy can effectively improve the energy utilization efficiency and reduce the energy consumption.
Master and Sslave Mechanism Based Route for Reliable Communications in Cognitive Radio Ad Hoc Networks
ZHAO Qian, FENG Guang-sheng and ZHENG Chen
Computer Science. 2015, 42 (Z11): 285-288. 
Abstract PDF(358KB) ( 417 )   
References | RelatedCitation | Metrics
To solve the node and link failures in cognitive radio Ad Hoc networks,a novel master and slave mechanism based route,named MSMRC,for reliable communication between cognitive users was presented in this paper.The route takes the primary user activity patterns into account,and both the link reliable time and channel reliable time are also introduced to design this routing mechanism.When some routes fail,the existing communications between the cognitive users can be restored quickly through pre-configured slave routes.Experiments show that the proposed MSMRC method can significantly reduce the average overhead,and improve the packet delivery ratio and link repair rate,ensuring the quality of the network end to end communication services.
Research on Wireless Networks Mobility Management and OPNET Simulation Based on SIP
CHEN Bin, MA Da-wei, YIN Cai-hua and JIANG Xue-yin
Computer Science. 2015, 42 (Z11): 289-291. 
Abstract PDF(760KB) ( 469 )   
References | RelatedCitation | Metrics
To counter the problem of mobility management,this paper briefly introduced the characteristic of SIP and the support for terminal mobility.We discussed in detail how to use OPNET to establish SIP handoff process model and by analyzing the simulation results verified the SIP support ability for mobility.
Localization Scheme for Mobile Tags Based on RFID in Internet of Things
GUO Ping and XIE Lei
Computer Science. 2015, 42 (Z11): 292-295. 
Abstract PDF(669KB) ( 455 )   
References | RelatedCitation | Metrics
In existing localization systems of using RFID technology,there is a lack of systems that localize mobile tags using heterogeneous mobile readers in a distributed manner,and positioning accuracy is low.We proposed the LSMT-RFID system for localizing mobile RFID tags using a group of ad hoc heterogeneous mobile RFID readers.Mobile readers cooperate with each other through time-constrained interleaving processes,and those readers in neighborhood share interrogation information,estimate tag locations accordingly and employ both proactive and reactive protocols to ensure timely dissemination of location information.Simulation experiments based on ns-3 were set to the positioning capability of the scheme,especially,the average localization error and positioning delay.The results show that the proposed scheme can achieve effective positioning.
Detecting Most Influential Nodes in Complex Networks by KSN Algorithm
TIAN Yan and LIU Zu-gen
Computer Science. 2015, 42 (Z11): 296-300. 
Abstract PDF(365KB) ( 596 )   
References | RelatedCitation | Metrics
It is very important in theory and practice to detect influential spreaders in networks accurately and efficiently.Recently,scholars from various fields have paid their attention to the study of ranking nodes.The K-shell index is a relatively powerful indicator to estimate the spreading ability of nodes.However,due to only attributes of the node itself being considered,limitation in accuracy and universal applicability will exist for K-shell decomposition.To solve this problem,this paper proposed a novel algorithm called KSN (the K-shell and neighborhood centrality) to estimate the spreading influence of a node by its K-shell value and the K-shell indexes of its nearest and next nearest neighbors.Experimental results demonstrate that this proposed algorithm acts more precisely in detecting the most influential nodes than degree centrality,betweenness centrality,K-shell decomposition and the mixed degree decomposition,et al.
Signal Boundary Characteristic Matching Extension Based on Bilateral Filtering
CAO Xiao-chu, JIN Di, LU Yin-tao, WANG Zong-ren and WANG Qi-di
Computer Science. 2015, 42 (Z11): 301-304. 
Abstract PDF(580KB) ( 508 )   
References | RelatedCitation | Metrics
The end effects of empirical mode decomposition affect the quality of signal decomposition.A method of signal boundary characteristic matching extension based on bilateral filtering was proposed to overcome it.The influence of the end effect is effectively restrained and the accuracy of signal components is improved.Both inherent laws of source signal and difference between local data are considered to boundary wave extension.The simulation and seismic data analysis demonstrate that it is an effective method for signal decomposition.
Location Scheme of Mobile Users Based on Imprecise Anchors in RFID Systems
REN Li and HUANG Qing
Computer Science. 2015, 42 (Z11): 305-309. 
Abstract PDF(1022KB) ( 516 )   
References | RelatedCitation | Metrics
In order to accurately position the mobile user,most current localization algorithms based on RFID need anchor assist.It is usually difficult to find or deploy enough anchor nodes for accurate localization.To solve this problem,a location scheme of mobile users based on imprecise anchors was proposed.Firstly,a large number of tags with approxi-mate locations are used as anchor nodes to compute the users’ locations.It does not require any additional precise anchor nodes,thus effectively avoiding the significant deployment cost.Then,two time-efficient algorithms were proposed to accurately locate the moving user,i.e.,the category cardinality based protocol and the RSSI based protocol.Experimental results indicate that the solutions can accurately locate the mobile users in a real-time approach.The accuracy of the improved method improves over 30% than the base solution.
On Packet Detection Algorithm of G3-PLC Specification with Narrow-band Powerline Noise Interference
ZHANG Yan-yu
Computer Science. 2015, 42 (Z11): 310-312. 
Abstract PDF(323KB) ( 509 )   
References | RelatedCitation | Metrics
When exploiting orthogonal frequency division multiplexing technology based on G3 specification on smart grid,it’s necessary to solve the synchronization in harsh powerline environment for G3-PLC system.When there are background noise and impulsive interference in powerline channel,the classic delay-and-correlate packet detection algorithm has serious error.According to the characteristic of powerline noise,the classic packet detection algorithm is improved in order to increase the accuracy of detection.Through the evaluation of classic algorithm and new algorithm,the result shows new algorithm is more accurate than classic algorithm in the condition of powerline noise model.
Survey on Distinction between Flash Crowd and DDoS Attacks
LUO Kai, LUO Jun-yong, YIN Mei-juan, LIU Yan and GAO Li-zheng
Computer Science. 2015, 42 (Z11): 313-316. 
Abstract PDF(712KB) ( 1308 )   
References | RelatedCitation | Metrics
DDoS attacks on Web servers is close to Flash Crowd and the distinction between Flash Crowd and DDoS attacks is becoming a new research point in the field of network security.This paper began with an overview of the concept and taxonomy of Flash Crowd,and compared the similarities and differences between Flash Crowd and DDoS attacks.Then we detailed the three popular methods of distinguishing between the Flash Crowd and DDoS attacks:methods based on traffic characteristics,methods based on user behavior and methods based on host testing.What’s more,this paper introduced several popular datasets,and finally predicted the researching trend in the future.
Construction Heterogeneous Computing Platforms and Sensitive Data Protection Based on Domestic X86 Processors
ZENG Zhiping, XIAO Haidong and ZHANG Xinpeng
Computer Science. 2015, 42 (Z11): 317-322. 
Abstract PDF(508KB) ( 752 )   
References | RelatedCitation | Metrics
With the needs of sensitive data protection growing in the era of big data,how to handle big data sets has become a hot research topic under the safe and controllable hardware and software environment.This paper designed a safe and controlled domestic X86 processor-based big data platform,which provides security for massive sensitive data by using AES (Advanced Encryption Standard) algorithm.In addition,we reasonably constructed the GPU heteroge-neous computing environment,thereby fully improving computational efficiency of the domestic big data platform,which provides a new solution for the safe handling of massive data .Experimental results show that domestic x86 processor(ZHAOXIN)-based GPU heterogeneous computing platforms can effectively meet the needs of big dataset processing. The improved AES algorithm to adapt heterogeneous computing environment can enhance the efficiency of encryption,and gain 22 to 23 times speedup.Dealing with massive data (GB and above),parallel processing capabilities and accelera-tion effect of domestic heterogeneous computing platform are very clear.The results of massive sensitive data proces-sing and information security have important application value.
Method of Topology Transformation for Mimic Network
ZHAO Liang, ZHANG Xiao-hui, ZOU Hong and ZHANG Peng
Computer Science. 2015, 42 (Z11): 323-328. 
Abstract PDF(572KB) ( 530 )   
References | RelatedCitation | Metrics
According to mimic network construction requirements,we proposed a method of topological equivalent transformation.In order to ensure the equivalence of topology transformation,we proposed a method for describing network abstractly.According to the method for describing network,the basic idea of topology transformation for mimic network based on the local network equivalent transformation was proposed,and mathematical modeling and theoretical analysis for mimic network topological transformation were made.In order to specify the method,this paper designed the algorithm of constructing simple equivalent subnet based on the available resources,and presented implementation of the algorithm.Finally,the analysis of the characteristics of the method,and future research directions were given.
Research on Privacy Protection Based on SEAndroid
WEN Han-xiang, LI Yu-jun and HOU Meng-shu
Computer Science. 2015, 42 (Z11): 329-332. 
Abstract PDF(321KB) ( 830 )   
References | RelatedCitation | Metrics
With the rapid development of mobile application,the number of Android phone users has increased sharply,and the growing users’ data have made Android system become the main target of malicious attackers.We analyzed and researched the SELinux added in Android 4.4 system to point out the possibility of refining restrictions on root permissions.Based on the mechanism,we put forward a design which can strengthen privacy in order to protect private data even if the data store in the mobile phone which has obtained root permissions.
Identity-based Hierarchy Group Key Management of Space Network
JIANG Zi-hui and LEI Feng-yu
Computer Science. 2015, 42 (Z11): 333-340. 
Abstract PDF(729KB) ( 529 )   
References | RelatedCitation | Metrics
With the gradual deepening of the information construction and the rapid development of space technology,mobile communication technology and network technology,spatial information system development also accelerates toward networking.Potential applications of special information network have gotten more concern,so its safety requirements are getting higher and higher.This paper proposed a identity-based group key management program(ID-GKM) for the entire space network,which uses the hierarchical-grouped group key management scheme.In addition to a common group key generating,distribution and updating,it also considered the part of the private key updating.Using the identity-based key encryption mechanism which is proposed by Boneh and Franklin,we proposed the private key update mechanism for space network.The program can adapt the hierarchical structure of the space network and meet the requirements of its strong scalability and high reliability.In addition,we concerned the difference of the ground terminal node and the nodes in space.This scheme uses the batch updating that is a combination of updating regularly and updating based on queue.And we can use the proxy re-encryption group key management scheme to solve the issue that the user must be online when group key is updating.
Research on DoS Attacks Against Control Level in OpenFlow-based SDN
LOU Heng-yue and DOU Jun
Computer Science. 2015, 42 (Z11): 341-344. 
Abstract PDF(344KB) ( 927 )   
References | RelatedCitation | Metrics
Based on OpenFlow protocol message exchange mechanism,all non-data packets need uploading by PACKET_IN message.Thus,a new DoS attack on the control plane was proposed.It uses non-stop forwarding unknown address packages to deplete resources in control plane.And a solution strategy was proposed to detect attacks and reduce network latency based on the programmability of SDN network.First,through SDN controller north application interface,Defense4ALL application was used to detect malicious traffic by characteristic of DoS attacks.Then by using the controller feature of dynamical configuration,switch configuration file was updated in real-time,and network forwarding policy was changed.Thereby it could reduce the damage caused by the attack on the entire network.The simulation shows that the success rate of this detection method closes to 100%.But in slow-speed less-source attack detection success rate is less than 80%.The overall network latency is reduced by 10ms or more.The proposed solution strategy can effectively reduce the interference of the DoS attacks against control level for entire network.
High-security Architecture on Independent Core Component
SHAO Jing, YIN Hong-wu, CHEN Zuo-ning and YU Ting
Computer Science. 2015, 42 (Z11): 345-347. 
Abstract PDF(608KB) ( 504 )   
References | RelatedCitation | Metrics
Building a high-security architecture is an important precondition of high-security information system.The core components of trusted computing architectures and virtualization architecture may be modified and bypassed.Aiming at this risk,a high-security architecture on independent core component(HAICC) was proposed.The architecture realizes strong isolation of security and computing functions by hardware.The system is divided into secure server sub-system and targeted computing sub-system,which occupy different physical resources.The former sub-system implements active measurement,runtime monitoring and key data recovery of the whole computing sub-system.The attack instance and security analysis show that,HAICC reduces the risk of modification and bypass for core security component,and enhances the integrity of security mechanisms.
Digital Forensic Investigation in Cloud Storage
DONG Zhen-xing, ZHANG Qing and CHEN Long
Computer Science. 2015, 42 (Z11): 348-351. 
Abstract PDF(609KB) ( 641 )   
References | RelatedCitation | Metrics
Nowadays,many users utilize the cloud storage service to store or share their data.At the same time,there are an increasing number of illegal cases about preserving illegal information or stealing the company’s confidential data through cloud storage service.Collecting the crucial evidences from cloud storage service reliably and completely has become an urgent problem.This paper took 360 cloud storage service as example,analyzed the law of residual data after accessing to the cloud storage through the browser and/or client software,and then presented a forensic analysis method to identify user behaviors.The time line of user’s action is reconstructed by combining logs and history data remnants together.Therefore the user behaviors related to the cloud storage service are profiled clearly.These ideas and methods can be applied to other cloud storage services currently used.
Differential Fault Attack and Analysis of Improvement on LEX
LI Jia-yu, SHI Hui, DENG Yuan-qing, GONG Jing and GUAN Yu
Computer Science. 2015, 42 (Z11): 352-356. 
Abstract PDF(450KB) ( 560 )   
References | RelatedCitation | Metrics
A method of differential fault attack on LEX was analyzed.In order to enhance the ability of stream cipher LEX to resist differential fault attack,based on the idea of making every RoundKey xor a 128-bit random stream,a new version of LEX was proposed.Then the safety and speed of the improved LEX algorithm were analyzed,and an exam-ple was performed to test the randomicity of the improved algorithm’s key stream.The results show that the improved algorithm is resistant to differential fault attack,and simultaneously,has the same computing speed and randomicity to the original LEX algorithm,which is an improvement of LEX.
Risk-controllable Common Elastic Mobile Cloud Computing Framework
LI Xin-guo, LI Peng-wei, FU Jian-ming and DING Xiao-yi
Computer Science. 2015, 42 (Z11): 357-363. 
Abstract PDF(1105KB) ( 569 )   
References | RelatedCitation | Metrics
Elastic mobile cloud computing(EMCC) enables mobile devices to seamlessly and transparently use cloud resources to augment the capability by moving part of mobile devices’ execution tasks to cloud on demand.At first,based on the summary of existing EMCC programs,the common EMCC implementation framework was build.Then we poin-ted out that the execution of EMCC applications may lead to privacy leakage and information flow hijack.Then an EMCC framework was proposed in which security risks are seen as costs of EMCC,and this framework can ensure the use of EMCC makes benefits for the mobile device user.Since the major difficulties of the implement of this framework are risk quantification and security-sensitive modules annotation,at last,a modules of risk quantification was designed and a tool which can annotate security-sensitive methods automatically was implemented.The validity of this tool was proved by experiments.
BlindLock:An Effective Pattern Lock System Against Smudge Attack
WU Ji-jie, CAO Tian-jie and ZHAI Jing-xuan
Computer Science. 2015, 42 (Z11): 364-367. 
Abstract PDF(1193KB) ( 458 )   
References | RelatedCitation | Metrics
Recently,a growing number of mobile devices use pattern lock as the identity authentication mechanism.To unlock a smartphone,a user must draw a memorized graphical pattern with a finger on the touchscreen where the finger actually leaves its oily residues,also called smudges.The smudges can be exploited by adversaries to reproduce the secret pattern,so that the user’s privacy is always revealed.Based on the research of the existing patten lock,we presented BlindLock as our main result.BlindLock can not only unlock in a pocket,but also use the cover principle to resist smudge attacks and use theory of visual occlusion to resist shoulder surfing attacks.Our user study also shows that BlindLock can significantly improve security,usability and password space of the pattern lock system while incurring minimal cost increase in terms of unlocking time and keeping the original graphics memory.
Analytic Hierarchy Process-based Assessment Method on Mobile Payment Security
SHAN Mei-jing
Computer Science. 2015, 42 (Z11): 368-371. 
Abstract PDF(603KB) ( 647 )   
References | RelatedCitation | Metrics
In big data ages,the security is the greatest concern of mobile payment.Compared with the Internet payment,mobile payment has many personal and difficult problems.A security assessement indexes system was built up through Delphi method,and the weight of each index was calculated with analytic hierarchy process,then the top risk node and security value of the whole system were gotten.According to the designed index system,fuzzy comprehensive analysis method was introduced to the security evaluation of mobile payment.The experiment illustrates this index system can effectively quantify and assess the security level of mobile payment.
Study on Safety Status and Coping Strategies for Big Data
SANG Yun-chang
Computer Science. 2015, 42 (Z11): 372-373. 
Abstract PDF(178KB) ( 1327 )   
References | RelatedCitation | Metrics
The data contain valuable information,but the data security is facing serious challenges.Based on the analysis of the basic characteristics of the data,we proposed the current security challenges large data faced,and described the data safety and countermeasures from four aspects,such as monitoring mechanisms,prevention and detection,response level and processing capability.
Method for Text Watermarking Based on Subject-verb Encoding
LI Gui-sen, CHEN Jian-ping, MA Hai-ying and YANG Fang-xing
Computer Science. 2015, 42 (Z11): 374-377. 
Abstract PDF(602KB) ( 489 )   
References | RelatedCitation | Metrics
Text watermarking protects the copyrights of text works by embedding copyright information(watermark) into a text.This paper proposed a text watermarking technique,in which the watermark is embedded by encoding the subject-verbs of the sentences in a text.A watermark message is converted into a string of the hexadecimal Unicode code.With the help of the language technology platform(LTP) of Harbin Institute of Technology,a series of processes are applied to the text to obtain the subject-verbs in the text.Each of the subject-verbs is encoded with one piece of the Unicode string,which achieves the embedding of the watermark.When extracting the watermark,the subject-verbs are obtained from the detected text and decoded according to the codebook generated in the watermark embedding.The corresponding pieces of the Unicode string are taken out from the codebook and put together in correct order.They are then converted back into the original characters to obtain the embedded watermark message.The proposed algorithm has a good nature of concealment and can resist various watermark attacks.
Detection Method and Performance Analysis of Network Attacks Based on Cloud Model
XIE Li-chun and ZHANG Chun-qin
Computer Science. 2015, 42 (Z11): 378-380. 
Abstract PDF(332KB) ( 602 )   
References | RelatedCitation | Metrics
In order to effectively determine whether network packets were attacked or not,a new detection algorithm DMCM(Detection Method based on Cloud Model) was proposed based on the cloud mode.Firstly,the state indicator of each packet was defined in terms of the discreteness and deviation of the properties of packets.Then,the process of computing the distribution of standard deviation was presented based on the cloud model.The distribution was used to determine the anomaly states of packets.Finally,a comprehensive experiments were conducted to study the performance of the DMCM algorithm through simulation using OPNET and MATLAB.Experimental results show that the proposed algorithm performs better than other algorithms in terms of adaptability.
Method for Data Storing Based on Coding Vector Encryption in WSNs
ZHANG Wei, SHI Xing-yan and CUI Mao-qi
Computer Science. 2015, 42 (Z11): 381-385. 
Abstract PDF(678KB) ( 437 )   
References | RelatedCitation | Metrics
With the development of wireless sensor networks (WSNs),an effective solution was provided by WSNs in precision agriculture production environmental monitor fields.Considering the WSNs shortages in agricultural production,especially in data transmission security and storage efficiency,this paper firstly summaried the main features of the agricultural WSNs and analyzed the good results to introduce cloud computing storage techniques.Then,the lower convergence of sensing data transmission process was approved by the scientific analysis to show importance for limited bandwidth with optimizaed network coding technique.Finally,with the help of coding vector encryption in WSNs,a new method for secure storage of sensor data information of agricaltural WSNs based on coding vector encryption was proposed.All the field testing results show that the method is rational and practical,and it can achieve a high encryption security based on network coding and a significant enhancement in the data transmission process.
Improved Design for Trust Model on Domain Entity of Computing Grid
YANG Zhang-wei, WANG Li-ping and LAI Wen-ping
Computer Science. 2015, 42 (Z11): 386-389. 
Abstract PDF(576KB) ( 539 )   
References | RelatedCitation | Metrics
Traditional trust model of computing grid based on trusted domains has less secure,which uses the number of entity as the only parameter of trust value calculation complexity.The paper proposed an improved autonomous entity trust model based on existing model of computing grid,introduced the time decay and penalty factor and established a direct and recommendation trust model of GSP for user.The simulation results show that the proposed model has a higher defense capabiliy to Whitewashing attack.
Advances in Biological Sequence Alignment Parallel Processing Based on Heterogeneous Systems
ZHU Xiang-yuan, LI Ren-fa, LI Ken-li and HU Zhong-wang
Computer Science. 2015, 42 (Z11): 390-395. 
Abstract PDF(588KB) ( 762 )   
References | RelatedCitation | Metrics
Sequence alignment is a fundamental operation in Bioinformatics.In recent years,there have been extensive studies and rapid progresses in biological sequence alignment parallel processing technologies due to its wide use,high computational complexity and large-scale data.This paper first analyzed the new development of high performance computing in sequence alignment,and then classified the up-to-date researches based on the applied architectures.Their implementation and performance were compared and analyzed in detail.It was pointed out that problems such as memory access control,synchronization,data transferring and scalability of algorithms are the key techniques to the study of biological sequence alignment parallel processing.Finally,some future directions of research were given.
Improved DBSCAN Algorithm Based on MapReduce
LAI Li-ping, NIE Rui-hua, WANG Jiang-ping and HUANG Jia-hong
Computer Science. 2015, 42 (Z11): 396-399. 
Abstract PDF(597KB) ( 849 )   
References | RelatedCitation | Metrics
Aimed at solving DBSCAN’s problems of the Eps parameters and the efficiency of processing of massive data sets,the article put forward a new algorithm called OPDBSCAN.It uses overlapping partitions to get a local Eps for reducing the effect of global Eps,then uses MapReduce to cluster in parallel to improve the efficiency.At last,the experiment shows that the OPDBSCAN can cluster faster and better.
Scheduling Data Sensitive Workflow in Hybrid Cloud
FAN Jing, SHEN Jie and XIONG Li-rong
Computer Science. 2015, 42 (Z11): 400-405. 
Abstract PDF(824KB) ( 587 )   
References | RelatedCitation | Metrics
Using public resources to extend the capacity of private cloud is an effective way for the enterprises to achieve high efficiency and elasticity in data storage and computing.Scheduling workflow with sensitive data in hybrid cloud needs to satisfy the requirements of data security and execution deadline.In order to minimize the monetary cost,the scheduler must decide which tasks should be run on the public cloud and on which computing resource each workflow task should be allocated.Integer linear program(ILP) was used to formulate workflow scheduling problem with three objectives,such as data sensibility,deadline and cost.For the purpose of reducing the solve time of ILP model,the task assignment filter strategy based on Pareo optimality theory was designed.The filter strategy can decrease the scale of task assignments,and reduce the mappings between tasks and resources of ILP model.Experiments show that removing the unreasonable task assignment before resource allocation can decrease the ILP model scale and reduce scheduling running time,while the method can obtain a good solution.
Virtual Machine Placement Algorithm Based on Improved Genetic Algorithm
HUANG Zhao-nian, LI Hai-shan and ZHAO Jun
Computer Science. 2015, 42 (Z11): 406-407. 
Abstract PDF(499KB) ( 478 )   
References | RelatedCitation | Metrics
Reducing the network delay and optimizing energy consumption and resource waste in the data centers have become increasingly important in the world.This paper focused on the resource waste and the network delay in the data centers and modeled the virtual machine placement to solve multi-objective optimization problems,such as minimizing physical machine resources and minimizing total network delay.Through the double-fitness genetic algorithm(CGA),we optimized the two objects at the same time.There is a contrast between CGA and FFD through simulation experiment,and the result is that CGA is better and it is an efficient virtual machine placement algorithm in the cloud environment.
Study on Strategy of Replica Selection in Cloud Storage Environment
ZHANG Cui-ping, GUO Zhen-zhou and GONG Chang-qing
Computer Science. 2015, 42 (Z11): 408-412. 
Abstract PDF(431KB) ( 641 )   
References | RelatedCitation | Metrics
In order to meet the needs of various users for cloud storage,cloud storage service providers generally divide the data into fixed size block and use redundancy backup technology to store data.So the researches on storage mechanism of the block placement,the best replica selection and the replica size have always been hot spots in improving the transmission speed of big file.According to the heterogeneity of storage nodes in cloud storage system,the improved strategy which uses AHP to weight the index of node performance,and uses the weighted index to improve particle swarm optimization algorithm(AHPPSO) was proposed.By introducing the weighted evaluation matrix associated with the performance of the storage node,PSO evolves toward the node of high comprehensive performance,improving the data transmission speed without increasing the cost of storage space.The strategy was realized in a self-built cloud sto-rage system,and the experiment result shows that this strategy can adapt to the various needs of users,and achieve the system load balancing to a certain extent.
Optimal Task-level Scheduling Based on Multimedia Applications in Cloud
GUO Ya-qiong and SONG Jian-xin
Computer Science. 2015, 42 (Z11): 413-416. 
Abstract PDF(352KB) ( 526 )   
References | RelatedCitation | Metrics
As an emerging computing paradigm,cloud computing has been increasingly used in multimedia applications.Because of the diversity and heterogeneity of multimedia services,how to effectively schedule multimedia tasks to multiple virtual machines for processing has become one fundamental challenge for application providers.So we studied task-level scheduling problem for cloud based multimedia applications.Firstly,we introduced a directed acyclic graph to mo-del precedence constraints and dependency among tasks in the hybrid structure.Based on the model,we studied the optimal task scheduling problem for the sequential,the parallel,and the mixed structures.Moreover,we combined the task nodes in the critical path according to the cost of limited resources.Lastly,we proposed a heuristic method to perform the near optimal task scheduling in a practical way.Experimental results demonstrate that the proposed scheduling scheme can optimally assign tasks to virtual machines to minimize the execution time.
Research on Dynamic Web Service Integration System in Cloud Computing
LIU Fei and HAO Feng-jie
Computer Science. 2015, 42 (Z11): 417-420. 
Abstract PDF(693KB) ( 506 )   
References | RelatedCitation | Metrics
As a kind of emerging business computational model,cloud computing distributes computation task on the resource pool which consists of massive computers,accordingly,the application systems can gain the computation strength,storage space and software service according to its demand.In this paper,we described what is cloud computing,summed up key techniques,such as virtual database,Web services,as well as programming model and ESB techno-logy used in cloud computing,and then an example of dynamic Web service integration system was illustrated.Experimental results show that the proposed system architecture can reduce the building-cost significantly as well as improve the system performance remarkably.
Analysis and Comparsion of Open Source IaaS Platform
LEI Qing
Computer Science. 2015, 42 (Z11): 421-424. 
Abstract PDF(665KB) ( 1260 )   
References | RelatedCitation | Metrics
As a new computing model,cloud computing has been explored from the initial theoretical stage gradually into a common development of theories,techniques and applications,while the research and development in open cloud projects play an important role.So far,many academic institutions and commercial enterprise have considered open cloud as an economical and effective solution of building public cloud or private cloud for information management,information services and technology research.In this paper,we presented a comparative study about four open IaaS platforms.We discussed in detail about their architecture,resource abstraction and control layer,cloud service management,security,the scale of community and so on.We hoped that this paper can provide a clear guide for evaluating the possible adoption of cloud technology.
Method of Workflow Bi-directional Scheduling in Cloud Computing Environment
ZHANG Pei-yun and FENG Qi
Computer Science. 2015, 42 (Z11): 425-430. 
Abstract PDF(502KB) ( 455 )   
References | RelatedCitation | Metrics
To reduce the time and cost of workflow scheduling in cloud computing,we proposed a bi-directional scheduling algorithm including two sub-algorithms,which are Backward and Forward.Firstly,the Backward algorithm achieve a scheduling according to the deadline start time for each task scheduling.Then,to reduce the cost of scheduling,the Forward algorithm schedules each task in advance as much as possible.In the process of forward scheduling,taking the deadline and the biggest cost and transmission time into consideration,the algorithm achieves dynamic scheduling.The experiment results show that our algorithm is better than BDA algorithm and ICPCP algorithm for lower rent cost and higher scheduling flexibility.
Review of Algorithm Visualization Systems:A Learner Perspective
LI Xiao-hong, LIU Cong and LUO Jia-wei
Computer Science. 2015, 42 (Z11): 431-437. 
Abstract PDF(1683KB) ( 546 )   
References | RelatedCitation | Metrics
Assisting students to understand algorithms is a challenge task in computer science education.Since algorithms are often complex topics,algorithm visualization is a useful aid for understanding the working of algorithms.It visualizes the behavior of an algorithm by producing an abstraction of both the data and the operations of the algorithm.Many algorithm visualization systems have been developed over the last years,Price and Karavirta’s taxonomy classifies the features and characteristics of different visualization systems which are widely accepted.However,Price’s taxonomy is too complex,and focuses on many irrelevant visualization systems which may confuse the readers.This paper gave a new taxonomy of traditional algorithm visualization systems from the learners’ view which is easy to understand.Also,we summarized the history and current situation of algorithm visualization systems and discussed the future of them.
XML Schema Features Extraction Algorithm
LIU Ke, YANG Hong-li, ZHAO Rui-fang, LIAO Hu-sheng, CHEN Yao and QIN Sheng-chao
Computer Science. 2015, 42 (Z11): 438-443. 
Abstract PDF(451KB) ( 539 )   
References | RelatedCitation | Metrics
As an important aspect of XML query optimization-Twig pattern minimizing,usually needs to take advantage of the constraints of XML Schema during the minimization process,so it is called Schema features.To simplify the traditional methods of extracting Schema features,and to ensure the accuracy of the extraction process,we proposed a model-checking algorithm to extract XML Schema features automatically.Based on the formal model of an XML Schema,we used an extended CTL formula to express Schema features,and proposed an algorithm to check if an XML Schema model satisfies CTL formula.Due to the expansion of CTL formulas,the proposed algorithm can not only express the prior Schema features,but also represent parents,ancestors and other backward features.Finally,we realizaed a aids tool for our methods.
Automatic Detection Method of Cross-browser Web Application
WANG Huan-huan, WU Yi-jian and ZHAO Wen-yun
Computer Science. 2015, 42 (Z11): 444-449. 
Abstract PDF(1281KB) ( 892 )   
References | RelatedCitation | Metrics
With the Web applications used more and more widely,the stability of these Web applications has been paid more and more attention.One of the most important issue is the compatibility of different browsers.To make sure that the application can be used in all browsers,it is very important to detect the browser compatibility in the development stage.This paper proposed a new automated technology to detect browser compatibility problems in the development stage,which can automatically browse all the pages of the Web application.After the analysis of the extracted information structure code and the related property,we got a report of cross browser compatibility problems for the developers to help them solve these questions faster.This paper implemented the method and applied it to a specific developed project to collect the relevant data and demonstrated the feasibility of the method.In the end,this paper analyzed the common Web application compatibility issues.
Automated Test Case Generation Based on SPEA2+SDE
TAN Xin, PENG Yao-peng, YANG Shuai and ZHENG Wei
Computer Science. 2015, 42 (Z11): 450-453. 
Abstract PDF(314KB) ( 730 )   
References | RelatedCitation | Metrics
Software testing is crucial to ensure software quality.However,the complexity and cost will increase a lot with the growing variety of software structures and functionality.Automated test case generation is aimed at reducing the high cost as well as improving the reliability of the test results.This paper mainly discussed the technology of automated test case generation based on evolutionary algorithm.By comparing the testing efficiency of different algorithms on several classic programs,SPEA2+SDE performs best among all the algorithms in generating the test case automatically.Finally,we used Kruskal-Willis test to analyze the test results,proving that the conclusion above is general and reliable.
T-Minicore:A Time Predictable Embedded Operating System
LI Xiao-fei, CHEN Xiang-lan, LIU Jie and LI Xi
Computer Science. 2015, 42 (Z11): 454-459. 
Abstract PDF(1056KB) ( 658 )   
References | RelatedCitation | Metrics
Evidence to date indicates that there is no set definition of time predictability in academia.Most researches on time predictability focus on architecture and programming language.In this paper,referring to the more recognized definition of time predictability,granularity division was applied to time predictable systems.And T-Minicore,a service-grained operation system based on servant/exe-flow model,was proposed.T-Minicore meets the demand for time predictability in LET (Logical Execution Time) Model.Its time predictability in the communication module was justified by theory and experiments illustrate that applications running on T-Minicore operating system are time predictable.
Research on Verification and Validation of Agent-based Simulation Models
YI Wen-ying and LI Bo
Computer Science. 2015, 42 (Z11): 460-463. 
Abstract PDF(468KB) ( 1367 )   
References | RelatedCitation | Metrics
We firstly reviewed the development of verification and validation simulation models using agent-based paradigm.Then a process for verification and validation of agent-based simulation models was proposed,which combines face validation,sensitivity analysis,calibration and operational validation.Finally,the verification and validation of EEPSS(economic and environmental policy simulation system) was taken as an example to simply illustrate the suggested process.
Method of Modeling Software Evolution Confirmation Based on LDA
HAN Jun-ming and WANG Wei
Computer Science. 2015, 42 (Z11): 464-466. 
Abstract PDF(600KB) ( 532 )   
References | RelatedCitation | Metrics
Evolution is an important part in the software life cycle.Now,much software has evolved several versions,however,how to confirmation evolved software coincides with aim of evolution becomes a problem that calls for immediate solution.Because there is not a systematic method so far,we adopted LDA topic modeling to model analyses for evolution confirmation.LDA can model some features in the software source code,through the model the latent topics can be analyzed in the source code. We made the extracted topic compare with the published reports of software evolution to find out the distinctions between them,and according to the distinctions whether the software evolution satisfy the purpose of evolution can be confirmed.
Improvement of General ABMS Model Representation Based on FLAME
YAN Yi-shi and LI Bo
Computer Science. 2015, 42 (Z11): 467-472. 
Abstract PDF(471KB) ( 647 )   
References | RelatedCitation | Metrics
Model representation is one of the most important issues during Agent-based modeling.Research on general ABMS model representation can reduce the threshold of using ABMS tools,which plays a significant role in the interdisciplinary research.As the leader of existing ABMS platforms,FLAME is not only with a high completeness,but also outstanding advantages in representation,code generation,visualization,etc.In this context,FLAME and XML were selected as the basis of general model representation.With inheriting their advantages of conciseness,completeness and parallelization,a deep research was achieved to find out the potential weakness and improvement possibilities,with targeted solutions as well.Finally,an Agent model example was tested and verified for the feasibility of solutions.
Modeling and Validation of Capability Requirement Process Based on Activity Diagram
LIU Da-wei, WANG Zhi-xue and YU Ming-gang
Computer Science. 2015, 42 (Z11): 473-478. 
Abstract PDF(850KB) ( 468 )   
References | RelatedCitation | Metrics
Current descriptions of C4ISR system capability requirement are mostly based on static model such as literature and static diagram,and lack of definition of specific operation to information and data,which leads to lack of detailed explanation of behavior between objects.Since the capability model which lacks executable dynamic semantics is unexecutable,a modeling approach to capability requirement process based on activity diagram was proposed to support the modeling and simulation of executable architecture.Firstly,definition of system process model was presented,and with the guidance of C4ISR capability metamodel,capability requirement process metamodel was built by extending UML activity diagram.Then ontology was used to describe the semantics of capability requirement process metamodel,and validation of C4ISR capability requirement process metamodel was done through reasoning of ontology.
MFI Based Interoperability Measurement of Business Models in Enterprises
LI Zhao, ZHAO Yi, LIANG Peng and HE Ke-qing
Computer Science. 2015, 42 (Z11): 479-485. 
Abstract PDF(1354KB) ( 487 )   
References | RelatedCitation | Metrics
Various business models of enterprise exist and are currently used in industry,whilst their definitions,structures,functions,and supporting tools are quite different from each other.For interoperability,the partial semantic interoperability between heterogeneous business models is challenging to achieve.Almost all of the enterprise business models can be described from the four major dimensions:Role,Goal,Process,Service(RGPS),consequently in this paper,an enterprise business model is actually a specific RGPS model.An approach for measuring the interoperability of business models in enterprises was proposed.At first,the RGPS interoperability features framework was constructed based on Meta-model Framework of Interoperability(MFI),and it was specified to be the interoperability features set of RGPS models.Secondly,the interoperability features set and a mathematical method were proposed to identify and quantify a RGPS model and its interoperability features,and then the model instance of the RGPS model was produced.Next,we calculated the similarity between two model instances,and obtained the measuring results of interoperability between corresponding RGPS models,which is used to build the interoperability measurement matrix of RGPS models set.At last,the approach was applied to quantify the interoperability of typical business models in the enterprises belonging to different domains,which facilitates and guides the collaborative interoperability of the business models in different domains.
Adaptive Software Architecture Framework Model
SU Shi-xiong and QI Jin-ping
Computer Science. 2015, 42 (Z11): 486-489. 
Abstract PDF(359KB) ( 496 )   
References | RelatedCitation | Metrics
For dynamic change of network environment and user requirements,this paper proposed a dynamic adaptive software architecture model,and then on the basis of the adaptation process,the adaptive system was given. The system gets certain self-adaptive ability by adjusting their own behavior.Finally,the model was validated by a simple example,and the results show that the model is able to adapt to the complex network environment.
Effective Power Capping Scheme for Database Server
YANG Liang-huai, RUAN Zhong-xiao, ZHU Hong-yan and WANG Zhou-xin
Computer Science. 2015, 42 (Z11): 490-496. 
Abstract PDF(899KB) ( 673 )   
References | RelatedCitation | Metrics
Power control is a critical issue in data center and power capping is the technique to keep the system within a fixed power constraint.This paper focused on the dynamic power control scheme in a data center node machine.We constructed a process-level power model based on our previous system-level power model,and integrated these two models into a gadget called soft power-meter to control system power and process power.To achieve power capping,the soft power-meter is integrated into a closed-loop control system,and a power control algorithm is devised,which keeps the system within the fixed power budget with good performance.The experiment results demonstrate that the proposed power capping scheme can effectively control the system’s power with small performance degradation,and improve the energy-efficiency.It can be applied to a power-aware DBMS server.
Software Maintainability Evaluation Based on Fractal Theory
HAO Xue-liang, ZHU Xiao-dong and LIU Li
Computer Science. 2015, 42 (Z11): 497-499. 
Abstract PDF(344KB) ( 424 )   
References | RelatedCitation | Metrics
Aimed at the software maintainability problem,qualitative and quantitative evaluation were studied,considering two aspects of structure complexity and software process.Spatial domain self-similarity among software module and temporal self-similarity between software development and software maintenance were analyzed.Based on fractal dimension,software maintainability qualitatively evaluation method was presented to ascertain software qualitative maintainability requirement and management.Combined with top-down maintenance method,temporal software maintainability quantitative evaluation method was put forward and maintenance workload was validated by a real example of software system.
Research Based on Observe-Model-Exercise* Paradigm for GUI Testing
SHEN Yi-jun and GAO Jian-hua
Computer Science. 2015, 42 (Z11): 500-503. 
Abstract PDF(564KB) ( 430 )   
References | RelatedCitation | Metrics
Generally,it is hard to determine the input space when testing the graphical-user interface.It’s also a challenge for the automatic testing tools to identify those events which can only be executed after certain conditions are sa-tisfied.In order to address these problems,one of the effective solutions is to execute the test with the event-flow graph model and the observe-model-exercise* Paradigm.In this paradigm,a table is used to maintain the mapping between the model elements,which include the nodes and edges of the model,and event sequences used to reach them,so that the unique conditions are aware before the execution of the events.The algorithm to maintain the mapping presented by Memon is suitable only for the edges of the model,thus we proposed a new algorithm which is suitable for the nodes of the model.The result of the experiment indicates that the required conditions before the execution of the events are successfully recorded with our algorithm.
Detecting Software Error by Using State Transition Model of Variable
ZHANG Guang-mei and LI Jing-xia
Computer Science. 2015, 42 (Z11): 504-507. 
Abstract PDF(332KB) ( 404 )   
References | RelatedCitation | Metrics
Variables are used in a program in order to implement the function of a program.There are different operations about a variable in a program and the operation on a variable can change the state of a variable.According to the different usage of a variable,different states of a variable were analyzed.First,the safe and unsafe states of a normal variable and a pointer variable were defined in this paper.Then the rules about the change between different states were also defined.After that,the state transition model of variable was provided.By using the state transition model of variable and the theory of program slice,a variable’s unsafe state can be traced.
Method of Chinese Characters Retrieval According to Pinyin Initials in Database
Computer Science. 2015, 42 (Z11): 508-509. 
Abstract PDF(137KB) ( 930 )   
References | RelatedCitation | Metrics
According to the collation of SQL Server without changing the structure of the query table conditions,this paper discussed and studied fast retrieval methods according to the first letter of alphabet characters in a SQL Server database server .
Research of Augmented Reality Based on Mobile Platform
WANG Wei, WANG Zhi-qiang, ZHAO Ji-jun and SHEN Yan-guang
Computer Science. 2015, 42 (Z11): 510-519. 
Abstract PDF(1782KB) ( 492 )   
References | RelatedCitation | Metrics
The current conditions and the progress of mobile augmented reality technology were surveyed.Firstly,the research background was summarized and analyzed including domestic and international organizations,distinguishing features,and related resources.Then,theories and technologies related to this area were expounded in detail.Categorized algorithms were summarized,classified and compared.Finally,some most difficulties and possible research directions in the future were proposed,coupling with the forecasted development.
Machine Vision-based Lightweight Driver Assistance System
XU Bang-zhen, TANG Yi-ping and CAI Guo-ning
Computer Science. 2015, 42 (Z11): 520-524. 
Abstract PDF(1431KB) ( 560 )   
References | RelatedCitation | Metrics
This paper proposed a machine vision-based lightweight driver assistance system.Firstly,the adjusted algorithm for extracting edge and lane line detection algorithm are used to calibrate inside and outside parameters of cameras offline.Secondly,a multi-window division method identifying the actual distance is used on two-dimensional image according to the results of calibration,and different window is divided into regions of different safety factor according to distance,in order to provide prior knowledge of geometry of vision detection to the road.Thirdly,when there is an obstacle in the area,the corresponding warning message is displayed to assist the driver and provide lightweight visual detection platform for intelligent driver assistance system.The proposed system in this paper can extract lane line on both sides of the vehicle quickly in car-board experiments and take advantage of off-line calibration results to generate alerts regions of different safety factors quickly,and both positive false detection rate and negative false detection rate in the experiment during normal driving in the lane are small and negligible.Compared with conventional driver assistance systems,our proposed method reduces the computation amount by simplifying the detection process to achieve lightweight lane and vehicle detection,and lays the foundation for implementation of the system on embedded systems.
Improved Mathematical Model for Cooperative Navigation of Multi-AUVs
WANG Wei-ping, YANG Miao and ZHAO Yu-xin
Computer Science. 2015, 42 (Z11): 525-528. 
Abstract PDF(324KB) ( 1007 )   
References | RelatedCitation | Metrics
AUV represents the future development direction of underwater vehicle. Through sharing of information,cooperative navigation system of multi-AUVs has more advantages than the single AUV navigation system.This paper analyzed the basic principles of cooperative navigation of Multi-AUVs,and advantages and disadvantages of its two kinds of network structure,then proposed an improved structure of cooperative navigation of multi-AUVs,and deduced the motion model and measurement model,to create the conditions for subsequent research on cooperative navigation algorithm.
Optimization Design of PID Controller for Spring Damper System Based on Particle Swarm Algorithm
WANG Bo, YAN Jun, HOU Qian-qian, XU Ming-ming and GUO Chun-hui
Computer Science. 2015, 42 (Z11): 529-531. 
Abstract PDF(204KB) ( 493 )   
References | RelatedCitation | Metrics
Spring damper system has been widely applied in engineering and its stability has important influence on the project.A design method of PID controller based on particle swarm algorithm was proposed to solve the difficult problems of parameter tuning on PID controller in the article.MATLAB simulation was finally used to demonstrate the feasibility and advantages of this approach.Compared the simulation results with the results of prediction method and the Z-N tuning method,it was showed that the particle swarm optimization algorithm to adjust the parameters of the PID can eliminate the impact of system,so as to make the system more stable and reliable.
Internet of Things Technology in Application of ETC System
HOU Li-hong and LI Wei-dong
Computer Science. 2015, 42 (Z11): 532-535. 
Abstract PDF(367KB) ( 1817 )   
References | RelatedCitation | Metrics
To illustrate the Internet of things has a profound influence on the development of intelligent transportation,we mainly introduced the technology of the Internet of things in application electronic toll collection(ETC).The basic concepts,structure and key technologies of the Internet of things were explained.The composition,advantage,working principle and process of electronic toll collection system based on radio frequency identification(RFID) technology were analyzed. We put forward improvement measures for the problems existing in the system.
Design Doppler Signal Emulator Test Software Platform
YANG Yi-dong, YAO Jin-jie and SU Xin-yan
Computer Science. 2015, 42 (Z11): 536-538. 
Abstract PDF(499KB) ( 513 )   
References | RelatedCitation | Metrics
Under the field trial condition,the traditional projectile speed test has high cost and long test cycle,at the same time,the measurement system can’t be confirmed in the situation of trial condition.In allusion to these problems,The Doppler signal simulator based on intelligent tablet was designed.On the basis of the introduced working principle of the hardware,this article described the process of and implementation of generating Doppler signal based on PXI6711 board and high-speed data acquisition based on PXI5122 board in detail.Test results show that Doppler signal emulator test software platform has many advantages such as the simple construction,operating and upgrading easily,and can be widely applied to the measurement of the velocity radar and the correlation test platform.
Soft Switching Control for Uncertain Delta Operator Systems Based on Sliding Mode
LIU Yun-long, WANG Yu-mei, KAO Yong-gui and WANG Wen-cheng
Computer Science. 2015, 42 (Z11): 539-541. 
Abstract PDF(358KB) ( 560 )   
References | RelatedCitation | Metrics
Aiming at high speed sampling control systems with internal parameter perturbation and external disturbance,a novel Delta operator reaching law method,called S-type variable rate reaching law,was concerned based on sigmoid function,and a soft switching control for uncertain Delta operator systems was proposed based on sliding mode.Soft switching controller for continuous time systems and discrete-time systems can be unified into those of Delta operator systems via the principle of Delta operator sampling.The closed loop-string high speed sampling soft switching controller design based on sliding mode via Delta operator S-variable rate reaching law can reach the scope of the switching plane in limited moment,and reduce system flutter effectively.The simulation results demonstrate that the stability and process steady of the proposed controller are better than those of conventional exponential reaching law method,and the controller has a good completely robustness in entire dynamic process.
Workload Scheduling for Minimizing Electricity Cost of Data Center
ZHENG Jian, CAI Ting and DU Xing
Computer Science. 2015, 42 (Z11): 542-543. 
Abstract PDF(233KB) ( 558 )   
References | RelatedCitation | Metrics
In order to reduce both electricity cost and carbon emission,some data centers begin to use green energy supply.However,challenges arise with the fluctuating workload and temporally diverse electricity price to data center electricity cost.To deal with these challenges,this paper presented a workload scheduling algorithm which can minimize the total electricity cost was data center.First,a model for the total electricity cost was introduced.Then a multi-objective constrained optimization problem to the electricity cost was formulated.Finally,the corresponding workload scheduling policy was formed by solution of the optimation problem.Experimental results show that the proposed algorithm can effectively reduce the total electricity cost.
Mobile Subjects Moving Track Capture Mechanism Based on Multi-sensor Data Fusion
BI Chao-guo and XU Li-min
Computer Science. 2015, 42 (Z11): 544-549. 
Abstract PDF(765KB) ( 1251 )   
References | RelatedCitation | Metrics
At present,many fields,such as military,medical care,science and technology,movies,games and many other applications need to capture moving trajectories of moving objects.Existing mobile trajectory recognition and mapping methods generally have high requirements for equipment and have complex algorithms.What’s more,their real-time performance is not ideal.This paper presented a multi-sensor data fusion based mobile trajectories capture mechanism with the intelligent mobile terminal as the carrier,combined with the use of acceleration sensor and attitude sensor to collect data.Through the fusion of the data and the application of the relationship between acceleration and displacement and the relationship between curve and straight line,accurate identification of moving body movement can be acquired.After that,the mobile trajectory can be projected onto a two-dimensional space using the principle of optical perspective projection and mapped on the screen of the intelligent terminal.Experimental results show that the mechanism with high accuracy and real-time performance has the ideal time and space complexity.
USDZQ Optimization Based on Ant Colony Algorithm and Application in ECG Compression
WANG Wei-ping and YANG Miao
Computer Science. 2015, 42 (Z11): 550-553. 
Abstract PDF(313KB) ( 432 )   
References | RelatedCitation | Metrics
This paper adopted an improved uniform quantizer,a uniform scalar dead zone quantizer(USDZQ),to quantize the transformed coefficients.The selection of quantizer parameters directly affects the ECG data compression performance.Therefore the objective of this study was focused on the optimization of the USDZQ parameters.We used the ant colony optimization(ACO) algorithm for the optimization.Experiments on several records from the MIT-BIH arrhythmia database show that as long as the USDZQ parameter is optimized reasonably,USDZQ can achieve better performance than the uniform quantizer,and may successfully be applied in the ECG data compression.
Evaluation of Port-logistics Capability Based on Entropy Weight and BP Neural Network
DOU Zhi-wu, LI Hong-wei and XIONG Qi
Computer Science. 2015, 42 (Z11): 554-556. 
Abstract PDF(856KB) ( 790 )   
References | RelatedCitation | Metrics
To improve the comprehensive capability of port-logistics is a hot and difficult question.So a combined method of entropy theory and BP neural was applied to research the port-logistics.The theory of entropy weight was adopted to determine the training samples and expected output,and the comprehensive capability of port-logistics were obtained based on the trained BP neural network.Instance including 7 years’ logistic data of Hekou port confirms the effectiveness and practical application value of the methods.
Evaluation Model of Textile Yarn Quality Based on Rough Set Theory
LI Xia, GUO Hao-long, ZHANG Bao-wei and WANG Yong-hua
Computer Science. 2015, 42 (Z11): 557-559. 
Abstract PDF(273KB) ( 518 )   
References | RelatedCitation | Metrics
It has important practical significance for textile enterprises to evaluate objectively the yarn quality. Accor-ding to the disadvantage the existing evaluation system is overly dependent on subjective experience to determine,this paper put rough set theory into the evaluation of yarn quality,and put forward the discrete algorithm of the evaluation index and constructed the evaluation metric function.On this basis,the yarn quality evaluation model was established based on rough set.The experimental results show that the model is simple and high efficient,and provides more objective data for the quality manager to carry on the decision-making.
Further Study on Running-car Network
YANG Ying-jie
Computer Science. 2015, 42 (Z11): 560-562. 
Abstract PDF(266KB) ( 415 )   
References | RelatedCitation | Metrics
Scientists put forward the idea of running-car network from computer managing city.Through studying on network,realtime-database and autoumatic car theory,the road network came true.The language generally is “Lisp”,and the database system is realtime-database system.
Windows Based Software Integration Technology and its Application on Motor Design Platform
YAO Yan-fei
Computer Science. 2015, 42 (Z11): 563-566. 
Abstract PDF(626KB) ( 553 )   
References | RelatedCitation | Metrics
To solve the problem of inconveniently using of multi-software in one project with the development of software industry,the application of motor design platform was introduced.This paper firstly discussed many key technologies,such as handle of the window,control of the windows message.Then,how to control software to start and exit,transfer message to the software menu and intercommunicate data between softwares were analyzed.Based on this method,SolidWorks and Ansoft were integrated under Visual C++ environment.Finally,an instance was given to verify the feasibility of this method.