Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 44 Issue Z11, 01 December 2018
  
Review for Deep Learning Based on Medical Imaging Diagnosis
ZHANG Qiao-li, ZHAO Di and CHI Xue-bin
Computer Science. 2017, 44 (Z11): 1-7.  doi:10.11896/j.issn.1002-137X.2017.11A.001
Abstract PDF(1567KB) ( 2153 )   
References | Related Articles | Metrics
At present,the various modalities of medical image data accumulate rapidly,bringing great challenges to doctors who diagnose disease through traditional medical image analysis methods.Deep learning method has gained great success and become more and more popular in the computer vision field.All that case provides new chances for automaticmedical image analysis and makes high precisely computer-aided disease diagnosis possible.In this paper,we reviewed state-of-the-art research progress of deep learning in the medical image field.Firstly,the method of deep learning and its application in the field of medical imaging are introduced.Then attention is focused on specific research progress of deep learning method in several typical and popular disease.Finally,the tendency of this research field is summarized,and then the existing problems and recommendations are put forward.
Text Extraction in Video and Images:A Review
JIANG Meng-di, CHENG Jiang-hua, CHEN Ming-hui and KU Xi-shu
Computer Science. 2017, 44 (Z11): 8-18.  doi:10.11896/j.issn.1002-137X.2017.11A.002
Abstract PDF(2184KB) ( 1581 )   
References | Related Articles | Metrics
Text extraction in video and images has important application value.Big data era brought urgent demands of huge amounts of information retrieval,many text extraction methods have been proposed in recent years.In this paper,we reviewed text extraction methods from video and images.First,we classified the course of text extraction into two steps:text region detection and localization,text segmentation.Then,some text region detection and localization and text segmentation algorithms have been discussed regarding their application fields and their advantages and disadvantages.Finally,we discussed benchmark data and performance evaluation,and pointed out the promising directions for future research.
Review of Implicit Surface Reconstruction from Point Cloud Dataset
XU Li-min and WU Gang
Computer Science. 2017, 44 (Z11): 19-23.  doi:10.11896/j.issn.1002-137X.2017.11A.003
Abstract PDF(1317KB) ( 1434 )   
References | Related Articles | Metrics
The surface reconstruction of point cloud data is to reconstruct the surface of 3D objects on the scattered data points of the scanning equipment.It is widely used in computer animation,target recognition,data visualization and geographic information system.The surface reconstruction based on implicit function is an important method of surface reconstruction of point cloud datasets due to its ability to reconstruct holes and cracks,and then do not need splicing and smoothing.This paper summarized some main implicit surface reconstruction methods,analyzed and compared the advantages and disadvantages of implicit model and corresponding surface reconstruction algorithm.Finally,the existing problems and the future development direction of implicit surface reconstruction were analyzed and discussed.
Survey on Monitoring Techniques for Data Abnormalities
WU Jing-feng, JIN Wei-dong and TANG Peng
Computer Science. 2017, 44 (Z11): 24-28.  doi:10.11896/j.issn.1002-137X.2017.11A.004
Abstract PDF(1280KB) ( 2747 )   
References | Related Articles | Metrics
In the current environment of large data,anomaly data is more difficult to obtain than normal data,and is more important.The purpose of anomaly detection is to detect activity data from normal subjects.Anomaly detection is widely applied in many fields,such as machine fault detection,data mining and disease detection and intrusion detection.Based on a large number of anomaly detection methods at present,this paper mainly discussed the existence of anomaly data,classified the major anomaly detection methods according to this framework,and put forward the advantages and disadvantages of these methods.Finally,we focused on the large data anomaly detection methods based on the deep learning,and introduced different methods and related applications and future research hotpots respectively.
Application Survey of Artificial Intelligence in Neurology
LI Shi-yu, WANG Feng, CAO Bin and MEI Qi
Computer Science. 2017, 44 (Z11): 29-32.  doi:10.11896/j.issn.1002-137X.2017.11A.005
Abstract PDF(1262KB) ( 1301 )   
References | Related Articles | Metrics
Artificial intelligence affects all aspects of people’s lives,and medical treatment has become one of the most popular areas of artificial intelligence.More and more artificial intelligence equipments are used to assist doctors in the diagnosis and treatment.The application of artificial intelligence in neurology was reviewed,especially for the diagnosis of Parkinson’s disease and Alzheimer’s disease.Firstly,the development history,classification and application status of artificial intelligence were expounded.Secondly,the research status of artificial intelligence diagnosis of Parkinson’s di-sease and Alzheimer’s disease was summarized,and the key technologies used to diagnosis Parkinson’s disease and Alzheimer’s disease were analyzed.Finally,The technology in the application of medicine was summarized,and the importance of the application of artificial intelligence in the medical field was clarified.The future research direction of artificial intelligence in the application of neurology was prospected.
Survey on Reliability Estimation Methods of Sequential Circuit in Height-level
OUYANG Cheng-tian, CHEN Li-li and WANG Xi
Computer Science. 2017, 44 (Z11): 33-38.  doi:10.11896/j.issn.1002-137X.2017.11A.006
Abstract PDF(1471KB) ( 677 )   
References | Related Articles | Metrics
Reliability of sequential circuits is emerging as an important concern in scaled electronic technologies.In this paper,a survey of the research progress of the high-level reliability analysis for sequential circuits was given.Specially,we focused on Bayesian reliability analysis,multiple-pass reliability analysis and reliability estimation of sequential circuit based on probabilistic transfer matrix.And these analysis methods of sequential circuit were selected for experiment on the ISCAS 89 benchmark circuits.Research results and experimental results show that the abstraction level of circuit is higher,the accuracy of the results is lower,and the time overhead will be less.In the same abstraction level,the simulation methods have high accuracy,but also have more runtime,and analytical methods have low time overhead,but less accurate.
Repetitive Pattern Recognition Algorithms and Applications in Web Information Extraction and Clustering Analysis
Munina YUSUFU and Gulina YUSUFU
Computer Science. 2017, 44 (Z11): 39-45.  doi:10.11896/j.issn.1002-137X.2017.11A.007
Abstract PDF(1415KB) ( 788 )   
References | Related Articles | Metrics
Detection of repetitive patterns in sequences and their applications have become fundamental research areas in data mining.It is a very important means for extracting useful information from sequences.In recent years,many works have been conducted focusing on the definitions of repetitive patterns,efficient recognition algorithm designs,and applications in the relevant areas.In this paper,the classification and characteristics of repetitive patterns in sequence were briefly described,and the commonly used data structures in the algorithms were discussed.Recent studies on the applications of detection algorithms in relevant fields and their main design ideas were reviewed by discussing and evaluating certain aspects.These aspects include the field knowledge and constraints,identifying results,scalabilities of algorithms and existing main problems.Application areas include Web information extraction,Web document feature extraction and clustering algorithms,and related Uyghur language information processing.Finally,some challenges on detection algorithms of repetitive patterns in sequences and applications in various fields were discussed and future research trends were also explored.
Ordering Recommender Algorithm Based on Consumers’ Behavior
DING Dang, ZHANG Zhi-fei, MIAO Duo-qian and CHEN Yue-feng
Computer Science. 2017, 44 (Z11): 46-50.  doi:10.11896/j.issn.1002-137X.2017.11A.008
Abstract PDF(1390KB) ( 1102 )   
References | Related Articles | Metrics
With the development of e-commerce,most of the existing catering management system lags consumers and managers’ need.An effective approach is to apply recommendation systems to catering management,and to provide ordering recommendations according to consumers’ behavior data.As for cold start problems that may arise in the recommending process,the ordering recommender system based on consumers’ behavior was proposed,containing three re-commendation engines,which are frequency statistics,association rules and Markov chain .Experiments on ordering data of real restaurants achieve a satisfactory result,and get a weight combination of three recommendation engine:(0.2167,0.5167,0.2666),and the best recommending length under that weight:3.
Application of BP Neural Network Based on Newly Improved Particle Swarm Optimization Algorithm in Fitting Nonlinear Function
LIN Yu-feng, DENG Hong-min and SHI Xing-yu
Computer Science. 2017, 44 (Z11): 51-54.  doi:10.11896/j.issn.1002-137X.2017.11A.009
Abstract PDF(1297KB) ( 987 )   
References | Related Articles | Metrics
A newly improved particle swarm optimization algorithm for the optimization of BP neural network was introduced to solve the problem of large error in fitting the nonlinear function.In this algorithm,a new model based on the newly improved particle swarm optimization algorithm is established through respectively changing the weight non-li-nearly and learning factor linearly,then it is applied to non-linear function fitting by combining with BP neural network.The results show that the newly improved particle swarm optimization algorithm can more rationally and effectively boost the fitting ability of BP neural network,and improve the accuracy of the fitting.
Research on Fuzzy Matching Duplicate Checking Algorithm Based on Matrix Model of Word Segmentation
LI Cheng-long, YANG Dong-ju and HAN Yan-bo
Computer Science. 2017, 44 (Z11): 55-60.  doi:10.11896/j.issn.1002-137X.2017.11A.010
Abstract PDF(1502KB) ( 2396 )   
References | Related Articles | Metrics
Aiming at the need of Chinese text duplicate checking,based on the result of word segmentation,we converted target text and sample text into matrix model of word segmentation,then scanned and analyzed matrix to get the result.Therefore an algorithm of duplicate checking was developed,and the usefulness of the method was demonstrated by practical examples.
Analysis and Prediction on Rebar Price Based on Multiple Linear Regression Model
CHEN Hai-peng, LU Xu-wang, SHEN Xuan-jing and YANG Ying-zhuo
Computer Science. 2017, 44 (Z11): 61-64.  doi:10.11896/j.issn.1002-137X.2017.11A.011
Abstract PDF(1380KB) ( 1535 )   
References | Related Articles | Metrics
A kind of rebar price analysis as well as prediction model based on multiple linear regression analysis was proposed by means of analyzing the upstream and downstream relationship of rebar industrial chain in futures black line variety.Firstly,the data of major factors influencing rebar price is collected,including coke futures settlement price,coking coal futures settlement price,iron ore futures settlement price,hot rolled futures settlement price,and central parity rate of RMB to USD.Later,these influencing factors are analyzed through scatter diagram and rend line to determine influencing factors.The multiple linear regression model based on least square method is constructed by virtue of SPSS and NCSS,and the collected data.Meanwhile,the collinearity among independent variables are moved through ridge regression to obtain revised model.At last,this model is applied to carry out accurate prediction of rebar price on trade day in the next month.The experiment indicates that the fitting degree of this model is higher with certain practicability.
New Intelligent Prediction of Chronic Liver Disease Based on Principal Component Machine Learning Algorithm
CHANG Bing-guo, LI Yu-qin, FENG Zhi-chao and YAO Shan-hu
Computer Science. 2017, 44 (Z11): 65-67.  doi:10.11896/j.issn.1002-137X.2017.11A.012
Abstract PDF(1302KB) ( 938 )   
References | Related Articles | Metrics
Using new information technology to predict the mechanism and characteristics of chronic liver disease is an effective way to improve its diagnosis.In this paper,we used the principal component analysis (PCA) of the machine learning algorithm to reduce the dimensional indicators of chronic liver disease,combined with neural network learning to build a new intelligent prediction of chronic liver disease (IPCLD).The experiment studied 125 data sets of 20-dimensional indicators of chronic liver disease,used receiver operating characteristic (ROC) curve to select 13-dimensional more sensitive indicators,further reduces the dimension down to 5 by PCA.The neural network is trained with 115 data sets,and the remaining 10 data sets are used as test data sets.Compared with being trained by original data,the IPCLD improves 15.07% prediction accuracy and reduces the complexity.
Study on Abnormal Diagnosis of Moving ECG Signals Based on Unsupervised Learning
LI Feng and XIE Si-hong
Computer Science. 2017, 44 (Z11): 68-71.  doi:10.11896/j.issn.1002-137X.2017.11A.013
Abstract PDF(1413KB) ( 949 )   
References | Related Articles | Metrics
To diagnosis the abnormal mobile ECG signals,a method based on unsupervised learning was proposed.The ECG data is classified by hierarchical clustering,and by combining the priority diagnosis method of the feature quantity, the complexity of time and the complexity of space consuming are effectively reduced.At last,the analysis of the examples verifies the methods of this paper.
UAV Navigation Algorithm Research Based on EB-RRT*
CHEN Jin-yin, LI Yu-wei and DU Wen-yao
Computer Science. 2017, 44 (Z11): 72-79.  doi:10.11896/j.issn.1002-137X.2017.11A.014
Abstract PDF(1767KB) ( 554 )   
References | Related Articles | Metrics
With the wide application of UAV,automatic navigation capability of UAV becomes more important.UAV navigation algorithm was defined to plan a collision free and smooth path from start position to destination position in known maps.Aiming at three problems of current navigation algorithms including low convergence rate,long searching time and navigation path couldn’t be applied to real UAV,EB-RRT* (Efficient B-RRT*) algorithm was proposed.A self-adaptive obstacle avoidance strategy was designed to speed up convergence rate of traditional navigation algorithm by reducing memory cost.Grid based segmentation mechanism was put forward to reduce searching time for path planning.Suitable down sampling and three times Bessel interpolation formula were adopted to smooth final path for practical UAV applications.Several simulation maps were used to testify the performances of proposed algorithms compared with other classic algorithms.
Algorithm of Importance Ranking for Influencing Factors of Website Service Quality Based on PageRank
QI Yu-dong, HE Cheng and YUAN Wei
Computer Science. 2017, 44 (Z11): 80-83.  doi:10.11896/j.issn.1002-137X.2017.11A.015
Abstract PDF(1186KB) ( 619 )   
References | Related Articles | Metrics
In this paper,factors were filtered through the Delphi method.But in the analysis of the relationship between the various factors,PageRank voting ideas look the interaction of the factors which affect the quality of web service as a link with each webpage,and this method uses this interaction as votes and calculates the affected value by internal relationship of each factors.The sequence of factors which affects the quality of web service by importance will be got eventually,and this method offers an example and reference for evaluation of quality of the web service.
Research of Path Planning in Multi-AGV System
TAI Ying-peng, XING Ke-xin, LIN Ye-gui and ZHANG Wen-an
Computer Science. 2017, 44 (Z11): 84-87.  doi:10.11896/j.issn.1002-137X.2017.11A.016
Abstract PDF(1269KB) ( 3800 )   
References | Related Articles | Metrics
In this paper,a dynamic routing method based on time window model was presented for dealing with multi-AGV path planning problem in warehouse.Firstly,A* algorithm is used for completing path routing of multiple AGV.Secondly,the time for AGV passing through the path node’s is calculated.Multi-AGV conflict issues is solved by inserting the time window into path and updating time window.Furthermore,through dynamic allocation of priority for the multi-AGV,the efficiency of system is enhanced.Finally,when the obstacles are in the path,by dynamically changing the path weight,real-time obstacle avoidance is achieved.The simulation results show that the proposed algorithm can effectively avoid collision under the condition of optimal path,the method can not only improve the system efficiency,but also has good adaptability and robustness in dynamic environment.
Fast Incremental Learning Algorithm of SVM with Locality Sensitive Hashing
YAO Ming-hai, LIN Xuan-min and WANG Xian-bao
Computer Science. 2017, 44 (Z11): 88-91.  doi:10.11896/j.issn.1002-137X.2017.11A.017
Abstract PDF(1231KB) ( 851 )   
References | Related Articles | Metrics
In order to improve the training speed and the classification accuracy in large scale high dimension data,a new incremental learning algorithm of SVM with LSH was proposed.It uses the LSH algorithm,which can seek similar data fast in a large scale and high dimension data,to filter out the incremental samples which may become SVs on the basis of the SVM algorithm.Then it makes the selected samples and the existing SVs as a basis for the following training.We took advantages of the multiple data sets to validate the algorithm.Experiments show that this new algorithm can improve the speed of the incremental training learning in large scale data with the effective accuracy.
Research of Text Sentiment Classification Based on Improved Semantic Comprehension
WANG Ri-hong, CUI Xing-mei, ZHOU Wei, WANG Cheng-long and LI Yong-jun
Computer Science. 2017, 44 (Z11): 92-97.  doi:10.11896/j.issn.1002-137X.2017.11A.018
Abstract PDF(1429KB) ( 955 )   
References | Related Articles | Metrics
Text classification has a wide range of applications in information retrieval,Web automatic document classification,digital library,automatic abstracting,document organization and management.An improved text sentiment classification method was put forward based on semantic understanding.Emotional sememe was joined to revise the definition in the emotional similarity calculation and the development of emotional phrases’ sentiment was combined.Focusing on emotional words and negative words,the degree of adverbs combining form of analysis,and module complex negative word and adverb were put forward.Combining with the use of conjunctions as the standard of the sentence for emotional tendencies classified processing,a text propensity algorithm was given to judge the text sentiment and classify the text.Experimental results show that the classification results with the method improve when comparing with the previous algorithm.
Cost-sensitive Random Forest Classifier with New Impurity Measurement
SHI Yan-wen and WANG Hong-jie
Computer Science. 2017, 44 (Z11): 98-101.  doi:10.11896/j.issn.1002-137X.2017.11A.019
Abstract PDF(1364KB) ( 772 )   
References | Related Articles | Metrics
For the problem of effective classification on imbalanced data sets,a classifier combining cost-sensitive learning and random forest algorithm is proposed.Firstly,a new impurity measure is proposed,taking into account not only the total cost of the decision tree,but also the cost difference of the same node for different samples.Then,the random forest algorithm is executed,K times sampling for the data set is performed,and K basic classifiers are built.Then,the decision tree is constructed by the classification regression tree (CART) algorithm based on the proposed impurity measure,so as to form the decision tree forest.Finally,the random forest algorithm makes the data classification decision by voting mechanism.In the UCI database,compared with the traditional random forest and the existing cost-sensitive random forest classifier,this classifier has good performance in the classification accuracy,AUC area and Kappa coefficient.
Automated Scoring Chinese Subjective Responses Based on Improved-LDA
LUO Hai-jiao and KE Xiao-hua
Computer Science. 2017, 44 (Z11): 102-105.  doi:10.11896/j.issn.1002-137X.2017.11A.020
Abstract PDF(1425KB) ( 1271 )   
References | Related Articles | Metrics
Automated scoring subjective responses (ASSR) have great promise for providing diagnostic information and reliability to aid language learning and testing.In the presnt study,we introduced the latent Dirichlet allocation (LDA) into an automated scoring task with Chinese subjective responses,and an improved LDA model with experts’ know-ledge was proposed.In the novel model,we proposed a text feature representation approach integrating document-latent topic probability vector and latent topic-core terms probability vector.Experiment results show that the improved-LDA is better than LSA in terms of the autoscoring performances.The findings of this study highlight the model selection in application of automated scoring Chinese responses with language testing.
CSI 300 Index Prediction Research Based on Multi-output Learning
TANG Yan-qin, PAN Zhi-song and ZHANG Yan-yan
Computer Science. 2017, 44 (Z11): 106-109.  doi:10.11896/j.issn.1002-137X.2017.11A.021
Abstract PDF(1261KB) ( 891 )   
References | Related Articles | Metrics
In the stock market,people usually depend on the historical trading data to predict the trend of future.The method of SVM is more common in the stock prediction,but it is complex and time-consuming,and usually predicts the trend of one day.This article adopted regularization method of multi-output learning to predict the trend of many days.We improved the method of multi-task learning and put forward the method based on multi-output learning.Experiments in CSI 300 index show that the mean square error(MSE) of prediction in this method is about 10 times compared to that in the support vector machine (SVM) method,and the time consuming is also reduced nearly three-quarters.
QoE Evaluation Model of Mobile Streaming Media
XIONG Li-rong and JIN Xin
Computer Science. 2017, 44 (Z11): 110-114.  doi:10.11896/j.issn.1002-137X.2017.11A.022
Abstract PDF(1436KB) ( 966 )   
References | Related Articles | Metrics
HTTP adaptive streaming can keep balance between smooth playback and video quality.Most of the HAS-based streaming media QoE models concern more about the current system or network conditions,but less about the user’s media equipment status and psychological experience.In this paper,the QoE evaluation model was designed to classify the influence factors into two categories:objective perception impact parameter and psychological effect influence parameter.We proposed the mobile device jitter status detection method and user viewing position detection method.Based on the mobile device status,user’ viewing position and streaming media service quality,psychological effects of serial positioning approach is used to evaluate QoE score.Finally,it is proved that the QoE model can provide accurate and effective QoE evaluation results according to the actual user experience.
Reduction Algorithm for Total Dominating Set
LUO Wei-zhong, CAI Zhao-quan, LAN Yuan-dong and LIU Yun-long
Computer Science. 2017, 44 (Z11): 115-118.  doi:10.11896/j.issn.1002-137X.2017.11A.023
Abstract PDF(1401KB) ( 625 )   
References | Related Articles | Metrics
Total dominating set is a famous NP-hard problem,and has important applications in wireless sensor networks.In this paper,we mainly studied the design of reduction algorithm which can reduce the size of the original pro-blem.By analyzing the problem structure and coloring the vertices in given graph,we observed many new combinatorial properties of its vertices,and presented a set of efficient and polynomial-time reduction rules exploring local structures of the graph.The rules are proved to be correct theoretically,and the effectiveness is verified by simulations.
Chaotic Gray Wolf Optimization Algorithm with Adaptive Adjustment Strategy
ZHANG Yue, SUN Hui-xiang, WEI Zheng-lei and HAN Bo
Computer Science. 2017, 44 (Z11): 119-122.  doi:10.11896/j.issn.1002-137X.2017.11A.024
Abstract PDF(1407KB) ( 1179 )   
References | Related Articles | Metrics
Gray wolf optimization is a new optimization algorithm.Compared with other groups,it has the problems of low convergence speed,unstable and easy to fall into local optimum.According to the structural characteristics of the GWO algorithm,a chaotic gray wolf optimization algorithm based on the adaptive adjustment strategy was proposed.The adaptive adjustment strategy is used to improve the convergence rate of the GWO algorithm.The chaotic local search strategy is used to increase the diversity of the population and avoid search process falling into the local optimal,Finally,six algorithms are used to simulate the algorithm,and the other four algorithms are compared.The experimental results show that the proposed algorithm has obvious advantages in terms of convergence speed,accuracy and stability.
Improved Quantum Behaved Particle Swarm Optimization Algorithm for Mobile Robot Path Planning
LIU Jie, ZHAO Hai-fang and ZHOU De-lian
Computer Science. 2017, 44 (Z11): 123-128.  doi:10.11896/j.issn.1002-137X.2017.11A.025
Abstract PDF(1547KB) ( 655 )   
References | Related Articles | Metrics
In order to realize the optimal path planning of mobile robot,an improved quantum behaved particle swarm optimization (LTQPSO) algorithm was proposed.Aiming at that the particle swarm algorithm has the problem of premature convergence,the individual particle evolution speed and the group dispersion are used to dynamically adjust the inertia weight,which makes the inertia weight adaptive and controllabe,and avoids premature convergence.Meanwhile,the natural selection method was introduced into the traditional location update formula in order to maintain the population diversity,strengthen the search ability of the global QPSO algorithm,and improve the convergence speed of the algorithm.The improved QPSO algorithm was applied to the path planning of mobile robot.Finally,the effectiveness and feasibility of the proposed method was verified by theoretical simulation and experimental results of a mobile robot platform.
Decision Tree Algorithm Based on Attribute Significance
WANG Rong, LIU Zun-ren and JI Jun
Computer Science. 2017, 44 (Z11): 129-132.  doi:10.11896/j.issn.1002-137X.2017.11A.026
Abstract PDF(1293KB) ( 727 )   
References | Related Articles | Metrics
The traditional ID3 decision tree algorithm is difficult in selecting attribute,its classification efficiency is not high,and anti-noise performance is not strong,so it is difficult to adapt to large-scale data set and other issues.Aiming at this situation,a decision tree algorithm based on attribute significance and variable precision rough set was proposed to ensure that the tree size is not too large while removing the noise data.The algorithm was validated by using multiple UCI standard data sets.The experimental results show that the algorithm is superior to the ID3 algorithm in the scale and classification accuracy of the decision tree.
Short-term Load Forecasting of Power System Based on Alternating Particle Swarm BP Network
TANG Cheng-e
Computer Science. 2017, 44 (Z11): 133-135.  doi:10.11896/j.issn.1002-137X.2017.11A.027
Abstract PDF(1358KB) ( 427 )   
References | Related Articles | Metrics
Short-term load forecasting is a key link in the normal operation of power system.Based on accurate load forecasting,an alternating particle swarm optimization algorithm was proposed to optimize the BP network model to predict short-term load in this paper.In order to determine the weight of the BP neural network,the weights of the BP neural network were optimized by using the alternating particle algorithm to reduce the error caused by the neural network prediction model to solve the short term load forecasting.The experimental results show that the optimized BP neural network prediction model is less accurate than the traditional BP neural network prediction model and closer to the actual power load.
Conservative Extension in Description Logic εL with Cyclic Terminologies
WANG Yong-hong, SHEN Yu-ming, NIE Deng-guo and WANG Ju
Computer Science. 2017, 44 (Z11): 136-140.  doi:10.11896/j.issn.1002-137X.2017.11A.028
Abstract PDF(1412KB) ( 421 )   
References | Related Articles | Metrics
In computer science,ontologies are dynamic entities.To make them adapt to new and evolving applications,it is necessary to constantly perform modifications such as the extension with new axioms and merging with other ontologies.Based on the different requires and application domains,users choose the appropriate ontology to import another ontology and implement the extension of the existing ontology in many application domains of developing ontologies.We argued that it is very significant to know that the resulting ontology remains a conservative extension of the original one after performing such modifications.If this is not the case,there may be unexpected consequences when using the modified ontology in the place of the the existing one in applications.Lutz et al. studied the conservative extension problem of description logic and proved that conservative extension remaines ExpTime-completeness.The conservative extension in the description logic with cyclic terminolgies was analyzed based on Lutz’s work.On the one hand,the sufficient condition of conservative extensions with respect to greatest fixpoint semantics in the description logic with cyclic termino-logies has the same primitive concepts,and its complexity is proved to be polynomial.On the other hand,conservative extension ofterminological cycles with respect to the greatest fixpoint semantics is prestented and its complexity is proved to be exponential.
Interval Type 2 Fuzzy System Identification Using NT Type Reduction Algorithm
WANG Zhe
Computer Science. 2017, 44 (Z11): 141-143.  doi:10.11896/j.issn.1002-137X.2017.11A.029
Abstract PDF(1197KB) ( 750 )   
References | Related Articles | Metrics
KM is a commonly used algorithm for interval type 2 fuzzy sets type reduction,which has the weakness of low efficiency,thus it is difficult to be used for online identification and control.A simplified interval type 2 fuzzy system identification method was proposed in this article.The method uses type 2 T-S fuzzy model,the premise fuzzy set of which is interval type 2 fuzzy sets and the consequent parameter is the same as type 1 T-S fuzzy model.A simplified type reduction algorithm was used in this article instead of KM algorithm.The simulation shows that the simplified method can improve identification efficiency without reducing identification accuracy and can be used for real time identification and control.
Double Quantitative Multi-granulation Rough Set Model Based on “Logical Conjunction” Operator
CHEN Hua-feng, SHEN Yu-ling, LONG Jian-wu and QU Xian-ping
Computer Science. 2017, 44 (Z11): 144-147.  doi:10.11896/j.issn.1002-137X.2017.11A.030
Abstract PDF(1341KB) ( 419 )   
References | Related Articles | Metrics
In this study,the double quantitative multi-granulation rough set model based on logical conjunction operator was proposed in multi-granulation approximate space.The variable precision rough set which is characterized by relative quantitative information and graded rough which describes absolute quantitative information were conbined to set double quantitative multi-granulation rough set model based on logical conjunction operater.Some mathematical properties of the investigated rough set model were researched in the viewpoints of optimistic and pessimistic.The constructed model describes relative quantitative information and absolute quantitative information at the same time in an approximate space.It is useful to process noisy data and provides more abundant theoretical principle for knowledge discovery based on rough set theory.
Matrix Triangular Decomposition Improvement of Pre Order Principal Sub Determinant
SU Er
Computer Science. 2017, 44 (Z11): 148-153.  doi:10.11896/j.issn.1002-137X.2017.11A.031
Abstract PDF(1530KB) ( 706 )   
References | Related Articles | Metrics
Using Gauss elimination method with part main elements is generally not got all principal sub determinants of matrix.This article discussed around gradual reduction with each step-by-step subdivision,final triangular decomposition was performed on matrix A0 by row permutation after a number of row replacement,and each pre order principal sub determinant of A0 was found out orderly.Main purpose of the article is to achieve improvement for usually triangle reduction method by row permutation,with a recursive method for algebraic representation,to bind matrix product ope-ration,and it inductively proves that final reduction result is in accordance with the realization of the L-U triangular decomposition to matrix.And at the same time with the process of gradual reduction,we got all pre order principal sub determinants of original matrix A0.
Multiple Object Tracking Algorithm via Collaborative Motion Status Estimation
YUAN Da-long and JI Qing-ge
Computer Science. 2017, 44 (Z11): 154-159.  doi:10.11896/j.issn.1002-137X.2017.11A.032
Abstract PDF(1532KB) ( 1169 )   
References | Related Articles | Metrics
Multiple object tracking (MOT) is widely applied in video analysis scenarios,such as human interaction,virtual reality,autonomous driving,visual surveillance and robot navigation etc.MOT can be formulated as a sort of tracklets association in existing detection results,in which the accuracy of detection algorithm is entitled an essential role in tracking performance.We proposed a multiple object tracking algorithm via collaborative motion status estimation.The algorithm is based on the tracking-by-detection framework.The algorithm predominantly focuses on data association of adjacent video frames,tackling challenges of MOT from three aspects:object detection,object motion status estimation and data association.Firstly,as for object detection,multi scale convolutional neural network(MS-CNN) is adopted as the detector,since the advantage of deep learning in detection outweighs that of classical machine learning method.Se-condly,to better predict object motion status and handle occlusion among targets,different motion estimation methods are utilized according to different motion statuses.In tracking status,kernelized correlation filter is employed,while in occlusion status,the use of kalman filter is prioritized.Lastly,Kuhn-Munkres algorithm is adopted to work out data association between detections and tracklets.A substantial amount of experiments were carried out to estimate the efficiency.The results are quite positive,demonstrating high accuracy.
Identifying Users’ Gender via Social Representations
ZHU Pei-song, QIAN Tie-yun and WU Min-quan
Computer Science. 2017, 44 (Z11): 160-165.  doi:10.11896/j.issn.1002-137X.2017.11A.033
Abstract PDF(1389KB) ( 485 )   
References | Related Articles | Metrics
Gender prediction has evoked great research interests due to its potential applications,like targeted advertisement and personalized search.Most of existing studies rely on the content texts.However,the text information is hard to access.This makes it difficult to extract text features.In this paper,we proposed a novel frame-work which only involves the users’ ID for gender prediction.The key idea is to represent users in the embedding connection space.We presented two strategies to modify the word embedding technique for user embedding.The first is to sequentialize users’ ID to get the order of social context.The second is to embed users into a large-sized sliding window of contexts.We conducted extensive experiments on two real data sets from Sina Weibo.Results show that our method is significantly better than the state-of-the-art graph embedding baselines.Its accuracy also outperforms that of the content based approaches.
3D Point Cloud Segmentation Method Based on Adaptive Angle
LU Yong-huang and HUANG Shan
Computer Science. 2017, 44 (Z11): 166-168.  doi:10.11896/j.issn.1002-137X.2017.11A.034
Abstract PDF(1254KB) ( 753 )   
References | Related Articles | Metrics
Point cloud of 3D segmentation is an important task to extract the point cloud data space based on geometric information.It is the basis of point cloud data feature extraction and analysis.At the same time,the point cloud data are usually discrete and unstructured.The segmentation of point cloud data is not a simple data processing task,and the segmentation efficiency and accuracy determine the data processing results of the work.Therefore,the research of point cloud data segmentation has important significance.This paper presented a cutting algorithm for 3D point cloud based on adaptive angle,in which the PCA algorithm is used to find the optimal dimension projection direction to reduce the dimension of the original point cloud data,and the concept of projection cluster is used to obtain cutting of the original target point cloud.
Self-learning Single Image Super-resolution Reconstruction Based on Compressive Sensing and SVR
QIN Xu-jia, SHAN Yang-yang, XIAO Jia-ji, ZHENG Hong-bo and ZHANG Mei-yu
Computer Science. 2017, 44 (Z11): 169-174.  doi:10.11896/j.issn.1002-137X.2017.11A.035
Abstract PDF(1984KB) ( 519 )   
References | Related Articles | Metrics
For the long learning time and the easiness to occur wrong and high frequecy details of super-resolution(SR) reconstruction algorithm which traditionally depends on external image database,this paper presented a single image SR reconstruction method based on compressive sensing(CS) and support vector regression(SVR). SVR model is training for degarded image itself to make full use of the self similarity of the image.In training stage,we firstly detected image edge and classified image patch into low and high frequency blocks.Then we did image block sparse coding,and trained a SVR model using image’s label vector and sparse representation matrix.Finally,we predicted the high resolution(HR) image with SVR using this model in the testing stage.Experiments show that the proposed method is more rea-listic than the method based on external library,and the edge is more clear than bicubic interpolation method.
Research on Detecting Algorithm of Moving Target in Aerial Video
TANG Jia-lin, ZHENG Jie-feng, LI Xi-ying and SU Bing-hua
Computer Science. 2017, 44 (Z11): 175-177.  doi:10.11896/j.issn.1002-137X.2017.11A.036
Abstract PDF(1544KB) ( 555 )   
References | Related Articles | Metrics
A detecting algorithm of moving target based on the combination between multiple frame energy accumulation and a stabilization method which is combined between improved feature matching algorithm and global motion compensation according to the moving target detection in the aerial video under complex background was proposed.Firstly,it uses the matching method of local area to increase the processing speed of the algorithm,and avoids the influence of the moving target on the background compensation.Secondly,it gets the matching points using the scale-invariant SURF algorithm in combination with the fast approximate nearest neighbor search method,and screens the superior matching points with bilateral matching and k-nearest Neighbor algorithm.Thirdly,it builds the affine transformation model,solves the motion parameters,and makes motion compensation.Finally,it detects moving target through multiple frame difference accumulation.The simulation results show that the method has good effect on moving target detection.
Research on Continuous Sign Language Sentence Recognition Algorithm Based on Key Frame
GUO Xin-peng, HUANG Yuan-yuan and HU Zuo-jin
Computer Science. 2017, 44 (Z11): 178-183.  doi:10.11896/j.issn.1002-137X.2017.11A.037
Abstract PDF(1498KB) ( 992 )   
References | Related Articles | Metrics
At present,most of the dynamic sign language recognition is only for sign language words.The continuous sign language sentence recognition research and the corresponding results are less,because the segmentation of such sentence is very difficult.In this paper,a sign language sentence recognition algorithm was proposed based on weighted key frames.Key frames can be regarded as the basic unit of sign word,therefore,according to the key frames,we can get related vocabularies,and thus we can further organize these vocabularies into meaningful sentence.Such work can avoid the hard point of dividing sign language sentence directly.With the help of Kinect,i.e.motion-control device,a kind of self-adaptive algorithm of key frame extraction based on the trajectory of sign language was brought out in the paper.After that,the key frame was given to weight according to its semantic contribution.Finally,the recognition algorithm was designed based on these weighted key frames and thus got the continuous sign language sentence.Experiments show that the algorithm designed in this paper can realize real-time recognition of continuous sign language sentences.
Judging and Fitting Method for Fractured Freehand Multi-stroke Based on Tolerance Zone
ZHOU Jing and FANG Gui-sheng
Computer Science. 2017, 44 (Z11): 184-188.  doi:10.11896/j.issn.1002-137X.2017.11A.038
Abstract PDF(1252KB) ( 446 )   
References | Related Articles | Metrics
In freehand sketches,a single primitive would often be fractured.To obtain standard sketches and turn fractured strokes into standard primitives,this paper presented a judging and fitting method for fractured freehand multi-stroke based on tolerance zone.The preprocessing of the sketch is to transform all strokes into approximate polylines and to obtain all strokes categorized by type.And the turning points of approximate polylines are stored for further use.This method uses the trend of freehand strokes to construct imaginary strokes.And tolerance zone is established based on the stored turning points of freehand strokes and imaginary ones.The number of the sampling points of the strokes with smaller minimum bounding rectangle,which fall into the tolerance zone of the other stroke and the imaginary one,is counted,and it’s used to judge whether two strokes are fractured multi-strokes or not.Multi-strokes are clustered together,and as a result,the sketch is divided into a set of stroke group. According to the stroke classification of each stroke group,stroke groups are fitted with geometric primitives .The examples shown in this paper validates the feasibility of this method.It lays the foundation for further sketch recognition.
Segmentation Algorithm of Medical Images Based on Multi-resolution Double Level Set
TANG Wen-jie, ZHU Jia-ming and ZHANG Hui
Computer Science. 2017, 44 (Z11): 189-192.  doi:10.11896/j.issn.1002-137X.2017.11A.039
Abstract PDF(1464KB) ( 561 )   
References | Related Articles | Metrics
This paper proposed a novel multiresolution double level set algorithm for medical image,which has a large amount of intensity inhomogeneities and complicated background,and can not be separated completely by traditional levelset.First of all,the algorithm gets the coarse scale image by analyzing the image with wavelet multiscale decomposition.Then,the algorithm identifies multiple targets by segmenting the analysed results in terms of improved double level set model.In order to deal with the effect of intensity inhomogeneities on the medical image,the algorithm introduces a bias fitting term into the improved double level set model and optimizes the coarse-scale segmentation result.The experimental result shows that the algorithm can reduce the problems of intensity inhomogeneities and complicated background,separate medical image including intensity inhomogeneities and multiple objects completely,and obtain the expected effect of segmentation.
Visual Object Tracking Method with Motion Estimation and Scale Estimation
ZHU Hang-jiang, ZHU Fan, PAN Zhen-fu and ZHU Yong-li
Computer Science. 2017, 44 (Z11): 193-198.  doi:10.11896/j.issn.1002-137X.2017.11A.040
Abstract PDF(1598KB) ( 587 )   
References | Related Articles | Metrics
Visual tracking has a wide range of applications in various fields such as video intelligent monitoring and eyes of robot.Based on discriminatory correlation filter,a visual target tracking method with motion estimation and scale estimation method was proposed.First,this method combines translation kernel correlation filter and particle filter method to estimate the position of moving targets on the frame.And then,it executes the scale correlation filter to be stronger ability to adapt scale change of moving object.In traditional KCF tracking algorithm,this method introduces a motion state estimation method based on probability,which can obtain more stable signal of target,and reduces the introduction ofthe background interference information at the same time,leading to it has stronger anti-jamming in complex scenarios.On benchmark data set,we took experiment to test this method,and compared our method with the current advanced tracking methods to verify the efficiency of the proposed algorithm in this paper.It has strong adaptability under complex conditions,the change of scale,illumination,change of pose,partial sheltering,rotating and rapid movement etc.
HOG Pedestrian Detection Algorithm of Multiple Convolution Feature Fusion
GAO Qi-yu and FANG Hu-sheng
Computer Science. 2017, 44 (Z11): 199-201.  doi:10.11896/j.issn.1002-137X.2017.11A.041
Abstract PDF(1337KB) ( 1496 )   
References | Related Articles | Metrics
Pedestrian detection is utilized as the fundamental of various computer vision applications.A typical and effective solution of pedestrian detection is combining histogram of oriented gradient (HOG) with support vector machine (SVM).In this paper,we proposed a novel pedestrian detection method,which uses convolutional neural network (CNN) in HOG+SVM to obtain more comprehensively feature description through a variety of convolutional kernels in CNN.Firstly,it extracts shallow features from data set by CNN,and uses CNN’s characteristics of displacement,and scale and deformation invariance.Then,it merges strongly relevant features by analyzing correlation coefficient of shallow features.After this,it extracts HOG features from CNN shallow features which are weakly correlated with each other.Finally,it uses SVM to complete the training and classification.The experimental results show that the proposed method can obtain higher accuracy than the existing method in pedestrian detection.
Relationship between PCA and 2DPCA
YAN Rong-hua, PENG Jin-ye and WEN De-sheng
Computer Science. 2017, 44 (Z11): 202-206.  doi:10.11896/j.issn.1002-137X.2017.11A.042
Abstract PDF(1610KB) ( 634 )   
References | Related Articles | Metrics
Principal component analysis and two-dimensional principal component analysis are two classical data transformation techniques.Although many scholars worked on them,there’s no relationship of two methods to be given.In this paper,we gave their relationship,which is that they have the same optimal objective value in optimization.The opinion was demonstrated through theoretical derivation and extensive experiments which is tested on the CMU-PIE and CK+face database.
Automatic Recognition Method of Tunnel Disease Based on Convolutional Neural Network for Panoramic Images
TANG Yi-ping, HU Ke-gang and YUAN Gong-ping
Computer Science. 2017, 44 (Z11): 207-211.  doi:10.11896/j.issn.1002-137X.2017.11A.043
Abstract PDF(1522KB) ( 991 )   
References | Related Articles | Metrics
In light of the current difficulty of getting panoramic image of the tunnel lining fast and conveniently,and the difficulty of detecting various diseases automatically,in this paper,we put forward a disease recognition method based on convolutional neural network (CNN) to address such problems.First,a panoramic visual sensor which can fast obtain the panoramic image of the whole tunnel section were designed,and then through digital image processing technology the panoramic images was dealt with including panoramic image unrolling,image processing and brinary processing to extract suspected disease regions.Finally,the diseases were classified automatically under convolutional neural network.The experiments demonstrate that the method can effectively simplify the structure of obtaining the panoramic image of the tunnel,achieve the automatic feature extraction detection and recognition through the end to end convolutional neural network,the rate of recognizing the diseases in panoramic image exceeds 88%,and this method provides effective technical support for tunnel maintenance and completion acceptance.
Plant Leaf Image Set Classification Approach Based on Non-linear Reconstruction Models
LIU Meng-nan and DU Ji-xiang
Computer Science. 2017, 44 (Z11): 212-216.  doi:10.11896/j.issn.1002-137X.2017.11A.044
Abstract PDF(1472KB) ( 510 )   
References | Related Articles | Metrics
In this paper,a plant leaf image set identification approach was proposed based on non-linear reconstruction models.This approach initializes the parameters of model by performing unsupervised pre-training using Gaussian restricted Boltzmann machines(GRBMs).Then,the pre-initialized model is separately trained for images of each plant set and class-specific models are learnt.At last,based on the minimum reconstruction error from the learnt class-specific models,majority voting strategy is used for classification.Besides,in order to avoid occurring deformation during the image scaled,this paper normalized plant image by image preprocessing and a method of feature extraction was used based on k-means.The experimental results show that this approach can accurately classify the class of plant image set.
Meanshift Target Tracking Algorithm of Adaptive HLBP Texture Feature
DU Jing-wen, HUANG Shan and YANG Shuang-xiang
Computer Science. 2017, 44 (Z11): 217-220.  doi:10.11896/j.issn.1002-137X.2017.11A.045
Abstract PDF(1637KB) ( 552 )   
References | Related Articles | Metrics
In combination of the image texture feature extraction method,which is based on Haar local binary pattern(HLBP),a new target tracking algorithm was proposed,and applied to Meanshift tracking framework.Visual Studio 2010 and the opencv2.4.9 were the experimental platforms.We compared the results of the new algorithm with the results of other two kinds of algorithms,which are traditional Meanshift target tracking algorithm and the target tracking algorithm based on local binary pattern texture feature (LBP).Experimental results show that,in the case of simple or complicated background,the proposed tracking approach always shows steady and accurate tracking features,and in the event of partial occlusions,it can correctly track the target.
Pedestrian Detection Based on DCT of Multi-channel Feature
LIU Chun-yang, WU Ze-min, HU Lei and LIU Xi
Computer Science. 2017, 44 (Z11): 221-224.  doi:10.11896/j.issn.1002-137X.2017.11A.046
Abstract PDF(1458KB) ( 807 )   
References | Related Articles | Metrics
In pedestrian detection,multi-channel feature detection has the defect of incomplete using features.In this paper,an algorithm of multi-channel feature detection based on discrete cosine transform (DCT) was proposed.We used a two-layer convolution network for arranging the image information after DCT to build a new channel in the frequency domain.This channel can describe complex textures about pedestrian.Combined with the features of the histogram of gradient and the color space as well as the DCT frequency domain,a low cost multi-channel pedestrian detector has been trained based on the Adaboost algorithm.Experiments on popular pedestrian databases show that the proposed method improves the accuracy of detection,and the effect is remarkable in low false positive per image.
Disparity Estimation Algorithm Based on Frequency Sensitive Three-dimensional Self-organizing Map
REN Yun, CHENG Fu-lin and LI Hong-song
Computer Science. 2017, 44 (Z11): 225-227.  doi:10.11896/j.issn.1002-137X.2017.11A.047
Abstract PDF(1169KB) ( 528 )   
References | Related Articles | Metrics
A disparity estimation algorithm was presented based on frequency sensitive three-dimensional self-organizing map in this paper.It uses disparity pattern recognition (DPR) algorithm based on classified three-dimensional self-organizing map (CFS-3DSOM).The main idea is predicting disparity separately for low light area and high light area of disparity images,and adding frequency sensitive method when training pattern library.Experiment results show that the average peak signal noise ratio (PSNR) of disparity prediction image of CFS-3DSOM-DPR algorithm increases 1.78dB compared to disparity estimation algorithm based on block.
Unsupervised Complex-scene Clothing Image Segmentation Algorithm Based on Color and Texture Features
GUO Xin-peng, HUANG Yuan-yuan and HU Zuo-jin
Computer Science. 2017, 44 (Z11): 228-232.  doi:10.11896/j.issn.1002-137X.2017.11A.048
Abstract PDF(1841KB) ( 581 )   
References | Related Articles | Metrics
In this paper,an unsupervised segmentation algorithm especially used for clothing images in complex background was proposed.Based on priori knowledge,it combines color and texture features together.First,block truncation encoding thought is used to cut the traditional 3-dimentional color space into 6-dimentional space,so that fine color features can be obtained.Then,the texture feature based on the improved local binary pattern (LBP) algorithm was designed and used to describe the image together with the color feature.After that,according to the statistical appearance-law of the target region and background in the image,called priori knowledge,a kind of bisect method was proposed to do segmentation.Since the image is divided into several sub image blocks,such bisect segmentation will be accomplished more efficient.Experiments show that the algorithm can quickly and effectively extract clothing region from the complex scene without any artificial parameters.This segmentation will play an important role for image understanding and retrieval.()
Video-based Nighttime Vehicle Detection and Tracking Algorithm
DONG Tian-yang, ZHU Hao-nan and WANG Hao
Computer Science. 2017, 44 (Z11): 233-237.  doi:10.11896/j.issn.1002-137X.2017.11A.049
Abstract PDF(1387KB) ( 760 )   
References | Related Articles | Metrics
In the night,the vehicles on highway are difficult to detect because of a variety of factors,such as bad highway lighting conditions and different type of lights.To solve the problem,a video-based nighttime vehicle detection and tracking algorithm was proposed.Firstly,this algorithm combines OTSU and one dimensional maximum entropy threshol-ding algorithm to extract vehicle lights,eliminating non-vehicle lights.After that,this method makes use of temporal and spatial characteristics of light to distinguish a car with multiple lights and side-by-side vehicles.Finally,the kalman filter is used to predict and track the vehicle lights.This paper analyzed the result of the algorithm in three different application scenarios,weak light smooth traffic,normal light smooth traffic and normal light congestion traffic.The experi-mental results prove that the proposed method has high accuracy and pretty real-time performance.
Research of Workpiece Defect Detection Method Based on CNN
QIAO Li, ZHAO Er-dun, LIU Jun-jie and CHENG Bin
Computer Science. 2017, 44 (Z11): 238-243.  doi:10.11896/j.issn.1002-137X.2017.11A.050
Abstract PDF(1685KB) ( 1542 )   
References | Related Articles | Metrics
The application of convolutional neural network (CNN) was proposed in workpiece defect detection,which can detect workpiece defects on its surface,to improve the product quality.The CNN can’t recognize the small defects of a product although it can classify different objects very well.This paper presented a method which uses CNN for defect detection based on the results of a recognition process.Firstly the defective samples are expanded to overcome the difficulty for lacking of training samples.Then by observing the output data obtained from a recognition CNN,a concept called “the defect distinguish ratio” is defined to measure the degree of defection.It is considered as a non-defect pattern only when the defect distinguish ratio reaches a certain level.Finally,the experiment demonstrates the validity and feasibility of the method,in which the defect detection ratio can reach 93.3%.
Drawing Method of Cultural Relics Line Drawing Based on Visual Curvature Estimation
LIU Jun, ZHOU Ming-quan, GENG Guo-hua and SHEN Ying-quan
Computer Science. 2017, 44 (Z11): 244-250.  doi:10.11896/j.issn.1002-137X.2017.11A.051
Abstract PDF(1727KB) ( 896 )   
References | Related Articles | Metrics
In order to solve the problem of the effective extraction of the feature lines of the three-dimensional model whose surface has noise,the reasons for the loss of texture details in the preprocessing stage was discussed and a new method of line drawing based on visual curvature estimation of the cultural relics was proposed.First, the discrete curvature on triangular mesh vertex is estimated and the cultural relic model is divided into the flat region and the non-flat region according to the curvature distribution.Then the sharpening filter on the vertex of the non-flat region is conducted to calculate the new mesh vertex coordinates.Finally,based on the visual curvature,the curved contour line is extracted to achieve automatic drawing of line drawing of cultural relics of the triangular mesh model.The experimental results show that the non-flat contour lines based on the visual curvature remains the details of the surface texture of the 3D model and avoid the sharp phenomena of the line drawing of the directed ridge line & valley line drawing.
Simplifying 3D Models and Collision Detection on Smartphones
SHEN Ying, WANG Hui, WANG Li-hui and WU Qing-qing
Computer Science. 2017, 44 (Z11): 251-256.  doi:10.11896/j.issn.1002-137X.2017.11A.052
Abstract PDF(1581KB) ( 790 )   
References | Related Articles | Metrics
The enormous model data and complex morphology of the 3D rendering and roaming on mobile platform make it very difficult to render the three-dimension scene clearly and quickly.In order to speed up the model rendering,an optimized rendering algorithm was proposed.3D models were simplified by quadric error metrics half edge collapse.And octree was used to achieve read,organize and fast draw 3D models by culling the scene rapidly which is not necessary in rendering.As the limitations of screen size and computing power on mobile devices,a collision detection algorithm for mobile platforms was put forward.Experimental results show that this method can effectively simplify models and reduce the amount of computation.Thus,it can be applied to realistic fast rendering of 3D scene.
Method of Vehicle Type Recognition Based on Improved Harris Corner Detection
ZHANG Tong and ZHANG Ping
Computer Science. 2017, 44 (Z11): 257-259.  doi:10.11896/j.issn.1002-137X.2017.11A.053
Abstract PDF(1369KB) ( 802 )   
References | Related Articles | Metrics
This paper used corner information as feature point in the front image of vehicle to recognize the type which the vehicle belongs to.We improved the Harris’ corner detection method and added the ratio of eigenvalues which represent different directions to the corner response,and thus the more corners were detected.Because the position of vehicle in video is changeable,the center of the license plate is defined as stable point,and vehicle type was recognized by the ratio of matching corner.The results of test verified the effectiveness of this method.
Color Space Selection of Non-contact Heart Rate Measurement
CAO Jian-jian, FENG Jun, TANG Wen-ming and YU Ying
Computer Science. 2017, 44 (Z11): 260-262.  doi:10.11896/j.issn.1002-137X.2017.11A.054
Abstract PDF(1210KB) ( 852 )   
References | Related Articles | Metrics
Compared with the traditional contact measurement method,the non-contact method for heart rate detection is more convenient and comfortable.Based on the non-contact heart rate measurement method of image-photoplethysmography(iPPG),this paper briefly analyzed the mechanism of this method and the measurement error of color space.The experiment compares the measurement error according to the different color space,which proves the necessity of selecting the color space during the non-contact heart rate measurement. On the other hand,optimal color space can be used to improve the accuracy of heart rate measurement accuracy of the color space.The experimental results show that the heart rate measurement with RGB color space has the smallest error compared with the heart rate obtained by electrocardiograph(ECG) at the same time.The average error of experiment is 1.68 bpm.Thus,choosing RGB color in non-contact heart rate measurement space can get higher accuracy.
Two-step Face Recognition Based on Factorization Models
CHENG Zai-he
Computer Science. 2017, 44 (Z11): 263-266.  doi:10.11896/j.issn.1002-137X.2017.11A.055
Abstract PDF(1402KB) ( 510 )   
References | Related Articles | Metrics
To overcome the effect of facial expression and pose on face recognition,Xu proposed using the original and “symmetrical face” training samples to perform representation based two-step face recognition.However,it usually gets bad symmetrical face samples based on mirror image with the changes of facial poses,which may affect the accuracy of recognition.A two-step face recognition algorithm based on factor decomposition model was proposed.Firstly,to reduce the effect of light,pose and other factors,the factorization model is used to separate the “expression factors” and “identity factor” from the face images which can be controlled,to construct new image samples.Meanwhile,the two-step face recognition method is adopted in the recognition stage,which makes full use of the advantage of the score level fusion,and further improves the recognition rate of the proposed algorithm.
Application of Improved 2DPCA Algorithm in Face Recognition
FENG Fei, JIANG Bao-hua, LIU Pei-xue and CHEN Yu-jie
Computer Science. 2017, 44 (Z11): 267-268.  doi:10.11896/j.issn.1002-137X.2017.11A.056
Abstract PDF(1181KB) ( 571 )   
References | Related Articles | Metrics
With the application of two-dimensional principal component analysis (PCA) in face recognition,a lot of ana-lysis methods based on 2D are becoming more popular.Compared with PCA algorithm based on vector feature extraction,2DPCA algorithm is based on the feature extraction of the matrix.Unlike the methods depending on the columns or all matrix of the eigenmatrix ,we proposed an algorithm based on the distance measurement method of the characteristic matrix,and the algorithm is combined with KNN algorithm.By using this method,the shortcoming based on the 2DPCA algorithm compared with algorithm based on principal component analysis (PCA) can alleviate some problems needed to be more coefficient.Experimental results on face database show that the proposed method of distinguish accuracy will increase,is’s performance is better than other methods in terms of accuracy and storage capacity.
Interactive Image Co-segmentation with Saliency Constraint
WANG Yi, XU Wen-di, YU Hui-bin, ZHENG He-rong and PAN Xiang
Computer Science. 2017, 44 (Z11): 269-272.  doi:10.11896/j.issn.1002-137X.2017.11A.057
Abstract PDF(1412KB) ( 583 )   
References | Related Articles | Metrics
Aiming at the problem that the similarity of common objects cannot be calculated accurately because of the interference of the background region,we proposed an image co-segmentation algorithm by image saliency and SIFT flow image alignment algorithm.Firstly,the algorithm calculates the image saliency features.Secondly,images are aligned with user specified image by SIFT flow.Therefore,the possibility of labels by image saliency and matching can be obtained.Finally,optimization segmentation boundary is refined by minimum cut algorithm.Experimental analysis shows that the algorithm can obtain better segmentation quality.
Research on Measurement of Moving Objects’ Velocity Based on Single Hand-eye Vision
BA Quan-ke, FU Cheng-hua, AI Xi-ju and LI Yun
Computer Science. 2017, 44 (Z11): 273-275.  doi:10.11896/j.issn.1002-137X.2017.11A.058
Abstract PDF(1316KB) ( 2853 )   
References | Related Articles | Metrics
In order to accurately measure the velocity of moving object in real time,this paper presented a method for measuring the velocity of moving objects based on single hand-eye vision.In this method,a portable camera is used to capture the moving ball,taking VS2010 platform OpenCV as the development platform,and moving object detection based on frame difference method is realized.In this paper,the velocity measurement algorithm can realize the detection of the horizontal velocity and vertical velocity.Experiments show that this method can effectively measure the velocity of moving objects with high accuracy.
Analysis and Optimization of Boot-up Performance for Railway Real-time Ethernet Switch
SHE Lei, ZHAO Xi-bin, CHEN Yu, SHI He-yuan and WEI Kang
Computer Science. 2017, 44 (Z11): 276-280.  doi:10.11896/j.issn.1002-137X.2017.11A.059
Abstract PDF(1331KB) ( 604 )   
References | Related Articles | Metrics
As the important component for the next generation of train communication networks,railway real-time Ether-net switch ensures the data exchange and transmission for status and control information.The boot-up of the switch,based on the embedded Linux system,includes BootLoader starting,kernel image loading,Linux kernel boot and user space initialization.The boot-up time directly affects the performance of switch or even the entire train communication networks.The boot-up process of the embedded Linux system and the mounting cost for various file systems,e.g.JFFS2 and UBIFS,were studied.The strategy for each process in boot-up was proposed to reduce the time cost.By reducing kernel,replacing file system and optimizing boot-up parameters,the performance of system boot-up was significantly improved.Experiments show that the boot-up time decreases from 26.69 seconds to 7.15 seconds,which is reduced by 73.2%.
Layer-step Data Collection Scheme for RF Harvesting Wireless Sensor Network
TIAN Xian-zhong and LIN Chu-chao
Computer Science. 2017, 44 (Z11): 281-285.  doi:10.11896/j.issn.1002-137X.2017.11A.060
Abstract PDF(1414KB) ( 403 )   
References | Related Articles | Metrics
In RF harvesting wireless sensor network,adoptding multi-hop to send data to the Sink can reduce the communication distance between nodes and then reduce energy consumption,but increases sending and receiving data of relay nodes.In order to reduce the network load of relay nodes and prolong the network lifetime,a layer-step data collection (LSDC) approach was proposed.By modeling the process of energy harvesting and energy consumption,this paper got the residual energy of nodes.Through the use of multiple “hop” command,this paper notified sensor nodes to transfer the collected data to the Sink step-by-step in the network.This paper chose the node which has best energy status as relay node,and ultimately all the data in the network were collected and processed by Sink efficiently.Theory analysis and emulation test show that our algorithm obtains more energy efficiently, prolongs the network lifetime,and energy consumption is more balanced than layer-by-layer data collection (LDC) approach.
Research on Deployment Decision Method of Virtual Application Network in Multi-cloud Environment
ZHU Hua-min, WU Li-fa and ZHAO Peng
Computer Science. 2017, 44 (Z11): 286-292.  doi:10.11896/j.issn.1002-137X.2017.11A.061
Abstract PDF(1605KB) ( 448 )   
References | Related Articles | Metrics
In multi-cloud environment,users can freely combine infrastructure resources to deploy virtual application network based on virtual application technology and infrastructure virtualization technology,and thus quickly build a distributed application system with certain business functions.Regarding the shortcomings of the existing multi-cloud deployment decision-making methods in deployment descriptions and addressing user multi-objective requirements,this paper first presented a deployment description method for the virtual application network based on the open virtual format document.Secondly,the common service quality metrics of the infrastructure resource combination were studied,and a multi-objective optimization model of the combination was subsequently defined.Then,the second generation non-dominated sorting genetic algorithm (NSGA-II) and multi-objective particle swarm optimization (MOPSO) algorithm were used to solve the model.Finally,a fuzzy decision-making method was designed to select the final satisfactory solution from the obtained results.The statistics of multiple experiments show that both algorithms can achieve a good convergence within a reasonable time,NSGA-II is only suitable for scenes with 2 to 3 objectives,and MOPSO can be used for scenes with more objectives and performs better than NSGA-II.Moreover,the fuzzy decision-making result can match the user’s objective preference best.
Data Filtration Method for RFID Based Indoor RTLS
GUAN Yang, YAN Guo-yu, WANG Ying and JIANG Sui-ping
Computer Science. 2017, 44 (Z11): 293-296.  doi:10.11896/j.issn.1002-137X.2017.11A.062
Abstract PDF(1294KB) ( 618 )   
References | Related Articles | Metrics
In some RFID based indoor real-time locating applications,because of the complexity of the scene and the limited accuracy of the RTLS,there are some abnormalities in the object’s trail reported by RTLS,such as objects going through walls.An efficient data filtration method was presented to avoid such abnormality.First,the scene is mapped into a discrete model which is represented by a “gray image”.Each pixel of the image represents a point of the scene,and the grayscale of each pixel represents different regions/rooms,walls and doors.During real time locating,two of the object’s consecutive scene coordinates are mapped to two pixels respectively.The grayscales of the pixels on the segment connecting theses two pixels are used to judge whether there is an abnormality or not,and the object’s coordinates in image are rectified.Finally,the rectified coordinates are mapped back to the scene coordinates.Experimental results show that our method is fast and efficient,and can avoid such abnormality.
Indoor Localization Based on Map Information and Particle Filter with Position Adaptive Correction
HUAN Ruo-hong and CHEN Yue
Computer Science. 2017, 44 (Z11): 297-301.  doi:10.11896/j.issn.1002-137X.2017.11A.063
Abstract PDF(1466KB) ( 521 )   
References | Related Articles | Metrics
As the existing indoor localization algorithm based on dead reckoning has the disadvantages of high cumulative error and low localization accuracy,an indoor localization approach based on map information and particle filter with position adaptive correction was proposed in this paper.The approach uses the known map information to control the birth and death of the particles during the localization process,and adaptively adjusts the positions of the compensating particles in the resampling stage according to the situation of particle degeneracy,thereby correcting the object position.The experimental results show that the proposed approach overcomes the shortcoming of cumulative error of dead re-ckoning algorithm and improves the localization accuracy.
Improved Vegas Algorithm over LEO Satellite Network
WEI De-bin, TAO Shun-li, SHI Huai-feng and LIAO De-lin
Computer Science. 2017, 44 (Z11): 302-307.  doi:10.11896/j.issn.1002-137X.2017.11A.064
Abstract PDF(1577KB) ( 594 )   
References | Related Articles | Metrics
Aimming at throughput degradation caused by the Vegas algorithm of SCPS-TP (Space Communications Protocol Standards Transport Protocol) in LEO (Low Earth Orbit) satellite network,an adaptive congestion control algorithm named Vegas-AD (Adaptive) was proposed.The algorithm based on the analysis of the Vegas improves calculation method of RTT(Round-Trip Time) so that it can adjust the congestion window more accurately,and optimizes the growth strategy congestion window,which can improve the bandwidth competitiveness of the congestion avoidance phase.At the same time,an factor which can adaptively adjust value of the congestion window is put forward according to the degree of network congestion.The results show that the bandwidth competitiveness of Vegas-AD algorithm is significantly higher than that of Vegas,and the new algorithm can improve the network throughput greatly.
M-layers Binary Graph Model for Interconnection Networks
SHI Hai-zhong and SHI Yue
Computer Science. 2017, 44 (Z11): 308-311.  doi:10.11896/j.issn.1002-137X.2017.11A.065
Abstract PDF(1246KB) ( 452 )   
References | Related Articles | Metrics
All of the hypercube,crossed cube,Mbius cube and Folded cube are famous interconnection networks.They have a common weak:its degree of node in Hypercube (crossed cube,Mbius cube or folded cube) increases with the increase of network scale (number of nodes).It means that the scalability of the super computer whose interconnection nework is Hypercube is weak.Can we design the interconnection network that has features of Hypercube and fixed degree of nodes?In this paper,we proposed m-layers binary graph model for interconnection networks.From the model,we designed new interconnection networks-m-layers hypercube,m-layers crossed cube,m-layers mbius cube and m-layers folded cube.Especially,m-layers hypercube have a feature:the degree of node in m-layers hypercube can not increase with the increase of newtork scale and it almost has all features of hypercube.In addition,we proposed the concept of m-layers graph generated by given graph.
Research on Wear Leveling Algorithm of NAND FLASH Memory Based on Greedy Strategy
JIA Xin and ZHANG Shao-ping
Computer Science. 2017, 44 (Z11): 312-316.  doi:10.11896/j.issn.1002-137X.2017.11A.066
Abstract PDF(1420KB) ( 1205 )   
References | Related Articles | Metrics
NAND FLASH memory is the storage device of sensor node in wireless sensor network.Sensor nodes conti-nuously acquire data from monitoring area,and exchange data between nodes.Frequently writing operations on NAND FLASH memory makes the physical block’s erase number unbalanced and reduces the memory’s life.Accordingly,it will affect the life of the whole sensor network.In this paper,a partition address mapping wear leveling algorithm was proposed to solve this problem.In this algorithm,when data is written into memory,a new physical block for writing operations is selected according to the wear leveling measurement which it computed with greedy methods.And the configuration and experimental data are exchanged between the old physical block and the new physical block,then the old physical block enters a wait state.It is proved by software testing that the algorithm proposed in this paper can effectively optimize the wear leveling of NAND FLASH memory.
Android Static Analysis System Based on Signature and Data Flow Pattern Mining
NING Zhuo, SHAO Da-cheng, CHEN Yong and SUN Zhi-xin
Computer Science. 2017, 44 (Z11): 317-321.  doi:10.11896/j.issn.1002-137X.2017.11A.067
Abstract PDF(1290KB) ( 675 )   
References | Related Articles | Metrics
With the improvement of Android malware’s resistance of being detected,traditional static analysis has faced some problems,for example,signature analysis has a high analysis speed,but it suffers repackaging and code confusion problems.Data flow analysis is preferred for its high accuracy,but it is criticized by high resources costs.To deal with the above problems,a new static analysis system was proposed by combining an improved multi-signature analysis and data flow mining method to find a balance point between the accuracy and the efficiency,in which not only the multi-signature analysis is improved by using the signatures of classes and the method,but also the frequent data flow patterns is mined in malware to avoid manual detection.The result shows the system has better capability in solving the repackaging or code confusion problem and the whole detection accuracy approaches 88%.
Data Leakage Oriented Testing Method for Web Sandbox
SUN Ya-jing, ZHAO Xu, YAN Xue-xiong and WANG Qing-xian
Computer Science. 2017, 44 (Z11): 322-328.  doi:10.11896/j.issn.1002-137X.2017.11A.068
Abstract PDF(1500KB) ( 596 )   
References | Related Articles | Metrics
Data Leakage is an important cause of Web sandbox escape.Namely,unauthorized programs can access sensitive data of system.The existed security analysis methods of Web application are not applicable to detect data leakage of Web sandboxes.In this paper,a Web sandbox test method was proposed to detect Web sandbox data leakage.Based on the model of JavaScript object,first,the method uses depth-first strategy to traversal native objects of browser and gets the collection of directly access object.Then,the method designs sensitive-point oriented test algorithm of encapsulated objects and gets the collection of indirectly access object.Next,the method designs data leakage test algorithm of multiple applications and gets the possible communication paths of programs.Finally,the method compares the test results and the specification of tested web sandbox to detect data leakage.This paper designed and implemented a Web sandbox test system (WSTS),and tested the different versions of ADsafe.The experimental results show that the method has good ability to detect data leakage of Web sandbox.
Obfuscation-based Broadcasting Multi-signature Scheme
LI Lei, JIA Hui-wen, BAN Xue-hua and HE Yu-fan
Computer Science. 2017, 44 (Z11): 329-333.  doi:10.11896/j.issn.1002-137X.2017.11A.069
Abstract PDF(1463KB) ( 837 )   
References | Related Articles | Metrics
Based on the indistinguishability obfuscation,this paper proposed a new broadcast Multi-signature scheme.By using the confused verification circuit as verification key,the part signature and the multi-signature can be verified.In this scheme,after generating the part signature by each signer,multi-signature is obtained by the modulation of the part signature which is collected by signature collector.The length of the part signatures,the complicacy of its algorithm and the complicacy of the verification algorithm are independent when the number of the signers changes.The properties of against external attack,non-forgery and non-disavowal are obtained in this scheme. Its securities are proved in the random oracle model.
System Design of Firewall Based on Deep Packet Inspection
LU Qi, HUANG Zhi-ping and LU Jia-qi
Computer Science. 2017, 44 (Z11): 334-337.  doi:10.11896/j.issn.1002-137X.2017.11A.070
Abstract PDF(1336KB) ( 1470 )   
References | Related Articles | Metrics
With the rapid development of the Internet,as an important means of network security,firewall has become the focus of research.In order to effectively filter the irrelevant data packets,resist the malicious attacks,and ensure the safe and stable operation of the network,on the basis of researching on the deep packet inspection (DPI) technology,a firewall system based on field programmable gate array (FPGA) and ternary content addressable memory (TCAM) was presented.The test results show that the designed firewall system based on deep packet inspection technology can meet the actual requirements.
Improved Method of Computer Virus Signature Automatic Extraction Based on N-Gram
YANG Yan and JIANG Guo-ping
Computer Science. 2017, 44 (Z11): 338-341.  doi:10.11896/j.issn.1002-137X.2017.11A.071
Abstract PDF(1393KB) ( 1070 )   
References | Related Articles | Metrics
With the rapid development of computer technology,security threats brought by computer virus have become more and more serious.The traditional N-Gram algorithm is difficult to capture bytes of different length,leading to the lack of effective signature and the geheration of huge signature sets,and creating a waste of storage space.Instead of using fixed-length N-Gram feature that the traditional way dose,an improved computer virus signature automatic extraction algorithm based on variable-length N-Gram was proposed to solve these problems.It extracts the effective signature to generate variable-length virus signature.Taking the correlation of signature frequency into account,the algorithm uses signature concentration to extract the N-Gram feature of malware samples and generates a data dictionary to save the storage space.In the experiment results,compared with the traditional algorithm which uses fixed-length N-Gram feature,the proposed method can effectively decrease the false rate of signature extraction.
Attribute-based Searchable Encryption of Electronic Medical Records in Cloud Computing
LI Xiao-rong, SONG Zi-ye, REN Jing-yi, XU Lei and XU Chun-gen
Computer Science. 2017, 44 (Z11): 342-347.  doi:10.11896/j.issn.1002-137X.2017.11A.072
Abstract PDF(1551KB) ( 943 )   
References | Related Articles | Metrics
Electronic medical record is an important product of the medical report under the rapid development of cloud computing technology,whose presence is convenient for the hospitals and patients to manage the medical reports.However,it would be inevitably faced with potential hazards such as the privacy disclosure,illegal access,etc.if the privacy data about patients are stored in the cloud.To protect the privacy of the electronic medical records stored in the cloud,an attribute-based searchable encryption scheme was proposed,and its important application in the electronic medical record system was given.Compared with the traditional searchable encryption scheme,this scheme does not need a secure channel for the trapdoor transmitting,and the difficulties in key management for multiple users could be reduced.Moreover,the scheme can hide the access structure and has fine-grained access control to add and revoke access rights of users according to the request of data owners.Security analysis shows that the scheme can not only protect the privacy of keywords,but also effectively resist the attack of keyword guessing and prevent the private data from leaking out.The keyword-based trapdoor matching algorithm requires only one computation of bilinear pairing,greatly improves the searching efficiency.
MacDroid:A Lightweight Kernel-level Mandatory Access Control Framework for Android
LI Ni-ge, MA Yuan-yuan, CHEN Mu, CHEN Lu and XU Min
Computer Science. 2017, 44 (Z11): 353-356.  doi:10.11896/j.issn.1002-137X.2017.11A.074
Abstract PDF(1368KB) ( 572 )   
References | Related Articles | Metrics
Smart terminal has become an important information processing platform in the mobile Internet era,and its security threats are becoming more and more serious.The security protection architecture for traditional computers has been unable to meet the special needs of smart terminal security protection.By analyzing the characteristics and levels of the smart terminal operating system,a lightweight kernel-level mandatory access control framework(MacDroid) was designed.The key issues of MacDroid security policy definition,security policy compilation,security policy implementation and so on were deeply studied in this paper.The MacDroid security policy description language(PSL) was proposed and the PSL lexical and grammar formal definition were given.Finally,the effect of MacDroid access control framework on the behavior of different layers of intelligent mobile terminals was evaluted.The experimental results show that the MacDroid framework has good control effect on application layer,native layer and kernel layer malware behavior of Android smart terminal.()
Evaluation Model of Vector Map Watermarking Algorithms Based on Set Pair Analysis
YANG Meng
Computer Science. 2017, 44 (Z11): 357-361.  doi:10.11896/j.issn.1002-137X.2017.11A.075
Abstract PDF(1303KB) ( 480 )   
References | Related Articles | Metrics
With the development of geographic information system(GIS),vector map has been widely used in all areas of life.As a kind of digital data,vector map risks of illegal copying,tampering,communication,etc.It makes the copyright protection of vector maps an increasingly serious problem.Currently,a number of scholars have proposed kinds of watermarking algorithms.However,without uniform standard,it is very difficult to give a more fair assessment of the watermarking algorithms.This paper proposed a more general evaluation of vector map watermarking model by analyzing the existing vector map watermarking algorithm,summarizing vector map watermarking attack classification and combining theory of the set pair analysis.
Group Secret Key Generation and Analysis in Wireless Network Based on Pairwise-generating Strategies
DAI Dong-ming and WU Xiao-fu
Computer Science. 2017, 44 (Z11): 362-365.  doi:10.11896/j.issn.1002-137X.2017.11A.076
Abstract PDF(1266KB) ( 523 )   
References | Related Articles | Metrics
Secret key establishment within wireless terminals utilizing physical layer information of wireless channel to ensure the security of mobile environment has received wide attention.However,generating group secret key among multiple wireless devices for secure communication in reality environment remains a challenge.This paper proposed a group secret key establishment scheme based on central node pairwise-generating strategies under the star topology in wireless network.Compared with the existing group secret key extraction method using the difference of received signal strength(RSS) among the wireless devices in the group,we gave a explicit analysis and a specific numerical result on the achieva-ble group secret key rate of each one.It shows that our scheme outperforms that of generating group key by leveraging the difference of RSS on key rate.
Improved Service Scheduling Algorithm for SWIM System
WU Zhi-jun, LIU Zhong and HU Tao-tao
Computer Science. 2017, 44 (Z11): 366-371.  doi:10.11896/j.issn.1002-137X.2017.11A.077
Abstract PDF(1443KB) ( 504 )   
References | Related Articles | Metrics
System wide information management is the infrastructure of aeronautical cloud,and it is used for transmission and sharing of information related to air transportation system (ATS).The function of SWIM is to achieve air cloud storage.The reliability and survivability of SWIM have a significant impact on the safe operation of ATS.This paper presented a resilience disaster recovery scheme for the purpose of enhancing the survivability of SWIM system.This scheme adopts the organizational structure of virtual service in Linux and improves weighted least-connection (WLC) algorithm being applied to Linux virtual service.Related experiments have been carried out in the simulation environment,and experimental results show that this proposed scheme meets the requirements of resilient disaster recovery for SWIM system and has better performance in guaranteeing reliability and survivability of SWIM network.
Method of Safety Evaluation for System Group Based on Grey Clustering
DING Li-tong, FAN Jiu-lun and LIU Yi-xian
Computer Science. 2017, 44 (Z11): 372-376.  doi:10.11896/j.issn.1002-137X.2017.11A.078
Abstract PDF(1384KB) ( 752 )   
References | Related Articles | Metrics
At present,most of the security assessment works of the information system are based on the single imformation system,and its security status is evaluated according to each attribute hierarchy.There are few methods to evaluate the security level of information system for system group.According to the security requirements of information system,a security level evaluation method based on system group was proposed.Taking into account the grey clustering analysis method of grey system theory,some evaluation objects are divided into several grades by using the whitening weight function,which is suitable for the system analysis of complex structure,high uncertainty and lack of effective information,In this paper,we used the whitening weight function to determine the level of security evaluation of information system.Firstly,from the point of view of the information system,a fast and efficient K-mean clustering algorithm is used to determine the whitenization weight function.Secondly,the grey level grey clustering coefficient of each information system is obtained by using the whitening weight function.Finally,we can get the security level of each information system.The results show that this method can effectively evaluate the security level of information systems group.
Improved Scheme of CP-ABE with Hidden Access Structure
WENG An-xiang and LING Jie
Computer Science. 2017, 44 (Z11): 377-380.  doi:10.11896/j.issn.1002-137X.2017.11A.079
Abstract PDF(1262KB) ( 1234 )   
References | Related Articles | Metrics
In the scheme of CP-ABE with hiding access structure,the access structure is implicitly embedded in the ciphertext.The users can not get access structure although trying different attribute combinations,so that the information of encryptor will not be revealed.But this will lead to complicated calculations and low access efficiency.This article proposed an improved scheme,which improves the efficiency by reducing operation of bilinear pairings by nearly half.Besides, based on DBDH assumption,we proved the scheme selectively secure against chosen-plaintext attack under standard model.
Safety-aware Transfer Learning Support Vector Machine
ZHOU Guo-hua, CHAO Hai-jing and SHEN Yan-ping
Computer Science. 2017, 44 (Z11): 381-384.  doi:10.11896/j.issn.1002-137X.2017.11A.080
Abstract PDF(1431KB) ( 477 )   
References | Related Articles | Metrics
Transfer learning method is a new machine learning framework,which aims at realizing an effective learning for target domain by efficiently transferring the existing knowledge from source domain to target domain.It would be nice to reduce the need and effort to recollect the training data.However,the classifier trained by the data of source domain and target domain often put in a worse performance than the classifier only trained by the data of target domain,which leads to the appearance of “negative transfer”.To address this problem,a safety-control mechanism with the knowledge of the labeled data of target domain was proposed for safe transfer leaning.Furthermore,in implementation,based on a recent transfer learning method (TL-SVM),a safety-aware transfer learning support vector machine(STL-SVM) was proposed,which avoids the appearance of “negative transfer” of TL-SVM theoretically.Experiment results on artificial datasets and real-world datasets show the effectiveness of the proposed method.
Clustering and Visualization of Social Network Based on User Interests
TANG Ying, ZHONG Nan-jiang, SUN Kang-gao, QIN Da-kang and ZHOU Wei-hua
Computer Science. 2017, 44 (Z11): 385-390.  doi:10.11896/j.issn.1002-137X.2017.11A.081
Abstract PDF(1626KB) ( 1241 )   
References | Related Articles | Metrics
With the development of social network,it becomes more and more important to extract useful information from the social network and provide valuable knowledge to users in an interactive visual interface intuitively.Clustering,as a crucial method in data mining,offers the global data analysis results.Traditional clustering methods of social network data mainly consider network topological structure.However, they haven’t considered the user interests for clustering.In this paper,the users are clustered by computing user-interest similarity based on Bayesian probabilistic model,furthermore,the interactive visualization method is designed to present the user clustering results.Specifically,we computed the feature vectors representing users’ interests based on latent semantic model.Then clusters with different interest characteristics were built based on these feature vectors.The suitable number of clusters are determined by heat map visualization results.Finally,we presented the interactive visualization method based on hierarchical bubble chart to support users to explore the clustering results from the global overview to local details.We performed experiments and analysis with data crawled from Douban website.The results validate the effectiveness of our method.
Collaborative Filtering Recommendation Algorithm Combining Clustering and User Preferences
HE Ming, SUN Wang, XIAO Run and LIU Wei-shi
Computer Science. 2017, 44 (Z11): 391-396.  doi:10.11896/j.issn.1002-137X.2017.11A.082
Abstract PDF(1465KB) ( 786 )   
References | Related Articles | Metrics
Collaborative filtering recommendation algorithm can use the known user preferences to predict the possible interest items,and it is now the most successful and widely used recommendation technique.However,traditional collaborative filtering recommendation algorithms suffer from data sparsity,which results in poor recommendation accuracy.Only user-item rating matrix has been used to data analysis by most current collaborative filtering algorithms,while the characteristics of the item attributes and user preferences for those who have not been considered.To address this issue,a collaborative filtering recommendation algorithm combing clustering and user preference was proposed in this paper.Firstly,the user preference matrix for the item category is constructed according to the user rating matrix and the item category information.Then the K-Means algorithm is used to cluster the item set,and the nearest-neighbor user corresponding to the unrated item is found based on the user preference matrix.Next,the sparse matrix in each item cluster is filled by the weight Slope One algorithm combined with the similarity of items to alleviate the problem of data sparsity.In addition,the user clusters are built based on the user interest matrix.Finally,the user based collaborative filtering algorithm in each user cluster is employed to predict the item rating for the filled rating matrix.The experimental results show that our algorithm effectively alleviates the sparsity problem of the raw rating matrix and achieves better quality of recommendation than other algorithms.
Improvement of HDFS Balanced Placement Strategy
YUAN Li-na
Computer Science. 2017, 44 (Z11): 397-399.  doi:10.11896/j.issn.1002-137X.2017.11A.083
Abstract PDF(1269KB) ( 996 )   
References | Related Articles | Metrics
The default data replica placement policy for HDFS is only measured using a single metric based on disk space,the true load balancing of each node cannot be realized. This paper proposed an improvement strategy of the load balancing based on performance through five aspects,such as disk space,CPU processing power,memory processing power,disk read/write,bandwidth,and a load capacity model was defined.Experimental results indicate that the improvement strategy is better than the default policy.
Resource Recommendation Method Based on Interactive Perception in Exploratory Search
SUN Hai-chun and LI Xin
Computer Science. 2017, 44 (Z11): 400-402.  doi:10.11896/j.issn.1002-137X.2017.11A.084
Abstract PDF(1353KB) ( 572 )   
References | Related Articles | Metrics
Information search technology based on keyword matching is becoming more and more mature.Search engines are expected to become more intelligent.As a kind of search engine which satisfies the information searching with fuzzy requirements,exploratory search comes up.Some existing methods which recommend information based on semantic relations among keywords can inspire users to find other information related to the current query.However,this kind of random recommendation can only achieve some application scenarios like users’ understanding of unfamiliar areas.For users who have a concrete search intention,how to make computers understand their information requirement based on some prior information is the key issue.This paper proposed a method of information recommendation based on user interactive perception,which includes two basic rules of interaction to make system understand users’ fuzzy requirements quickly.The example shows that it allows quantitative analysis of effective interactions.And,it can reduce interaction steps between users and systems.
Automatically Selecting Clustering Centers Algorithm Based on Density Peak and Grid
XIA Qing-ya
Computer Science. 2017, 44 (Z11): 403-406.  doi:10.11896/j.issn.1002-137X.2017.11A.085
Abstract PDF(1347KB) ( 1077 )   
References | Related Articles | Metrics
Aiming at the shortcomings of clustering by fast search and find of density peaks algorithm(DPC),which calculates massive distance between point objects,has high computational-complexity about clustering process,and needs to select the final cluster centers manually,an improved algorithm that choose clustering centers automatically based on density peak and grid(GADPC) was proposed.Firstly,with the idea of Clique algorithm,all data points are mapped to grid clustering with grid objects rather than point objects,in order to reduce the distance computation and clustering complexity of DPC algorithm.Secondly,the decision accuracy of the number of cluster centers is improved so that it can automatically select cluster centers more precisely.Finally,the relative similarity between grid internal points and adjacent grid points is dealt,so that the edge points and noise points can be solved well.Comparing with machine learning synthetic data sets of UEF and UCI natural data sets,the rand index of those data sets shows that the clustering quality of the improved algorithm is not lower than DPC and K-means algorithm when calculating large data sets,and it improves the dealing efficiency of DPC algorithm.
Multiple Correlation Analysis and Application of Granular Matrix Based on Big Data
WU Jun and WANG Chun-zhi
Computer Science. 2017, 44 (Z11): 407-410.  doi:10.11896/j.issn.1002-137X.2017.11A.086
Abstract PDF(1283KB) ( 899 )   
References | Related Articles | Metrics
The ever increasing big data is acclaimed,and the key point of big data is data analysis.However focusing on the big data with dynamic and multiple-dimensional characteristic,it is difficult for traditional data analysis methods to obtain reliable and accurate analytical results.Therefore there is an important opportunity and a great challenge for the data analysis methods to be developed.It aims to make an important research and investigation of the multiple correlation analysis for dynamic big data.The research object is dynamic big data,and the research is based on the theory of granular computing (GrC).We got the theoretical thought of granular matrix completely new in this paper.It was expected to reveal the multiple correlation analysis for dynamic big data.On one hand the achievements of this research would provide a scientific basis for multiple correlation analysis and revelation of the objective law in big data area.On the other hand it is also an important implication for sustainable development of big data.
Research on Text Data Topic Mining and Association Search
ZHU Wei-xing, XU Wei-guang, HE Hong-yue and LI Wen
Computer Science. 2017, 44 (Z11): 411-413.  doi:10.11896/j.issn.1002-137X.2017.11A.087
Abstract PDF(1393KB) ( 1149 )   
References | Related Articles | Metrics
Text data is the most natural way of storing and exchanging information.Text mining technology can disco-ver knowledge patterns hidden in massive text data.The text data mining and related search technology were studied in the paper,Firstly,text information is extracted by text parsing and extraction,word preprocessing and indexing.Then the theme information model based on latent semantic relations is used to mine the hidden topic information in large amount of text data.Finally,the topic model is used to calculate the relevance degree of keywords.In order to achieve the associated search,a prototype system of text data mining and association search is implemented.Subject discovery and association search were performed on Tancorp dataset,and the process of association search was displayed synchronously with visualization and Web page.
Method for Unstructured Data Transformation Based on XML Technology
YANG Jing and ZHOU Shuang-e
Computer Science. 2017, 44 (Z11): 414-417.  doi:10.11896/j.issn.1002-137X.2017.11A.088
Abstract PDF(1238KB) ( 812 )   
References | Related Articles | Metrics
XML,as a semi-structured language,is widely used in converting unstructured information to structured information because of its special characteristic of pre-defined mark.In this work,the complicated unstructured data on the network was converted to XML semi-structured data through POI technology,then the semi-structured data was converted to structured data by parsing XML file through SAX,which would provide convenience for users to search for information.In addition,those efficiencies of parsing of XML files though methods of SAX and DOM were compared in this work for the first time.It demonstrates that the parsing efficiency of SAX is higher than DOM when they are used to parse the same file,and this gap will increase with the size of XML file.
Research on Evolution of Network Public Opinion Introducing Time Mechanism
ZHENG Bu-qing, ZOU Hong-xia, HU Xin-jie and WANG Zhen
Computer Science. 2017, 44 (Z11): 418-421.  doi:10.11896/j.issn.1002-137X.2017.11A.089
Abstract PDF(1278KB) ( 689 )   
References | Related Articles | Metrics
The rapid development of network public opinion makes the evolution of public opinion become the research hotspot,which is of great significance for the forecast of public opinion.In this paper,we started from text clustering,for the evolution of public opinion analysis process,making the K-means clustering research in time series,and got clustering center.The time-weighted weighting of word frequency statistics in clustering was made,which makes the statistical keywords more representative.Through the analysis of the keywords obtained by time clustering and time series weighted statistical method,the trend of public opinion evolution was got.The results show that the method reduces the dimension of clustering and the noise,improves the accuracy of clustering,and enhances the reliability of evolution analysis.
Text Similarity Calculation Based on Semantic Dictionary and Word Frequency Information
DONG Yuan and QIAN Li-ping
Computer Science. 2017, 44 (Z11): 422-427.  doi:10.11896/j.issn.1002-137X.2017.11A.090
Abstract PDF(1469KB) ( 749 )   
References | Related Articles | Metrics
Considering the drawbacks of semantic understanding and frequent word appearance,this paper proposed a text similarity algorithm based on semantic dictionary and word frequency information,referred to as TSSDWFI.In particular,the proposed algorithm aims at evaluating the similarity between two texts by calculating the expanded similarity between any two words in texts and the maximum similarity matching between text words.The proposed algorithm adopts semantic dictionary to calculate similarity between texts and takes into account the similarity relationship between different words and the frequency of word appearance in the text.Simulation results show that,compared with the existing algorithms,the proposed algorithm TSSDWFI has higher accuracy.
Weighted Least Squares Support Vector Machine Based on Entropy Evaluation
LIU Chang and FAN Bin
Computer Science. 2017, 44 (Z11): 428-431.  doi:10.11896/j.issn.1002-137X.2017.11A.091
Abstract PDF(1303KB) ( 644 )   
References | Related Articles | Metrics
Support vector machine is a kind of machine learning algorithm based on statistical learning theory,which has a desirable modeling performance for nonlinear and high-dimensional data,even in the case of small samples.Typically,the data with multiple features would be normalized due to different dimensions.However,it ignores the dissimilarity of different features.A weighted least squares support vector machine was proposed.According to the entropy evaluation method,the feature weights may be determined so that the data could be normalized and weighted.Then the system model would be established through the least squares support vector machine.The experimental results demonstrate the effectiveness and superiority of the proposed method for the system with multiple features.
Technology of Extracting Topical Keyphrases from Chinese Corpora
YANG Yue and ZHANG De-sheng
Computer Science. 2017, 44 (Z11): 432-436.  doi:10.11896/j.issn.1002-137X.2017.11A.092
Abstract PDF(1371KB) ( 1113 )   
References | Related Articles | Metrics
In the big data era,the information is exploding.The most popular information among people connection is text message.On the Internet,there are countless text information upload or download every day.The important way to quickly grasp content of countless text message is extracting keywords.However,the traditional work of extracting keywords from text corpora ignores two problems:the length of keywords and the topic of text corpora.In this paper,a new algorithm which is in consideration of two aspects mentioned above was proposed.This paper combined the LDA topic model and frequent phrases discovery algorithm to generate frequent candidate phrases with different length,at the same time,this paper proposed an algorithm of completeness filter and rank function to filt and rank candidate.Finally,according to the rank list,the real keyphrases were chosen.
KDE-CGA Algorithm of Structure Learning for Small Sample Data Bayesian Network
XU Jian-rui, LI Zhan-wu and XU An
Computer Science. 2017, 44 (Z11): 437-441.  doi:10.11896/j.issn.1002-137X.2017.11A.093
Abstract PDF(1451KB) ( 971 )   
References | Related Articles | Metrics
In view of learning the Bayesian network under the condition of the small sample data,this paper firstly made use of kernel density estimation to expand the small scale sample data,then adopted the cloud theory-based genetic algotithm to learn the structure of Bayesian network.In order to improve the effect of data expanding,the paper discussed the way of improving the density function and its window breadth.At the same time,the cloud theory was combined with genetic algotithm.We changed crosses rate and variation rate properly,avoided the problem of looking an excellent answer in a part.Simulation results show that the algorithm is effective and practical.()
UID-DBSCAN Clustering Algorithm of Multi-dimensional Uncertain Data Based on Interval Number
WEI Fang-yuan and HUANG De-cai
Computer Science. 2017, 44 (Z11): 442-447.  doi:10.11896/j.issn.1002-137X.2017.11A.094
Abstract PDF(1519KB) ( 578 )   
References | Related Articles | Metrics
The researches on clustering methods of uncertain data have been paid more and more attention,among them,the UIDK-means algorithm and U-PAM algorithm inherit the partition-based algorithm defects that can not identify any shape clusters and is sensitive to noise.FDBSCAN algorithm assumes that the probability distribution function or probability density function of uncertain data is known,however this information is hard to acquire.For the shortage of the above algorithms,a new multi-dimensional uncertain data clustering algorithm namely UID-DBSCAN based on interval numbers was proposed.It uses interval data combined with statistic information to describe uncertain data reaso-nably.And it utilizes the intervals distance function of low computing complexity to measure the similarity of different uncertain data.The concepts of interval density,interval density-reachable and interval density connected were firstly proposed and applied to expand clusters.Meanwhile in order to realize automatic clustering,combining with statistical features of the data,the parameters of density can be adaptively selected.Experiment results show that UID-DBSCAN algorithm can identify noise effectively,process arbitrary shape clusters and obtain better clustering precision with low computing complexity.
Research on Template Extraction Based on Large-scale Network Log
CUI Yuan and ZHANG Zhuo
Computer Science. 2017, 44 (Z11): 448-452.  doi:10.11896/j.issn.1002-137X.2017.11A.095
Abstract PDF(1402KB) ( 1415 )   
References | Related Articles | Metrics
Aiming at the problem of extracting network events directly from large-scale network log,a template extraction method based on large-scale network log was proposed.The method can automatically convert the massive and original network logs into log templates,so as to provide important pre-preparation for understanding the network events root causes and preventing the occurrence of network failure.Firstly,the structure of the log is analyzed,and the words in the log are divided into two types:template word and parameter word.Then,from three different angles,the log template extraction is studied respectively.Finally,the actual production data of the Internet company is used,and Rand_index method is used to evaluate the accuracy and validity of the three extraction methods.The results show that the average accuracy of the log templates based on the tag recognition tree model is 99.57%,which is higher than that of the four different types of messages collected from the service cluster.
Analysis of Astronomical Spectral Data Based on Grid Clustering
CHEN Shu-xin, SUN Wei-min and WANG Li-li
Computer Science. 2017, 44 (Z11): 453-456.  doi:10.11896/j.issn.1002-137X.2017.11A.096
Abstract PDF(1245KB) ( 880 )   
References | Related Articles | Metrics
The efficient analysis and processing of astronomical data is supported by Internet plus integration information technology.The massive observation data and secondary data have achieved efficient storage,retrieval,data analysis and information mining. Combined with the fourth (DR4) released star spectral flowed calibration data of the large scien-tific engineering LAMOST telescope survey which has owned China’s independent intellectual property rights, taking the astronomical data released by LAMOST as an example,RFITSIO software package of R language programming platform is used to read and write the spectrum documents of special FITS format.With the statistical data and data mining method,the verification scheme of supervised grid clustering was designed,by processing and identifying the spectral data.The spectral characteristics were extracted by dimensionality reduction.The characteristics of the absorption spectra were retained by normalized continuous spectrum,then the center grid of clustering wave length scale was divided,and the similarity measure function was used to describe the observed spectrum.
K-Means Clustering Algorithm Based on Initial Center Optimization and Feature Weighted
WANG Hong-jie and SHI Yan-wen
Computer Science. 2017, 44 (Z11): 457-459.  doi:10.11896/j.issn.1002-137X.2017.11A.097
Abstract PDF(1299KB) ( 2120 )   
References | Related Articles | Metrics
In order to improve the clustering accuracy of traditional K-Means clustering algorithm,an improved K-Means clustering algorithm based on initial center optimization and feature weighted was proposed.Firstly,the initial feature weight is obtained based on the contribution factor of sample feature for clustering,and a weighted distance metric is constructed.Next,the k initial clustering centers are obtained by using the proposed initial clustering center selection method,and the initial clustering is performed with the initial feature weight.Then,the feature weights are adjusted according to the clustering accuracy and the clustering process is performed again.The above process is repeated until the clustering accuracy is no longer changed,resulting in the final clustering result.The experimental results on the UCI database show that the algorithm has high clustering accuracy compared with the existing K-Means clustering algorithm.
Research on Crawler Algorithm for Theme of Books
ZHANG Li-jing, ZENG Qing-tao, LI Ye-li, SUN Hua-yan and ZI Yun-fei
Computer Science. 2017, 44 (Z11): 460-463.  doi:10.11896/j.issn.1002-137X.2017.11A.098
Abstract PDF(1377KB) ( 549 )   
References | Related Articles | Metrics
Aiming at the problem that the information crawling result of a book contains a lot of useless data,a kind of crawler algorithm was proposed,which is based on the book topic.The algorithm mainly consists of two parts,one part is based on the ODP (Open Directory System) dynamic keyword expansion method to describe the subject,the other part is the semantic extension of lexical entry based on VSM (Vector Space Model) topic correlation algorithm. The new algorithm,the VSM algorithm based on keywords and VSM algorithm based on ODP were analyzed through expe-riment.The result indicates that the precision and the recall rate of the new algorithm are higher than that of other two algorithms.
SMART:A Graph-based Recommendation Algorithm for Fast Moving Consumer Goods in E-commerce Platform
QING Yong, LIU Meng-juan, YIN Ying and LI Yang-xi
Computer Science. 2017, 44 (Z11): 464-469.  doi:10.11896/j.issn.1002-137X.2017.11A.099
Abstract PDF(1491KB) ( 870 )   
References | Related Articles | Metrics
This paper proposed a new graph-based recommendation algorithm for fast consuming goods in e-commerce platform,called SMART.Different from the traditional recommendation based on user-item bigraph,there are four types of nodes including users,items,categories,and their corresponding edges.In the new user-item-category graph,the weight of each undirected edge is set according to each user’s interests in the items and their categories,so we can run biased random walks on the weighted graph.After many iterations,the probabilities from each user node walking to all other nodes would converge to stable values.It is believed that the convergence probabilities can reflect the probabilities with which users purchase the goods.At last,we explored the user’s preference for shops to adjust the convergence probabilities and compute the TOP-N recommendation list.We evaluated the performance of the proposed algorithm based on the dataset of comment records in JD fresh goods,and the results show that our algorithm can provide high quality recommendation.Compared with the basic bigraph-based recommendation,the accuracy and the recall of our algorithm have increased by 1.32% and 1.48% respectively.
Study on Multi-collaborative Filtering Algorithm of Command Information Based on Cloud Models
DU Bo, YU Yan and DAI Gang
Computer Science. 2017, 44 (Z11): 470-475.  doi:10.11896/j.issn.1002-137X.2017.11A.100
Abstract PDF(1596KB) ( 516 )   
References | Related Articles | Metrics
To overcome the problems of data sparse and cold start based on traditional collaborative filtering algorithm,when collaborative filtering relationship is established between commanders and command elements,a mission-oriented multi-collaborative filtering algorithm of command information was proposed. Firstly,the algorithm performs cloud model-based pro-collaborative filtering by operation type on the command elements,and then integrates the cohesion subset analysis into user-based collaborative filtering to mine the similarities between the commanders and command elements under specific operation types,thus to achieve accurate recommendations. The experimental results show that the proposed algorithm can be applied to the command information system of operational mission effectively and improves the recommendation efficiency and accuracy of the system.
Hardware Implementation of Fast Huffman Coding Based on Different Sorting Methods
LI Yi-ke and WANG Zhan
Computer Science. 2017, 44 (Z11): 476-479.  doi:10.11896/j.issn.1002-137X.2017.11A.101
Abstract PDF(1404KB) ( 1156 )   
References | Related Articles | Metrics
Aiming at the drawback of static software Huffman coding which has large computational complexity and the problem of dynamic Huffman coding that makes the decoder more complicated,a quasi-dynamic hardware Huffman encoder was designed.This encoder encodes an array of data statically,then outputs the result parallelly which guarantees that it has a high encoding speed,and its dalay time is only the total time of an encoding process.First,for taking advantage of parallelism of hardware,static and dynamic sorting network are both implemented to meet different encoding demands.Then,a hardware binary tree building unit and a resolving unit driven by data flow are implemented to get the final Huffman codes.Finally,the Huffman codes are outputted based on data stored in FIFO.The design results illustrate that,this quasi-dynamic based on Nexys4 platform shows advantages of high working frequency (over 100MHz),high throughput,low latency and can simplify the design of decoder.
Research on Traceability of Functional Requirements to Test Case
ZHAI Yu-peng, HONG Mei and YANG Qiu-hui
Computer Science. 2017, 44 (Z11): 480-484.  doi:10.11896/j.issn.1002-137X.2017.11A.102
Abstract PDF(1355KB) ( 815 )   
References | Related Articles | Metrics
Software development mainly comprises of requirements gathering,design,development,testing and maintenance.Maintenance is the main cost in the lifetime of software.During maintenance,developers have to understand program in order to locate the defect.The traceability links among requirements,source code and test cases can effectively help developers to understand the program.In this paper,the existing feature location methods and traceability methods were analyzed.Based these existing methods,an improved method which integrates dynamic execution information and information retrieval was proposed.This method can assist developers during the maintenance by establishing traceability links between requirements and test cases.
Method of Metamorphic Relations Constructing for Object-oriented Software Testing
ZHANG Xing-long, YU Lei, HOU Xue-mei and HOU Shao-fan
Computer Science. 2017, 44 (Z11): 485-489.  doi:10.11896/j.issn.1002-137X.2017.11A.103
Abstract PDF(1596KB) ( 576 )   
References | Related Articles | Metrics
To solve the question of insufficiency in object-oriented metamorphic relations constructing,a method of me-tamorphic relations constructing for object-oriented software testing based on error types was proposed.Firstly,the basicoperations in every method are analyzed and the faults are divided into three categories according to the location where errors occurred and the effect of errors.Then,the methods can be divided into two types according to the effect of implementation.Metamorphic relations are constructed for each method of every error type.Finally,we compared this method with other methods through the experiment of Rectangle.Experiment results show that new metamorphic relation construction method guided by error type has improved in the error detection rate and is helpful for finding fault location.
Temporal Predictability in Safety Critical Cyber Physical System
LI Xi, SUN Bei-lei, WAN Bo, CHEN Xiang-lan and ZHOU Xue-hai
Computer Science. 2017, 44 (Z11): 490-493.  doi:10.11896/j.issn.1002-137X.2017.11A.104
Abstract PDF(1228KB) ( 631 )   
References | Related Articles | Metrics
The safety critical cyber physical systems (CPS) consists of two concurrent subsystem,including cyber systems (CS) and physical systems (PS).The CS is constructed as a distributed real-time systems to meet the requirements of timelines and safety,both of which can be supported by the key property in system,i.e.,temporal predictability.However,there are no clear definition of temporal predictability in CPS.This paper summarized the state-of-art of predictability in CPS.Two key attributes to improve the system’s predictability were summarized,including timing predictability and ordering predictability.Finally,some advice to build a predictable CPS were proposed.
CS-Chord:Distributed High Dimensional Vector Index Based on Clustering Separation
YUAN Xin-pan, WANG Can-fei, LONG Jun and PENG Cheng
Computer Science. 2017, 44 (Z11): 494-497.  doi:10.11896/j.issn.1002-137X.2017.11A.105
Abstract PDF(1498KB) ( 517 )   
References | Related Articles | Metrics
M-Chord is a high dimensional vector index based on the P2P network,and the sparse vector of the clustering edge is easy to intersect with the query circle,which makes the area of the query increased,and the efficiency of M-Chord is reduced.Based on the idea of M-Chord,this paper proposed a distributed high dimensional vector index (CS-Chord) based on clustering,which separates the edge vectors from the dense vectors.The vector of the central region is still stored in the Chord loop,which saves a lot of resources locating time and improves the efficiency of retrieval.Experimental results show that when the query radius is 0.2,the number of CS-Chord distance computation 2000,reducing about 2500 times compared with M-Chord.The message forwarding times of CS-Chord decreases about 150 times,only 50% of M-Chord.
Offline Efficient Compression Algorithm for AIS Data Retains Time Elapsing Dimension
XU Kai, QIU Jia-yu and LI Yan
Computer Science. 2017, 44 (Z11): 498-502.  doi:10.11896/j.issn.1002-137X.2017.11A.106
Abstract PDF(1469KB) ( 886 )   
References | Related Articles | Metrics
Ship track compressing is an important step in global ship trace big data processing.This paper firstly expounded some existing offline compression algorithms for ship trajectory and also exhibited their problems.For example,classic Douglas-Peucker compression algorithm just compresses the ship trajectory in two dimensions,losing time element,which makes that the compression algorithm cannot keep the information of speed and status of ships.On the other hand,although dynamic Douglas-Peucker compression algorithm needs to divide ship trajectory into different groups before compression,it will yet result in some efficiency problems.The compression algorithms which takes into account time dimensions was token to spatial vector.In order to improve the effect and efficiency,we used inner product and outer product.Through experiment,we found that the efficiency of the algorithm had a 30% increase and a better effect in most cases.The innovation is putting the inner product and outer product into compression algorithms.
Multi-objective Moth-flame Optimization Algorithm Based Optimal Reactive Power Dispatch for Power System
LI Wei-kun, QUE Bo, WANG Wan-liang and NI Li-zhou
Computer Science. 2017, 44 (Z11): 503-509.  doi:10.11896/j.issn.1002-137X.2017.11A.107
Abstract PDF(1562KB) ( 590 )   
References | Related Articles | Metrics
In view of the increasing power energy demand and the drawback of conventional reactive power optimization methods,how to effectively solve the reactive power optimization has become a hot spot in power research.This paper proposed a multi-objective model of reactive power optimization problems in power system and a multi-objective moth-flame optimization algorithm (MOMFA) to optimize problems with multiple objectives for the first time.A fixed-sized external archive,grid and select mechanism are integrated to the MOMFA for maintaining and improving the pareto optimal solutions.The proposed algorithm is compared with two well-known algorithms on CEC multi-objective optimization test problems.Moreover,the proposed algorithm was simulated in real power system data and compared with two well-known algorithms:multi-objective particle swarm optimization (MOPSO) and non-dominated sorting genetic algorithm version 2(NSGA-II).The results demonstrate that the proposed algorithm is outperforms other algorithms in reactive power optimization.
Data Block Density Scheduling Strategy Based on HDFS in Shared Cluster
DU Hong-guang, LEI Zhou and CHEN Sheng-bo
Computer Science. 2017, 44 (Z11): 510-515.  doi:10.11896/j.issn.1002-137X.2017.11A.108
Abstract PDF(1523KB) ( 476 )   
References | Related Articles | Metrics
With the development of cloud computing technology and mass data processing technology,shared clusters use HDFS as a distributed file system and manage computing resources through virtualization to provide operational resources for computing frameworks and applications.The data localization of mass data processing applications is a key factor which affects its performance.At present,the research of shared cluster management framework’s scheduler mainly focuses on improving the throughput and resource utilization of the system by improving the parallelism of dispatching,and there are some defects in the quality of scheduling,such as the data locality.In this paper,a scheduling strategy based on data block density was proposed to improve the data locality of the application.By using this strategy,the performance of the application can be improved by reducing the cross-host I/O during the application operation.Experiments show that the scheduling strategy proposed in this paper can effectively reduce the running time of data-intensive operations.In the test case of WordCount and TeraSort with 2.5G data,the method of this paper achieved 90% data localization and shortened the operation by 20% time.
Research on Optimization Method of Merging and Prefetching for Massive Small Files in HDFS
ZHENG Tong, GUO Wei-bin and FAN Gui-sheng
Computer Science. 2017, 44 (Z11): 516-519.  doi:10.11896/j.issn.1002-137X.2017.11A.109
Abstract PDF(1333KB) ( 1534 )   
References | Related Articles | Metrics
HDFS has a significant advantage on storing the massive files,however,its storage architecture which has onlyone NameNode will result in the decrease of performance when HDFS is used to store massive files which is mainly composed by small files .A solution based on the idea of that small files were merged into large files was proposed.Meanwhile,the mapping relationship from small files to merging files was established and stored into HBase.Finally,we provided a LRU based prefetching mechanism to improve the reading speed.The experiments show that the proposed method can improve the overall performance of HDFS with the large amounts of small files.
Research on Autonomous Landing of Quad-rotor UAV
JIA Pei-yang, PENG Xiao-dong and ZHOU Wu-gen
Computer Science. 2017, 44 (Z11): 520-523.  doi:10.11896/j.issn.1002-137X.2017.11A.110
Abstract PDF(1388KB) ( 1382 )   
References | Related Articles | Metrics
In recent years,quad-rotor UAV plays an increasingly important role.Autonomous landing is the key technology for UAV’s intelligence processing system,and it has three main steps:target detection,target tracking,position prediction and landing.In this paper,the improved Apriltags algorithm was used to get target’s 3D position and attitude in real time.PID algorithm and Kalman filtering were used to make the UAV’s flying more stable and its responding more quickly.Fitting function was used to predict the UAV’s flying path.In conclusion,the method realizes the UAV’sintelligent recognition,stable tracking and autonomous landing for moving target.
Design and Implementation of Parallel DBSCAN Algorithm Based on Spark
HUANG Ming-ji and ZHANG Qian
Computer Science. 2017, 44 (Z11): 524-529.  doi:10.11896/j.issn.1002-137X.2017.11A.111
Abstract PDF(1682KB) ( 1204 )   
References | Related Articles | Metrics
With the cloud application requiring less running time and higher performance as well as memory prices continuing to decline,the technology of memory-based distributed computing framework,such as Spark,has received unprecedented attention and is widely applied.We designed and implemented parallelized DBSCAN algorithm based on Spark,which can minimize the shuffle frequency and the data amount in shuffle.In order to optimize the algorithm para-llelization strategy,we found the bottleneck of algorithm performance through the analysis and described the DAG of parallel DBSCAN algorithm based on Spark.Finally,we compared the performance of parallel DBSCAN algorithm with the DBSCAN algorithm,and analyzed the influence of different parameters on clustering results.Experimental results show that,compared with DBSCAN algorithm,the running time of parallel DBSCAN algorithm based on Spark decreases 37.2% and the speedup reaches 1.6 respectively on a data set containing three million lines without obvious loss of clustering accuracy.
Design and Implementation of Image Corner and Edge Detection System Based on Zynq
PAN Qing-song, ZHANG Yi, YANG Zong-ming and QIN Jian-xiu
Computer Science. 2017, 44 (Z11): 530-533.  doi:10.11896/j.issn.1002-137X.2017.11A.112
Abstract PDF(1613KB) ( 888 )   
References | Related Articles | Metrics
Based on Zynq chip,the whole system is implemented with hardware and software co-design method.ARM+FPGA is used as internal architecture of the Zynq,so it has both of their advantages,which are flexibility of ARM and parallel processing capability of FPGA.The two advantages are fully used in this system.Image acquisition is implemented by ARM software design.Corner and edge detection is implemented by FPGA hardware design.The ARM processor and FPGA did the data interaction through bus AX14.This system has 3 main functions,which are image acquisition,image feature extraction and image display.The results show that this system implemented by Zynq,which used hardware acceleration algorithm,is 6 to 8 times faster than that of implemented only by ARM processor.
Simulation Software Realisation of Active Power Output of Wind Farm Based on VC++ and SQL Server Database
MA Wan-cheng, YUAN Tie-jiang, ZHANG Heng and LIU Zhao-ting
Computer Science. 2017, 44 (Z11): 534-537.  doi:10.11896/j.issn.1002-137X.2017.11A.113
Abstract PDF(1434KB) ( 385 )   
References | Related Articles | Metrics
In view of that the research of the current active power output of wind farm focuses on the theoretical research and are lack of realization of the software platform,in this paper,the simulation software platform of active poweroutput of wind farm based on VC++ and SQL Server database was developed.Firstly,the active power output of wind farm was realized by using the VC++ rich function library,and the data of active power output of wind farm was displayed graphically based on the powerful data processing and data visualization function of MFC in VC++.Secondly,the data storage and management function of SQL Server database was used to realize the function of information browsing,query and remote data call.Based on this software platform,a software system architecture method was proposed.Finally,the data of wind farm in Xinjiang was simulated and verified by the measured wind speed data,and the results show that the simulation software of the active power output of wind farm can show the active power of wind farm in the form of a curve graph,and this proves that the software platform has a good engineering application,and it has important signifi-cance for the improvement of wind power system.
Design and Implementation of Service Management Platform Based on MEAN and SpringMVC
ZHANG Chuan-guo and WANG Ling-li
Computer Science. 2017, 44 (Z11): 538-541.  doi:10.11896/j.issn.1002-137X.2017.11A.114
Abstract PDF(1269KB) ( 566 )   
References | Related Articles | Metrics
With the rapid development of the Internet,API’s efficient and convenient management,safe and reliable access are the difficult problems.Aiming at solving the problems of API’s management and access,we designed and develo-ped the service management platform based on the MEAN and SpringMVC framework.At the same time,we put forward the concept of service,alias API group.Using the Platform,we implemented the service management,the voucher management,the user management and the unified access entrance.The platform is a good solution to the traditional problem of API management,improves the security of accessing API,and reduces the workload of aggregating and distributing API.
Formal Verification of TAPE Strategy for Dynamic Temperature Management in Multi-core System
QU Yuan-yuan, HONG Mei and SUN Ning
Computer Science. 2017, 44 (Z11): 542-546.  doi:10.11896/j.issn.1002-137X.2017.11A.115
Abstract PDF(1480KB) ( 521 )   
References | Related Articles | Metrics
Distributed DTM strategy in multi-core system is widely used because of its scalability.Before a distributed DTM policy is deployed,its reliability must be verified.In order to overcome the limitations of the traditional analytical methods,the model checking technique is applied to the analysis of distributed DTM strategies.This paper analyzed a TAPE policy which is a distributed DTM policy instance in a multicore system using statistical model checking techniques.The verification of TAPE strategy by UPPAAL SMC proves the security,validity,activity and stability of TAPE strategy,and proves the reliability of DTM scheme.
Diversity-guided FPSO Algorithm for Solving Air Refueling Region Deplaying Problem
HE Xu, JING Xiao-ning, FENG Chao and CHENG Yue
Computer Science. 2017, 44 (Z11): 547-551.  doi:10.11896/j.issn.1002-137X.2017.11A.116
Abstract PDF(1438KB) ( 535 )   
References | Related Articles | Metrics
Region deploying plays an important role in air refueling tasks.To select transport aircraft’s air refueling point,the model was established on the basis of requirements for total oil consumption,transportation time and threat price.The FPSO algorithm was used in the simulation and the superiority was verified,and we got the best air refueling point.
Research of Molecular Similarity Algorithm Based on Counting Bloom Filter
WANG Shan, SUN Li, WU Jie, FENG Feng and WANG Hong-wei
Computer Science. 2017, 44 (Z11): 552-556.  doi:10.11896/j.issn.1002-137X.2017.11A.117
Abstract PDF(1403KB) ( 577 )   
References | Related Articles | Metrics
Molecular similarity is an important part of the virtual screening technology,and plays a key role in compu-ter-aid drug design.In the process of 2D Fingerprint similarity assessment,some typical molecular similarity assessment use the Hash function in the process of molecular fingerprint mapping.However,the inherent conflict of Hash function easily reduces the precision of molecular fingerprint mapping.In this paper,a fingerprint mapping method based on counting bloom filter was adopted to effectively reduce the probability of fingerprint space mapping conflict and improved the similarity assessment process.To effectively evaluate molecular similarity,the improved method,which uses a tailored version of DUD (DUD LIB VS.1.0 sets),was validated by comparing with experimental results,using ROCE (Receiver Operating Characteristics Enrichment),AUC (Area Under Curve), awROCE and awAUC value as the evalua-tion standard.Compared with the other original molecular similarity method,the experimental result shows that improved method is still competitive in precision and scaffold hopping potential evaluation standard.
Optimizing Circuit Breaker Temperature Control Technique Based on GA Toolbox of MATLAB
LI Hong, XU Li-li and LI Jin
Computer Science. 2017, 44 (Z11): 557-560.  doi:10.11896/j.issn.1002-137X.2017.11A.118
Abstract PDF(1405KB) ( 674 )   
References | Related Articles | Metrics
The control of temperature change is a very important work in the process of studying circuit breaker temperature change.Because the test analysis of temperature change process are influenced by various factors,we need to use design of experiment method to screen out the key influence factors to improve.Based on the actual data obtained from a comprehensive survey of the circuit breaber temperature change process,this paper used uniformity and the more representative characteristics to perform an experiment when uniform design and orthogonal design when their number of experiment is similar.Then it applied response surface methodology to build a regression model about circuit breaker temperature change and used the genetic algorithm with penalty function to optimize,finding out a group of factors which get the best temperature control value.
Design and Implementation of Faults Querying App for Information System Based on Android
LI Meng-wei, DONG Zheng-hong and YANG Fan
Computer Science. 2017, 44 (Z11): 561-564.  doi:10.11896/j.issn.1002-137X.2017.11A.119
Abstract PDF(1372KB) ( 517 )   
References | Related Articles | Metrics
Information system is an important tool of realizing informatization,how to find and resolve the faults of information system connection to achieve interconnection is pretty important.This paper developed an Android App of faults querying for information system based on Android Studio combining relation database and expert system.The “information system faults database” is established by SQLite database,not only can support for users to diagnose faults,but also become a collecting platform of faults database.The App of faults querying has good portability and friendly operation interface,it can largely improve the efficiency of solving the faults,and it is of great significance in maintaining the stability of information system.
Research on Lower Redundancy of Two-dimension Barcode Encoding in Chinese Characters
YANG Kang, YUAN Hai-dong and GUO Yuan-bo
Computer Science. 2017, 44 (Z11): 565-569.  doi:10.11896/j.issn.1002-137X.2017.11A.120
Abstract PDF(1360KB) ( 799 )   
References | Related Articles | Metrics
With the expansion of two-dimension barcode application field,optimization and improvement of two-dimension barcode are imperative.A fixed length coding algorithm has been used in Chinese character encoding,there is a large redundancy because the algorithm ignores the frequency of Chinese character which has an impact on the efficiency of two-dimension barcode encoding.A variable length coding algorithm can reduce the length of some commonly used Chinese characters.Tt the same time,it can reduce the average length of two-dimension barcode in Chinese characters,correspondingly,it can increase the maximum number of Chinese character encoded in the two-dimension barcode.First of all,according to the frequency of commonly used Chinese characters,all of the commonly used Chinese characters were divided into different segments and encoded independently,and a table for variable length coding algorithm was created.Then,all of the Chinese characters based on the variable length coding algorithm were encoded without brea-king the standard two-dimension barcode encoding rules.At last,the time performance and spatial performance of fixed length and variable length coding algorithm were analyzed.Experiments show that the variable length coding algorithm can cut 18.4% redundancy in Chinese character encoding.
Design of MEMS Electric-field Sensor Test System Based on PXI
HU Xin-yu and CHEN Bo
Computer Science. 2017, 44 (Z11): 570-572.  doi:10.11896/j.issn.1002-137X.2017.11A.121
Abstract PDF(1302KB) ( 610 )   
References | Related Articles | Metrics
The traditional electric-field sensor test systems have some disadvantages such as complexity,inefficiency and bad compatibility.This paper designed a new electric-field sensor test system.Based on NI PXI and LabVIEW,virtual instrument system architecture was given by this test system.Based on DCS,the controller and operation station are separated,the standard electric-field controlling,data acquisition and processing can be realized.Based on the technique of correlation detection,it can measure the calibration of electric-field sensor in static and alternating electric field.Through experiments,the test system has been proved to have strong stability and high measuring accuracy less than 1%.
Research and Implementation of Campus Education Interconnection System for Intelligent Terminal
JIA Ning
Computer Science. 2017, 44 (Z11): 573-576.  doi:10.11896/j.issn.1002-137X.2017.11A.122
Abstract PDF(1458KB) ( 465 )   
References | Related Articles | Metrics
Based on the Internet of things and mobile internet applications,campus education interconnection system is a new mode of communication platform,which uses radio frequency identification technology,wireless communication technology,network communication technology,cloud storage technology development,and communicates students,families,schools and teachers.It realizes the real meaning of campus information intelligence,home school communication.The system that makes full use of mobile phone text messages,positioning,camera services,RFID recognition and other functions,gets student information in school and stores this information in the cloud platform.By manipulating the relevant hardware devices,parents can log on the terminal application system,and check with the student information in school at any time,so as to achieve the purpose of improving the connection and better communication between the school and the parents.
Research and Application of Multi-objective Optimization Algorithm Based on Interval Reliability Lower Bound
YAN Hong
Computer Science. 2017, 44 (Z11): 577-579.  doi:10.11896/j.issn.1002-137X.2017.11A.123
Abstract PDF(1322KB) ( 544 )   
References | Related Articles | Metrics
In order to fill the lack of research aims to multi-objective optimization algorithm,in this paper the problem of the multi-objective optimization which could be applied for manufacturing based on multiple factors was selected as the research object.Firstly,the concept of interval between credibility and dominance relations etc was proposed.Secondly,the algorithm multi-objective optimization based on the lower bound of the confidence interval was established based on confidence interval and dominance relations.Finally,the optimization algorithm was studied by multi-objective numerical optimization.The results shows that H measure and evolution are positively correlated with the increase of the individualin the same value of γ.It can explain,and the lower bound of the dominant credibility algorithm based on the -Pareto front more able to reflect the true Pareto front with the evolution algebra.It can be seen that the multi-objective optimization algorithm is more suitable for practical situation based on confidence interval bounds established compared with the proposed algorithm of IP-MOEA and SPGA.The multi-objective optimization algorithm based on the lower bound of the confidence interval can fill the lack.
Exact Epsilon-constraint Algorithm for Bi-objective Optimization of Flight Arrival Scheduling Problem
WANG Lu, ZHANG Xiao-ning, SUN Zhi-hui and WU Hui
Computer Science. 2017, 44 (Z11): 580-582.  doi:10.11896/j.issn.1002-137X.2017.11A.124
Abstract PDF(1256KB) ( 970 )   
References | Related Articles | Metrics
With the rapid growth of airport passenger traffic,more and more flights delay.Meanwhile,in the management of the aircraft landing at the airport,security is very important.On the runway,low degree of friction caused by snow or ice can indure the airplane accidents.Therefore,periodic runway maintenance is extremely important.This paper studied the scheduling problem of aircrafts on the runway with periodic maintenance.To guarantee a good service performance for airlines,and to increase the efficiency of runway utilization,we set two objective functions,i.e.,the first one minimizing the total tardiness of all airplanes and the second minimizing the makespan.We established a bi-objective mixed integer linear programming model.Then to obtain the exact Pareto front,we developed an epsilon-constraint method.At last,we used an example to demonstrate a possible application of our model as well as the algorithm.The purpose of this work is to obtain exact solution set for the bi-objective optimization problem,which can help practitioners in airport management for reference.
Design and Implementation of Electric Bicycle Rental Client Based on Android Platform
GUAN Xiao-han and LIU Zheng
Computer Science. 2017, 44 (Z11): 583-585.  doi:10.11896/j.issn.1002-137X.2017.11A.125
Abstract PDF(1216KB) ( 428 )   
References | Related Articles | Metrics
The design of client and server program is achieved in the system of electric bicycle rental client .Compared with the usual software for car rental,the system includes the function of speed limiting,besides the basic functions such as the user login,vehicle reservation,store check and order review are also included.In this paper,the asynchronous loading mechanism was adopted in the display of data,and the definition of interface was used by the communication between the client and the server.The second level cache was employed to avoid the problem of memory overflow when images are loading.This software runs smoothly and has higher security performance compared with the current car rental applications.
Research on Parallel Algorithm of Petri Net Based on Three-layer Mixed Programming Model
ZHOU Jie and LI Wen-jing
Computer Science. 2017, 44 (Z11): 586-591.  doi:10.11896/j.issn.1002-137X.2017.11A.126
Abstract PDF(1587KB) ( 556 )   
References | Related Articles | Metrics
In order to solve the deadlocks in the synchronization realized by using the MPI+OPenMP mixed programming during the parallelization of Petri nets based on the multi core cluster,the paper proposed the Petri net parallel algorithm based on a three layer mixed programming model.Firstly,it builds a three layer programming model of MPI+OPenMP+STM in the multi-core cluster environment according to the synchronous advantage of the transactional memory.Then,it analyzes the parallelization of the geometric model and the algebraic model of the Petri net.It also builds the Petri net parallel model with a three-layer structure of MPI+OPenMP+STM as well as designing and analyzing the Petri net parallel algorithm of the three-layer programming model.Finally,the paper validates the programming through examples and proves that the operating efficiency of this algorithm is much better than those of other programming modes.In addition,the larger the size of the Petri net is,the better effect of the parallel computing it has.Therefore,the algorithm is an efficient and applicable algorithm for the simulated parallel operation of the Petri net in the multi-core cluster environment.
Robot System for GIS Foreign Body Clean and Cavity Detection
MA Fei-yue, YOU Hong, DIAN Song-yi, YANG Jia-yong, PENG Xin-zhi, WANG Bo and DING Pei
Computer Science. 2017, 44 (Z11): 592-595.  doi:10.11896/j.issn.1002-137X.2017.11A.127
Abstract PDF(1389KB) ( 477 )   
References | Related Articles | Metrics
In view of the problem that the existing GIS pipeline detection method can not take into account the foreign matter cleaning,a robot system based on Edison Inter platform for the detection and cleaning was presented.The design of robot system was introduced in detail from aspects of hardware,kinematic model and human computer interaction platform.On the hardware,a highly integrated processor Edison Inter is used as the center to connect the modularized design during the hardware development.On the human-computer interaction platform,based on the Android development platform,the function of the 3D dynamic display,motion process control,robot attitude control have been in plemented in realtime implemented.The test shows that the robot is unque,smooth running in the GIS pipeline. The robot can achieve the displalement of about 20 degress.The camera has no dead corners.The video pictures of human-compu-ter interaction is clear.The whole system can satisfy the requirement of detection and cleaning.
Design and Implementation of Light Application Customizations Platform Based on NodeJS and Express Framework
WANG Ling-li and ZHANG Chuan-guo
Computer Science. 2017, 44 (Z11): 596-599.  doi:10.11896/j.issn.1002-137X.2017.11A.128
Abstract PDF(1293KB) ( 763 )   
References | Related Articles | Metrics
In the “Internet+” wave,we need to achieve enterprise-class mobile applications through technical means.The current challenges are how to work efficiently,response timely,minimize the cost and max the revenue.For those questions,this article analyzed the basic concept of light application and then summarized the characteristics highlighting in native applications.At the same time,we put up the platform for customizing and releasing the light application based on the OodeJS and Express framework.Using the platform,we can reduce the difficulty to develop application,cut the research cost,shorten the development cycle and improve the efficiency of developing the application.In future work,the lack of modules in the module library and the drag-and-drop editing of the page need to be perfect.