Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 42 Issue 4, 14 November 2018
  
Research and Development on Inference Technique in Probabilistic Graphical Models
LIU Jian-wei, CUI Li-peng, LI Hai-en and LUO Xiong-lin
Computer Science. 2015, 42 (4): 1-18.  doi:10.11896/j.issn.1002-137X.2015.04.001
Abstract PDF(1674KB) ( 1261 )   
References | Related Articles | Metrics
In recent years,probabilistic graphical models have become the focus of the research in uncertainty inference,because of their bright prospect for the application in artificial intelligence,machine learning,computer vision and so forth.According to different network structures and query questions,the inference algorithms of probabilistic graphical models were summarized in a systematic way.First,exact and approximate inference algorithms for solving the probabili-ty queries in Bayesian network and Markov network were discussed,including variable elimination algorithms,conditioning algorithms,clique tree algorithms,variational inference algorithms and sampling algorithms.The common algorithms for solving MAP queries were also introduced.Then the inference algorithms in hybrid networks were described respectively for continuous or hybrid cases.In addition,this work analyzed the exact and approximate inference in temporal networks,and described inference in continuous or hybrid cases for temporal networks.Finally,this work raised some questions that the inference algorithms of probabilistic graphical models are facing with and discussed their deve-lopment in the future.
Overview on Glowworm Swarm Optimization or Firefly Algorithm
CHENG Mei-ying, NI Zhi-wei and ZHU Xu-hui
Computer Science. 2015, 42 (4): 19-24.  doi:10.11896/j.issn.1002-137X.2015.04.002
Abstract PDF(537KB) ( 2127 )   
References | Related Articles | Metrics
The glowworm swarm optimization algorithm (GSO) or firefly algorithm (FA) is one of the intelligence algorithms,which is inspired by the biological behavior of the glowworm attracting mates or preying.It has good perfor-mance in the discrete combinational optimization problems and continuous optimization problems.However,it still has some drawbacks such as it’s easily trapped into local optimal solutions.Starting with the improvement and fusion of the algorithm as well as the discrete mechanism,this paper presented a series of schemes on improving the GSO or FA.Finally,some meaningful remarks on the future research were presented.
Validated Evaluation and Error Analysis of Elementary Functions
LIU Jian, TANG Min, ZENG Xia and ZENG Zhen-bing
Computer Science. 2015, 42 (4): 25-30.  doi:10.11896/j.issn.1002-137X.2015.04.003
Abstract PDF(428KB) ( 817 )   
References | Related Articles | Metrics
This paper discussed evaluation principle and implementation of elementary functions according to the standard of GNU.Based on the IEEE 754-2008 standard,theoretical error on elementary functions program in C language standard library was analyzed by error analysis fundamental theory.Firstly,the floating-point C program was transferred to the corresponding interval computing one using interval class provided by Boost library.Secondly,validated evaluation was used on these elementary functions by interval arithmetic.Finally,the interval,including real numeric,was obtained.Based on the interval,numerical error bounds on these elementary functions were presented.
Higher-order Logic Formalization of Basic Theory of Continuous Fourier Transform
LV Xing-li, SHI Zhi-ping, LI Xiao-juan, GUAN Yong, YE Shi-wei and ZHANG Jie
Computer Science. 2015, 42 (4): 31-36.  doi:10.11896/j.issn.1002-137X.2015.04.004
Abstract PDF(477KB) ( 1057 )   
References | Related Articles | Metrics
Continuous Fourier transform (CFT) is widely used in the fields of mathematics and engineering.The definition of CFT and its operational properties were formalized in the higher-order logic theorem prover HOL4,including linearity,frequency shifting,reversion,integration,first-order differention and higher-order differention,which lays the foundation for analyzing related systems by formal methods.Finally,the frequency response of resistance inductance capacitance(RLC) series resonant circuit was verified by the theorem proving method,which illustrates a preliminary application of formalized CFT.
Accelerated-growth HK Network Evolution Model
CUI Ai-xiang and FU Yan
Computer Science. 2015, 42 (4): 37-39.  doi:10.11896/j.issn.1002-137X.2015.04.005
Abstract PDF(259KB) ( 736 )   
References | Related Articles | Metrics
In recent years,with the further study of evolution model of complex networks,the research focus has shifted from the global structure to local structure.Empirical results show that many real networks exhibit power-law clique-degree distribution,and the distribution exponents decrease with the increase of the order of clique.This general regularity can’t be produced by the acquaintance recommended mechanism of HK model proposed by Holme and Kim.This work considered the property of accelerated growth of networks and proposed an improved HK model.Numerical simulations indicate that accelerated growth HK model has large cluster coefficient and small average shortest path.It not only has the small-world effect and scale-free property,but also reproduces the observed power-law clique-degree distribution.This study is better to understand the motifs in the network.
Research on Load Balancing Mechanism of Adaptive Mobile Application Layer Multicast Terminal Based on Feedback
CUI Jian-qun, JIANG Bo and WU Li-bing
Computer Science. 2015, 42 (4): 40-43.  doi:10.11896/j.issn.1002-137X.2015.04.006
Abstract PDF(427KB) ( 443 )   
References | Related Articles | Metrics
In the mobile application layer multicast communication,the streaming media service satisfaction may be reduced because of too many users in the hot place,while the not hot places may produce resources waste phenomenon,which leads the whole system performance degradation.This paper proposed an adaptive load balancing mechanism based mobile terminal active feedback.ALBM-MTAF uses some network-related performance metrics to simulate the streaming media service satisfaction,which is obtained by mobile terminal.Then through the mobile user’s active feedback to adaptively adjust,system switches the child nodes in the not hot places to the places having the better SMSS,so as to realize the load balance of the whole system.The simulation experiment shows that the mechanism has good load balancing effect,and can guarantee the quality of communication.
Research on Data Distribution Strategy in Cloud Storage System Based on Multi-objective Optimization
ZHANG Hua-wei and LI Zhi-hua
Computer Science. 2015, 42 (4): 44-50.  doi:10.11896/j.issn.1002-137X.2015.04.007
Abstract PDF(525KB) ( 479 )   
References | Related Articles | Metrics
The data distribution strategy in cloud storage systems has shortcomings that the distribution strategy only considers the load balancing among physical storage nodes instead of the heat load balancing,waiting time and other factors.Aiming at this question,a rational and efficient data distribution strategy local optimum distribution (LODS) was proposed.By defining a series of new definitions to describe the correlations in LODS,employing the hashing method to act as the core ideal and presenting a couple of multi-objective optimization rules,LODS combines the above innovations and the AHP method to give the finally optimized storage scheme.The proposed storage scheme can search the target storage node in cloud system within a local range with a optimized decision radius.The decision radius is stable without variation while the large-scale storage amount is tested.Once the decision radius is “0”,what the LODS degenerates is almost corresponding to that of Amazon S3.Experiments show that the average advantage of the proposed cloud storage strategy overs that used in HDFS,Amazon S3 in terms of the strategies of storage load balancing,load balancing heat and waiting time.
Multidimensional QoS Weight Model for Trusted Web Service Optimal Selection Based on Hadoop
HE Xiao-xia and TAN Liang
Computer Science. 2015, 42 (4): 51-55.  doi:10.11896/j.issn.1002-137X.2015.04.008
Abstract PDF(772KB) ( 460 )   
References | Related Articles | Metrics
With the rapid growth of the Web service application,major problem is how to select Web service more accurately to meet the users’ non-function requirements from many functions similar of the Web services.To solve this problem,we designed a multidimensional QoS weight model for the trusted Web service optimal selection based on Hadoop,called HL pattern.In the pattern,we took the Hadoop and HBase platform as the basic framework.Firstly,using subjective-objective empowerment mode to multi-dimension QoS attribute weighting improves the objectivity and accuracy of the QoS attribute weights.Then,credibility parameters are added to improve the credibility of the Web service QoS attributes,and QoS-tree is used to store the QoS attribute values.Experimental results show that this model not only improves the credibility of the QoS attributes,gives the rational Web service recommended sequence,with an objective means to meet the demand of the user’s QoS preference,but also improves the Web Service search accuracy in the Hadoop.
Optimal Power Allocation in Broadband Cognitive Radio Networks Based on SC-FDMA
WANG Zhen-chao, MA Ming-lei and LI Yan
Computer Science. 2015, 42 (4): 56-59.  doi:10.11896/j.issn.1002-137X.2015.04.009
Abstract PDF(397KB) ( 454 )   
References | Related Articles | Metrics
This paper studied the optimal power allocation problem of non-authorized user(NU) in broadband cognitive radio networks based on single-carrier frequency division multiple access(SC-FDMA).Firstly,we introduced the NU to authorized user(AU) interference power models of broadband cognitive radio networks based on SC-FDMA.On this basis,using convex optimization theory in the following two constraints,we derived two kinds of the optimal power allocation (OPA) algorithms to make the sum of rate of NU largest respectively.One constraint is any subcarrier’s interference power constraints(IPCs) of any NU to AU.The other constraint is all NU’s all subcarriers’ interference power constraints(IPC) to AU.Simulation results show that compared with equal power allocation (EPA) and the traditional optimal power allocation (C-OPA) algorithm,the proposed algorithm significantly improves the achievable sum rate of NU.
Diagnosis Method of Behavior Inference in Web Service Composition
JIA Zhi-chun and XING Xing
Computer Science. 2015, 42 (4): 60-64.  doi:10.11896/j.issn.1002-137X.2015.04.010
Abstract PDF(439KB) ( 422 )   
References | Related Articles | Metrics
With the wide applications of Web services and composite services in the distributed network,the size and complexity of Web service are increasing continuously.These could cause various faults of service system during the running.In building high-reliable service applications,one of the critical challenges is how to localize faulty service quickly and exactly and help service engine restore the normal process as soon as possible.To perfect the diagnosis mo-del and minimize the impact of noise data on diagnosis accuracy,we presented a service behavior model-based diagnosis method of behavior inference.The method models hidden markov model by combining historical data into service process definition.On the basis of using the decoding algorithms in HMM,the method is able to infer a correct execution trace which has the maximum likelihood with the exception execution trace and localize the service faults by comparing the differences between them.The experimental results show that the method is effective and robust to various noises in diagnosing the faults of Web services.
Energy Efficiency Protocol Based on Adaptive Sleeping in Wireless Sensor Network
LIANG Fang and SHEN Ji-nan
Computer Science. 2015, 42 (4): 65-67.  doi:10.11896/j.issn.1002-137X.2015.04.011
Abstract PDF(270KB) ( 423 )   
References | Related Articles | Metrics
Due to the energy limitation of wireless sensor network nodes,the efficiency strategy of energy saved becomes one of the most hot technologies.An energy efficiency protocol based on adaptive sleeping was proposed after analyzing the solution schemes of energy efficiency in all layers of wireless sensor network.According to the residual energy reserves,geographical location and the traffic load of nearby nodes,the data packets are forwarded with the adaptive sleeping mechanism.Comparing with geographical random forwarding protocol,this scheme can significantly reduce and balance energy consumption of the network,and then apparently prolong the network’s lifetime.
Traffic Prediction Algorithm in Buffer Based on Recurrence Quantification Union Entropy Feature Reconstruction
LU Xing-hua and CHEN Ping-hua
Computer Science. 2015, 42 (4): 68-71.  doi:10.11896/j.issn.1002-137X.2015.04.012
Abstract PDF(849KB) ( 603 )   
References | Related Articles | Metrics
The accurate prediction of short-time traffic network in base station buffer is the key to alleviate and control congestion.The short-time traffic network traffic has nonlinear chaotic characteristics,and the self correlation is weak.The traditional method uses linear time series method,and the nonlinear feature information is not used,so the prediction performance is not good.An improved traffic prediction algorithm based on nonlinear time series analysis and recurrence quantification union entropy feature reconstruction was proposed.The union entropy feature is extracted.The phase space reconstruction of characteristic sequences is obtained.The signal is mapped in the high dimensional phase space,and the recurrence quantitative analysis is taken for the traffic series.The autocorrelation characteristic singular decomposition is used for linear superposition after polymerization on the runoff series.The average mutual information method and false nearest neighbor algorithm are used for parameters optimization.Interpolation is taken for time frequency analysis and traffic flow control.The prediction of network traffic is completed.Simulation results show that this algorithm has high prediction accuracy and good stability,and prediction error is lower than traditional method,which shows good application value.
Conditional Access Mechanism of Video-on-demand System Based on Peer-to-Peer Network
ZHOU Xuan, HUAN Guo-qiang and SONG Zhan-jie
Computer Science. 2015, 42 (4): 72-75.  doi:10.11896/j.issn.1002-137X.2015.04.013
Abstract PDF(664KB) ( 449 )   
References | Related Articles | Metrics
Video-on-demand (VoD) based on Peer-to-Peer (P2P),as a new trend of the pay-IPTV service,has caused widespread concern in recent years.However,the instability and heterogeneity of the P2P network causing the existence of information security risks impede the realization of this system.On basis of conditional access mechanism of Video-on-demand system based on Peer-to-Peer network,we analyzed the insecurity factors of traditional system and proposed a kind of dynamic and two-way conditional access mechanism which is suitable for VoD system based on P2P.Mutual authentication protocol ensures the reliability of the identity of the communicating parties and generates the service key for transmitting control word,combing with the key agreement.This mechanism simplifies the design of system,and further improves the security of the system.
Wireless Sensor Networks Data Storage Strategy Based on RCFile
MIN Lin, FAN Wei-bei, GUO Zheng-wei and FAN Gao-juan
Computer Science. 2015, 42 (4): 76-80.  doi:10.11896/j.issn.1002-137X.2015.04.014
Abstract PDF(438KB) ( 481 )   
References | Related Articles | Metrics
With the development of wireless sensor network technology,it is widely used in many fields,like the military and national defense,industry and agriculture,urban management,biomedical,environmental monitoring,disaster relief.Due to the wireless sensor network nodes characteristics,its data storage and query strategy have become a hot research.In this paper,the existing data storage strategies were described in detail and researched,and their strengths and weaknesses were analyzed.Following by the combination of big data,an efficient data storage structure-RCFile was applied to the data stored in the sensor network.The data structure was changed though combining the row-store and co-lumn-store.Based on RCFile,a data storage algorithm was proposed named wireless sensor network data storage based on RCFile (WDSR),and then the corresponding results of the analysis were given.The simulation results show that the proposed algorithm has certain advantages in low power and high efficiency.Finally,the trend of the future development of wireless sensor data storage and query strategy research networks was given.
RSSI Assisted Coordinate-tetrahedron Centroid Localization Algorithm in Three-dimensional Space
GE Bin, ZHENG Jian-bao and HAN Jiang-hong
Computer Science. 2015, 42 (4): 81-84.  doi:10.11896/j.issn.1002-137X.2015.04.015
Abstract PDF(327KB) ( 613 )   
References | Related Articles | Metrics
Three-dimensional localization is one of the important technologies of WSN.RSSI assisted coordinate-tetrahedron centroid localization algorithm in three-dimensional space was proposed.Due to the complexity of the reality environment,the cases that unknown node is not in the anchor node tetrahedron internal exist.Quality RSSI value will be screened and converted to the distance between the unknown nodes to an anchor node.Then the tetrahedron volume is calculated and compared to exclusion.The math of centroid iterative is used to solve tetrahedral which contains unknown node.In addition,a weighted centroid localization that algorithm based on RSSI average value is used to resolve the situation that does not meet the conditions.Simulation results show that this algorithm’s positioning error is smaller than coordinate-tetrahedron centroid algorithm,and a weighted centroid localization algorithm based on RSSI average value is used to increase the node coverage rate.
Research on Stability and Robustness of Complex Network Structure
MAO Kai
Computer Science. 2015, 42 (4): 85-88.  doi:10.11896/j.issn.1002-137X.2015.04.016
Abstract PDF(337KB) ( 733 )   
References | Related Articles | Metrics
In the process of research on complex networks,we divided its network structure into three types according to the node connection tendency,including disassortative network,assortative network and the neutral network.We adopt variable gradient analysis to judge and analyze it’s stability respectively.Theoretical analysis shows that disassortative network is stable within a wide range,the assortative network’s state is unstable,and the stability of the neutral network is uncertain.We can determine whether the network is in a steady state based on the tendency of node connection.Meanwhile,in the research on complex network robustness, related simulation results show the stability and robustness have positive correlation,and the robustness of the disassortative network is best and the secondary is neutral network,followed by the fragile robustness of assortative network.
Research on Improved Anomaly Detection and Localization Algorithm in Wireless Sensor Networks
LAI Kai and WANG Xin-bing
Computer Science. 2015, 42 (4): 89-93.  doi:10.11896/j.issn.1002-137X.2015.04.017
Abstract PDF(444KB) ( 451 )   
References | Related Articles | Metrics
Fast anomaly detection and localization are critical to ensure effective functioning of wireless sensor networks.This paper presented an improved anomaly detection and localization algorithm in wireless sensor networks,where network heterogeneity is exploited for better bandwidth and energy efficiency.End-to-end measurements are collected through a two-phase probing.The goal of the first phase probing is to select probes that can cover as many anoma-lous links as possible and narrow down suspicious areas to be examined in the second phase.The probe selection pro-blem in this phase is formulated as a budgeted maximum coverage problem.We proposed an efficient approximation algorithm to solve it based on linear programming duality.The second phase probing is aimed at locating individual links that are responsible for the observed end-to-end anomalies with minimum communication cost.The prediction of diagnosis quality is carried out using the loopy belief propagation (LBP) algorithm.Experimental results show that our algorithm is much faster than the exact solution at the cost of slight performance degradation.
Data Rate Adaptive Routing Algorithm in Energy Harvesting Wireless Sensor Networks
QIU Shu-wei, YUAN Li-yong and LI Yan-yan
Computer Science. 2015, 42 (4): 94-100.  doi:10.11896/j.issn.1002-137X.2015.04.018
Abstract PDF(828KB) ( 513 )   
References | Related Articles | Metrics
Energy harvesting wireless sensor networks are an very important kind of passive sensing technology,which can effectively solve the problem of limited energy of nodes,and maintain continuous operation.Existing routing algorithm does not take advantage of the energy harvesting characteristics of nodes,and does not take the packet delivery ratio and the data rate into account.For improving the performance of the network,we proposed a data rate adaptive routing algorithm combining the packet delivery ratio of wireless link.Two constraints that a relay node on packet delivery route must be required to meet,were given by modeling the residual energy of nodes and the packet delivery ratio of wireless link.Through optimization equation,the data rate which can minimize the packet delivery delay over each hop on route was adaptive configured,and the route discovery procedure was proposed to find the packet delivery path to reduce the end to end delay.Experimental results show that the proposed algorithm can obtain the lower packet delivery delay and the higher throughput than fixed rate routing algorithm in energy harvesting wireless sensor networks.
Privacy-preserving for Location-based Service over Encrypted Data Search
LIU Shu-bo, LI Yan-min and LIU Meng-jun
Computer Science. 2015, 42 (4): 101-105.  doi:10.11896/j.issn.1002-137X.2015.04.019
Abstract PDF(431KB) ( 533 )   
References | Related Articles | Metrics
In location-based service system,it is a vital problem to protect user’s privacy including identity privacy,location privacy and preference privacy, and gain high quality service for users at the same time.This paper proposed a privacy-preserving scheme based on encrypted data search.Location-based service provider outsources its encrypted data and index to cloud server who executes user’s LBS queries and returns the top-K results to user according to matching score.The scheme does not rely on user’s cooperation and any trusted third party.Finally theoretical and experimental analysis shows that the proposed scheme can effectively protect user’s identity,location and preference privacy with a lower computation and communication overhead.
SAML Cross-domain Single Sign-on Authentication Protocol Based on Convertible Proxy Signcryption
WANG Guan-zhong, ZHANG Bin, FEI Xiao-fei and XIONG Hou-ren
Computer Science. 2015, 42 (4): 106-110.  doi:10.11896/j.issn.1002-137X.2015.04.020
Abstract PDF(775KB) ( 425 )   
References | Related Articles | Metrics
Convertible proxy signcryption algorithm has the advantages of protecting user privacy,anti-replay attack,anti-disavowal etc.A SAML cross-domain single sign-on authentication protocol (SSPCPS) was proposed based on the algorithm.Through user and heterogeneous domain server interacting and authenticating directly,the protocol simplifies the process of SSO authentication.User token is generated by combining selected random parameters with the public key,and is transferred in the secret form,improving the security of protocol.The attacker cannot use the service,even though the token is stolen.Proxy signature key is used to signcrypt the digest,reducing the amount of computation,and ensuring the privacy of user as well.Session key is negotiated based on DH algorithm,simplifying the distribution process as well as reducing the management cost.The security of the protocol was proved by CK security model and performance analysis was presented.The result indicates that the protocol holds the features of forward secrecy,message integrity,etc,and the amount of computation and the computation time of generating token are better than SSPPS protocol,Juang scheme and Kerberos scheme,etc.
Verification Methods Based on Petri Networks for Web Services Composition
SHEN Hua, HE Yan-xiang and ZHANG Ming-wu
Computer Science. 2015, 42 (4): 111-115.  doi:10.11896/j.issn.1002-137X.2015.04.021
Abstract PDF(889KB) ( 392 )   
References | Related Articles | Metrics
The purpose of structure verification analysis for services composition is to find the inherent achilles heel of the structure of services composition,which ensures that the structure of Web services composition is good in running phase.The boundedness verification of Web service composition is used to determine whether there are Web services or sub Web services composition that impacts the implementation of Web services composition.The deadlock verification is used to find whether services blind area exists.The trap verification is used to find whether service abnormal areas exist.This paper gave the algorithms to complete the above validation.The correctness of the algorithms was verified by testing experiments.
Research of Android Abnormal Intrusion Detection Based on Feature-weighted K-nearest-neighbor SVM
SUN Min, XU Cai-xia and GAO Yang
Computer Science. 2015, 42 (4): 116-118.  doi:10.11896/j.issn.1002-137X.2015.04.022
Abstract PDF(330KB) ( 556 )   
References | Related Articles | Metrics
In this paper,an abnormal intrusion detection method based on FWKN-SVM (Feature-weighted K-nearest-neighbor Support Vector Machine) for the Android platform was proposed.Firstly,we analyzed the limitations of the traditional SVM in practical applications,and proposed the feature-weighted K-nearest-neighbor method to lessen training set.Then,the system behavior was defined,according to the impact of mobile malware on the system,and the test set and the training set were built by using the data acquisition module implemented on Android phone.Lastly,we used the feature-weighted K-nearest neighbor method to lessen the training set and construct SVM classifier,and then predicted the test set.Simulation result shows that FWKN-SVM classification method has a good performance in Android abnormal intrusion detection.
Symptoms Privacy-preserving Matching Protocol for m-Healthcare Social Network
YANG Zhao-huan, LIU Shu-bo, LI Yong-kai and CAI Chao-hui
Computer Science. 2015, 42 (4): 119-122.  doi:10.11896/j.issn.1002-137X.2015.04.023
Abstract PDF(323KB) ( 507 )   
References | Related Articles | Metrics
With the rapid development of wireless body area network,m-healthcare social network has emerged as a promising mobile healthcare system.However,m-healthcare social network produces some privacy security issues currently,e.g.,how to ensure the privacy of patients when matching patients with the same symptoms and how to ensure communication privacy between objects.We first proposed a matching metric function calculates protocol.The patient can secretly select the matching patient with the same symptoms according to the patient’s requirement through this protocol.Besides,we proposed a symptoms-matching-based key agreement protocol based on the matching metric function calculation protocol.In order to meet the needs of privacy-preserving when patients are communicating,this protocol can authenticate and compute the shared session key under the condition of successful symptoms-matching.
Access Control Subject Similarity and Constraints
Abdugheni ABDUXUKUR, Kaysar RAHMAN and Nurmamat HELIL
Computer Science. 2015, 42 (4): 123-126.  doi:10.11896/j.issn.1002-137X.2015.04.024
Abstract PDF(321KB) ( 558 )   
References | Related Articles | Metrics
Constraint is an important factor in access control.It restricts sensitive combination of objects to be accumulated into similar subjects.However,conventional access control constraints lack flexibility.In order to improve the flexi-bility of constraints,we firstly respectively analysed potential inner-relationships among subjects and objects,and the relationships between them in access control.Then we proposed the concept of similar subject groups,and on this basis proposed revised access control constraint.Secondly,we implemented an experiment of subjects accessing objects.Experimental result shows the presented constraint is feasible and flexible.This revised constraint not only has the capability of conventional access control constraints,but also effectively prevents similar subjects’ collusive attack to the system.
Spectrum Detection of Time-varying DoS Attack Signal Based on Envelope Extension and Intrinsic Wave Matching
TANG Zan-yu and LIU Hong
Computer Science. 2015, 42 (4): 127-131.  doi:10.11896/j.issn.1002-137X.2015.04.025
Abstract PDF(454KB) ( 888 )   
References | Related Articles | Metrics
DoS attacks signal has non-stationary and time-varying property.It is lost in the complex network environment with color noise background,and it is difficult to detect.Traditional methods use Hough transform impulse response method to detect the non-stationary signal.Due to the edge effect of frequency distribution,detection performance is not good.A new spectrum detection method of DoS attack signal was proposed based on envelope extension and intrinsic wave matching filtering.The DoS attack signal is processed with hyperbolic frequency modulated signal decomposition,and mathematical evolution model is constructed.Signal envelope intrinsic wave features are extracted.The bilinear Hough transform method is used to analyze the spectrum distortion,instantaneous frequency estimation is obtained,and single pulse response amplitude frequency response is calculated.In time frequency feature space,the envelope extension path search is optimized.Intrinsic wave matching filter is designed based on minimum mean square error criteria.DoS frequency shift is controlled,and the spectrum detection is obtained.Simulation results show that the algorithm can improve the detection performance,and the interference of strong colored noise can be suppressed.The detection probability is higher than traditional methods.It can accurately estimate the parameters,and the active defense ability of network security is improved.
Delegation Authorization Protocol Based on Remote Attestation Applied in Multimedia DRM
FENG Wei-ning, ZHANG Zhi-yong and ZHAO Chang-wei
Computer Science. 2015, 42 (4): 132-135.  doi:10.11896/j.issn.1002-137X.2015.04.026
Abstract PDF(431KB) ( 453 )   
References | Related Articles | Metrics
The main content of existing delegation authorization model is about whether the delegatee can execute the delegated assignment(privilege),and the trust of the delegator’s platform is not mentioned.In view of this,the paper presented a delegation authorization security protocol based on remote attestation under multimedia environment.Multimedia contents’ trusted delegation authorization can be guaranteed in the protocol.This protocol not only ensures delegator and the multimedia server trust in delegatee’s authentication and platform integrity,but also achieves the trusted access to the multimedia.Delegation verification,message interactions between entities were stated.The potential attacks were enumerated and analysed.The delegation authorization protocol based on remote attestation applied in DRM realizes the trusted delegation and the functions are more perfect compared to the existing protocols.
Algorithm of Digital Image Encrypting Based on Multi-chaotic Sequence Generated by Rational Bézier Surface
ZHANG Yong-hong
Computer Science. 2015, 42 (4): 136-140.  doi:10.11896/j.issn.1002-137X.2015.04.027
Abstract PDF(1808KB) ( 423 )   
References | Related Articles | Metrics
In this paper,an implementation of digital image encryption scheme based on the multi-chaotic system genera-ted by rational Bézier surface was proposed.Firstly,many initial conditions of Logistic chaotic system are generated by using the external secret key of 8t-bit,and many Logistic chaotic sequences are generated.Then the multi-chaotic matrix is generated based on rational Bézier surface.Secondly,the XOR operation between plain-image and chaotic matrix is applied.Finally,the chaotic address set is generated by using the multi-chaotic matrix,and then the permutation is given by using this chaotic address set.The algorithm has the advantage that both diffusion and scrambling are operated by using the multi-chaotic sequence.The results of several experimental,statistical analysis and key sensitivity tests show that the proposed image encryption scheme provides an efficient and secure way for real-time image encryption and transmission.
Load Balancing Strategy Based on Pressure Feedback on MapReduce
LI Hang-chen, QIN Xiao-lin and SHEN Yao
Computer Science. 2015, 42 (4): 141-146.  doi:10.11896/j.issn.1002-137X.2015.04.028
Abstract PDF(1554KB) ( 425 )   
References | Related Articles | Metrics
Data skew is one of the factors which seriously affects the performance of MapReduce.Existing solutions for the data skew problem increase the burden that the users need to provide the partition function for the specific application,or write additional sampling processes for the MapReduce.To solve this problem,we presented a load balancing strategy based on pressure statistics.To get the global data distribution,we computed the statistics while preparing data,which makes full use of the shuffle stage in MapReduce.To balance the entire cluster,the strategy schedules the heavy nodes according to the data distribution,without requiring the user to provide additional input.In addition,due to the complexity of the applications,we introduced the pressure feedback mechanism,and further improved the perfor-mance of the scheduling policy.The experimental results show that our strategy is far more efficient than the default strategy.
Multi-layer Temporal Data Model Supporting Multi-modality Fusion Entity Search in Heterogeneous Information Spaces
YANG Dan, CHEN Mo, SUN Liang-xu and WANG Gang
Computer Science. 2015, 42 (4): 147-150.  doi:10.11896/j.issn.1002-137X.2015.04.029
Abstract PDF(591KB) ( 404 )   
References | Related Articles | Metrics
Faced with a large number of associated heterogeneous entities such as projects,authors,papers,products,movies with temporal information in heterogeneous information spaces,an entity and association centered multi-layer temporal data model,named multi-layer temporal entity network MTE-Network,was proposed which captures temporal information of entities and associations effectively.Moreover,multi-modality query model of entity search was proposed which supports users to search entities and related entities of any entity class in heterogeneous information spaces,and supports entity search on entity level,entity aggregation level and time line,and it can satisfy the information needs of users’ multi-modality entity search.The experimental results on real data set demonstrate the feasibility and effectiveness of the proposed temporal data model and query model.
Consistency Detection Method between UML Model and Java Source Code
ZENG Yi, LI Han-yu, LIU Hui-jun, YU Shuang-shuang and ZHOU Bo
Computer Science. 2015, 42 (4): 151-155.  doi:10.11896/j.issn.1002-137X.2015.04.030
Abstract PDF(932KB) ( 1138 )   
References | Related Articles | Metrics
Aiming at the problem of inconsistencies between the code and the model,this paper presented a consistency detection method based on UML model and the JAVA code.Firstly,the UML class diagram and sequence diagram are formalized and a concept of sequence diagram call graph(SD-CG) is put forward.On this basis,the transformation from the associated relationship between classes to the associated attribute of class and the transformation from the sequence diagram to sequence diagram call graph are completed.Thirdly,based on call graph which is made up of class methods being as nodes and the call relation between them being as edges,to analyze the dynamic behavior in the source code,this paper got the information of Java classes and the method call graph (CG) by the lexical analysis and syntax analysis of Java source code.Then the consistency detection algorithm between UML model and Java source code was designed,including the static information consistency detection and the interactive information consistency detection.At last,we developed a prototype tool which can complete information consistency detection to verify the feasibility and effectiveness of the proposed method.
Friend Recommendation Algorithm Based on PH-Tree Multi-attribute Index Tree
LIANG Jun-jie and SUN Yang-zheng
Computer Science. 2015, 42 (4): 156-159.  doi:10.11896/j.issn.1002-137X.2015.04.031
Abstract PDF(405KB) ( 602 )   
References | Related Articles | Metrics
Nowadays,more and more people make new friends on online social networks (OSN),and it becomes an important feature for OSN to recommend friends for users rapidly and exactly.An online friend recommended algorithm was proposed using a sorted index tree based on local network structure.Multi-attribute intersection values between different users were converted into binary vectors and organized in PH-Tree.Thus a user’s best friend recommended set can be determined by travelling the index tree easily.The experiments show that our method performs efficiently and precisely.
Intelligent Discovery of Dynamic Knowledges and Logic Dynamic Relation between their Attributes
XU Feng-sheng, YU Xiu-qing and SHI Kai-quan
Computer Science. 2015, 42 (4): 160-165.  doi:10.11896/j.issn.1002-137X.2015.04.032
Abstract PDF(448KB) ( 461 )   
References | Related Articles | Metrics
One direction singular rough sets and dual of one direction singular rough sets are two dynamic structures of S-rough sets.Under certain conditions,one direction singular rough sets and dual of one direction singular rough sets can be reverted to Z.Pawlak rough sets,and they are also thought as one fundamental form of S- rough sets.Using one direction singular rough sets and dual of one direction singular rough sets,attribute conjunction normal form of dynamic knowledge and its shrinking-expansion characteristics were provided.Furthermore,there are the knowledge reasoning structure and reasoning modal.By the above knowledge intersection and fusion,dynamic knowledge generation with characteristics of attribute conjunction normal form shrinking-expansion and generation theorem.And the logic relation between intelligent discovery and attributes of dynamic knowledge were obtained when they satisfy knowledge reasoning condition.Finally intelligent filter,filter criterion,filter theorem and applications were presented.
Research on Stock Price Real-time Fluctuation Influence Based on Web Big Data Mining
YANG Sha, YU Wei, LI Shi-jun, CAO Jing-jing and LIU Jing
Computer Science. 2015, 42 (4): 166-171.  doi:10.11896/j.issn.1002-137X.2015.04.033
Abstract PDF(502KB) ( 761 )   
References | Related Articles | Metrics
With the growing of the Internet,the network of large data has become an important distribution center for the financial sector.These data give valuable opportunities and severe challenges to stock analysis and prediction.The development of Web2.0 allows investors to actively participate in all aspects of the creation of network of information,communication and accessing.This paper disclosed information on the Internet and generated complete and effective fundamental of information to get 10 actors that may impact the stock market,which are economic and cycle,fiscal policy,changes in interest rates and exchange rates,price changes,inflation,political policies,changes in the industry,operating conditions and the downstream effects.We proposed an algorithm for the stock market prediction based on these 10 factors.Experiments show that the method has good feasibility,and can bring significant academic and commercial values.
On Density Based Outlier Detection for Uncertain Data
JIANG Yuan-kai, ZHENG Hong-yuan and DING Qiu-lin
Computer Science. 2015, 42 (4): 172-176.  doi:10.11896/j.issn.1002-137X.2015.04.034
Abstract PDF(410KB) ( 426 )   
References | Related Articles | Metrics
Uncertain data generally exist in a large number of applications,such as mobile computing,sensor networks and RFID technology.Outliers detection algorithm can improve the quality of these services.An uncertain data outlier detection algorithm based on density RLOF was proposed.This algorithm introduces a R2-tree structure,which effectively reduces the time complexity when calculating local outlier factor.It also reduces the cost of data updating in the uncertain data set and the maintenance cost of a massive data.The theoretical analysis and experimental results fully prove that the algorithm is effective and feasible.
Predicting Potential Drug Targets from Ion Channel Proteins Based on Ensemble Learning
XIE Qian-qian, LI Ding-fang and ZHANG Wen
Computer Science. 2015, 42 (4): 177-180.  doi:10.11896/j.issn.1002-137X.2015.04.035
Abstract PDF(331KB) ( 743 )   
References | Related Articles | Metrics
The identification of molecular targets is a critical step in the discovery and development process of new drugs.Among large known drug targets,ion channel proteins are the most attractive drug targets,which are closely linked to some diseases such as cardiovascular and central nervous systems.Traditional biological methods have the characteristics of high-cost and time-consuming in mining drug targets.Our work discussed the mining of potential ion channel drug targets based on random forests,which is aimed at speeding up the discovery process of drug targets and saving money.Since the lengths of sequences related to drug targets are diverse,thirteen types of protein encoding features were considered which can transform the protein sequences with distinct lengths into the sequences with same lengths in our study.A feature subset which has better performance in the division between drug targets and non-targets was chosen by numerical experiments and the ensemble learning was introduced to attain prediction models.Our study attains high accuracy by comparison to the developed methods,which plays the critical roles in the mining of new drug targets.
Tracking Initiation Algorithm Based on Grid Clustering and Modified Logic Algorithm
YU Sha, CHEN Ming-yan and CAO Jian-shu
Computer Science. 2015, 42 (4): 181-184.  doi:10.11896/j.issn.1002-137X.2015.04.036
Abstract PDF(369KB) ( 635 )   
References | Related Articles | Metrics
A new algorithm based on grid clustering and logic method for tracking initiation (TI-GCL) was proposed to solve the problems of multiple false tracks and slow initiation in high-density clutter environment.In order to get more accurate clusters,the new algorithm uses the technology of grid cores-based and boundary optimization to deal with the echo points which come from high-density grid and low-density grid,then uses the similarity of data in each cluster to clustering,and ultimately utilizes the modified logic method to initial the object track.The simulation results show that the algorithm is capable of initiating tracks accurately and quickly in dense clutters environment,which can be used for practice of project.
Classifying Interests of Social Media Users Based on Information Content and Social Graph
WU Hai-tao and YING Shi
Computer Science. 2015, 42 (4): 185-189.  doi:10.11896/j.issn.1002-137X.2015.04.037
Abstract PDF(768KB) ( 420 )   
References | Related Articles | Metrics
With the development of society,there has been a more and more obvious presence of the characteristic of audience-segmentation in human activity over information spreading,and user classification has also become an important research topic.So the article carried out a study over online social network user from multiple perspectives which mainly include user classification based on interested topics and preference,classify interests of social media user based on information content and topological relation,and both them respectively.For user classification based on information content,we adopted LDA to extract the topic distribution from the content posted by users.And the distribution is used in support vector machine,decision tree,Bayes and other multiple models to classify interests of users.For user classification based on topological relation,we found that users with same interests tend to have more common fans,and based on this finding we built classification models to classify users.Then,we proposed methods of combining information content and topological relation to classify users.Based on the experiments using Twitter data,we found that the combined method outperforms the one based on information content or topological relation.
Transfer Learning Algorithm between Keepaway Tasks Based on Policy Reuse
LI Xue-jun, CHEN Shi-yang, ZHANG Yi-wen and LI Long-shu
Computer Science. 2015, 42 (4): 190-193.  doi:10.11896/j.issn.1002-137X.2015.04.038
Abstract PDF(439KB) ( 574 )   
References | Related Articles | Metrics
In RoboCup Keepaway task,players can gain good high-level strategy with reinforcement learning.However,as Keepaway tasks have very huge state space,normal reinforcement learning requires a great many searching steps to converge,and needs very long time.To solve this problem,for 5v4 scale Keepaway task,policy reuse technique is applied to the reinforcement learning procedure of takers’ high-level decision to achieve transfer learning.The transferring plan along with the map of state and action space between 4v3 and 5v4 task were rationally designed.Then a policy reuse based algorithm was stated.Experiments show that after the same training time for 5v4 scale task,takers get shorter task finish time and higher stealing success rate during transfer learning than in normal reinforcement learning.So there are better policies learned by transfer learning.Transfer learning needs much less training time than normal reinforcement learning to get the same policy level.
Research on Efficient Privacy Preserving Frequent Pattern Mining Algorithm
CHENG Shu-tong, XU Cong-fu and DAN Hong-wei
Computer Science. 2015, 42 (4): 194-198.  doi:10.11896/j.issn.1002-137X.2015.04.039
Abstract PDF(415KB) ( 479 )   
References | Related Articles | Metrics
This paper elaborated the goal of privacy preserving data mining,that is to satisfy the demand of users for privacy protection as we acquire the mining results of effective data mining.For privacy protection to individual users and group users,this paper discussed different methods,and summed up two main algorithms in data mining of privacy preservation.Since efficient mining associations with secrecy konstraints(EMASK) needs full database scan and many comparison operations,the author came up with efficient mining associations with secrecy konstraints which is based on Bitmap computation(BEMASK).It transforms relational data form into relational model for machine,data processing is converted into granular computing method and calculation of frequent item-sets is turned into computing the intersection set of basic particles.Especially the vertical representation of Bitmap,under the condition of ensuring that accuracy is not reduced,on one hand reduces the number of I/O operations,on the other hand,greatly improves the efficiency.
Statistical Testing Based Research on Kernel Evaluation Measures
WANG Pei-yan and CAI Dong-feng
Computer Science. 2015, 42 (4): 199-205.  doi:10.11896/j.issn.1002-137X.2015.04.040
Abstract PDF(782KB) ( 651 )   
References | Related Articles | Metrics
This paper explored the research on evaluating kernel function by using statistical testing.By employing kernel,normalized kernel,centered kernel and kernel distance as geometric measure among samples in feature space,and applying 7 statistical testing methods such as t-test and f-test,this paper evaluated the distributional difference between the geometric measures among samples from same classes and the geometric measure among samples from different classes.The experimental results of kernel selection on 11 UCI datasets show that the kernel evaluation measures based on statistical testing reach or exceed the performance of KTA and FSM,etc.And we found that the two types of data distribution differences are mainly reflected in the variance difference.Moreover,the formatting of kernel function such as normalization or centering can change the feature space,and make the evaluation distorted.
Resource Scheduling Algorithm Based on Social Force Swarm Optimization Algorithm in Cloud Computing
YUAN Hao and LI Chang-bing
Computer Science. 2015, 42 (4): 206-208.  doi:10.11896/j.issn.1002-137X.2015.04.041
Abstract PDF(299KB) ( 560 )   
References | Related Articles | Metrics
In order to improve the resource scheduling efficiency in cloud computing,this paper put forward a cloud computing resource scheduling method based on social force swarm optimization algorithm.Firstly,resource scheduling task completion time was taken as objective function of swarm optimization algorithm based on social force model.And then the optimal scheduling scheme was found by simulation of crowd evacuation process of self-organization phenomenaand congestion avoidance behavior.Finally,the simulation experiments were carried out to test the performance of the proposed method.The results show that compared with other the resource scheduling method in cloud computing,the proposed method can quickly find the optimal resource scheduling scheme,make resource load more balanced,and improve the utilization rate of cloud computing resources.
Research of Microblog Sentiment Classification Based on Emotional Polarity Certainty for Vocabulary
GU Yi-jun and LIU Xiao-ming
Computer Science. 2015, 42 (4): 209-212.  doi:10.11896/j.issn.1002-137X.2015.04.042
Abstract PDF(412KB) ( 492 )   
References | Related Articles | Metrics
To improve the sentiment classification precision,the paper proposed a method to determine the certainty of sentiment.The method takes account of multiple sentiment features of vocabularies in the sentiment database and identifies the clause giving opinions.It classifies the clauses into different emotinoal polarities through machine learning.The experiment shows that the proposed method employing the certainty of the sentiment unifies the emotional polarities measurement.It has a better performance on identifying the clause giving options and emotional polarity classification.
Dynamic Algorithm for Computing Attribute Reduction Based on Information Granularity
WANG Yong-sheng, ZHENG Xue-feng and SUO Yan-feng
Computer Science. 2015, 42 (4): 213-216.  doi:10.11896/j.issn.1002-137X.2015.04.043
Abstract PDF(351KB) ( 556 )   
References | Related Articles | Metrics
Dynamic attribute reduction is one of the important issues in rough set theory.A dynamic attribute reduction model based on information granularity was constructed in dynamic decision table,and an incremental approach for computing information granularity was discussed in detail when some new attribute set is added into decision table.On this basis,a dynamic attribute reduction algorithm was proposed by using information granularity as the heuristic information.The proposed algorithm can use attribute reduction and information granularity of original decision table,which can effectively reduce the computational complexity,so that the attribute reduction has better inheritance.Finally,the example and experimental comparison indicate the feasibility and validity of the proposed algorithm.
Solving Strong Cyclic Planning with Minimal Expectation Weight
LI Yang, WEN Zhong-hua, WU Xiao-hui and LAO Jia-qi
Computer Science. 2015, 42 (4): 217-220.  doi:10.11896/j.issn.1002-137X.2015.04.044
Abstract PDF(655KB) ( 421 )   
References | Related Articles | Metrics
In real world,the action’s execution often takes a cost.Due to the interference of the external environment,the result of an action’s execution is uncertain.To solve this problem,we added weight to the action in system,used a probability distribution to show the stochastic state transition.A method of solving strong cyclic planning with minimal expectation weight was designed based on the proposed concept of expectation weight for strong cyclic planning.Mainly,this method applies depth-first search for getting all strong cyclic plannings.Then,it uses Gaussian elimination to slove the problem after converting the plannings into linear equations with variables in expectation weight.
Spatiotemporal Neighborhood Search for Solving Mixed-load School Bus Routing Problem
DANG Lan-xue, HOU Yan-e and KONG Yun-feng
Computer Science. 2015, 42 (4): 221-225.  doi:10.11896/j.issn.1002-137X.2015.04.045
Abstract PDF(412KB) ( 418 )   
References | Related Articles | Metrics
Neighborhood search is one of the key issues in the algorithm design for solving mixed-load school bus routing problem (SBRP).For saving the execution time of neighborhood search,this paper introduced the concepts of spatiotemporal distance and spatiotemporal connectivity index to reduce the size of searching space.The spatiotemporal distance between stop nodes was defined as the weighted sum of normalized spatial distance and temporal distance.Moreover,the spatiotemporal connectivity index,a number indicating the feasible connectivity between two SBRP nodes,is determined according to spatiotemporal distance and constraints such as school bus capacity,school time window and maxi-mum riding time.The spatiotemporal connectivity indexes are sorted in a descending order for each node,and the neighborhood lists for all nodes are prepared for neighborhood search operators.In the following step of node moves,the neighborhood node with higher index value will be examined preferentially.Thus the effective node will be accepted as earlier as possible,and the size of neighborhood lists can be reduced substantially.The test of a set of large-scale mixed-load SBRP instances shows that the spatiotemporal neighborhood search is capable of reducing the neighborhood search space and saving the solution time dramatically.
Automatic Text Summarization Based on Comprehensive Characteristics of Sentence
CHENG Yuan, Wushouer SILAMU and Maimaitiyiming HASIMUA
Computer Science. 2015, 42 (4): 226-229.  doi:10.11896/j.issn.1002-137X.2015.04.046
Abstract PDF(336KB) ( 856 )   
References | Related Articles | Metrics
To extract the abstract with less redundant information and a wide coverage,which can reflect the main idea of the text,this paper advanced a comprehensive text summarization method.This method takes the frequency of the words,the title,the position of the sentence in the text,cue phrases,similarity of the sentences and other features in the text into consideration,constructs a comprehensive feature weighting function,trains the corpus with mathematical regression model,removes the redundant information,and then gets the abstract.The experiment shows that this method is very effective and feasible,and very superior in the quality of the extraction.
Hybrid Tabu Search Algorithm for Solving Incident Vehicle Routing Problem
CAI Yan-guang, TANG Ya-lian and ZHU Jun
Computer Science. 2015, 42 (4): 230-234.  doi:10.11896/j.issn.1002-137X.2015.04.047
Abstract PDF(485KB) ( 521 )   
References | Related Articles | Metrics
Considering these factors that vehicles have the departure time limit and road conditions affect the transportation cost,etc,we established the incident vehicle routing problem (IVRP) mathematical model based on clients’ soft time windows,depots’ time windows,heterogeneous vehicles,road conditions constraints,etc.Making full use of the advantages of genetic algorithm (GA) and tabu search (TS),hybrid tabu search algorithm (HTSA) was constructed.Search space was increased by constructing multiple initial solutions,and two kinds of tabu list:local tabu list and global tabu list were designed,which not only can speed up the optimization,but also can get rid of the reliance on initial solution.Adjusting the tabu list length adaptively can avoid premature convergence,extracting core routes is conducive to late optimization and the relocate operator can reduce route number.Experiments show that IVRP is superior to the general VRP,which can save cost greatly.Moreover,HTSA is better than GA and TS in convergence speed and optimal results,and the stability of HTSA is better by comparing the standard deviation of total cost,total distance and convergence time.
Dependency Conflict in Schema Matching
DU Xiao-kun LI, Yan-hong and TU Tao
Computer Science. 2015, 42 (4): 235-239.  doi:10.11896/j.issn.1002-137X.2015.04.048
Abstract PDF(461KB) ( 413 )   
References | Related Articles | Metrics
Through analyzing the drawback of existing matching method,a new method was proposed to select matching relation between elements based on the dependency conflict.At first,the candidate match of elements in target schema is selected,and then the value of conflict is calculated for each overall matching solution,at last the solution with minimum value of conflict is selected as the final result.Experimental results show that the precision of matching result is increased and the time for the optimization procedure of the mapping data is reduced with this strategy.
Parameter Optimization and Application of SVM Based on Chaos Gravitational Search Algorithm
GONG An, LV Qian, HU Chang-jun, KANG Zhong-jian and LI Hua-yu
Computer Science. 2015, 42 (4): 240-243.  doi:10.11896/j.issn.1002-137X.2015.04.049
Abstract PDF(273KB) ( 412 )   
References | Related Articles | Metrics
Chaotic series and the crossover of genetic algorithm were introduced into gravitational search algorithm to overcome its poor local optimization problem.The improved algorithm was used for the SVM parameters optimization.And the simulated experiments show that the SVM model has a higher accuracy.At last the model was applied to the condition monitoring of primary air fan in power plant.And the experimental results indicate that the model is valid.
Dynamic Pruning Algorithm for Pattern Matching with Wildcards and Length Constraints
WANG Hai-ping, DAI Wei and GUO Dan
Computer Science. 2015, 42 (4): 244-248.  doi:10.11896/j.issn.1002-137X.2015.04.050
Abstract PDF(384KB) ( 526 )   
References | Related Articles | Metrics
Recently,with the development of bioinformatics,network security and information retrieval,wildcards are playing an increasingly key role in pattern ma-tching problems.Among these,the challenging problem of pattern ma-tching with wildcards and length constraints (PMWL) further extends the flexibility of pattern matching,thus bringing convenience to the user but producing the expense of higher computational complexity.Therefore,how to get a better matching solution in polynomial time has been well studied and developed.Based on the previous work,we presented a heuristic Pawn algorithm for PMWL.After transforming the PMWL problem into path search problem,a dynamic strategy is adopted with a pruning procedure in an alternating iterative manner.Experiments demonstrate that,comparing with the state-of-the-art algorithm,Pawn algorithm can improve the number of matching occurrences in most cases of 196 artificial patterns with certain characteristic and the DNA sequences.
Notes on Multi-ary α-Resolution Principle Based on Lattice-valued Logical System LP(X)
LIU Yi, XU Yang and JIA Hai-rui
Computer Science. 2015, 42 (4): 249-252.  doi:10.11896/j.issn.1002-137X.2015.04.051
Abstract PDF(665KB) ( 463 )   
References | Related Articles | Metrics
The basic theory of multi-ary α-resolution principle based on lattice-valued propositional logical system LP(X) with truth in a lattice implication algebra was further investigated,and the basic principle of the number of genera-lized literals which take part in the multi-ary α-resolution varies with the resolution deduction in LP(X) was gaven.The validities of multi-ary α-resolution principle in LP(X) were analyzed,which will lay the theoretical foundation for buil-ding the multi-ary α-semantic resolution method and constructing the multi-ary α-semantic resolution algorithm in LP(X).
Parameters Learning of Bayesian Networks for Multistate System with Small Sample
XIAO Meng and ZHANG You-peng
Computer Science. 2015, 42 (4): 253-257.  doi:10.11896/j.issn.1002-137X.2015.04.052
Abstract PDF(407KB) ( 831 )   
References | Related Articles | Metrics
To learn parameters for conditional probability distribution of multi-father nodes,a method was proposed which applys to the multistate nodes in the inaccurate model under the condition of insufficient sample information.Using the assumption of independence of causal interaction,the conditional probability distribution is decomposed and the size of conditional probability table is linear in the numbers of the parent nodes and their states.Using Leaky Noisy-MAX model,the influence of factors not included in the multistate system model can be quantified on the parameters learning.The model parameters extracted from small sample can create conditional probability tables.The results show that the method can improve the efficiency and precision of parameter learning.
Logistics Frequent Path Sequence Mining Algorithm Based on Topological Information
YANG Jun-yao, MENG Zu-qiang and JIANG Liang
Computer Science. 2015, 42 (4): 258-262.  doi:10.11896/j.issn.1002-137X.2015.04.053
Abstract PDF(443KB) ( 519 )   
References | Related Articles | Metrics
In order to get frequent paths from massive logistics data,according to the feature of logistics networks and logistics,this paper provided a logistics data model and a frequent path sequence mining algorithm PMWTI(Path Mining With Topology Information) taking the topological information of logistics networks into consideration.In PMWTI,a cost tolerable degree pruning method used for the deep pruning of candidate path sequences was designed.This method discards some candidate path sequences which are gained by Apriori pruning method but can’t be frequent path sequences.It can downscale the candidate path sequences,so that the algorithm scans the dataset less.The experimental result shows that,compared with the same algorithm which do not adopt this pruning method,PMWTI has better mi-ning efficiency.
Weight Vector Based Multi-scale Clustering Algorithm
SU Dong-hai, ZHAO Shu-liang, LIU Meng-meng, SU Jia-geng and LI Yan
Computer Science. 2015, 42 (4): 263-267.  doi:10.11896/j.issn.1002-137X.2015.04.054
Abstract PDF(416KB) ( 397 )   
References | Related Articles | Metrics
Multi-scale clustering plays an important role in multi-scale decision making,which cannot be replaced,while almost of the traditional multi-scale clustering algorithms have encountered the same deadly problem that clustering algorithms are applied on every scale which every user is interested in.To overcome the problem occurred,this paper gave a method of vectorization for the multi-scale characteristic in data and proposed a new multi-scale conversion mechanism of knowledge named WVB-MSCA(Weight Vector Based Multi-Scale Clustering Algorithm) with the help of scale conversion mechanism in earth science.WVB-MSCA applies clustering algorithm on the basic scale selected in advance,and inverses the knowledge gained from basic scale to other scales which users are interested in.Experimental results show that WVB-MSCA is feasible and effective.
Vehicle Logo Recognition Based on Deep Learning
PENG Bo and ZANG Di
Computer Science. 2015, 42 (4): 268-273.  doi:10.11896/j.issn.1002-137X.2015.04.055
Abstract PDF(1257KB) ( 801 )   
References | Related Articles | Metrics
Identification of vehicles which have caused traffic accidents is an important issue in intelligent transportation system (ITS).However,due to the missing or the stains on the car license plates,it is difficult to locate the vehicles.Logos,as key features of vehicles,can also help to identify vehicles of interest.This paper proposed a method to detect and recognize vehicle logos based on deep learning.Compared with traditional approaches that extract features manually,this method uses original images as inputs to learn features automatically.Experimental results demonstrate that the proposed method has a high accuracy,and even in the conditions of illumination change and noise contamination,it is able to produce stable and accurate results.
Supervised Band Selection Based on Class Separability for Hyperspectral Image
XU Ming-ming, ZHANG Liang-pei, DU Bo and ZHANG Le-fei
Computer Science. 2015, 42 (4): 274-275.  doi:10.11896/j.issn.1002-137X.2015.04.056
Abstract PDF(518KB) ( 452 )   
References | Related Articles | Metrics
Hyperspectral remote sensing data contain abundant spectral information,which are widely used,but sometimes the redundant spectral information limits the classification accuracy and computational complexity.In order to improve the interpretation efficiency,hyperspectral image dimension reduction is necessary,which is also one of the highlights in the hyperspectral image processing.This paper presented a hyperspectral image band selection method based on endmember separability (EBSS).This method maximizes class separability using Mahalanobis distance to determine the optimal band combination.Compared with other supervision of band selection algorithms,the proposed method does not need the training samples,and does not conduct classification during band selection.The experiment results show that the proposed method is effective and can get better classification accuracy.
Rendering Method for 3D Military Symbols Based on Triangle Recursive Segmentation
LUO Li-ji, LONG Tao, LI Jun-lin and ZHAO Heng
Computer Science. 2015, 42 (4): 276-280.  doi:10.11896/j.issn.1002-137X.2015.04.057
Abstract PDF(921KB) ( 442 )   
References | Related Articles | Metrics
In order to solve the problem of real time rendering of the 3D military symbol in battlefield simulation environment,using the arbitrary separability of Bezier curve,we proposed a new universal rendering method for 3D military symbol based on triangle recursive segmentation.Firstly,2D military symbols are generated by setting the control points. Secondly,the segmentation graphics are achieved by triangle recursive segmentation.In addition,the recursive calls are dynamiclly controlled.Lastly,each feature point is given a elevation value,as a result,the 3D military symbols are generated.The experimental results denote that the algorithm can maintain good matching speed and matching effect when it is used in a variety of basic 3D military symbols under different terrain conditions.This method can meet the needs of real-time military plotting system on the premise that it is highly general.
Infrared Image De-noising Method Based on Genetic Wavelet of TLS
WU Ying-chang, LUO Dian-sheng and HE Hong-ying
Computer Science. 2015, 42 (4): 281-284.  doi:10.11896/j.issn.1002-137X.2015.04.058
Abstract PDF(840KB) ( 411 )   
References | Related Articles | Metrics
In order to remove noise in infrared images more effectively,an infrared image de-noising method based on genetic wavelet of TLS (Total Least Squares) was presented.This method takes the image de-noised by TLS as male and the image de-noised by wiener filter as female to select,cross and mutate.The dominant gene which is extracted from TLS wavelet de-noising and wiener filtering is called as optimal offspring and decoded into image.Experimental results show that compared to conventional methods,the method effectively removes the noise,and has higher SNR(signal-to-noise rate) and smaller MSE(minimizes the mean squared error).
Localization of Causing-traffic-trouble Vehicle with Multi-level Cascaded Visual Attention Model
CHAI Zhen-liang and ZANG Di
Computer Science. 2015, 42 (4): 285-291.  doi:10.11896/j.issn.1002-137X.2015.04.059
Abstract PDF(1864KB) ( 401 )   
References | Related Articles | Metrics
The localization of causing-traffic-trouble vehicle is one of the most key problems for intelligent transportation system (ITS).This paper proposed a multi-level cascaded visual attention model to localize the causing-traffic-trouble vehicle.In each level of the proposed model,one significant feature of the vehicle such as color or vehicle logo is extracted and compared to the vehicle which has caused an accident.Then vehicles that have no similar features can be filtered.By performing feature extraction and feature comparison for several times,only the causing-traffic-trouble vehicle will be left behind.The experimental results demonstrate that the proposed approach is able to locate the causing-traffic-trouble vehicle accurately and is robust to luminance and noise.
Background Self-learning Framework for Bio Information Extraction from Hyperspectral Images
ZHANG Yu-xiang, GAO Xu-yang, WANG Ting, ZHANG Le-fei and DU Bo
Computer Science. 2015, 42 (4): 292-296.  doi:10.11896/j.issn.1002-137X.2015.04.060
Abstract PDF(936KB) ( 458 )   
References | Related Articles | Metrics
Physical or chemical methods are commonly used to extract certain bio information,such as fingerprint extraction,tumor region detection,etc.These methods may not only be time-consuming,but also possibly damage the entire bio information carrier.Meanwhile,the process cannot be recurred and reach a satisfactory accuracy.A new technique,hyperspectral imaging,can be adopted for the information extraction,by which the origin information will not be contaminated and can be able to be acquired from the image repeatedly.We proposed an information extraction method from hyperspectral images based on a background self-learning framework.In the conventional unstructured background models,it may be difficult to accurately estimate the background statistics,neither in a global nor local way.The proposed method can avoid this problem.Considering the spatial spectral information,its performance can be further improved.It is designed to extract fingerprint and tumor region from hyperspectral bio images.The experimental results show the validity and the superiority of our method for the bio information extraction from hyperspectral images.
Image Retrieval Method Research Based on BoC-BoF Feature
FENG Jin-li and YANG Hong-ju
Computer Science. 2015, 42 (4): 297-301.  doi:10.11896/j.issn.1002-137X.2015.04.061
Abstract PDF(1438KB) ( 467 )   
References | Related Articles | Metrics
The fusion feature for representing content of images was investigated in order to optimize content-based ima-ge retrieval method.Firstly,RootSift-based Bag-of-Features (BoF) were extracted,which capture shape and edge information.Then,Bag-of-Colors (BoC) based on HSV were adopted to replace the traditional color histogram quantization method was adopted,which capture color information.Lastly,BoC-BoF algorithm which integrates BoC vectors and BoF vectors was proposed.BoC-BoF algorithm effectively realizes the integration of global features and local features.The obtained impressive results show that this algorithm is more effective than other methods in two datasets of this paper.
Improved SPIHT Image Coding Algorithm Based on Hybrid Domain
WANG Xue-chun, LIU Shen-xiao and CHANG Chao-wen
Computer Science. 2015, 42 (4): 302-305.  doi:10.11896/j.issn.1002-137X.2015.04.062
Abstract PDF(613KB) ( 612 )   
References | Related Articles | Metrics
In order to make image compression algorithm have higher compression ratio and better visual effect,this paper proposed an hybrid domain image coding scheme based on wavelet transform and Contourlet.On the basis of the analysis of the SPIHT algorithm,we cancelled the classification of LIS tables in the SPIHT algorithm,and made image coding uniformly according to the first offspring after sun generation wavelet space tree coding sequence.The comparing results show that the new image compression scheme is superior to that of SPIHT in the recovery image quality and coding time.
Variational CV Model Based on Total Bregman Divergence and its Segmentation Algorithm
WANG Ji-ce and WU Cheng-mao
Computer Science. 2015, 42 (4): 306-310.  doi:10.11896/j.issn.1002-137X.2015.04.063
Abstract PDF(1006KB) ( 595 )   
References | Related Articles | Metrics
The classical CV model is not completely suitable to segment the gray image which is intensity inhomogeneity,and has been disturbed by Gaussian noises with some variance.The variational CV model based on the total Bregman divergence was proposed and its iterative segmentation algorithm was presented.Firstly,the problems and disadvantages of the variational CV model segmentation method constructed by the Euclidean metric are analyzed.Secondly,compared with Euclidean metric,a figure shows the advantages of the total Bregman divergence that there is no connection with coordinate system in the distance calculation.Then,to reach the purpose of reducing noise sensitivity and enhance robustness of image segmentation,the data deviation term in CV model is built by the total Bregman divergence.Finally,Euler-Lagrange equation of this proposed variational model is obtained by variational method,and the variational model algorithm of the image segmentation is presented by numerical computation method.In addition,to accelerate the convergence rate,the weighting parameters of fitting terms should appropriately chose bigger value,and the importance of fitting items increases in variational model.The experimental results show that the proposed method is low sensitive to initialize contour curve,and has good anti-noise and robust performance.
SIFT Algorithm Based on Block Matching
ZOU Cheng-ming, XU Ze-qian and XUE Dong
Computer Science. 2015, 42 (4): 311-315.  doi:10.11896/j.issn.1002-137X.2015.04.064
Abstract PDF(1442KB) ( 433 )   
References | Related Articles | Metrics
SIFT algorithm has distinctive advantages in the field of image processing.However,with the development of the SIFT algorithm,it still has some disadvantages such as the large amount of data processing,slow computing speed.To address these issues,a SIFT algorithm based on block matching was proposed.It reduces the time of feature extraction and matching by extracting the overlapping areas.For the rigid transformation of a image,the core of the algorithm is to calculate the image block segmentation and overlapping areas.In the first step, a small number of seed points are selected to estimate the associated transformation matrix of two images.Then,the original image is cut into pieces and the relevant block is found by the transformation matrix.In the second step,all of the matching feature points are detected on the block.Finally,RANSAC algorithm is used to remove error matching points to improve the matching accuracy.The experimental results show that the improved SIFT algorithm of block matching has better real-time and robustness than the standard SIFT algorithm,and it has a certain application value in the actual image matching.
Multi-label Learning for Improved RBF Neural Networks
LI Shu-ling, LIU Rong and LIU Hong
Computer Science. 2015, 42 (4): 316-320.  doi:10.11896/j.issn.1002-137X.2015.04.065
Abstract PDF(432KB) ( 656 )   
References | Related Articles | Metrics
A modified multi-label radial basis function (RBF) neural network algorithm that can fully consider the relationship between numbers of sample labels was presented.This improved algorithm is based on the fact that ignoring the relevance between sample labels may cause potential performance loss.The modified algorithm first optimizes the RBF basis function center calculation algorithm in hidden layer,i.e.k-means clustering.AP clustering is used to automatically find k values to obtain the node number of hidden layer and a Huffman tree is constructed to select the initial cluster centers to prevent the k-means clustering results falling into local optimal.Then a label counting vector C that reflects the correlation between the labels is constructed,and it is linearly multiplied with the clustering centers which are obtained through k-means clustering optimization to optimize the RBF basis function center and establish RBF neural network.Experiments using the public multi-label emotion data sets demonstrate the effectiveness of the proposed algorithm.