Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 43 Issue 4, 01 December 2018
  
Survey on Indoor Localization
XI Rui, LI Yu-jun and HOU Meng-shu
Computer Science. 2016, 43 (4): 1-6.  doi:10.11896/j.issn.1002-137X.2016.04.001
Abstract PDF(631KB) ( 2071 )   
References | Related Articles | Metrics
In recent years,with an increasing demand of context-aware applications and ambient intelligence applications,the accuracy requirements of user location information are rising.Due to the fact of complex indoor environment and the loss of unified infrastructure,indoor positioning technology has gradually become the focus field of user localization.Taking into account the infrastructure of indoor positioning technology,location accuracy,generality and other factors,the existing methods were divided into three categories:specific device-based localization,WIFI-based localization and mobile sensor-based localization.We discussed and analyzed specific localization technology and different me-thods,such as performance,features,advantages and disadvantages,etc.Finally,we summarized the current state of indoor localization technology field,and briefly described the direction of future research and development trends in the field.
Research Status and Future Trends of Recommender Systems for Implicit Feedback
LU Yi and CAO Jian
Computer Science. 2016, 43 (4): 7-15.  doi:10.11896/j.issn.1002-137X.2016.04.002
Abstract PDF(874KB) ( 886 )   
References | Related Articles | Metrics
As an effective approach addressing information overloading problem,recommender system has been a hot-spot in both industry and academia,which infers users’ potential requirements and interests by utilizing their explicit/implicit feedbacks,and then recommends them with preferable information or products.The recommendation methods based on explicit feedbacks are the mainstream approaches in this area,however,because of the prevalence of implicit feedbacks,the recommendation methods for implicit feedbacks can be more widely applied.Because the implicit feedbacks cannot reflect users’ preferences directly,to recommend products relying on implicit feedbacks is a more challen-ging task.The characteristics of implicit feedback,necessity and problems of recommendation for implicit feedbacks were illustrated firstly.Then a systematic taxonomy for various recommendation methods on implicit feedback was proposed.On this basis,the strength/weakness of these approaches and evaluation metrics for implicit feedback oriented recommendation were analyzed.Lastly,the possible directions of implicit feedback oriented recommendation in the future were discussed.
Survey of Trajectory Privacy Preserving Techniques
HU Zhao-wei and YANG Jing
Computer Science. 2016, 43 (4): 16-23.  doi:10.11896/j.issn.1002-137X.2016.04.003
Abstract PDF(693KB) ( 618 )   
References | Related Articles | Metrics
As the development of mobile social networks and information technology,location-based service applications are more widely used .Trajectory privacy is a special location privacy,which receives more and more attention.The collected trajectory data contain extensive information about mobile user under the dimensions of time and space.Attackers may get much private information by mining and analyzing background information on mobile user,which could result in human safety threat.Therefore,it is an important content of trajectory privacy preserving technology that how to ensure users get high-quality services as well as protect the users’ trajectory privacy.Firstly,this paper discussed the concept,type,classification,metric and architecture in terms of trajectory privacy preserving.Secondly,it described the recent major research on techniques and methods of trajectory privacy preserving.Finally,it generalized the current research focus in the field,and concluded the future research.
New Quantum Algorithm for Breaking RSA
WANG Ya-hui and YAN Song-yuan
Computer Science. 2016, 43 (4): 24-27.  doi:10.11896/j.issn.1002-137X.2016.04.004
Abstract PDF(287KB) ( 745 )   
References | Related Articles | Metrics
It is well known that the security of the most famous and widely used public-key cryptosystem RSA relies on the computational intractability of the integer factorization problem.In this paper,by a combined use of the fixed-point property of RSA,multiple Fourier transforms and variable substitution,a new quantum algorithm for directly recovering the RSA plaintext M from the ciphertext C was presented,without explicitly factoring the modulus n.Compared to Shor’s quantum algorithm,the new algorithm requires fewer quantum bits.The probability of success of the new algorithm is bigger than 1/2.Moreover,a comparison of the required resources between our algorithm and Shor’s was pre-sented.
Spectrum Allocation Algorithm Based on Hybrid Multigroup Evolution and Particle Swarm Optimization
WANG Jun-ming, LIU Jia-qi, CHEN Zhi-gang and GUO Lin
Computer Science. 2016, 43 (4): 28-32.  doi:10.11896/j.issn.1002-137X.2016.04.005
Abstract PDF(428KB) ( 480 )   
References | Related Articles | Metrics
In order to solve the spectrum allocation problem in cognitive wireless network,a spectrum allocation algorithm based on multigroup evolution and particle swarm optimization hybrid was proposed in this paper.It uses graph coloring model,and makes multiple populations evolve independently by genetic algorithm in order to improve global search capability of populations first.Then it selects the optimal particle of each individual population as a particle particle swarm optimization,and controls the direction of the initial velocity of each particle to speed up the convergence rate.Finally,the maximum benefit and the fairness among users were taken as the optimization goal to compare with genetic algorithm and particle swarm optimization by simulation experiment.The experimental results show that the algorithm is better than genetic algorithm and particle swarm optimization in convergence speed,cognitive user access fairness and total system efficiency.
Phase Transition Phenomenon of Regular 3-SAT Problem
ZHANG Ming-ming and XU Dao-yun
Computer Science. 2016, 43 (4): 33-36.  doi:10.11896/j.issn.1002-137X.2016.04.006
Abstract PDF(300KB) ( 649 )   
References | Related Articles | Metrics
We considered the restriction on the 3-CNF formula on n Boolean variables{x1,x2,…,xn},in which each of the 2n literals occurs precisely {x1,x1,…,xn,xn} times.We called such formulas as regular (3,s)-CNF formulas.Through the two kinds of clauses generating mechanism of (3,s)-CNF formula,we observed that the regular (3,s)-CNF formulas are easier to be satisfied than non-regular 3-CNF formulas while the input size is small.Thus we inferred that compared with non-regular 3-SAT,the transition point of regular 3-SAT problems has offset phenomenon.Finally we explained this phenomenon from a perspective of the number of the variable’s degree of freedom.
Study on Non Proliferation Stage Transfer Algorithm Based on Opportunistic Network
GUAN Pei-yuan, CHEN Zhi-gang, WU Jia and GUO Lin
Computer Science. 2016, 43 (4): 37-40.  doi:10.11896/j.issn.1002-137X.2016.04.007
Abstract PDF(326KB) ( 525 )   
References | Related Articles | Metrics
At present, the vast majority of algorithms in opportunistic networks take “packet proliferation” strategy,namely using the packet replication to improve the success rate of data packet transmission in network system. The core idea of the improved algorithms based on Epidemic algorithm or Spray and Wait algorithm is the proliferation.In this paper,the transmission process was divided into a certan period of time,and the stage of non proliferative transfer algorithm NPST(Non Proliferation Stage Transfer Algorithm) was proposed.The core idea of this algorithm is that in the early stage of system operation,system operation runs in accordance with the other classical algorithms,and when the data packets in node cache meet some condition, the system uses non proliferative strategy,and the transmissionof packets between nodes no longer generates new copies,instead,the packets transmit by the way of “exchange”.In the middle and later stages of system operation,the proposed algorithm can effectively reduce the routing overhead of the whole system,and improve the performance of the network.
Convergence Analysis of Water Wave Optimization Algorithm
ZHANG Bei and ZHENG Yu-jun
Computer Science. 2016, 43 (4): 41-44.  doi:10.11896/j.issn.1002-137X.2016.04.008
Abstract PDF(333KB) ( 832 )   
References | Related Articles | Metrics
Taking inspiration from the phenomena of water waves for global optimization,water wave optimization (WWO) is a novel evolutionary algorithm which mimics wave motions including propagation,refraction and breaking for effectively searching in a high-dimensional solution space,which has shown promising performance advantage over a variety of state-of-the-art metaheuristic optimization methods on well-known benchmark problems and real-world engineering problems.The paper theoretically analyzed the convergence conditions of the WWO algorithm.By simplifying the target problem and the parameter setting of the algorithm,we demonstrated that in WWO any individual can guaran-tee the convergence in two special cases:1) when only performing the propagation operation and 2) when only perfor-ming the refraction operation,which respectively happen under two special states of fitness changing.The paper also conducted numerical simulations for the two special cases respectively to validate the above convergence conditions.
Semantic Similarity Computing Based on Community Mining of Wikipedia
PENG Li-zhen and WU Yang-yang
Computer Science. 2016, 43 (4): 45-49.  doi:10.11896/j.issn.1002-137X.2016.04.009
Abstract PDF(403KB) ( 704 )   
References | Related Articles | Metrics
Words semantic similarity computing has been widely used in natural language processing,such as word sense disambiguation,information retrieval,text auto categorization.Different from traditional methods,we presented an algorithm based on community mining of Wikipedia to compute words semantic similarity.Our method makes use of the huge Wikipedia page network with category labels rather than its textual content.To get the community of a word page,we applied the HITS,which is a community discovery algorithm based on the theme,to pages network.Based on the gotten community,we measured the semantic similarity between two words from three aspects:(1)semantic relations between the two word pages,(2)semantic relations between the two communities of word page,(3)semantic relations between the categories which two communities belong to.Finally,tests on standard data sets WordSimilarity-353 show that the method we proposed is feasible and slightly better than some classic algorithms.In the best case,the Spearman correlation coefficient reaches 0.58.
Secure Identity-based Strong Designated Verifier Signature Scheme
XU Dan-hui and KANG Bao-yuan
Computer Science. 2016, 43 (4): 50-52.  doi:10.11896/j.issn.1002-137X.2016.04.010
Abstract PDF(306KB) ( 515 )   
References | Related Articles | Metrics
Compared with ordinary digital signature in which anyone who owns the signer’s public key can verify the validity of the signature,a designated verifier signature scheme makes it possible for a signer to convince a designated verifier that he has signed a message in such a way that the designated verifier cannot transfer the signature to a third party.In a strong designated verifier signature scheme,no third party can even verify the validity of a designated verifier signature,since the designated verifier’s private key is required in the verifying phase,such that strong designated verifier signature is widely used in electronic commerce,online bidding and electronic elections.Based on bilinear pairing we proposed a novel identity-based strong designated verifier signature scheme.Based on the GBDH assumption,we proved that our scheme is existentially against adaptive chosen message attack.Through the analysis of the computational cost,results show that the scheme has high efficiency.
Effect of Channel Noise on Quantum Information Splitting
BAI Chen-ming and LI Yong-ming
Computer Science. 2016, 43 (4): 53-57.  doi:10.11896/j.issn.1002-137X.2016.04.011
Abstract PDF(350KB) ( 651 )   
References | Related Articles | Metrics
In the literature[7],Muralidharan et al have proposed quantum information splitting of single-qubit state by using four cluster state.Based on this scheme,we analyzed the classical noise channel that has a profound influence on the quantum information splitting.Through the binary symmetric channel and the binary erasure channel,we obtained the relationship between information splitting success probability and the classical channel noise coefficient.In addition,we also investigated the effect of this scheme by the quantum noise channel.In the process of quantum information splitting by amplitude damping channel or depolarzing channel,quantum entanglement channel’s decoherence will occur,thus leading to the decline of the quality of quantum information splitting.In this paper,we characterized the relationship among the fidelity,the noise coefficient and the original quantum state.
Cloudlet-based Mobile Video Prefetching System
JI Chuan-wei, BAI Guang-wei, SHEN Hang and ZOU Lu-ning
Computer Science. 2016, 43 (4): 58-63.  doi:10.11896/j.issn.1002-137X.2016.04.012
Abstract PDF(765KB) ( 466 )   
References | Related Articles | Metrics
Most researches on mobile video communication(MVC) focus on audio and video synchronization,bit rate adjustment and energy saving,however,the change between APs which leads video communication service to interruption is not considered.We proposed a cloudlet-based mobile video prefetching system(CMVP) to this challenge.CMVP utilizes cloudlets to prefetch video streams for mobile-phones.We designed a greedy prefetching algorithm to reduce the count of service interruption as far as possible.And to balance cloudlets’ resource consumption,the low probability K to 1 markov prefetching progress schedule algorithm(LPMPPS) and first AP selection algorithm were designed.Our simulation results demonstrate that CMVP can access to more than 90% hit rate,and QoE can be improved remarkably.
Dynamic Threshold Window Algorithm for Virtual Machines Migration Decision
QU Xiao-ya and LIU Zhen
Computer Science. 2016, 43 (4): 64-69.  doi:10.11896/j.issn.1002-137X.2016.04.013
Abstract PDF(763KB) ( 490 )   
References | Related Articles | Metrics
Cloud data center(DC) is the center of data operation,exchange and storage.Based on virtualization technology,virtual machine(VM) placement has become an important technology for power management and elastic resource provision.In the stage of dynamic VMs management,with the changes of resources utilization,migration trigger mechanism will determine when to migrate the VMs from one host to the other.The accurate judgment of trigger time can balance the hot spots effectively and turn off cold spots in DC.However,current migration trigger mechanism lacks the response to the changes of DC workloads,and static threshold setting is easy to cause frequent migration with unnecessary migration and transmission cost.To solve these problems,a dynamic threshold setting algorithm,iWnd was proposed,which adjusts the size of threshold windows according to the workloads on the whole DC.In addition,iWnd reduces the number of VMs which need to be migrated,avoiding unnecessary migration and transmission cost and saving power.We made the experiments on a simulated cloud environment using Cloudsim toolkit.Our efforts show that iWnd can effectively reduce the number of VMs migration and migration failure rate without producing additional power consumption.
Dissemination of Negative Word-of-mouth in Online Community Based on Agent-based Simulation
CAI Shu-qin, WANG Wei, ZHOU Peng and CUI Xiao-lan
Computer Science. 2016, 43 (4): 70-75.  doi:10.11896/j.issn.1002-137X.2016.04.014
Abstract PDF(493KB) ( 485 )   
References | Related Articles | Metrics
With the development of Web2.0,the online community has become a convenient media for dissemination of negative word-of-mouth,and negative word-of-mouth in online community can directly or indirectly affect the economic interests of the enterprise as well as customer loyalty.Based on the perspective of characteristics of information dissemination,combined with multi-agent modeling method and virus propagation models,this paper used Netlogo platform for simulation experiments of negative word-of-mouth dissemination in online community.The simulation mainly includes the participation of online community and the information value of negative word-of-mouth which impact the dissemination process of negative word-of-mouth.The results suggest that the participation of online community and higher information value of negative word-of-mouth can both obviously promote the dissemination process of negative word-of-mouth.We also provided some countermeasures for enterprises to deal with the negative word-of-mouth,e.g.focusing on the role of online community and high information value of negative word-of-mouth,which could handle massive amounts of negative word-of-mouth with shorter time and lower cost,thereby reducing the economic losses and improving the customer loyalty.
Multi-domain Optimal Resource Management in Heterogeneous Wireless Networks
ZHANG Yuan-yuan and WANG Jian
Computer Science. 2016, 43 (4): 76-80.  doi:10.11896/j.issn.1002-137X.2016.04.015
Abstract PDF(433KB) ( 553 )   
References | Related Articles | Metrics
Based on the characteristic of multiple network cooperation,we studied multi-domain optimal resource ma-nagement in heteroge-neous wireless networks.We analyzed the situation of optimal resource management in heterogeneous wireless networks and proposed an optimal model to allocate the whole network resource.The model can improve both the entire rewards of the system and the quality of service (QoS) by allocating the heterogeneous wireless network resources between network domains.We proposed a new Semi-Markov Decision Process (SMDP) based model for the optimal management of resource to achieve the optimal decision-making strategies between multiple network domains in heterogeneous wireless network.Theoretical analysis and experiment results indicate that the optimal strategies can not only increase the overall long-term rewards of the system,but also reduce the call blocking probability of service in order to increase the quality of service (QoS) of heterogeneous wireless networks,compared with the complete sharing algorithm,our method can reduce the new call blocking probabilities.
Collaborative Caching Algorithm Based on Node Similarity in Content Centric Networking
FANG Xin-wei, CHEN Shu-qiao, REN Ze-rong and JIANG Yi-ming
Computer Science. 2016, 43 (4): 81-85.  doi:10.11896/j.issn.1002-137X.2016.04.016
Abstract PDF(766KB) ( 462 )   
References | Related Articles | Metrics
The default ALWAYS caching scheme (caching everything everywhere) in CCN results in the low storage utilization,the large content access latency and the inefficient caching performance as well.This paper proposed a colla-borative caching algorithm based on node similarity (CCANS).The interest packet is forwarded to the most similar node to increase the hit rate for correlate requests.Curbing the same duplication between collaborative nodes,the algorithm reduces the redundancy while increasing the diversity.The simulation results show that,compared with the exis-ting algorithm,the proposed algorithm decreases the route hops,reduces the request latency and achieves higher cache hit ratio.
Similarity Propagation Based Link Prediction in Bipartite Networks
YAO Fei-ya and CHEN Ling
Computer Science. 2016, 43 (4): 86-91.  doi:10.11896/j.issn.1002-137X.2016.04.017
Abstract PDF(492KB) ( 809 )   
References | Related Articles | Metrics
Link prediction is an important issue in the study of complex analysis. We presented a similarity propagation based link prediction in bipartite networks.The algorithm propagates and updates the link similarity scores by a random walk in the network.In the algorithm,each link in the network is assigned a similarity based on the transmission probability.The link similarity scores between different parts of the nodes are propagated via the links according to their transmission probability.The experimental results on real social networks of varying sizes show that our algorithm can achieve higher quality prediction results than other methods.
Analyzing Effects of Antenna Polarization on Indoor ZigBee Channel Transmission Characteristics
XU Guang-hui, SU Guo-jie, WANG Hua-li, WANG Qing-guo and LIU Yang
Computer Science. 2016, 43 (4): 92-96.  doi:10.11896/j.issn.1002-137X.2016.04.018
Abstract PDF(926KB) ( 414 )   
References | Related Articles | Metrics
To analyze the effects of different antenna polarization on indoor ZigBee channel transmission characteristics,the paper constructed a deterministic model of a specific apartment based on ray tracing method.Then it utilized Wireless Insite to present a simulation analysis on the four kinds of antenna polarization (horizontal polarization,vertical polarization,left-hand polarization and right-hand polarization).At last,through comparatively analyzing the path loss,the received power and the delay spread channel parameters,we got some conclusions of great value,such as vertical polarization leads to the best received power,but has serious delay spread;left-hand polarization and right-hand polarization have lower path loss than linear polarization,but the received power is not ideal;the wooden doors and partition walls have little effect on the spread of ZigBee channel while the metal door will leads to a serious loss of transmission waves,and so on.
Optimization of Communication Performance of RPC Client in HBase Architecture
HU Bo and TAN Liang
Computer Science. 2016, 43 (4): 97-101.  doi:10.11896/j.issn.1002-137X.2016.04.019
Abstract PDF(531KB) ( 879 )   
References | Related Articles | Metrics
HBase has become a key component of big data storage,analysis and processing.Its performance optimization is a hot research topic in nowaday industry and academia.HBase architecture consists of multiple subsystems,among which communication is realized via remote procedure call (RPC) communication mechanism.But the RPC client of subsystem adopts the blocking communication patterns,which will cause threads to block when there are intensive client data requests,thus affecting communication efficiency among subsystems,and reducing HBase performance.We first ana-lyzed communication mechanism of HBase RPC client side and its server side.Then we proposed a nonblocking communication pattern of HBase RPC client side via Java NIO.Experiment results show that this communication pattern can minimize the effect of blocking patterns on communication performance,and improve communication performance of HBase RPC client.
Research for ETX Routing Metric of RPL
CAO Xiang-yi, ZENG Bi and HE Cui-hong
Computer Science. 2016, 43 (4): 102-105.  doi:10.11896/j.issn.1002-137X.2016.04.020
Abstract PDF(915KB) ( 780 )   
References | Related Articles | Metrics
ETX (Expected Transmission Count) is routing metric of existing RPL(Routing Protocol for Low power and Lossy Networks).ETX minimizes required total number of delivering a packet between two nodes successfully.Howe-ver,it does not take the real-time data transmission into account.This paper minimized the latency of data transmission based on ETX,and guaranteed high success rate and low latency of data transmission.Finally,the improved algorithm was implemented in contiki OS and compared with the standard ETX.
QoS and Multilevel Index Based Web Service Publish/Subscribe
HE Qian, LI Jia, HU Qi-wei and QIANG Bao-hua
Computer Science. 2016, 43 (4): 106-110.  doi:10.11896/j.issn.1002-137X.2016.04.021
Abstract PDF(406KB) ( 410 )   
References | Related Articles | Metrics
The publish/subscribe (P/S) can help to manage big scale web services actively.In this paper,a QoS based web services P/S model and system architecture with a web services match algorithm based on QoS and multilevel index was proposed.The matching relationship between QoS of web services and the subscription constraint is a key of the P/Smodel.The published web services and the service subscription are used to generate a filter matrix to reduce redundant matches.The multilevel index is constructed according to the QoS attribute type on the published web services,and then the map from attribute to service is generated to subscribe services rapidly.The experiments show that the proposed web service P/S system is better than traditional methods and is suitable to the large scale distributed web service management.
Efficient Method of Live Migration on Virtual Machine Memory
CHENG Hong-xi and TAN Liang
Computer Science. 2016, 43 (4): 111-114.  doi:10.11896/j.issn.1002-137X.2016.04.022
Abstract PDF(347KB) ( 643 )   
References | Related Articles | Metrics
Pre-copy is the main strategy of the virtual machine live migration,but the traditional pre-copy algorithm regards dirty page as unit work set,which makes modification data small.But for a virtual machine environment which is widely distributed in memory,the overall number of iterations is generally too high,and data migration time is too long.To solve the problems,this paper proposed pre-copy algorithm based on the dirty data in page as unit work set.The algorithm uses a new data structure to record dirty data in page.Compared to the dirty pages as unit work set,the algorithm’s unit work set not only more accurately reflects the actual migration environment of a virtual machine,but also has significantly refined particle size,which makes the total work sets smaller than the preset threshold more probably,which can reduce the number of iterations,to shorten the migration time and reduce migration bandwidth.Experiments show that the algorithm has higher efficiency of virtual machine live migration.
Cooperative Electromagnetic Compatibility Control Complexity Optimization Algorithm Based on Cluster Filter for Mobile Crowd Sensing
LI Yu-ling, WANG Xiu-ling and ZHOU Jian-ming
Computer Science. 2016, 43 (4): 115-117.  doi:10.11896/j.issn.1002-137X.2016.04.023
Abstract PDF(295KB) ( 408 )   
References | Related Articles | Metrics
The high complexity and extra resource consumption of cooperative electromagnetic compatibility in mobile communication process in the mobile network are key problems that affect the communication performance and data transmission efficiency.The proposed algorithm is based on the mobile network,which is composed of mobile sensing terminal,perceived group,data service group and data communication group.At first,The model was built up.Then,the IIR,FIR and other multi type filter clusters were formed.The experimental results show that the proposed algorithm can not only optimize the power consumption of the mobile relay and CPU occupancy rate,but also can significantly improve the reliability of communication.
Security Access to Cloud Disk via Virtual Isolation Mechanism
CHEN Feng, BAO Ai-hua and ZHANG Wei-ming
Computer Science. 2016, 43 (4): 118-121.  doi:10.11896/j.issn.1002-137X.2016.04.024
Abstract PDF(452KB) ( 395 )   
References | Related Articles | Metrics
The cloud storage technology is an important research area of cloud computing.Because of the hidden trouble about data leakage,cloud storage services are often difficult to be widely used in organizations with the core data,such as the innovative enterprises or the army.For this issue,a noval security access model for cloud disk was proposed via the virtual isolation mechanism.Theoretical analysis shows that the model has ability to prevent sensitive data leakage in cloud disk of the enterprise.Further,an enterprise private cloud storage-oriented electronic document centralized ma-nagement and control system CFS was presented to test read/write operations performance based on the model.Until now,the system has been successfully applied to a number of important user units,and has very good development prospects.
Secure Control Mechanism of Internal and External Data-flow Oriented to Virtual-desktop
DENG Xiao-xiao, LU Chuan and MA Wei
Computer Science. 2016, 43 (4): 122-126.  doi:10.11896/j.issn.1002-137X.2016.04.025
Abstract PDF(691KB) ( 412 )   
References | Related Articles | Metrics
The data interaction of desktop virtualization between internal application data and external user operation platform are realized by virtual desktop protocol.Because of the deficiency of the data flow control mechanism in this kind of protocol,it may lead to the illegal interaction.In order to resolve this problem,based on gateway,this paper proposed a secure control mechanism of internal and external data-flow oriented to virtual-desktop.It not only has the overall control of virtual channel,avoiding modifying lots of transport protocols or terminals,but also has high compatibilities,expansibilities and usability.Deploying it at the gateway to protect from boundaries attack can reduce the server load and safety concerns significantly.Experiments prove that this mechanism can control the direction of data flow effectively.Meanwhile,it has little impact on existing desktop session.
Secure Multi-party Computation Based on Symbolic Edge-valued Binary Decision Diagram
XU Zhou-bo, YU Qiang-sheng, GU Tian-long and NING Li-hua
Computer Science. 2016, 43 (4): 127-133.  doi:10.11896/j.issn.1002-137X.2016.04.026
Abstract PDF(563KB) ( 455 )   
References | Related Articles | Metrics
Efficient representation of the decision function is a hot problem in the study of secure multi-party computation.Symbol description technology is a novel method for the decision function representation.Aiming at the problem of leaf nodes expansion which leads to state explosion in the protocol produced by the algebraic decision diagram (ADD) representation of decision function,an edge-valued binary decision diagram (EVBDD) technique was introduced,and then a new method for decision function representation based on EVBDD was proposed.Firstly,the decision function is transformed to symbolic EVBDD form by using the structure of EVBDD,and thus the problem of leaf nodes expansion is avoided.Secondly,through adding virtual nodes,the privacy issues appeared on the computation path is solved.Furthermore,an EVBDD encryption algorithm was proposed,and a new secure two-party computation protocol based on EVBDD was designed.Finally,the correctness,security and efficiency of the new protocol was analyzed,demonstrating a considerable improvement in efficiency of the new protocol compared to the method based on ADD.
New Bit-level Self-adaptive Color Image Encryption Algorithm Based on Hyperchaotic System
CHAI Xiu-li and GAN Zhi-hua
Computer Science. 2016, 43 (4): 134-139.  doi:10.11896/j.issn.1002-137X.2016.04.027
Abstract PDF(1278KB) ( 408 )   
References | Related Articles | Metrics
In the paper,a self-adaptive color image encryption algorithm using the hyperchaotic system was introduced,and we operated it at the bit level.Firstly,chaotic sequences produced by Chen hyperchaotic system are used to confused and diffused R,G,B components of the plain image.Moreover,self-adaptive encryption algorithm is adopted,namely,the higher four bit images are used to encrypt the lower four bit images,and the encrypted lower four bit images are used to encrypt the higher four bit images conversely.Then,the encrypted three base color components are combined horizontally and encrypted simultaneously,and this reduces the correlations of the three base color components.The encryption algorithm makes the relationship between the ciphered image and the plain image,keys more complex,some keys depend on the plain image,and the algorithm is more sensitive to the plain image.Tests and analyses of key space,key sensitivity,image histogram,correlation,information entropy and plain image sensitivity were carried out.The results demonstrate the superior security and high efficiency of the proposed scheme,and the encryption scheme has huge application in image secure communication field.
Cloud Services Cross-domain Authentication Protocol Formal Analysis and Verification Based on Fragment Model Check
CHEN Hong-song, WANG Gang and FU Zhong-chuan
Computer Science. 2016, 43 (4): 140-144.  doi:10.11896/j.issn.1002-137X.2016.04.028
Abstract PDF(404KB) ( 436 )   
References | Related Articles | Metrics
Aiming at the cross-domain authentication problem in multi-cloud services,a cloud service security authentication scheme based on SAML(Security Assert Markup Language)protocol was proposed.The key techniques and mechanism were clarified,and the abstract model of cloud service security authentication protocol was built.Using the combination of Casper and FDR software,cloud service authentication protocol was formally analyzed and verified by model checking method.By dividing the protocol to do model checking separately,the state space explosion problem in security protocol formal analysis and verification was resolved.The experiment results from model checking software prove that the proposed cross-domain authentication scheme is effective and security.
Mobile Data Storage Solution Based on Secret Sharing Protocol
RAN Juan and LI Xiao-yu
Computer Science. 2016, 43 (4): 145-149.  doi:10.11896/j.issn.1002-137X.2016.04.029
Abstract PDF(430KB) ( 809 )   
References | Related Articles | Metrics
Taking into account the limited resource and capability for mobile database,a mobile data storage solution based on secret sharing protocol was proposed.Lightweight main memory database is used in application to store a small amount of data.Then,the most data of the mobile client are stored in the database server.Sensitive data are stored on the data encryption server by using AES encrypting,and secret key is split by secret sharing technology and stored in different data storage servers,which guarantees that only the mobile client can get both the secret key and the cipher text,reducing the storage pressure of mobile client.The solution achieves the separation of data control permissions,ensuring the mobile client access to data with the hightest authority,improving the security of the data.The experiment results show that the solution is feasible with good performance and has good application prospects.
Trojans Keep-alive Behavior Detection Approach Based on Wavelet Transform
BAI Hong, PANG Jian-min, DAI Chao and YUE Feng
Computer Science. 2016, 43 (4): 150-154.  doi:10.11896/j.issn.1002-137X.2016.04.030
Abstract PDF(421KB) ( 863 )   
References | Related Articles | Metrics
Trojans keep-alive behavior detection algorithms generally are based on the method of clustering,which can hardly avoid the interference of other packets in the network,leading to false positive results.Therefore,this paper proposed a Trojans keep-alive behavior detection approach based on wavelet transform.In this approach,firstly,TCP packetsstream is described by packet length signal,then the signal is processed by compelling threshold denoising method based on Mallat theory,and finally detection results can be acquired through detail information decision algorithm based on packet rate.Experiments show that this approach can detect Trojan keep-alive behavior effectively and has better anti-interference.
Trust Evaluation Model with Eliminating Random Recommendation
ZHOU Guo-qiang, LIU Hong-fang and WANG Zi-yuan
Computer Science. 2016, 43 (4): 155-159.  doi:10.11896/j.issn.1002-137X.2016.04.031
Abstract PDF(420KB) ( 413 )   
References | Related Articles | Metrics
To deal with the problem that there are random recommendations in network entities,we proposed a trust evaluation model that can remove random recommendations.Former models always adopt random recommendations by setting weights to them.In essence,however,random recommendations have nothing to do with the action of the recommended.Adopting them by setting weights is baseless and may finally lead to distortions.In order to deal with the prob-lem that random recommendations may affect correct evaluation of other entities,first of all,this paper integrated users’ transaction histories to calculate their recommendation credibility.Next,according to the evaluation of dispersed characteristics that system made to random recommendation users,those that cannot gain evaluation of accumulation will be recognized and filtered out.Certain proportions of random recommendations experiments were set.The results show that,because random recommendations have been recognized and fittered out,the methods in this paper is proved to have higher success rate when comparing with traditional ones.
Low-cost Ultralightweight RFID Mutual-authentication Protocol
YANG Xin and LING Jie
Computer Science. 2016, 43 (4): 160-162.  doi:10.11896/j.issn.1002-137X.2016.04.032
Abstract PDF(302KB) ( 395 )   
References | Related Articles | Metrics
Aiming at the security problems and tag’s cost problems of RFID,a low-cost ultralightweight RFID mutual-authentication protocol was proposed,and this protocol was proved to be safe by BAN logic formal analysis method.Besides,security analysis shows that the protocol possesses robust security and can defend against malicious attacks such as disclosure attack,desynchronization attacks,impersonation attack,etc.It has advantages of better security,low cost and less consumption resource of computing and storage.
Time-bandwidth Allocation Scheme for Physical Layer Security in Cooperative Cognitive Radio Networks
GAO Rui-feng, NI Dan-yan, BAO Zhi-hua and HU Ying-dong
Computer Science. 2016, 43 (4): 163-166.  doi:10.11896/j.issn.1002-137X.2016.04.033
Abstract PDF(364KB) ( 440 )   
References | Related Articles | Metrics
Cognitive radio (CR) technology is one of the strong candidate technologies to solve the spectrum scarcity problems.We proposed a time-bandwidth allocation scheme for physical layer security in cognitive radio networks,in which the primary users (PUs) communicate with the help of the secondary users (SUs).As feedback,PUs share the bandwidth with SUs for the communication of the secondary network.Particularly,we selected two SUs,a relay SU and a jammer SU,to maximize the physical secrecy capacity of the PUs on the premise of meeting the lowest transmission rate requirement of the SUs.The optimal bandwidth allocation factor,time allocation factor and the cooperative power allocation of the SUs were given in this paper.Numerical results show that the proposed scheme can improve the secrecy capacity of the PUs significantly.
Model Checking Software Product Line Based on Feature Slicing
LIU Yu-mei, WEI Ou and HUANG Ming-yu
Computer Science. 2016, 43 (4): 167-172.  doi:10.11896/j.issn.1002-137X.2016.04.034
Abstract PDF(756KB) ( 393 )   
References | Related Articles | Metrics
Feature models are a popular formalism for describing the commonality and variability of software product line in terms of features.Feature models symbolize a representation of the possible application configuration space,and can be customized based on specific domain requirements and stakeholder goals.As feature models are becoming increa-singly complex,it is desired to provide automatic support for customized analysis and verification based on the strategic goals and requirements of stakeholders.This paper firstly presented feature model slicing based on the requirements of the users.It then introduced three-valued abstraction of behavior models based on the slicing unit.Finally,based on multi-valued model checker,a case study was conducted to illustrate the effectiveness of our approach.
Real-time Kernel Design Based on μC/OS-II and Meeting OSEK Standard
ZHU Yi-an, WEI Run-zhi, SU Shi-you and HUANG Shu-juan
Computer Science. 2016, 43 (4): 173-176.  doi:10.11896/j.issn.1002-137X.2016.04.035
Abstract PDF(390KB) ( 547 )   
References | Related Articles | Metrics
Safety critical systems make more requirements on timeliness and reliability of embedded systems.This paper designed a new kernel for a mixed time-triggered and event-triggered mechanism,meeting the OSEK standard.The kernel has the features of good timeliness,high flexibility and certainty.This paper also put forward a static schedule stra-tegy for mixed tasks and an algorithm to check the schedulability of time-triggered tasks.The flexible switch between tasks at both interrupt and task level helps to guarantee those good features while improving the system utilization as well.The experiment shows that the real-time kernel is effective and efficient with good timeliness and high reliability.
Probabilistic Quasi-Hoare Logic
WU Xin-xing, HU Guo-sheng and CHEN Yi-xiang
Computer Science. 2016, 43 (4): 177-181.  doi:10.11896/j.issn.1002-137X.2016.04.036
Abstract PDF(420KB) ( 596 )   
References | Related Articles | Metrics
A Hoare logic-baesed probabilistic quasi-Hoare logic for program correctness assessment was presented.Probabilistic quasi-Hoare logic describes the difference between the theory idea and implementation,and reflects the degree of which the theory idea is implemented by the program in practice.Thus,it can formally explain the reasons of errors of programs which are theoretically correct and the low correctness degree of the sequential composition of two programs (components) which are both high in correctness degree,etc.
Keyword Search for Relational Databases Based on Offline Index
GE Wei-yi, ZONG Shi-qiang and YIN Wen-ke
Computer Science. 2016, 43 (4): 182-187.  doi:10.11896/j.issn.1002-137X.2016.04.037
Abstract PDF(505KB) ( 379 )   
References | Related Articles | Metrics
The biggest challenge for keyword search over relational databases is that results are often assembled from tuples in several tables.Dominant approaches find tuples hit by keywords and identify their joins on the fly to form results,which are rather inefficient for databases with large scale or complex schema.Besides,these approaches only support simple keyword query,which limits their practical usage.Regarding this,we proposed an alternative way by in-dexing joinable tuples offline to speed online search,and strove to improve its index efficiency and query capability for practical usage.Firstly,to improve search and index efficiency,we proposed an approach to utilize schema graph information to enumerate joinable tuples.This approach discovers all suitable joinable tables and translates them to SQL queries,which are sent to database interfaces for getting possible joinable tuples.Secondly,to ensure the compactness of results,we proposed an approach to index joinable tuples and searched them by order of their size.The approach selects rele-vant joinable tuples with small scales in advance and excludes those joinable tuples containing them from the next round of selection.Finally,to support complex query,we proposed a field-based index structure,which uses different fields for different search types.At search time,these fields are queried and their results are sent to perform logical operations to get the final results.Experiments show that,compared to existing approaches,our approach outstands in index and search efficiency and provides query capability close to the SQL.
UCMLib:A Multi-core Multithread Programming Library
YANG Ji-xiang
Computer Science. 2016, 43 (4): 188-191.  doi:10.11896/j.issn.1002-137X.2016.04.038
Abstract PDF(336KB) ( 509 )   
References | Related Articles | Metrics
The radical shift in microprocessor architectures to multi-core designs and their further successful development require high productivity and speedup for multi-core programming.Aiming at the two goals,we focused on designing and implementing a lightweight multi-core parallel programming library called UCMLib supporting high level programming in C# for .NET.The library was built upon task primitive concept,supporting two models for expressing logical parallelism,namely data parallelism and task parallelism.The library simplifies parallel programming by encapsulating complexity for obviating the need for programmer to worry explicitly about locking and competing,thus increa-sing productivity.The speedup techniques surround efficient construction and management of tasks queues.The mea-sured performance shows good parallel speedups comparable to that of Microsoft TPL.Lastly,we also suggested possible improvements and some issues to be further studied.
Recommendation Model of Microblog User Tags Based on Hybrid Grain
ZHANG Rui, JIN Zhi-gang and WANG Ying
Computer Science. 2016, 43 (4): 192-196.  doi:10.11896/j.issn.1002-137X.2016.04.039
Abstract PDF(753KB) ( 486 )   
References | Related Articles | Metrics
To avoid the shortcomings of existing tag-recommendation system,we proposed a recommendation model based on hybrid grain.The proposed model divides the resources to multiple grain including user information,tags and blog content.Then filtering personal information,analyzing personality tags,calculating user tag entropy and inline degree,classification words,and extracting topic of micro-blog can proceed based on the divided hybrid grain.Through the steps above,the proposed method can recommend associated personalized labels to users.By comparing the experimental results,it shows that the proposed model can efficiently deal with the cold boot problem,improve the accuracy of tag recommendation and insure the diversity of recommended labels.
Large Data Clustering Algorithm Introducing Time and Frequency Clustering Interference Suppression
HU Xian-bing and ZHAO Guo-qing
Computer Science. 2016, 43 (4): 197-201.  doi:10.11896/j.issn.1002-137X.2016.04.040
Abstract PDF(418KB) ( 445 )   
References | Related Articles | Metrics
The data clustering method of large data sets has important research significance in pattern recognition,fault diagnosis,data mining and so on.Traditional data clustering algorithms use hybrid differential evolution particle swarm optimization algorithm,but because the cross term interference caused by interactions between components of data flow has an impact on the correct judgement,the clustering effect is not good.A large data clustering algorithm was proposed based on the time-frequency clustering algorithm.In the large database of Internet of things from the perspective of communication,big data clustering feature vector is generated.Arbitrary two cluster vectors are trained of membership grade about neighbor sample,and information is scheduled in time sliding window model.High frequency component suppression method is used to conduct interference suppression of time-frequency clustering interactions.By similarity fusion processing frequency domain convolution and using particle swarm optimization algorithm to do cluster fitness calculation,we improved the data clustering algorithm.The simulation results show that the algorithm has good anti-interference and self-adaptive performance,and it has high accuracy.
Solving Reachability Relationship by Property of Directed Cycle
CHEN Qiu-ru, WEN Zhong-hua, YUAN Run and DAI Liang-wei
Computer Science. 2016, 43 (4): 202-205.  doi:10.11896/j.issn.1002-137X.2016.04.041
Abstract PDF(380KB) ( 464 )   
References | Related Articles | Metrics
The goal of non-deterministic planning is getting the planning solution.Due to the lack of the helpful information,searching the problem space directly leads to a large number of useless work,which can be reduced dramatically by capturing the reachability relationship between states.But current methods perform poorly with respect to efficiency,thus we designed a new algorithm.We built a graph from the non-deterministic system,and checked whether there are some paths between states leading to some cycles.We concluded that every two states in the cycle are mutually reachable,if such a cycle does exist.We could treat vertex as the parent,and tagged it with the reachability relationships.By using this relationships and updating the reachability information of the states in the cycle,we could prevent many useless states to be searched.The experimental results show that the designed algorithm not only gains more complete reachability relationships,but also outperforms the current algorithms in efficiency.
Fuzzy Rough Set Algebra of Multi-granulation
KONG Qing-zhao and WEI Zeng-xin
Computer Science. 2016, 43 (4): 206-209.  doi:10.11896/j.issn.1002-137X.2016.04.042
Abstract PDF(252KB) ( 400 )   
References | Related Articles | Metrics
It is well known that a rough set algebra is a set algebra with added dual pair of rough approximation operators.In this paper,on the one hand,we discussed the classical fuzzy rough set algebra of multi-granulation by axiomatic approach.It is shown that the classical fuzzy rough set algebra do not obsess good properties.On the other hand,we defined the concept of equivalence relations with minimum (maximum) element.Moreover,multi-granulation fuzzy appro-ximation operators based on equivalence relations with minimum (maximum) element were defined.We discussed the properties of the fuzzy rough set algebra based on equivalence relations with minimum (maximum) element and got many excellent results.
Collaborative Filtering Recommendation Algorithm Based on Context Similarity and Twice Clustering
CAI Hai-ni, QIN Meng-qiu, WEN Jun-hao, XIONG Qing-yu and LI Mao-liang
Computer Science. 2016, 43 (4): 210-213.  doi:10.11896/j.issn.1002-137X.2016.04.043
Abstract PDF(408KB) ( 449 )   
References | Related Articles | Metrics
Aiming at solving the problem of personalized service recommendation in the field of mobile telecommunication network,this paper introduced the context information to the process of personalized recommendation,and proposed a collaborative filtering algorithm based on context similarity and twice clustering.Firstly,according to user context similarity,the users are clustered,and user rating confidence is calculated based on the rating matrix to distinguish core users from non-core users.Secondly,the center of clusters formed by initial clustering can be adjusted according to the rating of core users,and non-core users will be clustered again to form a new cluster.Finally,according to the context similarity,user preferences will be predicted.To some extent,this algorithm can reduce the influence of noise data from rating matrix on clustering results,and reduce the deviation of the recommendation.The experiment based on the simulation data set shows that,the algorithm improves the accuracy of user preferences effectively,and increases the accuracy of collaborative filtering recommendation.
Process Mining Approach Based on Tabu Search Algorithm
BAI Xue-cong and ZHU Yan
Computer Science. 2016, 43 (4): 214-218.  doi:10.11896/j.issn.1002-137X.2016.04.044
Abstract PDF(529KB) ( 544 )   
References | Related Articles | Metrics
Workflow management systems which support process control are widely used in order to meet the needs of the high efficient automatic production.Process mining can use historical data,such as the event logs,to generate abstract process model,and provide favorable conditions for the deployment of workflow system.This paper presented a general process mining framework based on heuristic optimization algorithm,and then applied the tabu search algorithm to process mining task.Some key problems,such as program initialization,creation of tabu list and neighborhood,were discussed in detail.Finally,the algorithm was implemented as a plug-in of ProM.The experiment verifies the correctness of the process mining framework,and the tabu search process mining approach can deal with different flow structures and has robustness to noise and less time consuming.
Detecting Concept Drift of Data Stream Based on Fuzzy Clustering
CHEN Xiao-dong, SUN Li-juan, HAN Chong and GUO Jian
Computer Science. 2016, 43 (4): 219-223.  doi:10.11896/j.issn.1002-137X.2016.04.045
Abstract PDF(533KB) ( 629 )   
References | Related Articles | Metrics
The phenomena of concept drift may occur in data stream,and how to detect it is very important in many applications.We used the improved version of FCM algorithm to cluster data in variable siding window,and measured the difference between adjacent windows to determine whether concept drift occurs.The result shows that our algorithm can detect concept drift in data stream effectively,and has great performance in clustering quality and time.
Named Entity Recognition Based on Deep Belief Net
FENG Yun-tian, ZHANG Hong-jun, HAO Wen-ning and CHEN Gang
Computer Science. 2016, 43 (4): 224-230.  doi:10.11896/j.issn.1002-137X.2016.04.046
Abstract PDF(594KB) ( 445 )   
References | Related Articles | Metrics
Traditional named entity recognition methods,which tag words by inputting a good deal of handmade features into statistics learning models,have achieved good results,but the manual mode of defining features makes it more difficult to build the model.To decrease the workload of the manual mode,this paper firstly got the distributed representation of word features by training the neural network language model without supervision,then discovered the deep features of words by inputting the distributed features into the deep belief net,finally conducted named entity recognition.The method uses the deep belief net to extend the neural network language model on the basis of research of predecessors,and presents a deep architecture which is available for named entity recognition.Experiments show that the me-thod applied to named entity recognition can perform better than traditional conditional random field model if both only using term feature and POS feature,and has a certain use value.
Relaxed Optimal Balanced Streaming Graph Partitioning Algorithm
YIN Xiao-bo and LUO En
Computer Science. 2016, 43 (4): 231-234.  doi:10.11896/j.issn.1002-137X.2016.04.047
Abstract PDF(334KB) ( 598 )   
References | Related Articles | Metrics
During the distributed processing of large scale graph data,it usually needs to partition the whole graph into different nodes.If the partition isn’t balanced,some of the nodes may be the bottleneck of the distributed system.In order to achieve a balanced partition and handle the quick updates of graph efficiently,this paper proposed a relaxed optimal balanced streaming graph partitioning algorithm.Firstly,we defined an objective function including costs of both inter-partitions and intra-partition as the general graph partition framework.Secondly,we analyzed the graph partition problem according to a maximal and a minimal optimization functions based within the proposed framework,and gave their relationships.Finally,we proposed a greedy optimal graph k-partitions algorithm upon the streaming graph.The experiments show that,the proposed graph partitioning algorithm have better balance and lower communication cost than related works,and upper applications upon this algorithm have better performance.
Collaborative Filtering Recommendation Algorithm Based on Overlap Degrees and Double Attributes
ZHANG Bo, LIU Xue-jun and LI Bin
Computer Science. 2016, 43 (4): 235-240.  doi:10.11896/j.issn.1002-137X.2016.04.048
Abstract PDF(524KB) ( 408 )   
References | Related Articles | Metrics
Recently,collaborative filtering is one of the most widely used and successful recommendation technology in the recommender system.However,the traditional collaborative filtering recommendation algorithm has the disadvantages of the one-sidedness for selecting neighbors and lower recommendation precision.In order to solve the problems,this paper proposed a collaborative filtering recommendation algorithm based on overlap degrees and double attributes.Firstly,we used the aggregated results of similarity and overlap degrees to select the recommended set of objects.Then,we proposed the concept of double attributes,and calculated the reliability of the target user and the popularity of the target item respectively.At last,taking into account the two groups,we used both user and item rating information to generate recommendation for the target user.Experimental results show the proposed algorithm is improved significantlyin terms of neighbor selection and recommendation quality compared to traditional collaborative filtering recommendation algorithm.
On Countable Core Models of RCC
ZHAO Xiao-rong, YU Quan and WANG Ju
Computer Science. 2016, 43 (4): 241-246.  doi:10.11896/j.issn.1002-137X.2016.04.049
Abstract PDF(546KB) ( 452 )   
References | Related Articles | Metrics
Spatial logic is an important branch of knowledge representation and reasoning.RCC(GRCC) is one of the most popular formal systems which attracts most attention.We started from the redundancy and nonredundancy of connection relation,proposed the concept of core-models,and proved the existence theorem of core-models.The internal connectivity of the RCC model was studied,and the first order definable property was proved.We proved the extension theorem of core-models based on internal connectivity.
Coupling Similarity-based Matrix Factorization Technique for Recommendation
GUO Meng-jiao, SUN Jing-guang and MENG Xiang-fu
Computer Science. 2016, 43 (4): 247-251.  doi:10.11896/j.issn.1002-137X.2016.04.050
Abstract PDF(415KB) ( 504 )   
References | Related Articles | Metrics
With the rapid development of Internet and information technology,information overload becomes more and more seriously.Recommender system can provide personalized recommendations to both individual users and businesses (such as e-commerce and retail enterprises).The data sparsity and prediction quality are recognized as the key challenges in the existing recommender systems.Most of the existing recommender systems depend on collaborating filtering (CF) method,which mainly uses the user-item rating matrix to represent the relationship between users and items.Se-veral researches consider utilizing extra information to improve the accuracy.However,most of the existing methods usually fail to provide accurate information for predicting recommendations,as there is an assumption that the relationship between attributes of items is independent and identically distributed,while,there are often several kinds of coupling relationships or connections existing among items or users in real applications.This paper incorporated the coupling relationship analysis to capture under-discovered relationships of items and aimed to make the ratings more reasonable.This paper proposed a coupled attribute-based matrix factorization model,which can capture the coupling correlations between items effectively.The experimental evaluations demonstrate the proposed algorithms outperform the state-of-the-art algorithms in the warm start and cold start settings.
Analysis of Social Media Networks Based on Deep Neural Networks
ZHANG Yan-hong and WANG Bao-hui
Computer Science. 2016, 43 (4): 252-255.  doi:10.11896/j.issn.1002-137X.2016.04.051
Abstract PDF(410KB) ( 516 )   
References | Related Articles | Metrics
Social media networks include not only multi-modal data,such as users,text,images,video,and so on,but also collective effect between data of different modals.In order to better describe social media networks and provide better services for applications,this paper proposed a deep neural network based social media network model.The proposed model models data of each single modal with a deep neural network,and then gets a latent feature representation for each modal data.For data between two different modals,a Gaussian distributed prior matrix and two posterior distributions of different modals were applied to build a generative model that describes the collective effect between two diffe-rent modals.The experiments show that the proposed model has better prediction performance in link analysis of networks than related works,and can describe the whole underlying social media network effectively.
Bayesian Chinese Spam Filtering Method Based on Phrases
WANG Qing-song and WEI Ru-yu
Computer Science. 2016, 43 (4): 256-259.  doi:10.11896/j.issn.1002-137X.2016.04.052
Abstract PDF(412KB) ( 823 )   
References | Related Articles | Metrics
Naive Bayesian has been widely used in the field of spam filtering,in which the feature extraction is one of the essential links in the algorithm.In the past,only words were used as text features for the extraction in the method of Chinese spam filtering.In face of large-scale email training samples,time efficiency of this algorithm will become a bottleneck of spam filtering technology.A Bayesian spam filtering algorithm based on phrases was proposed here which combines a new phrase analysis method put forward in text classification field.Phrases are extracted as the unit accor-ding to the rules of basic noun phrases,verb phrases and semantic analysis.Through comparison test experiment of spam filtering based on words and phrases as unit,the effectiveness of the proposed method was confirmed.
Research on Classification of Oral Bioavailability Based on Deep Learning
SHI Xin-yu, YU Long, TIAN Sheng-wei, YE Fei-yue, QIAN Jin and GAO Shuang-yin
Computer Science. 2016, 43 (4): 260-263.  doi:10.11896/j.issn.1002-137X.2016.04.053
Abstract PDF(310KB) ( 1066 )   
References | Related Articles | Metrics
It is expensive and time-consuming to measure oral bioavailability using traditional methods,and existing machine learning methods show lower accuracy.To solve the problems,a classification method of human oral bioavailability based on stacked autoencoder(SAE) was presented.Filtered features of molecular are combined with the model of SAE to classify human oral bioavailability.Experimental results show that the deep network can study more essential features of molecules comparing with other shallow learning models like support vector machine and artificial neural network,and the combination of screened 2D and 3D molecular features achieves better classification effect of oral bioavailability,with a average accuracy value of 83%,a sensitivity(SE) value of 94% and a specificity(SP) value of 49%.
Nonlinear Normalization for Non-uniformly Distributed Data
LIANG Lu, LI Jian, HUO Ying-xiang and TENG Shao-hua
Computer Science. 2016, 43 (4): 264-269.  doi:10.11896/j.issn.1002-137X.2016.04.054
Abstract PDF(481KB) ( 1072 )   
References | Related Articles | Metrics
Traditional normalization method for continuous attributes is usually a linear transformation.When using li-near normalization to deal with some non-uniform datasets,it’s easy to cause the subsequent data mining (particularly some mining methods based on distance) results are inaccurate enough for the interval of each data point in the local space is too small .This paper suggested a nonlinear normalization based on data fitting,and we could find out the corresponding nonlinear transformation function in the premise of not changing the distribution rules of data.According to the function,we could nonlinearly zoom the data interval,expand the interval of dense data and shrink the interval of sparse data at the same time.It can make the data mining more accurate.We used the neural network,SVM and KNN combining with different data set to test.The results show that the error rate decreases and the F1 measure increases at the same time.
Improved Maximal Lyapunov Exponent Chaotic Forecasting Method Based on Markov Chain Theory
LI Xiu-yun and CHEN Shuai
Computer Science. 2016, 43 (4): 270-273.  doi:10.11896/j.issn.1002-137X.2016.04.055
Abstract PDF(292KB) ( 459 )   
References | Related Articles | Metrics
Forecasting of chaotic time series based on maximal Lyapunov exponent may bring two results,and few litera-tures have studied on it.The paper introduced Markov chain to improve it.The improved method makes the gradient of time series as state variables,builds the state transition matrix on the basis of Markov chain which will be used to verify the evolution direction of the forecasting results,and then chooses the best prediction value based on the evolution of dynamical chaotic systems.At last,the paper verified the improved forecast model using the traffic flow data of Yuwu Highway.The result shows that the improved maximal Lyapunov exponent forecasting method is valid and feasible.
Research of Packing Method Based on Adaptive Genetic Algorithm and Multi-strip Strategy
XU Hua-jie, TAN Hong-sen and HU Xiao-ming
Computer Science. 2016, 43 (4): 274-278.  doi:10.11896/j.issn.1002-137X.2016.04.056
Abstract PDF(1557KB) ( 459 )   
References | Related Articles | Metrics
Aiming at the common rectangular packing optimal problems among the craft production of modern industries,the operators of genetic algorithm with better performance were applied to find the rectangular sequences,and the method with adaptively adjustable crossover probability and mutation probability was used to improve the genetic algorithm’s convergence speed and stability.A multi-strip strategy under the constraint of craft production was proposed,along with using the optimization inserting strategy of lowest horizontal line algorithm as the decoding method.Accor-ding to the experiment results,a higher and more stable utilization of sheet can be achieved than the hierarchical strategy,the efficiency at craft production can be improved and the time cost can be reduced by using the proposed packing method.It is of great realistic significance.
Models for Pattern Matching with Wildcards and Length Constraints
WANG Hao, WANG Hai-ping and WU Xin-dong
Computer Science. 2016, 43 (4): 279-283.  doi:10.11896/j.issn.1002-137X.2016.04.057
Abstract PDF(465KB) ( 469 )   
References | Related Articles | Metrics
For the problem of pattern matching with wildcards and length constraints (PMWL),the patterns are composed of a sequence of sub-patterns,where any two adjacent sub-patterns with flexible gaps are in a specified range of the text.Existing work includes a heuristic strategy and its completeness analysis with constraints,but the PMWL problem still needs systematic studies.We drew on the experience of constraint satisfaction problems(CSPs) and set up a 3-tuple model consisting of variables,domains and constraints.We then derived formal descriptions for the basic concepts and properties.Also,a tree-based matching algorithm was presented to solve the PMWL problem under certain conditions.
Detection and Location on Circular Valve Handle Based on Feature Decomposition and Combination
HE Li-xin, KONG Bin, YANG Jing, XU Yuan-yuan and WANG Bin
Computer Science. 2016, 43 (4): 284-289.  doi:10.11896/j.issn.1002-137X.2016.04.058
Abstract PDF(975KB) ( 423 )   
References | Related Articles | Metrics
To avoid sample collection,manual labelling and local extremum in the target detection method based on support vector machine or neural network,detection on geometrical feature of the common industrial circular valve handles was transformed into detection on two sub-features,circle and line segments in this paper.Namely,Hough transform is employed to detect circles and lines on an image captured by robot firstly.Only three lines are remained which best correspond to geometrical feature of circular valve handles in the designed algorithms.Then,combing the detected circle,lines and the angles features,the detected circle will be judged whether it is a handle or not.And the positions of inserting robot’s three fingers to operate the handle are obtained.The experiment results indicate that there’s no rigorous requirement for shooting angle and brightness,the circular valve handles can be detected effectively,and the position of inserting robot’s fingers can be calculated accurately by the method.The correct detection and location ratio is 90.7%.
Video Denoising Algorithm Based on Wavelet Threshold and PCA
HU Ran, GUO Cheng-cheng and YANG Jian-feng
Computer Science. 2016, 43 (4): 290-293.  doi:10.11896/j.issn.1002-137X.2016.04.059
Abstract PDF(849KB) ( 439 )   
References | Related Articles | Metrics
This paper introduced the local pixel grouping princical component analysis to video denoising and kept the correlation of the video sequence to suppress the noise of the video by 3D block-matching wavelet thresholding algorithm.The image of the first stage of the local pixel grouping princical component analysis is replaced by the result from wavelet thresholding of 3D block-matching video denoising algorithm to avoid the local effect of local pixel grouping princical component analysis for the video.Finally,local pixel grouping princical component analysis also suppresses the block effect of wavelet thresholding of 3D block-matching video denoising algorithm.Experimental results show that our algorithm introduces the princical component analysis to video denoising and solves the block effect of 3D block-mat-ching video denoising algorithm better.The subjective and objective comparison between different algorithms also proves that our algorithm has better denoising effect.
Fast Graphical Rendering of Seismic Profile Algorithm Based on GPU
DENG Bo-wen, LIU Chun-song, WU Fan-xian and YAO Xing-miao
Computer Science. 2016, 43 (4): 294-298.  doi:10.11896/j.issn.1002-137X.2016.04.060
Abstract PDF(943KB) ( 491 )   
References | Related Articles | Metrics
The drawing of seismic profile is the foundation of the visualization of 2D seismic data.Current method based on common rendering engine is complied on CPU,but with the increasing scale of seismic data,the conventional visua-lization method has been unable to meet the requirement of efficiency of interaction.This paper proposed a method to accelerate rendering seismic data graphics,which integrates graphical rendering and GPGPU technology.The graphical rasterization is implemented based on the powerful parallel computing capability of GPU.Experimental results indicate that,on the premise of ensuring imaging quality,large seismic data can be perfectly imaged in real time on a general PC platform.
GPU-based Texture Synthesis with Preserved Structures
TANG Ying, LIN Qi-feng, XIAO Ting-zhe and FAN Jing
Computer Science. 2016, 43 (4): 299-302.  doi:10.11896/j.issn.1002-137X.2016.04.061
Abstract PDF(1448KB) ( 404 )   
References | Related Articles | Metrics
We proposed a texture synthesis method based on Chamfer distance,which preserves the texture structures.We measured the similarity between texture structures by Chamfer distance and computed the distances of textures in both color space and structure feature space during the search of the matching texture patches.In this way,we solved the problem of discontinuity in synthesized textures with obvious structural patterns.However,the computational cost of Chamfer distance is very expensive,and it will become unaffordable as the resolution of the synthesized textures increases.To address this problem,we further presented a GPU-based algorithm to accelerate the computation of Chamfer distance during texture synthesis.The matching patches are searched in parallel on GPU to greatly improve the computational efficiency.Experimental results show that our method improves the quality of structured texture synthesis and accelerates the synthesis speed,which is irrelevant to the resolution of synthesized textures.
Medical Image Segmentation Based on Region-based Hybrid Active Contour Model
LIN Xi-lan, CHEN Xiu-hong and XIAO Lin-yun
Computer Science. 2016, 43 (4): 303-307.  doi:10.11896/j.issn.1002-137X.2016.04.062
Abstract PDF(929KB) ( 418 )   
References | Related Articles | Metrics
In view of the phenomenon that the calculation of variational level set algorithm is much larger and the speed is too low in the process of image segmentation,this paper proposed a new region-based hybrid nonconvex regularization active contour model based on some region-based active contour models.This model constructs a new energy functional which incorporates the LBF model having the property of local clustering of an image and geodesic active contour mo-del.By adding a nonconvex regularization term,they fasten the convergence speed of the contour curve,and can well pre-serve the shape of the region and protect the edge from oversmoothing.Thus,the minimum of the energy functional will be obtained by the typical finite difference method.Results of the simulation experiment on synthetic and medical images show that the proposed algorithm has quite fast convergence rate,accurate segmentation results and better robustness.
Decision Tree Based Coding Unit Splitting Algorithm for HEVC
CEN Yue-feng, WANG Wan-liang, YAO Xin-wei, WANG Chao-chao and PAN Tie-qiang
Computer Science. 2016, 43 (4): 308-312.  doi:10.11896/j.issn.1002-137X.2016.04.063
Abstract PDF(424KB) ( 476 )   
References | Related Articles | Metrics
To reduce the computational complexity of high efficient video coding (HEVC),a decision tree based coding unit (CU) splitting algorithm was proposed.The splitting of the CU is seen as a classification problem.Furthermore,splitting information extracted from CUs is added to the decision tree model for principle learning.A decision tree based classifier is obtained after the principle learning.Then the classifier is used to determine the splitting of the CU if the classification condition is satisfied.Thus,the computation of rate distortion (RD) cost is skipped,and the encoding computational complexity is reduced.Experimental results demonstrate that the proposed mechanism achieves significant reduction of the encoding computational complexity,while still maintaining high video coding quality.
Saliency Text Detection Combining Graph-based Manifold Ranking with C entral Segmentation
XU Xiao and GU Lei
Computer Science. 2016, 43 (4): 313-317.  doi:10.11896/j.issn.1002-137X.2016.04.064
Abstract PDF(1182KB) ( 428 )   
References | Related Articles | Metrics
To detect text from images with different backgrounds,a new algorithm based on saliency detection via graph-based manifold ranking and central segmentation was proposed. The image elements (pixels or regions) of input image are ranked by similarity with foreground cues or background cues at first.The boundary prior is exploited by using the nodes on each side of image as labelled background queries,and then binary segmentation is applied on the achieved saliency map and the labelled foreground nodes are taken as salient queries.The accurate alternative text area can be obtained at last.Then the central segmentation algorithm is used to obtain the precise edge graph.As the saliency map cannot locate the boundary and the edge graph cannot locate the text areas,we integrated both of them and got the final result.Experiment shows that the proposed method can effectively detect the text in the image.
Specific Target Tracking and Recognition Based on Adaboost-CSHG
PI Jia-li, WU Zheng-zhong and CHEN Zhuo
Computer Science. 2016, 43 (4): 318-321.  doi:10.11896/j.issn.1002-137X.2016.04.065
Abstract PDF(862KB) ( 436 )   
References | Related Articles | Metrics
Target tracking and recognition is the hot spot research object of computer vision.Firstly a target detection algorithm based on Adaboost was adopted to train a specific target classifier,which only makes a rough detection on tank model.Via building CSHG model,lots of false alarm could be deleted by connecting Adaboost and CSHG,and then an accurate detection on tank model was made,in the meanwhile,tracking the target was finished .Finally a target re-cognition method based on CSHG was used to recognize the target.Experimental results show that the algorithm can work well in both simple background and complicated background image conditions.