Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue 12, 14 November 2018
  
Advances on Human Action Recognition in Realistic Scenes
LEI Qing,CHEN Duan-sheng and LI Shao-zi
Computer Science. 2014, 41 (12): 1-7.  doi:10.11896/j.issn.1002-137X.2014.12.001
Abstract PDF(710KB) ( 1818 )   
References | Related Articles | Metrics
Human action recognition has become a hot and difficult spot currently in the domain of computer vision.The framework of mainstream methods includes visual feature detection,action representation and action classification.Action recognition in simple scenes has been implemented at present.This paper introduced in detail the research of human action recognition in realistic scenes from perspectives of research scope,feature detection,and action modeling.Unlike several recent published researches,we analyzed the state-of-the-arts and advances of this field,such as pose estimation,sparse coding based or deep learning based human action representation etc.Finally,the problems,difficulties as well as possible solutions were discussed.
Heuristic Routing Mechanism in ICN
SUN Xin-xin,WANG Xing-wei,LI Jie and HUANG Min
Computer Science. 2014, 41 (12): 8-10.  doi:10.11896/j.issn.1002-137X.2014.12.002
Abstract PDF(318KB) ( 778 )   
References | Related Articles | Metrics
Internet has become a social infrastructure.The current Internet architecture based on TCP/IP is faced with many challenges.This fact makes the clean slate design of future Internet architecture represented by Information-Centric Networking (ICN) become a hot research topic.In this paper,a heuristic routing mechanism in ICN was proposed.On the basis of name-based routing,a procedure was devised to look for other available interface through which the backtracking-condition-met interest packet will be forwarded,which can reduce the network blocking rate.Moreover,Forwarding Information Base (FIB) of neighbor nodes will be modified when data packets go through a router,which can realize the efficient use of cache.In addition,a concept of "popularity" was introduced to improve the Content Store (CS) hit rate.The proposed routing mechanism was implemented on INTERNET2 by simulation.The experimental results show that it is feasible and effective.
Fault Waveform Morphology Analysis about High Speed Interface
HU Xing,KUANG Ji-shun,YUAN Heng-zhou and LI Shao-qing
Computer Science. 2014, 41 (12): 11-12.  doi:10.11896/j.issn.1002-137X.2014.12.003
Abstract PDF(222KB) ( 777 )   
References | Related Articles | Metrics
With the increasing improvement of the data transfer rates,the application of high speed interface is more and more widely used.Testing of high speed interface is difficult,and engineer’s experience plays the critical role in analyzing and diagnosing the fault type and location. Through the waveform morphology analysis of high-speed interface in the fault state,the corresponding relation between fault and fault waveform was determined and the cost of fault analysis was reduced.The experimental system in real PCIE IP core constructs logical structure between transmissions.Using the SPICE simulator,we simulated various fault waveforming morphology and forming the fault dictionary.
Self-similarity Analysis and Modeling for On-chip Traffic
CHEN Yi-ou,HU Jian-hao and LING Xiang
Computer Science. 2014, 41 (12): 13-18.  doi:10.11896/j.issn.1002-137X.2014.12.004
Abstract PDF(471KB) ( 676 )   
References | Related Articles | Metrics
An accurate traffic analysis model is needed for latency prediction and verification in on-chip design.Unfortunately,the state-of-art Markov-based short range dependent models cannot characterize burst and self-similarity of on-chip traffic,therefore it is not applicable for the communication and signal processing SoCs.This paper proposed a self-similar NoC traffic model based on multiple parameters to provide accurate benchmarks for the design and verification of NoC.Using theoretical derivation and experimental method,this paper established an MPSoC information relevance model,provided an empirical fitting function between the parameters of the relevance model and Hurst parameter,and established the method to estimate Hurst parameter of NoC traffic.The experimental results prove that this traffic model can achieve an approximate and effective Hurst parameter.
Locating Vulnerable Point for Integer Overflow Based on Flag Bits Differences
HUANG Ke-zhen,LIAN Yi-feng,CHEN Kai,ZHANG Ying-jun and KANG Kai
Computer Science. 2014, 41 (12): 19-23.  doi:10.11896/j.issn.1002-137X.2014.12.005
Abstract PDF(969KB) ( 824 )   
References | Related Articles | Metrics
In recent years,the number of integer overflow vulnerabilities is still high and they have great threat to securi-ty.However,in the previous study,methods of locating vulnerable code are only used when patches or vulnerabilities’ proof of concept (POC) are automatically generated.Besides,when locating the vulnerable code,most of the previous methods tend to undermine buffer overflow that will cause its adjacent memory data to be overwritten.Integer overflow vulnerabilities,however,cannot directly overwrite important data,therefore,existing methods cannot locate integer overflow vulnerable code effectively.Currently,existing analysis of integer overflow vulnerabilities is inefficient and time-consuming as they are mostly conducted manually by manpower.In the present study,consequently,a novel method was proposed to locate vulnerable code of integer overflow.With view to enhance the efficiency on the part of analysts,this method combines dynamic taint analysis and EFLAGS register comparison so that it will decrease the number of instructions which can be used to locate the overflow point.On the basis of that,a system was further implemented and several experiments were conducted to verify our proposed method.The results show that our method is effective and efficient.
Reasoning Decision Method Based on Improved Theory of Evidence
WANG Yong-wei,ZHAO Rong-cai,CHANG De-xian,LIU Yu-nan and SI Cheng
Computer Science. 2014, 41 (12): 24-29.  doi:10.11896/j.issn.1002-137X.2014.12.006
Abstract PDF(428KB) ( 1078 )   
References | Related Articles | Metrics
According to the Zadeh paradox problem in current reasoning methods based on evidence theory,a combination method based on consistency conflict and intersection union dynamic adjustment was proposed.First,considering the combination of conflict and consistency,the concept of uncertainty was introduced which can be used to discount multi-source evidences.Then,new method combines multi-source evidence based on the dynamic adjustment of weight for the intersection and union.Thus,decision could be got by maximum belief.Finally,experiments in MATLAB were made to compare the validation of the proposed method with typical combination methods.Experiments show that the proposed method is effective,which can avoid generation of the paradox.The proposed method can get better results in reasoning discrimination than typical methods.
Trustworthy Architecture for Web Services
LIU Ling-xia,WANG Dong-xia and HUANG Min-huan
Computer Science. 2014, 41 (12): 30-32.  doi:10.11896/j.issn.1002-137X.2014.12.007
Abstract PDF(260KB) ( 691 )   
References | Related Articles | Metrics
The security and trustworthiness issues of Web services are the important factors that influence its development.Most of the existing solutions are from the point of view of security,and lack of considering that services need to still work as expected when facing attacks or threats.In this paper,the notion of security was expanded from the requirement of Web services.The goal and the content of trustworthy were proposed to meet the requirement of Web services.A trustworthy architecture for Web services was proposed.The architecture is based on security interaction,federated identity,and distributed policies and supported by operating maintenance and shared mechanisms.
Early Warning Method for Microblog
LIU Gong-shen,MENG Kui and XIE Jing
Computer Science. 2014, 41 (12): 33-37.  doi:10.11896/j.issn.1002-137X.2014.12.008
Abstract PDF(407KB) ( 812 )   
References | Related Articles | Metrics
The paper used user’s characteristics on Sina Weibo to measure a user’s influence on the propagation of microblog,and then proposed an early warning method for microblog.Firstly,we studied the basic characteristics of users whose microblog leads to a large or a small amount of reposts.Then we found the characteristics that can best discriminate between critical users and non-critical users,and used the feature selection method based on information gain to quantify the discrimination for each user characteristic on user critical.Secondly,based on the feature weighting model,a quantization method for user’s influence on the spread of microblog was proposed.Thirdly,a warning method for microblog was proposed.For a given newly released microblog,it sums up the influence value of the author and all other users who have already reposts the microblog.When the value exceeds a certain threshold,it outputs an warning.The microblog warning method can effectively control the propagation and spread for sensitive microblogs.
Research on Service Trust Evaluation Approach under Cloud Computing Environment
WANG Jin-dong,WEI Bo,ZHANG Heng-wei and He Jia-jing
Computer Science. 2014, 41 (12): 38-42.  doi:10.11896/j.issn.1002-137X.2014.12.009
Abstract PDF(446KB) ( 883 )   
References | Related Articles | Metrics
In cloud computing,the resources of service are widely distributed,complex and fickle.The trust relationship among service entities is hard to establish and maintain with high uncertainty.The characteristics of randomness and fuzziness of services cannot be responded by traditional trust evaluation approach comprehensively,so a service trust evaluation method based on weighted multi-attribute cloud was proposed.Time decay factor was introduced to reflect the timeliness of trust,and multi-attribute trust cloud was used to refine the evaluation granularity.In order to prevent conspiracy to defraud and malicious attacks,the reliability and weight of recommender were confirmed by similarity of eva-luation,and the trust rating was confirmed by cloud similarity calculation to provide security decisions for the user’s services selection.Simulation results show that this method can improve the success rate of services interaction obviously,and it can be applied to service trust evaluation under cloud computing environment.
Intrusion Detection System Based on Hybrid Immune Algorithm
FENG Xiang,MA Mei-yi,ZHAO Tian-ling and YU Hui-qun
Computer Science. 2014, 41 (12): 43-47.  doi:10.11896/j.issn.1002-137X.2014.12.010
Abstract PDF(466KB) ( 820 )   
References | Related Articles | Metrics
Computer security system and biological immune system have much comparability,so the artificial immune algorithm can be applied in intrusion detection system to solve various problems in the field of computer security.After studying the classical algorithm named negative selection algorithm,it was discovered that the matching algorithm would cause the examination black hole.A novel hybrid immune algorithm was proposed to solve the intrusion detection problem.The effectiveness and feasibility of the improved algorithm were verified.This paper partitioned the match string and set different coefficient for each section,thus to eliminate the problem that the r-continual position match algorithm has the constant match probability in the reverse choice algorithm,and to reduce the missing rate of intrusion detection system.This paper also combined the negative selection algorithm with the clonal selection algorithm.This will increase the reproduction,selection and intersection into the produce of detection.Thus the missing rate will be reduced.At last,we compared and analyzed the different parameters,including the section number,threshold value and r-continual parameter.
Digital Image Watermarking Algorithm Based on Dispersed Chaotic Mapping System
CHEN He-shan,LV Zhen-zhen and LUO Wei
Computer Science. 2014, 41 (12): 48-52.  doi:10.11896/j.issn.1002-137X.2014.12.011
Abstract PDF(920KB) ( 795 )   
References | Related Articles | Metrics
This paper designed a digital image watermarking algorithm based on a dispersed chaotic mapping system and wavelet transform.In the process of preprocessing watermark,the Logistic mapping produces key to encrypt the grey value in the pixel to insure the security of watermark.In the process of embedding watermark,three-level wavelet decomposition on image is conducted and watermark is embedded in the high frequency wavelet coefficients according to the size of wavelet coefficients.Analysis and experiment results show that the algorithm possesses relatively strong robustness,and better security and transparency than ordinary algorithms.
Controlled Dense Coding Using Non-symmetric Channel between Multi-parties
ZHANG Cheng-xian,GUO Bang-hong,CHENG Guang-ming,GUO Jian-jun and LIU Song-hao
Computer Science. 2014, 41 (12): 53-56.  doi:10.11896/j.issn.1002-137X.2014.12.012
Abstract PDF(643KB) ( 728 )   
References | Related Articles | Metrics
A scheme of controlled dense coding was proposed by using multi-parties in non-symmetric and high dimensional quantum channel.The quantum channel and the amount of the classic information of dense coding are controlled with the method of quantum measurement.By performing some unitary operations and some orthogonal quantum mea-surements on the quantum state,the quantum channel is purified.Thus,controlled dense coding is realized with some probability.It solves the extraction of the maximally quantum state in the case of decoherence in the practical quantum channel.It is successful to control dense coding between the N senders and the receiver and the efficiency of controlled dense coding is improved.
Efficient Lookup Method Based on Highly Available Peers in KAD
YAN He,LIU Wei,ZHANG Ge and CHENG Wen-qing
Computer Science. 2014, 41 (12): 57-59.  doi:10.11896/j.issn.1002-137X.2014.12.013
Abstract PDF(251KB) ( 943 )   
References | Related Articles | Metrics
The lookup performance in KAD is affected by the dynamics of peer participation.By studying the KAD routing mechanism,we found the times of ID appearance in routing tables can be the measurement of the peer availability.Based on the highly available peers,we proposed a new lookup method for KAD peers.At first,the KAD routing information is collected by a crawler.Then the highly available peers are selected as the lookup candidates.Experiment results show that,compared with the existed lookup scheme,our method can reduce 60% of average lookup time and obtain 18% more files in lookup results.
Distributed Collision-resolvable MAC Protocol for Wireless LANs with Interference Cancellation Support
SHEN Hu,LV Shao-he,WANG Xiao-dong and ZHOU Xing-ming
Computer Science. 2014, 41 (12): 60-66.  doi:10.11896/j.issn.1002-137X.2014.12.014
Abstract PDF(1113KB) ( 778 )   
References | Related Articles | Metrics
Medium access control is critical to wireless network performance.We introduced a novel collision resolution method based on known interference cancellation,and proposed a new MAC protocol named as CR-MAC.Basically,AP tries to decode all the data packets in a collision by combining partial retransmissions and known interference cancellation.Hence,the collided transmission can be fully utilized,and less retransmission is required.The simulation results show that CR-MAC performs much better than IEEE 802.11 DCF in terms of both the aggregation throughput and the expected packet delay under various network settings.
Framed Slotted ALOHA Anti-collision Algorithm Using Hybrid Spill-tree
XIA Jing-man,XIAO Guo-qiang,CHEN Kai and ZHAN Chun-mei
Computer Science. 2014, 41 (12): 67-69.  doi:10.11896/j.issn.1002-137X.2014.12.015
Abstract PDF(779KB) ( 664 )   
References | Related Articles | Metrics
To improve the efficiency of electronic tag anti-collision algorithm in radio frequency identification systems,a new frame slot ALOHA algorithm was proposed which takes advantage of accurate tag estimation and hybrid spill tree.The algorithm includes two stages,which are tag estimation and tag recognition.In the label estimation stage,the initial frame slot size is optimized by the accurate estimation of the number of labels.In the tag recognition stage,the collided tags in a slot are recognized rapidly by using the improved hybrid spill tree search algorithm.Experimental results show that the algorithm can effectively improve the performance of RFID anti-collision,and increase the efficiency of tag re-cognition in a RFID tag identification system.
Uncertain Data PT-Top k Query Processing in Wireless Sensor Network
MAO Ying-chi,WANG Kang,REN Dao-ning and WANG Jiu-long
Computer Science. 2014, 41 (12): 70-77.  doi:10.11896/j.issn.1002-137X.2014.12.016
Abstract PDF(675KB) ( 640 )   
References | Related Articles | Metrics
For the widespread wireless sensor networks applications,due to the quality of sensors and environment factor,the sensor readings are inherently uncertain.With the introduction of the probability dimension in the uncertain data,the query processing technologies for uncertain data become more and more difficult,and the types of uncertain data query have become richer. Uncertain data Top-k query is one of typical query tasks for the uncertain data.Considering the energy consumption and query response time in the wireless sensor network,an uncertain data PT-Top k query processing scheme is studied in a hierarchical structural wireless sensor network.Based on the x-tuple Rule of uncertain data,using intra-cluster and inter-cluster two phases query processing,a distributed Two-Phase PT-Top k Query Proces-sing approximation algorithm (TPQP) was proposed.Finally,the extensive experiment results show that the proposed TPQP can reduce the transmission consumption and query response time in terms of the probability p,the sorted number k,and the data volume.
Novel Algorithm of Blind Source Separation with Temporal Structure Based on Givens Transformation Matrix
ZHAO Li-xiang and LIU Guo-qing
Computer Science. 2014, 41 (12): 78-81.  doi:10.11896/j.issn.1002-137X.2014.12.017
Abstract PDF(352KB) ( 807 )   
References | Related Articles | Metrics
Independent Component Analysis (ICA) is an efficient method to solve the Blind Source Separation (BSS) problem with temporal structure.The key to ICA for whiten observation signals is to find an orthogonal matrix to throw away high-order redundancy between components.Given this problem,we proposed the parametric representation of orthogonal matrix in arbitrary dimension using Givens transformation matrix.Based on this,a new separation algorithm was proposed.Firstly,we decreased the number of parameters to be estimated by parameterizing the orthogonal matrix using Givens transformation matrix.Secondly,we converted the BSS problem into an unconstrained optimization problem,where the object function is the joint approximate diagonalization of multistep delayed covariance matrices.In order to estimate the parameters in orthogonal matrix,BFGS algorithm of quasi-Newton method was provided solving the unconstrained optimization problem.Finally,the separation for real mixed voice signals shows the effectiveness of our algorithm.
Study on Micro Blog Reposting Model Based on Characteristics of Information Obsolescence
YANG Zi-long,HUANG Shu-guang,WANG Zhen,LI Yong-cheng and XIAO Jia
Computer Science. 2014, 41 (12): 82-85.  doi:10.11896/j.issn.1002-137X.2014.12.018
Abstract PDF(339KB) ( 689 )   
References | Related Articles | Metrics
With the rapid development of the micro blog,extracting the characteristics of the message propagation and constructing the propagation model have already been a hot topic.Focused on users’ reposting behavior,first we analyzed the structure of the message reposting and extracted the characteristics of information obsolescence.Then we proposed an improved SIR model based on the law of diminishing reposting probability combining the time-effectiveness of the message.At last,we made use of real reposting data to prove the validity of our model.The results show that taking the time-effectiveness and information obsolescence of message into account can fit the progress of the message propagation well.Furthermore,we took advantage of this model to analyze the scale-free property of the vertex influence distribution.
Research on Ad hoc Network Hybrid Flow Congestion Control Based on Rate-limiting
CHEN Liang and XU Yang
Computer Science. 2014, 41 (12): 86-90.  doi:10.11896/j.issn.1002-137X.2014.12.019
Abstract PDF(660KB) ( 744 )   
References | Related Articles | Metrics
AQM (Active Queue Management) is based on TCP feedback mechanism,so it fails to control UDP flow when managing hybrid flow of TCP and UDP.At last non-video flow will influence video UDP transmission quality.Ad hoc network TCP/UDP AQM model was deduced based on TCP window principle and hybrid flow queuing mechanism.Then a PI (Proportional Integral) AQM algorithm based on UDP rate-limiting was proposed.The algorithm can mark priority of non-video UDP packet according to the difference between actual rate and committed rate.It will drop these packets from low to high priority order.NS simulation shows that compared with PI control,the new algorithm controls non-video UDP flow and improves video transmission quality to 0.98 dB Peak Signal to Noise Ratio (PSNR).
Big Data Reliable Transmission Control Mechanism Based on Power Function Curve and Netwrok Coding for Wireless Multimedia Sensor Networks
YU Hong and ZHU Li-li
Computer Science. 2014, 41 (12): 91-94.  doi:10.11896/j.issn.1002-137X.2014.12.020
Abstract PDF(401KB) ( 687 )   
References | Related Articles | Metrics
This paper proposed a reliable transmission control mechanism based on the large data collaborative forecasting power function regression curves and random linear network coding to provide high reliability,resource utilization and quality of service for multimedia data transmission,which is suitable for wireless multimedia sensor networks.Firstly,according to the data characteristics and dynamic multimedia network topology,a prediction model was presented based on the power function regression curve.Then the video frames were used as a unit for network coding at the network layer and the physical layer.Finally a collaborative network-based prediction and coding big reliable data transmission control mechanisms were established.Through mathematical analysis and simulation,the proposed mechanism was evaluated by analysis of system performance with the traditional mechanisms.The conclusion proves the superiority of the proposed mechanism.
Efficient Random Modulation Privacy-preserving MAX/MIN Query Protocol in Two-tiered Wireless Sensor Networks
LIU Hong-hui,LIU Shu-bo,LIU Meng-jun and CAI Zhao-hui
Computer Science. 2014, 41 (12): 95-100.  doi:10.11896/j.issn.1002-137X.2014.12.021
Abstract PDF(614KB) ( 691 )   
References | Related Articles | Metrics
Privacy preservation is always a hot research area in wireless sensor networks (WSNs),which includes the privacy-preserving MAX/MIN Query Protocol.This paper proposed a numeric comparison method that will not leak the raw value first to address the problem of pricacy-preserving MAX/MIN query,which is based on random number and numerical map.With this numeric comparison method and cryptography,we proposed an efficient random modulation privacy preserving MAX/MIN query protocol (ERM-MQP) in two-tiered wireless sensor networks.In ERM-MQP,sensors modulate the sampled data with random number to compute the privacy-preserving data and the storage nodes search the privacy-preserving MAX/MIN value.The Sink sensor recovers the privacy-preserving MAX/MIN data and gets the MAX/MIN value of sampled data in the end.All data is encrypted before transmission on query process.Finally according to the result of security analysis and energy analysis,and comparing with existing method by experiment on energy consumption,the ERM-MQP is secure and needs less energy.
Non-interactive Key Exchange Protocol Based on Certificateless Public Key Cryptography
WEI Yun,WEI Fu-shan and MA Chuan-gui
Computer Science. 2014, 41 (12): 101-106.  doi:10.11896/j.issn.1002-137X.2014.12.022
Abstract PDF(570KB) ( 711 )   
References | Related Articles | Metrics
A non-interactive key exchange (NIKE) allows two parties to establish a shared key without further communication.In ID-based non-interactive key exchange (ID-NIKE),PKG (private key generator) knows user’s private key,so it can calculate the shared key between two participants,which is namely the key escrow problem.In this paper,the first security model for certificateless non-interactive key exchange was proposed.And then a scheme of a certificateless non-interactive key exchange was given.The new scheme is proven secure in the Random Oracle Model based on the hardness of the bilinear diffie-Hellman assumption (BDH).It is the first non-interactive key exchange scheme based on certificateless public key cryptography (CL-PKC),which combines the advantage of the CL-PKC and the NIKE.Thus the center cannot calculate the shared key,which solves the key escrow problem in ID-NIKE.Especially,our scheme allows partial private key leakage,so it is more secure than other related schemes.
Network Intrusion Detection Algorithm Based on HHT with Shift Hierarchical Control
ZHANG Wu-mei and CHEN Qing-zhang
Computer Science. 2014, 41 (12): 107-111.  doi:10.11896/j.issn.1002-137X.2014.12.023
Abstract PDF(434KB) ( 699 )   
References | Related Articles | Metrics
In the strong interference background and low signal-to-noise,the accurate detection of network intrusion potential signal is the key of network security.The traditional Hilbert-Huang transform (HHT) intrusion signal detection algorithm has boundary control error resulted from envelope distortion,and spectrum leakage is occurred which leads the bad detection performance .An improved detection algorithm was proposed based on the time-frequency distribution feature and offset hierarchical control network HHT matching.The network potential intrusion mathematical evolution model is constructed,and the complex signals are decomposed into IMF single frequency signal.The intrusion detection system state transfer equation is obtained.The discrete analytical processing of the intrusion signal is taken based on Hilbert transform,and the signal model is obtained.The intrusion signal is decomposed with empirical mode,and the IMF component is analyzed by Hilbert spectrum.The HHT frequency shift is adjusted by hierarchical control mechanism,and residual projection and intrusion signal Hilbert marginal spectrum are matched.The envelope distortion is reduced,and the spectral leakage is suppressed.The accurate detection and parameter estimation of intrusion signal are achieved.Experiments show that this algorithm has strong anti-interference performance in intrusion signal detection,which can detect intrusion signal with low SNR effectively,and the performance of detection is improved.
Automatic Verification for Multi-protocol Attacks by Improving Athena
LIU Wei,GUO Yuan-bo,LEI Xin-feng and LI Jun-feng
Computer Science. 2014, 41 (12): 112-117.  doi:10.11896/j.issn.1002-137X.2014.12.024
Abstract PDF(551KB) ( 844 )   
References | Related Articles | Metrics
Protocol security in multi-protocol environments is an open issue in formal analysis for security protocols.Aiming at this problem,an automatic verification for multi-protocol attacks was proposed based on Athena algorithm.The state representation and successor state generation algorithm of Athena are extended,and the attacker can intercept messages from one protocol and insert messages generated by it to another protocol.Some state reduction rules are introduced.The method can verify whether there is a multi-protocol attack.The experiment results show that the method can implement automatic verification for multi-protocol attacks.
Image Region Cloning Authentication Algorithm Based on Local Invariant Feature and Outlier Detection
LE De-guang,JIANG Nan,ZHENG Li-xin and LI Xiao-chao
Computer Science. 2014, 41 (12): 118-124.  doi:10.11896/j.issn.1002-137X.2014.12.025
Abstract PDF(816KB) ( 775 )   
References | Related Articles | Metrics
In order to deal with the security problem of image region cloning,this paper proposed a image region cloning authentication algorithm based on local invariant feature and outlier detection.Firstly,this algorithm detects the point with local invariant feature in the scale space of image by DoG function.Secondly,this algorithm detects the image region cloning by means of local invariant feature match through the similarity measurement of euclidean distance and match policy of nearest neighbor distance ratio.Finally,the algorithm verifies the match results by means of outlier detection.The test results show the algorithm not only can detect the cloned region of image correctly,but also can prevent the different attacks of cloned image efficiently.Besides,the detection speed of cloned image is higher compared with other methods.
Improved Ultra-lightweight Authentication of Ownership Transfer Protocol for RFID Tag
SHEN Jin-wei and LING Jie
Computer Science. 2014, 41 (12): 125-128.  doi:10.11896/j.issn.1002-137X.2014.12.026
Abstract PDF(342KB) ( 652 )   
References | Related Articles | Metrics
Aiming at the securing hole of current ownership transfer protocol,a ultra-lightweight RFID mutual aut- hentication of ownership transfer protocol was proposed.The formal proof of the proposed authentication protocol was given based on GNY logic.It improves the reader and the tag in an open environment confidentiality of communications,which not only solves problems of Denial of Service (DoS) that are produced when the attacker replays the message,but also achieves the reader and the tag mutual authentication function through changing protocols interact.This paper analyzed the security and compared the performance of the protocol.The results show that the proposed protocol not only satisfies the security requirements of transfer of ownership,but also has the characteristics of ultra-lightweight,which indicates that it is suitable for actual mobile authentication environment.
Millionaires’ Protocol Based on Game Theory
FENG Yun-zhi and ZHANG En
Computer Science. 2014, 41 (12): 129-132.  doi:10.11896/j.issn.1002-137X.2014.12.027
Abstract PDF(325KB) ( 868 )   
References | Related Articles | Metrics
In the setting of classical millionaires’ problem,one party maybe tell the other party a wrong value,and he has no incentive to tell the true comparative result.Combining game theory and cryptography,this paper proposed a millionaires’ protocol.In the protocol,the participant’s payoff of following the protocol is more than the payoff of deviation.It is a best strategy for participant to abide by the protocol,and any cheating of Millionaire can be detected.So rational party has an incentive to abide by the protocol.Finally,every party can obtain the comparative result of wealth.
Convolution Tree Kernel Based Sentiment Element Recognition Approach for Chinese Microblog
CHEN Feng,CHAO Wen-han,ZHOU Qing and LI Zhou-jun
Computer Science. 2014, 41 (12): 133-137.  doi:10.11896/j.issn.1002-137X.2014.12.028
Abstract PDF(444KB) ( 656 )   
References | Related Articles | Metrics
Sentiment element recognition is one of the key sub-tasks of sentiment analysis,and its goal is to identify the sentiment targets in the text.Sentiment target recognition is identified as the most fine-grained sentiment analysis task,and many researchers have conducted lots of research work on it.Since Chinese Microblog text is short and very flexible,which is often not standardized and contains a lot of noisy information in the text,it brings new challenges to the Chinese Microblog sentiment analysis research.At present,most of sentiment target recognition methods are based on rules or statistical learning methods using flat features,which can not distinguish between noisy information and sentiment targets very well,resulting in the low recognition performance.According to the characteristics of Chinese Microblog,a novel sentiment element recognition approach based on convolution tree kernel was proposed.Firstly,the approach analyses the part of speech(POS) and dependency relationship of the Microblog sentences,and takes the nouns in the sentences as candidate sentiment elements.Secondly,it adopts two different pruning strategies to obtain every candidate’s structured information.Finally,convolution tree kernel method is used to calculate the similarity of dependency tree,which is the foundation of sentiment elements recognition.The experiments of NLP & CC2012 and NLP & CC2013 Chinese Microblog sentiment target analysis tasks show the performance of this approach is improved significantly comparing to the baseline.
Prediction of Latent Trust Relationships in E-commerce
MA Xiao,GAN Zao-bin,LU Hong-wei and MA Yao
Computer Science. 2014, 41 (12): 138-142.  doi:10.11896/j.issn.1002-137X.2014.12.029
Abstract PDF(407KB) ( 650 )   
References | Related Articles | Metrics
Trust relationships play an important role in helping users collect reliable information.Existing trust relationships prediction methods are mainly based on the trust transitivity and user similarity.However,in e-commerce systems,reviews from users with different reputations may have different impacts on other users’ purchasing behaviors.Different user reputations may have different impacts on trust relationships prediction.Therefore,this paper formally described a user-trust relationships sub-network and a user-product-review relationships sub-network in e-commerce systems.According to sociological theory,a trust relationships prediction method was proposed based on user similarity and global reputation,which aims at discovering the latent trust relationships between unfamiliar users in e-commerce systems and helping users distinguish the reliability of rating information in order to choose reliable products.Comparative experiments on the trust relationships prediction were performed on Epinions dataset.The experimental results show that the proposed method has better trust prediction accuracy.
Research on Collaborative Mining Algorithm on Homologous Data
WANG Yong,LV Ke and PAN Wei-guo
Computer Science. 2014, 41 (12): 143-147.  doi:10.11896/j.issn.1002-137X.2014.12.030
Abstract PDF(920KB) ( 599 )   
References | Related Articles | Metrics
This article explored the issues of knowledge management and improvement of the interpretability of data mining models,and proposed the collaborative mining algorithm (CMA),which performs pattern evaluation and know-ledge management based on collaborative mining of homologous data.In contrast to the ensemble learning knowledge rules by combining learning models of the same type,collaborative mining sets up learning models of different types based on homologous data,and each model owns different forms of knowledge rules.Through the comparison study,coincident knowledge rules were formed.Experiments show that collaborative mining can efficiently find the latent information in data,and improve the performance of knowledge management.
Attribute Reduction with Principle of Minimum Correlation and Maximum Dependency
ZHAI Jun-hai,WAN Li-yan and WANG Xi-zhao
Computer Science. 2014, 41 (12): 148-150.  doi:10.11896/j.issn.1002-137X.2014.12.031
Abstract PDF(323KB) ( 886 )   
References | Related Articles | Metrics
In the classical rough set,the reduction algorithm based on significance for decision table only considers the dependency of decision attribute and condition attribute,and does not consider the correlation between the condition attributes in reduct.The reduct calculated with this kind of algorithm may include redundant attributes.In order to deal with this problem,an improved algorithm was proposed in this paper,which calculates the reduct with the principle of minimum correlation and maximum dependency.Compared with the reduction algorithm based on significance for decision table,less attributes are remained in the reducts calculated with the proposed algorithm,and the redundancy of the reduct is smaller.The experimental results show that the proposed algorithm outperforms the reduction algorithm based on significance for decision table.
Perturbation Guided Ant Colony Optimization
DUAN Xi,YANG Qun,CHEN Bing and LI Yuan-zhen
Computer Science. 2014, 41 (12): 151-154.  doi:10.11896/j.issn.1002-137X.2014.12.032
Abstract PDF(334KB) ( 707 )   
References | Related Articles | Metrics
The hybridizations of Ant Colony Optimization(ACO) with Guided Local Search(GLS) can be used to solve the problem that ACO is easily trapped in local optima.However,there is a problem that the algorithmic optima prematurely converges to suboptimal solutions.This paper presented a Perturbation Guided Ant Colony Optimization(PGACO) algorithm to avoid the problem.A proposed perturbation method is used to move current solution to a neighbor solution space to build new global optimal solutions when the algorithm is prone to premature convergence.The experimental results show that PGACO can effectively avoid a premature convergence of the algorithm to suboptimal solutions.PGACO can generate a better solution,simultaneously has a better global search capability.
Research on Optimization of Sorting Algorithm under MapReduce
JIN Jing
Computer Science. 2014, 41 (12): 155-159.  doi:10.11896/j.issn.1002-137X.2014.12.033
Abstract PDF(379KB) ( 691 )   
References | Related Articles | Metrics
This specification is set for the theses to be published in computer applications and software,including fonts,margins,page size and print area.MapReduce is the standard parallel computing model on big data analysis.Ideally,a MapReduce system should make all nodes involved in computation highly load balancing,and minimize space usage,CPU,disk and network overhead.In fact,these principles are the MapReduce algorithm design guidelines.However,few studies on optimization achieve all these metrics simultaneously.To solve this problem,this paper firstly presented the criterions of optimal MapReduce algorithms,which should keep excellent parallelism and optimize all metrics simultaneously.In this paper,we studied sorting algorithm,which is the most important algorithm on data processing,and proposed an optimal sorting algorithm under MapReduce paradigm.Then we illustrated the effectiveness and efficiency by the theory and experiments.
Multi-label Learning Algorithm Based on Granular Computing
ZHAO Hai-feng,YU Qiang and CAO Yu-dan
Computer Science. 2014, 41 (12): 160-163.  doi:10.11896/j.issn.1002-137X.2014.12.034
Abstract PDF(322KB) ( 677 )   
References | Related Articles | Metrics
Multi-label learning deals with the problem that each instance is associated with multiple labels.Existing multi-label learning algorithm IMLL based on lazy learning does not fully consider the distribution of instances.When building the nearest neighbor sets of the instances,the number of the neighbor for each instance is a constant value valued k.It may lead to such an outcome that the instances with higher similarity are ruled out of the nearest neighbor set or the instances with lower similarity are capsulated into the nearest neighbor set,which will affect the performance of the classification method.In this article,an improved multi-label lazy learning algorithm combined with the idea of granular computing was proposed.The nearest neighbor set of each instance is built by the controlling of the granularity.Then the instances in the nearest neighbor set of each instance behave high similarity.Experimental results show that the performance of our algorithm is superior to IMLLA.
Research on Fuzzy Rough Sets Based Rule Induction Methods for Healthcare Data
LIU Yang,ZHANG Zhuo and ZHOU Qing-lei
Computer Science. 2014, 41 (12): 164-167.  doi:10.11896/j.issn.1002-137X.2014.12.035
Abstract PDF(335KB) ( 725 )   
References | Related Articles | Metrics
Healthcare databases typically contain numerous attributes,and have both continuous and discrete type of attributes in hybrid data,which limits the mining efficiency of knowledge discovery on healthcare data to a great extent.Based on fuzzy rough sets theory,we studied the classification rule mining methods on hybrid data.By introducing the generalization thresholds for rule induction algorithm,the proposed method can reduce the size of extracted rule set and complexity of rules,which may improve the classification efficiency of rough sets based knowledge discovery method on healthcare data.Finally we conducted comparative experiments on medical decision tables to verify the effectiveness of mined rules of proposed algorithm.
Query Expansion Method Based on Hidden Markov Model
JIAO Jian and ZHANG Yang-sen
Computer Science. 2014, 41 (12): 168-171.  doi:10.11896/j.issn.1002-137X.2014.12.036
Abstract PDF(428KB) ( 753 )   
References | Related Articles | Metrics
Automatic query expansion has been a main technique to improve retrieval performance by identifying the potential intentions of the users.In this paper,a method to identify the potential intentions of the users based on hidden Markov models was proposed.The model is trained with large amount of query logs provided by Sogou laboratory.Experiments show that the proposed method has significant improvements in retrieval accuracy for query expansion than other methods.
Integration of Dual-layer Fuzzy System with Center-constrained Minimal Enclosing Ball
XU Hua
Computer Science. 2014, 41 (12): 172-175.  doi:10.11896/j.issn.1002-137X.2014.12.037
Abstract PDF(276KB) ( 629 )   
References | Related Articles | Metrics
This paper used the central TSK fuzzy system which is the improved double layer TSK fuzzy system.Compared with the traditional TSK fuzzy system,the CTSK fuzzy system(Centralized TSK Fuzzy System)adopts the upgraded dual-layer TSK fuzzy system and has the following advantages:better interpretability,stronger robustness and approximation capability.However,the time and space complexity overhead greatly limits large or super large data sets.In the light of this limitation,a new algorithm CCMEB-CTSK(CCMEB-based CTSK) was proposed here dealing with large data sets.The algorithm not only preserves the advantages of the CTSK fuzzy system,but also contributes to the high efficiency and rapidity in handling large and super large sample data sets.Through simulation experiments,an ana-lysis of the difference in the performance index and running time of CCEMB-CTSK was made,including different fuzzy rule numbers as well as the difference in the generalization capability and performance robustness of CCEMB-CTSK in noiseless and noisy training samples.
Collaborative Filtering Recommendation Algorithm Based on Improved User Clustering
ZHANG Jun-wei and YANG Zhou
Computer Science. 2014, 41 (12): 176-178.  doi:10.11896/j.issn.1002-137X.2014.12.038
Abstract PDF(244KB) ( 712 )   
References | Related Articles | Metrics
In order to reduce the computation time of group user recommendation,this paper proposed an improved k-means clustering collaborative filtering recommendation algorithm.Because of the sparsity of data,the effect of the traditional clustering methods is not ideal when trying to divide user group.This paper took into account that invariant group correlation between users in the clustering center of the traditional K-means algorithm is not high,made the user clustering,then according to the classification calculated recommended results of each user in the cluster,made full use of user information transmission between users to enhance information sharing within the group,and polymerized all user recommendation result of the group.Finally,simulation results show that the method proposed in this paper can effectively improve the accuracy of the recommendation,and it is more effective than traditional collaborative filtering algorithm.
Improved PageRank Algorithm Based on Links and User Feedback
CAO Shan-shan and WANG Chong
Computer Science. 2014, 41 (12): 179-182.  doi:10.11896/j.issn.1002-137X.2014.12.039
Abstract PDF(605KB) ( 818 )   
References | Related Articles | Metrics
Based on the PageRank algorithm,this paper proposed an improved PageRank algorithm named Bias PageRank,which not only takes the link structure between pages into consideration,but also users’ feedback information,such as frequency of click and interval of recent click.Through comprehensively analyzing these information,BPR algorithm can make pages of high quality rise and pages of poor quality fall to a certain extent.Experiments indicate that BPR algorithm can improve the ranking result and users’ satisfaction.
Adjustable Fuzzy Rough Set:Model and Attribute Reduction
SONG Jing-jing,YANG Xi-bei,QI Yong and QI Yun-song
Computer Science. 2014, 41 (12): 183-188.  doi:10.11896/j.issn.1002-137X.2014.12.040
Abstract PDF(433KB) ( 642 )   
References | Related Articles | Metrics
Fuzzy rough set is an extension of classical rough set by considering requirements of the practical applications.However,many existing fuzzy rough set models only use simple fusions of a set of binary relations,and these fusions are not adjustable.To solve such problem,an adjustable fuzzy rough set was proposed by using a parameterized binary operator.Moreover,the approximate quality was regarded as a measurement and then the heuristic algorithm was used to calculate the reduction of adjustable fuzzy rough set.Finally,the approximate quality and the reduction of adjustable fuzzy rough set were compared with those of the strong fuzzy rough set and the weak fuzzy rough set respectively.The experimental results show that adjustable fuzzy rough set is a generalization of both strong and weak fuzzy rough sets.
Close-degree of Ordered Information Systems and Attribute Reduction Algorithm
MENG Hui-li,ZHAO Xiao-yan and XU Jiu-cheng
Computer Science. 2014, 41 (12): 189-191.  doi:10.11896/j.issn.1002-137X.2014.12.041
Abstract PDF(223KB) ( 645 )   
References | Related Articles | Metrics
In ordered information systems based on dominance relations,the close-degree of dominance classes under different attribute sets was defined,and then the close-degree of different attribute sets was also defined.The heuristic attribute reduction algorithm based on the close-degree of attribute sets was designed.The validity of the algorithm was tested by an example,and results show that the algorithm is efficient for attribute reduction of ordered information systems,and provides a theoretical basis for knowledge discovery in ordered information systems.
Melodies Elastic Matching for Humming Retrieval on Web
LI Peng,WANG Xiao-ming,WANG Xiao-feng and WANG Ya-wen
Computer Science. 2014, 41 (12): 192-196.  doi:10.11896/j.issn.1002-137X.2014.12.042
Abstract PDF(706KB) ( 795 )   
References | Related Articles | Metrics
With the increasing of musical works and variety user retrieval needs,using content-based music retrieval methods,we proposed a music melody definition and expression style,and then completed the similarity calculation on the melody contour geometric similarity.Specifically we proposed a series of factors affecting the accuracy of retrieval algorithms,and determined the appropriate threshold through the experimental method,and implemented the final retrieval system both on stand-alone version and Web version.The experiment results indicate that the solution is correct,effective and can achieve higher retrieval accuracy.
MapReduce-based SOINN Clustering Algorithm for Web Tag
WANG Jie,YU Yan-shuo,ZHOU Kuan-jiu and HOU Gang
Computer Science. 2014, 41 (12): 197-201.  doi:10.11896/j.issn.1002-137X.2014.12.043
Abstract PDF(949KB) ( 631 )   
References | Related Articles | Metrics
Web tag helps users to classify,organize and search internet resources according to their interests.Tag clustering can help to solve problems caused by openness and freedom of Web tag system,such as inaccurate information description,disorganized tags,ambiguity,and so on.Three tag feature vector representation (FVR) methods were pre-sented which are resource-based FVR,other tag co-occurrence FVR and total tag co-occurrence FVR,can all apply to SOINN clustering algorithm.SOINN clustering can be parallelized by MapReduce model.Experiments show that accuracy and recall rate of three tag FVR are superior to original tag co-occurrence FVR and tag SOINN clustering by MapReduce owns optimum performance when the number of class center is more than 2000.The experimental results prove that distributed clustering algorithms proposed in this paper have good scalability which can be applied to more massive cluster Web tag analysis system.
Characteristic Conditions and Timed Properties of Time Petri Nets with Mixed Semantics
PAN Li,ZHENG Hong,YANG Bo and ZHOU Xin-min
Computer Science. 2014, 41 (12): 202-205.  doi:10.11896/j.issn.1002-137X.2014.12.044
Abstract PDF(373KB) ( 651 )   
References | Related Articles | Metrics
Two time semantics,a strong semantics and a weak one,are usually adopted by time Petri nets in different application context.But they are limited in schedulability analysis because of scheduling consistency problem and sche-duling timeliness problem.This paper defined two characteristic conditions for consistency and timeliness,presented a time Petri net model with mixed semantics,and proved the mixed semantics model is more suitable for the schedulability analysis of real-time systems than the existing time semantics models.We further compared the timed bisimulation ability of the mixed semantics model and the strong and weak semantics models.
Characterizing Expressive Power for Concept Descriptions and Terminological Axioms Boxes in Description Logic FL0
SHEN Yu-ming,WEN Xi-ming and WANG Ju
Computer Science. 2014, 41 (12): 206-210.  doi:10.11896/j.issn.1002-137X.2014.12.045
Abstract PDF(484KB) ( 650 )   
References | Related Articles | Metrics
The two most important properties of a logic are its expressive power and the complexity of the reasoning problems,which are also an opposing relation in the logic.Bisimulations between interpretations are the effective way to characterize the expressive power,and a classical result is the van Benthem characterizing theorem,which gives an exact condition that a first-order formula with one free variable is equivalent to a modal logic formula.In this paper,a simulation for FL0(including atomic concept,top concept,conjunction concept and universal quantification) was given.Based on the simulation,the characterizing theorems of expressive power for concept descriptions and TBoxes are sufficient and necessary conditions that a first-order formula is equivalent to a concept description or a TBox is set up.The above results provide effective supports for the tradeoff between the expressive power and the complexity of reasoning problems.
Dissimilarity Based Ensemble of Extreme Learning Machine with Cost-sensitive for Gene Expression Data Classification
AN Chun-lin,LU Hui-juan,WEI Sha-sha and YANG Xiao-bing
Computer Science. 2014, 41 (12): 211-215.  doi:10.11896/j.issn.1002-137X.2014.12.046
Abstract PDF(369KB) ( 667 )   
References | Related Articles | Metrics
Dissimilarity based ensemble of Extreme Learning Machine (D-ELM) gets stable classification results on gene expression data classification.While this algorithm is based on the classification accuracy,it cannot meet the requirement to get the minimum misclassification of cost-sensitive classification,when the misclassification costs are not equal.This paper used probability estimate and misclassification cost to reconstruct the classification results.Then we proposed the algorithm of Cost-sensitive Dissimilarity based ensemble of Extreme Learning Machine (CS-D-ELM).This algorithm is applied on the data of gene expression and the experiment demonstrates that it can get better result.
Venation Extraction Algorithm for Insect Vein Based on Tensor Voting
DUAN Da-gao,GONG Le,WAN Yue-liang and HAN Zhong-ming
Computer Science. 2014, 41 (12): 216-219.  doi:10.11896/j.issn.1002-137X.2014.12.047
Abstract PDF(861KB) ( 664 )   
References | Related Articles | Metrics
Insects venation extraction for its automatic classification is of great significance.The paper discussed some problems existing in the traditional algorithms of vein extraction and proposed a new method based on tensor voting.First,a sparse binary point figure is obtained by preprocessing such as denoising,binarization,morphology and so on,and the tensor of each pixel is calculated.After performing tensor voting between pixels in neighborhood by setting up a threshold,the final vein edge is obtained.The tensor voting process combines with Gestalt law of proximity,similarity,and etc.The experiment results show that by mean of Gestalt and tensor voting,the structure corresponding to visual perception can be extracted and a more smooth and complete edge contour image can be got.What’s more,even if the vein image has a little fault,the proposed algorithm still can obtain a complete outline.The result vein contour image can greatly improve the performance of the later automatic classification of insects.
Hybrid Differential Evolution Algorithm for Vehicle Routing Problem with Time Windows
SONG Xiao-yu,ZHU Jia-yuan and SUN Huan-liang
Computer Science. 2014, 41 (12): 220-225.  doi:10.11896/j.issn.1002-137X.2014.12.048
Abstract PDF(486KB) ( 765 )   
References | Related Articles | Metrics
Aiming at the vehicle routing problem with time windows,a multi-objective mathematical model was built with goals of minimizing the vehicle number and the travel distance,and it was solved by a hybrid differential evolution algorithm which is based on Pareto-dominance,combined with improved differential evolution algorithm and variable neighborhood descent.First,redefined method to produce new individual was employed.Then,the global exploration and the local exploitation were balanced by combining the dual populations strategy and variable neighborhood descent technology,and the diversity of the population was maintained through the method which uses the random individuals to replace the repetition individuals.Third,Pareto-dominance was applied to compare different solutions,and Arena’s principle was adopted to construct non-dominated solution set.Finally computational results on the 18 Solomon problems with the different sizes show that the solution quality of the proposed algorithm averagely increases 2.04% and 14.95% than artificial bee colony algorithm in the travel distance and the vehicle number,and averagely increases 14.53% than the best known solutions in the vehicle number,verifying the effectiveness of the proposed algorithm.
Improved RELM Based on Fish Swarm Optimization Algorithm and Cholesky Decomposition for Gene Expression Data Classification
LU Hui-juan,WEI Sha-sha,GUAN Wei and MIAO Yan-zi
Computer Science. 2014, 41 (12): 226-230.  doi:10.11896/j.issn.1002-137X.2014.12.049
Abstract PDF(361KB) ( 800 )   
References | Related Articles | Metrics
The paper proposed an improved algorithm of regular extreme learning machine(FSC-RELM) based on fish swarm optimization algorithm and Cholesky decomposition to apply in classification of gene expression data.Firstly,fish swarm optimization algorithm is used to optimize the weights of input layer and the value of objective function is defined as the reciprocal of error function.For improving the speed of the algorithm and reducing the training time,Cholesky decomposition is used on RELM output layer weights matrix.The experiments on the standard genetic data sets show that the FSC-RELM algorithm in a relatively short period of time can obtain higher classification accuracy and good performance.
Study on Set Pair Cloud Multi-attribute Group Decision Method
WU Ai-yan,ZENG Guang-ping and TU Xu-yan
Computer Science. 2014, 41 (12): 231-233.  doi:10.11896/j.issn.1002-137X.2014.12.050
Abstract PDF(290KB) ( 756 )   
References | Related Articles | Metrics
The paper studied the research status of fuzzy multi-attribute decision on the complex phenomenon.Furtherly,the fuzziness and randomness of language information and their impact on the alternatives have become the focus of current research.So,a set pair cloud multi-attribute group decision method was presented with cloud model and SPA to solve these problems.Then the method was used for the study of “new rural energy development and utilization” to establish the renewable energy development model,and the corresponding solutions was given.Finally,a active result was got with the fuzzy two-tuple linguistic analysis to compare the experimental,which shows the method is rational,effective and practical.
TCRA:A Data Reasoning Algorithm Combined with Ontology Concept
LU Gui-fang and WANG Jing-bin
Computer Science. 2014, 41 (12): 234-237.  doi:10.11896/j.issn.1002-137X.2014.12.051
Abstract PDF(690KB) ( 739 )   
References | Related Articles | Metrics
A class description is used to represent the concept in OWL DL (Description Logic). Collection operation in theory can lead to arbitrarily complex concept. The current collection operation of OWL DL can be achieved with intersection,union and complement,and OWL can not represent set expression with a number of constraints. This paper extended the cardinality constraint on the basis of OWL DL set limit,which can achieve the description logic with number of set constraints. And we proposed a tree classify reasoning algorithm based on collection cardinality.The application of human resources screening proves that this method is flexible in expression and very effective in individual classification.
Low Entropy Image Sequences Lossless Compression
TANG Ying,LIU Xiao-zhe and ZHANG Hong-xin
Computer Science. 2014, 41 (12): 238-244.  doi:10.11896/j.issn.1002-137X.2014.12.052
Abstract PDF(1493KB) ( 678 )   
References | Related Articles | Metrics
A large-scale amount of 3D graphic rendering image data has been brought because of the cloud rendering system.In order to reduce I/O transmission and storage cost of cluster rendered image sequences,this paper presented a lossless compression scheme based on dictionary technology which can get more compression by decreasing local complexity of data.A data rearrangement technology is applied to increase the degree of local redundancy,which can get more dense compression.To further improve the compression performance of a large-scale number of image sequences,the paper proposed a distributed image compression scheme based on the cloud computing infrastructure.The method was realized according to the current Map/Reduce computing model.The experimental results show that the approach proposed in this paper can make the coding process more efficiency.
Image Specification Algorithm Based on Multi-peaks Gaussian Function
ZHAO Tong,WANG Guo-yin and XIAO Bin
Computer Science. 2014, 41 (12): 245-250.  doi:10.11896/j.issn.1002-137X.2014.12.053
Abstract PDF(1027KB) ( 934 )   
References | Related Articles | Metrics
Histogram equalization,as a special histogram specification method,is an effective algorithm for image contrast enhancement.But it stretches the dynamic range of the image’s histogram which usually makes some of the uniform regions of the output image become saturated with high light.An image specification algorithm based on Gaussian PDF has been proposed recently.However,it is unsatisfactory for image enhancement due to its worse sense of hierarchy.Based on this,an image specification algorithm based on multi-peaks Gaussian function was proposed in this paper.In this form,local-means and local-variances can be estimated respectively with the method of derivative.A key feature of the algorithm is that varying the parameters of local-variances can enhance the image contrast selectively and locally.The resulting process can broaden a range of image contrast.For color image enhancement,the proposed algorithm can be the provisions of the R,G,B three sub image of color image and then effectively combine R,G and B with the color recovery factor on the habit of human vision.Both experimental results and theoretical analysis demonstrate the proposed algorithm is optical effective.
Smoke Detection in Video Based on Color Histogram and Wavelets
SUN Jian-kun and YANG Ruo-yu
Computer Science. 2014, 41 (12): 251-254.  doi:10.11896/j.issn.1002-137X.2014.12.054
Abstract PDF(1199KB) ( 755 )   
References | Related Articles | Metrics
Automatic smoke detection in video plays a major role in the early detection and response of an unexpected fire hazard and the recognition of vehicle exhaust.One of the difficulties in smoke detection is eliminating the interference of moving objects with similar color as smoke effectively.In order to ensure the effectiveness and timeliness,the static statistical character of wavelet transforms was used,which can exclude the interferences such as cars or persons.At first,motion regions are obtained by background subtraction and then candidate smoke regions are extracted through histogram backproject method.At last,the wavelet transform is applied on the background and the corresponding video frame,and the difference between those two transformed images is yielded.According to the different statistical character of smoke and non-smoke,non-smoke is removed from the frame.The experimental results are impressive with limited false alarms,high accuracy and real-time capability.
Pedestrian Detection with Block Feature Shrink
ZHANG Deng-yi,WANG Qian,GUO Lei and WU Xiao-ping
Computer Science. 2014, 41 (12): 255-259.  doi:10.11896/j.issn.1002-137X.2014.12.055
Abstract PDF(929KB) ( 852 )   
References | Related Articles | Metrics
To improve the detection rate and decrease the high dimension of histogram of oriented gradient (HOG) and local binary patterns (LBP) features in pedestrian detection,this paper proposed a pedestrian detection method based on block feature shrink.Firstly,the sample image is divided into many overlapped blocks with the same size.Then the HOG and LBP features are abstracted from these blocks,and are fused together as those blocks feature.Next,block classifiers are trained by block features.Those blocks are sorted according to the detection rate of the classifiers.We chose the blocks with higher rate to shrink their features.Finally,the block features are connected after shrinking as the last feature used to detect pedestrian.Experimental results on INRIA test set report that the proposed method has higherdetection rate and lower dimension.
Novel Transfer Learning Algorithm Based on Bag-of-visual Words Model
WU Li-na,HUANG Ya-ping and ZHENG Xiang
Computer Science. 2014, 41 (12): 260-263.  doi:10.11896/j.issn.1002-137X.2014.12.056
Abstract PDF(675KB) ( 722 )   
References | Related Articles | Metrics
The bag-of-visual words model needs to learn visual vocabulary and classifier from the beginning when it learns a novel image category,and it cannot make use of learned visual vocabulary.This paper proposed a transfer lear-ning algorithm based on visual phrases.The visual phrases contain not only local invariable features,but also local spatial information,which can describe the common characteristics among different image categories.Our algorithm can make the bag-of-visual words model obtain good performance in the novel image category by transferring visual phrases from source visual vocabulary.The experimental results validate that the algorithm effectively utilizes learned knowledge and gains better performance in the novel image category even when there are a few training images.
Automatic Pedestrian Detection Based on Video Surveillance
LI Xin-jiang,GONG Xun,LI Tian-rui,ZHAO Tao and XIONG Wei
Computer Science. 2014, 41 (12): 264-268.  doi:10.11896/j.issn.1002-137X.2014.12.057
Abstract PDF(1209KB) ( 776 )   
References | Related Articles | Metrics
To address the problem that the technologies of pedestrian detection can’t achieve the balance between detecting speed and accuracy,this paper aimed to research on pedestrian detection under video surveillance.An automatic videopedestrian detection method (denoted as LUVC4) was proposed by combining LUV color space information and C4 pedestrian detection algorithm.Firstly the C4 algorithm is used to rapidly traversal each frame of the video image.The LUV color space is taken to detect this window further when the confidence score of detect window is in the suspicious interval.If the weighted sum of scores of the two detections satisfies the threshold,it is discriminated as a pedestrian.A large number of experiments show that the detection speed of the proposed method nearly reaches that of C4 and it can greatly decrease the missrate about 9% when false positive per image equals to 0.1.
Improved Method of Microaneurysm Detection Algorithm Based on Digital Fundus Images
DING Shan and SONG Li-xiao
Computer Science. 2014, 41 (12): 269-274.  doi:10.11896/j.issn.1002-137X.2014.12.058
Abstract PDF(1566KB) ( 716 )   
References | Related Articles | Metrics
This paper presented a new approach to detect microaneurysms (MAs) in digital fundus images.The contributions of this approach are mainly twofold.First,the dynamic multi-parameter template matching scheme was proposed in this paper,which is more realistic compared to conventional schemes.We applied the dual constraints scheme to measure the matching degree by combining the sum of errors and correlation coefficients.Second,an adaptive weighted scoring algorithm with distribution character based scoring scheme was proposed on feature extraction for the MAs detection,which can not only reduce false positive (FP),but also maintain the true positive (TP) effectively.
Band Grouping Based Hyperspectral Image Classification Using Mathematical Morphology and Support Vector Machines
ZHANG Fan,DU Bo,ZHANG Liang-pei and ZHANG Le-fei
Computer Science. 2014, 41 (12): 275-279.  doi:10.11896/j.issn.1002-137X.2014.12.059
Abstract PDF(1199KB) ( 711 )   
References | Related Articles | Metrics
How to analysis and recognize image accurately is an important issue in computer vision and pattern recognition fields.Remote sensing image,especially hyperspectral images combine spatial and spectral information in one data cube.In this paper,we proposed a band grouping feature selection method,then extracted morphology features.A feature selection algorithm called recursive feature elimination was applied to decrease the dimensionality of the input morphology features data.A support vector machine was used for the final classification.Experiments performed on real hyperspectral images,confirm that it is efficient using band grouping and mathematical morphology.
Scale Adaptive Target Tracking Algorithm for Robot
CHENG Xin-tian and TANG Zhen-min
Computer Science. 2014, 41 (12): 280-282.  doi:10.11896/j.issn.1002-137X.2014.12.060
Abstract PDF(584KB) ( 656 )   
References | Related Articles | Metrics
Mean-Shift algorithm is a simple and efficient target tracking algorithm,but it can’t recognize occluded target and the target of scale changes.This paper proposed a scale adaptive target tracking algorithm for robot based on affine transformation.We defined the corner points,recognized target according to the defined corner points,and recognized the scale changes of target using affine transformation.Compared with relative algorithms,the proposed algorithm can re-cognize the occluded target effectively,and when the scale of target changes,the proposed algorithm can also recognize the target accurately.The analysis shows that,when there is less than 2×104 pixels in an image and less than 25 frames per second in a video stream,the proposed algorithm can be used in real-time target tracking.
Band Selection and Classification for Hyperspectral Image Based on Multiple Particle Swarm Cooperative Optimization
REN Yue-mei,LI Lei,ZHANG Yan-ning,WEI Wei and LI Ying
Computer Science. 2014, 41 (12): 283-287.  doi:10.11896/j.issn.1002-137X.2014.12.061
Abstract PDF(694KB) ( 646 )   
References | Related Articles | Metrics
The huge increase of hyperspectral data dimensionality and information redundancy has brought high computational cost as well as the risk of over-fitting when classification is performed.We presented an automatic band selection and SVM classification method based on a novel wrapper multiple particle swarm cooperative optimization-SVM model (MPSO-SVM),which uses multi-particle swarm algorithm to search the feature subset,and improves the PSO by the new update strategy of position and velocity.In the process of cooperative optimization,we improved the premature convergence of PSO by introducing genetic algorithm.The MPSO-SVM model optimizes both the band subset and SVM kernel parameters simultaneously.The experimental results on hyperspectral image demonstrate that MPSO-SVM can select the best band combination and the optimal SVM parameters,and improve the classification accuracy significantly.
Image Resizing Based on Shape and Structure-preserving of Salient Objects
LIN Xiao,SHEN Yang,MA Li-zhuang and ZOU Pan-pan
Computer Science. 2014, 41 (12): 288-292.  doi:10.11896/j.issn.1002-137X.2014.12.062
Abstract PDF(942KB) ( 730 )   
References | Related Articles | Metrics
Since the traditional seam carving based image resizing methods may destroy the shape and structure of saliency objects in the image,we presented a new image resizing method which can preserve the content and shape and structure of salient objects.The proposed method firstly produces the significant map of clear shape and structure by combining with the classic saliency map and gradient histogram information,and then divides the image into blocks by using the existing significant map.Finally,image can be resized which combines the classic seam resizing method and the resizing method of deformation based on conformal energy according to the significant block size.And the experiments show that the proposed method has better performance in preserving the content and shape and structure of saliency objects than the previous works.
Medical Image Segmentation Based on Non-parametric B-spline Density Model with Spatial Information
LIU Zhe,SONG Yu-qing and BAO Xiang
Computer Science. 2014, 41 (12): 293-296.  doi:10.11896/j.issn.1002-137X.2014.12.063
Abstract PDF(925KB) ( 701 )   
References | Related Articles | Metrics
Because finite mixture model for parameters estimation method partially depends on the prior assumption and is sensitive to noise in image segmentation,a non-parametric medical image B-spline density model with spatial information segmentation method was proposed in this paper.First,the image non-parametric B-spline density model was designed,and spatial information function was defined in order to make the model with spatial neighborhood information.Secondly,non-parametric B-spline expectation maximum(NNBEM) algorithm was used to estimate the unknown parameter of the density model.Finally,image was clustered according to the Bayesian criterion.This method effectively overcome the model mismatch problem,which is not only effective to deal with noisy,but also reserve edge property well.The experimental results about the simulation image segmentation show the effeciveness of this method.
Fast Face Alignment Method Based on Hierarchical Model and it’s Application on Mobile Device
DENG Jian-kang,WANG Can-tian and LIU Qing-shan
Computer Science. 2014, 41 (12): 297-302.  doi:10.11896/j.issn.1002-137X.2014.12.064
Abstract PDF(1288KB) ( 690 )   
References | Related Articles | Metrics
This paper studied the problem of fast localiztion of facial landmark on a mobile smart phone.Fast face alignment method based on hierarchical model was proposed,which is derived from active shape model.Firstly,the landmarks on the canthi and angulus oris are quickly located using binary feature based on the result of face detection.After that,calibration and correction are performed on these landmarks.Secondly,combined with edge constraint on the eyes,mouth and face outline,local alignments are performed independently based on the location of landmarks on the canthi and angulus oris.Finally,the shape of the whole face is aligned by weighted projection.Experimental results show that the proposed method will converge after 8 to 10 iterations.It takes 40ms or less to complete face alignment on a single face image on a smartphone(the Samsung I9300),which satisfies the request of real time.
Fake Fingerprint Detection Algorithm Based on Curvelet Texture Analysis and SVM-KNN Classification
ZHANG Yong-liang,LIU Chao-fan,XIAO Gang and FANG Shan-shan
Computer Science. 2014, 41 (12): 303-308.  doi:10.11896/j.issn.1002-137X.2014.12.065
Abstract PDF(745KB) ( 684 )   
References | Related Articles | Metrics
Fake fingerprint attack,as a kind of simple and practical way of cracking fingerprint identification,is used by some outlaws.The current mainstream method of fake fingerprint detection is texture analysis,but the original texture analysis doesn’t include coarseness analysis caused by the difference between the materials of fake fingerprint and real fingerprint.In this paper,a novel method was proposed based on the curvelet transform and image texture feature with support vector machine and K-nearest neighbor(SVM-KNN) classification.Firstly,curvelet coefficient features with different scales and directions are extracted.Secondly,texture features are extracted from first order statistics,gray level co-occurrence matrix(GLCM) and Markov random field(MRF) in curvelet reconstructed image,and then fingerprint images are trained to obtain the classification criterion by SVM.Lastly,SVM-KNN classification is used for fake fingerprint detection.The experimental results in the databases of the Liveness Detection Competition 2011 (LivDet2011) show that the proposed method is effective and superior.