Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue 5, 14 November 2018
  
Survey of Appearance Models and Rendering
JIANG Hai-yan,ZHOU Ming-quan,WU Zhong-ke,GUO Di and ZHANG Xu
Computer Science. 2014, 41 (5): 1-7.  doi:10.11896/j.issn.1002-137X.2014.05.001
Abstract PDF(1892KB) ( 942 )   
References | Related Articles | Metrics
This paper summarized one of the main research contents of photorealistic graphics-the appearance models and rendering.It represented the two key factors that influence the appearance of different materials:the optical properties and geometric details of materials surfaces.And according to the interaction of light and materials’ surfaces-reflection,subsurface scattering and transmission and so on,it described emphatically the appearance models and rendering,analyzed in detail the specific models:BRDF,SVBRDF,BSSRDF,TSVBRDF and BTF,and concluded the characteristics of the models,as well as gave the rendering results using different models.Then it exhibited the main application areas with different models used in appearance rendering,such as film and television production,games industry,virtual design and cultural heritage protection.In the end,it concluded and prospectd the research of appearance modeling and rendering.
Survey towards Cognitive Ability Guarantee in Cognitive Radio Networks
FENG Guang-sheng,ZHENG Chen,WANG Hui-qiang,ZHAO Qian and LV Hong-wu
Computer Science. 2014, 41 (5): 8-13.  doi:10.11896/j.issn.1002-137X.2014.05.002
Abstract PDF(937KB) ( 747 )   
References | Related Articles | Metrics
Cognitive radio network can sense wireless environment and then uses spectrum white holes in an opportunistic manner through dynamic spectrum access technology.Cognitive ability is an essential attribute of cognitive radio network,therefore the safety guarantee of cognitive ability is directly related to the practical process in cognitive radio networks.Cognitive ability is faced various security threats due to the complexity of cognitive radio network architecture and node mobility issues.Cognitive ability security issues and corresponding solutions were introduced from the aspects of cognitive communication,resources sensing,inference decision,and services fit,and then the shortages and weaknesses of the existing solutions were summarized.Development tendencies on cognitive ability guarantee were proposed afterwards based on trust mechanism.Problems and challenges faced by cognitive ability guarantee as well as a discussion on the future research topics were referred to at last.
Accelerating ASIFT Based on CPU/GPU Synergetic Parallel Computing
HE Ting-ting,RUI Jian-wu and WEN La
Computer Science. 2014, 41 (5): 14-19.  doi:10.11896/j.issn.1002-137X.2014.05.003
Abstract PDF(495KB) ( 1276 )   
References | Related Articles | Metrics
ASIFT(affine-SIFT) is a fully affine invariant,and scale invariant image local feature extraction algorithm.It has a good result in image matching.But because of its high computational complexity,it cannot be applied to real-time processing.Thus GPU is used to accelerate ASIFT.Based on the analysis of running time of ASIFT,firstly SIFT was adapted to GPU,and then the other parts of ASIFT.Memory pool was used in GASIFT to avoid frequently allocating and deleting memory during the runtime.Different ways of CPU/GPU synergetic parallel computing were studied to make GASIFT more efficient.Experiments show that the model in which CPU takes the logical calculation work and GPU makes parallel computing is the most suitable way.Based on this model,GASIFT has a good speed-up ratio over other methods.That’s 16times compared with traditional ASIFT,and 7times compared with OpenMP optimized ASIFT.
Improved Priority List Task Scheduling Algorithm
LI Jing-mei,WANG Xue and WU Yan-xia
Computer Science. 2014, 41 (5): 20-23.  doi:10.11896/j.issn.1002-137X.2014.05.004
Abstract PDF(684KB) ( 1523 )   
References | Related Articles | Metrics
Task scheduling on heterogeneous multiprocessor is an important problem in the field of high performance computing.The paper proposed an improved priority list task scheduling algorithm to solve the problem that misconduct priority method exists in the task scheduling algorithm and the scheduling result is unsatisfactory.The algorithm improves the traditional priority list of task scheduling method which takes average execution time as parameter to calculate priority of task and proposes a weighted priority method to order the priority of tasks based on heterogeneous multi-core performance difference and dependent task characteristics.And then,the paper proposed a new way to select processor core for task which takes the current situation and backward critical path execution time as weight and overcame the local optimal problem brought by greedy though.Furthermore,at the task allocation stage,the algorithm takes task duplication and interval insertion technique to optimize schedule process and shorten the task earliest start time and improve the processor utilization successfully.The instance analysis and simulation experiments prove the algorithm can reduce the execution time of tasks effectively and give a full play to heterogeneous multi-core processor advantage.
Aging Path Reduction Algorithm for Type of Logic Gates
XING Lu,LIANG Hua-guo,YAN Lu-ming,ZHANG Li-na and YU Tian-song
Computer Science. 2014, 41 (5): 24-26.  doi:10.11896/j.issn.1002-137X.2014.05.005
Abstract PDF(337KB) ( 711 )   
References | Related Articles | Metrics
As decrease of CMOS technology scaling of integrated circuit,the problems of circuit reliability are more serious,and the circuit aging caused by NBTI is especially conspicuous.In fact,most of circuits are complicated and there are many paths in a circuit.The workload will be large if we predict aging of all paths.In this work,we proposed an iterative algorithm based on types and number of logic gates on one path,which is used to reduce the protected circuit paths.The algorithm classifies all paths by the number of every kind of logic gate and different influence of logic gates types on circuit aging,then removes the secure path of which aging will not occur,lessens the workload of circuit aging prediction,and improves the efficiency of aging prediction.
Research on SIMD-oriented Loop Optimizations
HOU Yong-sheng,ZHAO Rong-cai,HUANG Lei and HAN Lin
Computer Science. 2014, 41 (5): 27-32.  doi:10.11896/j.issn.1002-137X.2014.05.006
Abstract PDF(790KB) ( 762 )   
References | Related Articles | Metrics
Accelerating program performance via SIMD vector is very common in modern processors.Based on SIMD data dependency constraints,we presented the design and implementation of the improved Tarjan algorithm with reverse dependency graph to increase recognition rate of SIMD parallelization.And in combination with traditional vectorization method,the SIMD-oriented loop optimization technology was proposed that avoids unnecessary overhead in data reorganization.Our experimental results demonstrate that the performance of generated SIMD using the technology code is increased by 12% compared with ICC.
Rotation-based Test Pattern Clustering Compression Method for BIST
TU Ji, WANG Zi-long and LI Li-jian
Computer Science. 2014, 41 (5): 33-36.  doi:10.11896/j.issn.1002-137X.2014.05.007
Abstract PDF(326KB) ( 786 )   
References | Related Articles | Metrics
This paper presented a novel test pattern compression scheme for deterministic BIST.To reduce the storage requirements for the deterministic patterns,it relies on a two-dimensional compression scheme,which combines the clustering compression and rotation-based compression.Clustering compression divides the rest random pattern resistant faults(RPRF) into several clusters in a circuit.The test patterns in each cluster are no more than one bit different from each other,and only one seed selected from each cluster is needed to be stored in ROM.In order to reduce more storage cells,rotation-based compression method is added to compress the seeds of clustering compression.Experimental results show that the proposed scheme requires less test data storage than some previously published schemes,and it has the ability of at-speed test.
Study on Chain-based BIST Architecture of LUTs in FPGA
ZHANG Shuang-yue, LI Shuo, WANG Hong and YANG Shi-yuan
Computer Science. 2014, 41 (5): 37-40.  doi:10.11896/j.issn.1002-137X.2014.05.008
Abstract PDF(339KB) ( 811 )   
References | Related Articles | Metrics
BIST-based FPGA test uses the resource in the chip under test to build the TPG or ORA to reduce the usage of I/O pins and ATE.In traditional BIST methods of FPGA,only the fault detection of the CUT of BIST architectures is considered and the chip needs to be reconfigured several time to change different parts into CUT.In order to improve the efficiency of a chain-based BIST method of LUTs in FPGA,this article tried to find more suitable architecture and testing configurations of TPG which can reach a higher fault coverage of the TPG part.
Multiway Tree-based Group Key Management Scheme for Multi-privileged Group Communications
XU Yang, ZHOU Wei, DU Qiu-shuang and WANG Guo-jun
Computer Science. 2014, 41 (5): 41-45.  doi:10.11896/j.issn.1002-137X.2014.05.009
Abstract PDF(437KB) ( 758 )   
References | Related Articles | Metrics
In multi-privileged group communications,since users can access multiple data resources according to their different privileges,security issues become more difficult to solve than that in traditional group communications.Therefore,this paper proposed a novel centralized group key management scheme for multi-privileged environments.The proposed scheme employs multiway tree to construct a key graph and assigns a unique ID for every node in the key graph,so that the relationship between keys can be deduced by an ID which will contribute to locating the affected keys efficiently.As a result,the related users can update the affected keys through previous keys or with a rekeying material by using a one-way function when membership changes dynamically.Theoretical analysis and experimental simulation results show that the proposed scheme can reduce the storage and rekeying overhead efficiently,and it outperforms some previous schemes.Meanwhile,the forward and backward security is also guaranteed.
Rollback Recovery Algorithm for Virtual Machine Based on Quasi-synchronous Checkpointing
ZHANG Zhan, ZUO De-cheng, HUANG You-fu and HE Hui
Computer Science. 2014, 41 (5): 46-49.  doi:10.11896/j.issn.1002-137X.2014.05.010
Abstract PDF(377KB) ( 708 )   
References | Related Articles | Metrics
Considering the characteristic of the virtual machines based on cloud platform,a Quasi-synchronous checkpointing with selective message logging algorithm for virtual machine (for short,VM_QSC) was presented.The algorithm keeps the inherent optimized checkpoint interval of the VM nodes,selectively stores stable optimistic message log,and maintains the consistency of the whole VM systems by hypervisor on the physical machines.Performance evaluation and simulation result show that in contrast with the traditional communication induced checkpointing and coordinated checkpointing,VM_QSC maintains the autonomy checkpointing,and saves communication and storage cost.It adapts for the cloud platform to manage the virtual resource and migrate the virtual machine.
Study on Quasi-perfect Maximum Distance Pseudo Random Testing
WU Sheng-feng, WU Yue and XU Shi-yi
Computer Science. 2014, 41 (5): 50-54.  doi:10.11896/j.issn.1002-137X.2014.05.011
Abstract PDF(507KB) ( 748 )   
References | Related Articles | Metrics
This paper improved maximum distance random testing essentially based on quantitative analysis of the maximum distance between two test patterns in pseudorandom testing for VLSI.The test sequence generated by the proposed algorithm,called quasi-perfect maximum distance testing algorithm,can reach both maximum Hamming distance and quasi-maximum Cartesian distance so that each test pattern may detect as many different faults as possible.The idea of this algorithm for generating the new test sequence was described in detail.Experiment results on ISCAS’ 85benchmarks indicate that this approach can highly increase the efficiency of pseudo random testing.
Improving Fault Coverage by Adopting Different Sensitization Criterion for Faster Than At-Speed Testing
WEI Jian-long and KUANG Ji-shun
Computer Science. 2014, 41 (5): 55-58.  doi:10.11896/j.issn.1002-137X.2014.05.012
Abstract PDF(955KB) ( 1016 )   
References | Related Articles | Metrics
Test generation method for the small delay defects (SDDs) not only requires low algorithm complexity,but also more possibility to detect small delay.Faster than at-speed testing avoids to detect the longest sensitization paths for poor efficiency.It requires test patterns to be delicately classified into groups according to the delay of sensitization paths,and each group is managed to be applied at certain clock frequency.Then it adopts a path selection method to identify a certain length of paths quickly and accurately,which can achieve high test quality.At the same time,the paper firstly proposed that choosing single path sensitization criterion for short paths and nonrobust sensitization criterion for the critical paths to test can improve transition delay fault coverage (TDF) of the nodes which contain small delay defects at the cost of a little of time compared with adopting single sensitization criterion.Experimental results on ISCAS’89benchmark circuits show that the proposed method can achieve higher transition delay fault coverage of SDDs with low CPU time.
Reconfigurable Tolerance Method for Multi-processor System
ZHANG Shao-lin, YANG Meng-fei, LIU Hong-jin, JIANG Hong and WANG Ruo-chuan
Computer Science. 2014, 41 (5): 59-63.  doi:10.11896/j.issn.1002-137X.2014.05.013
Abstract PDF(699KB) ( 769 )   
References | Related Articles | Metrics
Because of advancing space applications,such as Second Generation of Navigation System,Manned Spaceflight,Deep Space Missions,requirement on On-Board Computer(OBC) for higher performance has been arisen and normal OBCs are facing an upgrade of consume and rad-hard capability.Based on adaptive reconfiguration technology and System-on-Chip design method,this paper presented a reconfigurable tolerance method for MPSoC.High reliability and expansibility can be achieved at hardware level,which makes a great sense of space applications and military missions in future.After introducing the basic concepts of MPSoC and reconfigurable technique,several representative cases were presented as to show the development of reconfigurable processors.Then the basic infrastructure of our dynamic reconfigurable tolerance system was discussed.Moreover,tolerance strategy based on system degrading was presented in detail.A contrast analysis among single core,normal design and our method was provided.Besides,using leon3to implement the process elements,simulation validation of tolerance module efficiency was given out and the test results show the control mechanism works well.Finally,the future related works were discussed and the proposed reconfiguration tolerance method was concluded.
Optimal Multi-objective Monitoring Resources Allocation in Distributed Systems
HE Pan, YUAN Yue and WU Kai-gui
Computer Science. 2014, 41 (5): 64-67.  doi:10.11896/j.issn.1002-137X.2014.05.014
Abstract PDF(431KB) ( 765 )   
References | Related Articles | Metrics
Against the loosely coupled and dynamic configuration features of distributed systems,multi-objective monitoring resources allocation oriented at optimal monitoring interval selection was studied to achieve the balance between resources optimization and reliability maintenance.To propose the allocation model,first of all,the Markov chain theory was employed to analyze the reliability of systems under monitoring mechanism.Secondly,two different kinds of monitoring resources cost models were analyzed.On the top of that,a multi-objective monitoring resources allocation model was built under the system reliability constraint.This model chooses the appropriate monitoring rate for each component through minimizing the monitoring cost.Finally,a genetic algorithm was used to solve the allocation and optimization model.Experimental studies results prove the necessity and impact of monitoring resources allocation in reliability optimization.The results also show that monitoring resources allocation can help to achieve resources optimization and reliability maintenance.Comparing with single-objective resources allocation,multi-objective resources allocation approaches can get better optimization results.
Improved Collaborative Filtering Recommendation Algorithm of Similarity Measure
WEN Jun-hao and SHU Shan
Computer Science. 2014, 41 (5): 68-71.  doi:10.11896/j.issn.1002-137X.2014.05.015
Abstract PDF(330KB) ( 785 )   
References | Related Articles | Metrics
Collaborative filtering algorithm is one of the most important technologies in electronic commerce recommendation system.The accuracy of recommendation system directly depends on the effectiveness of the similarity measure.The methods of traditional similarity measure mainly focus on the similarity of user common rating items,but ignore the relationship between the user common rating items and all items the user rates.The relationship between the user common rating items and all items the user rates can be calculated by Tanimoto coefficient.However,Tanimoto coefficient is based on the mode of binary operation,which will not get the satisfactory result if it is directly applied in recommendation system.Aiming at the above problems,the improved Tanimoto coefficient was proposed,and the relationship between the user common rating items and all items the user rates was blended into the traditional similarity measure methods.Experiments show that,to a certain extent,the proposed collaborative filtering algorithm is more effective and accurate.
Applying PSO to Multi-objective Test Cases Prioritization
CHEN Yun-fei, LI Zheng and ZHAO Rui-lian
Computer Science. 2014, 41 (5): 72-77.  doi:10.11896/j.issn.1002-137X.2014.05.016
Abstract PDF(524KB) ( 718 )   
References | Related Articles | Metrics
It may be impossible to re-execute the whole test suite in regression testing with the increasing size of software.In such a case,prioritizing test cases is vital to software regression testing.Test cases prioritization is a technique to search the best sequence of test cases execution.In regression testing for real projects,single-objective test cases prio-ritization has been gradually replaced by multi-objective prioritization,where the application of Evolution Algorithms to address multi-objective optimization problems is a hot spot in current research.However EAs are based on population genetic iterations in which the mechanism of exchanging information among populations is relatively complex,and consequently the efficiency of test suite prioritization based on EAs is sharply declining with the increase in the scale of the population.To address this problem,the paper presented a test suite prioritization technique based on Particle Swarm Optimization, proposed the corresponding particle representation,position and speed updated methods.Empirical studies were conducted to study the effect caused by different types of particle updating methods and size of particle swarm.Compared to the NSGA-Ⅱ based test case prioritization,the proposed technique is more efficient in test suite prioritization in real programs.
k-Hop Backtracking Trusted QoS Rerouting Mechanism
YANG Lei,WANG Xing-wei and HUANG Min
Computer Science. 2014, 41 (5): 78-81.  doi:10.11896/j.issn.1002-137X.2014.05.017
Abstract PDF(670KB) ( 696 )   
References | Related Articles | Metrics
Because of link or node failures,rerouting is necessary in trusted network.Considering users’ QoS (Quality of Service) requirements,trusted requirements and decrease of the algorithm cost,a k-hop backtracking trusted QoS rerouting mechanism was proposed.The mechanism backtracks hop by hop from the previous node of the failure node (or link) with k as the maximum hop count for the rerouting,and reuses original links of the path as much as possible.The network model and user trust evaluation model were constructed.Besides,the description of users’ requirements,the calculation methods of users’ satisfaction and the criterion of path judgments were given.The simulation results show that the proposed mechanism is both feasible and effective.In contrast to existing mechanisms,the rerouting success rate and user satisfaction degree are higher and the rerouting time is shorter with users’ demands satisfied.
Image Retrieval Based on Multi-probe Locality Sensitive Hashing and Word Map Chain Voting
XU Zhe,CHEN Fu-cai,LI Shao-mei and LI Xing
Computer Science. 2014, 41 (5): 82-85.  doi:10.11896/j.issn.1002-137X.2014.05.018
Abstract PDF(430KB) ( 864 )   
References | Related Articles | Metrics
To solve the problem of high memory cost,low retrieval accuracy when the background is changed obviously and reduced efficiency when the size of the database is increased of the BoVW (bag of visual words) method based on Euclidean locality sensitive hashing (E2LSH),a fast retrieval method based on word map chain voting of Hamming embedding was presented on the basis of clustering the feature points through multi-probe locality sensitive hashing.The method constructs a single-table visual dictionary with multi-word mapping and soft assignment,reduces the size of the dictionary to reduce memory consumption.Then a word map chain is constructed with Hamming embedding,and a weighting function is proposed to increase the retrieval accuracy.A weighted voting of the word map chain from matching results is made to accomplish image retrieval.Experimental results show that this method can effectively reduce memory consumption,improve retrieval accuracy,and is applicable to large scale datasets.
Server Push Technology Based on Fuzzy Fusion Decision-making
JIANG Qian-yue and ZHANG Ya-ying
Computer Science. 2014, 41 (5): 86-90.  doi:10.11896/j.issn.1002-137X.2014.05.019
Abstract PDF(415KB) ( 708 )   
References | Related Articles | Metrics
The traditional server push technology cannot change the push method in different user situation.A new server push technology called compound polling was introduced which combines namely polling with long polling based on fuzzy multiple criteria alignment program.Applying this technology to the traffic monitoring platform satisfies multi-user real time requirement as well as full usage of server resource when announcing real time information.
3-layer Cooperative Detection Algorithm for Multi-dimensional Events in WSN
WANG Hao-yun,LIU Jiao-jiao,FANG He-he,REN Shou-gang and XU Huan-liang
Computer Science. 2014, 41 (5): 91-96.  doi:10.11896/j.issn.1002-137X.2014.05.020
Abstract PDF(525KB) ( 622 )   
References | Related Articles | Metrics
A 3-layer cooperative detection algorithm for multi-dimensional events was proposed in wireless sensor networks.Sensors detect abnormality through the similarity of mean vector sequences and confirm the occurrence of events with the voting mechanism.On the basis of similarity of boundary vector sequences,cluster-headers analyze multi-dimensional events data using a modified K-means algorithm.Sinks match the types of known events using the distribution probability of event data.The results of theory analysis and simulation experiments indicate that compared with the traditional centralized algorithm,this 3-layer cooperative detection algorithm for multi-dimensional events can improve the detection precision under the interference of noise,reduce the data traffic and the computation complexity,and prolong the network lifetime in wireless sensor networks.
Design and Realization of High-available Controlling Integrated Double-redundant Ethernet Scheme
WANG Yi-ming,ZHANG Ke and CHEN Long
Computer Science. 2014, 41 (5): 97-101.  doi:10.11896/j.issn.1002-137X.2014.05.021
Abstract PDF(973KB) ( 1258 )   
References | Related Articles | Metrics
This paper presented a scheme of highly-available,controllable redundant Ethernet,which sends same data packets through two network cards to avoid losing information when packets lost in the network.A frame redundancy label is added before Ethernet user data,used for marking redundant and irredundant data and distinguishing different frame from redundant data,which can reach zero recovery time when network problems happen.Network diagnosis is introduced in the scheme,which monitors network’s state through the software in the nodes sending,receiving and dealing with UDP messages.Winsock SPI and API are used respectively for achieving redundant communication and network diagnosis,in order to keep transparent to the application layer as well as guarantee the network’s high availability and usability.Experiment shows that the LAN can tolerate multi-problems from nodes to switches,increase the data sent in the network and consequently achieve higher fault tolerance.
Research on QoE-based Bandwidth Allocation Mechanism for VoIP
YANG Qiu-ling,JIN Zhi-gang and HUANG Xiang-dang
Computer Science. 2014, 41 (5): 102-106.  doi:10.11896/j.issn.1002-137X.2014.05.022
Abstract PDF(407KB) ( 658 )   
References | Related Articles | Metrics
This paper used the Proportional Integrative Derivative (PID) control theory to allow a controlled trade-off between the bandwidth requirement of the VoIP data stream and its Quality of user Experience (QoE),then quantifying the perception in the bandwidth reallocation model,proposed a QoE-based bandwidth allocation mechanism for VoIP(QBAV),which maximizes the fairness between users while targeting the expected QoE.This paper proved the global optimum of the algorithm.The simulation shows that the proposed algorithm can meet about 90% of users,comparing with the NRG algorithm and the latest FC-MDI-S algorithm,and the bandwidth cuts down 9% and 15% respectively.The existing algorithms rely on minor high priority data streams.Avoiding this problem,the proposed algorithm enhances the overall performance of VoIP and the network efficiency.
Vehicular Ad hoc Networks Routing Protocol Research Based on Geographic Position
MA Zhi-xin,LIU Hai-ying and XIE Xian-zhong
Computer Science. 2014, 41 (5): 107-110.  doi:10.11896/j.issn.1002-137X.2014.05.023
Abstract PDF(937KB) ( 968 )   
References | Related Articles | Metrics
As Vehicular ad hoc networks has a special node type and channel characteristics,using the traditional Ad hoc network routing protocol can not achieve satisfactory performance.In order to realize high speed and reliable data transmission rate,it is needed to study the emerging routing algorithms.Geographical location aided routing based on greedy algorithm is the mainstream of VANET routing.This paper mainly studied the routing protocol based on the geo-graphic location, improved GPSR(Greedy Perimeter Stateless Routing) routing protocol.Introduced the concept of vectors to improve the routing protocol GPSR greedy forwarding mode,i.e. the choice of the next hop node considers not only the distance from the destination node but also the urban environment crossroads node.Moreover,added predictive models to predict the movement of vehicles at intersections and improve the efficiency of routing protocols.
Chinese Analogy Retrieval Using SVM
LIANG Chao,LV Zhao and GU Jun-zhong
Computer Science. 2014, 41 (5): 111-115.  doi:10.11896/j.issn.1002-137X.2014.05.024
Abstract PDF(486KB) ( 633 )   
References | Related Articles | Metrics
With the development of Internet,the problem of not acquiring information of unknown domains because not exactly import keywords becomes more common.As a new retrieve method of acquiring knowledge of unknown domains using the knowledge of known domains,analogy retrieval gradually becomes one of hot topics.Analogy retrieval first analyzes the potential relationships between pairs of words and then accurately returns target information using these relationships.For example,given an analogy query Q={A:B,C:?},here it is assumed that there are some potential relationships between A and B.The aim of analogy retrieval is to determine the target(s) D of ?,and the relationships between two pairs of words,A and B,C and D,are similar.Two key difficulties of analogy retrieval are:(1) mining relationships between two words and (2) extracting target words.Both of them are more challenging in Chinese.This paper proposed a SVM based Chinese Analogy Retrieval (namely SVMbCAR) with two main components,SVM based relation-words extracting and SVM based target words determining.Experiments on a real-life data set (600person entity pairs from Ren Li Fang) show that the accuracy of extracting relationships between two words is 82.3%,and the accuracy of extracting target words is 90.5%.
Algorithm of Network Coding Based Multipath Opportunistic Routing for Wireless Networks
HAN Li and QIAN Huan-yan
Computer Science. 2014, 41 (5): 116-119.  doi:10.11896/j.issn.1002-137X.2014.05.025
Abstract PDF(334KB) ( 687 )   
References | Related Articles | Metrics
We considered wireless mesh networks,and exploited the inherent broadcast nature of wireless by making use of opportunistic multipath routing.We presented an optimization framework that enables us to derive optimal flow control,routing,scheduling schemes,where we use network coding to ease the routing problem.The simulation shows that in the dense network with multi-flows,the algorithm can achieve higher throughput improvement and fairer allocation of bandwidth compared to other protocols of the same genre.
Tongyici Cilin-based Mapping Approach for Large-scale Chinese Ontology
WANG Ting,DI Rui-hua and LI Wei-ming
Computer Science. 2014, 41 (5): 120-123.  doi:10.11896/j.issn.1002-137X.2014.05.026
Abstract PDF(353KB) ( 922 )   
References | Related Articles | Metrics
Ontology mapping is an important way to solve the problem of the ontology heterogeneous.The Chinese knowledge is an important part of the online knowledge base,but there is still lack of relevant system for Chinese large scale ontology mapping.While facing the large-scale ontology mapping task,current Chinese ontology mapping system has low efficiency and its availability is not high.In order to solve the Chinese ontology mapping problem,this paper put forward a kind of Tongyici Cilin-based ontology mapping architecture,firstly presented a data field-based method to compress the Chinese large-scale ontology,secondly purposed a Tongyici Cilin-based algorithm for the Chinese equivalent-class discovery,thirdly implemented a prototype system for Chinese large-scale ontology mapping,finally compared the system with the other typical similarity computing algorithm.The result proves that it has higher overall perfor-mance and usability.
Study on the Control Optimization of Delay and Rate in Multiple-priority Rate-adjustable Queues
YANG Tian-ming and LIU Jing-ning
Computer Science. 2014, 41 (5): 124-128.  doi:10.11896/j.issn.1002-137X.2014.05.027
Abstract PDF(435KB) ( 733 )   
References | Related Articles | Metrics
It is a complicated problem in queue networks to optimally control the delay and service rate.For multi-class priority queue and adjustable service rate M/G/1queues,two convex optimization problems were studied,i.e.,minimizing convex functions of the average delay vector,and minimizing average service cost,both under the constraints of per-class delay,and consiquently an optimization algorithm was proposed for each of them.These algorithms use virtual queue techniques to solve the two problems with variants of dynamic cμ rules.Then these algorithms adaptively choose a strictly priority policy,in response to past observed delays in all job classes,in every busy period.Lyapunov drift analy-sis and simulation results validate the optimal performance of these two algorithms,and show that the proposed polices require limited or no statics of the queue.
Multimedia DRM System for Android Platforms
WANG Zhen,ZHANG Zhi-yong and CHANG Ya-nan
Computer Science. 2014, 41 (5): 129-132.  doi:10.11896/j.issn.1002-137X.2014.05.028
Abstract PDF(949KB) ( 622 )   
References | Related Articles | Metrics
For digital rights management of mobile multimedia in mobile terminals,a Digital Rights Management (DRM) approach for Android platforms was proposed.The solution adopts 3DES encryption algorithm and binds digital license with hardware of the device to implement usage of control and secure display of multimedia contents.Meanwhile,sharing digital rights between devices was supported in this approach.A prototype confirms that the solution has the features of high security and faster encryption speed.Otherwise,functions of usage of control can meet the practical requirements for digital rights management with Android platforms.
Provably Secure Multi-proxy Multi-signature Scheme with Different Proxy Groups
LIU Dan-ni,WANG Xing-wei and HUANG Min
Computer Science. 2014, 41 (5): 133-136.  doi:10.11896/j.issn.1002-137X.2014.05.029
Abstract PDF(375KB) ( 631 )   
References | Related Articles | Metrics
In most of the existing multi-proxy multi-signature (MPMS) schemes,the same proxy group is delegated the proxy right to sign by all the original members.Nevertheless,in many practical applications,original signer often demands to designate the proxy group in his own organization which is different from others’.It is seldom considered in the MPMS schemes.In this paper,we proposed a MPMS scheme with different proxy groups.Furthermore,in our scheme,when the final proxy signature is being authenticated,a group of specified verifiers have the access.To proved the safety of the new scheme,we improved a security model to testify that the new one is secure based on the computational Diffie-Hellman assumption.Compared with the previous scheme,the new one offers tighter safety and better computational efficiency.
High-performance Data Privacy Protection for Cloud
SUN Xin-wei,ZHANG Wei and XU Tao
Computer Science. 2014, 41 (5): 137-142.  doi:10.11896/j.issn.1002-137X.2014.05.030
Abstract PDF(736KB) ( 685 )   
References | Related Articles | Metrics
With the rapid development of technology of cloud computing and cloud storage,more and more businesses and individuals use cloud storage to store data or backup data.When uploading private data to the cloud,the user will lose the absolute control of the data,them data privacy protection becomes a problem that cloud storage has to solve.In order to solve this problem,BSBC (Bit Split Bit Combine),a new data privacy protection method was presented.Before uploading the data,BSBC splits the data according to bit and re-assembled to form a number of data files,then uploads the data to cloud storage servers;when downloading the data,BSBC downloads all the data files,then through the bit combination,revert them to the original file.Experiments show that this method can protect the privacy of users’ data, obtain 17~35times performance improvement compared with traditional encryption.Then assembly language is used is used to optimize the core codes of bit split and bit combination,and instruction scheduling optimization of assembly language to reduce the data conflict and pipeline stalls.Eventually,compared with traditional encryption,BSBC can get 25~35times performance improvement.
Heuristic Attack Strategy Against Improved LMAP+ Protocol
WANG Chao,QIN Xiao-lin and LIU Ya-li
Computer Science. 2014, 41 (5): 143-149.  doi:10.11896/j.issn.1002-137X.2014.05.031
Abstract PDF(597KB) ( 622 )   
References | Related Articles | Metrics
With the extensive usage of radio frequency identification (RFID) systems,the challenge of security is emerging.In order to reduce the computational cost of the tag,a series of ultralightweight RFID authentication protocols that only involve simple bit-wise operations like AND,OR,XOR,rotation,have incurred many concerns.But the existing ultralightweight RFID authentication protocols are unable to guarantee the strong security property.A heuristic attack strategy based on simulated annealing algorithm was proposed to reckon the secret data against the improved LMAP+protocol present by Gurubani in 2012.With the untraceability model proposed by Juels and Weis,a traceability attack was undertaken based on our heuristic attack strategy.And the secret values of improved LMAP+could be disclosed successfully after repeated experiments of the attack strategy.The experiment evidences that,through the passive attack by eavesdropping the exchanged messages between the reader and the tag,the reckoned secret values are approximately equal to the genuine values of the secret data.And in a full-disclosure attack experiment,the secret data will be cracked completely with a 70percent probability.At the same time,the attack process has a fast convergence speed with desired effectiveness.
Grid Resource Allocation Based on Mixed Combinatorial Double Auction
XIAO Ying-chun,WANG Han-wu and LI Meng-xiong
Computer Science. 2014, 41 (5): 150-154.  doi:10.11896/j.issn.1002-137X.2014.05.032
Abstract PDF(530KB) ( 807 )   
References | Related Articles | Metrics
Nodes acting as the resource providers or customers will play different roles in a single auction of the combinatorial double auction sessions or entities.Regarding this,we proposed a mixed combination double auction scheme which enables a node to be not only a resource provider but a customer.Under such a context we further adopted the so called credit mechanism into our resource allocation scheme,due to its positive effect on resource allocation security and quality of service.Specifically,we used trust as a parameter to determine and adjust the price of the resource,and further derived the optimal resource allocation scheme.The simulation results show that proposed algorithm can outperform previous methods in preventing malicious nodes,as well as improving the transaction rate and transaction’s positive incentive auction efficiency.
New Trust Based Access Control Model in Hadoop
LIU Sha and TAN Liang
Computer Science. 2014, 41 (5): 155-163.  doi:10.11896/j.issn.1002-137X.2014.05.033
Abstract   
References | Related Articles | Metrics
Hadoop is one of the most popular cloud computing platforms.In this platform,the existing access control model adopts Kerberos for identity verification,combines with authorization mechanism based on ACL,and uses the Delegation Token and Block Access Token,realizing a simple access control mechanism.There is an obvious shortco-ming in this model,namely,it considers only the identity authenticity of a user while authorizing,nevertheless the credibi-lity of its following behaviors.Once access control right is granted,there won’t be any kind of supervision.This paper proposed a new trust-based access control model in Hadoop,which is based on the existing access control model in Hadoop and is called LT.LT sets a trust value for each user,updates this value according to users’ behavior records,and controls the user to access Hadoop cluster with the trust value dynamically.Comparing with the existing access control model in Hadoop,the access and authorization mechanism realized in LT isn’t a one-time access and authorization,but a thoroughly real-time and dynamic process,so LT is more secure,more flexible and has a finer control particle size.Experiments show that this model is not only right and effective but also overcomes the disadvantage on lacking of security about the existing access control model in Hadoop.It can control a user to access or use the resources supplied by a Hadoop cluster dynamically and effectively.
Packet Matching Using Self-adaptive Chemical-reaction-inspired Metaheuristic for Optimization with Probability Distribution
WANG Ze-lin,WU Zhi-jian,YIN Lan and DENG Chang-shou
Computer Science. 2014, 41 (5): 164-167.  doi:10.11896/j.issn.1002-137X.2014.05.034
Abstract PDF(377KB) ( 690 )   
References | Related Articles | Metrics
Packet matching is the research focus of firewall and router devices.Its speed influences directly device performances.According to the sample informations of population currently,this paper drawed information entropy and histogram into information statistics of population currently,and used these statistics informations to adjust dynamicaly relevant parameters of optimization algorithm of Chemical-reaction-inspired metaheuristic.This paper,for the first time,analyzed problem from the sample of population currently,instead of assuming what is the distribution of all sample.From the experimental results,the algorithm proposed by this paper gets good desired effects.For dynamic adjustment parameters of intelligence algorithm of Chemical-reaction-inspired metaheuristic,the relation of scale and performance of packet matching is more loose.And the intelligent algorithm proposed is more suitable for packet matching.
Trust Model Based on Groups Recommendation in Social Network
ZHANG Feng,WANG Jian,ZHAO Yan-fei and DU He
Computer Science. 2014, 41 (5): 168-172.  doi:10.11896/j.issn.1002-137X.2014.05.035
Abstract PDF(418KB) ( 679 )   
References | Related Articles | Metrics
In social networking,trust calculation have attracted more and more attention,especially between two unknown nodes.At present most of the trust models aren’t accurate enough due to incomplete recommendation evidence.With the increasing of community,community becomes a development trend of social networks today.In this paper,we first proposed community recommendation model in substitution for the existing node recommendation model,to improve integrity and reliability of the recommend evidence,and consequently improve the accuracy of unknown nodes trust computation.Secondly friend group trust influence was considered and community correlation factor was presented to solve the community possible collusion attack.Finally,simulation results verified the rationality and effectiveness of the proposed method.
Physical Topology Discovery Algorithm for Ethernet Based on Intersection of Multi-subnet
ZENG Guang,CHEN Xing-yuan,DU Xue-hui and WANG Chao
Computer Science. 2014, 41 (5): 173-177.  doi:10.11896/j.issn.1002-137X.2014.05.036
Abstract PDF(1296KB) ( 692 )   
References | Related Articles | Metrics
The main achievements were introduced in physical topology discovery for Ethernet and the defaults of those methods were pointed out,then a new algorithm was proposed based on the intersection of multi-subnet.The algorithm uses the Minimum Requirements of Address Forwarding Table(AFT) around the intersection of multi-subnet to reason and establish the connection between switchers.And combining a typical multi-subnet topology,the algorithm was derived.Theoretical analysis proves that our method correctly infers the network topology with the incomplete AFTs,and has low communication and computational overheads,in which other methods fail.The algorithm is appropriate for discovering the physical topology of large,heterogeneous Ethernet that may include multiple subnets as well as uncooperative network elements,like hubs.
Anomaly Detection of Industrial Control System Based on Outlier Mining
CHEN Zhuang,HUANG Yong and ZOU Hang
Computer Science. 2014, 41 (5): 178-181.  doi:10.11896/j.issn.1002-137X.2014.05.037
Abstract PDF(425KB) ( 832 )   
References | Related Articles | Metrics
At present,industrial control system is widely used in electric power,transportation,water conservancy,large manufacturing industry and national critical infrastructure.ICS has become the important part of the national security strategy.The attacks against to the industrial control systems are more and more frequent,and there are little security products specifically for the industrial control system.Although most of the configuration software has variable alarm function,it is just sutable for a single variable,rarely from an overall consideration of the overall security.In order to effectively improve the industrial control system information security protection,based on the specific data and protocol and the highly real-time requirement,this paper proposed the Adaptive Clustering-Based Outlier Detection——ACBOD method to analyze the variable data from the OPC Server.This method has 4parts:data acquisition,clustering,Identification of clusters,and the cluster outlier detection.The testing results show that this method can find abnormal data in industrial control systems effective,also can find an unknown exception,and it can greatly improve the industrial control system safety protection ability.
Database Watermarking Algorithm Research Based on (t,n) Threshold
WANG Ju-long and CHEN Ji-hong
Computer Science. 2014, 41 (5): 182-185.  doi:10.11896/j.issn.1002-137X.2014.05.038
Abstract PDF(331KB) ( 747 )   
References | Related Articles | Metrics
A new watermark technology for databases based on (t,n) threshold was presented in this paper.Firstly,(t,n) threshold is used to divide watermark into some shade shares.Secondly,the tuples of relational database are marked by their hash values.The position for watermark embedding is based on the parity deference between mark and Most Significant Bits of the attributes.Finally,the watermark information is embedded into the numerical attributes of the database based on a new matching rule.Experimental results show that this algorithm is safe and effective to prevent various of attacks.
Situation Monitoring Model and Implementation Technique for Autonomous Web Services
CHEN Chao,ZHANG Hong-jun,MAO Xin-jun,YIN Jun-wen and HOU Fu
Computer Science. 2014, 41 (5): 186-189.  doi:10.11896/j.issn.1002-137X.2014.05.039
Abstract PDF(359KB) ( 688 )   
References | Related Articles | Metrics
In order to deal with open and dynamic Internet environment,the deployed services should be enriched in autonomy,and their situation should be provided with effective monitoring and management,so as to enable flexible application and dynamic adaptation of autonomous Web services.We proposed and realized the situation model and monitoring architecture of autonomous Web services by extending traditional SOA with autonomy.We also designed and implemented corresponding software platform,and conducted a case study to validate the feasibility of our approach.
Studies on Measurements of Component Approximate Matching
WU Xin-xing,HU Guo-sheng and CHEN Yi-xiang
Computer Science. 2014, 41 (5): 190-195.  doi:10.11896/j.issn.1002-137X.2014.05.040
Abstract PDF(997KB) ( 667 )   
References | Related Articles | Metrics
Usually,only part components could satisfy part requirements in practice.By comparing the pre-and post-conditions of components with the pre-and post-conditions of requirements,this paper proposed an approach to measure the matching degree of components, in order to quantitatively describe the approximate relationship between components and user’s requirements.Furthermore,through three compositions of components,it gave the matching degree rules of component-based softwares.
Approach of Transformation from Requirements Models to Software Architecture Models
XIE Zhong-wen,LI Xiao-yan,LI Tong,DAI Fei,YU Qian and ZHANG Xuan
Computer Science. 2014, 41 (5): 196-203.  doi:10.11896/j.issn.1002-137X.2014.05.041
Abstract PDF(702KB) ( 745 )   
References | Related Articles | Metrics
The transformation from requirements models to software architecture (SA) models is a hot topic in software engineering.Based on the ACP (Algebra of Communicating Processes) style requirements models set up by DERM (Dynamic Evolution Oriented Requirements Meta-model),an approach of the transformation from requirements models to software architecture models was proposed.The approach takes the Petri nets style SA models as the objective of transformation,and the behavior-mapping as the foundation of transformation.Firstly,the framework of the transformation was discussed.Secondly,the nodes of the behavior feature models were transformed into components and connectors of the SA models,and the counterpoint of the transformation rules was brought forward.Thirdly,the transformation of active property features in property feature models was discussed,and the strategy of sub-systems partition was put forward.Finally,feasibility and effectiveness of the proposed method were exhibited through a case study.
Dynamic Slicing Technique of Statechart Specifications
MIAO Chun-yu and CHEN Li-na
Computer Science. 2014, 41 (5): 204-207.  doi:10.11896/j.issn.1002-137X.2014.05.042
Abstract PDF(397KB) ( 759 )   
References | Related Articles | Metrics
As we all know,dynamic slicing technique is very useful in understanding,analysis and verification in the domain of sequential transformational programs.The classical definition of dynamic slicing is not suitable for Statechart specifications.We firstly formally defined a formal semantics model-observable semantics,which only describes outside observable behavior and conceals unobservable behavior of Statechart specifications,so it is very suitable for dynamic slicing.Then we proposed a new notion of dynamic slicing that is more natural for Statechart specifications.We formally defined notions of dynamic slicing criterion,dynamic slice generation algorithm and minimal dynamic slice.We also explained how to produce valid dynamic slicing criterion and proposed a simple and practical approximation algorithm for minimal dynamic slice generation using observable semantics as an intermediate representation.
Technology and Implementation of Extracting Control Flow Model of C Program
YANG Chang-kun and XU Qing-guo
Computer Science. 2014, 41 (5): 208-214.  doi:10.11896/j.issn.1002-137X.2014.05.043
Abstract PDF(520KB) ( 843 )   
References | Related Articles | Metrics
CFG (Control Flow Graph) represents the execution paths that may be taken in program,and the majority of static analysis tools provide its outcome by extracting control flow graph from AST (Abstract Syntax Tree) and then analyzing the CFG.Extracting correct CFG is the key to build system model on the process of model checking.This paper presented the algorithms for extracting the CFG of functions based on semantic analysis and structure analysis of AST.According to the CFG,some properties of C program can be analyzed via model checking tool.
Determinant Algorithm for Rough XML Functional Dependency
YIN Li-feng and QIU Zhan-zhi
Computer Science. 2014, 41 (5): 215-218.  doi:10.11896/j.issn.1002-137X.2014.05.044
Abstract PDF(321KB) ( 635 )   
References | Related Articles | Metrics
In order to describe and deal with uncertain XML data, the granular computing method was used to research determining problem of rough XML functional dependency.Based on rough set theory,the definitions of upper approximation and lower approximation of rough XML tree information system were given.With the help of rough similar relationship,the rough XML functional dependency was given.Next,how to use the bit pattern to represent the information values in the rough XML tree information system was researched.The determinant algorithm for the dependencies among the paths was presented and its time complexity was analyzed.Finally,by instance analysis,the information values using the data format of bits pattern are closer to the machine’s internal representation.The method can quickly determine the rough XML functional dependencies,and the computational efficiency and speed of the algorithm have been improved.
Graph-based Web Entity Ranking Method
XU Yao,ZHAO Zheng-wen,CHEN Qun,LIU Hai-long,DU Jing,HU Jia-qi and LI Zhan-huai
Computer Science. 2014, 41 (5): 219-222.  doi:10.11896/j.issn.1002-137X.2014.05.045
Abstract PDF(617KB) ( 652 )   
References | Related Articles | Metrics
In recent decades,users tend to get expected entities directly.Unfortunately,traditional search engine can only return some documents related to the key words instead of the entities user expect.What’s worse,most state-of-art entity ranking methods adopt the approach of weight stack by considering some factors related to the entities,and need many priori knowledge to train the weights.This paper extracted several candidate entities from the snippets returned by search engine and exploited the ideology of “Random Walk” to raise a graph-based algorithm,PERA(Probabilistic Entity Ranking Algorithm),to rank the candidates without many priori knowledge.The results of experiments show that the target entity gets a high ranking score.
Sorting Based XML Data Exchange Algorithm
REN Ke and YANG Xia
Computer Science. 2014, 41 (5): 223-226.  doi:10.11896/j.issn.1002-137X.2014.05.046
Abstract PDF(407KB) ( 629 )   
References | Related Articles | Metrics
In XML data exchange,XQuery and XSLT transform the XML document in memory in tree form,so they are not very efficient,and only can handle small documents.In order to transform large-scale XML documents efficiently,this paper defined the table of a schema,and proposed a sorting based three-phase XML data exchange algorithm.First,the algorithm transforms the XML document into a table,then,it sorts the table according to the target schema,and finally,it constructs a target XML document with the sorted table.The experiments show that the proposed algorithm can not only transform XML documents efficiently,but also be scalable to large-scale XML documents.
Weighted Bayes Based Data Streaming Online Classification Algorithm
LU Hui-lin
Computer Science. 2014, 41 (5): 227-229.  doi:10.11896/j.issn.1002-137X.2014.05.047
Abstract PDF(333KB) ( 717 )   
References | Related Articles | Metrics
Traditional classification algorithms need to obtain the whole training dataset before training the model.However,for big data,data are streaming into the system sequentially,so it is impossible to obtain the whole training dataset beforehand.This paper studied the online classification problem in data streaming for big data.It first described the online classification problem as an optimization problem,then proposed a Weighted Nave Bayes classifier and an Error Adaptive classifier,and at last,validated the efficiency of the proposed algorithm according to two real datasets.The experiments show that the prediction accuracy of our proposed algorithm is higher than related researches in non-noisy data streaming,and moreover, while data streaming is noisy,our algorithm still has better prediction accuracy,so it can be used in real online classification application in data streaming.
Explosion Search Algorithm with Conjugate Gradient Operator
CAO Ju,LI Yan-jiao and CHEN Gang
Computer Science. 2014, 41 (5): 230-234.  doi:10.11896/j.issn.1002-137X.2014.05.048
Abstract PDF(399KB) ( 721 )   
References | Related Articles | Metrics
As a global optimization algorithm,Explosion Search Algorithm (ESA) has some problem of low convergence speed and low optimization precision in the later period of the optimization.Fortunately,some deterministic optimization algorithms can overcome these shortcomings.Therefore,a deterministic algorithm without derivate information,which is called approximate conjugate gradient algorithm using difference quotient,was added in ESA.Based on the above,an improved Explosion Search Algorithm with Conjugate Gradient Operator (CGESA) was proposed.In CGESA,a new mutation operator is introduced to enhance the global search ability.Meanwhile,a new operator is introduced namely conjugate gradient operator to improve the local search ability of the optimal burst point,so that the convergence speed and optimization precision of CGESA are improved.Experimental results of the six well-known benchmark functions indicate that CGESA achieves better performance than ESA.
Multi-agent System Coalition Utility Allocation Strategy Based on Loyalty
CAO Yi-qin,ZHANG Zhen and HUANG Xiao-sheng
Computer Science. 2014, 41 (5): 235-238.  doi:10.11896/j.issn.1002-137X.2014.05.049
Abstract PDF(333KB) ( 651 )   
References | Related Articles | Metrics
In order to enhance the rationality about the coalition utility allocation among agents and the stability about forming the global optimal coalition in multi-agent system,this paper proposed a coalition formation strategy based on loyalty.The new strategy introduces the conception of agent’s loyalty,determines if one agent leaves the coalition before finishing the task or not during agent taking part in a coalition every time,and consequently evaluates the agent’s loyalty.At the same time,the new strategy decides the coalition utility allocation by means of combining each agent’s loyalty and their ability to finish the task.Theoretical analysis and experiment results show that the novel strategy can improve the justice of allocation for utility and achieve a global optimal solution,which is stable,speedy and simple.
A Set of Improved Hermite Kernel Function
TIAN Meng and WANG Wen-jian
Computer Science. 2014, 41 (5): 239-242.  doi:10.11896/j.issn.1002-137X.2014.05.050
Abstract PDF(695KB) ( 804 )   
References | Related Articles | Metrics
The selection of kernel function and its parameters plays a significant role in support vector machine (SVM) classification algorithms.Based on Hermite polynomial and the triangular kernel function,a new set of kernel functions—triangular Hermite kernel was proposed.The triangular Hermite kernel has two parameters.One parameter is determined by the distance between sample points and sample mean,and the other parameter is chosen only from nonnegative integer.So the parameters of the triangular Hermite kernel can be optimized easily.The experimental results on bi-spiral data,checkerboard data and 7UCI data sets indicate that the new kernel achieves the competitive classification performance,compared with polynomial kernel,Gaussian kernel,and the previous Hermite kernel proposed in reference [6].
Method of Multi-attribute Decision Making with Intuitionistic Fuzzy Number Based on Quadratic Programming
ZHANG Shi-fang
Computer Science. 2014, 41 (5): 243-244.  doi:10.11896/j.issn.1002-137X.2014.05.051
Abstract PDF(231KB) ( 740 )   
References | Related Articles | Metrics
A new method was proposed with respect to multi-attribute decision making problem in which the attribute weights are unknown completely and the attribute values are intuitionistic fuzzy numbers.First,some operational laws,score function and accuracy function of intuitionistic fuzzy numbers were briefly introduced.Further,a quadratic programming model was established and the attribute weights were obtained by solving this model.Moreover,overall attribute values of every alternative were aggregated by utilizing the intuitionistic fuzzy weighted averaging (IFWA) ope-rator.Finally,the alternatives were ranked and the most desirable one was selected according to score function and accuracy function.In addition,an example was given to show the practicality and feasibility of the developed approach.
Fast Model of Ensembling Linear Support Vector Machines Suitable for Large Datasets
HU Wen-jun,WANG Juan,WANG Pei-liang and WANG Shi-tong
Computer Science. 2014, 41 (5): 245-249.  doi:10.11896/j.issn.1002-137X.2014.05.052
Abstract PDF(425KB) ( 669 )   
References | Related Articles | Metrics
Although the algorithm of linear support vector machine (LSVM) is simple,efficient in training and testing speeds,it can not be applied for nonlinear datasets.For overcoming its drawback,the original training data was splited into several subsets and their LSVMs were respectively constructed.Then,we fit a nonlinear decision function for solving linear inseparation through the combination of the nonlinear radical basis functions (RBFs).Based on this motivation,we developed a new model,called fast model of ensembling LSVMs (FMELSVM),which is suitable for the classification of large datasets.This model improves the nonlinear capabilities of LSVMs using RBF.Meanwhile,the ensembling effects are enhanced by introducing an optimized weight vector.Experimental results on UCI demonstrate that FMELSVM obtains competitive effectiveness for large datasets.
Research on Layered Validation Methods for Military Conceptual Models
GU Chuang,DU Xiao-ming,LIU Bin and WANG Gui-qi
Computer Science. 2014, 41 (5): 250-253.  doi:10.11896/j.issn.1002-137X.2014.05.053
Abstract PDF(364KB) ( 1394 )   
References | Related Articles | Metrics
The validation of conceptual models are still important and difficult in the verification,validation and accreditation of modeling and simulation.It is hard to validate conceptual models in single level because of their abundant,various and multi-level contents.A layered validation method of military conceptual models was advanced.The validation of military conceptual models is divided into three layers:system layer,model layer and design layer.The validation contents,the validation index,the validation process and the validation methods of each level of the conceptual models validation were analyzed.Advantages of the advanced method are to clear validation content and validation methods of each stage of conceptual modeling,decompound complexity of validation work and enhance the stage,hierarchy,pertinence and maneuverability of the validation work.
Model Checking for Real-time Systems with Data Constraints
NI Shui-mei,CAO Zi-ning and LI Xin-lei
Computer Science. 2014, 41 (5): 254-262.  doi:10.11896/j.issn.1002-137X.2014.05.054
Abstract PDF(1610KB) ( 635 )   
References | Related Articles | Metrics
Real-time systems with data constraints refer to computing systems both with time-bound and data variables constraints.Now there are few studies about the specification and verification of a unified model which connects discrete data constraints and continuous-time together.The article presented a ZIA specification based on continuous-time which not only has continuous-time constraints,but also has discrete data constraints and then gave its temporal logic.MARTE is a modeling specification of a UML profile in the field of embedded real-time systems.MARTE is widely used in industry,but studies about its model checking are few.The article proposed Z-MARTE by expanding Z on MARTE,and converted Z-MARTE to a ZIA model based on continuous-time.When model checking for ZIA model based on continuous-time is realized,model checking for Z-MARTE is realized.Finally,an example was given to indicate that this method is feasible and effective.
Outlier Detection of Multivariate Time Series Based on Weighted Euclid Norm
GUO Xiao-fang,LI Feng,SONG Xiao-ning and LIU Qing-hua
Computer Science. 2014, 41 (5): 263-265.  doi:10.11896/j.issn.1002-137X.2014.05.055
Abstract PDF(312KB) ( 776 )   
References | Related Articles | Metrics
In order to improve the precision of anomaly detection algorithm for multivariate time series,based on the cumulative contribution rate of principal components,sequence and its principal components are selected,and in the k nearest neighbor local outlier detection algorithm,the weighted Euclid norm distance is used as the k nearest neighbor distance,so as to realize the anomaly detection of multivariate time series.In order to verify the effectiveness of the algorithm,anomaly detection was carried out on test data.The experimental results show that the algorithm has more advantages than traditional methods in the precision and recall.
Emotion Recognition Based on Unsupervised Extraction of Facial Expression Spatio-temporal Features
WANG Jin-wei,MA Xi-rong and SUN Ji-zhou
Computer Science. 2014, 41 (5): 266-269.  doi:10.11896/j.issn.1002-137X.2014.05.056
Abstract PDF(622KB) ( 650 )   
References | Related Articles | Metrics
Emotion recognition is the key to solving the problem of the absence of emotional communication in intelligent tutoring systems.According to the problem of effective extraction of facial expression spatio-temporal features from videos for emotion recognition,a recognition method based on unsupervised feature extraction using stacked convolutional independent subspace analysis (ISA) model was proposed to recognize three emotions including puzzlement,delight and boredom that most often appear in learning.This method first detects face in video and normalizes it,then adopts stacked convolutional ISA model to learn (without supervision) facial expression spatio-temporal features from video blocks,finally uses linear SVM classifier to recognize different emotions.Experimental results indicate that this method can extract spatio-temporal expression features more effectively than the use of hand-designed features,as well as recognition rate is better,and it meets the requirement of real-time.
Rapid Algorithm of Chinese High-frequency Repeat Extraction Based on Hierarchical Pruning
ZHANG Hai-jun,LIU Zhan-dong and Munina
Computer Science. 2014, 41 (5): 270-274.  doi:10.11896/j.issn.1002-137X.2014.05.057
Abstract PDF(480KB) ( 842 )   
References | Related Articles | Metrics
To extract high-frequency repeats from large-scale corpus,by using the hash table structure,this paper put forward a hierarchical pruning algorithm based on low-frequency character filtration and Cascade Pruning to filtrate low-frequency strings and to reduce the times of I/O reading & writing.On this basis,this paper employed the improved string sort algorithm,which can implement string sort in O(n) time complexity,to improve the efficiency of repeat extraction.According to the experimental data,it can be concluded that this algorithm is an effective method of repeat extraction,in which the relationship between the times of I/O reading & writing and the scale of corpus is linear.The algorithm can effectively extract repeats from text corpus whose scale is much larger than that of the computer memory and can better suport the repeat-based applications such as new words identification,term extraction,etc.
Critical Components Identification of Power Infrastructure Systems Based on Complex Network Theory
ZHANG Ying-chun and WANG Shu-liang
Computer Science. 2014, 41 (5): 275-279.  doi:10.11896/j.issn.1002-137X.2014.05.058
Abstract PDF(981KB) ( 737 )   
References | Related Articles | Metrics
As a typical critical infrastructure system,key components identification of the power network has great importance for analyzing and understanding vulnerability of the power network.On this basis,by utilizing the method of complex network,this paper studied the key components from aspects of complex network statistical index and perfor-mance variation under disturbance.Key components of the power network were acquired and identified to better protect the systems.It is helpful in providing decision support on cascading failures control and system improvement.
Item Correlation Graph Based Collaborative Filtering Algorithm
WANG Li-ping
Computer Science. 2014, 41 (5): 280-282.  doi:10.11896/j.issn.1002-137X.2014.05.059
Abstract PDF(334KB) ( 643 )   
References | Related Articles | Metrics
In e-commerce,accurate recommendation can improve the trading volume,and thus bring more profit for enterprises.In order to improve the accuracy of recommender system,this paper proposed an item correlation graph based collaborative filtering algorithm.This paper assumed the items as nodes,the number of people buying two commodities as the weight of the edge,and constructed an item correlation graph.According to the item correlation graph,this paper proposed a recommender algorithm that takes both item correlation graph based similarity and average similarity into consideration.Experiments show that the proposed algorithm has better prediction accuracy,and is more effective than related algorithms.
Classification Oriented Criterion Based Dimensionality Reduction and its Application in Face Recognition
YIN Fei and JIAO Li-cheng
Computer Science. 2014, 41 (5): 283-287.  doi:10.11896/j.issn.1002-137X.2014.05.060
Abstract PDF(941KB) ( 607 )   
References | Related Articles | Metrics
To tackle the problem of the curse of dimensionality caused by high dimensional data,a classification oriented criterion based dimensionality reduction method was presented.The proposed criterion aims at making each training sample and samples from the same class as close as possible in the feature space and making each training sample and samples from the different classes as distant as possible in the feature space.First,for each training sample,weighted average distance of samples from the same class and weighted average distance of samples from different classes were defined.Then,based on these two concepts,total distance of samples from the same class and total distance of samples from different classes were defined.After that,Classification Oriented Criterion (COC) was proposed,which aims at minimizing the total distance of samples from the same class and maximizing the total distance of samples from different classes.Finally,a novel dimensionality reduction method based on COC was presented.The experiments on publicly available face databases ORL and Yale demonstrate that the proposed method outperforms representative dimensionality reduction methods.
Research on Image Classification Based on Attribute Learning
LIN Wu-xu,CHENG Ke-yang and ZHANG Jian-ming
Computer Science. 2014, 41 (5): 288-291.  doi:10.11896/j.issn.1002-137X.2014.05.061
Abstract PDF(1121KB) ( 732 )   
References | Related Articles | Metrics
Inherent attributes of the image play an important role in image recognition,but the traditional classification methods tend to overlook these characteristics.This paper presented a method for image classification which combines the sparse representation with attribute learning.First the sparse representation codes of the image features are calcula-ted,then these codes are used for rebuilding the image features,finally these new features are used to train attribute classifiers which work for the purpose of image classification.Experiments using plant datasets show the efficiency of our method and improvement over the classical method.
Research of Wood Cell Sampling Method Based on Image Moments
WANG Jian-hua,GAO Wei-wei and ZHAO Lei
Computer Science. 2014, 41 (5): 292-295.  doi:10.11896/j.issn.1002-137X.2014.05.062
Abstract PDF(1110KB) ( 573 )   
References | Related Articles | Metrics
Accurate cell identification is based on accurate sampling of cell images.This paper put forward that using the concept of image moments samples cell image and extracts mathematical features of wood cells based on rotational and translation invariance.We first grayed colored images and splited cell images with dynamic threshold segmentation method preliminarily,then implemented segmentation process for adherent cells with adherent cell segmentation method based on image thinning,finally extracted sample cells with the concept of image moments and completed the sampling work.The experiment proves that this algorithm has a good stability and robustness.
Algorithm of Estimating Inner Lip Opening Distance Based on Classification of Mouth States
HUANG Xiu-qing,HUANG Wei,GAO Qiang,LU Yun and CHEN Chuan-bo
Computer Science. 2014, 41 (5): 296-298.  doi:10.11896/j.issn.1002-137X.2014.05.063
Abstract PDF(605KB) ( 726 )   
References | Related Articles | Metrics
In order to recognize the distance of mouth opening,this article proposed an algorithm of estimating inner lip opening distance based on the classification of mouth states.Firstly,we converted mouth images to YIQ space from RGB space.Then making use of the number and position of extreme points of Q value in tooth,lip and tongue,we classified mouth states into three categories and decided the range of the first difference from different categories.Thus,we could localize the position of the boundary points of inner lip more precisely and compute the opening distance of inner lip in the condition of tooth and tongue.The experimental results show the related coefficient computed by our method is 0.95while method based on contrast enhancement with multi-threshold binarization is 0.90.Our algorithm improves 5percent in related coefficient between actual values and estimated values of inner lip opening.
Improved Algorithm for Image Inpainting in Wavelet Domains
HU Wen-jin,LIU Zhong-min and LI Zhan-ming
Computer Science. 2014, 41 (5): 299-303.  doi:10.11896/j.issn.1002-137X.2014.05.064
Abstract PDF(925KB) ( 710 )   
References | Related Articles | Metrics
In view of staircase effect occurred in total variation minimization methods and insufficient noise suppression,a new wavelet-based image inpainting algorithm was presented in this paper.The inpainting processing performs with optimizing an energy function of total variational norm and the l2norm,and finite difference method is adopted for finally established diffusion equation.The scheme has strong capability of keeping sharp edge mean,and staircase effect is weakened in smooth area.Experimental results which use different image show that the proposed method achieves better inpainting effect,especially when the wavelet coefficient is relatively high.
Research on MR Image Segmentation Based on Fast FCM Algorithm Combined with Non-local Means
ZHANG Fei,FAN Hong and HAO Yan-rong
Computer Science. 2014, 41 (5): 304-307.  doi:10.11896/j.issn.1002-137X.2014.05.065
Abstract PDF(1219KB) ( 691 )   
References | Related Articles | Metrics
Aiming at the drawbacks existing in the FCM algorithm of the slow speed of operation,result vulnerable to the initial value and the difficulty to deal with the inherent Rician noise of MR image,this paper presented a fast FCM algorithm combined with non-local means.The core of the algorithm is as follow.Firstly,aiming at the Rician noise exi-sting in the MR image,using the non-local means algorithm to deal with the noise,eliminating the impact of noise on segmentation result.Secondly,getting the initial cluster centers automatically according to the proposed rules of initial centers .Finally,the cluster centers should be as the initial cluster centers of fast FCM for the segmentation of the denoised image to solve the slow search speed and the problem that is easy to fall into local minima caused by the random selection of the initial clusters.Experimental results show that the proposed algorithm can quickly and efficiently segment the image,and is more robust to noise.
Research on Traffic Holographic State Detection Based on Machine Vision in Lightweight
TANG Yi-ping,HUANG Lei-lei,YAN Hang-chen and MA Bao-qing
Computer Science. 2014, 41 (5): 308-314.  doi:10.11896/j.issn.1002-137X.2014.05.066
Abstract PDF(941KB) ( 907 )   
References | Related Articles | Metrics
Aiming at the problems of difficulty for road congestion state detection and high computational complexity for traffic basic parameters,a holographic road traffic state detection method based on machine vision in lightweight was presented in this paper.In order to achieve automatic detection of the road traffic congestion state and a variety of traffic basic parameters simultaneously using machine vision in embedded system,firstly,according to the design idea of "points stand for a plane",road areas is customized,and the uniform distribution of sampling points in the areas is generated automatically,which contributes to reduce the computing and storage resources.Secondly, by combining the background subtraction algorithm with the frame differential algorithm,some import data are obtained,such as the non-existence sampling points,the existence sampling points,the motion existence sampling points and the motionless exis-tence sampling points reflecting the road congestion states.Then,road scene background is updated by the gray value of non-existence sampling points,and some important traffic basic parameters are obtained by calculating the space distribution regularity of the existence sampling points,and traffic congestion state is achieved by analyzing the space arrangement of the motionless existence sampling points.The experiment results show that the proposed algorithms have the advantage of high computational efficiency,less resource consumption,large detection range and strong robustness,etc.,and can quickly and accurately detect various traffic basic parameters and road congestion states.
Insect Image Segmentation Algorithm Based on Multiple Linear Regression and Transition Region
LAN Hong and WANG Xuan
Computer Science. 2014, 41 (5): 315-319.  doi:10.11896/j.issn.1002-137X.2014.05.067
Abstract PDF(940KB) ( 737 )   
References | Related Articles | Metrics
Aiming at the drawbacks that the multiple linear regression model is not ideal to the boundary segmentation for insect image with shadow,a new insect image segmentation algorithm based on multiple linear regression and transition region was proposed in this paper.The method first optimizes the multiple linear regression model with norms,i.e.builds the multiple linear regression model based on RGB three color boards of the image,then acquires the regression model optimized by the cosine norm to segment the insect images.The optimized algorithm can improve the image segmentation effects,but it also retains the image shadow.So the transition region algorithm was introduced to segment the boundary from the shadow,and achieve the boundary optimization.Compared with the algorithm which only uses multiple linear regression,the proposed method improves the accuracy of insect image segmentation and has strong robustness.