Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue 6, 16 November 2018
  
Research of Data Management on Semantic Sensor Web
LI Qi and WU Gang
Computer Science. 2013, 40 (6): 1-7. 
Abstract PDF(635KB) ( 443 )   
References | RelatedCitation | Metrics
The semantic sensor Web is an integration of sensor network techniques,distributed computation techniques,database management techniques and semantic Web techniques.They can be used for sensing,collecting,integrating information,deriving additional knowledge and providing enhanced meaning for sensor observations so as to enable situation awareness.Users can conveniently access this information by visiting the Web interface.The semantic sensor Web is a new research area of computer science and technology with a wide application prospects.Both academia and industries are very interested in it.The concepts and characteristics of the semantic sensor Web were introduced,and the issues of the data management of the semantic sensor Web were discussed.The advance of the research on the semantic sensor Web,especially the data management,was also presented.
Research Advances in Test Suite Augmentation for Regression Testing
CHEN Xiang,GU Qing and CHEN Dao-xu
Computer Science. 2013, 40 (6): 8-15. 
Abstract PDF(857KB) ( 472 )   
References | RelatedCitation | Metrics
Test suite augmentation is a hot research issue in regression testing.After code change impact analysis,we evaluated the adequacy of existing test suite.If not adequate,we will design new test cases to adequately test the code change.Until now,no researchers have given a survey for this research topic.We firstly introduced the research background and problem description of this topic.Secondly we summarized a framework of this topic and made a systematic classification and comparison for existing research work.Thirdly we introduced the common used experiment subjects and evaluation metrics.Lastly we gave some suggestion on potential future work for this research area.
Chemical Service Composition
DENG Guang-hong,CAO Wan-hua,LI Jun,HUANG You-peng and CHEN Xiong
Computer Science. 2013, 40 (6): 16-20. 
Abstract PDF(493KB) ( 360 )   
References | RelatedCitation | Metrics
For working out the problem of service dynamical composition and re-composition in service orient embedded system integration process,a chemical service composition model was proposed.By imitating the manner of chemical synthesis,molecular decomposition,chemical replacement and molecular reproduction in the chemical reactions,service behaviors in the composition process like composition,decomposition,replacement,replication and their operating and evolution mechanisms were formally described.Based on these service behaviors and the petri-net method,the service evolution model was proposed to describe the compound composition process with services reliance and parallel situations.And then,as an illustration for meeting the requirements of service self-adaptive composition in command and control system,an experimental system which is called chemical service composition system (CSCS) was designed and implemented to verify the availability and analyze the performance of the models and algorithms.The results of experiment show that the chemical service composition model satisfies the application request of service self-adaptive composition in command and control system,and effectively enhances the flexibility and reconstitution of embedded systems.
Estimation Process Model for RUP Project
DU Yun-mei and LI Shi-xian
Computer Science. 2013, 40 (6): 21-28. 
Abstract PDF(1420KB) ( 386 )   
References | RelatedCitation | Metrics
The track record of software industry estimates shows that the failure rate of software projects is still high,and the estimate is one of the basic reasons.The innovation of estimation method does not expect a breakthrough.Through a controlled process,you can get the desired results.Proposed a process model to guide software project to launch a series of estimates related activities.The process model consists of two parts.The first is RUP estimation process model, which gives a detailed description of how to estimate for each development and management of stage.The second is to create a graphical step-by-step process model using Bayesian network inference model,which can be effectively used to estimate analysis,communication,balance,risk prediction.The step guide is to solve the problems of the definition of estimation activities,but it is not easy to form a clear view of the estimate.The features of software estimation are suitable for modeling with Bayesian network.BN workload estimation model is abstraction of the step guides.ESFQ model is the detailed model of the trade-off relationship between the critical factors of software projects.Case studies prove the applicability of the process model.
Higher-order Smooth Surface Extraction CUDA Parallel Implementation
YUAN Hong-xing,WU Shao-qun,GUO Li and ZHU Ren-xiang
Computer Science. 2013, 40 (6): 29-31. 
Abstract PDF(594KB) ( 363 )   
References | RelatedCitation | Metrics
Higher-order smooth surface extraction method can overcome aliasing artifacts of marching cubes algorithm.However,it introduces extra computation burden especially with optimal embedding function calculation.To resolve the problem,a parallel implementation based on graphics processing unit was presented.The original higher-order smooth surface extraction algorithm was divided into five parts including margin region,narrow band region,embedding function margin values,optimal embedding function and triangular mesh extraction.Then they were paralleled with task assignments method.The most complexity function embedding calculation is approximated by projected Jacobin method.The experimental results show that the speedup achieves more than 9after parallelization on GeForce GT 240M GPU.
Formal Description of Design Space of SIMD Instruction Sets
LI Chun-jiang,XU Ying,HUANG Juan-juan and YANG Can-qun
Computer Science. 2013, 40 (6): 32-36. 
Abstract PDF(462KB) ( 440 )   
References | RelatedCitation | Metrics
SIMD (Single-Instruction-Multiple-Data) parallel architecture plays an important role in the architecture of modern processors.The instruction set of SIMD becomes the most important subset of the whole instruction set of a processor.SIMD architecture and instruction set provides the parallel processing ability for short vectors.The SIMD instruction set supports multiple data types and multiple operations.This paper described the design space of SIMD instruction set using formal method,which depicts SIMD instruction set from multiple orthogonal dimensions.Then the discussion about the design of SIMD instruction set was presented.The formal method will be beneficial to analyze and design the SIMD instruction set architecture.
Modeling Web Service Composition Using Alternative Petri Net Considering QoS
LIU Li and FANG Jin-yun
Computer Science. 2013, 40 (6): 37-40. 
Abstract PDF(308KB) ( 375 )   
References | RelatedCitation | Metrics
Petri net is a proper method used to model Web service,but it cannot support for QoS modeling and analysis.This paper proposed a Petri net called PTCPN(Probability Time Cost Petri Net) by extending an existing Petri net with QoS modeling capability.It presened its syntax and semantics to prove that it can model not only functionality,but also non-functionality,such as reliability,time and cost.The result illustrates the effectiveness with an example to prove that PTCPN can support formal modeling and analyzing of QoS of Web service composition.
Analysis of MQX Interrupt Mechanism and Design of Interrupt Program Frame Based on ARM Cortex-M4
SHI Jing,WANG Yi-huai,SU Yong and SHEN Chen
Computer Science. 2013, 40 (6): 41-44. 
Abstract PDF(459KB) ( 920 )   
References | RelatedCitation | Metrics
Interrupt mechanism is the core mechanism which decides the instantaneity of the RTOS.MQX is an open source,multitask support,preemptive RTOS which is maintained by Freescale.It will be widely used in the application of the ARM Cortex-M microprocessor.MQX interrupt mechanism has the characteristics of real-time response and dynamic management.The paper analysed the MQX interrupt operation mechanism of top half and bottom half,based on the ARM Cortex M4Kinetis series microcontroller,and put forward the evaluation algorithm which is used to describe the instantaneity of the RTOS,making the controllability of program running time clear.On this basis,according to the embedded software engineering basic principles,not constrained by the traditional program structure design methods,this paper proposed a basic principle describing the interruption program structure under MQX and the distribution of programming elements,which satisfies the requirements of program reusability and portability.
Improved Multivariate Hash Function
ZOU You-jiao,MA Wen-ping,RAN Zhan-jun and CHEN He-feng
Computer Science. 2013, 40 (6): 45-48. 
Abstract PDF(415KB) ( 578 )   
References | RelatedCitation | Metrics
A multivariate hash function based on multivariate public key cryptographic algorithm MI was researched and its security was analyzed so that the method to broke the hash function was found.For this reason,the improved hash function was proposed.The improved hash function can keep all advantages of the original hash function.Further more it is immune on the collision.Its preimage attack,second preimage attack,differential attack and algebraic attack were also analyzed in this paper.The avalanche effect and its stability of the improved hash function were tested on the basis of a mathematical model.The experiment data shows the improved hash function meets the strict avalanche criterion and its avalanche effect is perfect and stable.
Chaos-based Hash Function for Wireless Sensor Networks
HUANG Jin-wang,HU Zhi-hui and FENG Jiu-chao
Computer Science. 2013, 40 (6): 49-51. 
Abstract PDF(307KB) ( 365 )   
References | RelatedCitation | Metrics
Wireless sensor networks (WSNs) are one of hot research issues because of their applications in fields of mi-litary,industry,ecological and health care.These applications often also include to monitor sensitive information,and security is therefore important.WSNs are restricted from many constraints,including low computation capability,small memory,limited energy resources.Hash function ran on PC can not used in WSNs directly,so we proposed a chaos-based hash function that can be ran on WSNs.Theoretical analysis and simulation results show that it has the same security level as the hash that ran on PC.
Reduction Method for Complex Net Structure Based on Characteristic Circle
HU Fei-hu,JING Juan-juan,LIU Lu-lu and MA Bei-long
Computer Science. 2013, 40 (6): 52-56. 
Abstract PDF(371KB) ( 369 )   
References | RelatedCitation | Metrics
Nets with complex structure are difficult to analyze and control.Reduction method based on characteristic circle may assist in its analysis as its reduced version may be significantly smaller while still retaining the original net’s essential properties.Study of the net is based on single directed graph.On the basis of the conception of connection and path,the circle path and composite circle path were defined.The pure characteristic circle and composite characteristic circle were defined based on conception of characteristic node and common node.The reduction rule and algorithm based on characteristic circle were discussed with samples.The result shows this method can effectively reduce complex net into small parts.However,different sequence of characteristic circle selection may have different reduction result.
Big Group-oriented Fuzzy Web Service Selection
ZHANG Long-chang
Computer Science. 2013, 40 (6): 57-62. 
Abstract PDF(565KB) ( 390 )   
References | RelatedCitation | Metrics
Group-oriented Web service selection is widely applied in everyday life,which is still challenging for big group with personalized QoS requirements and fuzzy QoS.Based on multi-attribute group decision making (MAGDM) theory,big group-oriented fuzzy Web service selection (BGFWSS) was presented.It includes five main steps:calculating group preference,calculating weighted cluster weights,constructing cluster weighted normalized decision-making matrix,determining the positive-ideal and negative-ideal solution,evaluating alternatives for group synthetically.Other contribution is a novel Web service QoS model to describe QoS values with real numbers,interval numbers and intuitionistic fuzzy numbers.Experimental results show that the proposed algorithm can solve the problem of big group-oriented Web service selection with fuzzy QoS very well.
Digital Home Service Scheduling Algorithm Based on Fuzzy Logic
DU Jian,CHEN Hong-bin and ZHAO Feng
Computer Science. 2013, 40 (6): 63-66. 
Abstract PDF(474KB) ( 557 )   
References | RelatedCitation | Metrics
Digital home technologies develop rapidly in recent years while the service scheduling is not very smart.An intelligent digital home service scheduling algorithm based on user preference and fuzzy logic was proposed.Firstly,fuzzy logic is applied to determine the scenario according to the user’s current behavior and times duration.Then,user preference,phychological state and time duration are considered to predict the user’s next service.The results show that compared to the algorithm which predicts the next service directly,the algorithm which first determines the scenario then predicts the next service has higher successful matching rate.The algorithm considering user preference and psychological state better adapts to the personalized digital home environment.Compared to the classical algorithms like minimax fair scheduling,the proposed algorithm attains higher successful matching rate.
Clustering Algorithm which Enhances Clusters’ Stability in Ad hoc Networks
WU Jing,JU Hong-jun,TIAN Li-qin and ZHAO Yun-long
Computer Science. 2013, 40 (6): 67-70. 
Abstract PDF(306KB) ( 368 )   
References | RelatedCitation | Metrics
In Ad hoc networks,MSWCA,which makes the most comprehensive consideration on clusters’ stability,is a typical algorithm of the motion-correlation considered clustering algorithm.Aiming at MSWCA’s problem that “it only considers intracluster stability,but neglects intercluster stability”,a clustering algorithm which enhances clusters’ stability (CAECS) was proposed.Based on mobility prediction idea,CAECS considers intracluster stability,intercluster stability and clusters’ optimization comprehensively,and it’s adapted to different scenarios by adjusting weights.The simulation shows that CAECS outperforms MSWCA on clusters’ stability and clusters’ maintenance overheads.
Overlay Network Based IPv6Network Architecture Protection Model
LIU Hui-sheng,WANG Zhen-xing,ZHANG Lian-cheng and HOU Yi
Computer Science. 2013, 40 (6): 71-75. 
Abstract PDF(412KB) ( 490 )   
References | RelatedCitation | Metrics
Different from IPv4,IPv6has some new properties,such as end-to-end communication,hierarchical address structure.Traditional network architecture protection schemes based on network address translation (NAT) are not in point as before.However,the proposed schemes for IPv6network architecture protection have some shortages,such as destroying the end-to-end property of the Internet,which leads to prevent the use of IPSec.Motivated by “showing falsity” and “hiding truth” in military tactics,an overlay network based network architecture protection model in IPv6(ON-NTPM6) was proposed.Firstly,an “Overlay Masking Network” design was presented,which is used to hide the true network architecture by deploying a virtual subnet with true allocated network prefix in the site.Then a topology dynamically generating algorithm was proposed,which is used to dynamically and randomly generate the topology of overlay masking network.Theoretical and empirical analysis results demonstrate that ON-NTPM6can effectively conceal the true architecture of protected network,and further deceive adversaries and consume their attack resources with virtual overlay network topology.
Novel RFID Anti-collision Algorithm Based on Tags Movement
SHI Feng-cha,CUI Chen and YU Jian
Computer Science. 2013, 40 (6): 76-79. 
Abstract PDF(316KB) ( 411 )   
References | RelatedCitation | Metrics
A novel frame slotted ALOHA anti-collision algorithm based on code division multiple access(CD-FSA) was presented and the expression of tag identification cratio was deduced.And then,the paper built a mathmatical model for the relative motion of tags and a reader in the practical application and gave the formula of identification cratio of the grouped tags lefting the region when the system reaches equilibrium state.Finally,computer simulation and analysis were made.The results show that whether tags are the state of stationary or movement,tag ientification cratio of the CD-FSA algorithm is higher than that of the frame slotted ALOHA algorithm.
Per-Flow Traffic Measurement Algorithm with Restrained Relative Error
ZHANG Jin,ZHAO Wen-dong,PENG Lai-xian and WU Ze-min
Computer Science. 2013, 40 (6): 80-83. 
Abstract PDF(425KB) ( 420 )   
References | RelatedCitation | Metrics
The accuracy of per-flow traffic measurement is evaluated using the metrics of both error probability and rela-tive error.Current research on flow traffic measurement mainly focuses on reducing error probability,while little attention has been paid to alleviating relative error.Motivated by the importance of reducing relative error in certain applications such as usage accounting,this study presented a new algorithm named MT-dlCBF (Multi-Tierd-left Counting Bloom Filter) with restrained relative error.MT-dlCBF consists of several tiers of dlCBF (d-left Counting Bloom Filter),and dlCBF at higher tier has longer flow fingerprints and wider flow traffic counters than that of dlCBF at lower tier respectively.In this way,the interference of long flow to short ones can be alleviated significantly,resulting in restrained relative error.Analytical and experimental results show that MT-dlCBF provides significant decrement in relative error and trivial increment in error probability compared to dlCBF.Moreover,MT-dlCBF has higher space efficiency than dlCBF under typical parameter settings.
Research and Implementation of Large-scale Context Management Framework Based on Terminal and Cloud
SHI Dian-xi,WU Zhen-dong and DING Bo
Computer Science. 2013, 40 (6): 84-89. 
Abstract PDF(799KB) ( 349 )   
References | RelatedCitation | Metrics
The context situation refers the global view extracted from massive context information collected from a wide area.Along with the popularity of various mobile terminals with the ability to sense its context,it is a great problem to get this kind of situation and provide better service based on what we get.On the basis of the “terminal+cloud” computing paradigm,this paper proposed a unified abstract model for mobile terminals to realize context collections.And then,we proposed an algorithm to realize the aggregation of massive context information on the cloud side,which is based on the MapReduce computing pattern.We validated the research content of this paper by a large-scale context management framework as well as a traffic situation application based on this framework.
Damage-tolerant Date Query Degraded Service Mechanism
LI Ling,QIN Xiao-lin and DAI Hua
Computer Science. 2013, 40 (6): 90-93. 
Abstract PDF(353KB) ( 464 )   
References | RelatedCitation | Metrics
Traditional database security mechanism which focuses on the data confidentiality often overlooks the requirementd of data integrity and availability.This paper presented a Damage-Tolerant Data Query Degraded Service (DT-DQDS),for improving system availability and user satisfaction of query operation.First,the model of the degraded servi-ce was given,which includes the definitions of data model and data integrity concept based on the existing study of survivability.Second,according to DT-DQDS model,data query mechanism and a concrete query algorithm were provided,and the mechanism is validated theoretically.Finally,experimental results further demonstrate its good perfor-mance on the aspect of query execution efficiency.
New Delegatable Private Mutual Authentication Protocol
WEN Ya-min and GONG Zheng
Computer Science. 2013, 40 (6): 94-99. 
Abstract PDF(644KB) ( 408 )   
References | RelatedCitation | Metrics
Private mutual authentication (or Secret handshake scheme) was proposed for anonymous bi-directional authentication among group members from the same organizations.However,the delegation functionality is not deeply studied which allows a group member to delegate his authentication rights to temporary proxies.For solving this pro-blem more efficiently,a new delegatable private mutual authentication protocol was presented.A temporary proxy can act on behalf of his delegator and accomplish a successful secret handshake with the other member.Based on the difficulty assumptions of the k+1square roots and discrete logarithm representation problems,our proposal is proven secure under the random oracle model.Compared with the related schemes,the performance of our new scheme is competitive.
Sybil Attack Defense Based on Ant Colony Algorithm
WANG Feng,LI Ya,ZHU Hai and WANG Yi-ran
Computer Science. 2013, 40 (6): 100-102. 
Abstract PDF(321KB) ( 375 )   
References | RelatedCitation | Metrics
In the structured peer to peer networks,non concentration of management and the freedom of participants involving the system make the Sybil attack becomes a special security threat which faces.By reviewing and studying the current Sybil defense,the social network which defenses Sybil attacks has good prospects.Using social network,combining ant colony algorithm to solve NP problem,an ant colony algorithm based on Sybil attack defense model ASDM and the related algorithm was presented.The experimental results show that the ASDM can effectively identify the Sybil node.
Trusted Computing-enabled DRM System and its Security Protocols
WANG Jian,ZHANG Zhi-yong,YU Wei-hua and YANG Li-jun
Computer Science. 2013, 40 (6): 103-107. 
Abstract PDF(434KB) ( 460 )   
References | RelatedCitation | Metrics
Digital rights management is designed to protect digital content usage from end to end.While,the hidden security problems in client system threaten the reasonable usage of digital contents.Through researching on trusted computing technology,a common architecture of DRM combined with trusted computing was presented.Especially,the application of trusted computing in license distributing and digital content usage was introduced.Then,an identity authentication and key agreement protocol for trusted DRM were designed,and also described with its security analysis.Through the protocol,license server can authenticate the DRM client and validate its integrity.Otherwise,the peer can obtain sharing key to protect the digital license distributing.
Worm Detection Method Based on Fuzzy Pattern Recognition to Network Behaviors
YAN Fen,CHEN Shuang-shuang and YIN Xin-chun
Computer Science. 2013, 40 (6): 108-110. 
Abstract PDF(343KB) ( 326 )   
References | RelatedCitation | Metrics
Worms have been one of the most serious threats to Internet security due to the significant damage,large range of victims and fast spread.How to detect network worm attack is an important aspect of network security research area.This article proposed a method which detects worms by analyzing and studying typical network behaviors while worms burst out.The algorithm studies the network behaviors of normal and abnormal computer separately,establishes standard fuzzy subsets of classification,and judges if the observation computer infects worms by utilizing fuzzy pattern recognition method.Finally,the experiment with the worm applications in the real world proves that this method is able to detect unknown scanning worms preferably.
Liquidity Risk Measurement and Analysis of Large-value Payment System
SHUAI Qing-hong,FANG Ling,KUANG Yuan-jing and LUO Yang
Computer Science. 2013, 40 (6): 111-115. 
Abstract PDF(423KB) ( 475 )   
References | RelatedCitation | Metrics
Payment system is one of the most important financial infrastructure of a nation.LVPS is an important part of the payment system,and it is more quick,more accurate,more safe than other settlements.Therefore,it is important to improve the liquidity of the LVPS to reduce the liquidity risk of the system.Based on the modeling,this paper quantified the indicators of liquidity demand and settlement delay,and used panel data to set up random effect model to mea-sure the probability of potential liquidity of the LVPS.
Quantitative Methods Based on Entropy Rate to Measure Capability for Cipher Chip to Defense Power Attacks
SHAO Qi-feng,TANG Xiao-wei,FANG Ming and YANG Tian-chi
Computer Science. 2013, 40 (6): 116-118. 
Abstract PDF(238KB) ( 408 )   
References | RelatedCitation | Metrics
Two key indicators to descript randomness of cipher chip power leakage were obtained through a large number of engineering experiments:one is the distribution law of gate-level flip number,another one is the transition matrix of the number of gate-level flip.Based on the two key indicators,we introduced the concept of entropy rate in information theory.Through the entropy rate,we can dynamicly measure the speed of the entropy increasing about the power consumption waveform in the encryption process,and effectively measure the defensive performance of the cipher chip under SPA attack.
Multi-objective AC-BM Algorithm Based on Automata Union Operation
WANG Zheng-cai,XU Dao-yun and WANG Xiao-feng
Computer Science. 2013, 40 (6): 119-123. 
Abstract PDF(365KB) ( 609 )   
References | RelatedCitation | Metrics
The AC-BM algorithm has the advantages that multiple pattern strings are searched simultaneously and that number of characters of moving text string is optimized.But,they are searched only in one text string in one time.To search in multiple text strings simultaneously,this paper designed multi-objective AC-BM algorithm.By union operation of two automatons,multi-objective multi-pattern tree automata was structured,and by BM algorithm’s bad character move technique,function of moving a set of text strings was designed.In the Snort,2-goal AC-BM algorithm and 3-goal AC-BM algorithm were implemented.On the condition that if in multiple text strings a pattern string is found,the algorithm stops,the result shows the new algorithm is obviously superior to AC-BM algorithm in time.
Identity-based Key Management Scheme in Pervasive Computing Environments
SUN Ling and TIAN Yuan
Computer Science. 2013, 40 (6): 124-127. 
Abstract PDF(420KB) ( 318 )   
References | RelatedCitation | Metrics
Considering the key management in pervasive computing environments,this paper proposed a novel identity-based key management scheme from additive elliptic curve group.It employs the secret-sharing technique to construct distributed private key generators,and designs the methods of updating private key,updating host-key shares and negotiating session key.It can achieve higher security requirements and has higher efficiency compared with available identity-based schemes.
Quantitative Estimation of Parameters in Quantization Indexing Modulation Watermarking Method of Spatial Domain Based on PSNR
JING Li and LI Shu-hong
Computer Science. 2013, 40 (6): 128-131. 
Abstract PDF(308KB) ( 459 )   
References | RelatedCitation | Metrics
Quantization Indexing Modulation (QIM) method is usually used in watermarking and information hiding.Quantization step is a key parameter of quantization modulation method.Its value connects with watermark embedding degree.Now there is no theoretical method to decide its value except through experiments,and this will reduce watermark inserting efficiency.To overcome this difficulty,a quantitative estimation method of quantization error for Dither-quantization Modulation was provided based on its distribution at first.Then based on the estimation method,selecting spatial pixels as quantization coefficients,the quantitative relationship equation of quantization step,watermark sequence length and PSNR was deduced.Experiments results show PSNR values calculated through quantitative equation are in good agreement with those obtained from experiments under equal conditions.It demonstrates that the deduced quantitative relationship equation is accurate.
Research on Secure Distributed Access Control System for Ubiquitous Computing
DOU Wen-yang,WANG Xiao-ming and ZHANG Li-chen
Computer Science. 2013, 40 (6): 132-137. 
Abstract PDF(607KB) ( 344 )   
References | RelatedCitation | Metrics
This paper designed a secure distributed access control system for complex security requirements in ubiquitous computing environment and the system architecture was presented.A authorization query algorithm was proposed to solve the problem of the efficiency of authorization query.Fuzzy reasoning machine was proposed to achieve fuzzy authorization.Finally,an encryption mechanism was presented to ensure the confidentiality and integrity of information during the process of authorization query.
Design and Implementation of Multi-domain Visual Modeling System Based on Web
ZHAO Shun-hua,WU Yi-zhong and SHEN Bo
Computer Science. 2013, 40 (6): 138-141. 
Abstract PDF(411KB) ( 530 )   
References | RelatedCitation | Metrics
Modelica-based multi-domain modeling and simulation software running on desktop has many disadvantages over Web-based software,which goes against the share and deposit of knowledge,and its maintenance and updating is difficult.Multi-domain modeling and simulation technology based on B/S architecture was researched,and a multi-domain modeling and simulation system—WebMWorks,which is based on Web,was designed and implemented.WebMWorks adopts silverlight and WCF technology to implement the visualization of models on browser side,and communication between browser and Web server.Finally a modeling instance on the WebMWorks system was given.
Feature Composition Failures and its Solution in FOP
CHEN Zhi-dan,SHEN Li-wei and ZHAO Wen-yun
Computer Science. 2013, 40 (6): 142-147. 
Abstract PDF(778KB) ( 335 )   
References | RelatedCitation | Metrics
There exist dependencies between software product line features,thus the feature modules in feature-oriented programming (FOP) is closely related in the code or structure level.On the other hand,whether the variable features are bound in the applications has destructive impact on the implementation of the feature dependencies,causing the potential problem of feature composition failures during FOP process.This paper analyzed the problem and concluded three main dependency scenarios,besides,proposed a vertical decomposition method for feature modules to solve the problem.Its key mechanism is to introduce variability into the inner part of feature modules,thus the problem can be avoided by composing the codes according to the specific requirements.Furthermore,the method was applied on a software product line of publishing-house profit evaluation systems to validate its effectiveness.
Event-Action Behavior Model for Cyber System
SONG Cui-ye,WANG Qing and DU Cheng-lie
Computer Science. 2013, 40 (6): 148-151. 
Abstract PDF(354KB) ( 413 )   
References | RelatedCitation | Metrics
The cyber part of the cyber physical system is designed to control the behavior of the physical part precisely and timely according to the requirements of the users,employing the cyber abilities such as computing,communicating and control technologies.Physical behavior is time-continuous and concurrent,while the cyber behavior is discrete.Such heterogeneous brings big challenges for the design of the cyber system.An accurate model is needed to capture the interaction requirement of the cyber part and the physical part.First,an architecture of the abstract behavior entities of the CPS system was given and explained.Then an event-action behavior model was proposed and defined in detail.Finally,the effect of the behavior model in the development of cyber system was analyzed in the background of smart detecting vehicle system,and future research work was pointed out.
Research of Distributed ETL Architecture Based on MapReduce
SONG Jie,HAO Wen-ning,CHEN Gang,JIN Da-wei and ZHAO Shui-ning
Computer Science. 2013, 40 (6): 152-154. 
Abstract PDF(273KB) ( 390 )   
References | RelatedCitation | Metrics
Aiming at deficiency of centralized execution mode of traditional extraction-transformation-loading(ETL) tools,this paper put forward the architecture of distributed ETL based on MapReduce——MDETL(MapReduce Distributed ETL).The ETL architecture which uses a parallel programming model of massive data parallel processing with cluster computing methods of distributed ETL,achieves the cluster distributed ETL processing.It improves the whole ETL system's flexibility and throughput rate,and has better expansibility and load-balancing,raises the performance efficiency.
Research and Application of Chaos Opaque Predicate in Code Obfuscation
SU Qing,WU Wei-min,LI Zhong-liang,LI Jing-liang and CHEN Wei-de
Computer Science. 2013, 40 (6): 155-159. 
Abstract PDF(391KB) ( 873 )   
References | RelatedCitation | Metrics
Constructing the safety opaque predicate is the major issue in code obfuscation and the key to it lies in using opaque predicates.In order to form a chaotic opaque predicate,the En_Logistic was proposed through improving the Logistic chaotic,and then it was applied to the construction of the opaque predicate clusters.The chaotic opaque was applied in the code obfuscation process.In this experiment,different methods of inserting chaos opaque predicate into program branches and sequence blocks were given.The program complexity evaluation and the control flow complexity evaluation were given in the application process of code obfuscation.Additionally,security analysis shows that the chaotic opaque predicate has higher security in fighting against static attacks,as well as the dynamic.And better effectiveness in the code obfuscation is verified in this experiment.
Software Fault Location Method Based on Fault Detection Model of Bipartitie Graphs
WANG Yao-xuan,YE Jun-min,CHEN Jing-ru and OU Zhong-hong
Computer Science. 2013, 40 (6): 160-163. 
Abstract PDF(351KB) ( 355 )   
References | RelatedCitation | Metrics
In software fault diagnosis process,the most costly and time-consuming is software fault location.In order to help tester locate software fault,with guidance of layering design thought,based on the complex relationship between software and its various modules and codes,this paper proposed the software failure propagation model based on topological graph through analysis of the historical data of the corresponding relationships between software fault and its phenomenon,making it possible to use the topology graph model to describe the software fault phenomenon.Through the topological graph model,the software fault propagation model can be converted into an easier fault detection model based on the bipartitie graphs.Then,an algorithm is designed based on greedy strategy according to this model.This algorithm solves the problem of the minimum coverage solution based on the bipartitie graphs.The result of this solution describes the set of assumed reasons for software faults,finds the corresponding module to the fault through the analysis of the relationship between the fault and the software modules,and thus to achieve fault location.Experiments show that this method of software fault location is effective.
Distributed Skyline Processing Based on Hypersphere Projection Partitioning on Cloud Environments
LEI Ting,WANG Tao,QU Wu and HAN Xiao-guang
Computer Science. 2013, 40 (6): 164-171. 
Abstract PDF(716KB) ( 423 )   
References | RelatedCitation | Metrics
Recently,skyline processing has been receiving considerable attention due to its potential applications in many fields,including traditional database,distributed database,data stream and even the categorical database and so on.Both the academic and the industrial have paid much attention to it.As an important data mining technique,skyline proces-sing is of great significance for multi-objective optimization,urban navigation,multi-criteria decision making and prefe-rence query,trip planning,defense and intelligence systems and geographic information systems.In addition,the amount of data collected and used by human is developing at an astonishing speed.Therefore,how to process Skyline query of massive data is an urgent problem.Aiming at cloud computing applications,this paper designed and implemented distributed Skyline processing based on hypersphere projection partitioning under the Map-Reduce framework,HSPD-Skyline.It is showed that partitioning the data according to the hyperspherical coordinates can increase the average pruning power of points within a partition,and reduce the cost of Skyline processing.The HSPD-Skyline algorithm also uses a heuristic strategy based on space partitioning tree,HA-SPT,to further improve the processing efficiency of the HSPD-Skyline algorithm.Finally,the theoretical analysis and experiment results illustrate that the HSPD-Skyline algorithm (Distributed Skyline Processing based on Hypersphere Projection Partitioning) consistently outperforms similar approaches for distributed skyline computation,regardless of data distribution,and further optimization strategies.
RM-LCDF:A Recovery Method for Block-level Continuous Data Protection
WANG Chao,LI Zhan-huai,LIU Hai-long and ZHANG Xiao-fang
Computer Science. 2013, 40 (6): 172-177. 
Abstract PDF(512KB) ( 323 )   
References | RelatedCitation | Metrics
Block-level continuous data protection has become an important data protection technology for modern data storage systems.It can restore data to any point in time and support reliable storage.The high availability of computeri-zed data has a raise requirement on the data recovery efficiency,but basic data recovery method for block-level continuous data protection is low.This paper presented a recovery method on the basis of the localized and continuous distribution features (RM-LCDF) for block-level continuous data protection.RM-LCDF reforms the basic data recovery method via three aspects:(1)Invalid write requests elimination,(2)Multi-buffer,and (3)Logical block address sorting.Both mathematical analysis and experiments show preliminarily that RM-LCDF can significantly reduce recovery data,improve I/O parallelism and I/O throughput,and thus improve the recovery efficiency.
Results Ranking Approach of XML Keyword Search Based on Keyword’s Structural Relationships
REN Jian-hua,ZHOU Jian,MENG Xiang-fu and WEI Ke
Computer Science. 2013, 40 (6): 178-182. 
Abstract PDF(482KB) ( 341 )   
References | RelatedCitation | Metrics
If the answer of an XML multi-keywords search is not empty,there would be some specific relationships between these keywords and such relationships can be speculated by SLCA (the smallest lowest common ancestor).This paper proposed an XML keywords query results ranking approach based on these relationships:the approach obtains the SLCAs by the LISA II algorithm,leverages the structures of SLCAs to speculate the interior structural relationships of keywords and to obtain the relationship tree.Then,the importance of each SLCA can be estimated by the strict degree of keywords to the query node in the relationship tree.The SLCAs are ranked according to their importance and the ordered SLCAs are treated as the ranked XML keywords query results.The experimental results demonstrate that the approach presented in this paper has the high precision,and can efficiently meet the user’s needs as well.
A Kind of Efficient Search Method Based on Big Data
YOU Chuan-chuan and ZHANG Gui-gang
Computer Science. 2013, 40 (6): 183-186. 
Abstract PDF(851KB) ( 388 )   
References | RelatedCitation | Metrics
This paper proposed an efficient search method to the problem of low efficiency for large dada queries.Using shared history query results as a set of intermediate results,when a new query request arrives,the first match for historical inquiry is directly added to the matching portion of the historical results for directly as part of the new query result of the request if achieving matching.It can reduce the large number of double counting query history,save search time and improve query efficiency.By experimental comparison and analysis show that data based query methods can improve query efficiency.
Clustering Models and Algorithms for Distributed Data Streams Based on Data Synopsis
MAO Guo-jun and CAO Yong-cun
Computer Science. 2013, 40 (6): 187-191. 
Abstract PDF(530KB) ( 436 )   
References | RelatedCitation | Metrics
Mining data streams aims at discovering knowledge from a large of streaming data,in which enough efforts have been done in recent years.As a typical example,the data to be collected by a sensor is a format of data streams.However,in the technical environment of a sensor network,multiple sensors always are set and they collect data in a distributed way,so mining data streams with a distributed way is making a challenge issue.Most ongoing studies for mining distributed data streams are suffering from the problems of accuracy or efficiency.In this paper,the model for clustering a distributed data stream was discussed,including a new synopsis data structure for summarizing data streams and some effective algorithms for key mining phases.The reasons of presented algorithms were also discussed.Experimental results demonstrate that presented models and algorithms have less transmission cost and higher clustering qua-lity to mine the global pattern from distributed data streams.
Real Time Analytics Study of Big Data Based on Tree-lib
SHEN Lai-xin and WANG Wei
Computer Science. 2013, 40 (6): 192-195. 
Abstract PDF(448KB) ( 670 )   
References | RelatedCitation | Metrics
In order to improve the storage and parallel processing capabilities of big data,the big data of real time concurrent analytics and management mode were built with the center of column stored Infobright and distributed MySQL Cluster,to complete to secondary develop of the open source brighthouse engine.The managing procedure Tree-lib was used to visual monitor,maintain and manage the distributed big data.The results of experiments show the combination of Infobright and Cluster have the capability of high compression storage and multiple concurrent inquiries and the efficient real time analysis with big data.The Tree-lib accomplishes generation,detection,update,backup and disaster recover of tree and library.Finally,the purpose of visual bidirectional management and maintenance is achieved.
Named Entity Recognition on Chinese Microblog
QIU Quan-qing,MIAO Duo-qian and ZHANG Zhi-fei
Computer Science. 2013, 40 (6): 196-198. 
Abstract PDF(243KB) ( 1364 )   
References | RelatedCitation | Metrics
The rapid development of microblog brings a new carrier for named entity recognition.The paper proposed an approach for named entity recognition on Chinese microblog according to the features of microblog.First of all,the paper normalized the text of the microblog and eliminated the interference caused by non-standard expression,then constructed several knowledge bases,such as Chinese person names,common place names and organization names,and devised feature templates for the recognition method based on conditional random fields.Meanwhile the correct recognition results were added to the knowledge bases to improve the performance of recognition.The experiment results show that our approach is effective to recognize named entities on Chinese microblog.
Word Similarity Measurement Based on BaiduBaike
ZHAN Zhi-jian,LIANG Li-na and YANG Xiao-ping
Computer Science. 2013, 40 (6): 199-202. 
Abstract PDF(331KB) ( 785 )   
References | RelatedCitation | Metrics
Research on word similarity measurement has been popular not only in natural language processing but also in other basic research.Traditional word similarity measurements use semantic lexical or large-scale corpus.We first discussed the background of the applications of word similarity measurement,such as information retrieval,information extraction,text classification,example-based machine translation,etc.Then two strategies of word similarity measurement were summarized:one is based on ontology or a semantic taxonomy,the other is based on large collocations of words in corpus.BaiduBaike,an online open encyclopedia,could be used not only as a corpus but also a knowledge resource with rich semantic information.Based on BaiduBaike with its rich semantic information and category graph,we proposed a new method to analyze and compute Chinese word similarity from four dimensions:the baike card,the content of word,the open classification of word and the correlation words.We used language-network to choose top key terms of content of word.Based on vector space mode (VSM) theory,we calculated the similarity between parts of words.We presented a new “multi-path searching” algorithm on BaiduBaike category graph.A comprehensive similarity measuring method based on the four parts was proposed.Experiment results show that the method has a good performance.
Research on Complex System Modeling and Reasoning Based on Large Fuzzy Cognitive Map
PENG Zhen,TIAN Li-qin,WU Jing,GAO Xiao-yan and YANG Bing-ru
Computer Science. 2013, 40 (6): 203-205. 
Abstract PDF(348KB) ( 370 )   
References | RelatedCitation | Metrics
As an intelligent computational tool,Fuzzy Cognitive Map has the advantage of intuitive knowledge representation and fast numerical reasoning ability,etc.,and is applied for system modeling and reasoning.In order to achieve the integrated mining of associated cognitive and clustering,and to achieve effective analysis and decision-making for complex system,large Fuzzy Cognitive Map of complex system was researched,and the ideas and methods of modeling and reasoning of large Fuzzy Cognitive Map association cognitive and state were proposed for Three Rivers ecological decision,which will not only enhance the large Fuzzy Cognitive Map research on complex systems modeling and reasoning,but also expand its application area.
Research on Parallelized Sentiment Classification Algorithms
YU Yong-hong,XIANG Xiao-jun and SHANG Lin
Computer Science. 2013, 40 (6): 206-210. 
Abstract PDF(390KB) ( 429 )   
References | RelatedCitation | Metrics
The scalability problem becomes a bottleneck for traditional stand-alone sentiment classification algorithms due to the massive data.We implemented feature extraction,feature weighting and classification algorithms involved in sentiment classification task by using MapReduce technique on Hadoop platform.We evaluated our proposed paralle-lized sentiment classification algorithms on real data sets in terms of precision and time costs.Experimental results show the effectiveness of these parallelized sentiment classification algorithms and also provide valuable references for users to select suitable sentiment classification algorithms according to user requirements.
Dynamics Behavior and Immune Control Strategies of SIRS Model with Immunization on Scale-free Complex Networks
CHEN Qian-guo and ZHANG Zi-li
Computer Science. 2013, 40 (6): 211-214. 
Abstract PDF(305KB) ( 461 )   
References | RelatedCitation | Metrics
The infectious SIRS model with artificial immunization was proposed based on SIRS model and scale-free nature on complex networks.The dynamic behavior of the model was studied through a mean-field theory.We studied the spreading of disease in the special scale-free networks through two different artificial immunization strategies,and simulated different impact of each strategy on disease spreading.The result shows that artificial immunization can effectively reduce infection rates and improve the system spreading threshold,so as to effectively control the disease spreading on complex networks.
Rule Extraction Algorithm Based on Discernibility Matrix in Inconsistent Decision Table
QIAN Wen-bin,YANG Bing-ru,XU Zhang-yan and XIE Yong-hong
Computer Science. 2013, 40 (6): 215-218. 
Abstract PDF(326KB) ( 377 )   
References | RelatedCitation | Metrics
Since the efficiency of traditional rule extraction algorithms based on discernibility matrix in inconsistent decision table is often poor, a quick rule extraction algorithm based on discernibility matrix was proposed to deal with the problem.The definite of simplified decision table is first introduced,and many duplicate objects are deleted in decision table.Then the subsets of discernibility matrix is constructed with respect to different decision classes,which effectively avoids the imbalance of objects and compresses the storage space of algorithm,and adopting the heuristic search strategy with backward greedy to calculate the relative minimal attribute reduction.Some useful decion rules based on reliabi-lity are extracted,what’s more,the reliability is dynamically given,and the algorithm has good adaptability.Finally,example analysis and experiential results show that the proposed algorithm can exact effective decision rules from inconsistent decision table.
Mining Algorithm for Temporal Text Association Rules in Text Mining
ZHANG Chun-yan,MENG Zhi-qing and YUAN Pei
Computer Science. 2013, 40 (6): 219-224. 
Abstract PDF(498KB) ( 422 )   
References | RelatedCitation | Metrics
Due to the frequent updates of the database,temporal database has hidden a lot of unknown information.Thus,temporal association rules for updating database should be generated.Although the association rules algorithm has been intensively studied,temporal association rules algorithm for the text data is also unusual.In this paper,we studied temporal association rules algorithm for the text,and established the temporal model for text.Then we presented mining algorithm SPFM for temporal text association rules.Finally,we got an experiment to check the effectiveness of the algorithm.
Ensemble Feature Selection Based on Normalized Mutual Information and Diversity
YAO Xu,WANG Xiao-dan,ZHANG Yu-xi and XUE Ai-jun
Computer Science. 2013, 40 (6): 225-228. 
Abstract PDF(352KB) ( 439 )   
References | RelatedCitation | Metrics
How to generate classifiers with higher diversity is an important problem in ensemble learning,consequently,an iterative algorithm was proposed as follows:base classifier is trained using optimal feature subset which is selected by maximum normalized mutual information,simultaneously,the attained base classifier is measured by the diversity based on the number of miss classified samples.The algorithm stops if satisfy,otherwise iterates until end.Finally,weighted voting method is utilized to fusion the base classifiers’ recognition results.To attest the validity,we made experiments on UCI data sets with support vector machine as the classifier,and compared it with Single-SVM,Bagging-SVM and AB-SVM.Experimental results suggest that our algorithm can get higher classification accuracy.
Protein-protein Interaction Identification Based on Relational Similarity
FENG Er-ying,NIU Yun,WEI Ou and CAI Xin-ye
Computer Science. 2013, 40 (6): 229-232. 
Abstract PDF(441KB) ( 455 )   
References | RelatedCitation | Metrics
Current protein-protein interaction (PPI) identification systems use single sentences as evidence,and often suffer from the heavy burden of manual annotation.To address these problems,a new relational similarity-based approach using large-scale text as evidence was proposed.First,description of PPIs is obtained by automatic searching of the whole PubMed database.Then,three types of features including lexical features,phrases,and dependency relations are extracted to build the vector space model of PPI.Finally,similarity between vectors is measured to classify the relationship between two proteins.In this method,training data is taken from existing PPI databases and no extra annotation work is needed.Results of the experiment show that this approach achieves high F-score (74.2%).
Research on Music Emotion Retrieval Technology Employing Fuzzy Mathematics
GAO You-ping,TONG Ming-wen,ZHANG Kai,YE Ju-ping and CHEN Lin-lin
Computer Science. 2013, 40 (6): 233-237. 
Abstract PDF(448KB) ( 382 )   
References | RelatedCitation | Metrics
One of the hot issues of the present study is how to match retrieval algorithm accurately while regarding the music emotion as retrieval keywords.Based on the fuzzy characteristics of emotion of music,the theory of fuzzy mathematics and AV emotion of vector space model were used to propose a novel music emotional fuzzy retrieval technology(MEFRT).The most significant feature of this music emotion retrieval technology is music emotion to quantitative treatment.We can see a music contains different types of emotion from this method of retrieval.According to the membership value,the same emotional type of music can be smart ranking.People can more accurately find the music resources by this way.In order to prove the effectiveness of the MEFRT,a series of experiments were designed,and the metrics were calculated,such as recall ratio,precision,F value,relevance,C value,V value coverage and new rate,etc.
Study on Optimal Combination of Model of Information Needs Based on Particle Swarm Optimization Algorithm
GUO Shu-hang,DING Xian and WANG Jian
Computer Science. 2013, 40 (6): 238-241. 
Abstract PDF(315KB) ( 330 )   
References | RelatedCitation | Metrics
To meet the requirements of the quantitative portfolio management in information needs,a model based on an improved Particle Swarm Optimization algorithm to solve the portfolio optimization problem of information was proposed.Firstly,discussing the application status of the PSO in the investment field.Second,defining the element model of information needs,setting two coefficients,then proposing a new PSO algorithm adding the Expected Utility Coefficient among the information needs and the Preferences Coefficient of decision-makers,and comparing it with traditional PSO algorithm.
Uncertain Attribute Graph Sub-graph Isomorphism and its Determination Algorithm
ZHANG Chun-ying and ZHANG Xue
Computer Science. 2013, 40 (6): 242-246. 
Abstract PDF(378KB) ( 352 )   
References | RelatedCitation | Metrics
The uncertain attribute graph expectative sub-graph isomorphism is based on the analysis of complex network structure and the characteristic of uncertain attribute graph.The uncertain attribute graph expectative sub-graph isomorphism is only one threshold value as constraint conditions.The method is simple,but the computation is large amount.Therefore,it brought in the definition of α-β sub-graph isomorphic of uncertain attribute graph,explained the semantic,and designed and implemented the algorithm of α-β sub-graph isomorphism.Through the experiments was proved that α-β sub-graph isomorphic is better than expectative sub-graph,and it analyzed the variation in the different threshold cases.The research of α-β sub-graph isomorphism algorithm lays the foundation for uncertain attribute graph sub-graph query and community mining.
Improved Artificial Bee Colony Algorithms Based on Extremal Optimization Strategy
GE Yu,LIANG Jing and WANG Xue-ping
Computer Science. 2013, 40 (6): 247-251. 
Abstract PDF(397KB) ( 627 )   
References | RelatedCitation | Metrics
In order to enhance the performance of artificial bee colony algorithm in solving optimization problems,this paper proposed an improved artificial bee colony algorithm.The improved algorithm redesigns local search scheme of onlook bees based on evolution method of extremal optimization strategy,and implements operators of component mutations,formulates rules of worst component judgment.The simulation results of eight typical functions of optimization problems show that the proposed algorithm can attain significant improvement on accuracy and convergent speed,has a better solution capability,compared with the basic artificial bee colony algorithm and known improved algorithm.
Structural Optimization Algorithm for RBF Neural Network Based on Mutual Information
GUO Wei
Computer Science. 2013, 40 (6): 252-255. 
Abstract PDF(657KB) ( 408 )   
References | RelatedCitation | Metrics
Aiming at designing the simplest RBF neural network architecture,a RBF neural network structure design algorithm based on mutual information was proposed in this paper.The relevance measure between each hidden and output units can be acquired by estimating the mutual information between the output matrix of the hidden unit and output unit,using k-nearest-neighbor statistics.And the simplest RBF neural network architecture can be achieved by removing the least related hidden units from the trained neural network one after another according to the relevance measure.This algorithm has the self-recovery mechanism,and the information processing capacity of the neural network can be ensured in the process of the simplification of the network’s architecture.The simulation results on the artificial datasets and the real-world benchmark datasets show the effectiveness and stability of the algorithm.
Implementation of Bayesian Inference on MCDB Distributed System
ZHOU Zhi-min and GAO Shen-yong
Computer Science. 2013, 40 (6): 256-259. 
Abstract PDF(380KB) ( 316 )   
References | RelatedCitation | Metrics
This paper described how the Monte Carlo database system (MCDB) can be used to easily implement Baye-sian inference via Markov chain Monte Carlo (MCMC) over very large datasets.Linear Bayesian regression,LDA and Dirichlet clustering were used as examples to demonstrate this task.To implement an MCMC simulation in MCDB,a programmer specifies dependencies among variables and how they parameterize one another using the SQL language.This paper devised a simple scheme for developing large scale machine learning systems with SQL,which with the help of MCDB,can automaticly deal with parallelization and optimization problems,to achieve high efficiency in computation.
Chaotic Time Series Prediction of Sunspots Number with Wavelet Packet-wavelet Neural Network
PAN Yu-min,ZHANG Xiao-yu and ZHANG Quan-zhu
Computer Science. 2013, 40 (6): 260-264. 
Abstract PDF(525KB) ( 414 )   
References | RelatedCitation | Metrics
Sunspot is an important phenomenon of the solar activity,and it is influencing the earth,human beings and the life environment.Because it is hard to identify the factors impacting the sunspots,we introduced the wavelet packet and chaos phase space reconstruction the subsequence,to find out the dynamics and physical laws of the sunspots time series.The method decomposes the original time series by wavelet packet,restructures and recovers the impact factors,predicts the subsequence by wavelet neural network toolbox and gets the final prediction result of the sunspots by wavelet packet restructure.The wavelet neural network is developed by myself,which is convenient with fast convergence speed and great data-processing capacity and is accurate in prediction with strong practicality.The method plays an important role to the promotion of using wavelet neural network,which provides a new way to predict the number of sunspots.
Modeling Research in Complex Network of Traditional Economic System
BAI Yong and LU Yi-nan
Computer Science. 2013, 40 (6): 265-267. 
Abstract PDF(263KB) ( 542 )   
References | RelatedCitation | Metrics
In contemporary society,model study in economic system has been one of the hot object in scientists.Many of the complex economic system can exist in nature.In these complex economic and mathematical model,when we research the relative characteristics of complex economic system,we must consider the targeted structure in complex network model characteristics.This thesis mainly researched the complex network of traditional economic and social system,gave a new research ideas for the socio-economic system in complex network。
Multi-target Tracking Statistical Techniques in Complex Case
JIN Xin,LIANG Xue-chun and YUAN Xiao-long
Computer Science. 2013, 40 (6): 268-271. 
Abstract PDF(583KB) ( 348 )   
References | RelatedCitation | Metrics
According to application requirement of video-based traffic statistics in video surveillance,the paper proposed improved background detection and tracking count method.Each pixel of a frame is to be updated in the traditional Gaussian background modeling and the number of Gaussian distribution is fixed which makes resources consumption increase.This article proposed that the update area is need to be fund firstly and then update the area.The region is updated by using dynamic adjusting Gaussian distribution method.At the same time,considering the characteristics of the mean and variance,and seting respectively their update rate.Connected domain analysis is applied to create the human node and get the node's centroid.Finally according to the nodes’ centroid which has been created in the list,first search forward is used.All target pixels are searched in the next frame which are used to determine the new position of the object in the video.Experiments show that the algorithm is simple and feasible and implements accurate tracking of multiple targets.The system achieves statistical data with high accuracy.
Improved Probabilistic Patch-based SAR Image Despeckling Based on Cluster Analysis and Rotation
HU Kai-yang and GENG Bo-ying
Computer Science. 2013, 40 (6): 272-275. 
Abstract PDF(1208KB) ( 397 )   
References | RelatedCitation | Metrics
Thin details in the filtered images are suppressed by the probabilistic patch-based (PPB) filter,which is attributed to the absence of effective selection of pixel patches and the unsuitable method of weight computing.For these problems,the data structure of cluster tree was introduced firstly.The same distance measure as applied in the PPB filter was chosen to build the cluster tree,which allows for efficient and precise selection of similar patches.Since the origi-nal PPB filter could not handle rotated or mirrored repetitive regions properly,the weight between two patches was redefined after the rotation of the patches.Finally,the PPB (non-it) filter was used for the denoising.Experimental results show that the improved filter has better performance in texture and details preservation than the original PPB (non-it) filter,especially in retaining thin details.
Recommendation Research Based on Improved URP Model and K Nearest Neighbors
XIA Li-min,ZHAO Ye-dong,PENG Dong-liang and ZHANG Wei
Computer Science. 2013, 40 (6): 276-278. 
Abstract PDF(308KB) ( 324 )   
References | RelatedCitation | Metrics
The methods used to recommend products suffer from the problems such as cold starting and accurate.To address these problems,a new recommendation method based on improved URP model and K nearest neighbors was proposed.Users and items are modeled by improved URP model,and this model can solve the new user problem effectively.The rates predicted are optimized by K nearest neighbors to solve the new item problem.The experimental results show that the new method has good quality for recommendation.
Canny Operator Study Based on GCV Criteria and Otsu
ZHANG Zhi-shun,XI Jian-qing and LIU Yong
Computer Science. 2013, 40 (6): 279-282. 
Abstract PDF(876KB) ( 332 )   
References | RelatedCitation | Metrics
In order to enhance the quality of the image,on the basis of previous studies,we improved the Canny operator to solve the problem caused by factors such as light,noise caused by image blur,unclear edge.Firstly,by optimizing the GCV criteria based threshold function with genetic algorithm automatically,we reduced the image noise and enhanced the quality of the image simultaneously.Secondly,using adaptive evaluation function to set the thresholds for the Otsu,we reduced the phenomenon of false edges regardless to the ratio of the threshold.Different image edge extraction experiments and data analysis show that the method is improved in terms of detection accuracy,noise immunity,and opera-tional efficiency significantly.At last,for the shortcomings,we pointed out further research direction.
Cognitive Neural Mechanisms and Saliency Computational Model of Auditory Selective Attention
LIU Yang,ZHANG Miao-hui and ZHENG Feng-bin
Computer Science. 2013, 40 (6): 283-287. 
Abstract PDF(927KB) ( 651 )   
References | RelatedCitation | Metrics
According to structure and function of auditory cognitive neural information processing,a new auditory saliency computing model based on mechanisms of selective attention cognitive neural was proposed in this paper,and referring to principle of image processing,algorithms of auditory saliency were presented.This model simulates both the bottom-up and top-down human auditory attention mechanism.In selective attention saliency extraction and background noise restrain,the model has achieved satisfactory results in simulation and natural audio experiments.
Algorithm for Pavement Distress Image Denoising Based on Gradient Enhanced Diffusion
ZHANG Yong-qiang
Computer Science. 2013, 40 (6): 288-290. 
Abstract PDF(518KB) ( 314 )   
References | RelatedCitation | Metrics
The pavement distress auto detection with rapid CCD camera suffers from the complicated pavement’s background which contains greasy dirt and inclusion.The traditional image denoising algorithm suffers from the edge and texture features’ losing which seriously interrupts the reliability of the detection system.In order to overcome this problem,an algorithm for image denoising with line-type texture based on gradient enhanced diffusion was proposed.Taking account of the character of image which includes line-type texture,the structure of texture can be imported into denoising.According to the local change of gradient,the diffusion factor was defined afresh.The simulated experiment results demonstrate that the method has the superiority and staility with the line-type texture image denoising.
Adaptive Parameters Settings Method of PCNN Based on Visual Information and its Modified Model
ZHAO Yan-ming
Computer Science. 2013, 40 (6): 291-294. 
Abstract PDF(601KB) ( 408 )   
References | RelatedCitation | Metrics
The parameters of pulse dual neural network (PCNN) determine the application of the model in the field of digital image processing.But adaptive settings of network parameters are based on the information of image statistics or network structure.Based on this,the adaptive parameters settings method of PCNN based on visual information was proposed and model was improved.By analyzing the nature of the biological visual perception theory and PCNN network,the method reveals the homology of the theory of visual perception and PCNN network parameters M,W and β.The M,W and β of adaptive parameter setting method were given on the basis of visual perception model.The PCNN improvement model of Biological visual features was designed.The experiments verify the geometric invariance of the model.And it is proved that the model achieves good results at the field of Content-based image retrieval.
Noise Reduction Algorithm for Texture Images Using Anisotropic Diffusion with Double-regularing Terms
LI Xiao-ning,GONG Jia-qiang and XING Hao-yang
Computer Science. 2013, 40 (6): 295-299. 
Abstract PDF(937KB) ( 327 )   
References | RelatedCitation | Metrics
Important information,such as textures or edges,is often lost or blurred in the process of smoothing algorithm based on P-M Anisotropic Diffusion.We defined a new diffusivity operator which is used to control the velocity of the diffusion and proposed a reaction-diffusion equation with double-regularizing terms to overcome this defect,besides,we analysed and verified the model’s convergence.The diffusivity operator relies on the change rules of gradient in a neighborhood to weaken the diffusion intensity in texture or edge regions.Two forcing terms were used to regulate diffusion and maintain edges,boundaries and texture.Experiment results and comparisons show the good performance of the proposed method for texture and edge preserving in the process of denoising,and demonstrate the proposed algorithm is feasible and applicable.
Auto Exposure Algorithm for Perception System of Intelligent Vehicle
GU Ming-qin,CAI Zi-xing and YI Liang
Computer Science. 2013, 40 (6): 300-302. 
Abstract PDF(582KB) ( 690 )   
References | RelatedCitation | Metrics
In order to obtain non-dim and stable images,this paper proposed an automatic exposure algorithm of camera.Firstly,traffic signs are detected in the first frames image,and Region of Interest(RoI) is determined in 9fixed partitions according to the results of traffic sign detection.Then the exposure situation is accurately judged by the histogram of V component in HSV color space,and weight matrix is selected according to the location of the interested region and exposure situation.Lastly,the exposure time of next frame image is got using gray value method,and the camera's automatic exposure is completed.The experimental results show that this method can quickly and effectively complete automatic exposure,and is adaptable for the backlight and backlit cases.
Image Thresholding Segmentation Based on Visual Perception and Isoperimetric Cut
ZOU Xiao-lin and FENG Guo-can
Computer Science. 2013, 40 (6): 303-307. 
Abstract PDF(1194KB) ( 376 )   
References | RelatedCitation | Metrics
Because the two-dimensional threshold segmentation methods do not consider the characteristic of human vi-sual perception,the search space of these methods is the whole two dimensional gray-level area.At the same time,image segmentation based on isoperimetric cut does not consider the gray intensity and it’s iteration terminate condition is difficult to determine,so the effect of image segmentation result using the method is not ideal.In this paper,a novel two-dimensional thresholding method based on visual perception and isoperimetric cut was presented.The proposed method first utilizes characteristics of visual perception to find a gray level vector decided by candidate threshold vectors,then uses isoperimetric cut as a criterion to select the minimum isoperimetricratio corresponding to the candidate threshold vector as the optimal threshold vector from the candidate threshold vectors.Experimental results on a series of image show that the proposed method outperforms some classic two-dimensional thresholding methods insegmentation quality.
Image Scrambling Algorithm Based on SOMA CUBE Square Matching
FAN Tie-sheng,ZHANG Zhong-qing and ZHANG Pu
Computer Science. 2013, 40 (6): 308-310. 
Abstract PDF(873KB) ( 348 )   
References | RelatedCitation | Metrics
In allusion to common deficiency of existing scrambling algorithm,an image scrambling algorithm based on SOMA CUBE square matching was proposed.To scramble an image,bit-plane of the original image is firstly exchanged to change the pixel grayscale,and then this image is divided into blocks according to SOMA CUBE;finally,any two matching methods of SOMA CUBE are chosen,and one is seen as the matching way of original image elements,the otheris the scrambling image,and puting the twos conversion,so as to realize the image scrambling.The scrambling image shows the white noise is not the problem of cyclical recovery safety,scrambling is relatively stable,can quickly achieve ideal scrambling effect,and there are not requirements for image size.The experimental results show that this algorithm can effectively achieve the gray image scrambling,has a good visual effect and quantitative evaluation results,and can resist certain geometrical attack.
Research on 3D Modeling Algorithm for Complex Plane Pattern
CHEN Yu-tuo,FEI Yong-chao,YAN Jun-ping and HAN Xu-li
Computer Science. 2013, 40 (6): 311-314. 
Abstract PDF(1113KB) ( 345 )   
References | RelatedCitation | Metrics
The algorithm presented in this paper firstly obtains the data matrix of the clear outline of planar complex patterns through vectorization and data conversion of the planar complex pattern image or graphic,secondly finds the intersection points of scan lines and contours using scan analysis method,and matches the intersection points and calculates midpoints,finally builds the 3D model of the planar complex patterns by the 3D model data matrix that is created using quadratic Bezier curves and post-optimization processing of the model data.This modeling approach can conceive and build a virtual model of the plane patterns.It can generate smooth and delicate curving surface and vivid model which has the basic structure characteristics of the patterns,and flexibly control the model generating form.The mode-ling algorithm has the features of low complexity,fast modeling speed,and strong practicality.