Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue 11, 16 November 2018
  
Research on Permutation Flow-shop Scheduling Problem
LIU Ying,GU Wen-xiang and LI Xiang-tao
Computer Science. 2013, 40 (11): 1-7. 
Abstract PDF(702KB) ( 1391 )   
References | RelatedCitation | Metrics
As the science and technology develop and the production scale expands constantly,the research on permutation flow-shop scheduling problem has been more and more attention by many scholars.So far,there are a lot of optimization algorithms in the fields,and they greatly increase the production efficiency.There are some literatures which are compared and reviewed,but not including the research of the latest solving methods and results.From a new perspective on the problems in classification,we detailed the comparison of new algorit hms,making scholars to have the latest and comprehensive understanding.
Cloud Computing Resource Scheduling:Policy and Algorithm
CHU Ya,MA Ting-huai and ZHAO Li-cheng
Computer Science. 2013, 40 (11): 8-13. 
Abstract PDF(522KB) ( 2403 )   
References | RelatedCitation | Metrics
Resource allocation and scheduling(RAS)are the key issue of cloud computing,and their policy and algorithm have a direct effect on cloud performance and cost.Firstly,four hot topics of cloud RAS,locality-aware task scheduling,reliability-aware scheduling,energy-aware RAS,and workflow scheduling,were presented.And then these fours were classified into three parts according to different optimization objectives(performance and cost),and various existing RAS policies and algorithms were discussed in details.In addition,a comparative analysis of four problems with their representative algorithms was made.Finally,some future research directions of cloud RAS were pointed out.
Research on WMD Model in Military Simulation and Analysis System
YANG Shan-liang,ZHAO Xin-ye,YANG Mei,FU Yue-wen and ZHOU Yun
Computer Science. 2013, 40 (11): 14-17. 
Abstract PDF(711KB) ( 741 )   
References | RelatedCitation | Metrics
Weapons of mass destruction are of great significance as an important part of military simulation and analysis system.Establishing the proper and correct WMD model can simulate and predict WMD impact process effectively,which can provide significant reference information for biochemical-protection decision support operations.Firstly,the status and role of WMD in military simulation and analysis system were introduced in brief.Secondly,the modeling ideas of WMD model were described detailedly from the three aspects of HPAC model engine,data-driven mode and data acquisition interface.Finally,an illustrative example utilizing the prototype military simulation system was given to verify the correctness and validity of the WMD model.
Research on Top-level Design of Architecture for Cyber-physical Systems
HE Ming,LIANG Wen-hui,CHEN Xi-liang and CHEN Qiu-li
Computer Science. 2013, 40 (11): 18-22. 
Abstract PDF(921KB) ( 480 )   
References | RelatedCitation | Metrics
After Cyber-Physical System(CPS)was proposed by National Science Foundation,it occupied the first place among eight important information technology fields in America.By analyzing current research situation of CPS at home and abroad,three kinds of views for CPS structure were proposed to realize the CPS coordinate physical process and meet the functional requirements just as information processing and physical control of CPS and real-time,reliability,security,adaptability and other non-functional requirements.The design theory and method of CPS architecture were stu-died.Moreover,the direction for future study of CPS was put forward.
GPU Parallel Algorithm Based on Power Evaluation Methods
WANG Zhuo-wei,CHENG Liang-lun and ZHAO Wu-qing
Computer Science. 2013, 40 (11): 23-28. 
Abstract PDF(473KB) ( 753 )   
References | RelatedCitation | Metrics
With the continuous development of hardware and software,Graphics Processor Units(GPUs)have been used in the general-purpose computation field.They have emerged as a computational accelerator that dramatically reduces the application execution time with CPUs.To achieve high computing performance,a GPU typically includes hundreds of computing units.The high density of computing resource on a chip brings high power consumption.Therefore power consumption has become one of the most important problems for the development of GPUs.This paper analyzed the energy consumption of parallel algorithms executed in GPUs and provided a method to evaluate the energy scalability for parallel algorithms.Then the parallel prefix sum was analyzed to illustrate the method for the energy conservation,and the energy scalability was experimentally evaluated using Sparse Matrix-Vector Multiply(SpMV).The results show that the optimal number of blocks,memory choice and task scheduling are the important keys to balance the perfor-mance and the energy consumption of GPUs.
Link-quality-aware Opportunistic Network Coding Mechanism in Wireless Networks
GE Qing,BAI Guang-wei,SHEN Hang,ZHANG Peng and CAO Lei
Computer Science. 2013, 40 (11): 29-34. 
Abstract PDF(508KB) ( 396 )   
References | RelatedCitation | Metrics
Wireless link quality has a significant impact on communication performance,which is currently not taken by most existing wireless network coding mechanisms into account,resulting in degradation of network throughput.To address this problem,this paper proposed a link-quality-aware opportunistic network coding mechanism(LONC)for wireless networks.By taking advantage of broadcast nature of wireless communication,this mechanism combines network coding with opportunistic forwarding techniques.As a metric for measuring transmission efficiency,the expected number of transmissions is used to compute the utility value of data transmission.On this basis,we assigned dynamic priority with each packet to be sent in accordance with their utility value.The objective is to ensure that packets with high-priority have a better chance to be forwarded,thus achieving much higher throughput.Our simulation results demonstrate that LONC can not only achieve performance improvement of network throughput,but also enhance data transmission quality to some certain extent.
Crossover Based Ultra-lightweight RFID Authentication Protocol
DU Zong-yin,ZHANG Guo-an and YUAN Hong-lin
Computer Science. 2013, 40 (11): 35-37. 
Abstract PDF(311KB) ( 384 )   
References | RelatedCitation | Metrics
Aiming at the security hole of RFID system and tag’s cost,we proposed a crossover based ultra-lightweight RFID authentication protocol(CURAP),and proved the correctness and security of this protocol with BAN logic formal analysis method.CURAP defines crossover operation involving bitwise XOR and left rotation.And data updating operation only is occurred in the reader,then the tag makes simple bitwise XOR extraction from transmission message.Security analysis and performance evaluation show that CURAP not only provides strong mutual authentication,resists various attacks,but also can lessen the computation requirement and storage space on tags,fit for low-cost RFID system.
Indoor Positioning Algorithm for WLAN Based on Distribution Overlap and Feature Weighting
XIE Dai-jun,HU Han-ying and KONG Fan-zeng
Computer Science. 2013, 40 (11): 38-42. 
Abstract PDF(412KB) ( 392 )   
References | RelatedCitation | Metrics
Affected by the time-varying and random characteristics of indoor wireless signal,the traditional positioning method using the received signal strength(RSS)mean as the fingerprint information has poor localization accuracy. This paper proposed a fingerprint matching positioning algorithm based on distribution overlap and feature weighting to resolve the problem.The probability distribution of access point signal’s envelope was used as location fingerprint feature.Firstly,fingerprint feature’s weight was set by utilizing the connectivity of the terminal and AP,and the similarity of fingerprints feature was indicated by the overlap of the signal envelope’s distribution.Secondly,the sum of the weighted features similarity was taken as the fingerprint similarity.Finally,the target’s position was estimated accor-ding the principle of maximum fingerprint similarity.The experimental results show that the proposed algorithm obtains significant accuracy improvement and higher practical value.
PMIPv6Based Fast Handover Scheme for Network Mobility
TANG Wei,TANG Hong-bo and CHEN Lu
Computer Science. 2013, 40 (11): 43-47. 
Abstract PDF(1009KB) ( 509 )   
References | RelatedCitation | Metrics
A PMIPv6based fast handover scheme for NEtwork MObility was proposed to solve the problem of long handover delay and lack of effective inter-domain handover scheme to support NEtwork MObility in Proxy Mobile IPv6Network.With the help of L2scheme to perform pre-handover before mobile node getting registered,information is acknowledged and tunnel for buffering packets is established in advance.Inter-domain handover is supported by extending signaling to make entities communication in different domain and eliminates the packet loss problem by introducing double buffering scheme.Analysis and simulation show that this proposal can reduce the delay during handover compared to the existing mechanisms.
Testing of Network Traffic Series in Reconstructed Phase Space Based on Recurrence Rate Feature
XIE Sheng-jun,YIN Feng and ZHOU Xu-chuan
Computer Science. 2013, 40 (11): 48-51. 
Abstract PDF(356KB) ( 599 )   
References | RelatedCitation | Metrics
Traditional linear analysis method takes the network traffic time series analysis,and does not fully use the exi-sted non-linear feature information.The data analysis performance is limited as result.An improved testing model of network traffic with phase space reconstruction based on recurrence quantitative analysis(RQA)was proposed,and on the basis of phase space reconstruction,the recurrence rate called REC feature was designed as the sequence data support.The average mutual information algorithm and false nearest neighbors algorithm were proposed in calculating the key parameters for phase space reconstruction.The determinism and prediction property of network traffic were tested based on the regular points and lines,and the abnormal flow feature was detected based on REC recurrence feature.Simu-lation result shows that the series has strong stability and self-similar characteristics,and the feature detection performance is stable and precise,and the monitoring precision is improved by 19% with REC feature,and the abnormal traffic detection precision is 99.7% with the improvement of 13.2%.It shows good performance in analysis of network traffic and non-stationary data sequence.
Parallel Network Initialization Method for WIA-PA Network
XIAO Jin-chao,ZENG Peng,ZHANG Qiong and YU Hai-bin
Computer Science. 2013, 40 (11): 52-57. 
Abstract PDF(575KB) ( 463 )   
References | RelatedCitation | Metrics
The industrial wireless network is widely utilized in factories based on TDMA technology.This paper proposed a novel fast parallel networking to ameliorate the funnel effect which can slow down the network efficiency during the WIA-PA industrial wireless process.It can not only achieve a parallel network of the routing device,but also reduce the communication collision to avoid resource conflicts by transforming superframe structure,selecting the initial time based on the residual energy,linking status and starting time,as well as improving the resource allocation algorithm in the joining process.The experimental results illustrate that this algorithm plays an important role in enhancing the networking efficiency greatly,and shortening the networking period significantly.
Real Time Force Feedback Data Transmission for Remote Haptic Collaboration System Based on TCP/IP Protocol
JIANG Zheng-zheng,GAO Zhan,GU Xiang and CHEN Xiang
Computer Science. 2013, 40 (11): 58-60. 
Abstract PDF(327KB) ( 449 )   
References | RelatedCitation | Metrics
Network time delay and packet loss are important factors which affect the performance of remote haptic collaboration system.For the purpose of real-time transmission of force feedback data,a new method for real time force feedback data transmission based on TCP/IP Protocol was proposed and developed.By the means of delay processing and interpolation preprocessing,the negative impacts of remote haptic perception caused by network communication are reduced to minimum.The experimental results have justified the feasibility of the method.The proposed method can be used to tele-medicine,tele-education and virtual reality applications.
New BoD Bandwidth Request Allocation Algorithm
LIU Lei,XIE Jun,HU Gu-yu and TANG Bin
Computer Science. 2013, 40 (11): 61-64. 
Abstract PDF(319KB) ( 433 )   
References | RelatedCitation | Metrics
In BoD(Bandwidth on Demand) protocol,full utilization of remaining bandwidth after allocation on demand can reduce service delay.First on account of how to make efficient use of remaining bandwidth and its effect on feedback control,this paper proposed a new BoD bandwidth request allocation algorithm based on prediction and on-board control.Secondly in view of both the long and short range dependence of network flow,we proposed ARFIMA(p,d,q) model for traffic modeling and prediction,which can represent both long and short range dependence according to the different values of parameter d.Finally we employed OPNET for simulation system.The simulation results indicate that,for both long and short range dependence service flow,BoD request allocation scheme based on ARFIMA(p,d,q) prediction and on-board feedback control can reduce service delay to the most degree,with the help of remaining bandwidth.
Multi-population Firefly Based Routing Protocol in Underwater Acoustic Sensor Networks
XU Ming and LIU Guang-zhong
Computer Science. 2013, 40 (11): 65-69. 
Abstract PDF(422KB) ( 356 )   
References | RelatedCitation | Metrics
Considering the unique characteristics of underwater acoustic sensor networks (UWASNs),this paper proposed a Multi-population Firefly based Routing (MFR) protocol to ensure that data packets can be correctly and efficiently forwarded in UWASNs.Firstly,we presented the network model of our UWASNs.And then,we designed three kinds of fireflies and their coordination rules in order to improve the self-adaptability of building,selecting and optimization of routing path.We demonstrated through simulations that our routing protocol outperforms traditional protocols in packet delivery ratio,end-to-end delay and network throughput under the same circumstances.
New Clustering Algorithm Based on Data Field in Complex Networks
LIU Yu-hua,ZHENG Yi,XU Cui and JIN Jian-zhi
Computer Science. 2013, 40 (11): 70-73. 
Abstract PDF(394KB) ( 456 )   
References | RelatedCitation | Metrics
Focusing on current clustering hot issues,the article presented a new clustering algorithm based on data field in complex networks.It calculates nodes’ importance combining with a mutual-information method,and divides network cluster structures according to node’s potential.Experiments show that the algorithm has certain advantages upon the accuracy and the computable complexity.
Methods and Mechanisms on Transmission Control for M2M Communications
WANG Qun and QIAN Huan-yan
Computer Science. 2013, 40 (11): 74-80. 
Abstract PDF(641KB) ( 377 )   
References | RelatedCitation | Metrics
M2M communications are important parts of the Internet of Things(IoT).Among them,the IP-based M2M communications achieve a unified addressing,coordinated and unified management for nodes,is also a hot research topic for many years.In traditional TCP error control mechanism,the loss of data is simply as network congestion,and congestion control mechanisms are used to solve these problems.The traditional TCP ignores wireless link imperfections including high bist error rate,frequent topology changes,channel asymmetry,unfairness of MAC protocol and so on.Therefore,the traditional TCP is not suitable for secure M2M communications.This paper analyzed the TCP deficiencies and corresponding improving methods that impact the M2M communications,and discussed the appropriate solution respectively from end to end connections,split connections and mixed connections.Finally,the paper proposed TCP improving thinking and research directions for M2M communications.
Security Protocol Based on IEC60870-5-104for Communication in Distribution Automation
MA Jun and ZHANG Yi-bin
Computer Science. 2013, 40 (11): 81-84. 
Abstract PDF(347KB) ( 368 )   
References | RelatedCitation | Metrics
It has been found that there are some cyber security risks in the communication process of distribution automation system(DAS)based on IEC60870-5-104protocol.In order to realize mutual authentication and shared key establishment for DAS Front-End Processor(FEP)and any terminal,this article presented a scheme based on unidirectional digital signature and unidirectional Keyed-Hashing for Message Authentication(HMAC)algorithm.It analyzed the features of communication network architecture based on EPON in DAS,the corresponding cyber security risks and security requirements,showed the implementation procedure of the scheme.The scheme needs not change original software and hardware of legacy terminals and considers resource-constraint terminals by using dedicated security devices.Security analysis proves that the scheme can resist outsider attack,replay attack and impersonation attack.Compared with the related works,the proposed scheme is more secure and practical,which can satisfy the application requirement.
Research on Application of Network Protocol Parsing Class System Based on Multi-core Optimization
LI Chang-rong and WU Di
Computer Science. 2013, 40 (11): 85-88. 
Abstract PDF(386KB) ( 388 )   
References | RelatedCitation | Metrics
When data transmission speed of the network traffic monitoring system is too fast,the data packet loss,transmission stops,response error will occur.To resolve the problem,this paper proposed evaluation index for the system which uses the system throughput as core index to evaluate the system optimization performance throught the index;selects the network protocol analysis system for multi-core optimization research,taking GTP-AS system as a specific objective optimization,according to the system performance bottleneck,puts forward a set of multi-core platform optimization strategy .The experiment proves when number of the core processor computing core is increased to 7,a number of kernel optimization of network protocol analysis system throughput can reach 391.73% of the optimized priorty,effectively improves the performance of the system.
Identifying User’s ID from Anonymous Mobility Trace Set via Asynchronous Side Information
ZHANG Hong-ji,LI Wen-zhong and LU Sang-lu
Computer Science. 2013, 40 (11): 89-93. 
Abstract PDF(406KB) ( 373 )   
References | RelatedCitation | Metrics
With the development of social network applications,and to meet the demand of mobile system designing and scientific research,plenty of location trace information has been collected and published.Most public traces utilize the anonymous ID and adding noise to protect the privacy of users.However,these processes are vulnerable when facing asynchronous attack.Even if partial information was exposed to adversary and the collections of side information and public trace set are not in the same duration,the adversary can also identify user’s ID with high accuracy.Our experiment shows that the existing method is not applicable when facing asynchronous side information.A novel method applicable in asynchronous condition was proposed which is called hot-matrix method.To verify this method,we employed experiments in three different mobility trace sets,whose subject is taxi,bus and human beings respectively.Experiments show that hot-matrix method performs much better than existing approach.
Construction of MAI Function Based on T-D Conjecture
ZHANG Zhe-lin and ZHOU Meng
Computer Science. 2013, 40 (11): 94-97. 
Abstract PDF(319KB) ( 329 )   
References | RelatedCitation | Metrics
An improvement was made on the construction method and the relevant conclusions of combinatorial conjecture proposed by Ziran Tu.Under the premise that the more general combinatorial conjecture is still rational,a class of Boolean functions f with the maximum algebraic immunity on even number of variables was presented,and using the functions f,a new class of balanced Boolean functions F with the maximum algebraic immunity on even number of variables was gotten.These functions not only have higher algebraic degree and nonlinearity,but also have strong resistance against algebraic attacks.
Formal Security Model Resist ing Session Exponential Reveal for Key Agreement Protocol
TAO Wen-jun and HU Bin
Computer Science. 2013, 40 (11): 98-102. 
Abstract PDF(443KB) ( 551 )   
References | RelatedCitation | Metrics
Based on the assumption of ephemeral exponential leakage in eCK model,this paper analyzed the effect of this hidden trouble and built a new formal security model in which a much stronger adversary can be resisted and also a new security attribute can be satisfied.Futher more,a provable secure key agreement protocol-HCMQV was designed in this new model.The protocol modifies the generation function of e in a natural way and makes it in secrecy.The method reduces the times of HASH and also the reflect attack can be avoided.In order to prove the security of HCMQV,we did not prove the unforgeability for the signature scheme like HMQV,but a distinguisher was constructed to reduce the security of protocol to DDH assumption tightly.Actually,designing a secure key agreement protocol in which ephemeral exponential can be leaked is possible.
Lightweight Authentication Protocol for RFID Based on Model of Untraceability
CHEN Xiu-qing,CAO Tian-jie and GUO Yu
Computer Science. 2013, 40 (11): 103-107. 
Abstract PDF(381KB) ( 378 )   
References | RelatedCitation | Metrics
We analyzed the security of RFID authentication protocols recently proposed by Ha et al.Our security analysis clearly highlights important security weakness in this article.More precisely,an adversary analyzes the messages between reader and tag and implements traceability attacks.Then,the adversary performs the passive full-disclosure attacks and discloses the tags’ secret.In order to evaluate the performance of traceability attacks,we observed the toy experimental results by running several tests on an implementation of the protocol.Finally,we used the formal model of untraceability,which successfully proves the robustness against tracing attacks and the untraceability of the enhanced scheme.
Method of Network Security Situation Prediction Based on IHS_RELM
CHEN Hong,WANG Fei and XIAO Zhen-jiu
Computer Science. 2013, 40 (11): 108-111. 
Abstract PDF(308KB) ( 421 )   
References | RelatedCitation | Metrics
To address the situation prediction problem in the network security situation awareness,this paper presented a prediction method of network security situation based on the algorithm of HIS_RELM. We proposed an improved harmony search(IHS) algorithm after studying the principle of the harmony search(HS) algorithm.This method embeds the regularized extreme learning machine(RELM) in the process of the objective function calculation of the improved harmony search algorithm,and takes advantage of the global searching ability of the IHS algorithm to optimize the input weights and hidden layer threshold of the RELM.To some extent,this enhances the learning ability and generalization ability of the RELM.Simulation experiments show that this method has better prediction affection in comparison with other existing prediction methods.
Provably Secure and Efficient Certificateless Signcryption Scheme
SUN Hua and ZHENG Xue-feng
Computer Science. 2013, 40 (11): 112-116. 
Abstract PDF(470KB) ( 618 )   
References | RelatedCitation | Metrics
Certificateless cryptography eliminates the key escrow problem inherent in identity-based cryptosystems and avoids the complex certificate managerment problem in traditional certificate-based public-key cryptosystems,so it achieves the best advantages of them.Signcryption is a cryptographic primitive that could achieve authentication and confidentiality simultaneously by combining digital signature and public key encryption,while it has lower computational cost than signing and encryption respectively.In this paper,a provably secure certificateless signcryption scheme was proposed,which requires only two bilinear pairing operation in the unsigncryption phase and is much more efficient than the existing ones.In the last,we proved it satisfies indistinguishability against adaptive chosen ciphertext attack and existential unforgeability against adaptive chosen message and identity attack by using the complexity assumptions in the random oracle model.
Multiple Pattern Algorithm for Chinese
HOU Zheng-feng,YANG Bo and ZHU Xiao-ling
Computer Science. 2013, 40 (11): 117-121. 
Abstract PDF(637KB) ( 391 )   
References | RelatedCitation | Metrics
Because of the independent of the Chinese characters,the space and time performances of AC algorithm decline sharply.For this problem,the storage structure of AC algorithm was improved and a multi-pattern algorithm for Chinese named AC_SC was proposed.The algorithm uses adjacency-list to store the finite state automata to solve the problem of the storage space rapid expansion.Besides,the long linked list of the state ‘0’ is changed into a Hash linked table to improve the matching efficient.Experimental results show that AC_SC has better time and space performances.
Efficient Fair Pseudonym Management Model
ZHU Xiao-ling,LU Yang,ZHANG Ben-hong and HOU Zheng-feng
Computer Science. 2013, 40 (11): 122-125. 
Abstract PDF(325KB) ( 473 )   
References | RelatedCitation | Metrics
Anonymity is an effective approach to achieve privacy protection.Due to the illegal operation of a malicious user,anonymity requires to be disclosed in some applications.However there are two problems in the existing traceable anonymous schemes.The first one is that the rights of the administrator are too large.Secondly the overhead of storage and search for the relation between ID and a secret increases as the number of users increases.The paper proposed an efficient fair pseudonym management model.The model is composed of pseudonym issuance,pseudonymous application and joint tracking.The new partially blind signature protocol was given to ensure that CA takes part in pseudonym is-suance,however,it is unable to track.A secret sharing method was proposed to ensure that tracking authorities jointly disclose a pseudonym.The above two problems are solved effectively by the separation of issuance and tracking without storage and search for the relation between ID and secret.The analysis shows the model has the characteristic of anonymity,traceability,unforgeability,robustness and fairness.So it can be applied in anonymous communication with tracking requirements in Internet.Moreover,it can link up with traditional PKI technique well.
Analysis of News Diffusion in Recommender Systems Based on Multidimensional Tastes
WANG Guan-nan,CHEN Duan-bing and FU Yan
Computer Science. 2013, 40 (11): 126-130. 
Abstract PDF(420KB) ( 366 )   
References | RelatedCitation | Metrics
How to deliver the right information to the right person to meet the individual needs of users is a basic problem in recommender systems.The emerging social recommender systems are personalized ones with sharing information of similar interest users.Using multidimensional vectors to characterize the user’s interest,simulating with multi-agent model,and factoring the quality of the users and news into recommendation,this paper analyzed impact of leader-follower network the structure and the quality factors on the recommendations and diffusion of news.The results indicate that different communities have different themes,and the core users of communities not only concentrate on one category,but also share the same interest with the community theme.Additionally,introducing the quality of users and news into systems not only can speed up the convergence of higher success rate of recommendation,but also can distinguish the followers of different users and the behaviors of propagation of different news,while raising the influence of excellent users and news and improving the professional level of recommendation.
Multi-attribute Ranked Keyword Search over Encrypted Cloud Data
FENG Gui-lan and TAN Liang
Computer Science. 2013, 40 (11): 131-136. 
Abstract PDF(589KB) ( 551 )   
References | RelatedCitation | Metrics
Searching on encrypted data is an important auxiliary function for cloud.Confidentiality-preserving rank-ordered search algorithm computes document relevance scores with one keyword local attribute,so its precision is low.To solve this problem,a multi-attribute ranked keyword search over encrypted cloud data was introduced.Firstly,cloud service provider(CSP)builds secure index of multi-attribute feature vector based on keyword local and global attributes.Secondly,the local attribute and global attribute weights are determined by users’ sort-by.And then,relevance score is calculated by multi-attribute ranking formula.Finally,CSP will return the interested results for the user.Experiments show that the method can improve the retrieval speed and the accuracy of search results effectively.
PSO-based K-means Algorithm and its Application in Network Intrusion Detection
FU Tao and SUN Wen-jing
Computer Science. 2013, 40 (11): 137-139. 
Abstract PDF(238KB) ( 448 )   
References | RelatedCitation | Metrics
PSO is an algorithm based on swarm intelligence optimization and search,has high efficiency,fast convergence.In this paper,it combines with the K-means algorithm for network intrusion detection.Experiment shows that PSO-based K-means algorithm overcomes the shortcoming that the K-means algorithm is sensitive to the initial cluster centers,outliers and noise,easy to fall into local optimal solution.It’s an algorithm with fast convergence and higher detection accuracy.
Database as Service System for Business Database Application Hosting and its Privacy Preservation Mechanism
CHEN Ping,ZHANG Tao,ZHAO Min,YUAN Zhi-jian and YANG Lan-juan
Computer Science. 2013, 40 (11): 140-142. 
Abstract PDF(355KB) ( 406 )   
References | RelatedCitation | Metrics
Database as a Service(DBaaS) is becoming a research hotspot of cloud computing,as a main application domain,business database application hosting puts forward the requirements of isolation and privacy preservation on hosting data.To satisfy this requirement,this paper proposed a virtual machine based database hosting method and corresponding DBaaS system based on CryptDB.This system has realized the hosting data encrypted storage and can execute SQL queries based on encrypted data.The experiment shows that compared with fully homomorphic encryption system,the performance of the system has lower loss,and better solve the issue of privacy protection and practical.
Research of Buffer Overflow Vulnerability Discovering Analysis and Exploiting
SHI Fei-yue and FU De-sheng
Computer Science. 2013, 40 (11): 143-146. 
Abstract PDF(591KB) ( 420 )   
References | RelatedCitation | Metrics
Currently,the problem of software security vulnerability becomes worse,and buffer overflow vulnerability still affects the current network and distributed system security.So it is very important to research the buffer overflow vulnerability discovering analysis and exploiting for the security of system software.In the paper,first of all,principle of buffer overflow and vulnerability discovering analysis and utilization techniques were discussed.Then one method of static analysis combined with the dynamic analysis of vulnerability discovering analysis was proposed,and a complete vulnerability discovering analysis process was presented,and the availability and effectiveness of the method were verified by actual Microsoft Office vulnerability.Finally,on the basis of the theory and technology,vulnerability discovering analysis system-VulAs was designed and realized under the Windows platform to assist the discovering and analysis of vulnerability,and the effectiveness of the tool was verified.
Performance Monitoring and Analysis of Heterogeneous Cloud Platforms
HUANG Xiao-fei,BAI Xiao-ying and YUAN Li-jie
Computer Science. 2013, 40 (11): 147-151. 
Abstract PDF(988KB) ( 388 )   
References | RelatedCitation | Metrics
Cloud computing is a new kind of computing and service model rising in recent years.By multi-tenant strategy and pay-on-demand model,it elastically provides users with seemingly unlimited computing and storage resources.Currently,cloud computing has become the focus and hotspot of new computing models in the Internet environment.How to compare and evaluate the performance of cloud platforms with different types from different companies to provide the basis for users to select one cloud platform,has become one of problems to be solved for cloud computing performance evaluation.To solve the above problem,a performance monitoring system to cross multiple cloud platforms was designed and implemented.It provides real-time performance data monitoring based on user requirements definition,comparative analysis of performance across cloud platforms,and displays the results to users graphically.
Entropy Analysis of Testing-based Fault Localization and Similarity-aware Fault Localization
WANG Zhen-zhen,XU Bao-wen,ZHOU Yu-ming and CHEN Lin
Computer Science. 2013, 40 (11): 152-157. 
Abstract PDF(443KB) ( 342 )   
References | RelatedCitation | Metrics
We presented an entropy model for the TBFL(testing-based fault localization) approaches and SAFL(similar-ity-aware fault localization) approach.We used the model to compare and analyze the Dicing approach,TARANTULA approach and SAFL approach on one instance.The analysis demonstrates that the entropy model can not only provide a new type of TBFL,but also provide a principle framework for constructing and analyzing various TBFL approaches and SAFL approach.
Generic Framework for Synchronizing Domain Feature Model and Application Feature Models in Software Product Line Evolution
HUANG Yang,SHEN Li-wei and PENG Xin
Computer Science. 2013, 40 (11): 158-163. 
Abstract PDF(424KB) ( 395 )   
References | RelatedCitation | Metrics
Software Product Line domain feature model and application feature models need to keep consistency during evolution,however they usually evolve separately.Building synchronization facility for each kind of feature model is time-consuming and error-prone.So we proposed a generic framework for synchronizing domain feature model and application feature models during evolution,including a generic feature meta-model and the synchronization rules based on the meta-model.Using the framework,different software organizations just need to define the transformation between their specific feature model and the generic feature model.In addition,we used an example to verify the usability of the framework.
Ripple-effect Analysis of Software Evolution Based on Component
YU Yong,WANG Li-xia and ZHAO Na
Computer Science. 2013, 40 (11): 164-168. 
Abstract PDF(378KB) ( 285 )   
References | RelatedCitation | Metrics
With the adoption of new technology and the change of the system environment,the evolution of components and software systems is inevitable,and the evolution will affect the overall behavior of the system.This papere analyzed the impact of the coupling on the evolution ripple effect in component-based software systems,and gave the matrix representation of the dependent relationship in component and the various coupling relationship between the components and connections of the software system.Based on matrix shift and calculation,ripple-effect of software evolution was ana-lyzed.And an approach to ripple effect analysis of the dynamic evolution of the software system was presented,which can prevent magnification of the ripple effect.In the dynamic evolution of component-based software system,the related components and connectors can be obtained according to the analysis of the ripple effect,which can ensure the consistency and continuity of the dynamic evolution.
Research of Software Complexity Metric Attributes Feature Selection Based on LASSO-LARS
ZHOU Yan-zhou,QIAO Hui,WU Xiao-ping,SHAO Nan and HUI Wen-tao
Computer Science. 2013, 40 (11): 169-173. 
Abstract PDF(434KB) ( 509 )   
References | RelatedCitation | Metrics
To cope with the software complexity metric attributes dimension disaster which exists in the software reliability early prediction,this paper put forward a software complexity metric attribute feature selection method based on Least Absolute Shrinkage and Selection Operator(LASSO)method and the Least Angle Regression(LARS)algorithm.This method can filter out some software complexity metric attributes which have smaller influence on the early prediction results and can obtain the key attributes subsets associated most closely with the prediction result.This paper firstlyanalyzed the characteristics of LASSO regression method and its application in feature selection,secondly modified the LARS algorithm so that it can be used to solve the problems which LASSO method involves and get relevant complexity metric attribute subsets,lastly combined with the Learning Vector Quantization(LVQ)neural network to carry on the early software reliability prediction experiment.During the experiment,the authors used the 10-fold experiment methods.The experiment results indicate that the method can improve early prediction accuracy of software reliability.
Process-driven Method of Semi-automatic Semantic Web Service Composition and its Application
XIONG Li-rong,YU Hui,FAN Jing and DONG Tian-yang
Computer Science. 2013, 40 (11): 174-180. 
Abstract PDF(1397KB) ( 344 )   
References | RelatedCitation | Metrics
As the number of Web services increases quickly,how to compose Web services into loosely-coupled system to fulfill flexible business requirements becomes a popular research topic.Traditional process-driven based service composition methods and frameworks predefine the composite process.When the local business-process changes,the whole composite process must be redefined again which lacks of flexibility.This paper introduced a semi-automatic semantic service composition method.The abstract service is used to describe the local dynamic process and will be automated replaced by composite service at execution phase when there is no candidate concrete service suitable for the abstract ser-vice.A backward-tree based service composition method was proposed for the replacement of the abstract service and semantic information was applied to improve the recall rate of Web service.Considering the selection of composite processes,the QoS evaluation method was introduced.Based on business process method,this paper proposed a semi-automatic semantic service composition framework.OWL-S technology is used for the description of the dynamic process.Finally,an application of credit evaluation case is shown which presents the availability and flexibility of the framework.
Framework for Model Checking CSP with Traces of Processes
ZHAO Ling-zhong,ZHAI Zhong-yi and QIAN Jun-yan
Computer Science. 2013, 40 (11): 181-186. 
Abstract PDF(552KB) ( 455 )   
References | RelatedCitation | Metrics
Communicating Sequential Processes is a classic method for describing concurrent systems and network security protocols.The verification for CSP is a complex conversion process,which translates processes into a label transition system and describes the properties to be verified with traces.The main problem of the method is in the description of liveness properties.This paper proposed a framework for model checking CSP with traces,in which properties are specified by LTL,a universal property specification language.Finally,a verification system for CSP was implemented with answer set programming.It is shown that the ability of the system for describing CSP processes is more powerful and the verification accuracy of the system is higher than the similar system developed previously.When a property is not qualified,the system returns counterexamples for the property.
Trustworthiness Evaluation Method for CPS Software Based on Multi-attributes
RONG Mei
Computer Science. 2013, 40 (11): 187-190. 
Abstract PDF(314KB) ( 409 )   
References | RelatedCitation | Metrics
Cyber-Physical System(CPS)is a kind of new system by combining physical world with computation.There are many kinds of software in Cyber-physical System,and each of them is running dynamically.It’s hard to ensure the properties such as correctness,safety,reliability with the dynamical environment.And trustworthiness evaluation can provide evidence for software quality management.So we proposed a method for CPS software’s trustworthiness evalua-tion based on multi-attri-butes.Firstly,we proposed a trustworthiness evaluation indicator system based on multi-attri-butes,secondly brought forward a method for software trustworthiness evaluation by considering the aging of trustworthiness evidences.After that,a set of decision rules was designed to explain the evaluation result.Then based on the interaction logic of software, the trustworthiness level of CPS software system was evaluated.At last,an example was used to explain the effectiveness of the framework.
Research on Runtime Monitoring for Self-adaptive and Reconfigurable Software Systems
TANG Shan,LI Li-ping and TAN Wen-an
Computer Science. 2013, 40 (11): 191-196. 
Abstract PDF(546KB) ( 400 )   
References | RelatedCitation | Metrics
Runtime monitoring is an important part of the study on self-adaptive software systems,and it is also an important design principle in building dependable software systems.However,most of the existing research works on runtime monitoring often mix the monitoring logic and the business logic.Different from these existing works,this paper proposed a requirement-model driven approach for self-adaptive software systems.Based on the goal model and the constraints specification,we illustrated how to define the monitoring model,how to deduce the specification of monitor from the goal model,how to generate and weave the monitoring codes automatically,how to diagnose and reconfigure the target system at runtime.This approach implements these tasks by separating the application code and the monitoring code,which is essential for facilitating software maintenance and promoting software reuse.
Privacy-preserving Data Aggregation Algorithm with Integrity Verification
SHI Lu-sheng and QIN Xiao-lin
Computer Science. 2013, 40 (11): 197-202. 
Abstract PDF(509KB) ( 320 )   
References | RelatedCitation | Metrics
In order to satisfy the needs of large-scale applications for wireless sensor networks,this paper proposed an aggregation algorithm which can protect the data privacy and verify integrity.First,the algorithm constructs disjoint aggregation trees,and the nodes divides their own data into several slices in each corresponding time slices according to their different degrees,and then the data slices are encrypted and transmitted in every aggregation trees,so privacy-preserving and data integrity can be addressed.Finally,using in-network aggregation based routing tree,the final result of each tree is sent to the base station,and the integrity of the final result is verified by the base station.Simulation results show that the algorithm can obtain higher data aggregation accuracy with lower communication overhead,has better privacy-preserving performance and the aggregation results integrity.
GML Parallel Query Based on MapReduce
XU Bin and GUAN Ji-hong
Computer Science. 2013, 40 (11): 203-207. 
Abstract PDF(374KB) ( 298 )   
References | RelatedCitation | Metrics
A novel GML distributed query method based on MapReduce model was proposed to deal with queries with large amount of GML data.Some pretreatments are taken with the original GML files to generate GML feature set before the queries,and the queries are converted from GML files to GML Feature Set.The queries can execute in a parallel way and be more efficient.
Mining Algorithm of Database Constraints Based on Characteristics Fuzzy Closer
WANG Yong and ZOU Sheng-rong
Computer Science. 2013, 40 (11): 208-210. 
Abstract PDF(306KB) ( 330 )   
References | RelatedCitation | Metrics
Traditional association rules algorithm only considers the class of close contact,ignores similarity features of the kind,high overhead classification process,time consuming association process.This paper proposed a mining algorithm of database constraints based on characteristics fuzzy closer,which describes the consistency between data through the closer between data fuzzy sets,introduces data fusion techniques into the traditional mining technology of neural network,after classifying and processing the data,analyzes the dynamic characteristics of the original mining data and gets new mining model,in order to accurately query target data in the large-scale database.The simulation experimental results show that the efficiency for the algorithm to mine the sparse data sets and dense data sets is superior to the traditional association rules algorithm,and it greatly improves the efficiency of database mining.
Semantic Web Service Discovery Based on Text Clustering and Similarity of Concepts
LIU Yi-song and YANG Yu-cheng
Computer Science. 2013, 40 (11): 211-214. 
Abstract PDF(338KB) ( 350 )   
References | RelatedCitation | Metrics
Semantic Web Services need to match services in the registry in succession in the discovery of services,which wastes a lot of time on irrelevant services,and reduces efficiency of discovery.Thus,a new discovery method of semantic Web service based on text clustering and similarity of concepts was proposed which can be divided into two phases:in the first phase,services of identical category are clustered according to descriptive texts in the service source file when texts are expressed and processed by vector space modal(VSM)and a multiple hybrid clustering algorithm MHC was proposed in the second phase,functions and properties between services are matched and semantic similarity between concepts is calculated combined with factors such as depth,strength and inheritance of directed edge in the hierarchical tree of ontology concepts.Finally,the experimental result shows that the method proposed in the article improves the matching efficiency greatly based on accurate rate.
Research on Parallel Shared Decision Tree Algorithm Based on Hadoop
CHEN Xiang-tao,ZHANG Chao and HAN Qian
Computer Science. 2013, 40 (11): 215-221. 
Abstract PDF(542KB) ( 395 )   
References | RelatedCitation | Metrics
Shared knowledge is focused on the problems of knowledge discovery shared by two(or more) applications/datasets,and it can help users investigate a poorly understood dataset from a well understood dataset by analogy and transferring knowledge.However,with the advent of the era of big data,the existing algorithms based on serialized can not able to meet the need of the rapid data growth and the serial algorithm has low efficiency in big datasets.The Parallel Shared Decision Tree algorithm(PSDT) based on Hadoop was introduced by using cloud computing.The traditional attributes list structure is uses in the PSDT.But the overmuch operations of I/O occur in the parallel processes of this algorithm,which will influence the algorithm’s performance.A new Hybrid Parallel Shared Decision Tree algorithm(HPSDT),which uses hybrid data structures,was proposed.The algorithm applies attributes-list to compute the split indicators parallelly,and data records-structure to split in the splitting procedure.Data analysis indicates that the algorithm of HPSDT simplifies the splitting process,and its operations of I/O are 0.34times of the PSDT.The experimental results show that PSDT and HPSDT have good parallelism and scalability,furthermore,the performance of HPSDT is better than that of PSDT.Especially,the superiority is even more obvious with the increase of the dataset size.
Function Semantic-based Web Service Description and Pre-filter Method
ZHAO Wen-dong,TAO Xiao-zhen,PENG Lai-xian and TIAN Chang
Computer Science. 2013, 40 (11): 222-227. 
Abstract PDF(490KB) ( 362 )   
References | RelatedCitation | Metrics
Functional semantic based Web service filter is a useful method to decrease the computation of service discovery.According to the problem that current semantic-based services description languages can not support service functional description,this paper firstly defined a functional based Web service semantic description model and created a domain oriented functional ontology.Based on this,a service description method was raised and a realize method based on OWL-S was expressed clearly.Finally a pre-filter method was raised.The pre-filter method can support the service discovery based on functional semantic,filter out unrelated services,decrease the computation of accurate service match and increase the matching efficiency.
Research on Deep Web Query Interface Clustering Based on Latent Semantic Analysis
QIANG Bao-hua,LI Wei,ZOU Xian-chun,WANG Tian-tian and WU Chun-ming
Computer Science. 2013, 40 (11): 228-230. 
Abstract PDF(318KB) ( 387 )   
References | RelatedCitation | Metrics
Generation of integrated query interfaces is the important issue of Deep Web data integration.How to cluster different query interfaces effectively is one of the most core issues when generating integrated query interface.Due to the traditional vector space model can’t solve the shortage of relying on keyword maching in the Deep Web query interface clustering,the Latent Semantic Analysis (LSA) method was introduced and then the algorithm of Deep Web query interface clustering based on Latent Semantic Analysis was proposed.The experimental results on UIUC Web integration repository show that LSA method can significantly improve the performance of Deep Web query interface clustering.
Sentiment Analysis of Chinese Microblog Using Topic Self-adaptation
REN Yuan,CHAO Wen-han,ZHOU Qing and LI Zhou-jun
Computer Science. 2013, 40 (11): 231-235. 
Abstract PDF(483KB) ( 658 )   
References | RelatedCitation | Metrics
Recently,with the rapid development of social networks,sentiment analysis over social networks has gradually become a new hot research topic,especially in the field of data mining.The typical features of Chinese microblog (such as “short” and “flexible”) bring some new challenges for the researcher to analyze its sentiment.So this paper carried out a systematic study on Chinese microblogging emotional analysis technology,including data preprocessing,sentimental lexicon construction,topic adjunction.In additon,to improve the precision of sentiment analysis,a novel emotional words classification approach was proposed.Meanwhile,we proposed a topic-oriented adaptive method to promote the work of emotional words identification.And the experimental results demonstrate the feasibility and effectiveness of our approach.
Forest Thinning via Contribution Gain
GUO Hua-ping and FAN Ming
Computer Science. 2013, 40 (11): 236-241. 
Abstract PDF(464KB) ( 489 )   
References | RelatedCitation | Metrics
An ensemble consisting of decision trees can be treated as a forest.We proposed a new strategy called forest thinning to reduce the ensemble size and improve its accuracy.Unlike traditional decision tree pruning methods,forest thinning treats all decision trees in a forest as a whole and try to evaluate the performance influence on the ensemble when a certain branch is pruned.To determine which branches should be pruned,we proposed a new metric called contribution gain.The contribution gain of a subtree is related not only to the accuracy of the host tree,but also to the diversity of trees in the ensemble,so it reasonably well measures how much improvement of the ensemble accuracy can be achieved when a subtree is pruned.With the contribution gain,we designed a forest thinning algorithm named FTCG(Forest Thinning via Contribution Gain).Our experiments show that forest thinning can significantly reduce forests structure complexity and improve their accuracy in most of data sets,no matter ensembles are constructed by a certain algorithm such as bagging,or obtained by an ensemble selection algorithm such as EPIC [1],no matter whether each decision tree is pruned or unpruned.
Term Importance Identification Method Based on Classification
QIU Yun-fei,BAO Li and SHAO Liang-shan
Computer Science. 2013, 40 (11): 242-247. 
Abstract PDF(507KB) ( 1056 )   
References | RelatedCitation | Metrics
In the field of traditional search engines and information retrieval,term weights for the input query are typically derived in a context independent fashion.Most information retrieval techniques employ bag-of-words approaches like Boolean models,vector-space models and other probabilistic ranking approaches to obtain term-weight of a term in a query.However,all these algorithms treat terms independently,and do not take the relationship among the terms.This paper employed supervised machine learning based on classification and syntactic parsing to derive a context-sensitive and query-dependent term weight for each word in a search query.By taking the result of syntactic parsing as a major feature of the classification,it is now able to avoid the information loss and increase the features of the short text.Meanwhile the classifier could achieve soft output,in order to give a more accurate quantized value to term importance.
Hybrid Algorithm Based on Monkey Algorithm and Simple Method
CHEN Xin and ZHOU Yong-quan
Computer Science. 2013, 40 (11): 248-254. 
Abstract PDF(428KB) ( 559 )   
References | RelatedCitation | Metrics
In view of the problem that Monkey algorithm cannot acquire solutions exactly in solving global optimization and spend a lot of time in computation,this paper designed a hybrid algorithm based on monkey algorithm and simple method which combine with the searching idea of traditional simple method.The algorithm improves the calculation accuracy and speeds up monkey algorithm converge speed in a certain degree.The simulation results show that the improved monkey-simple hybrid algorithm has strong advantage in function testing.The results are more close to the theory optimal solution.
GA-based Subspace Classification Algorithm for Support Vector Machines
JIANG Hua-rong and YU Xue
Computer Science. 2013, 40 (11): 255-260. 
Abstract PDF(587KB) ( 449 )   
References | RelatedCitation | Metrics
This paper presented a new GA based Subspace classification algorithm for SVM(GS-SVM).A modified sample selection method is adopted to select a subset of training data based on both the confidence and the convex hull.Then the representative samples are selected to train the SVM models by considering the distances between classes and the sample distribution.The algorithm adopts the matrix-form mixed encoding.Genetic algorithm is used to optimize the feature subspace of representative samples and the classification parameters of SVM simultaneously.The SVM classification model is produced based on the representative samples with the optimized feature subspace.Experimental results on eleven UCI datasets illustrate that the proposed algorithm is able to select both smaller sample subset and feature size,and achieve higher classification accuracy than the traditional classification algorithms.
Reduction Algorithm of Positive Domain for Decision Table Based on Relationship Matrix
JING Yun-ge and LI Tian-rui
Computer Science. 2013, 40 (11): 261-264. 
Abstract PDF(326KB) ( 629 )   
References | RelatedCitation | Metrics
This paper discussed the problem of attributes reduction in rough set.We first introduced both induced matrix and λ-cut matrix of equivalence relation matrix to calculate upper and lower approximation of decision tables and then proposed a reduction algorithm of positive domain for decision table based on relationship matrix with the prove of correctness in theory.What’s more,a heuristic reduction of attribute core in rough set was proposed to calculate the minimum reduction.With dynamical updating of attributes,we updated attribute equivalence relationship matrix through the method of update of matrix and calculated the positive domain after update of attribute rapidly.The example confirms the feasibility and effectiveness of proposed operation and method of attribute reduction.
Improved Differential Evolution Algorithm Based on Dynamic Adaptive Strategies
WANG Cong-jiao,WANG Xi-huai and XIAO Jian-mei
Computer Science. 2013, 40 (11): 265-270. 
Abstract PDF(466KB) ( 660 )   
References | RelatedCitation | Metrics
To solve problems of DE applied to complex optimization functions,an improved differential evolution algorithm (dn-DADE) based on dynamic adaptive strategy was proposed in this paper.Firstly,the elite solutions of current population were utilized in the new mutation strategy (DE/current-to-dnbest/1) to guide the search direction.Secondly,the adaptive update strategies of scaling factor and crossover factor were designed for making parameter values self-adapting at different search stages to improve the stability and robustness of the algorithm.A set of 14benchmark functions were adopted to test the performance of the proposed algorithm.The results show that dn-DADE algorithm has the advantages of remarkable optimizing ability,higher search precision,faster convergence speed and outperforms se-veral state-of-the-art improved differential evolution algorithms in terms of the main performance indexes.
Two-tier Clustering for Mining Imbalanced Datasets
HU Xiao-sheng,ZHANG Run-jing and ZHONG Yong
Computer Science. 2013, 40 (11): 271-275. 
Abstract PDF(423KB) ( 687 )   
References | RelatedCitation | Metrics
Classification of class-imbalanced data becomes a research hot topic in machine learning and data mining.Most classification algorithms tend to predict that most of the incoming data belongs to the majority class,resulting in the pool classification performance in minority class instances,which are usually much more of interest.In this paper,a two-tier clustering cascading mining algorithm was proposed.The algorithm first constructs balanced training set by clusterd-based under-sampling,using K-means clustering to cluster majority class and extract cluster centroids then merge with all minority class instances to generate a balanced training set for training.To avoid the number of the minority is too small,leading the shortage of training instance,combination of SMOTE over-sampling and cluster-based under-sampling is used;next,using “K-means+C4.5”,a method to cascade K-means clustering and C4.5decision tree algorithm for classifying on the balanced training set,the K-means clustering method is first used to parition the training instances into k clusters,and on each cluster,C4.5algorithm is used to build decision tree,the decision tree on each cluster refines the decision boundaries by learning the subgroups within the cluster.Experimental results show that the proposed method provides better classification performance than other approaches on both minority and majority classes,and is effective and feasible to deal with the imbalanced datasets.
Research on Simplification Methods of Seismic Database for CTBT Based on Granular Membership Reduction
SUN Yu-wei,ZHENG Xue-feng,PAN Chang-zhou and JIN Ping
Computer Science. 2013, 40 (11): 276-279. 
Abstract PDF(292KB) ( 355 )   
References | RelatedCitation | Metrics
Sharing of seismic data greatly promote the progress of seismic science.Meanwhile,knowledge discovery for massive seismic data is advancing the development of seismic monitoring technology.Concept lattice is a powerful tool in knowledge discovery.Knowledge reduction,moreover,is one of the important aspects of knowledge discovery.Firstly,this paper used membership relationships defined to classify formal context,and simplified membership class by using granular membership reduction.And then,the reduced concept lattice was expanded by selecting objects and attributes dynamically,accordingly,the concept lattice of original formal context was converted eventually.Finally,typical examples of UCI database were analyzed.The results show that this method is effective and can simplify the objects and attributes in the maximum scale.The method is also applyied in the seismic database.
Population Dynamics-based Optimization
HUANG Guang-qiu,LI Tao and LU Qiu-qin
Computer Science. 2013, 40 (11): 280-286. 
Abstract PDF(517KB) ( 575 )   
References | RelatedCitation | Metrics
To solve large-scale optimization problems(OP),a population dynamics-based optimization algorithm with global convergence was constructed based on the population dynamics theory.In the algorithm,each population is just an alternative solution of OP,and a feature attribute of a population corresponds to a variable of an alternative solution.The principle of orthogonal Latin squares is used to produce initial values of populations so as to cover search space with balance dispersion and neat comparability.The competition,mutual-benefit,predator-prey,mergence,mutation and selection behavior between any two populations are used to construct evolution policies of populations so as to ensure population suitability index(PSI)of each population to keep either to stay unchanged or to transfer toward better states,therefore the global convergence is ensured.During evolution process of populations,each population’s transferring from one state to another realizes the search for the global optimum solution.The stability condition of a reducible stochastic matrix was applied to prove the global convergence of the algorithm.The case study shows that the algorithm is efficient.
Research on Evolution of Online Consensus Based on Opinion Leader’s Guiding Role
ZHOU Er-zhong,ZHONG Ning and HUANG Jia-jin
Computer Science. 2013, 40 (11): 287-290. 
Abstract PDF(360KB) ( 550 )   
References | RelatedCitation | Metrics
The construction of an effective opinion interaction model is the key to predicting the development trend of an online consensus.However,the traditional model is constructed in a closed social network.In order to construct the opinion interaction model oriented to a dynamic virtual network,an approach to the evolution analysis of online consensus based on opinion leader’s guiding role was proposed.The approach carefully takes the context of the network and the characteristics of opinion disseminators into account.Experimental results show that the proposed approach can effectively simulate the evolutionary process of an online consensus and predict the future trend of the opinion.
Hierarchical Algorithm Solve Strong Cycle Planning
WANG Quan,WEN Zhong-hua,WU Xuan and TANG Jie
Computer Science. 2013, 40 (11): 291-294. 
Abstract PDF(287KB) ( 320 )   
References | RelatedCitation | Metrics
A hierarchical algorithm was designed to solve strong cycle planning.Hierarchical algorithm is start with the target state,first,uses strong planning hierarchies,second,uses the weak planning hierarchies with the remaining states,and records the appropriate information,finally uses that information as a heuristic factor to search a strong cycle planning hierarchy in the result of weak planning hierarchies.After hierarchical states,information recorded can be used to get strong cycle planning solution directly.When larger state action pair exists,designed algorithm has high efficiency.When strong planning solution exists,it can owe better efficient,and can ensure a better strong cycle planning solution—strong planning solution obtained.Experiments show that designed algorithm can get strong cycle planning solution by fewer repeat searches,is better than the backward search by high efficiency.
Study on Financial Crisis Prediction Model with Web Financial Information for Listed Companies
BIAN Hai-rong,WAN Chang-xuan,LIU De-xi and JIANG Teng-jiao
Computer Science. 2013, 40 (11): 295-298. 
Abstract PDF(428KB) ( 361 )   
References | RelatedCitation | Metrics
Previous studies on corporate financial crisis prediction are mainly based on financial measures.With further research,limitations of financial indicators have become increasingly prominent.Characteristics of financial indicators have affected the performance of the financial crisis prediction model,such as hysteresis quality and easy to be manipulated.In view of this,this paper transformed the text of Web financial information into numerical by calculating sentiment tendencies value,then took the sentiment tendencies value as indicator variables of financial crisis prediction model.Two prediction models of pure financial indicators and mixed indicators with the sentiment tendencies value of Web financial information were constructed.The prediction results of prediction models were examined.The model of pure financial indicators is better than the model of mixed indicators in the validity,stability and advancing of prediction.
Dynamic Behavior Evolution for First-order Hybrid Petri Nets Based on Conflict Checking
LIAO Wei-zhi,LI Wen-jing and LU Jian-bo
Computer Science. 2013, 40 (11): 299-303. 
Abstract PDF(457KB) ( 348 )   
References | RelatedCitation | Metrics
The problem of dynamic behavior evolution for a First-Order Hybrid Petri Nets(FOHPN) was discussed.The theorem to determine the conflict of a FOHPN was proposed firstly.Secondly,a dynamic behavior evolution for FOHPN based on conflict checking and resolution was presented.Finally,through case study,the effectiveness of the developed approach was illustrated.
Multidimensional Data Recommender Algorithm Based on Random Walk
LI Fang and LI Yong-jin
Computer Science. 2013, 40 (11): 304-307. 
Abstract PDF(298KB) ( 348 )   
References | RelatedCitation | Metrics
In recommender system,both of accuracy and flexibility are important for recommender algorithms.In order to provide a high flexibility while keeping high accuracy,this paper proposed a random walk based multidimensional re-commender algorithm.First,this paper built a multidimensional recommender system model using users’ context,se-cond,divided the user query into several sub-queries,and built a bipartite graph,finally,ranked candidate items accor-ding to the random walk model,and returned top-k results.Experiments show that the proposed algorithm can satisfy flexible recommender requests while keeping high prediction accuracy,and is more effective than related algorithms.
Variable Endmember Unmixing Algorithm Based on Correction MCMC Method
HU Xia,SONG Xian-feng and NIU Hai-shan
Computer Science. 2013, 40 (11): 308-311. 
Abstract PDF(812KB) ( 366 )   
References | RelatedCitation | Metrics
Traditional unmixing methods are based on the fixed endmember,and need to assume that the remonte sense image has pure pixel.In fact,this assumption is not necessarily true,and all pixels are not composed of the same endmembers.This paper merged the endmember extration and unmixing into one step,and abstracted it to a random process based on a standard spectral library.Within the premise of variable number of endmembers,Reversible jump MCMC method was used to estimate parameters.The accumulated knowledge of endmembers was used during the state transition process,to improve algorithm efficiency.This algorithm does not require human intervention.It can achieve automated unmixing,and has a high accuracy.The experiments show that the algorithm based on MCMC is superior to the traditional unmixing method in both accuracy and stability.
Acquisition of Fingerprints’ Minutiae Pairs Based on Multi-layers Validation
MEI Yuan
Computer Science. 2013, 40 (11): 312-315. 
Abstract PDF(577KB) ( 471 )   
References | RelatedCitation | Metrics
Acquisition correct minutiae pairs are very inportant for fingerprint matching.In order to obtain more correct minutiae pairs,the previous work constructed the local topological structure based on the obtained intial minutiae pairs set to perform the single layer validation,however,it is not robust to the negative effect caused by the false minutiae pairs.In this paper,a Fingerprints’ minutiae pairs acquisition method based on multi-layers validation was proposed.The main improvements are dividing the single layer validation into two layers validation,adding a supplementary validation.All experiments show that the improved method strengthens the robustness and promotes the correctness,but the time complexity is about twice higher than the previous work.
Routing Algorithm for Android Game Agents Based on Non-uniform Partition of Map
LI Hong-bo,ZHAO Kuan and WU Yu
Computer Science. 2013, 40 (11): 316-319. 
Abstract PDF(325KB) ( 324 )   
References | RelatedCitation | Metrics
For Android phones there are problems of insufficient mobile phone resources like the lower CPU frequency and smaller memory.To solve these problems,on the basis of the hierarchical routing algorithm,a routing algorithm for Android game agents based on non-uniform partition of map was presented.In this algorithm,the game map is non-uniformly partitioned to generate the abstract map in preprocessing phase.And in online searching phase,it first finds paths in key points of the abstract map,then searches paths in each sub-map until the destination node is found.The experiment results indicate that compared with the HPA* Enhancements and KM-A*,the algorithm has the advantages of shorter searching time,fewer traverse points and better results.Finally,a topographical factor is taken into consideration in order to adapt this algorithm to the need of the Android phone games.