Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 36 Issue 8, 16 November 2018
  
Research and Prospect of Dynamic Routing in Wireless Sensor Networks
QI Xiao-gang , LIU San-yang
Computer Science. 2009, 36 (8): 1-2. 
Abstract PDF(249KB) ( 812 )   
RelatedCitation | Metrics
Wireless sensor networks arc a kind of integrated intelligent information system consisting of small nodes with information collection, information transformation, and information processing. It can be used in many fields, and is a new domain of information networks. Limitation of the power energy, dissemination of extensive sensor nodes, and dynamics of the topology of wireless sensor networks have brought out enormous challenges to routing in wireless sensor networks. In the paper, firstly restrict factors were analyzed.some challenges of routing in wireless sensor networks were presented, and then various Finally, some development trends were be a proper selection to solve the routing problem in wireless sensor pointed out, and the dynamic topology model may networks.
Survey on Modelling and Verification of Time Sensitive Security Protocol
ZHOU Ti,LI Zhon-jun,WANG Zhi-yong,WANG Jin-ying
Computer Science. 2009, 36 (8): 3-7. 
Abstract PDF(519KB) ( 921 )   
RelatedCitation | Metrics
Security protocols arc used to provide secure communication over open network. I}imcstamp can make sure the freshness of the message in the protocol,but it is not enough to research the time sensitive protocols,and there are no effective methods that can verify these protocols.So it's very difficult to verify all aspects of those huge and complex protocols in formal ways. This paper described why using time-stamp and researching it,and discussed several popular methods in this field which are MSR method,induction method CSP method and BAN logic method in werification of time sensitive security protocols. This paper gave out their estimations. Finally we also strated the possible new directions of time sensitive security protocols verification.
Survey on Security Issues in Delay/Disruption Tolerant Networks
HU Qiao-lin,SU Jin-shu,ZHAO,Bao-kang,HUANG Qing-yuan,SUN Yi-pin
Computer Science. 2009, 36 (8): 8-11. 
Abstract PDF(338KB) ( 1381 )   
RelatedCitation | Metrics
As an emerging network architecturc,Delay/Disruption tolerant networks (DTN) have been widely investigated in the last few years. Since the application scenarios of DTN is extremely different with existing networks,DTN should address several unique security issues, including resource exhaustion, data security, fragment authentication, privacy preservation, etc. We focused on these unique security issues in DTN, and provided a brief introduction for these issues. Further, we reviewed and performed detailed classification and analysis for existing research efforts and approaches in these issues. Finally, we also identified open research issues, and intend to motivate new research efforts.
Survey on Router Buffer Sizing Strategy
WANG Jian-xin, LI Chun-quan, HUANG Jia-wei
Computer Science. 2009, 36 (8): 12-16. 
Abstract PDF(440KB) ( 1741 )   
RelatedCitation | Metrics
This paper reviewed the researches of buffer sizing in routers, focused on five typical buffer sizing methods based on the model of TCP protocol. and then concluded from comparison and analysis that buffer sizing methods based on different assumptions should be adapted to different network environments. Meanwhile we studied several primary factors which influence buffer requirement, then mainly analyzed how TCP protocols and Queue Management mechanisms interact with buffer sizing methods, thus pointed out that in complicated and dynamic network environments adaptive buffer sizing method should be adopted.
State of the Art and Future Challenge on Database Algorithm Optimization Based on Modern Processor
DENG Ya-dan,DING Ning,XIONG Wei
Computer Science. 2009, 36 (8): 17-20. 
Abstract PDF(377KB) ( 710 )   
RelatedCitation | Metrics
With the continuous development of hardware technology, computer performance has been continuously strengthened, and also the database. However, there has come forth some new problems, such as cache access delay and cache conflict. According to the classification of the various optimization technologies,we analyzed various research resups in database algorithm optimization based on modern processor in the past 10 years,and looked to the future trend of database optimization based on new hardware.
Overview of Software Defect Data Processing Research
LI Nine , LI Zhan-huai
Computer Science. 2009, 36 (8): 21-25. 
Abstract PDF(526KB) ( 1123 )   
RelatedCitation | Metrics
Software defect is important basic data for software quality analysis and improvement.The problems faced by software defect research is how to preprocess noisy defect data effectively before analysis,how to classify defect data accoding to their characters, and how to mine and analyze them. This paper described the contents, methods and technologies of the above mentioned three problems, then compared and analyzed them, at last proposed several problems of defect data which is worth further studying.
Survey of High-performance Web Crawler
ZHOU De-mao, LI Zhou-jun
Computer Science. 2009, 36 (8): 26-29. 
Abstract PDF(439KB) ( 1711 )   
RelatedCitation | Metrics
Web Crawlers, one of basic components of Search Engine, arc programs to download resources from Internet.We illuminated the work theory of the Web Crawlers, and its development, and how to design a high-performance, scaly ble,distributed Web crawler,including the faced key problem.
Research of Image Mining
FANG Ling-ling,WANG Xiang-hai
Computer Science. 2009, 36 (8): 30-34. 
Abstract PDF(549KB) ( 905 )   
RelatedCitation | Metrics
In recent decade, image mining is an emerging and challenging research field which is one of utmost front line researching orientations for database and information decision-making field internationally. This paper firstly studied the characteristics of image mining, and proposed some of them differentiating from the traditional data mining. And then, it analyzed the general process and the main modeling, and discussed the main technology of image mining. On this basis some application in the above area of study were analyzed and discussed. Finally, the disadvantages and development trend of image mining were presented.
Brief Report of Research on Cognizing the Subarea of Evolutionary Computation(II)
LIU Kun-qi,KANG Li-shan,ZHAO Zhi-zhuo
Computer Science. 2009, 36 (8): 35-39. 
Abstract PDF(524KB) ( 630 )   
RelatedCitation | Metrics
It is very important that the subarca of evolutionary computation is scientifically cognized for studying evolutionary computation and predicting the developing directions of evolutionary computation in future. hhe development mainlines,characteristics and internal inherent law were reviewed and summarized, and the cognitive process for evolutionary computation was explained from the perspective of philosophy of science. A series of new concepts and idea,for example, the essential problem, typical methods and typical instances were proposed and discussed. The research progress of methodology for evolutionary computation was presented in outline, and its influence on the research and education of computer science and technology was discussed.
Cross-layer Analysis and Modeling for Multihop CDMA Ad hoc Networks with Adaptive Modulation and Coding
LIAO Xiao-ding,WU You-jun,NIE Jing-nan
Computer Science. 2009, 36 (8): 40-44. 
Abstract PDF(386KB) ( 681 )   
RelatedCitation | Metrics
A novel cross-layer analytical model for multihop CDMA Ad hoc networks was proposed, which considers the effect of the physical layer, the MAC lay and the route layer on the performance of the networks. Firstly, the probability distribution was derived for the link distance between two randomly positioned nodes, and the probability distribution of the number of hop between the source node and the destination node based on MFR route strategy was also considered.Then, combining with adaptive modulation and coding, packet level QoS performance of the network was analyzed. Morcover, the end-to-end delay and throughput with retransmission mechanism were analyzed, and the effect of transmission range, the queue length and spread gain on the network performance was discussed. The proposed model considers the special MAC protocol and route strategy protocols;however, the approach can be extended to other MAC protocol and route strategy as well.
Scheme for Primary User Emulation in Cognitive Radio Networks
XUE Nan, GHOU Xian-wei , XIN Xiao-yu , LI Dan , YANG Ghen
Computer Science. 2009, 36 (8): 45-48. 
Abstract PDF(334KB) ( 656 )   
RelatedCitation | Metrics
To solving the threat to the primary user emulation (PUE) in cognitive radio networks, this paper presented a security scheme based on HASH matching technique. In this scheme, when primary user network was working, we added the HASH original data to the data sent by primary user base station, firstly secondary users used HASH funcoon to calculate those received original data,and then compared results with predistribution HASH values. Successful matching proved that the message of primary user emergency was true. This scheme made the best of the HASH characteristic of quickly calculating,which guaranteed timely spectrum handoff of secondary users and avoided interfering with primary user. The security of this scheme was guaranteed by the irreversible of one-way HASH function. The detail of the scheme was presented in this paper. Comparing with relevant schemes,it provides safety,efficiency and feasibility, and can effectively solve primary user emulation problem in cognitive radio networks.
Fine-grained Non-blocking Join Algorithm Based on XJoin
CHEN Gang,LI Guo-hui,GU Jin-guang,YANG Bing,CHEN Hui,TANG Xiang-hong
Computer Science. 2009, 36 (8): 49-53. 
Abstract PDF(392KB) ( 650 )   
RelatedCitation | Metrics
Wide-area distribution raises significant performance problems for traditional query processing technictues as data access becomes less predictable due to link congestion, load imbalances, and temporary outages. Norrblocking joining query execution is a promising approach to coping with unpredictability in unreliable network and hiding intermittent delays in data arrival by reactively scheduling background processing. Classical non-blocking two-way joining techniquc such as XJoin fail to deliver acceptable performance in such a scenario where gradually augmenting partition could not be dealt with during one relatively short intermittent delay. We developed a novel reactively-scheduled non-blocking join, called XJoin-FG,disparting one coarse-grained transaction into several parts according to the size of interval time.XJoin-FG employed fincgrained timestamp mechanism to avoid duplicate results. Using the optimization implementation along with emulational data obtained by monitoring Internet data delivery,we show that XJoin-FG is an effective solulion for providing fast ctuery responses to users even in the presence of the longer-term of data sources appeared as unavailability.
Module Based Self-adaptive Contention Resolution Scheme for WiMAX Network with Heavy Traffic
DU Wen-feng, WANG Zhi-qiang,TAO Lan, FU Xiang-hua
Computer Science. 2009, 36 (8): 54-58. 
Abstract PDF(425KB) ( 645 )   
RelatedCitation | Metrics
The latest wireless network access technology, IEEE 802. 16 provides broadband Internet connection to user.However, it was found that the mandatory contention resolution scheme in IEEE 802. 16 which is based on truncated binary exponential backoff algorithm can not run perfectly in most cases, especially when the traffic is heavy. A novel selfadaptive contention resolution scheme was proposed, which divides all Subscriber Stations into several groups according to the available transmission opportunities in each frame and requires each Subscriber Station send bandwidth request during its group time. Meanwhile, the number of available transmission opportunities and the number of Subscriber Stadons sending bandwidth request will be self-adaptively updated by Base Station and Subscriber Station respectively according to the average collision probability and transmission opportunity utilisation present during last frame. The results of analysis and simulation show our scheme can improve the performance of the whole network remarkably.
SCSRM:A Splliting-connection-based Satellite Reliable Multicast Scheme
LIU Gong-liang, KANG Wen-jing,GU Xue-mai,GUO Qing
Computer Science. 2009, 36 (8): 59-63. 
Abstract PDF(426KB) ( 751 )   
RelatedCitation | Metrics
In order to overcome the shortcomings of end-to-end transport protocols in satellite networks, a novel splliting-connection-based satellite reliable multicast (SCSRM) scheme was proposed in this paper, and the transport protocol operating in space segment, Sat MTCP was designed in detail from the aspects of error control, congestion control and feedback suppression, respectively. Simulation results show that this scheme achieves satisfying Goodput performance even if the channel condition gets worse; owing to the efficient feedback suppression policy and error recovery policy, the Uoodput performance maintains smooth as the receivers increase; and fairness can be achieved by suitable bandwidth allocation.
Research on Component-based Protocol for Wireless Networks
WANG Lu, LIU Li-xiang, WANG Da-peng
Computer Science. 2009, 36 (8): 64-66. 
Abstract PDF(343KB) ( 604 )   
RelatedCitation | Metrics
There arc some drawbacks in Laycred Network Architecture, such as redundancy intra-layers and no interaclions inter-layers etc and these drawbacks have resulted in some problems-these are difficult to be solved in contemporary architecture,such as QoS,network security etc. It is when applying Layered Network Architecture to wireless network that resource-scarce and intermittent connection problems etc are included except for those mentioned above in Layered Network Architecture. The paper proposed a Component based Network Architecture in order to resolve these problems. The Component-based Network Architecture, a non-layered architecture which is composed of low coupling components, partitions the network on the basis of functions into different components and it provides high duality of service through combining these different components in an efficient way. This Network Architecture is a solution to climinate the problems encountered in Laycred Network Architecture. The paper will demonstrate the architecture thorovrghly.
Vehicle Synthetic Mobility Model for VANET Simulation
GHANG Guo-qing,CHEN Wu,XU Zhong,HONG Liang,MU De-jun
Computer Science. 2009, 36 (8): 67-70. 
Abstract PDF(334KB) ( 780 )   
RelatedCitation | Metrics
As an application of Mobile Ad hoc Network (MANET)in the transportation realm, Vehicular Ad hoc Network (VANET) is a promising technology in Intelligent Transportation Systems (ITS). Since accurate node mobility model is important for simulation,we proposed a Synthetic Mobility Model (SMM) for modeling vehicles mobility pattern based on real city satellite map and real road conditions. SMM, Random Waypoint model and Manhattan model were compared by analysis and simulation experiments based on real Xi' an satellite map under QualNet. The result shows that mobility models make a great impact on the routing protocol performance and our SMM is much better to be used for VANET simulation.
Self-adaptive and Prediction-based Smooth Handoff Algorithm in 802. 11 Networks
ZENG Zhi,LI Ren-fa,WEN Ji-gang,XIE Kun
Computer Science. 2009, 36 (8): 71-74. 
Abstract PDF(329KB) ( 714 )   
RelatedCitation | Metrics
To overcome the smooth scan algorithm' s weakness of not adapting to the varied network environment, a self-adaptive and prediction-based smooth handoff(APBSH) algorithm was proposed. The key idea of the algorithm is using the varying recent network environment to predict the next time's (it uses two parameters; the urgent of scan and the urgent of data-transmission to represent different network environment quantitatively),and then self-adaptively choosing the suitable handoff strategy according to the prediction results to accomplish the handoff before losing the connection of the current AP and improve the duality of data service. The experiment shows that API3SH algorithm could accommodate with network environment, assure the Optimization of the handoff's performance and better than standard 802. 11 handoff algorithm and smooth handoff algorithm.
Communication Energy Model for Large Scale Fields Sensor Networks
XIAO De-qin,WANG Jing-li,LUO Xi-wen
Computer Science. 2009, 36 (8): 75-78. 
Abstract PDF(327KB) ( 736 )   
RelatedCitation | Metrics
It is an important technique to achieve agricultural informationization, automatization and intelligentize that wireless sensor networks is used in agriculture as a monitoring network. Farmland wireless sensor network has the characteristics of ultra largcscale, ultra low-cost and complex topological changing which brings the network protocol rigorous requirements. In this paper, a communication energy model for large-scale fields' sensor network was proposed based on Heinzclman's energy model, which takes the interaction with disposal space distance, transmission semi-diameter, packet size as well as the impact of components into account. Furthermore, a math method to calculate best transmission semi-diameter was presented while the disposal space distance was given. And then, the important factors of the disposal space distance and network scale in the model effect query energy consumption were attained by fitting the simulation results.
Real-valued Negative Selection Algorithm with Boundary Detectors
WANG Da-wei , ZHANG Feng-bin
Computer Science. 2009, 36 (8): 79-81. 
Abstract PDF(244KB) ( 699 )   
RelatedCitation | Metrics
A large quantity of holes is generated on the boundary of self and nonsclf region, because of the boundary dilemma of real-valued negative selection (RNS) algorithm. A real-valued negative selection algorithm with boundary delectors was presented. In the new approach,the detectors were interpreted using an aggressive interpretation,and during the training phase the boundary detectors whose aggressiveness was controlled by boundary threshold were generated and deployed on the boundary. The boundary detectors can reduce the holes on the boundary efficiently, and also probe the boundary between the self and nonsclf region implicitly. The algorithm was tested using both synthetic 2-D data set and MIT Darpa 1998 offline data set. Results demonstrate that the new approach has a much higher detection rate than RNS algorithm, in the case of the same false alarm rate, although it has a little higher minimum false alarm rate.
Mechanism for Packet Reordering in MANET
SUN Wei,WEN Tao,GUO Quan
Computer Science. 2009, 36 (8): 82-85. 
Abstract PDF(412KB) ( 820 )   
RelatedCitation | Metrics
The standard implementation of TCP experiences heavy throughput degradation when packets arc reordered.This paper presented a new version of the TCP protocol, called TCP_ D, which improves its performance to mobile-in-duced reordering packets in mobile ad hoc network (MANET). In TCP_D,the response to the receipt of duplicate acknowledgement is delayed by a short period to allow the receiver to receive the packets that travel in different path. We analyzed TCP_D to show that its steady state throughput is similar to the native implementations of TCP when packet reordering does not occur. I}he simulation results show that TCP_D performs consistently better than standard implementations of TCP under persistent packets reordering. In the case that packets are not reordered TCP_D maintains the same throughput as typical implementation of TCP ( TCP-NewReno) and shares network resource fairly.
Research on Implementation of OGC Web Processing Service
SUN Yu,LI Guo-qing,HUANG Zhen-chun
Computer Science. 2009, 36 (8): 86-88. 
Abstract PDF(355KB) ( 876 )   
RelatedCitation | Metrics
Web services provide a network-based solution to achieving the interoperability of geospatial information processing, but there is a lack of the definition of geospatial information metadata. Against this problem, Web Processing Services (WPS) was proposed by Open Gcospatial Consortium (OGC) who is dedicated to researching GIS resources sharing and processes interoperabifity. This paper proposed an extensible WPS implementation architecture based on the three main operations of WPS to solve the problem of geospatial information interoperability. Furthermore, we proved the design to be reasonable, feasible and extensible and can solve the interoperability problem better by developing a demo WPS platform with merge process according to this architecture.
Research on Logic Consistency of Junk Code Transformation within Sub-function
SUN Guo-zi , CHEN Dan-wei , CAI Qiang
Computer Science. 2009, 36 (8): 89-91. 
Abstract PDF(343KB) ( 699 )   
RelatedCitation | Metrics
Junk code transformation is an effective approach for the code obfuscation. Based on the analysis of current junk code strategies,the paper proposed a novel junk code encryption algorithm within sub-function and described the algorithm using formal language. With formalization method, the paper researched how to prove the logic consistency of junk code transformation within sub-function. We deduced some important lemmas after researching the formal definilion of junk code transformation within sub-function. With these lemmas, the paper proven from three aspects ("XOR and CMP Expand","Pseudo Embranchment Construction" and "Junk Code after JNE") that the program transformed by junk code algorithm within sub-function has the same logicality with its original one. At last, with the standard of the codcobfuscation's judgment,the paper gave the result of the algorithm in detailed analyzing.
Novel Design of Perfect Reconstructed Quadrature Mirror IIR Filter Banks
CHEN Hua-li,CHENG Geng-guo
Computer Science. 2009, 36 (8): 92-93. 
Abstract PDF(229KB) ( 825 )   
RelatedCitation | Metrics
A novel design for a two-channel IIR quadrature mirror filter (QMF) bank with near-perfect reconstruction was presented in this thesis. The analysis filter bank was given by an efficient poly-phase network implementation based on all-pass filters. The arising phase distortions were almost compensated by the synthesis filter bank. Based on the theorry of the perfect reconstruction of QMF , the synthesis filter bank was also given by the stable all-pass filters, designed via analytical closed-form expressions, so the design of IIR QMF was transformed into the design of the phase equalizer. The simulation results show that the abasing and amplitude distortions are completely canceled, and phase distordons minimized at the expense of an additional signal delay. The proposed IIR QMF banks have a lower algorithmic complexity and are always stable.
New Improvement on a Fair Non-repudiation Protocol
LEI Xin-feng, LIU Jun, XIAO Jun-mo
Computer Science. 2009, 36 (8): 94-97. 
Abstract PDF(396KB) ( 699 )   
RelatedCitation | Metrics
Zhou-Gollmann protocol is a fair non-repudiation protocol which is widely discussed in recent years. Kim found a flaw of this protocol on time-limited fairness. Rimming at this f1aw,Kim gave an improved protocol,but the improved schema highly depended on synchronization of network clock. We analyzed Kim's protocol in detail and show that, without clock synchronization, Kim's improved protocol is also unfair. Then we proposed a new schema to improve it further. Without reducing the efficiency of the original protocol, our new method avoids depending on synchronization of the clock, and keeps fairness and non-repudiation of the protocol.
Digital Content Lending (Renting) Model and its Implementaion Framework
ZHONG Yong,LIN Dong-mei, LIU Feng-yu
Computer Science. 2009, 36 (8): 98-104. 
Abstract PDF(566KB) ( 580 )   
RelatedCitation | Metrics
A lending (renting) model and its implementation framework of digital content were presented. The model not only can implement traditional lending (renting) models, but also can realize many new using models, which brings a strong flexibility to the model. The main protocols and functions of the model were implemented by license rules that make the update of lending policies easier and code shorter, and can satisfy the requirements of high assurance. The protocol, implementation meachnism and method of the model were explained and exampled. Finally, a comparative analysis with current methods was showed.
Verifiably Encrypted Signature Scheme Based on Certificateless
ZHOU Min,YANG Bo,FU Gui,WU Li-li
Computer Science. 2009, 36 (8): 105-108. 
Abstract PDF(297KB) ( 573 )   
RelatedCitation | Metrics
Certificateless cryptosystem realizes the properties of the certificateless, and the key-unescrow. So Certificateless Verifiably Encrypted Signature Scheme (CVES) was proposed,which is composed of Ccrtificateless Encryption Scheme and Verifiably Encrypted Signature Scheme. Finally, the correctness and unforgeability of CVES was proved.This scheme can effectively overcome the malicious signature and collusion attack.
Comparison on the Performance of Media Access Control Mechanism for IEEE802. 1 1 Wireless LAN
SONG Jun,JIN Yan-hua,LI Yuan-yuan,LI Lin
Computer Science. 2009, 36 (8): 109-111. 
Abstract PDF(247KB) ( 815 )   
RelatedCitation | Metrics
Analyzed and compared on the performance of three media access control mechanism for IEEE802. 11 wireless LAN,Distributed Coordination Function(DCF),Fast Collision Resolution (FCR) and New Self-Adaptive DCF (NSAD) by simulation. Simulation results indicate, compared with DCF, FCR greatly enhances the throughput, but de-teriorates the fitter and fairness of WLAN. NSAD improves the fitter and fairness notably of WLAN, and enhances a few on the network throughput.
Web Service Oriented Workflow Model Based on ECA Rules
HE Chun-lin,TENG Yun,PENG Ren-ming
Computer Science. 2009, 36 (8): 112-115. 
Abstract PDF(328KB) ( 802 )   
RelatedCitation | Metrics
While workflow technology is increasingly used to manage complex processes in scientific and business field,the enhancements to Web service capabilities are rectuired. The Petri net based workflow model has limitations that could not depict the course of state transition or the event happened at the moment of state transition. An ECA-rule based Web Service workflow model was proposed. The process instances of the model is driven by events, Which not only ensure workflow to run and monitor correctly but also sustain dynamic modification at runtime.
Based on Environment Ontology Web Service Description, A Projection Approach
CAI Guang-jun,JIN Zhi
Computer Science. 2009, 36 (8): 116-120. 
Abstract PDF(415KB) ( 576 )   
RelatedCitation | Metrics
Services description is essential in serviccoricnted computing (SOC). The major problem with most Web services description is a lack of information for using services. Based on environment ontology,a projection method was proposed that generates service description for a specific domain. The contents needed by this description approach and its consequence were illustrated. And then the description process was presented. This method can provide more precise description results and the knowledge of the service environment.
Performance Predication of Distributed Component Systems Based on Aspect Templates
HUANG Xiang,ZHANG Wen-bo,ZHANG Bo,WEI Jun
Computer Science. 2009, 36 (8): 121-125. 
Abstract PDF(501KB) ( 562 )   
RelatedCitation | Metrics
Nonfunctional features of distributed component systems usually supported by the crosscutting concerns provided by the middleware nowadays. Although this separation of concerns helps make the system design easy, it enlarges the difficulties of performance prediction at design time. This paper proposed a performance prediction method of distributed component-based systems through aspect templates. This method will automatically weave reusable aspect templates and build a complete performance model based on performance factors of middleware. The results of the model can help designers to find the defects of their designs or make a good choice among different alternative solutions.
Aspect-oriented Architecture Description Language AC2-ADI.
WEN Jing, YING Shi, ZHANG Lin-lin, NI You-cong
Computer Science. 2009, 36 (8): 126-132. 
Abstract PDF(596KB) ( 603 )   
RelatedCitation | Metrics
Architectural Description Language(ADL) is a foundation of the software development based on software architectures, the traditional ADLs lack the ability to describe the crosscutting concerns and crosscutting interactions in the software architecture, leading to the difficulties in comprehension, evolution and reusability of software architectural design decisions. We proposed a new aspect oriented ADL AC2-ADL,aiming to offer an effective systematic solution for the representation of aspect oriented software system, AC2-ADL provides aspectual components to describe the crosscutting concerns. In addition, by introducing Aspectual Connector and abstracting the joinpoint of the architecture,it described the complicated interactions among the software architecture elements. Summarily, the whole description process of AC2-ADL was discussed systematically through a case study in e-business domain.
Windows Rootkit Detection Method Based on Cross-view
BAI Guang-dong,GUO Yao, CHEN Xiang-qun
Computer Science. 2009, 36 (8): 133-137. 
Abstract PDF(426KB) ( 936 )   
RelatedCitation | Metrics
Rootkits are often used by attackers to hide their trails in an infected system, so attackers can lurk in an invaded system for a longer time. Rootkits have become new threats to the security of OS. This paper first presented a summary on rootkit and rootkit detection technologies on Windows. Built on the existing techniques, this paper proposed a new rootkit detection method to detect rootkits based on hidden processes. We designed a detection method based on cross-view,which detects hidden process by comparing process lists attained from OS high-level and low-level respectively. The low-level process list was attained by scanning memory to find kernel object in Windows kernel,which the high-level list is attained using Windows APIs. We implemented a Windows rootkit detecting tool named VITAL to implement the proposed method and used some representative rootkits in experiments to verify the effectiveness of the proposed method.
Research on Integration Process of Viewpoints in Viewpoint-oriented Requirements Engineering
LIANG Zheng-ping, MING Zhong,WU Guo-qing
Computer Science. 2009, 36 (8): 138-144. 
Abstract PDF(579KB) ( 744 )   
RelatedCitation | Metrics
The requirements information of all kinds of stakeholders is acctuired and expressed using the form of viewpoint independently and dispcrscdly in Viewpoint-Oriented Requirements Engineering. It must integrate all of these viewpoints in order to yield an uniform specification. In this paper, the common development of viewpoints' specification was treated as the style of integration. The process of integration was modeled using the Category Theory. At the same time, this paper proposed two kinds of integration of viewpoints’optimizing strategics, and their validity were also proved. In addition, the relationship of the result of integration and the sequence of integration was discussed.
Reflection Mechanism for Software Architecture Reuse and its Formalization
LUO Ju-bo,YING Shi,YE Peng
Computer Science. 2009, 36 (8): 145-148. 
Abstract PDF(303KB) ( 573 )   
RelatedCitation | Metrics
Reusing software resources at early stages of software development is insufficient Reflection mechanism has been successfully applied in the reuse of code component,but scarcely applied in the reuse of architecture and its constituents. This paper proposed a reflection mechanism supporting the reuse of architectural level designs,and described the base-level element model and meta-level element model of the reflective software architecture in detail. Moreover, it formalized the basclevcl architecture model using the formal specification language-Object-Z language completely. Choosing the Connections schema as a example, this paper also gave the initial theorem and its testified process.
Method of Evolutionary Testing Based on Unstructured Control Flow
JIANG Sheng, LU Yan-sheng
Computer Science. 2009, 36 (8): 149-152. 
Abstract PDF(416KB) ( 758 )   
RelatedCitation | Metrics
Evolutionary testing is a highly effective technique for automatically generating high quality test data, which is used for structural testing. However,under the criterion of Nodeoriented,testing of unstructured programs is inefficicnt and leading the technique degenerates to random testing. In this paper, with regard to the unstructured programs that contain arbitrary jump statement inner a loop body, a method of fitness calculation based on traditional approach was proposed, in which the impact of the number of iteration for evolutionary search was adequately considered. The experiments were then presented and the results show that the fitness function could effectively guide evolutionary search to find recauired test data at low cost.
Model Checking Web Services Based on Temporal Logic of Knowledge
LUO Xiang-yu, CHEN Yan, GU Tian-long, DONG Rong-sheng
Computer Science. 2009, 36 (8): 153-157. 
Abstract PDF(411KB) ( 568 )   
RelatedCitation | Metrics
Model checking has being used mainly to check if a system satisfies the specifications expressed in temporal logic. People pay little attention to the model checking problem for multi-agent logics of knowledge. However, in the distributed systems community, the desirable specifications of systems and protocols have been expressed widely in logics of knowledge. A Web service is an abstract notion that must be implemented by a concrete agent. hhe agent is the conCrete piece of software or hardware that sends and receives messages. hhus a Web services composition can be viewed as a multi-agent system We applied our model checker MCTK for temporal logic of knowledge to verified the Web services example of the SAS protocol. We also verified the example with three model checkers WSAh, W}Enginecr and SPIN. The experimental results show that the performance of our model checking method for verifying Web services is higher than these three model checkers.
Optimal Preventive Software Rejuvenation Policy with Periodic Testing
ZHAO Xu-feng,QIAN Cun-hua,NAKAGAWA Toshic
Computer Science. 2009, 36 (8): 158-160. 
Abstract PDF(296KB) ( 609 )   
RelatedCitation | Metrics
Software aging process and preventive software rejuvenation policy were studied in this paper. Consumption of physical memory, which would not be released, caused by aging related bug, was considered as damage suffered by shocks and could be known by periodic tests, preventive software rejuvenation was done at the next test point when consumption of physical memory reaches a preventive rejuvenation value. Expected cost rate model and optimistic preventive rejuvenation value which minimizes expected cost rate was analytically derived. Simulation experiment shows the validity of the rejuvenation conditions.
Modeling Tool for Software Architecture Design Decisions
XIAO Sai,CUI Xiao-feng,SUN Yan-chun, HUANG Gang
Computer Science. 2009, 36 (8): 161-164. 
Abstract PDF(454KB) ( 570 )   
RelatedCitation | Metrics
Architectural design plays a crucial role in the whole fifecycle of software.The vaporization of the design knowledge causes a lot of problems,for instance, the cost of evolution will be huge,the communications among stakeholds will be difficult,and the reuse of software architectural artifact will be limited.Therefore it is a demanding trend to esplicitly modeling the decision-centric software architecture design method.The tool helps architects model the core notions of issue,solution,decision,and rationale in architecture design,accomplish the design process from requirements to architectures,and implements the automated synthesis of candidate architecture solutions and capture of partial design rationale.Furthermore,the tool provides tracing between architecture models and the design decisions,and helps to implement the reuse of design decision knowledge during the architecture design processes.
Enhancement of Code Search Results Using Syntax and Semantic Analysis
LIU Shi,LI He,WANG Xiao-yin,ZHANG Lu,XIE Bing
Computer Science. 2009, 36 (8): 165-168. 
Abstract PDF(350KB) ( 922 )   
RelatedCitation | Metrics
Learning simple algorithm and specific API usage by code examples is an efficient way in software reuse and is the main purpose of using code search engine. Code search engine providing search service for source code on the internet was developed from Web search engine. It is able to locate source code related to the search input and brings great assistance to software development However, the state-of-art code search engines do not make a distinction between API implementation and usage, the results arc redundant and not easy to recognize. It is difficult for the user to obtain useful code segments from search result items. To address the problem,we proposed applying syntax and semantic analysis techniques to organizing the search results, clustering the similar code and acquiring better code digest We implemented our method with a code search engine and evaluate its effectiveness in this paper, the experimental results demonstrate that our approach works efficiently.
Automatic Behavior Protocol Recovery Tool for Object-oriented Programs Based on Static Code Analysis
HUANG Zhou,PENG Xin,ZHAO Wen-yun
Computer Science. 2009, 36 (8): 169-173. 
Abstract PDF(403KB) ( 775 )   
RelatedCitation | Metrics
Object behavior protocol is important for understanding object interfaces,correct module composition and reuse of classes. In the previous work,we proposed an antumotic method of extracting object behavior protocols based on static source code analysis. The method obtains direct and indirect dependencies between interfacing methods from source code, then constructs interface state diagrams based on intra class dependency relations. In this paper, we further presented the supporting tool for automatic behavior proptocol recovery, including the main tool modules, implementation techniques in each part.
Component Composition Evaluation Method Based on Grey Correlation
YANG Xiao-ping,ZHANG Wei-qun,ZHOU Xiang-bing
Computer Science. 2009, 36 (8): 174-176. 
Abstract PDF(225KB) ( 526 )   
RelatedCitation | Metrics
For improving the performance and efficiency of component composition, and evaluating the port representalion and protocol definition, this paper proposed a hierarchical correlation-oriented evaluation model; it implements the evaluation of component composition by grey correlation grade of integrated evaluation. Firstly, this paper set up a guide line system including the performance and efficiency of component composition, the integrality, usability and consistency of port representation and protocol definition; secondly, designed a integrated grey correlation evaluation method to evaluate it, then analyzed the characteristic of component composition; finally, analyzed a component library of an E-commerce case. The results show that the method can satisfy application efficiently, and it is good for improving component composition efficiency.
Formal Modeling and Evolution of Design Pattern Based on Role
SUN Jun-mei,MIAO Huai-kou
Computer Science. 2009, 36 (8): 177-181. 
Abstract PDF(382KB) ( 653 )   
RelatedCitation | Metrics
Design reuse becomes important in improving software development process. The concept of object oriented design pattern opens the situation for software design reuse. Inhere arc barriers when instantiating the design patterns,such as implementation, documentation, composition. This paper presented a formal modeling approach based on role.Class,attribute of class,the relation between class all are treated as roles and all roles are modeled with Object-Z. This effectively resolves the barriers when instantiating the design patterns. The formal model of design pattern is also cvolved based on role. The evolution is divides into role layer evolution and pattern layer evolution. Pattern layer evolution is composed of role layer evolution. The model consistence can be verificated with formal theory prover.
Incremental Maintenance of Approximate Equal-depth Histograms Based on Merge-split Strategy
GHANG Long-bo, LI Zhan-huai,WANG Yong
Computer Science. 2009, 36 (8): 182-184. 
Abstract PDF(249KB) ( 823 )   
RelatedCitation | Metrics
Histogram is one of effective methods for construction of synopsis data structures on landmark windows over data streams. This paper presented a new framework for incremental maintenance of approximate equal-depth histograms by merging and splitting the buckets,and compared three different merge & split strategics. The experimental resups show that the algorithms are effective and efficient for continuous streaming data processing over landmark window model.
Optimization Technology of Domain-special Data Transmission Based on XML
ZHAO Chong-chong, WANG Xin, HU Chang-jun
Computer Science. 2009, 36 (8): 185-188. 
Abstract PDF(309KB) ( 616 )   
RelatedCitation | Metrics
XML are widely used in various areas due to the usability, comprehensibility and cross-platform attribute. Although the XML technique can deal with the cross-platform transmission of domain data efficiently, the performance of this technique is limited by the fact that a great deal of additional data may be transmitted to keep integrality. This transmission problem not only increases the pressure of network, but also delays the response of data reserve. This paper proposed an optimized transmission technique for domain-special data based on XML.The thesis of this technique is to keep the attribute of XML,and to organize the elements of XML file in term of certain logical relationship and domain knowledge, i. e. based on certain rules, the XML data is divided into a series of classes to be transmitted without entire XML file. This approach can effectively resolve the problem arosen by the fact that the partial modification of XML data files lower the performance of system Rules and method of the various granularity partitions were given. Application condition and efficiency of the method were given too. Proposed methods which have been applied in petroleum field can improve the efficiency of transmission.
Muti-sources Web Service's QoS Collection Approach and Implementation
CHENG Ghi-wen, ZHAO Jun-feng, LI Tian, SHAO Ling-shuang, SUN Jia-su
Computer Science. 2009, 36 (8): 189-192. 
Abstract PDF(450KB) ( 678 )   
RelatedCitation | Metrics
With the vigorous development of Web Services technologies, Web Services’QoS(Quality of Service) has been paid much more attention. Recent research mainly focused on Qos-based applications, such as Qos-based dynamic composition and negotiation. Most of these researches are based on lots of reliable QoS Data. However, there are lots of limitations in current QoS collection methods,such as the exactness of QoS is not vary high and the amount of QoS is small. This paper focused on the collection of QoS advances a QoS collection and management frame (QosCollectionFrame). In this frame,this paper gave a research on QoS collection methods from multi sources,including the service-client end, the serviceprovider end, the onlinctest of Web service. Based on this research, this paper gave implementalions of these QoS collection methods.
Study on the Generalized Function-effect Chain in Information Retrieval
MIAO Jian-min,ZHANG Quan
Computer Science. 2009, 36 (8): 193-195. 
Abstract PDF(260KB) ( 608 )   
RelatedCitation | Metrics
The division of the generalized function sentence and the generalized effect sentence is the embodiment of the function-effect chain thought in the sentence category system. The division is the most basic classification to the natural language sentence. If we combine the statistics method and the rule method in the information retrieval and carry on the second filter to the results, the final result will conform to the actual demand for the inquiry user‘s and effectively enhance the correct rate of the information retrieval. The artificial data statistics result proved that the method effectively combines the statistics means with the rule means and improves the correct rate of the information retrieval.
Complete Algorithm of Quick Heuristic Attribute Reduction Based on Indiscernibility Degree
TENG Shu-hua,WEI Rong-hua,SUN Ji-xiang,TAN Zhi-guo,HU Qing-hua
Computer Science. 2009, 36 (8): 196-200. 
Abstract PDF(391KB) ( 636 )   
RelatedCitation | Metrics
After analyzing the attribute reduction algorithm based on rough set, a new definition of indiscernibility degree was given for mcasurcing the importance of attribution, and the property of indiscernibility degree was analyzed.Then based on the indiscernibility degree, a new heuristic reduced algorithm was proposed, which is useful to deal with the noise and makes the worst time complexity cut down to max(O(∣A∣∣U∣),O(∣A∣2∣U/A∣)).The theoretical analysis and experimental results show that this new method is not only useful in solving data noise but also robust and efficient.
Fast Chinese Text Search Based on LSH
CAI Heng, LI Zhou-Jun,SUN Jian,LI Yang
Computer Science. 2009, 36 (8): 201-204. 
Abstract PDF(374KB) ( 1226 )   
RelatedCitation | Metrics
The query of High dimension data attracts more and more attention. When dimension of a space vector is higher than 10, R-tree, Kd-tree, SR-tree and Quadtrecs perform worse than linear query. However, Locality Sensitive hashing (LSH) algorithm successfully deals with this problem. Nowadays LSH is playing a more and more important role in high dimension query. In the paper, the basic algorithm and principle of LSH were introduced firstly, then binary vector LSH Search Algorithm was improved by means of the multi-probe. Finally, we implemented the two kinds of LSH algorithms. The experience we have designed verified that the revised algorithm has better performance than the original one in two aspects. On the one hand, as the increment of setover, the proportion of retrial recall enlarges. On the other hand, the complexity of space decreases without the change of time complexity.
Application of an Improved Dijkstra Algorithm in Multicast Routing Problem
ZHANG Yi, ZHANG Meng,LIANG Yan-chun
Computer Science. 2009, 36 (8): 205-207. 
Abstract PDF(333KB) ( 598 )   
RelatedCitation | Metrics
Three improvements on Dijkstra algorithm were presented. The improvements were given as follows; (1)A novel optimized base on ACO implementing approach is designed to reduce the processing costs involved with routing of ants in the conventional Dijkstra algorithm. (2) Based on the model of network routing, in order to reduce the counting of other points, the set of candidates is limited to the nearest c points. (3)Marking the flags on the blocked points in order to prevent selecting these points. By this way simulations show that the speed of convergence of the improved algorithm can be enhanced greatly compared with the traditional algorithm.
Entity Relation Extraction for Complex Chinese Text
WAND Yuan XU De-zhi CHEN Jian-er
Computer Science. 2009, 36 (8): 208-211. 
Abstract PDF(334KB) ( 739 )   
RelatedCitation | Metrics
Entity Relation Extraction is one of the important research fields in Information Extraction. Aiming at the problem of inefficiency of existing approaches dealing with entity relation extraction, this paper presented a novel approach. I}his new approach proposes seven heuristic rules to extract relation feature sequence through combining with grammar feature of Chinese text, and applies the semantic sequence kernel function with KNN learning algorithm to fulfill the entity relation extraction task. Experiments arc carried out on two kinds of relation types defined in the ACE guidelines, results show that the new approach achieves an average F-score up to 76%,significantly higher than the traditional feature-based approaches and traditional shortest path for dependency kernel approaches.
Non-linear Transfer Learning Model
YANG Pei,TAN Qi, DING Yue-hua
Computer Science. 2009, 36 (8): 212-214. 
Abstract PDF(319KB) ( 704 )   
RelatedCitation | Metrics
Multi-task learning utilizes labeled data from other "similar" tasks and can achieve efficient knowledge-sharing between tasks. Previous research mainly focused on multi task learning for linear regression. A novel Bayesian multi-task learning model for non-linear regression, i. e. HiRBF, was proposed. HiRBF is constructed under a hierarchical Bayesian framework. According to whether the input to-hidden is shared by all tasks or not, we have two options to build the HiRBF model. Inhere is a comparison between them in the experiment section. The HiRBF algorithm is also compared with two transfer-unaware approaches. The experiments demonstrate that HiRI3F significantly outperforms the others.
Particle Filter Algorithm Imported Weighted Sampling about Pre-frame in Object Tracking
WU Gang,TANG Zhen-min,GENG Feng,CHENG Yong
Computer Science. 2009, 36 (8): 215-216. 
Abstract PDF(237KB) ( 751 )   
RelatedCitation | Metrics
Based on the strong relationship about frames near by, the Author brought forward an particle filter algorithm importing weighted sampling about pre-frame in object tracking. The algorithm resolves the trickiness in traditional SIR algorithm which depends on statcmodcl acutely by importing proposal distribution. The algorithm can track object which movement is irregular. The thought on weighted sampling about pre-frame not only can be applied in object tracking based sequential images, but also can be extended to other fields. The method is universal and practical.
Spam Filtering Based on Covering Algorithm
DUAN Zhen,WANG Qian-qian,ZHANG Yan-ping,ZHANG Ling
Computer Science. 2009, 36 (8): 217-219. 
Abstract PDF(312KB) ( 563 )   
RelatedCitation | Metrics
The correction rate and the risk rate of classification are important factors for evaluating an E-Mail system's performance,and spam filtering is a particular application of text categorization. This paper introduced covering algorithm (CA) of NN into spam filtering, and used several feature reduction methods to classify E-Mail. Comparing with SVM, the results of experiments indicated that it is an effective method to realize a spam filter using the combination of covering algorithm, appropriated feature selection and reduction methods. For the need of minimum risk of spam filtering,we proposed an improvement of one process in the handling of rejection samples by employing cross cover algorithm according to the result of analysis. The results show that this method can reduce the risk by changing the area which is affected by normal mail.
Privacy Preserving Association Rule Mining Method Based on Web Logs
BA0 Yu, HUANG Guo-xing
Computer Science. 2009, 36 (8): 220-223. 
Abstract PDF(321KB) ( 555 )   
RelatedCitation | Metrics
Each visitor's shopping session of the E-Business Web site is recorded in the Web server log files. Analyzing the log files and exploring the strong regularities in the commodities of the shopping cart, can provide the recommended goods for Web users, and improve the performance of the Web service. In order to improve the privacy preservation of the original visitor's shopping information and mining result, an effective method for privacy preserving association rule mining was presented. First, a new data preprocessing approach, Fake Column' s Randomized Response with Column Replacement (FCRRCR) was proposed to transform and hide the original data. Then, an effective privacy preserving association rule mining algorithm based on bit AND operation was presented. As shown in the experimental results, the algorithm can achieve significant improvements in terms of privacy, accuracy, efficiency and applicability.
Method of Calculating the Size of Dynamic Reduct Sub-table Family F
CHEN Hao,YANG Jun-an,Wu Yan-hua
Computer Science. 2009, 36 (8): 224-226. 
Abstract PDF(219KB) ( 582 )   
RelatedCitation | Metrics
This paper formulated the dynamic reduct model and analyzed the withdrawing of sub-table family F. We calculated the sub-table family according to space estimation of normal distribution and proposed one method of calculating the size of dynamic reduct sulrtable family F based on the reduct accuracy coefficient, which developed and completed the idea of sampling sub-table family F.
Algorithm of Attribute Reduction Based on Database Technology
WANG Xiong-bin,ZHENG Xue-feng,XU Zhang-yan
Computer Science. 2009, 36 (8): 227-230. 
Abstract PDF(269KB) ( 584 )   
RelatedCitation | Metrics
For improving the shortcoming of the model of traditional attribute reductions, some researchers proposed the definitions of attribute reduction based on the system entropy and database model. The main merit of attrbute reduction based on databased model is that it can use the efficient database technology to design the algorithm of attribute reduclion. So the corresponding algorithm of attribute reduction is efficient, For using the efficient database technology to design the algorithm of attribute reduction based on the system entropy, it was proved that the attribute reduction based on the system entropy arc equal to that based on database model. Then the algorithm of attribute reduction based on system entropy with some operations of the database technology was provided. And an example is used to illustrate the new algorithm.
Microcalcification Detection Based on K-means Cluster and Multiple Kernel Support Vector Machine
CHANG Tian-tian, LIU Hong-wei,FENG Jun
Computer Science. 2009, 36 (8): 231-233. 
Abstract PDF(215KB) ( 569 )   
RelatedCitation | Metrics
Considering the unbalanced distribution of the training samples and the multiformity of the features. A multiple kernel SVM based on K-means cluster algorithm was proposed. Firstly, training samples was clustered into K classes, different penalty factors were used for each class in order to balance the contributions of each class. Secondly, the multiple kernel support vector machine was proposed for diversity of the features. The stabilized training sample was obtained via active feedback learning. The result show that the detection rate can be improved at least 2 percent by the proposed method, compared with the single kernel SVM and the multiple kernel SVM.
System Dynamics Model for a Mixed-strategy Game of Environmental Pollution
CAI Ling-ru,WANG Hong-wei,ZENG Wei
Computer Science. 2009, 36 (8): 234-238. 
Abstract PDF(459KB) ( 2041 )   
RelatedCitation | Metrics
A system dynamic (SD) model was built for studying a mixed-strategy evolutionary game between the governments who manage environment pollution and the enterprise who produce with contamination generates. The stability analysis and SD simulation result show that evolutionary ectuilibrium doesn't exist. A dynamic penalty was suggested in SD model for the equilibrium stabilization and the improvement in environmental pollution. Finally, the stability analysis of the evolutionary game with dynamic penalty proves that Nash equilibrium is the evolutionary equilibrium. SD provides a simulation and experiment platform for the evolutionary game theory's development and application.
Research and Application Multi-dimensional Association Rules Mining Based Artificial Immune System
ZHU Yu,ZHANG Hong,KONG Ling-dong
Computer Science. 2009, 36 (8): 239-242. 
Abstract PDF(321KB) ( 653 )   
RelatedCitation | Metrics
Association rules mining is very important in the application of data mining. At present, single-dimensional asso-ciation rules result have been matured, but the prominent combinatorial explosion problem of multi-dimensional asso-ciation rules have not been solved perfectly so far. A method of mining multi-dimensional association rules was proposed based on artificial immune algorithm. This algorithm makes use of the immune memory characters, stores the asso-ciation rules in memory, and has faster speed of mining multi dimensional association rules. The results show that this algorithm which is applied to coal and gas outburst prediction has better robustness, and can be more quickly and efficicntly searched in the whole global. This algorithm has the feasibility and effectiveness in multi-dimensional rules mimng.
Method of Multi-granularity Community Structure Mining Based on Local Information Detection
ZHU Tao,CHANG Guo-cen,GUO Rong-xiao, LI Xiang-jun
Computer Science. 2009, 36 (8): 243-246. 
Abstract PDF(298KB) ( 554 )   
RelatedCitation | Metrics
Taking consideration of complexity and dynamic of complex networks, a definition of local modularity was proposed, and an algorithm for communication structure mining based on local information detection was given, with the criterion, i, e. the best community is the node group whose local modularity is the largest. Then a method of multi-granularity community structure mining was proposed, which provides new ideas to observe structure characters of complex networks from various angles. Final experiments verify its efficiency and feasibility.
Gaussian Processes Reinforcement Learning Method in Large Discrete States Space
ZHOU Wen-yun,LIU Quan,LI Zhi-tao
Computer Science. 2009, 36 (8): 247-249. 
Abstract PDF(330KB) ( 767 )   
RelatedCitation | Metrics
In order to solve the problem of "curse of dimensionality" , which means that the states space will grow exponentially in the number of features, in large discrete states space in reinforcement lcarning,a reinforcement learning method based on Gaussian processes was proposed. The Gaussian processes model can represent the distribution of functions,and it can be used to get a distribution of the expectation instead of its value. The experiment result shows that the performance such as speed of convergence and final effect can be improved obviously with the reinforcement learning method combined Gaussian processes. The "curse of dimensionality" in large discrete states space could be solved to a certain extent with the Gaussian processes regression model.
Evaluation on the Implementation Process of IS Based on FI-SEM
YU Xiu-yan,HU Ke-jin,LUAN dong-qing
Computer Science. 2009, 36 (8): 250-253. 
Abstract PDF(292KB) ( 662 )   
RelatedCitation | Metrics
The implementation process indexes of IS were built based on system theory. The meta-synthesis of FI-SEM was put forward to evaluate the implementation process of IS. FI-SEM deals with expert and weight information reflecting the su场ective and objective extent The demonstration research with 30 firms proves FI-SEM is effectively.
New Solving Algorithm of TSP Based on Return-to-zero Matrix
GUO Wen-qiang,QIN Zhi-guang,FENG Hao
Computer Science. 2009, 36 (8): 254-257. 
Abstract PDF(286KB) ( 745 )   
RelatedCitation | Metrics
Drawing on the basic thinking of traditional greedy algorithms, this paper proposed a new verification algorithm for solving traveling salesman problems based on return-to-zero matrix This method takes the return-to-zero matrix as input to avoid matrix traps, the full greedy algorithm as the thought of problem solving to obtain the shortest Hamilton circuit. Using this method to solve some problems in TSP-LIB, the results show that this proposed method can obtain a better satisfactory solution within a shorter time.
Study of Improved Particle Swarm Optimization
WANG Yong,ZHANG Wei,CHEN Jun,WEI Peng-cheng
Computer Science. 2009, 36 (8): 258-259. 
Abstract PDF(227KB) ( 1065 )   
RelatedCitation | Metrics
This paper intends to develop an improved particle swarm optimization (PSO) algorithm. The proposed method will introduce "Fin}Tuning" into the PSO algorithm which can promote the ability of local search to modify the defects of high similarity of individual particles on the late period of search following PSO algorithm. At last the performance of the improved PSO and PSO will be compared by optimizing five massively multimodal functions with varying complexities. The results show that the performance of the improved PSO is better than PSO on search success rate, average convergence time and average convergence generations.
Model for Dependent Evidences in DSmT Framework
WANG Jin,SUN Huai-Jiang
Computer Science. 2009, 36 (8): 260-263. 
Abstract PDF(341KB) ( 608 )   
RelatedCitation | Metrics
DSmT(Dezert Smarandache Theory) which is an extension of Dempster-Shafer theory assumes that sources of evidences are independent as the same as Dempster-Shafer Evidence Theory.This is a very strict condition that is difficult to satisfy in practice. A new model used to represent interrelated evidences in DSmT framework was proposed. In this model, two dependent evidences were considered resulted from orthogonal sum of one dependent original evidence and two independent original evidences, respectively. Combining the two dependent evidences can be reduced to an orthogonal sum of the two independent original evidences and the dependent original evidence. For this, independent original evidences must be identified firstly. A condition was proven to guarantee that the identified dependent original evidence is unique in free DSm’model,and if this condition is not satisficd,an approximate solution is proposed.
Construction Method of Lattice-valued Fuzzy Concept Lattice Based on Matrix Implication Operation
YANG Li,XU Yang
Computer Science. 2009, 36 (8): 264-267. 
Abstract PDF(248KB) ( 700 )   
RelatedCitation | Metrics
From the view of matrix, the construction method of a kind of fuzzy concept lattice was researched. Lattice valued fuzzy concept lattice was established on the structure of lattice implication algebra which is considered as the value range to depict the uncertainty relation between objects and attributes;considering the convenience of non-numerical computations, the matrix conj unction operation, disj unction operation and implication operation were defined, and based on the matrix implication operation, the construction method of lattice-valued fuzzy concept lattice was presented; furthermore,an example was provided to illustrate the method correct.
Rough Sets Tolerance Relation Based on Analyzing Method and PCA Incomplete Information System
WANG Jun-hui
Computer Science. 2009, 36 (8): 268-269. 
Abstract PDF(222KB) ( 569 )   
RelatedCitation | Metrics
How to acquire the optimal attributes reduction and weighs of attributes in incomplete information system is a hot topic for researching in rough sets theory recently. A rough set analysis method based on tolerance matrix and principle component analyze was provided in incomplete information system, which is one of heuristic data analyzing methods. Corresponding algorithm and an example were also given in this article. hhe method provided in this paper was compared with other methods. The result shows the method provided in this paper can find efficiently the core of an incomplete information system and acquire weighs of attributes through quantitative analyses.
Mean Value Coordinates on Tangent Plane for Laplacian Mesh Deformation
ZHAO Jian,HU Jian-guang,WU Ling-da
Computer Science. 2009, 36 (8): 270-272. 
Abstract PDF(227KB) ( 812 )   
RelatedCitation | Metrics
A rectuirement for Mesh deformation is to preserve local geometric detail. Laplacian mesh editing can meet this rectuirement, but Laplacian coordinates is not a accurate form to describe the local detail. This paper defined the Laplacian coordinates through mean value coordinates. First projects 1-ring of a vertex onto tangent plane, then constructs Laplacian coordinates use the mean value coordinates of the vertex. The result shows that this definition is accurate to describe local detail.
Image Encryption Communication Scheme Based on Clifford Map and Additive Modular Arithmetic
PAN Bo,FENG Jin-fu,TAO Qian,LI Qian
Computer Science. 2009, 36 (8): 273-275. 
Abstract PDF(237KB) ( 644 )   
RelatedCitation | Metrics
Based on Clifford super chaotic map, a novel image secure communication scheme was proposed. In this scheme, the process of confusion permutes a plain-image with Clifford map for the pretreatment, and process of diffusion is based on additive modular arithmetic. In every iteration,the arithmetic adopts the different keys. Once the synchronination of the communication between the sender and recevier is satisfied, the encrypted image can be correctly recovered into the source image. The results of computer numerical stimulation indicate the sheme proposed is feasible. Finally, we analyzed some cryptography properties of the scheme such as the key space and the sensitivity of the encrypted with resped to the secret key.
Research on Face Recognition Technique Based on Improved LBP Features
ZHAO Jian-ming,ZHU Xin-zhong,JIANG Xiao-hui
Computer Science. 2009, 36 (8): 276-280. 
Abstract PDF(412KB) ( 754 )   
RelatedCitation | Metrics
Recognition in uncontrolled situations is one of the most bottlenecks for practical face recognition. It is crucial to seek a valid and power discriminant method for facial appearance. Based on the local binary patterns for extract local texture, introducing a improved-new method called local ternary patterns, which is more rubost and more power discriminant for illumination variations and noise. Finally, PCA and Fisher linear discriminant arc used to reduce the dimensionalily and optimize discriminative classification respectively. Combing a simple and efficient image preprocessing chain,The mothod is tested and evaluated not only on JDL datasets but also AR datasets, promising results presented the method is valid and feasible.
Improved Adaptive FCM Color Image Segmentation Algorithm
YANG Hong-ying,WANG Xiang-yang,WANG Chun-hua
Computer Science. 2009, 36 (8): 281-284. 
Abstract PDF(322KB) ( 603 )   
RelatedCitation | Metrics
Fuzzy Gmeans (FCM) clustering is one of well-known unsupervised clustering techniques, which has been widely used in automated image segmentation. However, when the classical FCM algorithm is used for image segmentalion, there are also some problems, such as setting the initial number of clusters in advance, not considering the effects of various image features. An improved adaptive FCM image segmentation algorithm based on the ReliefF was proposed, which can accomplish image segmentation by considering the effects of various image features, incorporating the cluster validity exponent to ascertain the initial number of clusters automatically, extracting the image feature according to Laws texture measure. Experimental results show that the proposed method is simple and work well for most images, and has better segmentation effect than the existing FCM image segmentation.
Feature Extraction Algorithm Based on Locality and Globality and its Application in Face Recognition
GHANG Guo-yin, LOU Song-jiang, CHENG Hui-jie, WANG Qing-jun
Computer Science. 2009, 36 (8): 285-287. 
Abstract PDF(231KB) ( 608 )   
RelatedCitation | Metrics
A feature extraction algorithm based on locality and globality was proposed, which on one hand takes the locality of the data into consideration, on the other hand takes the globality of data into account. Consequently, the data afto dimension reduction not only preserves the locality relationship, but also reconstructs and represents the original dato perfectly. PCA (Principal Component Analysis) can represent the data nicely and LPP (Locality Preserving Projection) can preserve the locality relationship, so the algorithm just hybrid the two of them. But LPP does not utilize the classification information, so first LPP-based algorithm is improved, a supervised version is given, which results in that the algorithm is more suitable for classification task. Experiments in face recognition validate the correctness and effectiveness of the proposed algorithm.
Application Research of SVM Introjecting Fuzzy Theory in Image Affective Recognition
CHEN Jun-jie,ZHANG Da-wei,LI Hai-fang
Computer Science. 2009, 36 (8): 288-290. 
Abstract PDF(333KB) ( 840 )   
RelatedCitation | Metrics
This paper introduced FSVM, which introjects fuzzy theory to SVM, achieves a classification system which classifies image layer by layer to affective semantic level by FSVM,and proposed one kind of image affective semantics classification method. The difficulty is to establish a mapping from image features to image affective semantics and how to select fitting membership function to test image semantic class. The experimental result shows that the system is simple,fast, effective, and so on, therefore our system is proved to be successful in promoting the image semantic classification to affective semantic level.
Design of Adaptive Watermarking Algorithms for Video Signals Using Subjective Tests
SONG Xing-guang, ZHANG Chun-tian, SU Yu-ting
Computer Science. 2009, 36 (8): 291-295. 
Abstract PDF(418KB) ( 655 )   
RelatedCitation | Metrics
By introducing the subjective tests,a subjective optimized adaptive video watermarking algorithm was presented. This paper described an adaptive watermarking algorithm which is based on measured visibility thresholds of the impairments brought by watermarking. The visibility thresholds were determined by subjective tests. Constructions of the watermarking scheme were carried out such that the embedding strength is maximized without exceeding the measured visibility thresholds. Experimental results show that the presented watermarking scheme achieves comparatively high detection rate than spatial non-adaptive watermarking scheme and the simple human visual system based adaptive watermarking scheme.
License Plate Location Algorithm Based on Multi-phases
PIAN Zhao-yu MENG Xiang-ping, ZHANG Hong
Computer Science. 2009, 36 (8): 296-299. 
Abstract PDF(332KB) ( 698 )   
RelatedCitation | Metrics
The location of license plate plays a crucial role in license plate recognition system. Based on the rich edge information and unique features of the license plate region,a practical and effective method of locating the plate was proposed. Firstly, the original image was transferred to HSI space, and the rough locations were detected by using the edge information of the characters and the operation of the mathematical morphology; and then, the accurate region of license plate was decided by the SOM using four features proposed by this paper. Finally, the experiments on 80 color images show that the efficiency of the proposed algorithm is 96. 25%.
Research on the Characteristic of the Model of Self-servowriting in Hard Disk
ZHAO Xiao-gang , WANG Hai-wei , XIE Chang-sheng, LI Bo
Computer Science. 2009, 36 (8): 300-302. 
Abstract PDF(251KB) ( 800 )   
RelatedCitation | Metrics
Self-servowriting is a good method which deals with servo information writing. This paper analyzed the characteristic of servo model.It shows that position precision is the most important criterion to the validity of self-servowriting.In the paper three different controllers were designed to centrol the self-servowriting process. The esperiment data shows the impoertance of position precision in self-servowriting.This paper indicated the robust controller can resist outer disturbance and provide the position precision in self-servowriting process.