Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 36 Issue 7, 16 November 2018
  
Review of Security Techniques for Mobile Ad Hoc Networks
ZHANG Peng,SUN Lei,CUI Yong,HAN Xiu-feng
Computer Science. 2009, 36 (7): 1-7.  doi:10.11896/j.issn.1002-137X.2009.07.001
Abstract PDF(784KB) ( 834 )   
Related Articles | Metrics
Mobile Ad Hoc networks are particularly vulnerable due to their features of dynamic changing topology, and wireless communication. Current security techniques for mobile Ad Hoc networks arc divided into five categories; intrusion detection, securing routing protocols, architecture and models, key management techniques and others. Using intrusion detection, nodes' behaviors were analyzed and detected based on particular mathematics modules. Securing routing protocols include novel securing routing protocols and security enhancements of current routing protocols. The security techniques were introduced and analyzed.
Survey of Static Analysis Methods for Binary Code Vulnerability
TIAN Shuo, LIANG Hong-liang
Computer Science. 2009, 36 (7): 8-14.  doi:10.11896/j.issn.1002-137X.2009.07.002
Abstract PDF(653KB) ( 2442 )   
Related Articles | Metrics
A survey of static analysis methods for binary code vulnerabilities was provided. Based on summing up existing static detect methods for vulnerabilities, the general procedural of binary static analysis was modeled, and transforming program information into expressive and generic intermediate representation was regarded as the key component of the analysis and as a important research direction.
Survey on the Opinion Mining, Summarization and Retrieval
HOU Feng,WANG Chuan-ting,LI Guo-hui
Computer Science. 2009, 36 (7): 15-19.  doi:10.11896/j.issn.1002-137X.2009.07.003
Abstract PDF(569KB) ( 800 )   
Related Articles | Metrics
The review texts with subjective sentiments on the Web are valuable for many applications, since they expressed the opinions, attitudes and standpoints of users. The opinion mining technictue process these subjective texts specially to generate useful opinion summarization and knowlcdgcs automatically. The background and application of opinion mining was introduced firstly and then the sate of the art of opinion mining was presented in five subtasks: sentiment polarity identification of words, coarse sentiment classification, fine grade opinion mining and summarization, opinion retrieval and language resources and application systems; The difficulty and trend of opinion mining was concluded finally.
Survey of Applying Support Vector Machines to Handle Large-scale Problems
WEN Yi-min,WANG Yao-nan, LU Bao-liang,CHEN Yi-ming
Computer Science. 2009, 36 (7): 20-25.  doi:10.11896/j.issn.1002-137X.2009.07.004
Abstract PDF(636KB) ( 887 )   
Related Articles | Metrics
Being applied to handling large-scale problems, support vector machines(SVMs) needs longer training time and larger memory. I}he paper analyzed the limitation of SVMs, classified the algorithms of applying SVMs to handle large-scale problems into seven types, and made profound and comprehensive analysis of each kind of algorithm. Moreover, some issues valuable for future exploration in this area were indicated and discussed.
Brief Report of Research on Cognizing the Subarea of Evolutionary Computation
LIU Kun-qi,KANG Li-sha,ZHAO Zhi-zhuo
Computer Science. 2009, 36 (7): 26-31.  doi:10.11896/j.issn.1002-137X.2009.07.005
Abstract PDF(523KB) ( 690 )   
Related Articles | Metrics
It is very important that the subarca of evolutionary computation is scientifically cognized for studying evolutionary computation and predicting the developing directions of evolutionary computation in future.The development mainlines, characteristics and internal inherent law are reviewed were summarized, and the cognitive process for evolutionary computation was explained from the perspective of philosophy of science. A series of new concepts and idea,for example, the essential problem, typical methods and typical instances were proposed and discussed. The research progress of methodology for evolutionary computation was presented in outline, and its influence on the research and education of computer science and technology was discussed.
Voxel-coding for Surface Reconstruction from Contours
WANG Ming-fu,ZHOU Yong
Computer Science. 2009, 36 (7): 32-39.  doi:10.11896/j.issn.1002-137X.2009.07.006
Abstract PDF(695KB) ( 809 )   
Related Articles | Metrics
A complete contour-based reconstruction method must establish correspondence, solve branching problems,and construct tiles. Most modern reconstruction algorithms typically address only one or two of these problems. Therefore, their applications do not achieve complete solutions with complicated objects, such as the considerably convoluted and highly branched cortex of the human brain extracted from Magnetic Resonance Imaging(MRI) data. This paper presented an efficient Voxel-coding algorithm; which can handle complicated many-to-many branching and holes in a fully automatic and systematic way. First the contours from adjacent slices are projected onto an intermediate plane. And then be divided into groups on the basis of their difference regions. For each group of contours, skeletons arc extracted from the corresponding region. These skeletons are used to measure contour dissimilarity and to decompose dissimilar or complicated branching contours into simple and similar contour-to-skeleton pairs. Reconstructed surfaces arc 2D manifold triangle meshes which pass only input contours along slices. The algorithm has been tested using both hand-made data and real complex human cortical MRI data, demonstrating its efficiency.
New Evaluation Algorithm for the Network Two-terminal Reliability
HE Ming,QIU Hang-ping,LIU Yong
Computer Science. 2009, 36 (7): 40-41.  doi:10.11896/j.issn.1002-137X.2009.07.007
Abstract PDF(231KB) ( 749 )   
Related Articles | Metrics
A new method for computing the network two-terminal reliability was presented. The basic idea of the algorithm presented here relics on the notion of a frontal description of a graph. The innovation point is that the originality of the present work is in the application to the two-terminal and all-terminal reliability problem with close analysis of the complexity depending on the tree-width. We discussed methods to optimize the mean time to repair of the components. The method is valuable for optimizing network and assigning the mean time to repair.
Traffic Matrix Estimation Algorithm Based on Square Root Filtering
YANG Yang,ZHOU Jing-jing,YANG Jia-hai,ZHAO Wei,XIONG Zeng-gang
Computer Science. 2009, 36 (7): 42-45.  doi:10.11896/j.issn.1002-137X.2009.07.008
Abstract PDF(320KB) ( 810 )   
Related Articles | Metrics
The traffic matrix is one of the crucial inputs in many network planning and traffic engineering tasks, but it is usually impossible to directly measure traffic matrices. So, it is an important research topic to infer traffic matrix by reasonably modeling, and incorporating the limited empirical information. If the proposed methods, Kalman Filtering method is a more efficient and accurate method than many others. However, the error covariance calculation components of the Kalman Filtering arc difficult to implement in realistic systems due to the existence of ill-conditioning problems. The authors proposed Square Root Filtering/Smoothing traffic matrix estimation(SRFsTME) algorithm to improve it, and also proposed a data pre-filtering method to reject the "bad" data with considerable noise. Simulation and actual traffic testing results show that SRFsTME algorithm is more numerical accurate and stable than Kalman Filtering.
Robust Temporal Trust Model Using Ant Colony Algorithm in the Multi-domain Environment
WEN Zhu-mu,LI Rui-xuan,LU Zheng-ding,FENG Ben-ming,TANG Zhuo
Computer Science. 2009, 36 (7): 46-51.  doi:10.11896/j.issn.1002-137X.2009.07.009
Abstract PDF(509KB) ( 623 )   
Related Articles | Metrics
This paper introduced a model for time-based temporal trust. The trust in multi-domain environment is uncertain, which is variation for various factors. Every domain is endowed with a trust vector, which figures the trust intensity between this domain and the others. hhe trust intensity is dynamic due to the time and the inter-operation between two domains, a method was proposed to ctuantify this change based on the mind of ant colony algorithm and then an algorithm for the transfer of trust relation was also proposed. Furthermore, this paper analysed the influence to the trust intensity among all entities that is aroused by the change of trust intensity between the two entities, and presented an algorithm to resolve the problem. Finally,we show the process of the trusts' change that is aroused by the time's lapse and the inter-operation through the Simulation Experiment.
Efficient Secure Routing Scheme for Wireless Sensor Networks
YAO Xuan-xia,GHENG Xue-feng,ZHOU Fang
Computer Science. 2009, 36 (7): 52-55.  doi:10.11896/j.issn.1002-137X.2009.07.010
Abstract PDF(328KB) ( 745 )   
Related Articles | Metrics
In order to realize secure and efficient routing in wireless sensor networks, a local trust model was built In this local trust model, the trust value of a node was determined by the ratio of the packets delivery, the distance to the target sink node and the residual energy. And using this local trust model and the multiple criteria decision making technology, the next hop was chosen on the basis of the residual energies, the trust values, the direction factors and the distances to the target sink node of its neighbors. At the same time, the encryption and authentication mechanism based on the symmetric cryptography were used between neighboring nodes so as to ensure the secure packets forwarding. The proposed routing scheme can not only prevent most routing attacks and realize secure routing, but also can balance the life of the network and the efficiency of routing well to achieve efficient routing.
Secure Image Steganography Based on Step-varying Quantization
HE Jun-hui,TANG Shao-hua,XING Yi-bo
Computer Science. 2009, 36 (7): 56-59.  doi:10.11896/j.issn.1002-137X.2009.07.011
Abstract PDF(331KB) ( 820 )   
Related Articles | Metrics
Image steganography transfers message secretly by embedding the message into a cover image to implement covert communication. We presented an image steganographic algorithm called VSQS. A secret key dependent Uaussian sequence was generated and then rounded, which is used for the step-varying quanti}ation of the pseudo-random permuted image pixels. Depending on the relationship between the ctuantized pixels and the secret message, the image pixels were modified to embed the message. Furthermore, an extension of this algorithm(called TLQS) was presented. Experimental results show that both the VSQS and TLQS image steganographic techniques may provide higher capacity and be resistant to several well-known steganalytic methods.
Design and Implement of Wireless Sensor Network Medium Access Control Protocol
SHI Wei-ren, FENG Hui-wei,TANG Yun-jian
Computer Science. 2009, 36 (7): 60-62.  doi:10.11896/j.issn.1002-137X.2009.07.012
Abstract PDF(304KB) ( 800 )   
Related Articles | Metrics
Refering to the application requirement that node number is not mass and real time communication is not very important, it put forward a high effective and low energy consume requirement of wireless sensor network MAC protocol based on the IEEE 802. 15. 4 standard. hhe software mainly makes use of interrupt control and function call-back.The physical layer is provided with the function of wireless transceiver management to support low energy consume.hhe MAC layer adopts slotted CSMA-CA to carry out the node to access channel. The nodes include common nodes and coordinators. Common nodes adopt poll and sleep mechanism, implement in-direct data transmit with the coordinator.The experiment result shows that the application system achieves the target of low energy consume and high effective.
Network Security Situation Assessment Based on WOWA-FAHP
LU Zhen-bang,ZHOU Bo
Computer Science. 2009, 36 (7): 63-67.  doi:10.11896/j.issn.1002-137X.2009.07.013
Abstract PDF(399KB) ( 788 )   
Related Articles | Metrics
For the practical purposes of intrusion response decision-making and security management, a Fuzzy Analytic Hierarchy Process approach based on Weighted Ordered Weighted Averaging aggregation(WOWA-FAHP) and a network security situation assessment model based on WOWA-FAHP were proposed. Besides preserving the merits of the FAHP, the WOWA-FAHP approach takes into account both objective and subjective associations among the attributes,and is able to adapt various decision preferences. I}he assessment model based on WOWA-FAHP combines static and dynamic assessments; utilizes multiple information sources, such as system security risk evaluation, intrusion alert fusion and correlation, anomaly monitor and security audit; considers multiple aspects, such as intrusion alerts, anomalies, vulnerabilitics,and attack effects;and handles the complex relations among the factors with the WOWA-FAHP approach according to different security policies. The effectiveness of the proposed approach and model is illustrated via an actual security situation assessment for a network application service system.
Energy Level and Link State-based AODV Route Request Forwarding Scheme Study
HAO Ju-tao,ZHAO Jing-jing,LI Ming-lu
Computer Science. 2009, 36 (7): 68-70.  doi:10.11896/j.issn.1002-137X.2009.07.014
Abstract PDF(355KB) ( 728 )   
Related Articles | Metrics
Ad-hoc On-demand Distance Vector(AODV) provides scalable and effective solution for packet routing in mobile wireless ad hoc networks, however, the path generated by this protocol may derivate far from optimal because of inconsideration of node power, load unbalance and link state, resulting that energy consumption of each nods is imbalanced and reducing the lifetime of whole networks. Based on the basic AODV protocols, an improved protocol was presented. When the protocol is selecting a route, node power, load balance and link state between nods are all considered.The proposed protocol can improve the network performance. hhrough the simulation on NS2,it is confirmed that the improved AODV protocol is more energy-efficient than AODV and has higher data package delivery rate, lower end to end delay and lower route load.
NS Extension for Network Coding
LI Ling-xiong,HONG Jiang-shou,LONG Dong-yang
Computer Science. 2009, 36 (7): 71-73.  doi:10.11896/j.issn.1002-137X.2009.07.015
Abstract PDF(321KB) ( 733 )   
Related Articles | Metrics
Network coding has received extensive research attention, but the Simulation research on popular network simulators is few. Most of current network simulators were designed to simulate traditional networks,where nodes can only copy and forward packets only. While in the paradigm of network coding, nodes can perform coding operations on packets. The extension for network coding is fundamental for further simulation study. We analyzed the structure of NS firstly, and identified the parts that should be modified to suit for network coding. I}hen we proposed and implemented the extension. At last, simulation experiment on basic network was performed to verify the availability of the extension.
Algorithms of Multi-sensor Data Fusion for the Mud Pulse Signal
LI Chuan-wei,MU De-jun,LI An-zong
Computer Science. 2009, 36 (7): 74-75.  doi:10.11896/j.issn.1002-137X.2009.07.016
Abstract PDF(229KB) ( 657 )   
Related Articles | Metrics
According to the acquisition and transmitting conditions while drilling, the idea of detecting the signals by means of multi-sensors was proposed to get more accurate data, the reliability of sensors and the relation matrix between multi-sensors were also researched based on the fuzzy theory. The optimal methods of data fusion based on synthesis were presented to process both the example and real data. The processing results imply that the method has exhibited the abilities to reduce uncertainty and noises encountered in singlcsensors detection, which ensures its application in the measure while drilling system in oil field.
Novel VPN Authentication Scheme Based on Trusted Computing
QIU Gang,WANG Yu-lei,ZHOU Li-hua
Computer Science. 2009, 36 (7): 76-78.  doi:10.11896/j.issn.1002-137X.2009.07.017
Abstract PDF(349KB) ( 759 )   
Related Articles | Metrics
Platform security is particularly important in the case of remote access to corporate resources. Today, Virtual Private Network(VPN) client authentication mostly focuses on the identity of end-user and platform without ensuring the trust properties of the platform the end-user is operating. An attacker could exploit it to gain unauthorized access.Scheme based on the combination of smart card and Trusted Platform Module(TPM) can secure the identity authenticalion of the end-user's platform and assure the security of network connections.
Fractional Autoregressive Prediction for Long Range Bursty Traffic
WEN Yong,ZHU Guang-xi,XIE Chang-sheng
Computer Science. 2009, 36 (7): 79-81.  doi:10.11896/j.issn.1002-137X.2009.07.018
Abstract PDF(307KB) ( 660 )   
Related Articles | Metrics
The traffic with data packet transmission in various network conditions exhibits convincingly self-similarity causing the long range burstiness which cannot be captured by traditional telecommunication traffic models based on Poisson process or Markov process. The updated explicit high-resolution measurement and researches for the traffic reveal that the heavy tailness existing extensively in the network brings about the self-similarity of the traffic. The information extraction from the self-similarity and long range dependence is the key fact for the exact prediction of the long range bursty traffic. Two distinctive AutoRegressive predictors based on c}stable self-similar traffic model were presented. The predictors including FAR(Fractional AutoRegressive) ,FNAR(Fractional Nonlinear AutoRegressive) can minimite the dispersion according to the criteria with infinite variance. The final predicted values with the different schemes were obtained by combining the previous two individual predicted values for the higher predicted precision.
Secure(t,n) Threshold Proxy Signature Scheme Without a Trusted Party
YAN De-qin, ZHAO Hong-bo
Computer Science. 2009, 36 (7): 82-84.  doi:10.11896/j.issn.1002-137X.2009.07.019
Abstract PDF(220KB) ( 765 )   
Related Articles | Metrics
A secure threshold proxy signature scheme was proposed. Conspiracy attack, that is, any t(t is threshold value) or more malicious proxy signatures may work together to reconstruct the secret polynomial of the proxy group and derive the secret keys of other members in the proxy group, consequently they can impersonate some other proxy signers to generate a valid threshold proxy signature. A large number of the schemes which were proposed require trusted party. So the trusted party becomes the attacked part. The proposed scheme can withstand the conspiracy attack, arbi- trarily signers of t still can't know the secure keys of other members in the proxy group, so they can't impersonate some other proxy signers to generate a valid threshold proxy signature. Furthermore, with regard to t of malicious signers impersonate the original signer, the original signer uses the second empower to solve, which only needs one secret channel between the original signer and the generation signer. It cannot distinguish the proxy signers by proxy signatures. In the scheme each participant's public key and private key and the group public key arc negotiated among all participants with no trusted party required. We shall try to avoid attacking the trusted party and the cheating between proxy signers, the security of this scheme is more efficient than other schemes.
Multi-step Network Delay Prediction Model Based on RNN
HU Zhi-guo,ZHANG Da-lu,HOU Cui-ping,SHEN Bin,ZHU An-qi
Computer Science. 2009, 36 (7): 85-87.  doi:10.11896/j.issn.1002-137X.2009.07.020
Abstract PDF(333KB) ( 1196 )   
Related Articles | Metrics
Network delay is an important performance metric of IP network which reflects network path's workload characteristics. The precise prediction for network delay is an important basis on congestion control and route selection.A new multi-step prediction method was proposed for network delay prediction based on the random neural networks,this method overcomes the disadvantage traditional time-series method and neural network method. Compared with traditional RBF network and AR model,the experimental result indicated that the proposed model has better accuracy for single steps and multi steps prediction.
Design and Implement on a Live Media Streaming System Based on Peer-to-Peer and CDN
REN Li-yong,WANG Tao, DUAN Han-cong,ZHOU Xu
Computer Science. 2009, 36 (7): 88-91.  doi:10.11896/j.issn.1002-137X.2009.07.021
Abstract PDF(337KB) ( 1152 )   
Related Articles | Metrics
Current technologies adopted by live media streaming system were compared and analyzed, and many problems of these technologies were proposed. To solve the problems of exiting live media streaming systems, combined with the merits of CDN and P2P network architectures, this paper proposed and implemented a new live media streaming system which improves the network topology of traditional content distribution networks. And a solution for critical issues was given, such as the user peer management strategy, buffer management strategy and data schedule scheme, then the merits of this new live media streaming system were analyzed. Simulation studies show that, in large-scale networks environment, the proposed system has improved performances than the current exiting P2P live media streaming systems.
Method of Modeling Non-functional Properties in Software Architecture
ZHANG Lin-lin,YING Shi,ZHAO Kai,WEN Jing,NI You-cong
Computer Science. 2009, 36 (7): 92-96.  doi:10.11896/j.issn.1002-137X.2009.07.022
Abstract PDF(409KB) ( 624 )   
Related Articles | Metrics
How to address non-functional properties in software system has afflicted various stakeholders for a long time, and been one of the key points in software engineering fields. Aiming at the early stage of software architecture design, this paper proposed a new method for modeling non-functional properties, which employs the principle of multi-dimensional separation of concerns(MDSoC),and proposed a model named "1 + X" for handling. Based on this model,multiple dimensions of non-functional properties were classified, as well as concerns of non-functional properties for each dimension. Finally, both non-functional properties dimensions and concerns were specified using XMI.Research works in this paper can be prepared for the aspect oriented software architecture design, for the concerns of non-functional properties handled can be directly encapsulated using aspectual components. In addition, this method provided supports for architects, and the outputs of this method can be directly used in the software architecture design related to the various domains.
Dynamic Substitution of Web Services through Stateful Aspect Extension
DOU Wen-sheng,WU Guo-quan,WEI Jun,LIU Shao-hua
Computer Science. 2009, 36 (7): 97-102.  doi:10.11896/j.issn.1002-137X.2009.07.023
Abstract PDF(503KB) ( 739 )   
Related Articles | Metrics
As the service-oriented computing technology matures, service composition has became a new model for business cooperation among enterprises in Internet era. WS-BPEL is the de-facto standard for service composition. However, because Web services arc loosely coupled, distributed, and autonomous, the reliability of BPEL processes can' t be assured when partner services fail. As a result, dynamic runtime substitution of partner services is needed. But BPEL provides only limited service substitution. And when the partner services arc stateful, the problem gets more complex.This paper presented an stateful aspect extension of WS-BPEL.Stateful aspects were used to store the conversational details of partner services. In case of partner service failures,the conversational details can be replayed on another ecquivalent partner service transparently for the BPEL process. Our approach makes WS-BPEL have some ability of self-healing and improves the reliability of processes.
Self-adaptive Middleware in Ubiquitous Computing Environments
HE Jian-li,CHEN Rong,KANG Qin-ma
Computer Science. 2009, 36 (7): 103-106.  doi:10.11896/j.issn.1002-137X.2009.07.024
Abstract PDF(360KB) ( 616 )   
Related Articles | Metrics
The inherent complexity of ubiquitous computing poses many new challenges for software infrastructure,which needs urgently the adaptive middleware. This paper proposed a self-adaptive middleware in ubiquitous computing environments,which is composed of the three meta-models,namely the interface,framework and context meta-models.The design and implementation on CAR component platform were presented. The interface meta-model provides access to the services and the internal information and state of a component in terms of asynchronous and synchronous interfaces. The context meta-model represents context information in component object and mask distribution problems in ubiquitous computing environment based asynchronous event notification. The framework meta-model classifies and manages component, makes changes to middleware structures and behaviors according to the changing runtime environmenu. At last, the software system meets the dynamic self-adaptive requirement in ubiquitous computing environments.
A Kind of Semantic Service Search Method Driven by Natural-like Language
ZHANG Gui-gang
Computer Science. 2009, 36 (7): 107-112.  doi:10.11896/j.issn.1002-137X.2009.07.025
Abstract PDF(490KB) ( 691 )   
Related Articles | Metrics
The traditional method of searching service needs programmer program according to the fixed format. Actually,most of people especially the non-programmer have large difficulties to master it. The traditional search method can not satisfy non-programmer's requirements. To overcome the above limitations, the paper gave a kind of semantic service search framework driven by natural-like language. Defined a series of naturclike description rules to semantic scrvice request and gave a kind of new semantic service description framework, which is RDF4S(Resource Description Framework For Service). This paper analysed the ral-like language and analysed the core technology surface search and deep search about semantic service driven by natu in deep search.
Transformation from UML Model to FSM Model
GUO Liang,MIAO Huai-kou,WANG Xi,CHEN Sheng-bo
Computer Science. 2009, 36 (7): 113-116.  doi:10.11896/j.issn.1002-137X.2009.07.026
Abstract PDF(423KB) ( 1042 )   
Related Articles | Metrics
Various UML diagrams can be used for modeling different aspects of a Web application. It would be complex when testing and verifying the models of the Web application, since models described by different diagrams have to be considered separately. By transforming each UML diagram into FSM model,we can apply a uniformed method for representing, verifying and testing to all the UML models of the Web application. This paper proposed a method for transformation from UML model to FSM model based on statctransition property preservation rule. As we focused on the transformation from UML state diagrams to FSM models,a mechanism for transformation from three basic units of the state diagram to corresponding FSM models was presented. Finally, a prototype tool named UML2FSM was presented.
Method of Conformance Test Case Generation for C/S Model System Based on UML State Chart
YE Xin-min,WANG Pu-xin,BAI Xiang-yu,XIE Hui
Computer Science. 2009, 36 (7): 117-119.  doi:10.11896/j.issn.1002-137X.2009.07.027
Abstract PDF(257KB) ( 678 )   
Related Articles | Metrics
C/S model based system has characters of multi-layers architecture and Object oriented based programming technology etc. It is difficult to carry out conformance testing for this kind of system. This paper presented a model for a topic C/S system based on UML state chart, and then transformed it into EFSM. A method combining UIO and Chinese post man algorithm was applied on the EFSM to generate conformance test cases. Finally, we adopted data flow based analyzing technology to exclude the test cases that are not executable. The UML state chart is good at modeling software system, so we utilized such advantage to reduce the difficulty of test case generation, and decrease the length of test cases and cost of software development
Combining Object-oriented Programming and Aspect-oriented Programming for Software Product Line Implementation
ZHU Jia-yi,PENG Xin,ZHAO Wen-yun
Computer Science. 2009, 36 (7): 120-123.  doi:10.11896/j.issn.1002-137X.2009.07.028
Abstract PDF(345KB) ( 660 )   
Related Articles | Metrics
As one of the most popular software development technology,OOP(Object oriented Programming) does provide certain mechanisms for the implementation of software product line variabilities. However, OOP does not support crosscutting features and optional feature interactions well. Therefore, some researchers introduced AOP(Aspect oriented Programming) to the implementation of software product line. AOP can not only separate crosscutting concerns, but also provide flexible supports for configuration of optional feature interactions through separation of dependencies.Therefore,combining OOP and AOP in product line implementation can greatly promote the rcusability,adaptability,and configurability of product line assets. This paper explored the OOP and AOP combined implementation method for software product line on the analysis of related problems,and then presented a case study on a reward offering software product line for validation with related analysis and discussion.
Approach to Modeling and Testing Web Applications Based on Functional Components
TANG Yun-ji,MIAO Huai-kou,QIAN Zhong-sheng
Computer Science. 2009, 36 (7): 124-127.  doi:10.11896/j.issn.1002-137X.2009.07.029
Abstract PDF(405KB) ( 637 )   
Related Articles | Metrics
There are great differences between Web applications and the traditional programs. The modeling and testing methods of the traditional programs can not be fully fit for Web applications. An approach to generating test cases effeclively was presented. A Web application was divided into a set of functional components, each of which corresponded to an actual Web application module. A directed graph is employed to represent the structural relationship among the functional components, an FSM was used to represent their behavioral relationship, and the FSM composition was used to represent their interactions. In testing,two test criteria which are complete executing sectuence coverage and component complete executing sequence coverage were proposed.To satisfy these two criteria respectively, different test sets were generated. Additionally,a test case generation prototype was designed for the approach proposed.
Service Contract-oriented Design Approach to Developing Adaptable Services
LI Jun-huai,ZHANG Jing,ZHANG Zhuo-bin
Computer Science. 2009, 36 (7): 128-130.  doi:10.11896/j.issn.1002-137X.2009.07.030
Abstract PDF(317KB) ( 660 )   
Related Articles | Metrics
Web Service is fit for distributed, loosccoupling application environment. A Web service adapter of service contract-oriented was presented in order to solve the problem of enterprise application system development. We still introduced some key issues about service contract analysis engine, skeleton code generate engine and Web based WSDL editor in the Web service adapter. Also, the Web service adapter designed in this paper was compared with related tools,the results showed that our methods are more interactive and stability.
Rejuvenation Model of Server Cluster with Fluid Stochastic Petri Net
DU Xiao-zhi,QI Yong,HOU Di,LIU Liang
Computer Science. 2009, 36 (7): 131-134.  doi:10.11896/j.issn.1002-137X.2009.07.031
Abstract PDF(285KB) ( 608 )   
Related Articles | Metrics
Server clustering is an usual method to improve QoS and availability of system. But due to the need for longterm continuous operation to provide services, there exists software aging phenomenon in cluster system, which makes the system service rate is reduced and the failure rate is increased with the time increases. In this paper, the effect of software aging and the effect of failure were all considered, and fluid stochastic Petri net was adopted to model the working pattern of cluster system. Then the analysis methods of probability of each system state were given, and the formulas of system availability and throughout were presented. The availability and throughout were taken as the performance metrics of the cluster system. And some numeric experiments were done. The results show that the method introduced in this paper has the ability to describe system and is extendable. By using this method, the optimal rejuvenation period is gotten.
Approach to QoS Driven Component Composition of Enterprise Software and Application
MENG Fan-chao,CHU Dian-hui,ZHAN De-chen,XU Xiao-fei
Computer Science. 2009, 36 (7): 135-140.  doi:10.11896/j.issn.1002-137X.2009.07.032
Abstract PDF(507KB) ( 600 )   
Related Articles | Metrics
Aiming at the problem of current component assembly projects selection technology, an approach to component composition selection based on QoS optimization which is mainly oriented large and complex enterprise software and application system configuration management was proposed. The essence of this approach is that the problem of component composition projects selection is transformed into a problem of multi-objective optimization with constraints.A genetic algorithm based on vector encoding scheme was presented for component composition projects selection. The vector encoding scheme can easily express the connection relationships between interfaces in the component composition model,which can't be expressed in other encoding schemes. Finally, experiments results show the feasibility and efficiency.
Fuzzy Logic Based Metric in Software Testing
TU Ling,ZHOU Yan-hui,ZHANG Wei-qun,ZHOU Ya-zhou
Computer Science. 2009, 36 (7): 141-144.  doi:10.11896/j.issn.1002-137X.2009.07.033
Abstract PDF(308KB) ( 621 )   
Related Articles | Metrics
How to provide cost effective strategies for software testing has been one of the research focuses in software engineering for a long time. Many researchers in software engineering have addressed the effectiveness and quality metric of Software Testing, and many interesting results have been obtained. However, one issue of paramount importance in software testing-the intrinsic imprecise and uncertain relationships within testing metrics- is left unaddressed. To this end, a new quality and effectiveness measurement based on fuzzy logic was proposed. The software quality features and analogy-based reasoning were discussed, which can deal with quality and effectiveness consistency between different test projects. Experimental results were also provided to verify the proposed measurement.
Novel Prefix Encoding Scheme Based on Layered Structure
XU Juan,LI Zhan-lin,KE Xi-lin
Computer Science. 2009, 36 (7): 145-149.  doi:10.11896/j.issn.1002-137X.2009.07.034
Abstract PDF(395KB) ( 650 )   
Related Articles | Metrics
Most of the XML query strategies are based on some prefix schemes[1-4].By analyzing the current prefix schemes,we proposed a novel prefix encoding scheme with layered structure. The new prefix encoding scheme has relatively smaller mean coding length, and the code length is not increased with the depth increment of XML document. Another advantage this scheme brings out is, the query process was accelerated because component code comparisons for Xpath query axis computation become fewer with smaller code length. Extensive theoretic analysis and experimental resups show that this scheme is a better one which accelerates query process and saves the store space for the codes.
Research on a Spatial Join Query with Keyword Search
CHEN De-hua,LIU Liang-xu,LE Jia-jin
Computer Science. 2009, 36 (7): 150-152.  doi:10.11896/j.issn.1002-137X.2009.07.035
Abstract PDF(323KB) ( 618 )   
Related Articles | Metrics
In recent years,many real applications are rectuired to support both spatial joins and keyword-based search.Motivated by such requirements, this paper defined a new kind of Spatial Join query with Keyword Search(SJKS) which combines both keyword-based search and spatial join. To answer SJKS efficicntly,this paper constructed IRz-Trec(Information Retrieval R-Tree) indexing structures on the datasets, and presented an IR2-Tree-based SJKS processing algorithm named as IR2-TreeSJKS. The results of experiments show that IR2-TreeSJKS is effective for spatial join queries with keyword search.
Study on the OWL Ontology Construction Approach Based on Relational Databases
LU Yan-hui,MA Zong-min,WANG Yu-xi
Computer Science. 2009, 36 (7): 153-156.  doi:10.11896/j.issn.1002-137X.2009.07.036
Abstract PDF(405KB) ( 902 )   
Related Articles | Metrics
How to make use of the existing data resources to construct ontology(semi)automatically is one of the tasks of achieving Semantic Web. By analyzing the current research results and deficiency, a systematic approach to OWL ontology construction based on RDB was presented. Described the processing steps of extracting the semantic information from RDB including entity, relation, inheritance, aggregation and cardinality ratio constraint. Accomplished the transformation from RDB semantic information to corresponding elements of ontology. Prototype system implementation demonstrates the effectiveness of the approach.
Flight Delay Propagation Analyzing and Predicting System of Civil Aviation of China Based on SOA
XU Tao,RONG Yao,WANG Jian-dong
Computer Science. 2009, 36 (7): 157-160.  doi:10.11896/j.issn.1002-137X.2009.07.037
Abstract PDF(366KB) ( 679 )   
Related Articles | Metrics
To provide software foundation for predicting flight delay,a flight delay propagation analyzing and predicting system was designed and implemented based on SOA. In server side, algorithm of flight delay propagation analyzing and predicting was encapsulated in a Web service, and a graphic component package was offered to automatically generate simulation graphs of flight delay propagation DAG, advanced Petri net, Bayesian network, parallel cellular automata, and statistical graphs such as a bar graph. I}he client side was implemented by Magix AJAX framework. XML is used as dato exchange media among modules to gain good interoperability. The system can be used to accomplish flight delay propagation analyzing and predicting for single flight plan,multiple flight plans,airports and airlines. It shows that the system can effectively aid flight delay propagation analyzing and predicting.
Concept Lattice Based on Dominance Relations
WANG Jun-hong,LIANG Ji-ye,QU Kai-she
Computer Science. 2009, 36 (7): 161-163.  doi:10.11896/j.issn.1002-137X.2009.07.038
Abstract PDF(275KB) ( 726 )   
Related Articles | Metrics
The information system based on dominance relations is a kind of widely used information systems in real life. The formal context based on dominance relations was discussed. Three kinds of poset, obj ects poset, attributes poset and object attributes poset were established,and the partial order in the posets was discussed. Then, the definition and building method of concept lattice based on dominance relations was proposed. These conclusions further enrich the concept lattice theory, and provide a new way of thinking for rules extraction for the information system based on dominance relations.
Fast Collision Detection Algorithm for Spherical Blend Reconstruction
ZHAO Wei,LI Wen-hui
Computer Science. 2009, 36 (7): 164-169.  doi:10.11896/j.issn.1002-137X.2009.07.039
Abstract PDF(441KB) ( 707 )   
Related Articles | Metrics
Fast collision detection are necessary in order to resolve interactions between a virtual character and its environment. We presented a novel collision detection algorithm based on spherical blend skinning. A procedure was presented for refitting of bounding spheres for spherical blend skinning with sublinear time complexity. Constructed rotation bound by a quaternion. This refitting operation is an extension of the refitting for linear blending. Completed decomposes from spherical blending to linear by rotational component Although it is of course a little more difficult than in the linear case, the resulting algorithm is almost as easy to implement and the computation complexity of the presented algorithm is almost the same as that of the linear version.
Video Semantic Content Analysis Based on Ontology
BAI Liang,LIU Hai-tao,LAO Song-yang,Bu Jiang
Computer Science. 2009, 36 (7): 170-174.  doi:10.11896/j.issn.1002-137X.2009.07.040
Abstract PDF(515KB) ( 739 )   
Related Articles | Metrics
The rapid increase in the available amount of video data is creating a growing demand for efficient methods for understanding and managing it at the semantic level. New multimedia standards, such as MPEG-4 and MPEG-7,provide the basic functionalities in order to manipulate and transmit objects and metadata. But importantly, most of the content of video data at a semantic level is out of the scope of the standards. A video semantic content analysis framework based on ontology was presented. Domain ontology was used to define high level semantic concepts and their relations in the context of the examined domain. And low-level features(e. g. visual and aural) and video content analysis algorithms were integrated into the ontology to enrich video semantic analysis.OWL was used for the ontology description. Rules in Description Logic were defined to describe how features and algorithms for video analysis should be applied according to different perception content and low-level features. Temporal Description Logic was used to describe the semantic events, and a reasoning algorithm was proposed for events detection. The proposed framework was demonstrated in a soccer video domain and shows promising results.
Metaheuristic Strategy Based K-Means with the Iterative Self-Learning Framework
LEI Xiao-feng,YANG Yang,ZHANG Ke,XIE Kun-qing,XIA Zheng-yi
Computer Science. 2009, 36 (7): 175-178.  doi:10.11896/j.issn.1002-137X.2009.07.041
Abstract PDF(327KB) ( 749 )   
Related Articles | Metrics
The clustering problems based on minimizing the sum of intra-cluster squared-error are known to be NP-hard. I}he iterative relocating method using by K-Means is essentially a kind of local hill-climbing algorithm, which will find a locally minimal solution eventually and cause much sensitivity to initial representatives. The meta-heuristic strategy was introduced to minimize the squared-error criterion globally. Firstly, an evaluation function was built to approximate the dependency between a series of initial representatives of K-Means and the local minimal of objective criterion,and then the selection of initial representatives was done under the supervision of the evaluation function for the next K-Means. This iterative and self-learning process is called Meta KMeans algorithm. The experimental demonstrations show that Meta K-Means can overcome the sensitivity to initial representatives of K-Means to a great extent.
Polynomial Smooth Classification Algorithm of Semi-supervised Support Vector Machines
LIU Ye-qing,LIU San-yang,GU Ming-tao
Computer Science. 2009, 36 (7): 179-181.  doi:10.11896/j.issn.1002-137X.2009.07.042
Abstract PDF(218KB) ( 631 )   
Related Articles | Metrics
In order to solve the nonconvex and nonsmooth problem of semi-supervised support vector classification, a polynomial smooth function was introduced in this paper which was used to approach the nonconvex objective function.The introduced polynomial function has a high approximation accuracy in high density regions of samples and poor approximation performance appear in low density regions of samples. The model was solved by the method of conjugate gradient. Experimental results on artificial and real data support that the proposed algorithm can guarantee the accuracy when the percentage of labeled sample is very low and the accuracy is not improved obviously as the number of labeled data increasing. The performance of the proposed classifier is stable.
Fault Classifier of Rotating Machinery Based on Weighted Fuzzy Support Vector Data Description
ZHANG Yong,ZHANG Feng-mei,XIE Fu-ding,CHI Zhong-xian
Computer Science. 2009, 36 (7): 182-184.  doi:10.11896/j.issn.1002-137X.2009.07.043
Abstract PDF(320KB) ( 603 )   
Related Articles | Metrics
Based on the favorable classification performance of support vector data description(SVDD),aiming at the problem of fault samples' acquisition in rotating machinery fault diagnosis, this paper proposed a fuzzy support vector data description classifier based on positive and negative samples,which can be used to deal with the outlier sensitivity problem in traditional multi-class classification problems. hhis method considers the effect of negative samples to classification results,as well as the positive samples in the traditional SVDD algorithm. Experimental results show that the proposed method can reduce the effect of outliers and yield higher classification rate than other existing methods.
Kernel Principal Component Analysis Based on Feature Vector Selection
WU Hong-yan,HUANG Dao-ping
Computer Science. 2009, 36 (7): 185-187.  doi:10.11896/j.issn.1002-137X.2009.07.044
Abstract PDF(307KB) ( 744 )   
Related Articles | Metrics
Kernel principal component analysis(KPCA) is one of multivariate statistical control methods for solving nonlinear chemical process fault diagnosis. In this paper, it improves KPCA from two aspects. First, in order to improve the accuracy of KPCA for fault detection, a new method combined with wavelet was developed. Second, feature vector sclection(FVS) scheme was adopted to reduce the computational complexity of KPCA whereas preserve the geometrical structure of the data Tennessec Eastman process(TEP) simulations were carried out to show the given approach's effectiveness in process monitoring performance.
Zernike Moments with Minimum Geometric Error and Numerical Integration Error
ZHANG Gang,MA Zong-ming
Computer Science. 2009, 36 (7): 188-192.  doi:10.11896/j.issn.1002-137X.2009.07.045
Abstract PDF(382KB) ( 685 )   
Related Articles | Metrics
Shape feature extraction and description arc one of important research topics in content-based image retricval. The paper presented an approach using Zernike moments with minimum geometric error and numerical integration error,which is used for shape feature extraction and description. The approach maps the region of interest in an image into a unit disk,and the Zernike moments can be formed by computing a projection of the mapped image onto Zernike polynomials. Also the psychophysioiogical research results are introduced in the computation of Zernike moments to improve the retrieval performance of a system. Compared with the traditional Zernike moments, our experiment results show that the Zernike moments with minimum geometric error and numerical integration error arc better than the traditional Zernike moments approaches from the viewpoint of reconstruction. Viewed from retrieval, the systems using the Zernike moments with minimum geometric error and numerical integration error have better retrieval performance than the systems using the traditional Zernike moments.
Multi-strategy Based Single Document Question Answering
DU Yong-ping, HE Ming
Computer Science. 2009, 36 (7): 193-196.  doi:10.11896/j.issn.1002-137X.2009.07.046
Abstract PDF(335KB) ( 668 )   
Related Articles | Metrics
Single document question answering is also called Reading Comprehension(RC),which attempts to understand a document and returns an answer sentence when posed with a question. We proposed an approach that adopted multi-strategy and utilized external knowledge to improve the performance of RC,including pattern matching with Webbased answer patterns,lexical semantic relation inference and context assistance. This approach gives improved RC performance on the Remedia corpus. The effectiveness of different strategy was analyzed and pairwise t-tests show the performance improvements due to Web-derived answer patterns and lexical semantic relation inference technique are statistically significant. In addition, the performance impact by the co-reference resolution was also discussed. Finally, the comparison between the task of RC and multi document question answering(QA) was analyzed.
Weigh Matrix Based Solution Framework for Term Proximity Information Retrieval
QIAO Ya-nan,QI Yong,SHI Yi,HOU Di,WANG Xiao
Computer Science. 2009, 36 (7): 197-201.  doi:10.11896/j.issn.1002-137X.2009.07.047
Abstract PDF(402KB) ( 722 )   
Related Articles | Metrics
Tradional information retrieval models assume that keywords in ctueries are parallel, but the requirements of users should be abstracted to a series of keywords groups, and the sematic relations of keywords inside the group are closer than outside. This is "Term Proximity Information Retrieval" (TPIR) defined in this paper, and we presented a solution framework based on Weigh Matrix(WMSF). WMSF abstractes documents and ctueries to Weigh Matrix Representation of Document and Query Weigh Matrix, and then implements the TPIR based on the caculating of similarity between them. Empirical results show that WMSF is appropriate for TPIR compared with traditional information retrieval models which simplify the TPIR problems actually.
Adaptive Web Information Extraction Based on Tree
LI Zhao,PENG Hong,YE Su-nan,ZHANG Huan,YANG Qin-yao
Computer Science. 2009, 36 (7): 202-203.  doi:10.11896/j.issn.1002-137X.2009.07.048
Abstract PDF(227KB) ( 741 )   
Related Articles | Metrics
Many Web information extraction methods are related to wrapper induction. It extracts the items by the rules learnt from the Web pages used for training. Although it can get the information accurately,it is hard to be maintained when the template of the Web site is changed, as it needs to learn the rules again. In our research, we put forward a new adaptive Web information extraction. It determines the block which contains all information about the merchandise by using the keywords of a certain topic, which is based on DOM tree structure. The experiments on a great amount of Web pages show that our method can not only extract the information efficiently, but also is irrelevant to the site structure,which can be widely used for many different Web information extractions.
Classification Mining Using Association Rules Based on Rule Ranking
ZHU Xiao-yan,SONG Qin-bao
Computer Science. 2009, 36 (7): 204-207.  doi:10.11896/j.issn.1002-137X.2009.07.049
Abstract PDF(322KB) ( 638 )   
Related Articles | Metrics
A new associative classification algorithm based on rule ranking was proposed. The proposed method takes advantage of the optimal rule method preferring high duality rules. At the same time, it takes into consideration as many rules as possible, which can improve the bias of CBA that builds a classifier according to only several rules covering the training dataset. In the proposed algorithm, after the generation of association rules whose consectuences are class labels,rules arc ranked according to their length, confidence, support, lift and so on. Rules having no influence on the classification result arc deleted during ranking. The set of the ranked rules with a default class constructs the final classifier. Fina11y,20 datasets selected from UCI ML Repository was used to evaluate the performance of the method. The experimental results show that our method has higher average classification accuracy in comparison with CBA.
FP-array-based Improved FP-growth Algorithm
TAN Jun,BU Ying-yong,YANG Bo
Computer Science. 2009, 36 (7): 208-210.  doi:10.11896/j.issn.1002-137X.2009.07.050
Abstract PDF(224KB) ( 717 )   
Related Articles | Metrics
In FP-growth algorithm, two traversals of FP-tree are needed for constructing the new conditional FP-tree. A novel FP-array technique was presented that greatly reduced the need to traverse FP-trees. A improved FP-growth algorithm was presented which uses the FP-tree data structure in combination with the FP-array technique efficiently. Experimental results show that the new algorithm works especially well for sparse datasets.
Voice Activity Detection Using Wavelets Multiresolution Spectrum and Short-time Adaptive Audio Mixing Algorithm
XUE Wei,DU Si-dan,YE Ying-xian
Computer Science. 2009, 36 (7): 211-214.  doi:10.11896/j.issn.1002-137X.2009.07.051
Abstract PDF(318KB) ( 1084 )   
Related Articles | Metrics
The proposed VAD uses MFCC of multiresolution spectrum and two classical audio parameters as audio feature,and prejudges silence by detection of multi gate zero cross ratio, and classifies noise and voice by Support Vector Machines. New speech mixing algorithm used in Multipoint Control Unit(MCU) of conferences imposed short time power of each audio stream as mixing weight vector, and was designed for parallel processing in program. Various experiments show,proposed VAD algorithm achieves overall better performance in all SNRs than VAD of G. 729b and other VAD, output audio of new speech mixing algorithm has excellent hearing perceptibility, and its computational time delay is small enough to satisfy the needs of real-time transmission, and MCU computation is lower than that based on G. 729b VAD.
Method of Pre-decison on Pear Scab Based on SVR and Dynamical Feature Selection
GU Li-chuan,ZHONG Jin-qin,ZHANG You-hua,LI Shao-wen
Computer Science. 2009, 36 (7): 215-217.  doi:10.11896/j.issn.1002-137X.2009.07.052
Abstract PDF(327KB) ( 683 )   
Related Articles | Metrics
At present, there are time consuming and poor effect of forecasting in the method of Pr}decison on fruit diseries. A new forecast method SVR-D1. 1 based on regression was proposed in this paper, features reduction was conducted by using SVM. The method can select keen correlative features repeatedly and construct its dynamic optimizing parameters. Relativity statistical analysis was conducted between the real data and the forecasting data of Dangshansu pear scab, which show the method is more super and more valid than the current method in efficiency and precision of forecasting the occurrence tendency of Dangshansu pear scab. I}he experiment showed that the approach has obvious advantage on fitting degree, the reasoning efficiency and accuracy.
Approach of Web Services Composition Based on Co-evolutionary Cytokine Network
LIU Zai-qun,DING Yong-sheng,HU Zhi-hua
Computer Science. 2009, 36 (7): 218-221.  doi:10.11896/j.issn.1002-137X.2009.07.053
Abstract PDF(361KB) ( 666 )   
Related Articles | Metrics
Inspired by co-evolutionary mechanism in neural-immune-endocrine system, a cytokine network platform was proposed for Web services composition. In the control of cytokine network, bio-entity delegates Web service to construct a Melay evolutionary unit with conditions. Web composition is then transferred to Melay evolutionary process. Bio-entitics construct cytokine network through message matching and conditional constraints to support Web services composition. During the co-evolution process of Web services,the composed services are dynamically adjusted to finish the dynamical composition and management of Web services. The simulation results show that the approach is adaptive in dynamic cnmronmcnts.
Applying a Hybrid Approach Based on Cooperative Co-evolution and Interactive Genetic Algorithm to Complex Product Conceptual Design
SONG Dong-ming,ZHU Yao-qin,WU Hui-zhong
Computer Science. 2009, 36 (7): 222-226.  doi:10.11896/j.issn.1002-137X.2009.07.054
Abstract PDF(410KB) ( 771 )   
Related Articles | Metrics
A scheme solving process model of complex product conceptual design based evolution idea was instructed, a new hybrid approach based on cooperative co-evolution and interactive genetic algorithm was proposed. According to the characteristics of function design of cell phone conceptual design, a hybrid coding approach based on Messy coding technique and two-row chromosome structure was proposed. A case study was carried out to demonstrate its utilization and efficiency to conceptual design of complex product involving of multi-objective and human-computer cooperation.
Grid Service Deployment Technology Based on the Fuzzy-PID Feedback Model
YIN Feng,HE Xian-bo, LIU Tao
Computer Science. 2009, 36 (7): 227-229.  doi:10.11896/j.issn.1002-137X.2009.07.055
Abstract PDF(246KB) ( 647 )   
Related Articles | Metrics
Grid service scheduling algorithms are kernel technictue in task management system of computing grid. Using a grid service deployment agent technology based on the fuzzy-PID feedback control model, a new grid service deployment agent technology was put forward for solving the uncertainty in task scheduling of the computing grid. This method fuses the fuzzy theories and PID technology. Theory analysis and numerical experiment illustrate that this method can express the dynamics and uncertainty of expected time to compute of tasks and bring the dummy executing competence into play in the computing grid environment, they are the generalization of traditional grid scheduling algorithms.At the same time, it can afford the balanced bandwidth, delay and reliability.
New Group Priority Dynamic Real-time Scheduling Algorithm
BA Wei,ZHANG Da-bo,LI Qi,WANG Wei
Computer Science. 2009, 36 (7): 230-233.  doi:10.11896/j.issn.1002-137X.2009.07.056
Abstract PDF(376KB) ( 975 )   
Related Articles | Metrics
There is no restriction for priority levels in priority scheduling algorithms, which limits the application in practice to get good schedulability. Aiming at that question, considering the requirement of preemptive scheduling for hard real-time systems, a new group priority dynamic real-time scheduling algorithm was presented. The influence of the changed order for some jobs on the schedulability of the system was studied. The schedulability test for the job grouping was given. In the new algorithm, the jobs that satisfied the formula of the schedulability test were joined together as a group. The jobs outside the group schedule in darkest deadline-first, while the jobs in the group schedule in shortest job first. The simulation results show that, comparing with the earliest-dcadlincfirst and other traditional scheduling algorithms, the priority levels decrease deeply, the success ratio increases, the average response time is shorten and the switching number reduces in the new algorithm.
Hybrid Intelligent Prediction Model of Cobalt Concentration for Purification Process
ZHU Hong-qiu, YANG Chun-hua, GUI Wei-hua
Computer Science. 2009, 36 (7): 234-236.  doi:10.11896/j.issn.1002-137X.2009.07.057
Abstract PDF(339KB) ( 641 )   
Related Articles | Metrics
A hybrid intelligent prediction model combining case-based reasoning(CBR) with adaptive particle swarm optimization(PSO) was proposed for the cobalt concentration prediction of purification process in zinc hydrometallurgy.Owing to the different effect of the case in different periods, a combined weighted similarity functions was presented.Consideration of the retrieval accuracy of CBR influenced by the feature weighting vector selection and the optimal number of nearest neighbors, an adaptive PSO algorithm was proposed to optimize these parameters. The experimental verification and comparison analysis were executed using the industrial production data from purification process. The results show that the accuracy of the hybrid intelligent model is higher than the BP neural network model and the prediction results can be used as process data for the operation optimization of the purification process.
Online Route Planning Based on Quantum Particle Swarm Optimization
GUO Jin-chao,HUANG Xin-han,WANG Yan-feng,CUI Guang-zhao
Computer Science. 2009, 36 (7): 237-239.  doi:10.11896/j.issn.1002-137X.2009.07.058
Abstract PDF(225KB) ( 798 )   
Related Articles | Metrics
With regard to modern warfare, the environmental information is changing and it' s difficult to obtain the global environmental information in advance,so the real-time flight route planning capabilities of unmanned acrocraft is rectuired. Quantum particle swarm optimization was introduced to solve this optimization problem. Incorporating constrains into the algorithm, the local trap problem of simple PSO algorithm was solved effectively. Meanwhile, according to the threats distribution of terrain obstacles, adversarial defense radar sites and unexpected surfaccto-air missile (SAM) sites, surface of minimum risk was introduced and used to form the searching space. B-spline curves were used to approach the horizon projection of the 3-D route and this simplified the original problem to a two dimension optimizalion problem, thus the complexity of the optimization problem was decreased and efficiency was improved. I}he simulalion results show that this method can meet the online route planning.
New Penalty Model for Constrained Optimization Problems
HU Yi-bo,WANG Yu-ping
Computer Science. 2009, 36 (7): 240-243.  doi:10.11896/j.issn.1002-137X.2009.07.059
Abstract PDF(314KB) ( 1894 )   
Related Articles | Metrics
Penalty function method is one of the most widely used methods for constrained optimization problems in evolutionary algorithms. It makes the search approach the feasible region gradually by the way to punish the infeasible solutions. hhe penalty functions are usually defined as the sum of the objective function and the penalty temps. The methods will bring two main drawbacks. Firstly, it is difficult to control penalty parameters, secondly, when the difference between the objective function value and the constrained function value is great, the algorithm can not effectively distinguish feasible from infeasible solutions, and thus can not handle the constraints effectively. To overcome the defects, two satisfaction degree functions defined by the objective function and the constraints function were designed, respectively. A new penalty function was constructed by these two satisfaction degree functions. Moreover, we designed an adaptive penalty factor which is varying with the quality of the population and the number of generations. As a result, the penalty factor can be easily controlled. hhus a new penalty function optimization model was proposed. Furthermore, a new crossover operator and a new mutation operator were designed. Based on these, a new evolutionary algorithm for constrained optimization problems was proposed. The simulations are made on six widely used benchmark problems, and the results indicate the proposed algorithm is very effective.
Application of Chaos-support Vector Machine Regression in Traffic Prediction
LUO Yun-qian,XIA Jing-bo,WANG Huan-bin
Computer Science. 2009, 36 (7): 244-246.  doi:10.11896/j.issn.1002-137X.2009.07.060
Abstract PDF(244KB) ( 1017 )   
Related Articles | Metrics
A traffic forecasting model based on the support vector machine(SVM) and chaos was developed to improve the accuracy of the traffic prediction. Based on the phase space reconstruction,it calculates the real-time traffic's delay time, embedded dimension and I_yapunov exponent, and proves that the traffic chaos phenomena exists. hhat a chaos-SVM model was constructed and pairs of training samples was determined to forecast the real network traffic. The resups show that the chaos-SVM model is able to predict network traffic effectively. In comparison with the BP neutral network, it has higher accuracy of prediction.
Efficient Algorithm for Mining Frequent Patterns over Offline Data Streams
HOU Wei,WU Chen-sheng,YANG Bing-ru,FANG Wei-wei
Computer Science. 2009, 36 (7): 247-251.  doi:10.11896/j.issn.1002-137X.2009.07.061
Abstract PDF(510KB) ( 736 )   
Related Articles | Metrics
Mining frequent patterns from data streams is one of the hottest research topics in data mining nowadays.The features of data streams, such as consecution, disorder and real-time, raise requirements for higher time and space performance of mining algorithms. Vibration of pattern frectuency in data streams, compels the present algorithms to revise the synopsis structure continually,and leads up to disadvantage impact on both time and space efficiency. A more scalable synopsis structure SP-tree was designed firstly, meanwhile the concept of vibration factor X was given for main-twining vibrational information. Then an efficient algorithm for mining frequent patterns over offline data streams SPDS was proposed, which relieves the performance from the impact of vibration effectively, and increases the count accuracy of partial patterns. This algorithm adopts a dividcand-conquer mechanism to mine the current datasct, thereby improves itself further.
Modified Algorithm for Hash Function Based on the Spatiotemporal Chaotic System
ZHANG Xiang-hua
Computer Science. 2009, 36 (7): 252-255.  doi:10.11896/j.issn.1002-137X.2009.07.062
Abstract PDF(324KB) ( 694 )   
Related Articles | Metrics
A class scheme of hash function was introduced and the flaws and security holes were pointed out. Then, a modified algorithm for oncway Hash function construction based on the spatiotcmporal chaotic system was proposed.Analysis shows that the proposed algorithm has better statistics performance and efficiency, and remedies the security holes of the original algorithm.
Research and Implementation on Migration of Virtual Machine Including Vm-disk
LU Xiao-hu,LI Qin
Computer Science. 2009, 36 (7): 256-261.  doi:10.11896/j.issn.1002-137X.2009.07.063
Abstract PDF(578KB) ( 1164 )   
Related Articles | Metrics
The existing memory-based virtual machine ( VM) migration technology requires network-disk shared between source side and destination side, which results performance degradation greatly from poor network quality, and the VM migration can not be implemented without shared network-disk environment. To solve these problems, in view of the high-availability, high-performance, security and stability displayed by VM-disk, which acts as the encapsulation of VM persistent running state,a new migration strategy based on VM disk was proposed and implemented. A virtual disk driver named DiskMig was designed to migrate VM disk and ensure its consistency during migration process, and a new algorithm Transfer on Demand with Forward-ahead(TOD& FA) was presented to synchronize disk differences between source and destination fast. An effective storage structure bitmap was adopted by DiskMig system to record and search massive I/O offset information during migration with only constant time complexity. The experiment demonstrates that the performance of VM and its applications brought by the DiskMig system was only 5 percent degraded,thus the VM migration including its disk is implemented efficiently.
Research on Matrix Reconstruction in RAID Controller
JIANG Guo-song,ZOU Chen,XIE Chang-sheng
Computer Science. 2009, 36 (7): 262-266.  doi:10.11896/j.issn.1002-137X.2009.07.064
Abstract PDF(492KB) ( 694 )   
Related Articles | Metrics
Erasure code provides a specific coding method for data reconstruction which is for the protection of those multiple failure in the disk array. In RAID applications,with Erasure code strips for data loss modeling to optimize reconstruction algorithm to the entire strip reconstruction. In other words, they are only applied to a high degree of fault related sectors which is the sequential sector on the lost disk. We addressed two more general issues: data recovery from scattered or not related erase sectors which lead to the loss; data recovery from single disk failure(existing many faults) which lead to the partial loss. I}he two issues of the proposed method are completely generalized and can be used in any Erasure code, but best suited to XOR-based Erasure code. For scattered erasure code, we typically provide for two results for every loss secwr data: either the lost data is declared unrecoverable or it is declared recoverable and a formula is provided for the reconstruction that depends only on readable sectors. In short, the methodology is both complete and constructive.
On-line Self-adaptive System Based on Evolvable Hardware
ZHU Ji-xiang,LI Yuan-xiang,XIA Xue-wen,ZENG Hui
Computer Science. 2009, 36 (7): 267-269.  doi:10.11896/j.issn.1002-137X.2009.07.065
Abstract PDF(321KB) ( 642 )   
Related Articles | Metrics
As the rising of evolvable hardware, due to its advantages in evolutionary design of electronic circuits, faulttolerance, self-adaptability and so on, it maybe a new technology to break through the bottle-neck of conventional electronic system designing. On the basis of previous researches,we took a further explore on using evolvable hardware to self-adaptive system, a self- adaptive system model based on evolvablc hardware was proposed, and with the experiments on Xilinx Virtex-II Pro(XC2VP20) FPGA,it demonstrated that evolvable hardware technology is a feasible scheme for the self-adaptability of electronic system.
Research and Implement of Realtime of Linux Based on RTHAL
SU Shu-guang,LIU Yun-sheng
Computer Science. 2009, 36 (7): 270-272.  doi:10.11896/j.issn.1002-137X.2009.07.066
Abstract PDF(251KB) ( 1070 )   
Related Articles | Metrics
The paper focused on research and implement of realtime of Linux The paper proposed a method taking advantage of RTHAL(Real Time Hardware Abstract Layer) , which run between hardware and Linux The method made only a few of modification on Linux and had a schedule delay of only about 3μs. hhe paper discussed how to modify Linux and RTHAL for hard realtime. In addition the paper proposed an effective method to take advantage of buffer between them. Lastly an example of rcaltime MPEG4 stream system was cited, which applied the method and proved it could add to hard realtime of Linux effectively and concisely.
Object Semantic Probabilistic Model and its Application in Category Object Recognition and Scene Analysis
LIU Wei,CHEN Xin-wu,TIAN Jin-wen
Computer Science. 2009, 36 (7): 273-277.  doi:10.11896/j.issn.1002-137X.2009.07.067
Abstract PDF(385KB) ( 668 )   
Related Articles | Metrics
The article seek to discover the object categories' semantic probabilistic model based on statistical text analysis and we applied this new model on object recognition and Scene analysis. First, the image was represented by a set composed of local feature regions. Then, found the probability among image, local regions and semantic category based on the new model helps to calculate the posterios and recognizing the object. EM algorithm was used to estimate the parameters of the model. Experiments show the good performance on object recognition in the cluster background and also show the feasibility of the scene analysis.
Research on Image Segmentation Based on Global Optimization Search Algorithm
YANG Dan,QU Zhong
Computer Science. 2009, 36 (7): 278-280.  doi:10.11896/j.issn.1002-137X.2009.07.068
Abstract PDF(203KB) ( 658 )   
Related Articles | Metrics
In the cluster-based image segmentation algorithm,the initialization was needed in FCM(fuzzy C-means) algorithm and there were lots of local minimum in the objective function,if the initialization abtained the local minimum vicinity point, it would cause a convergence to local minimum. In order to solve this problem, a global optimization search algorithm was introduced to the FCM algorithm because it has the global optimization search capabilities. The improved FCM has more effective than the traditional method of FCM clustering algorithm through the simulation experiments and theoretical analysis of algorithm performance.
DCT Based Public Watermarking Algorithm with Visual Model
YANG Hong-jun,YANG Sheng,XIA Tai-wu
Computer Science. 2009, 36 (7): 281-283.  doi:10.11896/j.issn.1002-137X.2009.07.069
Abstract PDF(236KB) ( 718 )   
Related Articles | Metrics
A lossy compression-resisting public watermarking technique was presented. A watermark with visually recognizable pattern was embedded into the host image by modifying the polarity between DC value and low-frequency AC value in each 8 X 8 DCT block, and the watermark was adapted to the image by utilizing the masking characteristics of the human visual system(HVS),watermark extraction processes don't need the original image. The algorithm provides very good results both in terms of image transparency and robustness. The experimental results show that the proposed technique successfully survives lossy compression and common image processing operations.
Optimization Algorithm for Detection and Localization of Pure Face in Color Images
LI Xiang,WANG Jian-guo
Computer Science. 2009, 36 (7): 284-287.  doi:10.11896/j.issn.1002-137X.2009.07.070
Abstract PDF(290KB) ( 707 )   
Related Articles | Metrics
Face recognition has been a complex and difficult problem that is important for surveillance and security, telecommunications, digital libraries, and human-computer intelligent interactions. Based on understanding wide variety algorithms,this paper presented a Multi-way integration optimization algorithm for detection and localization of pure face in color images. First, to an input color image, skin color detection and preprocessing rules was adopted firstly to reduce the face detection searching regions. Secondly a method that can detect arbitrary rotated face within the image plane was proposed based on the computation of the direction of the object. Thirdly, to determine the angle of the rotated face region accurately, a method that compute the rotated angle again was proposed. Finally, the two eyes in the face candidate region was determined by projection function, and then the "pure face" position combining with the proportion of the face features was determined, the sideview face was also considered. hhe experiments showed that the proposed method provides more better detection results to arbitrary rotated faces within the image plane and sidwiew faces with two eyes,and more better robust under different illumination.
SMT Solder Joint 3D Reconstruction Technology Based on Shape from Shading
ZHAO Hui-huang,ZHOU De-jian,HUANG Chun-yue,ZHANG Shao-hua
Computer Science. 2009, 36 (7): 288-291.  doi:10.11896/j.issn.1002-137X.2009.07.071
Abstract PDF(315KB) ( 728 )   
Related Articles | Metrics
The principles of the surface mount solder 3D reconstruction technology based on Shape from Shading could be described as following; the practical solder joint image was obtained by image acquisition, and deal with the solder joint image by using some image processing arithmetic. According to a definitive reflection model, the constraint relation between the surface shape and image lighteness was built. Another constraint relation for the parameters of the objects surface shape is based on the prior knowledge of the object surface shape. Then finding solutions by those constraint relations can obtain the 3D shape of the objects surface. Furthermore,aiming at the defects of unfavorable result for the unaccepted solder joint 3D reconstruction,we improved 3D reconstruction algorithm. Based on expatiation of the principles of the surface mount solder 3D reconstruction technology, its implementation method and steps were introduced with practical examples. Some key technologies, including image acquisition and processing, solder joints three-dimensional and algorithms were also discussed. Some results were given as examples to verify feasibility of this technology.
Text Segmentation Using Color and Strobe Features
HUANG Bai-gang,LI Jun-shan,HU Shuang-yan
Computer Science. 2009, 36 (7): 292-294.  doi:10.11896/j.issn.1002-137X.2009.07.072
Abstract PDF(226KB) ( 692 )   
Related Articles | Metrics
A novel text segmentation approach based on unsupervised clustering using color and stroke features was presented. Firstly, the possible text and background color is estimated on the enhanced text line image by color reduction and histogram calculation. Then color and stroke features were extracted and text and background pixels are determined by K-mean clustering algorithm. At last, segmentation result is optimized by post-processing. The performance of our approach is demonstrated by experimental results for a set of images with Chinese and English text
Algorithm for Extracting Skeleton of 3D Human Body Model Based on Body Characteristic
CHEN Guo-dong,LI Jian-wei,PAN Lin,YU Lun
Computer Science. 2009, 36 (7): 295-297.  doi:10.11896/j.issn.1002-137X.2009.07.073
Abstract PDF(233KB) ( 1177 )   
Related Articles | Metrics
Extracting skeleton of 3D human body model is very important for animation of 3D characters. The existing algorithms are complicated or interactive. We presented a algorithm that is based on the trait of human body and golden section. Our algorithm consists of four steps:1)transformation into a low polygon model from original modcl;2) estimate the position of joint on the base of trait of human body and golden section; 3) segmentation of human body; 4) amend the position using geodesic distance. The experimental results show that our method is easier to implement and faster in computation than other methods.
Retrieval of 3D Models Based on Area Shape Distribution
ZHAO Peng-fei,JIN Feng
Computer Science. 2009, 36 (7): 298-299.  doi:10.11896/j.issn.1002-137X.2009.07.074
Abstract PDF(239KB) ( 659 )   
Related Articles | Metrics
How to reduce the time and increase the efficiency of retrieval are the key problems in 3D model retrieval.Osada's D3 shape distribution doed not describe the content of model enough and the calculated amount is comparatively large. A novel method for 3D models retrieval was proposed. Computing the area between the center of mass of model and two barycenters of random triangles, using the statistical data to compute area distribution histograms. Then the histogram is used to match the model. Experimental results show the proposed method received a better performance.
Gradient Mean Prediction Filtering Algorithm and its Application
SHENG Ming-lan,LAN Zhang-li,ZHOU Jian-ting
Computer Science. 2009, 36 (7): 300-302.  doi:10.11896/j.issn.1002-137X.2009.07.075
Abstract PDF(246KB) ( 713 )   
Related Articles | Metrics
The existence of measurement error is inevitable in a variety of measurement systems. Combining with the advantages of the traditional filter methods, the gradient mean prediction filtering algorithm was proposed in view of the random errors existence in slowly changing measurement data. The basic idea of the algorithm and flow were described.A filtering experiment was done using the measurement data of bridge continuous deflection measurement system. The result indicated that the algorithm can effectively remove the interference signal in slowly changing data, and the filter effect is obvious.