Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 3, 01 December 2018
  
Research Advances in Interaction Testing
CHEN Xiang,GU Qing,WANG Xin-ping,CHEN Dao-xu
Computer Science. 2010, 37 (3): 1-5. 
Abstract PDF(400KB) ( 1241 )   
RelatedCitation | Metrics
Interaction testing is an effective test suite generation technique and becoming a challenging problem. It is concerned with the interaction among several parameters of system under test (SUT) and can test the SUT effectively with smaller test suite. We described the interaction testing problem based on the interaction coverage criteria, explained four well-known combinatorial objects,classified several well-known interaction testing approaches by their characteris- tic and gave a brief interpretation of some classical approaches. We also gave a set of evaluation criteria and made a comparison among all the approaches. At last we gave a conclusion and discussed some issues to be further studied.
Survey of Chinese New Words Identification
ZHANG Hai-jun,SHI Shu-min,ZHU Chao-yong,HUANG He-yan
Computer Science. 2010, 37 (3): 6-10. 
Abstract PDF(549KB) ( 1278 )   
RelatedCitation | Metrics
New Words Identification (NWI) is a key technology in the field of Chinese information processing. NWI mainly includes two tasks;one is new words candidate extracting and filtering, the other is new words POS guessing.Since there is no specific symbol to mark word boundary for Chinese words,any adjacent characters are possible to compose a word, which brings a lot of obstacles for NWI. Moreover, because the prior knowledge and statistical data are not available, new words POS guessing has become the technological bottleneck of Chinese tagging. The status of the field for Chinese NWI was analyzed in detail, and the research techniques and existing problems for new words candidates extrading and new words POS guessing were discussed emphatically. In the end, the paper presented the prospects of the study for Chinese NWI.
Research and Development in Backdoor Set
GU Wen-xiang,LI Shu-xia,YIN Ming-hao
Computer Science. 2010, 37 (3): 11-16. 
Abstract PDF(534KB) ( 689 )   
RelatedCitation | Metrics
There is a great relationship between hidden structure of Propositional Satisfiability problem and problem hardness,which becomes a focus of study in recent years. Backdoor is one of these hidden structures,which makes the remaining questions can be solved in polynomial time. Through the study of this backdoor questions, firstly this paper made more comprehensive introduction about the developing of backdoor problem, related concepts of backdoor problem, parametric complexity of backdoor problem and relationship between backbone and backdoor. Then introduced more specific solution of backdoor set from these aspects, such as Constraint Satisfaction Problem (CSP) , Propositional Satisfiability Problem and Quantified Boolcan Formulae (QBF). At the same time, summarized some unresolved queslions and prospects of the backdoor sets.
Linkage Analysis of the Semantic Web: The State of the Art
GE Wei-yi,CHENG Gong,QU Yu-zhong
Computer Science. 2010, 37 (3): 17-21. 
Abstract PDF(539KB) ( 779 )   
RelatedCitation | Metrics
With the development of Semantic Web rescarch,more and more data are emerging on the Semantic Web. To pursue the goal of the Semantic Web, sharing and reusing data, entity search and document search are two essentials.Linkage analysis on the Semantic Web is a tool which could direct Semantic Web search. Besides, the state of the art of the Semantic Web always attracts researchers, whereas linkage analysis is a key to the mining of the macrostructure of the Semantic Web. This paper surveyed the state of the art of research on linkage analysis of the Semantic Web at two granularitics, entity-level and document level. In particular, we focused on various linkage models as well as the use of various linkage analysis methods.
Survey of Trust Management in Open Distributed Environments
GUAN Shang-yuan,WU Wei-guo,DONG Xiao-she, MEI Yi-duo
Computer Science. 2010, 37 (3): 22-28. 
Abstract PDF(740KB) ( 816 )   
RelatedCitation | Metrics
Trust is an important aspect of decision making for network-meditated principals, which can handle uncertainty,uncontrollabihty,fuzziness and incomplete information during interactions. Since trust management specifies,establishes,verifies and maintains trust relationships among principals, it receives increasing attention.On the basis of analyzing the existing concepts of trust and trust relationship, this paper presented the descriptive definition and the formal one of trust management, and discussed the relationship between them. Next, some typical trust management systems were introduced and compared. Based on the analysis of the shortcomings and problems of the existing trust. gement systems,we ended this paper with discussions on the trend of research and application.
Representation for RBAC Model in Description Logic
MA Li,MA Shi-long,SUI Yue-fei,YI Sheng-wei
Computer Science. 2010, 37 (3): 29-35. 
Abstract PDF(465KB) ( 664 )   
RelatedCitation | Metrics
Role-Based Access Control (RBAC) controls the user's access to resources by indirectly using roles,which simplifies the security management greatly. Although the research of RBAC model is a mature area, the lack of formalination of RBAC results in uncertainty and confusion about the concepts and meaning of RBAC. Description Logic (DL) is a kind of object based knowledge representation formalism, and also a decidable fragment of first order predicate logic, with well-defined semantics and powerful representation capability. To give a formal description of RBAC, this paper took RBAC96 as a reference model and proposed a new formalized method to RBAC with description logic, called DLRBAC,which gives formal definitions to the concepts and relations of RBAC. This paper also proved that the formal representation is faithful to RBAC model. Based on the formalized modcl,we can further study RBAC.
Replacement Algorithm Improved on 2Q* for Mail Service Workload and its Application in Storage Cache
MENG Xiao-xuan,SI Cheng-xiang,LIU Zhen-han,XU Lu
Computer Science. 2010, 37 (3): 36-41. 
Abstract PDF(563KB) ( 804 )   
RelatedCitation | Metrics
This paper analyzed the performance characteristics of classic 2Q algorithm when it was performed on mail-service workloads,and proposed an improved algorithm, called 2Q*.The simulation results show that 2Q* algorithm can outperform the other replacement algorithms,including the classic 2Q algorithm,for all the cache sizes and various mail-service workloads. To verify the simulation results in real practice, we implemented the algorithm in FIexiCache, a partitioned buffer cache system, and integrated it with a popular adaptive sectuential prefect policy properly. The experiment results in real system further confirm the effectiveness of 2Q* algorithm for mail service kind of applications in improving their physical I/O performance. Moreover, its runtime overhead is also fairly low.
Transport Sub-network Selection:Approaching with Bounded Maximum Spanning Sub-graphs
FENG Wang-sen,ZHANG Bei,CHEN Ping,CUI Jian
Computer Science. 2010, 37 (3): 42-45. 
Abstract PDF(325KB) ( 711 )   
RelatedCitation | Metrics
The bounded degree maximum spanning sub-graph problem arising from wireless mesh networks was studied. Given a connected graph G=( V, E) and a positive integer d≥2,the problem aims to find a maximum spanning sub-graph H of G with the constraint; for each vertex v∈V, the degree of v in H, dH(v)≤d. Here, a spanning sub-graph of G is a connected sub-graph in G which contains all the vertices of G. Polynomial time approximation algorithms were proposed for edge un-weighted case and edge weighted case respectively. When input graphs arc edge un-weighted, a 2-approximation algorithm is designed. When input graphs are edge weighted, the designed algorithm always outputs a spanning sub-graph whose maximum degree is no more than d+1 and weight is at least OPT(G)/(d+2),where OPT(G) is the weight of optimal solutions. The bounded degree spanning sub-graph output by the algorithm can be used as a transport sulrnetwork in wireless mesh networks.
Data Encryption Algorithm Based on Two Dimension Toggle Cellular Automata
XIA Xue-wen,LI Yuan-xiang,ZENG Hui
Computer Science. 2010, 37 (3): 46-48. 
Abstract PDF(317KB) ( 752 )   
RelatedCitation | Metrics
A cryptography system based on two dimension toggle cellular automata was proposed to solve the slowness of encryption and decryption based on one dimension toggle cellular automata. The encryption and decryption of data is completed by the co-evolution of the cellular automata. The key space can be adjusted by changing the neighborhood radins and the rounds of encryption. The analysis results show that the cryptosystem can resist brute attack and differential attack, and also has high security. The hardware shared by encryption and decryption made the cryptosystem to have a strong practicability.
Research on Improved Handoff Scheme between 3G and WLAN Networks
ZHANG Li-zhuo,JIA Wei-jia,ZHOU Shi-fei
Computer Science. 2010, 37 (3): 49-51. 
Abstract PDF(331KB) ( 640 )   
RelatedCitation | Metrics
The lack of consideration of handoff symmetry in traditional vertical handoff technology leads to the high computation complexity of handoff procedure. Thus, the Background Scanning based Moving Average Forecasting Scheme (MAFS) and Gradient Prediction Scheme (GPS) were proposed. MAFS firstly calculates the average value of the received signal strength, and counts the number of signal strengths which are small than average value. MAFS finally triggers the handoff procedure. GPS starts the procedure of handoff through computing the first derivative of the received signal strength. The combination of MAFS and GPS improves the efficiency of vertical handoff and abbreviates the disconnecting time. An experiment in which Mobile Terminal passes through 3G network, WLAN and the overlap area was designed and implemented. The experimental results show that the average handoff delay is cut off 40% and the average handoff packet loss is also reduced 35%.
P2P Super-peer Search Techniques Based on Hierarchical Quadrant Space
FENG Jin-xiao,CHEN Gui-hai,XIE Jun-yuan
Computer Science. 2010, 37 (3): 52-56. 
Abstract PDF(430KB) ( 772 )   
RelatedCitation | Metrics
Current super-peer search adopts flooding or random walk in routing method and their efficiencies are low.Meanwhile system does not combine the unstructured proximity search with structured data locating effectively. This paper proposed a new kind of super-peer topology based on hierarchical quadrant space callai Quad and introducai two unstructured super-peer search methods. The first one is backtracking and expanding search (BES) technique which attwins a tradeoff between flooding and random walk. BES considers both network traffic and search length. Second, bloom filter technique is adopted to improve the BES. Besides, Quad supports structured data locating. Simulations show that Quad super-peer search methods are better in search success ratio, search cost compared with current super-peer search techniques. Bloom Filter can effectively increase search efficiency and reduce search length. Meanwhile Quad is efficient in data locating.
Analysis of End to End Congestion Control Mechanism in Narrowband Ad hoc Networks
NIU Da-wei,YU Wei-bo,WANG Hai,GUO Xiao
Computer Science. 2010, 37 (3): 57-60. 
Abstract PDF(346KB) ( 906 )   
RelatedCitation | Metrics
Military wireless networks which are based on narrowband and half duplex channel make it is hard to utilize ACK as a indication for congestion. So the one way end to end delay and packet delivery probability become the important parameters to predict the avoidance. This paper analysed and revealed the numerical relationship among the congeslion level,link layer parameter and app layer performance through M/G/1 queueing model,thereby,provided theoretical principle for design of congestion control algorithm.
Practice-oriented Security Metric for Cryptographic Device under Electromagnetic Analysis
ZHANG Peng,DENG Gao-min, ZOU Cheng,CHEN Kai-yan,ZHAO Qiang
Computer Science. 2010, 37 (3): 61-63. 
Abstract PDF(338KB) ( 667 )   
RelatedCitation | Metrics
For evaluating the security of cryptographic device in the risk environment full of electromagnetic analysis (EMA) adversaries,by enhancing the adversary's ability in the classical cryptographic black box model, two novel adversaries,the key recover adversary and the indistinguishability determined adversary who takes the advantage of electromagnetic emissions,were defined within the framework of physical observable cryptography model. For the former,the security is evaluated in quantity with the adversary's success ratio, and for the latter, the security is evaluated in quality with the adversary's advantage. With the metric of adversary's success ratio, the attack abilities of several EMA distinguishers were compared. These two practiccoricnted security metrics laid the foundations of further researching and developing EMA resistant cryptographic system and device.
Enhanced Approach to Anomalous Program Behaviors Detection
XIE Feng,XIE Li-xia
Computer Science. 2010, 37 (3): 64-66. 
Abstract PDF(355KB) ( 656 )   
RelatedCitation | Metrics
Anomaly detection is an important method for protecting program Traditionally a program is protected by means of monitoring system call, but the invoked address is often ignored. This paper presented a new audit event named as L-Call to describe the program behavior, which is the system call with invoked address in nature. A Chebyshev inequality-based method was also presented to evaluate the deviation of program behavior from normal. The deviation degree that we named as anomaly degree is based on the likelihood of L-Call sequence occurred under the unknown distribution. Finally a Markov-based prototype was constructed to evaluate the experiment,which is named as LC-ADS (i.e. L-Call based Anomaly Detection System). The experimental results show that LC-ADS acquires the better true posi- five rate and lower false alarm rate.
Adaptive Proportional Fair Scheduling Scheme for Multi-input Multi-output Systems
TAN Li,SU Gang,ZHU Guang-xi,WANG Lin
Computer Science. 2010, 37 (3): 67-69. 
Abstract PDF(334KB) ( 1211 )   
RelatedCitation | Metrics
In MIMO systems, the throughput of system is increased by always scheduling the subscribers with good channel conditions to transmit. This is the multi user diversity. The fairness over all subscribers is the weak point. In this paper, the fairness of the MIMO systems was concerned and an adaptive proportional fair scheduling schemes based on opportunistic beamforming were proposed to increase the throughput of the subscribers in the bad channel conditions and guarantee the fairness allover the systems. In each timeslot, the adaptive parameters monitor the requested data rate and the mean of the rectuested data rate in the past period. And the adaptive parameters are updated per-user basis. The proposed schemes find a tradeoff between the throughput and the fairness. And the simulation results show the validity.
Unified Approach to Construct Quantum Error-correcting Code
QIAN Jian-fa,MA Wen-ping
Computer Science. 2010, 37 (3): 70-72. 
Abstract PDF(219KB) ( 874 )   
RelatedCitation | Metrics
Quantum error-correcting codes play an important role in not only ctuantum communication but also ctuantum computation. All kinds of cyclic codes,for example, Hamming codes,BCH codes and Recd-Solomon codes et al.,constacyclic codes and quasi cyclic codes have been used to construct quantum error-correcting codes. An unified approach to constructctuantum error-correcting codes was presented by using ctuasi-twisted codes. A sufficient and necessary condition for quasi-twisted contained its dual codes, and a new method for constructing quasi twisted codes was given. Moreover, new cauantum quasi-twisted codes were obtained by using quasi-twisted codes.
Provably Secure Multi-identity Single-key Decryption Scheme in the Standard Model
MING Yang,WANG Yu-min,PANG Liao-jun
Computer Science. 2010, 37 (3): 73-75. 
Abstract PDF(281KB) ( 817 )   
RelatedCitation | Metrics
Multi Identity SinglcKcy Decryption ( MISKD) scheme is a variant of Identity-Based Encryption(IBE) where a private decryption key can map multiple public keys (identities). More to decrypt multiple ciphertexts encrypted with different public keys. Based on exactly, a single private key can be used the bilinear pairings, the MISKI)scheme was presented and was provably secure in the standard model. The proposed scheme was proven secure against indistinguishably in adaptively chosen ciphertext and identity under the decision q-TBDHE assumption.
Research on New Generation Middleware Technology in the Open and Mobile Network Environment
LI Qi-lin,WANG Min-yi,ZHOU Ming-tian
Computer Science. 2010, 37 (3): 76-78. 
Abstract PDF(351KB) ( 729 )   
RelatedCitation | Metrics
Combining the characteristics of current network computing environment, research on new generation middleware technology which is suitable for open and mobile networking environments was presented. Based on the analysis to the key characteristics of today's network environment, new technical requirements for new generation middleware were identified. Subsectuently, some key challenges such as context awareness, environmental adaptation, open coordination,service discovery and pervasive interoperability for new generation middleware were discussed in detail. Further, some real solutions were showed as well.
Novel QoS-aware Multipath DYMO Routing Protocol in Mobile Ad Hoc Networks
HAN Bing-qing,CHEN Wei,ZHANG Hong
Computer Science. 2010, 37 (3): 79-82. 
Abstract PDF(428KB) ( 689 )   
RelatedCitation | Metrics
DYMO is a dynamic on-demand routing protocol in mobile ad hoc networks, which is a singlcpath routing protocol. But multi-path routing can support QoS better than single-path routing. Firstly, the advantages and problems of DYMO protocol were analyzed. Then a novel QoS-aware multi-path DYMO routing protocol(QA-DYMO) was proposed to provide QoS and multi path routing which establishes and utilizes multiple routes of link-disjoint paths to send data packets concurrently. Finally, a QoS-aware routing algorithm was proposed. QA-DYMO can adapt to the dynamic changes of the network and support QoS better. The simulation results show that the proposed protocol outperforms other pertinent protocols.
Data Synchronization of Authentication Protocols
DENG Miao-lei,HUANG Zhao-he,YANG Lu-shan,ZHOU Li-hua
Computer Science. 2010, 37 (3): 83-85. 
Abstract PDF(243KB) ( 603 )   
RelatedCitation | Metrics
Data synchronization is a basic rectuirement for authentication protocols and authenticated key exchange protocols, but it is much trickier and many times overlooked. By carefully analyzing an RFID authentication protocol, an RFID anthenticated key exchange protocol,and an authenticated key exchange protocol for mobile satellite communicalion systems which were found to be provably secure at present, attacks of data synchronization to these protocols were found respectively. These attacks destroy the availability of protocols. Furthermore, improvements to overcome the security vulnerabilities of these protocols were proposed.
Research on Program Behavior Model and Anomaly Detection Based on Multiple Abstraction
CHENG Xia,WANG Xiao-feng
Computer Science. 2010, 37 (3): 86-90. 
Abstract PDF(523KB) ( 651 )   
RelatedCitation | Metrics
Efficient short sequence models used in anomaly analysis of program behaviors arc not available in anomaly detection field. The current models are short of abstracting program behaviors. Therefore, a new highly self-explanatory pattern called GV pattern(gapped variable frequent pattern) was provided to cover three fundamental structures of program:sequence, selection and circulation. Subsequently, GV pattern mined algorithm and system-call flow chart model based on GV pattern library were presented in details. Experiments show that the anomaly detection algorithm based on new model keeps low detection overhead and false positive rate on the condition of high detection rate, which is crucial in a real-time intrusion detection system.
HF Channel Simulation Algorithm Based on Improved Walnut Street Model
CAO Peng,JING Yuan,HUANG Guo-ce
Computer Science. 2010, 37 (3): 91-93. 
Abstract PDF(244KB) ( 923 )   
RelatedCitation | Metrics
Traditional HF channel model-Watterson Model could simulate channel variations during short terms. In order to simulate channel variations during various terms,a HF channel simulation algorithm based on SNR reproduction was presented according to the statistical characteristics of HF channel. With Walnut Street model as its foundation, observed data were analyzed and processed, the SNR variation models during intermediate and long terms were built, combined with traditional Watterson channel model, the specified HF channel was simulated through SNR prediction. Simulation results show that this algorithm could simulate SNR variations in real channcl,which provides good simulation of channel quality for HF network simulation.
Kalman Filter-based Adaptive Hybrid FEC/ARQ Mechanism for Wireless Media Streaming
BAI Guang-wei,JIN Yong,ZHANG Peng
Computer Science. 2010, 37 (3): 94-98. 
Abstract PDF(505KB) ( 1136 )   
RelatedCitation | Metrics
This paper proposed an adaptive hybrid forward error correction (FEC)/automatic repeat request (ARQ) algorithm at the data link layer for wireless media streaming,in the hope that the media could make receivers to get high quality media streaming. The algorithm, based on cross-layer optimisation design,predicts the packet loss rate using the Kalman filter,and adjusts the FEC parameter N and the ARQ parameter Nmax ; On the other hand, we used an adaptive FEC at the application layer to adjust the sending rate of MPEG video frame adaptively and allocate the network bandwidth resource between the MPEG video source data and the redundant data dynamically. Both the mathematical analysis and simulation demonstrate that the proposed mechanism can significantly improve duality of media streaming, in terms of playable frame rate, reliability and real-time performance on the receiving side.
Performance Research of Multi-relay Hybrid-forwarding Cooperative Transmission Scheme
YANG Bo,YU Hong-yi,FENG Qiang
Computer Science. 2010, 37 (3): 99-101. 
Abstract PDF(247KB) ( 640 )   
RelatedCitation | Metrics
In order to overcome the shortcomings of decode-and-forward and amplify-and-forward transmission, a multi-relay hybrid-forwarding cooperative transmission scheme (MRHF-CTS) was proposed to optimize the performance of cooperative communication system, which selects the style of forwarding according to the result of decoding at relay nodes and uses the broadcast channels among relay nodes to improve the transmission reliability of multi-relay wireless networks. hhe performance characterizations in terms of outage probabilities arc developed. The results show that in contrast to decode-and-forward and amplify-and-forward scheme, MRHF-CTS offers a superior performance. And in the high signal-to-noise ratio (SNR) regime,the optimal performance of decodcand-forward transmission is gained.
Application of Automated Negotiation Based on Policy in Delegation Authorization of Distributed Environment
WU Xiao-nian,ZHANG Run-lian,MA Chun-bo,ZHOU Sheng-yuan
Computer Science. 2010, 37 (3): 102-105. 
Abstract PDF(367KB) ( 625 )   
RelatedCitation | Metrics
The grid system authorizes in delegation model to adapt well to the distributed environment. But the dynamic change of the grid would break the global consistency of privileges in delegation model between different secure domains. To address the problem, this paper introduced an automated negotiation mechanism based on policies. In order to detect the problem timely and negotiate the privileges quickly and renew the global consistency of privileges between the corresponding secure domains,the mechanism defined a set of policy rules,which would conduct the negotiation process to automate, and presented a state transition diagram that the system should follows. Sequentially, driven by the trigger,the mechanism would implement automatically the negotiation state transition, and enforce the privileges negotiation and reauthorize between negotiation parties. The test result shows that, comparing with negotiation process conducted by people, the automated negotiation mechanism improves the efficiency of the solution to the problem and system performance, and simplifies security administration work of the administrators.
Structured Multi-Agent Model for P2P MMOG
LUO Jia,CHANG Hui-you,YI Yang
Computer Science. 2010, 37 (3): 106-111. 
Abstract PDF(480KB) ( 783 )   
RelatedCitation | Metrics
For the contradiction between the increasing requirement of resources and limited load capacity of servers in Massive Multi player Online Game (MMOG),a Structured Multi-Agent (SMA) model with load balancing was proposed. The theory of P2P MMOG was defined for SMA and three core algorithms which contain node joining algorithm, neighbor discovery algorithm and cross domain algorithm were constructed. These algorithms theoretically guararrtee the consistency of states of all resources. In order to complement the load balancing, SMA assigns the processing licenses of resources to all nodes in an Area of Interest (AOI). hhrough analyzing the scalability, response speed and load consumption etc.,SMA has fine performance.
Analysis of the Indoor Positioning Systems in Pervasive Environment
LIANG Yun-ji,ZHOU Xing-she,YU Zhi-wen,NI Hong-bo
Computer Science. 2010, 37 (3): 112-116. 
Abstract PDF(517KB) ( 1126 )   
RelatedCitation | Metrics
Location-aware application is a gripping research field in pervasive computing. The current location of an object is one of the most important contexts. Currently,numerous prototype positioning systems were built to locate objects in different backgrounds and applications. This paper described the key techniques and methods of indoor positioning. Pros and cons of different methods were also discussed. Based on the analysis of the existing indoor positioning systems,we pointed out the hurdles and challenges of indoor positioning systems in pervasive environment.
Active Detecting Method against SYN Flooding Attacks
LI Hai-wei,ZHANG Da-fang,LIU Jun,YANG Xiao-bo
Computer Science. 2010, 37 (3): 117-120. 
Abstract PDF(319KB) ( 1247 )   
RelatedCitation | Metrics
SYN Flood brings great danger to the normal network operation. Many research studies detect the attack by analyzing the self-similarity of network traffic. However, the method may be ineffective to SYN Flood. By analyzing the high-precision traces which are captured by DAG cards, we proposed a new SYN Flood detection mechanism based on the active detection. It brings the technology of packet pair to abnormal traffic detection that detects SYN Flood, according to the background flow length change. The method has a 88 0 o SYN attack detection rate from experimental resups. This method is based on end-to-end technology which has better flexibility and controllability.
Secure Multi-domain Policy Integration Method Based on Ontology Similarity
JIN Li,LU Zheng-ding
Computer Science. 2010, 37 (3): 121-124. 
Abstract PDF(299KB) ( 553 )   
RelatedCitation | Metrics
Extracting semantic meaning of local access control policies is an effective method to avoid conceptual and logical conflicts in multi domain policies integration. In view of domain ontology, a secure policy integration method based on ontology similarity was proposed.Using a machine learning algorithm of Baycsian, it can self-adaptively construct a secure multi-domain interoperation model to satisfy the autonomy and cooperation of all domains.
Performance Analysis of Spectrum Sensing in Cognitive Radio Network
DING Han-qing,YANG Jia-wei,ZHAO Zhi-yuan
Computer Science. 2010, 37 (3): 125-127. 
Abstract PDF(253KB) ( 651 )   
RelatedCitation | Metrics
We analyzed the performance of spectrum sensing with energy detector in cognitive radio network. The resups of simulation indicate that fading decreases the detection performance of an individual cognitive radio user, and hard combination cooperative spectrum sensing can improve the cognitive radio system's detection probability, but the cognitive radio system's false alarm probability increases as the number of cooperative users increasing. Especially,when control channels are not reliable, low bound of false alarm probability will appear and result in decrease of spectrum utilization.
Cloud Trust Model for Wireless Sensor Networks
MA Bin,XIE Xian-zhong
Computer Science. 2010, 37 (3): 128-132. 
Abstract PDF(423KB) ( 673 )   
RelatedCitation | Metrics
In wireless sensor networks,it is difficult to take from the rate of trust risk. This paper used the concept or cloud theory to estimate dynamic context and consequently presented the definition of risk signal,and a cloud trust model based on risk evaluation for wireless sensor networks was proposed in this paper to solve this problem, in which the uncertainty between dynamic context relationships is considered. The risk was evaluated using cloud theory, quantified using risk and trust uncertainty degree presented in a uniform form. The simulation results show that the proposed trust model based on risk evaluation can efficiently express uncertainty of risk and trust, and decrease trust risk of nodes. And so this trust model also can evidently take from the rate of trust risk, and enhance successful cooperation ratio of system.
Immune Detector Generating Algorithm Based on Self-mutation
CHEN Zhe,ZHOU Yan-zhou,LU Zhi-guo
Computer Science. 2010, 37 (3): 133-137. 
Abstract PDF(437KB) ( 584 )   
RelatedCitation | Metrics
As a novel branch of computational intelligence, artificial immune system has strong capabilities of informalion processing and problem-solving paradigm. hhe detector generation is the key technology in constructing artificial immune system and has been the research hotspot in computational intelligence. This paper analysed the traditional delector generating algorithm Based on self-mutation mechanism and bland template technology, a self-mutation detector generating algorithm(SMDGA) was proposed.The principle, template definition and implementation steps of the SMDGA were described, and the performance and complexity of the SMDGA were analyzed. Both mathematical analysis and experiments show that SMDGA has the advantages in reducing the size of detector set and improving detecting efficicncy.
Segmental Measurement Based Approach to Estimate Internet End-to-end Performance Parameter
MA Zhen-yuan,ZHOU Jie,CHEN Chu,ZHANG Ling
Computer Science. 2010, 37 (3): 138-140. 
Abstract PDF(342KB) ( 699 )   
RelatedCitation | Metrics
The traditional measurement based method for estimating Internet performance parameters rectuires measurement points to be the start and the end of an end-to-end path. Such condition can make measurement aimless and increase network traffic. Meanwhile the estimation may be hard to implement, and the results may also be less accurate.This paper proposed a novel approach based on segmental measurement to estimate delay, delay variation, loss rate and error rate, which were defined by ITU-T. As each measurement can be used to several end-to-end path performance parameter estimations, the aimless of measurement was avoided and the network traffic was cut down significantly. Three module, including packet and path generator, parameter measurement machine and result collector, were designed and implemented based on PlanetLab. Its feasibility was illustrated by deploying on more than 100 world wide nodes. And the running data collected by result collector proved its accuracy.
Microwave Fire Detection and System Design Based on Low-altitude Aircraft Platform
ZHANG Lei,LU Jian-hua,LIANG Xin-gang
Computer Science. 2010, 37 (3): 141-143. 
Abstract PDF(268KB) ( 726 )   
RelatedCitation | Metrics
Microwave remote sensing can be achieved throughout all-day and all-weather land observation. So, the microwave technology has certain advantage in forest fire detection field. In this paper, in order to determine the fire happening region, the theories of microwave radiation and receiving were analyzed, and broadband microwave receiver system solution was designed. So as to meet the needs of emergency and flexibility detection, UPS receiver module and inertial navigation unit in low-altitude platform were designed to ensure the reliability and stability of microwave forest fire detection.
Custom Instructions Identification Based on Code Profiling
BO Shi,GE Ning,LIN Xiao-kang
Computer Science. 2010, 37 (3): 144-148. 
Abstract PDF(444KB) ( 746 )   
RelatedCitation | Metrics
Code profiling is a key technique for application behavior analysis and bottlenecks discovery. According to the requirement of designing custom instructions for reconfigurable processors, AID-prof, a novel profiling method based on virtual machine was presented. The benefits of the presented method are architecture-independent and close combination between static and dynamic analysis. Based on AID-prof, an automatic custom instructions identification procedure named CID was proposed. Experiments results show that AID-prof can discover application hot spots effectively, and custom instructions identified by CID can markedly speed up application execution.
Ontology-based Knowledge Extraction of Relational Data
ZHANG Guo-qiang,JIA Su-ling,WANG Qiang
Computer Science. 2010, 37 (3): 149-151164. 
Abstract PDF(345KB) ( 715 )   
RelatedCitation | Metrics
For converse the massive relational data to knowledge, the paper studied how to extract relational data to construct a OWI_ DI. ontology. Based on the formalization of relational data records and knowledge ontology, an algorithm was designed to realize the extraction and conversion. Some key issues such as the identity and relationship table were studied. Finally a case was designed to test and verify the algorithm The proposed method can help ontology developers extract the relational data into knowledge more easily, and provide a rapid way to build ontology for the realizalion of knowledge management.
Adaptive Method of Computing Data Stream Aggregation
HOU Dong-feng,LIU Qing-bao,ZHANG Wei-ming, DENG Su
Computer Science. 2010, 37 (3): 152-155169. 
Abstract PDF(428KB) ( 658 )   
RelatedCitation | Metrics
A method based on Adaptive Hierarchy Aggregation tree(AHA-Tree) was presented for computing aggregalion of data stream. The tructure of AHA-Tree borrowed the idea of multiple time granularities hierarchical window model,the recent data was kept in fine granularity and the older in rough. In addition,the partition of granularities was determined by density of time unit, the sparse time unit was kept in rough granularity and the denseness in fine. Moreover, the method of maintenance and aggregate computing was proposed for aggregation query. Experiment shows that the method is efficient in processing the data under non-uniform distribution.
WDB Data Extraction Based on Semantic Support
GAO Ming,WANG Ji-cheng,LI Jiang-feng
Computer Science. 2010, 37 (3): 156-158174. 
Abstract PDF(334KB) ( 637 )   
RelatedCitation | Metrics
The paper presented an algorithm which fills the ctuery interface by using machine learning based on the analysis of mechanism of Deep Web query. The algorithm is able to extract data automatically. Firstly, a 2D table is constructed. The columns of the table are controllers extracted from pages of the Deep Web query interface. Then values of the table are filled by giving values to all the controllers. Next, a learning of classification is going to be achieved according to the result whether the extraction of data successfully or not. Finally, the data is extracted by constructing request string automatically through the results of the learning. The experiment shows that the algorithm runs effective1y.
Code Similarity Detection Approach Based on Back-propagation Neural Network
XIONG Hao,YAN Hai-hua,HUANG Yong-gang,GUO Tao,LI Zhou-jun
Computer Science. 2010, 37 (3): 159-164. 
Abstract PDF(515KB) ( 800 )   
RelatedCitation | Metrics
It is very important to find plagiarized programs in the field of computer science education. Traditional methods for program similarity use attribute counting or structure information to detect plagiarism. This paper presented a program similarity detection approach based on back propagation (I3P algorithm) multi-layer feed-forward neural net works. We extracted seven compared features of the code as the input of the neural network, and obtained the program similarity through the network calculation. Comparing the result with the threshold value, we can find all groups of simlar programs. Experimental results show that our method is effective.
Monitoring Framework for Software Properties Based on Aspect-oriented Programming
LEI Yan,MAO Xiao-guang,WANG Cheng-song
Computer Science. 2010, 37 (3): 165-169. 
Abstract PDF(435KB) ( 700 )   
RelatedCitation | Metrics
Aiming at improving software dependability at runtime, this paper presented a monitoring framework for software properties based on AOP (Aspect-Oriented Programming). Independent of target software, this framework automatically generates monitoring aspects from software properties described by OCL(Object Constraint Language) and UML Profile for SPT (Schedulabihty,Performance,and Time)specification. I3y weaving aspects with target software,the approach makes target software having the ability to monitor software properties at runtime.
Detecting Invalid Mappings between Relational Database and Ontology
TANG Fu-nian,YAO Li,QI Xue-tian,XIAO Qing-tao
Computer Science. 2010, 37 (3): 170-174. 
Abstract PDF(463KB) ( 649 )   
RelatedCitation | Metrics
The semantic mappings between relational database and ontology arc critical to the implementation of interoperabihty between them. However, these mappings would be invalid if schema (or ontology) evolved, and it is a pre-requisite work to find the affected mappings while some change occurs. In this paper, by comparing the new version and old version of the schema (ontology) , rational evolution paths between the old version and new version were discovered,and a set of changed elements were detected. Based on the set of changed elements, we can design a query testing set,and queries in the testing set can be reformulated in virtual of corresponding mappings. Finally, if the reformulated qucries can not return a valid result set, the mappings used in reformulating queries arc proved to be invalid.
Semantic Web Service-oriented Semantic Program Transformation Approach
WANG Quan-yu,YING Shi,LU Guo-bin,ZHAO Kai
Computer Science. 2010, 37 (3): 175-177181. 
Abstract PDF(372KB) ( 607 )   
RelatedCitation | Metrics
Semantic program transformation is the foundation of Semantic-Web-Service-oriented software design. Semantic program can't be executed and called by runtime environment until it passes the program transformation. There isn't,however,any effective semantic program transformation method. Aiming at this problem,this paper presented a Semanti}Web-Service-oriented semantic program transformation method which is based on semantic program language SPI.This method makes an effective transformation on semantic information such as semantic data type, semantic rule,semantic service, semantic flow and so on, which not only enhances the flexibility and robustness of servic}oriented program design, but also increases the flexibility and reusability of the business process.
Generic&Semantic-based Data Extraction Approach
ZHANG Jian-ying,SUN Yong-jie,WANG Xiu-kun
Computer Science. 2010, 37 (3): 178-181. 
Abstract PDF(346KB) ( 689 )   
RelatedCitation | Metrics
A relational database can be viewed as a directed graph constructed by tuples and foreign key references. To facilitate data replication and sharing,semanticrclativity and logically independence should be satisfied when relational data is extracted. Multi-tree structures are employed as clusters of such data extracted from a relational database in this paper. Then the corresponding data extraction approach was proposed and implemented. We evaluated the extraction algorithm on a TPC-C database in Oracle, demonstrating the effectiveness and generalization of the approach.
Path-partitioned Encoding Optimizes Twig Queries
XU Xiao-shuang,FENG Yu-cai,WANG Feng,ZHOU Ying-biao,ZHANG Jun
Computer Science. 2010, 37 (3): 182-187204. 
Abstract PDF(569KB) ( 574 )   
RelatedCitation | Metrics
Effectively storing and querying XMI. documents becomes a hot research topic on current database domain. In the light of path summary, path-partitioned encoding scheme was proposed to store an XMI_ document, and useful for eliminating descendant axes and wildcards in twig queries. For twig queries without“//,,or“、”,a new query algorithm was developed based on structurcconstraincd nodes, so structural joins extremely decreases. The results of experimenu indicate the algorithm can significantly filter useless elements and improve the performance for twig queries.
Method of Automatic Semantic Web Services Composition Based on AND/OR Graph
LU Jin-yun,ZHANG Wei-qun
Computer Science. 2010, 37 (3): 188-190261. 
Abstract PDF(329KB) ( 631 )   
RelatedCitation | Metrics
Single Web service just provides limited functionality, Web services composition has become an important research aspect in Web services application domain. This paper proposed an approach based on AND/OR graph to compose semantic Web services automatically. It adds the semantics to Web services and has a smaller search space limited to the service composition AND/OR graph and it finds the best composition graph in the service composition AND/OR graph to achieve the purpose of optimizing services composition. I}hc results of the extensive experiments show that this method improves both the successful and the efficiency ratio of Web services composition.
Average Computational Time Complexity Optimized Dynamic Particle Swarm Optimization Algorithm
WANG Qin,LI Lei,LU Cheng-yong,SUN Fu-ming
Computer Science. 2010, 37 (3): 191-194288. 
Abstract PDF(412KB) ( 1779 )   
RelatedCitation | Metrics
Particle Swarm Optimization algorithm is widely used in many fields including lots of situations with high real time requirements such as wide-band digital signal processing. A great number of particles should be updated in iteralion in traditional PSO algorithm. So the average computational time complexity is high. This property of traditional PSO algorithm leads to serious delay which makes it could not be used in system with high real-time requirements. So,the average computational time complexity of traditional PSO algorithm should be reduced with negligible performance penalty. We proposed a Dynamic PSO (DPSO) with adjustable particle quantity algorithm. The core of this algorithm is the criteria of discarding particles. During iterations, some of particles will be discarded dynamically to reduce the average computational time complexity. Furthermore, the mutation was used on local best position in iterations to avoid involving in local optimum solution. Simulation results and theoretical analysis show the average computational time complexity of DPSO algorithm is reduced by about 30 0 o under the condition of similar optimum solution compared with traditional PSO algorithm. In terms of algorithm performance, the performance of DPSO algorithm is same as that of traditional PSO algorithm for the single-modal optimization. On the other side,the performance of DPSO will be better than traditional PSO for multimodal optimization.
Fuzzed of Lower and Upper Approximation in Incomplete Information System
YANG Ji-lin,QIN Ke-yun,PEI Zheng
Computer Science. 2010, 37 (3): 195-198. 
Abstract PDF(320KB) ( 600 )   
RelatedCitation | Metrics
The uncertainty of objects is considered in the incomplete information system. Based on the uncertainty of objects,the method in which lower and upper approximation are fuzzed,i. e.,the lower and upper approximation is a pair of fuzzy sets, was discussed in the incomplete information system. Onc hand, lower and upper approximations arc decided by the similarity relation;on the other hand,the objects with uncertainty due to“、”is existent Therefore, the uncertainty degree of an object and similarity degree among the objects were defined. Then the membership degree of the object belonging to lower and upper approximation was given by similarity degree. Formally,the process is considered as fuzzed of the lower and upper approximation sets. Finally, an example shows that membership degrees more visualy and reasonably describe that objects belong to the lower and upper approximation of the set.
Text Hierarchical Clustering Based on Several Domain Ontologies
ZHANG Ai-qi,ZUO Wan-li,WANG Ying,LIANG Hao
Computer Science. 2010, 37 (3): 199-204. 
Abstract PDF(498KB) ( 757 )   
RelatedCitation | Metrics
Thraditional clustering methods arc usually based on the similarity of keywords appearing in documents. Since these methods may lead to the loss of lots of semantic information, their clustering results are not accurate enough and often need large amount of computation. A new method for hierarchically clustering documents based on several domain ontologics was proposed. This method first transformed keyword-based vectors into corresponding concept-based vectors making use of several domain ontologies. Then, a formula was given for calculating similarities between different documents. An algorithm for document clustering based on several domain ontologics was proposed and its corresponding prime system was also designed and implemented. The experimental results show that our method can express and process documents from the perspective of concept semantics. It can decrease the amount of computation by reducing the dimension of the space of clustered o均ects and improve both the accuracy and the efficiency of document clustering.
Constraint Specification of Weakly Hard Real-time Systems Based on Smooth Scheduling
ZHU Xu-dong,CHANG Hui-you,YI Yang,TAO Qian
Computer Science. 2010, 37 (3): 205-207291. 
Abstract PDF(340KB) ( 644 )   
RelatedCitation | Metrics
Constraint specification is the foundation stone of the research of weakly hard real-time systems. From the perspective of the definitions of weakly hard real-time system, this paper put forward a new constraint specification,which can achieve smooth scheduling more efficiently. The paper also presented and proved an important weakly hard real-time constraint specification works well and has wide application on strictness comparison. The result shows the new constrain theorem of specification works well and has application.
Competitive Hopfield Network Combined with Variable Neighborhood Search for Maximum Diversity Problems
ZHOU Ya-lan,WANG Jia-hai,BI Wei,MO Bin,LI Shu-guang
Computer Science. 2010, 37 (3): 208-211252. 
Abstract PDF(384KB) ( 624 )   
RelatedCitation | Metrics
A discrete competitive Hopfield neural network(I}CHNN) combinai with variable neighborhood search(VNS) was proposed for the maximum diversity problem In order to overcome the local minima problem of D(}HNN,the idea of VNS was introduced into DCHNN. Once the network is trapped in local minima, the VNS can generate a new search neighborhood for I}CHNN. I}hus, the proposed algorithm can escape from local minima and further search better results. Finally, simulation results on the maximum diversity problem show that the proposed algorithm has good performancc.
Self-similarity Clustering Event Detection Based on Triggers Guidance
ZHANG Xian-fei,GUO Zhi-gang,LIU Song,CHENG Lei,TIAN Yu-xuan
Computer Science. 2010, 37 (3): 212-214220. 
Abstract PDF(332KB) ( 931 )   
RelatedCitation | Metrics
Traditional method of Event Detection and Characterization (EDC) regards event detection task as classificalion problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper didn't classify event using word as samples, but clustered event in judging event types. It adapted self-similarity to convergence the value of Kin K-means algorithm by the guidance of event triggers, and optimized clustering algorithm. hhen, combining with named entity and its comparative position information, the new method further ensures the pinpoint type of event.The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.
Particle Swarm Optimization with Chaotic Mutation
ZHU Hong-qiu,YANG Chun-hua,GUI Wei-hua,LI Yong-gang
Computer Science. 2010, 37 (3): 215-217. 
Abstract PDF(260KB) ( 1275 )   
RelatedCitation | Metrics
To overcome the disadvantage of low convergence speed and the premature convergence during the later computation period of particle swarm optimization, a chaotic particle swarm optimization (CPSO) was proposed. Aimed to improve the ability to break away from the local optimum and to find the global optimum, the non-winner particles were mutated by chaotic search and the global best position was mutated using the small extent of disturbance according to the variance ratio of population's fitness. The numerical simulation comparing to the standard PSO was performed using of complex benchmark functions with high dimension. The results show that the proposed algorithm can effectively improve both the global searching ability and much better ability of avoiding prcmaturity.
Affected Batches Rescheduling Algorithm for Multipurpose Batch Process
SHU Sheng,YU Hai-jie
Computer Science. 2010, 37 (3): 218-220. 
Abstract PDF(250KB) ( 541 )   
RelatedCitation | Metrics
Affected Batch Rescheduling (AI3R) Algorithm for multipurpose batch process was proposed in order to tackle delay of batch. The comprehension of quality and stability was used for the performance index of rescheduling solution. AI3R algorithm delays all affected batches based on processes defined in state task network (STN). Simulation case illustrated that AI3R algorithm is better than Right Shift Rescheduling (RSR) algorithm on makespan and the total delay time of start time of affected batches.
Research on RBFCM-based Heuristic Coordination Algorithm
PENG Zhen,YANG Bing-ru,XIE Yong-hong
Computer Science. 2010, 37 (3): 221-224. 
Abstract PDF(311KB) ( 644 )   
RelatedCitation | Metrics
Heuristic coordinator for knowledge discovery could simulate "creating intent" of cognitive psychology feature and enhanced the ability of self-cognition. With the aim to improve the cognitive feature and the performance of heuristic coordinator, the paper proposed one new heuristic coordinator algorithm, which used rule based fuzzy cognitive map to represent knowledge and to be effective inference in order to get non-association state of knowledge base for directional mining shortage knowledge in massive database. And the experiment demonstrates that the approach effective1y reduces the searching space and increases the intelligence of knowledge discovery compared with the directed hypergraph based method.
Algorithm Based on Genetic Algorithm for Sudoku Puzzles
LIU Yan-feng,LIU San-yang
Computer Science. 2010, 37 (3): 225-226233. 
Abstract PDF(243KB) ( 1347 )   
RelatedCitation | Metrics
To solve the Sudoku puzzles, above all, they were changed into a combinatorial optimization problem. I}hen, a genetic algorithm with specialized encoding, initialization and local search operator was presented to optimize it. The experimental results show the algorithm is effective for all difficulty levels Sudoku puzzles.
Dominance Relation-based VPRSM and its Application in Fuzzy Objective Information Systems
HUANG Bing,ZHOU Xian-zhong,SHI Ying-chun
Computer Science. 2010, 37 (3): 227-229241. 
Abstract PDF(335KB) ( 630 )   
RelatedCitation | Metrics
In group decision-making theory, how to acctuire reasonable decision rules is an important issue. This paper constructed a dominance relation-based variable precision rough set model (VPRSM) in fuzzy objective information systerns, and proposed some knowledge reduction definitions such as dominanc-based lower/upper distribution reduction.Using a heuristic function, we devised a lower distribution reduction algorithm. Finally, this model was applied to computer audit risk appraisal and some reasonable appraisal rules were acquired.
Statistical Analysis for Chinese-English Verb Subcategorization
HAN Xi-wu,ZHAO Tie-jun
Computer Science. 2010, 37 (3): 230-233. 
Abstract PDF(352KB) ( 861 )   
RelatedCitation | Metrics
Based on large scale ChinescEnglish parallel corpus, this paper described a systematic experiment of statistical analysis for bilingual verb subcategorization. Firstly, with lexical and grammatical compatibility as heuristics, probabilistic distributions of 654 bilingual subcategorization frames were estimated by means of a two-fold MI_E filtering method. Then,linguistic classification of the frames was determined according to Chinese and English syntax Finally,linguistic classes for each frame were labeled via SVM on the basis of their supporting corpus.
Workflow Mining Optimization Based on Hybrid Adaptive Genetic Algorithm
GU Chun-qin,TAO Qian,WU Jia-pei,CHANG Hui-you,YAO Qing-da,YI Yang
Computer Science. 2010, 37 (3): 234-238. 
Abstract PDF(400KB) ( 594 )   
RelatedCitation | Metrics
Current workflow mining algorithm using local strategy couldn't ensure that a globally optimal process mode was mined. The algorithm was also sensitive to noise. To solve the problems,a hybrid adaptive genetic algorithm (HA-GA) was proposed. Firstly, Elementary Workflow net (EW-net) was defined. The enabling and firing rules of EW-net were given, and the process model was described. Secondly, a converting algorithm proposed was used to convert the process model to EW-net, and an evaluating function of the individual fitness was presented in order to measure the compliance between event log and mined process model. Lastly, hybrid adaptive crossover and mutation rates were do signed according to evolution stage and parents' similarity. I}he simulation testing results demonstrate that the new algorithm has noise immunity and is more robust than a algorithm,and it can find better solution and converge faster thar the simple genetic algorithm (SGA) employing general genetic strategy.
Feature Weighting Based Event Argument Identification
FU Jian-feng,LIU Zong-tian,LIU Wei,SHAN Jian-fang
Computer Science. 2010, 37 (3): 239-241. 
Abstract PDF(245KB) ( 674 )   
RelatedCitation | Metrics
Vent extraction is a task of Automatic Content Extraction (ACE) program. Event Argument Identification is sub-task of Event extraction. The state-of-the-art of event extraction and event argument identification was given. An algorithm named Feature Weighting Based Event Argument Identification (FWEAI) was proposed, which improved ReliefF, a feature selection algorithm in classification algorithm, and employed it in clustering algorithm. The improved ReliefF (FWA)can assign different weights to different features according to their contributions to lustering,then Event arguments were clustered by using KMcans clustering algorithm. Experimental results demonstrate that FWEAI algorithm improves the precision on Event Argument Identification.
Parallel Algorithm for Solving Banded Linear Systems
DUAN Zhi-jian,YANG Yong,MA Xin-rong,LIU San-yang
Computer Science. 2010, 37 (3): 242-244270. 
Abstract PDF(257KB) ( 625 )   
RelatedCitation | Metrics
The work presented in this paper focused on alternating-direction parallel iterative method for solving banded-linear systems on distributed-memory multi-computers. Firstly, the matrix was splitted by using the feature of the coefficient matrix, thus the communication only need twice between the adjacent processors per iteration step. Furthermore, the sufficient conditions for convergence were given when the coefficient matrix A is a Hermite positive definite matrix or Mmatrix respectively. Finally, the numerical experiments implemented on HP rx2600 cluster indicate that the algorithm's parallel acceleration rates and efficiency are higher than the multi-splitting method's.
Hybrid Method for Outliers Detection Using GPLVM and SVM
TIAN Jian,GU Hong
Computer Science. 2010, 37 (3): 245-247. 
Abstract PDF(252KB) ( 836 )   
RelatedCitation | Metrics
Outlicrs arc objects that do not comply with the general behavior of the data. SVM(support vector machine)finds the maximal margin hyperplane in feature space for the purpose of distinguishing the outliers from normal samp1es. Based on the high performance of SVMs in tackling small sample size, high dimension and its good generalization,we proposed a new method for outlicr detection, which combines a novel unsupervised algorithm GPLVM(Gaussian process latent variable model) with standard SVM. GPLVM provides a smooth probabilistic mapping from latent to data space, embeds the dataset in a low-dimensional space which is used for cross validation of SVM I'he proposed approach was applied to KDD99 benchmark problems, and the simulation results show its validity.
Profile Information and Critical Path Length Based Software Fanout Tree Generation Algorithm
ZENE Bin,AN Hong,WANG Li
Computer Science. 2010, 37 (3): 248-252. 
Abstract PDF(455KB) ( 646 )   
RelatedCitation | Metrics
Exposing and exploiting instruction-level parallelism (ILP) is one of the most critical components of high performance for modern processors. Wide-issue superscalar, very long instruction word (VLIW) and dataflow processors can only achieve high performance when they execute nearby instructions in parallel. In order to expose more instruction-level parallelism to hardware execution substrate, this paper proposed an improved fanout tree generation algorithm. This algorithm first calculates the priority value of every target instruction based on its execution probability and the critical path length from the target instruction to an exit of its block, and then sorts the priority values in ascending order and generates a software fanout tree. Experimental results show that the algorithm proposed in this paper improves the performance over the prior work modestly.
Research of the Reconfigurable Router Structure with Evolutional Function
HUANG Wan-wei,QU Jing,WU Jun-ting,LIU Qiang
Computer Science. 2010, 37 (3): 253-255. 
Abstract PDF(359KB) ( 630 )   
RelatedCitation | Metrics
With the appearance of new protocols and services,the traditional routers with rigid and exclusive structure can't meet the upper services' development An open and reconfigurablc muter with functional evolution is needed. This paper put forward a systematic structure of reconfigurable muter, which is based on an assembly mechanism like building blocks. With multi-vector mapping,this structure takes advantage of changing vectors to realize the variable functions, update, add, or delete its original functions. The muter with evolutionary function will support the diverse upper services' development effectively.
Easing the Competition for Key Resource among Threads in SMT
YIN Jie,JIANG Jian-hui
Computer Science. 2010, 37 (3): 256-261. 
Abstract PDF(554KB) ( 690 )   
RelatedCitation | Metrics
Simultaneous Multithreading Processors boost performance by executing instructions from different threads simultaneously, which explore both inter-thread and intra-thread parallelism. Sharing critical resources (including rename register file, instruction queue and so on) among different threads may also bring vicious competition,which may resultin starvation,even degrade the performance.This is mainly due to long delays encountered by some thread,and the threads take a lot of key resources for long time,while the demand of the other threads for key resources cannot be met.This may reduce the utilization of resources.There are three methods to reduce the negative impact of competition:thread scheduling decides which threads to fetch instructions from in the fetch stage;instruction scheduling determine which instructions to enter the key resource in the dispatch stage;Partitions of the key resources allocate the key resources among threads.These scheduling strategies were reviewed in the paper.
Algorithm of Skin-tone Detection Based on Parameter Look-up Table
YI Yi-hu,QU Dao-kui,XU Fang
Computer Science. 2010, 37 (3): 262-264. 
Abstract PDF(348KB) ( 879 )   
RelatedCitation | Metrics
Because of the lack of adequate characterization in existing models for the distribution of skin-tone,algorithm was proposed to detect skin-tone based on a look-up table storing parameters. This method takes skin-tone and non skin-tone as two types of mode. I3y means of statistics in YCbCr color space, we can obtain the probability distribution of different samples with same chroma along lum coordinate,then make classification using I3ayesian discriminant rules,among which look-up table is used to store the model parameters to achieve rapid search and calculation. During experiment,we evaluated the detection performance of algorithm from two aspects of the pixel samples classification and image segmentation, result shows our method has high detection rate and strong robustness.
Volume Clipping Simulation Based on Clipping Distance Field
XIE Kai,SUN Gang,YANG Sheng,YU Hou-quan,YANG Jie
Computer Science. 2010, 37 (3): 265-267. 
Abstract PDF(256KB) ( 747 )   
RelatedCitation | Metrics
In order to clip volume data by interactive speed,we proposed the concept of Clipping Distance Field (CDF) and a CDF based clipping simulation algorithm. By combining volume rendering and clipping in the shader of GPU, the algorithm also achieved the high rendering/clipping speed required by interactive simulation. The result shows that the algorithm can process clipped data with different shape.
Incrementally Learning Human Pose Mapping Model
LIU Chang-hong,YANG Yang,CHEN Yong
Computer Science. 2010, 37 (3): 268-270. 
Abstract PDF(251KB) ( 853 )   
RelatedCitation | Metrics
Dicriminative approaches to 3D human pose estimation directly learn a mapping from image observations to pose,which requires large training sets. Gaussian process regression(GPR) to learn this mappings has been limited for high computational complexity, so we proposed a incrementally learning mappings based on GPR and Locally Weighted Projection Regression(LWPR). The approach utilized GPR to learn individual local models and LWPR to update existing models or learn a new local model for pose estimation. The experiment showed that the approach could greatly decrease computational complexity and exactly estimate the poses.
Moving Targets Tracking Based on Adaptive Level Set Algorithm
WANG Wei,CHEN Yi-wen,WANG Run-sheng
Computer Science. 2010, 37 (3): 271-274. 
Abstract PDF(346KB) ( 717 )   
RelatedCitation | Metrics
Level set algorithm is an effectual algorithm when tracking the nonrigid moving targets on complicated condilions. An adaptive algorithm was proposed to solve the initialization problem of level set. Firstly, the particle filter algorithm was used to track the targets, and the rough external contour could be gotten. Then based on the centroid of the external contour,the level set curve evolution was performed to get the precise external contour of the targets,and the extract results were fed back to the tracking frame; lastly the likelihood function was improved by updating the reference template dynamically. hhe experiment results show that the algorithm proposed in this paper can adapt to the moving changes of the nonrigid targets,and the targets can be tracked accurately.
X-ray Image Enhancement Based on Curvelet Transform
YANG Zhen-ya,CHEN Yu-quan,PAN min
Computer Science. 2010, 37 (3): 275-277. 
Abstract PDF(228KB) ( 871 )   
RelatedCitation | Metrics
Traditional X-ray image enhancement methods have the problem that enhanced image detail is not clear enough or too much noisy. Based on rigorous mathematical defined multi-scale geometric analysis method, Curvelet transform, a flexible non-linear enhancement function was presented. First of all, Curvelet transform was applied to the X-ray image, then, the fine Curvelet coefficients were mapped by the presented enhancement function, and finally the enhancement X-ray image was reconstructed from the amended Curvelet coefficients. The experiment results show that the Curvelet transform can effectively enhance the X-ray image edge contrast by presented enhancement function. The enhanced X-image has better visual effects with clear edge details and smaller noise compared with the traditional methods.
Graphical Simulation Algorithm for Chinese Ink and Wash Painting Based on Berlin Noise
LUO Yan,WU Zhong-fu,GUO Xuan-chang,ZHOU Shang-bo
Computer Science. 2010, 37 (3): 278-281. 
Abstract PDF(410KB) ( 1531 )   
RelatedCitation | Metrics
Based on analyzing the art effects of Chinese ink and wash drawing, a novel graphical simulation algorithm for Chinese ink and wash painting based on Perlin noise was proposed. We used the theory of Perlin noise, by constructing random noise functions, interpolation functions and different frequencies of the Perlin noise function to get the art effects of Chinese ink and wash painting. The experiment results indicate that the proposed algorithm can simulate the typical art effects of Chinese ink and wash painting successfully.
Change Detection of Multi-spectral Remote Sensed Images Based on PCA and EM Algorithm
WU Ke,NIU Rui-qing,WANG Yi,DU Bo
Computer Science. 2010, 37 (3): 282-284. 
Abstract PDF(357KB) ( 1064 )   
RelatedCitation | Metrics
Change detection can get useful changed information from the observed data about the earth. In this paper, a new technology aimed at two aspects in the directly compared way is based on pixel's characters in the multi-spectral and multi temporal remote sensed images,which arc "build the different image" and "threshold determination". In our method, an exact difference image can be made by PCA method, and then the threshold from the EM algorithm is used to obtain changed area,which can be seen as the result. Compared with the traditional method, the proposed method has a higher accuracy, and is more effective.
Infrared Object Tracking Method Using Human Eye Non-uniform Sampling Characteristic and Curves Evolving
CHEN Yi,SUN Xiao-wei,LI Yan-jun
Computer Science. 2010, 37 (3): 285-288. 
Abstract PDF(310KB) ( 629 )   
RelatedCitation | Metrics
Large quantity information needs to be dealt during image guidance. In order to compress calculation quantity,a novel method for tracking infrared target based on curves evolving theory and human eye non-uniform sampling characteristic was proposed. First,log-polar coordinate transform model was used to compress calculation quantity and increase calculating speed. And then,levcl sets curves evolving method based on target intensity and edge features was presented to suppress local deformation, thus achieving non-rigid deformed target tracking. Compared with traditional Mean Shift tracking method, this method is stable and efficient. Experimental results show the proposed method is effective for suppress non-rigid deformed.
Triangular Mesh Smoothing Algorithm Based on Neighborhood Similarity
HE Qiang,ZHANG Shu-sheng,BAI Xiao-liang,LI Liang
Computer Science. 2010, 37 (3): 289-291. 
Abstract PDF(258KB) ( 974 )   
RelatedCitation | Metrics
In order to enhance the quality of triangular mesh model and meet the rectuiremented of follow-up treatment,this paper presented a mesh smoothing algorithm based on neighborhood similarity. Firstly, contrasting to the gray values of image pixels,bilateral filtering differential operator was constructed as geometric gray value of the vertex Then neighborhood similarity was calculated between vertexes and the result was used as weight of geometric gray value for each vertex I}he final geometric gray value of a vertex was the average weight of its neighbor vertexes' geometric gray values. Finally, the vertex moved geometric gray value size of distance along its normal direction, and then we would get the smoothed model. Experiments demonstrate that this algorithm can acquire smoothing models and maintains their geometrical features effectively at the same time.
Standard Part Library Resource Sharing Based on Cellular Ontology
DING Bo,SUN Li-juan,LIU Xian-guo
Computer Science. 2010, 37 (3): 292-296. 
Abstract PDF(444KB) ( 574 )   
RelatedCitation | Metrics
Aimed at the problems that the standard part library does not support heterogeneous CAD systems and the incompletion of the data information,a standard part library resource sharing framework based on cellular ontology was proposed. The cellular ontology model used Web Ontology Language (OWL) to develop the cellular ontology model. Using semantic mapping between legacy ontology and cellular ontology built the uniform representation of product model data, exchanged data information according to the cell, shielded the heterogeneity of information format, which implemented share of standard part library and real-time exchange of product model data among heterogeneous CAD systans. hhe applications in the synchronized collaborative design between Pro/E, UG and CATIA were also introduced to prove the feasibility of the theories above.