Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 36 Issue 12, 16 November 2018
  
Algorithms for Longest Path : A Survey
WANG Jian-xin,YANG Zhi-biao,CHEN Jian-er
Computer Science. 2009, 36 (12): 1-4. 
Abstract PDF(432KB) ( 1934 )   
RelatedCitation | Metrics
The longest path problem is well-known NP-Hard, and has significant applications in many fields such as bioinformatics. After the emerging of parameterized computation theory, the parameterized k-Path problem becomes one of the most concerned research problems. We introduced several algorithms to solve the longest path problem, including approximate algorithms,parameterized algorithms and polynomial time algorithms on special graphs. We put emphasis on the analysis and comparison of the latest results in parameterized algorithm, which use color-coding, divide-and-conquer and algebra techniques to solve k-Path problem. At last, we presented some further research directions for this problem.
Network Protocol Conformance Testing:An Overview
ZHU Xue-feng,XU Jian-jun,ZOU Biao,ZHANG Zhe,SUN Lei
Computer Science. 2009, 36 (12): 5-7. 
Abstract PDF(389KB) ( 1085 )   
RelatedCitation | Metrics
Conformance testing is fundamental to the analysis of network protocol, while there arc lots of work on research and practice aspect of this problem, we still lack a systematically, efficiently and usable method for conformance testing of network protocol. I3egining with the formal description method for describe network protocol,we surveied major techniques on testing architecture, testing method and testing sequence generating in conformance testing. And finally, gave our method on remaining problem.
Survey of Digital Image Super Resolution Reconstruction Technology
XIAO Su,HAN Guo-qiang,WO Yan
Computer Science. 2009, 36 (12): 8-13. 
Abstract PDF(638KB) ( 1516 )   
RelatedCitation | Metrics
The purpose of image reconstruction is to estimate a high-resolution image with better vision effect from one or a sequence of images. Super resolution that comes from previous image restoration and reconstruction technology can take advantage of redundant information among images. Super resolution reconstruction could include two kinds of technology; reconstruction-based technology and learning-based technology. The reconstruction-based technology estimates a high-resolution image from input images according to specific degradation model. Learning-based technology supplies input images with prior knowledge from training examples to get a better result. The paper systematically introduced algorithms of super resolution technology. Finally, we pointed out that image registration, reconstruction of degradation model and learning model, blind estimation, learning algorithms were main problems and future direction in image super resolution technology.
Research and Development of Sequential Pattern Mining (SPM)
WANG Hu,DING Shi-fei
Computer Science. 2009, 36 (12): 14-17. 
Abstract PDF(390KB) ( 1942 )   
RelatedCitation | Metrics
Sectuential pattern mining (SPM) is an important research subject of data mining, and used widely in many application fields. hhis paper firstly discussed background of sequential pattern minging, then classified it, and introduced as well as compared the features of sequential pattern minging algorithms based on the classification. Finally, discussed the future research on this field so that researchers can do further study.
Design of High Availability in Carrier Grade Operating System CGEL
WANG Ji-gang,ZHENG Wei-min,XIE Shi-bo,ZHONG Wei-dong
Computer Science. 2009, 36 (12): 18-21. 
Abstract PDF(386KB) ( 681 )   
RelatedCitation | Metrics
High availability today plays an important role in protecting continuous running of critical telecommunication equipments. When the hardware or the software goes wrong, high availability technologies can help the system to mainlain normal. In order to improve the robustness of telecommunication ectuipments in the presence of errors, this paper analysed the high availability requirements of telecommunication equipments. Based on carrier grade operating system CGEL (Carrier Urade Embedded Linux) , this paper proposed and implemented a series of high availability designs, ineluding of condition monitoring; error control; failover; on-line operations. The functions of these high availability designs are relative independent, but closely related each other. The experimental results show that these designs can help telecom equipments to effectively resist the hardware and software failures, and ensure non-stop telecom system for stable operation.
P2P Networks Proactive Immune Mechanism Based on Multi-mobile Agent
XU Xiao-long,WANg Ru-chuan,XIAO Pu,CHEN Dan-wei
Computer Science. 2009, 36 (12): 22-25. 
Abstract PDF(472KB) ( 604 )   
RelatedCitation | Metrics
Peer-to-Peer (P2P) computing environment is suffering increasingly serious propagation and attack of malicious codes. This paper adopted the Multi-Mobile Agent to construct a novel proactive immune mechanism suitable for P2P network systems. The biological immune principle was referred in this paper,and Multi Agent technology was utilined to realize Agents with different functions, which build cooperation relationship among the Central Immune Peer and ordinary Peers to defend malicious codes jointly. Mobile Agent can roam throughout P2P network to detect malicious codes actively or to carry and set long distance immune modules like vaccines. It is convenient to achieve rapid responding, analyzing and dealing malicious codes. In order to distribute immune modules efficiently, the paper also proposed a new Exponential Tree Plus (E7+)and vaccine distribution algorithm based on E7+.This paper first analyzed the propagation of malicious code in P2P networks, and then introduced the architecture and components of P2P networks proactive immune mechanism based on Multi Mobile Agent, as well as the ETA-based vaccine distribution algorithm, which performance was described at the end of this paper.
Secure Access Authentication Scheme in Mobile IPv6 Networks
ZHANG Zhi,CUI Guo-hua
Computer Science. 2009, 36 (12): 26-31. 
Abstract PDF(508KB) ( 617 )   
RelatedCitation | Metrics
To Mobile IPv6 networks, identify authentication is crucial issues of the network security. This paper proposed a secure identify authentication scheme, which considers inters domain reputation relationship between mobile node home domain and the access domain in the prchandoff procedure and realizes effective mutual authentication between mobile node(MN) and the access domain. Authentication can be accomplished by double private key, HA and MN signing the home registration messages respectively. The access authentication can be accomplished in the visited network instead of the home network, and the handover procedure integrating authentication only needs one round trip.Thcoretical analysis and numerical results show that proposed scheme is more effective in reducing total authentication and handoff delay and the signaling overhead than relative schemes. Based on the security of CPK algorithm and IBS,we prove the access authentication and home registration process handover latency of ours is better than that of the existing solutions and our solution satisfies mutual authentication security,resolves the key escrow efficiently.
Research of DNS Worm in IPv6 Networks
XU Yan-gui,QIAN Huan-yan,ZHANg Kai
Computer Science. 2009, 36 (12): 32-36. 
Abstract PDF(438KB) ( 653 )   
RelatedCitation | Metrics
In IPv6 network environment, the scanning strategy of Internet worm was analyzed. A new type of worm,DNSWorm-V6 was built.The worm applies two different layers scanning strategy. It means that the worm applies subnet scanning strategy in local subnet and applies DNS scanning strategy in inter-subnet. Based on the two layers scanning strategy, a Two-Level worm propagation model, TLM was presented. I}he results of simulation experiment show that DNSWorm-V6 is a worm that can propagate fastly in the large-scale in IPv6 network, at the same time we can prediet the threat probably posed by the new worm in IPv6 network.
Negative Operator Embedded Clonal Selection Algorithm for Grid Intrusion Detection
YANG Min-hui, WANG Run-chuan
Computer Science. 2009, 36 (12): 37-40. 
Abstract PDF(484KB) ( 610 )   
RelatedCitation | Metrics
Grid Intrusion Detection is a method to solve the bottleneck of grid security. This paper proposed Negative Selection Operator Embedded Clonal Selection Algorithm (NCSA) as the new detector algorithm, based on Clonal Selection Algorithm. Negative selection operators as a component avoided self-tolerance phenomena of detectors, assisted memory detectors to complete dynamic updating; and affinity maturation decreased the numbers of co-simulations; so delectors could cover non-self space better. In order to obtain satisfactory TP and FP ratio, by experiments we set two affeeling NCSA behaviors' parameters, immature detectors' toleration period ( T) and mature detectors' lifespan (L) to appropriate values. With the same parameters and training conditions, comparing with CSA, the results show that NCSA gains higher TP ratio and lower FP ratio, improves the whole detection performance. Also the higher TP ratios and lower FP ratios mean that NCSA can recognize unknown intrusions and fit dynamic grid environments better.
Saturated Assignment Algorithm with Ordered Static Priority
WU Wei,NI Shao-jie,LIU Xiao-hui
Computer Science. 2009, 36 (12): 41-45. 
Abstract PDF(420KB) ( 734 )   
RelatedCitation | Metrics
In the area of communication, radar, navigation and various electronic production, embedded real-time scheduling has became the control kernel of those electronic and electrical systems, where the cost and the performance-price ratio arc major concerns for the system designers. In practical applications, those systems only support limited priority levels when the task number is greater than the number of priority levels, those well-known optimal algorithms, such as DM(deadline monotonic) and RM(rate monotonic),are impractical. However, they can still provide natural priority to assist system design. A saturated assignment algoritlnn with ordered static priority was proposed based on transcenderrtal knowledge of natural priority. It was proved to be the optimal ordered assignment. Further researches show that the saturated assignment with DM ordered priority leads to minimal priority levels, as long as any task is of deadline less than or ectual to its period. Our method is of low time complexity, the number of scheduling determination is equal to the total task number,which is much less than the well-known AGP(assignment of priority group) and NPA(lcast number priority assignment).
Density-based Clustering Protocol for Wireless Sensor Networks
QIAO Jun-feng,LIU San-yang,CAO Xiang-yu
Computer Science. 2009, 36 (12): 46-49. 
Abstract PDF(334KB) ( 854 )   
RelatedCitation | Metrics
Wireless sensor networks require energy-efficient routing protocols to prolong the system lifetime.Routing algorithms are firstly introduced for WSN,and then the Leach algorithm is researched. Based on Leach,a density-based clustering protocol was proposed.While selecting cluster heads,the protocol can set the scale of each cluster based on the density of nodes,with the network load and energy distribution balanced.Simulations show that our proposed protocal is more effective in prolonging the network lifetime and reducing the energy dissipation compared with Leach.
Research and Simulation of the Multi-mode Signals Synchronization Strategy
PAN Cheng-sheng,LIU Fang,FENG Yong-xin
Computer Science. 2009, 36 (12): 50-54. 
Abstract PDF(391KB) ( 714 )   
RelatedCitation | Metrics
Considering the new spread spectrum technology, the new signal modulation system and the frequency sharing characteristic, the synchronization method of single purpose haven't meet the need. Based on the studying of signal mechanism and characteristic of the L1 and L5 frequency signals in GPS and Galileo system, the control technology and data process method were weighty studied, the channels, accumulate time and process method were rationally choiced. Therefore, the purpose of multi mode signals receiving and synchronization was gained,furthermore the simulation and performance test were implemented, providing the feasibility, rationality and validity of the method.
3D Positioning Model and Algorithm Based on Wireless Sensor Networks
NIE Wen-hui,JU Shi-guang,XUE An-rong
Computer Science. 2009, 36 (12): 55-59. 
Abstract PDF(399KB) ( 638 )   
RelatedCitation | Metrics
Positioning of distributed wireless sensor nodes is an important and fundamental problem in a wide range of applications, such as search and rescue, target tracking, supply chain management, disaster relief and smart environmenu. By deployed in physical environments, the nodes are envisioned to form ad hoc communication networks and provide sensed data without special communications infrastructure. In constrction project, equiped with RFIDs, the locations of materials and components become of immediate use if such construction resources arc to be tracked. hracking the locanon of construction resources enables effortless progress monitoring and supports real-time construction state sensing. hhis greatly improve the productivity and reduce the cost for the construction enterprise. For above reason, we gave a node positioning model in 3 dimensional space and specially discussed it's localization error. This model had successfully used in construction projrect management
Research on Semantic Query in Hybrid P2P Networks
LIU Zhen,DENG Su,HUANG Hong-bin
Computer Science. 2009, 36 (12): 60-64. 
Abstract PDF(559KB) ( 654 )   
RelatedCitation | Metrics
Supporting semantic ctuery is one of key technictues which broaden P2P systems' applications. A semantic supported hybrid P2P network model-M-Chord was proposed. It adopts a metadata-specification-templatcbased semantic description model and combines technical characteristics of Chord and semantic overlay. A MST-based semantic overlay construction approach was designed. In MChord, the concept of semantic query routing was proposed. Based on M-Chord network model, semantic query method was proposed. The experiments show that scalability and semantic search efficiency are improved greatly in MChord.
Layered Multi-description Coding Based on P2P Video Streaming
HUANG Xiao-tao,GHANG Wei,ZHU Hua,LU Zheng-ding
Computer Science. 2009, 36 (12): 65-69. 
Abstract PDF(452KB) ( 670 )   
RelatedCitation | Metrics
Heterogeneity and transmission reliability are two of the major problems in network video communications.For video coding, layered coding is used to solve the problem of network heterogeneity and multi-description coding is an effective way to address the transmission reliability issue,furthermore layered multi description coding is a combination of both. Based on layered coding technology, we proposed a layered multi-description coding method based on pixel space decomposition. We presented corresponding sub-layer combination algorithm and pixel evaluation algorithm which can enhance the video quality for users. Moreover, By analyzing on the relationship between quantitative value and evaluation error of pixels, we established a mathematical model for layered multi-description decoding. Experimental results show that the proposed scheme performs better in heterogeneous P2P networks and gives a better solution to transmission reliability problems.
Application of Utility Theory in Investment Optimizing of Information Security
CHEN Tian-ping,ZHANG Chuan-rong,GUO Wei-wu,ZHENG Lian-qing
Computer Science. 2009, 36 (12): 70-72. 
Abstract PDF(322KB) ( 872 )   
RelatedCitation | Metrics
The relation model between security investment and risk control was introduced to solve the problem of the optimal information security investment in corporation budget. The security investment efficiency was studied and the new concept of reducing the event probability and lost efficiency was presented. The utility theory was used to model system under the corporation wealth, risk lost and security investment, and the exponential utility function was used to model the yield of corporation,maximum security investment bound was analyzed. The method using differential coeffident to achieve extremum was applied for the utility function and derived the result of optimal investment. The case study demonstrated the risk measurement method based on the utility was scientific and the security events producing more loss effect need more security investment.
Towards a Dynamic-attribute-based Multi-domain Usage Control Model
XU Chang-zheng,WANG Qing-xian
Computer Science. 2009, 36 (12): 73-75. 
Abstract PDF(355KB) ( 667 )   
RelatedCitation | Metrics
On the basis of analyzing multiple domain interaction, we proposed a dynamic attribute based multiple domain usage control model. The model DAIS-UCON is based on the next generation access control model UCONnac,and extends the dynamic characteristics of the UCONnac components of authorization, obligation and conditions. Then we classified dynamic attributes according to the time of definition and the scope applied,which facilitate modeling each component as a dynamic entity. At last we discussed the extended model by formalizing, and introduced new predicates to accommodate rectuirements of multi-domain dynamic interaction,which will be useful for dynamic policy constructing and authorization in access control.
Risk Function-based SVM Algorithm and its Application in P2P-traffic Detection
WU Min,WANG Ru-chuan
Computer Science. 2009, 36 (12): 76-80. 
Abstract PDF(451KB) ( 575 )   
RelatedCitation | Metrics
P2P traffic has taken great portions in the network traffic. While having a significant impact on the Internet,it brings serious problems such as network congestion and traffic hindrance caused by the excessive occupation in the bandwidth. I}he paper firstly introduced methods in identifying P2P traffic and their characters, then put forwards a Support Vector Machine (SVM) algorithm based on risk function which can be used to identify P2P traffic. Finally, a model was set up to fulfill the identification of P2P traffic. Experimental results show that the method is more suitable for practical recauirements of traffic identification and has a more accurate precision.
Research of the Reducing Token's Frequency Method in Token-routing High-speed Fibre Channel Switch Network
LIU Jun-rui,CHEN Ying-tu,FAN Xiao-ya,KANG Ji-chang
Computer Science. 2009, 36 (12): 81-84. 
Abstract PDF(367KB) ( 661 )   
RelatedCitation | Metrics
In order to acquire high speed switching in high-speed fibre channel switch network,a method applying to the token-routing fibre channel switch to reduce the token's frequency was put up. In this scheme, the token is coded with "the reducing-frequency code",that is a special sequence of '0'and '1',then it's frequency may be reduced multiply,and the switch can extract the information of token without the serial-to-parallel converter.The results show that the structure of switch is simplified while its reliability is greatly improved and its costs is significantly decreased.
Research on Link Utilization Inference Technology Based on the Multiple Source Network Tomography
DUAN Qi,CAI Wan-dong,TIAN Guang-li
Computer Science. 2009, 36 (12): 85-88. 
Abstract PDF(433KB) ( 633 )   
RelatedCitation | Metrics
The link utilization is an important parameter to describe the running state of the network. At present the research on the link parameter inference technology of network tomography is based on the single source measurements.And the multiple source network tomography has many advantages. I}hc link utilization estimation technology based on multiple source tomography was researched. The joining measurement method was proposed and multiple source link utilization is identifiable if the network is measured by the new method. Furthermore, the necessary and sufficient condilion of the measurement sub-network selection to make the link identifiable was proposed. At last, the maximum likelihood estimation of link utilization computed by the EM algorithm was derived and the effectiveness of that was validated by the model simulation and network simulation results.
Security Model for Mobile Agent-based Network Management
CHEN Zhi,WANG Ru-chuan
Computer Science. 2009, 36 (12): 89-92. 
Abstract PDF(476KB) ( 598 )   
RelatedCitation | Metrics
Mobile agent-based network management model can have a flexible management on network based on mobile agents, but the security problems that exist in network manger, managed nodes and mobile agents of this model have hindered its further development and application. This paper studied these security problems, built an integrated security model based on Java card and encryption technology, and gave the secure management process. The case analysis shows that the proposed model can carry on security protection of hardware and software in the network management process.
MPR Election Frequency Based Extended OLSR Protocol in Wireless Mesh Networks
SHEN Cheng,LU Yi-fei,XIA Qin,WANG Cui-han
Computer Science. 2009, 36 (12): 93-96. 
Abstract PDF(434KB) ( 691 )   
RelatedCitation | Metrics
Based on studying the characteristics of topology and business model of wireless mesh networks, depth analysis of the suitable routing protocol type for WMN was given out which shows that proactive hop-by-hop routing is most suitable for WMN. On the basis of Optimized Link State Routing which is a typical proactive hop-by-hop routing protocola MPR election frequency based routing protocol called EOLSR was proposed for Wireless Mesh Networks. EOLSR takes in the new concept of "MPR election frequency" and overcomes the defect of inadequate resource utilization of OLSR in Wireless Mesh Network environment by expanding the neighbor table structure and improving the MPR seleclion algorithm. The simulation results show that the proposed EOLSR protocol improves the network throughput and reduces the end-to-end delay without increasing the routing overhead.
Algorithm for Predicting the BGP Routes in iBGP Routing of Route Reflection Graph
XU Xin,WU Jing,GAO Yuan
Computer Science. 2009, 36 (12): 97-99. 
Abstract PDF(365KB) ( 779 )   
RelatedCitation | Metrics
Predicting the 13GP routes is particularly difficult when the BGP selection process does not form a deterministic ranking of the routes, due to the route reflection.This paper presented provably correct algorithms for computing the outcome of the I3GP route-selection process for each muter in a network, without simulating the complex details of BGP message passing. The Algorithms require only static inputs. The prediction algorithm was verified in a simulated large scale ISP. The results show that configuration a山ustment cause can the routing table changes.
Coordinated Multi-point Concurrent Testing for Routers and Subnetworks
GENG Hua-xin LUO Hao
Computer Science. 2009, 36 (12): 100-103. 
Abstract PDF(411KB) ( 641 )   
RelatedCitation | Metrics
Next generation Internet is facing challenges such as high-speed switching and QoS provisioning. This paper addressed performance testing and evaluation issues relevant to future high-speed switching devices and subnetworks. A generic test method called Coordinated Multi Point Concurrent Transverse Test Method (CMPG-TTM) introduced by SGNetcom Lab was discussed. A prototype of the Distributed乙Coordinated Multi-Point Concurrent-Test System (DCMC-TS)under development based on CMPC-TTM was used as example for multi-point testing techniques. Preliminary test results with DCMC-TS show that CMPC-TTM is a versatile test method for future network testing and DCMC-TS is superior over existing approaches,with which multiple implementations of TTM (DTC) is hosted in a single machine.
Delay-oriented Routing Approach for Backbones with the Interference
ZHANG Li-dong,QIN Guang-cheng,YIN Hao,CHEN Qian
Computer Science. 2009, 36 (12): 104-107. 
Abstract PDF(368KB) ( 643 )   
RelatedCitation | Metrics
The weapon cooperation data link adopts hierarchical methodology generally, as the nodes in it are numerous and moving rapidly. In the battlefield, the interference often occurs, and it will impact the delay seriously, which is an important parameter of the network. Morcover,thc general approaches that only care about the delay of next hop can't always obtain the minim end-to-end delay. A delay oriented routing approach with implementing model was proposed for backbones. The SINR to the neighboring nodes and the end-to-end delay to the destination could be obtained to be used for deciding upon the next hop adopting the idea of feedback and cross-layer. The results provide important insights that the approach can get better performance than other several typical routing protocols.
Research of Network Intrusion Detection Model Based on Artificial Immune
ZHANG Yu-fang,XIONG Zhong-yang,SUN Gui-hua,LAI Su,ZHAO Yin
Computer Science. 2009, 36 (12): 108-110. 
Abstract PDF(267KB) ( 596 )   
RelatedCitation | Metrics
Aiming at limitation of existent network intrusion detection model with artificial immune idea, an improved network intrusion detection model based on dynamic clonal selection algorithm was presented. For accelerating normal IP packets access, the self-pattern class was proposed and most of self-antigens were filtered and amended the self-antigen set dynamically in detection process. The constraint based detectors were adopted as antibody, any-r intervals mat ching rule was used to determinant antibody and antigen, and split detector method were settled to self-antigen mat ching. The experimental results show that the proposed model can achieve a faster running speed, the better detecting rates,and adapt to dynamically changing environments.
On Primitive Subscription Management and Matching in Publish/Subscribe Systems
QI Feng-liang,JIN Bei-hong,CHEN Hai-biao,LONG Zhen-yue
Computer Science. 2009, 36 (12): 111-114. 
Abstract PDF(345KB) ( 564 )   
RelatedCitation | Metrics
Management of primitive subscriptions and matching events with them are fundamental issues in a publish/subscribe system. This paper organized primitive subscriptions into a covering forest and then executes primitive subscription matching on the basis of multi-level predicate indexes. This approach was implemented in our content based publish/subscribe system OncePubSub. This paper also gave the descriptions of the experiments which were conducted to evaluate performance and overhead. Experimental results prove that our approach has efficient matching performance and good scalability.
Theory of Conditional Functional Dependencies and its Application for Improving Data Quality
HU Yan-li,ZHANG Wei-ming
Computer Science. 2009, 36 (12): 115-118. 
Abstract PDF(459KB) ( 1041 )   
RelatedCitation | Metrics
The theory of conditional functional dependencies was introduced and its application for improving data quality was discussed. Firstly, conditional functional dependencies (CFDs) were defined and a set of sound and complete inference rules for CFDs were proposed; and dependency propagation was studied to find CFDs holding on views. Secondly, the consistency and implication problems for CFDs were analyzed. Then SQI. techniques were proposed to detect data inconsistency in relational database. Finally, extensions of CFDs were discussed.
Worm Detection Immune Model Integrating Innate and Adaptive Immunity
ZHANG Jun-min,LIANG Yi-wen
Computer Science. 2009, 36 (12): 119-123. 
Abstract PDF(498KB) ( 711 )   
RelatedCitation | Metrics
As most of existing worm detection methods have a number of significant hurdles to overcome in order to employ such actions as blocking unsecure ports, breaking communication between infected and non-infected hosts to slow down worm propagation and minimize potential damage. The most noteworthy obstacle is the high false positive rate problem. A recently developed hypothesis in immunology, the Danger Theory, states that our immune system responds to the presence of intruders through sensing molecules belonging to those invaders, plus signals generated by the host indicating danger and damage. Inspired by the theory,the paper proposed an artificial immune model for worm detection. The model considers the cooperation of Dendritic Cells (DCs) in the innate immune system and T cells in the adaptive immune system, in which system calls comprising a process generated can be viewed as antigens and the corresponding behavioral information of the system and network can be viewed as signals. The theory analysis shows that the dual detection method of DCs detecting the behavioral information caused by antigens and T cells detecting antigens can decrease false positive rate, and the model also has a fast secondary response to the rcinfection by the same or similar worm.
Personalized Curriculum Origanization Method Based on Knowledge Topic Ontology
ZHU Zheng-zhou,WU Kai-gui,WU Zhong-fu,CHEN Yi-xiong,GAO Min
Computer Science. 2009, 36 (12): 124-128. 
Abstract PDF(451KB) ( 757 )   
RelatedCitation | Metrics
Based on ontology and knowledge management theories, a 5-level knowledge organization architecture and a 4-level knowledge usage architecture were proposed. Knowledge topic ontology, student ontology and e-Learning policy ontology were designed. Personalized curriculum were recommended to students according to the 3 ontologics. The test results show that this method can recommend knowledge topic, improve students' positivity, and the students' user satisfaction is up to 84%.
Data Integration Method of Collaborative Development for Complex Product
ZHOU Jian,ZHU Yao-qin,PANG Wei-qing
Computer Science. 2009, 36 (12): 129-131. 
Abstract PDF(399KB) ( 636 )   
RelatedCitation | Metrics
At present, the need of collaborative development for complex product can not be met by general data integration method. On the basis of analyzing the characteristics of collaborative development for complex product, a new data integration method based on Urid and RDF was proposed. Urid technology was used to solve the distributed and heterogencous problem of data accessing, and RDF technology was used to solve semantic heterogeneity between data sources,and eligible data source was obtained by the selecting approach based on QoS. Finally, an example was given to demonstrate the validity and rationality of the proposed method.
Conformance Test Selection Based on Availability
XING Yi,YE Xin-ming,XIE Gao-gang
Computer Science. 2009, 36 (12): 132-137. 
Abstract PDF(513KB) ( 565 )   
RelatedCitation | Metrics
Conformance test selection is a very important stage in conformance test. Test suite from test generation can't be suitable for practical test because of test cost, coverage, availability etc. best selection is necessary to generate subset of test suite. We gave a method to carry out test selection based on statistical test. It included three stages. The first was to construct acceptance region using formal method. The second step was to sample test execution and count the number of success. hhe last was to judge the availability of test case based on whether the number of success lie in acceptance region. The method has been used in our test selection application.
Indexing of Moving Objects in a Constrained Network
SONG Guang-jun,HAO Zhong-xiao,WANG Li-jie
Computer Science. 2009, 36 (12): 138-141. 
Abstract PDF(421KB) ( 586 )   
RelatedCitation | Metrics
Advance in wireless sensor networks and positioning technologies enable new data management applications to monitor continuous streaming data. An efficient indexing structure for moving objects is necessary for supporting the query processing of these dynamic data I}his paper proposed a new index technique based on a simulation prediction model,which supported ctuerying the past, present and future positions of moving objects in urban traffic networks.First,making full use of the feature of urban traffic networks,we used cellular automata model with crossings to simulate the movements of the objects. Then, by linear regression and circular are fragmented curve-fitting, the prediction trajectory equation of the objects in regular road segment and in crossing could be obtained. Moreover, we presented a dynamic structure named AU(adaptive units) which grouped neighbor objects moving in the similar moving patterns and developed it a two levels R-tree and a link list based index named AUC(Adaptive Unit Compounding) index. Finally, experimental studies indicated that the AUC index outperformed TPR-tree and TB-tree.
Recommender System Model Based on Isomorphic Integrated to Content-based and Collaborative Filtering
LI Zhong-jun,ZHOU Qi-hai,SHUAI Qing-hong
Computer Science. 2009, 36 (12): 142-145. 
Abstract PDF(389KB) ( 963 )   
RelatedCitation | Metrics
The two recommender systems which arc respectively based on content and collaborative filtering methods are most popular. Both types of filtering methods have advantages and disadvantages. This paper proposed a new isomorphic integrated model and algorithm which have the merits of the traditional recommender systems based on above two methods, and avoid the shortages of them to some extent. The experimental results show that the presented isomorphic integrated model and algorithm can improve the performance of the traditional recommender systems in predictive accuracy.
Analysis and Comparison of the Macro-topology between Large-scale Software and AS-level Internet
LI Hui,ZHAO Hai,AI Jun,LI Bo
Computer Science. 2009, 36 (12): 146-150. 
Abstract PDF(453KB) ( 572 )   
RelatedCitation | Metrics
The research of the macro-topology between large-scale software and AS-level Internet has important significances in the further comprehension and application for their structures. According to the complex network characteristic which is reflected by the macro-topology between largcscale software and A}level Internet, their structures were converted to network topology, analyzed and compared from connectivity, degree distribution characteristic, small-world characteristic and hierarchy, using the methods of complex network metric and analysis, then some macro-topology similarities and differences between large-scale software and AS-level Internet were discovered. Additionally, the reasons of them were discussed.
Design of Model Service System for Digital City
ZHANG Zi-min,LI Qi
Computer Science. 2009, 36 (12): 151-153. 
Abstract PDF(266KB) ( 587 )   
RelatedCitation | Metrics
First, a view that the construction of digital city is to turn from providing "data service" to offering "applicalion service" was proposed, and two aspects of "application service" were given, which is information services oriented to the public and the professional fields, respectively. `hhen, for the second aspect that has been studied less than the former, a model service system that is a part of the application service platform of digital city was designed. The role and functions of the model service system in the platform were discussed, and some critical characteristics of the system, ineluding the structure, the model interface specifications, and the workflow to use it, were analyzed. Lastly, the system prototype was presented and its development status was given.
Analysis of Automated Trust Negotiation Policy Based on Label Tree
XIA Dong-mei,ZEN Guo-sun,CHEN Bo,BA0 Yu
Computer Science. 2009, 36 (12): 154-157. 
Abstract PDF(425KB) ( 567 )   
RelatedCitation | Metrics
In the virtual computing environment the securing co-operation is based on the trust between the strangers,automated trust negotia-lion provides a mean to establish strangers in distributed situation. However, the current negotialion takes it for granted that the access control policy of negotiation is correct, which will probably has many problems to lead negotiation to fail. This paper emphasized on analyzing the characters of negotiation policy. Firstly, aiming at the inconsistency problems such as inconsistent policy and trivial policy, this paper established a logic proving method based on label binary tree in order to test policy consistency, so as to prove the soundness and completeness of this method.Secondly, this paper gained the minimal credential set by predigesting the policy tree, then successful negotiation was achieved through oncoff discovering the minimal credential set, which will avoid the policy circle and improve the efficiency and the probability of negotiation.
Design and Implement on a Backward Reasoning Algorithm Based on Fuzzy Petri Net
YANG Jin-song LING Pei-liang
Computer Science. 2009, 36 (12): 158-160. 
Abstract PDF(226KB) ( 730 )   
RelatedCitation | Metrics
Fuzzy Petri net is a ideal tool for fuzzy production knowledge representation and reasoning.This paper designed the FPN model for fuzzy production rule-based knowledge base and implemented a backward fuzzy reasoning algorithm based on FPN througn the recursive.The algorithm was verified by examples in this paper.Using the algorithm,one can calculate the tokens of any appointed places which correspond to the true fuzzy values of the relevant propositions.The algorithm is strong in logic expressionan easy to implement in computer sustem.The proposed algorithm can effiectively reduce the computing space through transforming a large and comples system based on FPN into a small subsystem relating to the problems and improve the computational efficiency.
Process Model for Multimedia Task Management and its Formal Descriptions
RUAN Feng,DONG Biao,CHEN Jin-hui
Computer Science. 2009, 36 (12): 161-163. 
Abstract PDF(346KB) ( 589 )   
RelatedCitation | Metrics
Multimedia task management(MTM) is the automation of a business process for supporting multimedia content in whole or part. To achieve the function of MTM, the multimedia business process must be abstracted from the real world and described by a kind of formal method. The result is process model. This paper presented a novel approach,called GOMM,for modeling and implementing the process model using a graph-oriented multimedia model(GOMM).The GOMM constitutes a self-contained piece of multimedia content that indivisibly unites three of the content's aspeels, namely the media aspect, the semantic aspect, and the functional aspect, as well as the versioning information. The model provides means for the integration of ontological knowledge within a GOMM's graph structure. A formal definilion was proposed for GOMM graphs.
Accident Information Interaction in the Virtual Team Based on Multi-agent
YANG Chun,LIU Jian-gang
Computer Science. 2009, 36 (12): 164-166. 
Abstract PDF(284KB) ( 579 )   
RelatedCitation | Metrics
During product collaborative development, the members in virtual team may interact with accident information. The mode and class of information interaction in virtual were discussed. And then, the appearance and importance of accident information interaction in virtual team were introduced. Based on these, the model of accident information interaction in virtual team was developed based on multi-agent. Finally, the Prototype of this model was built.
EPN-based Approach for E-learning Resource Composition
GAO Min, LI Hua,WU Zhong-fu
Computer Science. 2009, 36 (12): 167-170. 
Abstract PDF(323KB) ( 627 )   
RelatedCitation | Metrics
In e-learning domain, learning resource composition is essential to improve the efficiency of courseware development. This paper presented an EPN-based approach to clearning resource composition flows modeling and analysis.Till now, most methods are based on high-level Petri nets (HPN),but the resources HPN described are not suitable for learning resource composition. That leads to the inefficiency in modeling and analyzing composition flows. We proposed a EPN-based approach to the issue. We described technique for transition rules representing, algorithm, and workflow,so that resource composition can be automatically carried out. Finally, an example shows the method is suitable to the resource composition.
Region Relations of the Vague Region with Kernel and the Vague-hole Region and the Implication Theorem
LI Song,HAO Zhong-xiao
Computer Science. 2009, 36 (12): 171-175. 
Abstract PDF(437KB) ( 578 )   
RelatedCitation | Metrics
Representing spatial information of the Vague regions and handling the Vague region relations are of great significance in spatial database, geographical information systems and artificial intelligence etc. hhc region relations of the vague region with kernel and the Vaguchole region were discussed detailedly based on Vague sets. hhc conceptions of the Vague region with kernel, the Vague-hole region and the region partition were given based on the Vague sets. The region relations of the Vague region with kernel and the Vagucholc region in the different plane were discussed respeclively. The implication theorem and the implication algorithm for the relations of the Vague sub-regions were also given. Furthermore, the analysis of an instance was also given. The production in this work can deal with the indeterminate membership information of the Vague points in the Vague regions with kernel and the Vaguchole regions, and the region relations of the vague region with kernel and the Vague-hole region can be determined well.
Knowledge Reduction of Set-valued Information Systems Based on Dominance Relation
CHEN Zi-chun,LIU Peng-hui.QIN Ke-yun
Computer Science. 2009, 36 (12): 176-178. 
Abstract PDF(322KB) ( 611 )   
RelatedCitation | Metrics
We proposed two different types of dominance relations in set-valued information systems by the distribution of set-valued when all attributes were considered as criterions, and generalized the dominance relations in classical information systems. The judgment of knowledge reduction and relative reduction of each object were studied for two types of dominance relations in set valued information systems, from which we defined discernibility function based on discernibility matrix, and got approach computing knowledge reduction and relative reduction of each object.
Research on Interest-driven Iceberg Cube Construction and Incremental Update Method
GAO Ya-zhuo,NI Zhi-we,GUO Jun-feng,HU Tang-lei
Computer Science. 2009, 36 (12): 179-182. 
Abstract PDF(334KB) ( 644 )   
RelatedCitation | Metrics
This paper proposed a matrix-based iceberg construction method called MICC, in order to reduce the space Consumption during the full materialization for a data cube. Users' interests arc considered as the standard to construct an iceberg cube. On the basis of the result of MICC, the paper proposed an incremental update method to dynamically update the iceberg cube while the users' interests are changing. An experiment proves the efficiency and accuracy of these two methods MICC and ICIU.
Adaptive Hierarchical Fuzzy Inference-based Modulation Recognition Algorithm
CHEN Xiao-qian,WANG Hong-yuan
Computer Science. 2009, 36 (12): 183-186. 
Abstract PDF(340KB) ( 683 )   
RelatedCitation | Metrics
For the non-stationary digitally modulated signal a novel feature; High Order Cross Cumulant (HOCC) was proposed. The non-linear dynamic modeling of an adaptive fuzzy modulation classifier, which based on the training mechanism within the neural network, was first presented. The model adopted the hierarchical decision-based structure,which made the features match the classifier and reduced the redundancy of the membership functions and fuzzy rules.According to the distribution of the feature samples, we established the Hierarchical Fuzzy Neural Network System (HFNNS) with initial experience to guarantee the controllability of the knowledge inference structure. By applying the training data,the algorithm adaptively adjusted and optimized the structure parameter and completed the approximation process. The simulation results verified the better robustness of the system in the presence of various environment parameters (SNR etc) , as well as the improvement of the average probability of correct classification and the algorithm efficiency, compared with the neural network classifier and fuzzy classifier.
Approach for Discovering Association Rules from Log Ontologies Based on Hybrid System
SUN Ming,CHEN Bo,ZHOU Ming-tian
Computer Science. 2009, 36 (12): 187-190. 
Abstract PDF(362KB) ( 626 )   
RelatedCitation | Metrics
Building access pattern association rules on top of log ontologies is one of the main tasks of semantic Web user ge mining. With the restrictions of DL-safe rules, we combined log ontologies with first order application rules to build a hybrid log knowledge base. It can improve the capability of knowledge representation and reasoning of Web log system.After mining the frequent user-access patterns from the hybrid system through II_P theory,the access pattern associalion rules can be constructed to discover the potential associations between user-access behaviors. This method improves the results of semantic Web usage mining and provides more decision-making for optimizing the structure of Web sites.The experimental results show that this method is effective and quite feasible to solve practical problems.
Credal Network Inference Reduct Algorithm Based on d-separate
QU Ying,WU Qi-zong,CUI Chun-sheng
Computer Science. 2009, 36 (12): 191-193. 
Abstract PDF(252KB) ( 624 )   
RelatedCitation | Metrics
According to the problem such as vertexes of Credal set combination explosion in the inference of Credal Network, inference network reduction was proposed. By using d-separate among the variables, an algorithm to compute inference network reduction of large scale network was designed. The application case indicated that the algorithm could simplify the inference network and improve the efficiency of inference. It was useful and feasible for some kinds of inference.
Intuitionistic Fuzzy S-rough Decision Models and Application
HU Jun-hong,LEI Ying-jie
Computer Science. 2009, 36 (12): 194-196. 
Abstract PDF(308KB) ( 549 )   
RelatedCitation | Metrics
Based on the theory of intuitionistic fuzzy S-rough sets,the two direction intuitionistic fuzzy S-rough decision model was proposed. First, approaches to measures for evaluating goals and normalization of goals matrix values were described. Then, the models of abovcdecision and under-decision of intuitionistic fuzzy trough decision were founded;the arithmetic and detailed processes of intuitionistic fuzzy S-rough decision were given. Finally, the threat evaluating instances were detailed in the course of air attack goals. The results show that the intuitionistic fuzzy trough model can treat decision ingredients with all-around and the result reflects the truth well and truly.
Clustering Based on Evolutionary Algorithm in the Presence of Obstacles
WANG Yuan-ni,BIAN Fu-ling
Computer Science. 2009, 36 (12): 197-198. 
Abstract PDF(261KB) ( 593 )   
RelatedCitation | Metrics
In the real-world, constraints limits the spatial clustering must take into account the conditions of these restrictions, this paper studied the spatial clustering with obstacles. It mainly used the K-medoid algorithm to cluster, and it introduced an improved algorithm Guo Tao to solve the distance of spatial objects in the presence of obstacles. It is higher efficiency for small and medium-sized data. Through theoretical analysis and experimental, the algorithm is feasible.
Immune Cloning Particle Swarm Optimization for Wave Impedance Inversion
NIE Ru,YUE Jian-hua,DENG Shuai-qi,LIU Yang-guang
Computer Science. 2009, 36 (12): 199-202. 
Abstract PDF(474KB) ( 678 )   
RelatedCitation | Metrics
In the standard particle swarm optimization(PSO),the premature convergence of particles and slow convergence in the late process decrease the searching ability of the algorithm.By introducing the hybrid mutation mechanism,an immunity cloning PSO (ICPSO)algorithm was proposed and applied to the wave impedance inversion problem.When the local extremum is close to the global extremum,the proposed cloning selection operator can accelerate the best particle away form the local extremum. On the other hand,when the local extremum is far away from the global extremum,Tent sequence is adopted to extend the search scope and further the best particle mutation.Simulation for wave impedance inversion indicates that IPSO has better efficiency and higher accuracy.
Off-line Handwritten Character Recognition Based on Hierarchical Classification
WANG Yun-peng,MIAO Duo-qian,YUE Xiao-dong
Computer Science. 2009, 36 (12): 203-209. 
Abstract PDF(600KB) ( 597 )   
RelatedCitation | Metrics
The paper proposed a method of off-line handwritten character recognition based on hierarchical classification. The method simulates the produce of character recognition of human. When a man wants to recognize a character,he uses different strategy in different situation. If the character has a simple structure, he uses global features; if it looks similar with other character,he uses local features. We divided the classifier into macro layer and micro layer. The macro layer uses gradient feature to represent global feature, it simulates the simple target recognition produce; the micro layer uses principal curve feature to represent local feature, it simulates the similar form character recognition produce. We used confidence value to measure indeterminacy of the produce and result. We gave the definition of similar form character,and rules to telling them. The experimental results indicate that the method can effectively improve the recognition rate of off-line handwritten character, especially well in telling similar form character.
Structural Risk Minimization for Controlling Generalization Performance of Rough Set Learning Machine
LIU Jin-fu,YU Da-ren
Computer Science. 2009, 36 (12): 210-213. 
Abstract PDF(377KB) ( 578 )   
RelatedCitation | Metrics
The factors influencing the generalization performance of rough set learning machine were analyzed. Through introducing the principle of structural risk minimization into rough set learning process, structural risk minimization on rough set learning was proposed. Experiments on 12 UCI data sets show that the proposed method is effective for improving the generalization performance of rough set learning machine.
Correlation Properties of T-intuitionistic Fuzzy Subgroups
CHENG Tao, MI Ju-sheng
Computer Science. 2009, 36 (12): 214-215. 
Abstract PDF(219KB) ( 637 )   
RelatedCitation | Metrics
We discussed the relation between intuitionistic fuzzy rough sets and group structure. By drawing on the T-intuitionistic fuzzy normal subgroup,we obtained the T intuitionistic fuzzy similarity relation on group,and set up intuitionistic fuzzy rough sets on group. We studied the product of the intuitionistic fuzzy rough sets and the properties of intuitionistic fuzzy rough upper and lower approximate perators under group homomorphis.
Acquisitions to Decision Rules and Algorithms to Inferences Based on Crisp-fuzzy Variable Threshold Concept Lattices
QIU Guo-fang,ZHU Zhao-hui
Computer Science. 2009, 36 (12): 216-218. 
Abstract PDF(214KB) ( 673 )   
RelatedCitation | Metrics
Four kinds of cristrfuzzy variable threshold concepts are introduced in a fuzzy formal context and they form four kinds of variable threshold concept lattices respectively. And then four kinds of decision rule sets are obtained based on these lattices. Algorithms to inferences under different decision rule sets are established by an inclusion degrec,and total decision rules of all combinations among objects arc acquired. The decisions from the total decision rules are proved to be the lower approximated and the upper approximated decision sets,and the algorithms are accordant and consistent.
Problem of Rigid Assignment to Variables in Predicate Modal Logic
JIANG Feng
Computer Science. 2009, 36 (12): 219-222. 
Abstract PDF(388KB) ( 621 )   
RelatedCitation | Metrics
Propositional modal logic is now an effective tool in artificial intelligence and other areas of computer science,but predicate modal logic is not. There exist many controversies and problems in predicate modal logic. It would seem to be a simple matter to obtain predicate modal logic by adding quantifiers to propositional modal logic. The addition of quantifiers, however, opens the door to a labyrinth full of twists and problems, such as the problem of rigid assignment to variables, the problem with constant domains versus varying domains, and the problem with transworld identity, etc.We mainly discussed the problem of rigid assignment to variables in predicate modal logic. First, we gave an introduction to the problem of rigid assignment to variables. Second, we discussed the shortcomings of existing methods for the problem of rigid assignment to variables. Finally, we proposed our method by analyzing the basic reasons behind the problem of rigid assignment to variables.
Weighted Truth-valued-flow Inference Algorithm
HE Ying-si,DENG Hui-wen
Computer Science. 2009, 36 (12): 223-226. 
Abstract PDF(305KB) ( 555 )   
RelatedCitation | Metrics
The property of being consistent of Truth valued-flow inference algorithm was investigated. First the property of being consistent for one rule of Truth-valued-flow inference algorithm was proved,and then an example was given to show that I}ruth-valued-flow inference algorithm is not consistent for multiple rules. Finally, a weighted Truth-valued-flow inference algorithm was proposed, which was proved to be consistent for multiple rules.
Logical Paragraph Division Based on Semantic Characteristics and its Application
ZHU Zhen-fang,LIU Pei-yu,WANG Jin-long
Computer Science. 2009, 36 (12): 227-230. 
Abstract PDF(462KB) ( 892 )   
RelatedCitation | Metrics
A new matching method based on logircentered paragraphs was introduced. The method built on the basis of the concept dictionary carried out the cluster analysis of the paragraphs which have the same meaning in the text by analyzing the logical concept of the text to be classified so as to get a logical level, and established the logical paragraph concept on the basis of the division method of the logical level, then measured the contribution of different paragraphs to the text theme according to the logical paragraph. At the same time, in order to solve problem of synonyms and polysemy in the matching process, the expansion of the synonyms concept and related words were introduced. Experimental results show that this method can obtain a higher accuracy rate in content flitting, improving the effectiveness of classification effectively.
Performance Evaluation Method for Asynchronous Circuit Based on Static Data Flow Structure
JIN Gang,WANG Lei,WANG Zhi-ying
Computer Science. 2009, 36 (12): 231-234. 
Abstract PDF(357KB) ( 807 )   
RelatedCitation | Metrics
The Static Data Flow Structure (SDFS) is an abstract model of asynchronous circuits,which is very flexible and understandable. A Petri net model which is used for performance analysis was introduced. And a fast performance cvaluation method was also introduced, which is very suitable for the very large scale asynchronous circuit design. The size of this model is just half of the traditional Petri-net model for the asynchronous circuits, and the read-are which was introduced in the previous Petri-net model was eliminated in this model. At last, several experiments were presented to prove the validity of this model.
Performance Analysis of Webit Quad-core Processor
WANG Jia-liang,ZHAO Hai,LI Peng,LIU Zheng
Computer Science. 2009, 36 (12): 235-237. 
Abstract PDF(372KB) ( 589 )   
RelatedCitation | Metrics
Using real-time operating system kernel to manage multi-tasking is a trend in current embedded applications,and embedded applications for the device were severely limited because of its limited resources, so singlccore processor has been unable to well meet some of the practical applications, thus this paper designed and implemented a real-time multi-tasking preemptive Webit quad-core processor system is based on 8 bit AVR singlechips, and it generally retains the main features of the Webit2. 0 kernel. Webit ctuad-core processor system uses four AT90S8515 singlechips which are connected through the ISA bus to process multi-tasking in parallel,and experimental results show that its performance was improved greatly.
Fire Simulation with GPU-based Particle System
QIU Yu-feng,ZENG Guo-sun
Computer Science. 2009, 36 (12): 238-242. 
Abstract PDF(412KB) ( 608 )   
RelatedCitation | Metrics
This approach aims at improving the performance and reality of fire simulation in virtue of powerful ability of GPUPU. An emitter composed of yaw and pitch was designed to control the particle stream precisely. The formula of laminar flame was modified to outline the shape of turbulent fire. Besides, Lagrange interpolation was used to smooth and get the accurate fire skeleton around which particles move even when it becomes twist. In order to improve the performance,global memory was adopted to store particles to prevent the spending of binding texture memory repeatedly;parallelization of Lagrange interpolation and properties of particles update were realized of CUDA.All these measures contribute to reach a desirable real-time simulation speed and acquire an improvement of performance.
Performance Comparison and Analysis of Fundamental Matrix Estimating Methods for Computer Vision Applications
CAI Tao,DUAN Shan-xu,LI De-hua
Computer Science. 2009, 36 (12): 243-247. 
Abstract PDF(546KB) ( 808 )   
RelatedCitation | Metrics
The fundamental matrix (F matrix) relates corresponding points across two different viewpoints and defines the basic relationship between any two images of the same scene. Therefore, the F matrix plays an important role in most computer vision applications. Some important computing methods for the F matrix were introduced and analyzed after describing the epipolar geometry in computer vision. At last, these methods were implemented and their performarr ces were evaluated systematically based on simulated data and practical images. The test results proved that 1) the linear methods will work well on precisely located point-pairs without no mismatch; 2) the iterative nonlinear methods can conquer the Gaussian noise in positions of point pairs, however, have poor performance for mismatched points; 3) the robust methods can resolve the problems brought by noise and mismatching. Furthermore, the results also showed that the cigen-analysis based orthogonal regression methods outperform the conventional least squares methods.
New Algorithm of Image Rotation Matching Based on Feature Points
WU Ding-xue, GONG Jun-bin,XU Hong-bo,TIAN Jin-wen
Computer Science. 2009, 36 (12): 248-250. 
Abstract PDF(310KB) ( 1204 )   
RelatedCitation | Metrics
Template matching has many applications in signal processing, image processing, pattern recognition, and video compressing. It fund a desired template in the large reference image by sliding the template window in a pixel-by-pixel basis, computing the degree of similarity between them, and searching position with the largest similarity measurement.It is computationally expensive to search for every possible position of the template window within the larger reference image. When having a rotation between the template and the reference image, the conventional template matcking algorithm described above is not practical for real-time processing. In this paper, a point matching algorithm was proposed to match the rotated template, which included: firstly, the feature points were detected by Harris detector, then the facet model was used to approximate locally the image intensity function, and the rotation-invariability of the feature point was extracted, finally, the transformation(translation and rotation) was obtained by matching feature points. Resups have shown the efficacy of the proposed method.
Image Watermarking Scheme in Wavelet Domain against Geometrical Attacks
LOU Ou-jun,WANG Xiang-hai,WANG Zheng-xuan
Computer Science. 2009, 36 (12): 251-256. 
Abstract PDF(504KB) ( 688 )   
RelatedCitation | Metrics
The toughest challenge facing the robust watermarking is the geometrical attacks. A slight even intangible geometrical attack can fail the watermark in wavelet domain due to the shift-variance of wavclet. Based on the feature points, this paper proposed an image watermarking scheme against the geometrical attacks. First, according to the tree structure of the wavelet coefficients, the proposed scheme selected the root of the directional subtree with the highest texture as the embedded points from each wavclet tree. Second, this paper proposed an adaptive embedding strategy according to the energy of the low frequency coefficient corresponding to the embedded point and the texture characteristics of the highest frequency coefficients of the subtree. Finally, the proposed scheme used the Harris-Laplace operator to extract the feature points robust to the geometrical attacks,and form a template of feature points. During detection,the proposed scheme restored the attacked image by linear transformation using the feature template, and then verified the watermarking through statistical correlation. The detection process does not need the original host image. Experimental results show that the proposed algorithm has good transparency and is very robust to common image processing and geometric attacks.
Combining ODCS and SIFT for Fast Color Object Identification
LI Hai-tao,WU Pei-liang,KONG Ling-fu
Computer Science. 2009, 36 (12): 257-258. 
Abstract PDF(268KB) ( 582 )   
RelatedCitation | Metrics
To improve the celerity of object identification based on scale invariant feature transform(SIFT),a new fast approach called ODCS-SIFT was proposed to locate and identify color object from scene using combination of object dominant color set(ODCS) and SIFT. The whole approach comprises two joint stage; off-line training stage and on-line identifying stage. At off-line stage, find ODCS and SII门气libraries through human-machine interaction, while at on-line stage, firstly search in the whole scene image using ordinal grid scanning and seed filling simultaneously, then locate the object roughly using the ODCS frequency restriction, lastly in the smaller grayed location a more real-time and precise SIFT extracting and matching are executed. Experimental results show that the ODCS-SIFT can improve the celerity of 0bject identification effectively.
Dual Video Watermarking Technique Based on Energy Difference Ratio and Spread Spectrum
FU De-sheng,WANG Jian-rong,SUN Wen-jing
Computer Science. 2009, 36 (12): 259-262. 
Abstract PDF(351KB) ( 655 )   
RelatedCitation | Metrics
A dual video watermarking technique based on energy difference ratio and spread spectrum was proposed.Firstly, at the infra prediction, transform and quantization stage, it embeded robust watermarking by low-mid frequency DCT coefficients of direct permutation. Further more, semi fragile watermarking was embedded by energy-difference-rado algorithm. And besides, watermark extraction in this algorithm enabled the blind examination. Experimental results show that the proposed method will not affect the size of video files,and also has strong robustness under attacks such as rcquantization, block-deletion and collusion.
Study on Compression Algorithm of Cruise Missile Image Based on Wavelet Code
CHEN Sheng-lai,ZHENG Ai-min,LI Tao
Computer Science. 2009, 36 (12): 263-266. 
Abstract PDF(353KB) ( 574 )   
RelatedCitation | Metrics
Wavclet code algorithm is adopted because compression algorithm is required to largcratio and real-time compress images by missilcborne compression system under the condition of the guarantee of image quality, and DSP (Digital Signal Processor) is selected as the processor of algorithm. Wavelet code algorithm is composed of two step:images are transformed by integer lifting scheme,and the transform results arc coded by SPIHT(Set Partitioning in Hierarchical Trees) algorithm. Wavelet code algorithm is improved to be fit for DSP parallel process for it has some defeclions such as many repeat calculations, and consuming a large number of memories. Experiment results show that the peak signal-to-noise ratio (PSNR) of improved algorithm is larger than 28dB and is as much as original algorithm,which can meet the requirement of image duality; the compression speed is 20 frames per minute, which totally meets real hme compression reqmrement.
Enhancement of Detail Characters within Medical Image
JIAO Feng,BI Shuo-ben,GENG Huan-tong
Computer Science. 2009, 36 (12): 267-269. 
Abstract PDF(272KB) ( 736 )   
RelatedCitation | Metrics
Low-contrast and heavy noise are main shortages of X-ray medical images, which makes the images vague and uncertainly. As result,some very useful details characteristic are weakened which are difficult to distinguish even by naked eyes. Based on the analysis of multi-resolution wavelet transform and bilateral filtering operator,a kind of image enhancement algorithm for detail characters was presented. hhe algorithm can enhance the detail characters while confining noise amplifying and keeping lower distortion. The analysis of the results shows that local regions of the image are enhanced by using the 8-neighbor grad contrast of the wavelet transform coefficient, which makes the detail characters of image clearer adaptively. Experiments were conducted on real pictures, and the results show that the algorithm is flexible and convenient.
Corner Detection Algorithm for Image Mosaic
FENG Yu-ping,DAI Min,ZHANG Wei,WANG Mei-jiao
Computer Science. 2009, 36 (12): 270-271. 
Abstract PDF(290KB) ( 616 )   
RelatedCitation | Metrics
An improved algorithm for corner detection was proposed by analyzing the theory of Harris operator and its disadvantage. The main improved aspects were suggested just as follows ; First, the new grads operators were used to get image derivative. Second, the method adopted improved corner response function for avoiding the randomicity of k value. Then this algorithm automatically ascertained corner extraction threshold based on the first one in image sequence.Last, in order to improve the precision for corner extraction, the method obtained sub-pixel corners within 8 neighborhoods of the target pixel. Experiment results show that the computation time of improved algorithm enhances about 61. 3% , and it can be applied to image mosaic better.
ROI Coding Method Based on Selecting Subband and Bitplane Shift
XIA Chun-yu,WANG Xiang-hai
Computer Science. 2009, 36 (12): 272-277. 
Abstract PDF(510KB) ( 673 )   
RelatedCitation | Metrics
Through carefully analyzing both the advantages and disadvantages of the two kinds of ROI coding methods in JPEG2000,and then statistically analyzing the energy of ROI coefficients of wavelet transform in each subband,we proposed selecting and shifting important subband by ROI mask and the energy based on the wavelet transform, simultaneously expanding the important subband which should meet the the "Zero-tree" characteristic, a new region of interest(ROI) coding method based on selecting subband and bitplane shift was proposed. This schcxne has several novel-ties;(1) flexibly adjust the compression quality of the ROI and the ROB; (2) code multiple ROIs at various degrees of interest; (3) arbitrarily shape ROI coding without coding the shape. Experimental results validate the validity of proposed algorithm.
Color Cerebrovascular Image Skeleton Extraction Algorithm Based on Level Set Model
WU Jian,CUI Zhi-min,XU Jian,CAO Yan-yan
Computer Science. 2009, 36 (12): 278-281. 
Abstract PDF(307KB) ( 636 )   
RelatedCitation | Metrics
Skeleton extraction is a challenging subject of computer vision theory and applications and has an important application value. Based on the characteristics analysis of medical color images of cerebrovascular, this paper introduced a Level Set speed function using color gradient information and bayesian classification,proposed HSV space high-speed model,and put forward the skeleton extraction algorithm for color cerebrovascular image on the basis of this model.This algorithm uses two new intermediate function, whose all parameters arc obtained by analysis, so avoid the artificial interference. Experiments show that this algorithm is not sensitive to gradual changing of colors and the boundary noise and has good validity and robustness.
New Binarization Algorithm for Given Object Extraction
LI Liang-hua,LUO Bin-jie
Computer Science. 2009, 36 (12): 282-284. 
Abstract PDF(264KB) ( 590 )   
RelatedCitation | Metrics
In order to meet the requirements of preprocessing of Chinese cheque image recognition, this paper studied several binarization algorithms. After analyzing the histograms of gray-level images of 2000 cheques, we found a clue,maximum gradient, in histogram of gray-level cheque image, which can be used for segmenting image and extracting given object parallel-line of amount field. With the clue, we developed a new binarization algorithm which makes the parallel-line of amount field more obvious and easier to be located in binarization image. Comparing with several other commonly used binarization algorithms,the algorithm proposed by the paper has been proven to be more feasible and advanced in the simulating tests.
Bamboo Simulation Based on Improvement Fractal Algorithm and Displacement and Texture Mapping
LUO Yan,WU Zhong-fu,GUO Xuan-chang
Computer Science. 2009, 36 (12): 285-289. 
Abstract PDF(452KB) ( 677 )   
RelatedCitation | Metrics
Plants simulation is a important studies project in the computer graphics. We discussed the "Bamboo" simulation modeling which is involved in very few researchers. We improved the I= system algorithm based on the "Bamboo" singlcaxis plant characteristics,and explained the relevant different tcchnicals and methods used in "Bamboo" modeling and the realistic performanc. We quoted the displacement texture mapping technique to solve the difficult problem simulating the section header of "bamboo". Through this method,you can simulate realistic "Bamboo" form and characteristics in the nature. This paper gave a new solution in plant simulation.
Infrared Face Recognition Method Using Blood Perfusion and Sub-block DCT+FLD in Wavelet Domain
XIF Zhi-hua,WU Shi-qian,FANG Zhi-jun
Computer Science. 2009, 36 (12): 290-293. 
Abstract PDF(403KB) ( 634 )   
RelatedCitation | Metrics
To get the good performance of infrared face recognition from the biological feature and Combine the local and whole character, a novel method based on based for infrared face recognition was developed. A new method based on FLD for feature extraction was presented combing blood perfusion and block DCT in wavelet domain was proposed.Firstly,thermal images were converted into blood perfusion domain by blood perfusion model to obtain consistent facial images without effect of ambient variations. Secondly blood perfusion data were decomposed using two scales' discrete wavelet transform. hhen, the component of low frequency sub-bands was partitioned into sub-blocks, to which the DCT was further applied. The FLD was applied to the global features combined by the extracted coefficients from all subblocks in DCT domain. Finally, Euclidean distance and the 3-NN classifier were utilized in recognition. The experiments conducted illustrate that the method proposed in this paper has better erformance compared with traditional PCA,PCA + FLD in thermal images.