Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 7, 01 December 2018
  
On Methodology of Modeling the Trust Base in Operating Systems
SHI Wen-chang
Computer Science. 2010, 37 (7): 1-6. 
Abstract PDF(697KB) ( 208 )   
RelatedCitation | Metrics
Advocates the philosophy that operating systems are of indispensable significance to make applications trusled. Elaborates the concept of Trust Base in Operating System (TBOS) with a focus on Web applications. Discusses research methods for modeling the TI30S with a longing for ensuring that the TBOS is trusted. Proposes the TBOS archilecture that consists of three main parts, which are the Trust Monitoring Core Engine, the In-Kernel Trust Monitor and the Out of-Kernel Trust Monitor. Proposes a research guideline that is to exploit the potential of hardware and reduce the size of software. States key issues and key technictues in modeling the TBOS. Establishes the methodology foundalion of modeling the TBOS from the aspects of model construction method, trust monitoring method, inter-domain collaboration method, protection-by-isolation method, hardware features abstraction method and software size minimization method.
Research of Indicator for Information Assurance Evaluation
WU Zhi-jun,YANG Yi-xian
Computer Science. 2010, 37 (7): 7-10. 
Abstract PDF(458KB) ( 789 )   
RelatedCitation | Metrics
Both information system and information assurance are complex systems.To demonstrate the efficiency of information assurance better,it needs the quantitative parameters which are used as the indicators for the purpose of security evaluation.This paper presented the indicators for information assurance based on the core of security baseline policy,which is extracted from the juristic documents of national stratagem,management policy,engineering criterion,and technique measurements.The evaluate methods and procedures with double feedbacks were given in this paper.The indicators will help to improve the efficient and persistent of information assurance,and make the information system more secure.
Introduce Sentiment Analysis to Cognitive Science
LI Wei-jie
Computer Science. 2010, 37 (7): 11-15. 
Abstract PDF(597KB) ( 464 )   
RelatedCitation | Metrics
We concluded the process of sentiment analysis, which has three main steps, acquisition and expression of sentiment, classification and estimation of sentiment, and application of sentiment analysis. The results of sentiment analysis can be classified into summarization of textual sentiment, estimation or analysis sentiment to persons or events in texts. But nobody gives any attention to the role of emotion in texts as it in cognitive. For replying to this problem, we tried to introduce sentiment analysis to cognitive science and get emotion in texts by utilizing sentiment analysis. We proposed that emotion should have two constituent parts:emotion signal and emotion element. Emotion signals are the carriers of emotion. The carriers include phenomena observed in body such as face color, heart rate, and media which expresses or pass emotion such as images,words,voices. Emotion elements are the ones which are accepted by the society,such as love, hate, delight, abashment, envy, guilt, dread, anxiety and so on, being related to consciousness. We also proposed some ideas on introducing emotion to artificial intelligence. It may be helpful for cognitive science to analyze and utilize emotion.
Advances in Shuffled Frog Leaping Algorithm
HAN Yi,CAI Jian-hu,ZHOU Gen-gui,LI Yan-lai,LIN Hua-zhen,TANG Jia-fu
Computer Science. 2010, 37 (7): 16-19. 
Abstract PDF(366KB) ( 182 )   
RelatedCitation | Metrics
Shuffled Frog Leaping Algorithm (SFLA) is a population-based novel and effective mcta-heuristics computing method, which received increasing focuses from academic and engineering optimization fields in recent years. Since SFLA is a combination of Mcmctic Algorithm (MA) with strong Local Search (LS) ability and Particle Swarm Optimination (PSO) with good Global Search (GS) capability, it is of strong optimum-searching power and easy to be implemented. In this paper, the fundamental principles and framework of SFLA were described. Then, the related researches of SFLA in the current optimization and engineering fields were summed up. Lastly, the future perspectives of SFLA were presented.
Knowledge Compilation Survey
GU Wen-xiang,ZHAO Xiao-wei,YIN Ming-hao
Computer Science. 2010, 37 (7): 20-26. 
Abstract PDF(655KB) ( 700 )   
RelatedCitation | Metrics
Knowledge compilation (KC) has been emerging as a new technology for dealing with the propositional logic database. The process of it is transforming the given knowledge into another form, and then the reasoning can be done more tractable on it. As an effective reasoning method,KC has been applied in various artificial intelligence areas. We introduced the details about the research and application of KC, and proposed a perspective on target compilation languages according to the succinctness, the class of queries and transformations that the language supports in polytime.
Development of Medical Image Registration Technology
LI Xiong-fei,ZHANG Cun-li,LI Hong-peng,ZANG Xue-bai
Computer Science. 2010, 37 (7): 27-33. 
Abstract PDF(668KB) ( 585 )   
RelatedCitation | Metrics
Medical image registration plays a positive role in clinical diagnosis and treatment, illness monitoring, surgery and so on. With a medical image registration framework as the mainline,this paper presented an overview of classic algorithms and new technologies involved in the various modules of the framework, as well as related performance analysis. Moreover, this paper also involveed the development platform, test database and evaluation criteria, giving a comprehensive summary of the latest development of medical image registration technology.
Survey of Biogeography-based Optimization
WANG Cun-rui,WANG Nan-nan,DUAN Xiao-dong,ZHANG Qing-ling
Computer Science. 2010, 37 (7): 34-38. 
Abstract PDF(415KB) ( 473 )   
RelatedCitation | Metrics
Biogeography is the study of the geographical distribution of biological organisms, Prof. Dan Simon took the mechanism to resolve engineering problems,and proposed a new optimization algorithm named Biogeography-Based Optimization (BBO).BBO algorithm has a wide attention by its unique search mechanism and good performance. We over-viewed the BBO's natural mechanism, the math model of BBO migration, the progress of BBO, migration and mutation operation of BBO. We listed the results of BBO tests on a set of 14 standard benchmarks and compared it with UA,ACO, and PSO etc to prove its good performance. This paper also discussed the difference of BBO with traditional optimization algorithms and the future problems of BBO.
Differential Power Analysis Attacks on SMS4
LI Lang,LI Ren-fa,LI Jing,WU Ke-shou
Computer Science. 2010, 37 (7): 39-41. 
Abstract PDF(251KB) ( 326 )   
RelatedCitation | Metrics
The paper introduced Differential Power Analysis(DPA) attack of the decrypt encrypted circuits with SMS4, it is one particularly powerful type of Side Channel Attacks (SCA). All its theories are based upon the physical characters, power consumption models and data-dependent power consumption of CMOS logic gates which form the integrated circuits (ICs).The paper introduced the design and realization of DPA attacks of SMS4. Correct secret key of encryption algorithm is cracked successfully with experiments. It comes gradually and closely into ultimate target of the attack. The result indicates that SMS4 encrypted systems without some extra protective measures can' t resist the attacks of DPA, because of the leakages from the physical signals and the difference of power consumption while processing the different data of ICs. The results can give the researchers to provide a useful reference for the safety design.
New MAC Protocol Based on Frequency Hop-reservation for RFID Reader Networks
WANG Yong-hua,ZHAN Yi-ju,YANG Jian,CAI Qing-ling
Computer Science. 2010, 37 (7): 42-45. 
Abstract PDF(324KB) ( 183 )   
RelatedCitation | Metrics
The RFID reader collision problem emerges in the large applications; it will decrease the read rate of the whole system A MAC protocol based on the slow frequency hopping spread spectrum (FHSS) was proposed to resolve the reader collision problem.This MAC used reader synchronization mechanism. The reader used the corresponding frequency to communicate with tags if the reader competed with others and reserved the time slot successfully, and informed the neighbor readers to avoid communicating with tag simultaneously, thus prevented the tag interference. This MAC adopted the mechanism that reader transmissions and tag transmissions arc on separate frequency channels and the muti-frenquency hopping, so tags collide with tags but not with readers and readers collide with readers but not with tags, and such separation solves the reader frequency interference. The analysis results of the algorithm show that when the reader load is bigger and the reader average correspondence time is longer, the system throughput is higher.
Local Community Detecting Method Based on the Clustering Coefficient
LI Kong-wen,GU Qing,ZHANG Yao,CHEN Dao-xu
Computer Science. 2010, 37 (7): 46-49. 
Abstract PDF(414KB) ( 479 )   
RelatedCitation | Metrics
Community detecting has been a research topic in the complex network area. The global information of the whole network, which is rectuired by the traditional community detecting algorithms, is hard to get when the scale of the network grows. On the other hand, in many cases we only care about the local community of one particular node of the network. To make the local community detecting faster and more accurate, this paper proposed a local community detecting method based on the clustering coefficient of the nodes. The proposed method, which leverages the connectivity density and the characteristics of clustering coefficient, starts from the target node of the network and detects the community it belongs to by searching the neighbor nodes. This method rectuires only the local network information related to the target node and is faster compared to the traditional community detecting algorithm. It is also applicable for global community structure detecting. I}he method was applied to the Zachary network and JSCG, and the experiment results were analyzed by comparing with the actual characteristics of the object network.
Research of TCP-friendly Congestion Control Protocol in Wireless Network
XIAO Fu,WANG Ru-chuan,SUN Li-juan,WANG Hua-shun
Computer Science. 2010, 37 (7): 50-53. 
Abstract PDF(352KB) ( 340 )   
RelatedCitation | Metrics
The network real-time multimedia service's widespread application presented an new challenge to the traditional transport protocol: the deficiency of congestion control mechanism causes UDP to seriously seize the sharing band width of the TCP application, thus reduces the network fairness, even causes network congestion. Based on the characteristic of high error rato in wireless network, a novel TCP friendly congestion control algorithm TFRC-JI was presented in this article, which introduces the latency vibration to distinguish the link congestion from code error, thus different speed control mechanism is feedback to the transmitting end. Simulation experiment result indicated that compared with the traditional TFRC mechanism,the throughput of wireless link is enhanced while TCP friendly also maintaines using TFRC-JI,which suites well for real-time service transmission.
Research on Network Worm Test-bed
KUANG Xiao-hui,HUANG Min-huan,XU Fei
Computer Science. 2010, 37 (7): 54-56. 
Abstract PDF(375KB) ( 341 )   
RelatedCitation | Metrics
Worm validation experiments arc very important for worm research. Based on analysis of experiment environment construct method including mathematical modeling, simulation, emulation, test bed and hybrid modeling, an new worm model named virtual and real hybrid network worm emulation model was proposed. Because of combining simulation with emulation, the model balanced the fidelity and scalability very well.
Fuzzy Congestion Control Algorithm Based on Accumulative Traffic Delay
WANG Xue-shun,YU Shao-hua,DAI Jin-you,LUO Ting
Computer Science. 2010, 37 (7): 57-61. 
Abstract PDF(564KB) ( 281 )   
RelatedCitation | Metrics
In order to guarantee the quantity of flow medium traffic delivered in the network, congestion control strategy is adopted in flow medium traffic. According to the delay sensitivity of flow medium data, this paper proposed a novel congestion control for multimedia transmission in networks known as Accumulative Delay Fuzzy Congestion Control (ADFCC). ADFCC improves the QoS of multimedia stream in network by detecting and discarding these packets that accumulated delay exceeds multimedia stream's delay tolerance so as to maintain high bandwidth utilization. All kinds of packets were firstly classified into queues according to their own priorities which calculated by accumulative delay. Then the buffer state was divided into three phases, including normal, congestion avoidance, and congestion according to their buffer usage ratio. hhe three phases arc crossover each other because of their fuzziness. hhen by combining the whole congestion control,with the part congestion control,the fuzzy congestion control algorithm was carried out. Simulation results show that the proposed scheme can effectively reduce the average received packet delay than that of using traditional congestion control algorithm. Moreover, it is suitable to be used in the ever-changing network environment and can improve utilization of the network resources.
On the Connectivity Analysis over Dynamic Spectrum Access Networks
HOU Meng-shu,LI Yu-jun,LU Xian-liang,QU Hong,REN Li-yong
Computer Science. 2010, 37 (7): 62-65. 
Abstract PDF(436KB) ( 207 )   
RelatedCitation | Metrics
Different from traditional wireless networks, in dynamic spectrum access networks, primary users have privilege to use the spectrum and hence the connectivity of secondary user network is affected by the distribution and activity of the primary users. Based on continuum percolation theory, this paper proved that secondary users can form the percolated network when there were less primary users or the load of primary users was light.On the other hand, secondary user network couldnot be percolated when there were many primary users and the load was heavy. The necessary condition of the existence of percolated secondary user network was derived when the primary users and secondary users shared one channel. All above conclusions arc examined by extensive simulations.
Opportunity Network Routing Protocol Based on Radio Frequency
JIANG Hai-tao,LI Qian-mu,ZHANG Hong
Computer Science. 2010, 37 (7): 66-69. 
Abstract PDF(465KB) ( 209 )   
RelatedCitation | Metrics
The opportunity network is a kind of new network which can interconnect highly heterogeneous networks under extreme conditions. Its main feature is that there is no direct end-to-cnd path and its data transfer is achieved by store and forward procedure. This paper proposed a routing protocol based on radio frequency by applying radio frequency technology to the data transfer of opportunity network. In this method,it merges the feature of radio frequency technology in the data transfer process, such as the non-contact, non-human intervention, anti-poor environment. The performance analysis and simulation result prove the feasibility and rationality of the routing protocol.
Security Evaluation Model of WSN Based on Routing Attack Effect
ZHAN Yong-zhao,RAO Jing-yi,WANG Liang-min
Computer Science. 2010, 37 (7): 70-73. 
Abstract PDF(361KB) ( 180 )   
RelatedCitation | Metrics
A security evaluation model based on routing attack effect was introduced to improve the ability of WSN's safety evaluation. Network security entropy was argued to describe the security change after routing attack, we also chose appropriate index and predigested it, to analyze the calculating method of network security entropy. Applying Monte Carlo Method, we got the degree of security, and the security index should be unitary, then we evaluated the security of system and forecasted safe situation by calculating attack effect. Then the ability of WSN's security evaluation could be improved and the model provided basis for better method to resist attacks from enemy. Imitation and analysis of case studies shows that the model is oppropriate for security evaluation.
Optimization Based on Traffic Balance over Multipath Network
CAI Ling,WANG Jin-kuan,WANG Cui-rong
Computer Science. 2010, 37 (7): 74-78. 
Abstract PDF(495KB) ( 182 )   
RelatedCitation | Metrics
During the migration to next generation network, multipath network may be used for improving reliability and robustness. The chose multipath in network is the paths which supply the traffic to transfer. How to balance the traffic availability is one of the most important problems in multipath network to improve the performance. To loss rate sensitive traffic, such as VOID (Voice over Internet Protocol) , this paper provided an algorithm to solve the problem of balancing the traffic in multipath network by prediction and optimization theory. I}he algorithm predicts the packet loss rate of each and every path at first, then aiming at the minimum sum of the packet loss rate and achieving the optimal utilization ratio of resources, it converts a traffic balancing problem into optimization problem. The experimental results demonstrate that the proposed algorithm can perform well on QOS, such as packet loss rate.
Improvement of iHEED Protocol in Wireless Sensor Network Based on Clustering by Node Level
YIN An,WANG Ring-wen,HU Xiao-ya
Computer Science. 2010, 37 (7): 79-82. 
Abstract PDF(325KB) ( 203 )   
RelatedCitation | Metrics
Clustering is an important method to design the energy efficient routing protocols in wireless sensor networks. This paper analyzed the principles of the iHEED protocol and pointed out the "parent node loss after clustering" phenomenon by simulation and theoretical analysis. For imporving the iHEED protocol, the "clustering by node level"was introduced and implemented on TinyOS named iHEEDCHLevel protocol. From simulation using TOSSIM, it shows that the iHEED-CHLevel protocol is effective on clustering by node level and guarantees the setup of routing tree among the cluster heads for aggregated data transfer.
Bandwidth Differentiated Service in Web Server Based on Self-tuning Fuzzy Control
GAO Ang,MU De-jun,HU Yan-su,PAN Wen-ping
Computer Science. 2010, 37 (7): 83-86. 
Abstract PDF(430KB) ( 174 )   
RelatedCitation | Metrics
The strategy of differentiated service based on process allocation only concerns connection delay, but when the bandwidth of Web server is over-subscribed, such method is ineffective because the processing delay becomes the dominant factor of client perceived delay. This paper proposed a QoS strategy using two-level fuzzy control,by adjusting the bandwidth of virtual hosts that sever the different priority requests, the self-tuning fuzzy controller could affect the processing delay and achieve the proportional delay guarantee. The stability analysis and experimental results demonstrate that the proposed approach achieves a better performance in terms of the proportional delay variance to the target. The self-tuning fuzzy controller outperforms the static fuzzy controller by 40% on average.
(School of information Science & Engineering, Wuhan University of Science & Technology, Wuhan 430081, China);(School of Electronics Information, Wuhan University, Wuhan 430079 , China)
LI Wen-xiang,XIONG Qing-guo,YANG Lin-tao
Computer Science. 2010, 37 (7): 87-90. 
Abstract PDF(329KB) ( 172 )   
RelatedCitation | Metrics
Replication technique is an effective method for improving the availability of data, enhancing performance in terms of query latency and load balance, while replication also brings significant costs in storage space and traffic. We studied how to decrease the redundant traffic cost for replication in structured P2P overlay by topology optimization. We developed a new Hierarchical Proximity-Aware P2P overlay with dominating set nodes acting as super peers, and designed a corresponding replication technique with multiple hash functions for costless query. Our method can efficiently disseminate replicas across the network, increase query hit ratio, and decrease redundant query messages and storage spaces required. We gave theoretical analysis for performance metrics, and by simulation, we verified the superiority of our method.
Self-certified Proxy Signcryption Scheme Based on Elliptic Curve Cryptography
YU Hui-fang,WANG Cai-fen,WANG Zhi-cang
Computer Science. 2010, 37 (7): 91-92. 
Abstract PDF(249KB) ( 220 )   
RelatedCitation | Metrics
To overcome certificate management problem and key escrow problem in proxy signcryption schemes, a new self-certified proxy signcryption scheme based on elliptic curve cryptography(ECC) was proposed, it hardness relys on elliptic curve discrete logarithm problem(ECDLP). Compared with the existence hteratures, the proposed scheme possesses good security and shorter keyless requirement of storage space and less bandwith requirement,lower computational complexity and communication cost.
Pipelined Cooperative Spectrum Sensing Method in Cognitive Radio Networks
GAO Feng,YUAN Wei,LIU Wei,CHENG Wen-qing,WANG Shu
Computer Science. 2010, 37 (7): 93-96. 
Abstract PDF(363KB) ( 176 )   
RelatedCitation | Metrics
It has been shown that cooperation can improve the performance of spectrum sensing effectively. However,Cooperation also introduces additional overheads. To deal with this problem, we proposed a pipelined spectrum sensing framework, in which secondary users conduct spectrum sensing and results reporting in a pipehned way, and the time consumed by reporting can be also utilized for sensing, resulting in an improvemented of time efficiency and a much wider observing window for spectrum sensing. Besides,we also presented a multi-threaded sectuential probability ratio test method (MhSPRh) , which is very suitable for the pipelined framework as the data fusion technique. Simulation results indicate that our pipelined sensing scheme incorporating with MhSPRh can significantly improve the performance of cooperatme spectrum sensing.
Novel High Spectrum Utilization-oriented Cognitive MAC Protocol
SONG Hua,LIN Xiao-la
Computer Science. 2010, 37 (7): 97-101. 
Abstract PDF(438KB) ( 156 )   
RelatedCitation | Metrics
Cognitive radio(CR) is an emerging technology in wireless access,aiming at vastly improving the way in which radio spectrum is utilized.Its basic idea is that a secondary user(unlicensed user) can be permitted to use licensed spectrum,provided that it does not interfere with any primary users(licensed users).This paper proposed a novel cognitive MAC protocol under the property-rights model,in which,secondary users are divided into several non-overlapping groups,and each group users the proposed auction algorithm to bid for leasing the required channels.Simulations indicate that proposed MAC protocol utilizes spectrum resources to the maximum,guarantees the fairness and dynamics of channels allocation among groups.
New Design Method for Cipher Algorithms for Block Cipher
YANG Hong-zhi,HAN Wen-bao,SI XuE-ming
Computer Science. 2010, 37 (7): 102-104. 
Abstract PDF(237KB) ( 187 )   
RelatedCitation | Metrics
The reconfiguration idea was introduced into the design of cryptography algorithms, and the concept of cipher cluster was creatively proposed. Variation of cipher architecture was controlled by keys, which could not only enhance flexibility of algorithms but also suite to security needs of different users at variety levels. Security and performance of cipher cluster were analyzed, and a paradigm was given using AES cipher.
Improvement of TCP Congestion Control Algorithm for Mobile Ad Hoc Networks through Employment End-to-End Identification
JIANG Dao-xia,PAN Shou-wei,ZHOU Yao,LIU Feng-yu
Computer Science. 2010, 37 (7): 105-109. 
Abstract PDF(442KB) ( 161 )   
RelatedCitation | Metrics
This paper proposed an IADTCP (improvement ad hoc network tcp congestion control) algorithm, which improved TCP congestion control algorithm for mobile Ad Hoc networks through employing end-to-end identification. The IADTCP improved the slow-start strategy in traditional Ad Hoc network to resolve the problem that the congestion window grows un-smoothly, used two metrics IDD(inter delay difference) and STT (short term throughput) to perform multi-metric joint identification for congestion detection, used PLR (packet loss ratio) and POR (packet out of-order delivery ratio) to identify the network states of CHANNEL ERR and ROUTE CHANGE. Then the sender side would take advisable measures in light of network state information carried by back ACK package. The simulation resups have validated the feasibility and efficiency of the proposal.
Analysis of Network Traffic Based on Distributed Statistical Time Series
MENG Fan-xue,LIU Yan-heng,WU Jing,YANG Shu-qi
Computer Science. 2010, 37 (7): 110-114. 
Abstract PDF(402KB) ( 182 )   
RelatedCitation | Metrics
Studying relationship of distributed storage data would be conductive to the overall intrusion detection learning,thus this relationship could not only be used for intrusion detection learning algorithm,but also supervise to optimite data's storage. According to the analysis focused on relationship of network traffic in the distributed storage, a method based on distributed statistical time series was proposed. According to the relationship of network protocol, this method could group data packet, thus analyzed quantity relationship and gave alarm threshold. The experiment results show that the method can be used to detect network attacks.
Secure RFID Authentication Protocol for EPCGen2
DENG Miao-lei,HUANG Zhao-he,Mlu Zhi-bo
Computer Science. 2010, 37 (7): 115-117. 
Abstract PDF(273KB) ( 184 )   
RelatedCitation | Metrics
Many of the existing protocols for radio frequency identification (RFID) either do not conform to the EPC Class-1 Gen-2 (EPCGen2) standards or suffer from security flaws. Safety requirements for RF)D protocols were discussed and recently proposed EPCGen2 compliant cryptographic protocols were analyzed, and the design principles of EPCGen2 compliant authentication protocols were given. A new RFID authentication protocol based on the EPCGen2 standards was also proposed. The new protocol provides mutual authentication, anonymity, untraceability, anti-imper-sonation attack and anti-replay attack.
Novel Mmulti-path Routing Protocol with Load Balancing in Multi-channel Ad Hoc Networks
GUO Rui,GUO Wei,LIU Jun
Computer Science. 2010, 37 (7): 118-121. 
Abstract PDF(362KB) ( 188 )   
RelatedCitation | Metrics
For the inherent route coupling in mobile Ad Hoc networks, this paper proposed a OLSR-based multi-channel multi-path routing protocol with load balancing, which was called MMRP_I_B. It regarded the available channel band-width as the reference of the node load. In route establish stage, MMRP_ LI3 considered both the path load and hops, got multiple optimal node disjoint paths by running multiple developed-Dijska algorithm. Then it assigned a channel to each path in turn and distributed the traffic in a weighted round-robin fashion. Simulation results showed that compared with the protocol in single channel, MMRP-LI3 has a good performance in network throughput and the average end-to-end delay. MMRP_I_I3 can avoid the route coupling effectively.
New Multicast Routing Algorithm Based on Network Coding
LI Tao-shen,ZENG Ming-fei,GE Zhi-hui
Computer Science. 2010, 37 (7): 122-124. 
Abstract PDF(303KB) ( 189 )   
RelatedCitation | Metrics
Network coding is a new algorithm first presented in 2000. Its main advantage is allowing the multicast transmission rate reach theoretically limits. This paper introduced the traditional multicast routing algorithms' limitation, an alyzed the existing network coding algorithms' benefits and disadvantages,and based on an existence improved mathe matical model of network coding,implemented a static distributed layered network coding (SDLNC) algorithm. Simulation results show that this algorithm can significantly improve multicast routing data transmission rates.
Attribute-based Two Level Access Control for Web Service Resources
HUO Yuan-guo,MA Dian-fu,LIU Jian,LI Zhu-qing
Computer Science. 2010, 37 (7): 125-129. 
Abstract PDF(573KB) ( 309 )   
RelatedCitation | Metrics
Web Services Resource (WS-Resource) consists of static Web service interface and dynamic stateful resource. According to the different characteristics of the two components, we proposed an Attribute-Based Two Level Access Control (2L-ABAC) on for WS-Resources. Attribute retrieval is essential for ABAC systems because they are based on their decisions on attributes of users, so 2L-ABAC employs access control policies publishing mechanism to inform users of the needed attributes. Access control policies of Web Services are static and those of resources arc dynamic,correspondently two publishing methods, WSDL attachment and metadata exchanging, are adopted for each level respectively. 2L-ABAC inherits from the ABAC model the capability of authorizing unknown users from other security domains, besides its flexibility due to the hierarchy design model. Moreover, this architecture can be implemented by extending the standard specifications such as XACML and SAML, so it has broad applicability for WS-Resource based systems.
Study of the Role and Task-based Access Control Technology for CSCW System
ZHU Jun,TANG Yong
Computer Science. 2010, 37 (7): 130-133. 
Abstract PDF(356KB) ( 166 )   
RelatedCitation | Metrics
CSCW systems have many new requirements for access control,which cannot be met by existing models. In this paper,a new role and task-based access model (RTI3AC) was presented to meet these requirements. This model formally describes the relationship between the key elements of access control such as user, role, task, privilege, and workflow. The model grants and revokes access privileges of users by assigning them roles and canceling their roles, and provides partial inheritance and delegation between roles. In this model the conception to task was introduced, group tasks were divided into workflow task and independence task, and the relationship between roles and tasks was described. From those the model can dynamically manage the permissions through tasks and tasks' status. RTI3AC model is brought forward aiming at accommodating with the characteristics of collaborative systems such as multi user, dynamic, and collaboration. It can meet the requirements for access control in CSCW system adequately.
Object-oriented Web Application Testing Model
LU Xiao-li,DONG Yun-wei,ZHAO Hong-bin
Computer Science. 2010, 37 (7): 134-136. 
Abstract PDF(350KB) ( 215 )   
RelatedCitation | Metrics
In order to guarantee Web duality and reliability, people attach more and more importance to Web application testing. Based on good analysis and understanding to Web application, good testing models and methods can be put forward so as to test Web application effectively. An object oriented Web application testing model and testing methods were offered so as to support navigation testing and state testing.
Study of Component Interface Model Supported Integration and Expansion
WANG Qiong,DU Cheng-lie
Computer Science. 2010, 37 (7): 137-140. 
Abstract PDF(427KB) ( 154 )   
RelatedCitation | Metrics
Component interface is the only interaction between component and environment, and the design of interface affects the complexity of integration and expansion of component directly. An architecturcoriented and with performance-constraint component interface model (AOPCCllVI) was proposed. This interface model complies with encapsulalion characteristic of component and allows customer to understand the information of component,such as the topology of component,for integration and expansion of component in component interface. It also adds performance description and performance guarantee mechanism to control the component performance. The application of this model in some foundational pr伪ection has proved its effection.
Model Refactoring Conflict Resolution Algorithm
CHEN Jun-bing,WANG Zhi-jian,CHEN Bo,QIAN Si
Computer Science. 2010, 37 (7): 141-143. 
Abstract PDF(317KB) ( 173 )   
RelatedCitation | Metrics
Conflict resolution is a key problem in research of model refactoring,while the majority of researchers focus on conflicts detection. Conflict resolution is usually performed manually after being analyzed known conflicts. Three categories of conflicts could be resolved which include conflict of parallel applications of the same rule, symmetric conflict and asymmetric conflict. This paper concentrated on automating conflicts resolution so as to realize the automatic model refactoring. hhis method provides an integrated algorithm based on manual analysis of three categories of conflicts. The basic automatic resolution algorithm within refactoring according to the cause of the conflicts (the application of either the same rule or the different ones) , divides the conflicts and then resolves them correspondingly. This algorithm could preliminarily realize the automatic resolution of conflicts mentioned which arc caused by the parallel application of refactoring rules.
Collaborative Work System Modeling Method towards Usability Evaluation
LIANG Lu,TENG Shao-hua,SUN Wei-jun
Computer Science. 2010, 37 (7): 144-147. 
Abstract PDF(371KB) ( 213 )   
RelatedCitation | Metrics
In the development of the usagccentcred collaborative work system, it is urgently required that usability problems can be found and corrected in the early stage, causing the evaluators to have clear understanding on the usage contexts of the system. This work further needs to be supported by an effective collaborative task model. However,neither the existing interface models nor the singlcuscr-oriented usability inspection techniques could well work on the basi}level of collaboration units and the us}context of team work. Therefore, this paper proposed a new collaborative work system model as viewed of usability evaluation, on which a discounting usability inspection-walkthrough approach is applied. This method is able to achieve low-cost usability evaluation in the curly stage of software development cycle to detect usability problems by iteratively simulating the use situation in real-world. At last, an experiment showed that the approach has some advantage than several traditional techniques.
Dynamic Information Flow Analysis for Vulnerability Exploits Detection
TANG He-ping HUANG Shu-guang ZHANG Liang
Computer Science. 2010, 37 (7): 148-151. 
Abstract PDF(354KB) ( 306 )   
RelatedCitation | Metrics
Untrusted data originating from network user input and configuration files, causes many software security problems,when operated by security-critical function without strict data validation. Keeping track of the propagation of untrusted data is the main idea of dynamic taint analysis for vulnerability exploits detection. Data derived from network user input and configuration files were labeled as taint. Executed taint propagating algorithm by virtue of data flow analysis,and carried out several taint detection policies for each security-critical function. A vulnerability prototype system was implemented on open source emulator and many optimization mechanisms were exploits detection deployed.
SWS Based Business Requirement Driven Service Composition
ZHAO An-ping,WANG Xiao-yong,QIU Yu-hui
Computer Science. 2010, 37 (7): 152-155. 
Abstract PDF(419KB) ( 145 )   
RelatedCitation | Metrics
Semantic Web Service (SWS) is a step towards automating Web Services in order to face the challenges in Business to Business. SWS based business services composition is one of the most hyped and addressed issues in the field of services oriented computing. hhis paper provided basic ideas and methods toward SWS based business services composition. First, a formal model of business services composition was introduced. Second, an approach to service set mining for service composition was presented,and finally,the paper presented the solution model of requirement driven service selection and composition.
Software Automation Development Technology by PowerDesigner and CodeSmith in .NET Framework
ZHU Xiao-hui,WANG Jie-hua,SHI Zhen-guo,CHEN Su-rong
Computer Science. 2010, 37 (7): 156-159. 
Abstract PDF(426KB) ( 145 )   
RelatedCitation | Metrics
Concerning the inefficiency of software development and the difficulty in quickly adapting the changes of requirement, this paper described a new technology for software automation development It solved the problem of software automation development by designing conceptual model of database and converting to psychical model by Powder-Designer with some constraints and creating code automatically by the support of CodeSmith and customer templets. It solved the problem of development efficiency and software quality for MIS which is based on database. I3y applying in fact project, it showed that the solution could improve the efficiency of software development and reduce the costs.
Double-authorization Chain Sets Based Access Control Model
TU Jin-de,QIN Xiao-lin,DAI Hua
Computer Science. 2010, 37 (7): 160-164. 
Abstract PDF(0KB) ( 154 )   
RelatedCitation | Metrics
With the rapid development of information and technology, database faces more serious security situation. As the center of storage and process for the important data, databases often become the targets of attacks. Research of the access control has been an important part in the field of database security, but the traditional access control technologies could not satisfy the requirements of modern database security. On the basis of traditional DAL mechanism research, we proposed a doublcauthorization chain sets based access control modcl(DACS),which supports 8 kinds of authorization management functions including normal authorization and denial authorization, and has denial authorization mechanism and non-cascade revoking mechanism.
Semantic Retrieval Based on Weighted Domain Ontology
ZHANG Liang,QU Zhen-xin,DING Song,TANG Sheng-qun
Computer Science. 2010, 37 (7): 165-168. 
Abstract PDF(334KB) ( 209 )   
RelatedCitation | Metrics
This paper proposed an approach called WOSR for the semantic retrieval of domain resources that have been previously annotated with ontology concepts. Firstly WOSR developed domain ontology and associated weights with concepts through uniform probability distribution. Secondly the similarity between concepts was computed by using their weights. Finally calculated semantic similarities between a user request and the domain resources, and the answer was returned in the form of a ranked list ordered by semantic similarity. Experiments show that WOSR yields a better performance than other classical methods.
OBDD Based Symbolic Algorithm for Searching Acyclic AND/OR Graphs
WANG Xue-song,ZHAO Ling-zhong,GU Tian-long
Computer Science. 2010, 37 (7): 169-173. 
Abstract PDF(393KB) ( 183 )   
RelatedCitation | Metrics
Searching AND/OR graph is an important problem-solving technique in the area of artificial intelligence. The representation of AND/OR graph based on traditional data structures greatly limits the scale of the graph that could be handled with existing AND/OR graph searching algorithm. Based on the symbolic representation of acyclic AND/OR graph using 0BDD(ordered binary decision diagram),we proposed a novel symbolic algorithm for searching minimal-cost solution graph of acyclic AND/OR graph. It is shown that the symbolic algorithm has lower space complexity and can be used to handle larger-scale acyclic AND/OR graphs.
Research and Application of Dynamical Classification Model for Ensemble Learning Based on Approximation Concept Lattice of Roughness
DING Wei-ping,WANG Jian-dong,ZHU Hao,GUAN Zhi-jin,SHI Quan
Computer Science. 2010, 37 (7): 174-178. 
Abstract PDF(525KB) ( 173 )   
RelatedCitation | Metrics
Concept lattice is an effective tool for data classification, but classification efficiency and precision arc effected by its large scale. In this paper, rough sets theory was applied into the classification research of concept lattice, and a dynamical classification model(named CACLR)for ensemble learning based on approximation concept lattice of roughness was put forward. I}his model can constructs some identical approximation concept lattice classifiers of independent distribution and much precision according to the instance spatial configuration at the scope of roughness. And it can eliminate independent nodes in time during approximation concept lattice constructed, reduce the scale of concept lattice effestively. The multi-combination model for ensemble learning has robustness at the accuracy of rough classification and the efficiency of knowledge prediction. In the last part of this paper, the experiments tested on the UCI benchmark data sets were carried on and performance results of were given, which prove the practical value of CACLR model.
Research and Demonstration of Cloud Retrieval System Based on Server Clusters
AN Jun-xiu
Computer Science. 2010, 37 (7): 179-182. 
Abstract PDF(367KB) ( 160 )   
RelatedCitation | Metrics
Based on the study of cloud computing and mobile search engine, according to the development of current technology, we put forward the cloud retrieval system model based on clusters. The model consists of the cloud information layer, cloud retrieval cluster system, the user ctuery box We deeply studied distributed data storage technology in cloud retrieval system, and proposed program of cloud storage and retrieval of data structure. In order to improve the efficiency of parallel execution in cloud retrieval, put forward a program flow at the core of cloud retrieval software implementation of the model. The testing results of the model arc shown that the system function can realize correctly, and its performance is good and steady. The demonstration of this model gives the retrieval technology of massive information a scheme to develop thinking.
Estimation of Distribution Algorithm to Optimize the Assignment of Restoration Capacity for ASON
XU Chang,CHANG Hui-you,XU Jun,LUO Jia
Computer Science. 2010, 37 (7): 183-185. 
Abstract PDF(202KB) ( 151 )   
RelatedCitation | Metrics
In order to solve the problem of ASON restoration capacity assignment, the corresponding mathematical model was established, and a new optimization algorithm based on estimation of distribution algorithm was presented. Comparcel with other restoration capacity methods, this algorithm reduces the calculation work significantly, which facilitates the application of algorithm on projects. Simulation results show that the near global optimal solution can be easily obtwined and the solution is definitely satisfactory in engineering.
Extension of Rough Set Model Based on SPA Similarity Degree in Incomplete Information Systems
CHEN Sheng-bing,LI Long-shu,JI Xia,BIAN Shi-hui
Computer Science. 2010, 37 (7): 186-190. 
Abstract PDF(408KB) ( 152 )   
RelatedCitation | Metrics
In view of the limitations of existing extension of rough set models for processing incomplete information, the difference between null value's ectuality and known value's equality was analysed on probability. Based on the theory of Set Pair Analysis, SPA Similarity Degree and Similarity I}olerance Relation were proposed, and the method of extension of rough set model based on SPA Similarity Degree was described also. It discriminates null value’s equality from known value's equality by using the coefficient of difference degree,and gets the neighborhood of object according to coefficient of difference degree and similarity tolerance relation, than gets upper approximation and lower approximation according to the neighborhood. The difference of null value's equality is ignored when we compute upper approximalion, and the difference of null value's equality is emphasized for lower approximation. The results of the experiment reveal that both the classification capability and the precision of rough set arc better than other models.
Improved Bayesian Method for Model-based Diagnosis
JIA Xue-ting,OUYANG Dan-tong,ZHANG Li-ming
Computer Science. 2010, 37 (7): 191-194. 
Abstract PDF(414KB) ( 217 )   
RelatedCitation | Metrics
Model-based diagnosis concerns using a model of the structure and behavior of a system or device in order to establish why the system or device is faulty. But the fact is that determining a diagnosis for a problem always involves uncertainty. This situation is not entirely satisfactory. This paper built upon and extended previous work in model-based diagnosis by supplementing the model-based framework with probabilistic sound ways for dealing with uncertainty. This was done in a mathematically way in I3ayesian theory to compute the posterior probability that a certain component is not working correctly given some diagnosis. And in this paper we proposed a general method to increase efficiency. The complexity and the completeness of the method were analyzed. I}he time complexity and the space complexity were reduced in the improved method. The experimental results illustrate that the improved method has a better executive efficicncy than the traditional method in general. In fact, the executive efficiency may be improved up to two orders of magnitude in some cases.
Analysis for TSP Problem Based on Artificial Metabolic Algorithm
HU Yang,GUI Wei-hua,CAI Zi-xing
Computer Science. 2010, 37 (7): 195-199. 
Abstract PDF(485KB) ( 171 )   
RelatedCitation | Metrics
Artificial metabolic algorithm(AMA) model was built based on physiological metabolic functions in organism. Multi-step catalytic reaction kinetics model was obtained by analysis concentration difference between substrate and product. Searching optimization model for traveling salesman probl}m(TSP) based on concentration difference was established by using analogous analysis between city network and metabolic network. One real example states that optimination programming for traveling salesman problem(TSP) can be implemented efficiently through artificial metabolic algorithm.
Modeling and Function Approximation Approach Based on Evolutionary Functional Networks
LUO Qi-fang,ZHOU Yong-quan,XIE Zhu-cheng
Computer Science. 2010, 37 (7): 200-204. 
Abstract PDF(387KB) ( 155 )   
RelatedCitation | Metrics
A new genetic programming designing neuron functions, combining genetic programming and evolutionary algorithm, was proposed for hybrid identification of functional network structure and functional parameters by performing global optimal search in the complex solution space where the structures and parameters coexist and interact The method using hybrid base functions is different from traditional method approximate object function. The computing results show the high precision by the new method.
Novel Semi-supervised Clustering for High Dimensional Data
CUI Peng,ZHANG Ru-bo
Computer Science. 2010, 37 (7): 205-207. 
Abstract PDF(233KB) ( 199 )   
RelatedCitation | Metrics
Semi-supervised clustering is a popular clustering method in recent year, which usually incorporates limited background knowledge to improve the clustering performance. However, most of existing methods based on neighbors or density can't be used for processing high dimensionality data. So it is critical of merging the reduced feature with semi-supervised clustering process. ho solve the problem, we proposed a framework for semi-supervised clustering. The framework firstly preprocesses instances with transmissibility of constraints;then reduced dimensionality by projecting feature into low dimensional space;finally it clustered instances with reduced features. To evaluate the effectiveness of the method, we implemented experiments on datasets, the results show the method has good clustering performance for handling data of high dimension.
Quick Algorithm for Computing Core of the Positive Region Based on Order Relation
XU Zhang-yan,SHU Wen-hao,QIAN Wen-bin,YANG Bing-ru
Computer Science. 2010, 37 (7): 208-211. 
Abstract PDF(305KB) ( 144 )   
RelatedCitation | Metrics
At present, the main method of designing algorithm for computing the core based on the positive region is discernibility matrix In this method, the core is found by discovering all discernibility elements of discenibihty matrix So this method is very time consuming. On the foundation of the simplified decision table and simplified discernibility matrix, if the objects with condition attribute value of simplified decision table are looked as numbers, they are order. Using the order, the discernibility elements including the core may be turned into a mall quantity of research space. So the core based on the positive region can be found only by searching a small quantity of discernibility elements of the simplified discernibility matrix On this condition, an efficient computing core was designed by integrating the idea of radix sorting. The time complexity and space complexity are O(|C||U|)+O(|C|2|U/C|) and O(|C||U|) respectively.Since the core can be found by searching a small quantity of discerniblity elements of the simplified discernibility matrix in the new algorithm, the efficiency of the new algorithm is improved.
SMO Algorithm for Resolving Huber-SVR with Non-positive Kernels
FANG Yi-min,ZHANG Ling,SUN Wei-min,XU Bao-guo
Computer Science. 2010, 37 (7): 212-216. 
Abstract PDF(351KB) ( 303 )   
RelatedCitation | Metrics
A new SMO algorithm for SVR was proposed which can solve the SVR with non-positive kernels. In our SMO algorithm, the problem of solving SVR model is decomposed into a series of sulrproblems of seeking the minimum of parabola within a limited range. Such minimum can be found because only the symmetry axis direction of some parabo las is changed as respect to non- positive kernels. Hence,we derived relevant iterative formula of SMO method for Huber-SVR and designed the relevant algorithm. Some necessary proofs about the algorithm were also given. Based on our SMO algorithm, we did both regression experiments and prediction experiments using Huber-SVR with non-positive kernels, and compared the experimental results with that of Huber-SVR with positive kernels. The experimental results showed that some non-positive kernels may have better regression performance and better prediction performance than positive kernels,and this confirmed the validity and necessity of our algorithm. This method can also be extended to the other SVR.
Ensemble Learning Based Intrusion Detection Method
XU Chong,WANG Ru-chuan,REN Xun-yi
Computer Science. 2010, 37 (7): 217-219224. 
Abstract PDF(349KB) ( 240 )   
RelatedCitation | Metrics
In order to solve the problem of low detection rate for novel attacks and the difficulties in detecting unknown intrusions existing in traditional intrusion systems, the paper proposed a model based on ensemble learning in improved BP neural networks and support vector machines. Experiments show that using the ensemble learning method, the detection rate is higher than that of using any individual networks and svm. So it has a better detection rate not only to the known intrusion, but also to the unknown intrusion.
Regulation Regression Local Forecasting Method of Multivariable Chaotic Time Series in Short-term Electrical Load
REN Hai-jun,ZHANG Xiao-xing,SUN Cai-xin,WEN Jun-hao
Computer Science. 2010, 37 (7): 220-224. 
Abstract PDF(403KB) ( 169 )   
RelatedCitation | Metrics
Regulation regression local forecasting method of multivariable chaotic time series in short term electricalload was proposed, in order to improve the forecasting accuracy of short-term electrical load. The multivariate time series were constructed, by choosing the effective temperature factors with the greatest impact on the load. Firstly, time delay and embedding dimension were confirmed with the methods of mutual information and the minimum predicting error. Secondly, according to reconstruction parameters, the phase space of short term load multivariate time series was reconstructed. Thirdly, aiming at few neighboring points in the partial predicting method that can not satisfy least square estimate condition, multivariate time series chaos partial forecasting model based on the regularized regression was presented. Moreover, such model was carried on in practical power load forecast(an electrical power in Chongqing),and the forecasting accuracy was enhanced.
Research on Sparse Bayesian Model and the Relevance Vector Machine
YANG Guo-peng,ZHOU Xin,YU Xu-chu
Computer Science. 2010, 37 (7): 225-228. 
Abstract PDF(320KB) ( 908 )   
RelatedCitation | Metrics
The support vector machine is successfully applied in many fields of pattern recognition, but there ere several limitations thereof. The relevance vector machine is a I3ayesian treatment, its mathematics model doesn't have regularination coefficient, and its kernel functions don' t need to satisfy Mercer' s condition. The relevance vector machine can present the good generalization performance, and its predictions are probabilistic. We introduced the sparse I3ayesian modes for regression and classification, regarded the relevance vector machine learning as the maximization of marginal likelihood through the model parameters inference, then we described three kinds of training methods and presented the flow of the fast sequential sparse I3ayesian learning algorithm.
Automatic Acquisition of Translation Templates Based on Error-driven
ZHANG Chun-xiang,LIANG Ying-hong,YU Lin-sen
Computer Science. 2010, 37 (7): 229-232. 
Abstract PDF(344KB) ( 219 )   
RelatedCitation | Metrics
Automatic acquisition of translation templates is important for MT system to improve its translation quality and its ability of adapting to new domain. In this pap}r,hrecto-String method was applied to extract translation equivalents. Error-driven learning method was used to acctuire templates. These templates were applied to MTS2005 , and open test experiments were conducted for its translation quality. The experiment results show that the performance of new method in this paper is better than that of old method. Combination of new acquired templates and original ones makes assessment score of open test corpus improved by 3. 41 0 o under 3-gram Nist assessment metric.
Grid Task Scheduling Based on Improved Genetic Algorithm
YE Chun-xiao,LU Jie
Computer Science. 2010, 37 (7): 233-235. 
Abstract PDF(254KB) ( 183 )   
RelatedCitation | Metrics
Grid task scheduling is a NP-complete problem which concerns the scheduling of tasks and resources in a large scale, and thus a scheduling algorithm of high efficiency is rectuired. A grid task scheduling algorithm based on GA was proposed. In the process of population initialization, a new method which combines the min-min algorithm and the max-min algorithm was addressed, and in the evolution of the population a new criterion predicting the premature convergence was presented and the corresponding improved mutation was designed to avoid premature convergence. The simulation results show that this improved algorithm can solve the problem of grid task scheduling more effectively.
Real-time Multiobjective Path Planning
WEI Wei,OUYANG Dan-tong,LV Shuai
Computer Science. 2010, 37 (7): 236-239269. 
Abstract PDF(438KB) ( 518 )   
RelatedCitation | Metrics
A method adopting the idea of real-time search for solving multiobjective path planning problems was proposed, a local path planning algorithm was designed and implemented which executes a heuristic search within a limited local space to get all of the local non-dominated paths. After that, the method of real-time multiobjective path planning was proposed and the corresponding heuristic search algorithm was designed and implemented. The algorithm executes path planning process, learning process and moving process online by turns to get the set of local optimal paths, transfers the current state in the local space and updates heuristic information of the local states respectively until reaching the goal state successfully. Test results show that the algorithm can solve multiobjective path planning problems efficicntly by limiting local search space which can avoid lots of unnecessary computing work and thus improve the search efficiency.
Fast Multi-class Classification Algorithm of Support Vector Machines
QIN Yu-ping,LUO Qian,WANG Xiu-kun,WANG Chun-li
Computer Science. 2010, 37 (7): 240-242. 
Abstract PDF(235KB) ( 181 )   
RelatedCitation | Metrics
A fast support vector machines mufti class classification was proposed. Firstly, the number of every class training samples is used as weight to construct Huffman binary tree, and then train sub-classifiers for every non leaf node in the binary tree. For the sample to be classified, the sub-classifiers that between the root and a certain left are used to classify, the left is the class of the sample. The classification experiments on the Reuters 21578 database arc done with this algorithm. The experimental results show that it has better performance and partly overcomes the flaw of existing mufti class classification algorithm of support vector machines, which is slow in the process of classification.This algorithm can remarkably increase the speed of classification, especially in the case of more classes and the scale of every class is uniform.
Novel Local Within-class Features Preservation Kernel Fisher Discriminant Algorithm and Applied in Speaker Identification
ZHENG Jian-wei,WANG Wan-liang
Computer Science. 2010, 37 (7): 243-247. 
Abstract PDF(418KB) ( 235 )   
RelatedCitation | Metrics
Dimensionality reduction without losing intrinsic information on original data is an important technique for succeeding tasks such as classification. A novel local within-class features preservation kernel fisher discriminant algorithm was proposed after deeply analyzing the relationship between kernel fisher discriminant and kernel local fisher projection. I}he new method keeps the ability of KFD's global projection and solves the over-fitting of KI_FDA's local preservation problem, which can work well on overlapped or multimodal labeled data. I}he training algorithm is improved for resolving out of-memory problem when applied in large sample situation. The simulation and speaker identificanon application show that the proposed algorithm has more adaptability as well as advanced recognition rate and speed.
Attitude Control for Large Angle Maneuver of Flexible Satellite Based on a Multi-objective Evolutionary Algorithm
SHEN Xiao-ning,ZHOU Duan,GUO Yu,CHEN Qing-wei,HU Wei-li
Computer Science. 2010, 37 (7): 248-250263. 
Abstract PDF(334KB) ( 169 )   
RelatedCitation | Metrics
The attitude control for large angle maneuver of flexible satellite requires the satellite can follow the given angle quickly. Meanwhile, vibration of solar panel caused by maneuver of the satellite is expected to be low. A new multi-objective optimization evolutionary algorithm was applied to the path planning for large angle maneuver of satcllife, which can optimize both the rapidity of maneuver and low vibration of solar panel. Simulation results indicate that the proposed approach can search a diverse set of non-dominated path solutions in one run efficiently, from which the decision maker can make a selection under different working requirements. I}he approach proposed alleviates the confhcts between two criteria.
Shape Description of 3D CAD Models Using FFT
WANG Yan-wei,HUANG Zheng-dong,MA Lu-jie
Computer Science. 2010, 37 (7): 251-254259. 
Abstract PDF(428KB) ( 185 )   
RelatedCitation | Metrics
A Fast Fourier-Transform based shape description method for 3D CAI)models was presented. First,based on uniform representation of geometric facet and according to the facet adjacency relation in the CAI)model, a facet sequcnce was achieved with the hSP( Traveling Salesman Problem) algorithm for complete graphs, and the consistency of the facet sequence was ensured by adopting a reference CAl)model. Then, according to the facet sequence, the geometric information was transformed into five 1D discrete signals. After sampling and amplification, the discrete signals were transformed into frequency domain with FH h and the description of the C八I)model was composed of the magnitudes in frequency domain. If ignoring the deviation caused by facet sequence, describing geometric shapes of CAD models with the method presented in this paper is similar to describing discrete signals with FI门.Experiments were elaborated for physical interpretability too.
Fingerprint Recognition Method Based on Multi-feature Fusion
HAN Zhi,LIU Chang-ping
Computer Science. 2010, 37 (7): 255-259. 
Abstract PDF(455KB) ( 359 )   
RelatedCitation | Metrics
To overcome the disadvantages and better the performance of minutiacbased and imagcbased fingerprint identification methods,a fingerprint recognition algorithm,consisting of the processes of classification selection and classifier fusion, was put forward. Three image-based methods, based on orientation field, texture features of gray level cooccurrence Matrix, histogram features of local binary pattern operator, were combined with minutiacbased method for fingerprint recognition. Experimental results indicate that the performance of fusion-based method is better than any method used alone in fusion and can be applied to automatic fingerprint identification system.
3D Model Retrieval on Multi-feature Dynamic Integration
ZHENG Ying,ZHOU Ming-quan,GENG Guo-hua,GAO Yuan
Computer Science. 2010, 37 (7): 260-263. 
Abstract PDF(344KB) ( 209 )   
RelatedCitation | Metrics
Presented a method by itergrating multi-feature base on two-dimensional orthogonal projection to improve retrieval accuracy. Firstly, we obtained two-dimensional orthogonal projections of the 3D models, and then two features which among two-dimensional orthogonal projections were used and each feature has a weight These features were syncretized on output layer. I3y plusing the two results of the features with their weights, the whole result could be got. According to users' feedback, the system changed the weights of each feature dynamically. Experiments show retriving accuracy and efficiency have a greater improvement than that on single feature.
Color Image Segmentation Based on Spectral Clustering Using Multi-sampling
ZHU Feng,SONG Yu-qing,ZHU Yu-quan,XU Li-li,JIN Hong-wei
Computer Science. 2010, 37 (7): 264-266. 
Abstract PDF(254KB) ( 156 )   
RelatedCitation | Metrics
Spectral clustering for image segmentation is difficult to calculate the spectrum of weighted matrix. A multistage sampling spectrum image clustering algorithm was designed to eliminate it. First, the sampling theorem was given and proved, and the minimum sample size was derived according to the smallest cluster and cluster number. Secondly,image pixels were sampled based on the minimum sample size and a penalty function was proposed by repeated sampling for the elimination of sampling model on the stability effects of spectral clustering. Finally, the distance between a pixel point and categories was definited, the remaining points were assigned respective cluster depending on the principles ofthe nearest distence. I}he experimental results show the effectiveness of the algorithm.
Transform Optimization Based on Color Image Compression
XIE Kai,YANG Sheng,YU Hou-quan,YANG Jie
Computer Science. 2010, 37 (7): 267-269. 
Abstract PDF(238KB) ( 226 )   
RelatedCitation | Metrics
A transform optimization based preprocessing algorithm was presented. The method integrates combination of transforms, color component weighting and CSF filtering. Experiment results showed that proposed algorithm outperforms JPEU2000 lossy and lossless algorithm in both objective and subjective duality in wide range of compression rate.
Novel Edge-preserving Algorithm for Defocus Blurred Image Restoration 降
XIAO Quan,DING Xing-hao,LIAO Ying-hao
Computer Science. 2010, 37 (7): 270-272. 
Abstract PDF(273KB) ( 154 )   
RelatedCitation | Metrics
Relevant research on image restoration indicates that image's subjective visual quality is closely related to its local details. A novel restoration algorithm for defocus blurred image was proposed. The proposed algorithm based on BTV regularization framework by introducing a local adaptive weighted function constructs a new cost function for imargc restoration. hhis cost function which not only takes into account the global data-fidelity, but also considers the local statistical properties of image,meaning to fully consider the local structural features of image under global data-fidelity,hence behaves much better in edge preservation. Experimental results confirm the effectiveness of the proposed method.The image is restored with better subjective and objective visual quality, compared with other methods such as traditional B丁V regularization approach.
Semantic Analysis for Soccer Video Based on Fusion of Multimodal Features
ZHANG Yu-zhen,WEI Dai-di,WANG Jian-yu,DAI Yue-wei
Computer Science. 2010, 37 (7): 273-276. 
Abstract PDF(383KB) ( 202 )   
RelatedCitation | Metrics
This paper proposed a framework to fuse multimodal features to detect soccer highlights. First the audio stream was extracted from video and classified based on CHMM. Then according to time corresponding relationship,shoot event was detected based on the combination of goal and replay in the shots near to those including excited speech of commenter and cheer from audience, where replay was detected based on logos. For shoots scoring could be judged according to the length of excited speech and cheer and the one of replay and the caption appearance. In the shots close to those including whistles fouls could be detected based on the combination of replay appearance and the length of replay. Experiments prove the high efficiency of the proposed system.
Method for Image Splicing Detection Based on Statistical Features and Markov Feature
LI Zhe,ZHANG Ai-xin,JIN Bo,LI Sheng-hong
Computer Science. 2010, 37 (7): 277-279. 
Abstract PDF(239KB) ( 161 )   
RelatedCitation | Metrics
The authenticity detection of digital images is of great significance in judicial identification. Image splicing is used very often as a falsification method and it will certainly decrease the correlation between image pixels which can be reflected by some statistical features. Combined with Markov features, a new approach which uses moment and phase congruency to extract features was proposed and SVM classifier was used to judge the category an image belongs to.The experimental results show that the proposed method can achieve a detection accuracy of 91. 75%.
Semi-supervised Feedback Algorithm Based on Locally Adaptive Approximation
HUANG Chuan-bo,XIANG Li,JIN Zhong
Computer Science. 2010, 37 (7): 280-284. 
Abstract PDF(424KB) ( 155 )   
RelatedCitation | Metrics
In this paper, identification information was put into the distance measure, using this new distance measure instead of the Euclidean distance to construct k-neighbor, we proposed a new local linear nearest neighborhood propagation method. I}his provides a semi-supervised feedback algorithm based on the local adaptive approximation for image retrieval relevance feedback mechanism FLANNP (feedback locally adaptive nearest neighbor propagation). The decision function constructed by SVMs was used to determine the most discriminant direction in a neighborhood around the query. Such a direction provides a local adaptive distance algorithm. 13y this the reconstruction weights were computed.After all the labels were propagated from the labeled points to the whole dataset using the local linear neighborhoods with sufficient smoothness. The approach makes use of identification information in distance measure and improves the accuracy of image retrieval.
Wavelet Denoising of Phase Matching the Signal Noise Estimation
WU Peng,WANG Ai-xia,LI Jing-jiao
Computer Science. 2010, 37 (7): 285-286303. 
Abstract PDF(246KB) ( 196 )   
RelatedCitation | Metrics
Most of the wavelet threshold denoising method needs to calculate the corresponding threshold.The estimation of noise variance will directly affect the effect of threshold denoising method.This paper presented a new phase matching of the noise variance estimation method,using the method in real-time estimate the current noise.A new method of noise threshold was set up.The experiments show that the method can greatly improve the signal to noise ratio and can be made a very good denoising effect.
Improved DCNON Segmentation Algorithm Research in Color Image Sphere Coordinate
MIN Yu-tang,HU Hai,WANG Fu-rong,CHEN Tian
Computer Science. 2010, 37 (7): 287-290295. 
Abstract PDF(468KB) ( 174 )   
RelatedCitation | Metrics
This paper introducd a method of advanced dynamically coupled neural oscillator network based on color image segmentation. This method first converses RGB of color image from thre}dimensional Cartesian coordinates to spherical coordinates,aiming to remove background noise due to illumination and texture to generate smooth gray-level image of angle. Then, the advanced dynamically coupled neural oscillator network will do segmentation for the gray-level image of angle, to generate robust, fast and reliable segmentation.
Hardware/Software Partitioning VLD Design Based on AVS
LIU Wei,CHEN Yong-en,XU Yuan-feng
Computer Science. 2010, 37 (7): 291-295. 
Abstract PDF(458KB) ( 196 )   
RelatedCitation | Metrics
Architecture with Hardware/Software co-design for AVS Variable Length Code ( VLC) decoding was designed. Fixed Length Code, k-th Exp-Golomb Code and context based adaptive 2I}VLC (CA-2I}VLC) Code were decoded correctly on the platform. New design method for VLC table was presented, which was the optimization of nineteen original tables. As a result of the logical design and large amount of instructions decrease, a reduction of more than 30% in hardware consumption was achieved. Verifier based on RM52J was proposed as well to validate the rationality and validity of the whole system. Tested by ninety-two streams, this design was proved to reach the requirement of AVS video decoding.
On-chip SRAM-based FTL Design for Solid-state Drive
XIE Chang-sheng,LI Bo,LU Chen,WANG Fen
Computer Science. 2010, 37 (7): 296-300. 
Abstract PDF(484KB) ( 258 )   
RelatedCitation | Metrics
There are a lot of issues of the SSD arising in the storage industry. This paper proposed a caching FTL design to improve the efficiency of the SSl)random write and reduce the amount of erasure operations by on-chip SRAM. By simulation experiments of SSDsim,we demonstrated the efficiency of our design. At the end of this paper,I gave my follow-up plan in this work.
Optimization of Embedded SRAM for Low Power and Testing
WANG Jiang-an,ZHUANG Yi-qi,JIN Zhao,LI Di
Computer Science. 2010, 37 (7): 301-303. 
Abstract PDF(248KB) ( 369 )   
RelatedCitation | Metrics
In order to reduce power consumption of SRAM, an optimized SRAM was presented. I}echnology with isolalion of operation data was introduced into fast inputs,division of bus data was made to multiple bits of comparator; gating clock was added into big clock network, many modes of power control and isolation logic were increased;SRAM64K X 32 was separated into 8 sub blocks of SRAMBK X 32,which were connected with signals to select chip through logic of one selected from eight so that only one of sub blocks can be in read-write operation. The optimized SRAM64KX32 was used in SOC and power consumption of each part was mcasurai by adding bypass logic. `hhe SOC design was successfully implemented in 90nm CMOS process. The testing results indicate that power saving of the optimined SRAM64KX 32 is 29. 569%and area only increased by 0. 836%.