Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 10, 01 December 2018
  
Function S-Rough Sets, Function Rough Sets and Separation-Composition of Law for Information Systems
SHI Kai-quan
Computer Science. 2010, 37 (10): 1-10. 
Abstract PDF(770KB) ( 365 )   
RelatedCitation | Metrics
Function one direction singular rough, dual of function one direction singular rough sets and function two direction singular rough sets were given. They arc three types of forms about function trough sets. They came from improved S-rough sets by introducing function concept to S-rough sets. Function rough sets came from improved Z. Pawlak rough sets by introducing function concept to Z. Pawlak rough sets. This paper gave the relation between function Srough sets and function trough sets, the relation between function trough sets and Z. Pawlak rough sets, the relation between function S-rough sets and function rough sets. Using these results, this paper presented the generation of the interval discretization for function and finite element set, the principle of function discretization and element generation,the information law which was generated by function S-rough sets, the principle of the dynamic properties for function equivalence class and supplementary and deletion for properties, the separation-composition principle for data, the attribute characteristics of the dynamic separation-composition for information law, the invariance principle of the dynamic separation-composition for information law and so on. Using these concepts and results, the application of the separalion-composition for information law, the application of embedment separation for information images and the separalion-identification of embedded information images were given. In short, function S-rough sets and S-rough sets are a new research direction in rough sets.
Prospects and Current Studies on Video Mining
DAI Ke-xue,LI Qiang,LI Guo-hui
Computer Science. 2010, 37 (10): 11-15. 
Abstract PDF(481KB) ( 623 )   
RelatedCitation | Metrics
In recent years, more and more researchers over the world show great interests in video mining gradually.However, the research is just on the start Practical applications are even few. The concepts, the system structures, and the approaches of video mining need farther studies. I}he study progress overseas and the inland dynamics of video mining technology were reviewed for summing up the states of the art. Then the concepts and the meanings of video mining were discussed. And the relations and differences between video mining technology and other correlative technologics were distinguished. Finally, the major problems and the solvable way in the studying of video mining were prospected.
State of the Art on Cancer Classification Problems Based on DNA Microarray Data
YU Hua-long,GU Guo-chang,ZHAO Jing,LIU Hai-bo,SHEN Jing
Computer Science. 2010, 37 (10): 16-22. 
Abstract PDF(724KB) ( 711 )   
RelatedCitation | Metrics
Applying DNA microarray data to diagnose for cancer and recognize different subtypes of the same tumor has been becoming one of hot topics in I3ioinformatics. Firstly, this paper summarized the state of the art and the development trend for cancer classification based on microarray data. Then the basic procedure of DNA microarray experiments, structure & characteristics of microarray data and general process for cancer classification based on DNA microarray data were introduced. After that,a detailed survey and systemic comparative analysis combined with the research results for the last ten years was made from several main aspects listed as below; data preprocessing, feature gene selection, classifier design & classification performance evaluation. Finally, some subsistent difficulties in this research field were summarized and meanwhile, the possible directions for future work were also predicted and suggested.
Survey on Weight Information Problem Based on FN-E&DM
LI Ming-hui,XIA Jing-bo
Computer Science. 2010, 37 (10): 23-26. 
Abstract PDF(471KB) ( 398 )   
RelatedCitation | Metrics
The acquisition of attrib ute weight is acomplex problem in the process of evaluating乙decision making. To research the attribute weight information based on FN-E&DM is obtaining the weights by solving the model. Main developments of weights based on FN-E&.DM were surveyed from three aspects that weight information is known completely, incompletely and unknown completely. The solving model was analysed and acquisition method of attribute weight was discussed. Finally, the future researches of weights based on FN-E&DM were prospected.
Survey of Mining Imbalanced Datasets
ZHAI Yun,YANG Bing-ru,QU Wu
Computer Science. 2010, 37 (10): 27-32. 
Abstract PDF(527KB) ( 603 )   
RelatedCitation | Metrics
This paper reviewed the present situation of mining data in unbalanced classes at home and abroad in recent years. Firstly, it analysed in-depth the existing problems and their resulting nature. hhen it in detail dealt with various state-of-the-art data mining techniques under the unbalanced learning scenario. Moreover, from the data-level and algorithm-level respectively it analysed and compared them omprehensively in accordance with essential difference. At the same time, the paper summaricd measure metrics evaluating performance of mining imbalance data sets. Also, the paper pointed out recent hot issues of theoretic studies and applications. Finally, the perspectives on future work were also discussed.
On Summarization of Information Modeling Based on Implicit Polynomial Curves
WU Gang
Computer Science. 2010, 37 (10): 33-37. 
Abstract PDF(501KB) ( 467 )   
RelatedCitation | Metrics
It has many advantages for implicit polynomial curves to describe data set boundary, more obviously in fitting data point. hhis paper surveyed the current studies on information modeling based on implicit polynomial curves, focused on the comparison of the various implicit polynomials curves fitting algorithm, then, taking an example of image objects boundary, analyzed the effect of curves fitting algorithm, advantages and disadvantages of them, method improving and study trends.
Influence and Analysis of the Incomplete Behavior Evidence on Trust Evaluation in WSNs
TIAN Li-qin,LIN Chuang
Computer Science. 2010, 37 (10): 38-41. 
Abstract PDF(427KB) ( 354 )   
RelatedCitation | Metrics
In wireless sensor networks, single static identity-based node authentication can't meet the dynamic security needs. In order to provide better security we must combine dynamic behaviour-based node trust with static identity-based node authentication. In node behavior trust evaluation, behavior evidence is the fundamental basis of behavior evaluation. Because the node's behavior is random,uncertain,which causes behavior evidence was not complete and bchavior value is not ectual in each interaction, but as far as we know, this very important phenomenon in behavior trust evaluation is rarely considerd. "hhis paper analyzed incomplete behavior evidence obtained in node interaction brings about a significant impact on trust evaluation and discussed the relationship between incomplete behavior evidence and behavior value,gave a kind of behavior trust evaluation strategy for different values interaction such as Less-Filling and Weight Extension. Finally, we used two theorems and two properties to prove strategy's effect and property in behavior trust evaluation. Paper lays the quantity foundation in enhancing valuctrust reliability of node behavior trust.
Two-way Fast String Matching Algorithm in Ad hoc Networks
ZHANG Ying,XU Jian,CHANG Gui-ran,JIA Jie
Computer Science. 2010, 37 (10): 42-47. 
Abstract PDF(438KB) ( 444 )   
RelatedCitation | Metrics
In a network intrusion detection system, the original AC algorithm adopts oncway matching. I}he comparing time will increase as the number of samples increases. This paper presented an efficient multi-pattern matching algorithm,a bi-directional fast string matching algorithm. The algorithm uses finite automata and forward-backward twoway matching. Compared with the original oncway matching algorithm, the intrusion detection rate is increased by 3 times. The performance of the algorithm was analyzed and compared with other algorithms. Simulation results show that this algorithm can improve the efficiency and detection rate.
RCEA; An Energy Efficient Based Regular Coverage-enhancing Algorithm for Wireless Sensor Networks
TANG Lei,ZHOU Xing-she,ZHANG Da-qing,SUI Yu-lei,MA Jun-yan
Computer Science. 2010, 37 (10): 48-54. 
Abstract PDF(593KB) ( 360 )   
RelatedCitation | Metrics
We addressed the issues of maintaining sensing complete coverage by keeping a minimum number of nodes.We investigated the network topology with the most coverage based on computational geometry, and a regular coverage enhancing algorithm(RCEA) was presented. I3y introducing the concept of virtual force and balanced energy factor,nodes will repel each other to eliminate the sensing overlapping regions and coverage holes along spiral scanning-path. It can enhance the whole coverage performance, and realize the optimal topology. A set of simulation results were performed to demonstrate the effectiveness of the proposed algorithm.
New Network Simulation Approach Based on Integration of Fluid Simulation and Package Simulation
HAN Jin,CAI Sheng-wen,XIE Jun-yuan
Computer Science. 2010, 37 (10): 55-58. 
Abstract PDF(347KB) ( 347 )   
RelatedCitation | Metrics
With the development of network application, the structure and size of modern network become more complicated. Nowadays package simulation can't be competent for the network simulation tasks with those networks which has large size and high transit speed. Though fluid simulation has higher simulation efficiency, it has less simulation precision. In this paper, the fluid simulation was used to simulate network systems' background data fluid, and the package simulation was used to simulate the specific network actions. Based on this method,a new network simulation approach was presented,which realizes the integration of fluid simulation and package simulation and leverages the strong points of both simulation approaches. Compared with other similar approaches, the approach converts packages flow into a fluid flow, by this way, it gets precise simulation results come from the interaction of front data fluids and background fluids,and has more simulation precision and efficiency.
Novel Cloud Computing Intrusion Detection Model Based on Improved Manifold Learning
CHEN Dan-wei,HOU Nan,SUN Guo-zi
Computer Science. 2010, 37 (10): 59-62. 
Abstract PDF(381KB) ( 374 )   
RelatedCitation | Metrics
The cloud computing technology which is a new Internet based super-computing model has aroused great concern,facing an increasing number of security threats at the same time. This paper built a intrusion detection system framework which is adapt to the cloud computing environment proposed in this paper. The nonlinear manifold learning algorithm was introduced to the model proposed in this thesis to be the core algorithm of the data pre-processing module, and the improved I_I_E algorithm was given in order to improve the classification performance. I}he simulation result shows that the algorithm is effective and efficient.
Real Time Routing Protocol Based on Data Driven Link Estimation in Sensor Networks
YOU Shao-hui,LIU Qiang,SHEN Xue-ping
Computer Science. 2010, 37 (10): 63-67. 
Abstract PDF(431KB) ( 343 )   
RelatedCitation | Metrics
In this paper, a real-time routing protocol based on data driven link estimation for wireless sensor networks was presented. This protocol relics on the location information, constructs the next hop nodes set by forwarding path distance vector, combines the data driven idea to form velocity vector with delay, delivery rate and single hop distance,and provide the soft real time of end to end data. Real time and energy consumption arc the primary evaluation target for this protocol. With the ns2 simulation platform, the delay of this protocol reduces above 50 0 0 regard as AODV and DSR under the data rate 50 packets per second, energy consumption under 30 0 0. I}his protocol can effectively improve the real time performance and reduce network energy consumption.
(Dept. of Electronics and Information Engineering, Huazhong University oI Science and Technology, Wuhan 430074, China)
ZHANG Lin,WANG Shu
Computer Science. 2010, 37 (10): 68-70. 
Abstract PDF(292KB) ( 494 )   
RelatedCitation | Metrics
The lower complexity is an outstanding advantage of impulse radio ultra wideband(UWB) communication due to traditional carrier free. Both better performance and a simple receiver structure are the UWI3 system design goals. A novel demodulation algorithm for UWI3 signal reception based on amplitude comparison was proposed. The pulse amplitudes were obtained by integrating positive and negative half cycle of the pulse respectively at special posilion in pulse position modulation scheme, and then the binary symbol was demodulated by comparing their amplitudes.The receiver can get a better balance between reception performance and system complexity by using this method. The results show that the proposed method can remarkably obtain better bit error ratio performanee(over 5dB) than the square law detection with a little expense of complexity, and reduce complexity in comparison with coherent detection but less than 2dB performance degradation.
Analysis of Network Coding Nodes Selection in Wireless Mesh Networks
SHEN Xiao-jian,CHEN Zhi-gang,YE Hui,XIA Zhuo-qun
Computer Science. 2010, 37 (10): 71-73. 
Abstract PDF(319KB) ( 322 )   
RelatedCitation | Metrics
Wireless mesh networks can significantly improve the transmission performance of multi-hop links by using network coding techniques. However, there is the cost of network coding, how to choose encoding nodes in order to reduce the cost of network coding is the focus of study. This paper discussed network coding nodes selection in wireless mesh networks,proposed a algorithm of network coding node selection based on super-key nodes. When Ford-Fulkerson labeling algorithm finds augmented chain, the algorithm statistics in-degree of each node on path and saves the obtained information from different link in node, then confirms which nodes arc super-key nodes. These super-key nodes arc encoding nodes. Simulation results show that the proposed algorithm reduces the number of encoding nodes obviously while achieving the multicast maximum flow.
Research on Wireless Sensor Network Architecture Model in Ocean Scene
LIU Lin-feng,LIU Ye
Computer Science. 2010, 37 (10): 74-77. 
Abstract PDF(503KB) ( 485 )   
RelatedCitation | Metrics
The defects in traditional wireless sensor network(WSN) architectures were firstly studied,then the characteristics of ocean scene were analyzed. The typical requirement objectives of WSN based on ocean scene were acquired,at last a set of design principles for WSN model in ocean environment were concluded. According with the objects ocean environment adaptive, network self-cure, energy efficient and dynamic optimization, a wireless sensor network architecture model for ocean scene ( WSNAOS) was proposed. Furthermore, the design thinking, general characteristics and functions of every lay or every module were described and discussed. WSNAOS not only has the characteristics: ocean environment adaptive, energy efficient and so on, but also supplies the research of WSN protocols in ocean scene with a common problem description and an academic framework.
Cryptanalysis and Improvement of a New Identity-based Key Exchange Protocol
GUO Hua,ZHANG Fan,LI Zhou-jun,ZHOU Xiao-juan
Computer Science. 2010, 37 (10): 78-81. 
Abstract PDF(313KB) ( 541 )   
RelatedCitation | Metrics
This paper extended the definition of key-compromise-impersonate attack according to a new kind of identity-based key exchange protocol presented by Wang in 2007,then showed that this protocol can't resist the extending key-compromise-impersonate attack. This paper also conducted a detailed analysis on the flaw. To avoid this shortcoming, an improvement of the identity-based protocol was proposed based on the original scheme.
Adaptive Transmission Scheme for Cooperative OFDM Networks Based on Subcarrier Mapping
LI Fu-nian,ZHU Guang-xi,WANG De-sheng
Computer Science. 2010, 37 (10): 82-84. 
Abstract PDF(356KB) ( 441 )   
RelatedCitation | Metrics
In amplify-and-forward(AF) OFDM cooperative communication system, since the channel between the first hop(sourco-relay) and the second hop(relay-destination) varies independently,proper subcarrier mapping(SCM) strategy at the relay node has been shown to improve the average end-to-end capacity evidently. However SCM has a poor performance at the condition of low signal-to-noiscratio(SNR),which is worse than direct transmission. To solve this issue, assuming that perfect channel knowledge is available at the destination node, an adaptive transmission scheme based on subcarrier mapping was proposed for the OFDM-based cooperative communications in this paper. hhe adaptive transmission scheme, which is based on instantaneous channel state information, was selected to offer maximum the end-to-end system capacity. The simulation results show that the proposed solution can evidently improve end-to-end system capacity and achieve better performance of outage probability and symbol error rate.
Provable Secure Trust Data Sharing Protocol for Hybrid P2P Networks
LIN Huai-qing,WANG Bin,ZHOU Yan
Computer Science. 2010, 37 (10): 85-88. 
Abstract PDF(324KB) ( 368 )   
RelatedCitation | Metrics
The open and anonymity nature of P2P networks makes it extremely susceptible to malicious users attacking.On way to minimize threats is to build the trust model for P2P network. The trust data is critical to the trust model of P2P system which validity is relied on the reliability of trust data. We presented a CL-PKGI3ased protocol which provides the ability for sharing trust data securely in the hybrid P2P networks. I}he protocol avoids the authentication of the public key and the key escrow problem of identity-based cryptosystem. The security of cryptosystem is based on Bilinear DifficHellman Problem. And the strand space model was used to prove that the secure protocol can achieve agrccmcnt properties and keep secrecy of trust data.
Time-domain Non-uniform Error Protection Algorithm of Audio Information
MAO Qian,XU Jun-jun,DONG De-cun
Computer Science. 2010, 37 (10): 89-91. 
Abstract PDF(229KB) ( 321 )   
RelatedCitation | Metrics
A new method about audio safe transmission based on unequal error protection codes was proposed. Unequal error protection codes provide two levels of error-correcting capabilities: f1 and f2(f1>f2) for the different audio information bit with once codec. If the number of error bits less than or equal to f2 in the decoding process,a common decoding algorithm is adopt, then provides high protection lever with the error-correcting capabilities f1 by computing the syndromes of the sulrcodespace of f2.It's suitable for better audio codec design. Simulation shows that this method improves the security transmission of the audio signal. Under a certain noise the SNR value of the audio signal is obviously advanced and has a good hearing effect
Performance Analysis of CCSDS Sender Packet Multiplexing Processing System
BIE Yu-xia,LIU Hai-yan,PAN Cheng-sheng
Computer Science. 2010, 37 (10): 92-94. 
Abstract PDF(255KB) ( 463 )   
RelatedCitation | Metrics
For the characteristics of CCSDS, which arc packet random arrival and variable packet length, the paper established the performance model and processing limit of multiplexing processing system that sends CCSDS packets. On the basis of packet arrival obeying Poisson process,thc relationship of cache capacity and processing speed with packet length was analyzed. Then overflow threshold of packet multiplexing processing system was educed. Simulation results show the impact of packet arrival rate and packet length on processing ability and the required cache capacity, whichprovides a theoretical basis for design of CCSDS sender data processing system.
New Scheme of Full Disk Encryption Based on Logical Password Lock
ZHAO Fu-xiang,PANG Liao-jun,WANG Yu-min
Computer Science. 2010, 37 (10): 95-97. 
Abstract PDF(328KB) ( 337 )   
RelatedCitation | Metrics
Full disk encryption, which not only holds the secret of encrypted data but also conceals the meta data of disk structure, can give the more optimized defense against attack than file-system-level encryption as the tweakable enciphering mode introduced. However, the current disk encryption will raise dependence on the master key while broadening the coverage of confidential data,because the mount of data encrypted by single key will also be increased,Thus it eauses the key distribution problems whether the master key can be derived conveniently and how the workload of computation and storage will be reduced. ho solve this, a scheme of full disk encryption based on slide password lock was described and an analysis of environment and efficiency on the scheme was given.
SNMP-based Information Manaent Platform for IPv6 Wireless Sensor Networks
FANG Ran,GAO De-yun,LIANG Lu-lu,ZHANG Si-dong
Computer Science. 2010, 37 (10): 98-101. 
Abstract PDF(395KB) ( 357 )   
RelatedCitation | Metrics
Wireless Sensor Networks(WSNs) arc composed of a large number of sensor nodes distributed randomly by self-organization distributed protocols. The network management of WSN is an important requisite to guarantee the stability, reliability and high efficiency in WSN. In this paper, the existing technology of network management of WSN was discussed and researched. Combined with the characteristics and practical requirements of the wireless sensor networks,based on the existing basic protocols of IPv6-WSN and SNMP, a wireless sensor network information management plat-form was designed and realized with the characteristics of wireless sensor networks, to achieve remote and effectivemanagement.
Risk Assessment Approach Based on Trust in P2P Network
LI Kai,LI Rui-xuan,ZHANG Hua-juan,LU Zheng-ding
Computer Science. 2010, 37 (10): 102-104. 
Abstract PDF(349KB) ( 315 )   
RelatedCitation | Metrics
As a result of P2P network being open and dynamic, it is inevitable that peers' information exchange brings security issues to systems. A trust based approach was introduced in risk assessment, and a risk assessment method was put forword by combining trust estimation with the traditional risk calculation to evaluate the risk of information exchange between peers. Risk value is derived from estimation of trust risk and result risk. Trust risk is estimated according to the long-term behavior of a node, and reflects the relationship between trust and risk. Result risk is estimated according to the short-term behavior of a node. Simulation results show that the risk assessment approach based on trust can effectively work in P2P network.
Theoretical Discussion about Topology Control for Wireless Sensor Networks
ZHANG Xue,GONG Hai-gang,LIU Ming
Computer Science. 2010, 37 (10): 105-109. 
Abstract PDF(461KB) ( 424 )   
RelatedCitation | Metrics
Topology control is an important energy-saving technique used in wireless sensor networks. It has evolved into two dominant research directions,power control and sleep scheduling. Based on the observation of existing problems,this paper did fundamental research on topology control. We made a full-scale consideration on energy consumption of the network, both consumed in communications and in idle states. Then, under an ideal circumstance, we proposed a clear definition of the topology control problem with the objective to minimize energy consumption. We proved that it is NP-hard, and informally discussed the computational complexity of more realistic topology control problems. Based on that, we further put forward three necessary principles about how to design energy efficient topology control protocols.It is wished that this paper could contribute to the development of better topology control protocols.
New Logic of Analyzing Electronic Commerce Security Protocols
CHEN Li
Computer Science. 2010, 37 (10): 110-115. 
Abstract PDF(416KB) ( 328 )   
RelatedCitation | Metrics
The paper researched the typical logic analysis methods of electronic commerce security protocols and pointed out their limitations in analyzing security properties. Most of them arc lack of formal semantics and ability of analyzing hybrid cryptography-based primitives. In response on the above-mentioned problems,the paper proposed a new logic analysis method, which can analyze most of the known security properties of the electronic commerce protocols, such as authentication, secrecy of key, non-repudiation, accountability, fairness and atomicity. The validation of the new logic was verified by analyzing the anonymous e-cash payment protocol ISI. The analysis reveals the security vulnerabilities and flaws of the protocol, which cannot satisfy non-repudiation of merchants, secrey of key, accountability, fairness and atomicity, moreover, the customers face malicious cheat of the merchants.
Intrusion Detection Method Based on Model Checking of Concurrent Propositional Projection Temporal Logic
CHEN Jian-hui,WANG Wen-yi,ZHU Wei-jun
Computer Science. 2010, 37 (10): 116-117. 
Abstract PDF(240KB) ( 319 )   
RelatedCitation | Metrics
The intrusion detection method based on model checking of projection temporal logic can describe piecewise network attacks but concurrent attacks. We defined a novel concurrent operation and obtained a new intrusion detection method based on model checking of concurrent propositional projection temporal logic. I3y example, we show that our method can detect concurrent attacks efficiently.
Specification-based Distributed Detection for Mobile Ad Hoc Networks
WANG Fang,YI Ping,WU Yue,WANG Zhi-yang
Computer Science. 2010, 37 (10): 118-122. 
Abstract PDF(460KB) ( 319 )   
RelatedCitation | Metrics
Mobile ad hoc networks are highly vulnerable to attacks due to the open medium, dynamically changing network topology, cooperative algorithms, lack of centralized monitoring and management point. The traditional way of protecting networks with firewalls and encryption software is no longer sufficient and effective for those features. We proposed a distributed intrusion detection approach based on finish state machine(FSM). A cluster-based detection scheme was presented,where periodically a node is elected as the monitor node for a cluster. These monitor nodes can not only make local intrusion detection decisions, but also cooperatively take part in global intrusion detection. And then we constructed the finite state machine(FSM) by the way of manually abstracting the correct behaviours of the node according to the routing protocol of Dynamic Source Routing(DSR). The monitor nodes can verify every node's behaviour by the FSM, and validly detect real-time attacks without signatures of intrusion or trained data. Compared with the architecture where each node is its own IDS agent, our approach is much more efficient while maintaining the same level of effectiveness. Finally, we evaluated the intrusion detection method through simulation experiments.
Information Dissemination Strategy for Energy and Delay Optimization in Wireless Sensor Networks
SONG Yu-rong,JIANG Guo-ping
Computer Science. 2010, 37 (10): 123-126. 
Abstract PDF(427KB) ( 316 )   
RelatedCitation | Metrics
This paper addressed the information dissemination issue over wireless sensor networks. Based on the information propagation model of complex dynamic networks, an energy and delay optimized infom}ation dissemination strategy(EI}OIDS) was proposed for wireless sensor networks,which didn't need the location information and topology information. I3y using the dynamics characteristic that the process of propagation grows outside with time from the source of information, and the received signals strength indication(RSSI),the algorithm evaluates the relative distances between senders and receivers, and decides forwarding priorities of nodes and time delay of MAC. The algorithm ensures the maximum coverage of new area, avoids effectively the collision of nodes and optimizes the delay of the whole network. Meanwhile, a joint of the listening mechanism of MAC with rumor spreading idea is used for controlling the forwarding priorities of nodes, restraining the redundant flooding packets and minimizing the energy consumption.
Construction and Research of Networ(} Security Qualification Evaluation System
WANG Yuan,QI Shan-ming,YANG Huai
Computer Science. 2010, 37 (10): 127-129. 
Abstract PDF(309KB) ( 424 )   
RelatedCitation | Metrics
Now making a quantified analysis on the whole situation of the network security is the important means for solving the network security alarm and prevention. Through the analysis on the network security qualification process,this paper proposed the function structure of the network security quantification system,analyzed the related quantification technology, and gave the related function implementation.
Software Development Cost Estimates Based on Fuzzy Theory
REN Yong-chang,XING Tao,LIU Da-cheng
Computer Science. 2010, 37 (10): 130-134. 
Abstract PDF(519KB) ( 353 )   
RelatedCitation | Metrics
Aiming at the impact of the uncertainly cost estimates,we presented using fuzzy theory to estimate software development cost. By describing the principle of estimate formula of fuzzy theory and Mathematical Derivation, we created a mathematics model using fuzzy theory to estimate software development cost, gave the method of determining the membership function and computing similarity measure, discussed the relationship of database scale and software scale,analyzed the impact factor of database scale, designed the mathematics model of database scale estimation, pointed out the step of software development cost estimates using fuzzy theory. The summary of fuzzy theory method using fuzzy theory method to estimate the software cost can reduce the time cost of estimates, the complex of processing and the inaccurate of estimates which are caused by the variety of uncertain factors.
GridsimHelper:A Code Generator for GridSim Platform
DENG Rong,CHEN Hong-zhong,LI Can,WANG Xiao-ming,LI Jie,ZHANG Jun-qi
Computer Science. 2010, 37 (10): 135-137. 
Abstract PDF(271KB) ( 337 )   
RelatedCitation | Metrics
The Grid Modeling and Simulation toolkit GridSim provide a lot of basic tools to simulate essential components of distributed system. However, the process of simulation is not simple. Except for the necessity to write Java code that used the GridSim toolkit packages;new learner must experience the long learning curve in order to utilize the toolkit functionalitics for effective simulations. For this end, we designed and implemented a GUI based cock generator for GridSim, which is called GridsimHelper, with the aim of shortening the learning curve and enable fast creation of simulation models. I}he code generator provides facility for users to input and evaluate the performance of any scheduling algorithm,as well as baseline algorithm(provided by GridsimHelper) min-min and max-min for comparison. This paper gave the implementation details of GridsimHelper, and the experimental results illustrate the correctness and usefulness of GridsimHelper.
Effective Term-Concept Mapping Method Based on Ontology
LI Wen,CHEN Ye-wang,PENG Xin,ZHAO Wen-yun
Computer Science. 2010, 37 (10): 138-142. 
Abstract PDF(433KB) ( 433 )   
RelatedCitation | Metrics
Term-concept mapping, as one of the key steps of ontology-based semantic search, has a great impact on precision and recall. In traditional methods based on keyword matching, termconcept co-occurrence is introduced to compute the association,but this kind of method doesn't consider the attributes of concepts or their values, losing the semantic information. To handle the problem, we proposed a term-concept mapping method based on triple-document annotation, which takes into count both the concept document and term-document relationship, calculates the correlation and confident degree of term-concept firstly,and then gives the final mapping results. The experiments show that this method improves the precision and recall effectively.
Real-time Extension of Component Behavior Protocol and its Compatibility Verification
JIA Yang-li,ZHANG Zhen-ling,LI Zhou-jun
Computer Science. 2010, 37 (10): 143-147. 
Abstract PDF(444KB) ( 333 )   
RelatedCitation | Metrics
Formal specification and compatibility verification of complex real-time component systems' behavior can efficiently improve the systems' correctness and reliability. This paper analyzed the mainstream component models using in academia and industry and the common formal specification methods of component timed behavior. Based on the analysis we extended component behavior protocol and presented the timed behavior protocol(hI3P) to model components' real-time behavior. Common compatibility error types in component composition were analyzed and the compatibility verification algorithm based on hI3P was given. The timed behavior protocol is simple and convenient to apply and verify. An application example was introduced.
Ontology-based Web Services Reliability Model
WANG Xi-feng,WANG Guang-zheng,JIN Ling-ling
Computer Science. 2010, 37 (10): 148-151. 
Abstract PDF(420KB) ( 386 )   
RelatedCitation | Metrics
Reliability plays an important role in the selection and combination of Web services. With the popularization of Web services, researches on minimizing the discovery duration and improving precision ration are getting more important. This paper proposed a new ontology-based approach, called OntoRel, to evaluate Web services reliability. The OntoRel approach analyzes the measurement criteria and relation of reliability attributes, builds the ontology model and measures the reliability of Web services. The approach is characterized by the management and development of rehabihty knowledge domain,the evaluation and prediction of Web services reliability and the automatic selection and combination of Wcb services.
Models of Web Services Composition Based on Timed Color Petri Nets
WANG Yu-ying,CHEN Ping
Computer Science. 2010, 37 (10): 152-155. 
Abstract PDF(298KB) ( 366 )   
RelatedCitation | Metrics
BPEL is often used to describe the composition of Web Services, but it is lack of sound formal semantic. Web services were prone errors. Based on Timed Color Petri Net, transitions from Web Services which was described using I3PEL to Timed Color Petri Net models were proposed, while I3PEL activities execute ways and environments were considered. The models we get arc more exacter and can be used to verify and test Web Service. An instance of this transition were gmen.
Study of Reflective Software Architecture and PMB Protocols for Reusing Software at Design Stage
LUO Ju-bo,YING Shi
Computer Science. 2010, 37 (10): 156-160. 
Abstract PDF(449KB) ( 299 )   
RelatedCitation | Metrics
This paper proposed a reflective software architecture supporting the reuse of architectural level designs, and described the PMB protocols used to complete interaction and interoperation between metes level architecture and base level architecture of the reflective software architecture. Finally,it formalized the PMB protocols using the formal specification language,Object Z language.
Heuristic and Dynamic Thread Pooling Mechanism Based on Queuing System in Middleware
CHEN Ning-jang,LIN Pan
Computer Science. 2010, 37 (10): 161-164. 
Abstract PDF(471KB) ( 484 )   
RelatedCitation | Metrics
Middlcware in Internet cnvironmcnt,typically Wcb Application Server(WAS),needs to provide the performance guarantee and optimization services effectively for Internet applications. Thread pooling technique is a common method of performance optimization. With the consideration of the features of Internet applications,the management of thread pool in middleware needs to adjust dynamically on the basis of perceiving the context at run-time. However,how to find out effective influencing factors which make the adjusting results having better adaptability remains to be discussed further. The paper firstly presented a dynamic thread pool model of application server based on M/M/1/K/二/FCFS queuing system;then,the paper studied a mechanism which imports heuristic factors and rules for reflecting the context at run-time. It can realize dynamic adjustment of thread pool size more effectively so that thread pool size is well adapted to resources change. The experiments verify the effective influence of heuristic factor that exert on a山ustment of thread pool system.size and show that the presented management mechanism can significantly improve the performance of system.
Hierarchical Classification Approach of Hierarchical Feature Selection and Error Control
WU Bi-jun,LI Juan-zi,JIN Xin
Computer Science. 2010, 37 (10): 165-168. 
Abstract PDF(446KB) ( 383 )   
RelatedCitation | Metrics
There arc thousands of subjects in Chinese news subject specification. When they arc used in news classification,long training time and large model are two key problems we are facing, especially when some of classes are changed. Chinese news subject classification has hierarchical structure and hierarchical can solve the problem partially.We improved the Chinese news hierarchical classification to get better the result from two points of view. 1) Repetitious feature calculation represents news of different layers in hierarchical classification. 2) Use error control to solve the problem that one error classification in upper layer will lead in the error classification of its deeper classes. Our experimenu shows that hierarchical classification improves the precision of 4% comparing with flat classification, hierarchical classification with Repetitious feature calculation improves 3% comparing with hierarchical classification, and hierarchical classification with error control improves 3 % comparing with hierarchical classification.
Algorithm Based on Sliding Window for Similarity Queries over Data Stream
WANG Kao-jie,ZHENG Xue-feng,SONG Yi-ding
Computer Science. 2010, 37 (10): 169-172. 
Abstract PDF(384KB) ( 319 )   
RelatedCitation | Metrics
Similarity queries are fundamental part of modern data mining application. But traditional ctuery algorithms can not be applied on data stream, which is an unbounded sectuence of data elements generated at a rapid rate. We proposed a novel approach for computing similarity over multi data streams based on wavclet sliding window model. The basic idea is to divide sliding window into equally-sized basic windows and represent the data elements of a basic window using wavelet coefficients, then form wavelet synopses window. As a result, queries toward data streams can be converted to queries toward such wavelet synopses. This algorithm takes advantage of the merit of wavelet decomposition for linear computing and achieves superior runtime performance. The extensive experiments verified the effectiveness of our algorithm.
Consideration about Some Problems in Artificial Intelligence
HU Yang,GUI Wei-hua,CAI Zi-xing,YE Hua-wen
Computer Science. 2010, 37 (10): 173-174. 
Abstract PDF(190KB) ( 646 )   
RelatedCitation | Metrics
Based on analysing traditional artificial intelligence, we qualitatively discussed the intelligence algorithm convergence and solution to problems by intelligence algorithm. Some hot problems in the field of intelligence science were analyzed based on the background of physiological science. The concept of inverse intelligence was proposed. The development of artificial intelligence in the future was explored in some degree.
Research of Universal Combination Operation Model
JIA Peng-tao,HE Hua-can
Computer Science. 2010, 37 (10): 175-180. 
Abstract PDF(467KB) ( 354 )   
RelatedCitation | Metrics
Based on fuzzy logic, universal logics analyses continuous changeability among propositions, and puts forward two important concepts: generalized correlativity and generalized self-corrclativity, and realizes flexibility of propositional connective operation models which arc defined operator clusters controlled by corrclativity. Universal combination operation model can satisfy the requirement of integrated decision of continuous valued logic. It is hard to be applied because it only has binary model at present. Therefore the multimember universal combination operation model is an urgent demand. But it is difficult to be constructed because the complexity of universal combination problem increases sharply with increasing the number of member. The multimember universal combination operation model and a generator weighted 0-level universal combination operation model were proposed. It not only satisfies the requirement of multimember integrated decision,but also perfects propositional conjunctions theory of universal logics.
Action Space Based Caving Degree Approach for the 3D Rectangular Packing Problem
HE Kun,HUANG Wen-qi,HU Qian
Computer Science. 2010, 37 (10): 181-183. 
Abstract PDF(313KB) ( 462 )   
RelatedCitation | Metrics
This paper solved the three-dimensional rectangular packing problem with a quasi-human approach.By defining the maximal rectangular spaces at current iteration, the action space, we improved our caving degree approach such that the computation is largely speeded up at the same time the excellent characteristic of the caving degree is still kept.In this way a good solution could be achieved in a shorter time. In the experiments, we tested the improved algorithm with 47 without-orientation-constraint instances in the OR-Library. Computational results show an average space utilination of 95.24%, which improves current best result reported in the literature by 0.32 %. In addition, the results also show less running time compared with other algorithms.
Personal Assistant Agents with a Case-based Memory
CHEN Ke-jia BARTHES A.Jean-Paul
Computer Science. 2010, 37 (10): 184-189. 
Abstract PDF(556KB) ( 362 )   
RelatedCitation | Metrics
Personal Assistant(PA) agents arc cognitive agents capable of helping users handle tasks at their workplace.The paper concerned the design of a memory mechanism for PA agents,which leads to fairly complex cognitive agents.Inspired by the idea of Cascbased Rcasoning(CBR) method, a cognitive memory processor was built to monitor and update the memory of PA. A prototype of the PA agents was developed and implemented in our multi-agent platform named OMAS. Several experiments show the memory makes the PA agent more efficient in handling the tasks requested by the user.
Research on Target Localization Based on Improved Adaptive Velocity Particle Swarm Optimization Algorithm
YAO Jin-jie,HAN Yan
Computer Science. 2010, 37 (10): 190-192. 
Abstract PDF(234KB) ( 700 )   
RelatedCitation | Metrics
An improved adaptive particle swarm optimization algorithm based on velocity adaption and mutation adaption was proposed in view of the shortcoming of the existing localization algorithm and standard particle swarm optimizer algorithm, which has complex calculation, convergence speed and large computational load. The method has selected the particle swarm on the adaptive velocity particle swarm optimization algorithm, added adaptive mutation operation in iteration process to enhance its ability of quick convergence, and the mutation probability is adaptively adjusted by variance of the population' s fitness. The simulation results indicate that it could carry on the localization effectively through adopting the improved adaptive particle swarm optimization algorithm. when the variance of random noise interference is 0. 5, the localization RMSE is below 1. 5m, and has high convergence speed and low computational load.
Approach to Recognizing Chinese Metaphorical Phrases Based on Distinction Words
FU Jian-hui,CAO Cun-gen,WANG Shi
Computer Science. 2010, 37 (10): 193-196. 
Abstract PDF(435KB) ( 469 )   
RelatedCitation | Metrics
The research of metaphors is an important branch of natural language processing. People have realized that metaphors have taken a central position in thoughts and language. From the view of computational linguistics and natural language processing, language understanding and machine translation would be affected badly if the problem of metaphor was not well dealt with. I}hrough observing the context of metaphorial and non-metaphorial phrases respectively,we found that some words can be used to identify metaphorial phrases effectively from Chiese phrases and we call these words distinction words. In this paper, we extracted a number of distinction word form the Web, and then proposed a distiction word based method for recognizing metaphorial phrases. Experimental results show the effectivity of our method.
Discrete Particle Swarm Optimization Algorithm for the Routing of VLSI Circuit
LIU Geng-geng,WANG Xiao-xi,CHEN Guo-long,GUO Wen-zhong,WANG Shao-ling
Computer Science. 2010, 37 (10): 197-201. 
Abstract PDF(407KB) ( 635 )   
RelatedCitation | Metrics
Rectilinear Stciner Minimal Trec is one of the key problems in the routing of Very I_argc Scare Integration and a typical NP-complete problem. To solve the rectilinear Steiner minimal tree with rectangular obstacles(RSMTRO)problem effectivcly,an improved discrete particle swarm optimization(IDPSO ) algorithm was proposed Considering existence of obstacles, the penalty-based fitness function was designed. The principles of mutation and crossover operator in genetic algorithm were incorporated into the proposed PSO algorithm to achieve better diversity, and the scope of the particle optimization was appropriately expanded. Simulation results show that IDPSO algorithm can efficiently provide RSMTRO solution with good quality and converges more efficiently and rapidly than genetic algorithm.
Method of Modeling Knowledge for HTN Planning Based on Emergency Plan Template and its Application
TANG Pan,WANG Hong-wei,WANG Zhe
Computer Science. 2010, 37 (10): 202-206. 
Abstract PDF(446KB) ( 612 )   
RelatedCitation | Metrics
Emergency decision-making is the basis and core of emergency management HTN planning provides an effective mean for it and is the hot issue of current research. According to the hard problem of modeling the domain knowledge during the process of solving the emergency decision making problem based on H TN planning, on the foundation of modeling the emergency plan using workflow model based on PetriNet,this paper presented a method of translating the emergency plan template to the H I}N planning domain knowledge model. I}hen, we modeled the emegency decision- making problem as a mission-planning problem, used the H I}N planning system SHOP2 to solve it, and realized to make action plans by selecting and organizing emergency plan segments scientifically in the face of complex situation. Finally,we carry out the application research in the flood response work of some area.
Parameterized Complexity of Probabilistic Inference in Ising Graphical Model
CHEN Ya-rui,LIAO Shi-zhong
Computer Science. 2010, 37 (10): 207-210. 
Abstract PDF(406KB) ( 414 )   
RelatedCitation | Metrics
Probabilistic inference of the Ising graphical model is to compute the partition function and the marginal probabilistic distribution through summing variables. Traditional computational complexity theory shows that the exact probabilistic inference of the Ising graphical model is#P-hard, and the approximate probabilistic inference is NP-hard.We analyzed the parameterized complexities of exact probabilistic inference of the Ising graphical model and the Ising mean field approximate inference. First, we proved the parameterized complexity theorems of probabilistic inference of the Ising graphical model with different parameters, which show that parameterized probabilistic inferences arc fixed parameter tractable with the variable number and the graphical model treewidth as parameters respectively. Then, we proved the parameterized complexity theorems of the Ising mean field, which demonstrate that the parameterized Ising mean field is fixed parameter tractable with the combination of the free distribution treewidth, the number of iteration steps and the number of variables as parameter; furthermore, when the Ising graphical model parameters satisfy the contraction condition of the Ising mean field iteration formula, the parameterized Ising mean field is fixed parameter tractable with the combination of the free distribution treewidth and the number of iteration steps as parameter.
Analysis for Travel Route Planning Based on Fusion of Multi-intelligent Algorithm
MA Qing-lu,LIU Wei-ning,SUN Di-hua,DAN Yu-fang
Computer Science. 2010, 37 (10): 211-213. 
Abstract PDF(357KB) ( 327 )   
RelatedCitation | Metrics
In order that public can plan travel line before departure based on local traffic condition to short delay time and minimizing energy, this paper inmproved the evolutionary algorithm for Traveling Salesman Problem (TSP) based on artificial neural network,in which multi-intelligent computation algorithm is introduced to enhance the efficiency.And then a swarm intelligence analysis model was proposed to analyze the lines planning for public Traveling. The experimental results show that this method is simple and effective for calculating. It helps to overcome the blindness of choice to further expand in research direction of Computational Intelligence. And the results meet the needs of comprehensive travel information service for public.
Distance Measurement Based Adaptive Particle Swarm Optimization
LI Tai-yong,WU Jiang,ZHU Bo,FANG Bing
Computer Science. 2010, 37 (10): 214-216. 
Abstract PDF(242KB) ( 489 )   
RelatedCitation | Metrics
The inertia weight plays an important role in Particle Swarm Optimization(PSO). The classical PSO used a fixed inertia weight for all particles in an iteration and ignored the difference among the particles. To cope with this issue,a Distance Measurement based Adaptive Particle Swarm Optimization(D MAPSO ) was proposed. The Euclidean distance was used to calculate the difference between a particle and the known best global particle, and the particle tuned adaptively the value of the inertia weight according to the difference. Several classical benchmark functions were used to evaluate the strategy. The experimental results show that for continuous optimization problems, the DMAPSO outper-forms the classical PSO. The iteration times for finding the best solutions in the DMAPSO decrease about 60%averagely compared with that in the classical PSO.
Gene Selection with Tolerance Rough Set Theory from Gene Expression Data
JIAO Na,MIAO Duo-qian
Computer Science. 2010, 37 (10): 217-220. 
Abstract PDF(298KB) ( 321 )   
RelatedCitation | Metrics
Efficient gene selection is a key procedure of the discriminant analysis of microarray data. Rough set theory is an efficient tool for further reducing redundancy. One limitation of rough set theory is the lack of effective methods for processing real-valued data. However, gene expression data sets are always continuous. Discretization methods can result in information loss. hhis paper investigated an approach combining feature ranking together with features selection based on tolerance rough set theory. To evaluate the performance of the proposed approach, we applied it to two benchmark gene expression data sets and compared our results with those obtained by conventional method. Experimental resups illustrate that our algorithm is more effective for selecting high discriminative genes in cancer classification task.
Relations between Fuzzy Entropy, Similarity Measure and Distance Measure of Vague Sets
WANG Chang
Computer Science. 2010, 37 (10): 221-224. 
Abstract PDF(328KB) ( 387 )   
RelatedCitation | Metrics
Vague sets theory has gained great attention from more and more researchers for its applications in various fields, fuzzy entropy, similarity measure and distance measure are the most important technologies in Vague sets. Nowadays, several formula about fuzzy entropy, similarity measure and distance measure of Vague sets have been proposed,but all of these researches did not discuss the relations between these measures. This paper based on the axiom definitions of fuzzy entropy, similarity measure and distance measure of Vague sets, gave mutual induced relations between these measures. Complete relations between fuzzy entropy, similarity measure and distance measure were established.
Research on Uncertainty Measurement for Covering Rough-Vague Sets
XU Jiu-cheng,ZHANG Qian-qian
Computer Science. 2010, 37 (10): 225-227. 
Abstract PDF(342KB) ( 350 )   
RelatedCitation | Metrics
On the researching of covering rough sets theory,Rough-Vague sets model and its properties based on cover were researched by integrating rough sets theory and vague sets theory. In order to measured this model’s uncertainty perfectly,firstly,it measured the uncertainty of the classification capability of cover by defining knowledge capacity measure. Such method reflects the essential characteristic of knowledge classification. And then it defined Groughness to measure the model's uncertainty effectively by combining knowledge capacity and roughness(the measure of border field). The concisely and validity for this measurement were proved by analyzing example.
Quality Assessment for Halftone Image Based on Non-ideal Printer Model
XU Guo-liang,TAN Qing-ping
Computer Science. 2010, 37 (10): 228-232. 
Abstract PDF(454KB) ( 449 )   
RelatedCitation | Metrics
The pages to print need to be converted into halftone image in offset printing procedure. 1}o model the offset printing procedure is the basis of evaluating the duality of halftone image for offset printing. The ideal printer model is assumed in previous researches, i. e. every pixel in halftone image is corresponding to print or blank a small rectangle in the printing media. I}he ideal printer model ignores phenomena of dot gain, missing isolated pixels and clustered dots under a certain limit. This research is to establish a non-ideal printer model for offset printing procedure and incorporate it in halftone images quality assessment algorithm.
Research on Degree Reduction of Parameters Curves and Surfaces
SHI Mao,KANG Bao-sheng,YE Zheng-lin,BAI Hong-wu
Computer Science. 2010, 37 (10): 233-238. 
Abstract PDF(572KB) ( 526 )   
RelatedCitation | Metrics
The degree reduction of parameters curves and surfaces is one of the hottest topics in the researches of computer aided geometric design. This paper gave an overview about the methods and the problem of degree reduction for curves and surfaces in recent years. We mainly stated the degree reduction of Bezer curves and surfaces, the others curves and surfaces such as B-splines etc. were also mentioned.
Method of Catching NPR Hue in Double Light Source Condition
WANG Xiang-hai,QIN Xia-bin,XIN Ling
Computer Science. 2010, 37 (10): 239-241. 
Abstract PDF(342KB) ( 358 )   
RelatedCitation | Metrics
In NPR, a reasonable light and shade and coordination directly affect the effect of the last created menu. This paper discussed the methods of catching the tone of the object in Non-photorealistic rendering scene in double light source condition,and presented a method which can obtain hue information of the scene and objects based on small ball,this method supposes the surface of objects in scene is symmetrically covered with a layer of the same size balls, and two adjacent balls maintain a state of approximate tangent, by computing the ball's sheltered degree by other balls in double light source condition and the approximate mirror reflection information to confirm the hue information of the small ball in it's position. These hue informations can be used to control brushstroke's hue at the corresponding position in the process of painting,thereby ensures the rationality of the whole hue in the last created menu. This method can be applied to a variety of double light source non-photorcalistic rendering for obtaining light and shade information, and has simple structure,fast computation speed and facilitate the realization the adjustment of local information in scene. Simulation results show it is effective.
Image Denoise and Enhancement Based on Structure Self-similarity in Wavelet Domain
JIAO Feng,BI Shuo-ben,ZHAO Ying-nan,GENG Huan-tong
Computer Science. 2010, 37 (10): 242-245. 
Abstract PDF(372KB) ( 444 )   
RelatedCitation | Metrics
In NPR, a reasonable light and shade and coordination directly affect the effect of the last created menu. This paper discussed the methods of catching the tone of the object in Non-photorealistic rendering scene in double light source condition,and presented a method which can obtain hue information of the scene and objects based on small ball,this method supposes the surface of objects in scene is symmetrically covered with a layer of the same size balls, and two adjacent balls maintain a state of approximate tangent, by computing the ball's sheltered degree by other balls in double light source condition and the approximate mirror reflection information to confirm the hue information of the small ball in it's position. These hue informations can be used to control brushstroke's hue at the corresponding position in the process of painting,thereby ensures the rationality of the whole hue in the last created menu. This method can be applied to a variety of double light source non-photorcalistic rendering for obtaining light and shade information, and has simple structure,fast computation speed and facilitate the realization the adjustment of local information in scene. Simulation results show it is effective.
Image Spare Decomposition Algorithm Based on MP and 1DFFT
LI Xiao-yan,YIN Zhong-ke
Computer Science. 2010, 37 (10): 246-247. 
Abstract PDF(269KB) ( 404 )   
RelatedCitation | Metrics
There arc two main problems in image sparse decomposition that the speed is timcconsuming and the visual effect of reconstructed image is not very good. This paper introduced a new algorithm based on MP and IDFI门in image sparse decomposition. Two-dimensional image to be decomposed is converted into ono-dimensional signal. Likewise, the atoms of the over-complete dictionary arc converted into oncdimensional atoms. Then, the inner product operation between image or residual image and atom is transformed into the cross-correlation operation of one-dimensional signals.Finally, the method of FF"T realizes the cross correlation operation between oncdimensional signal and oncdimensional atom. Experimental results show that, when the size of the image is 512 X 512,compared with the two-imensional FIST method, the proposed algorithm speeds up a little more than 2. 11 times without any loss of the reconstructed image quality.
Analysis of Facet-braiding Based on 3D Reconstruction of Integral Imaging
WANG Hong-xia
Computer Science. 2010, 37 (10): 248-250. 
Abstract PDF(252KB) ( 395 )   
RelatedCitation | Metrics
Integral Imaging(II) is a technique capable of displaying threcdimensional images with continuous parallax in full natural color. Since micro-lens sheet is used in recording, only one recording is necessary in containing three-dimensional information. Facet braiding effect is an important visual phenomenon, which induces distortions and impoverishes the integral image quality. In Ref. 6 the facet braiding phenomenon was analyzed in view of single elemental image by Martinet-Cuenca. In contrast, in this paper the phenomenon was verified from the 3D reconstruction. First the traditional II system was modeled with optical software. Then the contrast experiment was drawn in the depth priority regime where the distance between the reference image plane and the reconstructed image is infinite. The facet braiding effect was not presented in our modeled optical system. The result will play an important role in viewing angle analysis, precise 3D reconstruction and spatial resolution analysis.
Image Threshold Segmentation Based on Improved Two-dimensional Renyi Entropy
HUANG Jin-jie,GUO Lu-qiang,LU Ren-hu,DING Yan-jun
Computer Science. 2010, 37 (10): 251-253. 
Abstract PDF(246KB) ( 636 )   
RelatedCitation | Metrics
This paper presented a new image threshold segmentation method based on improved two-dimensional Renyi entropy. First, two-dimensional histogram was build according to gray value and gradient value of the pixels, on this basis, calculated two-dimensional Renyi entropy of target and Background region, at last, we got segmentation threshold by maximizing the Renyi entropy function. It can handle more types of images and get more accurate shape of the image edge that pixels gradient information in combination with parameter of Renyi entropy which is adjustable.
Moving Objects Detection of Adaptive Gaussian Mixture Models on HSV
LIN Qing,XU Zhu,WANG Shi-tong,ZHAN Yong-zhao
Computer Science. 2010, 37 (10): 254-256. 
Abstract PDF(346KB) ( 523 )   
RelatedCitation | Metrics
In the current computer vision apphcations,the research on extracting moving objects from video sequences is very hot. hhe traditional methods can not detect moving object in the complex environment well. Accodding to the characteristics of HSV color space, a adaptive mixed Gaussian background Modeling based on HSV color space and the way to remove the shadow was also proposed. Firstly, in order to improve the efficncicny of Modeling, an adaptive selection strategy of number of components of mixture of Gaussinas model was proposed. Secondly, according to the shadow characters in the HSV vector space,we also proposed a new method to remove the shadow of objects. The paper can also build the model quickly and remove the shadow accurately under the situation of light with big change. Compared with the traditional shadows suppression method,this method can suppress shadows to moving objects without setting the threshold value.
FCM Image Segmentation Based on the Spatial Restrained Fuzzy Membership
PENG Dai-qiang,LI Jia-qiang,LIN You-quan
Computer Science. 2010, 37 (10): 257-259. 
Abstract PDF(254KB) ( 496 )   
RelatedCitation | Metrics
Abstract I}he conventional FCM algorithm is sensitive to noise. I}o overcome the defect of FCM algorithm, we proposed a novel regularized fuzzy o means algorithm for image segmentation based on the spatial restrained fuzzy membership. This approach introduced the relation of space restrained fuzzy membership into the modified FCM objective function. In the new objective function, the membership generated by the proposed algorithm is a product of two terms, the first term is the standard FCM membership responsible for data partitioning, and the second term is a robust constraint of neighborhood membership. Due to the introduction of the spatial information and the relation of neighborhood membership during the process of clustering, the algorithm has good performance in resisting noises.
Uniform Motion Burred Image Restoration Algorithm Based on the Z-transform and Fuzzy Weighted Mean Filter
LI Ming-he,HE Bin,YUE Ji-guang,LU Han-xiong,LI Yong-gang
Computer Science. 2010, 37 (10): 260-262. 
Abstract PDF(377KB) ( 353 )   
RelatedCitation | Metrics
Aiming at the problem of uniform motion blurred image,suppose the object moves along axis X. Restoration and degradation model was established based on Z- transform by strict mathematical deduction, then difference ectuation was changed into algebraic ectuation for simplifying the solving process. Because noise was easily brought and increased with image restoration processing, a fuzzy weighted mean filter was introduced into the restoration algorithm and the algorithm implementation was given concretely. Simulation results show that the proposed method could restore the uniform motion blurred image correctly,effectively and rapidly. Algorithm is not sensitive to fuzzy degree,has a certain stability and superiority compared with other algorithms.
Color Image Representation Method Using NAM Based on Gray Code
ZHENG Yun-ping
Computer Science. 2010, 37 (10): 263-266. 
Abstract PDF(447KB) ( 390 )   
RelatedCitation | Metrics
An important theorem was proposed which proves that the complexity sum of all bit-plane images based on the gray code is less than that based on the binary code. A new color image representation method using the Non-symmetry and Anti packing pattern representation Model(NAM) based on the gray code, which is called the GNAM representation method, was proposed by applying the gray code to the NAM-based color image representation method. Also,a concrete algorithm of the GNAM for color images was presented and the storage structure, the total data amount, and the time and space complexities of the proposed algorithm were analyzed in detail. By comparing the algorithm of the GNAM with those of the classic linear quadtree(LQ7)and the latest NAM which is not based on the gray code, the theoretical and experimental results show that the former can greatly reduce the numbers of subpatterns or nodes and simultaneously save the storage room much more effectively than the latters, and therefore it is a better method to represent color images.
Unsupervised SAR Image Segmentation Based on Multi-features
WANG Qing-xiang,LI Di,ZHANG Wu-jie
Computer Science. 2010, 37 (10): 267-270. 
Abstract PDF(363KB) ( 348 )   
RelatedCitation | Metrics
For synthetic aperture radar(SAR) images with the characteristics of complex texture,large brightness range and vague bridge boundary,a method of unsupervised SAR image segmentation based on multi-features was presented.First of all,fcatures of the local moments and the statistics(contrast,correlation,entropy,homogcneity) of gray level cooccurrence matrix were extracted. Secondly,the dimensional reduction operation by principal component analysis(PCA)was applied to these extracted features in order to obtain 2-dimensional features with adequate category information. Finally,pixels with 2-D feature information were automatically clustered by the Mean Shift method. As the Mean Shift clustering method needn't provide the number of cluster, this processing is an unsupervised process of automatic segmentation. Composite image with 13rodatz textures and SAR images were tested in segmenting experiments and the re- sups demonstrate the method can achieve more accurate segmentation than other two methods in which only the gray level co-occurrence matrix or moments are employed.
Estimating the Number of Components of Mixture Models for Medical Image
XIE Cong-hua,SONG Yu-qing,CHEN Jian-mei,CHANG Jin-yi
Computer Science. 2010, 37 (10): 271-274. 
Abstract PDF(295KB) ( 370 )   
RelatedCitation | Metrics
Estimating the number of mixture models is the key part of clustering analysis and density estimation for medical image. In order to overcome the over-fitting problem of the method of information criteria, we proposed a new estimation method which is based on a feature function of Gaussian mixture models(GMMs). First, the feature function of medial image was defined on the GMMs. Second, constructed a new criterion with the feature function to estimate the number of components of the mixture models. At last, we proposed an algorithm to compute the new criterion. Our new criterion uses a parameter to adjust the value of log-feature function and to keep the balance effect of the penalized function. Experiments on the simulate data and real CT image show our criterion can determine a more reasonable number of components K than others information model selection criteria and avoid the over-fitting problem of the medical image.
Algorithm of Texture Synthesis Based on Chaos Particle Swarm Optimization
QU Zhong,LI Nan
Computer Science. 2010, 37 (10): 275-278. 
Abstract PDF(333KB) ( 308 )   
RelatedCitation | Metrics
As the searching space is limited in its later search, the particle swarm optimization algorithm factors is easy to fall into local minimum, and access to the premature state curly. In response to these circumstances, a new patch-based method for texture synthesis based on chaos particle swarm optimization was proposed. Chaos optimization search technique was used in particle swarm optimization in this paper. The experimental result shows that comparing with particle swarm optimization, chaos particle swarm optimization has better optimization performance, overcomes the disadvantage of particle swarm optimization, and gains the texture synthesis image of higher quality.
WAN Intelligent Storage System for Next Generation Internet
LI Jie-qiong,FENG Dan
Computer Science. 2010, 37 (10): 279-282. 
Abstract PDF(369KB) ( 390 )   
RelatedCitation | Metrics
Aimed at the rapid data growth and the difficulties of network resource management and use in next generation Internet, the WAN intelligent storage system( WISS) adopts multi-level and scalable distributed storage mode to improve the performance of the network storage system based on system architecture optimization. Its storage management follows SMI-S specification. And it proposed an effective load balancing strategy and high-speed secure storage middleware solution for metadata management and data transmission in complex network environment WISS both improves data transfer rate,and reduces management cost It achieves the goal of separating control and data flow and accclerating data transfer while expanding storage capacity at the same time. As a result, WISS enhances the performance of the whole storage system enormously.
Hybrid Real-time Scheduling Algorithm Based on Partial Reconfigurable FPGA
YIN Jin-yong,GU Guo-chang,WU Yan-xia
Computer Science. 2010, 37 (10): 283-286. 
Abstract PDF(313KB) ( 482 )   
RelatedCitation | Metrics
Real-time task running on CPU/FPGA is usually composed of software/hardware subtasks with precedence constraints. A scheduling algorithm was proposed for software/hardware hybrid real-time tasks. The schedulability sufficient condition for real-time tasks was derived from analyzing what happens when the first deadline is missed. Hardware subtasks of each task were partitioned into several groups and subtasks within the same group were configured on PFGA overlapped. Hardware subtasks can be connected to the system bus dynamically by placing and routing ports of hardware subtasks and system bus manually. The experimental results demonstrate that the scheduling algorithm can meet real-time tasks' deadlines and make full use of FPGA.
Variable Voltage Tabu Task Scheduling Algorithm for Optimizing Energy Consumption
KANG Yan
Computer Science. 2010, 37 (10): 287-290. 
Abstract PDF(325KB) ( 370 )   
RelatedCitation | Metrics
Energy consumption is a critical issue in heterogeneous parallel and distributed systems. Dynamic voltage scaling(DVS) is a powerful technique to achieve energy saving by slowing down the processor into multiple frequency levels, and DVS algorithms typically consist of the assignment of tasks and the allocation of the slack. While most research focuses on the allocation phase, we considered the relation between the slack time and the assignmented of the tasks, and presented a Tabu-based Scheduling algorithm for energy minimization on heterogeneous distributed system.The algorithm finds an initial schedule for the tasks represented by directed acyclic graph(DAG) , and improves it by using a Tabu search strategy. The total energy consumption of the system is reduced further by using the Tabu strategy to forbid the task to assign onto specific processors and create more idle time slices for the slack allocation phase. Simulation results indicate that our algorithm achieves substantial energy savings on computer systems.
On Deadlock Prevention of a Subclass of Petri Nets--S4R
ZHU Sen
Computer Science. 2010, 37 (10): 291-294. 
Abstract PDF(316KB) ( 331 )   
RelatedCitation | Metrics
As a special subclass of Petri nets, S4R can model resource allocation systems with multiple processed that are more complex than S3PR. This paper proposed a deadlock prevention policy for S4R. First, checked the liveness of a flexible manufacturing system modeled by S4R by using MIP. Then designed a supervisor for the system if it is not live,which is based on a new concept of siphon control. Again, checked the livcness of the controlled system by using MIP.If the controlled system is not live, it will be further controlled. This policy can avoid some constraints that are not necessary. Usually a livencss-enforcing supervisor with more permissive can he obtained by the proposed policy.
Static Decentralized Output Feedback Control for Bilinear System
GUO Gang,NIU Wen-sheng,CUI Xi-ning
Computer Science. 2010, 37 (10): 295-296. 
Abstract PDF(175KB) ( 330 )   
RelatedCitation | Metrics
This paper presented the problem of decentralized static output feedback control for a nonlinear interconnected system which is composed by a number of Takagi-Sugeno( T-S) fuzzy bilinear subsystems with interconnections.Based on the decentralized control theory, some sufficient stabilization conditions were derived for the whole close-loop fuzzy interconnected systems. I}he stabilization conditions were further formulated into linear matrix inequalities(LMI)so that the corresponding decentralized controllers can be easily obtained by using the Matlab LMI toolbox. Finally, a simulation example shows that the approach is effective.
New Improved Pseudo-boolean Satisfiability Algorithm for FPGA Routing
TANG Yu-lan,LIU Zhan,YU Zong-guang,CHEN Jian-hui
Computer Science. 2010, 37 (10): 297-301. 
Abstract PDF(393KB) ( 432 )   
RelatedCitation | Metrics
In order to avoid the negative effect of increasing transformation cost of pseudo-Boolean Satisfiability algorithm in the routing process,a new routing algorithm was proposed for日'GA, which combines advantages of pseudo-Boolean Satisfiability and geometric routing algorithm. In the routing process,PathFinder,one of geometric routing algorithm was chosen firstly for FPGA routing. If not successful,then used pseudo-13oolcan Satisfiability algorithm. Moreover, technique of static symmetry-breaking was added to carry out pretreatment of pseudo-l3oolean constraints, detecting and breaking the symmetries in the routing flow. The purpose was to prune search path, and the cost was consectuently reduced. Preliminary experiments results show that the hybrid approach can reduce the runtime observably, speed up the solving process,and have no adverse affect on overall program.