Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 38 Issue 9, 16 November 2018
  
Web Service Composition Systems Survey
WU Yun-peng BA0 Wei-dong ZHANG Wei-ming HUANG Jin-cai
Computer Science. 2011, 38 (9): 1-4. 
Abstract PDF(369KB) ( 469 )   
RelatedCitation | Metrics
Web service composition has become an emerging and promising technology for designing and building complex applications out of single Wclrbased software components. Currently, there already arc several such systems based on different technology. Question along with these systems is which system should be chosen as a composition tool. This paper gave a function model of general Web service composition system,based on which analyzed twelve systems which already exist. According to the functions systems have, it divided these systems into different categories and gave an outlook to essential future research work.
Survey of the MAC Protocols on Underwater Acoustic Sensor Network
ZHOU Mi,CUI Yong, XU Xing-fu, YANG Xu-ning
Computer Science. 2011, 38 (9): 5-10. 
Abstract PDF(657KB) ( 626 )   
RelatedCitation | Metrics
Nowadays the ocean plays an increasingly important role in the development of human society. As a basilic means to learn about the sea, underwater communication technology becomes a popular nodus in research. Different from the terrestrial wireless communication in channel characteristics and performance requirements, conventional wireless communication MAC(medium access control) does not apply to it, MAC protocols for various underwater acoustic applications were continuously proposed. After outlining the characteristics of underwater acoustic sensor network and the designing standards of MAC, current control mode was divided into contention-based protocol and scheduling-based protocol according to the pattern by which the channel was accessed. The contention-based protocols were further divided into random access and collision avoidance according to their modes of dealing with collision, while scheduling-based ones into dynamic allocation and static allocation according to dynamics of channel allocation. Under this classification,the design idea and primary mechanism of current main protocols were described, their performance differences werediscussed such as energy efficiency, channel utilization and throughput, and a new development direction was presented for the improvement of the MAC protocols.
Research on Data Publishing of Privacy Preserving
YANG Gao-ming , YANG Jing ,GHANG Jian-pei
Computer Science. 2011, 38 (9): 11-17. 
Abstract PDF(653KB) ( 468 )   
RelatedCitation | Metrics
With the development of information technology, privacy leakage becomes a serious problem, therefore, it is in urgent need to prevent personal privacy disclosure in data publishing. For this reason, many researchers have proposed different ways to achieve data publishing of privacy protection. To sum up the previous work, we introduced research significance of privacy protection data release technology and its development process, described background attack model and privacy model during the study in this field, deeply analysed existing generalization/suppression method and clustering method to achieve anonymity data release, summarized information metrics of related anonymous dada quality, also discussed incremental data release method caused by data update as well as high-dimensional data and mobile duto release, finally, looked further research trends in this field.
Survey on Contextual Information Retrieval
TIAN Xuan LI Kong-mei
Computer Science. 2011, 38 (9): 18-24. 
Abstract PDF(625KB) ( 659 )   
RelatedCitation | Metrics
With the development of IR techniques, contextual information retrieval(CIR) has been identified to be a promising direction for improving search. In CIR, the retrieval of information depends on the time and place of submitting query,history of interaction,task in hand,and many other factors that are not given explicitly but lie implicitly in the interaction and surroundings of searching, namely the context. In this paper, a survey on research work of CIR was given, and the contextual elements of CIR were summarized. In addition, contextual elements were summarized into user context, document context, and system context and the related research work where introduced. At last, the key challen- ges in CIR were discussed from five aspects. Exploitation of real need behind user's query, search based on semantic un-derstanding,contextualinformation retrieval model and et al were pointed out as main problems needed to be resolved.
Survey on the Research of Cyber-Physical Systems(CPS)
LI Zuo-peng , ZHANU Tian chi , ZHANU Jing
Computer Science. 2011, 38 (9): 25-31. 
Abstract PDF(603KB) ( 2938 )   
RelatedCitation | Metrics
Cyber-Physical Systems(CPS) refers to the tight conjoining of and coordination between computational and physical resources,and will change the way in which we interact with the physical world. As the evolution of Internet of Things,CPS has attracted extensive attention of research institutions,government departments and community in china and abroad. This survey introduced and described the CPS of definition, system architecture and features, in addation,mainly studied and discussed the CPS of theory and technology hierarchy, the important challenges toward computer science technology and situation of researches,finally the future perspective of research trends was presented.
Algorithm of Authenticated Skip List Based on Directed Hash Tree
XU Jian, CHEN Xu, LI Fu-xiang, ZHOU Fu-cai
Computer Science. 2011, 38 (9): 32-35. 
Abstract PDF(382KB) ( 441 )   
RelatedCitation | Metrics
Authenticated skip list is an important authenticated data structures. It has been widely used in data authenticanon. Since the hash scheme has the important influence on the cost of the authenticated skip fist, a new hash scheme which is based on the idea of separating the hash scheme and data storage scheme was proposed in this paper. And the new algorithm of authenticated skip list(ASI: DHT for short) based on directed hash tree was also proposed. We applied hierarchical data processing and probability analysis methods to analyze the cost of ASL-DHT, and also made an algorithm simulation to compare with that of the original authenticated skip list. hhc results show that, ASI= DH T algorithm has got great improvement on storage cost, communication cost, and time cost.
Data Aggregation Security Solution Based on Key Management Scheme of Dynamic Multi-cluster
LEI Feng-yu, QIN Yu-hua, CHEN Wen-xin, CHEN Jing
Computer Science. 2011, 38 (9): 36-40. 
Abstract PDF(447KB) ( 405 )   
RelatedCitation | Metrics
A dynamic multiple cluster key management model based on identity was designed in wireless sensor network. The key management in the model is cluster-based. Every node in the network is just rectuired to store its private key and the public key factor matrix of the cluster that it belongs to, which consumes little storage space and can resist the collusion attack. The key distribution is secure and effective. It costs little during the node joining and leaving. And it realizes the identity authentication independent of the third party and has no use for infrastructure to sustain the key distribution center. A secure data aggregation solution based on the model was proposed. Security of the proposed solulion was analyzed and some attacks that can be resisted were listed. The energy consumption during the handshaking process was discussed, which indicates that it is feasible to use the identity-based cryptography in the wireless sensor networks.
Asymmetric Group Key Agreement with Traitor Traceability
ZHAO Xiu-feng,XU Qiu-liang, LIU wei
Computer Science. 2011, 38 (9): 41-44. 
Abstract PDF(385KB) ( 403 )   
RelatedCitation | Metrics
hhe notion of asymmetric group key agreement(ASGKA) was first introduced in EuroCrypt 2009,in which the group members merely negotiate a common encryption key which is accessible to attackers and corresponds to differtent decryption keys, each of which is only computable by one group member. One of the future works is to achieve asymmetric group key agreement with traitor traceability. In this paper, we proposed a provable security asymmetric group key agreement protocol ASGKAwI T in standard model,the new protocol provides traitor tractability.
Systemic Approach of Evaluating Information Security
MA Lan, YANG Yi-xian
Computer Science. 2011, 38 (9): 45-49. 
Abstract PDF(455KB) ( 445 )   
RelatedCitation | Metrics
Information security evaluation is the guarantee to the effective long-term information security protection.Based on the theory of dynamic system control, this paper emphasized on the system approach of Information security evaluation, presented a framework model for information assurance, and proposed a evaluation model for system approachof information security, dedcribed the procedures of system approach of information security, and provided the hardens to the protect system of information security.
Trust Updating Algorithm Using Subject Logic in Wireless Sensor Network
XIE Fu-ding, ZHOU Chen-guang, ZHANG Yong, YANG Dong-wei
Computer Science. 2011, 38 (9): 50-54. 
Abstract PDF(394KB) ( 380 )   
RelatedCitation | Metrics
ho solve the distortion of trust updating for Reputation-based Framework for Sensor Networks(RFSN),an algorithm of trust updating using subject logic was presented. Firstly, the process of trust updating in RFSN was recognixed and decomposed. It was pointed out that the prime reason in the distortion of trust updating is the partial refleclion of current behavior tendency of the node owning to the representation of trust in expectation of reputation variable,and the general design model was constructed. Then, the trust updating based on the expectation of reputation random variahlc was adjusted adaptively by importing the subject logic opinion to overlook the trust updating with comparative1y weaker support about current behavior tendency so as to avoid the distortion caused by using long-term behavior tendency oriented trust updating to reflect the current behavior tendency of the node. Finally, by J-Sim, it was demonstrated that the presented algorithm could not only represent the node' s long-term behavior tendency but also reflect the current behavior tendency to some extent.
Provable Secure Authentication Protocol Based on CPK and Improved ECDH Algorithm
HOU Hui-fang, WANG Yun-xia
Computer Science. 2011, 38 (9): 55-58. 
Abstract PDF(344KB) ( 647 )   
RelatedCitation | Metrics
Since ECDH was easily suffered from man-in-the-middle attack, a provable secure improved ECDH in the Canetti Krawczy(CK) model was presented. Combined with CPK, a security and efficient fast authentication protocol was devised based on the modular approach of Canetti Krawczyk(CK) model. It also designed and improved a message transmission authenticator in which the key of symmetric encryption algorithm was dynamically generated. It realized session key association and confirmation meanwhile mutual authentication. Analysis shows it has better security and performance.
Incentive Compatible Reputation Model for P2P Networks
HU Jian-li. ZHOU Bin, ZHOU Yu, WU Quan-yuan
Computer Science. 2011, 38 (9): 59-63. 
Abstract PDF(413KB) ( 394 )   
RelatedCitation | Metrics
An important challenge regarding peer's trust valuation in peer-to-peer(P2P) networks is how to cope with such issues as the fraudulent behaviors and the dishonest feedback behaviors from malicious peers, and the issue of inacfive recommendations to others. However, these issues cannot be effectively addressed by the existing solutions. Thus,an incentive compatible reputation management model for P2P networks, named ICRM, was proposed to solve them. In ICRM, the metric of time zone is used to describe the time property of the transaction experience and the recommendation. I}hrec other metrics such as the direct trust value, the recommendation trust value and the recommendation credibility, based on the metric of time zone are applied to express accurately the final trust level of a peer. Furthermore, the participation level is introduced as the metric to identify a peer's activeness degree. I}hcoretical analysis and simulation experiments demonstrate that,ICRM can effectively suppress the malicious behaviors such as providing unreliable services, or giving dishonest fecdbacks to others in the P2P networks. What's more, it also can incent peers to offer recommendations to others more actively.
Novel Linear Precoder Design with Oblique Projection for Downlink MU-MIMO Systems
ZENG Yu-hui,SHANG Peng,ZHU Guang-xi, WU Wei-min
Computer Science. 2011, 38 (9): 64-66. 
Abstract PDF(250KB) ( 385 )   
RelatedCitation | Metrics
In current MU-MIMO systems orthogonal projection method is generally emploied in the user channel to maximize the performance of the worst channel by increasing the minimum channel gain. We introduced the research of linear precoding in multi user MIMO system and the mathematical preliminaries of oblique projcction,and then implemented oblique projection technique into I31)GMI)system, proposed a novel linear precoding based on oblique projection to reduce the gain loss caused by the orthogonal projection method and further increase the system capacity. System model of this precoding algorithm is in detail, and derive out the closed-form expression of system capacity by implementing water-filling power allocation algorithm into the model. Simulation shows that in low SNR the proposed scheme can obtain higher system capacity than the orthogonal projection schema.
Intelligent Transportation Flow Detection Technology Based on Internet of Things
LIU Tang,PENG Jian,YANG Jin, WANG Xiao-fen
Computer Science. 2011, 38 (9): 67-70. 
Abstract PDF(341KB) ( 455 )   
RelatedCitation | Metrics
In order to better analyze dynamic transportation flow so as to gain road traffic information, this paper combined the Internet of Things technology and the intelligent transportation system and proposed a new intelligent transportation detection system-IhFDS(Intelligent Transportation Flow Detection System Based on Internet of Things).ITFDS obtains original traffic parameters and initial data fusion by vehicle sensor nodes. The sink is the central node for information gathering, distribution and secondary data fusion, and the center engine room is responsible for statistics and management. Simulation results show ITFDS can effectively and timely obtain the value of road transportation flow and send it to the vehicle,based on which travel routes can be selected.
Test Pac(}ets Choice Algorithm Aiming at Filter Conflicts
LI Lin, LU Xian-Liang
Computer Science. 2011, 38 (9): 71-75. 
Abstract PDF(420KB) ( 362 )   
RelatedCitation | Metrics
Because of firewall filter confhcts,filters may not be in accordance with administrators' meaning so that this leads to security vulnerabilities. hherefore we need correctness test to solve this problem. Most of the current test packets choice algorithms choose packets at random or from the apex of filters in the correctness test. However these methods neglect the areas that contain conflicting filters and hence cannot detect all error produced by filter conflicts. This paper presented a test packets choice algorithm aiming at filter conflicts to address this problem. The algorithm treats two filters as the basic processed object and computes their area that contains conflicting filters. We not only choose test packets from the apex of filters but from the areas that contain conflicting filters as well. Compared to current test pack- ets choice algorithms,the algorithm proposed by this paper can detect all error produced by filter conflicts with adding only a little packets. 1}his paper proves the algorithm and experiments verify its good performance.
Throughput Optimization of Cognitive Radio Based on Sequential Detection
ZHANG Wen, YANG Iia-wei,YAN qi,XIAO li-yuan
Computer Science. 2011, 38 (9): 76-78. 
Abstract PDF(290KB) ( 412 )   
RelatedCitation | Metrics
I}he throughput optimization of cognitive radio based on sequential detection was studied. The form of sequential detection in cognitive radio was given, then the mathematical model of throughput was established with the constraint that the primary users arc sufficiently protected. Furthermore, it is proved that there is a unique optimal paramcter setting which achieves the maximal throughput by optimization theory. At last, a searching method was given to oblain the optimal parameter setting.
Research of a Trust Chain Transfer Model
SI Li-min CAI Mian CHEN Yin-jing GUO Ying
Computer Science. 2011, 38 (9): 79-81. 
Abstract PDF(297KB) ( 357 )   
RelatedCitation | Metrics
Based on measurement applications and its dynamic library to protect the integrity of the application of static credible, and analyzing the relation between interactive applications, this article established chain of trust of the transfer model , to protect applications running in the process of dynamic credible, to build reliable application environment. Based on the intransitive noninterference model, this article abstracted the system as applications, actions, states and outputs,and formally defined to run trusted applications. Application trusted theorem was verified formally. Furthermore, by associating application with system state, the definition and the theorem of application environment trusted were proposed.
Multi-dimensional Complex Query Processing over DHT
XU Qiang,SUN Le-chang,LIU Jing-ju,ZHAO Ting,CAI Ming
Computer Science. 2011, 38 (9): 82-86. 
Abstract PDF(441KB) ( 361 )   
RelatedCitation | Metrics
Advanced query processing is a critical problem for the application of DHT networks. It has attracted much attention from both academic and industrial community. This paper presented a technique for multi-dimensional complex query processing based on Kademlia. It takes user's preference into consideration so that homogeneous data is relevantly indexed. Furthermore, the index maintenance brings no extra communication cost by piggybacking on routing table recovery, besides the advantages in resilience and load balance. The analysis and simulation results show that it implemenu multi-dimensional complex query processing with O(logN) query length and low cost.
Spurious Emission Interference of TD-SCDMA in Mulriple Coverage
GAO Di, ZHU Guam-xi,CHEN Yong-hu, WU Wei-min
Computer Science. 2011, 38 (9): 87-90. 
Abstract PDF(312KB) ( 512 )   
RelatedCitation | Metrics
As one of the international standards of 3G raised by China, TD-SCDMA is applied more widely recently. But under the case of coexistence of several wireless systems, interference among them increasingly affects the system performance,which promotes studies on related areas. Since frequency bands are allocated subtly,spurious emission interference greatly influences the duality of communication. This paper was based on the theory of spurious emission,while interference on TD-SCDMA was quantitatively analyzed referring to standard of each system. I}he simulation educed detailed data of interference from different systems using MontcCarlo method. Furthermore, state of interference was described by some parameters such as electronic level and spurious emission isolation, which can be referred in the construction of platform for a number of coexisting systems.
Research of Coordination Mechanism in Multi-platform Air Defense System
ZHANC Jie-yong, YAO Pei-yang,TENG Pei-jun
Computer Science. 2011, 38 (9): 91-94. 
Abstract PDF(325KB) ( 376 )   
RelatedCitation | Metrics
How to cooperate the actions among multi platform is one of critical problems in the process of building multi-platform air defense system. In the basic of introducing the important state of the coordination mechanism in multi-platform air defense system,many different kinds of coordination mechanisms were concluded and specified,then the performance and application of these different kinds of coordination mechanisms were analyzed and compared. At last, combining with the different development phases of operation mode, the three different kinds of coordination mechanisms based on target allocation were discussed.
Mix-criticality Driven QoS Adaptive Resource Management for SOA-based Critical Systems
GHANG Yi , CAI Wan-long
Computer Science. 2011, 38 (9): 95-99. 
Abstract PDF(428KB) ( 322 )   
RelatedCitation | Metrics
SOA-based distributed and embedded safety乙mission system(DESMCS) executes in open environments which have different dependability requirements. hhis paper presented a novel mix-criticality driven QoS adaptive dynamic resource management architecture, which provides middleware services for QoS- and criticality-based resource allocation and adaptation across heterogeneous computing nodes and cotntnunication networks. The architecture presented in this paper overcomes the disadvantages of traditional feedback-based static resource management technictues and pro- vides layered mix-criticality based resources allocation services. Finally, experiments demonstrate that it enables the DESMCS to dynamically react to changing resource demands or resource availability and better resource utilization,improved dependability.
WIA-PA Network Oriented Routing Algorithm Based on VCR
YI Xiu-shuang , WANG Xing-wei, WU Wei-xin , LIU Xiao-feng
Computer Science. 2011, 38 (9): 100-102. 
Abstract PDF(328KB) ( 394 )   
RelatedCitation | Metrics
Based on the IEEE-802. 15. 4 standard and WIA-PA networks with VCR in it, the minimum hop routing algorithm and energy-efficient routing algorithm in wireless networks were introduced. The advantages and disadvantages of the two algorithms were analyzed and compared. By combining the minimum hop routing algorithm and routing algorithm for energy-saving advantages as well as VCR in the WIA-PA network, a routing algorithm based on the VCR was designed and simulated. Simulation results show that the novel routing algorithm based on VCR with combing the two traditional routing algorithms is efficient. By comparing the WIA-PA VCR application with different routing algorithms, it achieves lower delay management data packet forwarding and energy consumption of data packets transmitting, and is more suitable for the WIA-PA network applications.
Research on Wireless Security Protocol Design
GU Xiang,ZHANG Zhen, QIU Jian-lin
Computer Science. 2011, 38 (9): 103-107. 
Abstract PDF(415KB) ( 383 )   
RelatedCitation | Metrics
Discussed general steps of a wireless security protocol design, including abstracting of application environment, analyzing weakness of application-specific network, confirming objectives of protocol to be designed, analyzing advantages and disadvantages of the existing similar protocols, designing protocol, proofing protocol security. And in accordance with these steps,a new wireless network authentication protocol was designed as an example. Practice shows that these steps could guide the designing of wireless security protocols better. They could also be used to guide the designing of some small application-layer protocols.
Research on Approach to End-to-end Network Link Delay Inference Based on PLE with Definite Solution
LIANG Yong-sheng,ZOU Yue,ZHANG Ji-hong
Computer Science. 2011, 38 (9): 108-111. 
Abstract PDF(321KB) ( 319 )   
RelatedCitation | Metrics
Network delay is one of the important network performance parameters. End-to-end network delay inference could deal with the difficulties caused by other network measurements based on internal routers or muter cooperation.Under the condition of two assumptions, network topology structure is gotten and stable, link performance is temporally and spatially independent, network delay inference model was presented, a new approach to network internal link delay inference based on Pseudo Likelihood Estimation(PLE) with definite solution was proposed in this paper. Based on PLE solved with Expectation Maximum(EM) algorithm, inference units with definite solution were determined via back-to-back packet sending way. This approach could solve the problem of indefinite solution and lower the computation complexity. Experimental study was performed based on model computation. The experimental results show that the approach is accurate and effective.
Stochastic Model Checking Continuous Time Markov Process
NIU Jun,ZENG Guo-sun, LU Xin-rong,XU Chang
Computer Science. 2011, 38 (9): 112-115. 
Abstract PDF(456KB) ( 416 )   
RelatedCitation | Metrics
the trustworthiness of a dynamic system includes the correctness of function and the satisfiabihty of per-formance mainly. I}his paper proposed an approach to verify the function and performance of a system under consideralion integratedly. Continuous-time Markov decision process(CTMDP) is a model that contains some aspects such as probabilistic choice, stochastic timing and nondeterminacy, and it is the model by which we verify function properties and analyze performance properties uniformly. We can verify the functional and performance specifications by computing the rcachability probabilities in the product CI}MDP. We proved the correctness of our approach, and obtained our verification results by using model checker MRMC(Markov Reward Model Checker). The theoretical results show that model checking CTMDP model is necessary and the model checking approach is feasible.
Research on Context Inconsistency Detection and Resolution
ZHANG Yi-nan ,WU Gang
Computer Science. 2011, 38 (9): 116-118. 
Abstract PDF(346KB) ( 358 )   
RelatedCitation | Metrics
Abstract The context aware system requires detection and resolution of context inconsistency, which exists in pervasive computing. Detection and resolution of context inconsistency was studied, a feed-back mechanism based on confidence measure database was presented, and the elimination resolution was enriched and extended in these ways. Also, we accomplished these resolutions, and their performances were evaluated through a series of experiments.
Research on Verification of Web Service Composition Based on uMSD
WANG Zhi-jian LI Wen-rui, YANG Ghong-xue,ZHANG Peng-cheng
Computer Science. 2011, 38 (9): 119-125. 
Abstract PDF(628KB) ( 357 )   
RelatedCitation | Metrics
In order to solve the problem that analyzing the composite services by manual is rather difficult and time-consuming, an approach was proposed to verify composite services by model checking based on uMSD. How to represent the temporal properties of the composite service easily and intuitively is a critical issue of the approach. Because uMSD finds a balance between simplicity of use and expressiveness, the paper defined the formal syntax and semantics of uMSD. In the paper, uMSD was used to graphically represent the temporal properties of a composite service On-the-Job Assistant as a case study, presenting the feasibility of uMSI).A series of experiments show the approach can effectively detect the logical errors in the composite service.
Formal Development of Non-recursive Algorithm for Koch Curve
LIU Run-jie , SHEN Jin-yuan , MU Wei-xin
Computer Science. 2011, 38 (9): 126-129. 
Abstract PDF(298KB) ( 731 )   
RelatedCitation | Metrics
Formal method is an important approach for construction of the trustworthy software. Koch curve is one of the typical fractals. A non-recursive algorithmic program of Koch curve was dvcloped, employing PAR method and the strategy of developing loop invariant and the algorithm was verified formally. This paper achieved loop invariant of Koch curve with readable, efficient and reliable non-recursive algorithm finally. The paper contributed to developing non-recursive algorithm using formal method and new strategy of developing loop invariant.
Extracting Model of Web Application Based on XML
CHENG Guang-jin, MIAO Huai-kou,FANG Ming-ke, MEI Jia,GAO Hong-hao
Computer Science. 2011, 38 (9): 130-134. 
Abstract PDF(500KB) ( 362 )   
RelatedCitation | Metrics
For model checking,an approach to extracting TA models of Web application based on XMI_ document with time constraint was proposed. The extraction process is divided into three phases; time and link extraction, model construction and display. Firstly, analyzed Web application reversely to extract, structure and store information related to link and time constraint from XMI_ source code. Then, analyzed elements for model construction such as link and time constraint,and restructure the information obtained with mapping and aggregation technology. Finally the TA(Timed Automata) model applying for formal checking was obtained. A particular case study, a mail box system is taken to i1lustrate the method to be feasible.
Test Case Generation Method for Concurrent Programs Based on Petri Net
HUO Min-xia, DINU Xiao-ming
Computer Science. 2011, 38 (9): 135-138. 
Abstract PDF(270KB) ( 575 )   
RelatedCitation | Metrics
Petri nets have an incomparable advantage of describing the unpredictable testing path of concurrent program. This article made test path of concurrent programs with Petri nets, transformed graphic matrix about Petri net model which is made of concurrent program codc,found the corresponding independent segment group according to certain rules, and obtained the independent segment groups of Petri nets which is the test path of the concurrent programs, by merging independent segment group. Experiment shows that the using of Petri nets in concurrent programs testing reduces the test difficulty and improves the test efficiency in concurrent programs testing.
Construction of UDDI Based Collaborative Support Platform for Manufacturing
WU Qian, GHOU Qing
Computer Science. 2011, 38 (9): 139-141. 
Abstract PDF(260KB) ( 319 )   
RelatedCitation | Metrics
Application software is the soul of modern manufacturing. In order to overcome the puzzle of heterogeneous application integration, the paper proposed a sort of software development platform of unified collaborative support based on UDDI. In the paper, it pointed out the speediness of developing application software under the condition of unified platform, explored the collaborative resource discovery and its description mechanism based on unified platform, discussed the service mechanism of software system development support for resource integration, and studied the key technology such as Web service, SOA support and so on. Finally it took the typical application case as an example to verify that it has speediness, practicability and effectiveness to adopt unified collaborative support platform to develop application software system in the manufacturing.
Research on Algorithms for Mining Fuzzy Horn Clause Rules
LIU Dong-bo,LU Zheng-ding
Computer Science. 2011, 38 (9): 142-145. 
Abstract PDF(307KB) ( 515 )   
RelatedCitation | Metrics
Fuzzy association rules can be used to represent human knowledge in terms of natural language, and has attraded a growing amount of attention from the communities of Data Mining and Knowledge Discovery. However, so far, most approaches of mining fuzzy association rules arc based on the measures of support and confidence for classical association rules. From the viewpoint of fuzzy implications, fuzzy Horn clause rules, degree of support, implication strength and some related concepts were defined, and an algorithm was proposed for mining fuzzy Horn clause rules.This algorithm can be decomposed into three subprocess. First of all, a quantitative database is transformed into a fuzzy database. Secondly, all frequent itemsets in the fuzzy database that arc contained in a sufficient number of transactions above the minimum support threshold are identified. Once all frequent itemsets are obtained, the desired fuzzy Horn clause rules above the minimum implication strength threshold can be generated in a straightforward manner.
Core Algorithm of High-dimensional Main Memory kNN-Join Index Structure
LIU Yan,HAO Zhong-xiao
Computer Science. 2011, 38 (9): 146-149. 
Abstract PDF(323KB) ( 370 )   
RelatedCitation | Metrics
kNN-Join is an important but costly primitive operation of high-dimensional databases. As RAM gets cheaper and larger,more and more datasets can fit into the main memory,how to realize the kNN-Join efficiently brings people's interests. 4-trecR and 4-trecS were designed especially for main-memory kNN-Join according to the properties of it.The core algorithms and relevant certificates of building them were presented combining with coding and node center coincidence technologies. Experiments show that the algorithrr}p-tree-kNN-Join based on p-tree-R and p-tree-S is superior to the existing kNN-Join algorithm of Gorder that can be used in main memory.
Semantic Annotation Method Based on Sparse Coding
CHEN Ye-wang, LI Hai-bo, YU Jin-shan, CHEN Wei-bin
Computer Science. 2011, 38 (9): 150-154. 
Abstract PDF(493KB) ( 353 )   
RelatedCitation | Metrics
Semantic annotation plays a significant role in Semantic Web rescarcho There arc many annotation methods for unstructured documents today. However,none of them takes notice of the fact that the knowledge locates in documenu sparsely, and few of them make use of the structure of a document effectively, which results in that they cannotannotate document well in case that the quality of the document is poor. In this paper, we proposed a Semantic Annotalion Method based on Sparse Coding(SAMSC) for unstructured data. This method starts from initiation by identifying some semantics described in documents by ontology; secondly, in order to determine the correlation between a document and a semantic topic described in ontology, it resolves the paragraph structure and topics of the document iterativcly;finally, this method annotates the documents in the global range of all documents by minimizing loss function. The experiment results demonstrate the performance of this method annotates unstructured documents well in the Web automatically and effectively. Also, it annotates low quality documents better than other methods.
Frequent Itemsets Mining Algorithm of Succinct Constraint with Adaptive Thresholds
REN Yong-gong, LU Zhen, SUN Yu-qi
Computer Science. 2011, 38 (9): 155-157. 
Abstract PDF(313KB) ( 390 )   
RelatedCitation | Metrics
In recent years, associate mining based on constraint is more and more focused. From the algorithm of existing associate mining, it is easy to observe that the traditional thresholds are given mostly by expert or found after repealed test User's feedback and the support of objective evidence was lacked. According to this problem,the construetional method of thresholds catered to user requirement was proposed. To obtain the self-adaptation constraints thresholds, the theory of normal distribution is cited in this method. FGC algorithm is ameliorated by using succinct constraint. A speedy, effective and intuitive frequent items mining algorithm was proposed. Experiments show that the proposed methods can increase the system's availability and reduce the runtime of algorithm.
Security Dissemination Methods Based on Probability for Dynamic Views
SONG Jin-ling,LI Fang-ling, LIU Guo-hua, HUANG Li-ming, ZHANG Guang-bin, WANG Dan-li
Computer Science. 2011, 38 (9): 158-163. 
Abstract PDF(551KB) ( 297 )   
RelatedCitation | Metrics
Views can restrict database access on specifically attributes and tuples, so publishing views can reduce the possiblity of privacy disclosure. Howevere, the data owner will publish various views based on different application on different time, the dynamic and continuity of view dissemination made mutual contact and mutual influence among views. Considering the privacy disclosure of the publishing views themselves, which called static privacy disclosure, can not measure the disclosure accurately and can not adapt to the practical application, how to guarantee the security of the dynamic views dissemination must be resolved. To solve this problem, at first, the construction method of possible worlds and calculation method of privacy disclosure probability were propoded, and the calculating formulas of privacy disclosure probability under differrent conditions of views merging were presented. Then, the security determination formina for dynamic views was proposed from the point of relative sefe of view. Based on this, the safety dissemination method for dynamic views was presented, which can guarantee maximum level of views publishing upon relatively safe.
Building Multidimensional Analysis Model for Business Reports Based on CWM
PIAO Yong, WANG Xiu-kun,CHEN Ghi-kui, WANG Zheng
Computer Science. 2011, 38 (9): 164-167. 
Abstract PDF(361KB) ( 336 )   
RelatedCitation | Metrics
With standardization and normalization process of financial reports,large numbers of enterprises are introducing extensible business report language(XI3RL) to disclose and manage their financial data. Because of complexity and domain specific feature of XI3RI,it lacks semantic description for data analysis,leading to the situation that many existing tools capable of multidimensional data analysis cannot understand and deal with XI3RL files. Based on the analysis on XBRL structure, the paper applied Common Warehouse Metamodel(CWM) , which is platform and domain independent, to build a generic multidimensional analysis model from business reports in X13RI_ format. Transforming rules are separated from implementation and hence making the model more generic and open.
Survey of Data Broadcast Scheduling in Wireless Environments
YU Ping
Computer Science. 2011, 38 (9): 168-172. 
Abstract PDF(455KB) ( 489 )   
RelatedCitation | Metrics
Data broadcast is an efficient way of data dissemination in wireless environments. In this paper the theoretical analysis was provided first for the three data broadcast mode(periodic broadcast, on-demand broadcast and hybrid broadcast). Then current data scheduling algorithms were classified and compared. Some typical multi channel scheduling algorithms were analyzed in detail. Finally we pointed out some research areas for data broadcast scheduling.
Spatio-temporal Association Rule Mining Algorithm and its Application in Intelligent Transportation System
XIA Ying,ZHANG Jun,WAND Guo-yin
Computer Science. 2011, 38 (9): 173-176. 
Abstract PDF(318KB) ( 549 )   
RelatedCitation | Metrics
Taking into account the spatial and temporal constraints simultaneously can filter irrelevant data early and improve the efficiency of discovering spatio-temporal association rule. Based on the idea, Spatio-Temporal Apriori (STApriori) algorithm was proposed. It analyzes the time validity and spatial relativity at the same time during the gene ration of frectuency item sets. It classifies the time duration of spatio-temporal data and considers the spatial relationship firstly and generates the transaction table, then performs join operation on spatial-related item sets. Experiments illuminate that the algorithm is well performed. The algorithm is applied in intelligent transportation system to analyze the trend of traffic congestion by identifying spatio-temporal association between road sections.
New Ensemble Constructor Based on Locality Preserving Projection for High Dimensional Clustering
ZHOU Jing-bo , YIN Jun, JIN Zhong
Computer Science. 2011, 38 (9): 177-181. 
Abstract PDF(429KB) ( 303 )   
RelatedCitation | Metrics
This paper studied how to construct cluster ensembles for high dimensional data and proposed a new ensemble constructor. ho ameliorate the effect caused by high dimensionality, the proposed method used Locality Preserving Projections(LPP) to reduce the dimensionahty before constructing ensembles. Then constructed ensembles based on random projection combined with K means in LPP subspace. Finally,we discussed how to choose the dimensionality of LPP subspace. hhe experiments show that ensembles generated by new algorithms perform better than those by Princi- pal Component Analysis with subsampling(PCASS) and simple Random Projection(RP) that was proposed before.
Scene Classification Method Based on the Feature Fusion of Regional Semantic Sub-concept Occurrence
WANG Ling-jiang, RUAN Jia-bin , YANG Yu-bin
Computer Science. 2011, 38 (9): 182-185. 
Abstract PDF(410KB) ( 337 )   
RelatedCitation | Metrics
A large quantity of practical demand has made image retrieval a research focus. We proposed a scene classificanon method based on the fusion of concept occurrence vectors, which are gained by segmenting images uniformly and recognizing corresponding regional sub concepts. Experiments show that our method increases the P/R values of scene classification,bringing a better result.
Community Division Model by Semantic Entropy and its Application Research
GHOU Ming-qiang , GHU Qing-sheng , LIU Hui-jun , GHANG Cheng
Computer Science. 2011, 38 (9): 186-189. 
Abstract PDF(337KB) ( 441 )   
RelatedCitation | Metrics
According to the facts that a complex network composed of a number of communities and users are usually interested in only a few topics, this paper proposed a community division model by community semantic entropy and entropy between sematic communities, classed the network into serveral communities, and applied the model in the services center specific problem. The community load capacity and the other characteristics experimental anlysises show that this model's efficiency singnifcantly increases in application, for the semantic characteristics of community arc fully accouted, it also provides a new way to the deployment of service registries in semantic community.
Credit Evaluation Model Based on Multiple Evolutionary Neural Networks
YU Min, WU Jiang
Computer Science. 2011, 38 (9): 190-192. 
Abstract PDF(332KB) ( 390 )   
RelatedCitation | Metrics
Credit evaluation plays an important role in the banking management. A novel credit evaluation model based on multiple evolutionary neural networks, named MNN-CREDIT, was presented. The MNN-CREDIT model establishes classifiers by a group of three-layer feed-forward neural networks with high accuracy and good diversity. The neural networks arc trained by niche genetic algorithm based on clustering. The credit evaluation result of the identifying client can first be evaluated by each neural network, and the final credit classification result is obtained according to the dynamic voting rule. Empirical analysis on Germany credit database was given. hhc results show that MNN-CREDIT model has higher prediction precision.
Application of Augmented Lagrange Optimization Algorithm to the Sparse Signal Reconstruction Problem
YANG Jun-jie, LIU Hai lin
Computer Science. 2011, 38 (9): 193-196. 
Abstract PDF(336KB) ( 492 )   
RelatedCitation | Metrics
To solve the problem of finding sparse solutions from Lp optimization modcl(O
Running-time Analysis of Evolutionary Programming Based on Levy Mutation
CAI Zhao-quan, LUO Wei, ZHANG Yu-shang, HUANG Han, LUO Yong-wei
Computer Science. 2011, 38 (9): 197-200. 
Abstract PDF(308KB) ( 332 )   
RelatedCitation | Metrics
Running-time analysis of the continuous evolutionary algorithm is a difficult problem now existing in the field at home and abroad. To deal with this issue, the paper gave an in-depth studies about the running time of evolutionary programming based on Levy mutation(LEP). The procedure is as follows:First,LEP algorithm was modeled on the basis of an absorbing Markov process, which proved the optimal solution of the I_EP convergence. Second, the expected first hitting time was used to evaluate the running time of LEP algorithm by taking its computational properties into consideration. Finally, based on the similar transformation of I_cvy distribution, an estimation equation of I_EP running time was proposed. The research results indicate that the upper bounds for the running time arc directly influenced by the Lebesgue measurement of the optimal space, its population scale and the searching range.
Lattice-valued Semantic Resolution Reasoning Method
GHANG Jia-feng,XU Yang,HE Xing-xing
Computer Science. 2011, 38 (9): 201-203. 
Abstract PDF(302KB) ( 459 )   
RelatedCitation | Metrics
Resolution-based automated reasoning is one of most important research directions in AI, semantic method is one of the most important reform methods for resolution principle, in semantic resolution method, it utilizes the technology restraining the type of clauses and the order of literals participated in resolution procedure to reduce the redundant clauses,and can improve the efficiency of reasoning,a一resolution principle on lattice-valued logic based on lattice implicanon algebra provides a alternative tool to handle the automated reasoning problem with uncomparability and fuzziness inforr工ration. It can refutably prove the unsatisfiability of logical forr工mlae in lattico-valued logic syster工r. Firstly, this paper discussed the property of one class of generalized clause set on latticcvalucd propositional logic I_P(X),this gcncr}r lined clause set can be divided into two non-empty sets, the semantic resolution method on it was investigated and sound theorem and weak complete theorem of this semantic resolution method were proved.
Parallel Algorithm of Block Bordered Linear Programming
YANG lin-feng,LI Tao-sheng, LI Jie, CHEN Yan
Computer Science. 2011, 38 (9): 204-207. 
Abstract PDF(291KB) ( 480 )   
RelatedCitation | Metrics
This paper presented the simpler and simplest correction equation of linear programming(LP) with block bordered coefficient matrix based on the framework of interior point method(IPM). And the diagonal sulrmatrix in the simplest correction equation was proved to be symmetric positive definite. Parallel IPM algorithm for LP was presented after a parallel solver for correction equation was proposed by integrating decoupling and Cholesky factorization of symmetric positive definite matrix The simulations in the cluster show that the proposed method is very promising for large scale LP problems due to its excellent speed up and scalability.
New Construction Method of Fuzzy Entropy of Vague Sets
XU Feng-sheng,SHI Kai-quan
Computer Science. 2011, 38 (9): 208-210. 
Abstract PDF(226KB) ( 332 )   
RelatedCitation | Metrics
The drawbacks of existing definition on fuzzy entropy of vague set were pointed out, and the causes of draw back were analyzed. We gave out the axiomatizing definition on fuzzy entropy of vague set, and proposed a new fuzzy entropy of Vague set, and proved its rationality and validity.
Global Path Planning of Unmanned Surface Vehicle Based on Electronic Chart
ZHUANG Jia-yuan, WAN Lei , I_IAO Yu-lei , SUN Han bing
Computer Science. 2011, 38 (9): 211-214. 
Abstract PDF(372KB) ( 617 )   
RelatedCitation | Metrics
To solve the problem of global path planning of USV(Unmanned Surface Vehicle) , a search of shortcut Dijkstra algorithm based on electronic chart was presented. The algorithm uses a dynamic grid model,overcomes the occupy memory problem of traditional Dijkstra algorithm, reduces planning time and improves planning precision. Simulation resups show that the environment model and path planning algorithm can generate safety and reasonable routes.
Interval Valued Intuitionistic Fuzzy Sets with Parameters and its Application in Pattern Recognition
ZHANG Ghcn-hua, YANG Jing-yu,YE You-pei, ZHANG Qian-sheng
Computer Science. 2011, 38 (9): 215-219. 
Abstract PDF(395KB) ( 382 )   
RelatedCitation | Metrics
The concept of interval-valued intuitionistic fuzzy sets with paramctcrs(IVIFSP) was first introduccd,and a series of IVIFSP were presented. Based on the degree of membership and the degree of non-membership, this paper focused on the construction of interval-valued intuitionistic fuzzy sets with single-parameter. Finally, parameter ectuation with critical point between any two patterns was constructed, and the critical point was proved to be theoretically effecfive in analyzing the result of pattern recognition. The results of pattern recognition simulation show that the method of IVIFSP is more flexible than that of the traditional intuitionistic fuzzy sets.
Semi-supervised Multi-instance Kernel
ZHANG Gang, YIN Jian,CHENG Liang-lun, ZHONG Qin-ling
Computer Science. 2011, 38 (9): 220-223. 
Abstract PDF(413KB) ( 390 )   
RelatedCitation | Metrics
In multi-instance learning, mechanism of making use of unlabeled instance would cut down training cost and increase generalization ability of the learner. Current algorithms perform semi-supervised multi-instance learning mainly by labeling each instance in bags and transferring multi-instance learning problem to singlcinstance semi supervised ones. In this paper we introduced a bag-level semi supervised learning framework with the idea that bag's label is determined by its instances and structure. With definition of multi-instance kernel, all bags(labeled and unlabeled) were used to calculate bag-level graph laplacian, which is a penalization term added to the optimization goal. We turned this problem into an optimization problem in RKHS and got a modified multi instance kernel function by unlabeled data as result that can be directly used in traditional kernel learning framework. We performed experiment in ALOI and Internet image datasets and compareed it with related algorithms. Experiment result shows that the proposed method can get the same accuracy as supervised counterpart with less labeled bags, and with the same labeled training data set the proposed method is of higher generalization ability.
Research about Tone Recognition of Mandarin Continuous Speech Based on Multi-space Probability Distribution
NI Chong-jia,LIU Wen ju, XU Bo
Computer Science. 2011, 38 (9): 224-226. 
Abstract PDF(338KB) ( 359 )   
RelatedCitation | Metrics
Chinese Mandarin is the tonal language. Tone is important to Mandarin speech recognition. We proposed a method to recognize the tone of Mandarin continuous speech, which is the combination of embedded tone model and explicit tone model. This method can fuse the fundamental frequency information of short time and long time. The experimenu in "863-Test" and "TestCorpus98" test show that our proposed method can achieve 96. 12%and 93. 78% tone recognition correct rate separatively.
Reinforcement Learning Negotiation Strategy Based on Bayesian Classification
SUN I}ian-hao,CHEN Fei, ZHU Qing-sheng,CAO Feng
Computer Science. 2011, 38 (9): 227-229. 
Abstract PDF(313KB) ( 317 )   
RelatedCitation | Metrics
To help negotiation Agent to select its best actions and reach its final goal,a reinforcement learning ncgotialion strategy based on I3ayesian classification was proposed. In the middle of negotiation process, negotiation Agent makes the best use of the opponent's negotiation history to make a decision of the opponent's type based on Bayesian classification,dynamically adjust the negotiation Agent's belief of opponent in time,quicken the negotiation result convergence and reach the better negotiation result. Finally, the algorithm was proved to be effective and practical by experiment
Variable Precision Fuzzy Rough Set Based on Inclusion Degree
XU Wei-hua, ZHANG Xian-tao,WANG Qiao-rong
Computer Science. 2011, 38 (9): 230-233. 
Abstract PDF(401KB) ( 318 )   
RelatedCitation | Metrics
We studied a power inclusion degree on fuzzy sets and established the fuzzy rough set model based on it with the foundation of weak fuzzy partition. Some important properties of this model were approached and the method to compute the roughness correspondingly was illustrated. Furthermore, two roughness measures of fuzzy rough set in this model were defined according to the Hamming-distance and Euclid-distance. The fuzzy rough set model we proposed can carry the computing on fuzzy rough set by fuzzy sets operating properties. And this model can lay the theoretical founlotion for research and applications of variable precision fuzzy rough set,
Note of Covering Rough Set Model Nature
LIANG Jun-qi, YAN Shu-xia
Computer Science. 2011, 38 (9): 234-236. 
Abstract PDF(199KB) ( 345 )   
RelatedCitation | Metrics
We promoted part nature of covering rough set model, introduced a pair of new operator and extended the in elusion relation of join and intersection of the upper and lower approximation sets, then got some better results.
Co-evolutionary Genetic Algorithm by Two Stages Based on Potential Field
ZHAO Xue-chen, WANG Hong-guo, SHAO Zeng-zhen, MIAO Jin-feng
Computer Science. 2011, 38 (9): 237-241. 
Abstract PDF(414KB) ( 306 )   
RelatedCitation | Metrics
I}his paper proposed a co-evolutionary genetic algorithm by two stages based on potential field. In the first stage, each population was evolved mainly by sexual reproduction, when all populations reached evolutionary stagnate,key areas were formed by cluster analysis,and narrowing the search area can improve the efficiency of the algorithm In the second stage, each population was evolved mainly by asexual reproduction to enhance local search, and realized the directed evolution based on individuals' fitness so as to speed up the convergence rate. At the same time it proposed a concept called environmental potential field which could guide the evolution in order to make multiple populations evolve cooperatively. The experimental results show that the proposed algorithm has high quality of precision and rapid convergence rate and that it overcomes the low efficiency of traditional algorithms to some extent.
Data Dimension Reduction Based on Category Theory
ZHOU Li 1i , LI Fan-zhang
Computer Science. 2011, 38 (9): 242-244. 
Abstract PDF(310KB) ( 459 )   
RelatedCitation | Metrics
The research of category theory is mainly about summary and abstraction of some specific mathematical objects and mapping. The dimensionality reduction problem based on category theory was discussed, and it could solve the problems of image analysis and image recognition. Besides, the process of dimensionality reduction based on category theory was listed. There are two examples,Principal Component Analysis Category and Isomap Category,which verified the correctness of the category theory applied to the dimensionality reduction problem.
Study on Pattern Similarity of Time Series
LIN Xun, LI Zhi-shin ZHOU Yong
Computer Science. 2011, 38 (9): 245-247. 
Abstract PDF(223KB) ( 329 )   
RelatedCitation | Metrics
Under normal circumstances,the same distance measurement method can only measure the similarity of the same length sequence mode. Based on the similarity with definition meta model, this article drawed the idea of dynamic time warping distance and gave the definition of a sequential pattern of dynamic time warping distance,and finally conducted simulation experiments with two different time series.
Chessboard Corner Detection Algorithm Based on SUSAN and Multi-direction Restriction of Symmetry and Uniformity
XU Shu-kui, LI Guo-hui,ZHANG Jun ,TU Dan
Computer Science. 2011, 38 (9): 248-252. 
Abstract PDF(456KB) ( 718 )   
RelatedCitation | Metrics
Corner detection is a key point in the program of camera calibration. Aiming at the problem of SUSAN algorithm which cannot differentiate the edge and corner in the corner detection of chessboard, this paper analyzed the characters of the chessboard’s corners. A chessboard corner detection algorithm based on multi-direction restriction of symmetry and uniformity of the corners was proposed. The real corners of the chessboard were confirmed by computing the restriction of symmetry and uniformity in the result of SUSAN algorithm. Experiments show that algorithm proposed has a better validity and precision.
Calibration Methods for the Multi-touch System Based on Four Collaborative Cameras
XIE Fei, CAI Shun ,WANG De-xin , CHEN Chao
Computer Science. 2011, 38 (9): 253-256. 
Abstract PDF(441KB) ( 292 )   
RelatedCitation | Metrics
The multi touch system using four cameras solves the occlusion by fitting the four directing lines of each toueh point. This paper presented three methods to build the directing line, including look-up table method, vanishing point method and stereo calibration method. The look-up table method stations a set of reference points along the rectangle frame,finds the closest reference point by interpolation, and builds the line according to the projection center and the reference point; the vanishing point method builds the directing line according to the vanishing point of the touch point and the projection center; stereo calibration method builds the directing line according to epipolar geometry back projection line. Results show that look up table is applicable to small platforms and vanishing point method fits large size platforms while stereo calibration method needs to improve the precision.
Finger Vein Recognition Algorithm Based on Relative Distance
WANG Ke-jun, LIU Jing-yu, LI Xue-feng
Computer Science. 2011, 38 (9): 257-259. 
Abstract PDF(330KB) ( 399 )   
RelatedCitation | Metrics
Raised a new finger vein recognition algorithm, the method uses the essential feature of inner finger vein feature structure to match the finger vein. First, extracts the endpoint and crosspoint in the finger vein images which have been thinned and repaired. Then computes the distance between these points. Lastly accomplishes the finger vein image recognition by comparing these distance value. The method integrates the vein's inherent feature and makes full use of the essential property of topology structure, and the localization is needless, so the method is simple and easily to be executed. The experiment result indicates that the algorithm effectivly overcomes the influence of image translation and rotation,and the identity recognition can be executed fast and exactly. So the method has a actual applicative value and an cvolvablc forground.
Research on the Complexity of Two Dimensional Engineering Graphics for Information Security
LONG Min, PENG Fei
Computer Science. 2011, 38 (9): 260-263. 
Abstract PDF(306KB) ( 315 )   
RelatedCitation | Metrics
Based on the characteristics of two dimensional CAD engineering graphics, the complexity of two dimensional CAD engineering graphic was proposed. The definitions of entity complexity, restriction complexity, and characteristic complexity were given, and their calculation methods were presented, respectively. Analysis and discussion illustrate the proposed method can effectively evaluate the complexity of two dimensional CAl)engineering graphics. Finally, the application of the complexity of two dimensional engineering graphics in encryption and watermarking was discussed.
Surveillance Scene Analysis Model Based on Motion Trajectory
LIANG Hao-zhe , LI Guo-hui, ZHANG Jun
Computer Science. 2011, 38 (9): 264-266. 
Abstract PDF(539KB) ( 320 )   
RelatedCitation | Metrics
The proposed model learns the potential structure of motion trajectory within a hierachical process composing of classification and clustering. Topology prior was used to classify the spatial similarity of trajectory. After that by utilining Mixture Model to estimate distribution of motion features, the potential motion rules of the scene were obtained,based on which abnomal motion can be detected. The model is robust to low-level noises because of the statistic learning of multi-dimentional motion clues, and by combining prior the motion rules have obvious semantic structures. The real surveillance experiment results verified the efficiency of the proposed model.
Applications of Fuzzy Logic in the Protection Algorithm of Face Feature
ZHOU Ling-li,LAI Jiang-huang,WU Xian
Computer Science. 2011, 38 (9): 267-270. 
Abstract PDF(334KB) ( 574 )   
RelatedCitation | Metrics
With the growing use of face recognition in security and video control domain, there is growing concern about the security and privacy of the biometrics data. Recently, technologies for biometric security and privacy have been proposed,which typically transform the biometric data to a binary string. hhese transformations can lead to some informalion loss and downgrade the performance of a system. This paper applied fuzzy logic to confirm the reliability of each bit in a binary string and to model the intra class variations. hhe experimental results show this method reduces the overlap of impostor distribution and genuine distribution and improves the performance of biometric security technology.
Approach to Video Text Localization Based on Gradient Discrete Cosines Transform
YAN Jun-hua,LI Dan,ZHOU Ya-tong
Computer Science. 2011, 38 (9): 271-275. 
Abstract PDF(856KB) ( 341 )   
RelatedCitation | Metrics
Video text information plays an important role in semantio-based video analysis, indexing and retrieval. The grade of gray in the video text and background is different,so an approach to video text localization using gradient disCrete cosines transform(DCT) was proposed. Firstly, the video frame is divided into N X N macro-blocks, and we can get the DCT coefficients from every macro-block. Then the gradient operator value considered as block intensity is calculated and we deal with it by using of smooth filter and morphology processing. Lastly, the horizontal and vertical projection can be obtained, the number of text line can be got and the text region is marked with text box hhe experimental results show that the proposed method is efficient on the localization of the static and rolling video text, especially the accuracy and integrality of character localization with the less strokes.
Parallel Algorithm and Implementation for Molecular Dynamics Simulation Based on GPU
FEI Hui, ZHANG Yun-quan, WANU Ke, XU Ya-wu
Computer Science. 2011, 38 (9): 276-278. 
Abstract PDF(410KB) ( 1052 )   
RelatedCitation | Metrics
Molecular Dynamics Simulation is an important method for acquiring liquid and solid atoms' properties. this method has been widely used in the fields of chemistry, physics, biology, medicine and materials. hhe complexity and accuracy demand causes enormous workloads. Parallel computing is a feasible way to speedup large-scale molecular dynamics simulation. With hundreds of GFlops or even hFlops performance, GPU can speed up computing-intensive applicanons. This paper presented a parallel algorithm named oApT-AD, and we implemented it on GPU under OpenCL and CUDA Framework. The experiment results show that the oApT-A7)algorithm can achieve 120 speedup on UPU Tesla 01060 under OpenCL Framework, compared to that on CPU. And we also implemented the oApT-Al)algorithm on GPU under CUDA Framework. hhe implement under OpenCI. Framework provides almost the same performance as the implement under CUDA Framework. Moreover, our algorithm can be extended to two or more(:PUs, with good scalability.
Design and Verification of Clock Domain Crossing for SOC
LUO Li, HE Hong-jun , XU Wei-xia , DOU Qiang
Computer Science. 2011, 38 (9): 279-281. 
Abstract PDF(323KB) ( 754 )   
RelatedCitation | Metrics
With the increasing number of clock domains and CDC signals in today's high-performance,low-power SOC,the design and verification of CDC problem become more and more important. Traditional verification methods can notfind a comprehensive cross-clock domain design of functional errors in the RTL stage. In this paper, we dicussed 5 types of CDC synchronizer circuit templates of our chip, and proposed hiberarchy verification method: structural analysis, assertion-based verification, and formal verification. Taped sample chip tests show all CDC designs work right, and demonstrate that design and verification method arc effective and complete.
Another Side of the Wall--Deeper Thinking in Turing Model
LEE Yick Kuen , CHENG Lee Lung
Computer Science. 2011, 38 (9): 282-287. 
Abstract PDF(523KB) ( 364 )   
RelatedCitation | Metrics
Computer has entered into the era of multi-processor structure. However, multicore puts us on the wrong side of the wall. The memory address parameters introduce extra-redundant, decrease system efficiency. This paper proposed a new kind of model by considering intelligent processes as a transformation on information. This model unifies the VonNeumann machine program and neural network transformation. hhe paper analysed the similarities and differences of these two process and proposed a computer system based on micro-kernel architecture and diverted into Voneumann or neural transform according to the maturity and speed requirement. Irritating biological organism intellect, this kind of computer, serves as a tool for human beings. Professor Eugenio Culurciello of Yale, with Yann LcCun of NYLJ, presented a high performance Embedded computer, NcuFlow, on HPEC workshop 2010-9-15 in Boston, its system architecture concepts are quire similar to the Pseudo-Organic computer proposed in this paper.
Loop Distribution Algorithm Based on Data Dependence Analysis in Auto-vectorization
HUANG Lei , YAO Yuan , HOU Yong-sheng , YANG Min
Computer Science. 2011, 38 (9): 288-293. 
Abstract PDF(473KB) ( 337 )   
RelatedCitation | Metrics
Loop distribution is a useful method to vectorization programs,but because of the data dependence,it's very hard to completely achieve loop distribution in auto-vectorization. So, it's usually used easily loop distribution in current auto-vectorization compiler. Here, discussed a new loop distribution method based on identify the statement vectorizalion, from the data dependence view, and achieved in current autoectorization compiler. I3y this method, we can completely analyse which statement can vectorize, which dependence cycle can vectorize, finally using loop distribution, the vectorization statements can and no-vector statements be distributed in different loops. I}his method can handle these loops which can't be vectorized by other auto-vectorization compilers, and have good effect for some loops which have complex dependence.
Energy-efficient Task Scheduling Approach for Homogeneous Multi-core Processors
WANG Ying-feng , LIU Zhi-jing
Computer Science. 2011, 38 (9): 294-297. 
Abstract PDF(333KB) ( 449 )   
RelatedCitation | Metrics
For periodic hard real-time tasks running on homogeneous multi core processors, an energy-efficient approach based on dynamic voltage scaling (DVS) was designed. First, computation tasks are ordered by decreasing cycles and task mapping is arranged based on the principle of the shortest scheduling length for computation tasks. Then the computation task with the minimum commun- ication time is set as the last executed computation task while the order of other computation tasks keep unchanged for each processor core. The optimal execution order of computation tasks on each processor core is determined during execution time extension in the case of all computation tasks arranged with the highest frequency in the initial mapping. Experiments were conducted on several random task sets based on the power model of the Intel PXA270. Results show that the proposed approach can decrease energy of multi-core processors eff ectively.
NoC Mapping for Heterogeneous Multi-core Cooperative System Based on Chaotic Discrete Particle Swarm Optimization
WAND Lei , LING Xiang , HU Jian-hao
Computer Science. 2011, 38 (9): 298-303. 
Abstract PDF(493KB) ( 296 )   
RelatedCitation | Metrics
Heterogeneous multi-core cooperative networks on-chip(NoC) mapping was split into two stages; assigning the tasks to the suitable IP cores, and then mapping the IP cores to the appropriate NoC tiles. ho deal with the different characteristics of these two successive stages, a coarse model and an accurate model of energy consumption or delay estimation were proposed respectively. A discrete particle swarm optimization with chaotic disturbance was proposed to solve the multi-objective NoC mapping problems, where a chaotic disturbance mechanism was designed to avoid obtaining local optimal solutions. The simulation results are better than that obtained by traditional schemes significantly.