Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue Z6, 16 November 2018
  
Program Attack and Protection Based on Return-Oriented Programming
Computer Science. 2012, 39 (Z6): 1-5. 
Abstract PDF(558KB) ( 2371 )   
RelatedCitation | Metrics
With the adoption of W⊕ X technology, the traditional code injection attacks have been almost eliminated, so the return-to-lib attack has been greatly restrained. Under this circumstance, Doc. Hovav Shacham promoted the ROP i- dea, which is short for Return-Oriented Programming. Based on the theory of stack overflow, making using of the valid short instruction sequences that end with ret instructions to construct gadget collections with Turning-Complete fea- lures, the ROP idea can accomplish the task of compute and attack. In this paper, we presented achievements in ROP field and ROP's ability of attack since its promotion,and then illustrated the direction for development of the automa- lion of ROP attack and its current achievements, after that, analyzed and predicted the future development of ROP auto- mation. Simultaneously, we discussed strategies and methods aiming at eliminating this attack based on its characteris- tics, introduced exisiting achievements of defending this attack by comparing their merits and demerits, gave our own perspectives of these defending strategies and methods about how to change and improve them.
Note on the Tu-Deng Conjecture
Computer Science. 2012, 39 (Z6): 6-8. 
Abstract PDF(260KB) ( 481 )   
RelatedCitation | Metrics
It is a difficult challenge to find Boolean functions used in symmetric ciphers achieving many good crypto- graphic properties. Recently, two classes of Boolean functions with maximum algebraic immunity have been proposed by Tu and Deng based on correctness of the assumption of a combinatorial conjecture about binary. One class of the func- lions are bent functions with maximum algebraic immunity, and another class of the functions are balanced and have maximum algebraic immunity,optimal algebraic degree and good nonlincarity. "hu-Deng conjecture has received a lot of attennpns from cryptographers. The conjecture in the case of wt(t)=k-3 proved. As a corollary,the case of wt(t)=k-3 was also proved.
Post-processing Method in Truly Random Number Generator
Computer Science. 2012, 39 (Z6): 9-11. 
Abstract PDF(338KB) ( 1020 )   
RelatedCitation | Metrics
Several post processing methods were theoretically analyzed in this thesis and a new post processing method for true random number generator was proposed. The random sequence after post processing meets the requirements of uniformity, independence, improve of entropy per bit. In case of arrays with bias digital noise, the above method is used. Moreover, randomness tests arc performed on the derived inner random arrays. I}he evaluate results show that this is a simple and effective post processing method that meets the rectuirements of small area and low power. The conclusion can be used in smart card.
Malware Detection Model Based on the Sandbox
Computer Science. 2012, 39 (Z6): 12-14. 
Abstract PDF(264KB) ( 558 )   
RelatedCitation | Metrics
The increasing of computer malware criminal leads researchers to pay attention on the effective detection of malware. The dynamic analysis detection method based on sandbox technology becomes the research spot. This paper proposed a behavior analysis algorithm based on improved attack tree which uses the improved QEML1 process virtual machine to obtain a shorter response time and a complete API sequences flow. And the experiment results demonstrate effective and feasible of this detection method.
Design of Functions of the Special Description Language for the Symmetric Cryptographic Algorithm
Computer Science. 2012, 39 (Z6): 15-17. 
Abstract PDF(226KB) ( 370 )   
RelatedCitation | Metrics
Symmetric cryptographic algorithm description language(SDI_SCA) describes the design ideas of algorithm in a natural way of thinking similar to the professional language. In the design of cryptographic algorithms, the components of algorithm which has safe and good performanc will be repeatly used, such as Substitute, Permutation. In this paper, based on the analysis of designing rules and features of a large number of public cryptographic algorithms, the function scheme was added to SDLSCA. Key technictues of the SDLSCA function were proposed in detail such as definition, call- ing and implementation. AES was described to illustrate the application of the function. I}he practice indicates that by making proper use of functions in SDI_SCA, algorithm description can be simplified, and the efficiency and utilization rate are improved.
Research on Hash Proof System and its Applications
Computer Science. 2012, 39 (Z6): 18-23. 
Abstract PDF(494KB) ( 1125 )   
RelatedCitation | Metrics
The notion of hash proof system was first proposed by Cramer and Shoup in Eurocrypt 2002. Inhere arc lots of research results about hash proof system since the notion was proposed, nowadays, some revisited versions were pre- sented. "Projective" and "smoothness" are two important prosperities of hash proof system It is found that hash proof system is used several contexts except used as a means to build efficient chosen-ciphertext secure publi}kcy encryption schemes, such as password-based authenticated key exchange, oblivious transfer, deniable authentication, zero-knowledge proof, commitment, ctc. We addressed the definitions of hash proof system, analyzed the derivation relations and security level among variations,and discussed the applications of hash proof system in cryptography.
Clustering Data Association Algorithm to Support Multi-target Tracl}ing in WSN
Computer Science. 2012, 39 (Z6): 24-27. 
Abstract PDF(338KB) ( 389 )   
RelatedCitation | Metrics
Multi-source data association is one of the key technologies of multi-sensor data fusion in wireless sensor net- work. The joint probability data association algorithm is a data association algorithm of Multi-target tracking, and it doesn't need any priori information about targets and clutters, but its computer expense is very large compared with the other related data association algorithms. The joint probability data association algorithm based on clustering algorithm applys the cogitation of clustering in pattern recognition to deal with the measurement data received from sensors. I}his method reduces the number of measurement data, and simplifies the effective matrix, thus reduces the computation of o- riginal algorithm.
Analysis on Error and Attack Tolerance of Reply Network BBS
Computer Science. 2012, 39 (Z6): 28-30. 
Abstract PDF(342KB) ( 489 )   
RelatedCitation | Metrics
With the theories of complex network, the paper researched error and attack tolerance of reply network on the Bulletin Board System. Firstly, the authors proposed five methods based on the degree and betweenness of nodes, in- eluding failure and attacks, and described the construction model of random network, scalcfrec network and reply net- work on I3I3S. Then, the definition and measure of error and attack tolerance were presented. Finally, the authors at- tacked networks with five methods. The results show that attacks can breakdown the network in seconds,while failure has little influence on networks, especially the reply network on BI3S, which means the reply network owned weaker at- tack tolerance and stronger error tolerance. Aimed to reduce the damage by attacked, we must protect the important nodes in the network, which have strong practical significance.
Buffer Overflow Detection Method Based on Model Checking
Computer Science. 2012, 39 (Z6): 31-34. 
Abstract PDF(342KB) ( 625 )   
RelatedCitation | Metrics
Buffer overflow has become one of the major sources of program flaws. All of the existing solutions for detec- ting buffer overflow have serious drawbacks that hinder their wider adoption by practitioners. In this paper, we re- searched those solutions,analyzed their advantages and disadvantages. We presented a model for buffers in the program related operations. At last we designed a buffer overflow detecting approach based on model checking. A prototype was also developed to verify this method.
Secure Key Issuing for Boneh-Boyen} Identity-based Encryption
Computer Science. 2012, 39 (Z6): 35-37. 
Abstract PDF(336KB) ( 689 )   
RelatedCitation | Metrics
By making use of user's identity as his public key, identity based cryptosystems have many advantages over traditional PKI based cryptosystems. But identity based cryptosystems also have an inherent drawback of key escrow that the key generation center knows all private keys of users. To improve the security of keys in identity based encryp- tion, how to avoid key escrow problem in identity based encryption becomes a hot issue. A secure key issuing scheme for I3oneh-I3oyenl identity based encryption was proposed, in which multiple key privacy authorities are set in addition to the key generation center to protect the privacy of users' private keys. A rigorous security proof in standard model of our secure key issuing protocol was also given. Thus identity based encryption is more usable in practice.
Relative Position Determination of Point and Polygon with Privacy Preserving
Computer Science. 2012, 39 (Z6): 38-40. 
Abstract PDF(333KB) ( 438 )   
RelatedCitation | Metrics
Privacy preserving computational geometry is a new research branch of secure multiparty computation. Be cause of the shortcomings and restraints in existing protocols, in semi-honest, a new protocol based on plumb line algo rithm and oblivious transfer protocol for relative position determination of point and polygon was proposed in this pa per. The correctness,computation efficient and security were analyzed and proved in this paper as well. The new proto col not only can be used in real number field, but also can be used in any polygon with great efficient.
Formalized Security Model of Identity Based Multi-Proxy Signature
Computer Science. 2012, 39 (Z6): 41-43. 
Abstract PDF(237KB) ( 369 )   
RelatedCitation | Metrics
Multi-proxy signature is an extension of the basic proxy signature primitive. In a multi-proxy signature scheme, an original signer can delegate his signing rights to a group of proxy signers. And only the cooperation of all the proxy signers can generate the proxy signature on behalf of the original signer. Combining multi-proxy signature with i- dentity-based cryptography, some identity-based multi proxy signature schemes have been proposed. But to date, no for- malized security model has been provided for identity-based multi-proxy signature schemes. This paper gave the formal definition of identity-based multi proxy signature schemes and formalize a notion of security for them.
Study on Security Early Warning Technology of Military Network Countermeasure
Computer Science. 2012, 39 (Z6): 44-46. 
Abstract PDF(295KB) ( 709 )   
RelatedCitation | Metrics
With the development of military networks countermeasure technology, network security early warning man- agement has become an essential part of constructing military networks defense in depth system. This paper focused on military networks countermeasure real environment, use of security early warning mechanism in the military network defense-in-depth strategy, proposed the design ideas, reaching goals and function structures of military network security curly warning system which is built into a real time situation awareness, technical and management coordination, pcace- time and wartime seamless connection. Finally, the current implementation principle and technology of early warning system were summarized.
Design and Implementation of the Hash Algorithm BLAKE Based on Matlab
Computer Science. 2012, 39 (Z6): 47-50. 
Abstract PDF(341KB) ( 834 )   
RelatedCitation | Metrics
The hash algorithm BLAKE is one of the five candidates for SHA-3(Sccurc Hash Algorithm-3) competition at the last round. This paper described the design and implementation of BLAKE based on Matlab. Our program with a GUI(Graphic Uscr Interface) can be used both in practical Balke Hash value calculation and Blake teaching or experi- ments.
Finding XSS Vulnerabilities Based on Static Analysis and Dynamic Testing
Computer Science. 2012, 39 (Z6): 51-53. 
Abstract PDF(338KB) ( 949 )   
RelatedCitation | Metrics
Web applications have a variety of security vulnerabilities which can be exploited when large number of Web applications are widely used. Among these security vulnerabilities, the Therefore, in order to detect XSS vulnerabilities in Web applications more effectively, this paper presented a method that combines the static code analysis based on Tainted mode model with the sanitizing unit dynamic testing which includes the definition of the source rules, the sanitizing rules and the receiving rules of XSS vulnerabilities and the description of the dynamic detection algorithm for sanitizing unit. Analysis shows that this method can effectively find XSS vulnerabil- ides in Web applications.
Fast Scalar Multiplication Based on Sliding Window Technology
Computer Science. 2012, 39 (Z6): 54-56. 
Abstract PDF(283KB) ( 1445 )   
RelatedCitation | Metrics
Scalar multiplication is the heart of elliptic curve cryptosystems. In recent years,how to realize efficient scalar multiplication is a research focus of information security field. 13y means of the wMOF representations of scalar and the direct computation 2kQ}P strategy,we modified the scalar multiplication algorithm based on sliding window technolo- gy. The analysis results indicate that the efficiency of the algorithms is improved obviously and the storage requirements arc reduced, and it can enhance the ECC's efficiency.
Satellite Network Key Management Based on Pre-distribution According to Stochastic Matrix
Computer Science. 2012, 39 (Z6): 57-59. 
Abstract PDF(252KB) ( 513 )   
RelatedCitation | Metrics
To key management scheme based on symmetrical cryptographic system, it is crucial to get balance of securi- ty and performance. According to satellite network nodes resource constraints, a key elements distribution method was introduced based on random matrix for nodes. It simplified the construction of key rings, and can make sure that any node can establish different session keys with different other nodes. Then we gave the corresponding key update mecha- nism The simulation results show that this method can guarantee security and reduce storage and computing cost by keys significantly at the same time.
New Approach for SQL-injecton Detection
Computer Science. 2012, 39 (Z6): 60-64. 
Abstract PDF(389KB) ( 726 )   
RelatedCitation | Metrics
Web application security is a serious isssuc of information security, and SQL- injection is one of the most com- mon attacks. This paper proposed an approach to counter SQL Injection which combines static mod}matching and dy- namic fcaturcfiltering. It learned automatically the structure feature of all legal SQL statements to construct knowledge library in safe environments, and then matched every SQL statement with knowledge library in real environments. If succeeded , this SQL statement was considered to be legitimate. If failed, it was not determined to be illegal immediately. We would take depth-feature check based on Valucat Risk,and identitify the true illegal SQL statements. Experimental results prove that this proposed approach has good performance and perfect protection for SQL Injection.
Markov Chain Based Attack on the CSMA/CA Mechanism
Computer Science. 2012, 39 (Z6): 65-68. 
Abstract PDF(598KB) ( 582 )   
RelatedCitation | Metrics
The CSMA/CA mechanism was formally modeled firstly. Based on the Markov Chain theory, the analysis of attack on CSMA/CA is introduced from two aspects, such as the stochastic performance model and bandwidth share model. After discussing the typical in frames. I}he attack performance attack methods, the experiments was performed with the knowledge of partial fields was analyzed by throughput, communication efficiency and collision numbers, which validated the feasibility and efficiency of attack method based on Markov Chain theory.
Side Channel Risk Evaluation Based on Mutual Information Game
Computer Science. 2012, 39 (Z6): 69-71. 
Abstract PDF(261KB) ( 599 )   
RelatedCitation | Metrics
Attack process of side channel attacks can be regarded as mutual information gambling process. Both sides of the game were cryptographic equipment designers(defense party) and the attacker. The game goal of defenders is for- mulcted by the defense strategy to reduce local and global risk which caused by the side channels leakage; to attack side, the game target and to the contrary. From make safety strategy,reduce safety risk angle,mutual infom}ation game theory is introduced in the decision-making process of the cryptographic chips designers(defense party) and the attacker,to in- vestigate the attack and design tactics choice to security risks, and combined with the quantitative methods of the mutual information, give the optimization tactics selection method of the both sides of attack and design on Nash equilibrium conditions,give the mutual information benefits of the both sides of attack and design on Nash equilibrium.
Access Control Mechanism Based on Behaviour Trust in Wireless Sensor Networks
Computer Science. 2012, 39 (Z6): 72-76. 
Abstract PDF(436KB) ( 383 )   
RelatedCitation | Metrics
Access control is an important application of wireless sensor networks. Previous works arc implemented by exchanging key among neighboring sensors. Considering that in many cases, an sensors behaviour is both spatially and temporally correlated, this paper proposed a distributed and localized behaviour trust mechanism based on similarity. this mechanism implements behaviour trust access control by using statistical hypothesis test for matching the reading se- quence of sensors and statistical characters of the behaviour. The simulation results show that the mechanism can reduce the network traffic, can detect as much as 93% of the not trusty sensors when 10% of sensors arc not security.
Principles and Improvement of Cube Attack
Computer Science. 2012, 39 (Z6): 77-80. 
Abstract PDF(286KB) ( 739 )   
RelatedCitation | Metrics
This paper introduced the principles, detailed steps and algorithms of Cube attack. By comparing Cube attack and AIDA attack, we analyzed its principles in detail, and proved that Cube attack is equivalent to higher order diffc rence in aspect of principle. And then,with discussing linear test,we concluded that Cube attack can be successful with high probability. In particular, this paper gave algorithms of the key steps, and employed them to attack the reduced Trivium. Finally, we improved the Cube attack.
Research and Design of Arithmetic Based on Node Trustful Model
Computer Science. 2012, 39 (Z6): 81-85. 
Abstract PDF(396KB) ( 1233 )   
RelatedCitation | Metrics
Node trust reflects behavior and ability of a node against another node in the network. This article first ana- lyzed the research status of trust model now. The algorithm design for node trust model is based on the research of local trust, recommendation trust, comprehengsive trust. The theory of probalbility knowledge was introduced to calculate nodal credible degree. This method in a variety of factors influencing the node trust, the identity authentication technolo- gy design and degree algorithm and degree algorithm was designed and realized.
Penetration Test Techniques Shallow
Computer Science. 2012, 39 (Z6): 86-88. 
Abstract PDF(250KB) ( 1998 )   
RelatedCitation | Metrics
Penetration testing is security check and audit for the network system, by an attacker identity and is a useful supplement of the safety assessment. According to the various types of penetration test, this paper summarized the pen- oration testing object, method, basic procedure, and designed a penetration testing system model, introduced the model of architecture design, main node and test process.
On Economics Model of Terminal Resource Exchange under Network Environment
Computer Science. 2012, 39 (Z6): 89-92. 
Abstract PDF(423KB) ( 366 )   
RelatedCitation | Metrics
The use of customer's resource under the networked environment is a problem that various providers such as network provider and content provider unwilling to discuss, but they always use this kind of resource without telling their customers. Forthemore,introducing economics models into resource sharing, especially as peer-to-peer network e- merges on the horizon, is recent years' research hot topic. Modeling the exchange of terminal resources under networked environment by using economics concept, presented simple model(bi-party model) and upon which by introduing cus- tomer's contribution factor put forward the user involved model(tri-party model) that enable the quantization of termi- nal resources. Finally this paper reduced the terminal resource exchange model to transport model, and by storage re- source exemple,illustrated that the method is able to gain the resolution.
Study and Implementation of Data Transfer System in Experiment of High Energy Physics
Computer Science. 2012, 39 (Z6): 93-95. 
Abstract PDF(604KB) ( 439 )   
RelatedCitation | Metrics
In the circumstance of high energy physics experiments, usually a large amount of experimental data is pro- duced every day. Also the high energy physics experiments are regional-crossing. Because of this feature, the experimen- tal data needs to be transferred to the remote data center and the computing center for data analysis. So how to transfer the data to the remote data center and computing center reliably and efficiently in real time is an important problem which needs to be solved for carrying on a high energy physics experiment successfully. This paper presented a mass da- to real-time transfer system under the environment of high energy physics. I}his system provides the interfaces with the data generation system and the data management system, and can transmit the experiment data from onsite to the re- mote data center and computing center reliably. I}he functionalities of the system include; multipath source data scan- ning, data transmitting, data buffer automatic releasing, data management interface, data transmitting performance moni- toring and logging and etc. I}he system also provides multi-flow concurrent transmission in order to solve the high delay in WAN and improve the transmission efficiency. In addition, to solve the single point problems caused by the network status or failure of the transmission point, the system also supports data relaying to make the system work robustly and reliably. Experiments show that the system performs well in data transmission process under the environment of high energy physics.
RSSI-based Centroid Location in Wireless Sensor Networks
Computer Science. 2012, 39 (Z6): 96-98. 
Abstract PDF(235KB) ( 897 )   
RelatedCitation | Metrics
In the wireless sensor network, the node localization is one of the core technologies in this domain. To solve the low localization caused by the traditional centroid location algorithm, based on the division of the wireless sensor network, the author put forward a renewed algorithm by combining the RSSI measuring technique with the centroid al- gorithm. The algorithm combines the received RSSI value with the divided area of the network and forms an estimated area of the unknown nodes so as to locate the node. Simulation shows that compared with the traditional centroid loca- lion algorithm,the error location has been significantly improved by this algorithm.
Improved Method of Wei沙ted Complex Networks Clustering
Computer Science. 2012, 39 (Z6): 99-102. 
Abstract PDF(341KB) ( 378 )   
RelatedCitation | Metrics
In order to overcome the shortcomings of the local convergence and poor results for multidimensional data clustering in complex networks clustering, by applying and analyzing complex networks clustering method, applied NJW algorithm and PSO clustering algorithm to the detection of the cluster structure of the complex networks, so designed and implemented an improved method of weighted complex networks clustering. Experiment demonstrates that the pro- posed method has high efficiency and good result in the implementation of larger and structure of more complex net- works.
Community Structure Detection in Complex Networks
Computer Science. 2012, 39 (Z6): 103-108. 
Abstract PDF(570KB) ( 1285 )   
RelatedCitation | Metrics
Many networks of interest in the sciences, including social networks, computer networks, arc found to divide naturally into communities or modules. Community structure can reflect the heterogeneity and modularity of the rcal- world networks. Finding the communities within a network is a powerful tool for understanding the structure and the functioning of the network. We reviewed some most popular methods for detecting community, including GN algorithm, modularity-based methods, dynamic algorithms, and the methods based on statistical inference. We used the standard testing network Zachary to test the abovcmentioned methods, and analysed the time complexity and conclude the ad- vantages and disadvantages of this methods. Finally, prospected of study on community detection methods.
Research on Geospatial Information Service Interoperability Protocol Based on REST
Computer Science. 2012, 39 (Z6): 109-112. 
Abstract PDF(420KB) ( 386 )   
RelatedCitation | Metrics
Restful Web services is full use the distributed characteristics of H I hP protocols so that it can make the de- velopment of Web services become more efficient, simple, and maintainable. The main research focused on geospatial in- formation service interoperabihty protocol. The resource of the service defined by separating the operation,intention and representation defined in the OGC Web service. Then using the operations defined by H I TP by combining the "re- source" concept and geospatial information "data" sharing, designed the RESTful geospatial information Service as RES7WMS and RES7WFS.
Research of Barber Model Based on Semaphore with Complicated Semantics
Computer Science. 2012, 39 (Z6): 113-116. 
Abstract PDF(362KB) ( 459 )   
RelatedCitation | Metrics
The sleeping barber model problem in computer science is a classic inter-process communication and synchro- nization problem between multiple operating system processes, however, most researchers was used to introduce extra functions to extend the original model, ignoring new complicated problems arose from semantic changes and not giving innovative solutions. Based on life prototype and semantics of new model, this paper redefined the barber model with complicated semantics,and proposed analysis pattern of resource oriented and provided solutions and algorithms of in- ter-process synchronization problems within complicated semantics based on Try-p semaphore and p-v primitives.
Fast SACK Scheme for Overall Throughput Enhancement Based on Multi-homed SCTP
Computer Science. 2012, 39 (Z6): 117-119. 
Abstract PDF(316KB) ( 473 )   
RelatedCitation | Metrics
This paper analyzed several algorithms when transmitting data in multi homed scenarios and proposed a fast SACK scheme based on multi homed SC,'TP. The scheme eliminate abuse that only the sender grasps the link status. Simulation results show that the proposed scheme could improve the throughout perfommnce of multi-homed SC于hosts when the delay of two paths is not balanced.
Diffusion-based Clock Synchronization in Mobile Wireless Sensor Networks
Computer Science. 2012, 39 (Z6): 120-122. 
Abstract PDF(233KB) ( 305 )   
RelatedCitation | Metrics
The diffusion-based clock synchronization in mobile wireless sensor networks(mWSNs) was researched. To improve the synchronization speed, the essential factors that affect the synchronization performance were analyzed, and two new schemes were put forward. Through simulations, it was illustrated that the synchronization operation rounds, the amount of nodes in networks, the communication range and motion speed of nodes all affected the synchronization speed in mWSNs. I3y giving different operating probability to nodes, the synchronization performance can be improved in the limit of sum of operations. hhe simulation results show that the diffusion-based clock synchronization can enhance the performance in mWSNs utilizing the mobility of nodes, and the two new schemes can improve the synchronization speed further.
Clustering Subsection Algorithm Based on Fuzzy Equivalence Matrix for Wireless Sensors
Computer Science. 2012, 39 (Z6): 123-124. 
Abstract PDF(209KB) ( 364 )   
RelatedCitation | Metrics
The clustering method based on fuzzy equivalence matrix is one of the classical methods of fuzzy clustering. We applied it to assort wireless sensors for the first time. The algorithm analyzes the pertinence of two sensors with the Euclid distance of them, and then acquires the fuzzy equivalence matrix for subsection. We designed the algorithm, ana- lyzed the time complexity of i1_ And the algorithm was simulated on Matlab. The results of simulation show the wireless sensors are assorted several classes by different distributing density with the algorithm.
Study of Network Quality Evaluation Based on IP Application
Computer Science. 2012, 39 (Z6): 125-128. 
Abstract PDF(341KB) ( 561 )   
RelatedCitation | Metrics
IBS evidence theory widely used in the information fusion field was introduced into network evaluation area. Faking the network application as guide, we selected the KPI which mainly bears application in the evaluated network as indicator-level information input. An Internet quality evaluation hierarchical structure was established by using AHP method to assign basic probability. The two-level IBS structure was used to fuse evidence to get the network duality e- valuation results. The calculated results show the model using two levels of IBS structure to fuse evidence eliminates the uncertainty at a great extent. Furthermore, it has good practicality.
Improvement of Node Localization in Wireless Sensor Networks Based on Quantum-behaved Particle Swarm Optimization
Computer Science. 2012, 39 (Z6): 129-131. 
Abstract PDF(252KB) ( 322 )   
RelatedCitation | Metrics
Focusing on the requirements of low cost and high accuracy in wireless sensor network(WSN),an improvc- ment method of weighted centroid localization algorithm was introduced based on received signal strength indicator(RS- SI) which used the quantum-behaved particle sSwarm optimization(QPSO ) to optimize WCLA evaluation coordinates to decrease the localization error, moreover, the convergence rate was quicken by improving expand/ contract coefficient. The simulation shows that the localization accuracy of the new algorithm is significantly superior to that of weighted centroid localization algorithm and weighted centroid localization optimized by PSO, and it could also overcome the short coming of PSO that convergent slowly and easy to fall into local minimum.
Study of Self-organizing Network Routing Model
Computer Science. 2012, 39 (Z6): 132-135. 
Abstract PDF(426KB) ( 366 )   
RelatedCitation | Metrics
The study of self-organizing network is a hotspot in the next generation network (NGN) research domain. P2P as one of the four technologies will shape the Intemet'sfuture. I3y building P2P overlay network on top of Internet's physical topology layer based on P2P computing mode, we can effectively build a full-decentralized Internet based self- organizing network routing model-Hierarchical Aggregation Self-organizing Network (HASH). I}he target and architec- lure of HASN were described in this paper, as well as a detailed description of a P2P decentralized naming, route discov- Bring and updating algorithm-HASH_ Scale. Simulation results testify the performance of HASH.
Research of Virus Spreading Based on Shortest Path Immunization Strategy in Scale-free Network
Computer Science. 2012, 39 (Z6): 136-138. 
Abstract PDF(262KB) ( 421 )   
RelatedCitation | Metrics
Most of the traditional virus immunization strategics arc based on global network topology information, how- ever, most real-life complex networks are known to us with only the local topology information. In view of that scale- free property exists in many real-life complex networks, the virus spreading with shortest path immunization strategy based on local topology information in scalcfrec complex evolving network was studied. I}his article used the mean-field theory to build a virus spreading model in scale-free network introducing a immunization strategy based on shortest path and a key factor; individual resistance. After comparing the efficiency of random immunization, target immunization and shortest path immunization for virus spreading in scale-free complex networks, the result indicates that the immuniza- lion strategy based on shortest path has important effectiveness.
Studying Reliability of Warship Fleet Complex Networ(}s Based on Repair Tactical
Computer Science. 2012, 39 (Z6): 139-141. 
Abstract PDF(257KB) ( 298 )   
RelatedCitation | Metrics
To study the robustness of complex networks under attack and repair, we introduced a repair model of com- plex networks. Based on the model, we introduced two new quantities, i. e. attack fraction f and the maximum degree of the nodes that have never been attacked Ka , to study analytically the critical attack fraction and the relative size of the giant component of complex networks under attack and repair, using the method of generating function. We showed ana- lytically and numerically that the repair strategy significantly enhances the robustness of the scalcfrec network and the effect of robustness improvement is better for the scale-free networks with smaller degree exponent. We discussed the application of our theory to the understanding of robustness of complex networks with reparability.
Analysis on Scalability of P2P Streaming System Based on Flash Crowd
Computer Science. 2012, 39 (Z6): 142-145. 
Abstract PDF(322KB) ( 343 )   
RelatedCitation | Metrics
Peer-to-Peer(P2P) live streaming systems have recently shown great potential attractive on the Internet. However,large-scale deployment of such systems relies too much on their efficiency in dealing with high dynamic chan- ges, especially in the flash crowd period. hhe main reason is that the expansion of a P2P live streaming system largely depends on the of the time demand of streaming media applications. Based on the proposed analysis and experiment, we found the inherent relation between the system scale and the time, as well as the constraints. We constructed a generic model for P2P streaming system to concentrate on analyzing the process that the nodes added to the system in the flash crowd. The paper first argued that simply using the "supply-demand" concept model to describe the system scale is not enough, and then introduced the upper bound of system scale over time with a random partner selection strategy like Gossip protocol. Finally we showed the impact on system scalability by critical factors clearly through a comparative a- nalysis on Matlab R2010a platform.
Research on IS-IS Interoperability Testing
Computer Science. 2012, 39 (Z6): 146-150. 
Abstract PDF(443KB) ( 586 )   
RelatedCitation | Metrics
Internet has become an integral part of our life, and it is important to have routing protocols that arc effective and stable. IS-IS is a major interior gateway protocol which is applied in some large networks widely. We researched the IS-IS interoperability testing, and established a layered IS-IS protocol interoperability testing interactive model based on the Extended Petri Net. Then we analyzed the Petri net model by using the reachability graph. According to test purpose and generated test sequences, we programed test cases and took some necessary test cases manually as a complement of the test set, Finally, we built the test platform for interoperability testing, and analyzed the test results.
Analysis Model for the Efficient of the TDMA MAC for Wireless Sensor Network
Computer Science. 2012, 39 (Z6): 151-153. 
Abstract PDF(581KB) ( 549 )   
RelatedCitation | Metrics
We presented a controllable threshold polling medium access protocol designed for wireless sensor networks.Although polling as one of kinds the TDMA medium access control protocol has the advantage of energy conservation,extensibility is not as good as that of contention-based protocol. The main goal of the MAC protocol is to realize the self-organizing in slot assignment and synchronization. Methods of the imbedded Markov-chain theory and the probability generating function were used to propose an analysis model of energy efficient for wireless sensor network. Finally, mathematic analyses and simulations results show that the proposed model is suitable for analyzing the energy efficient of MAC for wireless sensor network.
CRC Lookup-table and its Implementation by Parallel Matrix
Computer Science. 2012, 39 (Z6): 154-158. 
Abstract PDF(287KB) ( 1841 )   
RelatedCitation | Metrics
Cyclic redundancy check(CRC) has already been used in the fields of communication widely. However,straightforward CRC implementation based on the bit cannot meet the requirements of high-speed link. I3y the lookup-table or parallel algorithm, the bottleneck of speed can be resolved on a large scale. The relation between lookup-table and parallel matrix was investigated, from which lookup-table responding to polynomial of any order and any bit width processing can be given and deriving procedure of blocks. Comparison analysis on the performance of lookup-table,par- allel matrix and blocks shows that less time is consumed while increasing width, and parallel matrix has better performance in requirement of memory space, and by reducing length of checking sequence, computational speed of blocks is increased significantly.
Self-information Algorithm
Computer Science. 2012, 39 (Z6): 159-162. 
Abstract PDF(341KB) ( 340 )   
RelatedCitation | Metrics
In most studies, the selection of super-peer only considers whether the performance of nodes is suitable for super-peers, without taking into account the speed of the network to rebuild, when a node is out of or accession from the network .In the algorithm study, the on-line time and capacity was less considered at the same time. The algorithm was improved based on Alberto Montresor algorithm. From the concept of the self-information, combined with the node capacity and on line time, the algorithm of super peer selection was designed. The results show that the speed of building network for the self-information algorithm is similar with singlcused capacity, and selected super-peers were more stable. The self-information algorithm will effectively reduce the frequency of building network, thereby reduce the costs of the network traffic. For some catastrophic situations, for example, removing 50 0 o of nodes from the network, it can be quickly built. hhis indicats that the network built by our algorithm has robustness.
Research of Probability-based Greedy Spectrum Decision in Cognitive Wireless Mesh Networks
Computer Science. 2012, 39 (Z6): 163-165. 
Abstract PDF(244KB) ( 360 )   
RelatedCitation | Metrics
For the traditional spectrum decision methods in cognitive radio networks, the secondary users select their communication channel based on various criteria, which may result in channel contend and congest. Considering probability-based spectrum decision in cognitive wireless mesh networks,a greedy channel selection algorithm was proposed. When spectrum handoff occurres, the proposes algorithm combined with the improved preemptive resume priority (PRP) M/G/lqucuing theory to select optimal target channels for secondary users. Numerical analysis results show that the algorithm can increase and balance the channel utilization. With lower average arrival rate of secondary users,the average overall transmission time of the secondary users has been further reduced and the network performance isenhanced.
TOA Estimation and Integration Method Based on Multi-antenna System
Computer Science. 2012, 39 (Z6): 166-168. 
Abstract PDF(223KB) ( 550 )   
RelatedCitation | Metrics
In urban environment with serious blocking of direct paths,as a result of non-line-of-sight (NLOS) propagation effect, the accuracy of TOA estimation as well as the accuracy of location estimation is significant influenced. We developed a new algorithm to mitigate the NI_OS errors in location estimation based on multi-antenna system. Utilizing the TOAs obtained from different antennas in a multi-antenna array,the TOA with high accuracy is obtained by signal processing and integration. Simulation results show that this algorithm can mitigate NLOS errors effectively and improve the performance of TOA based location algorithm in NI_OS environment significantly.
Study on the Performance of the Self-healing for 7igBee Routing Fault in OPNET
Computer Science. 2012, 39 (Z6): 169-170. 
Abstract PDF(285KB) ( 568 )   
RelatedCitation | Metrics
ZigBee,a wireless local area network(WPAN)ficld technical standard based on IEEE802. 15. 4,with the advantages of low power consumption, easy extensibility and giant network capacity is increasingly popular. Its unique Mesh network has very strong self-healing ability with ad hoc network as node failure. The powerful self-hea-ling property of the Zigl3ec network was verified through simulating the dynamic routing fault scenarios in OPNEIT.
Research on the Attenuation Characteristics of the 2. 4GHz Wireless Signal Affected by Plant
Computer Science. 2012, 39 (Z6): 171-173. 
Abstract PDF(246KB) ( 700 )   
RelatedCitation | Metrics
Wireless sensor network is applied to greenhouse agriculture more and more widely, and needs higher requirements on the quality and reliability of network communication. In greenhouse, the wireless signal decays with density,height and distance of middle plants, so it causes to recede the network performance. I3y measuring the power of receiving signal with CC2430 chip under different conditions, the relationship of attenuation of 2. 4GHz signal and distance between the plant obstacles and the signal source were studied. hhe results can provide designing basement for efficient distribution of wireless nodes and optimization of network topology.
Dynamic and Low Redundancy Routing Algorithm Based on Probability Distribution in Mobile Sensor Networks
Computer Science. 2012, 39 (Z6): 174-177. 
Abstract PDF(415KB) ( 322 )   
RelatedCitation | Metrics
Because of the node mobility, limited storage space, limited energy and other factors, delay tolerant mobile sensor network(DTMSN) is suffered from high packet loss rate and high data redundancy. Thus the traditional deterministic routing mechanism can' t be directly applied to DTMSNs. A dynamic and distributed routing algorithm was proposed,which was based on the node's probability distribution of connecting with sink within given number of time slots. The algorithm makes full use of the node mobility so that the network can maintain low redundancy and achieve high data delivery rate at the same time. Experimental results show that the proposed algorithm has more advantages when the network has more nodes with faster speed and larger transmission range.
Recent Advances and Trends of Peer-to-Peer Networking Research
Computer Science. 2012, 39 (Z6): 178-183. 
Abstract PDF(540KB) ( 1015 )   
RelatedCitation | Metrics
After extensive investigations and applications in the past years, the research efforts of Peer-to-Peer(P2P)networking have been turned to the improvement of P2P systems design and deployment, application extension, and standardization, other than the development of P2P protocols and algorithms before. In this survey, we presented recent advances of P2P networking research, including theoretical modeling and analysis of P2P systems, hybrid systems of content delivery network(CDN) and P2P,P2P traffic control,application of network coding to P2P systems,P2P appli- canon extension,and standardization. Finally,we summarized the challenges for further studies and deployment of P2P systems.
Least Square Timing Acquisition Algorithm for Pulse Amplitude Modulation-ultrawide Band Signals
Computer Science. 2012, 39 (Z6): 184-186. 
Abstract PDF(227KB) ( 332 )   
RelatedCitation | Metrics
A rapid timing acquisition algorithm was proposed in the paper, which is based on least square estimation (LSE),which is developed for UWI3 (ultra-wideband) communication based designed training symbols. Using almost perfect sequcnce,a novel training sequence was judiciously designed. Compared to the conventional algorithm with GHz sample rate, the algorithm can realize timing acctuisition at receiver using simple correlation operations over the samples at the symbol rate(several MHz to several hundreds MHz) , which reduces considerably implementation complexity. The computational efforts of the algorithm is only a third of conventional algorithm. The acquisition performance in the multipath channel was also provided in terms of simulation, which indicates that with a 12-bit training sequences, the scheme achieves a high synchronization performance,when there are 24-bit training sequences,the bit error-rate(I3ER)is close to that of the case with perfect.
Equipment Simulation Model Based on SNMP and Component Assembly Technology
Computer Science. 2012, 39 (Z6): 187-189. 
Abstract PDF(254KB) ( 401 )   
RelatedCitation | Metrics
Based on SNMP protocol and component assembly technology, building network equipment simulation model to reflect the true operational status of ectuipment_ We can get or set the device operating status with the SNMP protocol, such as to get running status of CPU, memory and power supply. We can get an overall visual simulation model by the use of component assembly technology to assemble the network devices components, network ectuipment components including the engine board, business board,power and fan components, etc. hhe operation of these components de- pends on SNMP protocol. The simulation model of network ectuipment can improve the network management efficiency.The operation on simulation model is equivalent to operate the true equipment.
Integrated Management Platform of Nuclear Fuel Storage and Transportation Based on RFID
Computer Science. 2012, 39 (Z6): 190-194. 
Abstract PDF(462KB) ( 463 )   
RelatedCitation | Metrics
This paper described integrated system model to improve work efficiency and optimize control measures of nuclear fuel storage and transportation. RF1D and information integration technology was introduced. Traditional management processes were innovated in data acquisition and monitoring fields as well. System solutions and design model were given by emphasizing on the following key technologies: cascade protection of information system, security protocol of RFID information, algorithm of collision.
Research on Develeping the Telemedicine Based on Body Area Network and Cloud Computing
Computer Science. 2012, 39 (Z6): 195-200. 
Abstract PDF(597KB) ( 563 )   
RelatedCitation | Metrics
In the 21st century, network technology and mobile communication technology arc gradually entering the health services field. hhe combination of body area network, broadband mobile communications and cloud computing makes medical applications for a large population possible. The development of digital medical technology, especially tclemedicine,incrcasingly turns out to be an important method to significantly reduce the costs of health and medical treatment, change the uneven allocation of medical resources, and improve the health care as a whole. There are a lot of researches about body area network technology, cloud computing storage technology, system of health assessment, tclemedicine and home-care model at home and abroad, which put forward a number of innovative theories and models.However, there arc still many problems and difficulties. In digitalized medical applications that combines information technology and medical care,it becomes a vital strategy and development trend of medical reform in many countries to integrate body area network, cloud computing and other advanced information technologies in order to address the exiszing problems in applications one by one, and to innovatively establish collection and transmission of personal health information, the automation of health information processing and remote operation of health management. In this way, we can provide telemedicine service "timely" "appropriately" with "no boundaries",and improve the level of health and medical care and the duality of people's life. This paper provided a general review and outlook of the development in this area and made an analysis of typical solutions and application cases in this area.
Main Character and Basic Theory for Internet of Things
Computer Science. 2012, 39 (Z6): 201-203. 
Abstract PDF(248KB) ( 1115 )   
RelatedCitation | Metrics
The things is object connected to Internet of Things(IoT),and it's attributes determine IoT characteristics and research methods. Analysis of the main attributes of things indicate the things is man, service, substance and product, kind of representative agent IoT main features are self-feedback architecture, 3C, security, complex networks, complex ecosystem, and IoT complex ecosystem theory model is composed of TSESOI(Time, Space, Energy, Structure, Order, Information). Analysis of the relationship between basic concepts and network applications gives a theoretical founlotion architecture related to IoT.
Safe Delivery of Subscribe Information in the Internet of Thins
Computer Science. 2012, 39 (Z6): 204-206. 
Abstract PDF(278KB) ( 315 )   
RelatedCitation | Metrics
hhis paper presented a safe delivery method with low-load and self-configuration. The safe delivery method delivers subscription information effectively in the largcscale networking applications. It allows the terminal to complete more work which is based on the principle of the lowest network overload. Meanwhile, it provided a network self-configuration way which has a good fault tolerance to deal with the situation when nodes come to a failure or when the network needs to add new nodes, thus it better adapts to the largcscale networking applications. Experiments proved thecorrectness of the delivery method.
Comparison Research on MDA, Cloud Computing and SOA
Computer Science. 2012, 39 (Z6): 207-209. 
Abstract PDF(292KB) ( 462 )   
RelatedCitation | Metrics
Cloud computing is only effective to specific business enterprises and units that match particular conditions.It isn't a good choice to those business enterprises which hope to acquire better competition ability. Although service oriented structure(SOA) helps business enterprise to quickly carry out a system integration, it is still difficult for looking for and utilizing services along with the increment of the service amount So SOA isn` t a good choice either. Model driven architecture(MDA) provides business application platform for the business enterprise with methodology and stanlord independent to technical realization. The application to model-driven components and management to model data- base and component database will greatly support a business enterprise information-based.
Design Case of Power and Environment Monitoring System Based on Internet of Things
Computer Science. 2012, 39 (Z6): 210-211. 
Abstract PDF(267KB) ( 455 )   
RelatedCitation | Metrics
Based on the framework of Internet of things with three dimensions such as data perception layer, transport layer and application layer, and combined with characteristics of demand for the power and environment monitoring, the paper put forword three layer model of the power and environment monitoring system for Internet of things. A specific design case was also given with analysing the content of monitoring requirement and dedigning the function of central processing machine. Ultimately, the paper provided design layout for monitoring room and operation room. The model and design case give the reference value for enterprises' implementation of some feasible power and environment monitoring systems based on Internet of things.
Design of Intelligent Logistics System Based on Cloud Computing and Internet of Things
Computer Science. 2012, 39 (Z6): 212-213. 
Abstract PDF(269KB) ( 391 )   
RelatedCitation | Metrics
his paper analyzed the current problems of the military logistics guarantee system, explored the advantages of military logistics guarantee system based on the combination of the Internet of things and cloud computing, presented the concept of "intelligent logistics" and the corresponding infrastructure, designed and furthermore realized the oil management system based on the Internet of things and cloud computing technology.
Study of Security Problem Based on RFID System
Computer Science. 2012, 39 (Z6): 214-216. 
Abstract PDF(262KB) ( 1217 )   
RelatedCitation | Metrics
With the rapid development and wide application of RFID technology, RFID system' s security threats and malware of them also develop quickly. In order to prevent from them, we need to know their theories better. This paper particularly analyzed existing security problems and malware of RFID system,and gave some defense measures. At last,according to the characters of RFID system, the paper proposed a malware's immune model based on RFID system.
Data Mining Service Model in Cloud Computing Environment
Computer Science. 2012, 39 (Z6): 217-219. 
Abstract PDF(363KB) ( 862 )   
RelatedCitation | Metrics
Data mining in cloud computing environment was proposed as a solution, in order to solve the task of distributed massive data analyzing in network and promote development integration and business application of data mining.The solving mechanism of data mining service was explained by computing capability and service model of cloud computing. Data mining in cloud computing environment is a service model of information resources in network. Based on these,the data mining service architecture was constructed, and the data mining service creating procedure was designed. The system architecture of data mining service model was depicted. The service process of data mining was defined. The data mining service model in cloud computing was formed consequently.
Research on Improving Hadoop Job Scheduling Based on Learning Approach
Computer Science. 2012, 39 (Z6): 220-222. 
Abstract PDF(345KB) ( 447 )   
RelatedCitation | Metrics
With parallel computing, distributed computing and grid computing technology, cloud computing was proposed as a new model and developing fast. Hadoop is an open source cloud computing system that has been widely used.Job scheduling is one of the core problem on Hadoop platform. Through understanding and analyzing current scheduling algorithm that has already existed for Hadoop,based on learning approach, the past history of nodes and job attri-butes were used to improve job scheduling. Feature weighting-based naive bayes classification algorithm was applied to im- prove tasks scheduling, then it was verified through experiments. As a result, it improves the efficiency of scheduling of task allocation for Hadoop.
Service-oriented Architecture of Expandable IOT Combined with Cloud Processing and its Application Research
Computer Science. 2012, 39 (Z6): 223-225. 
Abstract PDF(517KB) ( 305 )   
RelatedCitation | Metrics
hhe Internet of things is the hot spot of current research, which will bring out a trillion level information industry. The architecture about the Internet of things is divergent since the occurrence of IoT. There are mainly three layer, four layer, five layer architecture, etc. In this paper, various architectures of Ioh were studied. The advantages and disadvantages of its performance and suitable field of its application were analyzed and compared. Meanwhile, the serviccoriented architecture of Io T was put forward in order to provide the reference model for the various application of IoT.
Hot Event Sentiment Analysis Method in Micro-blogging
Computer Science. 2012, 39 (Z6): 226-228. 
Abstract PDF(376KB) ( 738 )   
RelatedCitation | Metrics
Micro-blogging,as a new form of social communication,is attracting more and more attention during the last few years. Simplicity of information and diversity of communication modes in micro-blogging make it an important channcl of seeking information about hot events and expressing personal views. By analyzing and monitoring sentiment information in micro-blogging posts,we have opportunities to gain insights into users' emotion trend on hot events, which can help us evaluate and grasp the current situation of hot events. Therefore, we proposed a hot event sentiment analysis method in micro-blogging. This method first automatically detects different aspects of an event from the user perspective, and then performs sentiment analysis and emotional monitoring on each aspect. Additionally,we built a novel hot cvent sentiment analysis prototype system. The experimental results show that micro-blogging is effective in monitoring hot events on World Wide Web.
Web-oriented Spatial Information Extraction System
Computer Science. 2012, 39 (Z6): 229-231. 
Abstract PDF(357KB) ( 341 )   
RelatedCitation | Metrics
With the development of network technology, the Internet has become a complex and diverse data sources, especially the rise of Web2.0 and social networking, each Internet users can be seen as a spatial sensor from spatial information angle,massive spatial information surrounding them has been published on the Internet. Spatial information on the Internet becomes increasingly rich. This paper presented the Web oriented spatial information extraction systems. In this system, we recognize geographic sense address from non-structured data in the Web pages. Then we extracted attri bute information related to that address, and map this attribute information to spatial entity by address gcocoding. Final1y we proposed a prototype to prove the feasibility of the system.
Quantitative Analysis and Extracting Arithmetic of Collocations Basic on Typical Patterns
Computer Science. 2012, 39 (Z6): 232-234. 
Abstract PDF(355KB) ( 1231 )   
RelatedCitation | Metrics
The shortcoming of the existing automatic extraction algorithm was analyzed, and a new model was proposed, trying to transform unstructured language knowledge into structural language knowledge. The language knowledge was introduced to a extraction model based on typical patterns, and collocations were classed by noun, verb and adjective as center, and by substantive as backbone. Then, concurrence frequency and MI etc were used to screen in large-scale corpus. Finally, this knowledge was solidified to build collocation database.
Research on任learning Algorithm Based
Computer Science. 2012, 39 (Z6): 235-237. 
Abstract PDF(237KB) ( 315 )   
RelatedCitation | Metrics
Traditional C}learning algorithm is based on a single standard of reward, when the environments and the state is changed, the single standard of reward may not be able to adapt to new environments and state in multi agent system(MAS) , instead, it may restrict the learning efficiency. hhis paper proposed a method of multi agent "lcarning algorithm with multi-standard of reward. It adapt well to the changing environment and the state, complete the task in stages, different stages use different standards, so it can quickly complete the stage goal. In this paper, the simulation platform is pursuit problem in threcdimensional world. We increased the difficulty of rounding up and the complexity of the environment and state. Simulation results show that "lcarning algorithm based on multi-standard of reward can flexibly adapt to different environments and state,and efficiently complete learning tasks.
Robust Model and Algorithms for the Uncertain Traveling Salesman Problem
Computer Science. 2012, 39 (Z6): 238-241. 
Abstract PDF(303KB) ( 439 )   
RelatedCitation | Metrics
Traveling salesman problem is an important combinatorial optimization problem. The uncertain traveling salesman problem was considered. For each edge, an interval data was used to describe the uncertain travel time. In the framework of robust optimization, a model was developed. The prominent feature of this model is that the degree of conservativeness is adjustable. An exact algorithm and an ant colony optimization approach were proposed to deal with this model. Compared with the exact algorithm, the experimental results show that the proposed ant colony ptimization approach can obtain optimal or near optimal solutions within very shorter time. Finally, the property of the model was studied. The results support that the robust solution is useful in the uncertain environment.
Process Quality Analyse Based on BI
Computer Science. 2012, 39 (Z6): 242-244. 
Abstract PDF(238KB) ( 353 )   
RelatedCitation | Metrics
Process quality is the important section of the quality control. Now the major problem that exist in process quality control is lacking method of quality diagnose and improvement,history data is not effective utilized. BI(Business Intelligence) is the latest computer technology, this paper applied it in the process quality control, to integrate all the dato of process quality, find quality hidden trouble by data mining, assist operating staff to control quality and make correel decision to improve rate of satisfactory quality.
Trust Value-added Service Selection Algorithm of Workflow Based on Discrete
Computer Science. 2012, 39 (Z6): 245-248. 
Abstract PDF(427KB) ( 369 )   
RelatedCitation | Metrics
With the development of scrviccoriented computing, the users' tasks become more and more complicated.Thus,how to integrate all existing component services effectively to form a new valu}added services,as same as composite scrvice,which can content complex tasks has become a top research. Based on these service selection problems of valued-added services, this paper designs a discrete quantum particle swarm algorithm for enhancing trust, Comparing with traditional service selection algorithm, which is Qos-oriented for global optimization, this algorithm consider the issue of trust services which can solve the issue of trust in service workflow. At the same time, this algorithm disperses quantum particle swarm with the features of workflow. It redefines the computed methods of various locations and the auto-rcgulation of weight coefficient in quantum particle swarm algorithm according to the scenarios of service selcction. Compared with other similar research work, the time of service selection is reduced and a better fitness value can be got by this method. Simultaneously, the success rate of service selection is raised because of considering the issue of trust.
Method to Identi行the Future Stocl} Investor Sentiment Orientation on Chinese Micro-blog
Computer Science. 2012, 39 (Z6): 249-252. 
Abstract PDF(337KB) ( 428 )   
RelatedCitation | Metrics
Recently, Micro-blog has attracted more and more interests of Internet users. Thousands of the users share their views and opinions through micro-blog. There arc a large number of texts with sentiment orientation (thinking something is "good" or "bad") on the Micro-blog. These texts reflect the authors' emotion. Investor sentiment is the important indicator to research on the economic market trends. On behavioral finance, Stock investor sentiment affects the investors' decision. I}hen it will affect stock market. The stock investor sentiment orientation (thinking the future market will be "good" or "bad") on the future stock market is the indicator to reflect the investor emotion. In this paper,we proposed a method of sentiment classification and apply it to perform sentiment classification on Sina micro-blog (currently,thc largest Chinese micro-blog platform). In detail, our approach contains two-step classifier. Firstly, the first classifier will identify the sentence that contains the future sentiment. Secondly, use the second classifier to classify the sentence which identified by the first classifier into positive or negative. I}he experimental results show that our method achieves a decent performance on identifying the future stock investor sentiment orientation.
Improved Text-oriented Algorithm for the Domain-specific Concept Sieving
Computer Science. 2012, 39 (Z6): 253-256. 
Abstract PDF(331KB) ( 337 )   
RelatedCitation | Metrics
Ontology learning is a hot research field in semantic technology and its application, and the domain-specific concept sieving is the foundation of ontology learning. Although the method based on domain relevance expressions and domain consensus expressions displays the good effectiveness for the domain-specific concept sieving, it exists the faults of unilateral description information. So this paper presented an improved sieving algorithm of the domain-specific concept to solve the above problems. First, low freduency with synonymy and the part of relationship words set were identified and redundant concepts were filtered out through calculating the semantic similarity between candidate concept, and then using the improved field concept similarity and domain concepts consistent degree formula sieve concepts. The experiments show that this method improves the effectiveness of the domairrspecific concept sieving.
Algorithm for Prediction of New Topic's Hotness Using the K-nearest Neighbors
Computer Science. 2012, 39 (Z6): 257-260. 
Abstract PDF(351KB) ( 952 )   
RelatedCitation | Metrics
With the rapid development of the Internet, the government, enterprises and public have paid more and more attentions on net mediated public sentiment. How to effectively monitor and aright guide the public sentiment on the Internet has become an issue that should be coped urgently with. As a basis to solving the issue, it is necessary to have ability of predicting topic's hotness appearing on the Internet As traditional algorithms could not predict aright new topie's hotness,a novel algorithm based on K-nearest neighbors(K-NN) was proposed in this paper. The algorithm prediets the hotness of new topics by using hotness times series of their historical similar topics. The experimental results show that the average accuracies of the hotness prediction during the first 3 days arc 47. 26 0 0,61 0 0 and 67. 7 0 0 rcspcclively with the corresponding relative errors being less than 10 0 o,20"o and 30"0,and the average accuracy of the hotness trends within the first 3 days could be up to 73. 73 0 o. Meanwhile, the results also demonstrate that similar topics approximately have same hotness trends in their early developing stages.
Research on Multi-agent Q Learning Algorithm Based on Meta Equilibrium
Computer Science. 2012, 39 (Z6): 261-264. 
Abstract PDF(342KB) ( 630 )   
RelatedCitation | Metrics
Multi-agent reinforcement learning algorithms aim at cooperation strategy, while NashQ is frectuently menboned as a pivotal algorithm to the study of non-cooperative strategics. In multi agent systems, Nash equilibrium can not ensure the solutions obtained Pareto optimal, besides, the algorithm with high computation complexity. MetaQ algorithm was proposed in this paper. It is different from NashQ that MetaQ finds out the optimal solution by the pretreatment of its own behavior and the prediction of the others behavior. In the end,a game-climate cooperation strategy was used in this paper, and the results shows that MetaQ algorithm, with impressive performance, is fit for non-cooperative problem.
Probabilistic Data Model Based on Existence of Tuple
Computer Science. 2012, 39 (Z6): 265-270. 
Abstract PDF(435KB) ( 444 )   
RelatedCitation | Metrics
With the development of ICT, data management has meet more and more challenge, one of challenge is uncer tainty of data. This paper proposed a probabilistic data model based on existence of tuple, compared with other models,this model was proved to be complete, and we proposed corresponding probabilistic relational algebra and query algo rithm.
Contextual Preferences-based Approach for X1VIL Query Results Ranking
Computer Science. 2012, 39 (Z6): 271-276. 
Abstract PDF(497KB) ( 321 )   
RelatedCitation | Metrics
To deal with the problem of information overload for XML data queries, this paper proposed a contextual preferences based XMI_ query results ranking approach. I}he valucbased query predicates user specified arc treated as the context, and then the probabilistic information model is used on the dataset and query history to speculate the user preferences, estimate the corrclations between the specified and unspecified attribute unit values and the relevance between the unspecified attribute unit values and user preferences as well. Next, scoring function of query results is constructed and the ranking score which is computed by the scoring function is used to rank the query results. Results of experiments showed that the ranking approach proposed has high ranking accuracy and can meet the user needs and preferences as well.
Research of Dynamic Community Discovery Based on Diffusion Theory
Computer Science. 2012, 39 (Z6): 277-278. 
Abstract PDF(251KB) ( 417 )   
RelatedCitation | Metrics
In terms of the dynamic community discovery problem, this article came up with the idea of basing on the priority complex and thegrouth theorem and according to the nodes degree distribution to construct the time axis to dynamically simulate the social network' s mechanism of formation and evolution, simultaneously divide the community structures. We used Zachary Club and I_es Miscrables as the experiment data to test our algorithm. Experimental results show that, the communities that the algorithm gets are all strong connected communities, it is able to dynamically, accurately discover the community structure with high practical value.
Study on the Task Allocation Based on Improved Contract Net in Multi-agent System
Computer Science. 2012, 39 (Z6): 279-282. 
Abstract PDF(331KB) ( 432 )   
RelatedCitation | Metrics
The research of task allocation in multi-agent system is becoming more and more. The contract net model is a classical method for task allocation in multi-agent system, but traditional contract net model has lots of deficiencies.With the base of bidding announcement based on the credit of agent and bidding based on self-adaptation, a synthetic evaluation strategy based on the load,capability and credit of bidding agent was also proposed,which can effectively resolve the problem during the procedure evaluating bidding. The simulation experiment proves that the performance ofthe extended contract net is effective.
Particle Swarm Optimization for Subspace Clustering Identi行Tag Redundancy in Folksonomy
Computer Science. 2012, 39 (Z6): 283-287. 
Abstract PDF(433KB) ( 410 )   
RelatedCitation | Metrics
Web2. 0 tag recommender systems always contain a lot of redundant special label. Redundancy on tags may even burden the user with additional effort by selecting their preferred items, and these redundancy can introduce erroneous features into the user profile and hamper the effort to judge recommendations. There is usually a lot of irrelevant or redundant features in high-dimensional data sets, feature subsets between different clusters are not the same. Therefore,we should focus on the different features subsets to discover the cluster. This paper proposed subspace PSO clustering to identify tag redundancy. We designed a suitable weighting K-means o均ective function, which is more sensitive to the change variables in weight. On this basis we developed PSO to optimize the objective function, then obtain global optimal value, and finally improve tag redundancy accuracy. Our experimental results show that the proposed algorithm has greater searching capability and obtains a better clustering accuracy.
Attribute Reduction of Consistent Covering Decision System
Computer Science. 2012, 39 (Z6): 288-290. 
Abstract PDF(303KB) ( 299 )   
RelatedCitation | Metrics
In our real life, there always arc a great deal of complex massive databases. The notion of homomorphism can be used as a tool to study data compression in consistented covering decision systems. First, we presented the concept of consistent function related to coverings, the concept of covering mapping, and their properties were studied. Next, we proposed the notion of homomorphism of consistent covering decision systems, and proved that a consistent covering decision system can be compressed into a relatively small-scale decision system. Meanwhile, their attribute reductions are equivalent to each other under the condition of homomorphism.
Construction and Practice of the DataWarehouse in the Police Comprehensive Information SystemPreplan Expression and Optimization Method Based on and Case-based Reasoning
Computer Science. 2012, 39 (Z6): 291-292. 
Abstract PDF(498KB) ( 364 )   
RelatedCitation | Metrics
In Public Security Industry, with the gradually promoting informationization work, only using business data-base to handle alert, case handling, security business work already can not meet the demand, so the construction of data warehouse is the must. According to the characteristics of public security, this paper gave a solution which is suitable for public security industry data warehouse system .In this project overall design and implementation phase, this paper expouned the data warehouse system of data extraction, conversion and loading, the establishment of multidimensional modcl,data warehouse front application show.
Semantic Annotation Method for Web Image Based on Domain Ontology and Image Description Texts
Computer Science. 2012, 39 (Z6): 293-299. 
Abstract PDF(625KB) ( 381 )   
RelatedCitation | Metrics
Semantic auto annotation on Web image is an important method for huge amounts of images management and retrieval on Web,and the semantic mining from the images automatically and effectively is the key. The semantic lies not only in the image itself, but also and more importantly in its description texts. Further, the image semantic varies with the change of images or description knowledge. To address this issue, in this paper, based on domain ontology and different image descriptions,we propose an adaptive semantic annotation method for Web images. This method checks the impacts on image semantic from the description texts feature. It detects the semantic and extends keyword by domain ontology,and then based on a regression model to adaptively model the Web images' semantic distribution on its different description texts. A group of experiments are carried out on a real-world Web image data set and the experimental results show that our proposed method outperforms others is with excellent adaptivity.
Model of Data Mail QoS Based on Improved Ant Colony Algorithms
Computer Science. 2012, 39 (Z6): 300-303. 
Abstract PDF(324KB) ( 329 )   
RelatedCitation | Metrics
In the network system of distributed data exchange, the exchange of massive data is a key problem. I}he employcd hop and time delay arc consided during a routing selection, and there is not QoS guarantee. The data packages in the transmission process are delayed a long time for obstruction or interrupt, and some of them are discarded for seriously delay. In this paper, the model of data mail QoS based on improved ant colony algorithms realizes the data mailing with satisfaction of QoS routing, load balance and efficiency.
Discussion of Classification for Imbalanced Data Sets
Computer Science. 2012, 39 (Z6): 304-308. 
Abstract PDF(450KB) ( 607 )   
RelatedCitation | Metrics
Because of imbalanced class distribution, most classifiers lose efficiency with it. In fact the rarely occurring class in imbalanced datasets shows statistical significance. The problem of learning from imbalanced datasets has attracted growing attention in recent years. The paper provide a comprehensive review of the classification of imbalanced datasets, the nature of the problem, the factor which affected the problem, the current assessment metrics used to evaluate learning performance, as well as the opportunities and challenges in the learning from imbalanced data.
Semantic Web Service Discovery Method Based on Click Rate Index
Computer Science. 2012, 39 (Z6): 309-311. 
Abstract PDF(336KB) ( 316 )   
RelatedCitation | Metrics
A Web service discovery method based on the click rate index was presented to deal with lower efficiency of query in current semantic Web service discovery methods. By indexing the Web services with higher click-through rate in the universal description discovery and integration repository, we could firstly match the services with a higher access rate in the query process. Experimental results show that the method can guarantee the accuracy and improve the service discovery efficiency obviously.
Computation of Spectral Clustering Algorithm for Large-scale Data Set
Computer Science. 2012, 39 (Z6): 312-314. 
Abstract PDF(338KB) ( 723 )   
RelatedCitation | Metrics
Spectral clustering algorithm is a popular data clustering method. It uses eigen-decomposition technictue to extract the cigenvectors of the affinity matrix. But the method is infcasible for largcscale data set because of the store and computational problem. Motivated by the property of symmetric matrix, in this paper each column of the affinity matrix was used as the input sample for the iterative algorithm. hhe eigcnvectors of the affinity matrix could be iteratively computed. The space complexity of proposed method was only O(n),the time complexity was reduced to O(pkm). The effectiveness of proposed method was validated from experimental results.
Dynamically Update Materialized View in Order to Improve the Efficiency of OLAP Queries
Computer Science. 2012, 39 (Z6): 315-317. 
Abstract PDF(284KB) ( 302 )   
RelatedCitation | Metrics
In the data warehouse system, OLAP query usually involves two operations which are multi-table connection and grouping congregation. To improve the performance of queries has become the key to speed up OI_AP responding.Some time-assuming operations such as table connection and congregation can be calculated and preserved by utilizing materialized view. This investigation is based on the updated arithmetic of materialized view relating query frequency. It makes materialized view is fully utilized and query time can is highly shortened so that the query efficiency is able to be highly elevated.
Equipment Condition Monitoring Algorithm Based on Massive Data Mining
Computer Science. 2012, 39 (Z6): 318-321. 
Abstract PDF(432KB) ( 410 )   
RelatedCitation | Metrics
An ectuipment condition monitoring algorithm based on massive data mining was proposed. Industrial equipment has massive historical data and has a lot of multidimensional real-time running data. The proposed algorithm makes adaptive cluster analysis with massive historical healthy data to establish the mathematical models of equipment.The algorithm combines these models and real-time running data to achieve predication data. This algorithm can automatically determine the count of clusters by fully considering actual requests from industrial applications,which solves the problem that traditional clustering algorithms have much spending and low efficiency, and it also guarantees the efficiency in the procedure of regression. Simulation results show that the algorithm can effectively deal with massive data and get real-time predicted values,which realizes equipment condition monitoring.
Simulated Data Generation for Information System: A Survey
Computer Science. 2012, 39 (Z6): 322-324. 
Abstract PDF(287KB) ( 445 )   
RelatedCitation | Metrics
Simulated data generation for information system is an important way for providing data, which is needed in experiments, trials or exercises of information system. In this paper, the simulated data generation for information system was compared with other correlative fields, such as software test data generation, sample data expansion and virtual reality,and its research orientation was discussed. Research contents of the simulated data generation for information system were summarized. A typical system architecture framework for the simulated data generation for information system was proposed. The framework was divided into three layers. They are data layer,intermediate layer and generation layer. At last, the future research directions of the simulated data generation for information system were analyzed.
Discussion on the Method of Cluster Analysis
Computer Science. 2012, 39 (Z6): 325-327. 
Abstract PDF(292KB) ( 485 )   
RelatedCitation | Metrics
On the basis of brief introduction of the traditional clustering algorithm, this paper summarized the new clustering algorithm,and pointed out the future development direction of clustering algorithm.
Web Citation Frequency Analysis Based on Bradford Law
Computer Science. 2012, 39 (Z6): 328-330. 
Abstract PDF(241KB) ( 875 )   
RelatedCitation | Metrics
This paper adopted Bradford law and academic papers citation frequency as researched object, researched about distributing rules of article's citation frequency, discussed applicability about information law in network space, got conclusion that Bradford law is applicable in networks. By adopting district analysis method on CNKI database within citation frequency about database security, distribution curve of citation frectuency was got. Contrasting with Bradford curve, this paper accomplished curve imitatation and image analysis. This paper pointed out the conclusion that citation frequency on webometrics follows the Bradford law and Bradford law has certain applicability on network space.
R/M Integrated Supply Chain Risk Prediction Based on Improved Apriori Algorithm
Computer Science. 2012, 39 (Z6): 331-334. 
Abstract PDF(461KB) ( 401 )   
RelatedCitation | Metrics
This study was mainly to predict the risks for R/M integrated supply chain against its characteristics by Apriori algorithm. First it built risks database according to analyze the ingredient characteristics of the supply chain risk, extracting the relevant information from the risk database. Then dealt it with Apriori algorithm, finding the regularity of risks outbreak through the relevance between the risks. Used improving candidates' extraction method to reduce the space complexity of Apriori algorithm to improve its operational efficiency. Extracted the results of Apriori algorithm to explore the association rules between the risk events and the risk results. I}hen analyzed the results to complete the R/M integrated supply chain risk prediction. And the method could be proved to be feasible through simulation experiment.
Extensible Model for SOA Quality of Service
Computer Science. 2012, 39 (Z6): 339-342. 
Abstract PDF(358KB) ( 355 )   
RelatedCitation | Metrics
Service-Oriented Architecture(SOA) is emerging as a prevalent technology for software and enterprise information system development. As more and more services with similar functionality, people pay increasingly attention to Quality of Scrvice(QoS). This paper made the state of the art analysis to QoS models, and classified characteristics in existing literature. Their advantages and disadvantages were summarized. This contribution presented a QoS model for SOA which reflects its multi dimensional characteristics, and can be extended and measured dynamically.
Logical Authorization Language Based on Attribute and Subject-Operation-Object Stratification
Computer Science. 2012, 39 (Z6): 343-349. 
Abstract PDF(680KB) ( 450 )   
RelatedCitation | Metrics
Security policy is the key of access control, and the description, authentication and execution of policy can not be realized without authorization language. In practice, the existing authorization languages are not well adapted to the complex and dynamic nature of security requirements and can not provide enough support for access control model. This paper proposed a logical authorization language based on attribute and subject operation-object stratification(SOCKSAL). Based on first order logic,SOOSAL describes the subject,operation,and object by predication, and the relationship of these by rules. In addition, policy was classified into closed world policy and open world policy from the logic point of view of the world and a simple solution was given under security discussion of the two types of policies. Our experimental results show that SOOSAL has a strong descriptive power, and can achieve the policy of dynamic change and support different security requirements and authority principle better.
Review on Memory Subsystems in High Level Synthesis for FPGA
Computer Science. 2012, 39 (Z6): 350-356. 
Abstract PDF(663KB) ( 576 )   
RelatedCitation | Metrics
In order to accelerate the systems, High Level Synthesis(HLS) aims to map the computation of the system to the reconfigurable hardware, based on the behavioral description of the system in high level programming language. In HLS, the generation of efficient memory subsystem is critically important, especially for data intensive computation. In this paper, the existing HLS technologies for FPGA and their memory subsystems were analyzed. The generated memory subsystems' architectures were divided into three categories; DSP-like architecture, CPU-control architecture and architecture based on the reconfigurable memory functional units. These architectures were discussed with examples. Afto that, the front end and back end optimization techniques for memory subsystems in HI_S were discussed respectively. In addition, the aforementioned architectures and techniques were analyzed and evaluated. Finally, the mapping between the off-chip and on-chip memories and the efficient modeling for the programs were listed as the remained problems. In HI_S,syntheses for multi-module and automatic generation of the memory organization can be future research directions.
Anatomy of the Implementation of Builtin Functions in GCC
Computer Science. 2012, 39 (Z6): 357-359. 
Abstract PDF(346KB) ( 1568 )   
RelatedCitation | Metrics
GNU Compiler Collection(GCC) supports multiple high level languages and multiple platforms with its internal documents and source code opened. It is widely used both in industry and research. GCC supports lots of builtin functions and the implementations of builtin functions are one of the important implementations of GCC. We analyzed the intensions and operations of the builtin functions in GCC. Then, combining with our practical work and taking the vector extension instruction oriented builtin functions as examples, we demonstated the implementation details of plat-form-related builtin functions in GCC. This work is really worth referenced both for the comprehension of the implementation mechanism of builtin functions and for research and development based on GCC.
Study on Evolution Behavior of Service Combination under Unchanged Set
Computer Science. 2012, 39 (Z6): 360-364. 
Abstract PDF(399KB) ( 358 )   
RelatedCitation | Metrics
Started from the business logic of common update and evolution demand,this paper studied the evolution behavior of service combination under the condition that service set in service combination can't change. I3y proposing the evolution behavior set SEI3S in service combination, this paper defined basic evolution operation of service work flow architecture adjustment and proved that SEI3S can maintain the rationality of service combination after evolution. Thus,the rationality of inner flow logistic can' t change during the operation of service combination evolution. A case study shows that it can be put into practice.
Formal Description of Simulation Runtime Support Platform Architecture with XYZ/ADL
Computer Science. 2012, 39 (Z6): 365-369. 
Abstract PDF(500KB) ( 336 )   
RelatedCitation | Metrics
Net centric simulation runtime support platform(NCS-RSP) provides an environment supporting the construction of community of simulation task(CoST). This paper adopted dual software architecture description framework XYZ/ADL to describe the architecture of NCS-RSP by graphics language and formal language respectively. Then we decomposed and refined the core service layer during the construction of CoSt.This description method not only expresses the architecture graphics and behavioral abstraction of NCS-RSP from visual viewpoint, but also validates the corredness and completeness of architecture design from formal view. The research is a new attempt of formal description in military simulation domain and it provides a guide for the composition and reuse of NCS-RSP service.
GPU-based Parallel Rearches on RRTM Module of GRAPES Numerical Prediction System
Computer Science. 2012, 39 (Z6): 370-374. 
Abstract PDF(467KB) ( 656 )   
RelatedCitation | Metrics
GRAPES(Global and Regional Assimilation and Prediction System) is a new generation of numerical weather prediction(NWP) system of China. As the system processes amount of data and requires high real-time, it is always a hot research field of parallel computing. This is the first time that we use GPU(Graphics Processor Unit) general-pur-pose computing and CUDA technology on RRTM(Rapid Radiative transfer model) long-wave radiation module of GRAPES Meso model for parallel processing. Based on the analysis of computing performance, and according to the characteristics of the GPU architecture, the RRTM module parallel computational efficiency was optimized from the aspect of code tuning,memory,compiler options and etc. The optimization results indicate that the performance obtains a speedup of 14.3 X.Experiments were carried out on the GPU platform. The results show that the parallel computing algorithm is correct, stable and efficient for operational implementation of GRAPES in near future.
Portability of Dalvik in iOS
Computer Science. 2012, 39 (Z6): 375-379. 
Abstract PDF(443KB) ( 472 )   
RelatedCitation | Metrics
We analyzed the Dalvik virtual machine architecture, and studied several key technologies about the process of porting Dalvik for the iOS platform. We successfully built a Dlavik based JAVA runtime environment on the iOS platform which has great application value in mobile application development of the iOS and Android cross platform. We also analyzed the performance of the Migrated Dalvik and gave a conclusion and the fulture plan of this project.
Non-cooperative Gaming and Bidding Model Based Resource Allocation in Virtual Machine Environment
Computer Science. 2012, 39 (Z6): 380-382. 
Abstract PDF(325KB) ( 361 )   
RelatedCitation | Metrics
We studied resource allocation strategy about the virtualized servers. Based on a non-cooperative game theory, we employed bidding model to solve the resource allocation problem in virtualized servers with multiple instances competing for resources. The optimal response function of utility function which we introduced makes every player bidding prices reasonable. Although utility function is not well-defined at the point of zero,we show that the bidding game still has a unique equilibrium point. In our model,recourses arc well allocated to every virtual machine and arc improved the utilization of virtual resources.
Formal Model of Web Component Based on Object-Z Specification
Computer Science. 2012, 39 (Z6): 383-388. 
Abstract PDF(530KB) ( 357 )   
RelatedCitation | Metrics
Web component is a solution to reuse and extend Web service. Object-Z is the object oriented complement of the language Z which is based on set theorem and first order predicate logic. Modeling with a formal specification language Object Z could ensure the consistency and accuracy of the Web component which work in heterogeneous, loose coupling and encapsulation conditions. hhis thesis concentrates on studying Web component based on Object Z,and poses the related approach to modeling Web component and its compositions. The approach models static behaviors of component including interface and operation, defines the matching methods between interface and message, builds the formal frame of basic composite structure, applies the frame with component decomposition, and describes the composite method based on the rectuirements. On this basis, a case study for demonstrating our proposed approach, modeling the interactions and compositions is shown.
EDFUSE : FUSE Framework Based on Asynchronous Event-driven
Computer Science. 2012, 39 (Z6): 389-391. 
Abstract PDF(261KB) ( 934 )   
RelatedCitation | Metrics
FUSE is a classical framework of developing filesystem in userspace conveniently, it is composed of kernel module and userspace module. However, userspace module is developed in multithreading context; the number of threads is in proportion to the number of requests, this means that it is explosive growth when plenty of requests arc reached, it is harmful for system performance and responsive time. A novel asynchronous event-driven FUSE(EDFUSE) framework using pipeline is proposed to optimize the cache of data and metadata to increase the hit ratio of cache. Finally, the experiment studies show that the EDFUSE framework gains better throughput and efficiency.
Kriging Interpolation Algorithm Based on OpenMP
Computer Science. 2012, 39 (Z6): 392-395. 
Abstract PDF(296KB) ( 575 )   
RelatedCitation | Metrics
With the popularity of multi-core processors, full use of multi core PC's performance, computer technology is increasingly to multi-core architecture and multi-core computing technology development, To improve the small Hunan 100m X 100m grid interpolation rate of temperature, using the OpenMP standard for shared memory based parallel programming model to achieve the kriging interpolation algorithm based on OpenMP. Different multi-core PC, using a small grid 100m X 100m and 500m X 500m grid terrain data for a small average temperature for interpolation, not only effectively reduce the time and interpolation algorithms to improve the speedup, and significantly integrated into business systems enhance the system's response time and performance.
Description of Spectrum Availability in Dynamic Spectrum Allocati on
Computer Science. 2012, 39 (Z6): 396-397. 
Abstract PDF(232KB) ( 348 )   
RelatedCitation | Metrics
In the spectrum resources description of the dynamic spectrum allocation, we introduce two characteristic parameters to construct a spectrum description model. This method enrich description of the resource features. Because of differences in demand for spectrum, give the different weights of spectrum availability and the free probability respectively,so that the user is assigned an appropriate spectrum. Simulation results show that different types of business result different spectrum availability and the free probability parameters combined together to describe the spectrum availability, it can be appropriate to improve the spectrum efficiency. The spectrum efficiency can be increased by about 40% when the weight of free probability is much greater than the spectrum availability, especially for the voice signal.
Match Degree-based Method for Grid Service Composition and Optimize Choose
Computer Science. 2012, 39 (Z6): 398-400. 
Abstract PDF(315KB) ( 343 )   
RelatedCitation | Metrics
The match probability of automatic grid service composition is improved by Low Match Degree Service Composition Method(LMDSCM) based on semantic illation, similitude degree and graph search. With the concept aggregate degree based on domain ontology defined, it is used to the calculating of service composition and building of service composition graph. The optimize of service composition is transformed to shortest path problem Finally, comparisons of efficicncy and satisfaction shows that LMDSCM has higher satisfaction degree than other method and is easy to produce best service composition project.
Dynamic Self-organizing Fuzzy Neural Network Combined with SVD_TLS and EKF Algorithm Used for Dynamic System Processing
Computer Science. 2012, 39 (Z6): 401-403. 
Abstract PDF(232KB) ( 409 )   
RelatedCitation | Metrics
This paper proposed SVD-TLS and EKF based dynamic self-organizing fuzzy neural network (STD_DSFNN)for optimizing fuzzy rules and Adjusting nonlinear and linear parameters reasonably. Firstly, the structure and meanings of each layer arc given. hhen nonlinear parameters arc learned by using EKF algorithm, the linear parameters are learned by using SVD_TLS algorithm which also extract important rules at the same time. At last, the STE_DSFNN is verified through the typical Machcy-Glass time series prediction examples. The results show that the STE_DSFNN network structure is more compact and has better generalization ability compared with the DFNN, ANFIS and UKF_DFNN.
One Variety Conjectures of Complete-transposition Network
Computer Science. 2012, 39 (Z6): 404-407. 
Abstract PDF(279KB) ( 342 )   
RelatedCitation | Metrics
Multigranulation Based Variable Precision Rou沙 Set Approah to Incomplete Systems
Computer Science. 2012, 39 (Z6): 408-411. 
Abstract PDF(301KB) ( 370 )   
RelatedCitation | Metrics
In recent years,multigranulation approach is a new arisen data analyzing model. To deal with the incomplete information system through the multigranulation approach, by employing the basic ideas of non-symmetric similarity relation and variable precision, the multigranulation based variable precision rough sets were proposed, which include optimistic and pessimistic cases, respectively. Not only the basic properties about these models were proved,but also the relationships between variable precision approach and strict inclusion approach were discussed under the multigranulation frame. Finally,an example was used to show the "OR" decision rules can be generated from the incomplete information systems by the proposed multigranulation based variable precision rough set models.
System Thinking Based Development Framework for Software Safety Requirements
Computer Science. 2012, 39 (Z6): 412-415. 
Abstract PDF(465KB) ( 376 )   
RelatedCitation | Metrics
Poor software rectuirement for safety-critical systems (SCSs) is identified as a major root cause of catastrophis accidents. A system thinking based development framework for software safety requirements was built with system modeling and system analysis. For a particular analysis domain in a particular analysis level, a development method integrated with safety analysis was presented to develop software safety rectuirements. With the method, safety critical errors in software requirements arc neither likely to propagate through to other analysis domains in the same analysis level nor likely to the subsectuent analysis level. New safety rectuirements will be derived as early as errors are found in the safety analysis process. Safety evidence will be generated in the process to support the building of safety arguments.
Research on GIS Services Composition Based on Application Model
Computer Science. 2012, 39 (Z6): 416-418. 
Abstract PDF(249KB) ( 332 )   
RelatedCitation | Metrics
Recently, there arc some problems in integrating application model into GIS system, such as redundancy, difficulty to reuse the model and so on. We proposed a method to solve these problems. At first, the GIS application model was decomposed into sulrmodels, and the sulrmodcls and GIS functions were defined as services which arc provided to users. Then it defined the QoS (quality of service) of the GIS services. At last, the paper proposed a service composition method based on modified adaptive genetic algorithm (MAGA). The method can achieve the goal of global optimization of GIS services composition.
Research and Implementation on Grid-based Resource Management Model for Collaborative Learning Environments
Computer Science. 2012, 39 (Z6): 419-421. 
Abstract PDF(368KB) ( 294 )   
RelatedCitation | Metrics
Collaborative learning is a more effective approach than other learning under many circumstances, but the construction of collaborative learning environment is very difficult because of the high requirements met the resource management. Now it is hardly any learning management systems met the requirements of resource management in collaborative learning environment. In this paper, based analyzing the drawbacks of resource management in the mostly popular learning management system, we introduced grid resource management middleware-GRAM(the Grid Resource Allocation Manager) , and presented one resource management model for collaborative learning environment based grid.The model is tested with implementation of the open source, matured technologies including Linux, Apache, MySQL and PHP, and adopting GI'4 as grid services, the results show that expected design of the model is realized, and the the model could well surport the requirements of resource management in collaborative learning environment.
Test Suite Reduction Methods Based on K-medoids
Computer Science. 2012, 39 (Z6): 422-424. 
Abstract PDF(221KB) ( 538 )   
RelatedCitation | Metrics
According to the given testing objectives, test suite reduction aims to satisfy all testing requirements with the minimum number of test cases. This paper introduced the idea of K-medoids algorithm of cluster analysis to finish the test suite reduction. First, treat every test case as a node and find its similarity. And then, according to testing require-ments,choose test cases from test case suite. The results of the emulate programs prove that the method is effective and feasible.
Design and Implementation of the SHA-3 Candidate Algorithm Keccak Based on Matlab
Computer Science. 2012, 39 (Z6): 425-428. 
Abstract PDF(326KB) ( 863 )   
RelatedCitation | Metrics
Currently the final round of SHA-3(Secure Hash Algorithm-3) competition launched by NIST is under way.I}hc hash algorithm Keccak is one of the five candidates for the competition at the last round. This paper described the design and implementation of Keccak based on Matlab. Our program with a GUI(Graphic User Interface) can be used both in practical Keccak Hash value calculation and Keccak teaching or experiments.
Application of Rough Set Reduction in Spacecraft Fault Diagnosis
Computer Science. 2012, 39 (Z6): 429-431. 
Abstract PDF(335KB) ( 293 )   
RelatedCitation | Metrics
Damage detection of the aerospace structures is a critical issue, during which the data from multiple types of sensors are redundant and inconsistency to each other. Rough set reduction is an effective tool for data mining and classification, which can be used for the above problems. Firstly, the damage features arc extracted from the multi-sensor information(e. g. displacement, acceleration or strain sensor) and modal parameters. Then, the rough set reduction techniquc is employed to obtain the core set of all features. Finally, the structural damage conditions arc identified through probabilistic neural network. hhe numerical simulation of a helicopter demonstrates that the rough set reduction technictue can not only decrease the spatial dimension of feature attributions,but also improve the identification accuracy.
Algorithm for BP Neural Networks by Identifying Numbers of Hidden Layer Neurons Quickly
Computer Science. 2012, 39 (Z6): 432-436. 
Abstract PDF(418KB) ( 780 )   
RelatedCitation | Metrics
Based on polynomial curve-fitting theory, an orthogonal basis feed-forward neural network is constructed.The model is adopted by a three-layer structure, where the hidden-layer neurons are activated by orthogonal polynomial functions. In view of the network, an algorithm is proposed that a kind of hidden layer activation function is the orthogonal polynomial and the number of neurons can be ctuickly determined. Through mathematical proof , the validity of the algorithm is theoretically proved. The algorithm is verified by computer simulations, comparing with the conventional BP algorithm. The results show that this algorithm not only breaks through the traditional BP neural network limitalions, such as slow convergence rate, optimal number of hidden neurons that difficult to be determined, but also can achievc higher precision. The effectiveness of the designed algorithm is validated.
Design and Analysis of MST Constructing Algorithm Based on the Idea that is Named as Pruning Bowstrings and Protecting Branches
Computer Science. 2012, 39 (Z6): 437-440. 
Abstract PDF(429KB) ( 577 )   
RelatedCitation | Metrics
In order to recuperate localization of classical algorithms, it is advanced for the first time that the idea that is named as "pruning bowstring and protecting branches" , and a kind of downright new constructing algorithm of MST is designed and achieved creatively base on the idea. The result of experiment and math proving indicates that the new algorithm is right, The result of experiment and analysis indicates that the new algorithm can prominently recuperate the deficiency of classical algorithm in some actual application field and possesses important value on theory and application.
Research of Some Problems about Resource Management on Multi-core Processor Platform
Computer Science. 2012, 39 (Z6): 441-443. 
Abstract PDF(373KB) ( 597 )   
RelatedCitation | Metrics
In the architecture of multi-core processor, there are several cores integrated on a chip. These cores share some kind of hardware resources integrated on the same chip. This paper introduced the development of multi-core processor technology, proposed some problems about resource management on multi-core processor platform,discussed some problems of processor management and shared cache management deeply.
Research on the Applicability between Program and Platform in High Performance Simulation
Computer Science. 2012, 39 (Z6): 444-448. 
Abstract PDF(926KB) ( 360 )   
RelatedCitation | Metrics
The development of high performance simulation application works on the premise that the appropriate compuling platform is selected, and the parallel optimization direction of program is determined. Therefore, this paper researched on the applicability between the parallel computing platform and high performance simulation program Especially, three character sets arc distilled, including the performance character set of simulation program, the performance indicator system of parallel computing platform, and the element set of parallel optimization targets, so as to theoretically guide the methods of platform selection and program optimization more completely. Moreover, based on these three researches, this paper proposes the methods of determining the program applicability and platform applicability. The cases study indicates that the research achievement in this paper can guide the selection of the appropriate computing platform,and guide the development of performanc}oriented parallel optimization techniques.
Research and Simulation of Algorithm Based on Particle Swarm Optimizing Neural Network
Computer Science. 2012, 39 (Z6): 449-451. 
Abstract PDF(286KB) ( 397 )   
RelatedCitation | Metrics
As a new swarm itelligent algorithm, Particle Swarm Optimization(PSO ) algorithm has been attented and studied by many people. This paper mainly studied on the new algorithm that formed by PSO combining with neural network. Taking chaos phenomina of permanent magnet synchronous motor as a object, the new algorithm achieves controlling it. I}hc results of the expriments indicate the algorithm is feasible. At the same time, It proves the PSO optimizing neural network having some actural value.
Research on Resource Selection Algorithm Based on Feedback Control of Chaos Theory
Computer Science. 2012, 39 (Z6): 452-456. 
Abstract PDF(426KB) ( 396 )   
RelatedCitation | Metrics
Service discovery technology dominates the information gets to the user, so a good service resources choose to become one of the key issues of service discovery. Current service resource selection techniques in dynamic service selection of resources, not a good dynamic and real-time processing of service resources. Often use historical resources information based method for dynamic service selection of resources. Due to various causes of uncertainty in the service process, resource node that corresponds to the possible termination service possibility and real-time resource may not be appears in historical information list. You need a suitable dynamic resource selection method to further address these issues. In response to these problems, the article presented a dynamic servic resource selection method based on chaos feedback control. To select a suitable dynamic services resources as the goal, which improves the insufficient of dynamic resource selection method based on historical information.
Quantitative Description OWL-S Model Based on the Random PETRI Nets
Computer Science. 2012, 39 (Z6): 457-460. 
Abstract PDF(339KB) ( 313 )   
RelatedCitation | Metrics
OWL-S is a semantic Web service composition-oriented ontology description and semantic markup language that can be explained by the automatic machine,while it also achieved the interoperability of Web service functions between the services,it is one of the most important service portfolio standards. This paper presented a probabilistic model based on non-Markov random PETRI NETS (NMSPN) , which can model and describe the control flow of Web services described in OWL-S. then, the paper established a series of probabilistic analysis methods to achieve a quantitative credibility to analyze the semantics of Web services-based process model based on NMSPN.
Method of Software Reliability Allocation Based on UML Use Case Model
Computer Science. 2012, 39 (Z6): 461-463. 
Abstract PDF(242KB) ( 436 )   
RelatedCitation | Metrics
We introduced a methodology that starts with the analysis of the UML use case diagram for software reliability allocation with the probability of executing each use cases. We will utilize the probability of executing each use cases to measure the importance of the use case, then to determine weight factor of each use case. According to the weight factor of each use case to determine the reliability index undertaken by each use case, system architect can reasonable assign reliability index of system to each use case. This methodology based on use case can ensure the target of software reliability and provide reference for follow-up development and software test.
Methods and Steps of Building a Unified Human Resources Management Information System
Computer Science. 2012, 39 (Z6): 464-465. 
Abstract PDF(281KB) ( 665 )   
RelatedCitation | Metrics
Combined with the method and steps of building unified human resources management information system at Tsinghua University,we Investigate to realize and improve the theory of university unified information systems construction. 13ut also provide experience to the construction of university unified information systems or the construction of Information systems within the functions domain.
Distributed Elevator Group Control System Scheduling Based on Real-time Particle Swarm Optimization Algorithm
Computer Science. 2012, 39 (Z6): 466-473. 
Abstract PDF(714KB) ( 1157 )   
RelatedCitation | Metrics
In order to get a globally optimized solution for the elevator group control system(EGCS) scheduling problem, an algorithm with an overall optimization function is needed. This study analyes different traffic patterns and control mechanisms,and proposes a real-time particle swarm optimization(RPSO ) as an optimal solution to the EGOS scheduling problem. Simulation results show that distributed EGOS using RPSO can be scheduled well. Besides, the elevalor real-time scheduling and re-allocation functions are realized in case the elevator is unavailable or full. RPSO algorithm can reduce the waiting time of the passengers by almost 50 0 0 , and reduce passengers' riding time by almost one third.
Survey of Image-based Pencil Rendering Technique
Computer Science. 2012, 39 (Z6): 474-477. 
Abstract PDF(716KB) ( 677 )   
RelatedCitation | Metrics
Non-photorcalistic rendering is a branch of computer graphics,which is committed to generate some art style works. Currently,non-photorealistic rendering technology (NPR) has been able to successfully simulate the oil paintings, watercolors, cartoons, pencil and some other styles, and has been widely used in many areas. Pencil sketch is the basis of art modeling. Since usually simple and smooth, pencil works arc deeply loved by people. However, it is not easy for every person to master the skills of pencil drawing. Imag}based pencil rendering technique is an important part of NPR, and is more and more concerned by researchers. 13ut related technologies arc still immature, and can' t meet the needs of practical applications.
UMI. Based Software Requirement Modeling for Automatic Train Protection
Computer Science. 2012, 39 (Z6): 478-481. 
Abstract PDF(340KB) ( 374 )   
RelatedCitation | Metrics
To meet the needs of high safety of automatic train protection system, this paper proposed and implemented an optimized UML-based rectuirement modeling method. By learning from the formal method, this paper improved the classical state machine model, and introduced a super-state machine with a series of accurate pre-defined rules, to implement the formal modeling of ATP system. Applied the method to an urban rail transit signal system, the result shows that the model established in this paper reduces the software failure by avoiding the ambiguity, meanwhile, it greatly improves the safety and reliability and makes the system more easily to develop and maintain.
Least-squares Method Piecewise Linear Fitting
Computer Science. 2012, 39 (Z6): 482-484. 
Abstract PDF(189KB) ( 4882 )   
RelatedCitation | Metrics
Curve fitting is a very important descriptor in image analysis, the most commonly used curve fitting method is least squares method. But ordinary least squares method has some limitations, and there are many scholars have made study of improving it. The authors made further improvement on least-squares method and proposed a new piecewise linear fitting algorithm instead of polynomial curve fitting. The new algorithm achieves the goal of simplifying the mathematical model, reducing the calculation, and makes it better to fit point sequence.
Comparing and Improvement of Two Carotid Ultrasound Image Vascular Plaque Segmentation Methods
Computer Science. 2012, 39 (Z6): 485-488. 
Abstract PDF(1083KB) ( 489 )   
RelatedCitation | Metrics
This article presented two image segmentation methods on carotid ultrasound images of vascular plactues,i. e. , active shape models and active appearance models, and compares their effectiveness in segmenting the internal and external contours of carotid ultrasound images after segmentation of 38 groups of carotid ultrasound images. Based on comprehensive analysis of experimental results and considering the characteristics of carotid ultrasound images,the ASM method was improved by adding the scale invariant. Experimental results showed that in running time, ASM was close to the improved ASM, while AAM was about 16 times longer than ASM and improved ASM. By evaluating their operating efficiency with FOM and RAY methods, it was demonstrated that the improved ASM was much better than ASM and the improved ASM was the most effective algorithm for carotid ultrasound images segmentation.
Pareto-based Multi-object Clonal Evolutionary Algorithm
Computer Science. 2012, 39 (Z6): 489-492. 
Abstract PDF(307KB) ( 598 )   
RelatedCitation | Metrics
To overcome the shortcomings of partial multiobjective evolutionary algorithms,we combine some outstanding thoughts in SPEA2 and immune multi-objective optimization algorithm then innovate out a Pareto-based multi-object clonal evolutionary algorithm NPCA(non-dominated Pareto clonal algorithm). And testing the algorithm with the famous multi-objective optimization problems ZDTand DTLZ, the results show that the new algorithm NPCA obviously takes advantages over the typical multi-objective evolutionary algorithms.
Mobile Augmented Reality Technology Applied on Mobile Phone Platform
Computer Science. 2012, 39 (Z6): 493-498. 
Abstract PDF(1032KB) ( 770 )   
RelatedCitation | Metrics
The mobile phone provides a perfect interface platform which can make augmented reality system extricate itself from the heavily personal computer and bring augmented reality(AR) to the outdoor and wireless application fields.In this paper, the newly research progress was introduced in detail. Meanwhile the key technical problems, technical challenging and the future research work were also analyzed.
Method of Video People Counting Based on Double_ellipse Model
Computer Science. 2012, 39 (Z6): 499-502. 
Abstract PDF(861KB) ( 329 )   
RelatedCitation | Metrics
A method of people counting in videos based on the double _ellipse mode can calculate the number of people of crowed and partially occluded aims at the situation where inviroment and background are complex First abstract outline of the moving object and shape the outline to a ellipse,then fitting the ellipse of a head in the outline according to prior knowledge. If the shape and the head all can be successfully fitted to cllipses,we can simplcly determine a human body. When crowed is dense, separate the moving target in advence, then recognize them. This method is suitable for vertical mode, tilt mode, distant view mode, nearby view mode of camera lens, and can count the number of people on the basis of identifying the human body and the non-human body.
Research for Facing the Natural Language of the Geometry Drawing
Computer Science. 2012, 39 (Z6): 503-506. 
Abstract PDF(327KB) ( 831 )   
RelatedCitation | Metrics
In the case of computer geometry graphic software matures, the challenge of the geometic software is that convert geometry propositions described by natural language to geometric figure. This paper that combined with the needs of geometric natural language and established a set of effective understanding of language model, designed and developed the interface that realized the automatic conversion from natural language to geometric drawing commands,realized automatic conversion from natural language to geometic commands and generated geometry. Test results show that the translation rate overtakes 84.17%.
Non-parameter Kernel Function Discriminant Analysis Method and its Application in Face Recognition
Computer Science. 2012, 39 (Z6): 507-509. 
Abstract PDF(559KB) ( 324 )   
RelatedCitation | Metrics
Because of kernel discriminant analysis(KDA) algorithm only considers c-1 discriminated features, and neglects to capture the boundary structure while computing between-class scatter matrix So an improved algorithm of non-parametric and nonlinear(kerncl) was proposed. It added a weight function during computing the between-class scatter matrix which can overcome the above two disadvantages of KDA. Simulation results show that the recognition performance of the new method is superior to those of the existed methods, and it can avoid using the singular value decomposition theory of the matrixes,so it has some practical value.
Improved Method of Global Optical Flow Estimation
Computer Science. 2012, 39 (Z6): 510-512. 
Abstract PDF(250KB) ( 529 )   
RelatedCitation | Metrics
The optical flow estimation algorithm based on the optical flow constraint and smoothness constraint presented by Horn and Schunk is one of the important algorithms of the image motion estimation. The algorithm cannot acquire accuracy motion parameter estimation at low-gradient points. At the same time, the present improved methods required artificial selected parameters and when the threshold value was set too high the object area would be unperfected. Two optical flow estimation methods were presented by modifying the optical flow basic constraint weighted function. Experiment results show that the improved methods can depress the repression of reliable optical flow when the threshold value was set too high and improve the self-adaptive ability what lays a good foundation for moving object detection and tracking.
Image Fusion Algorithm Based on Second Generation Curvelet Transform and Regional Matching Degree
Computer Science. 2012, 39 (Z6): 513-514. 
Abstract PDF(490KB) ( 385 )   
RelatedCitation | Metrics
This text puts forwards a new multi-sensor image fusion algorithm based on the second-generation Curvelet transform and respectively discusses the fusion rules of coarse scale coefficients and fine scale coefficients. First make original images multi scale decomposed using Curvclet transform,and then transform coarse scale coefficients to equalize their strength distribution. Whereafter use the weighted average method to determine coarse scale fused coefficients and significant measure and regional matching degree joint analysis method to determine fine scale fused coefficients. Final1y, carry out consistency verification and inverse transform to acquire the fused image. The comparison between the traditional method and this new method is made from the three aspects:independent factors,united factors and comprehensive evaluation. The experiment proved the usefulness of the method, which is able to keep the edges, obtain performance parameter and better visual effect.
Discrete Error Complementary Strategy Based Control Method to Quantify Various Difficulty Levels of Test Questions
Computer Science. 2012, 39 (Z6): 515-518. 
Abstract PDF(314KB) ( 563 )   
RelatedCitation | Metrics
It is extremely important to auto-generation examination papers for high school,with the purpose of making examination normative and scientific. The average scores of students can be controlled by average difficulty level of the examination paper. However, how to quantify various difficulty levels of test questions is a key step. Currently, a common strategy is to adopt normal distribution to specify those numbers. After discretizing the normal distribution function based on the difficulty factor, the probability distributions of various difficulty levels were obtained according to symmetrical integral and a proportion error complementary method was adopted to reduce errors. According to the analysis results in the auto-generation examination papers, the standard deviation of 0. 2 is more suitable. In last, the computed numbers of various difficulty levels of test questions arc listed in various total questions. The result shows that the control method that use discrete error complementary strategy can control the ctuantity of various difficulty levels of test questions well and enable the strategy of auto-generation examination papers to be effective.
Adaptive Multiple Windows Stereo Matching Algorithm
Computer Science. 2012, 39 (Z6): 519-521. 
Abstract PDF(595KB) ( 573 )   
RelatedCitation | Metrics
New kind of adaptive multiple windows stereo matching algorithm was presented, which can solve that easily appeared with error matching in textureless image in stereo matching. The algorithm adopts 8 same windows according the smooth situation of images to find appropriate support area by 8 direction, and this method is very suitable for parallel processing. Then presented a kind of adaptive region growing algorithm to solve that the obtained result by the above steps,which leaves something to be desired. Experimental results show that this method is whether in textureless regions or texture regions has greatly improved, and running time close to the traditional local fixed window method,which is suitable for real-time processing.
Research on Multi-touch Gesture Analysis and Recognition Algorithm
Computer Science. 2012, 39 (Z6): 522-525. 
Abstract PDF(284KB) ( 781 )   
RelatedCitation | Metrics
Currently, multi-touch gestures on the touch system lack of an ideal description and analysis. Research on Multi-touch gesture analysis and recognition algorithm presents a general multi-touch gesture analysis and design framework, rational analysis of high performance algorithms and optimizing multi-touch commands. Contact displacement and time to set the threshold to improve the dual function of identifying the accuracy of contact, to prevent the sudden increase in the noise of false judgments, slow down fast operation fitter; using RBF neural network model to solve the problem of dynamic gesture recognition, and the introduction of Euclidean distance-based clustering statistics as a function of the characteristic parameters of the network, greatly improving the multi-touch gesture recognition efficiency and accuracy.
Noise-figure Recognition Based on Discrete Hopfield Neural Network
Computer Science. 2012, 39 (Z6): 526-528. 
Abstract PDF(476KB) ( 609 )   
RelatedCitation | Metrics
Based on the Hebb learning rule, noised and distortion figures of 0~9 were identified, using the associative memory ability of discrete Hopfield neural network. Through improving the memory samples, which is to be orthogonal, and using Hebb rule to learn the improved memory sample, the weight value matrix was obtained, the noise figure would be identified according to the information of noise figure. The identification experiment on noise figure by using the improved Hopfield neural network shows that the method improves the memory capability and the correct identificanon rate of traditional network.
Analysis on System Safety of Campus Mobile Card
Computer Science. 2012, 39 (Z6): 529-531. 
Abstract PDF(241KB) ( 381 )   
RelatedCitation | Metrics
Under introduction of mobile card business on campus, data security need to be explored and improved in order to promote management efficiency and enhance the management level. The impact of insecurity of mobile card system were analyzed, and key systems, card encryption, communication security, data storage security were researched and designed.
Construction of Multi-layer Semantic Image Database Based on Mpeg-7
Computer Science. 2012, 39 (Z6): 532-535. 
Abstract PDF(676KB) ( 336 )   
RelatedCitation | Metrics
The study addressed the problems of present image database with incomplete image semantics, no-unified content description standard and no security, in which images is independent of DBMS. The content architecture of imarge database was determined with lower features such as color, texture, shape etc, middle object and higher emotion. The Mpeg-7 based multi-layer semantic description framework was proposed because of the modification and expansion of Mpeg-7. A multi-layer semantics image database with 188 images was established using ORDB multi-media database implementation techniques in Oracle9i. Theoretical analysis and experimental results shows the distinctive superiority of the image database itself.
Relevance Feedback Algorithm Based on Fuzzy Semantic Relevance Matrix in Image Retrieval
Computer Science. 2012, 39 (Z6): 540-542. 
Abstract PDF(508KB) ( 345 )   
RelatedCitation | Metrics
The semantic gap between low level visual features and high level semantic concepts is an obstacle to the development of image retrieval. Relevance feedback techniques narrow the semantic gap to some extent. A relevance feedback algorithm was presented based on fuzzy semantic relevance matrix(FSRM). During the retrieval process, the weights in the FSRM are adjusted according to user's feedback and the FSRM arc modified by learning more time. Experimental results show the effectiveness of the algorithm.
Method for Image Denoising Based on Multiscale Geometric Analysis
Computer Science. 2012, 39 (Z6): 543-545. 
Abstract PDF(493KB) ( 343 )   
RelatedCitation | Metrics
The edge and outline of image arc very useful in the pattern recognition. But there arc much difficulties if the signals are polluted by noise. We first introduced the multiscale geometric analysis theory, and then proposed a method for image denoising on scale factor and contourlet transform. Experiments show that the performance of the proposed method is obviously superior to other methods both in vision and in SNR.
Algorithm for Edge Detection of Image Based on Mathematical Morphology
Computer Science. 2012, 39 (Z6): 546-548. 
Abstract PDF(747KB) ( 410 )   
RelatedCitation | Metrics
In order to improve the efficiency of edge detection, and decrease the noise impact of the image edge detection, the algorithm of image edge detection was proposed based on mathematical morphology. hhe concept of Multivariate structural elements was introduced, and an improved morphological edge detection operator was proposed, which can effectively detect the edges of the image with noise,and keep the edge smoothness. Experiments demonstrate that comparcel with traditional edge detectors, this edge detector has a good performance of noise immunity, a better real-time perfor-mance and a certain practicality and feasibility.
Display and Roaming of Water-environments Project
Computer Science. 2012, 39 (Z6): 549-551. 
Abstract PDF(507KB) ( 296 )   
RelatedCitation | Metrics
In order to better display the water environment, some water environment modeling methods were proposed in this paper. In addition, water presentation pattern, water level and water duality changing presentation methods were introduced. By using some vega inbox functions, such as pathing, navigator and motion models, the automatic roaming and interactive scene roaming are achieved. Finally, the callback method is used to display multiple windows, and this method can present multiple scenes simultaneously. The above methods and functions have been successfully applied to urban water environment simulations and achieved good results.
Multi-scale Edge Detection in the EMD Domain Using the Adaptive Noise Threshold
Computer Science. 2012, 39 (Z6): 552-554. 
Abstract PDF(222KB) ( 364 )   
RelatedCitation | Metrics
A multi-scale edge detection in the EMD domain using the median absolute deviation to estimate the noise threshold was proposed to detect the edges of noisy signals with different signal to noise ratios. The new algorithm first calculates the slope of the EMD residuals in scales. Second, it eliminates the noise in the slope signal by thresholding, the noise threshold of which is estimated by the median absolute deviation. At last, it outputs the edge information by the spatial consistence test Simulation results show that, the multi-scale edge detection in the EMD domain using the adaptive noise threshold can accurately detect the edges of signals, and suppress the noise.
Method of Pyrrophyta Cingulum Segmentation Based on Dominant Gray Levels and Watershed Transform
Computer Science. 2012, 39 (Z6): 555-558. 
Abstract PDF(599KB) ( 310 )   
RelatedCitation | Metrics
To tackle the over-segmentation problem in watershed algorithm, a method based on dominant gray levels and constraint marker watershed was proposed to extract cingulum region. In this algorithm, original image was reconstructed by dominant gray levels, eliminating local minima and noise disturber. Then markers of regional minima were extracted from gradient image by using auto-threshold, and imposed by shape, area and centroid constraints. The watershed transform of the maker-modified gradient image was performed to achieve the cingulum region segmentation. Simulation results show that the new scheduling algorithm has the benefits of less over-segmentation areas, the more accurate segmentation result and higher flexibility compared to the traditional algorithms.
Modeling and Simulation of Neutral Process in Industry Wastewater Based on VC++
Computer Science. 2012, 39 (Z6): 559-562. 
Abstract PDF(316KB) ( 347 )   
RelatedCitation | Metrics
Aiming at the problem of common neutral treatment process of wastewater, which was limited to a single reactor, this paper proposed a new model method, which contained neutralization pool, regulating pool and accidental pool.It was modeled by the neutral reaction mechanism, and used numerical methods Runge-Kutta method to solve the model after setting initial condition. This software was developed by algorithm and displayed by dynamic effects,real-time data and historical curve through a friendly interface. Simulation test showed that it could truly simulate the neutral process of sewage treatment plants,which used for teaching and experiments, and helped students to familiar with the sewage treatment process and finish control experiments.
Research on Three-dimensional Modeling of the Campus Library Based on ArcGIS
Computer Science. 2012, 39 (Z6): 563-565. 
Abstract PDF(742KB) ( 755 )   
RelatedCitation | Metrics
With the expansion of three-dimensional map applications, the technictue of three-dimensional modeling is more sophisticated. For study of three-dimensional visualization of the campus, the paper expounded three-dimensional modeling of campus library based on ArcGIS, including those based on symbolic modeling, based on plug-in modeling and based on DEM modeling. Furthermore, the advantages and disadvantages of each modeling method were discussed.
Research on Face Recognition Algorithm Based on SVD and RBF Neural Network
Computer Science. 2012, 39 (Z6): 566-569. 
Abstract PDF(309KB) ( 497 )   
RelatedCitation | Metrics
To solve the problem of high dimension and small sample in face recognition, this paper proposed a facial feature extraction and recognition method based on singular value decomposition and radial basis function neural network.By singular value decomposition,dimension reduction and compression,vector standardization and vector sorting,we finally got the feature vectors of singular value used to identify. It used the neural network classifier based on radial basis function to classify and recognize face. The experiments and data analysis on ORL database show that, this method has good performance whether in the error rate of classification or in the learning efficiency.
Color and Gradient Orientation Features Integrated Particle Filter Tracking
Computer Science. 2012, 39 (Z6): 570-572. 
Abstract PDF(816KB) ( 312 )   
RelatedCitation | Metrics
Aiming at the visual tracking drawbacks of standard particle filters under the conditions of complex environment and illumination transformation, a new particle filter was proposed based on color and gradient orientation feature to get over the problem of low robustness of the particle filter tracking method through unique feature of color.By designing the model for features fusion, the performance of only color feature was improved when the environment changed. In addition, the proposed target pattern update algorithm improved the adaptability to complex scene diversification. The experiment indicates that the proposed method is effective and robust for the visual tracking under complex backgrounds.
Designing and Developing the Communication Helper Based on Android
Computer Science. 2012, 39 (Z6): 573-576. 
Abstract PDF(576KB) ( 417 )   
RelatedCitation | Metrics
Android is an open source smart phone operating system, which is launched by Uoogle. More and more people spend time in researching and using it. The paper built Android application development platform, and realized Android application program. The work is from overall design to detailed design. The design of the communication helper software is based on Android. It has the function of grouping the contacts stored in the phone, storing contact detail information, and quick phone calls, SMS, cmail sending. I}he interface design is completed through the Android UI, and the program is debuged by virtual machine until the code is passed.
New Co-training Algorithm for Multi-relation Data
Computer Science. 2012, 39 (Z6): 635-539. 
Abstract PDF(318KB) ( 412 )   
RelatedCitation | Metrics
Semi-supervised learning is a hot research topic of machine learning. Co-training is a multi-view semi-supervised learning method. Uraph representation was introduced to co-training where multiple graphs were used to represent multi-relation data. Semi-supervised learning was processed on each graph while co-training was conducted between graphs to ensure the predictions of graphs arc the same. A new co-training algorithm for multi-relation data was proposed,and it was analyzed from the viewpoint of probability. Encouraging experimental results are gotten from real world multi-relation dataset.