Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue 4, 16 November 2018
  
P-sets, Inverse P-sets and the Intelligent Fusion-filter Identification of Information
Computer Science. 2012, 39 (4): 1-13. 
Abstract PDF(1042KB) ( 381 )   
RelatedCitation | Metrics
Retrieval of 3D CAD Models:Survey
Computer Science. 2012, 39 (4): 14-22. 
Abstract PDF(998KB) ( 1219 )   
RelatedCitation | Metrics
A comprehensive survey on the up-to-date retrieval methods for 3D CAl)models was presented. Due to the great differences in representations and retrieval algorithms between 3D CAD models and other general models, CAD models retrieval is divided into two hierarchies: visual-similarity-based retrieval and semanticfunctional-descriptions-based retrieval. Representations and algorithms of each hierarchy were discussed and analyzed systematically. Bench-marks and evaluations of 3D CAD model searching methods were summarised. Finally existing difficulties and future research were discussed.
On Multi-granular Labeled Classification for Spatial Remote Sensing Data
Computer Science. 2012, 39 (4): 23-27. 
Abstract PDF(391KB) ( 356 )   
RelatedCitation | Metrics
This paper studied the discovery of classification rules for remote sensing data with multi-granular labels. According to the different physical meaning of land cover, we first obtained a multi-scale decision table, and then discovered the classification rules bided in the multi scale decision table. Finally, we analyzed the accuracy of the classification.
Research on Clone Detection for Large-scale Model
Computer Science. 2012, 39 (4): 28-31. 
Abstract PDF(508KB) ( 481 )   
RelatedCitation | Metrics
Model clone detection is a very important technology for software maintenance, software structure optimization, etc. The paper first summarized the definition of model clone. Next, the overall detection process for model clone was divided and discussed in detail. After that, two kinds of typical clone detection technologies for larg}scale model were introduced. Finally, the paper reviewed the stat}of-art of current model clone detection research and identified some open research issues and pointed out possible future research directions in model clone detection area.
Multi-classification Algorithm for Indoor Positioning Based on Support Vector Machine
Computer Science. 2012, 39 (4): 32-35. 
Abstract PDF(360KB) ( 566 )   
RelatedCitation | Metrics
A multi-classification algorithm for indoor positioning based on SVM was proposed to tackle the problem of low precision and fluttering results faced in many real-time location systems. Traditional matching algorithms based on sampling points arc always deficient in dealing with nonlinear problem and jumping results in a short time. In handing this limitation,object location process was considered as a multi-classification problem by introducing grid concept K candidate grids were obtained using SVM first These candidates were then refined by previous location results, and ultimate accuracy result was achieved through a Kalman filter. Temporal information was utilized in the matching process to make the object movement more stable and smooth. Experiments show the superiority of our method over naive SVM method.
Topology Optimization of Network Coding Based P2P TV
Computer Science. 2012, 39 (4): 36-40. 
Abstract PDF(566KB) ( 392 )   
RelatedCitation | Metrics
With network coding(NC) , intermediate nodes can form outgoing packets through coding incoming packets to achieve the theoretically maximum throughput of multicast. Network coding has been applied in P2P TV systems to improve the performance of delivery ratio, delay and so on. Generally, it is simplified in existing P2P TV systems to shorten the waiting time of nodes for incoming data packets and reduce computing overhead. This introduces the impact of the redundancy ratio of data packets on topology, and increases system overhead. The cause of redundancy brought by topology was quantitatively analyzed, and a topology optimization scheme named instant control was proposed to control and optimize topology spontaneously. Experimental results show that instant control reduces redundancy ratio of data packets caused by topology effectively. Compared with other similar work, it achieves better tradeoff between redundancy ratio of data packets and utility ratio of uplink bandwidth capacity, and further reaches higher delivery ratio.
Novel Revocable Short Group Signatures Scheme without Encryption
Computer Science. 2012, 39 (4): 41-45. 
Abstract PDF(546KB) ( 470 )   
RelatedCitation | Metrics
Aiming at the intrinsic problems in revocation group signatures,such as reducing group member's computational costs, shortening the signature length and so on, a novel revocation short group signature scheme without encryption was proposed based on the XDDH, LRSW and SDLP assumptions, and it's security was proven. Member revocation was implemented by encoding the validity time into group signature key. In particular, our scheme does not use standard encryption and relies on re-randomizable signature schemes that hide the signed message so as to preserve the anonymity of signers. Our solution outperforms all prior solutions for member revocation in terms of communication and computational costs for the members. Group public key remains constant, and computational costs of signing and verifying are independent of the revocable number, and the signature is only 1195 bits in size.
Research on P2P Traffic Identification Based on K-means Ensemble and SVM
Computer Science. 2012, 39 (4): 46-48. 
Abstract PDF(352KB) ( 438 )   
RelatedCitation | Metrics
A P2P traffic identification model was constructed by the combination of K-means ensemble and support vector machine. It owns high accuracy, stability and overcomes complexity of cluster model. Firstly, the three base clusterer was formed by few labeled sample, and then the each cluster's label was assigned by MAP. The unlabeled sample's label is the same with the closest cluster. Identification model based on SVM was built by new sample set. hhe model makes the best of ensemble learning's stability and SVM's generalization ability, theoretical analysis and result demon-strate its feasibility.
Design and Implementation of WSNs Middleware Supporting Multiple Application Task
Computer Science. 2012, 39 (4): 49-52. 
Abstract PDF(364KB) ( 378 )   
RelatedCitation | Metrics
As the infrastructure in Internet of Things, wireless sensor networks must support, multiple application tasks to satisfy the rectuirement from diversified users. The current solution to wireless sensor networks middleware focuses mainly on a kind of applied domain. The architecture of middleware supporting multiple application tasks was proposed in this paper. Based on the architecture, the entire system and all relative components were designed and implemented.According to the diverse requirement, the middleware may support the different application tasks. Experiments based on simulative network prove that the middleware has good feasibility and practicability.
Set Pair Analysis Method for Evaluating Denial of Service Attack Resistance Ability
Computer Science. 2012, 39 (4): 53-55. 
Abstract PDF(337KB) ( 589 )   
RelatedCitation | Metrics
Attack resistance test(ART) is an important security evaluation method and evaluating the system's attack resistance ability becomes a new domain of ARh. A new evaluation method of DoS attack resistance ability based on set pair was presented. The process of systematic analysis DoS attacks resistance ability was broadly put forward. Identity- discrepancy-contrary connection numbers were confirmed by fuzzy method. Using rough set theory, the nonlinear combi- nation weight of evaluation indices was established. A simulation example shows that the approach is brief and effective.
Delay-constrained Binary Exponential Backoff Algorithm
Computer Science. 2012, 39 (4): 56-59. 
Abstract PDF(435KB) ( 704 )   
RelatedCitation | Metrics
According to the problem of timeliness for IEEE 802. 11's binary exponential backoff algorithm,which used maxmum transmitting times, a new delay-constrained binary exponential backoff algorithm was proposed, which used timcout threshold instead of maxmum transmitting times as the reference for packet dropping. It can increase system u- nitary effective throughput,and doesn't reduce system unitary throughput dramatically,which is fit for wireless Ad hoc networks demanding restrict delay. A Markov model was introduced to analyze system performance. The simulations validate the accuracy of model and the advantagement of IBC BEB over BEB.
Application of Branch and Bound Algorithm Based on K-MEANS Clustering in Network Anomaly Detection
Computer Science. 2012, 39 (4): 60-62. 
Abstract PDF(339KB) ( 527 )   
RelatedCitation | Metrics
Network anomaly detection has become one of the focus research topics in the field of intrusion detection. However, issues on accurate selection of key date in training set, the long selection time, and the high rate of detection misstatement arc still unresolved. Regarding to those problems, to integrate K-MEANS and Branch and Bound Algo- rithm,and to build up a network anomaly detection model on it can significantly improve the accuracy of key data selec- tion,and reduce time consumption as well. A series of experiments on well known KDD Cup 1999 dataset demonstrate that the model can achieve a high detection accuracy and efficiently constrain the false alarms caused by detection.
Non-interactive and Non-malleable Commitment Scheme Based on Lattice
Computer Science. 2012, 39 (4): 63-66. 
Abstract PDF(348KB) ( 598 )   
RelatedCitation | Metrics
NTRU is a well-known publi}key cryptosystem based on the difficulty of lattice reduction problems, and is mainly applied in publi}kcy encryption and digital signature. This paper constructed a non-interactive and non-malleable commitment scheme, which relies the security on the intractable CVP on lattice, and the binding property of commuter is satisfied. The validity of commitment is verified by hash function's collision resistance. Perturbing the plaintext with randomized mapping,plaintext will be in random distribution, and this scheme satisfies the hiding property of verifier and is non-malleable with respect to decommitment} This scheme has high efficiency as well as NTRU, and can resist channel eavesdropping attack, message replay attack and copying commitment attack.
Efficient RFID Location Method Based on Virtual Signal Strength
Computer Science. 2012, 39 (4): 67-70. 
Abstract PDF(348KB) ( 618 )   
RelatedCitation | Metrics
In analysis of the serious interference, low accuracy and other drawbacks of the existing typical positioning methods,a location algorithm based on virtual signal strength was proposed. This algorithm technically introduces the concept of the virtual signal strength and constructs virtual space with a classic radio propagation model in location area. Besides, "Function is brought in our algorithm to reduce the shadow effect caused by Q, a standard deviation of normal random variable. The simulation experimental results show that compared with the classical LANDMARC and VIRE, our algorithm has high accuracy and effectively solves the actual interference problem brought in the deployment of hig卜density reference tags.
Anonymous and Traceable Copyright Protection Protocol Based on Mobile Devices
Computer Science. 2012, 39 (4): 71-74. 
Abstract PDF(333KB) ( 405 )   
RelatedCitation | Metrics
Anonymous and traceable copyright protection protocol based on mobile devices was put forward in this pa- per. Firstly, others can not track the user by using the changing identity to replace user's true identity in the protocol. The anonymous nature exists in the protocol. Secondly, the protocol uses the method that one password only applies one fingerprint watermark so that the protocol resists the fake attack. Thirdly, while using oncway hash function to verify the identity of participators and the digital content, the protocol transfers the calculating works to the trusted center RA with enough computing ability from mobile users. I}he two behaviors reduce the mobile's calculating works, and im- prove efficiency. In addition, the copyright watermark and fingerprint watermark embedded in the digital product arc used to trace the traitor when finding one or more illegal copies of the digital product. So the protocol has the ability of traitor tracing. The analvsis indicates that the protocol is secure and practical.
Novel Service Acuqire Approach Oriented to Things of Internet Based on Artificial Potential Energy Field
Computer Science. 2012, 39 (4): 75-78. 
Abstract PDF(391KB) ( 353 )   
RelatedCitation | Metrics
In order to construct a secure and efficient service environment and make the user convenient to search re- source in the Internet of Things(IoT) , the paper proposed a novel service discovery method oriented to IoT based on the artificial potential field. Firstly, the concept of space service community was presented. I}hen the service acquire method SRDI3SM(service routing method based on space migration) was proposed. I3y the transmission characteristics of space node, SRDBSM can search the optimal node which is considered as next-hop node within the maximum effective trans- mission range. Theory analysis and experiments indicate that compared with random walk and modified-BFS, SRDI3SM has certain improvement in following aspects; reducing node energy consumption, prolonging the network space com- munity survival cycle and increasing the efficiency of resource discovery.
Practical Dual WiFi NIC and Multi-channel MAC Protocol
Computer Science. 2012, 39 (4): 79-83. 
Abstract PDF(413KB) ( 460 )   
RelatedCitation | Metrics
Multi-radio multi channel technology can efficiently improve the multi-hop wireless networks' bandwidth and throughput, so that it has been becoming the researching hot spot. Among the so many researching key issues about the multi-radio multi channel technology, wireless channel assignment and management arc more important and difficult. hhis paper proposed a new dual-radio multi-channel MAC protocol using the standard 802. 11 WiFi MC, named as DIM Dua1-Interface Management MAC protocol). DllVI can optimize the channel assignment by applying the channel conflict model,and take full use of the available IEEE802. 11 wireless channels with little command on hardwire configuration. The simulation results show that DIM has more network throughput and less latency of packets delivery.
Systematic and Irregular GLDPC Codes with Zigzag Component Codes
Computer Science. 2012, 39 (4): 84-88. 
Abstract PDF(384KB) ( 418 )   
RelatedCitation | Metrics
A novel systematic and irregular generalized low-density code with zigzag component(ZS-IGLDPC) was pro- posed in this paper. hhe sequences of degree distribution was designed by Gaussian approximation based density evolu- lion under the additive Gaussian white noisy channel. The performance of ZS-IGLDPC was analyzed by simulation. The simulation results show that the bit error rate of ZS-IGLDPC codes is less than other available codes for short and medi- um code length.
Improved Model of SNMPv3 Supporting Multicast
Computer Science. 2012, 39 (4): 89-93. 
Abstract PDF(474KB) ( 363 )   
RelatedCitation | Metrics
With the development of triple play, the type of business increases and the number of devices increases dra- matically. Some problems, such as the data repeat sending, large broadband overhead and low efficiency, are more serious in typical SNMP network management system based on unicast mock. An improved SNMPv3 model supporting multi- cast was proposed and the concepts of multicast group and the entity of group proxy were introduced. In the improved SNMPv3 modcl,the definition of MII3 was extended and the SNMP engine was improved to identify and process the multicast messages. The improved model and main modules on management stations and proxies were given respective- 1y. Then the implementation flows of group management and data collection based on improved model were described. hhe test results for the prototype system show that the data collection based on multicast polling can save the resource of senders, reduce network traffic and improve the data query and data collection efficiency greatly.
Research on Secure Distributions for Digital Contents in Multimedia Social Networks
Computer Science. 2012, 39 (4): 94-97. 
Abstract PDF(383KB) ( 391 )   
RelatedCitation | Metrics
The emerging and rapid development of Multimedia Social Networks make information changes and shares a- mong users much more convenient, but the phenomenon of unauthorized distributions on copyrighted digital contents becomes more serious. Digital rights management is an open issue and key challenges in the open network. With regard to behaviors of digital contents share and dissemination among social network user nodes, a framework and its secure protocol for digital contents distribution were proposed based on a trusted attestation proxy party-enabling remote at- testation in MSN. I}he proposed DRM scheme was analyzed and compared with the typical ones available, and the re- sups denote that combined with the trusted computing-supported high-level security user terminals, the novel method implements enhanced security, trustworthy and controllable mechanism on digital copyrights protection, and meets the requirement for platform privacy protections.
Research on P2P-based Anonymous Communication Technology
Computer Science. 2012, 39 (4): 98-100. 
Abstract PDF(345KB) ( 359 )   
RelatedCitation | Metrics
With the development and wide use of P2P networks, user privacy and security come to be valued continuous- 1y. Although the content of correspondence data can be protected by existing encryption which can not protect the user's identity very well, we can't give outlaw the opportunity to spread the illegal information when the anonymous technolo- gy is used. hherefore, this paper put forward a method based on acyclic packet routing mechanism to achieve the anony- mows communication in P2P system, communicate privacy and confidential communications by cutting network address, protecting members group, and uniformly managing group administrator. Simulation results show that routing strategy has been improved obviously by the mechanism. The mechanism can guarantee communication efficiency while impro- ving the anonymity of user,and the P2P network can be protected in real-time effectively.
Provable Secure Identity-based Threshold Proxy Signcryption Scheme
Computer Science. 2012, 39 (4): 101-105. 
Abstract PDF(419KB) ( 348 )   
RelatedCitation | Metrics
The proxy signature allows a designated person, called a proxy signer, to sign on behalf of an original signer. It has the properties of authenticity and non-repudiation of message except confidentiality. Signcryption is a technique that can encrypt and sign data together in some way, while it has merits of them. I}his paper presented an efficient iden- lily-based threshold signcryption scheme in the standard model and gave its security analysis in terms of bilinear pairing technique. At last, we proved its semantic security on the hardness of Decisional Bilinear DifficHcllman problem and its unforgcability on the hardness of Computational DifficHellman problem.
Research Based on a Method of FRS-FCM Ensemble Intrusion Detection
Computer Science. 2012, 39 (4): 106-109. 
Abstract PDF(306KB) ( 405 )   
RelatedCitation | Metrics
Traditional FCM algorithm is too dependent on initial distance and Euclidean distance is only applied to han- dle the dataset of numeric and spatial data structure for the super-ball. Based on fuzzy rough sets and ReliefF technolo- gy, the author proposed a fuzzy rough set based clustering algorithm(FRS-FCM) , and used it to integrated intrusion de- tection. By effective clustering and integrated learning, the algorithm can improve the detection rate and reduce the false detection rate, improve the detection rate of low-frequency attacks. Finally, simulation experiments using KDl)Cup 99 data set verify feasibility and effectiveness of the algorithm.
Congestion Control and Saturation Condition in Multi-hoop Ad hoc Based on Quasi-birth-and-death Model
Computer Science. 2012, 39 (4): 110-113. 
Abstract PDF(339KB) ( 363 )   
RelatedCitation | Metrics
Congestion control is an important factor for the performance of IEEE 802. 11 in Ad hoc wireless network. When congestion occurs, the packet rate is controlled, and researching what time congestion exists is a primary task. This paper performed researches on multi-hop Ad hoc network by modeling the packet sending process of individual node. Considering packet maximum retry time and packet buffer queue, and on the basis of 802. 11 DCF basic access mechanisms, the paper proposed an infinite state quasi birth-and-death model. 13y solving the model, the research on how to prevent saturated network under flooding mechanism was performed and corresponding formula which involves the packet arrival rate on MAC layer was obtained. The impact of packet arrival rate on stability was pointed out. This is fundamentally different from earlier work,which usually focuses on the DCF mechanisms itself,while we from the net- work layer view proposed a new model which accurately describes the actual packet transmission process and formulari- zes the mathematical relationship between the stability and packet arrival rate, which provides a mathematical tool for congestion control research.
Research on Algorithm of Network Topology Discovery Based on Wireless Sensor in Internet of Things
Computer Science. 2012, 39 (4): 118-122. 
Abstract PDF(470KB) ( 442 )   
RelatedCitation | Metrics
Since the finitude of hardware and network dynamic limit the bandwidth in the wireless Internet of things ( IOF),it is difficult to obtain the network topology information accurately and timely. Comprehensive,accurate and fast topology discovery has an important meaning to the network control, congestion control and fault location. In this pa- per,an improved fuzzy algorithm and strategy based on multi-mobile agent cooperative system were proposed. Our mo- del contains the process method of topology infom}ation, the agent active degree and migration, etc. We presented the best ration of agents to nodes and the IOF network topology algorithm As expected, the model is efficient and can save resources.
Combination Model-based Self-similarity Traffic Prediction
Computer Science. 2012, 39 (4): 123-126. 
Abstract PDF(350KB) ( 360 )   
RelatedCitation | Metrics
In view of mode mixing caused by EMD(Empirical Mode Decomposition),this paper proposed a self-similari- ty traffic prediction method based on the combination models. Through the EEMD(Ensemble Empirical Mode Decompo- sition) process, the long-term dependence existing in network traffic was removed effectively. Additionally, according to the different characteristics of each IMF(Intrinsic Mode Function) produced by EEMI),ANN C Artificial Neural Net- work) and ARMA(Auto Regressive Moving Average) were adopted for different IMFs. The simulation results demon- strate that the proposed method can effectively predict the traffic and has high precision.
Research on Bandwidth Allocation Mechanism of IB QoS
Computer Science. 2012, 39 (4): 127-131. 
Abstract PDF(421KB) ( 1241 )   
RelatedCitation | Metrics
The quality of service mechanism supported by the Infinil3and network is an important technique to effective- 1y separate and control the bandwidth allocation between the multiple data-flows on the network, which provides a gua- ranteed transportation service to applications concurrently running in the HPC or datacenters interconnected by II3 net- work. 13ut the "Infinil3and architecture specification" does not propose demand for the II3 device manufacturers how to implement II3 QoS in detail, especially for the three key components(High-Priority, Low-Priority and Limit of High-Pri- ority) how to numerically control the bandwidth allocation between Virtual Lanes(VL for short) in different priority- ranked queues and with different priority values, and also no documents have closed expatiation on it by now. In this pa- per, a quantitative formula to compute the ratio of bandwidth between high-and low-priority VI_s according the three key components was presented. It was brought forward based on abundant results of bandwidth allocation experiments done on the most popular II3 devices and proved to be true by random sample verification. We also found that the band- width ratio would change in zigzag shape with the increase of the I_ow-Priority value and periodically reach a stable val- ue on condition that both High-Priority and Limit of High-Priority are certain. These characteristics were verified by mathematical reasoning method.
Parallel Transmission Model Based on P2P
Computer Science. 2012, 39 (4): 132-134. 
Abstract PDF(359KB) ( 368 )   
RelatedCitation | Metrics
File sharing is one of the important application based on P2P, and how to improve the fast transmission based on P2P is important technology which is the guarantee of users' satisfaction. Based on the previous studies, this paper put forward a data fragment and fair storage strategy, then gave a parallel transmission algorithm. hhis stored strategy has more great space advantage than the entity copy. And this algorithm not only can adapt to the dynamic changes of the network, but also increase data transmission rate. The experiment result in dicates that it is in accordance with the theory and has a strong suitability and practical application value than previous model.
States Space Reduction Algorithm for Timed Behavior Protocol
Computer Science. 2012, 39 (4): 135-138. 
Abstract PDF(313KB) ( 350 )   
RelatedCitation | Metrics
It has great significance to relieve the state space explosion problem and improve the verification efficiency and practicality by state reduction. This paper analyzed the composition pattern of real-time components. The timed be- havior protocols based composition and the state space explosion problem were discussed, and the state space reduction algorithm for timed behavior protocol was given, and the sample was presented finally.
Multiple Paths Test Case Generation Based on Complex System Genetic Algorithm
Computer Science. 2012, 39 (4): 139-141. 
Abstract PDF(359KB) ( 542 )   
RelatedCitation | Metrics
In the light of the lack of effective methods to generating test case for multiple paths coverage based on com- plex system, we proposed a novel evolutionary generation approach of test case for multiple paths coverage. First, gener- is algorithm was improved; the ability of parent evolutionary selection, individual evolution and adaptive update method of populations were improved in the evolution of population, which cound solve the algorithm for early slow convergence or premature effectively. And then, according to the features and requirements of multiple paths coverage, fitness func- lion based on path-match was designed so that in one run, it was able to generate multiple test data to cover multiple target paths. Finally, the proposed approach was applied into several benchmark programs. The experimental results show that the proposed approach can improve the efficiency of test data generation effectively.
Web Service Composition Model Based on Generalized Stochastic Colored Petri Nets
Computer Science. 2012, 39 (4): 142-144. 
Abstract PDF(328KB) ( 366 )   
RelatedCitation | Metrics
A generalized stochastic colored Petri net(GSCPN) and a Web service model based on GSCPN were pro- posed. QoS and data representations were realized in this model. Web service composition algorithms based on GSCPNs were proposed. A hierarchical method for GSCPN was proposed for reducing the complexity of a Web service model and mitigating the state explosion problem partly. Properties of composition operations were analyzed. Approaches of analy- zing properties and QoS of the model were investigated.
Kernel Methods of Software Reliability Prediction
Computer Science. 2012, 39 (4): 145-148. 
Abstract PDF(347KB) ( 356 )   
RelatedCitation | Metrics
The pridiction of future failure data from observed data sets can been transformed into a problem of nolinear regression, and the kernel functions method is very efficient for solving nolinear regression problems. A kernel func- lions-based generic model adaptive to the characteristic of the given data sets is used for software failure time predic- lion, and it is applied to learn and recognize the inherent internal temporal property of software failure sequence in order to capture the most current feature hidden inside the software failure behavior. The experimental results based on four- teen real data sets show that the proposed model has better prediction and applicability than that of some other condi- tional software reliability prediction models.
Personalized Recommendation Model Based on the Dynamic and Mixed QoS of Semantic Web Service
Computer Science. 2012, 39 (4): 149-153. 
Abstract PDF(441KB) ( 362 )   
RelatedCitation | Metrics
Service recommendation is a main problem in the services computing. However, most of the current Web service recommender systems make recommend for functional properties, but consider less in the QoS attributes of Web services, and do not support dynamic changes of QoS attributes. The personalized Web service recommender model based on the semantic Web services takes advantages of dynamic mixing QoS attributes, and introduces the semantic Web technology into Web services. Under the supervision of the QoS monitor, it effectively monitors the QoS properties change of Web services,and dynamically updates the QoS attributes of the Web service. According to the establishment of user interest model, the system recommends some personalized Web services to the user .In addition, based on the collaborative filtering recommendation technology widely used in personalized recommendation system, this paper car- ried out a series of pretreatment filling, and fully considered the different time scores impact on the recommendation, combined the user interest degree with the user score similarity, and through different weights showed their importance degrec,comprehensivcly calculaed the nearest neighbor set of the target user, ultimately generated recommendation to the user u. To some extent, the system can effectively improve the efficiency, accuracy of the service recommended, and meet the needs of the user query.
Semantic Similarity-based Approach for Answering Imprecise Queries over Web Databases
Computer Science. 2012, 39 (4): 154-158. 
Abstract PDF(447KB) ( 431 )   
RelatedCitation | Metrics
To deal with the problem of answering the Web database imprecise queries, this paper proposed a semantic similarity-based Web database imprecise query approach. For a given query, one or several similar queries in the query history will be found firstly, and the similarity of each similar query to the original query is greater than the given rclax}r tion threshold. Then, the tuples matched to these queries arc treated as the imprecise query results to the current query. Finally, the result tuples are ranked according to their satisfaction to the original ctuery. Results of experiments de- monstrate that the query similarity measuring method proposed is stable and reasonable, and the imprecise query method proposed has higher recall and the ranking accuracy as well.
Multi-relational Frequent Pattern Mining Method Relying on the ER Data Model
Computer Science. 2012, 39 (4): 159-163. 
Abstract PDF(422KB) ( 472 )   
RelatedCitation | Metrics
In order to solve statistical skew and efficiency problems, it presented a method based on the ER(Entity-Rela- tionship) concept model. hhe method takes relationship sets of an ER model as the core, and applies expanded SQI_ sta- tistical primitives of relational databases to produce multi-relational frequent patterns based on data and interestingness constraints which are provided by users, such that the patterns' number may be greatly reduced. Not only no physical joining of relational tables is needed, but also no statistical skew is produced. hhe comparison with some related research works explains the effectiveness and correctness of the method, which is performed by making use of the relational data- base management system and the ER model.
Three Level Privacy Protection in Social Networks
Computer Science. 2012, 39 (4): 164-167. 
Abstract PDF(461KB) ( 861 )   
RelatedCitation | Metrics
Serious concerns on privacy protection in social networks have been raised in recent years. When an adversary uses some types of background knowledge to conduct an attack, an individual's privacy may be threatened. In pioneer work, node disclosure and edge disclosure were studied. However, there is little effort to deal with sensitivity disclosure. To address it, sensitivity disclosure was formally defined and theoretically analyzed. Taking into account all these disclo- sures, three level privacy protection was proposed, and lts graph was presented as the solution. ho generate Its graph,Its algorithm is both theoretically correct and experimentally effective.
Research on Anonymity Technique for Personalization Privacy-preserving Data Publishing
Computer Science. 2012, 39 (4): 168-171. 
Abstract PDF(482KB) ( 438 )   
RelatedCitation | Metrics
Privacy-preserving data publishing for personalization is a hot research topic in technique for controlling pri- vacy disclosure in data publishing in recent years. An overview of this area was given in this survey. First, on the basis of analyzing types of different personalization service, relevant anonymization models of personalization privacy were built. Second, according to the different adopting technictues, a summarization of the art of personalization privacy-pre- serving techniques was given, and the fundamental principles and characteristics of various techniques were generally de- scribed. In addition, some existing privacy measure methods and standards were provided, according to the differences of information measure of adopted algorithms. Finally, based on the comparison and analysis of existing researches, future research directions of privacy-preserving data publishing for personalization were forecasted.
Formal Languages Model for Temporal Data
Computer Science. 2012, 39 (4): 172-176. 
Abstract PDF(517KB) ( 478 )   
RelatedCitation | Metrics
Data model is the main clue of trends in database technology, and tempoaral data model is the core and basis of temporal database system. This paper discussed preliminarily some basic elements of data model in accordance with the status quo of temporal data model, made a temporal data model formalized, and further made a formal languages model of this temporal data model based on formal languages theory and denotational semantics method of formal se- mantics. By the formal languages model this paper defined some formal semantics rules for all kinds of temporal integrity constraints, and deeply analyzed inherent temporal semantics relationships of temporal data model, which provides an ef- ficient and convenient formalization theory framework for studying temporal data model.
Research on Deep Web Classification Based on Domain Feature Text
Computer Science. 2012, 39 (4): 177-180. 
Abstract PDF(358KB) ( 307 )   
RelatedCitation | Metrics
Automatic Decp Web classification is the basis of building Decp Web data intergration system. An approach was proposed to classify the Deep Web based on domain feature text. Using the ontology knowledge, the concepts which express the same semantics were firstly extracted from different texts. Then the definition of domain correlation was given as the quantitative criteria for feature text selection, in order to avoid the subjectivity and uncertainty of manual selection. In the process of the interface vector space model construction, an improved weighting method namedw I}FIDF was proposed to evaluate the different roles of feature text. At last, a KNN algorithm was used to classify these interface vectors. Comparative experiments indicate that the feature text selected by our method is accurate and effec- tive, and the new weighting method can improve the classification precision significantly and shows good stability in KNN classification.
Joint Feature Selection Method Based on Relevance and Redundancy
Computer Science. 2012, 39 (4): 181-184. 
Abstract PDF(376KB) ( 457 )   
RelatedCitation | Metrics
Based on a comparative study of four feature selection methods, including document frequency(DF) unrelated to class information,and information gain(IG),mutual information(MI) and chi square statistic(CHI),which are related to class information, we analyzed the disadvantages of combining these two kinds of methods directly and proposed a joint feature selection method based on relevance and redundancy to joint DF and one of IU,MI and CHI. This approach aims to eliminate redundant features, find useful features for classification and consequently improve the accuracy of text sentiment classification. hhe results of the experiment show that the proposed method can not only improve the per- formance but also reduce the feature dimension.
Improved Global Optimization Algorithm Based on Incremental Support Vector Regression Model
Computer Science. 2012, 39 (4): 185-188. 
Abstract PDF(352KB) ( 740 )   
RelatedCitation | Metrics
SVR(Support Vector Regression) is a kind of small sample learning method with strong robustness. It can effectively avoid‘dimension disaster' , and is introduced to the global optimization. However, the existing global optimi- nation algorithms based on SVR have several shortcomings, such as large number of evaluations, can not cope with high dimensional optimization problem and so on. We proposed a new improved global optimization algorithm DISVR based on incremental SVR model: an incremental SVR method to improve the efficiency of reconstruction process response, a new incremental I_HD sampling(I_atin Hyper-cube Sampling) to ensure an uniform distribution of samples, DIRECT search algorithm to enhance the stability and efficiency of the global search. Finally, the result of test functions suggests that the proposed method both reduce the time complexity, also effectively reduce number of source model's evalua- dons.
Grammar Formal Method of Battle Management Language Based on Improved-BNF
Computer Science. 2012, 39 (4): 189-192. 
Abstract PDF(483KB) ( 619 )   
RelatedCitation | Metrics
Battlc management languagc(13MI)is a critical technology which is used to realize unambiguous communica- lion between C2 and M& S systems, and it can help to solve the problems of the interoperation of C2 and Mc}.S sys- terns. First the architecture of I3ML was discussed, and then different methods of formalization were compared. A gram- mar formal method of BMI. based on improved backus-naur form(II3NF) was proposed,finally,the grammars of typical orders were formalized with II3NF and specific examples were given.
Improved PSO Based on Evolutionary Process Learning
Computer Science. 2012, 39 (4): 193-195. 
Abstract PDF(392KB) ( 392 )   
RelatedCitation | Metrics
Particle swarm optimi}ation(PSO) easily falls into the stagnation at the late evolutionary period because it does not know about the characteristics of the objective function completely. In the classic PSO, the finite information, such as the velocity v(t),the location x(t),the individual extremum P} of the particle and the global extremum Pg of the swarm at the prior time t, is employed to drive the evolutionary process. But in the evolutionary of PSO, the distribution characteristics of solutions of the objective function are hidden in the many and many function evaluations while the evo- lutionary is iterating. hhe novel PSO based on evolutionary lcarning(I= PSO ) balances the exploration and the exploita- lion process and controls the r}initialization and crossover selection of particles through the distribution characteristics of solutions extracted statically from the historical evaluations. The experimental results show that the L-PSO can im- prove the precise of solution and reduce the expected iterations although the time and space complexity is increased lightly.
Combining the Hough Transform and an Improved Least Squares Method for Line Detection
Computer Science. 2012, 39 (4): 196-200. 
Abstract PDF(459KB) ( 707 )   
RelatedCitation | Metrics
A novel line detection method combining the Hough transform and an improved least squares method was proposed. Advantages and drawbacks of the Hough transform and the least squares method on line detection and detec- lion accuracy were analyzed. Robust and heuristics-free coarse detection of lines was realized by the Hough transform to determine image regions where a line may exist. Least squares regression was then applied on feature points in these re- gions to obtain accurate line parameters. To overcome the sensitivity of conventional least squares method to outliers, the xrleast square with dual removal algorithm, which deletes each iteration a pair of data points with maximum positive and negative fitting errors to ensure the reservation of normal points in the data set and thus guarantee the accuracy of the linear regression, was proposed. Experimental results show that the novel method gives higher detection rate and more accurate line parameters compared with the Hough transform Besides, lower Hough space resolution can be a- dopted without much impacts on the detection results, thus reducing the cost on the memory needed.
Attribute Reduction Algorithm Based on Object Matrix in Incomplete Decision Table
Computer Science. 2012, 39 (4): 201-204. 
Abstract PDF(339KB) ( 344 )   
RelatedCitation | Metrics
Attribute reduction based on discernibility matrix is the most common methods in rough set attribute reduc- lion. Compared with discernibility matrix based on storage condition attribute,object matrix was defined in this paper. This object matrix uses the relationship between the decision value of the objects in tolerance class and condition attrib- ute,and what it stores is object set. hhen the definition of attribute reduction based on object matrix was given. It is proved that the attribute reduction acquired from this new method is ectuivalent to that based on positive region in in- complete information system. A heuristic algorithm for attribute reduction was presented which time complexity is max (()(I CI z I叽IIUI),()(ICI)),and space complexity is()(ICI).An example was used to illustrate the feasi- bility of this new algorithm.
Twin Distance of Minimum and Maximum Support Vector Machine
Computer Science. 2012, 39 (4): 205-209. 
Abstract PDF(425KB) ( 439 )   
RelatedCitation | Metrics
GEPSVM is a newly proposed binary SVM in recent years. It learns two optimal hyperplanes by solving the generalized eigenequation and determines the categories of patterns based on the distances between test sample and the two hyperplanes. Compared with traditional SVM, GEPSVM can reduce the time complexity, but it still leaves singulari- ty issues unsolved. This paper introduced a new algorithm丁DMSVM(Twin Distance of Minimum and Maximum Sup- port Vector Machinc)which seeks two optimal hyperplanes through solving the standard eigenequation and requires the minimal average distance between the same class of samples and hyperplanes. Compared with GEPSVM, I}DMSVM has the following advantages:it further reduces the time complexity and does not rectuire the introduction of regular items which improves the generalization performance and avoids singularity.
Optimal Particle Filter Object Tracking Algorithm Based on Features Fusion and Clustering Kernel Function Smooth Sampling
Computer Science. 2012, 39 (4): 210-213. 
Abstract PDF(385KB) ( 345 )   
RelatedCitation | Metrics
An improved particle filter object tracking algorithm was proposed to solve object tracking problems in com- plex scene. hhis paper used united histogram to describe target grayscale and gradient direction features imformation, and designed a self-adaptive features fusion observation model to adapt the changing scene. To solve particles degeneracy problem of basic particle filter, a resampling method based on clustering kernel function smooth was proposed. hhe ex- perimental results based on simulation and the actual scenes show that this algorithm is more adaptable and possesses higher accuracy, can track the moving object in complex scene effectively.
Improved Ant Colony Optimization Approach for the Team Orienteering Problem with Time Windows
Computer Science. 2012, 39 (4): 214-216. 
Abstract PDF(244KB) ( 685 )   
RelatedCitation | Metrics
Team orienteering problem with time windows is one of the most important logistic distribution routing opti- mization problems. The optimization objective of this problem is to plan the optimal feasible routes in which the total re- ward is maximized while a set of customers can be served in the given time windows. An improved ant colony optimiza- lion approach was developed to deal with this problem. In order to construct better and faster solutions, the proposed al- gorithm uses a fast method to determine dynamic candidate list. Moreover, it adopts the sequence method and the greedy method to construct solutions. Compared with iterated local search, the proposed algorithm can obtain better results within 12 seconds.
Incremental Hessian LLE by Preserving Local Adjacent Information between Data Points
Computer Science. 2012, 39 (4): 217-219. 
Abstract PDF(378KB) ( 578 )   
RelatedCitation | Metrics
Hessian LLE algorithm is a classical manifold learning algorithm. However, Hessian LLE is a batch mode. If only new samples are observed, the whole algorithm must run repeatedly and all the former computational results are discarded. So,incremental Hessian LLE(LIHLLE) algorithm was proposed, which preserves local neighborhood relationship between the original space and the embedding space. New sample points were linearly reconstructed with existing embedding results of local neighborhood samples. The proposed method can learn manifold in an incremental way. Simulation results in Swiss roll with hole and frey_rawface database testify the efficiency and accuracy of the proposed algorithms.
Synchronization between Different Hyperchaotic Systems and Dimensions and its Application in Secret Communication
Computer Science. 2012, 39 (4): 220-222. 
Abstract PDF(221KB) ( 449 )   
RelatedCitation | Metrics
A novel four-dimensional hyperchaotic system was constructed, and its basic characteristics of the nonli-near dynamics system were analyzed in detail. Active control synchronization method was used to design appropriate nonli near feedback controller, and this synchronization system was applied in chaotic secret communication, which implemented synchronization between the four-dimensional hyperchaotic system and thre}dimensional Lorenz system with different dimensions and different structures. Numerical simulation results show that this scheme is feasible and applicable.
Relative Decision Entropy Based Decision Tree Algorithm and its Application in Intrusion Detection
Computer Science. 2012, 39 (4): 223-226. 
Abstract PDF(341KB) ( 491 )   
RelatedCitation | Metrics
To overcome the disadvantages of traditional decision tree algorithms, this paper proposed a relative decision entropy based decision tree algorithm DTRDE. First, we introduced the information entropy proposed by Shannon into rough set theory, defined a concept of relative decision entropy, and utilized the relative decision entropy to measure the significance of attributes. Second, in algorithm DTRDE, we adopted the relative decision entropy based significance of attributes and the dependency of attributes in rough sets to select splitting attributes. And we used the attribute reduction technology in rough sets to delete the redundant attributes,aiming to reduce the computation complexity of our algorithm. Finally, we applied the proposed algorithm to network intrusion detection. The experiments on KDI)Cup99 dataset demonstrate that DTRDE algorithm has higher detection rate than the traditional information entropy based algorithms,and its computational expense is simliar to those of the traditional methods.
Ant Colony Algorithm Combined with Survey Propagation for Satisfiability Problem
Computer Science. 2012, 39 (4): 227-231. 
Abstract PDF(433KB) ( 495 )   
RelatedCitation | Metrics
Satisfiability problem is a basic problem in logic,and also is NP-hard. Survey propagation(SP) is a very effective algorithm for this problem. However, SP tends not to converge in hard region, or gives wrong assignments to the variables. An algorithm combined with SP and ant colony optimization(ACO) was proposed. The messages calculated in SP were used in ACO as guidance to help ACO find a solution. And local search was conducted in the new algorithm.The new algorithm can quickly find solutions for some instances that SP doesn't work.
Fast Parallel Molecular Algorithm for Solving the Discrete Logarithm Problem over Group on Z DNA-based Computing
Computer Science. 2012, 39 (4): 232-235. 
Abstract PDF(404KB) ( 348 )   
RelatedCitation | Metrics
The discrete logarithm problem over group is widely applied in the public key cryptosystems. We proposed a new DNA computing algorithm to solve the problem. Our new algorithm consists of an initial solution generator, a parallel multiplier, an invalid parallel detector, a parallel conventor and a parallel solution searcher. For the sake of reducing the DNA sequences required in the new algorithm, the solution space will be generated considering the three list algorithm. The proposed algorithm needs O(kz)biological operations, O(1) test tubes, O(2k) DNA sequences, and the maximum length of the DNA sequence is O(kz).Finally, the common test methods were used to verify the new algorithm's feasibility and effectiveness.
Semi-supervised Clustering Algorithm Based on Graph Contraction
Computer Science. 2012, 39 (4): 236-239. 
Abstract PDF(322KB) ( 475 )   
RelatedCitation | Metrics
In order to get a good clustering performance in data set with a small number of labeled samples, a semi supervised clustering algorithm based on graph contraction was proposed in this paper. At first, the whole data in sample space was represented as an edgcwcighted graph. Then the graph was modified by contraction according to must link constraints and graph theory. On this basis, we projected sample space into a subspace by combining graph laplacian with cannot-link constraints. Data clustering was conducted over the modified graph. Experimental results show that the method indeed reaches its goal for complex datasets, and it is acceptable when there has small amount of pairwise constrants.
Heterogeneous Network Communication Framework in Smart Space Environment
Computer Science. 2012, 39 (4): 240-245. 
Abstract PDF(806KB) ( 364 )   
RelatedCitation | Metrics
A heterogeneous network communication framework that meets the general communication needs in smart space was proposed and described in detail. In addition to dynamic adaptability, the framework also has other characteristics such as loosely coupled, robustness, versatility, flexibility. Serials currency tests in a real smart home environment(LookeyHome) indicate that even when communicating in gateway wsDualHttpl3inding transmit mode which has been proved with minimum throughput, the framework can withstand concurrent access at least about 70 times per second.This performance is good enough in most smart space communication occasions.
P-sets and its Dynamic Equivalence Classes Characteristics
Computer Science. 2012, 39 (4): 246-249. 
Abstract PDF(292KB) ( 409 )   
RelatedCitation | Metrics
By embedding the dynamic characteristics into the finite Cantor set and improving it, P-sets were proposed. P-sets have dynamic characteristics. P-sets are a pair of sets composed of internal P-set XI'(internal packet set XI')and outer p-set XF(outer packet set XF).By using P-sets, the concepts of internal P-equivalence classes, outer P-equivalence classes and P-equivalence classes were presented. The theorems were obtained such as the P-equivalence classes reduction theorem, the internal point theorem of internal P-equivalence classes discrete interval, the external point theorem of outer P-equivalence classes discrete interval,the subinterval theorem of internal P-equivalence classes discrete interval, and P-equivalence classes identification criterion. Using these results, the applications of P-equivalence classes in searching-identification about unknown information were given. The paper shows that there are overlapping and osmosis spaces between P-sets and cantor sets, and some new results are hidden in them.
Affective Recognition from Pulse Signal Using Correlation Analysis and Max-Min Ant Colony Algorithm
Computer Science. 2012, 39 (4): 250-253. 
Abstract PDF(460KB) ( 379 )   
RelatedCitation | Metrics
For the affective recognition from pulse signal, a new approach was presented, which combined correlation analysis and max-min ant colony algorithm. The effective feature subset which can identify the affective recognition model with better performance was found. Firstly, sequential backward sclection(SBS) was used for sorting the original fcalures. Secondly, the linear correlation coefficient was presented for calculating the correlation between features, and some features were removed which had greater correlation according to the result of sorting. Finally, max-min ant colony algorithm realized feature selection which searched for an optimal subset based on the compact feature subset, and combined with Fisher classifier to finish classification of six emotions which include happiness, surprise, disgust, grief, anger and fear. The experiments show that the proposed approach can find the more stable and effective feature subset from the original feature set,and establish effective affective recognition model.
Cardiac MRI Registration Algorithm Based on Radon Transform and Power Spectrum
Computer Science. 2012, 39 (4): 254-257. 
Abstract PDF(316KB) ( 425 )   
RelatedCitation | Metrics
An image registration algorithm based on Radon transform and power spectrum was presented for Cardiac magnetic resonance(MR) serial section images. At the first step, a morphological edge detection method as MR image preprocessing was used to extract the edge features of MR images which are as the inputs of registration operation.Then registration parameters of scale, rotation and translation were obtained by the registration method based on Radon transform and power spectrum. At last, the registration result was obtained by the registration parameters. The algorithm has solved the problem that the registration result is sensitive to the spatial noise of MR images, improved the accuracy of registration and reduced the computational cost With the algorithm, a hundred of MR images were used to registration tests. hhc results show that this algorithm can gain a good performance.
Image Filtering Algorithm Based on TIP Model
Computer Science. 2012, 39 (4): 258-260. 
Abstract PDF(352KB) ( 500 )   
RelatedCitation | Metrics
Few image details will be lost when noise is filtered in traditional image filtering algorithm. This article put forward an image filtering algorithm based on tangent image process model. I}his algorithm filters image by using tangent function and arctangent function according to 3*3 neighborhood pixel value of the pixel to be detected. It is simple, and easy to be implemented. Details of image edge and corner can be preserved and enhanced when noise is restrained effectively. I}he experiment shows that this algorithm is obviously better than other image filtering algorithm.
New Gait Recognition Method Based on Static and Dynamic Features
Computer Science. 2012, 39 (4): 261-264. 
Abstract PDF(315KB) ( 413 )   
RelatedCitation | Metrics
Recently, gait recognition for individual identification has received much increased attention from biometrics researchers. Gait Energy Image(GEI) is an efficient represent method. We divided GEI into three parts-Body related GEI(13GEI),Gait related GEI(GGEI),Body Gait related GEI(BGGEI). `hhe fouricr descriptor was used to describe the BGEI and BGGEI. The Gabor wavelet was used to the GGEI to get the magnitude feature, research their recognition ability respectively,fusion the two parts of Gait Related GEI and Body-Gait Related GEI in rank level and score level to gait recognition. The algorithm was tested in the CASIA datasets and gained high correct recognition rates.
Research on the Algorithm of Micro-movement Target Tracking and Extraction in Video by Multi-thread Technology
Computer Science. 2012, 39 (4): 265-268. 
Abstract PDF(366KB) ( 453 )   
RelatedCitation | Metrics
We extracted micro-movement target of real-time video images under the multi-thread conditions. Based on the multi thread ideas, we used skin colour detection and clustering methods to process and solve the access violation condition. hhe ideology of edge detection was used locate the range of the moving object, and then clustering algorithm and skin color detection and some other methods were used extract the object template and complement the integrity of the object template, and according to the object template and corresponding pixel coordinates of the original image, the original pixel color onto the corresponding position of new background model. The simulation results show that the proposed method ensure the quality requirements of real-time processing and has a certain robustness, so this method satisfies the needs of the project.
Research on Geometric Constraint Solving in Family of Objects Feature Model
Computer Science. 2012, 39 (4): 269-274. 
Abstract PDF(531KB) ( 354 )   
RelatedCitation | Metrics
A novel method of solving geometric constraints of family of object models was imposed, and two new type clusters were presented, namely scalable subset and radial subset Smaller collections of rewriting rule were used exhaustivcly, and were applied in rigid or non-rigid cluster systems, until there is no available rewrite rules so far, and the final cluster collections represent solution strategy of constraint system. A incremental algorithm and solution selection strategy in the solving method were imposed and implemented. Solutions of constraint problem are found efficiently in this method,and the number of solutions will be reduced.
Application of Segmentation Based on Optical Flow in Gait Recognition
Computer Science. 2012, 39 (4): 275-277. 
Abstract PDF(326KB) ( 333 )   
RelatedCitation | Metrics
The quality of human silhouettes has a direct effect on gait recognition performance. This paper proposed a robust gait representation scheme to suppress the influence of silhouette incompleteness.By means of dividing human body area in a video sequence into several sulrarcas and representing each sulrarea by an ellipse whose parameters can be calculated from the corresponding motion information extracted from optical flow field, a new body structure model called multi-linked ellipse model was established. In the recognition stage, the parameters of model are finally used to achieve gait recognition based on dynamic time warping technology. Experimental results prove the higher performance of the method.
Study of Explosion Simulation Based on Particle System
Computer Science. 2012, 39 (4): 278-281. 
Abstract PDF(321KB) ( 757 )   
RelatedCitation | Metrics
To solve the problems of computational complexity of the explosion simulation models in military and railway accident fields, this paper proposed an improved simulation model which is based on the particle system and replaced fluid model by the qualitative and quantitative random changing model. Our simulation model was established by the combination of particle emission speed parameters and random perturbations speed parameters, and it can reduce the computational complexity of the explosion simulation effectively by giving different parameters and forming specific simulation models for smoke,fire and debris. Experiment shows that this method has good operation speed and perfect simulation effect, and can meet the needs of various explosive simulation effect.
Design and Implementation for PLASMA Auto-tuning and Performance Optimizing
Computer Science. 2012, 39 (4): 282-286. 
Abstract PDF(468KB) ( 535 )   
RelatedCitation | Metrics
PLASMA is a high performance linear algebra package. Its innovative approach such as block data layout with tiling,fine grain parallelism and out of order execution mechanism greatly improves the performance of the program. However, there arc still some problems, for example, the size of block plays a severe role in performance and this mechanism brings some data copy. In this paper, by comparing the traditional LAPACK and PLASMA's mechanism, we aimed to analyze the advantages and disadvantages of PLASMA, and proposed two methods to make up the disadvantages. As to the PLASMA architecture, we proposed a concept of marginal matrix and analysed their impact on performance via extensive testing and analysis, and then proposed a method of auto-tuning. Besides, we also found a way to further improve the performance of PLASMA,which is adopting data transmission and computing in parallel. Finally,we verified the effect of optimized method by doing a large number of testing.
Efficient I/O Scheduler over Multi-bank Flash Memory File Systems
Computer Science. 2012, 39 (4): 287-292. 
Abstract PDF(493KB) ( 436 )   
RelatedCitation | Metrics
Flash memory has been widely used in storage systems because of its nonvolatile nature, its small size, shock resistance and fast access speed. The traditional scheduler over flash memory storage systems is NOOP scheduler. There is much room for improving the I/O performance, especially over multi bank flash memory storage systems. Because several banks can operate simultaneously, we proposed a new scheduler called Multi-Bank flash memory Scheduler(MBS) based on the native file system YAFFS to take advantage of the parallelism of multiple banks by considering the high read speed. A flexible bank assignment policy was proposed to assign proper banks for write requests according to the attributes(hot or cold) of requests, which were identified by an AVI= based-tree mechanism. M13S scheduler reorders read and write requests and gives higher priority to reads. The experimental results show that the I/O throughputs and average response time are improved significantly compared with the NOOP scheduler. An even erasable count and capacity utilization were obtained between different banks in the multi-bank storage system.
Timing Analysis and Design for DDR3 System
Computer Science. 2012, 39 (4): 293-295. 
Abstract PDF(341KB) ( 1115 )   
RelatedCitation | Metrics
DDR3 memory has become mainstream application in current server and computer system. Though many techniques such as dual reference voltage, dynamic on-die termination(ODT),fly-by topology and write-leveling, have been adopt by DDR3 to improve signal integrity to a certain extent, it is difficult to design and realize high data rate.Combined with the design of a creative processor and correspondent server board, this paper introduced the basis theory of DDR3 source synchronous signal in brief at first, and analyzed the key factors affecting the DDR3 system timing quantative using simulation software in time domain, then calculated the timing margin for DDR3 write operation. Simulation result shows that the duty-cycle of signal become worse 3%~5 % with the change of ODT value and the number of I/O simulatenous switching,and the timing skew due to crosstalk can reach 218ps.
Performance Analysis and Implementation of Lattice Boltzmann Methods on Heterogeneous Platform
Computer Science. 2012, 39 (4): 296-298. 
Abstract PDF(318KB) ( 406 )   
RelatedCitation | Metrics
This work investigated the implementation of lattice Boltzmann method(LBM) with CUDA on the CPU GPU heterogeneous platform with multiple parallel programming models,MPI+CUDA and MPI+OpenMP+CUDA,which shows good performance speedup. A method used to justify the computational load ratio was proposed in this paper to balance the computational time on CPU and GPUs, which provides insightful information about the multi-level parallclization and efficient usage of different computational resource on the CPU+GPU heterogeneous platform.
HHSR;A Prototype of Network-on-chip with Commands and Data Transferred Separately
Computer Science. 2012, 39 (4): 299-303. 
Abstract PDF(435KB) ( 344 )   
RelatedCitation | Metrics
Based on the previous research, a kind of novel network-on-chip which transfers commands and data separately was proposed,with regard to the different rectuirements for the commands transmission and the data transmission in chip-multiprocessors and the characteristics of different topologies in the large and very-large scale. A prototype to this novel network-on-chip named HHSR was presented. HHSR transfers commands and data in two networks-on-chip with different topology in the same chip-multiprocessor, which transfers commands in single hierarchical ring, a kind of network-on-chip with higher speed and higher general performance, to meet the needs for real-time to the commands transmission, and data in 2-D hexagon mesh grid, a kind of network-on-chip with lower speed but lower cost. The prototype guarantees and improves the transmission speed of the commands so as to guarantee and improve the performance of the whole chip-multiprocessors with a little speed-down to the data transmission and a little more cost.
Research on Parallel Modern Optimization Algorithms Using GPU
Computer Science. 2012, 39 (4): 304-311. 
Abstract PDF(737KB) ( 2107 )   
RelatedCitation | Metrics
In order to deal with the relatively high timccomplexity of practical issuc,parallel modern optimization based on GPU was presented in this paper. Firstly, CUDA parallel programming architecture and programming model were summarized at a macroscopic level. Then the parallel processes of five typical modern optimization algorithms(Simulated Annealing, Tabu Search, Genetic Algorithms, Particle Swarm Optimization and Artificial Neural Network) using CUDA programming model were provided. Experimental statistics measured in different environment indicate that the parallel method can obtain better performance on average than CPU. Finally the parallel optimization strategy was discussed and the outlook of future direction of parallel optimization algorithm was also pointed out.