Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue 12, 16 November 2018
  
DISP:Distributed Information Sharing Platform for IOT
Computer Science. 2012, 39 (12): 1-5. 
Abstract PDF(498KB) ( 409 )   
RelatedCitation | Metrics
For sharing data and services among human, machine and environment in the Internet of Things(IoT),we designed a data service system, named Distributed Information Sharing Platform (DISP). This system collects and processes a variety of data streams in real time, changing with time and location; controls devices deployed in physical environment; outputs high efficient stable data stream, serving users all over the world. Every user can either share the data stream under certain rightness, or define high-level service components as his will. This paper described the implementation mechanisms of DISP, analyzed the key techniques about real-time, openness, and scalability in detail, and verified the processes and efficiency under a prototype system.
Digital Image Encryption:A Survey
Computer Science. 2012, 39 (12): 6-9. 
Abstract PDF(459KB) ( 1725 )   
RelatedCitation | Metrics
According to the feature of digital image, the reason why traditional cipher algorithms are not applicable was analyzed, and the development of digital image encryption was surveyed. Some techniques, such as pixel permutation in space domain, encryption based on chaos, encryption in transform domain, image secret segmentation and sharing, encryption based on neutral network and cellular automata,encryption based on blind source separation,were illustrated,and the corresponding characteristics were analyzed and compared. At last, a large number of typical encryption algorithms were analyzed in detail to expose their weakness, and the future research direction was discussed.
Research on Dynamic Web Service Behavior Adaptation
Computer Science. 2012, 39 (12): 10-13. 
Abstract PDF(437KB) ( 406 )   
RelatedCitation | Metrics
Service-oriented computing is a research focus of current software engineering and software industries. With the widespread application of Web service composition, behavior interactions/collaborations among services become increasingly complicated. Current static mechanisms for service behavior adaptation are hardly to support adaptation of interactions/collaborations among complex Web services. Against this background, this paper firstly introduced fundamental concepts of Web service adaptation, and then analyzed the statcof-thcart of, and challenges to current static service behavior adaptation,finally discussed the basic principles,general methods of dynamic Web service behavior adaptation, and its advantage compared to static behavior adaptation mechanisms.
Study of Measures Problem for Rou沙 Relational Database
Computer Science. 2012, 39 (12): 14-15. 
Abstract PDF(239KB) ( 362 )   
RelatedCitation | Metrics
The study of measures problem for rough relational database (RRDB) and its development status were discussed. Concretely, some basic concepts related to rough relational database were given firstly, moreover the development status about measures problem of rough relational database were reviewed and analyzed at home and abroad.
Survey on Visual Tracking Algorithms Based on Mean Shift
Computer Science. 2012, 39 (12): 16-24. 
Abstract PDF(836KB) ( 816 )   
RelatedCitation | Metrics
Mean-shift based visual tracking algorithms have several desirable properties, such as computational efficiency,few tuning parameters, relatively high robustness in performance and straightforward implementation, which make them to become an appealing topic in visual tracking research area. Firstly, original mean shift tracking algorithm was introduced and its defects were pointed out afterwards. hhen improvements of the original algorithm were elaborately discussed from five aspects, namely generative and discriminative object appearance model, model update mechanism,scale and orientation adaptation, anti-occlusion and fast moving object tracking. Both classical algorithms and recent advances are included in each aspect. Finally,the prospects of mean-shift based tracking were presented.
Fuzzy Decision-theoretic Rough Sets
Computer Science. 2012, 39 (12): 25-29. 
Abstract PDF(378KB) ( 431 )   
RelatedCitation | Metrics
By considering the uncertain character in practical decision procedure, the fuzzy loss functions were induced to decision-theoretic rough set theory (DTRS) based on the BayesianB decision theory. With respect to the minimum Bayesign expected risk, a model of fuzzy decision-theoretic rough set theory (FDTRS) was built. The corresponding propositions and criteria of fuzzy decision-theoretic rough set theory were also analyzed. An example of enterprise credit assessment was given to illuminate the proposed model in applications.
Static Traffic Grooming Scheme Based on Predatory Search and Gaming
Computer Science. 2012, 39 (12): 30-32. 
Abstract PDF(245KB) ( 363 )   
RelatedCitation | Metrics
Using the layered graph, a static traffic grooming scheme based on predatory search algorithm (PSA) and gaming was proposed for IP over WDM optical Internet, in order that the comprehensive user traffic request delay satisfaction degree is maximized and the network relative cost is minimized under the constraints of the user traffic bandwidth and delay request In the proposed scheme, the transition between the local and the global search is realized by adjusting the restriction level of search space, and then the optimal traffic grooming scheme is found. It was simulated over actual network topologies and its performance was compared with certain existing traffic grooming scheme. Simulation results show that it has better performance.
Implementation Method for High-radix Fat-tree Deterministic Source-routing Interconnection Network
Computer Science. 2012, 39 (12): 33-37. 
Abstract PDF(424KB) ( 406 )   
RelatedCitation | Metrics
Multicast is an important operation in multicomputer communication systems and can be used to support several other collective communication operations. Comparing with the unicast based multicast approach or path-based multicast approach, the tre}based multicast approach achieves more efficiency. This paper presented a Distributed Multicast Forward-Fable and Asynchronous Replication (DMFTAR) based method to implement the multicast operation.According to the DMFTAR method, the implementation is divided into Multicast Service Layer (MSL) , Multicast Routing Layer(MRI)and Multicast Forwarding Layer (MFI).Theoretic analysis results show that the DMH TAR method achieves more scalability and less overhead than the traditional implementation method which is based on Multi-Head Worm-Hole Asynchronous Replication (MHWAR).
Entropy of Characteristics Based Anomaly Traffic Identification Technique
Computer Science. 2012, 39 (12): 38-41. 
Abstract PDF(434KB) ( 477 )   
RelatedCitation | Metrics
The existing methods build a model describing normal flow characteristics which is used to identify deviating flows. However, building such a microscopic model is challenging due to the wide variability of flow characteristics. The distributions of packet features (IP addresses and ports) observed in traces which can be described by entropy reveal the presence and the structure of a wide range of anomalies. A novel method named Entropy of Characteristics based Anomaly Traffic Identification (ECATI) was proposed. It utilizes entropy of characteristics to detect anomalies and analyzes traffic in anomalous time bins of which detector iteratively removes flows that seem normal. We measured the accuracy of ECATI algorithm using manually labeled anomalies and anomaly injection. The results show that ECATI accurately isolates the anomalous traffic with only few or zero missed flows under over 89.5% of average identification rate.
Research and Implementation of Instant Messenger Standard Monitor Management Technology
Computer Science. 2012, 39 (12): 42-46. 
Abstract PDF(487KB) ( 445 )   
RelatedCitation | Metrics
Based on the analysis of communication mechanism of mainstream instant messenger software, summary of the various instant messengers' transmission protocol of texts, a standard monitor system for instant messenger-IMSMMS was designed and realized. Through session association, IMSMMS can intercept the sender and the receiver of the message effectively. IMSMMS not only extracts the text message of the mainstream instant messengers (such as MSN,Fction,Yahoo! Messenger etc.)for multiple versions, but also extracts the text message between two different kinds of software: MSN and Yahoo! Messenger. Through setting the sensitive words, the system can filter out messages that contain sensitive words. The experiment results show that when the IM text message packets though the gateway is less than 1000 per second, IMSMMS only has a rate of missing packet less than 0.21%. The result indicates that IMSMMS can meet the network security requirements of small scale Intranet effectively.
P2P Spatial Querying-oriented Routing Recovery Method
Computer Science. 2012, 39 (12): 47-50. 
Abstract PDF(409KB) ( 412 )   
RelatedCitation | Metrics
This paper analyzed the triggered problems due to invalidation, and put foward a routing recovery method based on taken space, so as to maintain the integrity of whole data space when peers fail. At the mean time, algorithms of peer joining and space quering based on this routing recovery method were provided. Test results present that this routing recovery method can resolve the problems of back-track quering messages and joining failing effectively,and can reinforce the usability of the system.
Improved Particle Swarm Optimization Localization Algorithm for Wireless Sensor Network
Computer Science. 2012, 39 (12): 51-54. 
Abstract PDF(315KB) ( 377 )   
RelatedCitation | Metrics
For improving the convergence rate and search ability of particle swarm optimization(PSO) localization algorithm for wireless sensor networks(WSNs) ,the non-linear inertia weight and sorting fitness strategies were applied to improve this localization algorithm for nodes localization. Finally, through simulation, the locahcation result of this algorithm was compared with the standard particle swarm optimization algorithm, least-squares method in different anchor node desity, connectivity and measurement error. The results show that this improved algorithm can effectively suppress the ranging-error and improve the convergence rate, and using this method to optimize the localization of sensor nodes is feasible.
Multi-rate Multi-cast Resource Allocation Algorithm for Ad hoc Networks
Computer Science. 2012, 39 (12): 55-59. 
Abstract PDF(409KB) ( 369 )   
RelatedCitation | Metrics
Multicast sessions arc expected to be an efficient communication scheme especially for multimedia applications in mobile Ad hoc networks. A new resource allocation algorithm was proposed for wireless Ad hoc networks with multi-rate multi-cast capability. It uses a price-based approach to solve the multi-rate multi-cast problem. And it can adaptively allocate network traffic such that the aggregate utility for all sessions is maximized. Simulation results show that the proposed algorithm not only has good convergence, but also its multi-rate multi-cast capability allows users with different channel to achieve maximum performance, thereby increasing the network throughput.
K-means Clustering Algorithm Based on Artificial Fish Swarm
Computer Science. 2012, 39 (12): 60-64. 
Abstract PDF(429KB) ( 475 )   
RelatedCitation | Metrics
Aimed at the lack of global search capability of K-means algorithm, optimized K-means clustering algorithm based on artificial fish swarm(AFS-KM)was presented in this paper,which can overcome the problem of initial clustering center selection sensitivity of K-means and can obtain global optimized clustering partition. During clutering process, a weighted distance computation method based on information gain attribute weighting is used, so, the better clustering can be obtained for both spherical data and ellipsodal data. Simulation experiment is implemented over data set KDD-99, and the result shows that the satisfying detection rate and false acceptance rate can be obtained in network instraction detection.
Secure Access Scheme Based on TNC for Multi-level Classified Network
Computer Science. 2012, 39 (12): 65-69. 
Abstract PDF(433KB) ( 404 )   
RelatedCitation | Metrics
According to the admission control requirements of multi-level classified network (MLCN),a secure access model based on trusted network connection was proposed. I3y introducing security attribute checking rule, the security attribute of accessing device and its objects were checked in order to ensure that they would not lead to sensitive information leakage. By introducing integrity measurement rule, mutual measurement between the network and the device can be achieved. Based on the access security model, an access system framework for MLCN and an accompanying authentication protocol were put forward. hhe protocol performs integrity measurement before authenticating user authentication to achieve reliable mutual authentication. Comparative analysis indicates that the protocol is relatively more efficient.
Investigating in the Performance of MIL-STD-188-110C HF Waveforms Transmitting in ITS Wideband HF Channel
Computer Science. 2012, 39 (12): 70-72. 
Abstract PDF(233KB) ( 454 )   
RelatedCitation | Metrics
The HF waveforms which are defined by the US MIL-STD-188-1100 were introduced,also the model structure and the channel impulse response calculation method of the II}S wideband HF channel. I}his paper simulated and compared the results of the bit error rate curves of wideband waveforms transmitting in Wattcrson channel and ITS wideband HF channel for the first time, providing theoretic foundation in some degree on HF wideband communication research.
Security Analysis and Improvement of a Strongly Secure Certificateless Key Agreement Protocol
Computer Science. 2012, 39 (12): 73-75. 
Abstract PDF(314KB) ( 585 )   
RelatedCitation | Metrics
Yang and Tan proposed a certificateless key agreement protocol without pairing, and claimed their scheme satisfies forward secrecy, which means no adversary could derive an established session key unless the full user secret information(including a private key and an ephemeral secret key) of both communication parties are compromised.However, we pointed out their protocol is actually not secure as claimed by presenting an attack launched by an adversary who has learned the private key of one party and the ephemeral secret key of the other, but not the full user secret keys of both parties. Furthermore, to make up this flaw, we also provided an revised protocol in which the private key and the ephemeral secret key arc closely intertwined with each other for generating the session key, thus above attack can be efficiently resisted.
Incremental SVM Intrusion Detection Algorithm Based on Distance Weighted Template Reduction and Attribute Information Entropyc
Computer Science. 2012, 39 (12): 76-78. 
Abstract PDF(316KB) ( 382 )   
RelatedCitation | Metrics
In order to solve the problem of the SVM intrusion detection method which has low detection rate, high disforting rate and slow detection speed, a kind of incremental SVM intrusion detection algorithm based on distance weighfed template reduction and the attribute information entropy was proposed. In this algorithm, the training sample set reduction is made according to the sample for the samples and the neighbors to the total distance weighted weight, then,the clustering sample point and the noise of the fitting point are taken out through the adjacent to the border area segmentation and based on sample attribute information entropy, and then, using the sample dispersion extracts possible support vector machine, and incremental learning based on KK I} conditions is made to construct the optimal SVM classifier. The simulation results show that the algorithm has good detection rate and the detection efficiency, and distorting rate low.
Arnold Encryption Algorithm Based on PN Sequence
Computer Science. 2012, 39 (12): 79-82. 
Abstract PDF(571KB) ( 354 )   
RelatedCitation | Metrics
Arnold Cat transformation is a classical algorithm of image scrambling. But its periodicity restrains the times of image scrambling, so the number of the keys is not enough. This paper presented a modified Arnold method, which uses PN sequence and secure hash algorithm to generate a random parameter sequence, then divides an image into 4 pieces and carries out Arnold algorithm to the 4 pieces respectively. The Arnold algorithm effectively increases the number of the keys, and overcomes the attack via the exhaustive analysis, which enhances the security.
Research on Collaborative Filtering Recommendation Based on User Fuzzy Clustering
Computer Science. 2012, 39 (12): 83-86. 
Abstract PDF(351KB) ( 579 )   
RelatedCitation | Metrics
Traditional collaborative filtering algorithm does not consider the influence caused by the users' information, and the existing issues such as data sparsity, poor scalability and others directly affecte the recommendation quality of recommendation systems. To address these issues, a collaborative filtering algorithm based on user context fuzzy clustering was proposed. First, users are clustered by fuzzy clustering algorithm according to user context, then the user-item rating matrix should be filled through slope one algorithm in advance before the traditional collaborative filtering.This effectively improves the sparsity of user rating data and the real-time performance. The experimental results indicate that the recommendation accuracy of the advanced approach is largely improved.
Fast Fault Location Mechanism Based on Multipath Traffic Transmission
Computer Science. 2012, 39 (12): 87-90. 
Abstract PDF(320KB) ( 406 )   
RelatedCitation | Metrics
In order to solve the problem of expeditious and accurate fault localisation, a fast fault location mechanism based on multipath traffic transmission was proposed. Through establishing the multipath traffic transmission model, a number of link disjoint lightpaths arc founded. Meanwhile, each node computes independently the affected link vector in a distributed manner to restrict a small failure localization area rapidly. Theoretical analysis and simulation results demonstrate that the proposed mechanism can faster achieve the completely fault location and greatly increase the location speed.
Research of PE File Information Hiding Based on Incremental Link
Computer Science. 2012, 39 (12): 91-93. 
Abstract PDF(564KB) ( 543 )   
RelatedCitation | Metrics
Adopting incremental link aims to boost compile speed and make debug convenient.By analyzing PE file chararcteristics after utilizing incremental link, this algorithm was designed. Concretely,it hides information into padding bytes between adjacent function codes,which integrates hidden information into program instruction codes to guarantee concealment and attack tolerance. The experiment results show that the algorithm has a large hiding capacity, and PE file size does not increase after hidden,and there is not any impact on program performance.
Integrity Checking Algorithm Based on Hash Tree for Cloud Storage
Computer Science. 2012, 39 (12): 94-97. 
Abstract PDF(387KB) ( 537 )   
RelatedCitation | Metrics
Cloud storage services enable user to enjoy high-capacity and high-quality storage with less overhead, but it also brings many potential threats, for cxamplc, data intcgrality, data availability and so on. This paper proposed a new integrality checking algorithm. Follow the analysis, this new algorithm, based on hash tree and big integer operation, can check mass files integrality in less storage, compute and network resource. In addition, it also supports some data dynamic update.
Research on Electric Information Network Security Situation Awareness Model Based on Intelligent Agent
Computer Science. 2012, 39 (12): 98-101. 
Abstract PDF(355KB) ( 645 )   
RelatedCitation | Metrics
Network security situation awareness (NSSA) is one of the effective ways to implement network security monitoring. It can be utilized to improve the operational security management and to enhance the proactive defense ability of the network. On the basis of the research work on NSSA models, a NSSA model based on intelligent agent was proposed with the consideration of current situation and demand of electric information network. With the aim of directing the security monitoring and management of electric information network, the proposed model describes the agent modeling and functional modules of four layers that are data collection and processing layer, evaluation and analysis layer, coordination and management layer and situation decision layer.
Research on Reconfiguration Networked Software System Availability Prediction
Computer Science. 2012, 39 (12): 102-106. 
Abstract PDF(438KB) ( 358 )   
RelatedCitation | Metrics
Based on modeling and simulation for complex systems, a new approach to predict networked software system availability was proposed by multi-agent system modeling and simulation. Firstly, the method of multi-agent system modeling and simulation was introduced. Secondly, the characteristics of reconfiguration networked software system were analyzed. And then after researching on the approach to multi-agent based modeling and simulation for networked software system and behavioral models of reconfiguration, the new strategy which can be used for availability prediction was addressed. Finally, in order to verify the effectiveness, a case study was taken based on the approach, which was rear lined on the Netlogo simulation platform.
Multi-ontology System Based Approach of Access Control for Semantic Web Services
Computer Science. 2012, 39 (12): 114-117. 
Abstract PDF(598KB) ( 379 )   
RelatedCitation | Metrics
A multi-ontology system based access control approach for semantic Web services was proposed. First, a bridge ontology based cross domain multi-ontology system (CDMOS) , which provides a semantic model for access con- trol of Semantic Web Service,was presented based on the distributed description logic (DDL). Secondly,on the basis of semantic access control technology, the access control model for semantic Web service was given. Finally, this paper gave the architecture of multi-ontology system based access control approach for semantic Web service and the case study of this approach. In the access control for semantic Web service,CDMOS not only provides the semantic mapping for the semantic model of security domains, but also ensures the semantic independence among the security domains.
Research of SCA Service Model for Dynamic Management
Computer Science. 2012, 39 (12): 118-120. 
Abstract PDF(325KB) ( 371 )   
RelatedCitation | Metrics
The traditional methods based on the combination of SCA and OSGi arc not able to support the dynamic, nagement of component modules in the distributed environment at runtime Based on the analysis of the traditional me thods, this paper proposed an OSGi based SCA service model -DOSGi SCA. To support dynamic component module ma- nagement at runtime in the distributed environment, DOSGi-SCA constructs a service registry center to manage local and remote services based on the distributed OSGi. The practical example demonstrates that this model takes advantages of both SCA and OSGi while avoiding their deficiencies.
Numerical Modeling for Software Based on Extended Fuzzy Description Logic
Computer Science. 2012, 39 (12): 121-124. 
Abstract PDF(358KB) ( 320 )   
RelatedCitation | Metrics
How to present complex numerical relations is difficult in software modeling, since accurate complex numeri- cal relations representation always leads to high complexity in software model reasoning. hhis paper pointed out an ex- tended fuzzy description logic based framework to approximately represent numerical relations by comparisons over fuzzy functions, which contains three core modules: fuzzification from software numerical domain to fuzzy domain, con- struction of software numerical knowledge bases (SNKI3s) and reasoning with SNKI3s. This paper gave some common fuzzification functions and two fuzzification principles to guarantee sufficiency, talked about construction steps and pro- cedures of cut concepts,assertions and inclusions in SNKBs,and discussed some implementation mechanisms in design, optimization and segmentation of reasoning algorithms.
Study on Complementation of Passive Testing and Active Testing on Simulation Software
Computer Science. 2012, 39 (12): 125-132. 
Abstract PDF(442KB) ( 384 )   
RelatedCitation | Metrics
There are some disadvantages in passive testing and active testing on simulation software in practice. The de- tails on complementation between passive testing and active testing were proposed aimed at these disadvantages. Histor- ical data can be obtained through certain passive mechanisms from practical data in advance, and the effectiveness of test for the historical data can be improved. Both "orderly method" and "inversion method" can effectively overcome the dis- advantages of active testing and passive testing,also the complementation of them is more flexible in practice. BUU data obtained by the test can take great role for modification of parameter in mathematical model of simulation software. De- tails of the complementary testing should be associated with the development process of simulation software.
ASP-based Verification of Concurrent Systems Described by CSP
Computer Science. 2012, 39 (12): 133-136. 
Abstract PDF(632KB) ( 382 )   
RelatedCitation | Metrics
Traditionally,the verification of properties of Communicating Sequential Processes (CSP) is carried out on three different levels of models, which increases the complexity and difficulty of developing verification tools. At the same time,mainstream tools for verifying concurrent systems cannot verify multiple properties at one run of the tools, which decreases the verification efficiency of the tools. To deal with the problems, this paper proposed an ASP-based u- nified framework for verifying concurrent systems described by CSP. The method first transforms CSPs into the An- saver Set Program, and then transforms the execution rules for concurrent CSP processes and the properties in the form of LTL/CTL formulae into the rules of ASP. At last, the properties can be verified by a computation of the answer sets of the resulting ASP program. It is shown that the ASP based method for verifying CSP concurrent system is easy to implement,is able to verify multiple LTL/CTL formulae at one execution of the verification software,and at the same time achieves acceptable time efficiency.
Software Multi-project Scheduling Genetic Algorithms Based on a Time-line Model
Computer Science. 2012, 39 (12): 139-144. 
Abstract PDF(461KB) ( 555 )   
RelatedCitation | Metrics
Reasonable scheduling can greatly improve the utilization of the human resources in the process of software project development}I3ased on the research of current task scheduling algorithms, taking the separable of software de- velopment tasks,employce's skills and project experience into consideration, the multi-project concurrent scheduling model based on timo-line which splits the tasks by time unit was defined to minimize the cost that includes the emplo- yees' salaries and the penalties of overtime. At the same time, in order to improve the flexibility of assignment of emplo- yees,the model also proposes that the employees' skills and experience can be improved by training and working on some tasks. Since the model contains many conditions,genetic algorithm with some heuristics is used to implement the model. The effectiveness of this model and algorithm is verified by the simulation results.
Research on a Paxos-based Approach for Memory Data Replication in Stock Trading System
Computer Science. 2012, 39 (12): 145-148. 
Abstract PDF(571KB) ( 792 )   
RelatedCitation | Metrics
As the development of high-speed network technology and rising demand for high-frequency trading, upgra- ding trading speed has become the main focus of e-commerce trading system providers in recent years. Primary-backup replication based on shared storage is a classical approach to ensure high availability and data durability of trading sys- tans at present, but it is difficult to further reduce the system latency due to its persistence bottleneck. ho solve this problem,a Paxos-based approach for memory data replication was proposed and illustrated in the stock trading context. hhe approach accomplishes the primary-backup replication through messaging, can ensure the strong consistency of data replicas and tolerate possible benign failures. Experimental results show that compared with the shared storage replica- tion approach,this approach reduces the order processing latency of stock trading system from several milliseconds to several hundred microseconds in 10 Gbit Ethernet, and achieves hot failover correctly in lure. case of the primary host fai
xScraper:Bulk- and Deep-extracting Non-structured Web Information Based on Web-Harvest Techniques
Computer Science. 2012, 39 (12): 149-152. 
Abstract PDF(359KB) ( 362 )   
RelatedCitation | Metrics
A system named xScraper was developed based on the data extraction rules investigation in Web-Harvest, 5 main functions of this system are(1) flexible specification of extraction rules to meet different application rectuire- menu; (2) controllable bulk non-structured data (incl. images) extraction from the same Web site; (3) deep extraction of topi}rclated information across many Web sites; (4) extraction of metadata from Web sites and transformation in to XML tags; (5) non-structured multi-media information management in databases. xScraper is a simple, practical and ex- tendable system. It provides value-added services over Web-Harvest and can meet different requirements of Web infor- matron extraction.
Collaborative Filtering Recommendation Algorithm Based on Item Clustering and Global Similarity
Computer Science. 2012, 39 (12): 153-157. 
Abstract PDF(320KB) ( 363 )   
RelatedCitation | Metrics
Abstract When facing with the extreme sparsity of user rating data, traditional similarity measure method performs poor work which results in poor recommendation duality. To address the matter, a new collaborative filtering recommen- elation algorithm based on item clustering and global nearest neighbor set was proposed. Clustering algorithm is applied to cluster items into several classes based on the similarity of the items, and then the local user similarity is calculated in each cluster, at last a newly global similarity between nearest neighbor users is used to measure user similarity. In addi- tion,the factor of overlap is introduced to optimize the accuracy of the local similarity between users. The experimental results show that this algorithm can improve the accuracy of the prediction and enhance the recommendation quality, which shows good result on the condition of the extreme sparse data.
Research of Data Privacy&Security in Map-Reduce Model
Computer Science. 2012, 39 (12): 158-161. 
Abstract PDF(457KB) ( 366 )   
RelatedCitation | Metrics
In analyzing and processing of the large-scale database,itis crucial for protecting the privacy of sensitive da- ta. In order to analyze the service efficient and security for the mass data with statistical character, the mechanism with diffcrcntial privacy was implcmcntcd in the Map-Rcducc computation model. A decision trcc's gcncration algorism was proposed at the computation model,and proved to satisfy the } differential privacy. The experiment indicates that the algorism has well classification accuracy and computation efficiency.
Efficient Mufti-keyword Search over Secure Cloud Storage
Computer Science. 2012, 39 (12): 162-166. 
Abstract PDF(431KB) ( 324 )   
RelatedCitation | Metrics
ho protect users' privacy,data stored in cloud service provider (CSP) usually needs to be encrypted before being sent to CSP. It brings about a problem that how users search files using keywords over encrypted cloud data. In lots of scenarios, CSP is considered as a potential attacker. According to characteristics of cloud computing, we proposed an efficient privacy-preserving approach to support mufti-keyword search over encrypted data (short for PPMKS), which is based on binary sort tree for search ( short for I3.S TS). In our PPMKS, authorized users can easily search ci- phertext files using multiple keywords,which can make users enjoy the service of mufti keyword search over encrypted data anywhere and anytime.
Roles Acquisition Based on Concept Lattice Model
Computer Science. 2012, 39 (12): 167-170. 
Abstract PDF(402KB) ( 414 )   
RelatedCitation | Metrics
Role engineering focuses on the role mining and optimizing of Role-Based Access Control(RI3AC),but it o- mils the scenario of complex information system(CIS) among those applications. The popular model for access control in CIS is RI3AC, where relations between roles are assumed to have been built by humans beforehand. However, building these relations is timcconsuming, even for experts. We introduced the role engineering into CIS, exploited concept lat- lice model and subject predicate-object method to generate roles and their corresponding hierarchical relations from the requirement information acquired from domains, and the costing is lower. In the end, our experimental results show that our algorithm is effective.
Classi行ing Communication Dispatch System Logs of Smart Grid Based on Active Semi-supervised Learning
Computer Science. 2012, 39 (12): 171-176. 
Abstract PDF(442KB) ( 386 )   
RelatedCitation | Metrics
Communication dispatch system is the guarantee for the normal operation of the smart grid. In order to en- sure the correct operation of the system, man on duty need to record the operational status, emergencies, the accident fault as well as the corresponding treatment program of the communication dispatch system of smart grid To help rr}}nr gers to keep up with the working status of system, for finding the potential security risks, the logs need to be labeled for certain types,to facilitate managers to query and retricve,so the communication dispatch system needs to be able to au- tomatically classify recorded logs according to various demands of management. However, the automatic classification for logs recorded by attendants in terms of their own understanding and habits, needs to learn from a large number of la- bcled logs data provided by information scheduling experts. Since manually reading to label is a timcconsuming and la- bor-intensive process,only a small amount of labels are often provided in practical applications, thus affecting the per- formance of the automatic classification. In terms of this limitation, this paper proposed an automated classification method based on active semi-supervised learning. hhis method, on one hand, acquires the labels of logs that can improve the classifier most through active learning,on the other hand,further enhance learning performance by the use of larges number of unlabeled logs. The results of application on logs of communication dispatch system of national smart grid show that the method based on active semi-supervised learning can achieve better performance than existing methods.
Hybrid Recommendation Filtering Method Based
Computer Science. 2012, 39 (12): 177-180. 
Abstract PDF(526KB) ( 431 )   
RelatedCitation | Metrics
Combined with the rating matrix of user-item and the correlation matrix of itenrcategory, a new hybrid rec- ommended model was proposed. First, a new correlation degree measuring algorithm was presented by using these two matrixes. This algorithm takes into account the feature information and dynamically adjusts the result based on the sparse situation of the rating data, truly reflects the degree of association with each other. Then, a new weighted two- layer graph model was constructed by using the item-item correlation degree and the user-item correlation degree as the weight. On this basis, starting from the global structure of the two-layer graph, the recommendation algorithm based on weighted two-layer graph was given by the random walk algorithm, to provide users with personalized item recommen- lotions and user recommendations. The experiments show that the algorithm compared to other recommended models in the references has higher accuracy.
Pattern Matching with Wildcards Based on Suffix Tree
Computer Science. 2012, 39 (12): 181-183. 
Abstract PDF(417KB) ( 608 )   
RelatedCitation | Metrics
Pattern matching with wildcards is a hot research problem that can be used in biological sectuence analysis, text indexing, network intrusion detection, and so on. Aiming at the problem that the wildcards have strong limitations in the existing research work, pattern matching with flexible wildcards was studied. The wildcards can appear between any two substrings and can be specified with flexible length constraints. The nonlinear data structure-suffix tree was used to design build the suffix a completeness algorithm PAST. In the prepare process, an online incremental algorithm was used to tree which has priori knowledge of the text. In the search phase, the idea of dynamic programming was used to match the characters of the pattern. Experiments on DNA sectuences show that our method has better perfor mances in time than the related matching algorithm
General Algorithms for Dempster's Combination Rule of Evidence
Computer Science. 2012, 39 (12): 184-187. 
Abstract PDF(306KB) ( 340 )   
RelatedCitation | Metrics
Evidential reasoning is an important method of uncertainty reasoning,while Dempster's combination rule is the core of evidential reasoning. hhe focal clement of synthesis results may be a collection of a number of assumptions or propositions, but the previous algorithms for Dempster's combination rule of evidence cannot handle this situation, and their approximation algorithms for the calculation arc also not precise enough. Therefore, by implementing subsets of the frame of discernment with bit vectors, three precise general algorithms for Dempster's combination rule were pro- posed, which use linear list and balanced tree. Theoretical analysis and simulation results show that the algorithms arc effective.
Research of Music Recognition Based on Ensemble Learning
Computer Science. 2012, 39 (12): 188-191. 
Abstract PDF(441KB) ( 520 )   
RelatedCitation | Metrics
With the development of infox-rn
Research on Event-based Method for Text Representation
Computer Science. 2012, 39 (12): 192-194. 
Abstract PDF(328KB) ( 455 )   
RelatedCitation | Metrics
B3y studying some traditional text representation models, this paper considered the event as a basic semantic unit for narrative texts, and presented a new event based text representation method (Event Network) , which combines the characteristics of the graph structure. This method uses events and the relationship between events to represent the text, and can retain the structure information and semantic information of the text to a greater extent. The experimental results show that automatic summary based on the method has better performance.
Design of the Sign Language Template Library Based on Index Structure
Computer Science. 2012, 39 (12): 195-197. 
Abstract PDF(254KB) ( 317 )   
RelatedCitation | Metrics
With the increasingly widespread applications of HCI, the sign language recognition technology has gotten a lot of attention and development On the basic of the research of the current sign language recognition techniques, for the disadvantages about sign language template library and the China sign language features, the sign language template library based index structure was designed. And with this method, the sign language recognition accuracy and efficiency are improved.
TEo-CrCC Characteristic Parameter Extraction Method for Speaker Recognition in Noisy Environments
Computer Science. 2012, 39 (12): 198-203. 
Abstract PDF(246KB) ( 437 )   
RelatedCitation | Metrics
Considering the sharp decline in the recognition accuracy of MFCC characteristic parameter for speaker recog} nition in low SNR environments, this paper proposed TEC}CFCC characteristic parameter extraction method. Signal phase matching is applied to eliminate speech noise on the basis of CFCC characteristic parameter, and then teager energy operator is added to the acquisition of CFCC characteristic parameter. In this way TEC}CFCC characteristic parameter is obtained and the energy of speech becomes one of the characteristic parameters for speaker recognition. Experiment results show that the recognition accuracy can reach to 83. 2 0 o in a一5dI3 SNR of vehicle interior noise environment by using TEC)-CFCC characteristic parameter.
F-ladder Knowledge and F-ladder Mining-discovery
Computer Science. 2012, 39 (12): 204-207. 
Abstract PDF(392KB) ( 307 )   
RelatedCitation | Metrics
By employing one direction S-rough sets (one direction singular rough sets) and its dynamic characteristic, this paper presented the concepts of F-ladder knowledge and F-ladder degree. Using theses concepts, the discernibility theorem of F-ladder knowledge, the mining-discovery theorem of minimum F-ladder knowledge, the mining-discovery theorem of maximum F-ladder knowledge and the dependenccfilter theorem of knowledge discovery ere proposed, and the interior hiding principle of F-ladder knowledge, the mining-discovery criterion of F-ladder knowledge and its appli- canons were given. The results given in this paper show the new characteristics of one direction S-rough sets and the new applications of the dynamic characteristic of one direction trough sets.
Optimization Method of Workflow Service Subject Based on Dynamic Particle Swarm Algorithm
Computer Science. 2012, 39 (12): 208-210. 
Abstract PDF(305KB) ( 324 )   
RelatedCitation | Metrics
Research on workflow business hours一cost optimization problem. hhis paper proposed a novel dynamic par- tick swarm algorithm optimization method of workflow service subject. hhrough regional division, in each of the parti- cles located in the region, when adapting to a value less than the best fitness, reinitializes region, so that the algorithm has better global convergence and dynamic adaptability, while the introduction of random disturbance, reverse operator, the search scope arc expanded to the entire solution space in order to greatly improve the optimal solution probability. Combined with dynamic particle swarm algorithm based grid workflow scheduling problem of target model, and from three aspects of the cross time granularity, across time zones, the across working system, this paper discussed the work- flow service subject selection method. The experimental results show that this method than other applications of grid workflow scheduling algorithm has shorter execution time and cost,higher efficiency,better superiority.
Classifier-similarity-based Ensemble Classification Research for Data Streams
Computer Science. 2012, 39 (12): 211-213. 
Abstract PDF(263KB) ( 509 )   
RelatedCitation | Metrics
Classification of data streams has become one of hot research spots, and a new similarity-based dynamic en- semble algorithm was presented to deal with two critical problems, namely, concept drift and noise. Because adjacent data in data stream has the same concept with more probability, the new sub-classifier stands for the coming concept Based on it, ensemble classifier is got by similarity weighted majority voting, and sulrclassificr with worst performance is dele- ted for suiting for concept drift and noise. Experiment result on simulation data set shows the algorithm is best than other schema in classification accuracy and anti-noise.
Granular Matrix-based Knowledge Representation for Tolerance Relation
Computer Science. 2012, 39 (12): 214-215. 
Abstract PDF(274KB) ( 358 )   
RelatedCitation | Metrics
Based on equivalence relation, knowledge representation system (KRS) and knowledge reduction algorithms were established by rough set theory (RST). Tolerance relation is extension of equivalence relation. KRS, upper and lower approximation, knowledge dependency and discovery of association rules of tolerance system were defined and computed by granular matrix.
Research of Lattice in Evidence Synthesis
Computer Science. 2012, 39 (12): 216-219. 
Abstract PDF(230KB) ( 305 )   
RelatedCitation | Metrics
On the basis of architecture of rough set theory and evidence, this paper analyzes different methods of mass functions in rough set theory and evidence synthesis, and researches the relationship between the subdivision in partial sequence stratigraphic and evidence synthesis. According to the sub-division in partial sequence stratigraphic changes, this paper demonstrated that mass function which is got in the subdivision of supremum, infimum and all upper and lower bounds in the sub-partial order on the lattice, does not correspond to the mass function of evidence synthesis in evidence theory. Therefore,it clarifies the relationship between the mass function acctuired from the subdivision in dif- ferent knowledge granularity levels and mass function got from evidence theory.
Farm Machinery on-board GPS Location Accuracy Simulation and Research
Computer Science. 2012, 39 (12): 220-223. 
Abstract PDF(314KB) ( 466 )   
RelatedCitation | Metrics
Researched the agricultural production farm machinery on-board GPS positioning problem to improve the po- sitioning accuracy. Farm machinery on-board GPS positioning vulnerable is easy influenced by fitter. The car GPS will shake when farmland long lines is not flat, causing car GPS position is not accurate, resulting the work error of agricul- tural machinery operation. The traditional linear filter on-board GPS positioning method cannot effectively remove fitter influence, causing the problem of low location accuracy. In order to solve this problem, this paper put forward the on- board GPS filter double localization method. Using activation function controls kalman filter gain and preliminary wea- kens the effect of fitter. Introducing self-adapting attenuation factor establishes a rapid response filter, and combining with the farm machinery motor change track finishes removing fitter influence, completing farm machinery positioning at last Simulation results show that this method can effectively remove shaking influence caused by unstraight long line, accuratly finish the orientation of agricultural machinery car GPS.
Modeling and Simulating of User Clustering on Network Based on Particle Swarm Optimization
Computer Science. 2012, 39 (12): 224-227. 
Abstract PDF(338KB) ( 370 )   
RelatedCitation | Metrics
Most of the current research about public opinion on the network focuses on the analysis of emergencies' spreading and early-warning process, ignores the user's main role on the procedure of opinion-spreading. In terms of this problem, we introduced the concept of“concept space", modeled and simulated the users’concept clustering process during the transmission of emergencies on the network by using particle swarm optimization algorithm. On the basis of users' clustering results, we analyzed the dynamic evolution model of network emergencies. Changing the pa- rameter of velocity controls the convergence rate of user-clustering, and coordinates the evolution process of network c- mergencies, realizes the recognition of network hot events and early-warning for public opinion crisis. At last, we simu- lated the users' clustering behavior relatively based on basic PSO and speciation PSO (SPSO) algorithm Simulation re- sups show that SPSO algorithm is able to simulate the concept clustering more effectively. It is able to find multi-elute ring center at the same time, and is good to set adaptively coping strategies for early warning of public opinion.
KL-divergence Based Feature Selection Algorithm with the Separate-class Strategy
Computer Science. 2012, 39 (12): 228-232. 
Abstract PDF(341KB) ( 846 )   
RelatedCitation | Metrics
Feature selection is one of the core issues in designing pattern recognition systems and has attracted conside- ruble attention in the literature. Most of the feature selection methods in the literature only handle relevance and redun- dancy analysis from the point of view of the whole class, which neglect the relation of features and the separate class la- bels. To this end, a novel KL-divergence based feature selection algorithm was proposed to explicitly handle the rele- vance and redundancy analysis for each class label with a separate-class strategy. A KL-divergence based metric of effec- tive distance was also introduced in the algorithm to conduct the relevance and redundancy analysis. Experimental re- sups show that the proposed algorithm is efficient and outperforms the three representative algorithms CFS,FCI3F and RcliefF with respect to the quality of the selected feature subset.
Testable Properties for Subgraphs in Large Graphs
Computer Science. 2012, 39 (12): 233-236. 
Abstract PDF(381KB) ( 318 )   
RelatedCitation | Metrics
A property testing algorithm is required to highly probably distinguish the case that the input has predeter- mined property II and the case that the input is afar from having the property,for any given parameter s. If there is a property testing algorithm for II such that its query complexity is independent of the input size parameter n, II is called testable. Let H be a graph, H-free is the property of containing no copy of H. For any connected graph H, Uoldreich and Ron show that H-free is testable in bounded degree model. In adjacency matrix model, it proves that F}free is tester ble,for any given H.
Research on Decision Stability of Evolution Model of Granular Decision
Computer Science. 2012, 39 (12): 237-240. 
Abstract PDF(322KB) ( 316 )   
RelatedCitation | Metrics
Evolution model of granular decision is a method based on time series data of rough set in dynamic prediction which has good effects on dealing with dynamic data. 13ut after the forecast, the next timing t十:gets an actual decision 大十which is different from the decision厂十that we calculate. I}he solution of how to deal with this conflict mode is not illustrated. Therefore, this paper introduced game theory into the evolution model of granular decision to solve the conflict The actual decision人一1and the calculated decision f:十compose game matrix, and calculati眼the decision bent fit can judge the evolution of granular decision is stable or not at the time t; i 1.
Internal Public Opinions Monitor System Based on Topic Detection and Clustering
Computer Science. 2012, 39 (12): 241-244. 
Abstract PDF(348KB) ( 647 )   
RelatedCitation | Metrics
In order to deal with slow response rate and false alarm rate in external public opinions monitor system, a no- vel internal system was proposed based on topic detection and clustering. And its organization model,data structure and working flows were described as following. And topic center detection method was utilized to extract the current public hotspots. Then topic clustering and predication model could estimate their trends to monitor internal crisis and so on. Simulation results show that the system can perform better and faster than the traditional system in public opinion a- farm.
Research on Sentiment Classification of Collaborative Learning Based on Emotion Words and Sentiment Words
Computer Science. 2012, 39 (12): 245-248. 
Abstract PDF(352KB) ( 396 )   
RelatedCitation | Metrics
Sentiment classification aims to distinguish the expressed sentiment categories by the text, such as positive vs. negative and agree vs. disagree. We used a opinion lexicon, together with a small scale of emotion key words to con- duct sentiment classification with unlabeled data. Specifically, a document word bipartite graph was builts, and then the opinion words and emotion words were served as labeled points while the documents were regarded as unlabeled points in the graph. Label propagation algorithm was used to propagate the label information of the words to the documents. Finally, the high confident automatically-labeled samples were used as training data for sentiment classification through collaborative learning method. Experimental results demonstrate that our approach achieves a good performance for sen- timent classification across multiple domains.
Weights and Structure Determination of Multi-input Laguerre-orthogonal-polynomial Feed-forward Neural Network
Computer Science. 2012, 39 (12): 249-251. 
Abstract PDF(318KB) ( 363 )   
RelatedCitation | Metrics
In order to remedy the inherent weaknesses of the back-propagation (13P) neural-network model and its learning algorithm,a multi-input I_agucrrcorthogonal-polynomial feed-forward neural network (MILOPNN) was constructed, which is based on the theory of polynomial interpolation and approximation. Then, a new kind of weights-and-structurcdctcrmination(WASD) algorithm was proposed to determine the optimal weights and structure of the MILOPNN ctuickly and automatically. Computer simulation and experiment results further substantiate the efficacy of the WASI)algorithm,as well as the relatively good abilities of approximation and denoising of the MILOPNN model equipped with the WASD algorithm.
Improved STAM Algorithm Based on Differential Method
Computer Science. 2012, 39 (12): 252-254. 
Abstract PDF(308KB) ( 394 )   
RelatedCitation | Metrics
STAM algorithm is an efficient approach to implement non-linear function approximation with hardware, in which lookup tables and addition arc involved to approximate non-linear function accurately, and size of table could be reduced remarkably due to symmetry of function. Nevertheless the first coefficient still uses relatively big table. An improved method based on differential was brought up aiming at relieve the problem, and with the algorithm hyperbolic tangent sigmoid function was implemented on FPGA. Experimental results show that 17 0 0 to 30 0 0 of memory was saved after improvement.
Again Discussion for Definition of Similarity Measures between Vague Sets
Computer Science. 2012, 39 (12): 255-256. 
Abstract PDF(225KB) ( 299 )   
RelatedCitation | Metrics
Some faults of the similarity measures between vague sets were analyzed, and a new definition of the similarity measures between vague sets was given to attempt solving those faults. A new formula was putforward about the similarity measures between vague sets. An example shows that the proposed formulas is practicable.
Corpus Callosum Segmentation Algorithm of Diffusion Tensor Images
Computer Science. 2012, 39 (12): 257-260. 
Abstract PDF(872KB) ( 337 )   
RelatedCitation | Metrics
A vector-based active contour model algorithm for corpus callosum segmentation on diffusion tensor images was proposed. It utilized the vector-based Chan-Vese model to construct a vector-based signed pressure force function that controls the direction of the evolution. The form of the vector norm was used to describe anisotropy characteristics of corpus callosum on diffusion tensor images. The vector-based active contour model with both the global and the local segmentation property was also introduced into this algorithm. Segmentation results of 10 real diffusion tensor images showed that the proposed algorithm could segment corpus callosum precisely and stably.
Medical Image Encryption Algorithm Based on Multiple Chaos Systems
Computer Science. 2012, 39 (12): 261-263. 
Abstract PDF(330KB) ( 440 )   
RelatedCitation | Metrics
Due to the characteristics of medical image; huge data and high continuity of same color pixels, a medical image encryption algorithm based on chaos theory was proposed. hhe algorithm generates encryption array by randomly switcking among multiple one-dimensional chaos systems, introduces duplex feedback to the image data while encryp-ting,which lengthens the sequence period and improves the security of algorithm. Experiment results show that the algorithm has good running efficiency and encryption effect.
Generalized Cubic DP Curve
Computer Science. 2012, 39 (12): 264-267. 
Abstract PDF(255KB) ( 400 )   
RelatedCitation | Metrics
A set of cubic polynomial function with two shape parameters was presented as an extension of cubic DP basis functions. The properties of the new basis were analyzed. The generalized cubic DP curve with two shape parameters was defined using the new basis. The new curve not only retains many properties of cubic DP curve,but also can adjust the shape by moderating shape parameters or u. Based on the some given condition, the two section curves can be G^2 or C^2-continuous between adjacent generalized cubic DP curves,which is useful for free curve and surface design.
Face Recognition Based on Residual Space and SVM with Bidirectional Two Dimensions PCA
Computer Science. 2012, 39 (12): 268-271. 
Abstract PDF(554KB) ( 401 )   
RelatedCitation | Metrics
A novel face recognition algorithm was proposed to save the time of sample training and face recognition based on SVM. The new method is to recognize face based on residual space and SVM with I3idirectional two dimensions PCA. To avoid the influence of expression and light on face recognition,wavelet transform was used to process face images at first, then the within-class average was applied to the calculate two dimensions PCA. Furthermore SVM is in classification in order to effectively decrease the time of arithmetic. Experiments on Yale face data show that the new method can not only improve recognition rate, but also save the recognition time.
Study on Application of Fusion Theory in Gait Recognition
Computer Science. 2012, 39 (12): 272-277. 
Abstract PDF(506KB) ( 435 )   
RelatedCitation | Metrics
Recently,fusion-based gait recognition has become one of the hottest topics in the domain of biometrics recognition. This paper firstly discussed two different levels of fusion methods,featur}level fusion and decision-level fusion,then summarized the latest fusion-based gait recognition methods from three categories:multi-feature fusion,multi-biometric fusion and multi-view fusion. Furthermore, a gait recognition method via fusing shape and kinematics featureswas proposed to verify the effectiveness of fusion. Experimental results on CMU databases demonstrate the feasibility of the proposed algorithm and show that fusion can be an effective strategy to improve the recognition performance.
Research on Medical Image Quality Evaluation Method Based on Gradient Direction Information
Computer Science. 2012, 39 (12): 278-280. 
Abstract PDF(270KB) ( 347 )   
RelatedCitation | Metrics
This paper studied the evaluation methods of medical image quality to improve their reliability. Medical images arc obtained through using mathematical method to reconstruct them by compute, so the reliability of distortion image evaluation only very high, because the image during the process of imaging will be inevitably influenced by noise,making image being not uniform or distortion. The traditional medical image quality evaluation method only uses noise ratio to evaluate image quality, so the reliability of evaluation for image distortion is not high. To evaluate image quality,this paper put forward a gradient direction based evaluation method of medical image quality. It not only considers the image signal-to-noise ratio, but also combines the correlation of the pixels and human visual perception characteristics. It calculates gradient direction information used as the evaluation index, and can avoid the problem that evaluation reliability of the traditional method is not high to distortion image. The experiments show that the method can reflect the true image quality of visual perception, has high reliability evaluation.
Comparison and Analysis of Three Types of FFT Adaptive Libraries on Loongson 3A
Computer Science. 2012, 39 (12): 281-285. 
Abstract PDF(420KB) ( 434 )   
RelatedCitation | Metrics
FFT algorithm has a wide range of applications in computer science. Adaptive FI门software package with its excellent portability has been interested by many researchers and users. I_oongson 3A is developed by institute of compuling technology, Chinese academy of sciences. It is a quad-core CPU and compatible with MIPS instructions using RISC architecture. The article focused on three types of FFT adaptive libraries which are FFTW,UHFFTand SPIRAI.Firstly, we compared the difference between FFTW and UHFFT from two aspects of search framework and code generator. Then we elaborated SPIRAL's three layers schema which is used to produce optimized code automatically.Furthermore, we evaluated these libraries on the Loonson 3A platform and analyzed the results. Finally, we concluded the general method of current FIST adaptive software packages and provided a guideline for further development of adaptive FFT software package.
Research on Low-latency and Low-consumption Scaling-free Algorithm and Architecture
Computer Science. 2012, 39 (12): 286-289. 
Abstract PDF(345KB) ( 388 )   
RelatedCitation | Metrics
Scaling-free CORDIC algorithm is well known by its ability of calculating special function. Its limited range of convergence and lower speed are an important drawback. There is still some progress to make, although some new version has been proposed. Comparing with modified original scaling-free CORDIC algorithm, our proposed algorithm can virtually converge to a bigger range, because of adding a correction iteration, what is more, accelerate computation though parallelization. After simulation and synthesizing,we can obtain some new architectures which are able to reach one clock lower latency and a 25.5% reduction in area and 35% reduction in power consumption compared to the modified original scaling-free architecture.
New Label Alterable Access Control Policy
Computer Science. 2012, 39 (12): 290-294. 
Abstract PDF(456KB) ( 578 )   
RelatedCitation | Metrics
The security of Windows operating system which only provides discretional access control (DAC) capability has riveted far and wide attention. As an important information security technology,mandatory access control (MAC) can effectively enhance security of system, and the design of access control policy plays a key role in successful implementation of MAC. In order to satisfy the needs for secure projects in Windows operating system ultimately, combining advantages of classical access control models BLP and Biba,a new access control policy which adjusts security label of subjects based on its credibility was presented to solve poor usability caused by superposition of BI_P and Biba. And finally the prototypal system based on access from process to file shows that the usability and security of system are improved effectively.
Input-aware Runtime Scheduling Support for Fast Clustering of Radar Reflectivity Data on GPUs
Computer Science. 2012, 39 (12): 295-299. 
Abstract PDF(423KB) ( 381 )   
RelatedCitation | Metrics
As a classic algorithm in data mining, the clustering algorithm is often adopted in analysis of radar reflectivity data. However, it is time-consuming while facing dataset of large scale and high dimension. Recently, several studies have been conducted to make effort in parallclization or optimization of the clustering algorithm on GPUs. Although these studies have shown promising results, one important factor`program inputs-in the optimization is ignored in optimization. We took the program inputs in consider as a factor for optimization of the clustering algorithm on GPUs. By observing the distribution feature of the input radar reflectivity data, we found that the ability to adapt to inputs is important for our application to achieve the best performance on GPUs. The results shows that our approach can gain a 20%一40% performance increment, compared to previous parallel code on GPUs, which makes it satisfies the requirement of real-time application well.