Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 39 Issue 1, 16 November 2018
  
Advances in Study of Distributed Mining of Data Streams
Computer Science. 2012, 39 (1): 1-8. 
Abstract PDF(899KB) ( 869 )   
RelatedCitation | Metrics
With advances in communications technology and hardware equipment technologies,particularly the wide use of small wireless sensor devices,data collection and generation technologies have become more convenient and automated, organizations and researchers arc faced with the ever growing problem of how to manage and analyze large dynamic datasets. Environments that produce streaming sources of data are becoming common place,such as sensor network,financial data management, network monitoring, Web log analysis and the communication data online analysis. In many application instances,these environments arc also equipped with multiple distributed computing nodes that arc often loGated near the data sources. Analyzing and monitoring data in such environments requires data mining technology that is cognizant of the mining task, the distributed nature of the data, and the data influx rate. We reviewed the current situation of the field and identified potential directions of future research.
Multi-modal Tensor Data Mining Algorithms and Applications
Computer Science. 2012, 39 (1): 9-13. 
Abstract PDF(401KB) ( 1300 )   
RelatedCitation | Metrics
Multi-modal data mining technologies have attracted many research interests in recent years. Mining large amount of multi-modal data efficiently becomes a hot spot problem. Among these multi-modal mining technologies, multimodal data mining for tensor representation,which is also called as multi-modal tensor data mining,is one of the most significant research issues. We reviewed the statcof-thcart algorithms of the multi modal tensor data mining and their applications in computer vision. Firstly,multi-modal tensor data mining algorithms were categorized into different classes according to the different label information, task and core technology. In addition, some analyses about these algorithms were given. Secondly, some typical multi-modal tensor mining algorithms in computer vision application were i1lustrated. Finally, we presented our own analyses on research status of multi-modal tensor mining algorithms, and explored some potential future issues of multi-modal tensor mining in computer vision application.
CODAS;An Extensible Static Code Defect Analysis Service
Computer Science. 2012, 39 (1): 14-18. 
Abstract PDF(591KB) ( 835 )   
RelatedCitation | Metrics
Static defect analysis techniques are very useful in detecting defects at the early stage of software development process,which can improve the software quality effectively. I}he static code defect analysis tools such as FIND BUGS, JLINT, ESC/JAVA, PMD, and COVERI"hY can detect plenty of real defects, which has already been demonstrat ed .However,these tools don't provide sufficient usability and effectiveness, which restricts their further application. I}he insufficient usability lies in two points. The first point is that each standalone tool is only good at detecting some certain types of defects,which means that developers need to use more tools to get a more comprehensive defect report. The second point is that developers need to manually setup, configure, and execute each standalone tool one by one,which is a very time-consuming process. The insufficient effectiveness lies in that:the static analysis warnings provided by these tools usually contain lots of false positives and also many trivial warnings that arc not very important and won't be fixed by developers. In order to solve these issues,we proposed and implemented an extensible static defect analysis service; Code Defect Analysis Service (CODAS). Based on a highly extensible architecture, CODAS encapsulates and integrates multiple defect analysis tools seamlessly and also provides an effective warning prioritization algorithm, which synthesizes the advantages of different tools and improves their usability and effectiveness largely.
New Auto Identification Technology on Paper Currency Using Pseudo Binocular Stereo Imaging
Computer Science. 2012, 39 (1): 19-22. 
Abstract PDF(331KB) ( 837 )   
RelatedCitation | Metrics
The current paper money identification techniques arc based on safcline, watermark, magnetic inks, fluorescence ink, etc. Alone with the development of fake technology, the anti counterfeit technology needs new ways. This paper presented a brand-new anti-counterfeit method, which is based on the intaglio character of the paper currency, using a common flatbed scanner to achieve the pseudo binocular stereo image of the paper currency, and recognizing the relief texture to identify the real or fake currency. Furthermore, we also brought out a pseudo binocular stereo scanner prototype, which is a better currency detector with high efficiency. With the benefits of this scanner, our method can be applied automatically and is much faster.
Collaborative Filtering Recommendation Algorithm Based on User's Multi-similarity
Computer Science. 2012, 39 (1): 23-26. 
Abstract PDF(333KB) ( 1324 )   
RelatedCitation | Metrics
Conventional user-based collaborative filtering algorithm measures the similarity of two user's favor of any types of items through the single rating similarity. However, daily experience tells us that people usually have different degree on their favor of different types of objects,and obviously the single rating similarity cannot accurately describethis difference. Aiming at this problem, we deeply analyzed the characteristic of user-based collaborative filtering recommendation algorithm, and proposed a collaborative filtering recommendation algorithm based on user's multi-similarity,which describes different similarity of two user's favor of different types of items through the computation of their multiple independent rating similarity for different types of items. The experimental results show that the proposed algorithm, which computes predicted ratings of unrated items on the basis of user's multi-similarity, can effectively improve the accuracy of predicted ratings and enhance the quality of recommendation.
Content-based Video Forensics
Computer Science. 2012, 39 (1): 27-31. 
Abstract PDF(561KB) ( 815 )   
RelatedCitation | Metrics
Content based video forensics is an effective technictue to detect the legality of video content Nowadays, it has been a hot in the research field of multimedia security. I3y taking advantage of video fingerprint, this technique can extract varieties of visual features from illegal videos as evidence. In a typical video forensics model,the process of fingerprint extraction consists of shot segmentation, key frame extraction and fingerprint extraction. Thus, this paper analyzed content-based video forensics technique from these three aspects, and introduced the research development of the related algorithms.
Telecommunications Network Capability Services Model Based on Mashup Technology in Cloud Computing Environment
Computer Science. 2012, 39 (1): 32-36. 
Abstract PDF(458KB) ( 710 )   
RelatedCitation | Metrics
With the gradual integration of Internet and mobile communications at the service level, the "closed garden"business model used by domestic telecommunications operators has already been facing many challenges. So it's necessary to consider how to enhance the user experience by effectively introducing Internet business and service models. Against the background that operators are gradually opening their telecommunication APIs, based on the Mashup business constructed models, this paper proposed a telecommunications network capability services model in the cloud computing environment. The model will package telecommunications capabilities in the form of Web element using Mashup technology and provide telecommunications services to users, which can further enhance the abstraction level of tclecom- munications network capability services. This model breaks the tradition that telecommunications services stick to be set on mobile telephone and it's a new telecom service model under the background of Web2. 0. hhe new service model has proved its feasibility in the project of "Open Mobile Internet Platform-Research and development of algorithms and functions about applications' running and development engine".
New AQM Algorithm ISE-GPM-PID with Least Square Error Integral
Computer Science. 2012, 39 (1): 37-43. 
Abstract PDF(562KB) ( 679 )   
RelatedCitation | Metrics
With the increase of the streaming media applications in the Internet, streaming media traffic on the Internet is growing rapidly. Streaming media traffic is transmitted by UDP protocol since it has delay-sensitive and loss-tolerant characteristics. However, the existing active queue management algorithm based on TCP long flows is lack of the ability to fight against UDP traffic interference. In the paper, an active ctueue management algorithm ISE-GPM-PID was designed with least square error integral based on PID using I}CP/AQM model. In addition, the algorithm ISE-GPM-PID has the phase margin between 300 to 600 and the amplitude margin between 2 to 5. I}he ISE-GPM-PID is able to fight against UDP traffic interference and adapts to the Internet streaming media and Web applications. At the same time, the algorithm also has fast response time, small computing cost and good robustness, and can be used in large delay network envmonment.
Efficient and Provably Secure Identity-based Proxy Aggregate Signature Scheme
Computer Science. 2012, 39 (1): 44-47. 
Abstract PDF(396KB) ( 683 )   
RelatedCitation | Metrics
Based on the hardness of the computational Diffie-Hellman problem, this paper proposed an identity-based proxy aggregate signature. hhe scheme not only owns all the secure properties of proxy signature, but also has merit of the aggregate signature. At last, the scheme's correctness was exactly proven in terms of bilinear pairing technique, and its security analysis was given in detail. It is indicated that this scheme is proven secure and reliable. Therefore, this scheme is secure and efficient.
Nonnegative Matrix Factorization-based IP Traffic Prediction
Computer Science. 2012, 39 (1): 48-52. 
Abstract PDF(413KB) ( 638 )   
RelatedCitation | Metrics
In the face of limited satellite bandwidth resources and users' increasing demands, it becomes a significant issue to realize reasonable and efficient bandwidth allocation among users for broadband multimedia satellite communicalions system. Traffic prediction plays an important role in system resource allocation and management. In the process, IP traffic for multiple users which is regarded as training data was decomposed into basis matrix and coding matrix based on NMF (Nonnegative Matrix Factorization). Then each row vector of encoding matrix was predicted on time dimension on the basis of ARIMA model. At the end, prediction results were combined with basis matrix to generate IP traffic prediction results of each user. After NMF decomposition, the native umber of row vector is fewer than the number of users. As a result, compared with traditional signal user prediction method, the method proposed in this paper can reduce computational complexity. Nests indicate the accuracy of this prediction method.
Joint Routing and Channel Assignment Strategy Based on CSM Priorities in CogWMN
Computer Science. 2012, 39 (1): 53-56. 
Abstract PDF(337KB) ( 633 )   
RelatedCitation | Metrics
Cognitive Wireless Mesh Networks (CogWMN) can change their communication frequency actively. When they look for routing and schedule channels,channel allocation seems unbalanced commonly. In order to increase the utility of unlicensed bands and the throughput with multichannel, a kind of joint routing and channel assignment strategy based on channel statics metric (CSM) with priorities without common control channel was proposed. This scheme may handle the unbalanced channel allocation problem and the mesh nodes in network can access AP in fewer hops. The simulation result shows that the proposed strategy can increase the throughput performance and reduce network timcdelay.
Research on Real-time SOA towards Distributed Control Systems
Computer Science. 2012, 39 (1): 57-60. 
Abstract PDF(355KB) ( 717 )   
RelatedCitation | Metrics
SOA (Service Oriented Architecture) has been used wildly in traditional information system area. But how to construct a distributed control system widespread used in our society with the novel idea SOA is a new issue in service computing research area. With the characteristics of distributed control system and SOA, we proposed a layered rcaltime SOA model which could improve message transport reliability and real-time service processing. We discussed its detailed mechanism on message model, message transportation and service processing. Based on this model, we devc loped a multi net redundant real-time message bus prototype system for power distributed automation system and gave our results.
Research on Information Threaten Assessment Model Based on AHP
Computer Science. 2012, 39 (1): 61-64. 
Abstract PDF(319KB) ( 684 )   
RelatedCitation | Metrics
The measures deployed by the assessment results can reduce the negative impacts of the threats of informalion systems that we assess and predict in informatization development process. Proposed an model of infomation security threaten assessment based on AHP, which combining AHP with Fuzzy Comrehensive Evaluation, establishes threa assessment indicators,and plans to get the assessment result through the methods of assets identify, threats identify, threats analysis. The results of experiment verify the feasibility and validity of the model.
Pipeline Based Skein Algorithm Design and Implementation
Computer Science. 2012, 39 (1): 65-68. 
Abstract PDF(329KB) ( 1032 )   
RelatedCitation | Metrics
The evaluation of speed and performance in hardware is very important with SHA-3 competition. And for one of the final round candidates-Skein algorithm, its 4-unrolled structure has short critical path and 8-unrolled structure used fewer multiplexers. So combining the advantages of the two structures, we proposed a pipeline design with two stages and implemented on Xilinx Virtex-5. Finally the experimental simulation shows that this approach can greatly increase the throughput of Skein algorithm.
Hybrid and Lightwei沙t Cryptography for Wireless Sensor Network
Computer Science. 2012, 39 (1): 69-72. 
Abstract PDF(439KB) ( 703 )   
RelatedCitation | Metrics
Combined identity-based system with the idea of litcCA(Certificate Authority),a scheme of identity-based and lightweight C八based cryptography for wireless sensor network was proposed. The scheme not only avoids the drawback of private key escrow in the identity-based system, but also gains the advantage of simplifying complication of producing and verifying public key in traditional certificate-based CA system. Analysis shows that the scheme equips with characteristics of file-producing public key, file-verifying public key, certificate less, and high security which can defense some attacks easily carrying into wireless environment. It is applicable to be used in wireless network to gain data confidentiality, integrity, and non-refutation.
Energy Cost Based Energy Optimized Routing Algorithm in WSN
Computer Science. 2012, 39 (1): 73-76. 
Abstract PDF(418KB) ( 672 )   
RelatedCitation | Metrics
Aiming at the energy consumption issue of sensor nodes for routing algorithm in wireless sensor networks, an energy cost based energy optimized routing algorithm was proposed. Based on the comprehensive consideration of the efficicncy and balance of energy consumption of sensor nodes for data transmission, a new energy cost function was exploited, which achieved optimal matching between energy efficiency and balance. Sensor nodes compute the energy cost of its forward neighbors using this function and select the node with the minimum energy cost as its next hop. In the algorithm, the routing decision is made according to the information of neighbor nodes, with lower computational time complexity. Finally, the performance of proposed routing algorithm was simulated and compared with typical routing algorithms. Simulation results show that the algorithm can extend the network lifetime effectively, economize and balance the energy consumption of sensor nodes.
Research on Collaborative Selection and Handoff Mechanism for Networks and Terminals
Computer Science. 2012, 39 (1): 77-81. 
Abstract PDF(442KB) ( 767 )   
RelatedCitation | Metrics
This paper proposed a novel intelligent handoff mechanism to achieve cooperative selection and handoff funcdons of networks and terminals in heterogeneous networks. With the development of various services and applications,heterogeneous network convergence is the inevitable trend for information and communication technologies. Considering the rectuirements of multi-access and multi-terminal environment, this paper took AHP and URA methods to select the best destination network and the optimal terminal. The paper also put forward terminal selection and handoff signaling flow to implement the individual mobility management for ultimate goal of user-central smart space, which comprises three types of handoff methods including network vertical handoff , terminal handoff along with joint network and terminal handoff. Simulations illustrate that the intelligent handoff can effectively achieve network/terminal selection, and guarantee QoS performance of services and applications in heterogeneous networks for modern service industry.
Optimal Window Based Backoff Algorithm for IEEE 802. 1 1 WLANs
Computer Science. 2012, 39 (1): 82-84. 
Abstract PDF(238KB) ( 852 )   
RelatedCitation | Metrics
Abstract That every node should have the same value of contention window (Cw) was first vertisified in a WLAN by tormulating resource allocation as a utility maximization optimal problem, then the relation between the optimal value of Cwand the number of nodes was obtained by maximizing the total network utility with constrains of minimizing collision probability. A new retransmission algorithm was proposed which uses an optimal shared Cw. Due to the same Cw,the proposed algorithm can effectively overcome the unfairness of BEB algorithm and and improve throughput Simulation results validate our conclusion.
Benefit-tree Clustering Algorithm for Base Station Cooperation
Computer Science. 2012, 39 (1): 85-88. 
Abstract PDF(424KB) ( 636 )   
RelatedCitation | Metrics
Aiming at the objective of maximizing system sum-rate,a benefit tree clustering algorithm based on maximizing Cooperative Degree (CD) was proposed,where the clustering problem in base station (BS) cooperation system is modeled as constructing benefit trees of a connected graph with edge costs. The concept of CI)between every two benefit trees was defined, and based on this, the algorithm simultaneously generates some clusters of dynamic size by combining two trees with the maximum CD, which could solve the capacity-limited problem caused by conventional clustering scheme and enhance the system capacity. Results show this algorithm performs better than other existing clustering scheme in the system spectrum efficiency, and the computational complexity is linear.
Intrusion Detection System Based on Choosing Characters and Weighting Characters
Computer Science. 2012, 39 (1): 89-91. 
Abstract PDF(271KB) ( 848 )   
RelatedCitation | Metrics
The network intrusion means arc diversification. Traditional detection system can not extract feature very well. Packet is easy lost, and omission and misstatement rates arc high. In order to improve the detection rate, this paper proposed an intrusion detection algorithm based on weighted feature selection. Firstly, features were extracted from network packets, then support vector machine (SVM) was used to select feature based on cross validation and calculate the feature values,lastly, intrusion detection mode was set up based on the weighted reserves features. The results of simulation experiment show that the proposed algorithm improves the intrusion detection rate. It is an effective network intrusion detection method.
Web Service Matching Algorithm Based on Semantic Similarity
Computer Science. 2012, 39 (1): 92-95. 
Abstract PDF(439KB) ( 653 )   
RelatedCitation | Metrics
With the increasing growth of Web applications,how to discover the desired services for users efficiently becomes a significant challenge. A novel approach for service matching based on semantic similarity was proposed, which employs hierarchical ontology to compute the semantic similarity of concepts from two compared services. The maximum weight matching algorithm was improved according to the situation of Web service. The method was implemented in a prototype of service discovery. hhe experiments illustrate that our approach not only enhances the recall rate, but also meets the needs of the current service discovery.
On-demand Service Discovery Method of Networked Software
Computer Science. 2012, 39 (1): 96-100. 
Abstract PDF(549KB) ( 725 )   
RelatedCitation | Metrics
It is very important how to find the best service resources which meet to users' rectuirement and has high quality of service to construct networked software and improve users' quality of experience, according to diverse, indivictual and vague users’requirements of networked software. This paper proposed an on-demand service discovery method of networked software. First of all, according to specific domain, specific rectuirement instance and service resources were be marked by domain ontology. Then, a semantic matching model which consists of basic, function and QoS matching phase was built. This three phases gradually increase with predetermined-threshold judgment. Finally the best service resources which meet to users' requirement were found to construct networked software. Based on logistics field, the experiment was done for performance comparison testing, and the results indicated that this method can improve the matching precision of the requirement and service of networked software, and select the best service resources to construct networked software.
Approach for Trust Analysis of Software Dynamic Behavior Based on Noninterference
Computer Science. 2012, 39 (1): 101-103. 
Abstract PDF(319KB) ( 612 )   
RelatedCitation | Metrics
Software Dynamic Behavior Measurement (SDBM) is one of the core issues that must be solved by the trusted computing. Tackling this issue has two main steps: one is to model the dynamic behavior of software; the other is to deduce the trust of the modeled behavior. A noninterference-based approach, which focuses on the second step, was presented, and the decision theorems for behavior trust analysis were given as well.
Research on Concurrent and Distributed Mechanism of Apla Language
Computer Science. 2012, 39 (1): 104-108. 
Abstract PDF(395KB) ( 717 )   
RelatedCitation | Metrics
From the viewpoint of concurrent and distributed programming, several concurrent and distributed programming languages were analyzed and compared. A novel structured distributed and concurrent language Orc was designed and implemented by Professor Jayadev Misra. After analyzing the fundamental principles and languagccharacters,a new concurrent and distributed mechanism of Apla (Abstract Programming LAnguage) was originally designed in the paper. The mechanism includes the concurrent operator, concurrent statements, the definition, communication and synchronization of process. hhe feasibility and practicability of the mechanism were illustrated by giving a representative example. Finally, many advantages (such as generality, simplicity, abstract, easy-to-writing) of the concurrent and distributed mechanism were presented.
Calculational Bigraphcial Model of Context-aware Systems in Ubiquitous Computing Environment
Computer Science. 2012, 39 (1): 109-114. 
Abstract PDF(523KB) ( 771 )   
RelatedCitation | Metrics
In ubictuitous computing environment,it is a important research to model the context aware systems formally. Firstlly, we discussed applicability and deficiency of a extended bigraphical model of context-aware systems called Polato Graphical Model. On the basis of analysis of Polato Graphical Model, we presented a calculational biographical model for description of context aware systems and compared it with Polato Graphical Model through an example.
Researches on Basic Criterion and Strategy of Constructing Metamorphic Relations
Computer Science. 2012, 39 (1): 115-119. 
Abstract PDF(435KB) ( 1249 )   
RelatedCitation | Metrics
Metamorphic testing can alleviate the oracle problem in software testing, and the construction of metamorphic relations in this method is a difficult problem, which will affect the result of testing directly. I}his paper analysed the construction of effective metamorphic relations based on typical case studies, summarizeed several criterions of selecting useful relations,and the rationality of our criterions was verified through case studies with mutation analysis. The paper also proposed a testing strategy of combining metamorphic testing and equivalence testing. It can be used in the testing of programs whose input spaces are easy to be classified.
Validation of Web Service Composition Based on Probabilistic Model Checking
Computer Science. 2012, 39 (1): 120-123. 
Abstract PDF(339KB) ( 728 )   
RelatedCitation | Metrics
Web service composition validation is very important for improving the efficiency of software development.This paper presented a method based on probabilistic model checking to validate the service composition. We used an extended finite automaton to represent the service composition, and then converted it into a Markov model. A probabilistic model checker PRISM was used to validate the effectiveness of the service composition. Finally, we gave an example to illustrate the feasibility of this method.
Formal Description Approach for Pointcut Designator at Software Architecture Level
Computer Science. 2012, 39 (1): 124-129. 
Abstract PDF(489KB) ( 663 )   
RelatedCitation | Metrics
Pointcut designator(PCD) at software architecture level is a foundation of realizing quantification mechanism and describing aspect weaving in aspect oriented software architecture. Some Aspect Oriented Architecture Description Languages (AOADLs) introduce syntax clement of PCD, but formal description for semantic of PCD is not given. So it is difficult to accurately describe the injection location at software architecture level. For this problem, this paper proposed a first-order Logic Language for PCD (LL4PCD) based on abstract syntax tree form of AC2-ADL which is a kind of AOADL.Further formal description method for the PCD in AC2-ADL language was proposed on basis of LL4PCD.This method can precisely define the semantic of PCD and support the formal analysis of aspect weaving at software architccturc level.
Approach to Modeling Software Requirements Based on Feature Combination
Computer Science. 2012, 39 (1): 130-133. 
Abstract PDF(406KB) ( 732 )   
RelatedCitation | Metrics
Software rectuirement modeling has a great effect on software rectuirements engineering. This paper offered a new theory of software requirements modeling based on the combination of software features, to improve the efficiency and quality of software requirements modeling process. To be started, software features were classified into functional features and non-functional features. Secondly,we made a formal definition of all the feature components in these functional features, including atomic and composite functional features.Thirdly, to support the formalization process, the fcature combination process of functional features was abstracted as feature operations, and we proposed 23 operation axioms. Then a formal definition of non-functional features was brought up, also a scope analysis was built to integrate functional and nonfunctional features as the ultimate software requirement model. At the end of this paper, a detailed software requirement modeling process was proposed based on the concept of software feature combination, which is also the innovation of this paper.
Study of Fast Parallel Clustering Partition Algorithm for Large Data Sets
Computer Science. 2012, 39 (1): 134-137. 
Abstract PDF(437KB) ( 1994 )   
RelatedCitation | Metrics
With the rapid increase of data amounts in clustering algorithms' processing, traditional K-Means clustering algorithm is facing huge challenge for large data sets. In order to improve efficiency of traditional K-Means clustering algorithm, this paper proposed some improvement ideas and implementation using the cluster center initialization and communication mode, according to parallel clustering algorithm based on MPI and distributed clustering algorithm based on Hadoop in cloud. The results show that research of the algorithm can reduce the communication and computation largely, and can have higher implementation efficiency. I}hc research fruits will help us to design better and fast parallel clustering partition algorithm for large data sets in future.
News Topic Detection Approach on Chinese Microblog
Computer Science. 2012, 39 (1): 138-141. 
Abstract PDF(358KB) ( 1145 )   
RelatedCitation | Metrics
The popularity of microblogging brings another form of social news media. The paper proposed an approach of news topics mining from microblog. News topics were formed by finding the emerging keywords in large numbers and clustering them. To extract news keywords,a compound weight was introduced combining the word frequency and the growth, to measure the likelihood of a word to be a news keyword, and to construct the topic, contextual relevance model was used to support incremental clustering, which is more suitable to the problem compared with semantic similarity. The experiments on real world microblog data show the effectiveness of the approach to detect news topic out of massroc mcssagcs.
Research on Object-level Information Retrieval over Relational Databases
Computer Science. 2012, 39 (1): 142-147. 
Abstract PDF(560KB) ( 709 )   
RelatedCitation | Metrics
Traditionally, relational DataBase Information Retrieval (DBIR) treats the tuplclevcl relational data as retrieval objects,and join trees of tuples as the retrieval results. However,the retrieval effect of DBIR is still not ideal. To improve the retrieval effect of DBIR, the research of Object-level relational DataBase Information Retrieval (DB0IR) was proposed, and its background and the state of the art were described. Furthermore, the main ideas and research directions of DB0IR were discussed. DB0IR provides a prospective method for DBIR in the future work.
Simulation Study on Improved Index Technology for XML Data
Computer Science. 2012, 39 (1): 148-151. 
Abstract PDF(326KB) ( 638 )   
RelatedCitation | Metrics
XISS index is currently typical delegate supporting regular path expression in XML data index XISS index produces large path expression and the intermediate results for long inctuires, so join operation cost is very high, which increases the complexity of query, affects the query efficiency. In order to improve the XML data query efficiency, the article put forward an improved XISS index Improved XISS index at first introduced DTD schema information to improve the coding method, and then the node index structure was improved, to decrease the intermediate links, make query time not related with path length and improve query efficiency. hhe contrast experiment on the index and the improved XISS was made, and the results show that the improved XISS index decreases indexed time, accelerated inquire response speed and improves the XML data query result.
Data Stream Concept Drift Detection Method Based on Mixture Ensemble Method
Computer Science. 2012, 39 (1): 152-155. 
Abstract PDF(428KB) ( 675 )   
RelatedCitation | Metrics
Mining with data stream concept drift is a hot topic in data mining. Existing classification approaches consist of ensemble method based on single base classifiers and ensemble method based on hybrid base classifiers,which depend on the stationary assumption and lcarnable assumption. However, the former probably causes the larger classification deviation and the performance on accuracy is impacted in the noisy data streams,while the latter performs worse on the classification accuracy or the time consumption. Motivated by this, an ensembling classification method WE-DTB was proposed, based on hybrid based models with decision trees and Naive Bayes. It is an extended framework of WE model.Meanwhile, we utilized the popular concept drift detection mechanisms based on Hocffding Bounds and μ test to implement the detection on concept drifts. Extensive experiments demonstrate that our proposed method WE-DTB can detect concept drift effectively while maintaining the good performance on classification accuracy and consumptions on time and space.
Algorithm for Generating Decision Tree Based on Incomplete Information Systems
Computer Science. 2012, 39 (1): 156-158. 
Abstract PDF(221KB) ( 687 )   
RelatedCitation | Metrics
Decision trees are a kind of effective data mining methods to case classification. During processing objects with missing values in the incomplete information systems, the guessing technologies are often used in most of the existing decision tree algorithms. In this paper, we defined a condition attribute's decision support degree with respect to the decision attribute with the concept of a maximal consistent block, which can be regarded as the heuristic information.Moreover, we proposed an algorithm for generating a decision tree from an incomplete information system, which called IDTBDS. Note that the proposed algorithm not only fast extract the rule sets, and but also these rules possess more classification accuracy.
Control Signal Solving Model and Algorithm of Dynamic System Based on Process Neural Network
Computer Science. 2012, 39 (1): 159-161. 
Abstract PDF(334KB) ( 663 )   
RelatedCitation | Metrics
Aiming at the control problem of nonlinear dynamic system, a control signal solving model and algorithm based on Process Neural Network (PNN) was proposed. First, a system forward identification model based on PNN was set up by using nonlinear transform mechanism and self-adaptive learning ability of PNN to timcvarying input output signals of dynamic system, then according to the established model, the system control structure and the expected output signals, a control signal solving model and algorithm which satisfies system dynamic signal transform mechanism and transfer constraint relation was constructed. The information processing mechanism based on PNN control model was analyzed and the control signal optimize method based on GA coupled with LMS was given The experiment results veri-fy the feasibility of the model and algorithm.
Research of Multi-layer Cloud Modeling Method
Computer Science. 2012, 39 (1): 162-166. 
Abstract PDF(409KB) ( 652 )   
RelatedCitation | Metrics
The paper studied many complex large system modeling method, and analysed their advantages and disadvantages. Furtherly, its research purposes were put forward, and it is a new complex large system modeling method suitable for China's administrative hierarchy mechanism, and is able to effectively solve uncertainty of system information, especially coexistence of fuzziness and randomness. According to the research target, the paper discussed multi-layer state space model, described specification of its latitude model, and combined with the principle of the normal cloud generator,proposesd a "decomposition-collection" type of uncertainty intelligence modeling method: multi-layer cloud modeling method. Finally, it was used to analyse China's energy system, and got a active result, which shows the method is simple and effective.
A Kind of Pattern Matching Method of Mass Rules
Computer Science. 2012, 39 (1): 167-169. 
Abstract PDF(318KB) ( 1381 )   
RelatedCitation | Metrics
Based on requirement of a large sale of rules information processing, a pattern matching methods of mass rules was proposed. Mass rule pattern matching based algorithms steps and all kind of rule nodes' matching processing methods were researched in order to improve processing efficiency. Finally, mass roles pattern matching methods' chary cteristics were summarized. Mass rules pattern matching methods extended the existing mass rules pattern matching processing pattern and proposed new processing method. Comparative results show that the method has good effect.
Survey of Recent Trends in Local Support Vector Machine
Computer Science. 2012, 39 (1): 170-174. 
Abstract PDF(498KB) ( 685 )   
RelatedCitation | Metrics
Support Vector Machine(SVM) is an important and widely used classifier. If a sample wants to be classified,all training data will be used to obtain a hyperplane which is used to determine the label of that sample, that is, the SVM worked in global manner. However, this global behaviour doesn't imply consistency. hhe design of Local SVM(I_SVM)is in accordance with the result of"consistency implies local behaviour". In this paper, we first reviewed the main idea of LSVM, followed by the improvements on LSVM. In the following, we presented an LSVM algorithm which is based on cooperative clustering,reducing the time complexity of I_SVM in large scaled dataset. Then we ended this article by the conclusion.
Clustering Feature Selection Method Based on Neighborhood Distance
Computer Science. 2012, 39 (1): 175-177. 
Abstract PDF(312KB) ( 631 )   
RelatedCitation | Metrics
To overcome the limitation of bad results on clustering and time-consuming of existing clustering algorithm to high-dimensional data, we provided an unsupervised feature selection algorithm based on neighborhood distance, then we clustered again on the selected feature subset The use of the selected feature subset can improve clustering accuracy. The results of the experiment show that the method can find the valid features, and also improve the timcconsuming problems in clustering on high-dimensional data.
Scheduling Algorithm of a-Planarization for Solving the Problem of Multiprocessor Scheduling
Computer Science. 2012, 39 (1): 178-181. 
Abstract PDF(338KB) ( 927 )   
RelatedCitation | Metrics
In this paper,the concept of crflatness was proposed based on analyzing the multiprocessors scheduling problem, and then introduced it into the multiprocessors scheduling problem. Finally, a new algorithm based on the concept of a-flatness was proposed to solved the multiprocessors scheduling problem In this algorithm, the job set was flatting at first,and then solved the new problem obtained by the first step, finally an approximate solution for the original scheduling problem was obtained. The experimental results show that the solution obtained by this algorithm is good,and compared with the heuristic algorithm the result obtained by this algorithm is more stable.
Density Weighted Proximal Support Vector Machine
Computer Science. 2012, 39 (1): 182-184. 
Abstract PDF(232KB) ( 780 )   
RelatedCitation | Metrics
The regularized least squares problem replaces the quadratic programming problem in the standard proximal support vector machines (PSVM). The proximal support vector machines has an analytic solution, so it reduces the training time. But the unbalanced data of positive and negative class is disregarded in the standard proximal support vector machines, and the same penalty factors are assigned to the all training samples. In practical problem, the distribution of positive and negative class is unbalance. Aiming at this problem, a density weighted proximal support vector machines(DPSVM) based on the support vector machines was presented, it is a modified proximal support vector machines algorithm. First calculated the density information of the different data, then according to the density information of the different sample, the different penalty factors were assigned to different training sample which has different density. The penalty values in the original problem of proximal support vector machines were transformed into a diagonal matrix.This method was used on UCI dataset, and compared with support vector machines and proximal support vector Machines methods. The experiment results indicate that the density weighted proximal support vector machines have a better efficiency classified performance in the unbalance data sets of positive and negative samples.
Bayesian Network Classifier Based on L1 Regularization
Computer Science. 2012, 39 (1): 185-189. 
Abstract PDF(397KB) ( 632 )   
RelatedCitation | Metrics
Variable order-based Bayesian network classifiers ignore the information of the selected variables in their sequence and their class label, which significantly hurts the classification accuracy. To address this problem, we proposed a simple and efficient Ll regularized I3ayesian network classifier (Ll-I3NC). Through adjusting the constraint value of Lasso and fully taking advantage of the regression residuals of the information, L1-BNC takes the information of the sequence of selected variables and the class label into account, and then generates an excellent variable ordering sequence(L1 regularization path) for constructing a good Bayesian network classifier by the K2 algorithm. Experimental results show that L1-BNC outperforms existing state-of-the-art Bayesian network classifiers. In addition, in comparison with SVM,Knn and J48 classification algorithms,L1-BNC is also superior to those algorithms on most datasets.
Evaluation Method Based on Human Trust Mechanism for Mobile E-commerce Trust
Computer Science. 2012, 39 (1): 190-192. 
Abstract PDF(352KB) ( 880 )   
RelatedCitation | Metrics
According to the characteristics of mobile ccommerce, after the analysis of the influence factors of Customer's trust, an improved prediction model for trust in mobile e-commerce was built. Then, considering the human cognitive habits, a new mobile ccommerce trust evaluation index system was constructed. Finally, a improved prediction model for mobile e-commerce transaction trust based on improved gray was proposed and verified with rationality and effectiveness.
Reduction of Attribute Values Based on Interval-valued Formal Decision Contexts
Computer Science. 2012, 39 (1): 193-197. 
Abstract PDF(303KB) ( 614 )   
RelatedCitation | Metrics
The theory of concept lattices is an efficient tool for knowledge discovery, and is applied to many fields. Considering the information uncertainty in the real world, the paper defined the interal-valued formal decision context. Based on which, the consistence of the interval-valued formal decision context was studied through the relation between condition context and decision context, and further, the reduction of attribute value vectors was studied, which makes the attribute number and interval-value of attribute simpler than before.
Research on Method of Generalized Decision Rule Acquisition Based on GrC in Inconsistent Decision Systems
Computer Science. 2012, 39 (1): 198-202. 
Abstract PDF(414KB) ( 583 )   
RelatedCitation | Metrics
Because of various subjective and objective factors such as noise in data, inconsistent data occurs more frequently in recent years. This rectuires exploring some method and technictue that can directly analyze and deal with such inconsistent data. I}his paper studied generalized decision rule acquisition inconsistent decision systems. It first analyzed the basic principle of decision rule acquisition based GrC, and then gave a general method of computing all minimum generalized decision rule sets. This method has no need of constructing discernibility matrix and can be executed concurrently, so it has relatively low space cost and relatively high efficiency. In addition, it also can be easily expanded to compule all minimum generalized decision rule sets of other types. This provides a general method of rule acquisition in inconsistent decision systems.
Question Recommended Technology of Integrated Sentence Structure and Semantic Similarity
Computer Science. 2012, 39 (1): 203-206. 
Abstract PDF(362KB) ( 785 )   
RelatedCitation | Metrics
A kind of new application was proposed towards large-scale Question Answer(QA) pairs resource in this paper. Largcscale QA pairs library based on BaiDu ZhiDao platform was constructed and joined to QA system firstly.Then the question with the highest similarity in the library was recommended to the user by similarity calculation. We downloaded 10500 Web pages in the experiments and extracted 4687 QA pairs successfully. Results of experimental applications utilizing TF/IDF of keywords, syntax match of tree kernel function, semantic distance of sentences synthetically were given to illustrate the proposed technique. The application of our experiments obtained accurate rate by 79. 44 %, 81. 67 %and 88. 33% respectively in terms of using 1,2 or 3 methods abovementioned. The experimental resups show that using one more methods synthetically to calculate similarity can acquire more preferable effects.
Similar Transitive Binary Relation-based Generalized Rough Sets and their Axiomation
Computer Science. 2012, 39 (1): 207-209. 
Abstract PDF(298KB) ( 713 )   
RelatedCitation | Metrics
This paper presented the defintions of positive similar transitive and negative similar transitive binary relation,considered the generalized rough sets based on these two kinds of binary relations, analyzed their properties,respectively, also gave the corresponding axiomatic characterizations. Finally, we investigated the correlations between these two kinds relations and other relations-based generalized rough sets. Some important conclusions were achieved.
Trust Evaluation Model Based on Set Pair Analysis and its Application in Service Selection
Computer Science. 2012, 39 (1): 210-214. 
Abstract PDF(428KB) ( 652 )   
RelatedCitation | Metrics
The largcscale, heterogeneous, dynamic, distributed and autonomous network results in uncertainty and deceptive of resources and services, so an effective trust mechanism should be built to ensure transaction safety. But current trust evaluation approaches solely based on fuzzy logic and feedback arc inaccurate and ineffective. This paper presented a method which is based on set pair analysis theory to overcome the shortcoming that a fuzzy concept must be denoted by a precise,only strict membership function in fuzzy math. This model considers not only the feedback information, but also the service capacity that resources can achieve. hhe authors proposed two algorithms of service selection based on the trust model. Simulations prove that the algorithms can efficiently increase the success rate of task execulion.
Multi-objective Optimization Design of Complex Production Processing Based on Genetic Algorithm and Neural Network
Computer Science. 2012, 39 (1): 215-218. 
Abstract PDF(341KB) ( 1087 )   
RelatedCitation | Metrics
In the problem solving processing of complex non-linear multi-objective optimization, it is very difficult to getting the non-linear structure model beforehand and the considered parameters become more and more. I}he conventional modeling method and optimal model have many shortcomings, and arc difficult to solve currently complicated engineering practical problems. Artificial neural network provides a novel approach for solving the complex nonlinear system modeling problems. The trained neural network response surfaces can either be objective function or constraint con- ditions, and together with other conventional constraints, a system model is then set up and it can be optimized by genetic algorithm. This allows the separation between design analysis modeling and optimization searching. Through an example of the production process optimization problem of a chemical enterprise,the model of process parameters and performance target based on Backward Propagation neural network response surface was constructed, and the optimal process parameters and sample data were gained by genetic algorithm. The experiment results illustrate that the proposed method can get multi objective optimal model with Mgh accuracy, thus greatly raising the efficiency of optimization process.
Decentralized Multi-Agent Based Cooperative Path Planning for Multi-UAVs
Computer Science. 2012, 39 (1): 219-222. 
Abstract PDF(446KB) ( 1085 )   
RelatedCitation | Metrics
The Multiple Unmanned Aerial Vehicles (UAVs) coordination is popular in domains of distributed artificial intelligence. One of the key challenges is the real time path planning of multi UAVs according to their dynamic targets,threatens , and terrain changes in the complex environment, as well as the constraints of the UAVs themselves. hhis paper proposed a novel approach toward this challenge. Our model is based on multi-agent model with decentralized control to UAV teams. By modeling the constraints of the environment, their allocated targets and other dynamical constraints in real battle fields such as threatens, they can be represented by constraint agents. Therefore, a multiple UAV path planning problem can be converted into a traditional DisCSP problem. We adopted the dynamical programming progross to design the interaction between agents so that multi-agent teams can solve the DisCSP with the AI3h algorithm and a feasible planned path for each UAV can be produced according to its dynamical constrains and allocated target. We simulated the multi-UAVs dynamical path planning in two scenarios: with dynamical threatens and terrains; with dynamical targets to manifest the feasibility of our designs.
Quick Attribute Reduction Based on Rou沙Boundary Region
Computer Science. 2012, 39 (1): 223-227. 
Abstract PDF(485KB) ( 695 )   
RelatedCitation | Metrics
Attribute reduction is one of the core research content of Rough set. Most of the existing greedy reduction algorithm is based on positive region to find out an algebraic reduct. In fact, for an inconsistency decision table, algebra reduct changes the original Pawlak topology and expands the uncertainty degree of decision table. Therefore, in this paper, a novel reduction modal based on rough boundary region was introduced, which can keep the original Pawlak topology. Based on this modcl,an efficient algorithm for attribute reduction based on rough boundary region was proposed.Theoretical analysis and experimental results show that the algorithm of this paper is effective and feasible.
Gene Selection Method Based on Decomposition
Computer Science. 2012, 39 (1): 228-233. 
Abstract PDF(455KB) ( 642 )   
RelatedCitation | Metrics
Efficient gene selection is a key issue for classifying microarray gene expression data, since the data typically consist of a huge number of genes and a few dozens of samples. Rough set theory is an efficient tool for further reducing redundancy. However, when handling numerous genes, most existing methods based on rough set theory gain worse performance. A gene selection method based on decomposition was presented. The idea of decomposition is to break a complex task down into a master-task and several sub-tasks that are simpler, more manageable and more solvable by using existing induction methods, then joining them together to solve the original task. To evaluate the performance of the proposed approach, we applied it to four benchmark gene expression data sets and compared our results with those obtained by conventional methods. Experimental results illustrate that our algorithm improve computational efficiency significantly while keeping classification accuracy.
Fleet Assignment Problem Study Based on Multi-objective Fuzzy Linear Optimization Algorithm
Computer Science. 2012, 39 (1): 234-238. 
Abstract PDF(398KB) ( 1823 )   
RelatedCitation | Metrics
A new method based on multi-objective fuzzy linear optimization algorithm for fleet assignment problem was proposed. The method applied fuzzy theory to optimization concept, turning the fuzzy multi-objective optimization mathematical model with the objective of the balance of aircraft flight time, the balance of the numbers of the aircraft movemenu and least waiting time first to a linear programming problem according to the maximum degree of membership.The result of the experiment shows the method can rapidly get the desired results.Multi-objective optimization,
Efficient Algorithm for Mining Association Rules with Constraints
Computer Science. 2012, 39 (1): 244-247. 
Abstract PDF(318KB) ( 668 )   
RelatedCitation | Metrics
Association rules mining with constraints is an important association mining method, and it can mine the rules according to the users needs. Most of algorithms deal with one constraint, but in the reality applications, usually there are two or more constraints. In this paper, a novel algorithm for mining association rules with constraint was proposed.It can deal with two constraints simultaneously, namely constraint of anti-monotone and constraint of monotone. The algorithm consists of three phases, first, frequent 1-itemsets arc collected over the dataset, second, we apply some prune techniques to the constraints check and a conditional database is generated, and at the end, the final frequent itemsets which are satisfied with the constraints are generated. Experimental results show that the proposed algorithm is efficient both in run time and scalability.
Positive Region and its Algorithms in Rough Set Model of Variable Precision Upper Approximation and Grade Lower Approximation
Computer Science. 2012, 39 (1): 248-251. 
Abstract PDF(322KB) ( 718 )   
RelatedCitation | Metrics
Based on the combination of variable precision approximations and grade approximations,as well as the core position of positive region, rough set model of variable precision upper approximation and grade lower approximation was constructed, and positive region in the model was defined. Related to precision and grade quantitative indexes, the connotation and significance of the positive region were investigated, and precise description and some properties were obtained. In order to calculate the positive region, natural and atomic algorithms were proposed and analyzed, and a conelusion was drawn that natural and atomic algorithms have the same time complexity while atomic algorithm has more advantages in space complexity. Finally, a medical example was given to analyze and explain the positive region and the algorithms. Positive region in rough set model of variable precision upper approximation and grade lower approximation has completely expanded positive region in classical rough set model in a perfect direction, and has great values to necessity knowledge discovery related to precision and grade parameters.
Clouds Search Optimization Algorithm with Difference Quotient Information and its Convergence Analysis
Computer Science. 2012, 39 (1): 252-255. 
Abstract PDF(373KB) ( 677 )   
RelatedCitation | Metrics
Clouds have many natural phenomena such as generation, dynamic movement, rainfall and regeneration. A novel intelligent optimization algorithm called clouds search optimization algorithm, or CSO was proposed by blending these natural phenomena of clouds with the ideas of intelligent optimization algorithms. Droplets inside a cloud can produce difference quotient information to guide the search. Difference quotient information can approximate gradient, and its reverse direction can guide function value's decline. On the basis of difference quotient's those properties, clouds search optimization algorithm with difference quotient information (DCSO) was also proposed. It proved to be convergent by using the relationship between difference quotient and gradient, and the convergence property is similar to classical gradient-based algorithm. Finally, the numerical experiments on benchmark functions show the excellent performance of the two algorithms and the fast convergence speed of DCSO.
Covering Vague Sets
Computer Science. 2012, 39 (1): 256-260. 
Abstract PDF(477KB) ( 700 )   
RelatedCitation | Metrics
Both covering-based rough sets and vague sets are the mathematical tools to deal with the uncertainty information. The former is the extension of the rough set theory and the latter of the fuzzy sets. Computing the lower and the upper approximations with the existing covering-based rough set models, some elements, which arc actually not sure whether they completely belong to the given set,may be put into the lower approximation. Similarly, some elements,which maybe belong to the given set,are excluded from the upper approximation. By analyzing carefully the relationship between the element of the universe and the covering-elements, a covering vague set was established. This covering vague set can reflect the degree of membership of every element of universe to the given set from a new viewpoint. Furthermore, some relationships between the covering vague set and some important concepts of covering-based rough sets were studied. Finally, the properties of a covering vague set that when a covering coincides with a partition were discussed.
Text Categorization Algorithm Based on Manifold Learning and Support Vector Machines
Computer Science. 2012, 39 (1): 261-263. 
Abstract PDF(220KB) ( 661 )   
RelatedCitation | Metrics
In order to solve the text classification problem, this paper put forward a text classification algorithm based on manifold learning and support vector machine (LLE-LSSVM). Firstly, high dimension text characteristics are reduced by LEE algorithm, and the inner rule and characteristics of the information are mined to obtain meaningful low-dimensional feature space. Secondly the features arc input into the I_SSVM to be learnt while using chaotic particle swarm algorithm to optimize LSSVM parameters. Lastly establishes the text classification model. The simulation results show that the proposed algorithm improves text classification accuracy and reduces the classification time, and it is an effective text classification algorithm.
Web Retrieval Optimization Model Based on User's Query Intention Identification
Computer Science. 2012, 39 (1): 264-267. 
Abstract PDF(350KB) ( 1148 )   
RelatedCitation | Metrics
A Web retrieval optimization model was proposed based on the analysis and classification of user's query intendon. It focuses on user's query intention identification,and can both ensure match user's query intention accurately and filter useless information automatically by means of characteristic word of query intention, content keyword and user's ctuery behavior. Comparing with related work. This paper focused on user's query intention and user's satisfaction. Experimental results show that the model can improve significantly the accuracy in information retrieve and users'satisfaction compared with traditional methods.
Motion Path Planning Method for 3D Virtual Characters with Different Motion Modes
Computer Science. 2012, 39 (1): 268-272. 
Abstract PDF(437KB) ( 761 )   
RelatedCitation | Metrics
A path planning approach to virtual 3D character motion was proposed. According to the specific characteristics of a character, we designed a relevant model. Then we used the A* algorithm to search for a collision-free path, and finally, the motion path was optimized. Experimental results show that path planned by our method can satisfy characteristics of different scales and action types, including flight, crawling, walking.
Image Retrieval Based on Color Volume Histogram
刘广海,吴璟莉
Computer Science. 2012, 39 (1): 273-275. 
Abstract PDF(361KB) ( 830 )   
RelatedCitation | Metrics
HSV color space could mimic human color perception well. In order to take the advantage of HSV color space,a new feature descriptor,namely color volume histogram,was proposed in this paper for image representation and image retrieval. First, it converts color image from RUI3 color space to HSV color space, and then uniformly ctuantizes the color image into 72 colors. Finally, color volumes arc used to represent image features. I}he proposed algorithm is extensively tested on Corel datasets with 15000 natural images. Image retrieval experimental results show that the color volume histogram has the discrimination power of color and spatial features,and the performance is better than that of the local binary pattern histogram and color histogram significantly.
Preserving-Moment Principle-based 2-D Shannon Entropy Image Thresholding Method and its Fast Recursive Implementation
Computer Science. 2012, 39 (1): 276-280. 
Abstract PDF(424KB) ( 720 )   
RelatedCitation | Metrics
In order to overcome the drawbacks of the 2-D Shannon entropy image thresholding method, a preserving-moment modified Shannon entropy image thresholding method based on 2-D histogram oblictue segmentation was presented. First the two thresholding methods based on Shannon entropy were formulated by the oblique line which is perpendicular to the main diagonal; then the optimal threshold was chosen from the thresholds obtained from these methods using the preserving-moment principle, and its recursive algorithm of the method based on 2-D histogram oblique segmentation was inferred,finally the features of 2-D histogram and the algorithm were combined to get a novel recursive algorithm. Experimental results show that the proposed method's segmentation performance is much better and its running speed is about four times faster, compared with the current maximum entropy method based on 2-D oblique segmentation.
Neighborhood Preserving-based Relevance Feedback Algorithm in Image Retrieval
Computer Science. 2012, 39 (1): 281-284. 
Abstract PDF(241KB) ( 713 )   
RelatedCitation | Metrics
When there are no sufficient feedback samples provided by Relevance feedback, supervised learning methods may suffer from the over-fitting in image retrieval. This paper proposed a novel neighborhood preserving regression algorithm which makes efficient use of unlabeled images. The algorithm selects the function which can minimize the empirical loss on the labeled images, thus, the function can respect both semantic and geometrical structures of the image database. The experimental results show that the algorithm is effective for image retrieval.
Graphics Processing Unit Based Fuzzy C-means Clustering Segmentation
Computer Science. 2012, 39 (1): 285-286. 
Abstract PDF(246KB) ( 648 )   
RelatedCitation | Metrics
In order to accelerate the segmentation algorithm of FCM(fuzzyc一 means clustering),an accelerating algorithm based on GPU(graphics processing unit) was proposed. Firstly, this method analyses the various phases of FCM algorithm which could be paralleled. Then, in order to adapt to the GPU' s hardware architecture, this method transforms the computing of membership grade and clustering center and the classifying of every pixels according to the membership grade with CUDA(Compute Unified Device Architecture). Experimental results show that the efficiency of the FCM segmentation algorithm accelerated by GPU is improved obviously compared with CPU's serial algorithm. In view of the parallel features of most image processing algorithms, the acceleration based on GPU is universal.
Design and Implementation of Co-Array Fortran Compiler Based on Software Shared Memory
Computer Science. 2012, 39 (1): 287-289. 
Abstract PDF(307KB) ( 1209 )   
RelatedCitation | Metrics
Co-Array Fortran (CAF) has become part of Fortran Programming Language Standard, and has been widely accepted in scientific computing community. The paper presented the design and implementation of a CAF Compiler based on software shared memory. Several techniques were presented to improve the performance of CAF programs, including Co-array data communication by direct array assignment, Data Padding for raising locality and reducing false share. The benchmark testing shows that the performance of CAF programs is similar to that of MPI programs, with our CAF compiler.
Efficient Heuristic and Tabu Search for Hardware/Software Partitioning
Computer Science. 2012, 39 (1): 290-294. 
Abstract PDF(405KB) ( 884 )   
RelatedCitation | Metrics
Hardware/software(HW/SW) partitioning is one of the crucial steps in HW/SW co-design. It determines which componented of the system arc implemented on hardware and which ones on software. It has been proved that the HW/SW partitioning problem is NP-hard. This paper presented an heuristic algorithm for the HW/SW partitioning problem, which has been treated as an extended 0-1 knapsack problem. Tabu search was used to further the solution obtamed through the proposed heuristic algorithm, in order to minimize the hardware cost with the constraints of the soft- ware cost and the communication cost. Experimental results show that the algorithms proposed in the paper can produce better solution than the latest work, and the improvement is up to 28%.
Probability-based Ordered Information Systems
Computer Science. 2012, 39 (1): 293-243. 
Abstract PDF(387KB) ( 691 )   
RelatedCitation | Metrics
An information system is called an ordered information system if the domains of all condition attributes are ordered according preference. Firstly,the defects of interval-valued ordered information systems that do not include the probability distribution were analyzed and probability-based ordered information systems were proposed. Secondly, the approaches of establishing the outranking relation between objects with respect to attributes with monotonic preference and non-monotonic preference in probability-based ordered information systems were given respectively,and a probability-based dominance relation was defined, and the rough set model based on the probability-based dominance relation was presented. Finally, the probability-based ordered decision tables were defined, and the decision rules from probability-based ordered decision tables were researched.
Research on Parallel Algorithm of Edge Extraction Based on Multi-processor
Computer Science. 2012, 39 (1): 295-298. 
Abstract PDF(329KB) ( 983 )   
RelatedCitation | Metrics
As the development of microprocessors from uniprocessors with high frequency to chip multiprocessors (CMP),the ability of parallel processing of computers is advancing. Through analyzing the performance bottlenecks of the serial algorithm of UIS, raster data was paralleled by multithreads based on the advantage of CMP. Parallel programming models were analyzed and compared to build parallel performance estimating models. Based on OpenMP, the parallel performance could be improved furthest by choosing appropriate parameters. The experiment results show that the parallel performance estimating model can be used to forecast the parallel performance exactly and using OpenMP has an advantage over MPI in CMP environment. I}he parallel algorithm of the edge extraction based on OpenMP can significantly improve the efficiency of the image edge extraction.
Study on Optimization Methods for Hardware Enhancements I/O Virtualization of Many-Cores Processor
Computer Science. 2012, 39 (1): 299-304. 
Abstract PDF(554KB) ( 891 )   
RelatedCitation | Metrics
I/O resources of many cores processor are shared by many cores. I/O virtualization supports I/O resources efficient share and secure isolation, and is adopted by more and more processors. Hardware enhanccxnent for I/O virtualination is taken into account when it's architecture is designed, and it provides a general and efficient resolvent. This paper studied two key techniques of I/O virtualization-DMA remapping and Interrupt redirection. Firstly put forward a IOTSB cache management method based on hint and invalidation method based on invalidation queue to optimize DMA remapping performance, then a flexible and controllable interrupt redirection method was brought forward to supportreliable and efficient interrupt routing. Experimental results show the hardware enhancement method for I/O virtualization which we put forward not only can support efficient sharing of I/O resource with little cost, but also can provide almost the same I/O performance as environment without I/O virtualization.
Performance Analysis and Tuning of Large-scale Finite Element Analysis Program on Multi-core Cluster Platform
Computer Science. 2012, 39 (1): 305-310. 
Abstract PDF(496KB) ( 1131 )   
RelatedCitation | Metrics
Through the performance analysis of an open source finite clement software based on MPI program model on multi core cluster platform, some performance bottlenecks were founded. Based on the performance bottleneck analysis,two optimization plans based on MPI/OpenMP hybrid parallel program model were proposed, one of them resolves the inefficiency in solving linear or nonlinear system equations, and the other one elevates processes communication performance. Experiment results show that hybrid parallel solver can efficiently promote the pure MPI based parallel program performance, as up to 3 times. I}he multi-thread multi-process communication plan can do some optimization, but is not the best solution in this case. The overall optimized performance analysis indicates that on multi-core cluster computing platform, MPI/OpenMP parallel program model can more efficiently utilize hardware system computation resource.
Model Analysis of Image Median Filter Based on NoC
Computer Science. 2012, 39 (1): 311-314. 
Abstract PDF(282KB) ( 788 )   
RelatedCitation | Metrics
The paper proposed a NoC architecture for realizing a special median filter algorithm. In order to improve the processing speed, we combined the system-level parallel processing mechanism and instruction-level parallel processing mechanism in a special SoC,which not only can satisfy the processing speed, but also can lower the power consumption.