Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 37 Issue 12, 01 December 2018
  
Survey and Review on Key Technologies of Column Oriented Database Systems
LI Chao,ZHANG Ming-bo,XING Chun-xiao,HU Jin-song
Computer Science. 2010, 37 (12): 1-7. 
Abstract PDF(654KB) ( 910 )   
RelatedCitation | Metrics
Column-oriented database is a kind of new database storage technology that stores data according to column (not traditionally row). The database pioneers such as Dr. Michael Stonebraker are advocating and exploring the new theory and technology for column-oriented database. The main features of it arc good query efficiency,less disk access, less storage,and significant improvement of database performance. Column-oriented database is an ideal architecture for data warehouse natively, and thus shows a good potential in supporting highly efficient business intelligence applicadons. This new technology is promising in both academic and business, therefore attracting lots of high-tech corporalions and research institutes to devote in it, This paper introduced and analysed the main features,key technologies and current R&D situations of column-oriented database.
Algorithms for Cluster Editing:A Survey
WANG Jian-xin,WAN Mao-wu,CHEN Jian-er
Computer Science. 2010, 37 (12): 8-11. 
Abstract PDF(472KB) ( 366 )   
RelatedCitation | Metrics
The Cluster Editing is known to be a very important NP-hard problem. As a special case of Correlation Clustering, it plays a significant role on many fields such as computation biology. After the theory of parameterized complexity was brought up, its parameterized version has drawn much attention. We introduced some approximation and parameterized algorithms for this problem and some variants of it, emphasized on the latest results about its kernelization and FPT algorithms. At the end, we presented some further directions for future research.
Survey of Resource Discovery Mechanism in Distributed System
HAI Mo
Computer Science. 2010, 37 (12): 12-17. 
Abstract PDF(582KB) ( 392 )   
RelatedCitation | Metrics
The resource discovery problem is to find the addresses of matching resources by the given description of resources. How to find the needed resources quickly and accurately from the distributed stored resources is a challenge.Traditional grid resource discovery system adopts registry and index, but these methods cannot satisfy the need of increasing scale of grid system. P2P system is a scalable distributed system. It's an efficient method to solve the grid resource discovery problem by adopting current P2P technology. This paper introduced resource discovery in grid system,resource discovery in P2P system,grid resource discovery system based on P2P and gave comparisons.
Review of Energy Holes in Wireless Sensor Networks
WANG Dons-fang,QI Xiao-gamg
Computer Science. 2010, 37 (12): 18-21. 
Abstract PDF(354KB) ( 472 )   
RelatedCitation | Metrics
Based on characters of wireless sensor networks,the nodes around the sink use up their energy much faster than other nodes, which forms the energy holes problem. How to avoid energy problem and to prolong network period is hot in wireless sensor networks. Now methods of solving energy problem are mostly balancing the network's load. We concluded the methods of this problem from five parts, these are energy control and power control, data compress and mixture strategy, node's non-uniform distribution, dynamic nature and increasing sink's number, and cluster algorithm.What is more, we analysed the advantage and disadvantage among these five, and proposed the research direction.
New Multi-channel Multimedia Data Stream Filter Model
LI Jun,LIAO Hao,CHEN Jie,TAN Jian-long
Computer Science. 2010, 37 (12): 22-25. 
Abstract PDF(481KB) ( 484 )   
RelatedCitation | Metrics
Multimedia data stream is fused with data of different formats, e. g. text, pictures, audio and videos. To filter out sensitive information fast and accurately enough from high-speed multimedia data stream proves a great challenge that has not yet been well tackled. This paper proposed a model for filtering multimedia data stream, and implemented a validation prototype system upon national backbone network. The prototype system uses the intelligent rule base to establish filtering rules to relational data engine mapping mechanism, uses the decision tree strategy to establish the classification model for calculating the similarity to reduce the complexity of calculating the similarity degree of success achieved high-speed multimedia data real-time stream filtering. Experiments show that the model can efectively finish multi-mode filtering and multi-channel filtering in a multimedia data stream environment.
LTI Model-based Proportional Delay Control for Multiple Service Classes on Apache Web Server
LU Jian-bo,DAI Guan-zhong, PAN Wen-ping
Computer Science. 2010, 37 (12): 26-29. 
Abstract PDF(476KB) ( 412 )   
RelatedCitation | Metrics
The linear time invariant (LTI) model for Apache Web server was identified experimentally. This model was used to describe the relationship between connection delay ratio and service thread ratio for two classes of Web client connections. Based on this model,a controller was designed to implement proportional delay guarantee by adjusting the number of service thread for different class of connections. An extension for multiple connection classes was implemented by using multiple controllers. Simulation results indicated that, Apache Web server in the closed-loop system gua ranteeds the proportional delay for different connection classes quite well, even if the number of concurrent client connections changes abruptly under heavy load conditions.
DQ-MAC: A Diffserv-based MAC Mechanism in Wireless Sensor Networks
HE Jian,BAI Guang-wei,CAO Lei
Computer Science. 2010, 37 (12): 30-34. 
Abstract PDF(479KB) ( 372 )   
RelatedCitation | Metrics
Since the most of existing MAC protocols in wireless sensor networks do not support any priority scheme,this paper proposed an improved scheme of the existing S-MAC(sensor MAC) protocol, say, DC}MAC(Diffserv-based Qo}aware MAC). The main idea is that, based on Differentiated Services, an additional channel listening time was introduced for high priority data. As a result, compared to low priority data, data with high priority can achieve much more chance to be delivered, and the packet transmission latency of high priority data is reduced significantly as well. Our mathematical analysis demonstrates that, using the proposed D呀MAC, the transmission quality of high priority dato over wireless sensor networks is effectively improved, in terms of achieved throughput, average service delay, and so on.
Study on Internet Traffic Classification Using Machine Learning
LIU Qiong,LIU Zhen,HUANU Min
Computer Science. 2010, 37 (12): 35-40. 
Abstract PDF(716KB) ( 803 )   
RelatedCitation | Metrics
Internet traffic classification is one of the key foundations for research works and traffic engineering in Internet. The categories of Internet applications and the number of Internet flows increase fast last years. The technique challenges arc coped with development of traffic classification all the time. A research was carried out systematically in Internet traffic classification using machine learning in the paper. A mathematical description was given for the technology of traffic classification first. After surveying the technology of traffic classification based on supervised learning and unsupervised learning, we commented on the data preprocessing, classification model and performance evaluation etc. three aspects, and indicated the shortages now in these studies. Then, four common issues were summarized, which are data skew, label bottleneck, attribute changing and classify timely respectively. At the end, the prospect and our going work in this area were pointed out.
Learning-Task Scheduling Algorithm Based on CSP Model
CHEN Yi-xiong,WU Zhong-fu,FEND Yong,ZHU Zheng-zhou
Computer Science. 2010, 37 (12): 41-46. 
Abstract PDF(589KB) ( 425 )   
RelatedCitation | Metrics
This paper focused on realizing intelligent guide service for learning activities in clearning environment. At present, our research group has adopted a knowledge topi}based content routing algorithm and resource combination algorithm to meet the individualize rectuirements from learners. However, the learning effect still lacks of assurance although the expected content package was generated. In fact, the learning process is consisted of a series of learning tasks in stead of static contents,so the effect of learning can only be guaranteed by a group of well organized learning tasks which arc designed according to educational methodology. I}hus,this paper established a learning task layer above the former knowledge topic layer and converted the scheduling problem as a Constrain Satisfaction Problem, and proposed the solution. The experiments showed that the proposed algorithm can generate an optimized schedule which is effective in providing learning guide service.
Facilitating Role Management in RBAC:Using Multi-parents Tree
FEND Xiao-sheng,LI Ximg-yun,SUN Yang,ZHANG Wei-ming
Computer Science. 2010, 37 (12): 47-52. 
Abstract PDF(541KB) ( 399 )   
RelatedCitation | Metrics
Role-Based Access Control(RBAC) has been widely applied to authorize certain users to access certain data or resources within complex information systems. Several problems are coming about during the application of RBAC models, which include well-representing the role hierarchy, following the constraints applied in user-role assignments and role-role relations, revoking redundant roles and assignments, etc. This paper addressed these problems from the perspective of information visualization to facilitate role management in RBAC, particularly leveraging the experience of trees) visualization. A detailed problem statement was made first,and the data structure of multi-parents tree was defined. Then a multi-parents tree normalization process was proposed to construct a refined role hierarchy for elegant representation. Subsectuently, a two-layered paradigm, the nether for displaying role hierarchy and permissions, and the upper for placing uscrs,was presented for the visualization of role management in RBAC. Additionally,some specific interaction techniques were put forward to visually aid in solving the constraint and redundancy problems.
Frame Structure and its Performance Analysis of Fractional Frequency Reuse
GAO Di,ZHU Guang-xi,LI Yan-chun,Markus Hidell
Computer Science. 2010, 37 (12): 53-56. 
Abstract PDF(331KB) ( 405 )   
RelatedCitation | Metrics
Aiming at the increasing rectuirement of interference mitigation in wireless communiation system, this paper raised a new style of frame structure design-"Time-frequency division fractional frequency reuse" by reallocating subcarriers in frames. This new method integrates the modes of time division and frequency division on assigning resource blocks after modeling system capacity. The corresponding frame structure has a more precise granularity for adjustment compared with tradition fractional frequency reuse pattern. I}he results of simulation validate its advantages on system fairness and throughput of edge users, with the cost of slight capacity drop of the entire system.
Image Encryption Algorithm Based on Logistic and Standard Map
HU Chun-qiang,DENG Shao-jiang,QIN Ming-fu,HUANG Gui-lin
Computer Science. 2010, 37 (12): 57-59. 
Abstract PDF(250KB) ( 628 )   
RelatedCitation | Metrics
Based on Logistics and Standard Map,a new image encryption algorithm was proposed. By using the properties of chaotic map such as sensitive to initial values, sensitive to parameters and the statistic specialty of white noise,the confusion system was designed through discrete chaotic map and standard map. This confusion is dependent on the secret keys generated by a different chaotic system. I}hen changed the pixel values of encrypted image through the substitution system and made pixels associated with each other. At the same time, every pixel value was diffused to others.The system contains confusion, substitution and diffusion, which arc necessary factors for a good cryptosystem. The experimental simulation and analysis illuminate that the algorithm has such advantages: larger key space, easily realized and perfect statistical characteristics.
Adjusting Traffic over Multi-paths Based on Feasible Distance of Path
GUAN Li-an,WANG Bin-qiang
Computer Science. 2010, 37 (12): 60-62. 
Abstract PDF(259KB) ( 407 )   
RelatedCitation | Metrics
For eliminating the problem of poor load balancing which is resulted from implementing current adjusting traffic algorithms over multi-paths,IAH(Improved Adjustment Heuristic) was proposed. IAH utilizes the information of whole network and nodes which show the state of loads on the links. Based on the change of feasible distance of paths in routing information and the principle that the capability of short path is better than one long path, IAH adj usts traffic over multi-paths, transfers traffic from links with heavy load to light ones and make their traffic respectively match with their feasible distance. Simulation results show that the time of losing packets in implementing IAH is later than AH and the losing packet rate of IAH is better than AH in most time. Namely, the load balancing performance of IAH exceeds one of AH.
Optimal Sensor Deployment Scheme in Heterogeneous Sensor Networks Based on Binary Particle Swarm Algorithm
LI Min,SHI Wei-ren
Computer Science. 2010, 37 (12): 63-66. 
Abstract PDF(325KB) ( 361 )   
RelatedCitation | Metrics
As one of key issues addressed in the application of wireless sensor networks(WSNs),sensor deployment is one of the significant means of guaranteeing the quality of service in networks. To relieve the high density of distributing heterogeneity nodes in WSNs and geographical irregularity of the sensed event in the monitored area, a optimal differentinted sensor deployment scheme based on binary particle swarm algorithm was proposed, to compute the cost of networks and choose the right type of sensor at appropriate position. With guaranteeing coverage of sensors in the zone, the objective of min-cost of sensor deployment was computed in order to reduce network redundancy and enhance quality of service. Finally, the experimental results demonstrated that the proposed approach is suitable for solving deployment problems of heterogeneous WSNs.
Load Balancing Algorithm Using Flow Splitting to Avoid Packet Reordering
BU You-jun,WANG Chao,WANG Bin-qiang
Computer Science. 2010, 37 (12): 67-69. 
Abstract PDF(338KB) ( 421 )   
RelatedCitation | Metrics
Load balancing is a effective technique that keeps network from congestion when link over loading or link failures. hhe performance of balancing system was determined by granularity of load splitting. hhe more fine granularity the more better performance. But splitting schemes must make a tradeoff between slicing granularity and packet reordering. Splitting traffic at the granularity of packet, each path of the balancing system can obtain accurately load assignment, but can make a lot of reordering packets in the same TCP flow. Splitting traffic at the granularity of flow, the packet of the same flow arrives at the destination along the same path,which will not cause packet reordering,but each path gets the inaccurately load assignment compared with it's desired load sharing. This paper showed that one could split a flow into multi path without causing packet reordering, which using the time slot between consecutive packet of the same flow to chop the flow into several segment, FSLB algorithm splited the traffic at the granularity of the segment, the results show that FSLB algorithm gets the fine performance by simulation.
Novel Framework for Service Discovery in MANET
ZHANG Cheng,ZHU Qing-sheng,CHEN Zi-yu,LIU Hui-jun
Computer Science. 2010, 37 (12): 70-75. 
Abstract PDF(543KB) ( 463 )   
RelatedCitation | Metrics
The mobile ad hoc networks(MANET) are autonomous, infrastructureless network. Unlike the common service discovery used in wired applications, the service applications based on MANET have so many differences in data transmission, service registration and service discovery. All componented of service discovery of MANE T need to consider the features of mobile devices and mobile environment Considering the characters and requiremented, this paper presented a new framework for service discovery in MANET,which supports SOAP-over-UDP and adopts cluster architecture. And to make the service working smoothly in such framework, this paper prompted special service registralion and discovery strategies. The experiment results demonstrate that the new framework with special service working strategies has great superiority both in response-time and efficiency in MANET environment.
Research of Anti-fraud Detection Model for Advanced Payment System Information Security
WU Jing-hua, ZHOU Qi-hai,LIU Jia-fen
Computer Science. 2010, 37 (12): 76-80. 
Abstract PDF(441KB) ( 375 )   
RelatedCitation | Metrics
The rapid development of computer science and communication technology, as well as informational innovalion on financial payment methods, make national Advanced Payment System(APS) more and more efficient and convenicnt, and they also bring many threats to financial information security which is very difficult to be detected. These threats will affect the informationalization course of national APS,and also influence the information security and steady development of national finance system Therefore, this paper proposed an anti fraud detection model for APS informalion security. Based on the link-mining as a new technology in computer science, this model detects the mass information of APS and finds fraud information dynamically. Through simulation on anti-fraud detection to credit card which is one of the main tools of APS, its simulating results show that the model is significant to improve the dynamic detection, accuracy and validity of credit card fraud, and also reduces the financial risk in APS.
Design and Implementation of Bench4Q: A QoS-oriented E-commerce Benchmark
DUAN Zhi-quan,ZHANG Wen-bo,WANG Wei
Computer Science. 2010, 37 (12): 81-84. 
Abstract PDF(412KB) ( 387 )   
RelatedCitation | Metrics
The ability of application server which is the infrastructure of the E-commerce has become a key matter of people's concerns. TPC-W, as a very popular E-commerce benchmark, mainly focuses on performance of application server and is lack of concerns on the QoS guarantees. We designed a QoS oriented E-commerce benchmark named Bench4Q. hhis benchmark has made many extensions of load simulation and metrics analysis of TPC-W specification.We showed the characteristic of Bench4Q by running it on representative application server.
Formal Secure VMM Prototype Towards Level Verified Design
YI Qiu-ping,LIU Jian,WU Shu
Computer Science. 2010, 37 (12): 85-90. 
Abstract PDF(493KB) ( 530 )   
RelatedCitation | Metrics
Operation systems are the base of computer software system which have complex controlling logics, and their security and reliability arc very critical. In almost all of standards of security operating system in the world, formal specification and verification on them are needed. Inrecent years, several system meeting level "mandatory access control" and "structural protection" in GB 17859-1999 were designed and implemented by domestic research agencies. However,systems conformed to higher levcles arc still in blank. In this paper, we reported a VMM-based security prototypcCASVisor,a system towards "level verified design",which is designed and implemented in our group. CASVisor has formal definition of the system specification, which can be used to guide high-performance implementation in C programs, and support formal analysis and verification. Moreover,CASVisor can be used as rapid prototype to simulate the system design.
CILinear: An Automated Generation Tool of Linear Invariant
XING Jian-ying,LI Meng-jun,LI Zhou-jun
Computer Science. 2010, 37 (12): 91-95. 
Abstract PDF(405KB) ( 543 )   
RelatedCitation | Metrics
Constructing program invariants is an important part of program verification and Interproc is an open-source tool capable of constructing linear invariant of a simple language. This paper designed and implemented a tool CILinear for automatic generation of linear invariant among numeric variables of simplified C programs based on Interproc and C compiler tool,and it is showed that CILincar can construct linear invariant effectively and deal with more syntax units.The application of CILinear in program verification was also discussed by program codes.
Research and Implementation of Transition from WF-net to PNML
ZHOU Jian-tao,HAI Xiao-jun
Computer Science. 2010, 37 (12): 96-98. 
Abstract PDF(317KB) ( 604 )   
RelatedCitation | Metrics
Petri Net Makeup Language(PNML) is an interchange format for different kinds of Petri nets based on XML. It plays an important role for inter-operability among different variations of Petri nets. Workflow net(WF-net) is one of formal techniques for workflow modeling and qualitative and quantitative analysis. The work done in this paper is transition from WF-net model to PNML format. Firstly, PNML meta model for WF-net was presented based on stanlord meta model by extending those elements not in the original model. Secondly, ten transition rules from WF-net to PNML were given based on the meta model presented above, including source place/sink place transition,four triggers transition,and four fork/join constructs transition. Finally,an automatic transition tool was designed and implemented.The work is exploratory research on standard, formal interchange format of Petri net.
Adaptive Software Architecture Design Oriented to Requirements Uncertainty
FU Yun,LI Min-qiang,CHEN Fu-zan
Computer Science. 2010, 37 (12): 99-105. 
Abstract PDF(673KB) ( 500 )   
RelatedCitation | Metrics
Requirements uncertainty is recognized as a major source of risk to software development process. A new idea about software development, called the adaptive software architecture design, was proposed in this paper, which offers a feasible way to guarantee the quality of software product during the life-cycle. Firstly, the concept of requirement uncertainty was defined;its causes and impacts were discussed. Secondly, the relationship between the software requirement and the software architecture was analyzed, and the adaptive software architecture approach ws put forth to handle the variable users' requirements and control process risks. Finally, the evolution process of software architecture was characterized with regard to the requirements uncertainty, and a collaborative modeling method was presented to achieve adaptive design between requirements and architecture.
Automated Detection Method for Web Services Feature Interaction
LUO Xiang-yu,TAN Zheng,DONG Rong-sheng
Computer Science. 2010, 37 (12): 106-109. 
Abstract PDF(374KB) ( 351 )   
RelatedCitation | Metrics
Model checking techniques can be effectively applied to the verification of exceptions in Web services composition, such as feature interaction problems, but the verification process is not fully automatic. In order to improve the verification of intelligence level, we need to transform 13PEI_ language into the input language of MCTK, a symbolic model checker developed by us. Based on detailed research on the BPEL control flow, we proposed a formal model for activities implementation and gave the semantics of the BPEL activities implementation. We then respectively developed an algorithm for automatically converting the BPEI_ process to seven-tuple collections and an algorithm for converting those seven-tuples to input language of MCTK,such seven-tuples included effective information about state changes in the business implementation process. I}hc results show that the proposed algorithm can effectively verify feature interarction in Web Services,and support the verification of epistemic logic.
Research on Structural Testing of Web Applications
LU Xiao-lil,DONG Yun-wei
Computer Science. 2010, 37 (12): 110-113. 
Abstract PDF(385KB) ( 380 )   
RelatedCitation | Metrics
In order to guarantee Web duality and reliability, people attach more and more importance to Web application testing. Analyzing and understanding Web application, mastering inter control flow and data flow information, good structural testing models and methods can be built so as to carry out structural testing based on coverage rate. In this paper,an structural testing model that is five-level described as function level,function cluster level,object level,object cluster level and application level, was offered. Testing methods of designing and choosing test cases were also introduced so as to support structure testing of Web application.
Research on the Characterization and the Calculus of Temporal Granularities
ZUO Ya-yao,TANG Yong,SHU Zhong-mei,LI Lei,LIU Hai
Computer Science. 2010, 37 (12): 114-119. 
Abstract PDF(511KB) ( 423 )   
RelatedCitation | Metrics
Temporal modeling and calculus arc the fundamental logic problems in temporal information system. From the view of granularities, the semantics and properties of temporal granularities were discussed according to the partilion. The temporal primitives were characterized based on temporal granularities. Furthermore, conversion operators and their relationship were analyzed with different granularities; the calculus system of their relationship was constructed from the point view of algebraic system.
Structure Summary for Keyword Search over XML Documents
LOU Ying,LI Zhan-huai,GUO Wen-qi,CHEN Qun,HAN Meng
Computer Science. 2010, 37 (12): 120-124. 
Abstract PDF(442KB) ( 372 )   
RelatedCitation | Metrics
The index of XML Data is crucial for retrieval efficiency of XML document After analysis of existing XML structure summaries, this paper proposed a structural summary over keyword search called LSS combining the XML document. I_SS merges the nodes in the XMI_ tree with the same label path so as to determine nodes' homogeneity and heterogeneity efficiently. This paper implemented LSS constructing algorithm called CSCAN, and designed a XML keyword retrieval algorithm called LSScarch based on LSS. hhis algorithm split keywords' inverted list into different type subsets,finally retrieved to get all results quickly on these subsets. Experimental results demonstrated that I_SS can help to reduce the size of the keyword inverted list in XML document dramatically and improve retrieval efficiency.
Selectivity Estimation for Spatial Query Based on Histogram
ZHU Yan-lu,CHENG Chang-xiu,CHEN Rong-guo,YAN Xun
Computer Science. 2010, 37 (12): 125-129. 
Abstract PDF(544KB) ( 408 )   
RelatedCitation | Metrics
Spatial query optimization is one of the key topic in spatial database. Query optimization technology based on query cost estimation is an important method to improve the efficiency of ctueries. But the key problem of ctuery cost estimation is to estimate the size of ctuery results(i. e. selectivity). This paper focused on the two queries operations; spatial selection and spatial join,which arc most commonly used in spatial database. The paper expatiated some histogram algorithms for selectivity estimation of spatial ctueries, and analyzed their advantages and disadvantages. In the end of this paper, we discussed the future research directions of the selectivity estimation for spatial queries.
Dynamical Query Modification Algorithm for Fine-grained Access Control in Databases
SHI Jie,ZHU Hong,FEND Yu-cai
Computer Science. 2010, 37 (12): 130-133. 
Abstract PDF(409KB) ( 438 )   
RelatedCitation | Metrics
Fine-grained access control has received much attention from research community due to the requirements of privacy preserving and Web-based applications. It is a promising approach to implement fine-grained access control by query modification. However, in the areing query modification algorithm, the features of the queries issued by users and the features of FGAC policy are not considered. Thus,there are redundancies in the final executed queries which make unnecessary overhead. We first analyzed two different redundancies bases on the feature of queries issued by users and the FGAC policies. I}hen, we proposed a technique to detect these redundancies and provided a new algorithm to implement FGAC. A comprehensive set of experiments show that the performance is improved by the proposed algorithm.
Query Processing for Ontology-based Relational Data Integration
WANG Jin-peng,ZHANG Ya-fei,MIAO Zhuang
Computer Science. 2010, 37 (12): 134-137. 
Abstract PDF(474KB) ( 497 )   
RelatedCitation | Metrics
In order to resolve the problem of semantic integration of heterogeneous relational databases, based on semantic Web technology, this paper focused on the query processing problem in ontology-based relational data integralion. An ontology-based integration infrastructure for heterogeneous relational databases was presented. To model different databases' schemas, we proposed an ontology-based data describing method. Ontology was used as the global schema to describe the semantic of relational schemas. A ctuery rewriting algorithm was provided, which can rewrite SPARQL query over global schema into local queries that can be executed on heterogeneous relational databases. The experiment showed that the approach we proposed has nice scalability.
Classification Algorithm for Data Stream Based on Mixture Models of C4. 5 and NB
LI Yan,ZHANG Yu-hong,HU Xue-gang
Computer Science. 2010, 37 (12): 138-142. 
Abstract PDF(459KB) ( 528 )   
RelatedCitation | Metrics
Classification on the noisy data stream with concept drifts has recently become one of the most popular topics in streaming data mining. Classification algorithm for mining Data Streams based on Mixture Models of C4. 5 and NB was proposed,called CDSMM,in which decision trees based on C;4. 5 are selected as the basic classifiers and the classifier of NaW a I3ayes is adopted to filter noise data. Meanwhile, it introduces the p-hypothesis testing method to detect concept drifts. Extensive studies demonstrate that CDSMM is superior to several existing algorithms in the predictive accuracy when handling noisy data streams with concept drifts.
Non-blocking Join Algorithm Based on Statistics
CHEN Gang,GU Jin-guang,LI Si-chuan
Computer Science. 2010, 37 (12): 143-144. 
Abstract PDF(285KB) ( 329 )   
RelatedCitation | Metrics
Data stream query processing technology becomes a new and popular topic in database research area.The critical of improving non-blocking join algorithm is to improve the efficiency of memory join stage. If there are no more space for the coming tuple, some old tuples have to be flushed from memory to disk. A good refresh strategy is very helpful to increase join algorithm performance. The lowest frequently used tuples are searched from the result streams, then flush such tuples from memory to disk so that the tuples that arc stayed in the memory would generate more results. Statistics join algorithm performance is increased obviously and it expands the adaptability of the data stream relation join algorithm.
Algorithm Combination of Hash and BitTable for Mining Frequent Itemsets
REN Yong-gong,SONG Kui-yong,KOU Xiang-xia
Computer Science. 2010, 37 (12): 145-148. 
Abstract PDF(334KB) ( 354 )   
RelatedCitation | Metrics
In the frectuent itemsets mining, many algorithms are based on Apriori. These algorithms have two common problems. First,a lot of memory space are occupied by the entire database which must be loaded. Second,The processes of generating candidate itemset and computing support spend a lot of time. In order to improve efficiency, a BitTablc based form mining frequent itemsets algorithm Hash-BFI was proposed. The database was compressed into the BitTable in accordance with horizontal and vertical direction saving lots of place, used the hash function to compute the frequent two itemsets,also completely utilized AND,OR operation to generate candidate itemset and compute support for candidate itemset,and producted a pruning. All these meatures improve the efficiency of algorithm.
Research on Mining of Associated Events and Discovery of Roles Relationship
PENG Hui-liang,CAO Cun-gen
Computer Science. 2010, 37 (12): 149-155. 
Abstract PDF(669KB) ( 497 )   
RelatedCitation | Metrics
Many researchers hold this option that event accords with human normal cognitive rules and is the basic unit of human cognition. It is event which uses a verb as event core that constitutes the basic unit of dynamic process description while putting the entity concepts into organic links and organization to enrich static contact between them. But knowledge of event can not easily and directly acctuire from text. A method based on one event as core event to mine events which arc associated with the core event was proposed. It acquired associated event's chunks by extend binary words to multiple more semantic words on the basis of the full use of parsing, and then tagged role of the event's chunk and finally discovered roles relationship between of associated event and core event. The experimental results show that the proposed method has acquired considerable associated events and roles relationships from restricted corpus and has achieved a higher precision.
Skeleton Parsing Based on Multi-layer Maximum Entropy Model
GE Bin,FENG Xiao-sheng,TAN Wen-tang,XIAO Wei-dong
Computer Science. 2010, 37 (12): 156-160. 
Abstract PDF(451KB) ( 434 )   
RelatedCitation | Metrics
The main task of Skeleton Parsing is to identify the skeleton of a sentence automatically. Chinese Skeleton Parsing is a key problem in NLP. Because of the interrelation of the skeleton in the same context, a Multi-layer Maximum Entropy Modcl(MMEM) for the skeleton parsing was proposed. The low-layer ME analyzed skeleton by the context features while the high-layer ME analyzed skeleton by both the result of the low-layer ME and the features between sentences. The experiment showed that MMEM was efficient for Chinese skeleton parsing. A high precision was achieved under a small corpus while it was dependable on the scale of corpus. With the increasing of the corpus, the precilion of MMEM improves slowly.
Rough-set-based K-means Clustering Algorithm in Case Retrieval
CHEN Qian,XIANG Yang,GUO Xin,WANG Dong
Computer Science. 2010, 37 (12): 161-164. 
Abstract PDF(373KB) ( 347 )   
RelatedCitation | Metrics
The retrieval efficiency will fall down due to the increasing numbers of cases in Database in marketing Ontology hased case retrieval system. How to effectively improve the efficiency of case retrieval system is a serious prohlem. A K-means clustering algorithm based on rough set was proposed which can do the clustering work on thousands of marketing cases in Case Retrieval System in order to figure out the center of each cluster. So there is no need to do the similarity work in terms of each case. It is improved through experiment that the methods can largely enhance the performance of case retrieval system.
Crossbreeding Particle Swarm Optimization Algorithm Based on Dynamic Parameter
HUANG Wei,LUO Shi-bin,WANG Zhen guo
Computer Science. 2010, 37 (12): 165-166. 
Abstract PDF(259KB) ( 456 )   
RelatedCitation | Metrics
The particle swarm optimization (PSO) algorithm is easy to trapped into local extremum, and its convergence speed is lower and the precision is worse in the late evolution. Furthermore, the parameter selection can affect the algorithm. Aimed at these disadvantages of PSO,based on using the crossbreeding concept in the genetic algorithm for reference, the new algorithm by introducing dynamical parameters in the evolution of the speed equation is proposed. The convergence speed and the convergence rate were improved. The new method arc tested by function Levy No. 5 shows that the convergence speed and the average convergence rate was increased.
Research on Extension Reasoning Methods Based on Rough Set Data Analysis
ZHAO Rui,YU Yong-quan,ZHANG Jing
Computer Science. 2010, 37 (12): 167-170. 
Abstract PDF(334KB) ( 358 )   
RelatedCitation | Metrics
In the old extension transformation reasoning, extension transformation used historical infom}ation or personal experience, which restricted its application in intclligentized rcaso-ning. For solving the problem, an extension reasoning method based on rough set data analysis was proposed. The method gained some classified and rule knowledge by analyzing data with rough set firstly, then used the information to guide extension reasoning, which will make extension transformation reasoning become easy control and good efficiency and lay a foundation for its application in intelligentized reasoning.
Cancer Relevant Genes Selection Approach from Integrated Information
ZHANG Shu-bo, LAI Jian-huang
Computer Science. 2010, 37 (12): 171-174. 
Abstract PDF(477KB) ( 351 )   
RelatedCitation | Metrics
With the advent of gene expression data, it has shown great promise to explore cancer pathogenesis at the level of molecular biology. Exploring the key genes related with carcinoma is of great importance to the cancer diagnosis and treatment During the past few decades, many approaches were successfully proposed to select the key genes related with carcinoma. However, the gene sets selected by different methods are not always consistent with each other, this will cause difficulty to biological interpretation and practical application. A novel approach based on integrated informalion was proposed for the selection of key genes from gene expression data in this study, the three kinds of information,namely information gain, global discriminate ability and local discriminate ability were derived firstly, then they were weighted and integrated into a score to characterize the discriminate ability of each gene, and the gene set with the highs st classification accuracy was selected as key gene set. The experimental results showed that our approach can get better performance than those based on single information.
Intelligent Three-dimensional Warehouse Bin Location Allocation Optimization Algorithm
ZHANG Yang-sen,LIU An-yu
Computer Science. 2010, 37 (12): 175-177. 
Abstract PDF(349KB) ( 780 )   
RelatedCitation | Metrics
The intelligent warehouse has an important role for improving storage utilization ratio and production efficiency in modern enterprises. This article described the location allocation intelligent algorithm of threcdimensional Warehouse Management System of Qinhuangdao Port Group Co. Ltd, this algorithm fully takes into account the factors of weight evenly distributing, the nearest bin location options, spare parts species evenly distributing, spare parts usage ratio, the empty time of a bin etc, computes the bin location of storage according to the current state of whole warehouse, and provides decision support for the bin location option. The actual project operation shows that the proposed integrated bin location optimization allocation algorithm is effective, and has very important significance for improving the safety of warehouses, storage efficiency and so on.
Heuristic Polarity Decision Making Algorithm Based on Warning Propagation and DPLL Algorithm
QIN Yong-bin,XU Dao-yun,WANG Xiao-feng
Computer Science. 2010, 37 (12): 178-181. 
Abstract PDF(400KB) ( 482 )   
RelatedCitation | Metrics
The warning propagation(WP) algorithm is an important foundation of message propagation algorithm, the essence of the WP algorithm is the iteration process of warning message on the factor gragh. When the algorithm is convergent, it can get a set of stable warning message and get some partial assignment of formula variables by local cavity domain. The analysis of basic principle of the WP algorithm was presented, and the improvement of the algorithm was given.The experiment on the RB sets shows that the improved algorithm has fewer iteration times, less execute time and faster convergence speed than the original WP algorithm. However, in the most of the RB sets, the WP algorithm is not convergent,and then it can not solve the formula effectively. The combination of the WP and DPLL algorithm can reduce the times of the backing calculation,and then up the shortage of the WP algorithm. The result of the experiment on the RB sets shows that the method is effective.
Text Affective Computing from Cognitive Perspective
XU Lin-hong,LIN Hong-fei
Computer Science. 2010, 37 (12): 182-185. 
Abstract PDF(388KB) ( 503 )   
RelatedCitation | Metrics
This paper applied the cognitive pragmatic and emotion psychology to the text affective computing. Based on Lazarus's cognitive evaluation theory and cognitive context of the cognitive pragmatic, a new text affective model was presented which processes the negation through acquiring contradictory emotion from affective corpus, and affective schema utilized for improving precision of affective recognition. This paper broadened the dimension of the researches, and provided a possible candidate solution for text affective computing. The experiment results show that the text affecfive cognition model is effective.
Natural Gradient Reinforcement Learning Algorithm with TD(λ)
CHEN Sheng-lei,GU Rui-jun,CHEN Geng,XUE Hui
Computer Science. 2010, 37 (12): 186-189. 
Abstract PDF(336KB) ( 488 )   
RelatedCitation | Metrics
In recent years,policy gradient methods arouse extensive interests in reinforcement learning with its excellent convergence property. Natural gradient algorithms were investigated in this paper. To resolve the problem of low efficiency when estimating the gradient in present algorithms,TD(λ) method was used to approximate the value functions when estimating the gradient. The eligibility traces in TD(λ) make the propagation of learning experience more efficient.As a result, the variance in gradient estimation can be decreased and the convergence speed can be improved. The simulation experiment in cart pole balancing system demonstrates the effectiveness of the algorithm.
Research on Multi-objective Evolutionary Algorithm Based on Island Model
ZHAO Feng-qiang,XU Yi,LI Guang-qiang
Computer Science. 2010, 37 (12): 190-192. 
Abstract PDF(342KB) ( 529 )   
RelatedCitation | Metrics
Recently the reaserch on multi-objective evolutionary algorithms based on Pareto optimization concept has become a research hotspot. And it has been widely applied in engineering fields. This paper presented a parallel nondominated sorting genetic multi objective evolutionary algorithm(PNSMEA) based on NSGA-II. PNSMEA adopes island model and the population is divided into several sulrpopulations that evolve separately. The sub-populations migrate good individules each other at intervals of some generations,which can keep individules' diversity and broad the search domain of each sulrpopulation.PNSMEA adopts arithmetic crossover operator to overcome the weak search capability of SBX operator used by NSGA-II.The test results show that PNSMEA can not only improve the premature problem as well as the search capability in the isolated regions of NSGA-II but also contribute to obtaining the Pareto solution sets with better distribution.
Permutation Ant Colony System for Heterogeneous DAG Scheduling Problem
DENG Rong,CHEN Hong-zhong,WANG Bo,WANG Xiao-ming,LI Can
Computer Science. 2010, 37 (12): 193-196. 
Abstract PDF(359KB) ( 305 )   
RelatedCitation | Metrics
Reducing execution time of distributed program is a major issue of Grid Scheduling System Because scheduled programs are modeled by DAG, this problem is also called Heterogeneous DAG scheduling problem. Permutation Schedining Ant Colony System(PSACS) proposed by this paper presents solution of this problem as task permutation list and utilizes standard ACO searching technique to explore solution space. Experimental result indicates that PSACS outperforms GA and PSO substantially. It can get global optima for the majority(65%)of homogeneous DAG scheduling problems and pretty good solutions for heterogeneous DAG scheduling problems.
Construction and Properties of Variable Threshold Object-oriented Concept Lattices
SONG Xiao-xue,ZHANU Wen-xiu,LI Hong
Computer Science. 2010, 37 (12): 197-200. 
Abstract PDF(362KB) ( 346 )   
RelatedCitation | Metrics
In this paper we discussed the conceptual reduction of fuzzy object oriented concept lattices. hhrec kinds of variable threshold concept lattices, i. e.,crisp-crisp, crisp-fuzzy, fuzzy-crisp variable threshold object oriented concept lattices,and the properties and the relations of them were discussed. The results shows that the number of concepts in variable threshold object oriented concept lattices is less than that in fuzzy object oriented concept lattices,and the important conpeal is preserved.
Study on Calculation of Dynamic Displacement from Time-frequency Integration of Acceleration
WANG Jian-feng,MA Jian,MA Rong-gui,SONG Hong-xun
Computer Science. 2010, 37 (12): 201-202. 
Abstract PDF(235KB) ( 960 )   
RelatedCitation | Metrics
A new method was presented, which uses acceleration through mixed integration in time-frequency domains to obtain dynamic displacement. This method conducts a frequency domain integral and a time domain integral. A compartitive analysis made between the calculation result and test result by mean of acceleration meter shows that test dynamic displacement by means of Mixed integration in timcfrequcncy domains can significantly reduce the error.
Microarray Data Classification Based on Principal Curves
QI Yun-song,SUN Huai-jiang
Computer Science. 2010, 37 (12): 203-205. 
Abstract PDF(244KB) ( 339 )   
RelatedCitation | Metrics
In this paper, a novel classifier was proposed to classify microarray data using principal curves. Principal curves are the non-linear generalization of principal components. Intuitively, a principal curve `passes through the middle of the data cloud'. As a kind of new classification technique,Principal Curve-based classifier (PC) involves a novel way of computing a principal curve for each class using the training data. A test sample is the class-label of the principal curve that is closest to it according to Expected Sctuared Error. Experimental results illustrate the performance of the PC is better than other existing approaches when a very small sample size of a microarray set is concerned.
Learning Algorithm and Properties on a Class of Fuzzy Hopfield Networks
ZENG Shui-ling,YANG Jing-yu,XU Wei hong
Computer Science. 2010, 37 (12): 206-208. 
Abstract PDF(241KB) ( 339 )   
RelatedCitation | Metrics
In this paper, an efficient learning algorithm was proposed for a class of fuzzy Hopfield networks(Max-T FHNNs) based on I=norms. For any given set of patterns, the learning algorithm can find the maximum of all conneclion weight matrices that can make the set become a set of the equilibrium points of the Max-T FHNN when T is a left continuous T-norm. This maximal matrix is idempotent matrix in sense of Max-T composition, with which the Max-T FHNN can be convergent to a stable state in one iterative process for any input vector. It is proved theoretically that arbitrary set of patterns can become a set of the ectuilibrium points of every Max-T FHNN if only the T is left continuous T-norm. Max-TL FHNN has universally good robustness to perturbation of training pattern.
Design and Implementation of a Audio Classification System Based on SVM
SUN Wen-jing, LI Shi-qiang
Computer Science. 2010, 37 (12): 209-210. 
Abstract PDF(251KB) ( 1030 )   
RelatedCitation | Metrics
The timcdomain feature of audio and its extraction method were analyzed. The research of the process and architecture of a audio classification system based on SVM was made, and the SVM audio classifier was designed. The results of experiments show that the audio classification system based on SVM designed in the paper can classify audio effectively, and the average identification accuracy reaches more than 90%.
Local Sensitive Nonnegative Matrix Factorization
JIANG Wei, YANG Bing-ru,SUI Hai-feng
Computer Science. 2010, 37 (12): 211-214. 
Abstract PDF(297KB) ( 418 )   
RelatedCitation | Metrics
Non-negative matrix factorization (NMF) is a new matrix decomposition method based on the part of the study,which has a reflection of human thinking partial constitute the overall concept. It only find two nonnegative matrices whose product can approximate the nonncgative data matrix without considering the geometric structure and the discriminative information in the data. We presented a local sensitive nonnegative matrix factorization for dimensionality to overcome the disadvantage, which preserves not only the nonnegativity but also the geometric structure and discriminative information of the data. An efficient multiplicative updating procedure was produced, and its convergence was gua-ranteed theoretically. Experiments on ORL and Yale face recognition databases demonstrate that proposed method outperforms many existing dimensionality reduction methods.
Three-dimensional Gait Planning for Humanoid Robots Based on Decoupling Synthesis and ZMP Algorithm
WANG Zhi-liang,YU Guo-chen,XIE Lun
Computer Science. 2010, 37 (12): 215-217. 
Abstract PDF(257KB) ( 410 )   
RelatedCitation | Metrics
This paper presented a new type of humanoid robot gait planning method. The gait of humanoid robot was simplified as 7-link model, lateral gait of humanoid robot was simplified as 5-link model. And then in the case of the same Z coordinate, threcdimensional gait was synthesized. Eventually the method Conducted a validation and simulation through ZMP equation, combined with the actual system and its operational status analysis, the effectiveness of the proposed planning method was verified.
Lattice Implication Algebras, FI-algebras, and MV-algebras Based on the Interval Sets
XUE Zhan-ao,DU Hao-cui,YIN Hacr-zhe,XIAO Yun-hua
Computer Science. 2010, 37 (12): 218-223. 
Abstract PDF(408KB) ( 337 )   
RelatedCitation | Metrics
The interval sets arc new important direction of the research, which have been widely applied in approximate reasoning,fuzzy control, and so on. In the interval sets, the new interval implication (→) was redefined, the lattice implication algebras was reconstructed and its properties were discussed. Meanwhile, commutative FI-algebras and MV-algebras also were redefined in the interval sets. Three different algebraic systems were proved to equivalence, which are lattice implication algebras, commutative FI-algebras and MV-algebras.
Particle Swarm Optimization Algorithm Based on Extreme Value of Sub-swarm and Sharing Redistribution
GONG Yan,ZHANG Hao
Computer Science. 2010, 37 (12): 224-226. 
Abstract PDF(247KB) ( 316 )   
RelatedCitation | Metrics
To improve the efficiency of Particle Swarm Optimization, this paper proposed a novel Particle Swarm Optimization algorithm(ESPSO). The basic idea is Sub-Swarm mechanism and Sharing Redistribution. The main contribudons include, (1) Divides whole Swarm into n sulrSwarm; Each sub-Swarm search solution Independently; (2) Introduces extreme value of Sub-Swarm strategies to enable particle interaction; (3) Introduces Sharing Function to redistribute some Particle. The experiments on four benchmark functions show that the new algorithm increases success rate by 64%~93% compared with IPPSO.
Algorithm for Scenes with Complex Indirect Illumination
ZHU Zhen-xing, XU Xiao-yang,PAN Jin-gui
Computer Science. 2010, 37 (12): 227-229. 
Abstract PDF(255KB) ( 399 )   
RelatedCitation | Metrics
A global illumination algorithm called Ercuts was proposed to render scenes with complex indirect illuminalion. Ercuts adopted bidirectional path tracing to initialize the scene, and generated Markov chains of the same length by ER(Energy Redistribution) sampling for each pixel. After that, a VPL(Virtual Point Light) set was developed to simulate Markov chains. Ercuts also accelerated the computation of VPLs by establishing a light tree from VPI. set and tra versing it to approximate illumination from VPLs with a strongly sublinear cost. The experimental results indicate that, compared with traditional bidirectional path tracing, Ercuts works more effectively for scenes with complex indirect illummation.
Denoising Method with Gradient Fidelity Term on Smoothing Region
WANG Xu-dong,FENG Xiang-chu,CHEN Li-xia
Computer Science. 2010, 37 (12): 230-233. 
Abstract PDF(329KB) ( 701 )   
RelatedCitation | Metrics
Image denoising based on total variation may cause‘staircase effect' while the noise is removed. Using the coupled gradient fidelity term can effectively restrain‘staircase effect',but it makes edges obscured. I}his paper discussed the the way to detect smooth regions of an image. And three denoising methods with gradient fidelity term on smoothing region were proposed. Our methods can not only improve the denoising performance significantly, but also overcome ‘staircase effect' and preserve the edges. Numerical experiments demonstrate our models arc efficiency.
Multichannel Blind Image Restoration
XIAO Su,HAN Guo-qiang,WO Yan
Computer Science. 2010, 37 (12): 234-237. 
Abstract PDF(316KB) ( 361 )   
RelatedCitation | Metrics
In order to well utilize the information in the observed images and the complementarities between the information to improve the results of the image restoration, this paper proposed a blind restoration algorithm for multichannel images. Firstly, the prior models of the original image, point spread functions and the observed images were reconstructed, from which the prior distributions of them were obtained; secondly, the Uamma distribution was used to describe the unknown model parameters; finally, based on the inference of the max posterior probability, the optimal original image and the point spread functions were estimated using the evidence analysis method. Compared with the single channel algorithms, the experiments show that the multichannel blind image restoration can obtain better results. Meanwhile,the proposed algorithm shows competitive performance on restored results compared with some state-of-the-art algorithms.
3D Skeleton Algorithm Using Level-progressive
SUN Xiao-peng,ZHANG Qi
Computer Science. 2010, 37 (12): 238-240. 
Abstract PDF(345KB) ( 380 )   
RelatedCitation | Metrics
In this paper, we introduced a new algorithm for extracting skeletal curves. First, a few faces, the prominent feature faces,were computed by multi dimensional scaling(MDS).The algorithm which is based on the center of the feature faces and clustering made a preprocessing on the 3D models. Second, according to the seed which is the feature faces, getting a hierarchical segmentation of mesh by Uaussian curvature, k-ring strip growing algorithm was used. Calculated the center of each level and connect the center. At last whole skeleton. Experimental results show that the method is connected the line of the skeleton point for getting the correct and eff ective.
Spectral Reflectance Estimation by Support Vector Regression
ZHANG Wei-feng
Computer Science. 2010, 37 (12): 241-242. 
Abstract PDF(258KB) ( 537 )   
RelatedCitation | Metrics
A spectral reflectance estimation method using support vector regression and framclet kernel was proposed.Spectral reflectance estimation is an important subject in optical research. The aim is to convert device-dependent RGB values to device-and illuminant independent reflectance spectra. Regression methods are widely used to estimate spectral reflectance of surface colors given their camera responses, such as regularized least squares method with polynomial models,kernel based regularized least squares method, etc. In this paper, we introduced a novel estimating approach based on the support vector regression method. The proposed approach utilizes a framelet based kernel, which has the ability to approximate functions with multiscale structure and can reduce the influence of noise in data. Experimental resups show that the technicaue can improve the recovery accuracy and stability.
Fast Fuzzy C-means Algorithm Based on Entropy Constraint for Underwater Image Segmentation
WANG Shi-long,XU Yu-ru,WAN Lei,TANG Xu-dong
Computer Science. 2010, 37 (12): 243-246. 
Abstract PDF(440KB) ( 393 )   
RelatedCitation | Metrics
The mission of the vision system of autonomous underwater vehicle(AUV) is dealing with the information about the object in the complex environment rapidly and exactly for AUV to use the obtained result for the next task.So, aiming at realizing the image segmentation quickly on the precondition of high qualification, a fast fuzzy C-means algorithm based on entropy constraint for underwater image segmentation was proposed, in which the gradient operator,the histogram' s statistical characterization, sampling-computation and the relative information loss were considered comprehensively, and regularity of taking value of fuzzy weighted exponent m in this new algorithm was studied in detail by use of underwater image segmentation result and appraisal index of validity of fuzzy partition. Experimental results show that the novel algorithm can get a better segmentation result and the time efficiency is improved and the request of highly real-time effectiveness of AUV is satisfied.
Analysis Method for the Projection Error of Circle Center in 3D Vision Measurement
HAN Jian-dong,YANG Hong-ju
Computer Science. 2010, 37 (12): 247-249. 
Abstract PDF(226KB) ( 1359 )   
RelatedCitation | Metrics
Aiming to the misalignment of the perspective projection of the circle center and the imaged ellipse center when the circle surface and the image plane are not parallel,an analysis method for the projection error of ellipse center was proposed.The mathematical formulation of the projection error was deduced by building the mapping relation between image plane and circle surface,and the factors influencing the error were analyzed by the computer simulation.The method provides a valuable reference in theory for the measurement error evaluation of the circular center and the camera planning,and is valuable in engineering applications of vision measurement.
Research on Fast Volume Rendering of Medical Image with Large Volume Data
WANG Yang-ping,DANG Jian-wu,DU Xiao-gang,LI Sha,TIAN Zhong-ze
Computer Science. 2010, 37 (12): 250-251. 
Abstract PDF(281KB) ( 593 )   
RelatedCitation | Metrics
In order to do fast volume rendering for a large medical volume data and meanwhile assure the quality of rendering, a fast volume rendering method was presented. Firstly, the large medical volume data was divided into equal-size blocks, and then a set of visibility tests such as empty block space skipping, early block termination and early ray termination were used to speed up the whole rendering process. Finally,volume rendering prcintegration was utilized to improve the performance of volume rendering. The experiment results show that the proposed fast volume rendering has picked up the speed of rendering for a large medical volume data without loss of image quality.
Feature-preserving Surface Reconstruction from Scattered Points——Variational Level-set Method
JING Zhu-cui,LI Ming,XU Guo-liang
Computer Science. 2010, 37 (12): 252-254. 
Abstract PDF(307KB) ( 339 )   
RelatedCitation | Metrics
The aim of this paper is to reconstruct surface from scattered points while preserving or enhancing sharp features. This goal is achieved by establishing a new energy model according to the difference of the principal curvatures.We derived a nonlinear partial differential ectuation and solved it using the finite-element method for the unknown level set function. Numerical experiments show that the proposed method is effective.
Research on Tracking Topological Changes in Family of Objects Model
SUN Li-juan,LIU Xian-guo,YU Chun-feng,GAO Shang-min
Computer Science. 2010, 37 (12): 255-258. 
Abstract PDF(345KB) ( 335 )   
RelatedCitation | Metrics
A novel algorithm of tracking topological changes of family of object models was imposed,relations between parameters and topology of models were established, critical values of parameters were computed, dependent entities of models were determined, stable intervals and parametric ranges were computed, and at last topological changes of family of object models were tracked accurately in this system. The algorithm was used in our HUST-CA)D system,and intelligent of system was improved, parametric ranges was determined for designers.
Tree Simulation Method Based on Diameter Variety Rate
LIU Xiao-dan,DONG Ge,SUN Hong-yan
Computer Science. 2010, 37 (12): 259-261. 
Abstract PDF(292KB) ( 418 )   
RelatedCitation | Metrics
Proposed a trees simulation method with diameter variety rate. Based on L system, incorporated diameter variety rate of tree's branches and trunks into the tropism mechanics model,applied parameters to control branch's part shape and space location. The different species depent on the relation between the parameter and diameter variety rate.The method can control the tree's final shape conveniently, and generate the graphs which are more idyllic.
Research on Free Modeling Method Based on Free Hand Sketching
ZHAO Na,GUO Li,YUAN Hong-xing
Computer Science. 2010, 37 (12): 262-265. 
Abstract PDF(378KB) ( 360 )   
RelatedCitation | Metrics
A method of free modeling based on free hand sketching was proposed.Through the operations including smooth denoising, resampling, curve fitting, the sketch strokes obtained from users were transformed into sequence of evenly spaced points, from which the geometric information of target model was extracted. According to the geometric information, three-dimensional mesh model was generated. The drawing of free body with particular shape can be completed after adding texture mapping and illumination rendering to the 3d mesh model. Experimental results show that the method can be easy and quick to achieve threcdimensional shape modeling in line with users's ketching intention.
Legalization Algorithm for Macro Problems
GAO Wen-chao,CHEN Fu-zhen,YAN Hai-xia,LU Yong-qiang,QIAN Xu,ZHOU Qiang
Computer Science. 2010, 37 (12): 266-269. 
Abstract PDF(324KB) ( 496 )   
RelatedCitation | Metrics
Placement legalization removes the overlap among cells after the global placement, and moves the cells to their final position. Macros make legalization harder to resolve in the mixed mode. According to the structure characteristics of macros,and considering the site constraint an algorithm for macros legalization was designed and implemented.The algorithm was tested on mPL6 global placement Experimental results show its reasonable quality compared with FastPlace.
One Management Strategy of Reconfigurable Resource Using Graph Theory
ZHANG Hong-lie,ZHANG Guo-yin,CONG Wan-suo,HU Hai-yan
Computer Science. 2010, 37 (12): 270-274. 
Abstract PDF(406KB) ( 367 )   
RelatedCitation | Metrics
The management of reconfigurable resource is an important task for operating system of reconfigurable system. This paper presented UPFS algorithm for FPGA based on graph theory. The main idea of UPFS is the free space of FPGA mapped to undigraph, and then calculated the biggest loop and chain using adjacency matrix and direction vector inclination in undigraph concept, at last we found a set of the biggest rectangular satisfied the conditions. The simulation results show that, compared with existent algorithms, UPFS is feasible management strategy, which can reduce the waste of system resource and decrease the time of hardware distribution.
Effective Multi-objective Genetic Algorithm for Hardware-software Partitioning
LUO Li,XIA Jun,HE Hong-jun,LIU Han
Computer Science. 2010, 37 (12): 275-279. 
Abstract PDF(472KB) ( 411 )   
RelatedCitation | Metrics
Hardware/Software partitioning is one of the critical step in Hardware/Software Codesign flow,and has very important influence on the final design. In terms of the number of optimizing objects, it can be classified as single object partitioning and multi objects partitioning. Multi objects partitioning is NP-hard problem,a multi objective partitioning algorithm usually gets irrelevant Pareto results,no traditional optimum results. Genetic algorithm benfits to solve multi objective partitioning for its parallel colony research. Fitness function was investigated and a redefined fitness function was proposed, which adopts self-adaptive parameter and penalization function to escape from the premature convergence and improve evolution speed.The practical experiment results demonstrate that this algorithm is more efficient to balance all the parameters to optimize multi system objects under some constraints, for instance, executing time, cost, hardware area power, etc.
Fuzzy Neural Network Based Analog Circuit Fault Diagnosis Using Genetic Algorithms
ZHU Yan-qing,HE Yi-gang
Computer Science. 2010, 37 (12): 280-282. 
Abstract PDF(240KB) ( 392 )   
RelatedCitation | Metrics
A systematic approach combining fuzzy neural network, wavelet analysis and genetic algorithm was proposed for fault diagnosis of analogue circuits.The presented fuzzy neural network was developed with the improved fuzzy weighted reasoning method. The optimal feature sets was extracted to train the network by using wavclet analysis as a preprocessor.This can ensure a simple architecture for the neural network and minimize the size of the training set required for its proper training. And the adjusting of connection weights and optimization of membership functions were performed with genetic algorithms. The reliability of this method was experimented with active filter examples.The resups of experimental tests show that this method can satisfactorily detect and identify the faults. It not only distinguishes the ambiguity sets or some misclassificd faults that some other methods cannot identify, but also has faster speed in the training of network.
Design and Implementation of Object-based Storage Controller Based on SOC
GUO Yu-feng,LI Qiong,LUO Li,LIU Guang-ming
Computer Science. 2010, 37 (12): 283-286. 
Abstract PDF(377KB) ( 503 )   
RelatedCitation | Metrics
The object based storage repartitions the tradition file system functionalities and offloads the storage management functions to intelligent storage devices. With computing power of intelligent storage devices and based on object interface to enhance performance and security of storage system,and data sharing across platforms,Object based storage has been widely studied and applied. Object based storage controller is the key component of object-based storage system,and pays import effect to performance of object based storage system. Design and implementation of an object based storage controller based on SOC was put forward in this paper, the experimental results show that our object based storage controller performs well for system performance, reliability, cost and power. Finally, some parallel optimite methods which we are studying on were introduced.
Design and Implementation of Scalability Based on High Performance PCs Cluster
ZHU Yong-zhi,TIAN Tian
Computer Science. 2010, 37 (12): 287-291. 
Abstract PDF(400KB) ( 331 )   
RelatedCitation | Metrics
Scalabihty is a important concept for analyzing the performance of parallel computing systems. Heterogeneous systems have become more common, but few study for heterogeneous system's scalability.This paper first presented a definition of efficiency that can be fit for homogeneous parallel computing systems and heterogeneous parallel compuling systems. According to this definition, this paper also presented a iso-efficiency model that can be applied to both heterogeneous systems and homogeneous systems. From this we can analyze the change circumstance of the system scale and the question scale in which the efficiency maintains consistent. Finally, the paper presented some experiments,the results show that this model can estimates preferably the efficiency and high scalability of parallel computing systems.