Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 41 Issue 11, 14 November 2018
  
Survey of Case-based Reasoning Based on Description Logics
SUN Jin-yong,GU Tian-long and CHANG Liang
Computer Science. 2014, 41 (11): 1-6.  doi:10.11896/j.issn.1002-137X.2014.11.001
Abstract PDF(684KB) ( 470 )   
References | Related Articles | Metrics
Case-based Reasoning based on description logic (DL-based CBR) is one of the hot research topics in CBR.Firstly,the origin of CBR was introduced.Next,the development history of DL-based CBR was reviewed.Then research works on DL-based CBR were investigated comprehensively from four aspects:case representation and organization,case retrieval,case revision and case base maintenance.Finally,some existing problems were pointed out and correspondingly some future research directions were proposed.
Survey on Intelligent Transportation System
ZHAO Na,YUAN Jia-bin and XU Han
Computer Science. 2014, 41 (11): 7-11.  doi:10.11896/j.issn.1002-137X.2014.11.002
Abstract PDF(622KB) ( 4268 )   
References | Related Articles | Metrics
Traffic problems such as traffic congestion,environmental pollution can be regarded as the contradiction among people,vehicle and road,and the Intelligent Transportation System (ITS) is the only solution to this contradiction.Based on current comprehensive research of the ITS at home and abroad,this paper emphatically summarized the ITS development and further discussed the facing problems and challenges in our country.According to our national circumstances,we proposed the ITS by introducing technologies such as Internet of things,cloud computing and data mining,to strenghten the standardization by encouraging the government,enterprises and universities to work together.The level of national ITS can be improved by establishing the industry alliance to promote Chinese industrial chain integration and combining the ITS with logistics industry.
Legitimacy Analysis of Reverse Engineering Technique towards Network Coding
DU Xing-zhou,XU Chao and MENG Zhao-peng
Computer Science. 2014, 41 (11): 12-15.  doi:10.11896/j.issn.1002-137X.2014.11.003
Abstract PDF(381KB) ( 539 )   
References | Related Articles | Metrics
Network coding is a significant break-through in the field of network communications.It transformes the store-and-forward mode of traditional network and combines the information processing technologies of routing and coding,bringing in a great progress of network transmission efficiency.In the procedure of technology research and application,the intervene of network reverse engineering leads to a common attention and discussion of the intellectual property issue across the field of IT and law,especially for the topic of Fair Use.This article first studied the technical features of network coding,and then made an analysis of the legitimacy of network reverse engineering towards network coding upon the theory of Fair Use and network coding,coming up with proper principles.
SLP Exploitation Method for Type Conversion Statements
ZHAO Bo,ZHAO Rong-cai,LI Yan-bing and GAO Wei
Computer Science. 2014, 41 (11): 16-21.  doi:10.11896/j.issn.1002-137X.2014.11.004
Abstract PDF(1228KB) ( 441 )   
References | Related Articles | Metrics
With the rapid development of multimedia technology,more and more processors are integrated with SIMD (Single Instruction Multiple Data) extensions.Almost all current compilers are equipped with automatic vectorization.SLP (Superword Level Parallelism) method is introduced to some compilers in order to exploit intra-iteration paralle-lism.Frequent data type conversions are required when handling the multimedia data because of its characteristics of intensive storage and regular computation.However,the processing capacity of current SLP technique is not sufficient enough.In order to exploit more opportunities of SLP vectorization in programs with large amounts of type conversion statements,an SLP vectorization method was proposed to deal with type conversion statements.This method is able to handle the type conversion statements with the same or different vector factors using data regrouping in the framework of SLP vectorization.The experiment results show that the proposed method is efficient to exploit the SLP vectorization of data type conversion statements and is effective to improve the performance of the vectorization programs.
Test Data Compression Method of Dual Run Length Alternating Coding
CHENG Yi-fei and ZHAN Wen-fa
Computer Science. 2014, 41 (11): 22-24.  doi:10.11896/j.issn.1002-137X.2014.11.005
Abstract PDF(319KB) ( 424 )   
References | Related Articles | Metrics
The large amount of test data is one of the challenges in SoC test,and the test data compression is an effective method to deal with this challenge.A new test data compression method of a dual run length alternating compression was presented.0 run length and 1 run length can be coded alternately,and the type of next run length can be obtained according to the previous run length type.So the type of run length is not represented in code words,and hence the length of the needed code word is reduced.Experimental results show that the method can achieve higher compression ratio compared with the similar method,and decompression structure is very simple,so the goal of reducing the cost of test can be achieved.
User Behavior Modeling Method for Mobile Applications Based on Log Mining
CHEN San-chuan,WU Guo-quan,WEI Jun and HUANG Tao
Computer Science. 2014, 41 (11): 25-30.  doi:10.11896/j.issn.1002-137X.2014.11.006
Abstract PDF(493KB) ( 487 )   
References | Related Articles | Metrics
This paper presented a user behavior modeling method for mobile applications based on log mining.The method is two-fold,including monitoring instructions instrumentation and UI access modeling.We presented an automatic monitoring instructions instrumentation method that uses static analysis to automatically insert monitoring instructions at the appropriate site in order to dynamically monitor user behavior at run time.We also presented an automata based user behavior modeling method for mobile applications.Information attached to states and transitions of the automata in user behavior model describes transitions between UIs and the usage of each widget within UIs.The test results on real world mobile applications show that this method can both successfully instrument monitoring instructions and effectively obtain the UI access behaviors.
Smart SEP:An Online Synchronized Education Platform Based on Graphical Web Operation Record and Replay
CHEN De-jian,SUN Yan-chun and HUANG Gang
Computer Science. 2014, 41 (11): 31-35.  doi:10.11896/j.issn.1002-137X.2014.11.007
Abstract PDF(1001KB) ( 482 )   
References | Related Articles | Metrics
Influenced by Web 2.0 and cloud computing,online education platforms based on Web browser develop rapidly and have great impact on remote education,in-class teaching and collaborative learning.However,most of current education platforms seldom focus on synchronized education based on Web graphics,which is very important for process displaying and learning interaction on teaching materials.We firstly analyzed three major application fields of current online education,and then put forward an online synchronized education method based on graphical Web operation record and replay.On the one hand,the method supports basic synchronized education by high-fidelity operation record on presenter’s terminal and self-adapting operation replay on audience’s terminal.On the other hand,it also ensures reliable and orderly online education for both normal participants and late-comers by interactive synchronization control.Based on the method,we implemented Smart SEP (Smart Synchronized Education Platform) and gave detailed case study,which performs synchronized,dynamic and interactive education and verifies the feasibility of our method.
Feature Location Method Based on Call Chain Analysis
FU Kun,QIAN Wen-yi,PENG Xin and ZHAO Wen-yun
Computer Science. 2014, 41 (11): 36-39.  doi:10.11896/j.issn.1002-137X.2014.11.008
Abstract PDF(562KB) ( 396 )   
References | Related Articles | Metrics
In order to accomplish a variety of software maintenance tasks,such as fixing bugs,changing existent functionalities,or adding new features,developers often need to look for related codes of a feature in advance.Such a process to identify relevant program elements according to a given feature is called as feature location.Existing feature localization methods,mainly based on user demand,search related code elements in the source code,and recommend them to the user.However,these scattered elements do not have any connection with each other,and user may still need to manually find out the relationship between the elements,to understand how the code elements combine together to achieve a specific function.However,a new approach based on method-call-chains associated with data transferring can improve the feature location practice.This method can analyze the source code,extract all of the method-call-chains associated with the data transferring,and find relevant ones on the user’s demand.This method-call-chain is not a simple code segment.It can reveal the implementation of specific functions,and help user to understand the program better.An Eclipse plug-in of this approach was evaluated on JEdit.And the precision of the results of the tool is average 55%.
Search-based Automated Resolution of Context Inconsistency
JIANG Lei,XU Chang and CHEN Xiao-kang
Computer Science. 2014, 41 (11): 40-45.  doi:10.11896/j.issn.1002-137X.2014.11.009
Abstract PDF(545KB) ( 503 )   
References | Related Articles | Metrics
In recent years,with the popularization of intelligent equipment and the development of sensor technologies,context-aware applications keep emerging.However,contexts available for applications are prone to inconsistency due to unpredictable and uncontrollable environmental noises.There have been many techniques proposed for resolving such inconsistency.However,they largely overlook two aspects.First,inconsistency resolution for one constraint may lead to violation of another constraint (i.e.,constraint interfering).Second,inconsis tency resolution may affect application functionality or quality (i.e.,side effect).So a novel search-based automated resolution technique for context inconsis -tency was proposed to address these two aspects.Efforts were paid to run the inconsistency resolution efficiently via incremental computing.Experiments show that the newtechnique can produce satisfactory resolution results with negligible time cost.
Modified Hierarchy Clustering Algorithm for User-session-based Performance Testing
LIANG Li-tu and LU Lu
Computer Science. 2014, 41 (11): 46-49.  doi:10.11896/j.issn.1002-137X.2014.11.010
Abstract PDF(435KB) ( 478 )   
References | Related Articles | Metrics
Web applications are important parts of global information infrastructures.It attracts more and more researchers to study Web application performance testing.In this paper,a user-session-based approach combined with a modified agglomerative hierarchical clustering algorithm was proposed to automatically generate performance test cases.The approach generates test cases by exploiting user-sessions from server logs.It can not only reduce the manual effort when test engineers design the test cases,but also guarantee the validity of the testing results.In our approach,we firstly gave a definition of how to achieve the similarities between two URLs,and then employed a dynamic programming algorithm to calculate the similarities between two user sessions.According to the similarity matrix,a bottom-up agglomerative hierarchical clustering was employed to cluster the user sessions and then generated the test cases.Finally,experimental result of our approach shows its validity.
UML Design Pattern Recognition Method Based on Structured Query
XU Han-bin,ZHANG Xue-lin,ZHENG Xiao-mei,ZHANG Tian and LI Xuan-dong
Computer Science. 2014, 41 (11): 50-55.  doi:10.11896/j.issn.1002-137X.2014.11.011
Abstract PDF(762KB) ( 405 )   
References | Related Articles | Metrics
As model-driven techniques are matured and widely used,more and more models reflecting structures,behaviors and features of program have been produced in the process of software development.And models also preserve as important parts of software documentations.Among them,UML models are most widely used.Therefore,comprehension of UML models is thought as a good way to the comprehension of large-scale,highly complex software systems.One of the difficulties to comprehension of UML models is how to find and locate effectively a certain structural feature of model fragments from a large number of complex models.Fortunately,the wide application of design patterns provides an important clue for us to understand and locate model quickly and efficiently.This paper aimed to analyze and understand the structural features of design patterns in order to identify design patterns in UML models.In this way,the purpose of understanding software system flexibly and efficiently can be achieved.
Research on Deviation Diagnostic of Software Process Behavior Based on EPMM Modelling
ZHU Rui,LI Tong,MO Qi,ZHANG Xuan,WANG Yi-quan,LIN Lei-lei and DAI Fei
Computer Science. 2014, 41 (11): 56-62.  doi:10.11896/j.issn.1002-137X.2014.11.012
Abstract PDF(587KB) ( 518 )   
References | Related Articles | Metrics
In recent years,with in-depth understanding of PSEE,people gradually discover that there are certain deviations between the enacting process model and the actual observed process,which leads to the PSEE loses guiding significance for the actual software development activities.For the process deviation problem,based on the software evolution process meta-model (EPMM),on the side of software process deviation detecting,this paper proposed the process behavior space expression on learning process algebra weak bi-simulation ideas in order to detect the process deviation.On the side of software process deviation handling,the process deviation division of type and handling strategy were given.This method can effectively find prevalent deviations problems in the software process implementation, and improve the software process through dealing with the deviation problems,ultimately improve the quality of software products.
Personalized Integration Framework for Mobile Applications and its Implementation on Android Platform
ZHANG Dong-dong and XU Feng
Computer Science. 2014, 41 (11): 63-68.  doi:10.11896/j.issn.1002-137X.2014.11.013
Abstract PDF(725KB) ( 415 )   
References | Related Articles | Metrics
The explosive growth of mobile applications has brought new opportunities and challenges to both developers and users.From the perspective of developers,it is now possible to build new mobile applications rapidly based on massive existing mobile applications.However,such high-level integration is largely ignored by most existing developing tools whose focuses are on the API level.From the perspective of users,while recommending a single personalized application for users has been widely explored,recommending a whole application integration such as application sequences still remains as an open problem.We proposed a personalized integration framework for mobile applications,and our framework contains two major parts:1) defines an intent process execution language,which could facilitate the developers to build applications more naturally,2) gives the algorithm to evaluate the preferences of different mobile application sequences,which could support personalized recommendation of application sequences.To verify the rationality of the framework,we implemented it as a developing tool and runtime supporting mechanism on the Android platform.
Design and Implementation of SNS-based Mashup Connector
ZHUANG Xi-wei,SUN Yan-chun and HUANG Gang
Computer Science. 2014, 41 (11): 69-73.  doi:10.11896/j.issn.1002-137X.2014.11.014
Abstract PDF(1242KB) ( 422 )   
References | Related Articles | Metrics
In Web 2.0 era,rich client applications get popular.Usually,rich client applications are independent,by using mashup,we can create a new application with better user experience from two or more Internet sources.Mashup can help the information transfer between various applications in the same rich client,not different rich clients.However,when using applications,people usually need experienced friends’ help.So currently it fails to combine collective experience and intelligence.On the other hand,SNS (Social Network Service) becomes hot Web service with its timeliness,interactivity,universality and collective experience and intelligence.Taking advantage of SNS,we designed one kind of SNS-based mashup connector,so that collective experience and intelligence are added.At the same time,it solves the information transfer problem of applications between different rich clients.Then we implemented it and gave case study,which verifies the feasibility and validity of the mashup connector design.
MobiTran:A Technique of Transforming PC Web Application for Smart Phones
FANG Yi-meng,MA Yun,LIU Xuan-zhe and HUANG Gang
Computer Science. 2014, 41 (11): 74-78.  doi:10.11896/j.issn.1002-137X.2014.11.015
Abstract PDF(1009KB) ( 412 )   
References | Related Articles | Metrics
With the development of Internet,a large number of mobile devices begin to get access to Internet.However,Web applications are mostly designed for the PC screen size.The screen size of mobile devices is small relative to the PC.Therefore,when a mobile device gets access to the application for PC,there may be incomplete information display,interface layout confusion,poor user experience,more network traffic and other issues.Developing a mobile version of the Web application from scratch is the most naive choice,but it costs a lot,and maintaining two different versions is a daunting task.So we tried to adapt the view of the Web application instead of making a new one.Therefore,this paper implemented a developer-oriented Web application conversion tool,called MobiTran.The PC version of the Web application can be transformed into mobile version to fit the screen size of mobile devices,other features and data traffic.It also allows developers to edit the style and layout of the user interface,and ultimately a mobile version of this Web application is generated.After testing on mainstream sites,we concluded that MobiTran can adapt the view of Web application while maintaining the style,and it can lose less information and save data traffic.
Interaction Model for Internet of Things Based on Real-time UML Sequence Diagram
CONG Xin-yu and YU Hui-qun
Computer Science. 2014, 41 (11): 79-87.  doi:10.11896/j.issn.1002-137X.2014.11.016
Abstract PDF(685KB) ( 1399 )   
References | Related Articles | Metrics
Internet of things (IOT) is an intelligent system which integrates computation,communication and control.It aims at monitoring the behavior of physical processes,analyzing information to generate correct instructions,and actuating actions to make physical environment work correctly and better.In Internet of things,different things interact locally or through network connections.These interactions are affected by time and locations.Modeling and verification of Internet of things are an important area in the research of Internet of things.This paper proposed an interaction model for IOT based on real-time UML sequence diagram.In our model,all things that interact in IOT are modeled as interactive objects between which interactions are modeled by real-time UML sequence diagram.And timed automata is used to model the internal state changes of interactive objects,which results in a complement to the interaction model.Finally according to the transition rules,the interaction model was transformed to the form of timed automata for the sake of verification.A case study was used to show how to specifically apply the interaction model.Further some properties which IOT should satisfy were presented.And the model checking tool UPPAAL was used to analyze and verify the interaction model for IOT.
Enterprise Goal Programming Based on Analysis of Domanial KAOS
QIAN Li-bin,LIU Nian-tang,HU Yu-tian,LU Dan and SHAO Kun
Computer Science. 2014, 41 (11): 88-93.  doi:10.11896/j.issn.1002-137X.2014.11.017
Abstract PDF(492KB) ( 414 )   
References | Related Articles | Metrics
The existing enterprise information planning’s researches focus on the conversion process between the abstract enterprise goal and the specific information system target,lack formal description and deep-seated analysis of the presentative enterprise goal.This paper proposed an enterprise goal programming method based on analysis of domanial KAOS from the requirements engineering.This method gets more complete,clear and consistent enterprise goal by detailedly analyzing the abstract enterprise goal,the relationship between enterprise goals and the goal inconsistencies in enterprise domain,and builds a model of enterprise goal to meet the enterprise domain features and informational business requirements.Lastly,the paper defined the GSD(goal support degree) for quantitative analysis and verification of the enterprise goals to ensure the accuracy and reliability of the planned enterprise goals.
Scenario Automation Machine Based Internetware Evolution
WANG Mao-guang and CAO Huai-hu
Computer Science. 2014, 41 (11): 94-98.  doi:10.11896/j.issn.1002-137X.2014.11.018
Abstract PDF(889KB) ( 415 )   
References | Related Articles | Metrics
Internetware is an abstraction of software system in the open,dynamic and varying network environment.The internetware evolution characteristic requires the software is capable of evolving dynamically according to the application requirements and running environment changes.The scenario concept in the requirements phase of software engineering domain was introduced to the dynamic evolution.Different from the static requirements description,the scenario is used to illustrate the system dynamic evolution.The scenario formal representation and its complementary,equivalence and subset relations were defined.The internetware evolution method based on the scenario automation machine was illustrated.Thus,the internetware evolution is represented as a series of application scenarios.The system evolution is represented as the scenario transformation.It provides a new method for complex software self-evolution and supports the system large-granularity reuse.
Axiomatizing Weak Covariant-contravariant Simulation Semantics
ZHANG Wei
Computer Science. 2014, 41 (11): 99-102.  doi:10.11896/j.issn.1002-137X.2014.11.019
Abstract PDF(353KB) ( 384 )   
References | Related Articles | Metrics
Process algebra,as one of the important tools for describing and analyzing concurrent and distributed systems,becomes a central branch of research in concurrency theory.Simulation is considered as one of the fundamental notions for describing the refinement relation between processes.Covariant-contravariant simulation generalizes plain simulation and distinguishes the types of actions.Intuitively,it captures the fact that it is not always the case that “the larger the number of behaviors,the better”.But covariant-contravariant simulation ignores the difference between observable actions and silent actions.This paper presented the notion of a weak covariant-contravariant simulation and provided its axiomatic system.The soundness and ground-completeness are established in this system.Furthermore,we proved that such system is also ω-complete.
Model and its Evaluation Method for QoS of Web Services
XIAO Fang-xiong,Li Yan,ZHANG Jun-hua,ZHU Yi and ZHU Xiao-dong
Computer Science. 2014, 41 (11): 103-106.  doi:10.11896/j.issn.1002-137X.2014.11.020
Abstract PDF(333KB) ( 514 )   
References | Related Articles | Metrics
We proposed an abstract QoS model that includes only three abstract properties to cover all concrete QoS properties,i.e.,time,cost and probability,from views of practice and conciseness.We also proposed approaches for handling and evaluating the abstract properties in unique way,which support evaluating QoS of Web services in consis-tent and unambiguous way in abstract model level,and selecting Web service with optimal QoS.
Framework of BIM Cloud Services and Retrieval Algorithm of Spatial Location Based on Hadoop
CHEN Ze-lin,PAN Yun-jun,HE Yi-chen and QI De-yu
Computer Science. 2014, 41 (11): 107-111.  doi:10.11896/j.issn.1002-137X.2014.11.021
Abstract PDF(724KB) ( 420 )   
References | Related Articles | Metrics
Cloud platform becomes a necessity to store and manage huge data for complex applications.The task of Building Information Modeling (BIM) is to organize the relevant data and work collaboratively during the whole life-cycle of the construction informationize.BIM is in urgent need of cloud computing.How to build a supercomputing model on cloud platform is a big challenge when facing the complex BIM application.This paper presented a framework of cloud services for BIM application.Hadoop is a distributed software framework.The four layers of the framework were designed:cloud storage,cloud platform services,application services and client applications.The retrieval algorithm of urban spatial location was proposed on the framework,which uses the improved KD tree as index table.The paper presented the load balancing algorithm for spatial location retrieval by which many groups of users access the data block concurrently.By statistical access frequency of the nodes,the strategy of balanced distribution for data block was designed.Experiments show that the framework has the characteristics of concurrent processing capability and rapid response for building information.
Scratch:Tooling Support for Capture-and-replay of User Actions in Chrome Browser
CHEN Xiao-yu,HUANG Zhen,LIU Xuan-zhe,HUANG Gang and ZHANG Ying
Computer Science. 2014, 41 (11): 112-117.  doi:10.11896/j.issn.1002-137X.2014.11.022
Abstract PDF(768KB) ( 1400 )   
References | Related Articles | Metrics
Modern browsers,such as Mozilla FireFox and Google Chrome,are equipped with numerous powerful facilities like plug-ins and add-ons,which significantly enrich the user experiences on the Web.However,as Web applications get more complicated day by day,many tedious processes must be performed frequently,while others which are complex or hard to remember are done less frequently.This paper presenb ed the design of Scratch (Smart Capture-and-Replay at Chrome),a collaborative Programming-by-Demonstration (PBD) system for capturing,recording,editing,and playing back the user interactions and sharing user experience in Chrome Web browser,which greatly enhances people’s efficiency.
Model and Implementation of Registry for Autonomy Oriented Web Services
ZHANG Zi-long,MAO Xin-jun,YIN Jun-wen,HOU Fu and CHEN Chao
Computer Science. 2014, 41 (11): 118-123.  doi:10.11896/j.issn.1002-137X.2014.11.023
Abstract PDF(733KB) ( 385 )   
References | Related Articles | Metrics
In response to the openness and dynamics of the internet environment and in order to enhance the management of Web services posture and support the development of autonomy-oriented Web service (AOWS) application,we extended the traditional SOA,and proposed a model of Registry for autonomy-oriented Web services in this article.This Registry not only supports the basic register function,but also provides the functionality of managing posture of Web services.We introduced the model of lifecycle and description for AOWS,described the key technology of Registry for AOWS,and finally studied a case to validate the feasibility of models and implementation technologies.
Method to Automatic Testcase Generation toward Safety Critical Scenarios of Cyber-physical Systems
JIANG Peng,CHEN Xin and LI Xuan-dong
Computer Science. 2014, 41 (11): 124-127.  doi:10.11896/j.issn.1002-137X.2014.11.024
Abstract PDF(404KB) ( 503 )   
References | Related Articles | Metrics
Effectively testing safety-critical scenarios is an important means to improve security of cyber-physical systems.How to model safety critical scenarios so that the behavior of systems can be completely and precisely described,and how to effectively generate test cases so as to enhance the test coverage and reduce testing cost,are two key technical problems that must be solved in safety-critical scenarios’testing.Existing scenario modeling and test case generation techniques lack the support to describing and treating the important features of cyber-physical systems,thus they can’t generate test cases satisfying the testing requirements of the safety critical scenarios.We studied modeling and test case generation methods to safety critical scenarios in cyber-physical systems.A method that satisfies the requirement for modeling safety-critical scenarios by extending UML activity diagram with external event driven mechanism and timeliness characterizing mechanism was proposed.Then based on the scenario models,we studied a method to automatically generate test cases.
Automatic Refactoring of TV Webpage for Optimizing Cost of Browsing
LONG Yong-hao,WANG Jia,CHEN Xiang-ping,LI Kai-yuan and OUYANG Chun-xia
Computer Science. 2014, 41 (11): 128-131.  doi:10.11896/j.issn.1002-137X.2014.11.025
Abstract PDF(421KB) ( 381 )   
References | Related Articles | Metrics
With the development of the smart home,a great number of TV-user oriented Web applications are developed.Since the users interact with the television by remoter,the browsing cost of the webpage is affected by its layouts.What’s more,the simplifying of the elements on the webpage makes the automatic optimizing possible.In this paper,we proposed an algorithm to calculate the browsing cost for different kinds of interactions.Considering the elements’ types,positions,sizes,and structures,we proposed an interchangeability estimate algorithm to check out whether the two elements can change their positions with each other.On the basis of the two mentioned methods,we proposed an automatic webpage refactoring which gives each element a weight from the visits of the users and refactoring the webpage in order to reduce the total browsing costs.The technique was tested on a website which consists 116 pages to test the correction and significance of the method.
Context Retrieval Cost Model on Smartphones and its Application
SHEN Guo-feng,KONG Jun-jun,GUO Yao and CHEN Xiang-qun
Computer Science. 2014, 41 (11): 132-136.  doi:10.11896/j.issn.1002-137X.2014.11.026
Abstract PDF(442KB) ( 419 )   
References | Related Articles | Metrics
With the development of technology and increase of application requirements,a lot of sensors and network interfaces have been embedded in smartphones,which are a key contributor in acquiring context information and building smart mobile applications.Although the context retrieval cost on smartphone is significant,it is usually ignored by mobile application developers.This paper proposed a context retrieval cost model to analyze the cost of context information retrieval.We designed and implemented a context-retrieval-cost measuring tool,CRCTest,and measured the context retrieval cost model on an Android smartphone.Based on the measured cost model,we implemented an example application,and conducted some experiments to compare the cost of two approaches to acquiring location context.The result shows that it is feasible to optimize the context retrieval cost with the proposed context retrieval cost model.
Research and Development of Computer-aided Requirements Engineering Tool Based on Problem Frames
LIU Guo-yuan,WAN Guang-hai,PANG Liu and LI Zhi
Computer Science. 2014, 41 (11): 137-140.  doi:10.11896/j.issn.1002-137X.2014.11.027
Abstract PDF(935KB) ( 431 )   
References | Related Articles | Metrics
Problem Frames (PF) are widely regarded as an important research subject by the requirements engineering community.Currently,a lot of research work is done in the theoretical foundations and development methodologies of PF.However,how to apply them in the practice of a software development project remains a problem to be solved.A software prototype was developed to assist system analysts to derive software specifications from user requirements in a smooth and logical way,thus providing technical support for requirements communication,modeling and analysis in a software development project.Furthermore,the end products of the tool application also provide help for the software design in the development phase downstream.This software prototype will promote the further development of PF and drive it to maturity,i.e.,from theoretical research to practical applications.
Hot Deployment with Dependency Reconstruction
LI Hai-cheng,CAO Chun,LV Jun and TAO Xian-ping
Computer Science. 2014, 41 (11): 141-145.  doi:10.11896/j.issn.1002-137X.2014.11.028
Abstract PDF(676KB) ( 407 )   
References | Related Articles | Metrics
The hot deployment mechanism is a typical feature of mainstream application servers.But current application servers only support hot deployment of standalone applications,which cannot satisfy the requirement of complicated enterprise applications with dependency injection.Failures will occur when some modules are updated online,which will result in failure of the whole application platform.To solve the problem,a technology of hot deployment with dependency reconstruction was introduced.We created dependencies between modules when each module of applications is deployed at the first time.On its updating,we found out modules affected by the update,reconstructed dependency and carried out partial hot deployment,avoiding the cost of restarting application server.Experiments show that our technology of hot deployment can ensure the correctness of the application with dependency injection and operating efficiency of application servers will be highly improved in the scenario of real-world applications.
Research on Parameterized Runtime Monitoring
WANG Zhen,YE Jun-min,CHEN Shu,GU Jian and JIN Cong
Computer Science. 2014, 41 (11): 146-151.  doi:10.11896/j.issn.1002-137X.2014.11.029
Abstract PDF(667KB) ( 405 )   
References | Related Articles | Metrics
With the wide application of software in all kinds of safety critical systems as well as the increasingly complexity,software reliability becomes more and more important.As a software solution widely used in various platforms,runtime monitoring is one of the most flexible solution to enhance the reliability of software.With the development of runtime monitoring and software technology,people want to verify the dynamic properties of system through runtime monitoring.So runtime monitoring of parametric properties was presented.Runtime monitoring of parametric properties have achieved more and more attention because of its applicability in the object-oriented system.This paper summarized the researches on parametric runtime verification,presented the problem definition of parametric runtime verification,and introduced the main research content of this field,including parametric runtime verification approaches,technologies of reducing parametric monitoring overhead and runtime monitoring of multiply parametric properties.
Component Composition Technology and Tool Based on AJAX for Web Application
ZHENG Di-wen,SHEN Li-wei,PENG Xin and ZHAO Wen-yun
Computer Science. 2014, 41 (11): 152-156.  doi:10.11896/j.issn.1002-137X.2014.11.030
Abstract PDF(771KB) ( 407 )   
References | Related Articles | Metrics
For Web applications,component-based software development is also a way to improve the efficiency of development,whose solution involves the combination of the front-end and the back-end or third-party services.We proposed a component composition technology based on AJAX for Web application,based on the analysis on the component types and its composition pattern of the Web application.In this technology,we proposed two different solutions for combination of different component types.One uses jQuery to invoke Servlet while the other uses DWR technology to enable interaction between front-end page component and back-end business component or Web Service component.In addition,based on the proposed method,this paper implemented two online Web application component composition tools through two implementation styles,which allow users to define the composition details through graphical user interface and automatically complete the component composition process to generate Web applications.This paper used an experimental course selection website as an application development example,to verify the usefulness of the technology and the tool.
Research on Software Process Life Cycle Risk Management Model
ZHANG Zi-jian,CAO Jing,ZHANG Li and RAO Guo-zheng
Computer Science. 2014, 41 (11): 157-161.  doi:10.11896/j.issn.1002-137X.2014.11.031
Abstract PDF(674KB) ( 534 )   
References | Related Articles | Metrics
Firstly,the software process risk was defined from the perspective of the software process based on the basic risk concept.Secondly,the three-dimensional software project risk management model based on software process,life cycle and risk management was proposed combined with the full life cycle and CMMI.The full life cycle risk identification,assessment and monitoring model based on CMMI was presented.Finally,a software development risk management system was implemented,which achieves good practice results.
Interoperability Test Case Generation Based on Testing Requirements
HOU Chao-fan,WU Ji and LIU Chao
Computer Science. 2014, 41 (11): 162-168.  doi:10.11896/j.issn.1002-137X.2014.11.032
Abstract PDF(1317KB) ( 566 )   
References | Related Articles | Metrics
Network application will become the dominant mode of the development of software technology in the future.In the effort to ensure the effective work of the network applications,it is necessary to test interoperability among these network applications.The testing requirements of interoperability testing are complicated and inconstant,and the test cases are difficult to design,which lead to the high cost in interoperability testing.For this reason,a new interoperability test case generation approach based on testing requirements was presented in this paper.This approach uses the idea of Model-driven testing.The testing requirements are described by testing requirements model,and the specifications of the applications are presented with state diagrams.Finally,the test cases which satisfy the testing requirements are generated through the combination of the two models above.
Adaptive Step Length Forward-backward Pursuit Algorithm for Signal Reconstruction Based on Compressed Sensing
CAI Xu,XIE Zheng-guang,JIANG Xiao-yan and HUANG Hong-wei
Computer Science. 2014, 41 (11): 169-174.  doi:10.11896/j.issn.1002-137X.2014.11.033
Abstract PDF(769KB) ( 399 )   
References | Related Articles | Metrics
Compressed sensing (CS) is a new theory of signal sampling,processing and recovering,which can significantly reduce the sampling frequency of signal with high frequency and narrow band.Aiming at reconstructing signals with unknown sparsity,we proposed a novel signal reconstruction algorithm called the adaptive forward-backward pursuit (AFBP).Unlike the Forward-backward Pursuit algorithm with fixed step length,AFBP works with varied step length.It utilizes an adaptive thresholding method to adaptively choose the forward step length and conducts the regularize process towards the candidate support estimate to ensure its reliability.We adopted a method which combines the adaptive thresholding and the variable step length afterwards to decide the backward step length in order to reduce the necessary reconstruction time.Some incorrect indexes included in the support estimate can be deleted adaptively in order to improve the exact reconstruction rate.The AFBP reconstruction experiment was conducted including recovery of random sparse signals with common nonzero coefficient distributions.The results demonstrate that AFBP and FBP contribute to similar exact reconstruction rate as well as similar reconstruction error,while the reconstruction time of AFBP is sharply shorter than that of FBP.So AFBP can realize more efficient reconstruction of sparse signals with unknown sparsity than FBP.
Wavelet Threshold De-noising Method Oriented to Body Area Networks
LIU Yi,SONG Yu-qing,LIU Zhe,XU Li-bing and BAO Xiang
Computer Science. 2014, 41 (11): 175-177.  doi:10.11896/j.issn.1002-137X.2014.11.034
Abstract PDF(331KB) ( 652 )   
References | Related Articles | Metrics
ECG signal processing in body area network environment is faced with many problems,including limited resources and random noise.So it is essential to propose a better algorithm which is used for ECG signal de-noising.On the basis of lifting wavelet transform,we proposed a new de-noising method of ECG signal based on dual-threshold function.With this dual-threshold function processing the detail ECG signal decomposed by lifting wavelet,more accurate noise signal will be separated from the original signal.Simulation results show that the proposed de-noising algorithm overcomes the disadvantages of both soft and hard threshold methods to some degree,and obtains better de-noising performance.The de-noising speed is fast and the design of program is flexible and simple.The algorithm lays the foundation of the further processing of the ECG signal in some restricted environments such as body area networks.
Improvements of Indoor Signal Strength Fingerprint Location Algorithm
CAI Zhao-hui,XIA Xi,HU Bo and FAN Dan-mei
Computer Science. 2014, 41 (11): 178-181.  doi:10.11896/j.issn.1002-137X.2014.11.035
Abstract PDF(336KB) ( 549 )   
References | Related Articles | Metrics
As people have increasingly high demand of location-based services,indoor positioning technology in many fields has been widely used,and location algorithm is most important in indoor positioning research.This paper described the nearest neighbor and KNN signal strength fingerprint location algorithm and showed the disadvantage of KNN fingerprint algorithm.On the basis of KNN localization algorithm, an improved location algorithm based on region division was proposed.In the first stage,received signal strength was compensated and filtered to reduce the influence of various external factors on the positioning accuracy.Then we divided the location area,selected the major node and the most recent signal strength fingerprints.Finally the location result was calculated and verfied. The simulation proves the improved region division algorithm improves the positioning accuracy of 22.2%,reaching 2.1m compared with the traditional KNN algorithm,which proves the feasibility of this improved algorithm.
Services Modeling and Scheduling for Wireless Access Network Oriented Intelligent Transportation System
TAO Hua,WANG Xiao-jun and DAI Hai-kuo
Computer Science. 2014, 41 (11): 182-186.  doi:10.11896/j.issn.1002-137X.2014.11.036
Abstract PDF(439KB) ( 390 )   
References | Related Articles | Metrics
This paper first described the architecture and the main services of the wireless network oriented ITS (Intelligent Transportation System).The services of Cooperative Vehicle-Infrastructure System and PTT voice were analyzed and modelled.Pointed scheduling schemes were proposed for different services according to their QoS requirements.For the service of Cooperative Vehicle-Infrastructure System,vehicle speed adapted dynamic scheduling is adopted.For the service of PTT voice,scheduling based on state transition is adopted.For the video and other services which are not sensitive to delay,scheduling request is generated based on buffer status.Finally,the scheduling schemes were simulated on OPNET,from which the availability was verified with performance curve of delay and packet loss rate.This also provides reference for ITS access network design.
Slice-based Partition Low Delay Intra Frame Coding Method
YAO Chun-lian,RUAN Qiu-qi,JIANG Dong and WAN Li-li
Computer Science. 2014, 41 (11): 187-191.  doi:10.11896/j.issn.1002-137X.2014.11.037
Abstract PDF(682KB) ( 508 )   
References | Related Articles | Metrics
The inherent delay of the standard codec system is relatively large,for example,for standard definition video,the system delay exceeds 260ms.It is very difficult to be applied in the system that has strict requirement on delay.On the basis of existing standards,a new low delay intra coding structure was provided in the paper.The input video is divided into a number of slices,and slice is the basic code unit at the acquisition end.Slice-based acquisition method can effectively reduce the acquisition delay,at the same time,I-frame coding method is different from existing standard in which each acquisition slice is the basic coding unit and it is divided into four sub-pictures.In order to improve the prediction accuracy,one of sub-picture was chosen as prediction unit.Experiments show that the new structure can effectively reduce intra coding delay and acquisition delay,which can be reduced to 150ms.
Relay Communication Algorithm of Orthogonal Mixed Space-time Network Coding
LIU Yan,WANG Zi-rong and ZHU Xing-wei
Computer Science. 2014, 41 (11): 192-194.  doi:10.11896/j.issn.1002-137X.2014.11.038
Abstract PDF(234KB) ( 356 )   
References | Related Articles | Metrics
Aiming at the problem of low spectral efficiency and insufficient network throughput in wireless communication network,a space-time quadrature hybrid relay communication network coding algorithm (QHNR) was proposed.The algorithm analyzes the relay communication mode on the MAC phase and BC phase during transmission cycle through the system model and the error rate while the nodes send data.The method of grading gain is used to improve channel utilization efficiency and promote the system performance,and then chooses the better relay nodes via relay probability.The performance of QHNR algorithm was analyzed by both network throughput and network power allocation.The experimental simulation results show that ,compared with the TWR communication scheme and MWR communication scheme,QHNR algorithm has better results in improving the network throughput and promoting channel utilization efficiency.
Distributed Data Stream Clustering Based on LSH on Cloud Environments
QU Wu,WANG Li-jun and HAN Xiao-guang
Computer Science. 2014, 41 (11): 195-202.  doi:10.11896/j.issn.1002-137X.2014.11.039
Abstract PDF(694KB) ( 687 )   
References | Related Articles | Metrics
In recent years,with the wide application of computer technology and internet technology in the field of industrial production and information processing,these applications will continuously produce large amounts of sequence data evolved over time and constitute time series data stream,such as internet news feed analysis,network intrusion detection system,stock markets analysis and sensor networks data analysis.The real-time clustering analysis of data stream is a hot issue of the current data stream mining.However,due to the high speed,large-scale data and real-time analysis,data must often be analyzed on the fly.Although the one-pass-through scanning algorithm is able to meet the needs,the lack of efficient clustering algorithms to identify and distinguish patterns limits the effectivity and scalability of this method.In order to solve the above problems,we proposed a novel stream clustering algorithm called DLCStream,which is based on LSH on cloud environments.It is a distributed data stream clustering approach that uses the Map-Reduce framework and LSH mechanism to quickly find the clustering pattern in the data stream.Finally,the theoretical analysis and experiment results illustrate that the DLCStream algorithm results is significantly more efficient in efficient parallel processing,scalablity,and quality of the clustering results compared with traditional data stream clustering framework CluStream algorithm.
Research on DDoS Attack-defense Game Model Based on Q-learning
SHI Yun-fang,WU Dong-ying,LIU Sheng-li and GAO Xiang
Computer Science. 2014, 41 (11): 203-207.  doi:10.11896/j.issn.1002-137X.2014.11.040
Abstract PDF(524KB) ( 508 )   
References | Related Articles | Metrics
The process of DDoS attack-defense game in new situation is different now,so the payoff value cannot be quantified effectively and the game strategy cannot be adjusted dynamically to maximize the payoff using existing methods.In response to this problem,a DDoS attack-defense game model based on Q-learning was designed,and at the same time an algorithm was proposed on the basis of the model.Firstly,the payoff of the attacker and defender was calculated with the network entropy quantitative assessment method.Secondly,the single DDoS attack stage was studied using matrix game method.Finally,the model algorithm was proposed by introducing the Q-learning method into the game process,with which the strategies are adjusted dynamically according to the learning outcomes to maximize the payoff.The result of verification testing shows that the defender can achieve a higher payoff when adopting the model algorithm,thus the algorithm turns out to be practicable and effective.
Efficient Certificateless Ring Signcryption Scheme
SUN Hua and MENG Kun
Computer Science. 2014, 41 (11): 208-211.  doi:10.11896/j.issn.1002-137X.2014.11.041
Abstract PDF(386KB) ( 542 )   
References | Related Articles | Metrics
Ring signcryption is an important cryptographic primitive which combines the functions of encryption and ring signature.It can provide confidentiality,authenticity and anonymity simultaneously.At present,most of existing certificateless ring signcryption schemes are proposed in the random oracle,however,they sometimes are proven to be insecure when the hash functions are instantiated.Aiming at this problem,a certificateless ring signcryption scheme was put forward without random oracles in this paper.Meanwhile,it was proven that this scheme satisfies indistinguishability against adaptive chosen ciphertext attacks and existential unforgeability against adaptive chosen message attacks under the computational Diffie-Hellman assumption and decisional Diffie-Hellman assumption,so the scheme is secure and reliable.
Blind Watermarking Scheme Based on Support Vector Machine and Singular Value Decomposition
WANG Juan,LIN Yao-jin and WANG Yu-qi
Computer Science. 2014, 41 (11): 212-215.  doi:10.11896/j.issn.1002-137X.2014.11.042
Abstract PDF(572KB) ( 435 )   
References | Related Articles | Metrics
An image watermarking scheme was developed based on SVM (support vector machine) and SVD (singular value decomposition),which is used to further enhance the performance against attacks.Firstly,the host image is decomposed by the DWT transform,and its low frequency wavelet band is split into non-overlapping blocks.Then,the partial correlation model of the block is established by using the support vector machine.A feature sequence is derived through judging the numerical relationship between the prediction results and the corresponding position of the low frequency coefficient values,and the feature watermark sequence is derived through exclusive-or the feature sequence and the watermark.Moreover,the feature watermark sequence is embedded into corresponding block’s biggest singular va-lue from the original image’s low frequency wavelet band based on the principle of odd-even quantization.Finally,a watermarked image is obtained after SVD synthesis and IDWT.Experimental results show that the scheme is not only invisible but also has strong ability to resist attacks.
Policy Driven Exception Handling Description Approach for BPEL Processes
WANG Quan-yu,YING Shi,LV Guo-bin,WEN Jing,CHENG Yin-hai and CHEN Ying
Computer Science. 2014, 41 (11): 216-226.  doi:10.11896/j.issn.1002-137X.2014.11.043
Abstract PDF(1081KB) ( 432 )   
References | Related Articles | Metrics
In order to improve the ability of exception handling description for BPEL processes,this paper presented a policy driven exception handling description approach for BPEL processes.Firstly,the paper designed BPEH/PDL language,a new Policy Description language(PDL) for exception handling of BPEL processes,and based on Color Petri Net,proposed a formal description approach for exception handling of BPEL processes by the mean of BPEH/PDL.Summarily,the whole description processes of BPEH/PDL were discussed systematically through a case study in automotive manufacturing execeution system domain.
Research of Confirming Calculate Order Based on Simulink Model
LI Jun,ZHU Chang-hao and LU Meng-han
Computer Science. 2014, 41 (11): 227-232.  doi:10.11896/j.issn.1002-137X.2014.11.044
Abstract PDF(425KB) ( 765 )   
References | Related Articles | Metrics
The combination of Simulink simulation and code generation has great practical value.By analyzing the feature of models,Simulink models can be converted into C language code,which realizes the purpose of combination of Simulink simulation and code generation.And the primary problem is the confirmation of calculate order during the code generation process.From the information extracted from Simulink models file,the feature of module in Simulink model is analyzed and the dependence between modules from the relationship of the modules which is stored in graphsis is gotten.From the feature of module in Simulink model and dependence,two calculated orders are goten:the calculate order based on module which takes into account the underlying module and the calculate order based on hierarchy with consideration of subsystem.By analyzing and comparing the two calculate order,it was found that the calculate order based on hierarchy is better than the calculate order based on module.Finally by testing the model of f14 in Simulink,it was found that the two calculate orders are feasible.
Distributed Optimized Query Algorithm Based on Index
WANG Jing-bin and FANG Zhi-li
Computer Science. 2014, 41 (11): 233-238.  doi:10.11896/j.issn.1002-137X.2014.11.045
Abstract PDF(479KB) ( 508 )   
References | Related Articles | Metrics
Using index file is a new way of solving the large amount of RDF (Resource Description Framework) query problem,which can be a great aid to query optimization.At present,most of the RDF query optimization method based on Hadoop do not use index file,and most of them aim at static data so they perform poorly at dynamic updating of data.In order to overcome these two drawbacks,this paper proposed IMSQ (using Index in MapReduce to Segment and Query) algorithm to perform distributed RDF query.The algorithm can be divided into segment and query execution two parts,firstly,makes a starlike segmentation for RDF data,and obtaines several segment file and corresponding index file,secondly,generates a layered join plan,uses filter method to seek the index file to narrow the result set and then does query on corresponding segment file;finally,merges and outputs the middle result. The results of the experiment on the LUBM test data set show that IMSQ method query efficiency is higher when the amount of the RDF data is large.
Research on CBR Case Adaptation Based on ALCQ(D)
HUANG Jin-long,GU Tian-long,SUN Jin-yong and XU Zhou-bo
Computer Science. 2014, 41 (11): 239-246.  doi:10.11896/j.issn.1002-137X.2014.11.046
Abstract PDF(662KB) ( 431 )   
References | Related Articles | Metrics
CBR(Case-based Reasoning) is a branch of artificial intelligence,which overcomes the bottleneck of know-ledge acquisition.Case adaptation is a key step in CBR.Description logic ALC has been fully applied to the CBR,but now no alogirhtm is more effective to determine whether a retrieved similar case needs to be modified and how to fix based on description logic.ALC becomes ALCQ(D) as it introduces a qualitative quantity Q and a type constraint domain D.The algorithm in this paper uses ALCQ (D) concept to represent the source case and target case.Firstly,it presupposes that the retrieved source example can solve the problem of target,which means the target and source case examples both satisfy KB(knowledge base),but this may lead to inconsistent with KB.Then according to the conflict detection in the algorithm,it finds the concepts which lead to inconsistent in source concept instances and finally uses concept of replacement rules defined in this article to retrieve the most similar concepts to the inconsistent concept in ontology repository for replacing itself.Studies show that this algorithm has boundaries,reliability and completeness.This paper used an example to illustrate this algorithm.The results show that it can revise the similar case to solve the target problem.
Ranking Data Quality of Web Article Content by Extracting Facts
HAN Jing-yu and CHEN Ke-jia
Computer Science. 2014, 41 (11): 247-251.  doi:10.11896/j.issn.1002-137X.2014.11.047
Abstract PDF(444KB) ( 491 )   
References | Related Articles | Metrics
Data quality assessment of Web article content helps identify useful data.Exiting approaches not only heavily rely on lexicon features or user interactions to obtain quality indicators,but also can not capture the content’ semantics.A fact-based quality assessment (FQA) approach was proposed in this article.Given one target article,the approach starts with the identification of alternative context by collecting relevant articles and extracting facts from every article.Then,the accuracy baseline is constructed by voting,and the completeness baseline is constructed by iterations over fact graphs.Finally,data quality dimensions,including accuracy and completeness are calculated by comparing the facts of the target article with the established dimension baselines.Based on the facts of target article content,rather than particular features,FQA approach can quantify data quality dimensions with high precisions.The superior performance of FQA was verified in the experiments.
Uncertainty Measure Research on Rough Fuzzy Sets in Covering Approximation Space
ZHENG Ting-ting and ZHU Ling-yun
Computer Science. 2014, 41 (11): 252-255.  doi:10.11896/j.issn.1002-137X.2014.11.048
Abstract PDF(317KB) ( 408 )   
References | Related Articles | Metrics
Uncertainty measure is one of the basic problems of rough sets theory.The uncertainty of rough fuzzy sets is caused by both the roughness of the difference of the upper and lower approximate sets and the fuzziness of the uncertain concept extension.Now the uncertainty research on rough fuzzy sets is not yet well understood.This paper discussed the uncertainty of rough fuzzy sets in covering approximation space,proposed the more strict revised measurement criterion,and brought out the revised roughness by the membership difference among the original fuzzy sets,the upper and lower approximation sets.Example analysis shows the measurement approach can describe the actual problems more accurately.
Research and Application on Auto-word Building
WANG Jian-quan and JI Shao-bo
Computer Science. 2014, 41 (11): 256-259.  doi:10.11896/j.issn.1002-137X.2014.11.049
Abstract PDF(312KB) ( 511 )   
References | Related Articles | Metrics
Words are the basic elements of Chinese text,and Chinese language model plays a key role in Chinese text mining.Text classification is a data mining technology with high dimensions and most of the classifying algorithms are sensitive to the dimensions.As a result,the classification depends on the quantity of vocabularies.Besides,most of current Chinese language models are based on statistical theory,such as N-gram model and other improved models.Howe-ver,these statistical models are disadvantaged with computational complexity.In order to improve the quantity and efficiency,this paper gave Chinese words a new definition based on association rules,and proposed the Auto-word algorithm,by which a word vocabulary is constructed automatically and used for Chinese text mining.Finally,the efficiency of the Auto-word algorithm was proved by experiment.
Active Learning for Multi-label Classification on Graphs
LI Yuan-hang,LIU Bo and TANG Qiao
Computer Science. 2014, 41 (11): 260-264.  doi:10.11896/j.issn.1002-137X.2014.11.050
Abstract PDF(392KB) ( 448 )   
References | Related Articles | Metrics
Although active learning has been extensively used in study in graph data,little research has been done on active learning on multi-label classification with graph data.We proposed a novel approach for multi-label classification with graph data by using an active learning based on error bound minimization.We first obtained a series of equations by using multi-label classification and learning with local and global consistency (LLGC),so as to make the equation apply to minimize the transductive rademacher complexity and minimize the generalization error bound.By using the approach,we obtained the most informative sample data from graph data.Experiments show that our method can obtain high performance for multi-label classification.
BGrR:Large-scale Network Routing Speedup Techniques Based on Granular Computing
HE Fu-gui,LIU Ren-jin,ZHANG Yan-ping and ZHANG Ling
Computer Science. 2014, 41 (11): 265-268.  doi:10.11896/j.issn.1002-137X.2014.11.051
Abstract PDF(433KB) ( 545 )   
References | Related Articles | Metrics
Large-scale network routing is one of the fundamental problems in social network information processing.A granular analysis method of network based on granular computing was put forward.From basic problems of granular computing,using hierarchy topology and community structure of social network,this paper studied how to select grain of network and how to deal with problems among different granular spaces.By hierarchical granular chain,complex and large-scale network was mapped into different granular spaces.To reduce complexity of problem solving,network routing problem was mapped into different granular spaces. Throught the change of searching process from coarse granular space to fine granular space,network routing problem was solved.In order to speedup large-scale network routing fin-ding,a Between-Granular Routing Algorithm(BGrR) was put forward.In experiment,using urban road network as data source,the proposed method was compared with other heuristic searching path methods (A* and ALT ).The result of experiments shows that the proposed method is effective.
Improved Artificial Bee Colony Algorithm with Dual Cognitive Abilities and Performance Analysis
XIE Juan,QIU Jian-feng,MIN Jie and WANG Ji-wen
Computer Science. 2014, 41 (11): 269-272.  doi:10.11896/j.issn.1002-137X.2014.11.052
Abstract PDF(328KB) ( 501 )   
References | Related Articles | Metrics
Aiming at the problem that artificial bee colong algorithm has slower convergence rate in resolving unimodal problems and is easily trapped into local optimum in optimizing multimodal problems,according to the theory of group dynamics,an improved artificial bee colony algorithm with dual cognitive abilities which improves the search strategies of bees foraging behavior was presented by introducing the self-cognition and social cognition abilities.The experimental results show that the improved search strategies enhance the optimization performance of artificial bee colony algorithm and are superior to others by testing in a set of standard test functions.
Approach of Ascertaining Combinatorial Attribute Weight Based on Discernibility Matrix
YE Jun and WANG Lei
Computer Science. 2014, 41 (11): 273-277.  doi:10.11896/j.issn.1002-137X.2014.11.053
Abstract PDF(385KB) ( 398 )   
References | Related Articles | Metrics
The problems existed in the Pawlak attribute importance based method by which the attribute weight is constructed were analyzed in detail firstly,then a discernibility matrix based definition of attribute importance was given.On this basis,a novel approach for constructing the combinatorial attribute weights of information systems was proposed.In this approach,the attribute weights are ascertained according to the condition attribute’s contribution in the whole information system.The proposed approach not only reflects the ability of condition attributes to distinguish the object,but also reflects the classification capability of each attribute in the whole condition attributes.The numerical example demonstrates that the attribute weights gained by the proposed approach are more closer to the facts,so the proposed approach can improve the accuracy of the attribute weights.
Tuning PID Parameters Using Modified CPSO Algorithm
HUANG Wei-yong,GAO Yu-qin and ZHANG Yan-hua
Computer Science. 2014, 41 (11): 278-281.  doi:10.11896/j.issn.1002-137X.2014.11.054
Abstract PDF(325KB) ( 479 )   
References | Related Articles | Metrics
To improve the performance of control system,a new method for tuning PID parameters was proposed by using modified chaotic particle swarm optimization algorithm(CPSO).The chaotic search is applied to the initialization of position and velocity of initial swarm,the optimization of inertia weight,the generation of random constant and the generation of the local optimum neighborhood point,so that the algorithm has the ability of global optimization and continuous and precise local search.The experimental results of 3 typical control systems show that the proposed method for tuning PID parameters is effective,and the performance is obviously superior to the conventional methods.
New Wishart MRF Method for Fully PolSAR Image Classification
ZHANG Shuang,WANG Shuang and JIAO Li-cheng
Computer Science. 2014, 41 (11): 282-285.  doi:10.11896/j.issn.1002-137X.2014.11.055
Abstract PDF(913KB) ( 411 )   
References | Related Articles | Metrics
The unsupervised Wishart classifier has usually given some misclassified pixels,i.e.,several classes present the same polarimetric scattering mechanism,or one class has several different polarimetric scattering mechanisms.Aiming at it,this paper proposed a new Wishart MRF method for fully polarimetric synthetic aperture radar (PolSAR) image classification.Instead of preserving the fixed scattering characteristic,the new method sets a constrained scope for the label shifting from one to another.In addition,prior information is extracted by MRF method with an adaptive neighborhood.The physical scattering characteristic is considered,as well as the statistics information and the spatial information,and the physical scattering characteristics are preserved in a certain degree.Compared with traditional Wishart classifier and modified Wishart classifier preserving polarimetric scattering characteristics,the proposed method has better classification results on JPL/NASA AIRSAR data of San Francisco.
Image Matching Algorithm Combining SURF Feature Point and DAISY Descriptor
LUO Nan,SUN Quan-sen,CHEN Qiang,JI Ze-xuan and XIA De-shen
Computer Science. 2014, 41 (11): 286-290.  doi:10.11896/j.issn.1002-137X.2014.11.056
Abstract PDF(996KB) ( 833 )   
References | Related Articles | Metrics
Image matching is a basic technique in the research of computer vision,and local feature based image ma-tching methods are becoming increasingly popular in this field.To solve the classical SURF algorithm’s poor performance on the rotation invariance,this paper proposed a new matching algorithm combining the SURF feature point and DAISY descriptor.We proposed a main orientation distribution method which is more suitable for DAISY descriptor,so that a new descriptor can be obtained via rotating by the main orientation.Our algorithm effectively improves the matching ability of the classical SURF algorithm on the rotation invariance,only employing a little more computational burden.The experimental results demonstrate that our algorithm is more robust than classical methods when the image blurs,illumination,JPEG compression ratio or the viewpoint changes.
Feature Points Processing Strategies Based on Local Symmetry and its Application
GUO Yi-chao,LI Qing-yong,SUN Jin-rui,HUANG Ya-ping and TIAN Mei
Computer Science. 2014, 41 (11): 291-296.  doi:10.11896/j.issn.1002-137X.2014.11.057
Abstract PDF(1710KB) ( 350 )   
References | Related Articles | Metrics
In the image retrieval framework based on Bag of Words (BoW) model,images usually contain a large number of SIFT feature points whose features are not strong enough and have influence on the efficiency and performance of the image retrieval system.Based on properties of SIFT feature points and principle of visual saliency,this paper proposed a local symmetry measure method for SIFT feature points,and embedded two symmetry processing strategies in the BoW image retrieval framework:filtering method and weighting strategy.The experimental results on the Oxford Buildings dataset show that the selection strategy of SIFT feature points based on symmetry can effectively improve the performance of image retrieval systems.
Algorithm of Image Crack Detection Based on Morphology and Region Extends
QU Zhong,LIN Li-dan and GUO Yang
Computer Science. 2014, 41 (11): 297-300.  doi:10.11896/j.issn.1002-137X.2014.11.058
Abstract PDF(816KB) ( 619 )   
References | Related Articles | Metrics
In a complex background,the interference of uneven illumination,concrete bubbles,shadows and other noise,results in false detection of cracks and varying degrees of crack fracture.To solve these problems and implement detecting cracks accurately,an algorithm of crack detection which combines mathematical morphology and region extends was proposed in this paper.Firstly,images collected on the natural conditions are preprocessed using the method of morpho-logy,and then Canny edge detection and morphology are combined to achieve the coarse detection of cracks.Secondly,the regional extension algorithm is adopted for accurate detection.Finally,the cracks are processed to make them natural.Experimental results show that the proposed algorithm can detect the cracks on concrete pavement cracks accurately and efficiently.
Shape Prior Based Hybrid Active Contour Model and its Applications in Image Segmentation
CAO Dong-mei and XU Jun
Computer Science. 2014, 41 (11): 301-305.  doi:10.11896/j.issn.1002-137X.2014.11.059
Abstract PDF(1219KB) ( 485 )   
References | Related Articles | Metrics
In this paper,a new Shape-prior based Hybrid Active Contour (SHAC) model was presented for segmentation.By using level set method,this model combines boundary and adaptive region information together and learns an optimal prior shape from the training set.It takes the boundary and adaptive region feature as local information while prior shape as global information.The model combines global and local information in the process of iteration to guide the evolution of deformative curve and achieve the goal of segmenting target objects.Experiments show that compared with GAC,C-V,and RSF models,SHAC model displays its advantages not only in the segmentation of image strong noise and weak boundary,but also in the image with low contrast resolution,complicated background and contributes improved accuracy.
Image Copy Detection Algorithm by Image Force Field Mapping
ZHANG Bing and ZAN Cheng
Computer Science. 2014, 41 (11): 306-308.  doi:10.11896/j.issn.1002-137X.2014.11.060
Abstract PDF(996KB) ( 308 )   
References | Related Articles | Metrics
The concept of force field to describe intermolecular interaction in physics is adapted to image processing.Ima-ge force field is designed as taking pixels for particles and grey scale for mass.The force field of an image models the discrete information of the image in a more intuitive way and is shown to be isomorphic to the image itself.Based on the image force field,an image copy detection algorithm called Clock algorithm was presented.In Clock algorithm,features of an image force field map are represented by three lines with the same endpoint like hour,minute and second hands of a clock,hence its name.The Clock algorithm was tested with images in an image database.The experiment results show that it can detect both noise-like distortions and geometric distortions with very high accuracy.In addition,the type and intensity of the distortion can also be determined,maintaining high rate at both recall and precision ratio in the same time.
Research on Natural Scene Statistics in Color Space
CHU Jiang and CHEN Qiang
Computer Science. 2014, 41 (11): 309-312.  doi:10.11896/j.issn.1002-137X.2014.11.061
Abstract PDF(305KB) ( 578 )   
References | Related Articles | Metrics
Natural scene statistics has been widely used in blind image quality assessment,but most assessment methods are designed for gray images,and color space information is not properly used.We studied 5 color spaces (RGB,HSV,LAB,YCBCR,YIQ),and then used Gaussian distribution,logistic distribution,extreme distribution and T distribution to model the normalized coefficients,in order to find the best model for the color space.Then we used Gaussian model parameters as feature to classify the distorted images in LIVE database.We found out that the classify precision in some color space outperforms that in gray-scale statistics.
Improved Denoising Algorithm Based on Non-regular Area Gaussian Filtering
SI Shao-hui,HU Fu-yuan,GU Ya-jun and XIAN Xue-feng
Computer Science. 2014, 41 (11): 313-316.  doi:10.11896/j.issn.1002-137X.2014.11.062
Abstract PDF(581KB) ( 604 )   
References | Related Articles | Metrics
Because traditional Gaussian filter often blurs the edge structure in images,an improved Gaussian filtering algorithm based on non-regular area was proposed in the paper.The algorithm based on Gaussian filtering breaks traditional thought of adopting fixed window,adaptively constructs non-regular Gaussian mask region through analyzing the self-similarity of texture.The rationality of weighted values distributed by Gaussian function is further improved to maintain rich texture as the potential noise or lower similar pixels are rejected in non-regular areas.Experimental results show that the proposed algorithm makes a better balance between enhancing image details and denoising,and is better than traditional Gaussian filtering and other methods.
Texture Classification Using Phase Congruency
MA Yan,YANG Hai-jun and HE Jiang-ping
Computer Science. 2014, 41 (11): 317-320.  doi:10.11896/j.issn.1002-137X.2014.11.063
Abstract PDF(561KB) ( 431 )   
References | Related Articles | Metrics
This paper presented a Phase congruency (PC) based method for texture classification.First,PC is computed in an image.Then,the continuous values are digitalized.Finally,the PC histograms of images are constructed and used to classify the texture images.The PC histogram reflects the global characteristic and it can be combined with the Local Binary Pattern (LBP) methods to improve the description efficiency.The experiments on Outex,Brodatz and CUReT show that better results are obtained by combining PC and LBP methods.PC has a good noise tolerance and the experiments show that the combination achieves promising results in noise cases.