Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 42 Issue 12, 14 November 2018
  
Survey on Graph Model-based Document Summarization
WANG Jun-li, WEI Shao-chen and GUAN Min
Computer Science. 2015, 42 (12): 1-7. 
Abstract PDF(740KB) ( 1077 )   
References | RelatedCitation | Metrics
With the rapid development of the Internet technologies,the speed of information transmission has reached unprecedentedly high levels.However,getting valuable information from mass data is becoming more and more difficult.Automatic summarization technologies allow us to extract a summary that represents the main idea of the original document,which have attracted much attention.Now many related technologies have been widely used in existing automatic summarization approaches,such as statistical analysis,machine learning technology,linguistic knowledge and etc.This paper summarized research works of summarization approaches based on graph-based ranking algorithms.First the basic knowledge of automatic summarization and graph-based ranking algorithms were elaborated.Then the summarization approaches based on graph ranking algorithms were introduced,mainly including three parts:construction of text graph,graph-based ranking,and sentences selection.Finally on the basis of analysis of existing approaches,the future development of graph-based summarization approaches was explored.
Advances in Computer-based Lung Sounds Classification Method
ZHENG Ming-jie, SONG Yu-qing and LIU Yi
Computer Science. 2015, 42 (12): 8-12. 
Abstract PDF(518KB) ( 1482 )   
References | RelatedCitation | Metrics
Lung sound signal is a physiological acoustic signal generated in the ventilation process between the human respiratory system and outside.It contains a wealth of physiological and pathological information and has great value in research.In recent years,environmental problems,like air pollution and the weather with fog and haze,have led to a rise in the incidence of respiratory disease.To meet the growing demand for fast and accurate diagnosis of lung disease,auscultation has attracted more attention with its convenience and safety,yet it shows limitations as it depends on the experience and the hearing capacity of the physician and the limited frequency response of the stethoscope.With the development of automated lung sound diagnostic techniques and hardware,lung sound classification by computer makes up for the defect in traditional auscultation.This paper introduced the concept of lung sounds,computer-based lung sound signal processing and pattern recognition techniques,and summarized the recent development of machine learning-based lung sounds classification techniques.Finally,the research and application development trend of lung sounds classification techniques were discussed.
Quantitative Performance Analysis Model of Matrix Multiplication Based on GPU
YIN Meng-jia, XU Xian-bin, XIONG Zeng-gang and ZHANG Tao
Computer Science. 2015, 42 (12): 13-17. 
Abstract PDF(508KB) ( 1500 )   
References | RelatedCitation | Metrics
Performance evaluation and optimization are indispensable work when designing efficient parallel program,and the performance of storage system directly affects the performance of the processor.We used GPGPU-Sim to simulate the storage hierarchy of GPU,and found out optimal quantity allocation relationship between SM and storage controller in GPU.Matrix multiplication is an essential part in the field of scientific computing,as a representative application with both computation and memory access intensiveness,and its performance is an important indicator of GPU high-performance computing.Performance model is a new technology solution as parallel systems performance evaluation,which has many advantages.In order to improve the performance of matrix multiplication,this paper proposed a quantitative performance model based on GPU.The model quantitatively analyzes instruction pipeline,shared memory access and global memory access,establishes the performance model,finds the performance bottlenecks and improves the execution speed.The experiment proves the model has practicability,and effectively realizes the optimization of the matrix multiplication algorithm.
Memory Access Optimization for Vector Program of SIMD Form
XU Jin-long, ZHAO Rong-cai and XU Xiao-yan
Computer Science. 2015, 42 (12): 18-22. 
Abstract PDF(415KB) ( 824 )   
References | RelatedCitation | Metrics
There are two ways to get vector program,one is handwritten,and the other is generated automatically by the compiler.Limited to programmers and parallel compiler’ ability,there always is some optimization space in vector program.The optimizing compiler concernes most about how to transform the serial program into vector form,rarely do further optimization after the vector form generating.We proposed a memory access optimization method for vector program of SIMD form.Firstly it determines that whether the program needs to be optimized.If optimization is needed,redundancy optimization and align optimization will be implemented for the vector form program.Experimental data show that the proposed method can significantly improve the running efficiency of the program,and the goal is achieved.
Mailing List Based QA Information Extraction Approach
LUO Yu-xiang, ZOU Yan-zhen, JIN Yong and XIE Bing
Computer Science. 2015, 42 (12): 23-25. 
Abstract PDF(597KB) ( 452 )   
References | RelatedCitation | Metrics
Open source projects often provide mailing lists to help users better understand and use open source software.However,developers often spend a lot of time to retrieve the emails when they want to find a special answer,because there are a huge number of emails with unclear question and complex organization.User usually take a lot of email conversations before they get a right answer.In the paper,we proposed and implemented a question & answer information extraction approach based on open source software’s mailing list.It can automatically extract the question sentence and the corresponding best answer from the emails,which can help users search mailing list and learn open source software more effectively.We also did some experiments to verify the availability and the efficiency of our approach.
Power-aware Virtual Machine Dynamic Mapper Using Particle Swarm Optimization
SU Yu, GAO Yang and QIN Zhi-guang
Computer Science. 2015, 42 (12): 26-31. 
Abstract PDF(513KB) ( 508 )   
References | RelatedCitation | Metrics
Power consumption management has become one of important issues in cloud computing.This paper designed and implemented a new meta-heuristic scheduler,in which the proposed self-adaptive particle swarm optimization (SAPSO) is used to detect and track the changing optimal target servers for virtual machine(VM) provisioning in the resource pool.The method considers resource dynamic and the power consumption of busy ser-vers with different loads as well as idle servers in different sleep states,minimizing the incremental power in VM mapping due to workload mapping without degrading performance.Simulations show that the proposed power aware SAPSO is significantly effective to minimize the power consumption increment without compromising both the performances of the VM mapping and QoS with SLA to cloud users.
Browse-shopping-behavior-pattern-oriented Indoor LBS Mobile Application for Book Shopping
HE Yuan-duo, CHEN Zhi-yun and WANG Ya-sha
Computer Science. 2015, 42 (12): 32-35. 
Abstract PDF(830KB) ( 498 )   
References | RelatedCitation | Metrics
To enhance shopping experience,a lot of shopping assistant mobile applications are emerging.They can provide supplementary information of a specific commodity,which is helpful for shoppers with clear shopping goals.However,under the browse shopping pattern,where shoppers treat shopping as a kind of relaxation,shoppers don’t have clear shopping goals.Existing applications help little under this shopping pattern.This paper implemented an indoor LBS application orienting browse shopping pattern for book shopping.Its back-end collects all kinds of information of books from both the bookstore information management system and the Internet.Then keyphrase extraction method is used to build a keyphrase database according to the position.The front-end applies indoor positioning technology to display the overview information,which helps shoppers quickly understand the commodities.The evaluation indicates that the application can enhance shoppers’ shopping experience.
Software Evolution Visualization Based on 3D Animation
YU Han, WANG Hai, PENG Xin and ZHAO Wen-yun
Computer Science. 2015, 42 (12): 36-39. 
Abstract PDF(837KB) ( 457 )   
References | RelatedCitation | Metrics
Data visualization is an important research area of modern computer science,especially the software maintenance research.An interactive visualization with 3D animation can show the software evolution history vividly.In our system,software evolution history is compared to the development of a real word city.Users can easily move through the city so that they can view the details of the evolution history as well as the high level trends of software architecture.We developed a prototype tool using unity3D based on some related works.The prototype achieves the goal of providing an easy way to view software maintenance data.
Cloud-based Notes Plugin for Class on Google Chrome
QIAO Zi-jian, CHEN De-jian and SUN Yan-chun
Computer Science. 2015, 42 (12): 40-42. 
Abstract PDF(842KB) ( 516 )   
References | RelatedCitation | Metrics
Nowadays,in-class teaching and Web-based teaching are two main ways of education.However,neither of them provides students with a platform on which students can share the notes or questions on lecture details.Nor students can form a sharing protocol about their study and knowledge.As a result,it’s difficult for students to have a deep understanding of the handouts in a general learning environment.To solve this problem,based on the analysis of existing education platforms and cloud-based notes,this paper designed the system architecture of a cloud-based notes plugin on Google Chrome and implemented it with HTML5,Node.js and other key technologies.Students can not only take notes and submit questions covering details of the online lectures,but also share them with others.It can help students a lot in both interactive discussion and knowledge understanding.Finally,this paper gave case studies to verify the feasibility and effectiveness of the solution proposed in the paper.
Evolution of Contributors in Open Source Software Development
LI Qi-feng and LI Bing
Computer Science. 2015, 42 (12): 43-46. 
Abstract PDF(573KB) ( 470 )   
References | RelatedCitation | Metrics
Open source software development is heavily based on voluntary contributions.Developers are self-selected.But as open source software projects evolve,changes in the development team affect their organization and decision structure.New members enter the group and others leave.From the related research it is known that a small group of very active developers are responsible in general for the proper evolution of a project.But it does not attend to the time axis that evolution requires.In this paper,we analyzed how software developers evolve in open source software projects.As case studies,we selected GNU/Linux for researching the behavior of contributors to work on a open source software project.Our aim is to give quantitative insight about the evolution of maintainers.We studied how many these develo-pers remain from the beginning of the project and what happens to packages maintained by those developers who left the project.
Automatic Verification of Singly Linked List Pointer’s Reachability Property Using Data-flow Analysis Method
DONG Yu-chen, WANG Han-fei and ZHAO Jian-hua
Computer Science. 2015, 42 (12): 47-51. 
Abstract PDF(432KB) ( 433 )   
References | RelatedCitation | Metrics
A common scenario in verifying a program is to find out whether some user-specific properties in some program points hold during or after execution.Manual formal verification is tedious and error-prone,so automatic verification is an important way to improve code verification efficiency.Data-flow analysis technique can automatically discover specific properties in programs points.This paper presented a method that integrates a kind of data-flow analysis technique(singly linked list pointers’ reachability) with Scope-Logic-based code verification.We colleced reachability pro-perties in program points through data flow analysis,presented the results as first order logic formulas with recursive functions,then inserted these formulas into corresponding program points,proved them and established their dependencies according to rules of Scope Logic.Experiments show that our method can acquire singly linked list pointer’s reachability property efficiently,and the results can be effectively used in code verification in Scope Logic.
Pervasive Computing Environment Oriented Service Orchestration Framework for Android
GU Jing-xiao, PENG Xin and ZHAO Wen-yun
Computer Science. 2015, 42 (12): 52-55. 
Abstract PDF(439KB) ( 451 )   
References | RelatedCitation | Metrics
Mobile devices and smart phones are the main carriers of service aggregation and orchestration in pervasive computing environment.Services in pervasive computing environment have diverse forms,including remote Web services,local services within devices (such as local app and sensors inside) and context-aware services provided by the surrounding environment (such as environmental sensors).In addition,changing contexts raise new demands for the adaptive ability of software itself,but service orchestration on mobile devices is limited by computing power and resources.In order to solve the above problems,this paper addressed a pervasive computing environment oriented service orchestration framework on Android platform based on SOA,named ASOF.With the help of ASOF,mobile devices are able to obtain service templates of required business processes and bind services for abstract services in the templates.In this way,the devices can perform lightweight mixed service orchestration and dynamically gain the ability to invoke all types of services in pervasive computing environment.After that,a standard implementation of ASOF based on OSGi Felix framework was given and validated by a concrete case.
SmartHR:A Resume Query and Management System Based on Semantic Web
KE Ye-qing, MA Zhi-rou, WU Hai-jiang and LIU Jie
Computer Science. 2015, 42 (12): 56-59. 
Abstract PDF(341KB) ( 561 )   
References | RelatedCitation | Metrics
The personnel departments of organizations are always confronted with the challenge of efficiently finding out suitable candidates from massive resumes.Some business departments usually express their demand for talents as tags,such as “with rich search engine development experience”,“graduated from 985 universities” and so on.These requirements cannot be done by SQL queries or keyword search.To fill this gap,this paper proposed a resume analysis and search method based on semantic Web.By the method of domain knowledge base auxiliary information extraction,resume information is semantically analyzed and labels for suitable candidates are automatically generated.In addition,on a large personnel scale,we proposed a method of multi-level cache by which the performance has been greatly improved.Meanwhile,it is applied to nearly ten thousand personnel agency resumes and experiments show the effectiveness.
Dynamic Updating Technology for Application Instances on PaaS
ZHANG Jie, CAO Chun and YU Dong-liang
Computer Science. 2015, 42 (12): 60-64. 
Abstract PDF(401KB) ( 453 )   
References | RelatedCitation | Metrics
PaaS (Platform-as-a-Service) is one of the key services in cloud computing,which provides high availability and scalable development and runtime environment for applications.However,when the applications running on a PaaS platform need to be updated,current PaaS platforms lose their high availability due to the lack of effective support for dynamic updating.To solve the problem,based on current research on software dynamic updating technology,we introduced a PaaS oriented dynamic software updating framework.With extensions of transaction management,dynamic dependence management and version management,we realized instance-level dynamic update for applications on PaaS platforms.We implemented the technical framework on Cloud Foundry to demonstrate the effectiveness of our technology.
Safety Requirements Description Method Based on RUCM
WU Xue, LIU Chao and WU Ji
Computer Science. 2015, 42 (12): 65-70. 
Abstract PDF(761KB) ( 684 )   
References | RelatedCitation | Metrics
Safety requirements have commanded increasing attention as software is playing a more and more important role in today’s safety critical systems.The extraction and description of software safety requirement are the key element of the whole software safety work.The subsequent software design and realization and test process will reference to software safety requirement.Nevertheless,most safety requirements are described in ordinary functional specification,lack of independent and normative description,especially the relationship between safety requirements and fault,failure.As a result,there is little practical guidance on how to describe safety requirements.So this paper designed a safety requirements specification-Safety RUCM,which is based on restricted use case modeling RUCM,and extended its specification template and restriction rules by adding fault specification and data specification in order to support the fault related information.We used this specification to describe an operating system safety requirement.Result shows that this specification is practicable.
Method Combining Linear Temporal Logic and Fault Tree for Software Safety Verification
WANG Fei, SHEN Guo-hua, HUANG Zhi-qiu, MA Lin, LIU Chang, LI Hai-feng and LIAO Li-li
Computer Science. 2015, 42 (12): 71-75. 
Abstract PDF(407KB) ( 784 )   
References | RelatedCitation | Metrics
Embedded software is playing an important role in safety critical field.How to ensure the safety of safety-critical software has recently become a research focus.The fault tree technique is a safety analysis method which is commonly used in traditional industry.However,FTA (Fault Tree Analysis) itself lacks formal temporal semantics.To solve the problem,this paper proposed a method to verify the safety of embedded software combining linear temporal logic and FTA.Applying linear temporal logic to formal specification of the fault tree,it extracts the software safety properties from the formal fault tree and describes them with the temporal logic.Expert can use the extracted safety properties to do model checking of the safety-critical software,to analyze and verify its reliability and safety.The paper applied the model checking to a module of a safety-critical airborne software to demonstrate the method in detail.
Dynamic Target Tracking and Predicting Algorithm Based on Combination of Motion Equation and Kalman Filter
WANG Yan, DENG Qing-xu, LIU Geng-hao and YIN Biao
Computer Science. 2015, 42 (12): 76-81. 
Abstract PDF(460KB) ( 913 )   
References | RelatedCitation | Metrics
In the view of problem that the traditional location technology is erroneous greatly,and can not predict the position of the target,this paper presented a dynamic target tracking and predicting algorithm ME-KF combining motion equation and Kalman filter,which simulates the motion characteristics of dynamic target by motion equation,reduces the influence of noise on the measurement results and predicts the position of the target in the next moment.This algorithm has been practically applied to personnel location system of Liaoning Paishanlou mine and has made remarkable achievements.The experimental results show that this method improves the precision of location,predicts people positions,makes early warning of possibly dangerous areas,and can also successfully analyze the distribution of obstacles.
MARTE Models Based System Reliability Prediction
CHAI Ye-sheng, ZHU Xue-yang, YAN Rong-jie and ZHANG Guang-quan
Computer Science. 2015, 42 (12): 82-86. 
Abstract PDF(444KB) ( 572 )   
References | RelatedCitation | Metrics
System reliability is an important qualitative attribute of systems.System reliability analysis based on models may find problems and solve them at the early phase of development.UML profile for MARTE is an extension of UML in the domain of real-time and embedded systems.We presented a method for reliability prediction of MARTE models.The MARTE model considerd by the method includes a use case diagram,a deployment diagram and a set of activity dia-grams.A MARTE model is transformed into a network of Markov decision process,which is then analyzed by the model checking tool PRISM.We obtained the estimation of system reliability by analyzing the resulting model.A case study demonstrates that,by analyzing models with various resource reliability,our method can further reveal the impact of the reliability of each resource on the system.
Display Process and Technique Implementation of Ontology Conceptual Diagram
WANG Shi-qi, LI Yi-xiao, SHEN Li-wei and ZHAO Wen-yun
Computer Science. 2015, 42 (12): 87-91. 
Abstract PDF(659KB) ( 442 )   
References | RelatedCitation | Metrics
Ontology modeling is critical in the research and construction of semantic Web.Faced with a giant-scale domain,crowdsourcing and graphical editing can attract more people to participate the ontology modeling work.In the platform designed for these,the display of concept mo-del should meet a series of requirements,including the correctness of content,partial display and light-weight data transmission.To satisfy the actual needs in the display of concept mo-del,this thesis studied and concluded the display process of concept map which is based on ontology.This process aims at the transmission of ontology data from backstage to the concept map editor in the foreground.The process includes a series of steps:the loading of ontology model,the localization of ontology,data transmission,model transformation and model display.Besides,based on a set of tools including SPARQL and XSLT,this essay gave an implementation scheme for the display of concept map,and integrated the modules implementing the display process into the cooperative ontologymodelling platform.
Design and Implementation of RFID-based Campus Navigation System
CUI Jin-qi and TAO Xian-ping
Computer Science. 2015, 42 (12): 92-94. 
Abstract PDF(583KB) ( 441 )   
References | RelatedCitation | Metrics
With the development of GIS technology,LBS technology,mobile Internet technology and other core techno-logies,personal navigation system that serves the general public has become a hot application.Indoor walking guide is one core of personal navigation system and the indoor positioning technology involved is always one of research difficulties.Based on the fixed RFID tag group,this paper generated an indoor location map,combined portable mobile RFID readers and smart mobile phones and completed the indoor positioning,path computation and navigation warning.Based on the design scheme above,this paper completed a navigation system of campus of Nanjing University based on RFID,and we have put it into practice.
Simulation and Real-time Analysis for Embedded Software Design Model with Consideration of Integrated Modular Avionics Platform
SUN Lei, YANG Hai-yan and WU Ji
Computer Science. 2015, 42 (12): 95-97. 
Abstract PDF(295KB) ( 442 )   
References | RelatedCitation | Metrics
How to ensure an avionics software satisfies its real-time requirement is always a hot research problem.According to the results reported in industry,the earlier the defects found,the less cost to fix them to improve the chances of not missing deadlines.For avionics software running on integrated modular avionics (IMA) which is standardized by the ARINC653,people can use` the following method.The design model (in UML) of avionics software is transformed into simulation model (in Simulink),and the potential real time problems are investigated by executing the simulated models in the platform of Simulink.Since avionics software may have plenty of interactions with IMA platform (i.e.,interface layer and operating system layer) to apply resources to use,or to communicate with other applications,a respective simulation module was designed to simulate the behavior of IMA platform.An industrial case study was used to demonstrate the effectiveness of the proposed approach.
Design and Implementation of Vehicle Remote Control and Independent Way-finding System
LI Xiao-fan and XU Chang
Computer Science. 2015, 42 (12): 98-101. 
Abstract PDF(1085KB) ( 467 )   
References | RelatedCitation | Metrics
Intelligent robots obtain environment information such as terrain and temperature with sensors,and then analyze and process such information to take appropriate countermeasures.However,if a robot cannot locate itself in the environment,its user can only perform simple control,and its obstacle avoidance algorithm may fail due to under-sampling of obstacle information.To address these problems,this paper proposed a system that can perform remote control and independent way-finding on smart-car robots.The system can automatically locate a smart-car robot through efficient image processing and recognition.It improves over existing algorithms and builds a collision avoidance strategy to ensure that the robot can always find a safe path to its destination.Our experimental results show that the system has high locating accuracy,and its remote control and way-finding functions are reliable and useful.
Fault Localization Using Failure-related Contexts for Automatic Program Repair
LI Ang, MAO Xiao-guang and LEI Yan
Computer Science. 2015, 42 (12): 102-104. 
Abstract PDF(590KB) ( 424 )   
References | RelatedCitation | Metrics
In face of the high cost consumed by program repair in the software life-cycle,researchers from both academia and industry are trying to develop an effective automatic program repair technique to address this problem.Being an essential part of program repair,fault localization plays an important role in repair performance.However,preliminary studies have shown that existing fault localization approaches do not take into account the features of automatic repair,and therefore restrict repair performance.It is vital to design fault localization approaches for automatic repair to improve its performance.To address this issue,this paper proposed a fault localization approach using failure-related contexts for improving automatic program repair.The proposed approach first utilizes program slicing technique to construct a failure-related context,then evaluates the suspiciousness of each element in the context of being faulty,and finally presents this context and its elements with different suspiciousness to automatic program repair techniques for performing repair on faulty programs.The preliminary experimental results demonstrate that the proposed approach is effective to improve automatic repair performance.
Automated Detection of Extract Method Refactorings
LIU Yang, LIU Qiu-rong and LIU Hui
Computer Science. 2015, 42 (12): 105-107. 
Abstract PDF(235KB) ( 440 )   
References | RelatedCitation | Metrics
Automated detection of refactorings is a hot topic in the field of software refactoring.The main purpose of the detection is to facilitate the understanding of software evolution and refactoring on clients according to changes made on their servers.Although a number of methods and tools have been proposed to automatically detect refactorings,to the best of our knowledge there is no methods or tools to detect and extract method refactorings automatically by comparing two versions of an application.To this end,we proposed an approach to detect and extract method refactorings by comparing two successive versions of a given application.We also implemented the proposed approach and validated it with open source applications.Evaluation results on 8 open-source applications suggest that the precision of the proposed approach varies from 65% to 90%.We also conducted an evaluation by monitoring developers on a small application.And evaluation results suggest that the recall and precision of the proposed approach is 85% and 85%,respectively.
Approach to Subdividing Systems Based on Hierarchical Clustering
ZHU Rui, LIAO Hong-zhi, LI Tong, DAI Fei, WANG Yi-quan, MO Qi and LIN Lei-lei
Computer Science. 2015, 42 (12): 108-114. 
Abstract PDF(1072KB) ( 399 )   
References | RelatedCitation | Metrics
System subdivision always is involved in the information technology domain,and a commonly used method is the U/C Matrix.However, the system complexity and the factor of people will cause some critical problems,such as inefficiency,uncertainty and mistakenly division .Consequently,the relationship between system and subsystems,subsystems and functions,functions and data,as well as properties of these relationships were discussed.The subsystem hierarchically clustered according to the simulation of functions was measured by structure entropy and Hpal entropy,and a series of computational formula were given.This new way of system subdivision changes the things finished by people to computation.Meanwhile,a prototype system was accomplished and a case study was analyzed to verify the theory.
Analysis of Real-time Performance of Algorithm Credit in Xen Virtual Machine
ZHANG Tian-yu, GUAN Nan and DENG Qing-xu
Computer Science. 2015, 42 (12): 115-119. 
Abstract PDF(415KB) ( 805 )   
References | RelatedCitation | Metrics
The development of complex real-time embedded systems has become a trend in recent years.To reduce cost and enhance flexibility,multiple systems are sharing common computing platforms via virtualization technology.We studied the real-time performance of algorithm Credit in Xen which is the most popular virtual machine monitor.Firstly we analyzed the basic implementation of Credit which is the default scheduling algorithm in Xen virtual machine.Then we proposed and proved an effective method that can configure VCPU parameters to improve the real-time performance of Credit for promotion.On this basis,we finally obtained the resources function curve SBF allocated for VCPU in the worst case by showing and getting the basic properties of Credit scheduling algorithm.
Verification QoS of Web Services Compositional Processes
KAI Jin-yu, MIAO Huai-kou and GAO Hong-hao
Computer Science. 2015, 42 (12): 120-123. 
Abstract PDF(687KB) ( 413 )   
References | RelatedCitation | Metrics
Whether the Web services can win the market,on the premise of meeting the functional requirements,quality of service (QoS) becomes an important factor in judging its advantage.This paper adopted the technique of probability model checking to evaluate the QoS of the Web services computing combination process.In this paper,starting from the access log of user accessing a Web service,using the method of clustering,we constructed a user group-oriented QoS using model.As for the QoS requirement,we used the extended QoS state diagram to describe.And then,we used the simulation verification method in the tool of probabilistic model checking——PRISM to judge whether the behavior of the Web services computing combinational process satisfies its QoS requirement.
Multi-objective Coevolutionary Test Case Prioritization
SHI Yu-nan, LI Zheng and GONG Pei
Computer Science. 2015, 42 (12): 124-129. 
Abstract PDF(777KB) ( 552 )   
References | RelatedCitation | Metrics
Test case prioritization is an effective method to significantly reduce costs of regression test.According to some certain aim of test purpose,the main idea of test case prioritization is to rearrange the permutation of the test suite in order to execute the test case with higher priority preferentially.Aiming at the defect of single object genetic algorithm in test case prioritization,such as slow convergence speed,easy to be trapped in local optimum and lacking of the trade-off between multiple testing criteria,a new competitive co-evolutionary approach was adopted to resolve these problems.In this new approach,multi metrics of fitness are used,including the absolute fitness which evaluates the survival ability of an individual and the relative fitness which estimates the number of defeated opponents of each individual.The outstanding individuals who defeat more opponents can join the elite set for further evolution.By eliminating “old” individuals,this approach can control the individual’s survival time to avoid the local optimum.To improve the efficiency of error detection,we introduced the average percentage of mutation kill rate as a new multi-objective optimization criterion.Comparing to the classical search algorithm,the co-evolutionary algorithm can improve the search efficiency and local search ability,and experiment verified the validity of the new approach.
Real-time Data Warehouse Pre-processing Based on Dynamic Mirror Replication
MAO Ying-chi, MIN Wei, JIE Qing and ZHU Li-li
Computer Science. 2015, 42 (12): 130-135. 
Abstract PDF(520KB) ( 468 )   
References | RelatedCitation | Metrics
Real-time data warehouse is one of the important research field in the data management.Real-time data query and import can bring about the problem of query contention.Query contention will not only seriously affect the accuracy of query analysis,but also reduce the performance of the real-time data warehouse.In this paper,combining an external dynamic storage area,a dynamic mirror replication technology was proposed to effectively solve the query contention problem.Meanwhile,the fly-weight materialization method and the fly-weight materialization join algorithm were proposed to improve the query and analysis performance in the real-time OLAP.Based on the TPC-H benchmark,the proposed dynamic mirror replication technology was evaluated.The experimental results demonstrate the proposed solution can get better performance in terms of effectiveness.
Formal Validation of Causal Behaviors of Problem Domains in Problem Frames Approach
ZHU Li-lu and LI Zhi
Computer Science. 2015, 42 (12): 136-142. 
Abstract PDF(1720KB) ( 465 )   
References | RelatedCitation | Metrics
This paper proposed a method of formally identifying and validating causal behaviors of problem domains,which are the basis of problem progression in the problem frames approach.A symbolic model checking method based on the NuSMV language was adopted in order to provide verifiable evidence of causal relationships between events which are useful in problem progression,reduce the complexity of problem representation,and increase the reliability of specifications of the computing machine.A UML state-chart is used to represent the finite space of internal state transitions of a critical domain.A CTL formula is used to describe the reachability of certain internal states of the domain.A series of causally-related events are identified through traversing all the possible paths of state-transition in the state-chart to validate the correctness of the CTL formula,thus providing effective technical support to problem progression.
Detecting Security of Applications in Chinese Third-party Android Market
YAN Jin-pei, HE Hui, AN Wen-huan, ZHANG Xiao-hui, REN Jian-bao and QI Yong
Computer Science. 2015, 42 (12): 143-147. 
Abstract PDF(443KB) ( 599 )   
References | RelatedCitation | Metrics
At present,repackaged apps exist in third-party Android application markets.In this paper,150 official apps are selected randomly and 572 third-party markets apps are used as contrast.Android repackaged apps security detection system was designed.First,we fine-grained identified repackaged apps by calculating their similarity,then gained resource files through reverse engineering, analyzed overprivileged behaviors according to the mappings matcher between system API and permission,and analyzed permission abused behaviors according to constructed methods CFG. By parallel processing,the system detects that there are 33.17% repackaged apps in third-party markets,19.58% permissions are modified,and modified permission apps include 45.95% overprivileged behaviors and 27.03% permission abused behaviors.
Detection and Resolution of Differences of Data Aware Processes
ZHANG Xue-wei, XING Jian-chun, YANG Qi-liang, SONG Wei and WANG Hong-da
Computer Science. 2015, 42 (12): 148-151. 
Abstract PDF(693KB) ( 397 )   
References | RelatedCitation | Metrics
Business-driven development favors the construction of a business process by different business-development staffers.In order to obtain a standard reference process to consolidate these variants,it is necessary to detect and resolve differences between data aware processes.Existing approaches focus on detection and resolution of differences between process models with relying on a change log at the level of control flow.However,few methods can study the problem of differences between data aware processes.Based on program dependence graphs and correspondences,a novel approach was proposed to detect and resolve differences by comparing before and after modified data aware processes.It builds a hierarchical change log and meets requirements of user-friendliness.
Research of Quality-aware Contexts Management for Complex IOT Applications
ZHENG Di, BEN Ke-rong and WANG Jun
Computer Science. 2015, 42 (12): 152-156. 
Abstract PDF(424KB) ( 426 )   
References | RelatedCitation | Metrics
With the rapid development of IOT,it will lead to more multiple-dimensional and more numerous information with wide-area and heterogeneous sensor networks.Accordingly,the information from these kinds of networks may be more uncertain,incomplete,inconsistent and inaccurate.Therefore,we proposed a context management framework based on research of eliminating incomplete,inconsistent and inaccurate contexts.In this framework,we used quality factors configuration for contexts,regarding methods for inconsistent and inaccurate contexts as well as filling methods for incomplete contexts.Results show that the method can reduce the uncertainty of the contexts and improve the quality of the IOT applications efficiently.
Research on Fusion Trust Model for Decision-making
XU Pei, LIAN Bin, SHAO Kun, CHEN Jun and AN Ning
Computer Science. 2015, 42 (12): 157-161. 
Abstract PDF(389KB) ( 708 )   
References | RelatedCitation | Metrics
The trust relationship model is one of the most complex social relation models either in real life or in the open networks.It is an abstract cognitive psychology with difficult measure,because it involves assumptions,expectations,behavior and environment,and other factors.Drawing on previous research experience,considering the variety of trust factor,this paper proposed a new trust fusion model for decision-making.This model is based on the evolution of direct trust,indirect trust and reputation,and is on the basis of trust fusion to obtain the decision trust.In order to prove the validity of the trust fusion model,we assumed object makes class A activity in the expectations of the probability P.Experimental verification shows that the difference of decision trust and credibility of the object is significantly less than the difference of decision trust,indirect trust or reputation and credibility of the object.
Unified Modeling Method of Functional and Non-functional Aspects for Composite Software
XIAO Fang-xiong, XU Bo, XIA Guo-en, LI Guo-xiang and MIN Hua-qing
Computer Science. 2015, 42 (12): 162-166. 
Abstract PDF(434KB) ( 400 )   
References | RelatedCitation | Metrics
Non-functional aspects such as time,cost and probability,are becoming more and more important for composite software in the dynamic,open,heterogeneous and changeable environment of Internet.Constructing unite functional and non-functional models of composite software in design phase and verifying functional and non-functional aspects based on the models,are effective to assure dependability for the kind of software.In this paper,a novel property sequence diagram (PSD) was proposed by extending traditional UML sequence diagram with abstract time,cost and probability.PSD has two levels.The low level of PSD is extended with time and cost,and is used to model basic and detail interact scenarios of composite software.While the high level is extended with probability,and is used to construct full scenario by synthesizing the low levels.An example was illustrated to show the effectiveness of purposed methods.
Generation Method of Path Set Affected by Program Change Based on Source Code Analysis
GUO Dan-dan and JIANG Ying
Computer Science. 2015, 42 (12): 167-170. 
Abstract PDF(307KB) ( 491 )   
References | RelatedCitation | Metrics
Any stage of the software life cycle may change software due to various reasons.When the software changes,it must be checked whether the changes affect the normal functions of the original software by using regression testing.In order to improve the efficiency and reduce the cost of regression testing,it is necessary to determine the content affected by software change accurately.This paper proposed a method to analyze the change scope of the programs based on source code statements during the unit testing.Then the impact and change sets can be gained.The generation algorithm of the paths impact set was presented.The experimental results show that this method can effectively generate the program path set affected by program change.And the efficiency of regression testing is improved.
Study of Social Network Analysis Software
LIU Peng, LI Xian-xian and WANG Li-e
Computer Science. 2015, 42 (12): 171-174. 
Abstract PDF(836KB) ( 807 )   
References | RelatedCitation | Metrics
With the growth in the size of social network data,manual processing methods cannot meet the needs of social network analysis.This paper reviewed 4 commonly used and well-documented social network analysis softwares,which are NodeXL,Pajeck,Gephi,and networkX.Through the comparison of the software’s inputting data format,statistical performance,data visualization,help documentation,and difficulty of use,we gave an objective evaluation of their overall performance,and recommendations of selecting and use.
Data Reduction Analysis for Message between Services Based on Abstract Interpretation
JIANG Caoqing, XIAO Fangxiong, GAO Rong, YING Shi and WEN Jing
Computer Science. 2015, 42 (12): 175-180. 
Abstract PDF(509KB) ( 415 )   
References | RelatedCitation | Metrics
Interpretation JIANG Cao-qing1 XIAO Fang-xiong1 GAO Rong1 YING Shi2 WEN Jing2 (Department of Computer and Information Management,Guangxi University of Financial and Economics,Nanning 530003,China)1 (State Key Lab of Software Engineering,Wuhan University,Wuhan 430072,China)2 Abstract Infinite values of variable range of message between services may exist in service-oriented software,which results in the state space explosion situation when checking model.In order to make termination rerification feasible in practice,the size of the state space must be reduced which makes computing time and space demand reasonable.In this paper,based on interval abstract theory of abstract interpretation,we extended the classic interval abstract domain methods,carried out variable interval analysis on exception control flow graph with abstract domain methods of the unification interval.On this basis,interval set of message variables between services is obtained by the reverse analysis.An arbitrary value on a variable interval is equivalence relative to the termination of the verification,so reduction values of message variables between services can be gotten through selecting a representative value from every variable interval sets,which provides the reduced initial configuration for termination verification of exception handling,effectively avoides the state space explosion.
Research and Development of Computer-aided Requirements Analysis Tool Based on Human-computer Interaction
HE Zheng-hai and LI Zhi
Computer Science. 2015, 42 (12): 181-183. 
Abstract PDF(833KB) ( 384 )   
References | RelatedCitation | Metrics
Software requirements engineering plays an essential role in software development projects,and human beings are the key players for requirement analysis activities,therefore,a user-centered approach should be used in the design of computer-aided requirements analysis tools.Based on an existing problem-oriented computer-aided requirements engineering (CARE) tool,this prototype extends the tool further to a new platform and offers better usability.The Android platform is chosen for development because software applications running on Android have some advantages over applications running on PC,such as providing more functionalities and better mobility,and providing new technical support in software quality improvement and better user experience due to its open-source approach.In addition,a demonstration on how this prototype is designed based on the theory and modeling techniques from human-computer interaction research was presented.
Regional Diffusion Mechanism Based Time Synchronization Algorithm for Wireless Sensor Networks
WANG Tao
Computer Science. 2015, 42 (12): 184-188. 
Abstract PDF(686KB) ( 428 )   
References | RelatedCitation | Metrics
This paper analyzed convergence issues of distributed time synchronization algorithm in wireless sensor networks,and proposed a regional diffusion mechanism based time synchronization algorithm.The algorithm is made up by two phases.The first phase proposes a spokesmen information exchange algorithm (SIE) for time synchronization within the region,based on the optimal foraging theory (OFT) and the principles of the highest yields.In the second phase, the spokesperson is chosen for the regional to do synchronization between regions according to the time offset,at the same time the synchronization process is mapped to a Markov chain,and a Markov chain based spokesperson acce-lerated algorithm (MarSAA) is proposed to accelerate convergence.Theoretical analysis and experimental results show that the proposed algorithm has better time complexity,and the performance is better than the traditional network-wide time synchronization algorithm,and the two-stage algorithm can also run in parallel.
Time Synchronization Algorithm for Large-scale Wireless Sensor Networks
HAO Gang and ZHUANG Yi
Computer Science. 2015, 42 (12): 189-194. 
Abstract PDF(502KB) ( 437 )   
References | RelatedCitation | Metrics
Concerning the problem that typical time synchronization algorithms used in large-scale wireless sensor network have a low accuracy and high energy consumption,this paper proposed a time synchronization algorithm based on cluster-tree structure.First,the algorithm establishes clusters-based spanning tree to reduce the hop count of synchronization,and then uses two-way SRS in inter-cluster and one-way ROS in intra-cluster to reduce the number of messages required for the network synchronization.The experimental results show that compared with the RBS and TPSN algorithm,the proposed algorithm can keep the network synchronization precision in higher level,and effectively reduce energy consumption of sensor nodes.
Network Link Quality Estimation with Local Characteristics in WSNs
LI Jun-wei, LI Shi-ning and ZHANG Yu
Computer Science. 2015, 42 (12): 195-200. 
Abstract PDF(615KB) ( 373 )   
References | RelatedCitation | Metrics
Reprogramming (i.e.code dissemination) is one of the key technologies to enable software update in wireless sensor networks.Traditional code dissemination protocols are evaluated for performance comparisons by simulations and testbed experiments.Recently,the analytical model was proposed to characterize software update and evaluate performance of state-of-the-art code dissemination protocols because the analytical model is both lightweight and accurate for performance evaluation in large scale networks.However,the analytical model depends on estimation of network-wide link quality.The methods used to calculate network-wide link quality in previous analytical models ignore local characteristics in software update,thus the calculated network-wide link qualities are not scalable in different topologies of networks and are not accurate enough.By this observation,a novel method with local characteristics for estimation of network-wide link quality was proposed,which takes the averaged expectation of link qualities of neighborhood nodes from any node as local link quality for that node.Therefore,the proposed method an reflect network-wide software update accurately.When taking the estimated network-wide link quality of this proposed method as input for the analytical model to predict the performance of reprogramming,i.e.completion time of software update,the analytical results match that of testbed experiments with high accuracy.The prediction error of the analytical results is below 5% in linearand grid networks.Therefore,the proposed method is robust and adaptive in both linear and grid networks,compared with existing methods for estimating network-wide link quality.
Research on Position Equation Direct Solving Method in Multi-constellation Satellite Navigation System
WANG Wen-wen, ZHANG Ke and HUANG Bin
Computer Science. 2015, 42 (12): 201-206. 
Abstract PDF(427KB) ( 1062 )   
References | RelatedCitation | Metrics
In the navigation positioning system,the traditional least squares method requires linear iteration for solution,which brings great computation and affection to real-time performance and stability.Aiming at this problem,and considering the feature that there are more visible satellites in multi-constellation satellite navigation system,two direct solving methods which use “3+2+2” and “3+3+2” satellites were put forward.Also,when there are more than 9 visi-ble satellites in triple system,the method of converting nonlinear equations into linear equations was proposed.The proposed three methods do not require assuming the initial position coordinate of receiver,thus linear iteration can be avoided and computation load is reduced.Simulation results show that the proposed methods are effective and reliable for position solving in multi-constellation satellite navigation system.
Propagation Model of Electromagnetic Nanonetworks in Terahertz Band
WANG Wan-liang, WU Teng-chao, YAO Xin-wei, LI Wei-kun and CHEN Chao
Computer Science. 2015, 42 (12): 207-211. 
Abstract PDF(435KB) ( 813 )   
References | RelatedCitation | Metrics
For the purpose of Terahertz Band (THz) communication based electromagnetic nanonetworks,an analytical THz channel model from the aspect of energy consumption was proposed,by considering the peculiarities of path loss and molecular absorption in the propagation of THz band.Two different scenarios with or without reflection path were adopted to analyze the influence of path loss in the THz band,respectively.For the analysis of peculiarities of molecular absorption,the theory of atmospheric radiation was adopted.In detail,energy consumption of THz communication combined with the path loss and atmospheric molecular absorption was presented with different transmission distances and frequencies.The results demonstrate that the energy consumption for molecular absorption is an important element for the total ener-gy consumption of electromagnetic wave in the THz band,and atmospheric molecular absorption is decided by the composition of molecular in transmission medium,while the energy consumption for path loss is affected by the transmission frequency and distance.The analytical results have an important reference value for the design of nano-node and the selection of transmission frequency.
Efficient Cellular Networks Collaboration Transmission Technology with QoS Constraints and Beamforming Criterion
SONG Hai-long and ZHANG Shu-zhen
Computer Science. 2015, 42 (12): 212-214. 
Abstract PDF(314KB) ( 374 )   
References | RelatedCitation | Metrics
For the issues that data transfer efficiency is not high and the energy is larger caused by the serious cellular network interference,a efficient cellular networks collaboration transmission technology with QoS constraints and beamforming criterion was proposed.The technology uses a heterogeneous system model composed of the macrocell cellular network and the femtocell, and analyzes the network communication and power signal interference conditions.Based on QoS constrain efficient collaboration transmission algorithm,the QoS constraints is used to optimize energy efficiency of data transmission into a beam-forming and power allocation,and the QoS constraints is introduced into beamforming criteria to improve the state of the network interference and increase the energy efficiency.The comparison of the experimental results indicates that this technique can improve the energy efficiency of the cellular network for data transmission,and reduce the communication interference.
Research of Edge Router Based on 6LoWPAN
XIAO Xiang-ning, WANG Peng, LI Jian-li and GUO Ping
Computer Science. 2015, 42 (12): 215-219. 
Abstract PDF(895KB) ( 527 )   
References | RelatedCitation | Metrics
In this paper,a method to design edge router based on smart 6LoWPAN sensor network was proposed.With the use of Linux embedded operating system and single CPU,this method transplants 6LoWPAN protocol suite on Linux system,makes 6LoWPAN network load on LAN radio frequency interface and connect to Internet through edge router.By using multiple protocol fusion,IPv4/IPv6 double stack protocol,NAT64 protocol and 6to4 tunnel protocol will be unified,and 6LoWPAN network nodes will be accessed adaptively to the Internet in different network environment.
Scheme of Automated Trust Negotiation Based on Fuzzy Logic
MA Xiao-xin and ZENG Guo-sun
Computer Science. 2015, 42 (12): 220-223. 
Abstract PDF(441KB) ( 447 )   
References | RelatedCitation | Metrics
Automated trust negotiation (ATN) is an important method for establishing the trust between strangers in open network environment by disclosing attribute certificates.Because it is difficult to use tranditional trust negotiation to describe the credential access control policies and establishing the trust is inefficient,a fuzzy logic-based scheme was proposed.By modeling the credential access control policies in fuzzy logic formula,the description will be more concise and flexible,and negotiation path will be optimized.The analysis shows that efficiency and success rate of this negotiation scheme can be improved to some extent.
Trustworthy Software Distributing Mode Based on Software Description
LI Jian-fei, XU Kai-yong and JIN Lei
Computer Science. 2015, 42 (12): 224-228. 
Abstract PDF(484KB) ( 402 )   
References | RelatedCitation | Metrics
The present study of software distribution is mainly about the efficiency and integrity.For high security requirement of inner network environment,how to make use of trusted computing technology to ensure the safety and reliability of the software release has more important research significance.Because the software trust distribution is dynamic,namely the requirements of different users and platforms are different,and human configuration is inefficient and its safety and reliability cann’t be ensured,thus an intelligent release strategy among user,platform,software was put forward.The software function reliability algorithm can quantitatively calculate the degree of the software functions conforming with user requirements,and intelligent matching algorithm can generate installation sequence based on dependency to ensure running reliably.In order to describe the software information in the distribution process and meet the needs of distribution algorithm at the same time,we designed an SDDL (Software Distribution Description Language) based on XML.Analysis and application example demonstrate that the mode can indeed enhance the trust of software distribution.
Efficient and Secure Communication Scheme for Deep Space Networks
REN Fang and ZHENG Dong
Computer Science. 2015, 42 (12): 229-232. 
Abstract PDF(414KB) ( 630 )   
References | RelatedCitation | Metrics
The secure deep space communication network is one of the most important aspects in deep space exploration.A deep space network architecture based on satellites networks was proposed and an efficient and secure communication scheme for this architecture was studied.In this scheme,the attribute-based encryption algorithms are used to achieve secure data transmission in deep space networks.The deep space node chooses a set of attributes for data to be transmitted and encrypts it with private keys corresponding to the attributes and a user can decrypt the cipher if and onlyif the attributes set satisfies the access tree user holds.There is no need for the node and the user to authenticate the identity of each other,so the numbers of information exchange are greatly reduced.The scheme is suitable for deep space networks in order to make up the deficiency of costly data transmission.
Domain-specific Algorithm of Protocol State Machine Active Inference
WANG Chen, WU Li-fa, HONG Zheng, ZHENG Cheng-hui and ZHUANG Hong-lin
Computer Science. 2015, 42 (12): 233-239. 
Abstract PDF(620KB) ( 525 )   
References | RelatedCitation | Metrics
Existing protocol state machine inference approaches based on algorithm L* are inefficient owing to ignorance of protocol-specific knowledge.As the protocol messages are abstracted as the independent and insignificant symbols,and test samples are completely generated randomly in equivalence query,invalid queries and test samples are inevi-table.A protocol state machine active inference algorithm named L+N was proposed,which improves the algorithm L+M in three aspects.Firstly,L+N filters the invalid output query according to the constraint on strict order,which is extracted from conservation samples.Secondly,L+N constructs the extended prefix tree accepter(EPTA) corresponding to the sample set and answers the output query in advance.Thirdly,a new proposed strategy to find counterexamples more effectively is applied to judge the equivalence query based on positive sample mutation.Experimental results show that L+N improves the inference efficiency greatly and achieves the same accuracy as algorithm L+M.
Forensic Method Based on Markov Chain Model for Splicing Images
BU Jiang, ZHENG Bin and CHEN Hai-yang
Computer Science. 2015, 42 (12): 240-242. 
Abstract PDF(251KB) ( 395 )   
References | RelatedCitation | Metrics
Markov chain is used to model the correlation of discrete cosine transform coefficients,and the state transition probability matrix is extracted as feature vector.Then the support vector machine classifier is used to discriminate natural images and doctored images.Experimental results show that the average accuracy rate can achieve 86%.The detection performance of our method is better than that of the method using bi-coherence features proposed by Ng.
Two-dimension Declassification Policy in Multithreaded Environments
JIN Li and ZHU Hao
Computer Science. 2015, 42 (12): 243-246. 
Abstract PDF(403KB) ( 412 )   
References | RelatedCitation | Metrics
Information declassification aims at secure release of sensitive information.Existing security specifications and enforcement mechanisms of declassification policies focus on sequential programs,and they can not be directlytransplanted to multithreaded environments for that attackers can take advantage of some properties of thread scheduling to derive sensitive information.To this end,a two-dimension declassification policy in multithreaded environments was proposed,based on the multi-threaded programming language model and thread scheduling model,effectively ensuring that appropriate information is released at the appropriate point of programs.Moreover,dynamic monitoring mechanisms of the policy in multithreaded environments were presented,and the soundness of enforcements was proved.
Novel Global Kmeans Clustering Algorithm for Big Data
LI Bin, WANG Jin-song and HUANG Wei
Computer Science. 2015, 42 (12): 247-250. 
Abstract PDF(306KB) ( 499 )   
References | RelatedCitation | Metrics
The clustering method for big data has attracted lots of interest in recent years.This paper proposed a novel global k-means clustering algorithm (NGKCA).The proposed clustering method comprises four phrases,namely row dimension reduction phrase,line dimension reduction phrase,global k-means clustering phrase and the adjustment of clustering center point.The row dimension reduction phrase is realized by means of spectral clustering method,while the line dimension reduction phrase is realized with the aid of particle swarm optimization.Both the row dimension reduction phrase and the line dimension reduction phrase are completed,and then the global k-means clustering phrase and the PSO phrase proceed.The experiments were carried out on some well-known machine learning data set and a standard network security data set KDDCUP99.Experimental results show that the proposed NGKCA leads to superior perfor-mance in comparison with some common algorithms reported in the literature and the time complexity of the NGKCA is better than the algorithm of global k-means.
Meaningful (K,N) Free Expansion Image Sharing Scheme Based on GF(23)
OUYANG Xian-bin and SHAO Li-ping
Computer Science. 2015, 42 (12): 251-256. 
Abstract PDF(1012KB) ( 394 )   
References | RelatedCitation | Metrics
There is pixel expansion in conventional meaningful image sharing schemes which usually use short authentication information to verify the correctness of sharing information and bring defect that the facticity of the reconstructed secret image pixels cannot be accurately identified.To address these problems,a meaningful (K,N) free expansion ima-ge sharing scheme based on GF(23) was proposed.In the proposed scheme,firstly a key is used to generate an encryption mapping table and then this table and secret image pi-xel location information are used to encrypt secret image pixels.Secondly (K,N)-threshold scheme based on GF(23) is used to share the ciphered pixels and pixels authentication information and then they are embedded into N cover images.Finally the key which is used to generate encryption mapping table is also shared into N sub-keys by (K,N)-threshold scheme and the sub-keys’ MD5 values are published to the third reliable party to prevent cheating from distributed cover image holders.The experimental results show the proposed scheme can accurately detect attacked regions in reconstructed secret image.By comparing with conventional methods,the proposed scheme does not have any pixel expansion and distributed cover images have better visual quality.
Design and Implementation of Information Flow Control Framework for PaaS
SHAO Jing, CHEN Zuo-ning, YIN Hong-wu and XU Guo-chun
Computer Science. 2015, 42 (12): 257-262. 
Abstract PDF(511KB) ( 463 )   
References | RelatedCitation | Metrics
Decentralized information flow control is an effective method for end-to-end data protection.The existing DIFC methods have some shortages,for example,information flow tracking granularity is too simplex and language runtime environment has to be modified,which cannot satisfy the data security requirements of PaaS platform.An information flow control framework for GAE was proposed.The framework GIFC combines three granularities of objects,message and SQL.In the component,the information interactions of the entities are controlled with the Python library.The entities are those involved in the method calling for objects.Between the components,message proxies filter the messages with the security labels,in order to restrict the messages received by the component.Between the components and datastore,the data models of GAE are extended,supporting the persistent storage of labels in the datastore.The evaluation shows that the combination of multi IFC granularities effectively balances the precision and performance.
Artificial Bee Colony Algorithm Based on Strategy of Segmental-search with Current Optimal Solution
MAO Li, ZHOU Chang-xi and WU Bin
Computer Science. 2015, 42 (12): 263-267. 
Abstract PDF(392KB) ( 375 )   
References | RelatedCitation | Metrics
An artificial bee colony (ABC) algorithm based on the strategy of segmental-search with current optimal solution was proposed in this paper,in order to overcome the drawbacks of poor local searching capability and slow convergence of conventional ABC algorithm.In this algorithm,onlooker bees utilize the local search strategy guided by the global current optimal solution and individual current optimal solution to mutate dimension,and the local search strategy based on the strategy of segmental-search is used to improve the updating rate of food sources,which enhances the local search capability of the algorithm.The simulation results of six standard functions show that the modified ABC algorithm can attain significant improvement on solution accuracy and convergence rate compared with the basic ABC algorithm.
Improvement of C4.5 Algorithm with Free Noise Capacity
WANG Wei, LI Lei and ZHANG Zhi-hong
Computer Science. 2015, 42 (12): 268-271. 
Abstract PDF(920KB) ( 392 )   
References | RelatedCitation | Metrics
Against the decline of decision tree prediction accuracy rate for high-dimensional data with noise,this paper used the theory of noise-free principal component analysis(NFPCA) algorithm to improve C4.5 algorithm,forming the NFPCA-in-C4.5 algorithms.On one hand,the algorithm transforms the noise suppression problem into an optimization problem of a combination of fitting the data and controling the smoothness,getting the space of principal components.On the other hand,it lets the space of principal components back to the space of original data during the process of building a new node in the decision tree from the top to down,to avoid the loss of characteristic information permanently in the dimension reduction process.Experimental results show that NFPCA-in-C4.5 algorithm has effects of dimensionality reduction and noise reduction,and avoids significant reduction of prediction accuracy rate,which is caused by the loss of information and noise.
Learning to Rank Based on Linear Model for Social Media Streams
ZHANG Wei and LI Yue-xin
Computer Science. 2015, 42 (12): 272-274. 
Abstract PDF(579KB) ( 398 )   
References | RelatedCitation | Metrics
In social media,recommending suitable updates for users can not only reduce information searching time,but also improve users’ stickiness for social media.In order to improve the recommendation accuracy of updates in social media,this paper proposed a linear model based learning to ranking algorithm for updates.Firstly,according to attribu-tes of social media,we defined corresponding bias features,and proposed a linear model based latent bias model.Se-condly,according to features of update and recipients,we defined corresponding linear feature model.Finally,combining the latent bias model and the feature model,we proposed a linear model with temporal effect.The experiments show that,compared with related works,the proposed algorithm has better prediction accuracy and higher execution efficiency.
Learning to Rank Based Approach for Image Searching
TAN Guang-xing and LIU Zhen-hui
Computer Science. 2015, 42 (12): 275-277. 
Abstract PDF(320KB) ( 404 )   
References | RelatedCitation | Metrics
Image searching is one of the most important researches in image sharing based social networks.Traditional image searching methods usually compare the user keywords and the textual description of images in database while searching.Because the textual description is ambiguous,the abstracting of text for images is very hard,and thus the accuracy of image searching is low.In order to improve the accuracy of image searching,this paper proposed a learning to rank based approach.We described each image as a combination of multiple feature descriptors,and compared the similarity of the query and the image in database while users input a query of image.We applied association rules and support vector machine to learn the weight of each feature descriptor,and proposed corresponding learning algorithms.The experiments show that the proposed image searching approach is more accurate than related works while retrieving image for users.In addition,while using association rule and support vector machine to learn the classification functions,the two algorithms use the same instances to learn the weight of each feature descriptor,so they are relevant.
Calculating DTW Center of Time Series Using Dynamic Planning
SUN Tao, XIA Fei and LIU Hong-bo
Computer Science. 2015, 42 (12): 278-282. 
Abstract PDF(346KB) ( 954 )   
References | RelatedCitation | Metrics
The central time series plays an important role in time series clustering,which indicates the common features of time series.We proposed dynamic planning approach called DPSSD to calculate central time series of two time series.The approach is recursive based on the minimizing sum of squares of DTW (SSD) distance from central series to two sample series.Degree-pruning was also introduced to decrease the algorithm time complexity.The proposed algorithm was proved theoretically.It can achieve the optimal solution.In the experiments,the results illustrate that our approaches have much better performance and robustness than DBA,which is measured by SSD.
Accurately and Efficiently Comparing Rooted Phylogenetic Trees
LI Shu-guang, CHEN Shu-ying and ZHU Li-bo
Computer Science. 2015, 42 (12): 283-287. 
Abstract PDF(411KB) ( 1310 )   
References | RelatedCitation | Metrics
Phylogenetic trees represent the historical evolutionary relationships among different species.Comparing phylogenetic trees is a fundamental task in bioinformatics.One way for tree comparison is to define a pairwise measure of similarity or dissimilarity in the tree space to determine how different two trees are.Robinson-Foulds distance is by far the most widely used dissimilarity measure.A new pairwise dissimilarity measure for comparing rooted phylogenetic trees was defined which takes into account not only the identity of clusters in the case of Robinson-Foulds distance,but also more subtle similarities between clusters,and thus may provide more accurate and cleaner measurement than Robin-Foulds distance.Two algorithms to compute this measure efficiently were presented.With slight modifications of these results,they can be applied to five other related comparison indices.
Calibration Method of Using Texture Distribution Features
LI Na, GENG Guo-hua and WANG Xiao-feng
Computer Science. 2015, 42 (12): 288-291. 
Abstract PDF(817KB) ( 408 )   
References | RelatedCitation | Metrics
Texture reconstruction is an important content for digital protection of cultural relics.Aiming at automation calibration precision problem of multiple angle texture collection in texture reconstruction for small and medium-sized relics,this paper put forward a kind of new quantitative features calibration method in order to improve precision of the fitting between the 3D space points and the 2D pixel.This method makes a normalization processing for texture distribution data according to two important types of texture feature as species of color and quantity of geometric pattern of the real surface.Then,it uses the normalization data to assign the quantitative features of calibration template of camera angles.Finally,it realizes the multi-view registration and mapping between the 3D space points and the 2D pixel through perspective projection principle.The experimental results show that,the proposed non-uniform calibration template can better reflect the precision of texture than the uniform calibration template without obviously increasing the amount of calculation at the same time.
Fractal Image Retrieval Algorithm Based on Contiguous-matches
ZHANG Si-si, LIU Yu and ZHAO Zhi-bin
Computer Science. 2015, 42 (12): 292-296. 
Abstract PDF(998KB) ( 407 )   
References | RelatedCitation | Metrics
Fractal code is used to describe the image similarity in scales redundancy information.In this paper,the image features are recorded as fractal codes that are used to determine the image similarity and retrieval.Based on the self-adaptive method of quadtree segmentation,a new fast fractal image coding method was presented.In this method,fractal codes are gotten quickly from the similarity determination of fixed blocks within neighborhood.The segmentation is decreased and the coding time is reduced with ensuring the quality of image decoding.Meanwhile,the paper also proposed a new distance formula by which the image similarity between blocks can be quickly determined and the accuracy of the image similarity judgment can be enhanced. The experimental results show that compared with the grayscale histogram method,the proposed algorithm significantly improve the accurate-complete rating of image retrieval.Compared to the fractal retrieval algorithm in literature,the algorithm shortens the encoding time and reduces the number of sub-blocks,thereby improves retrieval efficiency.
Tracking Method of Hand Grasping Objects Based on Feedback from Camshift to Codebook Model
ZHAO Cui-lian, WANG Hong, FAN Zhi-jian and GUO Jing
Computer Science. 2015, 42 (12): 297-301. 
Abstract PDF(683KB) ( 387 )   
References | RelatedCitation | Metrics
In order to solve problem that Camshift algorithm can’t automatically track targets in complex background,an algorithm to detect and track moving objects based on feedback from Camshift to codebook model was proposed.At first,the algorithm detects foreground objects by codebook model and tracks objects in the foreground area by Camshift using color probability distribution.Automatic tracking is achieved by comparison of window sizes and judgment of histogram correlation,and an input search window of the next frame is improved by location prediction and size expansion of window.At the same time,a union of multi-processed rectangular windows is as a feedback to the next frame of the image detection region for codebook model.Finally,the algorithm is applied for determining a grasping status between a hand and objects.The specific process is using the captured images by two cameras to detect and track hand and objects in a static background and then counting the number of grasp by rectangle intersection to verify the tracking algorithm.Since the information feedback reduces the search region of detection and tracking,this algorithm gets a better real-time performance,and some experimental results show that the frame processing rate can increase 130% in a single camera.
Mixed Application of GHA Based on PCA in BP Neural Network
FAN Yan, WU Xiao-jun, SHAO Chang-bin and SONG Xiao-ning
Computer Science. 2015, 42 (12): 302-306. 
Abstract PDF(676KB) ( 485 )   
References | RelatedCitation | Metrics
In view of the defects resulting from the combination of traditional method of feature extraction and BP neural network,this paper presented a new classification model “PCABP network”.Firstly,the PCA eigenvector is used to initialize the initial-layer weight matrix of the PCABP network,thus,the initial-layer of the new classification model “PCABP Network” replaces the role of PCA in the function of feature extraction.Secondly,in the training process,with the application of GHA and GD algorithm,dynamic adjustment to the projection weights matrix of the initial layer has been achieved,and accordingly,the PCA eigenvector has been optimized.This method optimizes “category separation” and “feature extraction” from source samples,finds out the best connection point between sample dimension reduction and classification,and replaces the traditional recognition pattern “firstly separate feature extraction,then classification by the use of classifier”.The experiment based on FERET face library verifies the effectiveness of this method.
Key Frame Extraction Algorithm Based on Improved Block Color Features and Second Extraction
LIU Hua-yong and LI Tao
Computer Science. 2015, 42 (12): 307-311. 
Abstract PDF(931KB) ( 396 )   
References | RelatedCitation | Metrics
Key frame extraction is an important technique in video summarization,searching,browsing and understanding.Nowadays some problems exist in the algorithms of key frame extraction,such as selecting features problem,choosing threshold difficultly,week adaptability and so on.In order to extract key frame efficiently,this paper proposed an improved key frame extraction algorithm based on low-level features.Firstly,each frame is divided into equal-area rectangular ring.Secondly,sub-block accumulative color histogram is extracted as color features and different weights are set for different rectanglular rings in order to highlight the central part of frame.Thirdly,key frames are selected in accordance with the significant change of frames.Lastly,key frames are optimized and selected in accordance with the frames of location in the video.The experiment results show that the proposed algorithm has good adaptability and can effectively reduce redundant key frames when the shot has a sudden flash or the object moves fast.Finally,the extracted key frames by this algorithm can express the primary content of video effectively.