Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 40 Issue Z11, 16 November 2018
  
Sensor Information Aware Uneven Clustering Routing Algorithm (SIAUCR)
NIU Jia-pei and CHENG Liang-lun
Computer Science. 2013, 40 (Z11): 1-3. 
Abstract PDF(335KB) ( 739 )   
References | RelatedCitation | Metrics
In Wireless Sensor Networks(WSN),cluster head election unreasonable will lead nodes consuming their ener-gy quickly,the network end their life early.In this paper a sensor information aware uneven clustering routing algorithm(SIAUCR) is proposed where cluster heads are selected based on node’s neighborhood cardinality,the distance from node to base station and the node’s residual energy.In cluster formation stage:the node join the nearest cluster,transmission through single hop in the cluster single hop and multi-hop between cluster heads.Simulation results show that the SIAUCR algorithm is more effective in extending the network life,energy savings and transmission than LEACH,CEBRCA.
Security in Wireless Sensor Networks with Mobile Sink
ZHANG Xu-bin and LIU Zhi-hong
Computer Science. 2013, 40 (Z11): 4-7. 
Abstract PDF(442KB) ( 691 )   
References | RelatedCitation | Metrics
Unattended wireless sensor networks operating in hostile environments face the risk of compromise.Given the unattended nature,sensors must safeguard their sensed data of high value temporarily.However,saving data inside a network creates security problems due to the lack of tamper-resistance of sensors and the unattended nature of the network.In some occasions,a network controller may periodically dispatch mobile sinks to collect data.If a mobile sink is given too many privileges,it will become very attractive for attack.Thus,the privilege of mobile sinks should be restricted.Additionally,secret keys should be used to achieve data confidentiality,integrity,and authentication between communicating parties.To address these security issues,we present mAKPS,an asymmetric key predistribution scheme with mobile sinks,to facilitate the key distribution and privilege restriction of mobile sinks.
Design and Implemetation of a Lightweight Building Framework Based on Bash
BAI Yun,YU Li and XIE Chang-sheng
Computer Science. 2013, 40 (Z11): 8-12. 
Abstract PDF(448KB) ( 973 )   
References | RelatedCitation | Metrics
A lightweight building framework was designed and implemented based on Bash script language widely supported in various distribution of Linux,for the purpose of managing the complexity factors in building operation system,thus deeply customizing embedded Linux operation system with graphics user interface upon various hardware platform.Based on these unique design and simplification,the resulting system can be less environmental dependent,more efficient for continuous development.
Application of TDMA Theory in Data Collection System Onboard
LIANG Huan,ZHAO Kai-rui,LAN Qi-long,YANG Xin,WEI Zi-yang and ZHOU Yue-ming
Computer Science. 2013, 40 (Z11): 13-14. 
Abstract PDF(252KB) ( 1094 )   
References | RelatedCitation | Metrics
The article first introduces TDMA (Time Division Multiple Access) and optical fiber transmission principles,followed by its application in the airborne data acquisition system for detailed study.Based on this work,by using field programmable gate array (FPGA) hardware design,and by designing and implementing the proposed program in hardware part,we designed a FC (Fiber Channel)-based,optical signal transmission and multi-node distributed system.In the actual verification,the experimental results achieved the expected goals,indicating that the design is stable and reliable.
Adaptive Momentum Fast Blind Source Separation Algorithm for Time-varying Mixing System
CHEN Hai-ping,ZHANG Hang,LU Wei,YANG Liu and ZHOU Xuan
Computer Science. 2013, 40 (Z11): 15-17. 
Abstract PDF(307KB) ( 651 )   
References | RelatedCitation | Metrics
Most of existing blind source separation algorithms are developed by assuming that the mixing matrix is fixed.However,the mixing matrix is commonly time-varying in practical communication system.In this paper,a model of gradually time-varying mixing system is proposed.Aiming at the model and the existing model of abruptly time-va-rying mixing system,a fast blind source separation algorithm is proposed by using the exponentially weighted sum of error squares as the cost function and adding the adaptive momentum term to the learning rule.Simulation results show that the proposed algorithm converges faster than existing algorithm and trace the time-varying system effectively.
Data Aggregation Algorithm of Maximum Lifetime for Wireless Sensor Network
ZHANG Zhen-yu and ZHAO Qiu-ling
Computer Science. 2013, 40 (Z11): 18-21. 
Abstract PDF(346KB) ( 715 )   
References | RelatedCitation | Metrics
For wireless sensor network there is a series of problems such as node energy limited and data conflict and transmission delay in information transmission,Put forward data aggregation algorithm on the basis of the maximum lifetime of wireless sensor network.The algorithm divided the nodes in the network into a plurality of clusters,Each cluster in the uniform distribution of nodes according to the node''s transmission range,Each of the nodes select communication mode in the light of their local information and residual energy to transmit data to the cluster head node,and form the shortest path of data transmission.Then apply particle swarm optimization method based on Pareto in line with the centralized TDMA scheduling model.The average processing time slot and the average energy consumption optimization was optimizated in the completion of the information transmission.Simulation results show that the algorithm can not only maximize the survival time of the network,but also can effectively reduce the data fusion time and reduce network delay.
Compressed Sensing-based Cooperative Spectrum Detection Algorithm Based on Signal Dependability
LI Na,CHEN Song,WANG Sheng and LI Ou
Computer Science. 2013, 40 (Z11): 22-25. 
Abstract PDF(346KB) ( 677 )   
References | RelatedCitation | Metrics
Compressed sensing provides a new way for broad band spectrum sensing in cognitive radio.According to the principle of compressed sensing,a cooperative spectrum detection algorithm based on signal dependability is proposed,which is used in the scenario of multiple cooperated cognitive users.The quality of signal that users received is used as the dependability in this algorithm,while the spectrum occupation is determined by Orthogonal Matching Tracking algorithm.Simulation results show that the proposed algorithm performs better than traditional ones in the conditions of different signal-noise ratio,and the detection performance is improved with low complexity.
Influence of Bursty on Information Diffusion
DENG Dong-mei,ZHU Jian,CHEN Duan-bing and GAO Hui
Computer Science. 2013, 40 (Z11): 26-28. 
Abstract PDF(249KB) ( 715 )   
References | RelatedCitation | Metrics
In recent years,the transmission dynamics is a hot topic in network research,traditional information dissemination researches are based on static network,but many actual networks are temporal.So far,many scholars have researched the impact of bursty on information dissemination,and the results show that bursty presents different roles for information dissemination on different data sets and different nodes infection way.Regarding to this phenomenon,stu-dying the influence of bursty on information dissemination by using DCW null model,and analyzes information spreading on the original data and the null model data,and to find out the reason why bursty shows different effects on the spreading.
Rate-compatible Puncturing Algorithm for Finite-length LDPC Codes
NIU He-hao and HE Yuan-zhi
Computer Science. 2013, 40 (Z11): 29-31. 
Abstract PDF(235KB) ( 1002 )   
References | RelatedCitation | Metrics
This paper propose a rate-compatible puncturing scheme for finite-length low-density parity-check (LDPC)codes over the additive white Gaussian noise (AWGN) channel.The method can applicable to both regular and irregular LDPC code.The scheme selects bits to be punctured base on a sequence of criteria.An important selection criterion is the number of short cycles with low approximate cycle extrinsic message degree (ACE) in which a candidate bit node participates.Simulation results demonstrate that the ACE plays an important role in the performance of the codes,these results also show that the scheme is superior to the existing puncturing methods in a wide range of code rates.
Study of the Large Scale Parallel Operating System
SHAO Zong-you,WANG Zhao-shun and XU Jian-wei
Computer Science. 2013, 40 (Z11): 32-36. 
Abstract PDF(539KB) ( 836 )   
References | RelatedCitation | Metrics
By analyzing the operating system running in a supercomputer simulator,it is found that different nodes in the supercomputer use different part of an operating system.However,a traditional operating system cannot meet the demand.In such case,a heterogeneous large scale operating system is designed and a prototype named SandOS is implemented based on the simulator.The prototype includes light weight kernel operating system SandPOS for compute nodes,FileServer for I/O nodes and MonitorServer for service nodes.By comparing SandPOS with traditional operational system in memory overhead,schedule efficiency,run efficiency and address translating efficiency,result shows that in large scale parallel system,SandOS has better efficiency than traditional operational system.
Localization Method for WSN Based on One Mobile Anchor
GONG Miao,FENG You-bing and BIAN Jian-xiu
Computer Science. 2013, 40 (Z11): 37-40. 
Abstract PDF(289KB) ( 949 )   
References | RelatedCitation | Metrics
A localization algorithm for WSN using one mobile anchor with directional antenna was proposed to improve positioning accuracy.The network was divided into several layers,and then anchor node moved along the x axis and layer boundary to traverse all unknown nodes.At intervals of the same distance,the anchor node directed broadcast position information in the process of moving.The unknown nodes according to the anchor node coordinate and azimuth information to determine its position.The simulation results show that,the algorithm has higher accuracy than the typical SLWL algorithm.
Message Transmission System for Opportunistic Networks
MA Xue-bin,ZHANG Yan-wen,OUYANG Zhen-chao and WANG Li-ting
Computer Science. 2013, 40 (Z11): 41-45. 
Abstract PDF(917KB) ( 687 )   
References | RelatedCitation | Metrics
An opportunistic network is a type of challenged networks,which has not any stable end-to-end delivery path between source nodes and destination nodes,and the messages are transmitted by the “store-carry-and-forward”paradigm when message-carrying nodes encounter other nodes until they reach the destination nodes.In this paper,we design a information collection and message transmission system--BlueChat--for opportunistic networks by Bluetooth protocol.On one hand,it collects information of node contact for building routing table,mobility model,community detection,routing protocol and QoS strategy research.On the other hand,it transmits messages through different message queue management strategy for special circumstances demand.Performance of this system is tested last for three months and more than fifty nodes involved.Ninety five percent of the messages is delivered and it could meet the message transmission needs of opportunistic networks.
Study of Network Resilience
LIU Mi-xia and ZHU Hong-lei
Computer Science. 2013, 40 (Z11): 46-49. 
Abstract PDF(462KB) ( 2281 )   
References | RelatedCitation | Metrics
Resilience in the network,which is defined as the ability of the networks to provide and maintain an acceptable level of service in the face of various faults and challenges to normal operation,must be viewed as an essential design and operational characteristic of future networks in general,and the Global Internet in particular.This paper is about network resilience in heterogeneous environment based on situational awareness in the holistic view.The works are organized as follows:first,definition,model,realization and assessment of network resilience are analyzed in depth;then,network resilience model based on situational awareness is discussed,and function and realization methods of each level are depicted in detail;at last we talk about the next step research of network resilience.
Research Progress and Development Trend of Cognitive Radio for Smart Grid
YAO Ji-ming,LIANG Yun,LI Bin-lin and HUANG Li
Computer Science. 2013, 40 (Z11): 50-52. 
Abstract PDF(340KB) ( 1119 )   
References | RelatedCitation | Metrics
Application of cognitive radio for smart grid will effectively address the problem of spectrum resource scarcity in the wireless communications.A cognitive radio network can serve as a robust and efficient communications infrastructure that can solve both the current and future energy management needs of the smart grid.Cognitive radio testbed for the smart grid is not only able to test the performance of application,but also can find more practical problems.An efficient and reliable communication architecture has great significance to support the two-way communication for the smart grid.The latest research progress of cognitive radio-based testbed and system architecture for smart grid is summarized,and,in this basis,this paper discusses the development trend of the next step.
Research on Mobile Operating System Architectures
HU Zhong-wang
Computer Science. 2013, 40 (Z11): 53-56. 
Abstract PDF(456KB) ( 1152 )   
References | RelatedCitation | Metrics
Mobile operating system(MOS)is the foundational software based on current international competitive field.MOS is the basic system software for mobile devices,it controls all resources of mobile device and provides basic servi-ces and application development.The architecture design is a main task of the new MOS development.This paper analyzed the structures of the mainstream MOS Andriod,iOS,Symbian and Windows Phone,compared the similarities and differences of the architecture and implementation technology,gived the technical routes of autonomous MOS,discussed the development trend of MOS,and pointed out the development trend of the future MOS.This paper is proposed to provide technical support and help for the development of MOS.
Design for SMEs of Low Cost Web Server Load Balancer
XU Wei,ZHU Shuai and YE Chun-hao
Computer Science. 2013, 40 (Z11): 57-59. 
Abstract PDF(249KB) ( 725 )   
References | RelatedCitation | Metrics
With the rapid growth of Web traffic,making the application of load balancing equipment more and more widely,but the current load balancing equipment is very expensive,leading to small and medium-sized enterprises (SMEs) unbearable,so the design of a low cost Web service load balancer has important significance.The author through the analysis of the current load balancing technology,Constructed the design framework of Web server load balancer,designed algorithms,such as server performance evaluation,health check,continuous service etc,to solve the core problem for SMEs to design the low cost Web server load balancer.
Research of Virtual Machine Load Balancing Based on Ant Colony Optimization in Cloud Computing and Muiti-dimensional QoS
ZHANG Mu
Computer Science. 2013, 40 (Z11): 60-62. 
Abstract PDF(240KB) ( 766 )   
References | RelatedCitation | Metrics
Aiming at Virtual Machine load balancing problem in Cloud computing,in order to realize the efficient sche-duling of Virtual Machine load ba-lancing in Cloud computing and meet the user’s QoS need,a Virtual Machine load balancing method based on Muiti-dimensional QoS is proposed.Firstly,the mathematical model of cloud resource scheduling is built,Then,a Virtual Machine load balancing algorithm based on ant colony optimization in Cloud computing and Muiti-dimensional QoS is intorduced,Finally,the CloudSim simulation experiment shows the algorithm can solve the Virtual Machine load balancing problem effectively,can reduce the load balance deviation and satisfy the need of Virtual Machine load balancing in Cloud computing.
Social Network’s Fuzzy Cluster Theory and its Applications
SHEN Jie and GUO Li-sen
Computer Science. 2013, 40 (Z11): 63-67. 
Abstract PDF(652KB) ( 625 )   
References | RelatedCitation | Metrics
This paper analyzes the existing social network''s history,characteristics and applications,next we derives the corresponding theoretical model.In our fuzzy cluster belief propagation theory,according to social network’s coverage,we classify it into hierarchical sets,analysis the affected number of users and user behaviors,then dividing the sets into different clusters through the proposed algorithm.Since we adopt the fuzzy set to calculate the approximation of propagation probability,the computation complexity is reduced.Then,we give an example of social network for electrical automobile charge,in which the wireless sensor networks and social Web2.0application mode are both used.Through comparing the statistic data and those estimated from our cluster algorithm,it proves that the effective and accurate of algorithm.Finally,we give the general social network business model and look forward to its impact on the future of the Internet.
Research on the Independence of the Communication Protocol Software in Automation
DAI Hong-bin
Computer Science. 2013, 40 (Z11): 68-72. 
Abstract PDF(455KB) ( 676 )   
References | RelatedCitation | Metrics
Communication protocol software is employed to perform communication among devices,systems and diffe-rent parts of them to exchange data and share information in automation projects.Its reliability and scalability would significantly affect the quality and implementation of the projects.In this paper the data and operations in the communication protocol software are explored and then divided according to the function stage.By introducing the transaction concept,some principles for the communication protocol software are suggested,which help enhance the independence of the communication protocol software,thus improve its reliability and scalability,and as a result contribute to the quality and implementation of automation projects.
Application of Swarm Intelligence Optimization Algorithm in Parameter Optimization Design of Propeller
WANG Peng,HUANG Shuai and ZHU Zhou-quan
Computer Science. 2013, 40 (Z11): 73-76. 
Abstract PDF(295KB) ( 1122 )   
References | RelatedCitation | Metrics
In general,parameter optimization design of propeller is a nonlinear problem,and the key to the problem is how to find a set of appropriate parameters under various constraint conditions to make propeller performance best.As a novel evolutionary computation technology,swarm intelligence is now becoming a new research hotspot,and has been successfully applied in many fields.Practice shows that swarm intelligence optimization algorithm is an effective method to solve global optimization problems.In this paper,the principles of particle swarm optimization and artificial bee colony algorithm were introduced.Then on the basis of establishing mathematical model of parameter optimization design of propeller,the swarm intelligence optimization algorithm was employed to solve the problem of parameter optimization design of propeller,and the experimental results indicate that the swarm intelligence optimization algorithm is an effective and potential method for this problem.
Covariance Based Learning Algorithm for Gaussian Mixture Model
LIAO Xiao-feng,FAN Xiu-bin and JIANG Qing-shan
Computer Science. 2013, 40 (Z11): 77-81. 
Abstract PDF(335KB) ( 1733 )   
References | RelatedCitation | Metrics
Expectation maximization is commonly used for parameter estimation in Gaussian mixture model.This paper presented a machine learning algorithm based on covariance(CVB) for solving the Gaussian mixture model with the specific constrain that covariance is already known.Experiments show that the CVB algorithm has better performance than the EM algorithm with regard to the specific constraint.
Design and Implementation of Simulation Engine for AADL Model Based Testing
XUAN Hang,DONG Yun-wei and SUN Bo
Computer Science. 2013, 40 (Z11): 82-85. 
Abstract PDF(707KB) ( 788 )   
References | RelatedCitation | Metrics
The non-properties of safety-critical system,such as real time,reliability and safety,are becoming a key constraint of dependency of system behavior,and they are effect software quality for the large-scale system.It is a key research task to find a solution to analyze the dependency property in system design phases for the purpose that designers can optical system architecture,rebuild software components and hardware components to meet system quality specification.An AADL model-based test Engine was designed and implemented in this paper to solve question above.The simulation engine was developed on System C and POSIX techniques,and its core function includes task encapsulation,task scheduler,time management,interrupt management and signal control.It can execute AADL instance dynamic,and carry out model-based testing for verification real time property of embedded system.At last,it gave a case study for Automaton control system,which is model with AADL,and can execute testing over AMSE.Some time properties were tested,such as AADL flow latency,thread execution time and cache hit rate.This simulation engine is useful to verification AADL model.
Research and Improvement of Filter Algorithm of Malicious Information Based on One-class SVM
DING Xiao-yun,LIU Gong-shen and MENG Kui
Computer Science. 2013, 40 (Z11): 86-90. 
Abstract PDF(519KB) ( 778 )   
References | RelatedCitation | Metrics
The research of monitoring and filtering of the files transporting through internet is getting hotter and hotter now.The traditional algorithm based on string-matched is not able to meet the need of the huge increase of information.Although SVM model can surely improve the efficiency of the classification,the problem that SVM’s too large dimension will affect the speed of examine still exists.It also causes a waste of storage space and compute ability.One algorithm was raised by first reducing the dimension by some specific algorithm before classification.The analysis result shows that after the improvement,we can get a more accurate result.
Influence Analysis of Target Tracking Error on Missile Guide Hit Probability
ZHANG Guo-dong,ZHANG Jian-qiang and LIU Zhong
Computer Science. 2013, 40 (Z11): 91-93. 
Abstract PDF(226KB) ( 1010 )   
References | RelatedCitation | Metrics
Aiming at the guide hit probability problem,this paper analyzed the influence of target tracking error on the missile guide hit probability.First this paper built up a convert measurements kalman filtering error model,and then it analyzed guide hit probability’s influence factors,at last it built up equivalent hit the target decision model and hit probability model based on falling point dispersions,discussing the influence degree of the target tracking error on the guide hit probability.A simple example shows that the analysis method is useful and gets some instructive conclusions.
Safety Supervision System for Heavy Haul Railway:Timed Automaton Model and Verfication
WANG Jin,SUN Jing-hao,HE Xing-quan and MENG Ya-kun
Computer Science. 2013, 40 (Z11): 94-97. 
Abstract PDF(445KB) ( 650 )   
References | RelatedCitation | Metrics
Safety supervision for trains approaching port is an improtant and difficult issue in China port information construction.Realizing full automation of train traveling process tracing plays a very significant role in ensuring safety and high efficiency in supervision in port.A cyber-physical system was designed to deal with the incident when some person climbs the train traveling on the heavy haul railway,and an associated timed automaton model was proposed to describe the real-time behaviors of the system.A random run sequence was simulated by the UppAal tool,and the verfication results show that the system designed in this paper satisfies the critical properties,such as reachability,security,liveness and timing constraints.
Multi-label Text Classification Algorithm Based on Hyper Ellipsoidal SVM
QIN Yu-ping,WANG Yi,LUN Shu-xian and WANG Xiu-kun
Computer Science. 2013, 40 (Z11): 98-100. 
Abstract PDF(220KB) ( 919 )   
References | RelatedCitation | Metrics
A new multi-label text classification algorithm based on hyper ellipsoidal support vector machines was proposed. To each class sample,the hyper ellipsoidal that includes as much the class samples as possible and push the outlier samples away is trained in the featuer space. For the sample to be classified,the mahalanobis distance from the sample mapping to the center of each hyper ellipsoidal were used to decide the sample classs. The results of the experiment show that the proposed algorithm has a higher classification accuracy.
Counting Method for Ridges Crossed by Connection Line of Two Minutiae
ZHONG Wei-bo,LV Yuan and LI Min-min
Computer Science. 2013, 40 (Z11): 101-104. 
Abstract PDF(832KB) ( 854 )   
References | RelatedCitation | Metrics
The number of ridges crossing the connection line of two minutiae is increasingly used in fingerprint matching for its robustness in fingerprint scaling,rotation,translation and slight deformation,and affects the accuracy and robustness of fingerprint matching directly.Most ridges counting algorithms based on Bresenham or its improved algorithm are not accuracy enough now.A new ridge counting algorithm based on the thinned fingerprint was presented in this paper.The pixels of the connection line are calculated firstly,and the black pixels on or nearby the connection line are achieved.Then the number of ridges crossing the connection line is obtained according to the geometric relationship among the connection line,adjacent ridges and minutiae.Experiment results show that the algorithm can accurately and effectively get the number of ridges which across the connection line of two minutiae.
Type II Fuzzification on Choquet Integral
YANG Rong and ZHENG San-yuan
Computer Science. 2013, 40 (Z11): 105-108. 
Abstract PDF(341KB) ( 1089 )   
References | RelatedCitation | Metrics
This paper provided a detailed discussion on one fuzzification of Choquet integral which supports fuzzy-valued integrand and gave crisp-valued integration result.It is a generalized Choquet integral for fuzzy-valued integrand,interval-valued integrand,as well as the crisp-valued integrand.The presented generalized Choquet integral with respect to signed fuzzy measure can act as an aggregation tool which is especially useful in many information fusing and data mining problems (such as regression and decision making) where not only crisp data but also heterogeneous fuzzy data are involved.
Model-based Logic Boundary Coverage Testing Criteria
LI Li-ping and LI Xing-sen
Computer Science. 2013, 40 (Z11): 109-114. 
Abstract PDF(511KB) ( 1115 )   
References | RelatedCitation | Metrics
Because the specification-based logic coverage criteria little regards to the boundary,this paper formalizesd the boundary value analysis method,proposed a series of model-based logical boundary coverage criteria.Results show test cases satisfying these criteria can detect more errors than original logic coverage testing criteria.They not only satisfy the logic coverage criteria but also test the system boundaries.
Bi-level Programming Problem Based on Improved Particle Swarm Algorithm
ZHAO Zhi-gang,WANG Wei-qian and HUANG Shu-yun
Computer Science. 2013, 40 (Z11): 115-119. 
Abstract PDF(377KB) ( 2931 )   
References | RelatedCitation | Metrics
This paper proposed an algorithm which uses particle swarm optimization (PSO) method to solve the bi-level programming problem (BLPP).A PSO algorithm with adaptive mutation is put forward firstly to improve the performance of standard PSO.Then the modified PSO is used to solve the bi-level programming model.In the proposed algorithm,the interactive iteration between the two PSO optimizes synchronously the two levels of BLPP,and finally obtaining its optimal solution.The experimental results show that the new algorithm can be used to solve the general BLPP.
Information Fusion Algorithm Based on D-S Evidence Theory
JIANG Tao
Computer Science. 2013, 40 (Z11): 120-124. 
Abstract PDF(374KB) ( 2609 )   
References | RelatedCitation | Metrics
Aiming at the problem on the insufficient in systematic in the application of information fusion using existed D-S evidence theory algorithm,this paper proposed a new hierarchical fusion algorithm based on D-S evidence theory.The model using hierarchical and domain fusion pattern processes the data which have many dimensions character.This paper also presented a new proximate calculation algorithm in order to resolve the problem of how to use the initial information to determine the probability in high level information fusion.And the algorithm gives the corrective algorithm in order to solve the problem of the conflict in evidence.The simulation results show that the algorithm can be in high probability of detection,and the probability of false alarm is low,the speed of convergence is fast and the correction is high.
Improved Binary Particle Swarm Optimization Algorithm Based on Multi Velocity Vector and Adaptive Speed Value
SHEN Jia-jie,JIANG Hong and WANG Su
Computer Science. 2013, 40 (Z11): 125-130. 
Abstract PDF(391KB) ( 696 )   
References | RelatedCitation | Metrics
Aiming to easy to prematurity and low iteration speed problem of the standard binary particle swarm optimization (BPSO) algorithm in high dimension environment,using the the multi velocity vector and adaptive speed value,an improved discrete binary particles swarm optimization based on multi velocity vector and adaptive speed value was proposed.Though theoretical derivation,the correctness of improved discrete particle swarm optimization algorithm was proofed.The correctness of the theoretical derivation was verified by the experiment.
Improved Artificial Glowworm Swarm Optimization Algorithm for Solving Parameters of Van Genuchten Equation
MO Yuan-bin,LIU Fu-yong and MA Yan-zhui
Computer Science. 2013, 40 (Z11): 131-135. 
Abstract PDF(461KB) ( 678 )   
References | RelatedCitation | Metrics
Van Genuchten equation is widely used soil water characteristic curve equation,and its parameter value precision is the key to the use of the equation.In order to solve these parameters accurately,the glowworm swarm optimization(GSO) algorithm was introduced,and a new artificial glowworm swarm optimization algorithm based biological parasitic behavior (GSOPB) was proposed,which consists of the host and the parasite population.The two populations exchange glowworm in a certain number of iterations.In order to embody the rule of survival of the fittest in biological evolution,the glowworm with poor fitness in the host population is removed.The experiment results of some benchmarks show the effectiveness of GSOPB,and the results of solving parameters of Van Genuchten show good performance of GSOPB compared with the other methods.The algorithm can be used as a new method to calculate Van Genuchten equation parameters.
Dynamic Compression of Property Oriented Concept Lattices Based on Rough Set Theory
ZHOU Xiu-xiu and LI Jian-zhuo
Computer Science. 2013, 40 (Z11): 136-139. 
Abstract PDF(257KB) ( 601 )   
References | RelatedCitation | Metrics
As an efficient tool for knowledge acquisition,formal concept analysis has been applied to many fields.This paper mainly proposed new method of dynamic compression in property oriented concept lattices.We first discussed the relationships between congruence relations and the corresponding property oriented concept lattices based on dependence space theory.Secondly,we defined notions of attribute reduction in property oriented concept lattices based on congruence relations which is to find the minimal attribute subsets preserving the congruence partition.Finally,we proposed the new methods of dynamic compression in property concept lattices.
Classification Algorithm Based on Heterogeneous Cost-sensitive Decision Tree
RUAN Xiao-hong,HUANG Xiao-meng,YUAN Ding-rong and DUAN Qiao-ling
Computer Science. 2013, 40 (Z11): 140-142. 
Abstract PDF(318KB) ( 738 )   
References | RelatedCitation | Metrics
Usually,cost-sensitive learning assumes that different types of cost can be converted into a unified units of the same price.Apparently how to construct appropriate cost-sensitive attribute selection factor is a challenge.In this paper,a kind of heterogeneous cost-sensitive decision tree algorithm was designed,which fully considers the different cost in selecting split attribute,constructs an attribute selection model based on heterogeneous cost-sensitive,designs the price sensitive pruning strategy based on cost-sensitive.The experimental results show that this method is effective and more efficient than the present other methods.
Dynamic Particle Swarm Optimization Based on Hybrid Variable
ZHOU Li-jun,PENG Wei,ZENG Xiao-qiang and ZOU Fang
Computer Science. 2013, 40 (Z11): 143-146. 
Abstract PDF(306KB) ( 983 )   
References | RelatedCitation | Metrics
Particle swarm optimization(PSO) is a relatively simple structure which runs very quickly,but it is easily fall into local optimum and appears the phenomenon of premature convergence.Aiming at the PSO existing problems,by setting the proportional coefficient control of inertia weight between influence strength,this paper introduced a kind of novel way using the iteration number and particle size of the distance between the dynamic change inertia weight.At the same time,in order to increase the diversity of population,using "hybrid variation" operator,designed a kind of dynamic particle swarm optimization based on hybrid variable,(HV-DPSO) based on reference function of numerical experiment.The experimental results show that compared with the traditional PSO,the new algorithm not only can effectively avoid premature convergence but also has better convergence effect.
Improved Anisotropic Diffusion Denosing Model
ZHAO Hai-yong and JIA Yang-li
Computer Science. 2013, 40 (Z11): 147-149. 
Abstract PDF(567KB) ( 959 )   
References | RelatedCitation | Metrics
In order to remove noise effectively and preserve key details,a more effective and adaptive diffusion denosing model was proposed.A new diffusion function was built based on the P-M model.The new diffusion function would be a constant when the gradient is low.After the gradient exceeds a certain threshold,the new diffusion function becomes a monotone descending function which reduces to zero at a certain gradient.The characteristic of the diffusion function makes the improved model smoothing faster in homogeneous area and stop in edge.The experiment results show that the performance of the improved model is better than the traditional methods.
Research on Algorithm of Revising Two Dimensions PSD
ZHANG Feng-qi,WANG Yong-sheng,ZHANG Bao-shang and ZHANG Qiu-zhi
Computer Science. 2013, 40 (Z11): 150-152. 
Abstract PDF(313KB) ( 1142 )   
References | RelatedCitation | Metrics
Position sensitive detector is a photo electricity sensor which is sensitive to the position of signal.By calculating the weight of light,PSD can measure the position of light and output the analogy current signal.The non-liner has impact on the reliability of PSD.As a result of the influence of the marginal electrodes,serious non-linearity appears on the edge of PSD,which restricts the reliability and application.This paper took a research on how to revise the distortion.By studying the interpolation arithmetic,this paper put forward Biharmonic Spline interpor so as to revise the distortion point.Through simulation and analysis,the error can be limited to within 2.29um.It produces noticeably improvement compared with Nerve Network,which is satisfying.
Research on Fuzzy Knowledge Base of Military Simulation and Analysis System
TIAN Tian,CHEN Yong,LIU Tian-jia and ZHAO Xin-ye
Computer Science. 2013, 40 (Z11): 153-156. 
Abstract PDF(442KB) ( 924 )   
References | RelatedCitation | Metrics
Decision of commanders requires a great deal military knowledge and situation information.For military analysis and simulation evaluation system,the construction of control and command model requires more detailed knowledge,and the completion of the knowledge base can dispose of the responsibility to the running speed and efficiency of control and command model.However,real decisions in the battlefield exist too much fuzzy information,so it is acknowledged that how to deal with fuzzy information enables a significant portion of the development of knowledge base.After delving into knowledge base of military analysis and simulation evaluation system,we present the design of fuzzy extended data model.In addition,we have demonstrated that its correctness by an illustrative example.
Improved the KNN Algorithm Based on Related to the Distance of Attribute Value
XIAO Hui-hui and DUAN Yan-ming
Computer Science. 2013, 40 (Z11): 157-159. 
Abstract PDF(295KB) ( 1103 )   
References | RelatedCitation | Metrics
Definition of the samples will directly impact on the accuracy and the efficiency of KNN.In view of disadvantages to the traditional KNN algorithm on the distance the definition and categories of decision,proposed the use of attribute importance to category to improve KNN algorithm (FCD-KNN).At first,a distance of the two samples is defined as the correlation distance of the same attribute values.The distance can effectively measure the similarity degree of the two sample.Secondly,According to this distance selects the k nearest neighbors.Finally,the category of the test sample is decided by the average distance and the numbers on the respective category.The theoretical analysis and the simulation experiment show that compared with KNN and-KNN,raised the rate of accuracy enormously in classification.
Attribute Reduction and Rule Acquisition in Incomplete and Inconsistent Ordered Decision Systems
WEI Bi-peng,LV Yue-jin,LI Jin-hai and LI Da-lin
Computer Science. 2013, 40 (Z11): 160-164. 
Abstract PDF(371KB) ( 664 )   
References | RelatedCitation | Metrics
The notion of a generalized dominance decision function is defined in an incomplete and inconsistent ordered decision system,and its discernibility matrix is proposed to design an attribute reduction algorithm.Then a new approach of rule acquisition is obtained in an incomplete and inconsistent ordered decision system by generalized dominance decision function.Finally,a real example is used to demonstrate the effectiveness of the proposed algorithm.
Hierarchical Storage Access Model Based on Multi-Attributes Measurement
SHI Guang-yuan and ZHANG Yu
Computer Science. 2013, 40 (Z11): 165-169. 
Abstract PDF(407KB) ( 789 )   
References | RelatedCitation | Metrics
With the rapid development of cloud computing, cloud storage has become an important way of business-critical information services. However, limited by storage resource performance and impact data, users often need to endure a long delay of access. In order to alleviate the situation, intelligent data management technologies have been proposed, to effectively manage large amounts of data, as well as reduce access latency, improving the quality of cloud computing services. Proposing an Hierarchical Storage Access Model based on Multi-Attributes Measurement. Model managed through the statistical analysis of the static and dynamic properties of the data object, extracting key information from the properties, data management and in accordance with this decision, cold/hot data migration to the corresponding levels so that you can reasonable while planning your storage resources to improve storage system performance. Performance test results of the experiment showed that the model has good overall performance.
Performance Analysis Method for Intrusion Detection in MANETs Based on Machine Learning Algorithms
JIANG Yi-bo,WANG Yu-chen,WANG Wan-liang,ZHANG Zhen and CHEN Qiong
Computer Science. 2013, 40 (Z11): 170-174. 
Abstract PDF(469KB) ( 793 )   
References | RelatedCitation | Metrics
Mobile Ad-hoc network (MANET) has become an important technology in recent years and the corresponding security problems is getting more and more attention.This paper proposed a performance analysis model and an integrated evaluation index for intrusion detection based on machine learning algorithms.The experiment simulated three typical anomalous behaviors (Black hole,Flooding and Packet drop) and compared seven well-known machine learning algorithms in detail.The analysis results show that the proposed model could give a well expression to the performance of each algorithm.In particular,MultiLayer Perceptron,Logistic Regression and Support Vector Machine give the best performance and the Logistic Regression and Support Vector Machine also spend very little time to train the classification model.
Mechanism of Detecting and Preventing Application Layer DDOS Attack Based on Traceback
WANG Rui
Computer Science. 2013, 40 (Z11): 175-177. 
Abstract PDF(253KB) ( 978 )   
References | RelatedCitation | Metrics
Distributed Denial of Service is an attempt to make a machine or network resource unavailable to its intended users.With the further development of technology,DDOS attacks on the network layer have been largely weakened.However,more and more attacks occur in the application layer with various and more complicated forms.Attack traffic may be legitimate from the view of the lower layer protocol,which makes the detection and prevention more difficult.This article explained the discipline and measures of application DDOS attack by instances,summarizes and improves the mechanism of detection and prevention with present technology.
Algorithm Based on Arnold and RSA for Optimal Selection of Large Prime Numbers in Image Encryption
YANG Yang,YANG Jie and FENG Jiu-chao
Computer Science. 2013, 40 (Z11): 178-180. 
Abstract PDF(493KB) ( 644 )   
References | RelatedCitation | Metrics
An algorithm based on Arnold and RSA for optimal selection of large prime numbers in image encryption is proposed,including the Arnold transform and the RSA encryption algorithm.On the basis of the traditional RSA algorithm,this method proposed a scheme of the random selection of large prime numbers using the passage of time as the seed.Experimental results show that this method has high security,the decrypted image has a certain degree of robustness to additive noise attack.
Research of Group Key Service Initialization Based on QKD
LUAN Xin,GUO Yi-xi,SU Jin-hai,SUN Wan-zhong and ZHAO Hong-tao
Computer Science. 2013, 40 (Z11): 181-183. 
Abstract PDF(321KB) ( 649 )   
References | RelatedCitation | Metrics
With the deepening of the quantum cryptography field research,video conference,networks games,stock trading and other dynamic group communication models put forward application requirements to quantum group key.In order to service these applications well,on the basis of analyzing the traditional group key manage-ment scheme,present a two layers of packet distributed quantum group key management model,design the quan-tum group key service agreement under this model,focus on the initialization phase of the agreement.Compared with several kinds of classic group key management scheme,this scheme’s efficiency is higher in group key gen-eration and sharing,has certain practical significance.
Information Secure Transmission System Based on CAPICOM and IAIK
WU Jie-ming,SHI Jian-yi and LI Shou-zheng
Computer Science. 2013, 40 (Z11): 184-187. 
Abstract PDF(316KB) ( 885 )   
References | RelatedCitation | Metrics
This paper describes some techniques which can ensure the secure transmission of network data.On this basis,proposes the design ideas of the information secure transmission system.In this system,completes the management of digital certificates by using the EJBCA open source system.And,the client using CAPICOM technology simplifies the digital signatures and digital envelope implementation process.Besides,the server uses third-party libraries IAIK to achieve analysis and validation PKCS#7format data.Finally,provides some critical code of the information secure transmission system.
Application of Many-to-one Encryption and Authentication Scheme in Video Conference
LIU Xiu-yan,WEI Zhen-gang,LIN Xi-jun and XING Jing
Computer Science. 2013, 40 (Z11): 188-191. 
Abstract PDF(339KB) ( 1083 )   
References | RelatedCitation | Metrics
According to the existing safety problems and the causes in video conference,this paper proposes the scheme that using many-to-one encryption and authentication scheme to encrypt the session key.The scheme using secondary encryption method to ensure the security of the session key in conference.Analysis shows that the scheme can effectively solve the session key transmission data is caused by leakage illegal personnel stealing utilization,and reduce the burden of key management.
Quantitative Evaluation Across Software Development Life Cycle Based on Data Fusion
ZHANG Wei-xiang,LIU Wen-hong and WU Xin
Computer Science. 2013, 40 (Z11): 192-195. 
Abstract PDF(381KB) ( 736 )   
References | RelatedCitation | Metrics
This paper brings out a method on quantitative software trustworthy evaluation across software development life cycle.First,builds a hierarchical assessment model with decomposition of software trustworthy on the various stages of software development life cycle,designs appropriate quantitative or qualitative metrics set; then,using of knowledge discovery in database techniques to obtain the weights of all software trustworthy characteristics; finally,using of data fusion theory to process and reason large volumes of multi-type measurement data.Engineering practice shows that it can effectively improve objectivity of the assessment process and the accuracy of the assessment results.
Improved Boyer-Moore Algorithm Applied in IDS
WANG Xi-na and YU Jian-peng
Computer Science. 2013, 40 (Z11): 196-198. 
Abstract PDF(300KB) ( 841 )   
References | RelatedCitation | Metrics
In module design for detection engine of IDS,the misuse of detection based on the Pattern Matching algorithm is the most commonly used by designers as a means of core technology,and the loss rate of data packet,the rate of false positives of IDS and Matching speed of detection engine depend on the performance of Pattern Matching algorithm.Boyer-Moore algorithm,its improved Boyer-Moore Horspool algorithm and Boyer Moore Horspool System algorithm are the the most used-widely Pattern Matching algorithm.Based on the analysis of the BM algorithm and improved algorithm,a new improved BM algorithm is proposed in the article.The algorithm takes advantage of the end character of the string and the uniqueness of next character of corresponding text strings in it,and consider the text string information to increase Matching speed,so that can accommodate to the requirement of high-efficiency of Pattern Matching algorithm for IDS.
Survey of Study on Privacy Preserving
LI Xiao-ye,SUN Zhen-long,DENG Jia-bin and SONG Guang-jun
Computer Science. 2013, 40 (Z11): 199-202. 
Abstract PDF(371KB) ( 851 )   
References | RelatedCitation | Metrics
As the extensive applications and fast development of data publishing and data mining,how to protect private data to prevent the disclosure of sensitive information,has already become a current research hotspot.This paper analyzes the privacy preserving technology respectively from the two aspects,and makes further comparison and analysis of existing algorithms.Finally,we point out two directions can be in-depth study in this field.
Security Analysis and Improvement of Strongly Secure Certificateless Key Agreement Protocol
WANG Dian-gang,DING Xue-feng and HUANG Kun
Computer Science. 2013, 40 (Z11): 203-209. 
Abstract PDF(606KB) ( 666 )   
References | RelatedCitation | Metrics
The certificateless public key cryptography (CLPKC) has attracted wide attention since it could solve the certificate management problem in the traditional public cryptography and the key escrow problem in the ID-based cryptography.Many certificateless signcryption (CLSC) schemes using pairing have been proposed.The pairing operation is a very complicated operation.So the performance of these schemes is not very good.In this paper,we study the CLSC schemes without pairing ,and find that Selvi et al.Is scheme is not a standard CLSC scheme since the user must verify the public key before using it.This not only inverses the thought of the CLPKS but also increases the user’s computational cost.To solve the problem,three new CLSC schemes without pairing have been proposed.In this paper,we will show the three CLSC schemes provide neither unforgeability property nor confidentiality property.To improve security,we also propose a new CLSC scheme without pairing and demonstrate it is provably secure in the random oracle model.
Research of New SYN Flood Defense Model Based on Linux
LIU Yun
Computer Science. 2013, 40 (Z11): 210-213. 
Abstract PDF(314KB) ( 855 )   
References | RelatedCitation | Metrics
The SYN Flood is a typical denial of service attack technology and endangers the network using the security vulnerabilities of the TCP protocol.There is no good way to completely solve it at present.This paper analyzed the three existing SYN Flood defense model:the SYN Cookie,the SYN Gateway,the SYN Proxy,and put forward the enhanced SYN Flood defense model,and researched the related algorithm,and implemented the model based on linux,and tested the defense model last.The result of the test shows that the enhanced SYN Proxy model can resist the high intensity SYN Flood attack and be better superiority than the existing model.
Method of Network Security Dynamic Assessment Based on Attack-defense Confrontation
LIAN Li-quan,PENG Wu and WANG Dong-hai
Computer Science. 2013, 40 (Z11): 214-218. 
Abstract PDF(399KB) ( 1641 )   
References | RelatedCitation | Metrics
According to the characteristic of the network attack-defense real-time variation,a dynamic assessment method of network security state was presented.Firstly,a network security model based on vulnerability state transition was built,according to the characteristics of attack and defense both sides.Then the success probability and the consequences of attack success were quantitated,and the effects of attack-defense confrontation behaviors on the key asset security attributes such as confidentiality,integrity and availability were analyzed.Finally,the feasibility and validity of this method were proved through an experiment.
Reversible Data Hiding Based on Full Context Prediction
LUO Jian-gao and HAN Guo-qiang
Computer Science. 2013, 40 (Z11): 219-223. 
Abstract PDF(413KB) ( 796 )   
References | RelatedCitation | Metrics
We propose a nonlinear,adaptive,gradient-adjusted predictor based on full context (FCGAP).FCGAP is suitable for reversible image data hiding.In FCGAP,the predicted value of target pixel is equal to the weighted average of 4neighborhood pixels,and the weighted coefficients are computed by 12nearest neighbor pixels.Based on FCGAP,a reversible data hiding is proposed,which can provide high data embedding capacity and low distortion.Experimental results demonstrate that FCGAP can more fully utilize context information,and has better prediction accuracy of pixels than other predictors;the reversible data hiding based on FCGAP has better capacity-distortion performance compared with other state-of-the-art algorithms.
Trust Calculation Method Based on Experience Ontology for Service Systems
DU Xiao-jing,YAO Gao-feng and HU Le
Computer Science. 2013, 40 (Z11): 224-227. 
Abstract PDF(343KB) ( 588 )   
References | RelatedCitation | Metrics
Trust calculation rely on communication and sharing of recommendatory trust information from so called third parties in the high dynamic,open,heterogeneous and distributed service computing environments.However current research methods in trust representation and calculation lack the analysis and reasoning on experience information,decreasing reliability of trust calculation which is being aggravated by subjective and objective difference of entities within such networks environments.This paper first introduce trust representation and calculation based on formal semantics,then proposed an experience ontology in which explicitly represents concepts and properties related to interactions experience,namely E-Ont.At last,trust calculus were defined on the basis of experience sharing.Compared with trust calculus based on rating and context-aware rating,our method enhance the reliability of trust calculation and beneficial selecting of trusted service provider in open,heterogeneous and distributed environments.
Implementation of Three-dimensional Security Defense System
LIU Yang,SHAO Xu-dong,PAN Cheng-da and HU Zheng-Liang
Computer Science. 2013, 40 (Z11): 228-234. 
Abstract PDF(683KB) ( 722 )   
References | RelatedCitation | Metrics
With the increasing popularity of intelligent terminals,convenient-to-use Android operating system has been widely used.The standard Android Security Framework is lack of strong protection mechanism,even the existing and developing security technology for standard Android are one-sided.TDSD-Droid adopted the advantages of SELinux security enhancement and other Android security technology,and implemented a MAC mechanism in kernel,a new MMAC mechanism based on Flask access architecture,a novel Flexible Security Policy adaptation mechanism,an innovative security policy learning mechanism,and a new integrity verification function based on TF smart card. It achieved a consistent three-dimensional security defense system for Android terminals from top to bottom.
Topical Relevance Analysis of Hashtags in Chinese Microblogging Environment
HU Chang-long,TANG Jin-tao and WANG Ting
Computer Science. 2013, 40 (Z11): 235-237. 
Abstract PDF(346KB) ( 2802 )   
References | RelatedCitation | Metrics
Hashtag (the topical words of a micro-blog) is a kind of topic label of microblog created by publisher,which can help users find hot topics efficiently from the massive micro-blog data.Different Hashtags created by different publisher may describe the same topic.Thus mining the relevance between the Hashtags will help to find hot topics more efficiently.In this paper,a wide range of features were explored to analyze the topical relevance between Hashtags,such as the Hashtag text,content of the related microblog,the time of occurrence and the co-occurrences of Hashtags.The experimental results show that the proposed features are helpful for topical relevance analysis of Hashtags.
Analysis of Probable Lower Bound of Time Complexity of Some Classical Algorithms Based on Decision Tree and Information Theory
ZHOU Yi-min and LI Guang-yao
Computer Science. 2013, 40 (Z11): 238-241. 
Abstract PDF(290KB) ( 981 )   
References | RelatedCitation | Metrics
Algorithms are concurrently born with electronic computers.Sometimes we even believe algorithms’ birth is long before that of modern computers.Various algorithms were designed to solve miscellaneous concrete problems.Nevertheless,whether the time complexity of an algorithm’s lower bound exists and if so how to determine that lower bound do not usually come to be a problem worth researching or being valued.A method of modeling and analyzing the time complexity’s lower bound of some classical algorithms wase proposed based on decision model and information theory and how to calculate it was presented to solve the problem.The method provided is feasible and effective for the classical algorithms listed and is also applicable to those not listed we believe.
Research on Construction of Chinese Academic Institutional Repositories Based on Open Access Consciousness
CHEN Yi-ling and LV Yang-jian
Computer Science. 2013, 40 (Z11): 242-245. 
Abstract PDF(340KB) ( 690 )   
References | RelatedCitation | Metrics
The paper took an investigation of the Institutional Repositories in Chinese universities,and analyses the status about amount of resources,types of resources,browse manners,language distribution,software platform,collection policies,and so on.There are some deficiencies existing in the academic Institutional Repositories,for example,lacking of funds and resources,existing intellectual resistance,short of uniform standards,etc.And then corresponding countermeasures were put forward.
Paragraph-Sentence Mutual Reinforcement Based Automatic Summarization Algorithm
XIE Hao and SUN Wei
Computer Science. 2013, 40 (Z11): 246-250. 
Abstract PDF(371KB) ( 645 )   
References | RelatedCitation | Metrics
Sentence ranking is the key issue of text automatic summarization.Based on mutual reinforcement principles,we proposed a new sentence ranking model——paragraph-sentence mutual reinforcement model.With the relation between paragraphs and the mutual reinforcement between paragraphs and sentences,it iteratively computes the salience of the sentences and extract the summary sentences.We analyzed the effect of the internal and external reinforced factor and discussed the problem of redundancy remove.Experiments show that it can extract high quality summary when it applies to the single-document summarization.
Periodicity Algorithm of Textual Data Mining with Multi-granularity Time
MENG Zhi-qing,LOU Ting-yuan and HU Qiang
Computer Science. 2013, 40 (Z11): 251-254. 
Abstract PDF(395KB) ( 958 )   
References | RelatedCitation | Metrics
The large-scale text data mining is an important branch of the big data analysis and is also a hot research to-pic in recent years.This paper studied algorithm of the textual periodicity data mining with multi-granularity time.First,the concepts of granularity conversion and multi-granularity time interval were presented.Then,a periodic pattern of textual data and an algorithm of the periodic pattern to textual data with multi-granularity time were proposed.Finally,by testing virus textual data,the proposed algorithm shows that some efficient periodic patterns are obtained.The influence of the periodic range on the degree of support and confidence were discussed.This paper provided a new method for the big text data analysis.
Multi-label Patent Classification Oriented to TRIZ Users
YUAN Li,CHEN Yang and ZHAO Yong
Computer Science. 2013, 40 (Z11): 255-258. 
Abstract PDF(408KB) ( 986 )   
References | RelatedCitation | Metrics
Patent is not only the results but also the resources of products innovation.It can help the designer make innovation effectively,if we classify technique knowledge of patent based on innovative demand.The classification of products patent based on the TRIZ can assistant in using technique contradiction addressed in patent to make innovative design.The original Inventive Principles is so abstract that some principles are overlapped.Paper analyzed the 40IPs and grouped them into new 20classes.Patent classification problem is known as multi-label classification problem.Pro-Techniques,CREAX,these two softwares supply patents which explain Inventive Principles in detail.The dataset is used to compare the performance of multi-label classification algorithm:problem transformation and algorithm adaptation.Several measures such as hamming loss,F-measure have been proposed in the literature for the evaluation of multilable classifiers.The result shows Problem Transformation performs more excellent than Algorithm Adaptation using TRIZ patent datasets.
Chinese Chunk Dependency Relationship Parsing Based on Words Dependency
LI Li,ZHAO Wen-juan and FAN Xiao-zhong
Computer Science. 2013, 40 (Z11): 259-262. 
Abstract PDF(317KB) ( 872 )   
References | RelatedCitation | Metrics
Chinese Chunking is an import technique in the Syntax parsing.This paper adopted the theoretical analysis of dependency to parse the Chinese Chunk Dependency Relationship.First of all,BIO tags are used to identify the Chinese chunk,then according to the dependency of words to distinguish the dependency of Chinese Chunk.Experimental results show that,the Chinese Chunking rates of accuracy and recall are respectively 82.3% and 78%,and the Chinese Chunk Dependency rates of accuracy and recall are respectively 89% and 90.5%。
Research of Distributed ETL Dimensional Data Model Based on MapReduce
SONG Jie,HAO Wen-ning,CHEN Gang,JIN Da-wei and ZHAO Cheng
Computer Science. 2013, 40 (Z11): 263-266. 
Abstract PDF(583KB) ( 662 )   
References | RelatedCitation | Metrics
Because MapReduce lacks support for high-level ETL specific constructs,this paper presented a parallel dimensional ETL framework based on MapReduce (MapReduce Distributed ETL--MDETL),which exhibits the data processing to the composable property(the processing of dimensions and facts),directly supports high-level ETL-specific dimensional constructs.This paper evaluated its performance on large realistic data sets.The experimental results show that MDETL achieves very good scalability.
News Recommendation Technology Combining Semantic Analysis with TF-IDF Method
ZHOU You and DAI Mu-hong
Computer Science. 2013, 40 (Z11): 267-269. 
Abstract PDF(318KB) ( 1138 )   
References | RelatedCitation | Metrics
Currently in the news item recommendation system,usually using TF-IDF weighting technology combined with the cosine similarity measure,however,this technique does not take into account the actual semantics of the text itself,therefore,the paper propsed a new method based on the combination of contents and their semantic similarities. This method is a collection of synonyms and inverse document frequency combining semantic similarity using WordNet synset do similar calculations.Building user profiles for laboratory tests to verify the effectiveness of the method.Experimental results show that the proposed method outperforms the TF-IDF method.
Data Placement Algorithm for Large-scale Storage System
ZHENG Sheng and LI Tong
Computer Science. 2013, 40 (Z11): 270-273. 
Abstract PDF(304KB) ( 665 )   
References | RelatedCitation | Metrics
With the era of big data coming,t PB and EB even ZB-level dataset makes storage system scalable.Traditional data distribution algorithm was confronted with serious challenge because of different performance storage devices added and the old ones quitted,even multiple devices failed simultaneously.A new hash mapping algorithm was proposed which supports the node weight and multi-replica and also considers node failure and node overload.The algorithm can adapt dynamically to change of storage nodes and promises data even distribution probabilistically for different performance nodes.Besides,the one can effectively deal with node failure and node overload which can improve the availability and performance of the system.
Study of Sentiment Analysis of Product Reviews in Internet Based on RS-SVM
WANG Gang and YANG Shan-lin
Computer Science. 2013, 40 (Z11): 274-277. 
Abstract PDF(335KB) ( 822 )   
References | RelatedCitation | Metrics
As product reviews in the internet are helpful for the decision of online shopping,and the classification accuracy of sentiment analysis is one of important problems.Recently,ensemble learning has been proved to be an effective method of enhancing the classification accuracy.Bagging and Boosting have been applied into the sentiment analysis,while Random Subspace is paid less attention to.In this paper,an new method,RS-SVM,was proposed for sentiment analysis based on the characteristic of high dimension of product review''s dataset.RS-SVM uses the state-of-the-art SVM as base learner and Random Subspace as ensemble method in order to enhance the accuracy of sentiment analysis.Lastly,experiments based on movie reviews'' dataset were conducted to verify the effectiveness of RS-SVM.Experimental results reveal that RS-SVM gets the best classification results compared with other methods.
BP Neural Network Pruning Algorithm Improved on Base of Genetic Algorithm to Optimize Decision Tree Model
WU Tong and CHENG Hui
Computer Science. 2013, 40 (Z11): 278-280. 
Abstract PDF(310KB) ( 997 )   
References | RelatedCitation | Metrics
Decision is an effective classification method.But during the building process of decision tree,there usually appear over-fitting phenomena of models.This paper discussed soft pruning processing by using BP pruning which is based on BP neural network.Then,according to the shortages of BP pruning,this paper proposed a revised algorithm,named GBP-Pruning.This algorithm is able to train weight and threshold value of BP-Pruning model by bringing in genetic algorithm,so that it can overcome the shortages of BP-Pruning.It also proved the feasibility of GBP-Pruning.
Designs of Structures and Modules Based on Local Data Marts
ZHANG Shi-hong and QIN Hao
Computer Science. 2013, 40 (Z11): 281-283. 
Abstract PDF(308KB) ( 645 )   
References | RelatedCitation | Metrics
According to the actual needs of the local mobile communications,you need to design the hierarchy of data marts,and its structure consists of comprehensive query-oriented data,and data layers of detail-oriented queries,which focuses on the overall design and the primary table design for account theme,business theme,competition theme,user theme,new theme,major clients theme,such as modules.
Research on Fuzzy Logic-based Model of Tiered Storage
SHI Guang-yuan and ZHANG Yu
Computer Science. 2013, 40 (Z11): 284-287. 
Abstract PDF(432KB) ( 702 )   
References | RelatedCitation | Metrics
Iered storage is the important way for Intelligent Data Management.It can effectively balance access relationships between storage resources and variety data,and maximize the overall performance of the storage system.Howe-ver,the judgment of hotspot data is a bottleneck for tiered storage in process of data classification.Proposing a Fuzzy Logic-based Model of tiered storage (FLM) which uses the data characteristics as the input variable that reflect hot and cold level of data.Then,FLM uses the fuzzy logic to analyze data characteristics for getting output variable,and it can smooth the boundary of non-hotspot and hotspot.Experimental model analysis indicates that FLM can make the smoothness of data migration and reduce shake problems in data management.
Autonomous Collaborative Tracking Mode on Smart Camera for Traffic Monitoring
LU Xiu-qing,ZHANG Ya-ying and YE Chen
Computer Science. 2013, 40 (Z11): 288-291. 
Abstract PDF(713KB) ( 719 )   
References | RelatedCitation | Metrics
This paper designed and implemented an autonomic collaboration tracking mode on embedded smart cameras for traffic monitoring by introducing dynamic roles and distributed control.Collaborative tracking for abnormal vehicle will be organized automatically in smart camera network and the autonomous tracking process is created and completed by smart cameras themselves.The decentralized control mode can effectively slow down the working pressure of service systems,reduce the requirements of network bandwidth for real-time video transmission,and make the system flexible and fault-tolerant.
Binary Image Edge Detection Based on Spatially-Variant Mathematical Morphology
LI Xiao-lin,QIU Wei-gen and ZHANG Li-chen
Computer Science. 2013, 40 (Z11): 292-295. 
Abstract PDF(567KB) ( 754 )   
References | RelatedCitation | Metrics
Based on spatially-variant mathematical morphology,a new image edge detection method was proposed in this paper to meet needs of detecting binary image edge.The proposed method constructed by the spatially variant (SV) structuring elements can achieve good adaptive effectiveness and keep the edge smoothness in the binary image edge detection. The experimental results show that compared with the traditional edge detection operators,the algorithm has the following distinct advantages:good performance of anti-noise,better real-time performance,easy to implement and a certain practicality and feasibility.
Method for Accelerating Display Speed by Organizing Scene Objects via Geometric Data Analysis
ZHOU Yi-min and LI Guang-yao
Computer Science. 2013, 40 (Z11): 296-300. 
Abstract PDF(665KB) ( 666 )   
References | RelatedCitation | Metrics
Graphical applications and interactive systems,especially applied in large scene simulation,are often required to display large quantities of objects synchronously.But not much attention has been focused on the organization of the scene structure which plays an important role in determining the speed of displaying the whole scene each frame respectively while much has been spent on surface simplification and other coherency within each object’s representation.Though the coherency within the scene is so simple to understand,this thesis will still present some also simple methodology which can be used to factually improve the efficiency in displaying the customized scene via applying geometric data analysis or related method.We presented a new method for representing a hierarchy of scene objects which partitions the scene into groups of objects termed as clouds of objects.The method has several variants,some of which applies only geometric data analysis,some of which just applies scene’s inherent geometrical property and others which applies both.Finally,we presented some statistics information to verify improvement by applying our method in graphical interactive systems.
Novel Algorithm for Local Color Transfer Based on Preserving Detail Texture
CHEN Hai and FENG Guo-can
Computer Science. 2013, 40 (Z11): 301-303. 
Abstract PDF(494KB) ( 825 )   
References | RelatedCitation | Metrics
Local color transfer between images is a method that transfers the color characteristics from the marked region of a reference image to the target image’.Reinhard algorithm is a classical algorithm,but it does not take the image details into consideration.This paper proposed a novel algorithm for local color transfer based on the flexibility and practicality,which takes consideration on the mean,standard deviation and gradient of the target region and reference region,has a greatly increase on the flexibility and practicality through an adjustable weights.
Research on Ultrasound Carotid Image Segmentation Methods Based on Local Chan-Vese Model Using Level Set Method
ZENG Ya-jie,YANG Xin,XU Hong-wei,LIU Yang,LIANG Hua-geng and DING Ming-yue
Computer Science. 2013, 40 (Z11): 304-308. 
Abstract PDF(770KB) ( 748 )   
References | RelatedCitation | Metrics
The segmentation of intima and adventitia of Common Carotid Artery(CCA) in ultrasound transverse images is critical,and the results can be used for qualitative estimates and quantitative measurements of plaque size,thickness and shape.Firstly the adventitia was segmented using Local C-V model and the intima was segmented using C-V model.Distance limitations item was proposed to limit the evolution of the intima,and Sparse field method(SFM) was used to improve efficiency of the level set method.The result was analyzed and compared by full-orthogonal method (FOM),ray method and Dice index.The results indicated that the LCV model can effectively segment the adventitia of the caro-tid artery;C-V model can effectively segment the intima;Improved methods can increase the speed of the program and improve the accuracy of segmentation of the intima and adventitia.
Target Tracking Based on Feature Space of Detection
AN Guo-cheng and ZHANG Feng-jun
Computer Science. 2013, 40 (Z11): 309-313. 
Abstract PDF(698KB) ( 663 )   
References | RelatedCitation | Metrics
To solve the problem of small targets or targets which have similar color with scene background in tracking,we propose an object tracking method which exploits real-time detection results.First,scene background is effectively modeled and foreground mask is obtained by background subtraction and frame differencing,then,mean shift algorithm is applied to objects tracking in the fused image space.Though precise and sensitive,pixel-level processing algorithms such as mixture of Gaussian and frame differencing are not robust.The mean shift algorithm which is a block-level processing one is robust whereas it weakens spatial information of feature space.This paper effectively combines the merits of both algorithms to achieve robustness and accuracy of object tracking.Through the method,our system shows very good tracking performance for targets which move fast or have similar color disturbance with scene background.Also,our algorithm is efficient and the computation of the novel algorithm is fast to satisfy real-time application.Several groups of comparative experimental results show that the new algorithm can effectively suppress the scene background disturbance and improve the performance of object tracking.Experimental results on video clips demonstrate the effectiveness and efficiency of our method.
Cycle-spinning Based Contourlet Denoising for Multiple Image
CHENG Yan
Computer Science. 2013, 40 (Z11): 314-317. 
Abstract PDF(585KB) ( 1034 )   
References | RelatedCitation | Metrics
A CS-based multiframe denoising algorithm in Contourlet domain is proposed.We exploit the state-space relationship among the sequential images to build a novel CS-based framework to reduce noise.By estimating state-space relationship among the inter-frames,the successive frames are denoised in Contourlet domain and used weighted average to get a restored image.Experiments demonstrate the effectiveness of the proposed method and show the superiority for various noise characteristics,especially in preserving the detail and texture information with higher PSNR values.
Face Detection Based on DS-Adaboost Algorithm
YE Jun and ZHANG Zheng-jun
Computer Science. 2013, 40 (Z11): 318-319. 
Abstract PDF(478KB) ( 665 )   
References | RelatedCitation | Metrics
Focusing on the disadvantages of the Real Adaboost algorithm selected smoothing factor.This article proposes DS-Adaboost algorithm,this algorithm dynamic select smoothing factor in weak classifier output,According to the size of the ratio of Wj+1Wj-1 dynamic selected smoothing factor.When Wj+1Wj-1>1,εj=Wj+1,0j+1Wj-1<1,εj=Wj-1.The experimental results indicate that the DS-Adaboost algorithm can better play the role of balance,make the proportion of two types of samples are within can match.
WCE Image Retrieval by Using Fused Feature
ZOU Yue-xian,HUO Jia-sen,LIU Ji,LI Yi and DENG Wen-jun
Computer Science. 2013, 40 (Z11): 320-324. 
Abstract PDF(1184KB) ( 981 )   
References | RelatedCitation | Metrics
Wireless capsule endoscopy (WCE) is a new technique for detecting the gastrointestinal diseases which has a great market value.This technique will release a mass of image data and lead a big challenge for doctors to review the WCE images.Hence,WCE image retrieval is a fundamental function of the WCE image analysis system.This paper investigates the WCE image retrieval problem by using the similarity information of the WCE images.With the analysis of the property of WCE image,we develop an image retrieval algorithm by fusing the color feature and texture feature.Firstly,the color feature hue-saturation correlation histogram and texture feature LBP (local binary pattern) have been extracted;Then the color and texture features are fused by Gaussian normalization;Finally the index of the WCE image dataset can be built accordingly using the resultant fused feature vector.The experiments show that the proposed method has low computational complexity and a good retrieval result.
Robust Region_based Active Contours Model for Image Segmentation
JIANG Fan,WANG Chang-ming,BAO Jian-dong,XIE Xiao-min and DING Liang-hua
Computer Science. 2013, 40 (Z11): 325-328. 
Abstract PDF(844KB) ( 721 )   
References | RelatedCitation | Metrics
A novel region_based active contours model is proposed to deal with the images with intensity inhomogeneities and weak boundaries.For the proposed model,new global force and local force are defined,which compose a hybrid energy functional.Then,the energy functional is incorporated into a variational level set formulation.Furthermore,we regularize the level set function by using Gaussian filtering to keep it smooth and eliminate the re-initialization.In addition,the proposed model can degrade to a new global CV model.Experiment results show that the proposed model can not only segment images with intensity inhomogeneities and weak boundaries,but also robust to the noise,initial contours.And,it has high computational efficiency.
HOG-Feature and SVM Based Method for Forward Vehicle Recognition
LI Xing,GUO Xiao-song and GUO Jun-bin
Computer Science. 2013, 40 (Z11): 329-332. 
Abstract PDF(833KB) ( 1161 )   
References | RelatedCitation | Metrics
A HOG-feature and SVM based method was proposed for real-time forward vehicle recognition in automotive safety driver assistant systems.The shadow underneath vehicle was segmented accurately by using histogram analysis method and the initial candidates were generated by combining horizontal and vertical edge feature of shadow.These initial candidates were further verified by using a vehicle classifier based on the histogram of gradient and support vector machine.The experimental results show that the proposed method could adapt to different light conditions robustly.Specially,the proposed method has a recognition rate of 96.52percent and a false rate of 3.59percent in normal light condition.
Hyperspectral Remote Sensing Image Classification Based on SSLPP
PAN Yin-song,WANG Pan-feng,HUANG Hong and LIU Yan
Computer Science. 2013, 40 (Z11): 333-336. 
Abstract PDF(402KB) ( 752 )   
References | RelatedCitation | Metrics
Without using the data category information,Locality Preserving Projection algorithm is an unsupervised algorithm in the dimension reduction of hyperspectral images,so it has not a very good performance in the aspect of feature extraction.To resolve this problem,an algorithm based on semi-supervised locality preserving projection (SSLPP) is proposed in this paper.A graph Gi is constructed by SSLPP combined with the information of unlabeled samples and labeled samples whose are treated in different ways,therefore increasing the weights of congener samples and being beneficial to feature extraction.Experiments on the AVIRIS KSC set and Botswana data set show that the algorithm proposed in the paper can find the high dimensional space data intrinsic structure,effectively improving the overall accuracy of the classification.
Abnormal Crowd Movement Direction Detection Algorithm
LIU Shang and DONG Lin-fang
Computer Science. 2013, 40 (Z11): 337-340. 
Abstract PDF(587KB) ( 812 )   
References | RelatedCitation | Metrics
Direction is an important feature of crowd movement. The force between each other is small and the possibility of collision is low in orderly crowd movement,while they are higher in messy crowd movement which can leads security incidents,i.e. stampede. A new algorithm is proposed in this paper which detects the direction of abnormal crowd movement. First,this algorithm computes the velocity matrix and direction matrix by using optical follow method,and by these two matrices an index-“different movement index of frame” is computed.This index can indicate whether abnormal crowd phenomenon is appeared. The experiment results show that this index is relative with the messy degree of crowd movement. This algorithm can detect the abnormal movement effectively,which can avoid danger caused by messy movement.
Summary of Partial Differential Equation (PDE) Method on Digital Image Processing
DING Chang,YIN Qing-bo and LU Ming-yu
Computer Science. 2013, 40 (Z11): 341-346. 
Abstract PDF(510KB) ( 1913 )   
References | RelatedCitation | Metrics
The Partial Differential Equation (PDE) Method on Digital Image Processing has developed rapidly in recent years.The PDE method aims to build mathematical model of partial differential equation and makes the image change following the partial differential equation.Finally the image will achieve our goal.The image processing by the traditional method will never reach the effect using the PDE method.According to domestic and overseas studied conditions,we discuss four kinds of PDE models including image de-noising,image imprinting,image segmentation and image enhancement.We also analyze the establishment of partial differential equations,the solution of partial different equations and the implementation of partial differential equations.
Research of Interaction Control in Direct Volume Rendering
YU Rong-huan,WU Ling-da and YANG Chao
Computer Science. 2013, 40 (Z11): 347-349. 
Abstract PDF(752KB) ( 640 )   
References | RelatedCitation | Metrics
According to the interaction control problem of direct volume rendering,we analyzed the color control,alpha control and areas control problems in the direct volume rendering and proposed an interaction control method based on histogram control.This method adopted an alpha polygonal line similar to histogram to control the alpha in volume data and achieve the isosurface show.The experiment show that this method can simulate the isosurface and reduce the rendering time effectually.
Recognition of Aircrafts Based on Weighted Area
LI Yu and WU Zeng-yin
Computer Science. 2013, 40 (Z11): 350-353. 
Abstract PDF(1094KB) ( 601 )   
References | RelatedCitation | Metrics
An aircraft identification method based on weighted area was proposed to distinguish different aircrafts by means of the difference of their structures. Firstly,obtain the head,wing and tail of the aircraft using the multi-scale and multi-direction characters of Gabor transform,extract the features of the three areas and recognise them respectively,then distribute the weight for the three regional according to the different regional contribution to the aircrafts’ global features,finally combine with the weight of the recognition results in different regions to get the final aircraft type. The results show that this method has a higher recognition rate than traditional support vector machine and neural network methods under the same order of magnitude recognition time conditions and a strong anti-overlap effect,so it is an effective method of aircraft target recognition in this paper.
Edge Detection Algorithm Based on the Eight Directions Sobel Operator
ZHENG Ying-juan,ZHANG You-hui,WANG Zhi-wei,ZHANG Jing and FAN Sheng-juan
Computer Science. 2013, 40 (Z11): 354-356. 
Abstract PDF(480KB) ( 1123 )   
References | RelatedCitation | Metrics
An edge detection algorithm based on eight directions sobel operator is proposed for the problem that traditional edge detection method’s result is not satisfactory. The algorithm based on eight templates of 0°,22.5°,45°,67.5°,90°,112.5°,135° and 157.5° different directions is used to detect edges better in different directions. In the detection process,we take into account that the distance of neighborhood pixel to the center pixel is different,the contribution of the neighborhood pixel to the center pixel is different. This algorithm weights the neighborhood pixel according to the Euclidean of it to the center pixel. We let the pixel nearer to the center pixel has greater weight. The experimental results show that the new algorithm can detect the image edge relatively complete,with clear contours and better continuity.
Research of Barcode Recognition Technology Based on CPU&GPU in Aviation Information System
WANG Peng and LIU Shan-shan
Computer Science. 2013, 40 (Z11): 357-358. 
Abstract PDF(254KB) ( 581 )   
References | RelatedCitation | Metrics
For the low speed recognition of barcode recognition system in high-resolution image,which based on image acquisition in aviation information system,this paper proposed a new kind of barcode recognition system structure which based on GPU & CPU parallel processing,and presented the solution of the system.At last,investigated critical technology of image noise processing in recognition system and discussed the QNLM filter arithmetic which was suitable for operating on GPU.The method not only made the processing speed faster,but also had been verified feasible.
High-order Regularization Model for Image Denoising in Symmetric Tensor Space
LIU Xiao-yan and FENG Xiang-chu
Computer Science. 2013, 40 (Z11): 359-362. 
Abstract PDF(811KB) ( 793 )   
References | RelatedCitation | Metrics
In order to integrative deal with staircasing effect of ROF model and over-smoothing of high-order regularization,a new model for image denoising was proposed by using second-order symmetric gradient to construct the regularization term in the symmetric tensor space.By analyzing the properties of new model,an efficient primal-dual algorithm is introduced.The new model can effectively reduce the staircase effect because the second-order symmetric gradient is higher than first derivative.Meanwhile,it can maintain the edge because the norm of second-order symmetric gradient is smaller than the norm of Hessian matrix.Both theoretical analysis and simulated results show that the new algorithm has a high converge speed and stability.
Trust-circle Based Recommendation on User Cold-start
YANG Wei-sheng,LUO Ai-min and ZHANG Meng-meng
Computer Science. 2013, 40 (Z11): 363-365. 
Abstract PDF(291KB) ( 746 )   
References | RelatedCitation | Metrics
In recent years,to solve the problem of user cold-start on recommendation system,the technology of trust-based recommendation has got rapid development.However,in transacting the trust relationship,traditional trust-based recommendation technologies seem to be comparatively rough.The idea of trust-circle based strictly controls the effects of trust degree on the results of recommendation.As experimental results show,this method can not only efficiently settle the user cold-start problem,but also improve the precision of recommendation.
Human-centric Urban Transportation Multimedia Surveillance Method
YANG Tao,WANG Yong-gang,HU Jian-bin,GONG Bin and CHEN Zhong
Computer Science. 2013, 40 (Z11): 366-368. 
Abstract PDF(323KB) ( 632 )   
References | RelatedCitation | Metrics
With the rapid growth of the vehicles on the road,the urban transportation multimedia surveillance has become a very hot topic.Human operators remotely monitor the image scenes captured by the cameras can effectively monitor only four camera views.The event-false-rate would be very high in this traditional way.RFID is a very feasible technology to identify an item based on radio frequency transmission and it can be also used to track and detect a wide variety of objects,especially high-speed moving vehicles.Human-centric CCTV surveillance system is also another hot technology that can be used to help detect human’s attention in the CCTV views for important events.To implement the moving vehicle surveillance system in big cities practically,we propose a RFID-Vehicle-Tag,Event assisted and Human-centric urban transportation surveillance method.The method has two major features:1.the method computes the human operator’s attention in the CCTV views to automatically determine the importance of events captured by the respective cameras.2.The method combined the Human-centric CCTV surveillance tech.and the RFID tech.to improve event identification rate evidently.The method can be used efficiently by the system operators to track and detect events in moving vehicles easily.
Research and Implementation of the System of Multi-disciplinary Flow Integration and Design of Experiment
JIANG Xing-pei and WU Yi-zhong
Computer Science. 2013, 40 (Z11): 369-373. 
Abstract PDF(656KB) ( 635 )   
References | RelatedCitation | Metrics
As the design of complex engineering system is a decision-making process of multi-disciplinary synthesis design optimization,a System of Multi-disciplinary Flow Integration and Design of Experiment based on component is designed and implemented.The system supports the conceptual design and simulation experiment of complex engineering system.Compared to current multi-disciplinary design optimization systems,the system has the following characteristics and advantages:1) Design of experiment (DOE) and flow’s hybrid model other than project management style is proposed to easily manage the model;2) Multi-threading technology is adopted to seek the parallel scheduling execution for flow’s components and DOE’s instances;3) Dynamic task scheduling technology is proposed to realize the flexible control mode of the combination of DOE’s automatic scheduling and interactive control.
Study on the Key Technology of Sub-millimeter Imaging Guidance
WU Hang,ZHANG Yan-jie and ZHONG Qi-shui
Computer Science. 2013, 40 (Z11): 374-378. 
Abstract PDF(459KB) ( 844 )   
References | RelatedCitation | Metrics
The research on imaging guidance in the sub-millimeter are being explored,because of its unique advantages,It has a great development prospect in both military and civil. In this paper,Target detection and recognition technology have been introduced. The Object''s radiometric equation and the detection distance of sub-millimeter wave radiometer have been given. We have introduced the technologies of image pretreatment. The image pretreatment includes image filtering and target segmentation. At last,we have pointed out the direction of the future development of imaging gui-dance technology.
Rule-based Preprocessing Algorithm for Web Page Segmentation
PENG Hong-chao,TONG Ming-wen,ZOU Jun-hua and HAO Qiu-hong
Computer Science. 2013, 40 (Z11): 379-382. 
Abstract PDF(451KB) ( 666 )   
References | RelatedCitation | Metrics
Since the independent design between web contents and styles of National Level Excellent Courses,web page segmentation algorithm can hardly run.We present a rule-based preprocessing algorithm of web page segmentation to create correlation between tags and style information.The algorithm consists of three steps:first,get the style information;second,associate styles with tags;third,output HTML and PerfectNode which is associated class list.We selected 100pages from the National Level Excellent Courses randomly to run the preprocessing algorithm.Experimental results show that the algorithm can associate tags with styles efficiently,which can solve the problems that web page segmentation algorithm cannot run.
Prisoner Localization and Management System Based on RFID
YANG Heng-liang
Computer Science. 2013, 40 (Z11): 383-384. 
Abstract PDF(272KB) ( 815 )   
References | RelatedCitation | Metrics
The prison is an important part of the state apparatus,Because of its special status,It has the very high request in the security system. A design which solves the problem is this prisoner localization and management system based on RFID technology. The system design can get real-time and accurate positioning of the prison staffs,with functions of calling the roll,personal stroke monitoring and alarming,It will effectively improve the level of prison safety protection.
Design of Solar Auto-tracking System Based on Embedded Technology
ZHAO Pei-mei and WANG Ri-hong
Computer Science. 2013, 40 (Z11): 385-388. 
Abstract PDF(330KB) ( 643 )   
References | RelatedCitation | Metrics
With the development of the solar photovoltaic technology in recent years,high-precision position of the sun tracking technology is more and more important.But general sensors under the condition of guarantee the high precision tracking range is ignored,it results in that the tracking range of sensors often appear small search less than the sun.In order to guarantee the photoelectric sensors in such aspects as scope of tracking,tracking accuracy and meet the requirements,the paper put forward a kind of design method as the control core of tracking control system based on S3C2410development board,it adopted photoelectric tracking and sun angle tracking,complementary cylinder sensor structure.Using database is used for recording the daily weather condition’s solar trajectory,the system can effectively improve the solar energy collection and utilization efficiency,it can expand the tracking range of the sun.It has good application prospect.
Applied Research on the Task-driven Teaching Mode in the Data Structure Curriculum Design
RAN Yan-hui and TANG Wan-mei
Computer Science. 2013, 40 (Z11): 389-391. 
Abstract PDF(283KB) ( 1013 )   
References | RelatedCitation | Metrics
This paper analyses the characteristics of the data structure curriculum and puts forward the task-driven teaching mode based on the problems in the actual teaching.Then expounds the meaning and implementation steps of task-driven teaching mode,and applies this teaching mode to the data structure curriculum teaching.It stimulates the students’ learning initiative effectively,enable students to better grasp the knowledge and improves the students’ comprehensive ability.
Improved Model of Plate Glancing Flatness and Thickness and its Active Disturbance Rejection Decoupling
ZHANG Rui-cheng and WANG Jian-chao
Computer Science. 2013, 40 (Z11): 392-394. 
Abstract PDF(263KB) ( 610 )   
References | RelatedCitation | Metrics
In AFC-AGC strong coupling continuous rolling process as the research object,analysis various coupling factors of the system of the thickness control and strip flatness control and establishing the thickness and shape glancing flatness coupling model after adding the effects of oil film thickness.Based on the use of ADRC technique for decoupling design,and used computer to carry on the simulation to actual parameters of a factory of five tandem rolling mill,the simulation results show that the design of the disturbance decoupling controller can basically eliminate the coupling relationship between the thickness of plates and plate glancing flatness,and can get a good decoupling effect.
Implementation Method of Weapon Equipment SCM
ZHENG Cui-fang
Computer Science. 2013, 40 (Z11): 395-397. 
Abstract PDF(249KB) ( 1030 )   
References | RelatedCitation | Metrics
Software configuration management (SCM) throughout the entire software life cycle,it can better solve the problem of software resources,process and related issues. This paper analyze the actuality of weapon equipment software quality management work combined with the weapon equipment software’s characteristics,SCM implementation process is put forward,then the method,process and requires of software configuration management are presented in detail,focusing on the change control which is the key process of SCM.According practical software development,It also describes the application of change control in weapon equipment software configuration management.
Acoustic Location System Design
MIAO Sheng,ZHOU Wei,TANG Hao,WU Ji-da and YAO Shao-wen
Computer Science. 2013, 40 (Z11): 398-400. 
Abstract PDF(238KB) ( 1175 )   
References | RelatedCitation | Metrics
Acoustic source location is one of the research point in the acoustic technique.In this paper,we firstly use the idea of TDOA (Time Difference of Arrival) in the acoustic location system design;Putting three or more microphones in the open area,collecting acoustic signal and calculate the time delay between the different microphones,then use a new cooperative location algorithm based on Chan and Taylor algorithm to estimate the target location.The simulations and tests indicate this system can effectively locate the acoustic target in open area,and has stable performance.
Integration of PDM and ERP Based on the Generic Bill of Materials
ZHU Chao,YANG Wen-bing,SUN Lin-rui,WANG Hui-long and HUANG Guan-hua
Computer Science. 2013, 40 (Z11): 401-404. 
Abstract PDF(354KB) ( 772 )   
References | RelatedCitation | Metrics
Product Data Management (PDM) and Enterprise Resources Planning (ERP) are the important technical foundations of manufacturing to achieve Computer Integrated Manufacturing System.The integration of the two systems is mainly based on the work flow and bill of materials (BOM).The former cannot update data real-time because of the restriction of work flow nodes;the latter integrated product structure data only which cannot meet the needs of variant design method based on the diversiform customer requirements.The complete product structure which includes the product configuration information was established based on the generic bill of materials (GBOM).To accomplish the integration of PDM and ERP,the intermediate files of part (Item) and GBOM were set up for data interchange.The configuration information in the GBOM was transport to the ERP end,at which the production staff can configure product,the response time to market changes was shortened.A case study on RMM3molded case circuit breaker (MCCB) illustrated the effectiveness of the proposed integration method.
Detecting and Tracking Edges of the Lane Based on Range Images
WANG Qian,GUO Li and SHI Hang-fei
Computer Science. 2013, 40 (Z11): 405-408. 
Abstract PDF(544KB) ( 674 )   
References | RelatedCitation | Metrics
The range image of the laser scanner could be used to detect and track edges of the lane.By clustering and analyzing the scan points,edges of the lane could be detected,edges of the lane could be substituted by fitting lines through the least square algorithm.Based on the fitting lines,edges of the lane could be tracked by Kalman filter.Expe-rimental results show the accuracy and robust of the algorithm.
Research on the Seismic Instantaneous Attributes of Empirical Mode Decomposition Parallel Extraction Algorithm Based on GPU
CAO Xiao-chu,JIN Di,WANG Zong-ren and WANG Qi-di
Computer Science. 2013, 40 (Z11): 409-411. 
Abstract PDF(584KB) ( 596 )   
References | RelatedCitation | Metrics
It is significant that instantaneous attributes have been extracted from seismic data.Empirical mode decomposition based on the local characteristics provides a new analysis method suitable for nonlinear and non-stationary signals.EMD parallel algorithm has been analyzed and researched on GPU.Computation efficiency in GPU has substaintial advantage of that in CPU by comparative experiment,and speedup of GPU is up to 8.66compared to CPU among the test data in this article.
Analysis and Research of the Soil Fertility Status Based on Improved DBSCAN Algorithm
GUO Wan-chun,CAI Li-xia,CHEN Hang and CHEN Gui-fen
Computer Science. 2013, 40 (Z11): 412-414. 
Abstract PDF(230KB) ( 910 )   
References | RelatedCitation | Metrics
Typically DBSCAN algorithm based on density can effectively deal with clusters of arbitrary shape,but the algorithm is not a comprehensive analysis of soil fertility,Since temporal data have obvious differences.To solve this problem,this paper proposes a method to analyze the situation of soil fertility in nong’an town based on improved DBSCAN algorithm.First,using AHP to get the weight of each property in order to balance the differences between the data;Secondly,the improved DBSCAN algorithm apply to analyze the data of soil fertility,and the experimental results with traditional DBSCAN algorithm were compared.Experimental results show that the improved DBSCAN algorithm more quickly and efficiently for selecting the two parameters Eps and minPts,it has better clustering results.
Research into the Guiding Function of Color Emotion in E-commerce Website Interface
ZHANG Xiao-ling,QIN Feng-mei and QIU Yu-hui
Computer Science. 2013, 40 (Z11): 415-416. 
Abstract PDF(258KB) ( 845 )   
References | RelatedCitation | Metrics
The color of website has guiding function to influence consumer’s purchase decision and to give consumption implication to consumer.In this paper,by combing the website examples and practice,from the angle of E-commerce website interface,researches into color’s guiding functions.It’s a valuable try in the research of guiding function of color emotion in E-commerce website interface.
Study of the Implementation Model of Information Construction in Campus
DUAN Zong-yao and RAO Shui-lin
Computer Science. 2013, 40 (Z11): 417-420. 
Abstract PDF(590KB) ( 612 )   
References | RelatedCitation | Metrics
The 2011Annual project plan of Hubei Province Science Education "1025" (project number:2011B267) "implementation" campus informationization construction theory and technology research papers.The team carried out extensive investigations and studies,to fully understand the information needs of teachers and students,but also fully understand the working environment,the teaching environment of information,analysis of the direction and goals of campus information construction;summarizes the campus information construction using Web technology experience and limitations,the instant communication technology is currently the most popular the second wave (IM) applied to the construction of the campus information,and developed a chat function similar to Tencent QQ,user structure similar to the alumni of IM software;discuss the campus information theory,carry out appropriate solutions for campus information,and with the practice and the corresponding point of view.
Work-based Process-oriented Curriculum Development Research and Practice
ZHANG Xiao-ling,QIN Feng-mei and QIU Yu-hui
Computer Science. 2013, 40 (Z11): 421-422. 
Abstract PDF(291KB) ( 779 )   
References | RelatedCitation | Metrics
Work process is "in the enterprise to accomplish a task and get the results of the work carried out a complete working program is a comprehensive,always in motion but the structure is relatively fixed system." Combined with computer multimedia technology professionals,based on the work process-oriented curriculum development and the construction of ideas,content,and other initiatives to analyze and discuss.
Design and Implementation of BPM Platform Based on SOA Technology Management
WANG Yu-juan
Computer Science. 2013, 40 (Z11): 423-425. 
Abstract PDF(769KB) ( 679 )   
References | RelatedCitation | Metrics
This paper presents a deep analysis the shortage of BPM platform of sinopec science and technology management system,SOA technology is introduced into the BPM platform,and puts forward a kind of five layer BPM platform based on science and technology management SOA through the service component layer and services layer technology integration,the formation of a unified SOA service management,realize the original BPM platform complicated business logic scalable and maintainable.
Research and Implementation of SOA Frame in Call Center Reporting Systems
QIN Feng-mei,QIN An-bi and QIU Yu-hui
Computer Science. 2013, 40 (Z11): 426-427. 
Abstract PDF(760KB) ( 612 )   
References | RelatedCitation | Metrics
Call Center Reporting System is an important part of the information system.With call center business continues to expand,traditional reporting system can not meet the business needs of the call flexibly and dynamically.This paper proposes an SOA-based models,and Reporting Services Reporting Services application implementation.Thereby ensuring consistency call report data sources,while providing standardized,uniform access interface,the achievement of an effective data report data access and sharing.
Design and Implementation of Personnel Training Information MS Based on Business Rule Engine
YU Jun-yang and GU Zi-yao
Computer Science. 2013, 40 (Z11): 428-431. 
Abstract PDF(327KB) ( 600 )   
References | RelatedCitation | Metrics
In this paper,research and implementation of personnel training information management system based on business rule engine,through the analysis of business process,main function module design and business relationships in the data description,expounds the main business rule,and detailed description of the implementation of key rules.The situation of on-line use of the system illustrates that the personnel training organization,management and analysis are improved significantly.
Processing System of Meteorological Data Based on XML and R egular Expression
TIAN Lan,JIN Shi-sheng,LI Bo,BO Ying-zhu and LI Jue
Computer Science. 2013, 40 (Z11): 432-435. 
Abstract PDF(598KB) ( 644 )   
References | RelatedCitation | Metrics
With the continual progress of the construction of composite observing system for meteorology,the kind and amount of real-time meteorological data information transmission increase rapidly.Simultaneously,the transmission processing of real-time meteorological data information transmission presents a characteristic that it needs the function of parallel processing of many kinds of real-time data information in a certain period.For gathering,storing,sharing the real-time data information efficiently,reliably,completely,rapidly,the technique that it adopts XML (Extensible Markup Language) to mark and explain the content of kinds of message information,and form the definition of data transmission operation,then it makes the system possess the ability that it can adapt the variable observing data transmission operation,with the data processing application based on regular expression.It realizes the gathering,processing and distributing the real-time meteorological data information in the provincial information center,and makes up the disadvantages of the original operation system that the system is complex,the function is tedious,its expansibility is not good,and the efficiency is low.
Process of Establishing CMMI System and its Role in Project Management
QI Xiao-ling and FENG Da-peng
Computer Science. 2013, 40 (Z11): 436-438. 
Abstract PDF(496KB) ( 993 )   
References | RelatedCitation | Metrics
This paper introduces the sources of CMMI, CMMI 5grades, different grades and different levels in CMMI project schedule control and avoid project termination probability. For this organization system is established based on the CMMI model, as well as the system frame to complete and final results. Finally, the CMMI system based on role in project management, and peer review as an example, introduces the project planning, project budget and project results to the benefits of control.