Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
Current Issue
Volume 48 Issue 11, 15 November 2021
Blockchain Technology
Survey on Key Techniques and Development of Blockchain as a Service Platform
MAO Han-yu, NIE Tie-zheng, SHEN De-rong, YU Ge, XU Shi-cheng, HE Guang-yu
Computer Science. 2021, 48 (11): 4-11.  doi:10.11896/jsjkx.210500159
Abstract PDF(1878KB) ( 1453 )   
References | Related Articles | Metrics
Blockchain as a Service is a new application method that embeds the blockchain framework into the cloud computingplatform,which can effectively use the cloud platform to improve the convenience and efficiency of the deployment and operation of the blockchain system.This paper mainly analyzes and summarizes the key techniques and existing platform systems of Blockchain as a Service.Firstly,this paper introduces the concept and function of BaaS platform,and analyzes the advantages of BaaS platform in improving safety performance,realizing personalized customization and reducing development cost.Then,based on the existing commercial BaaS platform,the system architecture and key technique architecture of BaaS platform are introduced in detail,and the characteristics,technologies and functions of the current mainstream BaaS platform,as well as the relevant application scenarios are introduced.In this paper,the challenges encountered by the current BaaS platform are summarized,and the future research direction of BaaS is prospected.
Survey of Crowdsourcing Applications in Blockchain Systems
LI Yu, DUAN Hong-yue, YIN Yu-yu, GAO Hong-hao
Computer Science. 2021, 48 (11): 12-27.  doi:10.11896/jsjkx.210600152
Abstract PDF(1944KB) ( 2241 )   
References | Related Articles | Metrics
Blockchain technology can be extensively applied in diverse services,ranging from online micro-payment,supply chain tracking,digital forensics,health-care record sharing to insurance payment.Extending the technology to the crowdsourcing,we can obtain a verifiable and traceable crowdsourcing systems.Emerging research in crowdsourcing applications exploits blockchain technology to optimize the task assignment and reward payment using various consensus protocols and blockchain techniques,which can provide additional security,automatic,verifiable and traceable crowdsourcing platforms.In this paper,we conduct a systematic survey of the key components of crowdsourcing blockchain and compare a number of popular blockchain applications.In particular,we first give an architecture overview of popular crowdsourcing-blockchain systems by analyzing their network structures and protocols.Then,we discuss variant consensus protocols for crowdsourcing blockchains,and make comparisons among different consensus algorithms.Finally,we look forward to the future research problems in this field,and provide a large number of references.
Online Patient Communication Model Based on Blockchain
CHEN Xian-lai, ZHAO Xiao-yu, ZENG Gong-mian, AN Ying
Computer Science. 2021, 48 (11): 28-35.  doi:10.11896/jsjkx.210400240
Abstract PDF(1951KB) ( 865 )   
References | Related Articles | Metrics
At present,false information is prevalent on the Internet,and the authenticity of shared information cannot be guaranteed when patients communicate on the Internet.In order to solve this problem,a blockchain-based online patient communication model is proposed,and patients can anonymously share real medical data and communicate with other patients.Firstly,the patient's digital identity is used to protect privacy,the medical summary data needed for patient communication are uploded to the blockchain and published all the data for retrieval,and it allows patients to retrieve the required cases without locating a specific patient.Secondly,in order to avoid malicious uploading by authorized personnel,smart contracts are used to perform multiple authentication on data upload,and doctors and patients restrict each other to ensure that all the data on the blockchain is true and reliable.Finally,the improved RAFT consensus algorithm can quickly identify Byzantine nodes so as to better achieve consensus on the blockchain.The performance of the model is evaluated through experiments,and the results show that the medical data could be shared,and meet the needs of patients for online communication under the premise of ensuring patient privacy.
CFL_BLP_BC Model Based on Authentication and Blockchain
LIAN Wen-juan, ZHAO Duo-duo, FAN Xiu-bin, GENG Yu-nian, FAN Xin-tong
Computer Science. 2021, 48 (11): 36-45.  doi:10.11896/jsjkx.201000002
Abstract PDF(1577KB) ( 802 )   
References | Related Articles | Metrics
The coming of 5G era brings new challenges to the information security of emerging information industries.The exis-ting security technologies can't meet the requirements of millisecond level authentication and trusted authentication for specific scenarios in 5G era.Therefore,CFL technology is taken as the origin technology of information security.Based on the local modification of the security axioms of the original BLP model,combined with the Blockchain technology,CFL_BLP_BC model formally describes the basic elements,security axioms and state transition rules of the model.The model can support the construction of five aspects of information security,and has the attributes of millisecond level,instruction level and empirical system.The model belongs to endogenous safety,innate immunity and active defense technology.The model can provide important theoretical gui-dance for theemerging information industry.
Blockchain-based High-threshold Signature Protocol Integrating DKG and BLS
LIU Feng, WANG Yi-fan, YANG Jie, ZHOU Ai-min, QI Jia-yin
Computer Science. 2021, 48 (11): 46-53.  doi:10.11896/jsjkx.210200129
Abstract PDF(2186KB) ( 1133 )   
References | Related Articles | Metrics
Threshold signatures are fundamental tools for multi-party information security protocols.It is widely used in fields such as identity authentication,anti-counterfeiting and tamper-resistance.We introduce a new decentralized threshold signature protocol BHTSP which combines distributed key generation (DKG) and BLS signature.The protocol allows multi-party participation and generates a signature of constant size.We implement this protocol with smart contract as the communication layer for secure parameter exchange.Experimental simulation results show that BHTSP can generate threshold signature with constant size.It reduced the memory consumption for aggregated public key combinations needed in signature verification by 85.3% compared to Schnorr signature.In the experimental blockchain platform,BHTSP is able to support the generation of threshold signatures involving up to 50 participants,optimizing the execution process for blockchain multi-party transactions.
SQL Access Framework for Hyperledger Fabric
YU Zhi-yong, LIN Li-qiang, CHEN Yan, ZHOU Tian, NI Yi-tao, CHEN Xing
Computer Science. 2021, 48 (11): 54-61.  doi:10.11896/jsjkx.210100220
Abstract PDF(2484KB) ( 744 )   
References | Related Articles | Metrics
Blockchain technology has the advantages of decentralization,non-tampering,and traceability,and has been widely used in various fields.However,as a cutting-edge technology,blockchain has problems in development and high learning costs,and most developers are more familiar with application development methods based on relational databases,that is operating relational databases through SQL.Hyperledger Fabric is the most mainstream blockchain development framework.In response to this pro-blem,the article proposes a SQL access framework for Hyperledger Fabric.First of all,in view of the inconsistency of the underlying storage structure of Fabric and SQL,the conversion rules from relational model to key-value model are defined and implemented the model conversion algorithm.Second,the SQL execution contract is developed based on smart contract technology,which realizes the automatic transformation from SQL statement to CouchDB operation statement,and then operates the under-lying data of Fabric.Finally,from the perspective of application layer development,an application automation refactoring tool is designed.The tool consists of two parts,including the application refactoring tool and the blockchain.Fabric-Driver is an interactive middleware for the network.The experimental results show that compared with the existing scheme,using the proposed framework to develop blockchain applications can save about 82% of the time,the reading and writing performance of using this framework is less than 5%.
Survey of Anonymous and Tracking Technology in Zerocash
FU Zhen-hao, LIN Ding-kang, JIANG Hao-chen, YAN Jia-qi
Computer Science. 2021, 48 (11): 62-71.  doi:10.11896/jsjkx.210300025
Abstract PDF(1630KB) ( 1586 )   
References | Related Articles | Metrics
In recent years,relying on the research breakthrough and rapid development of blockchain technology,a variety of di-gital currencies are rising and flooding into the market.As the currency with the strongest privacy in the UTXO model of blockchain so far,the anonymity technology of Zcash not only provides a strong guarantee for users' privacy,but also has high scienti-fic research value and a wide range of application prospects.Therefore,in order to standardize the legal use of digital currency and explore the wider application prospect of digital currency anonymity technology,scholars from all walks oflife have also conducted research on the anonymity and anti-anonymity technology of Zcash from different angles.Focusing on Zcash,a new digital currency,we first introduce thegeneral framework of Zcash.Secondly,the anonymous technology adopted by Zcash:zk-SNARKs and shielded pool transaction technology,are sorted out.Then we summarize and analyzethe research on Zcash tracking technology by scholars from all walks of life.In the end,anonymous technology and tracking technology development of Zcash are prospected.
Traceable Mixing Scheme for Bitcoin
YU Qi-long, LU Ning, SHI Wen-bo
Computer Science. 2021, 48 (11): 72-78.  doi:10.11896/jsjkx.210600242
Abstract PDF(1571KB) ( 894 )   
References | Related Articles | Metrics
Mixing is an important way for privacy protection among digital currency such as Bitcoin.However,on the one hand,Bitcoin mixing protects user privacy,on the other hand,it facilitates the transfer of assets for illegal activities such as ransomware and Bitcoin theft.In this paper,we propose a traceable scheme for Bitcoin mixing.The scheme aims to protect the privacy of legi-timate users and can trace the illegal assets.The system is regulated by trusted third party,user anonymity and traceability based on the group signature which is constructed by bilinear groups and strong Diffie-Hellman assumption.When there is a need for tracing,the regulator can determine the signed user through the system private key,so as to determine the illegal asset transfer path.Security analysis shows that the scheme can trace the illegal asset transfer without modifying the current Bitcoin system,meanwhile,the solution provide privacy protection and asset safety for legitimate users.Furthermore,the scheme provides a refe-rence direction for the research on digital currency privacy protection.
Survey of Vulnerability Detection Tools for Smart Contracts
TU Liang-qiong, SUN Xiao-bing, ZHANG Jia-le, CAI Jie, LI Bin, BO Li-li
Computer Science. 2021, 48 (11): 79-88.  doi:10.11896/jsjkx.210600117
Abstract PDF(1490KB) ( 3077 )   
References | Related Articles | Metrics
Smart contract is an important component of blockchain platform to realize transactions,which provides an effective solution to the trust problem between multi-party transactions.Smart contracts not only manage high value tokens but also have the characteristics of immutable,which lead to the security threats of smart contracts many times in recent years.At present,a lot of researches have devoted to the security of smart contracts,among which the vulnerability detection of smart contracts has become the main concern.This paper analyzes the security of smart contract systematically.From the perspective of whether to execute the smart contract,vulnerability detection tools are divided into static detection tools and dynamic detection tools.In particular,the vulnerability detection ability of existing detection tools is analyzed,and the principles,advantages and disadvantages of 16 detection technologies are discussed.Finally,the paper gives a prospect of how to improve the security of intelligent contract,and puts forward three research directions which may improve the security of smart contract.
Ethereum Smart Contract Bug Detection and Repair Approach Based on Regular Expressions, Program Instrumentation and Code Replacement
XIAO Feng, ZHANG Peng-cheng, LUO Xia-pu
Computer Science. 2021, 48 (11): 89-101.  doi:10.11896/jsjkx.210600064
Abstract PDF(2418KB) ( 1083 )   
References | Related Articles | Metrics
As the largest blockchain platform supporting smart contracts,millions of smart contracts have been deployed on Ethereum.Since the deployed smart contracts cannot be modified even if the contracts contain bugs,it is critical for developers to eliminate bugs prior to the deployment.Many smart contract analysis tools have been proposed.These tools either use bytecode-based symbolic execution to detect bugs,or convert the source code to an intermediate representation and then detect bugs.The tools based on symbolic execution usually cannot cover many types of bugs in source code.Converting the source code to an intermediate representation negatively impacts the detection speed.Moreover,these tools are bug detectors,which cannot automatically fix bugs based on analysis results.To address these limitations,we propose an approach named SolidityCheck,which employs regular expressions,program instrumentation and statement replacement in source code to quickly detect bugs and fix certain types of bugs.We conduct extensive experiments to evaluate SolidityCheck.The experimental results show that,compared with existing approaches,SolidityCheck demonstrates excellent performances on multiple indicators.
Research Progress on Blockchain-based Cloud Storage Security Mechanism
XU Kun, FU Yin-jin, CHEN Wei-wei, ZHANG Ya-nan
Computer Science. 2021, 48 (11): 102-115.  doi:10.11896/jsjkx.210600015
Abstract PDF(2831KB) ( 1667 )   
References | Related Articles | Metrics
Cloud storage enables users to obtain cheap online storage services on demand through network connection anytime and anywhere.However,due to the untrustability of cloud service providers,third-party institutions and users as well as the inevitable malicious attacks,there are many security vulnerabilities of cloud storage.Blockchain has the potential to build a trusted platform with its characteristics of decentralization,persistence,anonymity and auditability.Therefore,the research on cloud storage security mechanism based on blockchain technology has become a research trend.Based on this,the security architecture of cloud sto-rage system and the security of blockchain technology are first outlined,then the literature review and comparative analysis are conducted from four aspects of access control,integrity verification,data deduplication and data provenance.Finally,the technical challenges of blockchain-based cloud storage security mechanism are analyzed,summarized and prospected.
Data and Behavior Analysis of Blockchain-based DApp
HU Teng, WANG Yan-ping, ZHANG Xiao-song, NIU Wei-na
Computer Science. 2021, 48 (11): 116-123.  doi:10.11896/jsjkx.210200134
Abstract PDF(3979KB) ( 1302 )   
References | Related Articles | Metrics
Blockchain technology has evolved rapidly in recent years,as a result,many organizations and enterprises have started using decentralized applications (DApps) based on blockchain and smart contracts to enhance the functionality and security of their information systems,or to expand new businesses.However,DApps may also introduce new problems due to the possible security and performance issues of blockchain and smart contracts.In order to deeply study and analyze the data and behavioral phenomena of DApps so as to help users better apply blockchain and DApps,a total of 2 565 DApps in 21 categories are first collected,as well as data related to these DApps from July 30,2015 to May 4,2020 (about 10 million block heights),including 16 302 smart contracts,7 678 185 EOA,95 889 930 external transactions,and 30 833 719 internal transactions.Then the DApp distribution is deeply analyzed from four perspectives:number,time,category,and smart contract,and some findings are summarized from them,which can provide valuable references for DApp developers and blockchain researchers.
Computation Resource Allocation and Revenue Sharing Based on Mobile Edge Computing for Blockchain
XU Xu, QIAN Li-ping, WU Yuan
Computer Science. 2021, 48 (11): 124-132.  doi:10.11896/jsjkx.201100205
Abstract PDF(2668KB) ( 840 )   
References | Related Articles | Metrics
This paper proposes a mobile edge computing (MEC) assisted blockchain system in which mobile terminals (MT) do not have enough local computation resources to solve the proof of work (PoW) puzzle.By combining the computation resource allocation of MTs and edge server (ES) with the revenue sharing of MTs,a joint optimization problem is formulated to maximize the system-wide utility of all MTs and the ES.To solve the optimization problem efficiently,a multi-layer decomposition algorithm based on cyclic block coordinate descent (CBCD) is proposed.First,given the revenue sharing variables in advance,the corresponding sub-problem can be solved to obtain the computation resource allocation results of both MTs and ES.Then,with the obtained computation resource allocation,the revenue sharing variables of MTs are optimized.Finally,this paper optimizes the two sub-problems alternately until the algorithm reaches convergence.The numerical results show that the proposed algorithm can obtain the optimal solution of the joint optimization problem effectively and improve the system-wide utility.
Consortium Blockchain Consensus Algorithm Based on PBFT
ZHOU Yi-hua, FANG Jia-bo, JIA Yu-xin, JIA Li-yuan, SHI Wei-min
Computer Science. 2021, 48 (11): 133-141.  doi:10.11896/jsjkx.201200148
Abstract PDF(2157KB) ( 1951 )   
References | Related Articles | Metrics
Focusing on the problems of poor scalability,the random selection of primary nodes,and high network overhead in the practical-Byzantine-fault-tolerant consensus algorithm,this paper proposes a Byzantine-fault-tolerant consensus algorithm based on credit evaluation mechanism.First of all,the system sets different roles for the nodes in the cluster and assigns different permissions to the nodes according to different roles,so that the nodes with different permissions can enter and exit the network dynamically.Secondly,the voting mechanism and follow-the-satoshi algorithm based on credibility are designed to ensure the security and fairness of the election.Finally,in the aspect of the consensus process,an optimized two-stage Byzantine-fault-tolerant consensus is proposed to reduce the network overhead in the PBFT consensus process.Experiments show that the consensus algorithm based on the credit evaluation mechanism proposed in this paper has the characteristics of high dynamic,election security,and low cost compared with the PBFT algorithm which is suitable for the alliance chain.
Trust-based Dual-layer Scalable Consensus Protocol
SHAO Xing-hui, HUANG Jian-hua, WANG Meng-nan, WU Hai-xia, MAI Yong
Computer Science. 2021, 48 (11): 142-150.  doi:10.11896/jsjkx.210100126
Abstract PDF(2956KB) ( 784 )   
References | Related Articles | Metrics
As the core of blockchain technology,consensus mechanism determines the performance,scalability,and security of blockchain systems.Aiming at the performance and scalability issues of current blockchains and the high cost of the incentives used to maintain the security of the systems,a trust-based dual-layer scalable consensus protocol (TDSCP) is proposed.First,the trust model and consensus algorithm of dual-layer cooperation are designed through a structured network.The trust model determines whether a node gets the right to generate blocks based on its trustworthiness to avoid the high cost of mining.Secondly,the dual-layer consensus algorithm within the partitions improves consensus efficiency,expands the number of nodes involved in consensus,and avoids the problem of system centralization.Finally,the verifiable random function and the multilevel graph partitioning algorithm are combined for partitioning nodes,which can effectively prevent malicious nodes from gathering and reduce the number of cross-partition transactions.The experimental results show that TDSCP improves the scalability of the blockchain system,the latency of its intra-partition algorithm consensus is lower,and the partition method significantly reduces the number of cross-partition transactions.
PBFT Optimized Consensus Algorithm for Internet of Things
LIU Wei, RUAN Min-jie, SHE Wei, ZHANG Zhi-hong, TIAN Zhao
Computer Science. 2021, 48 (11): 151-158.  doi:10.11896/jsjkx.210500038
Abstract PDF(2901KB) ( 1352 )   
References | Related Articles | Metrics
Faced with a large number of IoT transactions,efficient consensus algorithm plays a key role in the application of blockchain technology into IoT.In this paper,according to the problems of long consensus time delay and low throughput in practical Byzantine fault tolerant algorithm (PBFT),we propose a practical Byzantine fault tolerant algorithm based on clustering (C-PBFT).Firstly,the nodes are clustered according to location features to form a network structure with multiple centers and layers.Then,consensus tasks are divided to conduct consensus in bottom and top network,thereby reducing the communication cost needed by consensus.Finally,credibility of dynamic credit model evaluation node is introduced to reduce the participation of abnormal nodes and increase the security and reliability of the system.Experimental results show that the C-PBFT algorithm can effectively reduce communication overhead,consensus delay and improve throughput.
Key Update Mechanism in Bitcoin Based on Improved P2PKHCA Script Scheme
XIANG A-xin, GAO Hong-feng, TIAN You-liang
Computer Science. 2021, 48 (11): 159-169.  doi:10.11896/jsjkx.210400027
Abstract PDF(2193KB) ( 916 )   
References | Related Articles | Metrics
Bitcoin is one of the most mature public chain application systems,the user key is the critical factor to the process of determining the ownership of Bitcoin,the security of Bitcoin is guaranteed by the safe management of the user key,and the loss of the key will lead to the loss of a large number of user assets.So it is an urgent problem to recover the lost assets.This paper proposes a key update mechanism in Bitcoin based on the improved P2PKHCA (pay-to-public-key-hash-with-conditional-anonymity) script scheme to solve above problems.Firstly,the key generation algorithm in the P2PKHCA scheme is improved by introducing the key life cycle and random number to solve its key leakage problem.Secondly,the two new opcodes,OP_KEYUPDATE and OP_TSELECTION,are proposed to design the new key update script to realize the user key update of the Bitcoin system.Finally,two types of key update schemes based on the key update script are constructed to make the script suitable for the requirements of different key update applications.The security analysis and performance analysis of the key update mechanism show that the proposed mechanism realizes the recovery of lost Bitcoins in the Bitcoin system on the premise of the effective completion of update of user's key.
Database & Big Data & Data Science
Study on Multi-source Data Fusion Framework Based on Graph
KUANG Guang-sheng, GUO Yan, YU Xiao-ming, LIU Yue, CHENG Xue-qi
Computer Science. 2021, 48 (11): 170-175.  doi:10.11896/jsjkx.201100004
Abstract PDF(1757KB) ( 3489 )   
References | Related Articles | Metrics
When analyzing various data in a given task,most of current researches only analyze single-source data and lack me-thods applied to multi-source data.But now data are becoming more abundant,therefore,this paper proposes a multi-source data fusion framework for fusing data from multiple network platforms.The data of the same platform contains text and various attri-butes,and there are also great differences in content and form among data of different platforms.Most existing network information mining methods only use part of the data in the same platform for analysis,and even ignore the interaction between the data of different platforms.Therefore,this paper proposes a data fusion framework,which can not only use more features of the same platform to improve the performance of a single platform,but also fuse the data features of different platforms to complement each other,thereby improving the performance of multiple platforms.This paper uses the task of event classification,and the abundant features effectively improve the F1 value,which verifies the effectiveness of the proposed multi-source data framework.
Collaborative Filtering Recommendation Algorithm of Behavior Route Based on Knowledge Graph
CHEN Yuan-yi, FENG Wen-long, HUANG Meng-xing, FENG Si-ling
Computer Science. 2021, 48 (11): 176-183.  doi:10.11896/jsjkx.201000004
Abstract PDF(2493KB) ( 870 )   
References | Related Articles | Metrics
For personalized recommendation,common recommendation algorithms include content recommendation,Item CF and User CF.However,most of these algorithms and their improved algorithms tend to focus on users' explicit feedback (tags,ra-tings,etc.) or rating data,and lack the use of multi-dimensional user behavior and behavior order,resulting in low recommendation accuracy and cold start problems.In order to improve the recommendation accuracy,a collaborative filtering recommendation algorithm based on knowledge graph (BR-CF) is proposed.Firstly,according to the user behavior data,behavior graph and behavior route are created considering the behavior order,and then the vectorization technology (Keras Tokenizer) is used.Finally,the similarity between multi-dimensional behavior route vectors is calculated,and the route collaborative filtering recommendation is carried out for each dimension.On this basis,two improved algorithms combining BR-CF and Item CF are proposed.The expe-rimental results show that the BR-CF algorithm can recommend effectively in multiple dimensions on the user behavior dataset of Ali Tianchi,realize the full utilization of data and the diversity of recommendation,and the improved algorithm can improve the recommendation performance of Item CF.
Imbalanced Data Classification of AdaBoostv Algorithm Based on Optimum Margin
LU Shu-xia, ZHANG Zhen-lian
Computer Science. 2021, 48 (11): 184-191.  doi:10.11896/jsjkx.200900107
Abstract PDF(1911KB) ( 499 )   
References | Related Articles | Metrics
In order to solve the problem of imbalanced data classification,this paper proposes an AdaBoostv algorithm based on optimal margin.In this algorithm,the improved SVM is used as the base classifier,the margin mean term is introduced into the optimization model of SVM,and the margin mean term and loss function term are weighted by data imbalance ratio.The stochastic variance reduced gradient (SVRG) is used to solve the optimization model to improve the convergence rate.In the optimal margin AdaBoostv algorithm,a new adaptive cost sensitive function is introduced into the instance weight update formula,the minority instances,the misclassified instances and the borderline minority instances are assigned higher cost values.In addition,a new weight strategy of the base classifier is derived by combining the new weight formula and introducing the estimated value of the optimal margin under the given precision parameter v,so as to further improve the classification accuracy of the algorithm.The experimental results show that the classification accuracy of the AdaBoostv algorithm with optimal margin is better than other algorithms on imbalanced datasets in the case of linear and nonlinear,and it can obtain a larger minimum margin.
Recommendation Algorithm Based on Knowledge Graph and Tag-aware
NING Ze-fei, SUN Jing-yu, WANG Xin-juan
Computer Science. 2021, 48 (11): 192-198.  doi:10.11896/jsjkx.201000085
Abstract PDF(2411KB) ( 1454 )   
References | Related Articles | Metrics
Recommendation systems alleviate the problem of information overload caused by the rapid increase of data on the Internet.But traditional recommendation systems are not accurate enough due to data sparsity and cold start.Therefore,a novel recommendation algorithm based on knowledge graph and tag-aware (KGTA) is proposed.First,tags of items and users are used to capture low-order and high-order features through knowledge graph representation learning.The semantic information of entities and relationships in two knowledge graphs is embedded into a low-dimension vector space to obtain the unified representation of items and users.Then,deep neural networks and recurrent neural networks combining attention mechanism are respectively utilized to extract the latent features of items and users.Finally,ratings are predicted on the basis of latent features.KGTA not only takes relationship information and semantic information of knowledge graph and tags into consideration,but also learns latent features of items and users through deep structures.Experimental results on MovieLens datasets illustrate that the proposed algorithm performs better in rating prediction and improves the accuracy of recommendation.
Data Placement Strategy of Scientific Workflow Based on Fuzzy Theory in Hybrid Cloud
LIU Zhang-hui, ZHAO Xu, LIN Bing, CHEN Xing
Computer Science. 2021, 48 (11): 199-207.  doi:10.11896/jsjkx.200900009
Abstract PDF(2586KB) ( 497 )   
References | Related Articles | Metrics
A reasonable data placement strategy is essential to the efficient execution of scientific workflow in hybrid cloud environment.The traditional data placement strategy mainly focuses on the deterministic environment,but the data transmission time is uncertain due to the different load,bandwidth fluctuation and network congestion between different data centers and computer characteristics in the actual network environment.To solve this problem,a fuzzy adaptive discrete particle swarm optimization algorithm based on the fuzzy theory and genetic algorithm operator (FGA-DPSO) is proposed to minimize the fuzzy transmission time of data,place the scientific workflow data reasonably and meet the privacy requirements of the data set and the capacity limit of the data center.The experimental results show that the algorithm can effectively reduce the fuzzy data transmission time of scientific workflow in hybrid cloud environment.
MLCPM-UC:A Multi-level Co-location Pattern Mining Algorithm Based on Uniform Coefficient of Pattern Instance Distribution
LIU Xin-bin, WANG Li-zhen, ZHOU Li-hua
Computer Science. 2021, 48 (11): 208-218.  doi:10.11896/jsjkx.201000097
Abstract PDF(2821KB) ( 545 )   
References | Related Articles | Metrics
The spatial co-location pattern is a set of spatial features,and the instances frequently appear together in the spatial region.Due to the correlation and heterogeneity of spatial data,the distribution of co-location instances may appear globally in the whole research area (global co-location pattern),or appear in a local area of the research area (regional co-location pattern),Thus the multi-level co-location pattern mining is proposed.There are two problems with current multi-level co-location pattern mining methods:1)the existing multi-level co-location pattern mining methods ignore the spatial distribution characteristics of patterns and fail to accurately distinguish global and regional co-location patterns;2)the existing multi-level pattern mining method uses global non-prevalent co-location patterns as candidate regional co-location patterns,and the number of candidate patterns is too large.In response to the above problems,firstly,we define the uniform coefficient of the instance distribution of the co-location pattern and consider the pattern distribution in space while considering the pattern prevalence,so as to correctly and efficiently identify the global and regional co-location patterns.Secondly,a novel multi-level co-location pattern mining algorithm is designed based on the uniformity coefficient of the instance distribution of the pattern.In this algorithm,an effective pruning strategy is proposed to improve the efficiency of the algorithm.Finally,extensive experiments are carried out on real and synthetic data sets,which verify the correctness and efficiency of the proposed method.
Computer Graphics & Multimedia
Improved Multi-feature Fusion Algorithm for Target Detection Based on Haar-like and LBP
YUAN Xiao-pei, CHEN Xiao-feng, LIAN Ming
Computer Science. 2021, 48 (11): 219-225.  doi:10.11896/jsjkx.201100174
Abstract PDF(3075KB) ( 589 )   
References | Related Articles | Metrics
Aiming at the problems of detecting targets with Haar-like,such as excessive feature values,expensive computational cost,inability to describe the target texture features,and low recognition rate etc.,this paper proposes an adaptive threshold IHL (Improved Haar-like LBP) feature extraction algorithm based on the information of the origin of the sliding window.More speci-fically,the algorithm first constructs the IHL feature coding method fusing Haar-like features and LBP features.Then,for computing Haar-like local binary features,a Gaussian matrix is used to obtain an adaptive threshold that conforms to the pixel distribution law,and the pixel information of the central point is introduced to ensure the rationality of the extracted feature value.Finally,the cascade classifier trained by AdaBoost is built and experiments are conducted on the KITTI vehicle data set and the INRIA Person pedestrian data set.The proposed algorithm,with a recognition rate of more than 94%,is proven effective with recognizing 1 102 pedestrian targets in 65 s and 1 852 vehicle targets in 114.3 s,which can significantly speed up target recognition time and obviously improve the detection accuracy compared with state-of-the-art methods.
Optimization Method for Inter-frame Stability of Object Pose Estimation for Human-Machine Collaboration
MU Feng-jun, QIU Jing, CHEN Lu-feng, HUANG Rui, ZHOU Lin, YU Gong-jing
Computer Science. 2021, 48 (11): 226-233.  doi:10.11896/jsjkx.201200095
Abstract PDF(2198KB) ( 872 )   
References | Related Articles | Metrics
Existing object pose estimation methods cannot provide estimated poses with inter-frame stability.As a result,when the results are directly used in visualization scenarios such as augmented reality,it will cause screen jitter,so it's not suitable enough for application scenarios such as human-machine collaboration.This paper proposes an object pose estimation optimization method that includes multiple methods.By improving the loss function of the original pose estimation method and using causal filtering to optimize the pose estimation result,a stable estimated pose can be obtained.In addition,in order to consummate the eva-luation system of the degree of stability of the pose estimation method,this paper proposes three evaluation indicators:the direct deviation distance DBD,the direction reversal rate DRR and the average displacement angle ADA,which can evaluate the object pose estimation method from multiple viewpoints.Finally,the YCB-STB dataset is used to test,and the method is compared with the original method without optimization.The results show that the proposed method can improve the inter-frame stability of the existing object pose estimation methods without introducing additional resources,and has a small impact on the accuracy of the original method,which satisfies the requirement of object attitude estimation in human-machine collaborative scene.
Smoothing Filter Detection Algorithm Based on Middle and Tail Information of Differential Histogram
DAN Zhou-yang, LIU Fen-lin, GONG Dao-fu
Computer Science. 2021, 48 (11): 234-241.  doi:10.11896/jsjkx.200900121
Abstract PDF(3588KB) ( 847 )   
References | Related Articles | Metrics
Smoothing is an important method for digital image denoising and blurring.It is often used to beautify and retouch “forged” images.Therefore,it is necessary to detect various smoothing filters.Aiming at the common image smoothing proces-sing,this paper proposes a new smoothing filter detection algorithm based on the tail information of differential histogram.Firstly,for the image to be detected,multiple difference absolute value histograms are constructed based on different difference step lengths and directions.Then,the occurrence frequency of the difference values of 0 and 1 in the histogram is extracted,and the several difference values from large to small at the tail of the histogram and their occurrence frequency are extracted to construct multi-dimensional detection feature.Finally,an SVM classifier is constructed to detect images.The experiments of smoothing detection and distinguishing different smoothing filters are carried out on the image library.The experimental results show that the proposed algorithm has excellent detection performance for three common spatial smoothing filters,including median filter,ave-rage filter and Gaussian filter.In addition,the algorithm can effectively distinguish smoothing filter from sharpening,scaling,compression and other digital image operations,and has robustness in JPEG compressed image.
Sign Language Recognition Based on Image-interpreted Mechanomyography and Convolution Neural Network
WANG Xin-ping, XIA Chun-ming, YAN Jian-jun
Computer Science. 2021, 48 (11): 242-249.  doi:10.11896/jsjkx.201000019
Abstract PDF(5088KB) ( 972 )   
References | Related Articles | Metrics
Time series signals are widely used in various pattern recognition applications.In order to solve the problem of low pattern recognition rate of time series signals for a large number of targets,this article uses a variety of transform methods to convert time series signals into images,and performes pattern recognition using image classification algorithms.In the experiment,the mechanomyography (MMG) corresponding to 30 sign languages on the forearm are collected and converted into diffe-rent image styles,and a convolution neural network (CNN) framework is designed to establish pattern recognition classification models for the images.The models are optimized twice with the application of transfer learning algorithm,and the recognition rate of the best classification model reaches 98.7%,which is much higher than the recognition rate of traditional machine learning algorithms.The experimental results imply that the image processing of time series signals can effectively improve the recognition rate of multi-target pattern recognition of MMG.This paper can provide references for pattern recognition of other time series signal.
Multi-patch and Multi-scale Hierarchical Aggregation Network for Fast Nonhomogeneous ImageDehazing
YANG Kun, ZHANG Juan, FANG Zhi-jun
Computer Science. 2021, 48 (11): 250-257.  doi:10.11896/jsjkx.200900058
Abstract PDF(3109KB) ( 781 )   
References | Related Articles | Metrics
Despite dehazing algorithms based on convolutional neural networks have made tremendous progress in synthetic uniform hazy datasets,they still perform poorly on real nonhomogeneous hazy images.In order to achieve fast and effective nonhomogeneous image dehazing,we propose a multi-patch and multi-scale hierarchical aggregation network (MPSHAN),which fuses multi-patch local information and multi-scale global information.Secondly,we propose a hierarchical fusion module (HFM),which not only decouples residual fusion to achieve richer non-linear feature expression,but also improves the feature fusion qua-lity at key locations through the channel attention mechanism.At the same time,dilated convolution is used on hierarchies to obtain multi-scale information,which enhances feature maps to optimize the fusion effect.In addition,in the loss function,we add frequency domain loss to restore better edge quality.The experimental results show that the proposed algorithm has good robustness on nonhomogeneous hazy images,and the average processing time of 1 200×1 600 high-resolution images is only 0.044 s.Compared with other dehazing algorithms,it achieves a better balance between image dehazing effect and running time.
Image Super-resolution by Residual Attention Network with Multi-skip Connection
LIU Zun-xiong, ZHU Cheng-jia, HUANG Ji, CAI Ti-jian
Computer Science. 2021, 48 (11): 258-267.  doi:10.11896/jsjkx.201000033
Abstract PDF(5156KB) ( 718 )   
References | Related Articles | Metrics
Deep convolutional neural networks (Deep CNNs) are difficult to train as they become deeper.Moreover,in image super-resolution,channel-wise features and inputs of the low-resolution (LR) image are treated equally between different channels,resulting in the deficiency of the representational ability of the CNNs.To resolve these issues,residual attention network with multi-skip Connection (RANMC) is proposed for single-image super resolution (SISR),which employs residual in multi-skip connection (RIMC) structure,then a very deep network is formulated with serval residual groups.Each residual group (RG) contains a certain number of short skip connections (SSC) and multi-skip connections (MC).Based on RIMC,rich low-frequency (LF) information is allowed to be bypassed through multi-skip connection,and high-frequency (HF) information is focused on learning by the principal network.Furthermore,considering interdependencies in channel and spatial dimension,attention mechanism block(AMBlock) is proposed to focus on the location of the information and adaptively readjust channel-wise features,where the spatial attention (SA) mechanism and channel attention (CA) mechanism are taken in the approach.Experiments indicate that RANMC can not only recover image details better,but also obtain higher image quality and network performance.
Improved YOLO v4 Algorithm for Safety Helmet Wearing Detection
JIN Yu-fang, WU Xiang, DONG Hui, YU Li, ZHANG Wen-an
Computer Science. 2021, 48 (11): 268-275.  doi:10.11896/jsjkx.200900098
Abstract PDF(2355KB) ( 1330 )   
References | Related Articles | Metrics
Safety production management is an important policy for the development of high-risk enterprises such as the construction industry and heavy industry,and safety helmets play a key role in head protection in the production environment.Therefore,it is necessary to strengthen the supervision of helmet wearing.In recent years,the monitoring method of helmet wearing based on image vision has become the main means for enterprises to implement management.How to improve the detection accuracy and speed of helmet wearing is a crucial issue for applications.To deal with this issue,an improved YOLO v4 algorithm is proposed to promote the accuracy and efficiency of safety helmet wearing detection in this paper.First,a 128×128 feature map output is added to the original three feature map outputs of the YOLO v4 algorithm,and the 8 times downsampling of the feature map output is changed to 4 times downsampling to provide more small target features for subsequent feature fusion.Second,the feature fusion module is improved based on the idea of dense connection to realize feature reuse,so that the Yolo Head classifier responsible for small target detection can utilize the features of different levels,to obtain better target detection and classification results.Finally,comparative experiments are carried out.The results show that the average accuracy of the proposed method is 2.96% higher than the original network detection accuracy to be 91.17%,and the detection speed is basically unchanged to be 52.9 frame/s.Thereupon,the proposed algorithm can achieve better detection accuracy while meeting real-time detection requirements,and effectively realize the high-speed and high-precision detection of helmet wearing.
Artificial Intelligence
Automatic Learning Method of Domain Semantic Grammar Based on Fault-tolerant Earley Parsing Algorithm
MA Yi-fan, MA Tao-tao, FANG Fang, WANG Shi, TANG Su-qin, CAO Cun-gen
Computer Science. 2021, 48 (11): 276-286.  doi:10.11896/jsjkx.210100218
Abstract PDF(1819KB) ( 567 )   
References | Related Articles | Metrics
Refined domain text analysis is an important prerequisite for high-quality domain knowledge acquisition.It usually relies on a large number of some form of semantic grammars,but summarizing them is often time-consuming and labor-intensive.In this paper,an automatic learning method of semantic grammar based on fault-tolerant Earley parsing algorithm is proposed,which automatically generates new semantic grammars (including lexicons and grammar production rules) according to seed grammar to reduce labor costs.This method uses the optimized fault-tolerant Earley parser to perform fault-tolerant parsing on the input statements,and then generates candidate semantic grammars based on the parse tree generated by the fault-tolerant parsing.Finally,the candidate semantic grammars are filtered or corrected to obtain the final semantic grammars.In the experiment of five TCM medical records with different diseases,the precision rate of learning new lexicons is 63.88%,and precision rate of learning new grammar production rules is 81.78%.
Joint Extraction Method for Chinese Medical Events
YU Jie, JI Bin, LIU Lei, LI Sha-sha, MA Jun, LIU Hui-jun
Computer Science. 2021, 48 (11): 287-293.  doi:10.11896/jsjkx.201200016
Abstract PDF(2325KB) ( 1194 )   
References | Related Articles | Metrics
The popularization of electronic clinical medical records (EMRs) makes it possible to use automated ways to quickly extract high-value information from EMRs.As a kind of crucial medical information,tumor medical event is typically composed of a series of attributes describing malignant tumors.Recently,tumor medical event extraction has become a research hotspot in the academic community,and many influential academic conferences publish it as an evaluation task and provide a series of high-quality manually annotated data.Aiming at the discrete characteristic of tumor event attributes,this paper proposes a joint extraction method,which realizes the joint extraction of tumor primary site and primary tumor size and also the extraction of tumor metastasis sites.In addition,aiming to alleviate the small counts and types of annotated tumor medical texts,this paper proposes a pseudo-data generation algorithm based on the global random replacement of key information,which improves the transfer learning ability of the joint extraction method for different types of tumor events.The proposed method wins the third place in the clinical medical event extraction evaluation task of CCKS2020,and extensive experiments on CCKS2019 and CCKS2020 datasets verify the effectiveness of the proposed method.
Branching Heuristic Strategy Based on Learnt Clauses Deletion Strategy for SAT Solver
WANG Yi-jie, XU Yang, WU Guan-feng
Computer Science. 2021, 48 (11): 294-299.  doi:10.11896/jsjkx.201000142
Abstract PDF(1884KB) ( 577 )   
References | Related Articles | Metrics
For the SAT solver,most popular branch variable decision-making strategies are based on the variable activity evaluation of conflict.The unassigned variable with the maximum activity is selected as the decision variable,and the most recent conflict is solved first.However,they all ignore the impact of the number of clauses containing decision variables on the Boolean constraint propagation (BCP).To solve this problem,this paper proposes a branch variable decision strategy (VDALCD) based on the learning clause deletion strategy,which reduces the activity of variables in the deleted clause when the clause is deleting.Based on the VDALCD strategy,Glucose4.1 and MapleLCMDistChronoBT-DL-v2.1 are improved to solvers Glucose4.1_VDALCD and Maple-DL_VDALCD.This paper uses 2018 and 2019 SAT international competition questions as benchmark test cases to compare the improved version with the original version of the solver.The experimental results show that Gluose4.1_VDALCD finds out 26 more examples than Gluose4.1,an increase of 15.5% in the 2018 example test.In the 2019 example test,Maple-DL_VDALCD finds out 17 more examples than MapleLCMDistChronoBT-DL-v2.1,an increase of 7.6%.
Study on Text Retrieval Based on Pre-training and Deep Hash
ZOU Ao, HAO Wen-ning, JIN Da-wei, CHEN Gang, TIAN Yuan
Computer Science. 2021, 48 (11): 300-306.  doi:10.11896/jsjkx.210300266
Abstract PDF(1964KB) ( 651 )   
References | Related Articles | Metrics
Aiming at the problem of low retrieval efficiency and accuracy in text retrieval,a retrieval model based on pre-trained language model and deep hash method is proposed.Firstly,the prior knowledge of text contained in the pre-trained language model is introduced by transfer learning,and then the input is transformed into high-dimensional vector representation by feature extraction.A hash learning layer is added to the back end of the whole model to fine tune the parameters of the model by designing specific optimization objectives,so as to dynamically learn the hash function and the unique hash representation of each input in the training.Experimental results show that the retrieval accuracy of this method is at least 21.70% and 21.38% higher than that of other benchmark models in top-5 and top-10,respectively.The introduction of hash code makes the model improve the retrieval speed by 40 times under the premise of only losing 4.78% accuracy.Therefore,this method can significantly improve the retrieval accuracy and efficiency,and has a potential application prospect in the field of text retrieval.
Text Sentiment Analysis Based on Fusion of Attention Mechanism and BiGRU
YANG Qing, ZHANG Ya-wen, ZHU Li, WU Tao
Computer Science. 2021, 48 (11): 307-311.  doi:10.11896/jsjkx.201000075
Abstract PDF(1980KB) ( 1772 )   
References | Related Articles | Metrics
Aiming at the lack of the ability of simple neural networks to capture the contextual semantics of texts and extract important information in texts,a sentiment analysis model FFA-BiAGRU is proposed,which integrates attention mechanism and GRU.First,we pre-process the text and vectorize the words through GloVe to reduce the vector space dimension.Then,through a hybrid model that fuses the attention mechanism with the update gate of the gating unit,it can extract important information in the text features.Finally,the text features are further extracted through the forced forward attention mechanism,and then classified by the softmax classifier.Experiments on public data sets show that the algorithm can effectively improve the sentiment ana-lysis performance.
Construction and Application of Russian Multimodal Emotion Corpus
XU Lin-hong, LIU Xin, YUAN Wei, QI Rui-hua
Computer Science. 2021, 48 (11): 312-318.  doi:10.11896/jsjkx.200900088
Abstract PDF(2162KB) ( 792 )   
References | Related Articles | Metrics
As a research hotspot in the field of emotion analysis,Russian multimodal sentiment analysis technology can automatically analyze and identify emotions through rich information such as text,voice and image,which is helpful to timely understand the public opinion hotspots in Russian speaking countries and areas.However,there are only a few multimodal emotion corpora in Russian,which limits the further development of Russian emotion analysis technology.Based on the analysis of the related research and emotion classification methods of multimodal emotion corpus,this paper develops a scientific and complete tagging system,which includes 11 items of information in utterance,space-time and emotion.In the whole process of corpus construction and quality control,this paper follows the principle of emotional subject and emotional continuity,formulates a strong operational annotation specification and constructs a large-scale Russian emotional corpus.Finally,it discusses the application of corpus in the analysis of emotional expression characteristics,the analysis of personality characteristics and the construction of emotion recognition model.
Knowledge Distillation Based Implicit Discourse Relation Recognition
YU Liang, WEI Yong-feng, LUO Guo-liang, WU Chang-xing
Computer Science. 2021, 48 (11): 319-326.  doi:10.11896/jsjkx.201000099
Abstract PDF(1914KB) ( 869 )   
References | Related Articles | Metrics
Due to the lack of connectives,implicit discourse relation recognition models infer the semantic relations (e.g.,causal) between two arguments (clauses or sentences) based on their semantics.The performance of these models is still relatively low.It is also very difficult for corpus annotators to annotate implicit discourse relations.They usually insert an appropriate connective to assist the annotation of an implicit discourse relation instance.Considering the above,a knowledge distillation based method is proposed for implicit discourse relation recognition to take use of the connectives inserted during corpus annotating.Specifically,a connective-enhanced model is constructed to integrate the connective information,and then the integrated connective information is transferred to the implicit discourse relation recognition model via knowledge distillation.Experimental results on the commonly used PDTB dataset show that the proposed method achieves better performance than the baselines.
Path Planning of Mobile Robot with A* Algorithm Based on Artificial Potential Field
CHEN Ji-qing, TAN Cheng-zhi, MO Rong-xian, WANG Zhi-kui, WU Jia-hua, ZHAO Chao-yang
Computer Science. 2021, 48 (11): 327-333.  doi:10.11896/jsjkx.200900170
Abstract PDF(5275KB) ( 1162 )   
References | Related Articles | Metrics
In order to solve the traditional A* algorithm is not taken into account when planning path obstacle distribution on the influence of the path selection,this paper puts forward an improved A* algorithm,the artificial potential field of thought and the traditional A* algorithm,the combination of the obstacles in grid map gives repulsive force function and the repulsive force around the grid size calculation,in order to improve the searching ability of A* algorithm,the repulsive force of the grid is introduced into the evaluation function of A* algorithm.The results of MATLAB simulation and Turtlebot experiments show that,compared with the traditional A* algorithm,the new improved algorithm combined with artificial potential field algorithm can plan a better path,improve the efficiency of path planning,and increase the search speed by 13.40%~29.68%.The path length is shortened by 10.56%~24.38%,and the number of path nodes is reduced by 6.89%~27.27%.The experimental results show that the improved A* algorithm has obvious optimization effect and is effective and feasible.
Computer Network
Review of Directional Routing Protocols for Flying Ad-Hoc Networks Based on Directional Antennas
YANG Zhang-lin, XIE Jun, ZHANG Geng-qiang
Computer Science. 2021, 48 (11): 334-344.  doi:10.11896/jsjkx.210400182
Abstract PDF(1872KB) ( 918 )   
References | Related Articles | Metrics
In recent years,flying ad-hoc networks with UAVs as nodes have received extensive attention due to their different applications in various fields.In order to meet the quality of service requirements of complex tasks,the routing of flying ad-hoc networks needs to provide sufficient network performance.Compared with omnidirectional routing based on omnidirectional antennas,directional routing based on directional antennas can improve channel utilization and communication range,which can enable flying ad-hoc networks to obtain better network performance and quality of service.In this paper,the advantages and pro-blems brought by the application of directional antennas in flying ad-hoc networks are analyzed.Furthermore,the existing single path directional routing and multi-path directional routing are introduced in detail from the aspects of directional antenna control mechanism,routing algorithm,application scenarios,advantages and disadvantages.What's more,these routing protocols are compared qualitatively in terms of antenna types,control mechanisms,network performance,and key parameters.Finally,the challenges faced in application and future development of the directional routing are discussed.
Hover Location Selection and Flight Path Optimization for UAV for Localization Applications
ZHAO Xiao-wei, ZHU Xiao-jun, HAN Zhou-qing
Computer Science. 2021, 48 (11): 345-355.  doi:10.11896/jsjkx.201000105
Abstract PDF(3613KB) ( 860 )   
References | Related Articles | Metrics
A typical application of UAV is to locate ground targets.This paper proposes to let a UAV hover at predetermined positions to broadcast beacon signals.When a ground node receives beacons from at least three hovering positions,it can localize itself.This paper mainly considers how to choose the hovering positions and how to optimize the flight path of the UAV.This paper proposes two hovering schemes,and gives two path planning algorithms for the hovering schemes to optimize the flight path of the UAV.We prove that the flight paths under the two hovering schemes are the shortest paths respectively.Through simulations,it is verified that the proposed scheme can achieve complete coverage of the area,so that any ground node can be localized.Simulations show that the proposed schemes can achieve higher localization accuracy by adjusting the flying height of the UAV or the grid size in the hovering schemes.
Non-orthogonal Multiple Access Enabled Scalable Video Multicast in HetNets
JI Xiao-xiang, SHEN Hang, BAI Guang-wei
Computer Science. 2021, 48 (11): 356-362.  doi:10.11896/jsjkx.200900080
Abstract PDF(2405KB) ( 532 )   
References | Related Articles | Metrics
In this paper,a resource management framework is presented for non-orthogonal multiple access (NOMA)-enhanced scalable video coding (SVC) multicast in heterogeneous networks (HetNets).In this framework,radio spectrum slicing for different base stations,spectrum partition for multiple groups within each spectrum slice,and transmit power division within each multicast group are jointly considered,to maximize the overall video quality experienced by user equipment in multicast groups.For tractability,the joint resource management problem is formulated as an integer linear programming problem,with theconsi-deration of different video requests,varying device locations,and inter-cell interference.The formulated optimization problem is decoupled into an intra-group transmit power division problem and a multi-group spectrum partition and inter-slice spectrum partition problem.The former subproblem obtains the optimal transmitting power for each layer of superposition coding through multiple cycles.The latter subproblem is solved optimally by the knapsack algorithm.Simulation results show that the proposed scheme is superior to the existing scheme in spectral efficiency and average video quality of each user equipment.
Reinforcement Learning Based Dynamic Basestation Orchestration for High Energy Efficiency
ZENG De-ze, LI Yue-peng, ZHAO Yu-yang, GU Lin
Computer Science. 2021, 48 (11): 363-371.  doi:10.11896/jsjkx.201000008
Abstract PDF(2683KB) ( 937 )   
References | Related Articles | Metrics
The mutual promotion of mobile communication technology and mobile communication industry has achieved unprecedented prosperity in the mobile Internet era.The explosion of mobile devices,expansion of the network scale,improvement of service requirements are driving the next technological revolution in wireless networks.5G meets the requirements for the thousand-fold improvement of service performance through intensive network deployment,but co-channel interference and bursty request problems make the energy consumption of this solution very huge.In order to support 5G network to provide energy-efficient and high-performance services,it is imperative to upgrade and improve the management scheme of mobile networks.In this article,we use a short-cycle management framework with cache queues to achieve agile and smooth management of request burst scenarios to avoid dramatic fluctuations in service quality due to request bursts.We use deep reinforcement learning to learn the user distribution and communication needs,and infer the load change rules of the base station,and then realize the pre-scheduling and pre-allocation of energy,while ensuring the quality of service and improving the energy efficiency.Compared with the classic DQN algorithm,the two-buffer DQN algorithm proposed in this paper can provide nearly 20% acceleration in convergence.In terms of decision performance,it can save 4.8% energy consumption compared to the currently widely used keep on strategy.
Study on Performance Optimization of Edge Devices Based on Two-layer Virtualization
TAO Zhi-yong, ZHANG Jin, YANG Wang-dong, CHEN Wei-man
Computer Science. 2021, 48 (11): 372-377.  doi:10.11896/jsjkx.210400061
Abstract PDF(3387KB) ( 541 )   
References | Related Articles | Metrics
With the continuous increase in the number of users which access to ISP's edge devices,the amount of data needs to be processed has doubled,which causes the edge device to be overloaded,and affects the normal interaction of the virtual private network data constructed by the multi-protocol label switching and the border gateway protocol.At this stage,the MCE,HOPE and SDN solutions all have certain limitations in solving this problem:the MCE mode edge device interface does not support the creation of logical channels,the solution cannot be used;the HOPE mode causes problems such as routing loops;the number of concurrent sessions processed by a single SDN controller in the SDN mode cannot exceed 64 000.In response to the above problems,an edge device optimization solution based on dual-layer virtualization is proposed.This solution includes the following three basic steps:edge device virtualization,virtual private network tunnel establishment,and virtual private network information isolation.On this basis,the solution is optimized from the network model,the construction of the network resource pool and the splitting of the network resource pool.The performance of the solution is evaluated based on the experimental environment,and compared with the virtual private network constructed in the traditional way in terms of packet forwarding rate,manageability,and sca-lability.The analysis results show that the dual-layer virtual solution which is designed to construct a network resource pool can realize the unified scheduling and management of resources,effectively solve the overloaded problem of ISP's edge equipment,and also be an effective solution for building a virtual private network.