计算机科学 ›› 2022, Vol. 49 ›› Issue (1): 241-251.doi: 10.11896/jsjkx.210100113

• 人工智能 • 上一篇    下一篇

基于机器学习的编译器自动调优综述

池昊宇, 陈长波   

  1. 中国科学院重庆绿色智能技术研究院自动推理与认知重庆市重点实验室 重庆400714
    中国科学院大学 北京100049
  • 收稿日期:2021-01-14 修回日期:2021-05-23 出版日期:2022-01-15 发布日期:2022-01-18
  • 通讯作者: 陈长波(chenchangbo@cigit.ac.cn)
  • 作者简介:chihaoyu@cigit.ac.cn
  • 基金资助:
    国家自然科学基金面上项目(11771421);中国科学院“西部之光”;重庆市院士牵头科技创新引导专项(cstc2018jcyj-yszxX0002,cstc2019yszx-jcyjX0003,cstc2020yszx-jcyjX0005);国家重点研发计划(2020YFA0712300)

Survey on Automatic Tuning of Compilers by Machine Learning

CHI Hao-yu, CHEN Chang-bo   

  1. Chongqing Key Laboratory of Automated Reasoning and Cognition,Chongqing Institute of Green and Intelligent Technology,Chinese Academy of Sciences,Chongqing 400714,China
    University of Chinese Academy of Sciences,Beijing 100049,China
  • Received:2021-01-14 Revised:2021-05-23 Online:2022-01-15 Published:2022-01-18
  • About author:CHI Hao-yu,born in 1996,postgra-duate.His main research interests include compiler optimization and machine learning.
    CHEN Chang-bo,born in 1981,Ph.D,associate professor.His main research interests include symbolic-numeric computation,automatic parallelization and optimization of computer programs and machine learning.
  • Supported by:
    National Natural Science Foundation of China(61302144,61902417).

摘要: 现代编译器提供的优化选项众多,选择何种参数因子、选择哪些选项组合以及以何种顺序应用这些选项成为复杂的问题,其中优化次序问题是最困难的优化问题。随着传统方法的改进(迭代编译结合启发式优化搜索)以及新技术的出现(机器学习),构建一种相对高效、智能的编译器自动调优框架成为可能。文中通过调查过去数十年的相关研究,总结了前人的研究思路和应用方法。首先介绍了编译器自动调优的发展历程,包括早期的手工方法、成本函数驱动的方法、启发式优化搜索驱动的迭代编译、基于机器学习的直接预测以及机器学习驱动的迭代编译方法。然后重点梳理了基于机器学习的直接预测和机器学习驱动的迭代编译自动调优方法,统计和对比了一些较为成功的框架和最新的研究成果。最后提出了当前编译器优化存在的问题和今后的重点研究方向。

关键词: 编译器, 迭代编译, 机器学习, 优化择项, 优化择序, 自动调优

Abstract: Modern compilers offer many optimization options.It is a complex problem to choose which parameter values,which combination of options and in which order to apply these options.Among them,the optimization phase ordering is the most dif-ficult one.With the improvement of traditional methods (iterative compilation combined with heuristic optimization search) and the emergence of new technologies (machine learning),it is possible to build a relatively efficient and intelligent compiler automatic tuning framework.This paper summarizes the research ideas and application methods of predecessors by investigating the related research in the past decades.Firstly,the development of compiler automatic tuning is introduced,including early manual methods,cost function driven methods,iterative compilation and machine learning-based prediction methods.Then,this paper focuses on the direct prediction based on machine learning and the automatic optimization method of iterative compilation driven by machine learning.Last but not least,several successful frameworks and some recent research results are listed.Current challenges and some key future research directions are also pointed out.

Key words: Automatic tuning, Compiler, Iterative compilation, Machine learning, Optimization options selection, Optimization phase ordering selection

中图分类号: 

  • TP314
[1]WANG Z.Machine Learning in Compiler Optimization[J].Proceedings of the IEEE,2018,106(11):1879-1901.
[2]ASHOURI A H,PALERMO G,CAVAZOS J,et al.Automatic Tuning of Compilers Using Machine Learning[M].Springer International Publishing,2018.
[3]LIU S,CUI Y Z,JIANG Q,et al.An efficient Tile Size Selection Model Based on Machine Learning[J].Journal of Parallel and Distributed Computing,2018,121:27-41.
[4]YUKI T,RENGANARAYANAN L,RAJOPADHYE S,et al.Automatic Creation of Tile Size Selection Models[C]//Procee-dings of the 8th annual IEEE/ACM International Symposium on Code Generation and Optimization.ACM,2010:190-199.
[5]CHI H Y,CHEN C B.Prediction of Loop Tiling Size Based on Neural Network[J].Computer Science,2020,47(8):62-70.
[6]CHEN T,ZHENG L,YAN E,et al.Learning to Optimize Tensor Programs[J].Advances in Neural Information Processing Systems,2018,31:3389-3400.
[7]LEATHER H,BONILLA E,O'BOYLE M.Automatic Feature Generation for Machine Learning Based Optimizing Compilation[C]//Proceedings of the 7th annual IEEE/ACM International Symposium on Code Generation and Optimization.IEEE Computer Society,2009:81-91.
[8]HAJ-ALI A,AHMED N K,WILLKE T,et al.NeuroVectori-zer:End-to-end Vectorization with Deep Reinforcement Lear-ning[C]//Proceedings of the 18th ACM/IEEE International Symposium on Code Generation and Optimization.2020:242-255.
[9]LIU H,ZHAO R C,WANG Q.Parameter Selection Method for Function Level Compiler Optimization Guided by Supervised Learning Model[J].Computer Engineering and Science,2018,40(6):957-968.
[10]CAVAZOS J,FURSIN G,AGAKOV F V,et al.Rapidly Selecting Good Compiler Optimizations Using Performance Counters[C]//International Symposium on Code Generation & Optimization.IEEE Computer Society,2007.
[11]NOBRE R,MARTINS L G A,CARDOSO J M P.A Graph-based Iterative Compiler Pass Selection and Phase Ordering Approach[J].ACM Sigplan Notices,2016,51(5):21-30.
[12]ALHASNAWY L H,ALWAN E H,FANFAKH A B M.Using Machine Learning to Predict the Sequences of Optimization Passes[M]//New Trends in Information and Communications Technology Applications.2020.
[13]ALMOHAMMED M H,FANFAKH A B M,ALWAN E H.Parallel Genetic Algorithm for Optimizing Compiler Sequences Ordering[C]//International Conference on New Trends in Information and Communications Technology Applications.Cham:Springer,2020:128-138.
[14]MARTINS L G A,NOBRE R,CARDOSO J M P,et al.Clustering-based Selection for the Exploration of Compiler Optimization Sequences[J].ACM Transactions on Architecture & Code Optimization,2016,13(1):1-28.
[15]LI X C ,GAO G,ZHANG J,et al.Selection of Compiler-optimization Sequences[J].Scientia Sinica Informationis,2019,49(10):1267-1282.
[16]LIU H,XU J L,ZHAO R C,et al.Compiler Optimization Sequence Selection Method Guided by Learning Model[J].Computer Research and Development,2019,56(9):2012-2026.
[17]MAGNI A,DUBACH C,O'BOYLE M.Automatic Optimization of Thread-coarsening for Graphics Processors[C]//Proceedings of the 23rd International Conference on Parallel Architectures and Compilation.2014:455-466.
[18]CUMMINS C,PETOUMENOS P,WANG Z,et al.End-to-end Deep Learning of Optimization Heuristics[C]//2017 26th International Conference on Parallel Architectures and Compilation Techniques (PACT).IEEE,2017:219-232.
[19]FALCH T L,ELSTER A C.Machine Learning Based Auto-tu-ning for Enhanced OpenCL Performance Portability[C]//2015 IEEE International Parallel and Distributed Processing Sympo-sium Workshop (IPDPSW).IEEE,2017.
[20]KURRA S,SINGH N K,PANDA P R.The Impact of Loop Unrolling on Controller Delay in High Level Synthesis[C]//Design,Automation & Test in Europe Conference & Exhibition.IEEE Computer Society,2007.
[21]COOPER K D,HARVEY T J,WATERMAN T.An Adaptive Strategy for Inline Substitution[C]//International Conference on Compiler Construction.Berlin:Springer,2008.
[22]SATO Y,YUKI T,ENDO T.An Autotuning Framework for Scalable Execution of Tiled Code via Iterative Polyhedral Compilation[J].ACM Transactions on Architecture and Code Optimization,2019,15(4):1-23.
[23]HENNI M,MEKKI I I,BAGHDADI R,et al.A Deep Learning Approach for Automatic Code Optimization in Tiramisu[J].arXiv:10.13140/RG.2.2.15388.85125,2020.
[24]FURSIN G,MIRANDA C,TEMAM O.MILEPOST GCC:Machine Learning Based Research Compiler[C]//GCC Summit.2008.
[25]PARK E,CAVAZOS J,ALVAREZ M A.Using Graph-based Program Characterization for Predictive Modeling[C]//Proceedings of the Tenth International Symposium on Code Gene-ration and Optimization.2012:196-206.
[26]CAVAZOS J,DUBACH C,AGAKOV F V,et al.AutomaticPerformance Model Construction for The Fast Software Exploration of New Hardware Designs[C]//Proceedings of the 2006 International Conference on Compilers,Architecture,and Synthesis for Embedded Systems(CASES 2006).Seoul,Korea,2006.
[27]PARK E,KARTSAKLIS C,CAVAZOS J.HERCULES:Strong Patterns towards More Intelligent Predictive Modeling[C]//International Conference on Parallel Processing.IEEE,2014.
[28]KRUSE M,FINKEL H,WU X.Autotuning Search Space forLoop Transformations[J].arXiv:2010.06521,2020.
[29]CUMMINS C,FISCHES Z V,BEN-NUN T,et al.Programl:Graph-based Deep Learning for Program Optimization and Analysis[J].arXiv:2003.10536,2020.
[30]ALON U,ZILBERSTEIN M,LEVY O,et al.code2vec:Lear-ning Distributed Representations of Code[J].Proceedings of the ACM on Programming Languages,2019,3(POPL):1-29.
[31]VENKATAKEERTHY S,AGGARWAL R,JAIN S,et al.IR2Vec:LLVM IR Based Scalable Program Embeddings[J].arXiv:1909.06228,2019.
[32]HECHT M S.Flow Analysis of Computer Programs[M].Elsevier Science Inc.,1977.
[33]MUCHNICK S.Advanced Compiler Design Implementation[M].Morgan Kaufmann,1997.
[34]KANADE A,MANIATIS P,BALAKRISHNAN G,et al.Learning and Evaluating Contextual Embedding of Source Code[C]//International Conference on Machine Learning.PMLR,2020:5110-5121.
[35]BRAUCKMANN A,GOENS A,CASTRILLON J.ComPy-Learn:A Toolbox for Exploring Machine Learning Representations for Compilers[C]//2020 Forum for Specification and Design Languages (FDL).IEEE,2020:1-4.
[36]DE MULDER W,BETHARD S,MOENS M F.A Survey on the Application of Recurrent Neural Networks to Statistical Language Modeling[J].Computer Speech & Language,2015,30(1):61-98.
[37]WU Z,PAN S,CHEN F,et al.A Comprehensive Survey onGraph Neural Networks[J].IEEE Transactions on Neural Networks and Learning Systems,2020,32(1):4-24.
[38]FEY M,LENSSEN J E.Fast Graph Representation Learning with PyTorch Geometric[J].arXiv:1903.02428,2019.
[39]WANG M,YU L,ZHENG D,et al.Deep Graph Library:Towards Efficient and Scalable Deep Learning on Graphs[J].arXiv:1909.01315,2019.
[40]ASHOURI A H,MARIANI G,PALERMO G,et al.A Bayesian Network Approach for Compiler Auto-tuning for Embedded Processors[C]//IEEE ESTIMedia 2014.IEEE,2014.
[41]VENTO D D.Performance Optimization on A Supercomputerwith Ctuning and the PGI Compiler[C]//Proceedings of the 2nd International Workshop on Adaptive Self-Tuning Computing Systems for the Exaflop Era.2012:12-20.
[42]BONDHUGULA U,BASKARAN M M,KRISHNAMOOR-THY S,et al.Automatic Transformations for Communication-Minimized Parallelization and Locality Optimization in the Polyhedral Model[C]//Joint European Conferences on Theory & Practice of Software International Conference on Compiler Construction.Springer-Verlag,2008.
[43]RAUCHWERGER L,PADUA D.The privatizing doall test:A Run-time Technique for Doall Loop Identification and Array Privatization[C]//Proceedings of the 8th International Conference on Supercomputing.1994:33-43.
[44]BATTAGLIA P W,HAMRICK J B,BAPST V,et al.Relational Inductive Biases,Deep Learning,and Graph Networks[J].ar-Xiv:1806.01261,2018.
[45]LI Y,TARLOW D,BROCKSCHMIDT M,et al.Gated GraphSequence Neural Networks[J].arXiv:1511.05493,2015.
[46]GILMER J,SCHOENHOLZ S S,RILEY P F,et al.Neural Message Passing for Quantum Chemistry[C]//International Confe-rence on Machine Learning.PMLR,2017:1263-1272.
[47]KIPF T N,WELLING M.Semi-supervised Classification withGraph Convolutional Networks[J].arXiv:1609.02907,2016.
[48]VELICˇKOVIC' P,CUCURULL G,CASANOVA A,et al.Graph Attention Networks[J].arXiv:1710.10903,2017.
[49]WANG G,YING R,HUANG J,et al.Improving Graph Attention Networks with Large Margin-based Constraints[J].arXiv:1910.11945,2019.
[50]YE G,TANG Z,WANG H,et al.Deep Program Structure Mo-deling Through Multi-Relational Graph-based Learning[C]//Proceedings of the ACM International Conference on Parallel Architectures and Compilation Techniques.2020:111-123.
[51]BRAUCKMANN A,GOENS A,ERTEL S,et al.Compiler-based Graph Representations for Deep Learning Models of Code[C]//Proceedings of the 29th International Conference on Compiler Construction.2020:201-211.
[52]KAM J B,ULLMAN J D.Monotone Data Flow AnalysisFrameworks[J].Acta Informatica,1977,7(3):305-317.
[53]COOPER K D,HARVEY T J,KENNEDY K.Iterative Data-flow Analysis,Revisited[OL].https://hdl.handle.net/1911/96324,2004.
[54]ZHANG Z,CUI P,ZHU W.Deep Learning on Graphs:A Survey[J].IEEE Transactions on Knowledge and Data Enginee-ring,2020.
[55]CUMMINS C,FISCHES Z,BEN-NUN T,et al.ProgramGraphs for Machine Learning[C]//34th Conference on Neural Information Processing Systems.NeurIPS 2020.
[56]BARCHI F,URGESE G,MACII E,et al.Code Mapping in He-terogeneous Platforms Using Deep Learning and LLVM-IR[C]//2019 56th ACM/IEEE Design Automation Conference (DAC).IEEE,2019:1-6.
[57]BARCHI F,PARISI E,URGESE G,et al.Exploration of Con-volutional Neural Network Models for Source Code Classification[J].Engineering Applications of Artificial Intelligence,2021,97:104075.
[58]CHEN T,MOREAU T,JIANG Z,et al.{TVM}:An Automated End-to-end Optimizing Compiler for Deep Learning[C]//13th {USENIX} Symposium on Operating Systems Design and Implementation ({OSDI} 18).2018:578-594.
[59]LI M,LIU Y,LIU X,et al.The Deep Learning Compiler:AComprehensive Survey[J].IEEE Transactions on Parallel and Distributed Systems,2021,32(3):708-727.
[60]LIU H,ZHAO R,WANG Q,et al.ALIC:A Low OverheadCompiler Optimization Prediction Model[J].Wireless Personal Communications,2018,103(1):809-829.
[61]OGILVIE W F,PETOUMENOS P,WANG Z,et al.Minimizing the Cost of Iterative Compilation with Active Learning[C]//2017 IEEE/ACM International Symposium on Code Generation and Optimization (CGO).IEEE,2017:245-256.
[62]SUTTON R S,BARTO A G.Reinforcement Learning:An Introduction[M].MIT Press,2018.
[63]HAJ-ALI A,AHMED N K,WILLKE T,et al.A View on Deep Reinforcement Learning in System Optimization[J].arXiv:1908.01275,2019.
[64]MAMMADLI R,JANNESARI A,WOLF F.Static Neural Compiler Optimization via Deep Reinforcement Learning[C]//2020 IEEE/ACM 6th Workshop on the LLVM Compiler Infrastructure in HPC (LLVM-HPC) and Workshop on Hierarchical Pa-rallelism for Exascale Computing (HiPar).IEEE,2020:1-11.
[65]ASHOURI A H,MARIANI G,PALERMO G,et al.COBAYN:Compiler Autotuning Framework Using Bayesian Networks[J].ACM Transactions on Architecture and Code Optimization,2016,13(2):1-25.
[66]ASHOURI A H,BIGNOLI A,PALERMO G,et al.MiCOMP:Mitigating the Compiler Phase-Ordering Problem Using Optimization Sub-Sequences and Machine Learning[J].Acm Transactions on Architecture & Code Optimization,2017,14(3):1-28.
[67]HAJ-ALI A,HUANG Q,MOSES W,et al.AutoPhase:Compi-ler Phase-Ordering for High Level Synthesis with Deep Reinforcement Learning[J].arXiv:1901.04615,2019.
[68]BEN-NUN T,JAKOBOVITS A S,HOEFLER T.Neural Code Comprehension:A Learnable Representation of Code Semantics[J].arXiv:1806.07336,2018.
[69]ZHANG B.Artificial Intelligence Enters the Post Deep Lear-ning Era[J].Journal of Intelligent Science and Technology,2019,1(1):4
[70]TAI K S,SOCHER R,MANNING C D.Improved SemanticRepresentations From Tree-Structured Long Short-Term Me-mory Networks[J].Computer Science,2015,5(1):36.
[1] 冷典典, 杜鹏, 陈建廷, 向阳.
面向自动化集装箱码头的AGV行驶时间估计
Automated Container Terminal Oriented Travel Time Estimation of AGV
计算机科学, 2022, 49(9): 208-214. https://doi.org/10.11896/jsjkx.210700028
[2] 宁晗阳, 马苗, 杨波, 刘士昌.
密码学智能化研究进展与分析
Research Progress and Analysis on Intelligent Cryptology
计算机科学, 2022, 49(9): 288-296. https://doi.org/10.11896/jsjkx.220300053
[3] 何强, 尹震宇, 黄敏, 王兴伟, 王源田, 崔硕, 赵勇.
基于大数据的进化网络影响力分析研究综述
Survey of Influence Analysis of Evolutionary Network Based on Big Data
计算机科学, 2022, 49(8): 1-11. https://doi.org/10.11896/jsjkx.210700240
[4] 李瑶, 李涛, 李埼钒, 梁家瑞, Ibegbu Nnamdi JULIAN, 陈俊杰, 郭浩.
基于多尺度的稀疏脑功能超网络构建及多特征融合分类研究
Construction and Multi-feature Fusion Classification Research Based on Multi-scale Sparse Brain Functional Hyper-network
计算机科学, 2022, 49(8): 257-266. https://doi.org/10.11896/jsjkx.210600094
[5] 张光华, 高天娇, 陈振国, 于乃文.
基于N-Gram静态分析技术的恶意软件分类研究
Study on Malware Classification Based on N-Gram Static Analysis Technology
计算机科学, 2022, 49(8): 336-343. https://doi.org/10.11896/jsjkx.210900203
[6] 陈明鑫, 张钧波, 李天瑞.
联邦学习攻防研究综述
Survey on Attacks and Defenses in Federated Learning
计算机科学, 2022, 49(7): 310-323. https://doi.org/10.11896/jsjkx.211000079
[7] 李亚茹, 张宇来, 王佳晨.
面向超参数估计的贝叶斯优化方法综述
Survey on Bayesian Optimization Methods for Hyper-parameter Tuning
计算机科学, 2022, 49(6A): 86-92. https://doi.org/10.11896/jsjkx.210300208
[8] 赵璐, 袁立明, 郝琨.
多示例学习算法综述
Review of Multi-instance Learning Algorithms
计算机科学, 2022, 49(6A): 93-99. https://doi.org/10.11896/jsjkx.210500047
[9] 肖治鸿, 韩晔彤, 邹永攀.
基于多源数据和逻辑推理的行为识别技术研究
Study on Activity Recognition Based on Multi-source Data and Logical Reasoning
计算机科学, 2022, 49(6A): 397-406. https://doi.org/10.11896/jsjkx.210300270
[10] 姚烨, 朱怡安, 钱亮, 贾耀, 张黎翔, 刘瑞亮.
一种基于异质模型融合的 Android 终端恶意软件检测方法
Android Malware Detection Method Based on Heterogeneous Model Fusion
计算机科学, 2022, 49(6A): 508-515. https://doi.org/10.11896/jsjkx.210700103
[11] 王飞, 黄涛, 杨晔.
基于Stacking多模型融合的IGBT器件寿命的机器学习预测算法研究
Study on Machine Learning Algorithms for Life Prediction of IGBT Devices Based on Stacking Multi-model Fusion
计算机科学, 2022, 49(6A): 784-789. https://doi.org/10.11896/jsjkx.210400030
[12] 许杰, 祝玉坤, 邢春晓.
机器学习在金融资产定价中的应用研究综述
Application of Machine Learning in Financial Asset Pricing:A Review
计算机科学, 2022, 49(6): 276-286. https://doi.org/10.11896/jsjkx.210900127
[13] 李野, 陈松灿.
基于物理信息的神经网络:最新进展与展望
Physics-informed Neural Networks:Recent Advances and Prospects
计算机科学, 2022, 49(4): 254-262. https://doi.org/10.11896/jsjkx.210500158
[14] 么晓明, 丁世昌, 赵涛, 黄宏, 罗家德, 傅晓明.
大数据驱动的社会经济地位分析研究综述
Big Data-driven Based Socioeconomic Status Analysis:A Survey
计算机科学, 2022, 49(4): 80-87. https://doi.org/10.11896/jsjkx.211100014
[15] 章晓庆, 方建生, 肖尊杰, 陈浜, RisaHIGASHITA, 陈婉, 袁进, 刘江.
基于眼前节相干光断层扫描成像的核性白内障分类算法
Classification Algorithm of Nuclear Cataract Based on Anterior Segment Coherence Tomography Image
计算机科学, 2022, 49(3): 204-210. https://doi.org/10.11896/jsjkx.201100085
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!