运营商|可以改善图神经网络,提升GNN性能的三个技巧( 三 )


其次 , 使用自监督目标对 GNN 模型进行预训练通常有利于最终模型的性能 。它可以增加训练示例的数量 , 有时还可以减少固有噪声 。
第三 , 为前置和最终预测任务测试不同的架构可以提高模型的预测能力 。
引用[1
Michael Schlichtkrull Thomas N. Kipf Peter Bloem Rianne van den Berg Ivan Titov and Max Welling. Modeling relational data with graph convolutional networks 2017.
[2
Ziniu Hu Yuxiao Dong Kuansan Wang and Yizhou Sun. Heterogeneous graph transformer 2020.
[3
 deepfindr 2020.
[4
Davide Bacciu Federico Errica Alessio Michelia Marco Podda. A Gentle Introduction to Deep Learning for Graphs 2020
[5
Yulei Yang Dongsheng Li. NENN: Incorporate Node and Edge Features in Graph Neural Networks 2020
[6
Federico Errica Marco Podda Davide Bacciu and Alessio Micheli. A fair comparison of graph neural networks for graph classification 2020.
[7
Clement Gastaud Theophile Carniel and Jean-Michel Dalle. The varying importance of extrinsic factors in the success of startup fundraising: competition at early-stage and networks at growth-stage 2019.
[8
Dejun Jiang Zhenxing Wu Chang-Yu Hsieh Guangyong Chen Ben Liao Zhe Wang Chao Shen Dongsheng Cao Jian Wu and Tingjun Hou. Could graph neural networks learn better molecular representation for drug discovery? 2021.
[9
Hoang NT and Takanori Maehara. Revisiting graph neural networks: All we have is low-pass filters 2019.
[10
Petar Veli?kovi? Guillem Cucurull Arantxa Casanova Adriana Romero Pietro Liò and Yoshua Bengio. Graph attention networks 2018.
[11
Carlo Harprecht. Predicting Future Funding Rounds using Graph Neural Networks 2021
[12
Weihua Hu Bowen Liu Joseph Gomes Marinka Zitnik Percy Liang Vijay Pande and Jure Leskovec. Strategies for Pre-training Graph Neural Networks 2019.
[13
Jacob Devlin Ming-Wei Chang Kenton Lee and Kristina Toutanova. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2018.
https://www.overfit.cn/post/bfaf84ba0c204ad08689016a79130dd5
【运营商|可以改善图神经网络,提升GNN性能的三个技巧】作者:Carlo H