Graph Attention Networks (GAT) is a novel architectures that operate on graph-structured data, which leverages masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations. Based on PGL, we reproduce GAT algorithms and reach the same level of indicators as the paper in citation network benchmarks.
git clone -b 2.2.5 https://github.com/PaddlePaddle/PGL
pip3 install pgl==2.2.5
There's no need to prepare dastasets. The details for datasets can be found in the paper.
cd PGL/examples/gat/
CUDA_VISIBLE_DEVICES=0 python3 train.py --dataset cora
GPUs | Accuracy | FPS |
---|---|---|
BI-V100 | 83.16% | 65.56 it/s |
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》