An enhanced model known as RAGATv2 which is built upon the structure of the Relation Aware Graph Attention Network (RAGAT)
-
Updated
Sep 30, 2023 - Jupyter Notebook
An enhanced model known as RAGATv2 which is built upon the structure of the Relation Aware Graph Attention Network (RAGAT)
Empirical Research over the possible advantages of pretraining a Graph Neural Network for Classification by using Link Prediction. We used GCN, GAT and GraphSAGE with minibatch generation. Done for the Learning From Networks course taught by professor Fabio Vandin at the University of Padova
Bachelor Thesis
Parallel implementations of Graph Attention Network, including CUDA, OpenMP, and TinkerPop Vertex Program.
This repository contain the unofficial implementation of the graph attention neural network.
try different opts on word context graph with GCN and GAT to obtain word embeddings.
Developing efficient classification for Reddit posts/comments/communities with Graph Neural Networks (GNNs)
Dense and Sparse Implementation of GAT written by PyTorch
An attempt to apply graph attention neural network on modelling EU electricity interconnectors with only public data.
🎩 First experience with Gatsby & the JAM stack
Ziyuan Chen & Zhirong Chen, 2022 Summer Research @ ZJU
Zero-to-hero for Graph Neural Networks
This repository presents and compares HeterSUMGraph and variants using GATConv, GATv2Conv and a combination of HeterSUMGraph and SummaRuNNer (using HeterSUMGraph as a sentence encoder).
A TensorFlow 2 implementation of Graph Attention Networks (GAT)
Graph Attention Networks (GATs)
Graph4CTR(GCNs, GATs, HGCNs)
Add a description, image, and links to the gat topic page so that developers can more easily learn about it.
To associate your repository with the gat topic, visit your repo's landing page and select "manage topics."