Skip to content

Code for paper "EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning"

License

Notifications You must be signed in to change notification settings

cchao0116/EasyDGL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EasyDGL

License License Stars

The official implementation for "EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning".

What's news

[2023.07.10] We release the early version of Pytorch-DGL codes at pre-branch.

[2023.03.16] We release the TensorFlow version of our codes for link prediction.

Results for Link Prediction

Dataset

We use the Netflix benchmark to evaluate model performance, where the Tensorflow Record scheme is as follows:

Feature Name Feature Type Content
seqs_i FixedLenFeature(int64) sequence of a user's rated items
seqs_t FixedLenFeature(float32) sequence of timestamps in sec
seqs_hour FixedLenFeature(int64) sequence of timestamps in hour
seqs_day FixedLenFeature(int64) sequence of timestamps in day
seqs_weekday FixedLenFeature(int64) sequence of timestamps in weekday
seqs_month FixedLenFeature(int64) sequence of timestamps in month

TFRECORD Download: Google, 夸克

Results

Below we report the HR@50, NDCG@50 and NDCG@100 results on the above provided dataset.

Model HR@50 NDCG@50 NDCG@100
GRU4REC 0.40903 0.18904 0.20321
SASREC 0.41802 0.19614 0.21075
S2PNM 0.41960 0.19536 0.20991
BERT4REC 0.42487 0.19782 0.21257
GREC 0.41915 0.19573 0.20974
TGAT 0.41633 0.19205 0.20679
TiSASREC 0.44583 0.20879 0.22334
TimelyREC 0.42202 0.19897 0.21315
CTSMA 0.45240 0.21141 0.22589
EasyDGL (ours.) 0.48320 0.23104 0.24476

Folder Specification

  • conf/: configurations for logging
  • data/: preprocessing scripts for data filter and split
  • runme.sh: train or evaluate EasyDGL and baseline models
  • src/: codes for model definition
Supported algorithms:

Run the Code

Download our data to $DATA_HOME directory, then Reproduce above results on Netflix benchmark:

bash runme.sh ${$DATA_HOME}

Citation

If you find our codes useful, please consider citing our work

@inproceedings{chen2021learning,
  title={Learning Self-Modulating Attention in Continuous Time Space with Applications to Sequential Recommendation},
  author={Chen, Chao and Geng, Haoyu and Yang, Nianzu and Yan, Junchi and Xue, Daiyue and Yu, Jianping and Yang, Xiaokang},
  booktitle={Proceedings of the International Conference on Machine Learning (ICML '21)},
  pages={1606--1616},
  year={2021},
  organization={PMLR}
}

@article{chen2023easydgl,
  title={EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning},
  author={Chen, Chao and Geng, Haoyu and Yang, Nianzu and Yang, Xiaokang and Yan, Junchi},
  journal={arXiv preprint arXiv:2303.12341},
  year={2023}
}

About

Code for paper "EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published