Download PDFOpen PDF in browser

MAT: Effective Link Prediction via Mutual Attention Transformer

EasyChair Preprint no. 11464

4 pagesDate: December 6, 2023

Abstract

The Data Science and Advanced Analytics (DSAA) 2023 challenge[1] focuses on proposing link prediction methods to solve challenges about network-like data structure such as network reconstruction, network development, etc from articles on Wikipedia. In this challenge, we propose the Mutual Attention Transformer (MAT) method to predict if there is a link between two Wikipedia pages. Our method achieved the 5th and 4th position on the leaderboard for the public test and private test, respectively. Code will be publicly available for the ease of experimental re-implementation

Keyphrases: attention, link prediction, transformer

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:11464,
  author = {Van Quan Nguyen and Quang Huy Pham and Quang Dan Tran and Kien Bao Thang Nguyen and Hieu Nghia Nguyen},
  title = {MAT: Effective Link Prediction via Mutual Attention Transformer},
  howpublished = {EasyChair Preprint no. 11464},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser