Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Global attention mechanism #237

Open
akhilpandey95 opened this issue Apr 9, 2024 · 1 comment
Open

Implement Global attention mechanism #237

akhilpandey95 opened this issue Apr 9, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@akhilpandey95
Copy link
Collaborator

resources to consider

@allaffa allaffa added the enhancement New feature or request label Apr 11, 2024
@allaffa
Copy link
Collaborator

allaffa commented Apr 12, 2024

@akhilpandey95

Global attention can mean many things. A few examples are:

  • concatenate global input features to the graph embedding layer created by the message passing layers.
  • a "slight" generalization of the above consists in passing the global input features into an MLP, and then concatenate the output to the graph embedding generated by the MPNN layers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants