Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There is a bug about attention mask in source code #102

Open
Jason941 opened this issue Apr 27, 2023 · 0 comments
Open

There is a bug about attention mask in source code #102

Jason941 opened this issue Apr 27, 2023 · 0 comments

Comments

@Jason941
Copy link

Here is the location file:/src/transformer/modeling_utils.py

192 exteneded_attention_mask = extended_attention_mask.to(dtype=self.dtype)
193 exteneded_attention_mask = (1.0 - extended_attention_mask) * -10000.0
194 return extended_attention_mask

The returned variable name is wrong!!!

@Jason941 Jason941 changed the title There is a bug about attention mask There is a bug about attention mask in source code Apr 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant