Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning rate decay #291

Open
djsaunde opened this issue Jun 28, 2019 · 2 comments
Open

Learning rate decay #291

djsaunde opened this issue Jun 28, 2019 · 2 comments
Labels
enhancement New feature or request

Comments

@djsaunde
Copy link
Collaborator

We've no in-library implementation of learning rate decay. It could be implemented similarly to weight decay in Connection objects, but in LearningRule objects, operating on nu.

@djsaunde djsaunde added the enhancement New feature or request label Jun 28, 2019
@tomking
Copy link
Contributor

tomking commented Feb 24, 2021

Hello, sorry if this is an erroneous question, I am just starting to learn about this library.

Would it make sense for this feature to implement the option of defining individual learning rate decays to operate on pre and post synaptic events in addition to operating on both of them with the same value?

@Hananel-Hazan
Copy link
Collaborator

Yes, it may help, but for that we need to find the relevant biological evidence for this separate adjustment and also application that can utilize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants