You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We've no in-library implementation of learning rate decay. It could be implemented similarly to weight decay in Connection objects, but in LearningRule objects, operating on nu.
The text was updated successfully, but these errors were encountered:
Hello, sorry if this is an erroneous question, I am just starting to learn about this library.
Would it make sense for this feature to implement the option of defining individual learning rate decays to operate on pre and post synaptic events in addition to operating on both of them with the same value?
We've no in-library implementation of learning rate decay. It could be implemented similarly to weight decay in
Connection
objects, but inLearningRule
objects, operating onnu
.The text was updated successfully, but these errors were encountered: