-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH Create transfer learning tutorial #580
base: master
Are you sure you want to change the base?
Conversation
Co-authored-by: Bru <a.bruno@aluno.ufabc.edu.br>
Co-authored-by: Bru <a.bruno@aluno.ufabc.edu.br>
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## master #580 +/- ##
=======================================
Coverage 84.12% 84.12%
=======================================
Files 67 67
Lines 5437 5437
=======================================
Hits 4574 4574
Misses 863 863 |
We also have added the code to download and prepare both datasets (TUAB and NMT) and then training and fine-tuning steps. the code works with the newest TUAB dataset and Braindecode. Please let me know if you have any questions. APD_EEG repo. |
Yes, @MohammadJavadD, you read my mind. this is my block with this PR and the NMT dataset PR. I am thinking of something.... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @MohammadJavadD,
I made a rapid initial review. We need a little more text/concepts and some figures. Maybe put a static figure from your paper or my paper.
Can you work a little on the text that comments?
This tutorial shows how-to perform transfer learning using braindecode. | ||
Indeed, it is known that the best augmentation to use often dependent on the task | ||
or phenomenon studied. Here we follow the methodology proposed in [1]_ on the | ||
openly available BCI IV 2a Dataset and another dataset. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you replace this explanation @MohammadJavadD Something like theses examples:
- https://braindecode.org/stable/auto_examples/model_building/plot_how_train_test_and_tune.html#sphx-glr-auto-examples-model-building-plot-how-train-test-and-tune-py
-https://braindecode.org/dev/auto_examples/model_building/plot_hyperparameter_tuning_with_scikit-learn.html - https://braindecode.org/dev/auto_examples/advanced_training/plot_data_augmentation_search.html#sphx-glr-auto-examples-advanced-training-plot-data-augmentation-search-py
REPLACE... Data augmentation could be a step in training deep learning models. | ||
For decoding brain signals, recent studies have shown that artificially | ||
generating samples may increase the final performance of a deep learning model [1]_. | ||
Other studies have shown that data augmentation can be used to cast | ||
a self-supervised paradigm, presenting a more diverse | ||
view of the data, both with pretext tasks and contrastive learning [2]_. | ||
|
||
|
||
REPLACE... | ||
|
||
Figure about transfer learning. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
# | ||
# ### Load and save the raw recordings | ||
# | ||
# Here we assume you already load and preprocess the raw recordings for both TUAB and NMT datasetsand saved the file in `TUAB_path' and 'NMT_path' respectively. To read more see this notebook [here](https://braindecode.org/stable/auto_examples/applied_examples/plot_tuh_eeg_corpus.html) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Here we assume you already load and preprocess the raw recordings for both TUAB and NMT datasetsand saved the file in `TUAB_path' and 'NMT_path' respectively. To read more see this notebook [here](https://braindecode.org/stable/auto_examples/applied_examples/plot_tuh_eeg_corpus.html) | |
# Here, we assume you have already loaded and preprocessed the raw recordings for both TUAB and NMT datasets and saved the file in `TUAB_path` and `NMT_path` respectively. To read more, see how to load the Temple University Dataset in this braindecode [tutorial](https://braindecode.org/stable/auto_examples/applied_examples/plot_tuh_eeg_corpus.html) |
###################################################################### | ||
# Target is being set to pathological | ||
# ------------------------------------- |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The concept of target needs to be clear at the begin of the tutorial.
# We split the recordings by subject into train, validation and | ||
# testing sets. | ||
# | ||
|
||
# split based on train split from dataset |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Explain a little more about the true and false here, related with abnormal classification
# ------------------------------------- | ||
# We can now create the deep learning model. | ||
# In this tutorial, we use DeepNet introduced in [4]. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you discuss a little more about the choice of the parameters?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can directly use your paper, and say something, as investigated in [1] ...
print( | ||
"Number of parameters = ", | ||
sum(p.numel() for p in model.parameters() if p.requires_grad), | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
print( | |
"Number of parameters = ", | |
sum(p.numel() for p in model.parameters() if p.requires_grad), | |
) | |
print(clf) |
This is will print the parameters because of EEGClassifier
print( | ||
"Number of parameters = ", | ||
sum(p.numel() for p in model.parameters() if p.requires_grad), | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
print( | |
"Number of parameters = ", | |
sum(p.numel() for p in model.parameters() if p.requires_grad), | |
) | |
print(clf) |
###################################################################### | ||
# Conclusion | ||
# ------------------------------------- | ||
# | ||
# In this example, we used transfer learning (TL) as a way to learn | ||
# representations from a large EEG data and transfer to a smaller dataset. | ||
# You can put one of the results from your paper here. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can put more information about why this is important and things like this
No description provided.