Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Allow Decoder to work with surface objects #4205

Draft
wants to merge 19 commits into
base: main
Choose a base branch
from

Conversation

ymzayek
Copy link
Member

@ymzayek ymzayek commented Jan 10, 2024

Changes proposed in this pull request:

  • Add tests for decoder methods with surface object
  • Adapt screening percentile function
  • Adapt ReNA clustering

TODO

Examples to adapt:

  • examples/02_decoding/plot_haxby_searchlight_surface.py

Copy link
Contributor

👋 @ymzayek Thanks for creating a PR!

Until this PR is ready for review, you can include the [WIP] tag in its title, or leave it as a github draft.

Please make sure it is compliant with our contributing guidelines. In particular, be sure it checks the boxes listed below.

  • PR has an interpretable title.
  • PR links to Github issue with mention Closes #XXXX (see our documentation on PR structure)
  • Code is PEP8-compliant (see our documentation on coding style)
  • Changelog or what's new entry in doc/changes/latest.rst (see our documentation on PR structure)

For new features:

  • There is at least one unit test per new function / class (see our documentation on testing)
  • The new feature is demoed in at least one relevant example.

For bug fixes:

  • There is at least one test that would fail under the original bug conditions.

We will review it as quick as possible, feel free to ping us with questions if needed.

Copy link

codecov bot commented Jan 10, 2024

Codecov Report

Attention: Patch coverage is 98.21429% with 1 lines in your changes are missing coverage. Please review.

Project coverage is 92.01%. Comparing base (abb80ff) to head (7f44d33).
Report is 78 commits behind head on main.

Files Patch % Lines
nilearn/regions/rena_clustering.py 97.87% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #4205      +/-   ##
==========================================
+ Coverage   91.85%   92.01%   +0.16%     
==========================================
  Files         144      145       +1     
  Lines       16419    16724     +305     
  Branches     3434     3548     +114     
==========================================
+ Hits        15082    15389     +307     
+ Misses        792      766      -26     
- Partials      545      569      +24     
Flag Coverage Δ
macos-latest_3.11_test_plotting ?
ubuntu-latest_3.10_test_plotting 91.80% <98.21%> (-0.06%) ⬇️
ubuntu-latest_3.11_test_plotting 91.80% <98.21%> (?)
ubuntu-latest_3.12_test_plotting 91.80% <98.21%> (?)
ubuntu-latest_3.8_test_min 68.81% <98.21%> (?)
ubuntu-latest_3.8_test_plot_min 91.47% <98.21%> (?)
ubuntu-latest_3.8_test_plotting 91.76% <98.21%> (?)
windows-latest_3.11_test_plotting 91.79% <98.21%> (?)
windows-latest_3.12_test_plotting 91.79% <98.21%> (?)
windows-latest_3.8_test_plotting 91.75% <98.21%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ymzayek
Copy link
Member Author

ymzayek commented Jan 15, 2024

@bthirion WDYT about handling FREM in the decoder? Is it applicable/used for surface analysis? This would require adapting ReNA. If not priority, I would leave for later

@bthirion
Copy link
Member

Adapting ReNA (and Ward) to the surface is technically easy. We should definitely do that.

@ymzayek
Copy link
Member Author

ymzayek commented Jan 15, 2024

Adapting ReNA (and Ward) to the surface is technically easy. We should definitely do that.

Well yes and no. I think writing a new image check function that wraps check_niimg and handles surface image separately (we don't want to call load_niimg) might make this and a lot of other cases easier.

@bthirion
Copy link
Member

Ye, but behind that, the algorithm will be a bit different, given that the connectivity structure is then defined by the mesh and no longer by the voxel grid.

@ymzayek
Copy link
Member Author

ymzayek commented Jan 17, 2024

What would be the equivalent calculation of edges and weights for surface data? _make_edges_and_weights is a blocker for FREM using ReNA and it depends on several functions that need volumetric data so maybe it would be preferable to write a _make_edges_and_weights_surface function that is called from _weighted_connectivity_graph when the input is a surface object

@bthirion
Copy link
Member

bthirion commented Jan 17, 2024

edges are the mesh edges: each triangle results into three edges.
Given some data X weights are then simply computed as $|| X_i - X_j ||^2$ for edge (i, j).

@ymzayek
Copy link
Member Author

ymzayek commented Jan 18, 2024

The comment from #4227 (review) is meant to be here

We probably also have to adapt Decoder heuristics with the percentile selection ? What about clustering ?

mask : :class:`numpy.ndarray`, shape (nx, ny, nz)
mask : :class:`numpy.ndarray`
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is nothing in this function that requires that shape

else:
edges, weight = _make_edges_and_weights(X, mask_img)

# TODO : deal with dict results if surface analysis
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now failure is here because edges and weights are returned in a dictionary with keys representing the surface image object mesh and data keys for the different hemispheres

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bthirion I'm a bit stuck here. Does it make more sense to let ReNA run on each hemisphere separately and then concatenate the results in X_train on the level of the call from the decoder?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could great a unique graph that would stack the data from the two hemispheres. This graph would have 2 connected components. This would have the advantage to make the use on surface transparent for other functions.

@ymzayek
Copy link
Member Author

ymzayek commented Jan 19, 2024

TODO:

  • Finish rena clustering adaptation
  • validate screening percentile and clustering for surface

Copy link
Member

@bthirion bthirion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thx ! Thi is great

nilearn/decoding/space_net.py Outdated Show resolved Hide resolved
faces : ndarray
The vertex indices corresponding the mesh triangles.

mask : boolean
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see how to make sense of mask in this function, given that the edge ordering is unclear at that point ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maks is on vertices, right ? Then it makes sense. But the dosctring has to be improved.

@@ -171,11 +291,20 @@ def _weighted_connectivity_graph(X, mask_img):
"""
n_features = X.shape[1]

edges, weight = _make_edges_and_weights(X, mask_img)
if isinstance(mask_img, SurfaceImage):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not finish this section under the conditional for surface. It probably needs to be updated according to this comment #4205 (comment)

And then the whole pipeline needs to be validated

@Remi-Gau Remi-Gau added the Surface Related to surface data or surface analysis. label Jan 23, 2024
@Remi-Gau Remi-Gau added this to the release 0.11.0 milestone Jan 24, 2024
@Remi-Gau Remi-Gau removed the Blocked label Jun 3, 2024
@Remi-Gau
Copy link
Collaborator

Remi-Gau commented Jun 3, 2024

Examples to adapt:

* [ ]  examples/02_decoding/plot_haxby_searchlight_surface.py

Copy link
Member

@bthirion bthirion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There remain a few comments to address, but it looks good overall.

assert model.scoring == "roc_auc"

model.score(X, y)
accuracy_score(y, y_pred)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could additionally check that the accuracy score are reasonable (close to chance I guess) ?

assert model.scoring == "r2"

model.score(X, y)
r2_score(y, y_pred)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could additionally check that the accuracy score are reasonable (close to chance I guess) ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Surface Related to surface data or surface analysis.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants