Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distributed weights (builder) #547

Open
darribas opened this issue Aug 9, 2023 · 5 comments
Open

Distributed weights (builder) #547

darribas opened this issue Aug 9, 2023 · 5 comments
Labels

Comments

@darribas
Copy link
Member

darribas commented Aug 9, 2023

As the new generation of weights are coming into fruition (#534 ), I wanted to drop an issue to collate a proposal and some ideas @martinfleis and I have been fleshing out over the past couple of years. If anything, at least this can be a space to discuss whether it'd make sense to have on the roadmap at all and, if so, how it can be squared up with all the ongoing parallel plans on weights.

Idea: support (at least) part of the weights functionality on parallelisable and distributed data structures

Why: building weights for larger-than-memory datasets would lay the foundation for spatial analytics (i.e., PySAL functionality) at planetary scale. My own take is that in the last ten years, we've gotten away without this because RAM and CPUs have grown faster than the data we were using. I think this is changing because we're able to access datasets significantly bigger (even just at national scale) on which we'd like to run pysal.

How: our proposal (and this is very much for debate too!) is to build functionality that stores weights as adjacency matrices that are stored as dask.dataframe objects that don't need to live in memory in full. To build them, we could rely on dask_geopandas.geodataframe objects. This way has a couple of advantages:

  • Neither the input nor the output needs to be fully in memory so, theoretically, you can scale horizontally (more cores, more machines, same RAM per machine) as much as you want
  • It offloads most of the hard stuff that is non-spatial to already available tools that are already pretty robust (e.g., Dask), further integrates PySAL in the broader eco-system, and allows us to focus only on the main blockers that are particular to the problem at hand (i.e., compute and store spatial relationships)

If the above seems reasonable, probably the main technical blocker is defining spatial relationships across different chunks (for geometries within the same chunk it's straightforward), and possibly merge results at the end. I know Martin had some ideas here and there is precedent we built out-of-necessity and very much in ad-hoc ways for the Urban Grammar here (note this is all before the 0.1 release of dask-geopandas so we didn't have a dask_geopandas.GeoDataFrame to build upon. Some stuff here might be redundant or it might be possible to notably streamline it).

@darribas darribas added the graph label Aug 9, 2023
@ljwolf
Copy link
Member

ljwolf commented Aug 9, 2023

The new stuff focuses on the adjacency dataframe as its core data structure & keeps methods quite lean to make exactly this possible... or at least we hope!

We'd still need a testbench to profile whether we need to .collect()/.compute() things (like, .lag()) or when a method might try to persist data. I think we should profile this using a standard dask dataframe of an adjacency table and sending it directly to Graph(). I'll add to the roadmap.

For a distributed constructor, I think the existing rook()/queen() will already work on a distributed GeoDataFrame() via its sindex, but vertex_set_intersection() won't. Not sure about KNN/Range queries for KNN/DistanceBand or out-of-core Delaunay() and subgraphs thereof...

@martinfleis
Copy link
Member

I think the existing rook()/queen() will already work on a distributed GeoDataFrame() via its sindex

Nope, dask_geopandas.GeoDataFrame does not have the sindex functionality that would work in a distributed way.

@jGaboardi
Copy link
Member

I think the existing rook()/queen() will already work on a distributed GeoDataFrame() via its sindex

Nope, dask_geopandas.GeoDataFrame does not have the sindex functionality that would work in a distributed way.

Do you know if it is planned? (or even feasible?)

@martinfleis
Copy link
Member

Do you know if it is planned? (or even feasible?)

Planned would be a bit too strong of a word. We have discussed that and concluded that it may be feasible but it may also be better to guide people to use two steps - first sindex query over partitions and then sindex query over geometries within subset of partitions. That is what the current implementation of distributed sjoin is doing and I suppose what @darribas was using in his work on disk-based areal interpolation. Can't promise anything on this front, dask-geopandas is not a priority for any of us at the moment.

@darribas
Copy link
Member Author

darribas commented Aug 9, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants