-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
huge volume segy data loading #118
Comments
Segy data isn't designed for performant data access on this scale. You can use the cli converter to convert the data to NetCDF format which allows for lazy access of the data from disk. Alternatively you could try the ZFP compression format or other fast access format for cube data. |
I have tried to convert it to NetCDF format but it failed, error message:"No space left on device" |
This sounds like you don't have enough spare hard drive space to create the NetCDF file. Can you move the file to larger hard drive or free up space on your existing disk? |
Use this to compress your seismic: https://github.com/equinor/seismic-zfp (pip install works on most platforms, but non Mac M1 I'm afraid) and then use |
SEGY-SAK v0.5 adds support for large SEG-Y via lazy loading of data. Alternatively, use a different file format such as ZFP, ZGY, or OpenVDS. |
Hello !!
I am currently using segysak to work with seismic data with jupyter notebook. I have a system error when i try to load my cube which is nearly 24 Giga with the segy_loader function. Here is the error:
"MemoryError: Unable to allocate 42.0 GiB for an array with shape (2501, 2001, 2251) and data type float32"
How can i do to be able to load huge volume of seismic please ?
The text was updated successfully, but these errors were encountered: