Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Out of Memory" whith calcTensor on large EBSD files #2002

Open
MicAtWork opened this issue Nov 22, 2023 · 2 comments
Open

"Out of Memory" whith calcTensor on large EBSD files #2002

MicAtWork opened this issue Nov 22, 2023 · 2 comments

Comments

@MicAtWork
Copy link

Im trying to calculate the stiffnes tensor for some large EBSD files I have. For the files larger than ~690 MB I get "out of memory" in Matlab. Ive tried increasing the memory by setMTEXpref('memory',32000*1024); but it doesnt help.

This is the code I use

% crystal symmetry
    CS = {...
        'notIndexed',...
        crystalSymmetry('m-3m',  [2.8660 2.8660 2.8660], 'mineral', 'Iron BCC', 'color', [255 53 35]/255),...
        crystalSymmetry('m-3m',  [3.6599 3.6599 3.6599], 'mineral', 'Iron FCC', 'color', 'blue')};

% load the data
    ebsd = EBSD.load(FileName,CS,'interface','crc',...
          'convertEuler2SpatialReferenceFrame'); % giving 6836x7116 EBSDsquare 


    o = ebsd(phase).orientations; % resulting in 34426708x1 orientations

    % Define matrix
        M = [
                [C11 C12 C12 0 0 0];
                [C12 C11 C12 0 0 0];
                [C12 C12 C11 0 0 0];
                [0 0 0 C44 0 0];
                [0 0 0 0 C44 0];
                [0 0 0 0 0 C44];
            ];
    C = stiffnessTensor(M, o.CS,'density',MatDensity);
    [CVoigt, CReuss, CHill] = calcTensor(o, C);

The Error Message I get:

Out of memory.
Error in tensor/EinsteinSum (line 81)
    M1 = M1 .* M2;
Error in tensor/rotate (line 31)
  T = EinsteinSum(T,ind,R,[d -d],'keepClass');
Error in  .*  (line 32)
  r = rotate(b,a);
Error in orientation/calcTensor (line 31)
[varargout{1:nargout}] = mean(ori .* T,varargin{:});

The other files are less than 470 MB and work fine for the calculation of the stiffnessmatrix

Using MTEX version 5.10.2 with Matlab 2023b update 4, on Win 11, on a computer with 13th intel i5 and 64 GB RAM, with java heap memory set to maximum.

would it be possible to implement some sort of tall-array calculation for heavy files? or split the calculation in subsets of the orientations and then average the results?

@zmichels
Copy link

Those certainly are large datasets.

Not a solution to the actual problem... but wondering if you might entertain using reduce() as a workaround. If you felt like a spatially downsampled version of your map may yield a satisfactory representative stiffness tensor, then you could load your data and subset every other point (or some other factor) to start from a map with a lower point-density and larger effective step-size than your original.

reduce() will output an EBSD dataset with a factor reduction in the number of points. So... reduce(ebsd, 2) yields a map with every other point from the original. Use a greater number factor for greater reduction.

@MicAtWork
Copy link
Author

MicAtWork commented Nov 22, 2023

Thanks! that work for now, Another solution was to use a random subset of orientations:
odf_subset = calcOrientations(o, 1000); % 1000 random points, change to any desired number

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants