Back to the main page.

Bug 2130 - ft_preamble_provenance can't handle large files

Status CLOSED DUPLICATE
Reported 2013-04-19 19:33:00 +0200
Modified 2019-08-10 12:03:47 +0200
Product: FieldTrip
Component: core
Version: unspecified
Hardware: PC
Operating System: Windows
Importance: P3 normal
Assigned to:
URL:
Tags:
Depends on:
Blocks:
See also:

Ingrid Nieuwenhuis - 2013-04-19 19:33:31 +0200

When calling for instance ft_timelock analysis with a data set larger than 2GB, I get the below error, Commenting out pre and post amble and it runs normally. I suggest some sort of memory check should be done so a function can not error on this, or a try catch or something. I'm using Matlab 2012b on Windows 64 bits PC. Error using CalcMD5 *** CalcMD5[mex]: Input > 2^31 byte not handled yet. Error in ft_preamble_provenance (line 53) cfg.callinfo.inputhash = cellfun(@CalcMD5, cellfun(@mxSerialize, cellfun(@eval, ft_default.preamble, 'UniformOutput', false), 'UniformOutput', false), 'UniformOutput', false); Error in ft_preamble (line 54) evalin('caller', ['ft_preamble_' cmd]); Error in ft_timelockanalysis (line 125) ft_preamble provenance data Error in prepareERP_slowwaves_sLoreta (line 169) timelock = ft_timelockanalysis(cfg, data);


Johanna - 2013-04-22 13:39:20 +0200

Hi Ingrid, Is this the same as bug 1917?


Ingrid Nieuwenhuis - 2013-04-22 19:42:21 +0200

The cause of this bug is indeed what bug 1917 is (sorry missed that one). However, I'd like to urge for a temporary solution in the functions calling it, if fixing bug 1917 is difficult. I'm not familiar with what it is preample and post amble provenance do exactly, and if it can just be by passed for large files for now. If skipping it is not a big deal (which I think, it's just keeping track of memory/time or something no?), I'd suggest to asap include a check in all functions calling it. If the file is too big, just give a message that time tracking or what ever it does is not possible yet for big files, and by pass preamble. It's very annoying that this bug (1917) now results in an error over something trivial when using large data. I'll file it as duplicate, and paste this there as well *** This bug has been marked as a duplicate of bug 1917 ***


Robert Oostenveld - 2019-08-10 12:03:47 +0200

This closes a whole series of bugs that have been resolved (either FIXED/WONTFIX/INVALID) for quite some time. If you disagree, please file a new issue describing the issue on https://github.com/fieldtrip/fieldtrip/issues.