Back to the main page.

Bug 1917 - provenance with md5sum fails for very large input variables

Status CLOSED FIXED
Reported 2013-01-03 14:36:00 +0100
Modified 2013-06-05 12:15:21 +0200
Product: FieldTrip
Component: core
Version: unspecified
Hardware: PC
Operating System: Mac OS
Importance: P2 major
Assigned to: Roemer van der Meij
URL:
Tags:
Depends on:
Blocks:
See also:

Robert Oostenveld - 2013-01-03 14:36:17 +0100

tlout=ft_timelockanalysis(cfg_tl,data); Error using CalcMD5 *** CalcMD5[mex]: Input > 2^31 byte not handled yet. Error in ft_preamble_provenance (line 49) cfg.callinfo.inputhash = cellfun(@CalcMD5, cellfun(@mxSerialize, cellfun(@eval, ft_default.preamble, 'UniformOutput', false), 'UniformOutput', false), 'UniformOutput', false); Error in ft_preamble (line 54) evalin('caller', ['ft_preamble_' cmd]); Error in ft_timelockanalysis (line 127) ft_preamble provenance data


Diego Lozano Soldevilla - 2013-02-13 13:47:01 +0100

*** Bug 1982 has been marked as a duplicate of this bug. ***


Ingrid Nieuwenhuis - 2013-04-22 19:42:21 +0200

*** Bug 2130 has been marked as a duplicate of this bug. ***


Ingrid Nieuwenhuis - 2013-04-22 19:43:19 +0200

Also see bug 2130: I'm not familiar with what it is preample and post amble provenance do exactly, and if it can just be by passed for large files for now. If skipping it is not a big deal (which I think, it's just keeping track of memory/time or something no?), I'd suggest to asap include a check in all functions calling it. If the file is too big, just give a message that time tracking or what ever it does is not possible yet for big files, and by pass preamble. It's very annoying that this bug (1917) now results in an error over something trivial when using large data.


Robert Oostenveld - 2013-04-22 19:53:43 +0200

(In reply to comment #3) agreed. What happens is that the data object is converted into a stream of bytes (in memory, which may also be problematic if very big), the md5 hash is computed and stored for history keeping.


Roemer van der Meij - 2013-04-24 15:39:41 +0200

I now added an explicit check on size, so 'big' input is skipped, while we search for a replacement function than can handle on larger files


Roemer van der Meij - 2013-06-05 12:15:21 +0200

Closing time