Back to the main page.
Bug 3316 - add sampling rate information to artifact structures
|2017-06-21 11:35:00 +0200
|2017-06-21 13:50:39 +0200
Thomas Hartmann - 2017-06-21 11:35:55 +0200hi, i would like to propose adding a field to the artifact definitions produced by the ft_artifact and ft_databrowser function. the ft_rejectartifact function could then check whether the artifact and data sampling rate match. if this is not the case, an error could be thrown or the artifacts' boundaries, expressed in samples, could be converted automatically. there are certain use cases in which data is resampled during the analysis process. applying ft_rejectartifact to data using artifact definitions that do not match in their sampling rate is not detected (and not detectable!) at the moment. this is a "silent error" that goes undetected and might have a sever impact on the analysis. the use case becomes apparent when analyzing data recorded with a high sampling rate (e.g., 10kHz), which is necessary to analyze phenomena like the FFR. doing artifact rejection on that kind of data is very slow and takes up a lot of RAM. downsampling the data for this process helps a lot without any sacrifice to the accuracy of neither automatic nor manual artifact identification as artifacts are not that high in frequency. however, the resulting artifact definitions must eventually be applied to the original data. i would be happy to provide a prototype implementation for discussion if this proposal is seen as sensible by the developers. best, thomas
Robert Oostenveld - 2017-06-21 12:40:56 +0200I understand and appreciate the problem, but am not sure if this is the best solution. There are other cases where artifact rejection would result in silent errors. E.g. after appending and/or after reconstructing sampleinfo. Perhaps another (not yet well thought trough) idea: how about adding "sampleinfoorig" when resampling?