unix timestamp in EEG data using labstreaminglayer

Hello all,
I am currently working on the neuromarketing data available on this link. https://figshare.com/articles/dataset/NeuMa_Raw_A_multimodal_Neuromarketing_dataset/22117001/3 and it is described in this article https://www.nature.com/articles/s41597-023-02392-9. According to the authors, they have used https://github.com/labstreaminglayer/App-LabRecorder to record the data. I am struggling to understand how the timestamps work. In the article, the authors say they use unix timestamps and the timestamps has the form: 3.329729360250185e+05. I try to convert it back using the function datetime(t, 'ConvertFrom', 'posixtime') but it is giving me something non sense. Is there someone who has worked on this data or can help me with this?
Thank you very much in advance

11 Comments

Posix timestamps are seconds from Jan 1, 1970, and should be much more than !0E5 in magnitude. For example,
posixtime(datetime(now,'ConvertFrom','datenum'))
ans = 1.7617e+09
datetime(ans,'ConvertFrom','posixtime')
ans = datetime
28-Oct-2025 19:49:10
shows that the Posix time for just now.
The 1E5 values are much more in the timeframe of datenum, but tney would be about 7E5 for current date ...
now
ans = 7.3992e+05
You need to find out what the actual date of the data you're trying to read is. The link above to the Nature paper failed, btw...
Thank you so much for the reply. This is the link to the article, hopefully it works: https://www.nature.com/articles/s41597-023-02392-9
I am attaching a picture here, with the description of one of the stream of the dataset.
Perhaps 3.329729360250185e+05 = 332972.936 seconds, which equals approximately 92.49 hours or 3.85 days (which would match a typical PC runtime). Then these values provide much more accurate timestamps relative to the LSL epoch (typically system boot time or a similar reference point). And somewhere in the XDF file is some reference to what that epoch is, e.g. clock_offsets or similar.
This interpretation would mean:
  • first_timestamp - created_at = 188.706 seconds3.1 minutes
  • last_timestamp - first_timestamp = 359.997 seconds6 minutes
Does that match your expectation of those data sets?
Do you have a rough idea of the dates/years when those data sets were collected? Or the duration of the test?
Perhaps they encode hours since 1st Jan 1970, which would give created_at = 332,784.219 hours = 31st Dec 2007, 21:13:08 UTC... a rather odd time to start such an experiment. That academic needed a holiday.
Lax
Lax on 28 Oct 2025
Edited: Lax on 28 Oct 2025
Hello Stephen23. Thank you very much for your explaination. This is my first time working on this publicly available dataset.
According to the figshare repository the authors shared, the first version of the data was uploaded in 2023-02-17.
It is not clear to me about the duration of the test but 6 mins sound reasonable.
There is indeed the clock_offsets in the dataset, what would that object tells me?
Thank you once again
"There is indeed the clock_offsets in the dataset, what would that object tells me?"
I have no idea, I do not have your data. But you can open that 1x1 struct in the variables viewer and take a look at it.
Or upload the data here by saving it in a MAT file and then clicking the paperclip button.
Hello Stephen23, I am clipping the ET_stream file as a .mat file here.
Many thanks again
dpb
dpb on 29 Oct 2025
Edited: dpb on 29 Oct 2025
"... the first version of the data was uploaded in 2023-02-17."
But when the data were uploaded isn't necessarily the date of the data itself is it?
Somewhere in the documentation for the dataset there has to be a clear definition of the data structure; keep digging until you find it.
Looking at the RawDataProcessing.mlx file, it looks like everything is plotted against sample number. ET_stream.time_stamps is never used. Any plots that require sample time are supplied the sample frequency: EEG_stream.info.nominal_srate.
I think you can process this data without needing to figure out the time stamps.
Hello Cris,
Many thanks for your comments. The issue for me is that later on the data is processed using by using segmentation. There are products in which there are multiple EEG segments correspond to them, the authors suggest to discard the segments that are less than 3 seconds, that's why I need the timestamp.
There's a link to some m-files to process the EEG and other data streams; surely inside there will be the code to deal with the time stamps? Or, as @Cris LaPierre notes, they didn't use it, either.
This is not really a MATLAB-related issue at all, however...is there not some other users' group for which the Q? about the data structure itself would be more appropriate (and more likely to find somebody who actually knows)?
As far as the segments being of any given length, the sampling rate is given for each signal (some were 300 Hz, eye movement 120(?) Hz, I think I recall seeing). So, you can compute the length of the signal by the product of the sampling interval times a number of samples regardless of the time stamp itself. From that you then could probably back calculate what the time stamps are.
FsEEG=300; % EEG sampling rate
dt=1/FsEEG; % EEG sample time, s
3/dt % Samples/3 sec
ans = 900.0000
3*300 % or Seconds * samples/S ==> samples, too.
ans = 900
But, the actual date/time the data were taken would be totally immaterial to any analysis.
Between sample rate and number of samples, you can determine which segments are less than 3 seconds.
load ET_stream.mat
fs = str2double(ET_stream.info.nominal_srate)
fs = 600
nS = str2double(ET_stream.info.sample_count)
nS = 43199
segLength = nS/fs % length in seconds
segLength = 71.9983
I don't know enough about the equipment to know if it makes more sense to use nominal_srate or effective_srate. I selected nominal because that is what is used in the RawDataProcessing.mlx file.

Sign in to comment.

Answers (0)

Categories

Asked:

Lax
on 28 Oct 2025

Edited:

on 30 Oct 2025

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!