- home
- Advanced Search
- European Marine Science
- Research data
- Neuroinformatics
- European Marine Science
- Research data
- Neuroinformatics
Loading
Research data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Wongupparaj, Peera;Wongupparaj, Peera;This dataset comprises EEG and behavioral data recorded from 60 Thai Buddhist monks who voluntarily participated in the research project. The behavioral data contain participant characteristics, while the EEG data provide absolute and relative powers of five frequency bands (delta, theta, alpha, beta, and gamma) during the 30-minute meditative states of the 60 Thai monks. The 64-channel EEG neuroscan was utilized to record the prolonged duration of the 30-minute meditation session. Additionally, the 30-minute EEG features are segmented into six 5-minute intervals ('0-5 minutes', '5-10 minutes', '10-15 minutes', '15-20 minutes', '20-25 minutes', and '25-30 minutes') to enable researchers to further investigate the temporal changes in brain activity during mindfulness meditation.
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/pthdhf2dwm.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/pthdhf2dwm.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Hemakom, Apit;Hemakom, Apit;The data were collected from 66 healthy university students (21 males, 24 females in the follicular phase of the menstrual cycle, and 21 females in the luteal phase of the menstrual cycle) . The Montreal Imaging Stress Task (MIST) was modified and used in this study. A series of computer-based mental arithmetic tasks is designed to evaluate responses in control and stressful conditions. The experiments thus conducted in two separate sessions for the two conditions, which were two weeks apart. Each of the 2 sessions consisted of 7 periods: training, eyes open (EO), mental arithmetic task (MAT) on 4 consecutive levels of difficulty (arithmetic calculation level 1, AC1–arithmetic calculation level 4, AC4), and recovery. The control condition started with a training period to familiarize the subject with the experimental procedure, during which ECG and EEG signals were not recorded. During this period, the subject was presented with a series of computerized sampled questions at 4 difficulty levels of mental arithmetic tasks. Answer choices for every question were displayed on a computer screen in a sequence of integers between 0 and 9. The subject was requested to use a wireless computer mouse to click on the correct answer. Following the training period, the recordings started, and the subjects were asked to sit in a relaxed position with no movement and to focus on a black dot displayed on a computer screen for 5 minutes (the eye-open, EO, period). These requirements were critical to constructing a reliable EEG baseline with the minimum amount of artifacts caused by eye and body movements. After that, an instruction to perform mental arithmetic calculations was shown on the computer screen. The mental arithmetic task is composed of 4 levels of difficulty. Each level lasted 5 minutes. Level 1 (Arithmetic Calculation Level 1 – AC1): addition (+) and subtraction (-) of 3 single-digit numbers, e.g., 7-4+1. Level 2 (Arithmetic Calculation Level 2 – AC2): addition (+), subtraction (-), and multiplication (x) of 3 single- and double-digit numbers, e.g., 6x8-30. Level 3 (Arithmetic Calculation Level 3 – AC3): addition (+), subtraction (-), and multiplication (x) of 4 single- and double-digit numbers, e.g., 35+10-4*8. Level 4 (Arithmetic Calculation Level 4 – AC): addition (+), subtraction (-), multiplication (x), and division (/) of 4 single- and double-digit numbers, e.g., 96/4x2-11. No time limit or negative feedback messages were given to the subjects. After each question, the correct/incorrect message was displayed. After the AC4 period, the subjects relaxed and sat still for 5 minutes (the recovery period). ECG and EEG signals were recorded from the beginning of the EO period until the end of the recovery period. The protocol for the mental-stress condition was the same as it was for the control condition, but with a time limit and social evaluative threat components introduced. Several negative feedbacks were introduced to actively induce stress.
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/cyhchpxwps&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/cyhchpxwps&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Corona-Gonzalez, Cesar E;Corona-Gonzalez, Cesar E;1. ObjectiveTo determine the electrophysiological impact of an online learning method (OLM) on children experiencing learning difficulties2. SampleThirty-six children aged between 7 and 13 were recruited to generate this database. Every child showed evidence of suffering from either reading or math difficulties. Thus, an OLM [1] was provided to enhance learning skills. Consequently, the sample was divided into the following study groups: A) Children with reading difficulties who:I) underwent cognitive training - Experimental groupII) did not undergo cognitive training - Control groupB) Children with math difficulties who:I) underwent cognitive training - Experimental groupII) did not undergo cognitive training - Control group3. Methods3.1. InterventionsBefore EEG recordings, the experimental groups were asked to utilize the OLM 15 minutes a day for three months. On the other hand, control groups continue with their traditional learning method for three months (e.g. attending school, home school, etc.). After that, EEG recordings were collected for all groups in the following conditions: 3.2. EEG recordings- Baseline: the participant was sitting in a comfortable position and looking at a cross on the screen for three minutes.- Reading: three texts were presented where reading aloud was mandatory. The child was asked to read as carefully as possible and according to his/her abilities. Self-corrections during reading were allowed. After each text, three comprehension questions were displayed.- Math: two blocks of twenty arithmetic operations each were solved. Every operation had three response alternatives.4. EEG data description (events)- See "Event and participant description.xlsxReferences:[1] Arroyo J, González de Vega D. Smartick. Madrid, España: Sistemas Virtuales de Aprendizaje S.L.; 2009. Available from: https://mx.smartickmethod.com/?f=1 THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/f8h7m6bys7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/f8h7m6bys7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Embargo end date: 12 Feb 2024 EnglishPublisher:Dryad Patelaki, Eleni; Foxe, John J.; Mantel, Emma P.; Kassis, George; Freedman, Edward G.;# Title of Dataset Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Young adults ## Description of the data and file structure This Drayd dataset contains multimodal MoBI data, collected from young adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. The data is organized as follows: ``` |-- 010705001 | |-- LSLData | | |-- 010705001.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705001.txt | | |-- mainExperScript_010705001.log | | |-- motion_state_010705001.txt | | |-- Training_GoNoGo_010705001.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705001_processed.txt | | |-- mainExperScript_010705001_processed.txt | |-- EEGstruct_Raw | | |-- 010705001.set | | |-- 010705001.fdt |-- 010705002 | |-- LSLData | | |-- 010705002.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705002.txt | | |-- mainExperScript_010705002.log | | |-- motion_state_010705002.txt | | |-- Training_GoNoGo_010705002.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705002_processed.txt | | |-- mainExperScript_010705002_processed.txt | |-- EEGstruct_Raw | | |-- 010705002.set | | |-- 010705002.fdt |-- ... |-- ... |-- ... |-- metadata.xlsx ``` ### Notes: #### About the LSLData folder: The `.mat` file in this folder contains a cell array with the 3 synchronized datastreams (EEG, motion capture, behavioral responses) along with metadata. Each cell of the array contains a different datastream. In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. In those cases, two separate `.mat` files occurred: one containing the recording before the break (e.g. `010705013_1.mat`) and another containing the recording after the break (eg. `010705013_2.mat`). #### About the Logfiles\_Raw folder: * The `GoNoGo_{participantID}.txt` is a manually created logfile containing the following information about the images presented during the Go/NoGo task (each row corresponds to one image): * Column **Block**: contains the block number during which the image was presented * Column **Trial**: contains the trial number during which the imagee was presented * Column **Image**: contains the IAPS code of the presented image * Column **RespTime**: contains the response time to the presented image * Column **MotState**: contains `sitting` if the block was a sitting block, and `walking` if the block was a walking block * Column **Button**: contains `1` if a valid button press was recorded in response to the image, and `0` if no valid button press was recorded. * The `Training_GoNoGo_{participantID}.txt` is a manually created logfile containing the same information as the `GoNoGo_{participantID}.txt`, but only for the training block. Note that this data was not analyzed the in the paper--it only serves to assess how well the participant understands the task, before they start with the actual experiment. * The `motion_state_{participantID}.txt` is a manually created logfile containing the order in which sitting and walking blocks were performed. IMPORTANT: this is the final/correct sequence of walking/sitting -- if the walking/sitting sequence in the `GoNoGo_{participantID}.txt` is different, then it must change to align with this one. The reason why those two sequences are different for some participants between the two logfiles is because the walking/sitting sequence that had initially been planned for them (`GoNoGo_{participantID}.txt`) had to change on the fly, for example because they were tired and requested to do more sitting and leave walking for later. * The `mainExperScript_{participantID}.log` is an logfile automatically generated by Presentation after the completion of each experimental scenario run. The information is organized in the following columns: * Column **Trial**: incremental trial number * Column **Event Type**: it can take one the following values * `Picture`: this is the most common event. The `Picture` event is on throughout runtime of the experimental scenario (even when a the black screen with the white centered cross is dispayed on the projection screen--that is a picture too) * `Response`: button press from the Nintendo switch * `Text Input`: it occurs at the end of some experimental blocks. The experimenter has coded the scenario to pause and wait until text input from is provided * `Pause`: it occurs at the end of some experimental blocks. It indicates that the scenario has been paused manually. * `Resume`: it almost always occurs after pause events, at the end of some experimental blocks. It indicates that the scenario has been resumed manually. * `Quit`: the scenario has been ternimated manually before its completion * Column **Code**: * For the **Picture** event type, it can take one of the following values: * `countdown_3`: Part of the countdown at the beginning of each experimental block. Displays a white '3' with a black background on the projection screen in front of the participant. * `countdown_2`: Part of the countdown at the beginning of each experimental block. Displays a white '2' with a black background on the projection screen in front of the participant. * `countdown_1`: Part of the countdown at the beginning of each experimental block. Displays a white '1' with a black background on the projection screen in front of the participant. * `countdown_go`: Part of the countdown at the beginning of each experimental block. Displays a white 'Go' with a black background on the projection screen in front of the participant. * `pic_display`: Displays an IAPS image on the projection screen in front of the participant. * `fixation_cross_no_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. No button presses are accepted during this event code, since they are considered as delayed responses to the previous trial. * `fixation_cross_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. Button presses are accepted during this event code. * For the **Response** event type, it can take either of the following values: * `1`: For button presses provided by the participant during task performance. * `2`: For keyboard presses provided by the experimenter at the end of each block, to enable continuing to the next block. * For all the rest of the event types (**Text Input**, **Pause**, **Resume**, **Quit**), the event code value is empty. * Column **Time**: time of occurrence of each event relative to the start of the scenario. * Column **TTime**: time of occurrence of each event relative to the start of the trial the event is in. * Column **Uncertainty** (Time): temporal uncertainty for each event. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **Duration**: For picture stimuli, this is the duration of the picture presentation. For pause events, this is the duration of the pause. Presentation does not monitor the durations of other events. * Column **Uncertainty** (Duration): uncertainty in the duration of a picture stimulus. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **ReqTime**: Requested time of presentation given in the scenario file. Note that actual presentation times for picture stimuli are constrained by the monitor refresh and therefore should differ from requested times. * Column **ReqDur**: For picture stimuli, this is the requested duration of presentation given in the scenario file. Note that picture stimuli durations are constrained by the monitor refresh. * Column **Stim Type**: Its value is `other`, except for pictures with code `fixation_cross_resp` during which button presses are accepted. For these picture events, the value is either `hit` (button press was detected) or miss (no button press was detected). All times written in the logfile are in tenths of milliseconds (0.1 milliseconds resolution). The uncertainties provide the upper limit so that an uncertainty of 0.2 milliseconds means the uncertainty is between 0.1 and 0.2 milliseconds. To view the logfile data properly aligned with respect to the columns defined above, it is suggested to use the following command in MATLAB: ```matlab S = importdata({full_path_to_logfile},'\t') ``` where S is a structure, and the field S.textdata is a cell array containing the aligned data. For more details about its structure, check the [Presentation documentation](https://www.neurobs.com/pres_docs/html/03_presentation/07_data_reporting/01_logfiles/03_event_table.htm). The event code of every image is the same, i.e. `pic_display`, which functions as a placeholder. To obtain behaviorally meaningful information, i.e. whether a specific trial was a correct or incorrect Go or NoGo, we need to know which exact IAPS image code each `pic_display` event corresponds to. To this end, information from the `mainExperScript_{participantID}.log` has to be fused with information from the `GoNoGo_{participantID}.txt` (after ensuring that the walking/sitting sequence of the latter is corrected according to the `motion_state_{participantID}.txt`). In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. As such, two separate sets of logfiles occurred, for all logfiles described above. Any logfile recorded as part of the first session, before the break, is denoted by an additional `_1` at the end of the logfile name, for example: `mainExperScript_010705013_1.log`, `GoNoGo_010705013_1.txt`, `motion_state_010705013_1.txt` and `Training_GoNoGo_010705013_1.txt`. Any logfile recorded as part of the second session, after the break, is denoted by an additional `_2` at the end of the file name, for example: `mainExperScript_010705013_2.log`, `GoNoGo_010705013_2.txt`, `motion_state_010705013_2.txt` and `Training_GoNoGo_010705013_2.txt`. In some cases where running the training block was not necessary (e.g. for the second recording session after the break, or the participant had already completed the training block shortly before the start of the experiment), the logfile name contains an additional `_noTraining` string, and no manually created training logfile was generated in this case, for example: `mainExperScript_noTraining_010705024.log` and `GoNoGo_noTraining_010705024.txt`. #### About the Logfiles\_Processed folder: * The `GoNoGo_{participantID}_processed.txt` is the same as the `GoNoGo_{participantID}.txt`, with the difference that it has 2 additional columns: * Column **EmoState**: contains the emotional valence (`positive/neutral/negative`) of each presented image. The classification into the 3 categories was conducted based on [Grühn & Scheibe, 2008](https://rdcu.be/dymZf). * Column **ZeroClusters**: contains `1` for all trials except those which belong to a cluster of 6 consecutive non-responses; those latter trials are assigned the value `0` in this column * The `mainExperScript_{participantID}_processed.txt` is the same as the `mainExperScript_{participantID}.log`, with the difference that every placeholder `pic_display` event code has been replaced with an appropriate string of the following structure: `StimOnset_MotState_EmoState_DistPrevNoGo_{distanceNum}_ButtonResp_{Button}_ZeroCluster_{ZeroClusters}_RT_{RespTime}_BlockNum_{Block}`. The `DistPrevNoGo` is followed by a number that indicates how many trials before the current one the last NoGo trial happened (**distanceNum**). #### About the EEGstruct\_Raw folder: Contains `.set` and `.fdt` files, which are formats used by EEGLAB. [EEGLAB](https://sccn.ucsd.edu/eeglab/index.php) is an open-source MATLAB toolbox for electrophysiological signal processing and analysis. Here is an example of loading an EEG dataset, using the pop_loadset function provided by EEGLAB: ```matlab EEGstruct = pop_loadset('010705001.set') ``` `.set` files contain the metadata and `.fdt` files contain the raw data. Alternatively, if the user prefers to work with `.mat` files only, they can load each EEG structure only once using pop_loadset, and then save it as a `.mat` file as follows: ```matlab save('010705001.mat','EEGstruct','-v7.3') ``` Each of these folders essentially contains a structure, the fields of which have been populated with EEG and behavioral data. Specifically, the field **data** contains a (channels)x(time points) matrix with the raw EEG data; the field **event** contains a structure where field **type** contains the event names (e.g. `sitting_hit_negative`, `walking_corrRej_positive`) and field **latency** contains the EEG time point at which the event occured. #### About the metadata.xlsx: This Excel file contains metadata about the whole dataset, organized into 2 sheets, the `Young Adults` sheet and the `Older adults` sheet. Each sheet contains metadata for the respective age group indicated by the sheet name. The first 5 columns are common across the 2 sheets: * Column **ID**: The 9-digit participant ID which is also the name of the individual participant data folders, e.g. `010705001`. The last 3 digits represent an incrementally-assigned number from 1-102. * Column **Age**: Participant's age at the time of the recording * Column **Speed**: treadmill speed, in miles per hour * Column **Sex**: `F` for female, `M` for male * Column **Dominant hand** (coincides with the hand used to provide button-press responses): `R` for right hand, `L` for left hand The `Older adults` sheet includes one additional, 6th column **MoCA score** containing the MoCA scores for each of the older participants. ## Sharing/Access information Data for this project will only be shared via Dryad. Data was not derived from any other sources. ## Code/Software The code will be provided [here](https://github.com/CNL-R). ## Associated Datasets In the context of this study, data were also collected from older adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. These older adult data can be found in the Dryad dataset titled **[Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Older adults](https://doi.org/10.5061/dryad.xsj3tx9nb)**. Combining walking with a demanding cognitive task is traditionally expected to elicit decrements in gait and/or cognitive task performance. However, it was recently shown that, in a cohort of young adults, most participants improved performance when walking was added to performance of a Go/NoGo response inhibition task. The present study aims to extend these previous findings to an older adult cohort, to investigate whether this improvement when dual-tasking is observed in healthy older adults. Mobile Brain/Body Imaging (MoBI) was used to record electroencephalographic (EEG) activity, three-dimensional (3D) gait kinematics and behavioral responses in the Go/NoGo task, during sitting or walking on a treadmill, in 34 young adults and 37 older adults. Increased response accuracy during walking, independent of age, was found to correlate with slower responses to stimuli (r = 0.44) and with walking-related EEG amplitude modulations over frontocentral regions (r = 0.47) during the sensory gating (N1) and conflict monitoring (N2) stages of inhibition, and over left-lateralized prefrontal regions (r = 0.47) during the stage of inhibitory control implementation (P3). These neural activity changes are related to the cognitive component of inhibition, and they were interpreted as signatures of behavioral improvement during walking. On the other hand, aging, independent of response accuracy during walking, was found to correlate with slower treadmill walking speeds (r = -0.68) and attenuation in walking-related EEG amplitude modulations over left-dominant frontal (r = -0.44) and parietooccipital regions (r = 0.48) during the N2 stage, and over centroparietal regions (r = 0.48) during the P3 stage. These neural activity changes are related to the motor component of inhibition, and they were interpreted as signatures of aging. Older adults whose response accuracy ‘paradoxically’ improved during walking manifested neural signatures of both behavioral improvement and aging, suggesting that their flexibility in reallocating neural resources while walking might be maintained for the cognitive but not for the motor inhibitory component. These distinct neural signatures of aging and behavior can potentially be used to identify ‘super-agers’, or individuals at risk for cognitive decline due to aging or neurodegenerative disease. The .mat files in the "LSLData" subfolders require MATLAB (MathWorks Inc., Natick, MA, USA) to open. An open-source alternative to open them is using Python (see https://www.askpython.com/python/examples/mat-files-in-python) The .set and .fdt files in the "EEGstruct_Raw" subfolders require EEGLAB, which is an open-source MATLAB toolbox for electrophysiological signal processing and analysis (https://sccn.ucsd.edu/eeglab/index.php). Alternatively, they can be opened using MNE, which is an open-source MATLAB toolbox for electrophysiological signal processing and analysis (https://mne.tools/dev/generated/mne.io.read_raw_eeglab.html). The .txt and .log files in the "Logiles_Raw" and "Logfiles_Processed" subfolders can be opened using any text editor. The metadata.xlsx file can be opened using Microsoft Excel. Alternatively, LibreOffice (https://www.libreoffice.org/), which is free and open-source, can be used. This dataset was collected using the Mobile Brain-Body Imaging modality, involving synchronous recordings of 3 data streams: 1) EEG (BioSemi Inc., Amsterdam, The Netherlands) 2) Behavioral responses to the designed Go/NoGo task (Presentation, Neurobehavioral Systems Inc., Berkeley, CA, USA) 3) Full-body kinematics (OptiTrack, NaturalPoint, Inc., Corvallis, OR, USA). To record these 3 data streams in a time-synchronized manner, the Lab Streaming Layer (LSL: https://labstreaminglayer.org/#/) was used. The data included are raw, except for the behavior-related logfiles, for which both raw and processed versions are provided (see README file for details).
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.mgqnk9947&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.mgqnk9947&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Embargo end date: 12 Feb 2024 EnglishPublisher:Dryad Patelaki, Eleni; Foxe, John J.; Mantel, Emma P.; Kassis, George; Freedman, Edward G.;# Title of Dataset Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Older adults ## Description of the data and file structure This Drayd dataset contains multimodal MoBI data, collected from older adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. The data is organized as follows: ``` |-- 010705027 | |-- LSLData | | |-- 010705027_1.mat | | |-- 010705027_2.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705027_1.txt | | |-- GoNoGo_noTraining_010705027_2.txt | | |-- mainExperScript_010705027_1.log | | |-- mainExperScript_noTraining_010705027_2.log | | |-- motion_state_010705027_1.txt | | |-- motion_state_010705027_2.txt | | |-- Training_GoNoGo_010705027.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705027_processed.txt | | |-- mainExperScript_010705027_processed.txt | |-- EEGstruct_Raw | | |-- 010705027.set | | |-- 010705027.fdt |-- 010705028 | |-- LSLData | | |-- 010705028.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705028.txt | | |-- mainExperScript_010705028.log | | |-- motion_state_010705028.txt | | |-- Training_GoNoGo_010705028.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705028_processed.txt | | |-- mainExperScript_010705028_processed.txt | |-- EEGstruct_Raw | | |-- 010705028.set | | |-- 010705028.fdt |-- ... |-- ... |-- ... |-- metadata.xlsx ``` ### Notes: #### About the LSLData folder: The `.mat` file in this folder contains a cell array with the 3 synchronized datastreams (EEG, motion capture, behavioral responses) along with metadata. Each cell of the array contains a different datastream. In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. In those cases, two separate `.mat` files occurred: one containing the recording before the break (e.g. `010705027_1.mat`, `010705082_1.mat`) and another containing the recording after the break (eg. `010705027_2.mat`, `010705082_2.mat`). #### About the Logfiles\_Raw folder: * The `GoNoGo_{participantID}.txt` is a manually created logfile containing the following information about the images presented during the Go/NoGo task (each row corresponds to one image): * Column **Block**: contains the block number during which the image was presented * Column **Trial**: contains the trial number during which the imagee was presented * Column **Image**: contains the IAPS code of the presented image * Column **RespTime**: contains the response time to the presented image * Column **MotState**: contains `sitting` if the block was a sitting block, and `walking` if the block was a walking block * Column **Button**: contains `1` if a valid button press was recorded in response to the image, and `0` if no valid button press was recorded. * The `Training_GoNoGo_{participantID}.txt` is a manually created logfile containing the same information as the `GoNoGo_{participantID}.txt`, but only for the training block. Note that this data was not analyzed the in the paper--it only serves to assess how well the participant understands the task, before they start with the actual experiment. * The `motion_state_{participantID}.txt` is a manually created logfile containing the order in which sitting and walking blocks were performed. IMPORTANT: this is the final/correct sequence of walking/sitting -- if the walking/sitting sequence in the `GoNoGo_{participantID}.txt` is different, then it must change to align with this one. The reason why those two sequences are different for some participants between the two logfiles is because the walking/sitting sequence that had initially been planned for them (`GoNoGo_{participantID}.txt`) had to change on the fly, for example because they were tired and requested to do more sitting and leave walking for later. * The `mainExperScript_{participantID}.log` is an logfile automatically generated by Presentation after the completion of each experimental scenario run. The information is organized in the following columns: * Column **Trial**: incremental trial number * Column **Event Type**: it can take one the following values * `Picture`: this is the most common event. The `Picture` event is on throughout runtime of the experimental scenario (even when a the black screen with the white centered cross is dispayed on the projection screen--that is a picture too) * `Response`: button press from the Nintendo switch * `Text Input`: it occurs at the end of some experimental blocks. The experimenter has coded the scenario to pause and wait until text input from is provided * `Pause`: it occurs at the end of some experimental blocks. It indicates that the scenario has been paused manually. * `Resume`: it almost always occurs after pause events, at the end of some experimental blocks. It indicates that the scenario has been resumed manually. * `Quit`: the scenario has been ternimated manually before its completion * Column **Code**: * For the **Picture** event type, it can take one of the following values: * `countdown_3`: Part of the countdown at the beginning of each experimental block. Displays a white '3' with a black background on the projection screen in front of the participant. * `countdown_2`: Part of the countdown at the beginning of each experimental block. Displays a white '2' with a black background on the projection screen in front of the participant. * `countdown_1`: Part of the countdown at the beginning of each experimental block. Displays a white '1' with a black background on the projection screen in front of the participant. * `countdown_go`: Part of the countdown at the beginning of each experimental block. Displays a white 'Go' with a black background on the projection screen in front of the participant. * `pic_display`: Displays an IAPS image on the projection screen in front of the participant. * `fixation_cross_no_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. No button presses are accepted during this event code, since they are considered as delayed responses to the previous trial. * `fixation_cross_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. Button presses are accepted during this event code. * For the **Response** event type, it can take either of the following values: * `1` For button presses provided by the participant during task performance. * `2`: For keyboard presses provided by the experimenter at the end of each block, to enable continuing to the next block. * For all the rest of the event types (**Text Input**, **Pause**, **Resume**, **Quit**), the event code value is empty. * Column **Time**: time of occurrence of each event relative to the start of the scenario. * Column **TTime**: time of occurrence of each event relative to the start of the trial the event is in. * Column **Uncertainty** (Time): temporal uncertainty for each event. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **Duration**: For picture stimuli, this is the duration of the picture presentation. For pause events, this is the duration of the pause. Presentation does not monitor the durations of other events. * Column **Uncertainty** (Duration): uncertainty in the duration of a picture stimulus. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **ReqTime**: Requested time of presentation given in the scenario file. Note that actual presentation times for picture stimuli are constrained by the monitor refresh and therefore should differ from requested times. * Column **ReqDur**: For picture stimuli, this is the requested duration of presentation given in the scenario file. Note that picture stimuli durations are constrained by the monitor refresh. * Column **Stim Type**: Its value is `other`, except for pictures with code `fixation_cross_resp` during which button presses are accepted. For these picture events, the value is either `hit` (button press was detected) or miss (no button press was detected). All times written in the logfile are in tenths of milliseconds (0.1 milliseconds resolution). The uncertainties provide the upper limit so that an uncertainty of 0.2 milliseconds means the uncertainty is between 0.1 and 0.2 milliseconds. To view the logfile data properly aligned with respect to the columns defined above, it is suggested to use the following command in MATLAB: ```matlab S = importdata({full_path_to_logfile},'\t') ``` where S is a structure, and the field S.textdata is a cell array containing the aligned data. For more details about its structure, check the [Presentation documentation](https://www.neurobs.com/pres_docs/html/03_presentation/07_data_reporting/01_logfiles/03_event_table.htm). The event code of every image is the same, i.e. `pic_display`, which functions as a placeholder. To obtain behaviorally meaningful information, i.e. whether a specific trial was a correct or incorrect Go or NoGo, we need to know which exact IAPS image code each `pic_display` event corresponds to. To this end, information from the `mainExperScript_{participantID}.log` has to be fused with information from the `GoNoGo_{participantID}.txt` (after ensuring that the walking/sitting sequence of the latter is corrected according to the `motion_state_{participantID}.txt`). In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. As such, two separate sets of logfiles occurred, for all logfiles described above. Any logfile recorded as part of the first session, before the break, is denoted by an additional `_1` at the end of the logfile name, for example: `mainExperScript_010705082_1.log`, `GoNoGo_010705082_1.txt`, `motion_state_010705082_1.txt` and `Training_GoNoGo_010705082_1.txt`. Any logfile recorded as part of the second session, after the break, is denoted by an additional `_2` at the end of the file name, for example: `mainExperScript_010705082_2.log`, `GoNoGo_010705082_2.txt`, `motion_state_010705082_2.txt` and `Training_GoNoGo_010705082_2.txt`. In some cases where running the training block was not necessary (e.g. for the second recording session after the break, or the participant had already completed the training block shortly before the start of the experiment), the logfile name contains an additional `_noTraining` string, and no manually created training logfile was generated in this case, for example: `mainExperScript_noTraining_010705042.log` and `GoNoGo_noTraining_010705042.txt`. #### About the Logfiles\_Processed folder: * The `GoNoGo_{participantID}_processed.txt` is the same as the `GoNoGo_{participantID}.txt`, with the difference that it has 2 additional columns: * Column **EmoState**: contains the emotional valence (`positive/neutral/negative`) of each presented image. The classification into the 3 categories was conducted based on [Grühn & Scheibe, 2008](https://rdcu.be/dymZf). * Column **ZeroClusters**: contains `1` for all trials except those which belong to a cluster of 6 consecutive non-responses; those latter trials are assigned the value `0` in this column * The `mainExperScript_{participantID}_processed.txt` is the same as the `mainExperScript_{participantID}.log`, with the difference that every placeholder 'pic_display' event code has been replaced with an appropriate string of the following structure: `StimOnset_MotState_EmoState_DistPrevNoGo_{distanceNum}_ButtonResp_{Button}_ZeroCluster_{ZeroClusters}_RT_{RespTime}_BlockNum_{Block}`. The `DistPrevNoGo` is followed by a number that indicates how many trials before the current one the last NoGo trial happened (**distanceNum**). #### About the EEGstruct\_Raw folder: Contains `.set` and `.fdt` files, which are formats used by EEGLAB. [EEGLAB](https://sccn.ucsd.edu/eeglab/index.php) is an open-source MATLAB toolbox for electrophysiological signal processing and analysis. Here is an example of loading an EEG dataset, using the pop_loadset function provided by EEGLAB: ```matlab EEGstruct = pop_loadset('010705027.set') ``` `.set` files contain the metadata and `.fdt` files contain the raw data. Alternatively, if the user prefers to work with `.mat` files only, they can load each EEG structure only once using pop_loadset, and then save it as a `.mat` file as follows: ```matlab save('010705027.mat','EEGstruct','-v7.3') ``` Each of these folders essentially contains a structure, the fields of which have been populated with EEG and behavioral data. Specifically, the field **data** contains a (channels)x(time points) matrix with the raw EEG data; the field **event** contains a structure where field **type** contains the event names (e.g. `sitting_hit_negative`, `walking_corrRej_positive`) and field **latency** contains the EEG time point at which the event occured. #### About the metadata.xlsx: This Excel file contains metadata about the whole dataset, organized into 2 sheets, the `Young Adults` sheet and the `Older adults` sheet. Each sheet contains metadata for the respective age group indicated by the sheet name. The first 5 columns are common across the 2 sheets: * Column **ID**: The 9-digit participant ID which is also the name of the individual participant data folders, e.g. `010705001`. The last 3 digits represent an incrementally-assigned number from 1-102. * Column **Age**: Participant's age at the time of the recording * Column **Speed**: treadmill speed, in miles per hour * Column **Sex**: `F` for female, `M` for male * Column **Dominant hand** (coincides with the hand used to provide button-press responses): `R` for right hand, `L` for left hand The `Older adults` sheet includes one additional, 6th column **MoCA score** containing the MoCA scores for each of the older participants. ## Sharing/Access information Data for this project will only be shared via Dryad. Data was not derived from any other sources. ## Code/Software The code will be provided [here](https://github.com/CNL-R). ## Associated Datasets In the context of this study, data were also collected from young adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. These young adult data can be found in the Dryad dataset titled **[Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Young adults](https://doi.org/10.5061/dryad.mgqnk9947)**. Combining walking with a demanding cognitive task is traditionally expected to elicit decrements in gait and/or cognitive task performance. However, it was recently shown that, in a cohort of young adults, most participants improved performance when walking was added to performance of a Go/NoGo response inhibition task. The present study aims to extend these previous findings to an older adult cohort, to investigate whether this improvement when dual-tasking is observed in healthy older adults. Mobile Brain/Body Imaging (MoBI) was used to record electroencephalographic (EEG) activity, three-dimensional (3D) gait kinematics and behavioral responses in the Go/NoGo task, during sitting or walking on a treadmill, in 34 young adults and 37 older adults. Increased response accuracy during walking, independent of age, was found to correlate with slower responses to stimuli (r = 0.44) and with walking-related EEG amplitude modulations over frontocentral regions (r = 0.47) during the sensory gating (N1) and conflict monitoring (N2) stages of inhibition, and over left-lateralized prefrontal regions (r = 0.47) during the stage of inhibitory control implementation (P3). These neural activity changes are related to the cognitive component of inhibition, and they were interpreted as signatures of behavioral improvement during walking. On the other hand, aging, independent of response accuracy during walking, was found to correlate with slower treadmill walking speeds (r = -0.68) and attenuation in walking-related EEG amplitude modulations over left-dominant frontal (r = -0.44) and parietooccipital regions (r = 0.48) during the N2 stage, and over centroparietal regions (r = 0.48) during the P3 stage. These neural activity changes are related to the motor component of inhibition, and they were interpreted as signatures of aging. Older adults whose response accuracy ‘paradoxically’ improved during walking manifested neural signatures of both behavioral improvement and aging, suggesting that their flexibility in reallocating neural resources while walking might be maintained for the cognitive but not for the motor inhibitory component. These distinct neural signatures of aging and behavior can potentially be used to identify ‘super-agers’, or individuals at risk for cognitive decline due to aging or neurodegenerative disease. This dataset was collected using the Mobile Brain-Body Imaging modality, involving synchronous recordings of 3 data streams: 1) EEG (BioSemi Inc., Amsterdam, The Netherlands) 2) Behavioral responses to the designed Go/NoGo task (Presentation, Neurobehavioral Systems Inc., Berkeley, CA, USA) 3) Full-body kinematics (OptiTrack, NaturalPoint, Inc., Corvallis, OR, USA). To record these 3 data streams in a time-synchronized manner, the Lab Streaming Layer (LSL: https://labstreaminglayer.org/#/) was used. The data included are raw, except for the behavior-related logfiles, for which both raw and processed versions are provided (see README file for details).
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.xsj3tx9nb&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.xsj3tx9nb&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Aslan, M;Aslan, M;This dataset includes EEG signals for lie detection. EEG signals were collected using a wearable and portable EEG device called Emotiv Insight, which has 5 channels, from 27 different subjects. The subjects participated in two experiments, taking on the roles of deceivers and truth-tellers. In each experiment, a box with 5 different beads was given to the subjects, and they were instructed to take 2 beads from the box and place them in their pockets. In the first experiment, subjects were asked to decide whether to assume the role of a deceiver or a truth-teller. In the second experiment, they were required to take on the opposite role. During the experiments, subjects watched a video composed of images of the beads in the box placed in front of them. The video clip started with a 3-second black screen, followed by 2 seconds of bead images and 1 second of a black screen, repeating in this pattern. After obtaining EEG data, the initial 2 seconds of excessive signal data were removed from the raw data, resulting in a total of 75 seconds of EEG data. In the deceiver role, subjects clicked the button in their left hand labeled "no" if the displayed image matched the bead they took, and the button in their right hand labeled "yes" if it did not, thus deceiving about all the images. For the truth-teller role, the opposite actions were taken, clicking "yes" for the taken bead image and "no" for the not-taken bead image, thus telling the truth for all images. EEG signals were recorded following this procedure. The EEG signals underwent an offset removal process to obtain raw EEG data. Both raw data and preprocessed EEG data were stored in .csv format. The purpose of this dataset is to provide EEG signals for lie detection, offering an alternative and diverse dataset with different channel counts.When this data set is used, the relevant article must be cited. The relevant article citation is below.Aslan, M., Baykara, M. & Alakus, T.B. LieWaves: dataset for lie detection based on EEG signals and wavelets. Med Biol Eng Comput (2024). https://doi.org/10.1007/s11517-024-03021-2 THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/5gzxb2bzs2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/5gzxb2bzs2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Halder, S;Halder, S;1. Grand_Contingency_table.mat -- This file has all 24 participant's behavioral response dataset in 3*3 contingency table format for 5 lags. Format - structure; one high contrast and one low contrast file; each file - 3*3*5*24. 2. Grand_psychophysical_params_table.mat -- This file has all 24 participant's psychophysical parameter dataset for 5 lags. Format - structure; 7 parameters obtained from the model; each file - 5*24. 3. n2p_ERP_all_lag_occ_par.mat and p3_ERP_all_lag_par.mat -- These two files has the data for these two ERP components for 18 EEG participants for 5 lags - n2p in occ-par electrodes and p3 in parietal electrodes. Format : double, each file 251*18*5. 4. Early_timewindow_ERP_peak_measure_all_electrode.mat and Late_timewindow_ERP_peak_measure_all_electrode.mat -- These two files have ERP components quantified as peak measure in early (150 - 300ms) and late (300-550 ms) for all electrodes for 18 participants and for 5 lags. Format : double, each file 128*90. (18*5 =90 ) 5. btw_cls_detection_discrimination_distances_lag_wise.mat -- This file contains between-class distance metrics. It has 2 1x5 cell structures, one for distances in detection dimension and the other for discrimination dimension for 5 different lags between the categories. Each element in this cell structure has distance values for every time point and from 18 subjects. This file also contains the time variable with 251 time points. 6. Fronto_Parietal_Coherence_taperParams_3_5_left_left.mat -- This file contains five coherency variables for each lag. Each variable is a subject x time x frequency dimension (i.e., 18 x 59 x 82). It also has time (59 time points) and frequency (82 frequency values) variables. 7. For_correlations_between_metrics -- This file contains multiple .mat variables (quantified coherence, quantified distance and d' for 18 subjects and for 5 lags; format : 90*1) for correlation analysis. THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tyscxph53y.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tyscxph53y.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Corona-Gonzalez, Cesar E;Corona-Gonzalez, Cesar E;1. ObjectiveTo perform a psychophysiological evaluation of reading and mathematical skills in children with low academic performance.2. SampleOne hundred and two children participated in this study. Each child was allocated into one of the following groups: a) 51 Children with low academic performance in reading.b) 51 Children with low academic performance in math.2.1. Sample selection criteria* Children aged between 7-13 years.* Not to be diagnosed with any neurological disorder.* Capacity to read a simple text and solve arithmetic facts* Indistinct gender and socioeconomic level.3. MethodsData collection was carried out across two stages: I) Psychometric evaluation, where reading, spelling, math, attention levels, and IQ were assessed. These results determined which academic skill was the most affected. II) EEG acquisition, by designing two experimental paradigms. For children in the reading group, three passages were displayed and should have been read out loud. After that, three multiple-choice reading comprehension questions were presented. Regarding the math experiment, forty arithmetical facts were displayed within two blocks (20 each). Similarly, three options were displayed to answer the operation. 4. Data description.4.1. EEG recordings.- Baseline: a cross was displayed on the screen, and a 3-min recording was taken for each child. - Reading: children read out loud three texts while the EEG was being recorded. Then, three comprehension questions were answered.- Math: two blocks of 20 operations were performed for each child.4.2. EEG events.- Baseline: no events.- Reading: reading out loud (between 'condition' events); answers for reading comprehension questions ('33026' option 1; '33027' option 2; '33028' option 3).- Math: block 1 (from '33101' to '33120'); block 2 (from '33121' to '33140'); answers ('33026' option 1; '33027' option 2; '33028' option 3).4.3. Files attached- 32Ch_gTec.ced/.txt - Channel location. - Arithmetical Facts and Answers.xlsx: includes the event tags for each operation and answers for the Math EEG data- Comprehension Questions and Answers.xlsx: includes the event tags for each answer for the comprehension questions. In addition, this file describes the sequence of appearance of the questions.- EEG files.xlsx: it shows the EEG files available in the database - Psychometric Variables.xlsx: description of each psychometric variable and categories associated with each numerical value. - Psychometric_Math.xlsx: psychometric results for children in the math group.- Psychometric_Reading.xlsx: psychometric results for children in the reading groupReferences: [1] Secretaría de Educación Pública. Manual de Procedimientos para el Fomento y la Valoración de la Compentencia Lectora en el Aula. Available from: https://rarchivoszona33.files.wordpress.com/2012/10/manual_fomento.pdf THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/bnfgrn5jbb.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/bnfgrn5jbb.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Embargo end date: 12 Feb 2024 EnglishPublisher:Dryad Authors: Patelaki, Eleni; Foxe, John J.; McFerren, Amber L.; Freedman, Edward G.;Patelaki, Eleni; Foxe, John J.; McFerren, Amber L.; Freedman, Edward G.;# Title of Dataset Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Increased cognitive load ## Description of the data and file structure This Drayd dataset contains multimodal MoBI data, collected from young adults while performing the 2-back Go/NoGo response inhibition task and concurrently walking on a treadmill. The data is organized as follows: ``` |-- 010705001 | |-- LSLData | | |-- 010705001_1.mat | | |-- 010705001_2.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705001_1.txt | | |-- GoNoGo_010705001_2.txt | | |-- mainExperScript_010705001_1.txt | | |-- mainExperScript_010705001_2.txt | | |-- motion_state_010705001_1.txt | | |-- motion_state_010705001_2.txt | | |-- Training_GoNoGo_010705001_1.txt | | |-- Training_GoNoGo_010705001_2.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705001_processed.txt | | |-- mainExperScript_010705001_processed.txt | |-- EEGstruct_Raw | | |-- 010705001.set | | |-- 010705001.fdt |-- 010705002 | |-- LSLData | | |-- 010705002.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705002.txt | | |-- mainExperScript_010705002.txt | | |-- motion_state_010705002.txt | | |-- Training_GoNoGo_010705002.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705002_processed.txt | | |-- mainExperScript_010705002_processed.txt | |-- EEGstruct_Raw | | |-- 010705002.set | | |-- 010705002.fdt |-- ... |-- ... |-- ... |-- metadata.xlsx ``` ### Notes: #### About the LSLData folder: The `.mat` file in this folder contains a cell array with the 3 synchronized datastreams (EEG, motion capture, behavioral responses) along with metadata. Each cell of the array contains a different datastream. In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. In those cases, two separate `.mat` files occurred: one containing the recording before the break (e.g. `010705001_1.mat`) and another containing the recording after the break (eg. `010705001_2.mat`). #### About the Logfiles\_Raw folder: * The `GoNoGo_{participantID}.txt` is a manually created logfile containing the following information about the images presented during the Go/NoGo task (each row corresponds to one image): * Column **Block**: contains the block number during which the image was presented * Column **Trial**: contains the trial number during which the imagee was presented * Column **Image**: contains the IAPS code of the presented image * Column **RespTime**: contains the response time to the presented image * Column **MotState**: contains `sitting` if the block was a sitting block, and `walking` if the block was a walking block * Column **Button**: contains `1` if a valid button press was recorded in response to the image, and `0` if no valid button press was recorded. * The `Training_GoNoGo_{participantID}.txt` is a manually created logfile containing the same information as the `GoNoGo_{participantID}.txt`, but only for the training block. Note that this data was not analyzed the in the paper--it only serves to assess how well the participant understands the task, before they start with the actual experiment. * The `motion_state_{participantID}.txt` is a manually created logfile containing the order in which sitting and walking blocks were performed. IMPORTANT: this is the final/correct sequence of walking/sitting -- if the walking/sitting sequence in the `GoNoGo_{participantID}.txt` is different, then it must change to align with this one. The reason why those two sequences are different for some participants between the two logfiles is because the walking/sitting sequence that had initially been planned for them (`GoNoGo_{participantID}.txt`) had to change on the fly, for example because they were tired and requested to do more sitting and leave walking for later. * The `mainExperScript_{participantID}.txt` is an logfile automatically generated by Presentation after the completion of each experimental scenario run. The information is organized in the following columns: * Column **Trial**: incremental trial number * Column **Event Type**: it can take one the following values * `Picture`: this is the most common event. The `Picture` event is on throughout runtime of the experimental scenario (even when a the black screen with the white centered cross is dispayed on the projection screen--that is a picture too) * `Response`: button press from the Nintendo switch * `Text Input`: it occurs at the end of some experimental blocks. The experimenter has coded the scenario to pause and wait until text input from is provided * `Pause`: it occurs at the end of some experimental blocks. It indicates that the scenario has been paused manually. * `Resume`: it almost always occurs after pause events, at the end of some experimental blocks. It indicates that the scenario has been resumed manually. * `Quit`: the scenario has been ternimated manually before its completion * Column **Code**: * For the **Picture** event type, it can take one of the following values: * `countdown_3`: Part of the countdown at the beginning of each experimental block. Displays a white '3' with a black background on the projection screen in front of the participant. * `countdown_2`: Part of the countdown at the beginning of each experimental block. Displays a white '2' with a black background on the projection screen in front of the participant. * `countdown_1`: Part of the countdown at the beginning of each experimental block. Displays a white '1' with a black background on the projection screen in front of the participant. * `countdown_go`: Part of the countdown at the beginning of each experimental block. Displays a white 'Go' with a black background on the projection screen in front of the participant. * `pic_display`: Displays an IAPS image on the projection screen in front of the participant. * `fixation_cross_no_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. No button presses are accepted during this event code, since they are considered as delayed responses to the previous trial. * `fixation_cross_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. Button presses are accepted during this event code. * For the **Response** event type, it can take either of the following values: * `1`: For button presses provided by the participant during task performance. * `2`: For keyboard presses provided by the experimenter at the end of each block, to enable continuing to the next block. * For all the rest of the event types (**Text Input**, **Pause**, **Resume**, **Quit**), the event code value is empty. * Column **Time**: time of occurrence of each event relative to the start of the scenario. * Column **TTime**: time of occurrence of each event relative to the start of the trial the event is in. * Column **Uncertainty** (Time): temporal uncertainty for each event. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **Duration**: For picture stimuli, this is the duration of the picture presentation. For pause events, this is the duration of the pause. Presentation does not monitor the durations of other events. * Column **Uncertainty** (Duration): uncertainty in the duration of a picture stimulus. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **ReqTime**: Requested time of presentation given in the scenario file. Note that actual presentation times for picture stimuli are constrained by the monitor refresh and therefore should differ from requested times. * Column **ReqDur**: For picture stimuli, this is the requested duration of presentation given in the scenario file. Note that picture stimuli durations are constrained by the monitor refresh. * Column **Stim Type**: Its value is `other`, except for pictures with code `fixation_cross_resp` during which button presses are accepted. For these picture events, the value is either `hit` (button press was detected) or miss (no button press was detected). All times written in the logfile are in tenths of milliseconds (0.1 milliseconds resolution). The uncertainties provide the upper limit so that an uncertainty of 0.2 milliseconds means the uncertainty is between 0.1 and 0.2 milliseconds. To view the logfile data properly aligned with respect to the columns defined above, it is suggested to use the following command in MATLAB: ```matlab S = importdata({full_path_to_logfile},'\t') ``` where S is a structure, and the field S.textdata is a cell array containing the aligned data. For more details about its structure, check the [Presentation documentation](https://www.neurobs.com/pres_docs/html/03_presentation/07_data_reporting/01_logfiles/03_event_table.htm). The event code of every image is the same, i.e. `pic_display`, which functions as a placeholder. To obtain behaviorally meaningful information, i.e. whether a specific trial was a correct or incorrect Go or NoGo, we need to know which exact IAPS image code each `pic_display` event corresponds to. To this end, information from the `mainExperScript_{participantID}.txt` has to be fused with information from the `GoNoGo_{participantID}.txt` (after ensuring that the walking/sitting sequence of the latter is corrected according to the `motion_state_{participantID}.txt`). In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. As such, two separate sets of logfiles occurred, for all logfiles described above. Any logfile recorded as part of the first session, before the break, is denoted by an additional `_1` at the end of the logfile name, for example: `mainExperScript_010705001_1.txt`, `GoNoGo_010705001_1.txt`, `motion_state_010705001_1.txt` and `Training_GoNoGo_010705001_1.txt`. Any logfile recorded as part of the second session, after the break, is denoted by an additional `_2` at the end of the file name, for example: `mainExperScript_010705001_2.txt`, `GoNoGo_010705001_2.txt`, `motion_state_010705001_2.txt` and `Training_GoNoGo_010705001_2.txt`. #### About the Logfiles\_Processed folder: * The `GoNoGo_{participantID}_processed.txt` is the same as the `GoNoGo_{participantID}.txt`, with the difference that it has 2 additional columns: * Column **EmoState**: contains the emotional valence (`positive/neutral/negative`) of each presented image. The classification into the 3 categories was conducted based on [Grühn & Scheibe, 2008](https://rdcu.be/dymZf). * Column **ZeroClusters**: contains `1` for all trials except those which belong to a cluster of 6 consecutive non-responses; those latter trials are assigned the value `0` in this column * The `mainExperScript_{participantID}_processed.txt` is the same as the `mainExperScript_{participantID}.txt`, with the difference that every placeholder `pic_display` event code has been replaced with an appropriate string of the following structure: `StimOnset_MotState_EmoState_DistPrevNoGo_{distanceNum}_ButtonResp_{Button}_ZeroCluster_{ZeroClusters}_RT_{RespTime}_BlockNum_{Block}`. The `DistPrevNoGo` is followed by a number that indicates how many trials before the current one the last NoGo trial happened (**distanceNum**). #### About the EEGstruct\_Raw folder: Contains `.set` and `.fdt` files, which are formats used by EEGLAB. [EEGLAB](https://sccn.ucsd.edu/eeglab/index.php) is an open-source MATLAB toolbox for electrophysiological signal processing and analysis. Here is an example of loading an EEG dataset, using the pop_loadset function provided by EEGLAB: ```matlab EEGstruct = pop_loadset('010705001.set') ``` `.set` files contain the metadata and `.fdt` files contain the raw data. Alternatively, if the user prefers to work with `.mat` files only, they can load each EEG structure only once using pop_loadset, and then save it as a `.mat` file as follows: ```matlab save('010705001.mat','EEGstruct','-v7.3') ``` Each of these folders essentially contains a structure, the fields of which have been populated with EEG and behavioral data. Specifically, the field **data** contains a (channels)x(time points) matrix with the raw EEG data; the field **event** contains a structure where field **type** contains the event names (e.g. `sitting_hit_negative`, `walking_corrRej_positive`) and field **latency** contains the EEG time point at which the event occured. #### About the metadata.xlsx: This Excel file contains metadata about the whole 2-back dataset. The information provided in each column is explained below: * Column **ID**: The 9-digit participant ID which is also the name of the individual participant data folders, e.g. `010705001`. The last 3 digits represent an incrementally-assigned number from 1-114. * Column **Age**: Participant's age at the time of the recording * Column **Speed**: treadmill speed, in miles per hour * Column **Sex**: `F` for female, `M` for male * Column **Dominant hand** (coincides with the hand used to provide button-press responses): `R` for right hand, `L` for left hand ## Sharing/Access information Data for this project will only be shared via Dryad. Data was not derived from any other sources. ## Code/Software The code will be provided [here](https://github.com/CNL-R). ## Associated Datasets In the context of this study, data were also collected from young adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. These 1-back data can be found in the Dryad dataset titled **[Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Young adults](https://doi.org/10.5061/dryad.mgqnk9947)**. This study elucidates the neural mechanisms underlying increasing cognitive load while walking by employing 2 versions of a response inhibition task, the ‘1-back’ version and the more cognitively demanding ‘2-back’ version. By using the Mobile Brain/Body Imaging (MoBI) modality, electroencephalographic (EEG) activity, three-dimensional (3D) gait kinematics and task-related behavioral responses were collected while young adults (n = 61) performed either the 1-back or 2-back response inhibition task. Interestingly, increasing inhibitory difficulty from 1-back to 2-back during walking was not associated with any detectable costs in response accuracy, response speed, or gait consistency. However, the more difficult cognitive task was associated with distinct EEG component changes during both successful inhibitions (correct rejections) and successful executions (hits) of the motor response. During correct rejections, ERP changes were found over frontal regions, during latencies related to sensory gain control, conflict monitoring and working memory storage and processing. During hits, ERP changes were found over left-parietal regions during latencies related to orienting attention and subsequent selection and execution of the motor plan. The pattern of attenuation in walking-related EEG amplitude changes, during 2-back task performance, is thought to reflect more effortful recalibration of neural processes, a mechanism that might be a key driver of performance maintenance in the face of increased cognitive demands while walking. Overall, the present findings shed light on the extent of the neurocognitive capacity of young adults and may lead to a better understanding of how factors such as aging or neurological disorders could impinge on this capacity. This dataset was collected using the Mobile Brain-Body Imaging modality, involving synchronous recordings of 3 data streams: 1) EEG (BioSemi Inc., Amsterdam, The Netherlands) 2) Behavioral responses to the designed Go/NoGo task (Presentation, Neurobehavioral Systems Inc., Berkeley, CA, USA) 3) Full-body kinematics (OptiTrack, NaturalPoint, Inc., Corvallis, OR, USA). To record these 3 data streams in a time-synchronized manner, the Lab Streaming Layer (LSL: https://labstreaminglayer.org/#/) was used. The data included are raw, except for the behavior-related logfiles, for which both raw and processed versions are provided (see README file for details).
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.n2z34tn3d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.n2z34tn3d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Yin, Hanmo;Yin, Hanmo;doi: 10.17632/tnf8fbcnkv
The data shows the ERP and behavioral results of near-miss and full-miss under three reward expectancy levels: high, medium and low. The results show that on FRN and P300, the gap between near-miss and full-miss differs with different levels of reward expectancy, proving that reward expectancy affects the near-miss effect. Sheet names represent different dependent variables, low means low reward expectancy, same for medium and high . For some variables, the use declaration is marked in red font in the table
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tnf8fbcnkv&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tnf8fbcnkv&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
Loading
Research data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Wongupparaj, Peera;Wongupparaj, Peera;This dataset comprises EEG and behavioral data recorded from 60 Thai Buddhist monks who voluntarily participated in the research project. The behavioral data contain participant characteristics, while the EEG data provide absolute and relative powers of five frequency bands (delta, theta, alpha, beta, and gamma) during the 30-minute meditative states of the 60 Thai monks. The 64-channel EEG neuroscan was utilized to record the prolonged duration of the 30-minute meditation session. Additionally, the 30-minute EEG features are segmented into six 5-minute intervals ('0-5 minutes', '5-10 minutes', '10-15 minutes', '15-20 minutes', '20-25 minutes', and '25-30 minutes') to enable researchers to further investigate the temporal changes in brain activity during mindfulness meditation.
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/pthdhf2dwm.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/pthdhf2dwm.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Hemakom, Apit;Hemakom, Apit;The data were collected from 66 healthy university students (21 males, 24 females in the follicular phase of the menstrual cycle, and 21 females in the luteal phase of the menstrual cycle) . The Montreal Imaging Stress Task (MIST) was modified and used in this study. A series of computer-based mental arithmetic tasks is designed to evaluate responses in control and stressful conditions. The experiments thus conducted in two separate sessions for the two conditions, which were two weeks apart. Each of the 2 sessions consisted of 7 periods: training, eyes open (EO), mental arithmetic task (MAT) on 4 consecutive levels of difficulty (arithmetic calculation level 1, AC1–arithmetic calculation level 4, AC4), and recovery. The control condition started with a training period to familiarize the subject with the experimental procedure, during which ECG and EEG signals were not recorded. During this period, the subject was presented with a series of computerized sampled questions at 4 difficulty levels of mental arithmetic tasks. Answer choices for every question were displayed on a computer screen in a sequence of integers between 0 and 9. The subject was requested to use a wireless computer mouse to click on the correct answer. Following the training period, the recordings started, and the subjects were asked to sit in a relaxed position with no movement and to focus on a black dot displayed on a computer screen for 5 minutes (the eye-open, EO, period). These requirements were critical to constructing a reliable EEG baseline with the minimum amount of artifacts caused by eye and body movements. After that, an instruction to perform mental arithmetic calculations was shown on the computer screen. The mental arithmetic task is composed of 4 levels of difficulty. Each level lasted 5 minutes. Level 1 (Arithmetic Calculation Level 1 – AC1): addition (+) and subtraction (-) of 3 single-digit numbers, e.g., 7-4+1. Level 2 (Arithmetic Calculation Level 2 – AC2): addition (+), subtraction (-), and multiplication (x) of 3 single- and double-digit numbers, e.g., 6x8-30. Level 3 (Arithmetic Calculation Level 3 – AC3): addition (+), subtraction (-), and multiplication (x) of 4 single- and double-digit numbers, e.g., 35+10-4*8. Level 4 (Arithmetic Calculation Level 4 – AC): addition (+), subtraction (-), multiplication (x), and division (/) of 4 single- and double-digit numbers, e.g., 96/4x2-11. No time limit or negative feedback messages were given to the subjects. After each question, the correct/incorrect message was displayed. After the AC4 period, the subjects relaxed and sat still for 5 minutes (the recovery period). ECG and EEG signals were recorded from the beginning of the EO period until the end of the recovery period. The protocol for the mental-stress condition was the same as it was for the control condition, but with a time limit and social evaluative threat components introduced. Several negative feedbacks were introduced to actively induce stress.
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/cyhchpxwps&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/cyhchpxwps&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Corona-Gonzalez, Cesar E;Corona-Gonzalez, Cesar E;1. ObjectiveTo determine the electrophysiological impact of an online learning method (OLM) on children experiencing learning difficulties2. SampleThirty-six children aged between 7 and 13 were recruited to generate this database. Every child showed evidence of suffering from either reading or math difficulties. Thus, an OLM [1] was provided to enhance learning skills. Consequently, the sample was divided into the following study groups: A) Children with reading difficulties who:I) underwent cognitive training - Experimental groupII) did not undergo cognitive training - Control groupB) Children with math difficulties who:I) underwent cognitive training - Experimental groupII) did not undergo cognitive training - Control group3. Methods3.1. InterventionsBefore EEG recordings, the experimental groups were asked to utilize the OLM 15 minutes a day for three months. On the other hand, control groups continue with their traditional learning method for three months (e.g. attending school, home school, etc.). After that, EEG recordings were collected for all groups in the following conditions: 3.2. EEG recordings- Baseline: the participant was sitting in a comfortable position and looking at a cross on the screen for three minutes.- Reading: three texts were presented where reading aloud was mandatory. The child was asked to read as carefully as possible and according to his/her abilities. Self-corrections during reading were allowed. After each text, three comprehension questions were displayed.- Math: two blocks of twenty arithmetic operations each were solved. Every operation had three response alternatives.4. EEG data description (events)- See "Event and participant description.xlsxReferences:[1] Arroyo J, González de Vega D. Smartick. Madrid, España: Sistemas Virtuales de Aprendizaje S.L.; 2009. Available from: https://mx.smartickmethod.com/?f=1 THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/f8h7m6bys7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/f8h7m6bys7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Embargo end date: 12 Feb 2024 EnglishPublisher:Dryad Patelaki, Eleni; Foxe, John J.; Mantel, Emma P.; Kassis, George; Freedman, Edward G.;# Title of Dataset Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Young adults ## Description of the data and file structure This Drayd dataset contains multimodal MoBI data, collected from young adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. The data is organized as follows: ``` |-- 010705001 | |-- LSLData | | |-- 010705001.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705001.txt | | |-- mainExperScript_010705001.log | | |-- motion_state_010705001.txt | | |-- Training_GoNoGo_010705001.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705001_processed.txt | | |-- mainExperScript_010705001_processed.txt | |-- EEGstruct_Raw | | |-- 010705001.set | | |-- 010705001.fdt |-- 010705002 | |-- LSLData | | |-- 010705002.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705002.txt | | |-- mainExperScript_010705002.log | | |-- motion_state_010705002.txt | | |-- Training_GoNoGo_010705002.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705002_processed.txt | | |-- mainExperScript_010705002_processed.txt | |-- EEGstruct_Raw | | |-- 010705002.set | | |-- 010705002.fdt |-- ... |-- ... |-- ... |-- metadata.xlsx ``` ### Notes: #### About the LSLData folder: The `.mat` file in this folder contains a cell array with the 3 synchronized datastreams (EEG, motion capture, behavioral responses) along with metadata. Each cell of the array contains a different datastream. In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. In those cases, two separate `.mat` files occurred: one containing the recording before the break (e.g. `010705013_1.mat`) and another containing the recording after the break (eg. `010705013_2.mat`). #### About the Logfiles\_Raw folder: * The `GoNoGo_{participantID}.txt` is a manually created logfile containing the following information about the images presented during the Go/NoGo task (each row corresponds to one image): * Column **Block**: contains the block number during which the image was presented * Column **Trial**: contains the trial number during which the imagee was presented * Column **Image**: contains the IAPS code of the presented image * Column **RespTime**: contains the response time to the presented image * Column **MotState**: contains `sitting` if the block was a sitting block, and `walking` if the block was a walking block * Column **Button**: contains `1` if a valid button press was recorded in response to the image, and `0` if no valid button press was recorded. * The `Training_GoNoGo_{participantID}.txt` is a manually created logfile containing the same information as the `GoNoGo_{participantID}.txt`, but only for the training block. Note that this data was not analyzed the in the paper--it only serves to assess how well the participant understands the task, before they start with the actual experiment. * The `motion_state_{participantID}.txt` is a manually created logfile containing the order in which sitting and walking blocks were performed. IMPORTANT: this is the final/correct sequence of walking/sitting -- if the walking/sitting sequence in the `GoNoGo_{participantID}.txt` is different, then it must change to align with this one. The reason why those two sequences are different for some participants between the two logfiles is because the walking/sitting sequence that had initially been planned for them (`GoNoGo_{participantID}.txt`) had to change on the fly, for example because they were tired and requested to do more sitting and leave walking for later. * The `mainExperScript_{participantID}.log` is an logfile automatically generated by Presentation after the completion of each experimental scenario run. The information is organized in the following columns: * Column **Trial**: incremental trial number * Column **Event Type**: it can take one the following values * `Picture`: this is the most common event. The `Picture` event is on throughout runtime of the experimental scenario (even when a the black screen with the white centered cross is dispayed on the projection screen--that is a picture too) * `Response`: button press from the Nintendo switch * `Text Input`: it occurs at the end of some experimental blocks. The experimenter has coded the scenario to pause and wait until text input from is provided * `Pause`: it occurs at the end of some experimental blocks. It indicates that the scenario has been paused manually. * `Resume`: it almost always occurs after pause events, at the end of some experimental blocks. It indicates that the scenario has been resumed manually. * `Quit`: the scenario has been ternimated manually before its completion * Column **Code**: * For the **Picture** event type, it can take one of the following values: * `countdown_3`: Part of the countdown at the beginning of each experimental block. Displays a white '3' with a black background on the projection screen in front of the participant. * `countdown_2`: Part of the countdown at the beginning of each experimental block. Displays a white '2' with a black background on the projection screen in front of the participant. * `countdown_1`: Part of the countdown at the beginning of each experimental block. Displays a white '1' with a black background on the projection screen in front of the participant. * `countdown_go`: Part of the countdown at the beginning of each experimental block. Displays a white 'Go' with a black background on the projection screen in front of the participant. * `pic_display`: Displays an IAPS image on the projection screen in front of the participant. * `fixation_cross_no_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. No button presses are accepted during this event code, since they are considered as delayed responses to the previous trial. * `fixation_cross_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. Button presses are accepted during this event code. * For the **Response** event type, it can take either of the following values: * `1`: For button presses provided by the participant during task performance. * `2`: For keyboard presses provided by the experimenter at the end of each block, to enable continuing to the next block. * For all the rest of the event types (**Text Input**, **Pause**, **Resume**, **Quit**), the event code value is empty. * Column **Time**: time of occurrence of each event relative to the start of the scenario. * Column **TTime**: time of occurrence of each event relative to the start of the trial the event is in. * Column **Uncertainty** (Time): temporal uncertainty for each event. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **Duration**: For picture stimuli, this is the duration of the picture presentation. For pause events, this is the duration of the pause. Presentation does not monitor the durations of other events. * Column **Uncertainty** (Duration): uncertainty in the duration of a picture stimulus. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **ReqTime**: Requested time of presentation given in the scenario file. Note that actual presentation times for picture stimuli are constrained by the monitor refresh and therefore should differ from requested times. * Column **ReqDur**: For picture stimuli, this is the requested duration of presentation given in the scenario file. Note that picture stimuli durations are constrained by the monitor refresh. * Column **Stim Type**: Its value is `other`, except for pictures with code `fixation_cross_resp` during which button presses are accepted. For these picture events, the value is either `hit` (button press was detected) or miss (no button press was detected). All times written in the logfile are in tenths of milliseconds (0.1 milliseconds resolution). The uncertainties provide the upper limit so that an uncertainty of 0.2 milliseconds means the uncertainty is between 0.1 and 0.2 milliseconds. To view the logfile data properly aligned with respect to the columns defined above, it is suggested to use the following command in MATLAB: ```matlab S = importdata({full_path_to_logfile},'\t') ``` where S is a structure, and the field S.textdata is a cell array containing the aligned data. For more details about its structure, check the [Presentation documentation](https://www.neurobs.com/pres_docs/html/03_presentation/07_data_reporting/01_logfiles/03_event_table.htm). The event code of every image is the same, i.e. `pic_display`, which functions as a placeholder. To obtain behaviorally meaningful information, i.e. whether a specific trial was a correct or incorrect Go or NoGo, we need to know which exact IAPS image code each `pic_display` event corresponds to. To this end, information from the `mainExperScript_{participantID}.log` has to be fused with information from the `GoNoGo_{participantID}.txt` (after ensuring that the walking/sitting sequence of the latter is corrected according to the `motion_state_{participantID}.txt`). In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. As such, two separate sets of logfiles occurred, for all logfiles described above. Any logfile recorded as part of the first session, before the break, is denoted by an additional `_1` at the end of the logfile name, for example: `mainExperScript_010705013_1.log`, `GoNoGo_010705013_1.txt`, `motion_state_010705013_1.txt` and `Training_GoNoGo_010705013_1.txt`. Any logfile recorded as part of the second session, after the break, is denoted by an additional `_2` at the end of the file name, for example: `mainExperScript_010705013_2.log`, `GoNoGo_010705013_2.txt`, `motion_state_010705013_2.txt` and `Training_GoNoGo_010705013_2.txt`. In some cases where running the training block was not necessary (e.g. for the second recording session after the break, or the participant had already completed the training block shortly before the start of the experiment), the logfile name contains an additional `_noTraining` string, and no manually created training logfile was generated in this case, for example: `mainExperScript_noTraining_010705024.log` and `GoNoGo_noTraining_010705024.txt`. #### About the Logfiles\_Processed folder: * The `GoNoGo_{participantID}_processed.txt` is the same as the `GoNoGo_{participantID}.txt`, with the difference that it has 2 additional columns: * Column **EmoState**: contains the emotional valence (`positive/neutral/negative`) of each presented image. The classification into the 3 categories was conducted based on [Grühn & Scheibe, 2008](https://rdcu.be/dymZf). * Column **ZeroClusters**: contains `1` for all trials except those which belong to a cluster of 6 consecutive non-responses; those latter trials are assigned the value `0` in this column * The `mainExperScript_{participantID}_processed.txt` is the same as the `mainExperScript_{participantID}.log`, with the difference that every placeholder `pic_display` event code has been replaced with an appropriate string of the following structure: `StimOnset_MotState_EmoState_DistPrevNoGo_{distanceNum}_ButtonResp_{Button}_ZeroCluster_{ZeroClusters}_RT_{RespTime}_BlockNum_{Block}`. The `DistPrevNoGo` is followed by a number that indicates how many trials before the current one the last NoGo trial happened (**distanceNum**). #### About the EEGstruct\_Raw folder: Contains `.set` and `.fdt` files, which are formats used by EEGLAB. [EEGLAB](https://sccn.ucsd.edu/eeglab/index.php) is an open-source MATLAB toolbox for electrophysiological signal processing and analysis. Here is an example of loading an EEG dataset, using the pop_loadset function provided by EEGLAB: ```matlab EEGstruct = pop_loadset('010705001.set') ``` `.set` files contain the metadata and `.fdt` files contain the raw data. Alternatively, if the user prefers to work with `.mat` files only, they can load each EEG structure only once using pop_loadset, and then save it as a `.mat` file as follows: ```matlab save('010705001.mat','EEGstruct','-v7.3') ``` Each of these folders essentially contains a structure, the fields of which have been populated with EEG and behavioral data. Specifically, the field **data** contains a (channels)x(time points) matrix with the raw EEG data; the field **event** contains a structure where field **type** contains the event names (e.g. `sitting_hit_negative`, `walking_corrRej_positive`) and field **latency** contains the EEG time point at which the event occured. #### About the metadata.xlsx: This Excel file contains metadata about the whole dataset, organized into 2 sheets, the `Young Adults` sheet and the `Older adults` sheet. Each sheet contains metadata for the respective age group indicated by the sheet name. The first 5 columns are common across the 2 sheets: * Column **ID**: The 9-digit participant ID which is also the name of the individual participant data folders, e.g. `010705001`. The last 3 digits represent an incrementally-assigned number from 1-102. * Column **Age**: Participant's age at the time of the recording * Column **Speed**: treadmill speed, in miles per hour * Column **Sex**: `F` for female, `M` for male * Column **Dominant hand** (coincides with the hand used to provide button-press responses): `R` for right hand, `L` for left hand The `Older adults` sheet includes one additional, 6th column **MoCA score** containing the MoCA scores for each of the older participants. ## Sharing/Access information Data for this project will only be shared via Dryad. Data was not derived from any other sources. ## Code/Software The code will be provided [here](https://github.com/CNL-R). ## Associated Datasets In the context of this study, data were also collected from older adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. These older adult data can be found in the Dryad dataset titled **[Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Older adults](https://doi.org/10.5061/dryad.xsj3tx9nb)**. Combining walking with a demanding cognitive task is traditionally expected to elicit decrements in gait and/or cognitive task performance. However, it was recently shown that, in a cohort of young adults, most participants improved performance when walking was added to performance of a Go/NoGo response inhibition task. The present study aims to extend these previous findings to an older adult cohort, to investigate whether this improvement when dual-tasking is observed in healthy older adults. Mobile Brain/Body Imaging (MoBI) was used to record electroencephalographic (EEG) activity, three-dimensional (3D) gait kinematics and behavioral responses in the Go/NoGo task, during sitting or walking on a treadmill, in 34 young adults and 37 older adults. Increased response accuracy during walking, independent of age, was found to correlate with slower responses to stimuli (r = 0.44) and with walking-related EEG amplitude modulations over frontocentral regions (r = 0.47) during the sensory gating (N1) and conflict monitoring (N2) stages of inhibition, and over left-lateralized prefrontal regions (r = 0.47) during the stage of inhibitory control implementation (P3). These neural activity changes are related to the cognitive component of inhibition, and they were interpreted as signatures of behavioral improvement during walking. On the other hand, aging, independent of response accuracy during walking, was found to correlate with slower treadmill walking speeds (r = -0.68) and attenuation in walking-related EEG amplitude modulations over left-dominant frontal (r = -0.44) and parietooccipital regions (r = 0.48) during the N2 stage, and over centroparietal regions (r = 0.48) during the P3 stage. These neural activity changes are related to the motor component of inhibition, and they were interpreted as signatures of aging. Older adults whose response accuracy ‘paradoxically’ improved during walking manifested neural signatures of both behavioral improvement and aging, suggesting that their flexibility in reallocating neural resources while walking might be maintained for the cognitive but not for the motor inhibitory component. These distinct neural signatures of aging and behavior can potentially be used to identify ‘super-agers’, or individuals at risk for cognitive decline due to aging or neurodegenerative disease. The .mat files in the "LSLData" subfolders require MATLAB (MathWorks Inc., Natick, MA, USA) to open. An open-source alternative to open them is using Python (see https://www.askpython.com/python/examples/mat-files-in-python) The .set and .fdt files in the "EEGstruct_Raw" subfolders require EEGLAB, which is an open-source MATLAB toolbox for electrophysiological signal processing and analysis (https://sccn.ucsd.edu/eeglab/index.php). Alternatively, they can be opened using MNE, which is an open-source MATLAB toolbox for electrophysiological signal processing and analysis (https://mne.tools/dev/generated/mne.io.read_raw_eeglab.html). The .txt and .log files in the "Logiles_Raw" and "Logfiles_Processed" subfolders can be opened using any text editor. The metadata.xlsx file can be opened using Microsoft Excel. Alternatively, LibreOffice (https://www.libreoffice.org/), which is free and open-source, can be used. This dataset was collected using the Mobile Brain-Body Imaging modality, involving synchronous recordings of 3 data streams: 1) EEG (BioSemi Inc., Amsterdam, The Netherlands) 2) Behavioral responses to the designed Go/NoGo task (Presentation, Neurobehavioral Systems Inc., Berkeley, CA, USA) 3) Full-body kinematics (OptiTrack, NaturalPoint, Inc., Corvallis, OR, USA). To record these 3 data streams in a time-synchronized manner, the Lab Streaming Layer (LSL: https://labstreaminglayer.org/#/) was used. The data included are raw, except for the behavior-related logfiles, for which both raw and processed versions are provided (see README file for details).
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.mgqnk9947&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.mgqnk9947&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Embargo end date: 12 Feb 2024 EnglishPublisher:Dryad Patelaki, Eleni; Foxe, John J.; Mantel, Emma P.; Kassis, George; Freedman, Edward G.;# Title of Dataset Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Older adults ## Description of the data and file structure This Drayd dataset contains multimodal MoBI data, collected from older adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. The data is organized as follows: ``` |-- 010705027 | |-- LSLData | | |-- 010705027_1.mat | | |-- 010705027_2.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705027_1.txt | | |-- GoNoGo_noTraining_010705027_2.txt | | |-- mainExperScript_010705027_1.log | | |-- mainExperScript_noTraining_010705027_2.log | | |-- motion_state_010705027_1.txt | | |-- motion_state_010705027_2.txt | | |-- Training_GoNoGo_010705027.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705027_processed.txt | | |-- mainExperScript_010705027_processed.txt | |-- EEGstruct_Raw | | |-- 010705027.set | | |-- 010705027.fdt |-- 010705028 | |-- LSLData | | |-- 010705028.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705028.txt | | |-- mainExperScript_010705028.log | | |-- motion_state_010705028.txt | | |-- Training_GoNoGo_010705028.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705028_processed.txt | | |-- mainExperScript_010705028_processed.txt | |-- EEGstruct_Raw | | |-- 010705028.set | | |-- 010705028.fdt |-- ... |-- ... |-- ... |-- metadata.xlsx ``` ### Notes: #### About the LSLData folder: The `.mat` file in this folder contains a cell array with the 3 synchronized datastreams (EEG, motion capture, behavioral responses) along with metadata. Each cell of the array contains a different datastream. In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. In those cases, two separate `.mat` files occurred: one containing the recording before the break (e.g. `010705027_1.mat`, `010705082_1.mat`) and another containing the recording after the break (eg. `010705027_2.mat`, `010705082_2.mat`). #### About the Logfiles\_Raw folder: * The `GoNoGo_{participantID}.txt` is a manually created logfile containing the following information about the images presented during the Go/NoGo task (each row corresponds to one image): * Column **Block**: contains the block number during which the image was presented * Column **Trial**: contains the trial number during which the imagee was presented * Column **Image**: contains the IAPS code of the presented image * Column **RespTime**: contains the response time to the presented image * Column **MotState**: contains `sitting` if the block was a sitting block, and `walking` if the block was a walking block * Column **Button**: contains `1` if a valid button press was recorded in response to the image, and `0` if no valid button press was recorded. * The `Training_GoNoGo_{participantID}.txt` is a manually created logfile containing the same information as the `GoNoGo_{participantID}.txt`, but only for the training block. Note that this data was not analyzed the in the paper--it only serves to assess how well the participant understands the task, before they start with the actual experiment. * The `motion_state_{participantID}.txt` is a manually created logfile containing the order in which sitting and walking blocks were performed. IMPORTANT: this is the final/correct sequence of walking/sitting -- if the walking/sitting sequence in the `GoNoGo_{participantID}.txt` is different, then it must change to align with this one. The reason why those two sequences are different for some participants between the two logfiles is because the walking/sitting sequence that had initially been planned for them (`GoNoGo_{participantID}.txt`) had to change on the fly, for example because they were tired and requested to do more sitting and leave walking for later. * The `mainExperScript_{participantID}.log` is an logfile automatically generated by Presentation after the completion of each experimental scenario run. The information is organized in the following columns: * Column **Trial**: incremental trial number * Column **Event Type**: it can take one the following values * `Picture`: this is the most common event. The `Picture` event is on throughout runtime of the experimental scenario (even when a the black screen with the white centered cross is dispayed on the projection screen--that is a picture too) * `Response`: button press from the Nintendo switch * `Text Input`: it occurs at the end of some experimental blocks. The experimenter has coded the scenario to pause and wait until text input from is provided * `Pause`: it occurs at the end of some experimental blocks. It indicates that the scenario has been paused manually. * `Resume`: it almost always occurs after pause events, at the end of some experimental blocks. It indicates that the scenario has been resumed manually. * `Quit`: the scenario has been ternimated manually before its completion * Column **Code**: * For the **Picture** event type, it can take one of the following values: * `countdown_3`: Part of the countdown at the beginning of each experimental block. Displays a white '3' with a black background on the projection screen in front of the participant. * `countdown_2`: Part of the countdown at the beginning of each experimental block. Displays a white '2' with a black background on the projection screen in front of the participant. * `countdown_1`: Part of the countdown at the beginning of each experimental block. Displays a white '1' with a black background on the projection screen in front of the participant. * `countdown_go`: Part of the countdown at the beginning of each experimental block. Displays a white 'Go' with a black background on the projection screen in front of the participant. * `pic_display`: Displays an IAPS image on the projection screen in front of the participant. * `fixation_cross_no_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. No button presses are accepted during this event code, since they are considered as delayed responses to the previous trial. * `fixation_cross_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. Button presses are accepted during this event code. * For the **Response** event type, it can take either of the following values: * `1` For button presses provided by the participant during task performance. * `2`: For keyboard presses provided by the experimenter at the end of each block, to enable continuing to the next block. * For all the rest of the event types (**Text Input**, **Pause**, **Resume**, **Quit**), the event code value is empty. * Column **Time**: time of occurrence of each event relative to the start of the scenario. * Column **TTime**: time of occurrence of each event relative to the start of the trial the event is in. * Column **Uncertainty** (Time): temporal uncertainty for each event. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **Duration**: For picture stimuli, this is the duration of the picture presentation. For pause events, this is the duration of the pause. Presentation does not monitor the durations of other events. * Column **Uncertainty** (Duration): uncertainty in the duration of a picture stimulus. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **ReqTime**: Requested time of presentation given in the scenario file. Note that actual presentation times for picture stimuli are constrained by the monitor refresh and therefore should differ from requested times. * Column **ReqDur**: For picture stimuli, this is the requested duration of presentation given in the scenario file. Note that picture stimuli durations are constrained by the monitor refresh. * Column **Stim Type**: Its value is `other`, except for pictures with code `fixation_cross_resp` during which button presses are accepted. For these picture events, the value is either `hit` (button press was detected) or miss (no button press was detected). All times written in the logfile are in tenths of milliseconds (0.1 milliseconds resolution). The uncertainties provide the upper limit so that an uncertainty of 0.2 milliseconds means the uncertainty is between 0.1 and 0.2 milliseconds. To view the logfile data properly aligned with respect to the columns defined above, it is suggested to use the following command in MATLAB: ```matlab S = importdata({full_path_to_logfile},'\t') ``` where S is a structure, and the field S.textdata is a cell array containing the aligned data. For more details about its structure, check the [Presentation documentation](https://www.neurobs.com/pres_docs/html/03_presentation/07_data_reporting/01_logfiles/03_event_table.htm). The event code of every image is the same, i.e. `pic_display`, which functions as a placeholder. To obtain behaviorally meaningful information, i.e. whether a specific trial was a correct or incorrect Go or NoGo, we need to know which exact IAPS image code each `pic_display` event corresponds to. To this end, information from the `mainExperScript_{participantID}.log` has to be fused with information from the `GoNoGo_{participantID}.txt` (after ensuring that the walking/sitting sequence of the latter is corrected according to the `motion_state_{participantID}.txt`). In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. As such, two separate sets of logfiles occurred, for all logfiles described above. Any logfile recorded as part of the first session, before the break, is denoted by an additional `_1` at the end of the logfile name, for example: `mainExperScript_010705082_1.log`, `GoNoGo_010705082_1.txt`, `motion_state_010705082_1.txt` and `Training_GoNoGo_010705082_1.txt`. Any logfile recorded as part of the second session, after the break, is denoted by an additional `_2` at the end of the file name, for example: `mainExperScript_010705082_2.log`, `GoNoGo_010705082_2.txt`, `motion_state_010705082_2.txt` and `Training_GoNoGo_010705082_2.txt`. In some cases where running the training block was not necessary (e.g. for the second recording session after the break, or the participant had already completed the training block shortly before the start of the experiment), the logfile name contains an additional `_noTraining` string, and no manually created training logfile was generated in this case, for example: `mainExperScript_noTraining_010705042.log` and `GoNoGo_noTraining_010705042.txt`. #### About the Logfiles\_Processed folder: * The `GoNoGo_{participantID}_processed.txt` is the same as the `GoNoGo_{participantID}.txt`, with the difference that it has 2 additional columns: * Column **EmoState**: contains the emotional valence (`positive/neutral/negative`) of each presented image. The classification into the 3 categories was conducted based on [Grühn & Scheibe, 2008](https://rdcu.be/dymZf). * Column **ZeroClusters**: contains `1` for all trials except those which belong to a cluster of 6 consecutive non-responses; those latter trials are assigned the value `0` in this column * The `mainExperScript_{participantID}_processed.txt` is the same as the `mainExperScript_{participantID}.log`, with the difference that every placeholder 'pic_display' event code has been replaced with an appropriate string of the following structure: `StimOnset_MotState_EmoState_DistPrevNoGo_{distanceNum}_ButtonResp_{Button}_ZeroCluster_{ZeroClusters}_RT_{RespTime}_BlockNum_{Block}`. The `DistPrevNoGo` is followed by a number that indicates how many trials before the current one the last NoGo trial happened (**distanceNum**). #### About the EEGstruct\_Raw folder: Contains `.set` and `.fdt` files, which are formats used by EEGLAB. [EEGLAB](https://sccn.ucsd.edu/eeglab/index.php) is an open-source MATLAB toolbox for electrophysiological signal processing and analysis. Here is an example of loading an EEG dataset, using the pop_loadset function provided by EEGLAB: ```matlab EEGstruct = pop_loadset('010705027.set') ``` `.set` files contain the metadata and `.fdt` files contain the raw data. Alternatively, if the user prefers to work with `.mat` files only, they can load each EEG structure only once using pop_loadset, and then save it as a `.mat` file as follows: ```matlab save('010705027.mat','EEGstruct','-v7.3') ``` Each of these folders essentially contains a structure, the fields of which have been populated with EEG and behavioral data. Specifically, the field **data** contains a (channels)x(time points) matrix with the raw EEG data; the field **event** contains a structure where field **type** contains the event names (e.g. `sitting_hit_negative`, `walking_corrRej_positive`) and field **latency** contains the EEG time point at which the event occured. #### About the metadata.xlsx: This Excel file contains metadata about the whole dataset, organized into 2 sheets, the `Young Adults` sheet and the `Older adults` sheet. Each sheet contains metadata for the respective age group indicated by the sheet name. The first 5 columns are common across the 2 sheets: * Column **ID**: The 9-digit participant ID which is also the name of the individual participant data folders, e.g. `010705001`. The last 3 digits represent an incrementally-assigned number from 1-102. * Column **Age**: Participant's age at the time of the recording * Column **Speed**: treadmill speed, in miles per hour * Column **Sex**: `F` for female, `M` for male * Column **Dominant hand** (coincides with the hand used to provide button-press responses): `R` for right hand, `L` for left hand The `Older adults` sheet includes one additional, 6th column **MoCA score** containing the MoCA scores for each of the older participants. ## Sharing/Access information Data for this project will only be shared via Dryad. Data was not derived from any other sources. ## Code/Software The code will be provided [here](https://github.com/CNL-R). ## Associated Datasets In the context of this study, data were also collected from young adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. These young adult data can be found in the Dryad dataset titled **[Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Young adults](https://doi.org/10.5061/dryad.mgqnk9947)**. Combining walking with a demanding cognitive task is traditionally expected to elicit decrements in gait and/or cognitive task performance. However, it was recently shown that, in a cohort of young adults, most participants improved performance when walking was added to performance of a Go/NoGo response inhibition task. The present study aims to extend these previous findings to an older adult cohort, to investigate whether this improvement when dual-tasking is observed in healthy older adults. Mobile Brain/Body Imaging (MoBI) was used to record electroencephalographic (EEG) activity, three-dimensional (3D) gait kinematics and behavioral responses in the Go/NoGo task, during sitting or walking on a treadmill, in 34 young adults and 37 older adults. Increased response accuracy during walking, independent of age, was found to correlate with slower responses to stimuli (r = 0.44) and with walking-related EEG amplitude modulations over frontocentral regions (r = 0.47) during the sensory gating (N1) and conflict monitoring (N2) stages of inhibition, and over left-lateralized prefrontal regions (r = 0.47) during the stage of inhibitory control implementation (P3). These neural activity changes are related to the cognitive component of inhibition, and they were interpreted as signatures of behavioral improvement during walking. On the other hand, aging, independent of response accuracy during walking, was found to correlate with slower treadmill walking speeds (r = -0.68) and attenuation in walking-related EEG amplitude modulations over left-dominant frontal (r = -0.44) and parietooccipital regions (r = 0.48) during the N2 stage, and over centroparietal regions (r = 0.48) during the P3 stage. These neural activity changes are related to the motor component of inhibition, and they were interpreted as signatures of aging. Older adults whose response accuracy ‘paradoxically’ improved during walking manifested neural signatures of both behavioral improvement and aging, suggesting that their flexibility in reallocating neural resources while walking might be maintained for the cognitive but not for the motor inhibitory component. These distinct neural signatures of aging and behavior can potentially be used to identify ‘super-agers’, or individuals at risk for cognitive decline due to aging or neurodegenerative disease. This dataset was collected using the Mobile Brain-Body Imaging modality, involving synchronous recordings of 3 data streams: 1) EEG (BioSemi Inc., Amsterdam, The Netherlands) 2) Behavioral responses to the designed Go/NoGo task (Presentation, Neurobehavioral Systems Inc., Berkeley, CA, USA) 3) Full-body kinematics (OptiTrack, NaturalPoint, Inc., Corvallis, OR, USA). To record these 3 data streams in a time-synchronized manner, the Lab Streaming Layer (LSL: https://labstreaminglayer.org/#/) was used. The data included are raw, except for the behavior-related logfiles, for which both raw and processed versions are provided (see README file for details).
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.xsj3tx9nb&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.xsj3tx9nb&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Aslan, M;Aslan, M;This dataset includes EEG signals for lie detection. EEG signals were collected using a wearable and portable EEG device called Emotiv Insight, which has 5 channels, from 27 different subjects. The subjects participated in two experiments, taking on the roles of deceivers and truth-tellers. In each experiment, a box with 5 different beads was given to the subjects, and they were instructed to take 2 beads from the box and place them in their pockets. In the first experiment, subjects were asked to decide whether to assume the role of a deceiver or a truth-teller. In the second experiment, they were required to take on the opposite role. During the experiments, subjects watched a video composed of images of the beads in the box placed in front of them. The video clip started with a 3-second black screen, followed by 2 seconds of bead images and 1 second of a black screen, repeating in this pattern. After obtaining EEG data, the initial 2 seconds of excessive signal data were removed from the raw data, resulting in a total of 75 seconds of EEG data. In the deceiver role, subjects clicked the button in their left hand labeled "no" if the displayed image matched the bead they took, and the button in their right hand labeled "yes" if it did not, thus deceiving about all the images. For the truth-teller role, the opposite actions were taken, clicking "yes" for the taken bead image and "no" for the not-taken bead image, thus telling the truth for all images. EEG signals were recorded following this procedure. The EEG signals underwent an offset removal process to obtain raw EEG data. Both raw data and preprocessed EEG data were stored in .csv format. The purpose of this dataset is to provide EEG signals for lie detection, offering an alternative and diverse dataset with different channel counts.When this data set is used, the relevant article must be cited. The relevant article citation is below.Aslan, M., Baykara, M. & Alakus, T.B. LieWaves: dataset for lie detection based on EEG signals and wavelets. Med Biol Eng Comput (2024). https://doi.org/10.1007/s11517-024-03021-2 THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/5gzxb2bzs2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/5gzxb2bzs2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Halder, S;Halder, S;1. Grand_Contingency_table.mat -- This file has all 24 participant's behavioral response dataset in 3*3 contingency table format for 5 lags. Format - structure; one high contrast and one low contrast file; each file - 3*3*5*24. 2. Grand_psychophysical_params_table.mat -- This file has all 24 participant's psychophysical parameter dataset for 5 lags. Format - structure; 7 parameters obtained from the model; each file - 5*24. 3. n2p_ERP_all_lag_occ_par.mat and p3_ERP_all_lag_par.mat -- These two files has the data for these two ERP components for 18 EEG participants for 5 lags - n2p in occ-par electrodes and p3 in parietal electrodes. Format : double, each file 251*18*5. 4. Early_timewindow_ERP_peak_measure_all_electrode.mat and Late_timewindow_ERP_peak_measure_all_electrode.mat -- These two files have ERP components quantified as peak measure in early (150 - 300ms) and late (300-550 ms) for all electrodes for 18 participants and for 5 lags. Format : double, each file 128*90. (18*5 =90 ) 5. btw_cls_detection_discrimination_distances_lag_wise.mat -- This file contains between-class distance metrics. It has 2 1x5 cell structures, one for distances in detection dimension and the other for discrimination dimension for 5 different lags between the categories. Each element in this cell structure has distance values for every time point and from 18 subjects. This file also contains the time variable with 251 time points. 6. Fronto_Parietal_Coherence_taperParams_3_5_left_left.mat -- This file contains five coherency variables for each lag. Each variable is a subject x time x frequency dimension (i.e., 18 x 59 x 82). It also has time (59 time points) and frequency (82 frequency values) variables. 7. For_correlations_between_metrics -- This file contains multiple .mat variables (quantified coherence, quantified distance and d' for 18 subjects and for 5 lags; format : 90*1) for correlation analysis. THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tyscxph53y.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tyscxph53y.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Corona-Gonzalez, Cesar E;Corona-Gonzalez, Cesar E;1. ObjectiveTo perform a psychophysiological evaluation of reading and mathematical skills in children with low academic performance.2. SampleOne hundred and two children participated in this study. Each child was allocated into one of the following groups: a) 51 Children with low academic performance in reading.b) 51 Children with low academic performance in math.2.1. Sample selection criteria* Children aged between 7-13 years.* Not to be diagnosed with any neurological disorder.* Capacity to read a simple text and solve arithmetic facts* Indistinct gender and socioeconomic level.3. MethodsData collection was carried out across two stages: I) Psychometric evaluation, where reading, spelling, math, attention levels, and IQ were assessed. These results determined which academic skill was the most affected. II) EEG acquisition, by designing two experimental paradigms. For children in the reading group, three passages were displayed and should have been read out loud. After that, three multiple-choice reading comprehension questions were presented. Regarding the math experiment, forty arithmetical facts were displayed within two blocks (20 each). Similarly, three options were displayed to answer the operation. 4. Data description.4.1. EEG recordings.- Baseline: a cross was displayed on the screen, and a 3-min recording was taken for each child. - Reading: children read out loud three texts while the EEG was being recorded. Then, three comprehension questions were answered.- Math: two blocks of 20 operations were performed for each child.4.2. EEG events.- Baseline: no events.- Reading: reading out loud (between 'condition' events); answers for reading comprehension questions ('33026' option 1; '33027' option 2; '33028' option 3).- Math: block 1 (from '33101' to '33120'); block 2 (from '33121' to '33140'); answers ('33026' option 1; '33027' option 2; '33028' option 3).4.3. Files attached- 32Ch_gTec.ced/.txt - Channel location. - Arithmetical Facts and Answers.xlsx: includes the event tags for each operation and answers for the Math EEG data- Comprehension Questions and Answers.xlsx: includes the event tags for each answer for the comprehension questions. In addition, this file describes the sequence of appearance of the questions.- EEG files.xlsx: it shows the EEG files available in the database - Psychometric Variables.xlsx: description of each psychometric variable and categories associated with each numerical value. - Psychometric_Math.xlsx: psychometric results for children in the math group.- Psychometric_Reading.xlsx: psychometric results for children in the reading groupReferences: [1] Secretaría de Educación Pública. Manual de Procedimientos para el Fomento y la Valoración de la Compentencia Lectora en el Aula. Available from: https://rarchivoszona33.files.wordpress.com/2012/10/manual_fomento.pdf THIS DATASET IS ARCHIVED AT DANS/EASY, BUT NOT ACCESSIBLE HERE. TO VIEW A LIST OF FILES AND ACCESS THE FILES IN THIS DATASET CLICK ON THE DOI-LINK ABOVE
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/bnfgrn5jbb.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/bnfgrn5jbb.1&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Embargo end date: 12 Feb 2024 EnglishPublisher:Dryad Authors: Patelaki, Eleni; Foxe, John J.; McFerren, Amber L.; Freedman, Edward G.;Patelaki, Eleni; Foxe, John J.; McFerren, Amber L.; Freedman, Edward G.;# Title of Dataset Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Increased cognitive load ## Description of the data and file structure This Drayd dataset contains multimodal MoBI data, collected from young adults while performing the 2-back Go/NoGo response inhibition task and concurrently walking on a treadmill. The data is organized as follows: ``` |-- 010705001 | |-- LSLData | | |-- 010705001_1.mat | | |-- 010705001_2.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705001_1.txt | | |-- GoNoGo_010705001_2.txt | | |-- mainExperScript_010705001_1.txt | | |-- mainExperScript_010705001_2.txt | | |-- motion_state_010705001_1.txt | | |-- motion_state_010705001_2.txt | | |-- Training_GoNoGo_010705001_1.txt | | |-- Training_GoNoGo_010705001_2.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705001_processed.txt | | |-- mainExperScript_010705001_processed.txt | |-- EEGstruct_Raw | | |-- 010705001.set | | |-- 010705001.fdt |-- 010705002 | |-- LSLData | | |-- 010705002.mat | |-- Logfiles_Raw | | |-- GoNoGo_010705002.txt | | |-- mainExperScript_010705002.txt | | |-- motion_state_010705002.txt | | |-- Training_GoNoGo_010705002.txt | |-- Logfiles_Processed | | |-- GoNoGo_010705002_processed.txt | | |-- mainExperScript_010705002_processed.txt | |-- EEGstruct_Raw | | |-- 010705002.set | | |-- 010705002.fdt |-- ... |-- ... |-- ... |-- metadata.xlsx ``` ### Notes: #### About the LSLData folder: The `.mat` file in this folder contains a cell array with the 3 synchronized datastreams (EEG, motion capture, behavioral responses) along with metadata. Each cell of the array contains a different datastream. In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. In those cases, two separate `.mat` files occurred: one containing the recording before the break (e.g. `010705001_1.mat`) and another containing the recording after the break (eg. `010705001_2.mat`). #### About the Logfiles\_Raw folder: * The `GoNoGo_{participantID}.txt` is a manually created logfile containing the following information about the images presented during the Go/NoGo task (each row corresponds to one image): * Column **Block**: contains the block number during which the image was presented * Column **Trial**: contains the trial number during which the imagee was presented * Column **Image**: contains the IAPS code of the presented image * Column **RespTime**: contains the response time to the presented image * Column **MotState**: contains `sitting` if the block was a sitting block, and `walking` if the block was a walking block * Column **Button**: contains `1` if a valid button press was recorded in response to the image, and `0` if no valid button press was recorded. * The `Training_GoNoGo_{participantID}.txt` is a manually created logfile containing the same information as the `GoNoGo_{participantID}.txt`, but only for the training block. Note that this data was not analyzed the in the paper--it only serves to assess how well the participant understands the task, before they start with the actual experiment. * The `motion_state_{participantID}.txt` is a manually created logfile containing the order in which sitting and walking blocks were performed. IMPORTANT: this is the final/correct sequence of walking/sitting -- if the walking/sitting sequence in the `GoNoGo_{participantID}.txt` is different, then it must change to align with this one. The reason why those two sequences are different for some participants between the two logfiles is because the walking/sitting sequence that had initially been planned for them (`GoNoGo_{participantID}.txt`) had to change on the fly, for example because they were tired and requested to do more sitting and leave walking for later. * The `mainExperScript_{participantID}.txt` is an logfile automatically generated by Presentation after the completion of each experimental scenario run. The information is organized in the following columns: * Column **Trial**: incremental trial number * Column **Event Type**: it can take one the following values * `Picture`: this is the most common event. The `Picture` event is on throughout runtime of the experimental scenario (even when a the black screen with the white centered cross is dispayed on the projection screen--that is a picture too) * `Response`: button press from the Nintendo switch * `Text Input`: it occurs at the end of some experimental blocks. The experimenter has coded the scenario to pause and wait until text input from is provided * `Pause`: it occurs at the end of some experimental blocks. It indicates that the scenario has been paused manually. * `Resume`: it almost always occurs after pause events, at the end of some experimental blocks. It indicates that the scenario has been resumed manually. * `Quit`: the scenario has been ternimated manually before its completion * Column **Code**: * For the **Picture** event type, it can take one of the following values: * `countdown_3`: Part of the countdown at the beginning of each experimental block. Displays a white '3' with a black background on the projection screen in front of the participant. * `countdown_2`: Part of the countdown at the beginning of each experimental block. Displays a white '2' with a black background on the projection screen in front of the participant. * `countdown_1`: Part of the countdown at the beginning of each experimental block. Displays a white '1' with a black background on the projection screen in front of the participant. * `countdown_go`: Part of the countdown at the beginning of each experimental block. Displays a white 'Go' with a black background on the projection screen in front of the participant. * `pic_display`: Displays an IAPS image on the projection screen in front of the participant. * `fixation_cross_no_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. No button presses are accepted during this event code, since they are considered as delayed responses to the previous trial. * `fixation_cross_resp`: Displays a white '+' with a black background on the projection screen in front of the participant. Button presses are accepted during this event code. * For the **Response** event type, it can take either of the following values: * `1`: For button presses provided by the participant during task performance. * `2`: For keyboard presses provided by the experimenter at the end of each block, to enable continuing to the next block. * For all the rest of the event types (**Text Input**, **Pause**, **Resume**, **Quit**), the event code value is empty. * Column **Time**: time of occurrence of each event relative to the start of the scenario. * Column **TTime**: time of occurrence of each event relative to the start of the trial the event is in. * Column **Uncertainty** (Time): temporal uncertainty for each event. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **Duration**: For picture stimuli, this is the duration of the picture presentation. For pause events, this is the duration of the pause. Presentation does not monitor the durations of other events. * Column **Uncertainty** (Duration): uncertainty in the duration of a picture stimulus. For details, see [here](https://www.neurobs.com/pres_docs/html/03_presentation/09_timing/01_uncertainties.htm) * Column **ReqTime**: Requested time of presentation given in the scenario file. Note that actual presentation times for picture stimuli are constrained by the monitor refresh and therefore should differ from requested times. * Column **ReqDur**: For picture stimuli, this is the requested duration of presentation given in the scenario file. Note that picture stimuli durations are constrained by the monitor refresh. * Column **Stim Type**: Its value is `other`, except for pictures with code `fixation_cross_resp` during which button presses are accepted. For these picture events, the value is either `hit` (button press was detected) or miss (no button press was detected). All times written in the logfile are in tenths of milliseconds (0.1 milliseconds resolution). The uncertainties provide the upper limit so that an uncertainty of 0.2 milliseconds means the uncertainty is between 0.1 and 0.2 milliseconds. To view the logfile data properly aligned with respect to the columns defined above, it is suggested to use the following command in MATLAB: ```matlab S = importdata({full_path_to_logfile},'\t') ``` where S is a structure, and the field S.textdata is a cell array containing the aligned data. For more details about its structure, check the [Presentation documentation](https://www.neurobs.com/pres_docs/html/03_presentation/07_data_reporting/01_logfiles/03_event_table.htm). The event code of every image is the same, i.e. `pic_display`, which functions as a placeholder. To obtain behaviorally meaningful information, i.e. whether a specific trial was a correct or incorrect Go or NoGo, we need to know which exact IAPS image code each `pic_display` event corresponds to. To this end, information from the `mainExperScript_{participantID}.txt` has to be fused with information from the `GoNoGo_{participantID}.txt` (after ensuring that the walking/sitting sequence of the latter is corrected according to the `motion_state_{participantID}.txt`). In case the Presentation scenario had to be terminated before its completion (e.g. the participant wanted to take a restroom break), then a new scenario was launched after the break to complete the required number of task blocks. As such, two separate sets of logfiles occurred, for all logfiles described above. Any logfile recorded as part of the first session, before the break, is denoted by an additional `_1` at the end of the logfile name, for example: `mainExperScript_010705001_1.txt`, `GoNoGo_010705001_1.txt`, `motion_state_010705001_1.txt` and `Training_GoNoGo_010705001_1.txt`. Any logfile recorded as part of the second session, after the break, is denoted by an additional `_2` at the end of the file name, for example: `mainExperScript_010705001_2.txt`, `GoNoGo_010705001_2.txt`, `motion_state_010705001_2.txt` and `Training_GoNoGo_010705001_2.txt`. #### About the Logfiles\_Processed folder: * The `GoNoGo_{participantID}_processed.txt` is the same as the `GoNoGo_{participantID}.txt`, with the difference that it has 2 additional columns: * Column **EmoState**: contains the emotional valence (`positive/neutral/negative`) of each presented image. The classification into the 3 categories was conducted based on [Grühn & Scheibe, 2008](https://rdcu.be/dymZf). * Column **ZeroClusters**: contains `1` for all trials except those which belong to a cluster of 6 consecutive non-responses; those latter trials are assigned the value `0` in this column * The `mainExperScript_{participantID}_processed.txt` is the same as the `mainExperScript_{participantID}.txt`, with the difference that every placeholder `pic_display` event code has been replaced with an appropriate string of the following structure: `StimOnset_MotState_EmoState_DistPrevNoGo_{distanceNum}_ButtonResp_{Button}_ZeroCluster_{ZeroClusters}_RT_{RespTime}_BlockNum_{Block}`. The `DistPrevNoGo` is followed by a number that indicates how many trials before the current one the last NoGo trial happened (**distanceNum**). #### About the EEGstruct\_Raw folder: Contains `.set` and `.fdt` files, which are formats used by EEGLAB. [EEGLAB](https://sccn.ucsd.edu/eeglab/index.php) is an open-source MATLAB toolbox for electrophysiological signal processing and analysis. Here is an example of loading an EEG dataset, using the pop_loadset function provided by EEGLAB: ```matlab EEGstruct = pop_loadset('010705001.set') ``` `.set` files contain the metadata and `.fdt` files contain the raw data. Alternatively, if the user prefers to work with `.mat` files only, they can load each EEG structure only once using pop_loadset, and then save it as a `.mat` file as follows: ```matlab save('010705001.mat','EEGstruct','-v7.3') ``` Each of these folders essentially contains a structure, the fields of which have been populated with EEG and behavioral data. Specifically, the field **data** contains a (channels)x(time points) matrix with the raw EEG data; the field **event** contains a structure where field **type** contains the event names (e.g. `sitting_hit_negative`, `walking_corrRej_positive`) and field **latency** contains the EEG time point at which the event occured. #### About the metadata.xlsx: This Excel file contains metadata about the whole 2-back dataset. The information provided in each column is explained below: * Column **ID**: The 9-digit participant ID which is also the name of the individual participant data folders, e.g. `010705001`. The last 3 digits represent an incrementally-assigned number from 1-114. * Column **Age**: Participant's age at the time of the recording * Column **Speed**: treadmill speed, in miles per hour * Column **Sex**: `F` for female, `M` for male * Column **Dominant hand** (coincides with the hand used to provide button-press responses): `R` for right hand, `L` for left hand ## Sharing/Access information Data for this project will only be shared via Dryad. Data was not derived from any other sources. ## Code/Software The code will be provided [here](https://github.com/CNL-R). ## Associated Datasets In the context of this study, data were also collected from young adults while performing the 1-back Go/NoGo response inhibition task and concurrently walking on a treadmill. These 1-back data can be found in the Dryad dataset titled **[Mobile Brain-Body Imaging (MoBI) dual-tasking datasets (response inhibition while walking): Young adults](https://doi.org/10.5061/dryad.mgqnk9947)**. This study elucidates the neural mechanisms underlying increasing cognitive load while walking by employing 2 versions of a response inhibition task, the ‘1-back’ version and the more cognitively demanding ‘2-back’ version. By using the Mobile Brain/Body Imaging (MoBI) modality, electroencephalographic (EEG) activity, three-dimensional (3D) gait kinematics and task-related behavioral responses were collected while young adults (n = 61) performed either the 1-back or 2-back response inhibition task. Interestingly, increasing inhibitory difficulty from 1-back to 2-back during walking was not associated with any detectable costs in response accuracy, response speed, or gait consistency. However, the more difficult cognitive task was associated with distinct EEG component changes during both successful inhibitions (correct rejections) and successful executions (hits) of the motor response. During correct rejections, ERP changes were found over frontal regions, during latencies related to sensory gain control, conflict monitoring and working memory storage and processing. During hits, ERP changes were found over left-parietal regions during latencies related to orienting attention and subsequent selection and execution of the motor plan. The pattern of attenuation in walking-related EEG amplitude changes, during 2-back task performance, is thought to reflect more effortful recalibration of neural processes, a mechanism that might be a key driver of performance maintenance in the face of increased cognitive demands while walking. Overall, the present findings shed light on the extent of the neurocognitive capacity of young adults and may lead to a better understanding of how factors such as aging or neurological disorders could impinge on this capacity. This dataset was collected using the Mobile Brain-Body Imaging modality, involving synchronous recordings of 3 data streams: 1) EEG (BioSemi Inc., Amsterdam, The Netherlands) 2) Behavioral responses to the designed Go/NoGo task (Presentation, Neurobehavioral Systems Inc., Berkeley, CA, USA) 3) Full-body kinematics (OptiTrack, NaturalPoint, Inc., Corvallis, OR, USA). To record these 3 data streams in a time-synchronized manner, the Lab Streaming Layer (LSL: https://labstreaminglayer.org/#/) was used. The data included are raw, except for the behavior-related logfiles, for which both raw and processed versions are provided (see README file for details).
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.n2z34tn3d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.n2z34tn3d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2024Publisher:Mendeley Data Authors: Yin, Hanmo;Yin, Hanmo;doi: 10.17632/tnf8fbcnkv
The data shows the ERP and behavioral results of near-miss and full-miss under three reward expectancy levels: high, medium and low. The results show that on FRN and P300, the gap between near-miss and full-miss differs with different levels of reward expectancy, proving that reward expectancy affects the near-miss effect. Sheet names represent different dependent variables, low means low reward expectancy, same for medium and high . For some variables, the use declaration is marked in red font in the table
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tnf8fbcnkv&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!more_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.17632/tnf8fbcnkv&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu