- home
- Advanced Search
Filters
Clear AllLoading
Research data keyboard_double_arrow_right Dataset 2020Zenodo EC | SYNTHESYS PLUSEC| SYNTHESYS PLUSWalton, Stephanie; Livermore, Laurence; Bánki, Olaf; Cubey, Robert; Drinkwater, Robyn; Englund, Markus; Goble, Carole; Groom, Quentin; Kermorvant, Christopher; Rey, Isabel; Santos, Celia; Scott, Ben; Williams, Alan; Wu, Zhengzhe;Tools and services evaluation speadsheet
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od______2659::167171ddeda91698bba630f62fe45191&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 22visibility views 22 download downloads 9 Powered bymore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od______2659::167171ddeda91698bba630f62fe45191&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2017Zenodo EC | MoreGrasp, EC | Feel your ReachEC| MoreGrasp ,EC| Feel your ReachAuthors: Ofner, Patrick; Schwarz, Andreas; Pereira, Joana; Müller-Putz, Gernot R.;Ofner, Patrick; Schwarz, Andreas; Pereira, Joana; Müller-Putz, Gernot R.;How neural correlates of movements are represented in the human brain is of ongoing interest and has been researched with invasive and non-invasive methods. In this study, we analyzed the encoding of single upper limb movements in the time-domain of low-frequency electroencephalography (EEG) signals. Fifteen healthy subjects executed and imagined six different sustained upper limb movements. We classified these six movements and a rest class and obtained significant average classification accuracies of 55% (movement vs movement) and 87% (movement vs rest) for executed movements, and 27% and 73%, respectively, for imagined movements. Furthermore, we analyzed the classifier patterns in the source space and located the brain areas conveying discriminative movement information. The classifier patterns indicate that mainly premotor areas, primary motor cortex, somatosensory cortex and posterior parietal cortex convey discriminative movement information. The decoding of single upper limb movements is specially interesting in the context of a more natural non-invasive control of e.g., a motor neuroprosthesis or a robotic arm in highly motor disabled persons. Please cite the paper http://dx.doi.org/10.1371/journal.pone.0182578 if you use this dataset.
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.834975&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 1Kvisibility views 1,206 download downloads 12,128 Powered bymore_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.834975&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2017Embargo end date: 23 Mar 2017 EnglishDryad NSF | LIP: The Gate of Frontal ..., EC | BRAINSHAPENSF| LIP: The Gate of Frontal Feedback to Occipital Cortex? ,EC| BRAINSHAPEVan Dromme, Ilse C.; Premereur, Elsie; Verhoef, Bram-Ernst; Vanduffel, Wim; Janssen, Peter;doi: 10.5061/dryad.t402g
The primate visual system consists of a ventral stream, specialized for object recognition, and a dorsal visual stream, which is crucial for spatial vision and actions. However, little is known about the interactions and information flow between these two streams. We investigated these interactions within the network processing three-dimensional (3D) object information, comprising both the dorsal and ventral stream. Reversible inactivation of the macaque caudal intraparietal area (CIP) during functional magnetic resonance imaging (fMRI) reduced fMRI activations in posterior parietal cortex in the dorsal stream and, surprisingly, also in the inferotemporal cortex (ITC) in the ventral visual stream. Moreover, CIP inactivation caused a perceptual deficit in a depth-structure categorization task. CIP-microstimulation during fMRI further suggests that CIP projects via posterior parietal areas to the ITC in the ventral stream. To our knowledge, these results provide the first causal evidence for the flow of visual 3D information from the dorsal stream to the ventral stream, and identify CIP as a key area for depth-structure processing. Thus, combining reversible inactivation and electrical microstimulation during fMRI provides a detailed view of the functional interactions between the two visual processing streams. openData_newxls files contain percent signal change per run.img/hdr files show tvalues for contrasts see readme.txt for more information
ZENODO arrow_drop_down DRYAD; NARCIS; DANS-EASYDataset . 2017 . 2016add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.t402g&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 26visibility views 26 download downloads 4 Powered bymore_vert ZENODO arrow_drop_down DRYAD; NARCIS; DANS-EASYDataset . 2017 . 2016add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.t402g&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu- ICDAR'15 SMARTPHONE DOCUMENT CAPTURE AND OCR COMPETITION (SmartDoc) - Challenge 1 (original version)
Research data keyboard_double_arrow_right Dataset 2015 EnglishZenodo EC | TECNIOSPRINGEC| TECNIOSPRINGAuthors: Chazalon, Joseph; Rusiñol, Marçal;Chazalon, Joseph; Rusiñol, Marçal;CHALLENGE 1: SMARTPHONE DOCUMENT CAPTURE COMPETITION Smartphones are replacing personal scanners. They are portable, connected, powerful and affordable. They are on their way to become the new entry point in business processing applications like document archival, ID scanning, check digitization, just to name a few. In order keep our workflows streamlined, we need to make those new capture device as reliable as batch scanners. We believe an efficient capture process should be able to: detect and segment the relevant document object during the preview phase; assess the quality of the capture conditions and help the user improve them; optionally, trigger the capture at the perfect moment; and produce a high-quality, controlled output based on the high resolution captured image. This competition is focused on the first step of this process: efficiently detect and segment document regions, as illustrated by following video showing the ideal output for the preview phase of some acquisition session: Click here to watch the video. This video shows the ideal document object detection (ie the ground truth, as a red frame). For this challenge, the input consists in a set of videoclips containing a document from a predefined set, and the output should be an xml file containing the quadrilateral coordinates in which we can find the document per each frame of the video. Click here for detailed information about the dataset. Licence for the dataset of challenge 1 (page outline detection in preview frames) : This work is licensed under a Creative Commons Attribution 4.0 International License <http://creativecommons.org/licenses/by/4.0/>. Author attribution should be given by citing the following conference paper: Jean-Christophe Burie, Joseph Chazalon, Mickaël Coustaty, Sébastien Eskenazi, Muhammad Muzzamil Luqman, Maroua Mehri, Nibal Nayef, Jean-Marc OGIER, Sophea Prum and Marçal Rusinol: “ICDAR2015 Competition on Smartphone Document Capture and OCR (SmartDoc)”, In 13th International Conference on Document Analysis and Recognition (ICDAR), 2015. If you use this dataset, please send us a short email at <icdar.smartdoc (at) gmail.com> to tell us why it was useful to you, and whether you have results or publications we can reference on our website. Thank you!
ZENODO arrow_drop_down add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.1230217&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 3Kvisibility views 2,615 download downloads 1,922 Powered bymore_vert ZENODO arrow_drop_down add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.1230217&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
Loading
Research data keyboard_double_arrow_right Dataset 2020Zenodo EC | SYNTHESYS PLUSEC| SYNTHESYS PLUSWalton, Stephanie; Livermore, Laurence; Bánki, Olaf; Cubey, Robert; Drinkwater, Robyn; Englund, Markus; Goble, Carole; Groom, Quentin; Kermorvant, Christopher; Rey, Isabel; Santos, Celia; Scott, Ben; Williams, Alan; Wu, Zhengzhe;Tools and services evaluation speadsheet
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od______2659::167171ddeda91698bba630f62fe45191&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 22visibility views 22 download downloads 9 Powered bymore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=od______2659::167171ddeda91698bba630f62fe45191&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2017Zenodo EC | MoreGrasp, EC | Feel your ReachEC| MoreGrasp ,EC| Feel your ReachAuthors: Ofner, Patrick; Schwarz, Andreas; Pereira, Joana; Müller-Putz, Gernot R.;Ofner, Patrick; Schwarz, Andreas; Pereira, Joana; Müller-Putz, Gernot R.;How neural correlates of movements are represented in the human brain is of ongoing interest and has been researched with invasive and non-invasive methods. In this study, we analyzed the encoding of single upper limb movements in the time-domain of low-frequency electroencephalography (EEG) signals. Fifteen healthy subjects executed and imagined six different sustained upper limb movements. We classified these six movements and a rest class and obtained significant average classification accuracies of 55% (movement vs movement) and 87% (movement vs rest) for executed movements, and 27% and 73%, respectively, for imagined movements. Furthermore, we analyzed the classifier patterns in the source space and located the brain areas conveying discriminative movement information. The classifier patterns indicate that mainly premotor areas, primary motor cortex, somatosensory cortex and posterior parietal cortex convey discriminative movement information. The decoding of single upper limb movements is specially interesting in the context of a more natural non-invasive control of e.g., a motor neuroprosthesis or a robotic arm in highly motor disabled persons. Please cite the paper http://dx.doi.org/10.1371/journal.pone.0182578 if you use this dataset.
add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.834975&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 1Kvisibility views 1,206 download downloads 12,128 Powered bymore_vert add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.834975&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euResearch data keyboard_double_arrow_right Dataset 2017Embargo end date: 23 Mar 2017 EnglishDryad NSF | LIP: The Gate of Frontal ..., EC | BRAINSHAPENSF| LIP: The Gate of Frontal Feedback to Occipital Cortex? ,EC| BRAINSHAPEVan Dromme, Ilse C.; Premereur, Elsie; Verhoef, Bram-Ernst; Vanduffel, Wim; Janssen, Peter;doi: 10.5061/dryad.t402g
The primate visual system consists of a ventral stream, specialized for object recognition, and a dorsal visual stream, which is crucial for spatial vision and actions. However, little is known about the interactions and information flow between these two streams. We investigated these interactions within the network processing three-dimensional (3D) object information, comprising both the dorsal and ventral stream. Reversible inactivation of the macaque caudal intraparietal area (CIP) during functional magnetic resonance imaging (fMRI) reduced fMRI activations in posterior parietal cortex in the dorsal stream and, surprisingly, also in the inferotemporal cortex (ITC) in the ventral visual stream. Moreover, CIP inactivation caused a perceptual deficit in a depth-structure categorization task. CIP-microstimulation during fMRI further suggests that CIP projects via posterior parietal areas to the ITC in the ventral stream. To our knowledge, these results provide the first causal evidence for the flow of visual 3D information from the dorsal stream to the ventral stream, and identify CIP as a key area for depth-structure processing. Thus, combining reversible inactivation and electrical microstimulation during fMRI provides a detailed view of the functional interactions between the two visual processing streams. openData_newxls files contain percent signal change per run.img/hdr files show tvalues for contrasts see readme.txt for more information
ZENODO arrow_drop_down DRYAD; NARCIS; DANS-EASYDataset . 2017 . 2016add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.t402g&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 26visibility views 26 download downloads 4 Powered bymore_vert ZENODO arrow_drop_down DRYAD; NARCIS; DANS-EASYDataset . 2017 . 2016add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5061/dryad.t402g&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu- ICDAR'15 SMARTPHONE DOCUMENT CAPTURE AND OCR COMPETITION (SmartDoc) - Challenge 1 (original version)
Research data keyboard_double_arrow_right Dataset 2015 EnglishZenodo EC | TECNIOSPRINGEC| TECNIOSPRINGAuthors: Chazalon, Joseph; Rusiñol, Marçal;Chazalon, Joseph; Rusiñol, Marçal;CHALLENGE 1: SMARTPHONE DOCUMENT CAPTURE COMPETITION Smartphones are replacing personal scanners. They are portable, connected, powerful and affordable. They are on their way to become the new entry point in business processing applications like document archival, ID scanning, check digitization, just to name a few. In order keep our workflows streamlined, we need to make those new capture device as reliable as batch scanners. We believe an efficient capture process should be able to: detect and segment the relevant document object during the preview phase; assess the quality of the capture conditions and help the user improve them; optionally, trigger the capture at the perfect moment; and produce a high-quality, controlled output based on the high resolution captured image. This competition is focused on the first step of this process: efficiently detect and segment document regions, as illustrated by following video showing the ideal output for the preview phase of some acquisition session: Click here to watch the video. This video shows the ideal document object detection (ie the ground truth, as a red frame). For this challenge, the input consists in a set of videoclips containing a document from a predefined set, and the output should be an xml file containing the quadrilateral coordinates in which we can find the document per each frame of the video. Click here for detailed information about the dataset. Licence for the dataset of challenge 1 (page outline detection in preview frames) : This work is licensed under a Creative Commons Attribution 4.0 International License <http://creativecommons.org/licenses/by/4.0/>. Author attribution should be given by citing the following conference paper: Jean-Christophe Burie, Joseph Chazalon, Mickaël Coustaty, Sébastien Eskenazi, Muhammad Muzzamil Luqman, Maroua Mehri, Nibal Nayef, Jean-Marc OGIER, Sophea Prum and Marçal Rusinol: “ICDAR2015 Competition on Smartphone Document Capture and OCR (SmartDoc)”, In 13th International Conference on Document Analysis and Recognition (ICDAR), 2015. If you use this dataset, please send us a short email at <icdar.smartdoc (at) gmail.com> to tell us why it was useful to you, and whether you have results or publications we can reference on our website. Thank you!
ZENODO arrow_drop_down add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.1230217&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
visibility 3Kvisibility views 2,615 download downloads 1,922 Powered bymore_vert ZENODO arrow_drop_down add ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.5281/zenodo.1230217&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu