The Toolbox Bouquet is half a day online session running on Dec 14th from 2 to 6PM (Aix-Marseille time). Several exciting practical courses are offered on selected flowers (toolboxes) for cutting-edge MEEG data analysis
This event will take place in the interactive and fun platform (that you may know from LiveMEEG) : Gather.town. Check this tutorial or these instructions for more information.
This session is free and open to all but with a limited number of seats.
Registration is mandatory and needs to be done before Friday Dec 2nd.
Flowers (courses) have different teaching approaches composed of the following attributes:
FLUX: A pipeline for MEG analysis
Oscar Ferrante, Tara Ghafari
University of Birmingham, Centre for Human Brain Health, UK
During our session, we will present a recently published analysis pipeline for Magnetoencephalography (MEG) called FLUX. There are several open-source toolboxes developed by the community, each providing a wealth of options for analyses. This poses a challenge for reproducible research as well as for researchers new to the field. The FLUX pipeline aims to help making the analyses steps and setting explicit for standard analysis using two of the most popular toolboxes: MNE-Python and FieldTrip. The FLUX pipeline goes from pre-processing to source localization of oscillatory brain activity, but it can also be used for event-related fields and multivariate pattern analysis. The pipeline includes documented code for MNE-Python and FieldTrip. During the session, we will use a sample data set on visuospatial attention to illustrate the analysis steps. We will provide all the scripts as interactive notebooks implemented in Jupyter Notebook and MATLAB Live Editor. We will also provide explanations, justifications, and graphical outputs for the essential steps. Furthermore, we will suggest text and parameter settings to be used in pre-registrations and publications. We believe that this session will help new researchers entering the MEG field by providing them with some standardization on the basic analysis steps, as well as researches who want to align their approaches across different toolboxes.
Introducing Meggie – a MNE-python based graphical user interface to do M/EEG analysis
University of Jyväskylä, Finland
Meggie is a free open source software for running MEG and EEG analysis with a graphical user interface. It is written in Python 3, runs on Linux, macOS and Windows, and uses the MNE-python library under the hood to do heavy lifting. It is designed to allow end-to-end analysis of MEG and EEG datasets from multiple subjects with common sensor-level analysis steps such as preprocessing, epoching and averaging, spectral analysis and time-frequency analysis. Most of the analysis steps can be run for all the subjects in one go, and combining the results across subjects is made possible with grand averages.
The software website can be found at https://github.com/cibr-jyu/meggie, where the instructions for installation and usage are. We are also preparing an article: https://www.biorxiv.org/content/10.1101/2022.09.12.507592v1
In the session we will go through the basic features of the software, providing a good starting point on deciding whether the software could be useful for you and how to use it. One does not need to install anything to participate.
Frites: a Python package for functional connectivity analysis and group-level statistics of neurophysiological data
Institut de Neurosciences de la Timone, Aix-Marseille Université, France
Frites is a Python toolbox for task-related functional connectivity analysis based on information theoretical approaches and group-level statistics on neurophysiological data (M/EEG, Intracranial). Frites also contains workflows accessible to users with little programming knowledge. For more information please see the Frites webpage (https://brainets.github.io/frites/) and recent releases on github (https://github.com/brainets/frites).
NeuroPycon & Ephypype
Institute for Applied Mathematics, M. Picone, National Council of Research, Rome, Italy
NeuroPycon (Meunier et al. 2020) is an open-source brain data analysis toolkit which provides Python-based template pipelines for advanced multi-processing of M/EEG, functional and anatomical MRI data, with a focus on connectivity and graph theoretical analyses.
NeuroPycon is based on the NiPype framework (Gorgolewski et al., 2011) which facilitates data analyses by wrapping numerous commonly-used neuroimaging software tools into a common Python framework.
The current implementation of NeuroPycon contains two complementary packages:
- ephypype is mainly based on MNE-python package (https://mne.tools/) and includes pipelines for electrophysiology analysis and a command-line interface for on the fly pipeline creation. Current implementations allow for MEG/EEG data import, pre-processing and cleaning by automatic removal of ocular and cardiac artefacts, in addition to sensor or source-level connectivity analyses.
- graphpype is based on radatool (https://deim.urv.cat/~sergio.gomez/radatools.php) and is designed to investigate functional connectivity via a wide range of graph-theoretical metrics, including modular partitions.
Results visualization is provided through visbrain (http://visbrain.org/), an open-source python software devoted to graphical representation of neuroscientific data.
NeuroPycon is available for download via github (https://github.com/neuropycon) and the two principal packages are documented online (https://neuropycon.github.io/ephypype/index.html, https://neuropycon.github.io/graphpype/index.html).
The present hands-on session describes the philosophy, architecture, and functionalities of NeuroPycon packages and provides illustrative examples through interactive notebooks.
Simulating Event-Related EEG Activity using SEREEGA
Brandenburg University of Technology, Germany
Electroencephalography (EEG) is a popular method to measure brain activity, with a large and increasing number of analysis techniques available to interpret the recorded data. One difficulty in developing and evaluating such analysis techniques is that no ground truth is available that describes the actual source-level activity of the brain. As such, developers must use other ways to examine the validity of their EEG analysis approaches. In particular, researchers have used simulated EEG to achieve this. Simulating EEG activity allows the results of analysis techniques to be compared to the ground-truth initial parameters of the simulation. The MATLAB-based EEG simulation toolbox SEREEGA (Simulating Event-Related EEG Activity) provides functionality that covers the majority of EEG simulation approaches used in past literature in a dedicated set of scripts (and a GUI). This free and open-source toolbox supports different head models, and allows source-level activity to be simulated using, inter alia, oscillations, event-related spectral perturbations, event- related potentials, noise, and auto-regressive models. The session will introduce SEREEGA to the audience, providing a step-by-step guide to simulating event-related EEG activity with a fully known ground truth. This session will also cover the newly released HArtMuT headmodel that can be used with SEREEGA to simulate signals coming from non-brain sources, a factor that has long been difficult but is required for the realistic simulation of artefacts.
Prerequisite: MATLAB, EEGLAB
Contributing to open science with git and github
Institut de Neurosiences de la Timone, Aix-Marseille Université, France
Efforts towards reproducible science are more and more promoted by the organisational and funding institutions, as well as by the scientific community. The use of shared repository containing source code with automated tracking of changes is one of the aspect to implementing more reproducible research.
The tutorial will focus on two aspects :
• In the first part, I will present the basic commands of git, the most widely use tool for code sharing and versionning. Git can be implemented as a stand alone solution on a server with gitlab, but also supported by websites such as framagit, github or bitbucket.
• In the second part, I will show how to contribute to existing collaborative large open source projects, with reference to examples of successful open source projects (e.g. scikit-learn, python-mne, python-neo, etc.).
I will introduce how to establish your own project, and allow contributions from others (including with collaborators outside of lab) respecting the minimal rules to ensure efficient collaborative efforts.
Prerequisite: Have git installed on your computer; have an acount on github (including two-factor authentication)
Intro to Deconvolution Modeling with the “Unfold” toolbox
Olaf Dimigen, Benedikt Ehinger
Max Planck Institute for Human Development, Berlin, Germany
University of Stuttgart, Germany
This workshop will combine a short theoretical introduction with a hands-on tutorial to the “unfold toolbox” (http://www.unfoldtoolbox.org), a new framework for analyzing M/EEG data recorded in naturalistic paradigms (e.g., mobile brain-body imaging) or other experimental settings that involve overlapping brain responses from different events (e.g., fast multimodal stimulation, stimulus- and response-related potentials, microsaccades). It also allows the researcher to control for influences of linear or nonlinear nuisance variables on the M/EEG signal.
Prerequisite: To take part in the hands-on part, you need a computer with Matlab installed (MATLAB 2016a or newer). You will also need a version of EEGLAB, but this will be provided by us if not already available. You do not need to install the unfold toolbox before the workshop. Some familiarity with Matlab is helpful but not required.
If you want to prepare in advance, you can use these materials:
Toolbox paper: https://peerj.com/articles/7838/
Tutorial paper on applications: https://jov.arvojournals.org/article.aspx?articleid=2772164
Human Neocortical Neurosolver: A Software Tool for Cell and Circuit Level Interpretation of MEG/EEG signals
Brown University, Rhode Island, USA
MEG/EEG signals are correlated with nearly all healthy and pathological brain functions. However, it is still extremely difficult to infer the underlying cellular and circuit level origins. This limits the translation of MEG/EEG signals into novel principles of information processing, or into new treatment modalities for pathologies. To address this limitation, we built the Human Neocortical Neurosolver (HNN): an open-source software tool to help researchers and clinicians without formal computational modeling or coding experience interpret the neural origin of their human MEG/EEG data. HNN provides a graphical user interface (GUI) to an anatomically and biophysically detailed model of a neocortical circuit, with layer specific thalamocortical and cortical-cortical drives. Tutorials are provided to teach users how to begin to study the cell and circuit level origin of sensory event related potentials (ERPs) and low frequency rhythms. Once users have an understanding of the basic workflows and tutorials in the HNN GUI, those familiar with Python can work in the HNN-core Pythonic interface. In this hands-on workshop we’ll give a didactic overview of the background and development of HNN and tutorials of use with both HNN-GUI and HNN-core.
Prerequisite: HN-GUI and HNN-core installed on computer
OpenViBE: an open source BCI software suite
Arthur Desbois, Marie-Constance Corsi
Aramis team-project, Inria Paris, Paris Brain Institute, France
OpenViBE is an open-source software platform dedicated to designing, testing and using brain-computer interfaces (BCIs). It can be used to acquire, filter, process, classify and visualize brain signals in real time, with applications in the medical field (assistance to disabled persons, helping physical rehabilitation, neurofeedback), but also in virtual reality, video gaming and robotics.
It is compatible with a large selection of EEG hardware devices, and features a wide array of signal processing algorithms and machine learning methods. It supports scripting through Python and MATLAB, and provides an easy-to-use GUI allowing to create and manipulate pipelines or specific use-case scenarios, thanks to a flexible architecture based on independent processing boxes communicating with one another.
During the first part of the session, a general overview of the BCI context and of the software will be given. We will also present several examples of use of OpenViBE in the BCI domain.
In the second part, we will show a step-by-step tutorial showing the different capabilities of OpenViBE, using a Motor Imagery experimental protocol. Different scenarios will be analyzed in details to showcase typical operations: signal acquisition, extraction of features of interest, training and running a classification algorithm.
Finally, in the third part, we will discuss more advanced topics such as external interfaces (i.e. using the widely used Lab Stream Layer transfer protocol), developing C++ boxes, using Python for scripting, and using OpenViBE as a processing engine. To illustrate this, a new application currently in development will be demoed, showing how to push things further, and allow using BCI in clinical and research contexts.
AnyWave : multi platform software to visualize and process MEG/SEEG/EEG data
Institut de Neurosciences des Systèmes, Aix-Marseille Université, France
In the session I will give a general presentation of what the software can do for clinician users and also how it can be used for developers to implement new tools using MATLAB or Python. We will also present a pipeline example starting from MEG data, to ICA components and localisation of spikes in the brain.
Preprocessing infant EEG using APICE (Automated Pipeline for Infants Continuous EEG)
UNICOG, NeuroSpin, Saclay, France
Some of the main challenges when analyzing infant EEG is that recordings are short and typically heavily contaminated with artifacts. Additionally, the signal features change along development, and variability across participants is higher than in adult data. APICE (Fló et al. 2022) is a set of functions based on EEGLAB and written in MATLAB specifically developed to deal with these issues while keeping the pipeline fully automated. One of the main contributions of APICE is that it provides multiple algorithms for motion artifact detection using adaptive thresholds. The implementation of adaptive thresholds makes it suitable across age groups without requiring manual rejection or substantial threshold adjustments for each protocol resulting in better data recovery than other available pipelines. Moreover, APICE works on continuous data, which makes it suitable for different types of analysis. APICE also provides functions to define non-functional channels and data segments to reject, correct localized artifacts, and apply ICA using iMARA to identify components associated with physiological artifacts. APICE is fully automated, flexible, and modular; thus, it can be easily combined with other toolboxes or preprocessing steps.
In the first part of the talk (~45 minutes), I will briefly overview the challenges of infant EEG preprocessing and introduce APICE. More specifically, I will show its principles and explain the different types of functions and pipeline architecture. The second part of the talk (~45 minutes) will be a hands-on tutorial. I will provide some example scripts and recordings to preprocess. I will also explain how different parameters and steps can be adapted for different requirements.
Prerequisite: MATLAB, EEGLAB toolbox (recommend version >=14.0). To have downloaded APICE (https://github.com/neurokidslab/eeg_preprocessing) and iMARA (https://github.com/Ira-marriott/iMARA/). If participants desire to preprocess their data during the hands-on, the data should be already in the EEGLAB format (.set, .fdt) and have the layout of the electrodes.
Model-based neuroscience: connecting simple and complex models with EEG data
Marieke van Vugt
University of Groningen, the Netherlands
As the field is becoming increasingly aware of the need to have strong theories of cognition, computational models are becoming more popular as well. In this tutorial, we will talk about how we can use such computational models to better understand what is happening from moment to moment in EEG data. There are multiple approaches to do so. Most studies focus on a signal at a single moment in time, e.g., the peak of a particular ERP component. Conversely, here we will discuss what methods we can use to understand the changes over the full EEG signal, moment by moment. We will then launch into specifics for one such model, the Drift Diffusion Model (DDM). We will briefly go over how to fit a DDM to behavioural data, and then discuss how we can turn the results of these fits into a continuous regressor to relate to EEG data. Following this, we will go into how to do the canonical correlation between the DDM regressor we just created and EEG data. We will end by discussing how to extend this method to other tasks and models.
Prerequisite: Familiarity with Matlab
“Take that bouquet of Alpha waves in your face”
Hans Berger (alleged)