Motor interactions between mouth, hand and foot
Dans

Motor interactions between mouth, hand and foot

Publié par: le 8 juillet 2016 | Pas de commentaire

Intensive contraction of a muscle modulates the corticospinal excitability (CSE) not only of the contracting muscle, but also of the resting muscles located in remote parts of the body; this is the so-called “remote effect”. We investigated to what extent the CSE of a hand muscle is modulated during preparation and execution of mouth and foot movements either separately or in combination. Hand-muscle CSE was estimated based on motor evoked potentials (MEPs) elicited by transcranial magnetic stimulation (TMS) and recorded from the first dorsal interosseous (FDI) muscle.

Naeem received his Ph.D. in Human Movement Science from the University of Verona and VU University of Amsterdam, where he aimed to tackle the interaction between sound perception and motor behavior, and sought to unravel its neural underpinnings. Thereafter, he spent time as a postdoc at the University of Helsinki. Recently, he joined the laboratory of Prof. Isabelle Peretz as a postdoc to perform research in the field of neuroscience of music.

Presentation by Marianne Stephan
Dans

Presentation by Marianne Stephan

Publié par: le 10 juin 2016 | Pas de commentaire

Marianne Stephan is interested in the influence of auditory information on motor learning and memory formation and  its underlying neuronal mechanisms. She’s currently doing a PostDoc with Dr Virginia Penhune (Concordia University). The presentation will be about preliminary data of a Transcranial Magnetic Stimulation study performed last year at BRAMS.

Investigating corticospinal excitability during melody listening: a Transcranial Magnetic Stimulation study

Conference by Jonathan Fritz
Dans

Conference by Jonathan Fritz

Publié par: le 30 mai 2016 | Pas de commentaire

Transformation from sound to meaning in the ferret auditory cortex.  

How do we make sense of the sounds we hear? We propose that there is a transformation from sound to meaning in the brain, which includes multiple steps from an initial, faithful encoding of incoming spectrotemporal acoustic patterns to further stages where the relevant auditory objects are recognized, categorized and associated with task or context-specific behavioral meaning and appropriate responses.

Dans

CRBLM-BRAMS Workshop on Mobile EEG for Neuroscience

Publié par: le 27 mai 2016 | Pas de commentaire

Register: http://goo.gl/forms/c6G2FFew0ZdA8PH23 (Deadline is May 23)

Schedule:

During the day: BRAMS conference room

9 – 9:15 am Welcome Address Alexandre Lehmann (McGill, CRBLM-BRAMS)http://www.crblm.ca/members/regular/alexandre_lehmann
9:15 – 10 am Mobile EEG: Toys, medical devices, and everything in between Jeremy Moreau (NeuroSpeed Lab, MNI)http://www.mcgill.ca/bic/research/neurospeed-dynamic-neuroimaging-laboratory-baillet
10 – 10:15 am Visualizing frequency band activity with consumer EEG Naoto Hieda (Shared Reality Lab, McGill)http://srl.mcgill.ca
10:15 – 11:15 am MuLES software + live demo Raymundo Cassani (MusaeLab, INRS)http://musaelab.ca/team-view/raymundo-cassani/
11:15 am – 12 pm LSL software presentation Martin Bleichner (Oldenburg University)http://www.uni-oldenburg.de/psychologie/neuropsychologie/team/martin-bleichner/
12 – 1 pm BREAK
1 pm – 3:30 pm Hands-On Recording with LSL and the SMARTING Device Martin Bleichner (Oldenburg University)

LECTURE:  Pavillon Marie-Victorin Room D-440 (90 Avenue Vincent-d’Indy, Metro Édouard-Montpetit)

BRAMS-CRBLM Invited Lecture (4 – 5 pm): Martin Bleichner (Oldenburg University, Germany)

Topic: The Oldenburg approach to mobile EEG

In this talk I will present our approach on mobile EEG. The joint research cluster Hearing4All has the goal to better understand and to improve hearing where necessary. Our group’s project focuses on controlling hearing devices using the listener’s neural activity: instead of needing a remote control to select the optimal setting, the hearing device should seamlessly respond to its user’s intentions. For this it is necessary to record and to understand the neural activity related to hearing in daily life situations. To this end we have developed solutions for mobile EEG acquisitions that allow for concealed signal acquisition. With a combination of mobile EEG, ear-centered EEG (cEEGrid, eartrode) and mobile signal acquisition (smartphone based) we can study the aspects of auditory attention inside and outside the lab. Here I will talk about our approach to mobile EEG and will present a number of studies we have conducted on auditory attention. Further, I will present an overview of a number of studies in our lab that use mobile EEG, for example to study social interactions or aspects of neurorehabilitation.

 

The lecture will be followed by a gathering, back at BRAMS in the conference room,  with live demos of mobile EEG technology (N. Hieda & R. Cassini)

 

Conference by Tim Falconer
Dans

Conference by Tim Falconer

Publié par: le 26 mai 2016 | Pas de commentaire

Tim Falconer, author of Bad Singer: The Surprising Science of Tone Deafness and How We Hear Music

Tim Falconer, a self-described bad singer, always wanted to make music, but after he started taking vocal lessons he learned that he’s actually amusic.

Étude sur la modulation de la perception de l’intensité sonore chez les gens avec acouphènes

Étude sur la modulation de la perception de l’intensité sonore chez les gens avec acouphènes

Publié par: le 20 mai 2016 | Pas de commentaire

Notre équipe de recherche s’intéresse à la façon dont les générateurs de bruit modifient la perception de l’intensité sonore.

Workshop on Advanced Motion Capture
Dans

Workshop on Advanced Motion Capture

Publié par: le 26 avril 2016 | Pas de commentaire

Program : Dr. Carolina Brum Medeiros will present a two-day workshop at BRAMS on advanced motion capture techniques for the analysis of performer movements. Using the large Qualisys system available at BRAMS, she will discuss effective marker placement and data fusion techniques for the capture of expert performances, using examples of various instrument performances and sports.  

Presentation by Jens Kreitewolf
Dans

Presentation by Jens Kreitewolf

Publié par: le 22 avril 2016 | Pas de commentaire

The influence of voice parameters on perceptual grouping in dynamic cocktail-party listening

Cocktail parties pose a difficult, but solvable, problem for the auditory system (reviewed by Shinn-Cunningham, 2008). The cocktail-party problem is made considerably easier when all sounds within the target stream are spoken by the same talker (Bressler et al., 2014).

Lecture by Philippe Albouy, Ph.D.
Dans

Lecture by Philippe Albouy, Ph.D.

Publié par: le 15 avril 2016 | Pas de commentaire

« Driving brain function with non-invasive rhythmic stimulation: a new way of shaping behavior? »

Substantial efforts in neuroscience have been made to understand how humans process complex stimulus patterns. To perceive and understand such patterns, the brain uses not only specialized centers, but also connections between those regions and other more distant areas.

Conference by Pauline Larrouy-Maestri, PhD
Dans

Conference by Pauline Larrouy-Maestri, PhD

Publié par: le 17 mars 2016 | Pas de commentaire

In-tune versus out-of-tune: On the perception of pitch accuracy

Singing is a common activity. However, on the occasion of special events like birthdays, singing contests, or even in daily life one can witness the great variability of performers regarding pitch accuracy. We typically classify performances as in-tune or out-of tune but the criteria as well as the process to make such judgment remain unclear. This talk aims at (1) clarifying the definition of pitch accuracy, i.e., the criteria on which one relies when listening to melodies, (2) presenting current studies designed to elucidate the mechanisms underlying the process judgment during music perception, and (3), discussing similarities and differences with other perceptual domains.