Literature | Videos | Student contributions

Literature

Books


  • The Diving Bell and the Butterfly


    A Memoir of Life in Death. Jean-Dominique Bauby (Author), Jeremy Leggatt (Translator) In December 1995, the 44-year-old former editor-in-chief of the French Elle magazine had suffered a severe stroke that left his body paralyzed but his mind intact, a condition known as "locked-in syndrome." Able to communicate only by blinking his left eyelid, he dictated this book letter by letter to an assistant who recited to him a special alphabet. Book at Amazon

Articles

  • N. Birbaumer, N. Ghanayim, T. Hinterberger, I. Iversen, B. Kotchoubey, A. Kübler, J. Perelmouter, E. Taub, H. Flor

    A spelling device for the paralysed


    Nature 398, 297-298, March 1999. Article at publisher

  • Niels Birbaumer

    Breaking the silence: brain-computer interfaces (BCI) for communication and motor control


    Psychophysiology, 43(6):517–532, Nov 2006. This article reviews the research concerned with invasive and noninvasive BCIs from the perspective of their clinical usefulness for communication and motor restauration in paralysis. Article PDF .

  • Jonathan R Wolpaw, Niels Birbaumer, Dennis J McFarland, Gert Pfurtscheller, and Theresa M Vaughan.

    Brain-computer interfaces for communication and control.


    Clin Neurophysiol, 113(6):767–791, Jun 2002. This review summarizes the state of BCI research in 2002 with emphasis on its applicationto the needs of those with severe neuromuscular disabilities. Article PDF

(Alex's Article)



  • "Control of an Electrical Prosthesis With an SSVEP-Based BCI," Muller-Putz, G.R.; Pfurtscheller, G., Biomedical Engineering, IEEE Transactions on , vol.55, no.1, pp.361-364, Jan. 2008


[Adam]

Videos


  • Neurofeedback Training in children with ADHD in the group of Dr. Birbaumer

TrainYourBrain.jpg


  • Niels Birbaumer: BCI in Paralysis: An Unfulfilled Promise (video in English)




























Student contributions


Difference curves

for each electrode (from Betzaida)

differences.jpg
Visual Differences
(Collected from all students by Adam)

cp4 685milliseconds
yes 0.765 (+/- 50)
no -5.412
significant

Cz 420milliseconds
yes 4.066 (+/- 150)
no -2.422
significant

Poz 685millisecnds
yes 5.02 (+/- 100-900)
no -2.6

significant

Poz 2710 milliseconds
yes -2.488 (+/- 200)
no 4.886

significant

Poz 2690 milliseconds
yes -2.409 (+/- 200)
no 4.666
significant

F3 95 milliseconds
yes 95.8547 (+/- 0-260)
no 4.31
not significant

Cpz 215 milliseconds
yes 1.285 (+/- 730)
no -3.5459
significant

T8 420 milliseconds
pos 3.629 (+/- 200)
neg -0.6113
not significant

P6 685 milliseconds

pos 0.6113 (+/- 300-900)
neg -5.24

significant

P4 1185 milliseconds
pos 4.476 (+/- 1125-1325)
neg -0.8317
significant

Significant differences:

Cp4: [675:700], [1185:1190]

Indices: electrode 18
[675:700]/5, [1185:1190]/5


Significant Differences and Patterns
Kathryn Pitton

C4,C2,Cz,C1,Cp4,Cp2,Cpz,Cp1,Cp3,Cp5
time 450-480
Frequencz 6hz
signal difference 2000

Po8,Po4,Poz,Oz,Po3, Po7
time 885-1110
frequency 8 hz
signal difference 2.749

F4, Fz, F3
time 470-505
frequencz 6 Hz
signal difference 1400-1550

T8, C6, C5
Time 1255-1375
Frequencz 2Hz
signal difference -344.3

Cp3,Cp1,Cpz,Cp2,C3,C5, t7, P1, P3, P5
Frequency 30 hz
Time 190-310
signal difference -270 to -370

Automated identification of feature vectors



Effect of removing trials with high alpha power on the classification accuracy: (Workflow)

  • we use parieto-occiptal alpha power as a signature of the attentive state of the subject.
  • high alpha power = low attention
  • expectation: trials with low attention deteriorate the classification accuracy due to brain signals being different from the trials with high attention
  • we determined the parieto-occiptal alpha power from the time interval 1 s BEFORE SENTENCE ONSET in each trial.
  • we determined a threshold for alpha power above which the trials would be excluded from classification.
  • we determined classification accuracy before and after removing the trials with high alpha power.

Classification accuracy:

Feature vector: signals at the time points and electrodes where there was a significant difference between a yes and a no answer (significance 0.05). The significance was assessed using the percent trials that was also used in the training set. Classifiers were trained for each run.
Percent trials in training set
Classification accuracy before removing trials with high alpha power
Classification accuracy after removing trials with high alpha power
25%
49.05%
48.46%
50%
53.81%
55.3%
75%
49.75 %
51.00%
Conclusions from the table above:
  • classification accuracy is around 50% -> not sufficient for BCI!
  • changing the percentage of trials included in the training set essentially does not change the classification accuracy.
  • removal of the trials with high alpha power from the classification procedure did not improve classification accuracy.

Potential avenues for improvement:
  • Check out a different classification algorithm
  • Use different feature vectors
  • Use different signatures of non-attentiveness or sleep to exclude invalid trials

Time-frequency feature vectors

Feature vector: power at the frequencies over the time windows and electrodes where there was a significant difference between a yes and a no answer (significance 0.01). The significance was assessed using the percent trials that was also used in the training set. Classifiers were trained for each run.

Percent trials in training set
Classification accuracy before removing trials with high alpha power
Classification accuracy after removing trials with high alpha power
25%
48.86%
49.49%
50%
53.45%
54.1%
75%
49.58%
52.12%
Conclusions from the table above:
Using TFA feature vectors does not improve classification accuracy.


Fischler feature vectors

Feature vector: Signals at electrodes Cz, C3, C4, F3, F4 at 400 ms - 800 ms after onset of the last word in sentence.

Based on a result of Fischler et al., 1983, Brain Potentials Related to Stages of Sentence Verification, Psychophysiology, vol. 20, no. 4, (see Blackboard "Additional Course Documents" for the full text)

Percent trials in training set
Classification accuracy before removing trials with high alpha power
Classification accuracy after removing trials with high alpha power
25%
46.77%
44.53%
50%
49.91%
52.26%
75%
42.5%
44.63%

Furdea feature vectors

Feature vector: Signals at electrodes Fz, Cz, Cpz, P3, Pz, P4, PO7, POz, PO8, Oz at time interval 2 seconds following the end of a sentence (approximately 500 ms - 2500 ms in our data set)

Furdea et al., 2011, A new (semantic) reflexive brain–computer interface: In search for a suitable classifier, J. Neurosc. Methods, vol. 203, (see Blackboard "Additional Course Documents" for the full text)

Percent trials in training set
Classification accuracy before removing trials with high alpha power
Classification accuracy after removing trials with high alpha power
25%
50.15%
50.59%
50%
53.45%
54.05%
75%
46.67%
39.96%

General Summary of Program
Segment data choosing epoch 0 to 3 seconds after onset of last word.
Motivation for this: conditioning with the electric shock occurred at this point and the last word determined the truth value of the sentence. The last word occurred approx. for 500ms. We expect these brain signals to distinguish between the true and false responses.

CLIS partient is presented with true and false statements.
Goal: Choosing brain signals that differentiate these statements. We expect the most difference in the interval of 3000ms we chose (see above).

Next visualized the EEG data from the CLIS patients .
Motivation to use EEG: EEG is cheap, portable, pro: high temporal resolution, con: low spatial resolution.
Visualized with a movie, to show how the average potentials changed in time through the brain.
Topoplots for spatial visualization, temporal visualization with graphs.

Event related potentials (ERP)
Then we looked at the mean EEG responses over time course for each electrode and condition and created the respective ERP. Event here is the onset of the last word. ERP are computed by averaging the signal over all trials, to get an average response, hopefully including properties to process this type of signal (= ERP is expected to yield the specific response to this event).
ERP done in time domain, for negative and positive sentence. We hope to have signal difference in these ERPs to differentiate between the yes and no answers. We get ERPs for each electrode. Red curve for ERP of positive sentences, blue for negative.

We looked at the visual differences between the curves, and extract the results on this page (see list above). Betzaida plotted the differences in the average positive and average negative curves, then took the resulting curves integral to identify the largest differences automatically.
We used a t-test to determine whether the differences between the two average curves (ERPs) are statistically significant. We superimposed this over our ERP data.

We did the same thing with the time frequency analysis (TFA). The main idea of TFA is to see the contribution of each frequency of the signal, giving us the power spectrum. Power vs. Frequency. Difference between TFA and FA is over time, can't determine frequency and time simultaneously, due to uncertainty principle. TFA: We are looking at how the power spectrum changes over time. Run the data through the Fourier Transformer. If TFA, take time windows and compute power spectrums for each time window. Problem is to now visualize this, we color coded the power at each frequency and time windows. TFA for each condition. Difference in the TFA between the positive and negative sentences. Reddish value meant the power at this time window in this electrode was stronger in the positive than negative sentence. Blue meant the opposite.
Then we looked at the significant differences between the figures for the pos and neg and subtracted the two power spectra from each other. Neg value meant positive stimuli was smaller. Visualize significant differences by setting all others to 0 and making it white. Positive values were mapped to red, and negative to blue.

For classification we need a training set to train the classifier (i.e. align the hyperplane so that it separates the two classes). We also need a test set to test the accuracy of the classifier.

Feature Vectors
Are a "window" into the problem. Our feature vectors are the data extracted from the experiment where significant differences occur. We want the feature vectors to be as low dimensional as possible and adequately differentiate between the data classes. Feature Vector strongly depends on what we choose to characterize it. We used 4 FVs (ERP, TFA, Fischler (ERP), Furdea (ERP), We pick different parts of the signal and construct the feature vector from it. Then we take it and do the classification, and determine the accuracy of the classification.

The results of our classification accuracies are consistent with Dr. Birbaumer's. We looked at the effects of attention on the classification based on the parieto-occipital alpha power 1 second prior to the onset of the sentence. High alpha power is interpreted as low attention. The trials with low attention were removed and the classification done on the reduced set. This essentially did not improve classification accuracy.

Potential improvements
Use different feature vectors with coherence, a TFA, but not for power, the consistence of the relative phase between to electrodes.
Use different signatures of attention and sleep.


Literature Research: Attention__
1) Bahramisharif, A. et al, “Covert attention allows for continuous control of brain-computer interfaces”, European Journal of Neuroscience.
This study focused on how covert attention, the mental focusing on a peripheral sensory stimulus while keeping a gaze in the same direction, can be used as a control signal in BCI experiments. This research also worked on training subjects to be able to modulate posterior alpha activity through various applications.

2) Treder, M., “Brain-computer interfacing using modulations of alpha activity induced by covert shifts of attention”, Journal of NeuroEngineering and Rehabilitation
This research investigated different pairs of directions of attention shifts and the reliability of differentiating between them in visual brain-computer interface experiments. The results found substantial variety in those direction pairs which were reliably classified with a mean accuracy of 74.6% for the best-classifiable pair.

3) Marchetti, et al, “Brain-computer interfacing using modulations of alpha activity induced by covert shifts of attention”, Neurorehabilitation and Neural Repair
As patients with amyotrophic lateral sclerosis have limitations in eye movement, especially in the final stages of the disease, this study sought out to improve BCIs which rely on visual interfaces by orienting the covert visuospatial attention of ALS patients. The researchers concluded that implementing endogenous visuospatial attention orienting in the design of covert visuospatial attention-baced interfaces is better suited in designing visual BCIs for such ALS patients.

4) Treder, Matthias S. and Benjamin Blankertz, “Covert attention and visual speller design in an ERP-based brain-computer interface”
This article looks at the comparisons between covert (visual periphery) and overt eye attention (eye movements) when looking at an ERP (event related potential) based BCI. Both classifiers can be used for ERP BCI and showed that subjects rely heavily on overt attention and later on for the covert attention.

5) Choon Guan Lim, “A Brain-Computer Interface Based Attention Training Program for Treating Attention Deficit Hyperactivity Disorder”
Using EEG sensors, a group of unmedicated ADHD patients went to 24 sessions of BCI training using a feed forward game. Results showed that the change in EEG severity correlated directly with the improvement of the ADHD patients. The patient’s attention span showed a significant improvement over the course of the training meaning that BCI could help train the brain attention wise.

6) Birbaumer, Niels, “Neurofeedback and Brain–Computer Interface: Clinical Applications”
While Neurofeedback and BCI have been mostly used for paralysis or ALS patients, it has been shown to provide significant improvement in those suffering from ADHD. BCI has been able to show that attention can be controlled through repeated monitoring of the brain and learning how to control your brain waves.


7) Hill, Lal, Bierig, Birabaumer, Scholkoph “An Auditory Paradigm for BCI”
Tried to make a paradigm that enables the user to focus their auditory attention. This would benefit CLIS patients as they do not need to have eye movement control but simply the auditory components. They used Support Vector Machine classification and Recursive channel elimination on averaged event related potentials and found that an untrained user can do this with a high level of accuracy.
The shows that users can control their EEG signals within the first trial by just focusing on the auditory signal given to them

8) “Brain Computer Interface based 3D Game for Attention Training and Rehabilitation”
This article talks about the development of a video game that is controlled by your brain which will allow your attention to control the virtual hands movement. This would be able to train those with ADHD.
Said to also be able to help patients that have lived from traumatic experiences

9) “Classification of cognitive states of attention and relaxation using supervised learning algorithms”
Users used EEG sensors to try and read brain frequencies in real time by a video game to classify to cognitive states of attention and relaxation, in relation to brain waves read by the device. The aim of it was to estimate behaviors using human brain waves

10) “EEG Spectral Analysis for attention state assessment: graphical Versus Classical Classification Techniques”
Focus on a brain attention state that could be used to selectively run an application on a mobile device. Did a spectral analysis of the recorded EEG to compute the Alpha band power for different subjects during attentive and non-attentive tasks. The power values were used to train people to discriminate between 2 different attention stat. had over 70% accuracy. A classification accuracy of 83% was obtained.


Presentation Summary on Speech, Attention, and Speech
By Betzaida & contributions from Rodrigo

Speech

Presentation



Attention Group

distinguishing yes no responses

speech processing can either be related to a device that takes brain signuatures when person is pseeking

goal: create device that can process a persons brain wave patterns that can either synthesize speech or form commands



Article shows like ours research presented used a different conditioning then what we need

asks question where last word triggers thought of yes or noise then the following noise within 500 ms develops a difference

our data has been cut off right before the last word is spoken, so we are seeing there reaction essentially

then they get 500 ms of electric shock



Dr. Dodel showed reseach findings

temporal evolution of frequency band by changing the time interval

and indication that the subject is actually processing the sentence


second article dodel showed

neural synchronization mediates on-line sentence processing: EEG coherence evidence from filler-gap constructions

we dont have gap filling so it is essentially not related but we get info on sentence processing

not related so we moved on to another one

third articl

analysis of time-varying synchronication

not promising



4th article

increased neuronal communication accompanying sentence comprehension

obsereved in theta beta and gamma frequencies => description of what each one excuded

shows difference within complexities

where is ours is consistent complexity


5th

Brain potentionals related to stages of sentence verficiation

30 years old but highly related

viewed potentials on true and false sentences

which showed it 250 to 450 ms after setnnece given

didnt state brain regions affected

t-tests were used to show significance

we could use this immediately to see if in our data we have this as well



three properties we could use

1. is the singal value

2.power spectrum

(power over time)

3. use coherence





ATTENTION

3 categories

ADHD

20 to 30

BCI used to help ADHD

Covert Attention

moving the eyes

being able to use eye

alpha waves

measuring peripheral activity of the eye to control BCI

locked in, originally only able to move eye in one direction for 3rd source

Using ERP based BCI

Auditory Attention

theta and beta waves

80% accurate





SLEEP

primarily detected by theta and delta waves

have wireless bci applications

various classification of sleeping analysis

time frequency analysis

analysis of spindle amplitudes and tendencies

shows slow wave activity



can be used in our data - determine if the patient is sleeping or not

by analysizing the theta and delta waves



see how much sleep there are getting

sleep deprivation sometimes affects patients responses

but in this case, it is not pertinent

because their sleep and wake cycle are completely differently







discussion on what kind of we would like to do with the data with the remaining week s