Brain–computer interface
Neuropsychology |
---|
A brain–computer interface (BCI), sometimes called a brain–machine interface (BMI) or smartbrain, is a direct communication pathway between the brain's electrical activity and an external device, most commonly a computer or robotic limb. BCIs are often directed at researching, mapping, assisting, augmenting, or repairing human cognitive or sensory-motor functions.[1] They are often conceptualized as a human–machine interface that skips the intermediary component of the physical movement of body parts, although they also raise the possibility of the erasure of the discreteness of brain and machine. Implementations of BCIs range from non-invasive (EEG, MEG, MRI) and partially invasive (ECoG and endovascular) to invasive (microelectrode array), based on how close electrodes get to brain tissue.[2]
Research on BCIs began in the 1970s by Jacques Vidal at the University of California, Los Angeles (UCLA) under a grant from the National Science Foundation, followed by a contract from DARPA.[3][4] Vidal's 1973 paper marks the first appearance of the expression brain–computer interface in scientific literature.
Due to the
Recently, studies in
History
The history of brain–computer interfaces (BCIs) starts with Hans Berger's discovery of the electrical activity of the human brain and the development of electroencephalography (EEG). In 1924 Berger was the first to record human brain activity by means of EEG. Berger was able to identify oscillatory activity, such as Berger's wave or the alpha wave (8–13 Hz), by analyzing EEG traces.
Berger's first recording device was very rudimentary. He inserted silver wires under the scalps of his patients. These were later replaced by silver foils attached to the patient's head by rubber bandages. Berger connected these sensors to a Lippmann capillary electrometer, with disappointing results. However, more sophisticated measuring devices, such as the Siemens double-coil recording galvanometer, which displayed electric voltages as small as one ten thousandth of a volt, led to success.
Berger analyzed the interrelation of alternations in his EEG wave diagrams with brain diseases. EEGs permitted completely new possibilities for the research of human brain activities.
Although the term had not yet been coined, one of the earliest examples of a working brain-machine interface was the piece Music for Solo Performer (1965) by the American composer
After his early contributions, Vidal was not active in BCI research, nor BCI events such as conferences, for many years. In 2011, however, he gave a lecture in Graz, Austria, supported by the Future BNCI project, presenting the first BCI, which earned a standing ovation. Vidal was joined by his wife, Laryce Vidal, who previously worked with him at UCLA on his first BCI project.
In 1988, a report was given on noninvasive EEG control of a physical object, a robot. The experiment described was EEG control of multiple start-stop-restart of the robot movement, along an arbitrary trajectory defined by a line drawn on a floor. The line-following behavior was the default robot behavior, utilizing autonomous intelligence and autonomous source of energy.[15][16] This 1988 report written by Stevo Bozinovski, Mihail Sestakov, and Liljana Bozinovska was the first one about a robot control using EEG.[17][18]
In 1990, a report was given on a closed loop, bidirectional adaptive BCI controlling computer buzzer by an anticipatory brain potential, the Contingent Negative Variation (CNV) potential.[19][20] The experiment described how an expectation state of the brain, manifested by CNV, controls in a feedback loop the S2 buzzer in the S1-S2-CNV paradigm. The obtained cognitive wave representing the expectation learning in the brain is named Electroexpectogram (EXG). The CNV brain potential was part of the BCI challenge presented by Vidal in his 1973 paper.
Studies in 2010s suggested the potential ability of neural stimulation to restore functional connectively and associated behaviors through modulation of molecular mechanisms of synaptic efficacy.[21][22] This opened the door for the concept that BCI technologies may be able to restore function in addition to enabling functionality.
Since 2013, DARPA has funded BCI technology through the BRAIN initiative, which has supported work out of the University of Pittsburgh Medical Center,[23] Paradromics,[24] Brown,[25] and Synchron,[26] among others.
Versus neuroprosthetics
Neuroprosthetics is an area of
The terms are sometimes used interchangeably. Neuroprosthetics and BCIs seek to achieve the same aims, such as restoring sight, hearing, movement, ability to communicate, and even
Animal BCI research
Several laboratories have managed to record signals from monkey and rat
In 2020, Elon Musk's Neuralink was successfully implanted in a pig,[30] announced in a widely viewed webcast. In 2021, Elon Musk announced that he had successfully enabled a monkey to play video games using Neuralink's device.[31]
Early work
In 1969 the operant conditioning studies of Fetz and colleagues, at the Regional Primate Research Center and Department of Physiology and Biophysics, University of Washington School of Medicine in Seattle, showed for the first time that monkeys could learn to control the deflection of a biofeedback meter arm with neural activity.[32] Similar work in the 1970s established that monkeys could quickly learn to voluntarily control the firing rates of individual and multiple neurons in the primary motor cortex if they were rewarded for generating appropriate patterns of neural activity.[33]
Studies that developed
There has been rapid development in BCIs since the mid-1990s.
Prominent research successes
Kennedy and Yang Dan
Phillip Kennedy (who later founded Neural Signals in 1987) and colleagues built the first intracortical brain–computer interface by implanting neurotrophic-cone
In 1999, researchers led by Yang Dan at the University of California, Berkeley decoded neuronal firings to reproduce images seen by cats. The team used an array of electrodes embedded in the thalamus (which integrates all of the brain's sensory input) of sharp-eyed cats. Researchers targeted 177 brain cells in the thalamus lateral geniculate nucleus area, which decodes signals from the retina. The cats were shown eight short movies, and their neuron firings were recorded. Using mathematical filters, the researchers decoded the signals to generate movies of what the cats saw and were able to reconstruct recognizable scenes and moving objects.[36] Similar results in humans have since been achieved by researchers in Japan (see below).
Nicolelis
Miguel Nicolelis, a professor at Duke University, in Durham, North Carolina, has been a prominent proponent of using multiple electrodes spread over a greater area of the brain to obtain neuronal signals to drive a BCI.
After conducting initial studies in rats during the 1990s, Nicolelis and his colleagues developed BCIs that decoded brain activity in owl monkeys and used the devices to reproduce monkey movements in robotic arms. Monkeys have advanced reaching and grasping abilities and good hand manipulation skills, making them ideal test subjects for this kind of work.
By 2000, the group succeeded in building a BCI that reproduced owl monkey movements while the monkey operated a joystick or reached for food.[37] The BCI operated in real time and could also control a separate robot remotely over Internet Protocol. But the monkeys could not see the arm moving and did not receive any feedback, a so-called open-loop BCI.
Later experiments by Nicolelis using
Donoghue, Schwartz and Andersen
Other laboratories which have developed BCIs and algorithms that decode neuron signals include the
John Donoghue's lab at the Carney Institute reported training rhesus monkeys to use a BCI to track visual targets on a computer screen (closed-loop BCI) with or without assistance of a joystick.[41] Schwartz's group created a BCI for three-dimensional tracking in virtual reality and also reproduced BCI control in a robotic arm.[42] The same group also created headlines when they demonstrated that a monkey could feed itself pieces of fruit and marshmallows using a robotic arm controlled by the animal's own brain signals.[43][44][45]
Andersen's group used recordings of premovement activity from the posterior parietal cortex in their BCI, including signals created when experimental animals anticipated receiving a reward.[46]
Other research
In addition to predicting
Miguel Nicolelis and colleagues demonstrated that the activity of large neural ensembles can predict arm position. This work made possible creation of BCIs that read arm movement intentions and translate them into movements of artificial actuators. Carmena and colleagues[38] programmed the neural coding in a BCI that allowed a monkey to control reaching and grasping movements by a robotic arm. Lebedev and colleagues[39] argued that brain networks reorganize to create a new representation of the robotic appendage in addition to the representation of the animal's own limbs.
In 2019, researchers from UCSF published a study where they demonstrated a BCI that had the potential to help patients with speech impairment caused by neurological disorders. Their BCI used high-density electrocorticography to tap neural activity from a patient's brain and used deep learning methods to synthesize speech.[48][49] In 2021, researchers from the same group published a study showing the potential of a BCI to decode words and sentences in an anarthric patient who had been unable to speak for over 15 years.[50][51]
The biggest impediment to BCI technology at present is the lack of a sensor modality that provides safe, accurate and robust access to brain signals. It is conceivable or even likely, however, that such a sensor will be developed within the next twenty years. The use of such a sensor should greatly expand the range of communication functions that can be provided using a BCI.
Development and implementation of a BCI system is complex and time-consuming. In response to this problem, Gerwin Schalk has been developing a general-purpose system for BCI research, called BCI2000. BCI2000 has been in development since 2000 in a project led by the Brain–Computer Interface R&D Program at the Wadsworth Center of the New York State Department of Health in Albany, New York, United States.[52]
A new 'wireless' approach uses
The use of BMIs has also led to a deeper understanding of neural networks and the central nervous system. Research has shown that despite the inclination of neuroscientists to believe that neurons have the most effect when working together, single neurons can be conditioned through the use of BMIs to fire at a pattern that allows primates to control motor outputs. The use of BMIs has led to development of the single neuron insufficiency principle which states that even with a well tuned firing rate single neurons can only carry a narrow amount of information and therefore the highest level of accuracy is achieved by recording firings of the collective ensemble. Other principles discovered with the use of BMIs include the neuronal multitasking principle, the neuronal mass principle, the neural degeneracy principle, and the plasticity principle.[54]
BCIs are also proposed to be applied by users without disabilities. A
Beyond BCI systems that decode neural activity to drive external effectors, BCI systems may be used to encode signals from the periphery. These sensory BCI devices enable real-time, behaviorally-relevant decisions based upon closed-loop neural stimulation.[56]
The BCI Award
The Annual BCI Research Award is awarded in recognition of outstanding and innovative research in the field of Brain-Computer Interfaces. Each year, a renowned research laboratory is asked to judge the submitted projects. The jury consists of world-leading BCI experts recruited by the awarding laboratory. The jury selects twelve nominees, then chooses a first, second, and third-place winner, who receive awards of $3,000, $2,000, and $1,000, respectively.
Human BCI research
Invasive BCIs
Invasive BCI requires surgery to implant electrodes under the scalp for communicating brain signals. The main advantage is to provide more accurate reading; however, its downside includes side effects from the surgery including scar tissue, which can make brain signals weaker. In addition, according to the research of Abdulkader et al., (2015),[57] the body may not accept the implanted electrodes and this can cause a medical condition.
Vision
Invasive BCI research has targeted repairing damaged sight and providing new functionality for people with paralysis. Invasive BCIs are implanted directly into the grey matter of the brain during neurosurgery. Because they lie in the grey matter, invasive devices produce the highest quality signals of BCI devices but are prone to scar-tissue build-up, causing the signal to become weaker, or even non-existent, as the body reacts to a foreign object in the brain.[58]
In
Dobelle's first prototype was implanted into "Jerry", a man blinded in adulthood, in 1978. A single-array BCI containing 68 electrodes was implanted onto Jerry's
In 2002, Jens Naumann, also blinded in adulthood, became the first in a series of 16 paying patients to receive Dobelle's second generation implant, marking one of the earliest commercial uses of BCIs. The second generation device used a more sophisticated implant enabling better mapping of phosphenes into coherent vision. Phosphenes are spread out across the visual field in what researchers call "the starry-night effect". Immediately after his implant, Jens was able to use his imperfectly restored vision to drive an automobile slowly around the parking area of the research institute.[60] Unfortunately, Dobelle died in 2004[61] before his processes and developments were documented. Subsequently, when Mr. Naumann and the other patients in the program began having problems with their vision, there was no relief and they eventually lost their "sight" again. Naumann wrote about his experience with Dobelle's work in Search for Paradise: A Patient's Account of the Artificial Vision Experiment[62] and has returned to his farm in Southeast Ontario, Canada, to resume his normal activities.[63]
Movement
BCIs focusing on motor neuroprosthetics aim to either restore movement in individuals with paralysis or provide devices to assist them, such as interfaces with computers or robot arms.
Researchers at
More recently, research teams led by the BrainGate group at Brown University and a group led by University of Pittsburgh Medical Center, both in collaborations with the United States Department of Veterans Affairs, have demonstrated further success in direct control of robotic prosthetic limbs with many degrees of freedom using direct connections to arrays of neurons in the motor cortex of patients with tetraplegia.[67][68]
Communication
In May 2021, a Stanford University team reported a successful proof-of-concept test that enabled a quadraplegic participant to input English sentences at about 86 characters per minute and 18 words per minute. The participant imagined moving his hand to write letters, and the system performed handwriting recognition on electrical signals detected in the motor cortex, utilizing hidden Markov models and recurrent neural networks for decoding.[69][70]
A report published in July 2021 reported a paralyzed patient was able to communicate 15 words per minute using a brain implant that analyzed motor neurons that previously controlled the vocal tract.[71][50]
In a recent review article, researchers raised an open question of whether human information transfer rates can surpass that of language with BCIs. Given that recent language research has demonstrated that human information transfer rates are relatively constant across many languages, there may exist a limit at the level of information processing in the brain. On the contrary, this "upper limit" of information transfer rate may be intrinsic to language itself, as a modality for information transfer.[72]
In 2023 two studies used BCIs with recurrent neural network to decode speech at a record rate of 62 words per minute and 78 words per minute.[73][74][75]
Technical challenges
There exist a number of technical challenges to recording brain activity with invasive BCIs. Advances in CMOS technology are pushing and enabling integrated, invasive BCI designs with smaller size, lower power requirements, and higher signal acquisition capabilities.[76] Invasive BCIs involve electrodes that penetrate brain tissue in an attempt to record action potential signals (also known as spikes) from individual, or small groups of, neurons near the electrode. The interface between a recording electrode and the electrolytic solution surrounding neurons has been modelled using the Hodgkin-Huxley model.[77][78]
Electronic limitations to invasive BCIs have been an active area of research in recent decades. While intracellular recordings of neurons reveal action potential voltages on the scale of hundreds of millivolts, chronic invasive BCIs rely on recording extracellular voltages which typically are three orders of magnitude smaller, existing at hundreds of microvolts.[79] Further adding to the challenge of detecting signals on the scale of microvolts is the fact that the electrode-tissue interface has a high capacitance at small voltages. Due to the nature of these small signals, for BCI systems that incorporate functionality onto an integrated circuit, each electrode requires its own amplifier and ADC, which convert analog extracellular voltages into digital signals.[79] Because a typical neuron action potential lasts for one millisecond, BCIs measuring spikes must have sampling rates ranging from 300 Hz to 5 kHz. Yet another concern is that invasive BCIs must be low-power, so as to dissipate less heat to surrounding tissue; at the most basic level more power is traditionally needed to optimize signal-to-noise ratio.[78] Optimal battery design is an active area of research in BCIs.[80]
Challenges existing in the area of material science are central to the design of invasive BCIs. Variations in signal quality over time have been commonly observed with implantable microelectrodes.[81][82] Optimal material and mechanical characteristics for long term signal stability in invasive BCIs has been an active area of research.[83] It has been proposed that the formation of glial scarring, secondary to damage at the electrode-tissue interface, is likely responsible for electrode failure and reduced recording performance.[84] Research has suggested that blood-brain barrier leakage, either at the time of insertion or over time, may be responsible for the inflammatory and glial reaction to chronic microelectrodes implanted in the brain.[84][85] As a result, flexible[86][87][88] and tissue-like designs[89][90] have been researched and developed to minimize foreign-body reaction by means of matching the Young's modulus of the electrode closer to that of brain tissue.[89]
Partially invasive BCIs
Partially invasive BCI devices are implanted inside the skull but rest outside the brain rather than within the grey matter. They produce better resolution signals than non-invasive BCIs where the bone tissue of the cranium deflects and deforms signals and have a lower risk of forming scar-tissue in the brain than fully invasive BCIs. There has been preclinical demonstration of intracortical BCIs from the stroke perilesional cortex.[91]
Endovascular
A systematic review published in 2020 detailed multiple studies, both clinical and non-clinical, dating back decades investigating the feasibility of endovascular BCIs.[92]
In recent years, the biggest advance in partially invasive BCIs has emerged in the area of interventional neurology.
The
First-in-human trials with the Stentrode are underway.
ECoG
Electrocorticography (ECoG) measures the electrical activity of the brain taken from beneath the skull in a similar way to non-invasive electroencephalography, but the electrodes are embedded in a thin plastic pad that is placed above the cortex, beneath the dura mater.[98] ECoG technologies were first trialled in humans in 2004 by Eric Leuthardt and Daniel Moran from Washington University in St. Louis. In a later trial, the researchers enabled a teenage boy to play Space Invaders using his ECoG implant.[99] This research indicates that control is rapid, requires minimal training, and may be an ideal tradeoff with regards to signal fidelity and level of invasiveness.[note 1]
Signals can be either subdural or epidural, but are not taken from within the brain parenchyma itself. It has not been studied extensively until recently due to the limited access of subjects. Currently, the only manner to acquire the signal for study is through the use of patients requiring invasive monitoring for localization and resection of an epileptogenic focus.
ECoG is a very promising intermediate BCI modality because it has higher spatial resolution, better signal-to-noise ratio, wider frequency range, and less training requirements than scalp-recorded EEG, and at the same time has lower technical difficulty, lower clinical risk, and may have superior long-term stability than intracortical single-neuron recording.[101] This feature profile and recent evidence of the high level of control with minimal training requirements shows potential for real world application for people with motor disabilities.[102][103] Light reactive imaging BCI devices are still in the realm of theory.
Recent work published by
Non-invasive BCIs
There have also been experiments in humans using
Functional near-infrared spectroscopy
In 2014 and 2017, a BCI using
Electroencephalography (EEG)-based brain-computer interfaces
After the BCI challenge was stated by Vidal in 1973, the initial reports on non-invasive approach included control of a cursor in 2D using VEP (Vidal 1977), control of a buzzer using CNV (Bozinovska et al. 1988, 1990), control of a physical object, a robot, using a brain rhythm (alpha) (Bozinovski et al. 1988), control of a text written on a screen using P300 (Farwell and Donchin, 1988).[13]
In the early days of BCI research, another substantial barrier to using electroencephalography (EEG) as a brain–computer interface was the extensive training required before users can work the technology. For example, in experiments beginning in the mid-1990s, Niels Birbaumer at the University of Tübingen in Germany trained severely paralysed people to self-regulate the slow cortical potentials in their EEG to such an extent that these signals could be used as a binary signal to control a computer cursor.[108] (Birbaumer had earlier trained epileptics to prevent impending fits by controlling this low voltage wave.) The experiment saw ten patients trained to move a computer cursor by controlling their brainwaves. The process was slow, requiring more than an hour for patients to write 100 characters with the cursor, while training often took many months. However, the slow cortical potential approach to BCIs has not been used in several years, since other approaches require little or no training, are faster and more accurate, and work for a greater proportion of users.
Another research parameter is the type of
A further parameter is the method of feedback used and this is shown in studies of
In 2005 it was reported research on EEG emulation of digital control circuits for BCI, with example of a CNV flip-flop.[109] In 2009 it was reported noninvasive EEG control of a robotic arm using a CNV flip-flop.[110] In 2011 it was reported control of two robotic arms solving Tower of Hanoi task with three disks using a CNV flip-flop.[111] In 2015 it was described EEG-emulation of a Schmitt trigger, flip-flop, demultiplexer, and modem.[112]
While an EEG based brain-computer interface has been pursued extensively by a number of research labs, recent advancements made by
In addition to a brain-computer interface based on brain waves, as recorded from scalp EEG electrodes, Bin He and co-workers explored a virtual EEG signal-based brain-computer interface by first solving the EEG inverse problem and then used the resulting virtual EEG for brain-computer interface tasks. Well-controlled studies suggested the merits of such a source analysis based brain-computer interface.[116]
A 2014 study found that severely motor-impaired patients could communicate faster and more reliably with non-invasive EEG BCI, than with any muscle-based communication channel.[117]
A 2016 study found that the Emotiv EPOC device may be more suitable for control tasks using the attention/meditation level or eye blinking than the Neurosky MindWave device.[118]
A 2019 study found that the application of evolutionary algorithms could improve EEG mental state classification with a non-invasive Muse device, enabling high quality classification of data acquired by a cheap consumer-grade EEG sensing device.[119]
In a 2021 systematic review of randomized controlled trials using BCI for upper-limb rehabilitation after stroke, EEG-based BCI was found to have significant efficacy in improving upper-limb motor function compared to control therapies. More specifically, BCI studies that utilized band power features, motor imagery, and functional electrical stimulation in their design were found to be more efficacious than alternatives.[120] Another 2021 systematic review focused on robotic-assisted EEG-based BCI for hand rehabilitation after stroke. Improvement in motor assessment scores was observed in three of eleven studies included in the systematic review.[121]
Dry active electrode arrays
In the early 1990s Babak Taheri, at University of California, Davis demonstrated the first single and also multichannel dry active electrode arrays using micro-machining. The single channel dry EEG electrode construction and results were published in 1994.[122] The arrayed electrode was also demonstrated to perform well compared to silver/silver chloride electrodes. The device consisted of four sites of sensors with integrated electronics to reduce noise by impedance matching. The advantages of such electrodes are: (1) no electrolyte used, (2) no skin preparation, (3) significantly reduced sensor size, and (4) compatibility with EEG monitoring systems. The active electrode array is an integrated system made of an array of capacitive sensors with local integrated circuitry housed in a package with batteries to power the circuitry. This level of integration was required to achieve the functional performance obtained by the electrode.
The electrode was tested on an electrical test bench and on human subjects in four modalities of EEG activity, namely: (1) spontaneous EEG, (2) sensory event-related potentials, (3) brain stem potentials, and (4) cognitive event-related potentials. The performance of the dry electrode compared favorably with that of the standard wet electrodes in terms of skin preparation, no gel requirements (dry), and higher signal-to-noise ratio.[123]
In 1999 researchers at
SSVEP mobile EEG BCIs
In 2009, the NCTU Brain-Computer-Interface-headband was reported. The researchers who developed this BCI-headband also engineered silicon-based
In 2011, researchers reported a cellular based BCI with the capability of taking EEG data and converting it into a command to cause the phone to ring. This research was supported in part by Abraxis Bioscience LLP, the U.S. Army Research Laboratory, and the Army Research Office. The developed technology was a wearable system composed of a four channel bio-signal acquisition/amplification module, a wireless transmission module, and a Bluetooth enabled cell phone. The electrodes were placed so that they pick up steady state visual evoked potentials (SSVEPs).[126] SSVEPs are electrical responses to flickering visual stimuli with repetition rates over 6 Hz[126] that are best found in the parietal and occipital scalp regions of the visual cortex.[127][128][129] It was reported that with this BCI setup, all study participants were able to initiate the phone call with minimal practice in natural environments.[130]
The scientists claim that their studies using a single channel fast Fourier transform (FFT) and multiple channel system canonical correlation analysis (CCA) algorithm support the capacity of mobile BCIs.[126][131] The CCA algorithm has been applied in other experiments investigating BCIs with claimed high performance in accuracy as well as speed.[132] While the cellular based BCI technology was developed to initiate a phone call from SSVEPs, the researchers said that it can be translated for other applications, such as picking up sensorimotor mu/beta rhythms to function as a motor-imagery based BCI.[126]
In 2013, comparative tests were performed on android cell phone, tablet, and computer based BCIs, analyzing the power spectrum density of resultant EEG SSVEPs. The stated goals of this study, which involved scientists supported in part by the U.S. Army Research Laboratory, were to "increase the practicability, portability, and ubiquity of an SSVEP-based BCI, for daily use". Citation It was reported that the stimulation frequency on all mediums was accurate, although the cell phone's signal demonstrated some instability. The amplitudes of the SSVEPs for the laptop and tablet were also reported to be larger than those of the cell phone. These two qualitative characterizations were suggested as indicators of the feasibility of using a mobile stimulus BCI.[131]
Limitations
In 2011, researchers stated that continued work should address ease of use, performance robustness, reducing hardware and software costs.[126]
One of the difficulties with EEG readings is the large susceptibility to motion artifacts.[133] In most of the previously described research projects, the participants were asked to sit still, reducing head and eye movements as much as possible, and measurements were taken in a laboratory setting. However, since the emphasized application of these initiatives had been in creating a mobile device for daily use,[131] the technology had to be tested in motion.
In 2013, researchers tested mobile EEG-based BCI technology, measuring SSVEPs from participants as they walked on a treadmill at varying speeds. This research was supported by the Office of Naval Research, Army Research Office, and the U.S. Army Research Laboratory. Stated results were that as speed increased the SSVEP detectability using CCA decreased. As independent component analysis (ICA) had been shown to be efficient in separating EEG signals from noise,[134] the scientists applied ICA to CCA extracted EEG data. They stated that the CCA data with and without ICA processing were similar. Thus, they concluded that CCA independently demonstrated a robustness to motion artifacts that indicates it may be a beneficial algorithm to apply to BCIs used in real world conditions.[128] One of the major problems in EEG-based BCI applications is the low spatial resolution. Several solutions have been suggested to address this issue since 2019, which include: EEG source connectivity based on graph theory, EEG pattern recognition based on Topomap, EEG-fMRI fusion, and so on.
Prosthesis and environment control
Non-invasive BCIs have also been applied to enable brain-control of prosthetic upper and lower extremity devices in people with paralysis. For example, Gert Pfurtscheller of Graz University of Technology and colleagues demonstrated a BCI-controlled functional electrical stimulation system to restore upper extremity movements in a person with tetraplegia due to spinal cord injury.[135] Between 2012 and 2013, researchers at the University of California, Irvine demonstrated for the first time that it is possible to use BCI technology to restore brain-controlled walking after spinal cord injury. In their spinal cord injury research study, a person with paraplegia was able to operate a BCI-robotic gait orthosis to regain basic brain-controlled ambulation.[136][137] In 2009 Alex Blainey, an independent researcher based in the UK, successfully used the Emotiv EPOC to control a 5 axis robot arm.[138] He then went on to make several demonstration mind controlled wheelchairs and home automation that could be operated by people with limited or no motor control such as those with paraplegia and cerebral palsy.
Research into military use of BCIs funded by DARPA has been ongoing since the 1970s.[3][4] The current focus of research is user-to-user communication through analysis of neural signals.[139]
MEG and MRI
Magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) have both been used successfully as non-invasive BCIs.[140] In a widely reported experiment, fMRI allowed two users being scanned to play Pong in real-time by altering their haemodynamic response or brain blood flow through biofeedback techniques.[141]
fMRI measurements of haemodynamic responses in real time have also been used to control robot arms with a seven-second delay between thought and movement.[142]
In 2008 research developed in the Advanced Telecommunications Research (ATR)
In 2011 researchers from UC Berkeley published[144] a study reporting second-by-second reconstruction of videos watched by the study's subjects, from fMRI data. This was achieved by creating a statistical model relating visual patterns in videos shown to the subjects, to the brain activity caused by watching the videos. This model was then used to look up the 100 one-second video segments, in a database of 18 million seconds of random YouTube videos, whose visual patterns most closely matched the brain activity recorded when subjects watched a new video. These 100 one-second video extracts were then combined into a mashed-up image that resembled the video being watched.[145][146][147]
BCI control strategies in neurogaming
Motor imagery
Bio/neurofeedback for passive BCI designs
Biofeedback is used to monitor a subject's mental relaxation. In some cases, biofeedback does not monitor electroencephalography (EEG), but instead bodily parameters such as
Visual evoked potential (VEP)
A VEP is an electrical potential recorded after a subject is presented with a type of visual stimuli. There are several types of VEPs.
Steady-state visually evoked potentials (SSVEPs) use potentials generated by exciting the retina, using visual stimuli modulated at certain frequencies. SSVEP's stimuli are often formed from alternating checkerboard patterns and at times simply use flashing images. The frequency of the phase reversal of the stimulus used can be clearly distinguished in the spectrum of an EEG; this makes detection of SSVEP stimuli relatively easy. SSVEP has proved to be successful within many BCI systems. This is due to several factors, the signal elicited is measurable in as large a population as the transient VEP and blink movement and electrocardiographic artefacts do not affect the frequencies monitored. In addition, the SSVEP signal is exceptionally robust; the topographic organization of the primary visual cortex is such that a broader area obtains afferents from the central or fovial region of the visual field. SSVEP does have several problems however. As SSVEPs use flashing stimuli to infer a user's intent, the user must gaze at one of the flashing or iterating symbols in order to interact with the system. It is, therefore, likely that the symbols could become irritating and uncomfortable to use during longer play sessions, which can often last more than an hour which may not be an ideal gameplay.
Another type of VEP used with applications is the P300 potential. The P300 event-related potential is a positive peak in the EEG that occurs at roughly 300 ms after the appearance of a target stimulus (a stimulus for which the user is waiting or seeking) or oddball stimuli. The P300 amplitude decreases as the target stimuli and the ignored stimuli grow more similar.The P300 is thought to be related to a higher level attention process or an orienting response using P300 as a control scheme has the advantage of the participant only having to attend limited training sessions. The first application to use the P300 model was the P300 matrix. Within this system, a subject would choose a letter from a grid of 6 by 6 letters and numbers. The rows and columns of the grid flashed sequentially and every time the selected "choice letter" was illuminated the user's P300 was (potentially) elicited. However, the communication process, at approximately 17 characters per minute, was quite slow. The P300 is a BCI that offers a discrete selection rather than a continuous control mechanism. The advantage of P300 use within games is that the player does not have to teach himself/herself how to use a completely new control system and so only has to undertake short training instances, to learn the gameplay mechanics and basic use of the BCI paradigm.[148]
Non-brain-based human–computer interface (physiological computing)
Human-computer interaction can benefit from other recording modalities, such as EOG and eye-tracking. However, these modalities do not record brain activity and therefore do not fall within the exact scope of BCIs, but rather can be grouped under the wider field of physiological computing.[151]
Electro-oculography (EOG)
In 1989, a report was given on control of a mobile robot by eye movement using electrooculography (EOG) signals. A mobile robot was driven from a start to a goal point using five EOG commands, interpreted as forward, backward, left, right, and stop.[152]
Pupil-size oscillation
A 2016 article[153] described an entirely new communication device and non-EEG-based human-computer interface, which requires no visual fixation, or ability to move the eyes at all. The interface is based on covert interest; directing one's attention to a chosen letter on a virtual keyboard, without the need to move one's eyes to look directly at the letter. Each letter has its own (background) circle which micro-oscillates in brightness differently from all of the other letters. The letter selection is based on best fit between unintentional pupil-size oscillation and the background circle's brightness oscillation pattern. Accuracy is additionally improved by the user's mental rehearsing of the words 'bright' and 'dark' in synchrony with the brightness transitions of the letter's circle.
Synthetic telepathy
In a $6.3 million US Army initiative to invent devices for
In 2002 Kevin Warwick had an array of 100 electrodes fired into his nervous system in order to link his nervous system into the Internet to investigate enhancement possibilities. With this in place Warwick successfully carried out a series of experiments. With electrodes also implanted into his wife's nervous system, they conducted the first direct electronic communication experiment between the nervous systems of two humans.[155][156][157][158]
Another group of researchers was able to achieve conscious brain-to-brain communication between two people separated by a distance using non-invasive technology that was in contact with the scalp of the participants. The words were encoded by binary streams using the sequences of 0's and 1's by the imaginary motor input of the person "emitting" the information. As the result of this experiment, pseudo-random bits of the information carried encoded words "hola" ("hi" in Spanish) and "ciao" ("goodbye" in Italian) and were transmitted mind-to-mind between humans separated by a distance, with blocked motor and sensory systems, which has low to no probability of this happening by chance.[159]
In the 1960s a researcher was successful after some training in using EEG to create Morse code using their brain alpha waves. Research funded by the US army is being conducted with the goal of allowing users to compose a message in their head, then transfer that message with just the power of thought to a particular individual.[160] On 27 February 2013 the group with Miguel Nicolelis at Duke University and IINN-ELS successfully connected the brains of two rats with electronic interfaces that allowed them to directly share information, in the first-ever direct brain-to-brain interface.[161][162][163]
Cell-culture BCIs
Researchers have built devices to interface with neural cells and entire neural networks in cultures outside animals. As well as furthering research on animal implantable devices, experiments on cultured neural tissue have focused on building problem-solving networks, constructing basic computers and manipulating robotic devices. Research into techniques for stimulating and recording from individual neurons grown on semiconductor chips is sometimes referred to as neuroelectronics or neurochips.[164]
Development of the first working neurochip was claimed by a Caltech team led by Jerome Pine and Michael Maher in 1997.[165] The Caltech chip had room for 16 neurons.
In 2003 a team led by Theodore Berger, at the University of Southern California, started work on a neurochip designed to function as an artificial or prosthetic hippocampus. The neurochip was designed to function in rat brains and was intended as a prototype for the eventual development of higher-brain prosthesis. The hippocampus was chosen because it is thought to be the most ordered and structured part of the brain and is the most studied area. Its function is to encode experiences for storage as long-term memories elsewhere in the brain.[166]
In 2004 Thomas DeMarse at the
Collaborative BCIs
The idea of combining/integrating brain signals from multiple individuals was introduced at Humanity+ @Caltech, in December 2010, by a
Ethical considerations
As technology continually blurs the line between science fiction and reality, the advent of brain-computer interfaces (BCIs) poses a profound ethical quandary. These neural interfaces, heralded as marvels of innovation, facilitate direct communication between the human brain and external devices. However, the ethical landscape surrounding BCIs is intricate and multifaceted, encompassing concerns of privacy invasion, autonomy, consent, and the potential societal implications of merging human cognition with machine interfaces. Delving into the ethical considerations of BCIs illuminates the intricate balance between technological advancement and safeguarding fundamental human rights and values. Many of the concerns raised can be divided into two groups, user centric issues and legal and social issues.
Ethical concerns in the user centric sphere tend to revolve around the safety of the user and the effects that this technology will have on them over a period of time. These can include but are not limited to: long-term effects to the user remain largely unknown, obtaining informed consent from people who have difficulty communicating, the consequences of BCI technology for the quality of life of patients and their families, health-related side-effects (e.g. neurofeedback of sensorimotor rhythm training is reported to affect sleep quality), therapeutic applications and their potential misuse, safety risks, non-convertibility of some of the changes made to the brain, lack of access to maintenance, repair and spare parts in case of company bankruptcy,[180] etc.
The legal and social aspect of BCIs is a metaphorical minefield for any entity attempting to make BCIs mainstream. Some of these concerns would be issues of accountability and responsibility: claims that the influence of BCIs overrides free will and control over sensory-motor actions, claims that cognitive intention was inaccurately translated due to a BCI malfunction, personality changes involved caused by deep-brain stimulation, concerns regarding the state of becoming a "cyborg" - having parts of the body that are living and parts that are mechanical, questions about personality: what does it mean to be a human, blurring of the division between human and machine and inability to distinguish between human vs. machine-controlled actions,
In their current form, most BCIs are far removed from the ethical issues considered above. They are actually similar to corrective therapies in function. Clausen stated in 2009 that "BCIs pose ethical challenges, but these are conceptually similar to those that bioethicists have addressed for other realms of therapy[184]". Moreover, he suggests that bioethics is well-prepared to deal with the issues that arise with BCI technologies. Haselager and colleagues[185] pointed out that expectations of BCI efficacy and value play a great role in ethical analysis and the way BCI scientists should approach media. Furthermore, standard protocols can be implemented to ensure ethically sound informed-consent procedures with locked-in patients.
The case of BCIs today has parallels in medicine, as will its evolution. Similar to how pharmaceutical science began as a balance for impairments and is now used to increase focus and reduce need for sleep, BCIs will likely transform gradually from therapies to enhancements.[186] Efforts are made inside the BCI community to create consensus on ethical guidelines for BCI research, development and dissemination.[187] As innovation continues, ensuring equitable access to BCIs will be crucial, failing which generational inequalities can arise which can adversely affect the right to human flourishing.
Low-cost BCI-based interfaces
Recently a number of companies have scaled back medical grade EEG technology to create inexpensive BCIs for research as well as entertainment purposes. For example, toys such as the NeuroSky and Mattel MindFlex have seen some commercial success.
- In 2006 Sony patented a neural interface system allowing radio waves to affect signals in the neural cortex.[188]
- In 2007 NeuroSky released the first affordable consumer based EEG along with the game NeuroBoy. This was also the first large scale EEG device to use dry sensor technology.[189]
- In 2008 OCZ Technology developed a device for use in video games relying primarily on electromyography.[190]
- In 2008 Final Fantasy developer Square Enix announced that it was partnering with NeuroSky to create a game, Judecca.[191][192]
- In 2009 Mattel partnered with NeuroSky to release the Mindflex, a game that used an EEG to steer a ball through an obstacle course. It is by far the best selling consumer based EEG to date.[191][193]
- In 2009 Uncle Milton Industries partnered with NeuroSky to release the Star Wars Force Trainer, a game designed to create the illusion of possessing the Force.[191][194]
- In 2009 Emotiv released the EPOC, a 14 channel EEG device that can read 4 mental states, 13 conscious states, facial expressions, and head movements. The EPOC is the first commercial BCI to use dry sensor technology, which can be dampened with a saline solution for a better connection.[195]
- In November 2011 Time magazine selected "necomimi" produced by Neurowear as one of the best inventions of the year. The company announced that it expected to launch a consumer version of the garment, consisting of catlike ears controlled by a brain-wave reader produced by NeuroSky, in spring 2012.[196]
- In February 2014 They Shall Walk (a nonprofit organization fixed on constructing exoskeletons, dubbed LIFESUITs, for paraplegics and quadriplegics) began a partnership with James W. Shakarji on the development of a wireless BCI.[197]
- In 2016, a group of hobbyists developed an open-source BCI board that sends neural signals to the audio jack of a smartphone, dropping the cost of entry-level BCI to £20.[198] Basic diagnostic software is available for Android devices, as well as a text entry app for Unity.[199]
- In 2020, NextMind released a dev kit including an EEG headset with dry electrodes at $399.[200][201] The device can be played with some demo applications or developers can create their own use cases using the provided Software Development Kit.
Future directions
A consortium consisting of 12 European partners has completed a roadmap to support the European Commission in their funding decisions for the new framework program
Other recent publications too have explored future BCI directions for new groups of disabled users (e.g.,[10][204])
Disorders of consciousness (DOC)
Some people have a
These and other articles describe new challenges and solutions to use BCI technology to help persons with DOC. One major challenge is that these patients cannot use BCIs based on vision. Hence, new tools rely on auditory and/or vibrotactile stimuli. Patients may wear headphones and/or vibrotactile stimulators placed on the wrists, neck, leg, and/or other locations. Another challenge is that patients may fade in and out of consciousness and can only communicate at certain times. This may indeed be a cause of mistaken diagnosis. Some patients may only be able to respond to physicians' requests for a few hours per day (which might not be predictable ahead of time) and thus may have been unresponsive during diagnosis. Therefore, new methods rely on tools that are easy to use in field settings, even without expert help, so family members and other people without any medical or technical background can still use them. This reduces the cost, time, need for expertise, and other burdens with DOC assessment. Automated tools can ask simple questions that patients can easily answer, such as "Is your father named George?" or "Were you born in the USA?" Automated instructions inform patients that they may convey yes or no by (for example) focusing their attention on stimuli on the right vs. left wrist. This focused attention produces reliable changes in
Motor recovery
People may lose some of their ability to move due to many causes, such as stroke or injury. Research in recent years has demonstrated the utility of EEG-based BCI systems in aiding motor recovery and neurorehabilitation in patients who have had a stroke.[210][211][212][213] Several groups have explored systems and methods for motor recovery that include BCIs.[214][215][216][217] In this approach, a BCI measures motor activity while the patient imagines or attempts movements as directed by a therapist. The BCI may provide two benefits: (1) if the BCI indicates that a patient is not imagining a movement correctly (non-compliance), then the BCI could inform the patient and therapist; and (2) rewarding feedback such as functional stimulation or the movement of a virtual avatar also depends on the patient's correct movement imagery.
So far, BCIs for motor recovery have relied on the EEG to measure the patient's motor imagery. However, studies have also used fMRI to study different changes in the brain as persons undergo BCI-based stroke rehab training.[218][219][220] Imaging studies combined with EEG-based BCI systems hold promise for investigating neuroplasticity during motor recovery post-stroke.[220] Future systems might include the fMRI and other measures for real-time control, such as functional near-infrared, probably in tandem with EEGs. Non-invasive brain stimulation has also been explored in combination with BCIs for motor recovery.[221] In 2016, scientists out of the University of Melbourne published preclinical proof-of-concept data related to a potential brain-computer interface technology platform being developed for patients with paralysis to facilitate control of external devices such as robotic limbs, computers and exoskeletons by translating brain activity.[222][223] Clinical trials are currently underway.[224]
Functional brain mapping
Each year, about 400,000 people undergo brain mapping during neurosurgery. This procedure is often required for people with tumors or epilepsy that do not respond to medication.[225] During this procedure, electrodes are placed on the brain to precisely identify the locations of structures and functional areas. Patients may be awake during neurosurgery and asked to perform certain tasks, such as moving fingers or repeating words. This is necessary so that surgeons can remove only the desired tissue while sparing other regions, such as critical movement or language regions. Removing too much brain tissue can cause permanent damage, while removing too little tissue can leave the underlying condition untreated and require additional neurosurgery.[citation needed] Thus, there is a strong need to improve both methods and systems to map the brain as effectively as possible.
In several recent publications, BCI research experts and medical doctors have collaborated to explore new ways to use BCI technology to improve neurosurgical mapping. This work focuses largely on high gamma activity, which is difficult to detect with non-invasive means. Results have led to improved methods for identifying key areas for movement, language, and other functions. A recent article addressed advances in functional brain mapping and summarizes a workshop.[226]
Flexible devices
Flexible neural interfaces have been extensively tested in recent years in an effort to minimize brain tissue trauma related to mechanical mismatch between electrode and tissue.[230] Minimizing tissue trauma could, in theory, extend the lifespan of BCIs relying on flexible electrode-tissue interfaces.
Neural dust
Neural dust is a term used to refer to millimeter-sized devices operated as wirelessly powered nerve sensors that were proposed in a 2011 paper from the University of California, Berkeley Wireless Research Center, which described both the challenges and outstanding benefits of creating a long lasting wireless BCI.[231][232] In one proposed model of the neural dust sensor, the transistor model allowed for a method of separating between local field potentials and action potential "spikes", which would allow for a greatly diversified wealth of data acquirable from the recordings.[231]
See also
- Informatics
- Intendix (2009)
- AlterEgo, a system that reads unspoken verbalizations and responds with bone-conduction headphones
- Augmented learning
- Biological machine
- Cortical implants
- Deep brain stimulation
- Human senses
- Experience machine
- Kernel (neurotechnology company)
- Lie detection
- Microwave auditory effect
- Neural engineering
- Neuralink
- Neurorobotics
- Neurostimulation
- Nootropic
- Project Cyborg
- Simulated reality
- Telepresence
- Thought identification
- Wetware computer (Uses similar technology for IO)
- Whole brain emulation
- Wirehead (science fiction)
Notes
References
- ^ PMID 28082858.
- ^ a b Michael L Martini, BA, Eric Karl Oermann, MD, Nicholas L Opie, PhD, Fedor Panov, MD, Thomas Oxley, MD, PhD, Kurt Yaeger, MD, Sensor Modalities for Brain-Computer Interface Technology: A Comprehensive Literature Review, Neurosurgery, Volume 86, Issue 2, February 2020, Pages E108–E117, [1]
- ^ PMID 4583653.
- ^ S2CID 7928242.
- PMID 10896180.
- ^ Bird JJ, Manso LJ, Ribeiro EP, Ekárt A, Faria DR (September 2018). A Study on Mental State Classification using EEG-based Brain-Machine Interface. Madeira Island, Portugal: 9th international Conference on Intelligent Systems 2018. Retrieved 3 December 2018.
- ^ Bird JJ, Ekart A, Buckingham CD, Faria DR (2019). Mental Emotional Sentiment Classification with an EEG-based Brain-Machine Interface. St Hugh's College, University of Oxford, United Kingdom: The International Conference on Digital Image and Signal Processing (DISP'19). Archived from the original on 3 December 2018. Retrieved 3 December 2018.
- PMID 29549239.
- S2CID 62506825.
- ^ a b Wolpaw, J.R. and Wolpaw, E.W. (2012). "Brain-Computer Interfaces: Something New Under the Sun". In: Brain-Computer Interfaces: Principles and Practice, Wolpaw, J.R. and Wolpaw (eds.), E.W. Oxford University Press.
- S2CID 17571592.
- S2CID 4690450.
- ^ .
- S2CID 7928242. Archived from the original(PDF) on 19 July 2015. Retrieved 4 November 2022.
- ^ S. Bozinovski, M. Sestakov, L. Bozinovska: Using EEG alpha rhythm to control a mobile robot, In G. Harris, C. Walker (eds.) Proc. IEEE Annual Conference of Medical and Biological Society, p. 1515-1516, New Orleans, 1988
- ^ S. Bozinovski: Mobile robot trajectory control: From fixed rails to direct bioelectric control, In O. Kaynak (ed.) Proc. IEEE Workshop on Intelligent Motion Control, p. 63-67, Istanbul, 1990
- ^ M. Lebedev: Augmentation of sensorimotor functions with neural prostheses. Opera Medica and Physiologica. Vol. 2 (3): 211-227, 2016
- ^ M. Lebedev, M. Nicolelis: Brain-machine interfaces: from basic science to neuroprostheses and neurorehabilitation, Physiological Review 97:737-867, 2017
- ^ L. Bozinovska, G. Stojanov, M. Sestakov, S. Bozinovski: CNV pattern recognition: step toward a cognitive wave observation, In L. Torres, E. Masgrau, E. Lagunas (eds.) Signal Processing V: Theories and Applications, Proc. EUSIPCO-90: Fifth European Signal Processing Conference, Elsevier, p. 1659-1662, Barcelona, 1990
- ^ L. Bozinovska, S. Bozinovski, G. Stojanov, Electroexpectogram: experimental design and algorithms, In Proc IEEE International Biomedical Engineering Days, p. 55-60, Istanbul, 1992
- S2CID 14678623.
- PMID 22666612.
- ^ Fox, Maggie (13 October 2016). "Brain Chip Helps Paralyzed Man Feel His Fingers". NBC News. Retrieved 23 March 2021.
- ^ Hatmaker, Taylor (10 July 2017). "DARPA awards $65 million to develop the perfect, tiny two-way brain-computer inerface". Tech Crunch. Retrieved 23 March 2021.
- ^ Stacey, Kevin (10 July 2017). "Brown to receive up to $19M to engineer next-generation brain-computer interface". Brown University. Retrieved 23 March 2021.
- ^ "Minimally Invasive "Stentrode" Shows Potential as Neural Interface for Brain". Defense Advanced Research Projects Agency (DARPA). 8 February 2016. Retrieved 23 March 2021.
- ^ "Cochlear Implants". National Institute on Deafness and Other Communication Disorders. February 2016. Retrieved 1 April 2024.
- ^ Miguel Nicolelis et al. (2001) Duke neurobiologist has developed system that allows monkeys to control robot arms via brain signals Archived 19 December 2008 at the Wayback Machine
- ^ Baum M (6 September 2008). "Monkey Uses Brain Power to Feed Itself With Robotic Arm". Pitt Chronicle. Archived from the original on 10 September 2009. Retrieved 6 July 2009.
- ^ Lewis T (November 2020). "Elon Musk's Pig-Brain Implant Is Still a Long Way from 'Solving Paralysis'". Scientific American. Retrieved 23 March 2021.
- ^ Shead S (February 2021). "Elon Musk says his start-up Neuralink has wired up a monkey to play video games using its mind". CNBC. Retrieved 23 March 2021.
- S2CID 45427819.
- S2CID 37539476.
- S2CID 37161168.
- S2CID 701524.
- PMID 10479703.
- S2CID 795720.
- ^ PMID 14624244.
- ^ PMID 15888644.
- PMID 21976021.
- S2CID 4383116.
- S2CID 9402759.
- ^ Pitt team to build on brain-controlled arm Archived 4 July 2007 at the Wayback Machine, Pittsburgh Tribune Review, 5 September 2006.
- YouTube
- S2CID 4404323.
- S2CID 3112034.
- S2CID 31277881.
- S2CID 129946122.
- PMID 31019323.
- ^ S2CID 235907121.
- ^ Belluck, Pam (14 July 2021). "Tapping Into the Brain to Help a Paralyzed Man Speak". The New York Times.
- ^ "Using BCI2000 in BCI Research". National Center for Adaptive Neurotechnology. Retrieved 5 December 2023.
- PMID 18094685.
- S2CID 9290258.
- ^ S2CID 37168897.
- PMID 31409713.
- ISSN 1110-8665.
- S2CID 11248506.
- ^ "Vision quest". Wired. (September 2002).
- ISSN 1059-1028. Retrieved 10 November 2021.
- ^ Tuller D (1 November 2004). "Dr. William Dobelle, Artificial Vision Pioneer, Dies at 62". The New York Times.
- ISBN 978-1-4797-0920-5.
- ^ nurun.com (28 November 2012). "Mr. Jen Naumann's high-tech paradise lost". Thewhig.com. Retrieved 19 December 2016.
- S2CID 5681602.
- S2CID 4347367.
- ^ Martins Iduwe. "Brain Computer Interface". Academia.edu. Retrieved 5 December 2023.
- PMID 22596161.
- PMID 23253623.
- PMID 33981047.
- S2CID 239736609.
- ^ Hamliton J (14 July 2021). "Experimental Brain Implant Lets Man With Paralysis Turn His Thoughts Into Words". All Things Considered. NPR.
- S2CID 237574228.
- PMID 37612500.
- S2CID 261098775.
- S2CID 261099321.
- S2CID 216508360.
- PMID 12991237.
- ^ PMID 25610364.
- ^ S2CID 7020369.
- S2CID 215817530.
- S2CID 3961913.
- PMID 26098896.
- PMID 29270103.
- ^ PMID 23562053.
- PMID 25890770.
- PMID 32116472.
- PMID 28246640.
- PMID 31406326.
- ^ PMID 29529359.
- PMID 31075202.
- PMID 26041930.
- S2CID 220308983.
- ^ S2CID 234102889.
- PMID 24999351.
- ^ Bryson S (5 November 2020). "Stentrode Device Allows Computer Control by ALS Patients with Partial Upper Limb Paralysis". ALS News Today.
- ^ Lanese, Nicoletta (12 January 2023). "New 'thought-controlled' device reads brain activity through the jugular". livescience.com. Archived from the original on 16 February 2023. Retrieved 16 February 2023.
- S2CID 255545643.
- doi:10.1142/9789812561763_0040. Archived from the original(PDF) on 4 April 2005.
- ^ "Teenager moves video icons just by imagination". Press release. Washington University in St Louis. 9 October 2006.
- PMID 18310813.
- PMID 22438708.
- .
ECoG- Based BCI has advantage in signal and durability that are absolutely necessary for clinical application
- ^ PMID 21750369.
Justin Williams, a biomedical engineer at the university, has already transformed the ECoG implant into a micro device that can be installed with a minimum of fuss. It has been tested in animals for a long period of time – the micro ECoG stays in place and doesn't seem to negatively affect the immune system.
- S2CID 239756345.
- S2CID 214704481.
- PMID 24789862.
- PMID 28141803.
- ^ Winters, Jeffrey (May 2003). "Communicating by Brain Waves". Psychology Today.
- ^ Adrijan Bozinovski "CNV flip-flop as a brain-computer interface paradigm" In J. Kern, S. Tonkovic, et al. (Eds) Proc 7th Conference of the Croatian Association of Medical Informatics, pp. 149-154, Rijeka, 2005
- .
- S2CID 33223634.
- S2CID 21464338.
- PMID 19850134.
- PMID 22046274.
- ^ "Thought-guided helicopter takes off". BBC News. 5 June 2013. Retrieved 5 June 2013.
- PMID 15876632.
- PMID 25162231.
- PMID 27014511.
- ISSN 1076-2787.
- S2CID 233446181.
- PMID 33485365.
- PMID 7514984.
- Bibcode:1994PhDT........82A.
- ^ Hockenberry, John (August 2001). "The Next Brainiacs". Wired. Vol. 9, no. 8.
- S2CID 14515754
- ^ S2CID 10943518.
- PMID 23181009.
- ^ S2CID 23136360.
- PMID 32581758.
- ^ US 20130127708, issued 23 May 2013
- ^ S2CID 14324159.
- S2CID 32640699.
- PMID 29614020.
- PMID 16792302.
- S2CID 38568963.
- PMID 24321081.
- ^ Subject with Paraplegia Operates BCI-controlled RoGO (4x) at YouTube.com
- ^ Alex Blainey controls a cheap consumer robot arm using the EPOC headset via a serial relay port at YouTube.com
- ^ Drummond, Katie (14 May 2009). "Pentagon Preps Soldier Telepathy Push". Wired. Retrieved 6 May 2009.
- ^ Ranganatha Sitaram, Andrea Caria, Ralf Veit, Tilman Gaber, Giuseppina Rota, Andrea Kuebler and Niels Birbaumer(2007) "FMRI Brain–Computer Interface: A Tool for Neuroscientific Research and Treatment"
- .
- ^ "To operate robot only with brain, ATR and Honda develop BMI base technology". Tech-on. 26 May 2006. Archived from the original on 23 June 2017. Retrieved 22 September 2006.
- S2CID 17327816.
- PMID 21945275.
- ^ Yam, Philip (22 September 2011). "Breakthrough Could Enable Others to Watch Your Dreams and Memories". Scientific American. Retrieved 25 September 2011.
- UC Berkeley. Archived from the originalon 25 September 2011. Retrieved 25 September 2011.
- UC BerkeleyNews Center. Retrieved 25 September 2011.
- ^ S2CID 206636315.
- ^ "Goals of the organizers". BBC. Retrieved 19 December 2022.
- PMID 22479236.
- S2CID 16314534.
- ISBN 978-3-319-49057-1.
- PMID 26848745.
- ^ Kennedy, Pagan (18 September 2011). "The Cyborg in Us All". The New York Times. Retrieved 28 January 2012.
- ^ Selim, Jocelyn; Drinkell, Pete (1 November 2002). "The Bionic Connection". Discover. Archived from the original on 6 January 2008.
- ^ Giaimo, Cara (10 June 2015). "Nervous System Hookup Leads to Telepathic Hand-Holding". Atlas Obscura.
- ^ Warwick, K, Gasson, M, Hutt, B, Goodhew, I, Kyberd, P, Schulzrinne, H and Wu, X: "Thought Communication and Control: A First Step using Radiotelegraphy", IEE Proceedings on Communications, 151(3), pp.185–189, 2004
- PMID 14568806.
- PMID 25137064.
- ^ Bland, Eric (13 October 2008). "Army Developing 'synthetic telepathy'". Discovery News. Retrieved 13 October 2008.
- PMID 23448946.
- ^ Gorman, James (28 February 2013). "One Rat Thinks, and Another Reacts". The New York Times. Retrieved 28 February 2013.
- ^ Sample, Ian (1 March 2013). "Brain-to-brain interface lets rats share information via internet". The Guardian. Retrieved 2 March 2013.
- PMID 17596441.
- ^ Caltech Scientists Devise First Neurochip, Caltech, 26 October 1997
- ^ Sandhana, Lakshmi (22 October 2004). "Coming to a brain near you". Wired News. Archived from the original on 10 September 2006.
- ^ "'Brain' in a dish flies flight simulator". CNN. 4 November 2004.
- ^ "David Pearce – Humanity Plus". 5 October 2017. Retrieved 30 December 2021.
- ^ Stoica A (2010). "Speculations on Robots, Cyborgs & Telepresence". YouTube. Archived from the original on 28 December 2021. Retrieved 28 December 2021.
- ^ "Experts to 'redefine the future' at Humanity+ @ CalTech". Kurzweil. Retrieved 30 December 2021.
- ^ WO2012100081A2, Stoica, Adrian, "Aggregation of bio-signals from multiple individuals to achieve a collective outcome", issued 2012-07-26
- PMID 21655253.
- S2CID 14930969.
- S2CID 6783719.
- S2CID 13201979.
- S2CID 25136642.
- S2CID 40341170.
- PMID 28798411.
- PMID 34417494.
- ^ "Paralyzed Again". MIT Technology Review. Retrieved 8 December 2023.
- ^ "Gale - Product Login". galeapps.gale.com. Retrieved 8 December 2023.
- S2CID 5132634.
- PMID 30868377.
- S2CID 205043226.
- PMID 19616405.
- PMID 24860445.
- PMID 24273623.
- ^ "Sony patent neural interface". Archived from the original on 7 April 2012.
- ^ "Mind Games". The Economist. 23 March 2007.
- ^ "nia Game Controller Product Page". OCZ Technology Group. Retrieved 30 January 2013.
- ^ a b c Li S (8 August 2010). "Mind reading is on the market". Los Angeles Times. Archived from the original on 4 January 2013.
- ^ Fruhlinger, Joshua (9 October 2008). "Brains-on with NeuroSky and Square Enix's Judecca mind-control game". Engadget. Retrieved 29 May 2012.
- ^ New games powered by brain waves. Physorg.com (10 January 2009). Retrieved on 12 September 2010.
- ^ Snider, Mike (7 January 2009). "Toy trains 'Star Wars' fans to use The Force". USA Today. Retrieved 1 May 2010.
- ^ "Emotiv Homepage". Emotiv.com. Retrieved 29 December 2009.
- ^ "'necomimi' selected 'Time Magazine / The 50 best invention of the year'". Neurowear. 22 November 2011. Archived from the original on 25 January 2012.
- ^ "LIFESUIT Updates & News – They Shall Walk". Theyshallwalk.org. Retrieved 19 December 2016.
- ^ "SmartphoneBCI". GitHub. Retrieved 5 June 2018.
- ^ "SSVEP_keyboard". GitHub. Retrieved 5 April 2017.
- ^ Protalinski, Emil (8 December 2020). "NextMind ships its real-time brain computer interface Dev Kit for $399". VentureBeat. Retrieved 8 September 2021.
- ^ Etherington, Darrell (21 December 2020). "NextMind's Dev Kit for mind-controlled computing offers a rare 'wow' factor in tech". TechCrunch. Retrieved 1 April 2024.
- ^ "Roadmap - BNCI Horizon 2020". bnci-horizon-2020.eu. Retrieved 5 May 2019.
- S2CID 15822773.
- ISBN 978-3-642-29746-5.
- ISBN 978-4-431-55037-2.
- S2CID 6498232.
- S2CID 6447538.
- PMID 25505400.
- PMID 24312041.
- S2CID 37902399.
- PMID 24468185.
- S2CID 5071115.
- PMID 32334608.
- PMID 27112213.
- S2CID 20808455.
- ISBN 978-3-319-39954-6.
- PMID 27590975.
- S2CID 7120989.
- PMID 25071547.
- ^ PMID 33418846.
- S2CID 5866337.
- ^ Opie N (2 April 2019). "Research Overview". University of Melbourne Medicine. University of Melbourne. Retrieved 5 December 2019.
- S2CID 205282364.
- ^ "Synchron begins trialling Stentrode neural interface technology". Verdict Medical Devices. 22 September 2019. Retrieved 5 December 2019.
- S2CID 31284248.
- PMID 25461213.
- PMID 20400953.
- S2CID 36593459.
- S2CID 5223203.
- PMID 26421660.
- ^ S2CID 47542923.
- S2CID 21557.
Further reading
- Brouse, Andrew. "A Young Person's Guide to Brainwave Music: Forty years of audio from the human EEG". eContact! 14.2 – Biotechnological Performance Practice / Pratiques de performance biotechnologique (July 2012). Montréal: CEC.
- Gupta, Cota Navin and Ramaswamy Palanappian. "Using High-Frequency Electroencephalogram in Visual and Auditory-Based Brain-Computer Interface Designs". eContact! 14.2 – Biotechnological Performance Practice / Pratiques de performance biotechnologique (July 2012). Montréal: CEC.
- Ouzounian, Gascia. "The Biomuse Trio in Conversation: An Interview with R. Benjamin Knapp and Eric Lyon". eContact! 14.2 – Biotechnological Performance Practice / Pratiques de performance biotechnologique (July 2012). Montréal: CEC.