The Two Directions of BCI
Brain-computer interfaces operate in two fundamentally different modes:
Reading Mode (Output BCIs) The BCI records neural activity and translates it into commands:
- Patient thinks "move arm" โ BCI detects motor cortex patterns โ Computer interprets the intent โ Robotic limb moves
Writing Mode (Input BCIs)
The BCI delivers electrical signals the brain interprets as sensation:
- Camera captures thermal data โ Encoder converts to neural signal language โ Electrodes stimulate visual cortex โ Brain perceives heat patterns as "sight"
The most transformative applications combine both โ bidirectional interfaces that let humans control external systems and receive synthetic sensory feedback from them.
The Neural Signal Pipeline
Every BCI follows the same fundamental architecture:
Sensor/Detector
โ
Signal Conditioning (amplification, filtering)
โ
Analog-to-Digital Conversion
โ
Feature Extraction (which neurons fired? which patterns?)
โ
Decoder (machine learning model)
โ
Command / Output to computer
For stimulation (input BCIs), the pipeline runs in reverse:
External data (camera, sensor, etc.)
โ
Encoding algorithm (translate to neural language)
โ
Digital-to-Analog Conversion
โ
Electrode stimulation (micro-current pulses)
โ
Neuron activation
โ
Brain perception
Types of Electrodes
| Type | Placement | Resolution | Invasiveness | |------|-----------|-----------|-------------| | EEG caps | Scalp surface | Low | None | | ECoG grids | Brain surface (epidural) | Medium | Surgery required | | Utah Array | Penetrating cortex | High | Highly invasive | | Neuralink threads | Deep cortex penetration | Very high | Highly invasive | | Synchron Stentrode | Via blood vessels | High | Minimally invasive |
The higher the electrode resolution and depth, the richer the signal โ and the more complex the surgical procedure.
The Encoding Problem: The Hardest Challenge
Reading neural signals is now well understood. Encoding synthetic signals the brain can learn is where the field is unsolved.
The challenge:
- Neurons communicate in complex, context-dependent patterns
- The same stimulus produces slightly different neural patterns every time
- A synthetic signal must be consistent enough for the brain to learn its meaning over time
This is like creating a new language the brain has never encountered and asking it to build fluency from the ground up.
Current approach: Stimulation patterns are designed based on what natural stimuli produce, then refined through training feedback. The brain does most of the work โ it adapts to whatever signal arrives if it's consistent.
Neuralink's Blindsight Program
Neuralink's most compelling current project isn't motor control โ it's blindsight: restoring vision to people who have lost their eyes and optic nerves, or who were born without them.
The approach:
- A camera (worn as glasses) captures the visual field
- A dedicated chip encodes the visual data into neural stimulation patterns
- Microelectrodes stimulate the visual cortex directly
- The brain trains on the patterns and constructs visual perception
Early results: participants perceive phosphenes (flashes of light that coalesce into shapes over training). Resolution improves with time.
This is not "normal" vision. It is a new form of visual experience โ potentially with different characteristics than biological sight.
Why the Brain Accepts Synthetic Signals
The brain has no mechanism to verify the source of its inputs. It cannot "check" whether signals came from eyes or electrodes. It only asks:
- Is this signal consistent?
- Does it follow learnable patterns?
- Does it correlate with other sensory data (cross-modal confirmation)?
If yes to all three โ it assigns the signal meaning and integrates it into conscious perception.
This is proven by every cochlear implant user who eventually hears "normally," despite their brain receiving electrical pulses where it once received acoustic nerve signals.