of 45
C
OMPOSITIONAL
COD
ING
OF INDIVIDUAL FINGER MOVEMENTS
IN
HUMAN POSTERIOR PARIETAL CORTEX
AND MOTOR CORTEX
ENABLE
S
TEN
-
FINGER
DECODING
Charles Guan
1
*, Tyson Aflalo
1,2
, Kelly Kadlec
1
, Jorge Gámez de Leon
1
, Emily R. Rosario
3
, Ausaf Bari
4
, Nader
Pouratian
5
, Richard A. Andersen
1,2
1
California Institute of Technology, Pasadena, CA, USA
2
T&C Chen Brain
-
Machine Interface Center at Caltech
3
Casa Colina Hospital and Centers for Healthcare, Pomona, CA, USA
4
David Geffen School of Medicine at UCLA, Los An
geles, CA, USA
5
University of Texas Southwestern Medical Center, Dallas, TX, USA
*Correspondence: cguan@caltech.edu (C.G.)
A
BSTRACT
Objective
.
Enable neural control of individual prosthetic fingers for participants with
upper
-
limb
paralysis
.
Approach
.
Two
tetraplegic
participants were each implanted with a 96
-
channel array in the left posterior
parietal cortex
(PPC)
.
One of the participants was a
dditionally implanted with a 96
-
channel array near the hand
knob of the left motor cortex (MC).
Across tens of sessions, we recorded neural activity while the
participants
attempted to move individual fingers
of the right hand
. Offline, we classified finge
r movements from neural
firing rates using linear discriminant analysis (LDA) with cross
-
validation.
T
he participants
then
used the neural
classifier
online
to control individual fingers of a brain
-
machine interface (BMI).
Finally
, we characterized the
neu
ral
representational geometry during
individual
finger movements of both hands.
Main Results
.
The two participants achieved 86% and 92% online accuracy
during BMI control of the
contralateral fingers
(chance = 17%).
Offline, a linear decoder achieved ten
-
f
inger decoding accuracies of
70
%
and 6
6
%
using respective PPC recordings and 75% using MC recordings
(chance = 10
%)
.
A compositional code
linked corresponding finger movements of the contralateral and ipsilateral hands.
Significance
.
This is the first
study to
decod
e both
contralateral and ipsilateral
finger movements from PPC.
Online BMI control
of contralateral finger
s
exceeded that of previous finger BMIs.
PPC
and
MC
signals can be
used to control
individual
prosthetic fingers, which may contribute to a hand restoration strategy for people
with tetraplegia.
Keywords
:
finger decoding, hand movement, brain
-
computer
inte
rface (B
CI
), posterior parietal cortex (PPC),
motor cortex (MC
)
,
representational geometry
,
factorized representations
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
NOTE: This preprint reports new research that has not been certified by peer review and should not be used to guide clinical practice.
I
NTRODUCTION
Tetraplegic individuals identify hand function as a high
-
impact priority for improving their quality of life
[1
3]
.
Neuroprosthetics research has enabled control of basic grasp shapes
[4,5]
, an important step towards
empowering paralyzed individuals to perform daily activities. However, these basic grasp templates constrain
the r
ange
of
motion and thus limit the usefulness of existing neural prosthetics.
The complexity of human motor behavior is largely enabled by
our
versatile
,
dexter
ous hands
[6]
. The human
hand can weave intricate crafts, sign expressive languages, and fingerpick guitar solo
s
. Even everyday manual
behaviors, like turning a door handle, require volitional control over many degrees of freedom
[7]
. Indeed,
humans can move
individual
fingers much more independently than other animals, including monkeys
[8,9]
. To
better restore autonomy to people with tetraplegia, neural prosthetics would benefit from enabling dexterous
finger control.
Cortical brain
-
machine interface (BMI) research h
as
focused
on control of computer cursors and robotic arms,
rather than dexterous hand control.
Foundational studies implemented continuous decoders for cursor control
[10
13]
. Leveraging this cursor control,
[14,15]
subsequently developed on
-
scre
en keyboard typing interfaces
.
[5,16
18]
applied continuous decoding to arm control, with
[16]
controlling the user's own muscles.
Recent
work has also decoded speech from senso
rimotor cortex
[19
22]
.
However, relatively few BMI studies have
focused on hand control
[23
28]
, and previous studies frequently
comb
ine the ring and little fingers or leave
them out altogether
. Individuated finger control
wo
uld be useful for applications like keyboard typing
or object
manipulation
.
Most
motor
BMIs record neural activity from the motor cortex (MC), although areas of the
posterior parietal
cortex (PPC) have also been used successfully for BMI
control
(for review, see
[29]
)
of reaching
[10,12,30]
and
grasping
[4,22]
. The PPC plays a central role in sensorimotor integration, with regions of PPC representing visual
stimulus locations
[31]
, eye movements
[32]
, task context
[33]
, planned reaches
[34]
, and object grasping
[35,36]
. PPC uses partially mixed selectivity to simultaneously encode many motor variables
[37]
, which can be
useful for versatile neural de
coding.
Despite PPC's clearly demonstrated role in dexterous
grasping
[6,36,38]
, l
ess i
s known about PPC responses
during individual finger movements. With fMRI, lesion, and anatomical evidence situating primary motor cortex
as core to fine finger movements (for review, see
[6]
), most electrophysiological studies
of finger movements
have focused on the primary motor (M1) and primary somatosensory cortex (S1)
[24,25,28,39
42]
.
Nevertheless,
non
-
human
primate mapping
stud
ies
[43]
and stimulation
studies
[44,45]
have identified PPC sub
-
regions that are likely involved in fine finger movements. These results imply that fine finger movements are
supported by a broad neuronal network, which should be
investigated to improve
dexterous BMI control.
Here, we recorded intracortical activity from the PPC of two tetraplegic participants while they attempted to
press
individual fingers. Across task contexts, we could classify individual
finger movements during planning and
execution periods. We connected
this neural decoder to drive a neural prosthetic hand, with accuracies
exceeding
recent
intracortical
BMI
stud
ies
[27,46]
. Furthermore, we
characterize both the neural tuning and
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
representational geometry
[47]
during finger movements
of both hands
.
T
he neural code
was composed of
separable
laterality and finger components
, leading to finger representations that were simultaneously
discriminable and similar across
contralateral/ipsilateral pairs of fingers
.
These findings
contribute to the
understanding of human hand movements and
advance
the development of hand neuroprosth
e
tics
for people
with paralysis.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
M
ETHODS
Study participants
Experiments were conducted with volunteer participants enrolled in a brain
-
machine interface (BMI) clinical
study (
ClinicalTrials.gov
Identifier:
NCT01958086
). All
procedures were approved by the
respective
Institutional
Review Boards of California Institute of Technology, Casa Colina Hospital and Centers for Healthcare, and
University
of California, Los Angeles.
Participant
N
is a right
-
handed, tetraplegic woman. Approximately 10 years before this study, she sustained an
AIS
-
A spinal cord injury at cervical level C3
-
C4.
N
can move her deltoids and above, but she cannot move or feel
h
er hands.
Participant
J
is a right
-
handed, tetraplegic man. Approximately 3 years before this study, he sustained
a
spinal
cord injury at cervical level C4
-
C5. He has residual movement in his upper arms, but he cannot move or feel his
hands.
Each particip
ant consented to this study after understanding the nature, objectives, and potential risks.
Tasks
Alternating
-
cues f
inger press task with delay
Each participant performed an instructed
-
delay finger movement task (
Figure
1
). They were seated in front of a
computer monitor display, with their hands prone on a flat surface. Each trial began with a cue specifying a
finger
of the right hand.
The finger
cue then disappeared during a delay period. A condition
-
invariant go
-
icon
appeared, instructing the participant to attempt to press the
cued
finger as though pressing a key on a
keyboard. This instructed
-
delay task format temporally separates the visual st
imulus from the planning and
execution epochs.
Supplementary Table
1
documents the phase durations for each task, and
Supplementary Table
2
lists the date
ranges for each task
.
Some regions of the posterior parietal cortex (PPC) are modulated by non
-
motor variables like visual stimulus
location
[31]
and task context
[33]
. To ensure that the recorded neural signals reflected movement type (rather
than, e.g., visual memory), we varied the cueing method between runs (
Figure
1
). In the Spatial
-
Cue variant, five
circles corresponded to the five fingers. In the Text
-
Cue variant, the finger cue was a letter abbreviation. A brief
Pre
-
Cue phase in each trial indicated what cue variant the trial
would be.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
Figure
1
.
Alternating
-
cues, instructed
-
delay finger press task
Trial structure. Each rectangle represents the computer monitor display at each phase. Two cue variants, text and spatial, we
re trial
-
interleaved. In the spatial variant, the location of the highlighted circle corresponded to the cued finger. Trials withou
t a highlighted circle
indicated a No
-
Go cue. In the text variant, a highlighted letter (for example, "M" for the middle finger) cued each finger. In both variants
,
the finger cue disappeared before the movement phase (Go) to separate planning and executio
n periods. Phase durations are listed in
Supplementary Table
1
.
Finger press task with randomized cue location (reaction
-
time)
Letters, corresponding
to each movement type, were arranged in a 3 x 4 grid across the screen. Each grid
consisted of two repetitions each of T (thumb), I (index), M (middle), R (ring), P (pinky), and X (No
-
Go). Letters
were arranged in a random order to dissociate eye gaze sign
als from movement representations. On each trial, a
single letter cue was indicated with a crosshairs symbol, which was jittered to minimize systematic effects of
letter occlusion. Each cue was selected once (for a total of 12 trials) before the screen was
updated to a new
arrangement. Each run
-
block consisted of 4 screens for a total of 48 trials.
On each trial, the participant was instructed to immediately saccade to the cued target and fixate, then attempt
to press the corresponding finger
of the right h
and.
A trained classifier (
Brain
-
machine interface (BMI)
calibration
) decoded the finger movement from neural signals (
Online BMI discrete control
) and displayed the
classified finger m
ovement 1.5 seconds after the start of the trial. The participant
pressed
the instructed finger
and fixated on the cue until the visual classification feedback was shown.
Data from participant
N
performing this task was previously analyzed in
[46]
. Data from
participant
J
have not
been reported previously.
During 3 sessions,
participant
J
also performed this task using his left hand.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
Figure
2
.
Reaction
-
time finger
-
press task with randomized cue location
. Figure adapted
from
[46]
(CC
BY
-
NC 4.0).
Main finger press task. When a
letter was cued by the red crosshair, the participant looked at the cue and immediately attempted to flex
the corresponding digit of the right (contralateral) hand. We included a null condition "X," during which the participant loo
ked at the
target but did
not move their fingers. Visual feedback indicated the decoded finger 1.5 seconds after cue presentation. To randomize the
saccade location, cues were located on a grid (3 rows, 4 columns) in a pseudorandom order. The red crosshair was jittered to
minimize
visual occlusion.
Ten
-
finger press task
Each participant also performed an instructed
-
delay finger press task with fingers from both hands. The task was
like
the Text
-
Cue variant of
the
Alternating
-
cues fi
nger press task with delay
, except without a Pre
-
Cue phase. All
ten fingers were interleaved in trials within the same run
-
block (
Figure
3
).
Phase durations
are documented in
Supplementary
Table
1
.
Figure
3
.
Text
-
cued finger movement task with instructed
-
delay
.
Trial structure. Text cues indicate the hand ("R" or "L") and the finger (e.g., "m" for middle finger
).
A delay period separates the Cue phase
from a condition
-
independent Go display
.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
Implant location
Participant
N
was implanted with two 96
-
channel Neuro
P
ort Utah electrode arrays 6 years after injury (about 4
years before this study). The implant locatio
ns were determined using anatomical priors and preoperative
functional magnetic resonance imaging (fMRI)
[46]
. One array (denoted
N
-
PPC
) was implanted over the
hand/limb region of PPC at the junction of the intraparietal sulcus (IPS) with the postcentral sulcus (PCS). This
region is thought to be involved in the planning of grasp movements
[4,36,48]
. In this report, we refer to this
brain area as PC
-
IP (postcentral
-
intraparietal)
, although it is sometimes also
referred to as the anterior
intraparietal
sulcus
(aIPS) region
[49]
. A second array was in Brodmann's area (BA) 5d. In the weeks following
implantation, it was found that the BA 5d array did not function, so only the PC
-
IP array was used in this study.
Participant
J
was implanted with two
96
-
channel
NeuroPort
Utah electrode arrays about 20 months after injury
(about 35 months before this study). The first array (denoted
J
-
PPC
) was implanted in the superior parietal
lobule (SPL) of the left PPC. The second array (denoted
J
-
MC
) was implante
d near the hand knob of the left
motor cortex (MC) (
Supplementary Figure
1
). PPC and MC activity were recorded simultaneously.
Neural signal recording
and preprocessing
Neural signals were acquired, amplified, bandpass
-
filtered (0.3 Hz
-
7.5 kHz) and digitized (30 kHz, 16
-
bits/sample) from the electrodes using NeuroPort Neural Signal Processors (
NS
P) (Blackrock Microsystems Inc.).
Action potentials (spikes) were detected by high
-
pass filtering (250Hz cut
-
off) the full
-
bandwidth signal, then
thresholding at
-
3.5 times the root
-
mean
-
square (RMS) voltage of the respective electrode. Although one or
more source neurons may generate thre
shold crossings, we used raw threshold crossings for online control and
only sorted spikes for offline analyses. Single neurons were identified using the k
-
medoids clustering method.
We used the gap criteria
[50]
to determine the total number of waveform clusters. Clustering was performed on
the first n
{2, 3, 4} principal com
ponents, where n was selected to account for 95% of waveform variance.
Feature Extraction
Except when otherwise specified, we used a 500
-
millisecond (ms) window of neural activity to calculate firing
rates (counted spikes divided by the window duration).
The firing rate was then used as the input features to
each analysis or classification model. Neurons with an average firing rate less than 0.5 Hz were excluded from all
analyses.
Behavioral epochs: the movement execution (“Go” or “move”) analysis window w
as defined as the 500
-
ms
window starting 200 ms after the Go cue. For applicable tasks, the movement planning (“Delay” or “plan”)
analysis window was defined as the 500
-
ms window starting 200 ms after the Delay screen. The Cue analysis
window was defined a
s the 500
-
ms window starting 200 ms after the Cue screen.
The
intertrial interval (
ITI
)
analysis window was defined as the last 500 ms of the ITI phase.
Single
-
neuron selectivity for finger movements
In the
section
Single
-
ne
uron modulation to individual finger presses
,
we
used a one
-
way ANOVA to determine
whether neurons distinguished firing rates between conditions. A neuron was considered discriminative if p <
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
0.05 after false discovery rate
(
FDR
)
correction for multiple comparisons using the Benjamini
Hochberg
procedure
; we also denoted
this FDR
-
adjusted p
-
value
as
q
. We corrected for
m
=
N
comparisons, where N is the
number of neurons for each participant. Following Cohen's rules of thumb
[51]
,
we denoted
the ANOVA effect
size
as
"large" if η
2
> 0.14. As the ANOVA post h
oc test, we used Dunnett's multiple comparison test
[52]
to
determine which fingers had significantly different firing rates than the No
-
Go baseline.
To quantify
the effect size of firing
-
rate changes
against
the No
-
Go
baseline
(
Figure
4
a)
, we used
Hedges' g
,
which is
similar to Cohen's d but bias
-
corrected for small sample sizes.
We calculated and visualized
Hedges' g
values
using the Data Analysis using Bootstrap
-
Coupled Estimation Python library
[53]
.
For
visual simplicity,
we pooled
neurons
across sessions
when calculating and visualizing single
-
neuron
metrics
(percentage selective, number of fingers discriminable from No
-
Go, empirical cumulative distribution functions).
To visualize firing rates, spike r
asters were smoothed with a Gaussian kernel (50
-
ms
standard
-
deviation [
S
.
D
.
]
),
then averaged across trials to create a peristimulus time histogram (PSTH).
Offline classification with cross
-
validation
We trained a separate linear classifier for each session
to predict finger movements from the neural features.
We used diagonal
-
covariance linear discriminant analysis (
diagonal
LDA)
[54]
;
Diagonal
LDA is equivalent to
Gaussian Naive Bayes
(GNB) when GNB
share
s a single
covariance matrix
across classes
.
For offline classification and parameter sweeps,
we used stratified
K
-
Folds cross
-
validation (with K = 8) to
estimate the generalization error. Reported classification metrics correspond to cross
-
validated performance on
the evaluation sets. Results were aggregated over sessions by pooling trials. Across
-
session standard
deviations
of classification accuracy are weighted by the number of trials in each session.
Learning curves (
Figure
5
b)
were generated by
using subsets
of the training set during each Stratified K
-
Fold
split. Neuron
-
dropping curves
(
Figure
5
c)
were generated by
stacking
neurons
across sessions, then evaluating
performance with random subse
ts of neurons.
Window duration sweeps
(
Figure
5
d)
varied the size of the firing
-
rate estimation window while fixing the start time
at 200ms after the Go cue. Neural decode time
-
courses
(
Figure
5
e)
used 500ms bins centered at different times of the trial.
Online brain
-
machine interface (BMI) discrete control
Each BMI control session started with a run of the open
-
loop calibration
task. For participant
N
, this was the
Alternating
-
cues finger press task
,
modified to not have a delay. For participant
J
, this was the
finger press task
with randomized cue location
, without classifier output.
The
neural activity and finger movements from the calibration task served as training data for the online BMI
classification model. Neural features were composed of the threshold crossing rates of each electrode during a
1
-
second window. Electrodes with mean f
iring rates less than 1 Hz were excluded to minimize sensitivity to
discretization. The window start
-
time was chosen to maximize the cross
-
validated classification accuracy on the
calibration task. The classifier was then re
-
trained using all data from the
calibration task (without cross
-
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
validation), using threshold crossing rates from the selected window. This classifier was then used to decode
attempted movements in the BMI task.
During online control of the finger grid task, the classifier predicted a si
ngle finger movement for each trial.
Input neural features consisted of the threshold crossing rates from each electrode in the time window [0.5, 1.5]
seconds after cue presentation.
As a proof
-
of
-
concept, we also connected the classifier output to the
fingers of a robot hand
(
not shown in
preprint
)
.
On each trial, a screen cue instructed the participant which finger to press.
The
BMI
classifier
predicted each finger movement from the neural features and
then moved
the corresponding finger on the
robotic hand.
Neural distance betwe
en fingers
We quantified the neural activity differences between finger movements using
the cross
-
validated
(
squared
)
Mahalanobis distance
[55]
.
The Mahalanobis distance is a continuous
, non
-
saturating
analogue of LDA
classification accuracy
[56]
. C
ross
-
validation
removes the positive bias of standard distance
metrics
, such that
[
푗푘
2
]
=
0
when
two activity patterns
are statistically identical
.
To calculate population distances, we us
ed the representational similarity analysis Python toolbox
[57]
. The
toolbox slightly modifies the cross
-
validated Mahalanobis equation, incorporating the noise covarianc
es of both
folds to improve robustness:
푗푘
2
=
(
)
(
Σ
+
Σ
2
)
1
(
)
/
where
and
indicate independent partitions of the trials,
Σ
is the noise covariance matrix,
(
,
)
are the
firing rate vectors for conditions
(
,
)
stacked across trials, and
normalizes for the number of neurons.
The
units of
푗푘
2
are
푢푛푖푡푙푒푠
2
/
푛푒푢푟표푛
.
Shared representations across hands
To quantify whether finger
representations
were
similar across hands, we compared the pairwise distances
between matching finger
pairs and the pairwise distances between non
-
matching finger pairs
(
Figure
8
b)
.
We
denoted a
finger pair
as
matching
if
the hands differed
and
the finger
-
types were the same ([Lt, Rt], [Li, Ri], [Lm,
Rm], [Lr, Rr], [Lp, Rp]).
We denoted a
finger pair
as
non
-
matching if the hands differed and the
finger
-
types also
differed ([Lt, Ri], [Lt, Rm], [Lt, Rr], [Lt,
Rp], [Li, Rt], [Li, Rm], etc.).
We
described
a neural population as
sha
r
ing
representations across hands if the average distance
between matching finger pairs
was smaller than the
average
dista
nce between non
-
matching finger pairs
.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
Compositionality of finger representations
Compositional cod
ing
refers to r
epresentations
that
can be constructed
by combining and recombining
basic
components
[58,59
]
.
We
assessed
whether finger representations could be
linearly
decomposed into
the sum of
finger
-
type and laterality components.
We first visualized the representational geometry in
Figure
8
d
using
2
-
D
multidimensional scaling (MDS). MDS
projects the
movement
conditions
into a low
-
dimensional space that preserves
pairwise
distances
(
Figure
8
a)
as
well as possible. We performed MDS on data from individual sessions and then used Generalized Procrustes
Analysis (GPA)
with scaling
to
normalize and
align MDS projections across sessions.
In the
N
-
PPC
MDS plot
,
ellipses show standard error (S.E.) acr
oss sessions.
The
J
-
PPC and
J
-
MC
MDS plots show the condition means
without any S.E. ellipses
,
because
the 2 sessions with
participant
J
are not sufficient to estimate
the
S.E.
To
assess
compositionality
, we
used
leave
-
one
-
group
-
out
cross
-
validation to
determine
whether hand
-
and
finger
-
dimensions
generalize to left
-
out movements
(
Supplementary Figure
8
)
.
If finger representations ar
e
compositional
, then hand classifiers (left vs. right) should generalize
when trained on a subset of finger types
and
evaluated
on left
-
out
fingers
.
Additionally,
finger
-
type
classifiers should generalize
when trained on one
hand and tested on the other han
d
(
Figure
8
e)
.
This metric is sometimes cal
led cross
-
condition generalization
performance (CCGP)
[60,61]
.
We pooled neurons across sessions (
N
: 10 sessions;
J
: 2) into a pseudo
-
population.
We
used a permutation test to assess whether CCGP was significantly above chance, shuffling the labels
repeatedly
(N=1001)
to generate a null distribution.
Standard
,
within
-
condition decoding accuracy provide
d
an
upper bound on
CCGP
. R
eaching this upper bound
implies perfect compositionality
.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
R
ESULTS
Single
-
neuron modulation to individual finger presses
We first sought to determine whether PPC single neurons discriminate between individual finger movements.
We quantified single
-
neuron modulation to attempted
finger presses of the right (contralateral to the implant)
hand while the participant performed the
Alternating
-
cues finger press task with delay
(participant
N
: 120 trials
per session for 4 sessions; par
ticipant
J
: 112 trials per session [min: 96; max: 120] for 3 sessions). We recorded
118 neurons per session (min: 111; max: 128) over 4 sessions from
N
-
PPC
, 103 neurons per session (min: 92;
max: 116) over 3 sessions from
J
-
PPC
,
and
93 neurons per sessi
on (min: 90; max: 95) from
J
-
MC
. For each
neuron, we calculated firing
rate
s
during the attempted movement period
and compared firing rates across
conditions
(
Figure
4
a,
Supplementary Figure
2
,
Supplementary Figure
3
)
.
Similar to results
from finger studies of the
motor cortex hand area
[42,62]
, PPC neurons were not
anatomically
segregated
by finger selectivity.
A large portion of neurons (
N
-
PPC
: 54%;
J
-
PPC
:
30
%;
J
-
MC
: 7
8
%
;
Figure
4
c)
discriminated firing rates across conditions (q < 0.05), and selective neurons were often selective for multi
ple
finger movements (mean number of significant fingers,
N
-
PPC
: 2.1;
J
-
PPC
: 1.9;
J
-
MC
: 2.7). Moreover, many
neurons discriminated between finger conditions with large effect sizes (percentage of neurons with η
2
> 0.14,
N
-
PPC
: 40%;
J
-
PPC
:
25
%;
J
-
MC
:
64%
;
Figure
4
d
-
e)
.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
Figure
4
. PPC single neurons discriminate between attempted finger
movements.
a)
Single
-
trial firing rates for an example
N
-
PPC
neuron during movements of different fingers. (top)
Markers
correspond to
the
firing
rate during each trial (N=120 across 6 conditions)
. Gapped vertical lines to the right of markers indicate +/
-
S
.D., and each gap
indicates the condition mean
. (bottom)
Firing rates during thumb (T) and index (I) presses were
higher
than the No
-
go (X) baseline.
Vertical bars indicate bootstrap 95% confidence intervals (CI) of the effect size versus No
-
go baseline. H
alf
-
violin plots indicate
bootstrap distributions.
b)
Mean smoothed firing rates for each finger movement for two example
N
-
PPC
neurons. Shaded areas indicate 95% CI.
c)
P
ercentage of
N
-
PPC
neurons that discriminated between finger movements in each analysis w
indow (q < 0.05
, FDR
-
corrected for
466 neurons
).
Line (blue) indicates mean across sessions. Markers (gray) indicate individual sessions.
d)
Complementary empirical cumulative distribution function (cECDF) visualizing the proportion of
N
-
PPC neurons with ANO
VA effect
sizes (η
2
) above the corresponding x
-
axis value. Vertical lines (gray) indicate Cohen’s thresholds
[51]
for small (η
2
=0.01), medium
2
=0.06), and large (η
2
=0.14) effect sizes.
e)
Overlap of
N
-
PPC
neurons that modulated significantly
(q < 0.05)
with large
effect sizes (
η
2
> 0.14)
during movement preparation
and movement execution.
We also quantified single
-
neuron modulation during movement preparation
.
Preparatory activity discriminated
between finger conditions with reasonable effect sizes
(
Figure
4
d).
C
onsistent with reaching studies of PPC
[10]
,
s
lightly fewer
PPC
neurons
had strong tuning (q < 0.05 and η
2
> 0.14)
during movement preparation
(
N
-
PPC
:
24
%;
J
-
PPC
:
2
3
%)
than du
ring movement execution
(
N
-
PPC: 43%;
J
-
PPC:
24%
) (
Figure
4
e)
.
Classifying finger presses from neural activity
Since single neurons were tuned
to finger movement
s
, we
evaluated
whether attempted finger movement
s
could be classified (offline)
from the
population neural activity. Using data from the same task, we trained linear
classifiers and assessed finger classification accuracy on held
-
out trials using cross
-
validation (
Methods
).
Classification accuracies
substant
ially exceeded
chance
(
N
-
PPC
: 86%;
J
-
PPC
: 64%;
J
-
MC
: 84%; chance: 17%).
The
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
majority (
N
-
PPC
: 75%;
J
-
PPC
: 42%;
J
-
MC
: 67%) of errors misclassified an adjacent finger
(
Figure
5
a
,
Supplementary Figure
4
,
Supplementary Figure
5
)
.
Classification accuracy can depend on the neural signal quality and prediction window. To better understand
how finger classification varies over dataset and classifier parameters, we q
uantified cross
-
validated accuracy
across different training dataset size
s
, neuron count
s
,
and
window
durations
(
Figure
5
b
-
d
,
Supplementary Figure
4
,
Supplementary Figure
5
).
Cross
-
validated
accuracy increased with more training data, reaching 80% accuracy when training on about 40
trials (2.7 minutes) for
N
-
PPC
. Higher neuron count
s provide
more finger information and thus improved
classification accuracy, reaching 80% a
ccuracy at about 70 neurons for
N
-
PPC
. These results indicate that a single
electrode array in PPC provides sufficient information to control a discrete finger
-
press prosthetic.
A
ccuracy also increased when using longer window durations, reaching 80% at d
urations above
350
ms.
Longer
window durations average out firing rates and thereby reduce the impact of measurement noise and behavioral
variability on classification, but they directly mandate longer control delays. In some cases, it may be useful to
mini
mize BMI control latency even at the expense of accuracy
[63]
.
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint
Figure
5
.
Offline c
lassification of finger
movement execution
from population activity
.
a)
Cross
-
validated c
onfusion matrix for classifying finger movement from
N
-
PPC
neural activity. 86%
accuracy, 480 trials over 4
sessions
.
b)
Learning curve showing c
ross
-
validated accuracy as a function of the training dataset size. About 40 trials (less than 7 trials per
finge
r) are needed to achieve 80% accuracy. Shaded area indicates
95% CI
over folds/sessions
.
c)
Neuron
-
dropping curve (NDC) showing cross
-
validated accuracy as a function of recorded neurons
.
Neurons were aggregated across
sessions.
About
70 neurons
are needed
to
achieve
80% accuracy.
d)
Hyperparameter sweep showing c
ross
-
validated classification accuracy as a function of decode window size. Input features were the
average firing rates in the window [200ms, 200ms +
window
size
] after Go
-
cue. Window durations of
about 350ms are necessary to
achieve 80% accuracy.
e)
Cross
-
validated classification accuracy
across the trial duration (500
-
ms
sliding window).
Finger movements could also be decoded from PPC during the planning period (
Figure
5
e)
), although
classification accuracy was lower (
N
-
PPC
:
66
%;
J
-
PPC
:
61
%; chance: 17
%)
than during movement execution
.
Brain
-
machine interface control of finger movements
We next mapped neural activity to finger movements to control an online finger BMI, where our participants
would tap each finger and their attempted movemen
t would be decoded. For this section, we replicated a usage
scenario where a prosthetic user could decide to move a finger and immediately execute the movement,
without needing a delay period.
We started each session with an open
-
loop
calibration task where the participant attempted to press fingers
according to visual cues (
Methods
). Using only a short calibration period (8 repetitions per condition,
totaling
.
CC-BY-NC-ND 4.0 International license
It is made available under a
is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity.
(which was not certified by peer review)
The copyright holder for this preprint
this version posted December 9, 2022.
;
https://doi.org/10.1101/2022.12.07.22283227
doi:
medRxiv preprint