sEEG-based Encoding for Sentence Retrieval: A Contrastive Learning Approach to Brain-Language Alig
Paper PDF: http://arxiv.org/pdf/2504.14468v1
Check my merch: https://dragonprof-2.creator-spring.com
Interpreting neural activity through meaningful latent representations
remains a complex and evolving challenge at the intersection of neuroscience
and artificial intelligence. We investigate the potential of multimodal
foundation models to align invasive brain recordings with natural language. We
present SSENSE, a contrastive learning framework that projects single-subject
stereo-electroencephalography (sEEG) signals into the sentence embedding space
of a frozen CLIP model, enabling sentence-level retrieval directly from brain
activity. SSENSE trains a neural encoder on spectral representations of sEEG
using InfoNCE loss, without fine-tuning the text encoder. We evaluate our
method on time-aligned sEEG and spoken transcripts from a naturalistic
movie-watching dataset. Despite limited data, SSENSE achieves promising
results, demonstrating that general-purpose language representations can serve
as effective priors for neural decoding.
Видео sEEG-based Encoding for Sentence Retrieval: A Contrastive Learning Approach to Brain-Language Alig канала AI Papers - Vuk Rosić
Check my merch: https://dragonprof-2.creator-spring.com
Interpreting neural activity through meaningful latent representations
remains a complex and evolving challenge at the intersection of neuroscience
and artificial intelligence. We investigate the potential of multimodal
foundation models to align invasive brain recordings with natural language. We
present SSENSE, a contrastive learning framework that projects single-subject
stereo-electroencephalography (sEEG) signals into the sentence embedding space
of a frozen CLIP model, enabling sentence-level retrieval directly from brain
activity. SSENSE trains a neural encoder on spectral representations of sEEG
using InfoNCE loss, without fine-tuning the text encoder. We evaluate our
method on time-aligned sEEG and spoken transcripts from a naturalistic
movie-watching dataset. Despite limited data, SSENSE achieves promising
results, demonstrating that general-purpose language representations can serve
as effective priors for neural decoding.
Видео sEEG-based Encoding for Sentence Retrieval: A Contrastive Learning Approach to Brain-Language Alig канала AI Papers - Vuk Rosić
Brain-Language Alignment with CLIP Contrastive Learning for Brain Signals Frozen Text Encoder in Contrastive Learning InfoNCE Loss in Neural Decoding Multimodal Foundation Models for Neuroscience Naturalistic Movie-Watching Dataset Analysis Sentence Embedding from Brain Activity Spectral Representations of sEEG Signals Stereo-Electroencephalography Signal Encoding sEEG-based Sentence Retrieval
Комментарии отсутствуют
Информация о видео
25 апреля 2025 г. 2:46:00
00:05:32
Другие видео канала