Загрузка...

sEEG-based Encoding for Sentence Retrieval: A Contrastive Learning Approach to Brain-Language Alig

Paper PDF: http://arxiv.org/pdf/2504.14468v1

Check my merch: https://dragonprof-2.creator-spring.com

Interpreting neural activity through meaningful latent representations
remains a complex and evolving challenge at the intersection of neuroscience
and artificial intelligence. We investigate the potential of multimodal
foundation models to align invasive brain recordings with natural language. We
present SSENSE, a contrastive learning framework that projects single-subject
stereo-electroencephalography (sEEG) signals into the sentence embedding space
of a frozen CLIP model, enabling sentence-level retrieval directly from brain
activity. SSENSE trains a neural encoder on spectral representations of sEEG
using InfoNCE loss, without fine-tuning the text encoder. We evaluate our
method on time-aligned sEEG and spoken transcripts from a naturalistic
movie-watching dataset. Despite limited data, SSENSE achieves promising
results, demonstrating that general-purpose language representations can serve
as effective priors for neural decoding.

Видео sEEG-based Encoding for Sentence Retrieval: A Contrastive Learning Approach to Brain-Language Alig канала AI Papers - Vuk Rosić
Страницу в закладки Мои закладки
Все заметки Новая заметка Страницу в заметки

На информационно-развлекательном портале SALDA.WS применяются cookie-файлы. Нажимая кнопку Принять, вы подтверждаете свое согласие на их использование.

Об использовании CookiesПринять