Self-supervised learning of T cell receptor sequences exposes core properties for T cell membership

Romi Goldner Kabeli, Sarit Zevin, Avital Abargel, Alona Zilberberg, Sol Efroni

Research output: Contribution to journalArticlepeer-review

Abstract

The T cell receptor (TCR) repertoire is an extraordinarily diverse collection of TCRs essential for maintaining the body's homeostasis and response to threats. In this study, we compiled an extensive dataset of more than 4200 bulk TCR repertoire samples, encompassing 221,176,713 sequences, alongside 6,159,652 single-cell TCR sequences from over 400 samples. From this dataset, we then selected a representative subset of 5 million bulk sequences and 4.2 million single-cell sequences to train two specialized Transformer-based language models for bulk (CVC) and single-cell (scCVC) TCR repertoires, respectively. We show that these models successfully capture TCR core qualities, such as sharing, gene composition, and single-cell properties. These qualities are emergent in the encoded TCR latent space and enable classification into TCR-based qualities such as public sequences. These models demonstrate the potential of Transformer-based language models in TCR downstream applications.

Original languageEnglish
Pages (from-to)eadk4670
JournalScience advances
Volume10
Issue number17
DOIs
StatePublished - 26 Apr 2024

Fingerprint

Dive into the research topics of 'Self-supervised learning of T cell receptor sequences exposes core properties for T cell membership'. Together they form a unique fingerprint.

Cite this