Transformer for Gene Expression Modeling (T-GEM): An Interpretable Deep Learning Model for Gene Expression-Based Phenotype Predictions

Date

2022-09-29

Authors

Zhang, Ting-He
Hasib, Md Musaddaqul
Chiu, Yu-Chiao
Han, Zhi-Feng
Jin, Yu-Fang
Flores, Mario
Chen, Yidong
Huang, Yufei

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Deep learning has been applied in precision oncology to address a variety of gene expression-based phenotype predictions. However, gene expression data's unique characteristics challenge the computer vision-inspired design of popular Deep Learning (DL) models such as Convolutional Neural Network (CNN) and ask for the need to develop interpretable DL models tailored for transcriptomics study. To address the current challenges in developing an interpretable DL model for modeling gene expression data, we propose a novel interpretable deep learning architecture called T-GEM, or Transformer for Gene Expression Modeling. We provided the detailed T-GEM model for modeling gene–gene interactions and demonstrated its utility for gene expression-based predictions of cancer-related phenotypes, including cancer type prediction and immune cell type classification. We carefully analyzed the learning mechanism of T-GEM and showed that the first layer has broader attention while higher layers focus more on phenotype-related genes. We also showed that T-GEM's self-attention could capture important biological functions associated with the predicted phenotypes. We further devised a method to extract the regulatory network that T-GEM learns by exploiting the attributions of self-attention weights for classifications and showed that the network hub genes were likely markers for the predicted phenotypes.

Description

Keywords

phenotypes prediction, interpretable deep learning, Transformer, cancer type prediction, immune cell type prediction

Citation

Cancers 14 (19): 4763 (2022)

Department

Electrical and Computer Engineering