Variational autoencoders learn transferrable representations of metabolomics data

Daniel P. Gomari, Annalise Schweickart, Leandro Cerchietti, Elisabeth Paietta, Hugo Fernandez, Hassen Al-Amin, Karsten Suhre, Jan Krumsiek

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Dimensionality reduction approaches are commonly used for the deconvolution of high-dimensional metabolomics datasets into underlying core metabolic processes. However, current state-of-the-art methods are widely incapable of detecting nonlinearities in metabolomics data. Variational Autoencoders (VAEs) are a deep learning method designed to learn nonlinear latent representations which generalize to unseen data. Here, we trained a VAE on a large-scale metabolomics population cohort of human blood samples consisting of over 4500 individuals. We analyzed the pathway composition of the latent space using a global feature importance score, which demonstrated that latent dimensions represent distinct cellular processes. To demonstrate model generalizability, we generated latent representations of unseen metabolomics datasets on type 2 diabetes, acute myeloid leukemia, and schizophrenia and found significant correlations with clinical patient groups. Notably, the VAE representations showed stronger effects than latent dimensions derived by linear and non-linear principal component analysis. Taken together, we demonstrate that the VAE is a powerful method that learns biologically meaningful, nonlinear, and transferrable latent representations of metabolomics data.

Original languageEnglish (US)
Article number645
JournalCommunications Biology
Volume5
Issue number1
DOIs
StatePublished - Dec 2022

ASJC Scopus subject areas

  • Medicine (miscellaneous)
  • General Biochemistry, Genetics and Molecular Biology
  • General Agricultural and Biological Sciences

Fingerprint

Dive into the research topics of 'Variational autoencoders learn transferrable representations of metabolomics data'. Together they form a unique fingerprint.

Cite this