1 research outputs found
To Transformers and Beyond: Large Language Models for the Genome
In the rapidly evolving landscape of genomics, deep learning has emerged as a
useful tool for tackling complex computational challenges. This review focuses
on the transformative role of Large Language Models (LLMs), which are mostly
based on the transformer architecture, in genomics. Building on the foundation
of traditional convolutional neural networks and recurrent neural networks, we
explore both the strengths and limitations of transformers and other LLMs for
genomics. Additionally, we contemplate the future of genomic modeling beyond
the transformer architecture based on current trends in research. The paper
aims to serve as a guide for computational biologists and computer scientists
interested in LLMs for genomic data. We hope the paper can also serve as an
educational introduction and discussion for biologists to a fundamental shift
in how we will be analyzing genomic data in the future