ARTFEED — Contemporary Art Intelligence

Wisteria: A Multi-Scale DNA Language Model

other · 2026-05-09

A team of researchers has introduced Wisteria, a genomic language model designed to incorporate multi-scale feature learning for DNA sequences. This model enhances the Mamba-based architecture by utilizing gated dilated convolutions to identify local motifs, alongside gated multilayer perceptrons to address global dependencies. Additionally, a Fourier-based attention mechanism facilitates modeling in the frequency domain and supports length generalization. Wisteria demonstrates impressive results on downstream benchmarks, effectively handling both short and long-range dependencies.

Key facts

  • Wisteria is a genomic language model for DNA sequences.
  • It integrates multi-scale feature learning within a unified framework.
  • It augments Mamba architecture with gated dilated convolutions.
  • Gated multilayer perceptrons refine global dependencies.
  • A Fourier-based attention mechanism supports frequency domain modeling.
  • It performs well on benchmarks with short and long range dependencies.
  • The paper is published on arXiv with ID 2605.05913.
  • The model addresses interplay between local motifs and global dependencies.

Entities

Institutions

  • arXiv

Sources