Interplay Between Belief Propagation and Transformer: Differential-Attention Message Passing Transformer

Chin Wa Lau, Xiang Shi, Ziyan Zheng, Haiwen Cao, Nian Guo

公開日: 2025/9/19

Abstract

Transformer-based neural decoders have emerged as a promising approach to error correction coding, combining data-driven adaptability with efficient modeling of long-range dependencies. This paper presents a novel decoder architecture that integrates classical belief propagation principles with transformer designs. We introduce a differentiable syndrome loss function leveraging global codebook structure and a differential-attention mechanism optimizing bit and syndrome embedding interactions. Experimental results demonstrate consistent performance improvements over existing transformer-based decoders, with our approach surpassing traditional belief propagation decoders for short-to-medium length LDPC codes.

Interplay Between Belief Propagation and Transformer: Differential-Attention Message Passing Transformer | SummarXiv | SummarXiv