00:00:00

Share Your Feedback 🏝️

Casual Attn

Casual Attn

MinWoo(Daniel) Park | Tech Blog

Read more
Previous: Jet-Nemotron Next:

Casual Attn

  • Related Project: Private
  • Category: Paper Review
  • Date: 2025-09-10

Causal Attention with Lookahead Keys

  • url: https://arxiv.org/abs/2509.07301
  • pdf: https://arxiv.org/pdf/2509.07301
  • abstract: In standard causal attention, each token’s query, key, and value (QKV) are static and encode only preceding context. We introduce CAuSal aTtention with Lookahead kEys (CASTLE), an attention mechanism that continually updates each token’s keys as the context unfolds. We term these updated keys lookahead keys because they belong to earlier positions yet integrate information from tokens that appear later relative to those positions, while strictly preserving the autoregressive property. Although the mechanism appears sequential, we derive a mathematical equivalence that avoids explicitly materializing lookahead keys at each position and enables efficient parallel training. On language modeling benchmarks, CASTLE consistently outperforms standard causal attention across model scales, reducing validation perplexity and improving performance on a range of downstream tasks.
Previous: Jet-Nemotron Next:

post contain ""

    No matching posts found containing ""