attention mechanisms

30 Dec 2022 04:13 - 17 Jun 2023 08:29
Open in Logseq
    • Attention (machine learning) - Wikipedia)
      • Attention-like mechanisms were introduced in the 1990s under names like multiplicative modules, sigma pi units, and hyper-networks. Its flexibility comes from its role as "soft weights" that can change during runtime, in contrast to standard weights that must remain fixed at runtime.