Context, Stochasticity, and Meaning in Large Language Models
Karl Svozil
公開日: 2025/4/18
Abstract
We analyze the resolution of polysemy through dynamic, context-aware vector embeddings, contrasting this with a quantum-inspired framework for semantic context. We then investigate the intentional role of stochasticity in the generative process, identifying it as the core mechanism for creative and varied outputs. Finally, we explore the compelling analogy between the model's reliance on vector-space projections and the formalisms of quantum mechanics. These principles are presented within the complete training context - from pre-training to reinforcement learning - to provide a cohesive understanding of both the construction and the intrinsic behavior of these models.