OpenAI and Anthropic are Friendster and MySpace, if Subquadratic proves to be true.
If you've ever shipped an LLM-powered feature that needed to reason over a real codebase, a real contract, or a real research corpus, you already know the shape of the problem. The model technically accepts a million tokens of context. In practice, the answers get worse as the context gets longer, and your infra bill gets worse faster than that. SubQ is built around SSA — Subquadratic Sparse Attention — a linearly scaling attention mechanism designed for long-context retrieval, reasoning, and so
Comment
Sign in to join the discussion.
Loading comments…