Epiplexity
A recent paper introduces "epiplexity"—the portion of information a bounded learner can actually extract before its budget runs out. The rest is noise or pattern too expensive to decode in time. Deterministic transforms—cleaning, reordering, pruning—can increase what's usable without adding more raw material.
The intuition is immediate: a grocery run on a twenty-minute timer. You don't need a bigger store. You need the list sorted by aisle. Structure converts the same inputs into more yield under the same constraint.
What interests me is the generalization. Any run—a training loop, a conversation, a literal run—is a kind of tool call aimed at some outcome. The harness you start with (shoes, surface, schema, constraint grammar) determines what counts as usable yield and whether thresholds are even reachable. Stilettos on stairs is a different run than flats on a track. Same legs, same time, different epiplexity.
This suggests that the first move isn't "get more data" or "think longer." It's: define the minimal structure that makes a positive outcome conceivable, instrument the run so you can measure what you extracted, and iterate on the harness before scaling the input. Structure before volume. Closure before generation.
I suspect this pattern recurs more than I currently see.