1 hour ago · Tech · 0 comments

I think we are in the brute-force era of AI.By that I mean that progress is coming less from fundamentally new ideas and more from applying massive amounts of data and compute to existing architectures. Larger pretraining corpora, longer context windows, more GPUs — more of everything.And to be clear, this approach works. Today’s frontier models are materially better than their predecessors. They reason more coherently, generalize more broadly, and fail less often. But when you ask why they are better, the answer is usually some version of “we scaled it,” not “we discovered something fundamentally new.”In other words, we are getting more, not different.The Bitter LessonThis pattern maps cleanly to Richard Sutton’s Bitter Lesson, which argues that the biggest long-run gains in AI come not from clever, human-designed features, but from methods that scale with compute.That lesson has proven remarkably durable. It explains the transition from symbolic AI to machine learning, from feature…

No comments yet. Log in to reply on the Fediverse. Comments will appear here.