I think a lot of people are confused about modern AI models being mostly "coding models", and thinking that because of this they're not good for other types of work.I think that's the wrong way to think about it.Or one of the reasons, anyway.I'd say the primary reason so many AI labs are optimizing for coding is probably because augmenting/replacing coding work is immediately helpful to companies and developers. In other words, it makes money. So, no mystery there.The meta-reason But I think a bigger reason these models are so good at things is because coding is a meta-skill.Not really all, but lots.Coding, or code really, is fundamentally a structured type of problem solving. And when a model gets better at coding, it gets better at solving all kinds of problems at the same time.Not exactly, but largely.So when a model gets better at coding, it's getting better at getting better.So next time you hear some model is doing really well on coding, remember that that maps pretty closely to…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.