I don't think it is very hard to predict the exponential rise of INT in large LLMs in the near future, but what about on prem local models and their rise? Every day I am seeing reports of people able to run decent models on their own M4 or dual card windows rigs. Models up to 27B on airplanes, when does this become the dedicated fallback for API access to the big three? I am old enough to remember search in its infancy, and I have seen a lot of people comparing Search ( think the time of google/yahoo/msn ) fought it out for Google to become the king of the hill. This feels different. No one wants to revisit a dominance by one player like Search. It's almost like the foreseeable future shows two tiers, the paid and the local with both tiers accelerating quickly. Search was also a horsepower/hardware problem. Whoever had the most crawlers, who could crawl the most frequently, who had the quickest SERP load times generally did better, but AI is still a different animal. People seem to be…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.