2 hours ago · Tech · 0 comments

I watched this video yesterday and had a little epiphany that I shared at the Tinderbox forum. It doesn't seem to have resonated with anyone, but since it's my epiphany, I figure I ought to put it in the marmot. It seems we're on the cusp of having on-device LLMs that can do many useful things. I'd say it's worth your time to watch the video all the way through, but I linked to the part that is just before the little epiphany. "Data hygiene really matters." "A local model is most useful when it can read all your stuff." There are hundreds of thousands of files on this computer. I know, because I just transferred them from the old computer. I don't recall how many hundreds of thousands, but I believe it's more than 600K. Of those 600K files, I'm confident an AI could easily ignore >95% of them, just by their location within the file system. They have little to do with the user, except insofar as one or more of them may be a problem for a particular app, or some other glitch you'd like…

No comments yet. Log in to reply on the Fediverse. Comments will appear here.