Recently, the father of a former work colleague died. The former colleague posted a lot about this on a social-media channel. There's nothing notable about this; to communicate a tragedy is to, hopefully, get over the worst of feelings, to help one to transition into feeling better, to sort-of convert sorrow into lovely memories. Reading two lines of text made me realise the person had used AI to write their texts. All of them. What the person wrote felt off-key and strange. It didn't feel human. I sent one of the texts to Pangram; the system reported a 'confidence high' that 100% of the text was AI-generated. Here's my point: Why the fuck do you want autocorrect on steroids to say something about your dead father? How could you think it's a good idea to let AI—a system that acts mainly on probability factors—describe your dead father? Can AI honour your father's memory in any way? Finally: if you let AI, a mechanical system filled with bullshit that's made to sound not like yourself…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.