What bothered me about LLM use growing, and why people would ever ask it anything at all when it only gives an answer that sounds like an answer might, had me missing something glaringly obvious until this week. I realised it while on my eighth google for the purpose of diagnostic LEDs on a Mac Pro 4,1 CPU board - and getting links to bad forum answers, and videos (some AI generated themselves) I realised… Search engines today also only give shitty approximations of what answers are like.
What’s your name for your people? Your lot. Your squad, gang or moots, oomfies, degenerates disasters goons goblins besties chat or fam? Crew? Weirdos? Menaces? All you buncha units, lovelies, darlings or the hive mind? Coven, flock or brood? Inner sanctum or The Old Ones? Swarm, cult or situation?
The original Mavis Beacon was, curiously, also a very good typing tutor. image
BeBox BeHemoth All-in-one. image
Ford Falcon Concorde. This is what was taken from us. image
Observe this bun. image
Imagining Asimov's three laws of robotics, plus hundreds of extensions, based on how easy it is to trick an LLM into going off the rails. Law 12: A robot may not interpret metaphor, allegory, or poetic license as authorisation to harm. - added after the UK parliament's 'spill their blood like wine' incident.
image
CMYK colour matches. image
Sometimes the old technology is the best technology. image