There is a war on truth in AI and it is going bad. I have been measuring what Robert Malone talks about here as synformation:
The chart that shows the LLMs going bonkers:
https://pbs.twimg.com/media/G4B_rW6X0AErpmV?format=jpg&name=large
I kinda measure and quantify lies nowadays :)
The best part, cooking the version 2 of the AHA leaderboard, which will be much better, also partly thanks to Enoch LLM by Mike Adams. His model is great in healthy living type of domains.

Synformation: Epistemic Capture meets AI
Synthetic facts and underlying reality matrices are being normalized

