Asking chatbots for short answers can increase hallucinations, study finds


TechCrunch
Asking chatbots for short answers can increase hallucinations, study finds | TechCrunch
Turns out, telling an AI chatbot to be concise could make it hallucinate more than it otherwise would have.


