Complex Mathematics

ChatGPT 5 is finally saying ‘I don’t know’ – here’s why that’s a big deal



Large language models have an awkward history with telling the truth, especially if they can’t provide a real answer. Hallucinations have been a hazard for AI chatbots since the technology debuted a few years ago. But ChatGPT 5 seems to be going for a new, more humble approach to not knowing answers; admitting it.

Though most AI chatbot responses are accurate, it’s impossible to interact with an AI chatbot for long before it provides a partial or complete fabrication as an answer. The AI displays just as much confidence in its answers regardless of their accuracy. AI hallucinations have plagued users and even led to embarrassing moments for the developers during demonstrations.





Source link