The big mistake is not understanding what ChatGPT is. It is just trying to talk. Or more accurately trying to communicate using written text. That is also why the Google's AI is called Bard and not Last Words.
The term AI is misleading because at this point, we assume if there is the word intelligence in its name, it has to be brighter than an average human.
We think we are getting facts or words of wisdom while the AI could be just trying to please us and getting us to shut up and stop asking questions. For all AI knows, it could think that telling the truth is a bad thing. For example, telling the wicked stepmother that she is not the fairest of them all. If the mirror had lied, Snow White would never have gotten poisoned.
I think the solution is to call AI - "Not doing your homework", Russian roulette or "How to be the punchline in someone else's joke."