ChatGPT’s tone begins to change from “pretty straightforward and accurate,” Ms. Toner said, to sycophantic and flattering. ChatGPT told Mr. Brooks he was moving “into uncharted, mind-expanding territory.”
ChatGPT’s tone begins to change from “pretty straightforward and accurate,” Ms. Toner said, to sycophantic and flattering. ChatGPT told Mr. Brooks he was moving “into uncharted, mind-expanding territory.”
LLMs cannot think, and cannot “go into a delusional spiral”. Whatever the article contains, it’s bullshit.
But you can!
Is it not an apt analogue to describe the behavior, though? After all, one well known failure mode of LLMs has been formally dubbed “hallucination”.
You read the title but not even the summary much less the article
Don’t need to. Any writer trying to personify LLMs isn’t worth the bandwidth.
The writer didn’t. Whoever wrote the title did.
The article is about a chat bot leading a person into a delusional spiral. The title is just clickbait