It seems inevitable that LLMs will be training on more and more data generated by other LLMs creating a hallucination feedback loop.
It seems inevitable that LLMs will be training on more and more data generated by other LLMs creating a hallucination feedback loop.