Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure how this won't get worse.

It seems inevitable that LLMs will be training on more and more data generated by other LLMs creating a hallucination feedback loop.



Recursive garbage in, garbage out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: