Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because LLM saying "I got confused, dropped the database and then got scared and hid this from you" hides the "why" LLMs do the things they do. I would also prefer if they were less sycophantic and argue with what I'm wanting to do rather than treating user as a god (ie - "the algorithm you're trying to use is less performant than an alternative")
 help



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: