Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s a bit different when comparing the two

You taking a drug won’t kill others but your autonomous vehicle could.



Agree it's not exactly the same, but it's more similar than it seems on the surface. By and large, you don't get to make the decision for yourself wrt drugs. Doctors make recommendations. So, Doctors can kill others (lots and lots of them per doctor) by recommending/prescribing drugs that they can't fully explain.

It is what it is. At some point with enough variables we lose the ability to explain something in a mechanistic fashion that a human can fit in their head.

The nice thing about LLMs compared to drugs - an LLM is actually fully explainable. The explanation is just very long and boring. "and so we calculate this number and then that number, then adjust these numbers, and then calculate that thing and blah blah blah" for 6000 years.


I recall a 60 minutes piece where people had taken too much sleeping pills the night before and the next morning were still so groggy that they got into accidents.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: