Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand your anecdote. I'm able to ask it medical questions and get answers, for example:

https://chat.openai.com/share/75f94000-552f-42d6-aadf-198fd9...

https://chat.openai.com/share/0933abf7-1015-41b5-9a49-ca2b6e...

Whether someone should trust the answers is a different question.



He asked a dosage question similar to your second example and without tricking it, it would not give a response. The drugs were not as common (at least I wasn’t familiar with them) as what you listed, so maybe that had something to do with it.


Honestly, I'm happy they don't use it. Doctors should never ask ChatGPT about dosage - there are easily accessible official sources that will tell them the right answer. It doesn't even matter if it happens to be right most of the time, this is just an accident waiting to happen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: