Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This claim is talking out of proportion, LLMs do push back:

> Input Risk. An LLM does not challenge a prompt which is leading or whose assumptions are flawed or context is incomplete. Example: An engineer prompts, "Provide a thread-safe list implementation in C#" and receives 200 lines of flawless, correct code. It's still the wrong answer, because the question should have been, "How can I make this code thread-safe?" and whose answer is "Use System.Collections.Concurrent" and 1 line of code. The LLM is not able to recognize an instance of the XY problem because it was not asked to.

When I prompt Gemini 2.5 Pro with "Provide a thread-safe list implementation in C#" it _does_ push back and suggest using the standard library instead (in addition to providing the code of course). First paragraph of the LLM response:

> You can achieve a thread-safe list in C# by using the lock statement to synchronize access to a standard List<T>. Alternatively, you can use concurrent collection classes provided by .NET, such as ConcurrentBag<T> or ConcurrentQueue<T>, depending on the specific access patterns you need.

https://g.co/gemini/share/7ac7b9238b28



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: