Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

'+' or '-' doesn't adequately describe operations even in the numeric domain (e.g. overflow conditions). They are recognizable only because almost every language has similar semantics for integer or floating point arithmetic. But even using '+' for something like string concatenation unmoors it from historical context.

And beyond that, you're just making things up. '<<' for input/output?

We spent thousands of years developing the written word just so we wouldn't have to rely on primitive pictographs to vaguely get our points across.



That "only because" is my entire point. It's like, aside from the whole reason symbols are useful, they're not useful.

If you've been paying attention, I hold up << for IO as an excellent example of operator overloading gone wrong. But it's not because << is undescriptive, it's because << means "bit shift".


Your justification of "huge amounts of history" only applies in very limited contexts, all of which can be hard wired without allowing general operator overloading.

"<<" does not mean "bit shit." Several languages use it to mean "bit shift," but many do not, and even in those languages the usage is ambiguous. C uses two operators that, visually, could refer to six different operations: arithmetic shift left/right, logical shift left/right, rotate left/right (on x86: shr, shl, sar, sal = shl ror, rol). As a result, the C standard leaves right-shifts of signed negative numbers implementation defined.


How can those limited contexts be hardwired without allowing general overloading? What if I want to write a new numeric type?

Seems like you're just talking about operators, not overloading. If << doesn't have a distinct meaning then it shouldn't be used. If it's OK to use << it should be OK to use it both for built-in types and custom types.


I'd argue that << was a bit of a misfeature anyways. Providing operators for mathematical symbols with well-known meaning is sensible. == is an understandable hack, but a hack, nonetheless. || and && probably could have just used the words AND and OR like SQL does.

The << operator was just using symbols for the sake of symbols and would have been much more semantically clear with Shift(value, nBits) and would've avoided temptation to overload this meaningless operation.


You have a good point. I don't know if I completely agree, but I can see the merit in it. However, you're now talking about the merits of operators in general, rather than overloading, which is a different thing altogether.


The issue is that operators rarely have obvious semantics when applied to non-builtin types. '+' makes sense for integers and floating point numbers, complex numbers and ratios, and almost nothing else. Those types can be built into the language, and '+' defined on them. It doesn't make any sense for strings and all the mismash of things people will overload it for.


I've never seen anybody confused by "+" for string-concatenation. It makes far more sense than, say, integer division where 5/3 = 1.

Just because C++ ruined operator overriding with their moronic << doesn't mean that every other language should live without perfectly reasonable operator behavior. I've been dealing with Java instead of C# and not being able to use simple obvious equality checks with the "==" operator is agonizing.


Languages should build in all the numeric types you could ever need, but they don't. A lot of languages still don't have arbitrary-size integers (Swift among them).

You see the vast majority of cases not needing operator overloading and see it as useless, while I see the small number of cases where it's really, really good and see it as important.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: