Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All languages that tried to fight complexity either grew up to adopt complexity and stay relevant, or faded away.

Programming languages don't get complex just for fun, their designers are tackling actual relevant issues.

Go community doesn't seem to have learned much from the past.



> All languages that tried to fight complexity either grew up to adopt complexity and stay relevant, or faded away.

> Programming languages don't get complex just for fun, their designers are tackling actual relevant issues.

Yes and yes.

> Go community doesn't seem to have learned much from the past.

Well... if you start with something simple, and complexity comes, you can still try to keep it as simple as possible. But if you started with something that was already complex (but complex in ways that your theory said it needed to be, not in the ways that the real world said it needed to be), and you try to fix that, you wind up with something really complicated. Ditto if you start off with complexity to handle all the use cases of the past.

Go started off simple, and is letting real-world use push them into becoming more complicated. That's a defensible approach, even today.


> That's a defensible approach, even today

Not if you are dishonest about it. Not if you refuse to learn from the past couple decades.


In what way is Go dishonest about it? In what way have they refused to learn from the past couple decades?


I think insisting that they need more use cases for generics is dishonest when you consider they used generics to implement library data types themselves.

I don't mean they are outright lying, or are bad people or anything like that.


I suspect the Go developers just have a very different idea of what the past is. Given the state of software engineering, I'm not quite as likely to put a positive spin on the things we've built since then.


This is true for languages that try to be all things to all people (a la Java). All languages are DSLs, and if you target just a few specific domains and beat back the masses who want the language to expand beyond its intended purpose, than simplicity remains possible.


DSLs shouldn't be turing complete and turing complete languages shouldn't try to be DSLs.

Ant was a DSL that managed to become turing complete and the results were pretty horrible.


Turing completeness is a symptom, not a cause. No one would argue that SQL is bad because some implementations are Turing-complete [0].

[0]: https://stackoverflow.com/a/7580013


>No one would argue that SQL is bad because some implementations are Turing-complete

They do actually. Though when people do say that it tends to be phrased "keeping business logic in stored procedures is a bad idea".

People argue that all the damn time.

Accidental turing completeness usually signals a design flaw somewhere (would you also consider it too controversial to argue that C++ templates mentioned in your link are nasty and people complain about them a lot?).


>All languages that tried to fight complexity either grew up to adopt complexity and stay relevant, or faded away.

And yet, to this day, C is just as, if not more popular, than C++. Why is that? I can do so much more in C++, but I, and my colleagues, pick plain-old C every time.


Thanks Linux.

C was already on the way out when Linus created Linux.

Apple was migrating from Object Pascal to C++.

IBM had CSet++ for OS/2.

Borland, Microsoft, Zortech, Symatec were selling C++ frameworks.

UNIX vendors were playing with Taligent and CORBA.

BeOS and Symbian were developed in C++.

Then came Linus, made Linux with GNU on top.

GNU project for a long time always mentioned that the go to language for GNU projects should be C.

All major C compilers are written in C++ nowadays, there is hardly any reason to stay with C outside UNIX world.


C was already on the way out when Linus created Linux.

That seems a little... fanciful. There was a lot of C++ and it was a great way to show how modern and forward-looking you were (and to sell compilers, tools, frameworks) but standardization hadn't got far, interoperability was poor, problems great and small abounded. A number of the things you mention above were spectacularly unsuccessful.


Yes, C++ was a pain to write portability before 2000, but so was C, in spite of having been standardized in 1990, most compilers were a mix of K&R C and ANSI C.

Nevertheless, all major desktop OSes were going C++ for their application frameworks, before the widespread adoption of GNU software.

> A number of the things you mention above were spectacularly unsuccessful.

Mostly due to politics between corporations and very little to do with C++ itself.


> but standardization hadn't got far

That's true, but not really fair: even ANSI C standard was only 2 years old at that time, despite C being way older than C++. Standardization takes a lot of time …


You're in the minority. For new projects, C is much less popular than C++.


It might not be that easy to tell. The C++ I write (for myself) is essentially plain C. No templates, dynamic dispatch, constructors, or exceptions. Most of the standard libs I use begin with the letter 'c'.

It uses some C++ features, but it's philosophically much closer to C code.


If it only compiles with a C++ compiler, it is C++, regardless of the amount of language features being used.


That's fine, but I don't think that's what people mean when they say C++. I certainly wouldn't call myself a C++ coder and, if I applied for a C++ job, I'm pretty sure that, after I had explained that I don't do exceptions, virtuals, or the STL, I'd be politely shown the door.


> Programming languages don't get complex just for fun, their designers are tackling actual relevant issues.

Have you ever used C++ templates? I mean every popular languages have issues related to design complexity.


Some complexity is avoidable, other isn't.

Besides C++ templates are a result of creating a conceptually simple, one size fits all solution for generics, metaprograming, library tuning, and some dozens of other problems that other languages have specialized tools to solve. Turns out that the complex set of features works better.


Is it a complex set of features, or rather a set of focused tools?


It is a large set of simple tools. It is conceptually complex because each tool is different and you must learn them all.


I did my first C++ steps with Turbo C++ 1.0 for MS-DOS.

My first use of C++ templates was in Turbo C++ 3.5 for Windows 3.1.


Go's community doesn't learn from the past, but good languages fight complexity even as they add expressive power. (They at least try to get the most expressive power per complexity cost.)

Pragmas, incidentally, aren't really a source of bad complexity. Per the abstract definition of the language, they really indeed do have no effect at all and are just comments. Yay!

Implementations have properties too—they aren't just rude practicalities. Compiler's, in particular, connect one language (the input) to another (the output). Programas mediate how those additional properties apply to the language at hand. It's an interesting mental challenge to formalize them in the absence of a compiler having an comcrete stable ABI.

So in conclusion, go people once again don't understand good design. Pragmas are not an ugly wart, but actually a great example of layering—a rare example of an abstraction that doesn't leak!


I wouldn't go as far to say that having undocumented magic comments doesn't add complexity. From a very surface level, sure, the parser is the same, but now every tool that works with the Go language needs to be aware of these. Linters, for example, need to not complain about the missing space in front of the comment, but only if the comment starts with "go:".

Ultimately anything that changes how the program is executed is going to add complexity, so they might as well "make it official" and add a keyword for it.


In Perl and Javascript pragmas are language level and are used to help people avoid some mistakes at certain stages of software development. This is fine, no leaky abstractions. In Go they are lower level and therefore are side effects of leaky abstractions in compiler and language design. So they should be fixed, not kept or turned into pragmas in the spec. The choices I can think of: either make Go lower level itself or move low level stuff into another intermediate lower level language.


There's another choice, which is to keep doing things the way they are being done, simply not add another 20 pragmas, and get on with life because there isn't actually a problem here. None of the problems with pragmas I've seen in other languages are present in Go, since the pragmas are simple and mostly used only by the implementation and/or compiler itself, and there's no interactions, or massive code complexity from ifdefs, or string-concatenation-based macro disasters, or any of the other real problems caused by pragmas, with the possible faint exception of pragmas not being cleanly delineated from the comment syntax, which is still not causing any huge problems I can see, nor is that likely to change in the future.

The problems that C has with pragmas, and that C++ imported from pragmas, can not be naively imputed to other languages without demonstrating there's actually a problem here. This wouldn't even make my top 10 issues with Go; I'm not sure it's even an issue at all.


> In Perl [...] pragmas are language level

No, actually. The syntax that people use to invoke the pragma ("use strict [arg]...") is not a pragma at the language level, it's just the syntax for importing symbols from modules. For example,

  use strict ('vars', 'refs');
expands to

  BEGIN { require 'strict'; strict->import('vars', 'refs'); }
because that's how the "use" statement is defined. `BEGIN{...}` cause the statements in the block to be executed as soon as the BEGIN block has been fully parsed [1]. `require 'strict'` loads the module `strict.pm` from the library path (the source code is on CPAN at [2], if you're interested), then its `import()` method is called with two string arguments. The implementation of that method is:

  sub import {
    shift;
    $^H |= @_ ? &bits : all_bits | all_explicit_bits;
  }
There's a lot of weird Perl syntax in there, but the gist is that it modifies the $^H variable. And THAT is the actual pragma which is defined by the language. [3] The module strict.pm is just a wrapper around $^H to make things a bit more user-friendly.

I know that's sorta kinda off-topic, but since we're talking about language design, I figured I'd contribute this small anecdote that illustrates really well how the more recent parts of Perl are designed: a ton of metaprogramming on top of relatively small changes to the core language. If you want another example, have a look at how object-oriented programming was tacked on to Perl as a tiny afterthought, yet the way it interacts with all the other parts of the language makes hugely powerful OOP frameworks like Moose possible. (OTOH, that approach also makes the language pretty messy, but it always gets the job done for me, at many scales.)

[1] Usually, execution only begins when the entire file has been parsed, but this code needs to run earlier because it changes the parser's behavior.

[2] https://metacpan.org/source/SHAY/perl-5.26.1/lib/strict.pm

[3] Notably, $^H behaves differently from other variables: Every assignment to it is scoped only to the current block, whereas regular variables need to be shadowed explicitly. This is particularly useful to temporarily lift a strictness requirement for a single statement, similar to how `unsafe` is used in Rust:

  use strict;
  ...
  my $function_name = 'implementation_' . ($x + 2 * $y);
  $function_name(); //error: cannot call string value
  {
    no strict 'refs'; //"no" is like "use", but in reverse (calls the module's unimport() instead of import())
    $function_name(); //works: calls the function with the name stored in the variable
  }


Sure, this is all correct from implementation perspective as levels are not strictly defined and are open to interpretation. However, we were talking about users' perspective, which is kind of the whole point of leaky abstractions. In this context language level features are those that users can understand about the language without any assumptions about other levels, like how compilers represent things internally.


The word pragma in perl has always been used to refer to that specific syntax sugar, just read the start of "perldoc perlpragma".

Furthermore the $^H (there's also %^H) facility you mention is just one way pragmas are implemented, e.g. "use overload" is a core pragma that doesn't use that method at all, instead it defines special functions in the importing package which the compiler is aware of.

Then there are other "pragmas" that are really just utility wrapper functions, e.g. "use autouse". The "Pragmatic modules" section of "perldoc perlmodlib" has the full list.


Guy Steele, "Growing a Language", https://www.youtube.com/watch?v=_ahvzDzKdB0


Lisp?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: