Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> ... it insulates the student from the javac command line program, and the command line environment itself

Such a misguided article. The Java language has (quite literally) nothing to do with `javac`. In fact, there's a lot of other compilers out there. Taking a Java class should not focus on the intricacies and weirdness of Java compiler command line interfaces. It seems pretty amateurish to argue in favor of it. Case in point: back in high school, I learned C++ using a Borland compiler (and I haven't touched `bcc` in the 15 years since then, even though I've written plenty of C++ in the meantime).

> ... imports and file naming requirements.

These, again, are (compiler) implementation details and should not be part of a generalized Java curriculum.

> And of course, mathematics is the foundation of computer science.

Stuff like this can be misleading and even though not completely wrong (discrete math, logic, metalogic, mathematical logic are sorta' kinda' similar), it's just wrong enough[1] to lead people astray.

[1] https://www.scott-a-s.com/cs-is-not-math/



I strongly disagree.

A lot of my peers in college were very bright and could write great code, but they were absolutely useless as developers because they didn't know any tooling. They couldn't compile, run, test, or source control their code if the professor didn't set it up for them.

Yes, tools change, but the knowledge from one tool is almost always transferable. Once you're comfortable on the CLI and understand the concept that Java source is compiled to bytecode and then run on a VM, it's easy to switch to a new compiler.

If you only want to know the absolute bare minimum, go to a boot camp. People go to college to learn, and being comfortable with tools is an essential part of being a developer that students need to learn.


Agreed. The least effective developers I’ve had the displeasure of working with never understand how their code executes end-to-end. And the best ones have a deep knowledge of it.

This is no fluke! If you don’t know how a classloader works with the Java classpath, how to set various JVM flags for operating the JVM, etc., then I’d argue that you’re an amateur, and I wouldn’t trust you to write correct code in a production setting. It’s critical knowledge to know how your code runs, and if you don’t know that, then stay away from production systems. If the goal of these courses is to teach students enough about Java to use it in a professional setting, then knowing how classfiles are produced, bundled, distributed, loaded, and executed by a JVM is critical knowledge.

More succinctly, if you don’t know how your code makes it to a production server and gets executed, you’re not ready to work on it.

(And if the goal isn't to teach students how to use Java in a professional setting, then why are they learning it at all? Use a teaching language so students can focus on the concepts instead of Java's idiosyncracies.)


eh, I understand the sentiment - but it's a little overkill. Junior programmers can be code reviewed and taught these things over time, and also be wildly productive in a production setting.


Sure you can teach junior programmers all kinds of stuff that they should have already learned or should know how to learn, I do it all the time. I'm arguing that this is crucial information that even entry-level engineers should know. If they don't, then I give them a link to the relevant documentation and politely ask them to read it. It's a prerequisite.


Are there any books or articles that go through the things that you’ve mentioned like how the class loader works etc? Most of the tutorials or books I have seen do not even touch this aspect and I want to get better at it.


Oracle documentations typically. And javadoc for those classes.

But I don't think it is such a key to anything. It is sort of stuff that is easy to learn when you need it, easy to forget and not useful most of time. It makes sense to read when you done learning other things, but not for beginners.


Isn't eclipse a piece of tooling? I mean I think the best argument against teaching eclipse is IntelliJ is much better, imo, and is more widespread than eclipse at this point. Student should have a module/class whatever about command line tools as well. I this comparison is comparing two things that aren't comparable, an IDE vs learning the CLI. I also think you contradict yourself in your last sentence. Teaching an IDE in class should help people be comfortable with their tools but at some point they might need to know what the tool is doing under the hood.


> but at some point they might need to know what the tool is doing under the hood.

The problem is the IDE makes it so easy to never have to look under the hood, to the point when people take a peek the get overwhelmed and quickly return to the comfort of the IDE.

However, in real world of production environments there will be no IDE holding your hand and as such you can't get away without a good understanding of the engine under the hood.


Yeah they’d probably be using some build system or framework or script and still wouldn’t need to know the invocation for javac.


The situation in which you run javac or edit in vim in production is absurdly rare.

In real world, you always use tools and in real java world there is always ide or application server or something.


I haven't really played around with java since the 90's, and I bought a Java book and tried hopping back in. It was really frustrating because the book uses 8 and I downloaded 9 and Eclipse was slightly diffferent, something with modules blah blah.

Plus downloading the actual Java SDK was confusing, the versioning, it not being free anymore...


OpenJDK packages seem easy enough.


In particular, command line compilers have been used virtually the same way since the first command line interfaces, so knowledge of javac is definitely transferable.

IDEs change more frequently.


I agree that many developers are not very adept with tooling, but is that because they don't use the CLI? Or is it just that they aren't interested in those parts of software engineering? Eclipse is a tool too, and not a simple one either.


The author is specifically talking about students. The emphasis at this stage plays a key role in whether they'll have that interest or not. Knowing is a prerequisite to developing interest.


I agree, but regardless of if they're exposed to a CLI, they will be exposed to tooling. Why is GUI tooling less likely to spark that interest?


GUI tooling almost entirely uses the command line layers under the hood, so it is strictly more complicated while hiding more of the real process - the worst of both worlds.

My point about Git is an example of this. Teaching Git is hard, partly because Git's command line UX is bad, but layering the complexity of, e.g., EGIT on top doesn't really remove most of that and makes it harder to teach (because expressing what to do in a GUI is strictly harder than communicating text commands, among other things).


EGit doesn't call the git command line application, it uses a Java implementation of git. Although I can agree it's debatable whether they have improved on the command line interface or not.


IDEs are convenience layers, compilers are essential abstractions - if you don't know how it works underneath the GUI layer you will get stuck really fast and won't be able to troubleshoot when you eventually run in to inevitable problems with IDE, build, compiler.


What makes the CLI interface to the compiler more essential of an abstraction than a GUI interface to the compiler? Just that it is simpler, or that it's been around for longer? A GUI isn't necessarily just calling CLI apps in the background, it could be using programmatic interfaces too.


It could be - but in the majority of cases it isn't - and you will need to fall back to CLI and UNIX abstraction layers when the IDE breaks - because that's the way it's been done since forever and if you want to work with existing (and probably future) tooling and be able to solve the problems when they happen in the layer above you will need to learn the interface used to expose all the functionality to the tools under the hood.


> It could be - but in the majority of cases it isn't

I don't think that's really true, at least for the Java ecosystem which is implicated here. Eclipse does not call out to command line tools for most things, as far as I've seen


> "What makes the CLI interface to the compiler more essential of an abstraction than a GUI interface to the compiler?"

Build and CI systems don't go through a GUI IDE, making it required for developers to understand the CLI anyway.


As an example, Maven support in Eclipse and IntelliJ don't call the Maven command line application, they operate entirely within the IDE.


> What makes the CLI interface to the compiler more essential of an abstraction than a GUI interface to the compiler? Just that it is simpler

It's not simpler! It's way more confusing to figure out that your make file told your compiler to run a command with some unknown set of obscure options that are causing an error than to just have the IDE pop a dialog and tell you in plain words, or better, not even let you set an option incorrectly in the first place.


The CLI is more explicit than a GUI. In a GUI, compiling / running / testing your code is an afterthought that happens at the click of a button.

In the CLI, you have to be explicit about what actions are taking place. You have to physically type in what command you want to run and the different arguments to it.

The convenience of a GUI is fantastic, but the abstraction makes it a very bad learning tool. Being able to open up a terminal and fix the inevitable git / build / configuration errors is a very valuable skill that developers should have.


I don't see the distinction. I started my career using command line tools and am completely comfortable with them. Yet for more recent projects I stay within the IDE (usually Visual Studio).

It doesn't matter whether I tick a checkbox for compiler warnings or add that option to the command line. The only difference is the IDE makes options easier to discover.

It's the same for Git. A decent GUI gives you a much better visual picture of the state of your working copy.


Maybe gradle then? Or actual tooling?

I’d agree that some focus on that in CS programs would extremely valuable. Maybe things are better now, but when I went in 2008-2012 git wasn’t even mentioned and basically all tooling was learned on your own.


I haven’t taken an introductory CS course since the mid 90s, so I don’t have enough knowledge to know whether I agree or disagree with you. What was the base level of knowledge like when you were in introductory courses? Back in the day, you couldn’t guarantee that everyone could even turn a computer on, so for loops were a second midterm thing.

Edit - I should mention that my last sentence was the perspective of 18 year old me. I was 18, but in my defence I was 18. :)


IDE counts as tool too. And knowing it is not different the knowing other tools.


>A lot of my peers in college were very bright and could write great code, but they were absolutely useless as developers

being a useless developer doesn't make it not computer science. not all of computer science is software development.


The point of the article is the very concept that the compiler and the Java runtime environment are not the same thing (which an IDE obfuscates to some extent) is what he wants students to learn early on. (And details like that). This leads into a lot of important concepts around how programming languages are seen by the computer.


> ... how programming languages are seen by the computer

Learning a programming language has nothing to do with how it's seen by a computer. That's a different class (a class on compilers, maybe). You can even turn Java into Javascript if you want to[1]. The fact that the compilation/transpilation flow here is Java -> Java Bytecode -> JS is meaningless in the context of learning Java.

[1] https://github.com/google/j2cl


>Learning a programming language has nothing to do with how it's seen by a computer.

Learning the syntax of a progaming language might not have much to do with how it's seen by a computer but it's important if you want to learn how to use a language properly. If you don't know how Java(or any language) works fundamentally, the code you write may work, but it'll likely be more inefficient both in terms of program and programmer speed.

Debugging sessions will be more frustrating and take longer and generally the quality of the programs you write will be lower compared to programs written with a language's quirks and peculiarities taken into account. Which you learn by understanding how a language works.

I'm not saying they should be able to write a Java compiler or interpreter, but they should have a decent idea how they work at least fundamentally.

It's harder to learn this when it all happens at the push of a button and you never learn to appreciate that button because you've never had to do it yourself and if something goes wrong with that button for some reason, you won't understand how to fix it or work around it.


But learning Java is only useful in the context of programming computers, for which you need to know practical things like files and compilers, which I think is the point of the article.


You have completely missed the author's point. The issue is not "how can we help students learn to code" the issue is "how can we help students understand the systems that they are coding on since that affects how they code"

Knowing Java runs in a VM very much affects how you have to code. Sure, when you are learning in a 100/200 level class it doesn't, but by the time you graduate if you don't understand the implications, that is bad. Why not make a very simple change early on that gives context for students later on? We shouldn't be teaching students that Java runs in a virtual machine in a 200/300 level class, by that time we should have moved on to why that matters.

Forcing students to continually use java/javac will make them start learning the concepts on how a computer actually processes their program. It makes them think a bit more about it and the concepts are not foreign when they learn it in a full class. The IDE just turns the whole process into a single button with little context.


It is a way better way of learning the concept through practice, rather than having students memorize it for a multiple choice section on an exam, then forget it.


> what he wants students to learn early on

The author, Leonora Tindall, is a woman.


Was the gender of the person specifically mentioned on the website? I couldn't find it.


whooooppsss


As a former student who had to use eclipse, I think the ide dramatically slowed my understanding of what was actuay happening in my projects. A command line and a few commands however really makes things simple and clear.

The only thing you need an ide for is autocomplete and refactoring, and in a small project like you have in school, it’s simpler to use a text editor imho and you can fully understand your project that way too


> Such a misguided article. The Java language has (quite literally) nothing to do with `javac`. In fact, there's a lot of other compilers out there. Taking a Java class should not focus on the intricacies and weirdness of Java compiler command line interfaces.

Eh, javac is the reference implementation and the language specification makes several references to it, it's also by far the most widely used. It's also good to understand what goes on behind the scenes in your IDE because the IDE is even more removed from the language than the compiler. C++ is somewhat different because it does not have a reference implementation and has a somewhat more diverse set of implementations. Though I think it's still very useful know your way around the gcc or clang command line.

> These, again, are (compiler) implementation details and should not be part of a generalized Java curriculum.

Imports and naming are not implementation details... they are defined in the specification.


> > ... imports and file naming requirements.

> These, again, are (compiler) implementation details and should not be part of a generalized Java curriculum.

In Java, these are language details. You cannot write Java without following them.


I think there's great value in learning to use a lightweight editor, a compiler or interpreter, the command line, etc.

I also think that Java is perhaps the worst possible choice for that.

It's a language created with the "build once run anyway" mentality that tries to abstract away the system, it's the poster child of languages married to an IDE, and verbose to the point that forcing people to write java code without autocompletion features probably violates the Geneva convention.

At this point in time if you want your students to learn about tooling it would be more useful to teach them to use chrome's JavaScript console than to teach them to use a Java compiler through the command line.


>verbose to the point that forcing people to write java code without autocompletion features probably violates the Geneva convention.

I'm happy to use just emacs. The naming conventions make it easy enough to remember. I've never been a fan of heavy IDEs. Maybe that is just me as a solo developer/entrepreneur, but I don't think that I am the only one.

Once you decouple yourself from the IDE and all of the bloated enterprizey (ahem spring) frameworks, it is actually pleasant to work with. Most of that stuff is superfluous when you have a command of the environment.


it's a regular issue, the more layers between the actual tool (JVM) the less you will need to master, but at the same time the first thing you'll learn is to forget how to use javac or any compiler and just import projects configs or let tools infer stuff for you.

In my personal case, knowing about classpaths the hard way was my only path to salvation. Otherwise I was blindly tweaking IDEs project configs without any idea what was happening.


I think there's a bit of personal preference. As a student I hated IDEs, because I couldn't tell what was actually happening. To this day, I avoid using and IDE or other tool for git/version control, because I can't tell what it is doing.

There's more typing with a text editor and javac, but it's much simpler to understand IMHO.

All of this must be understood in the context of first year programming. For my class, there were no libraries that one needed, and at most 10 or so java files to compile. The classpath would never need to be more than the working directory.


I thought it was obvious that the author means javac as a stand-in for the general idea of compiling and running code from the command line rather than having an IDE do everything for you. Her arguments apply to any Java compiler.


Once students finally graduate they might be unlucky enough to land a job as a Java developer. Here they will live and breathe continuous integration where its essential to write build scripts that can be exported to other environments, and the only way to do this is to use command line recipes.


The thing is, early in their study, students are learning that compilers exist at the same time that they are learning a language. They should become familiar with a compiler, and in the case of Java that it generates class files, that they pass them to a JVM, to inform them of the state of the art of tooling architecture, or even to help them imagine that they could change some pieces or swap them out.

Using the tools directly serves to re-enforce all these concepts.

They can get to know other tools for that same language like in your Borland example, sure. But if you didn't know the compiler existed, you would be in trouble. And in C you will also want to understand the preprocessor and linker.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: