Scala developers end up on the defensive because many of the claims are either because of misunderstandings or are fashion driven, being repeated ad-nauseam many times by people with only a superficial experience of the language.
For example I'm going to make the claim that SBT indeed isn't just a bunch of symbols, but rather one of the best build management tools in existence. For anybody going to challenge that I'm going to start mentioning the complete clusterfuck of other platforms and languages, like that of C/C++, or that of Python, or that of Javascript, or that of C#/.NET, or that of Haskell. SBT not only does demonstrably a better job than most, but works well out of the box and is relatively pain free. It has its own set of problems of course, but what tool doesn't.
Compatibility issues are real. But you know, the nice thing about those compatibility issues is that it forced the community to come up with solutions. At least Scala and SBT are addressing compatibility issues, whereas the transition from Java 6 to later versions is proving to be next to impossible and I don't think I have to mention Python 3 or ECMAScript 6.
Compilation times haven't been that much of an issue for us because our codebase is split into services and libraries that have a reasonable size. And on the switch from Scala 2.10 to 2.11 I definitely saw an improvement, with the claims of 30% being about right.
I do agree that they can do better, taking a look at similar languages with a slightly better story, like Haskell. However what people are usually missing is that this is usually a trade-off with compile-time safety.
The more static a language is, the harder it is to compile it. The compilation cost for a dynamic language like Javascript is zero, the cost being completely amortized with everything happening directly at runtime. When going from Javascript to Java or C#, for people that prefer Java or C#, people go from zero to a very noticeable and resource intensive AOT compilation step and many times this tradeoff is worth it (depending on whom you ask) for safety, performance or tooling reasons. Well, Scala is a more static language than Java, it infers a lot of things, it has better generics, it does implicits which can then model type-classes and even the infamous CanBuildFrom pattern and so on, giving you a nice balance between productivity and safety in a static language. And that doesn't come for free.
> The compilation cost for a dynamic language like Javascript is zero, the cost being completely amortized with everything happening directly at runtime.
This is one of the big points I always try to make to people. JS compilation is cheap and maybe it's faster to spit out some functional code. But all you've really done is pushed the rest of your work to tomorrow. Your type verification happens at runtime, and now instead of your bugs being compiler errors, your bugs are production errors that you've exposed your customers to.
With Scala I rarely need to use the debugger. With JS and PHP I have to constantly use the debugger to track down "what type does this variable actually have in this random edge case at runtime?"
I used to believe this, but what I discovered is that edit-time/compile-time/run-time phases are not as important as failing quickly in absolute time.
If I spot my error during compile-time in scala and only during run-time in javascript that is still a win for javascript if I found the error 1 second after writing it in javascript because that's how long the page takes to load as opposed to 5 seconds in scala, because that's how long sbt takes to compile.
Absolute time to discover the failure from writing it is important, not the phases that the error is discovered in, and despite this article, I still find that on complex project structures sbt/scala is horribly slow.
This is not an assumption, it's an experience. Code shipped to users. In production. In both Scala and Javascript bugs got through, both required specific practices to try to minimise them; with those practices, I found many bugs in the javascript code extremely quickly. None of the bugs in the scala code were found quickly because the iteration cycle is poison (bear in mind, our code base was probably a worst case and may not be representative of scala in general).
I'm now firmly of the opinion that static vs dynamic typed is a sideshow, the real value is shifting as many errors as possible to as early as possible (in terms of seconds from typing the error to finding it) by whatever means you can, and a powerful type system is one tool for doing this, but if it compromises the iteration cycle and actually shifts the discovery of errors later in time then it's undermined one of the major reasons for using it.
Yes, but is that because of Java compatibility (which has been generally exemplary) or maybe because of other issues (I'm thinking of Oracle vs Google)?
Second, at least on lambdas they could generate inner classes for older devices and update DEX format now that Dex is compiled AOT to native code. So only the compiler dex2aot would be required to be updated.
Currently there is zero public feedback on the new compiler chain (Jack & Jill), but from the sources it appears it is Java 6.5 all the way.
They have some unit tests that appear to be related to Java 8, but no one knows what is their idea.
In a few years (by Java 10 timeframe), according to the actual roadmap which might still change, Java will have modules, a JNI replacement, reifeid generics, value types, new array type, GPGPU support, AOT compiler....
If nothing changes, assuming Android 4.4 devices will be gone, the only change will be from Java 6.5 to Java 7, if Android's team attitude is to stay like it currently is.
But at least they spend time doing those stupid devbytes videos, doing continuous buggy releases of the support library and Google services.
Android Java is definitely the second coming of J++/J#.
Not if startup time matters or if you need to deploy into systems where dynamic linking is not possible.
The majority of the commercial third party JVM vendors do offer AOT as part of their toolchain.
Similarly .NET always had a JIT/AOT model. The novelty with .NET Native is static compilation and integration of the Visual C++ backend.
I don't if this is true, but I read somewhere that AOT was tabu at Sun.
Oracle is now improving Java to cover the use cases where it still fails short vs other alternatives, so having an AOT compiler in the reference JDK instead of forcing devs to get it from a third party is part of their roadmap.
How it will look like in the end, it is not 100% clear.
Lombok works perfectly with Java 8. Don't know why I am a "village idiot" for using it since a couple of years.
Android is a special case, though. That's just Google being incapable of accepting a ruling and making a deal with Oracle. Instead they are pouting like little princesses and secretly developing their Swift copycat language.
I'm not quite sure what you are trying to say with this statement, "I don't think I have to mention Python 3 or ECMAScript 6".
ECMAScript 6 is finalized and many people are running it today in production environments either using Node on the backend or on the front-end using a build process that integrates with something like Babel.
The following quote implies that you believe JavaScript is runtime evaluated, when in fact the standards body does not specify runtime evaluation or compilation. That is a decision left up to the implementation. All major browsers today and Node.js use just in time compilation for JavaScript rather than runtime evaluation.
"The compilation cost for a dynamic language like Javascript is zero, the cost being completely amortized with everything happening directly at runtime."
[EDIT]
I have added a link to the JavaScript entry on Wikipedia where you can verify that modern browsers compile rather than interprest JavaScript.
The implementation technique is completely irrelevant. And the part of the comment that you quoted implies that the parent understands how it's implemented.
I'm not sure how implementation is "completely irrelevant"; if that were the case then the entire conversation is essentially irrelevant as we could potentially implement any compiled language as an interpreted language.
The piece I quoted indicates to me that the parent possibly does not understand this about JavaScript since they stated the compilation cost of JavaScript "is zero" that implies that JavaScript is not compiled. If there is a compilation step then some cost is incurred no matter how low.
It was further supported that the parent may not understand JavaScript is typically compiled when they followed on with "cost being completely amortized with everything happening directly at runtime". That would indicate that all of the cost is incurred/paid at runtime rather than in the compile step which is simply false.
The parent is talking about the costs and benefits of AOT compilation versus the lack. Whether the interpreted language uses a JIT has nothing to do with that trade-off. That's why it's irrelevant.
Also, that the parent bothered specifying "AOT" in the first place is proof enough the entire basis of your belief in their understanding of things is utterly groundless. This would be clearer to you if you understood that JIT compilation cost is amortized, as the parent alludes, and that it doesn't happen in "a" compilation step.
There is also a good transition story for porting code from Python 2 -> 3. The "problem" is largely a community issue: libraries with many dependents and a lack of authors to port them, institutions with large code-bases unwilling to port, and so forth.
The tooling hasn't been the problem. There are compatibility libraries (ie: six), source transformation tools (ie: 2to3), and plenty of books, blog posts, and talks on how to manage supporting both version of Python or doing a full port.
How is it straightforward? I tried porting a very small codebase from python 2 to 3 and ran into a bunch of issues which required manual workarounds. If it were easy to port, most library owners would have done so already.
I guess it depends on your idea of straight-forward. If you mean, "press a button and everything is done for me," then it's not very straight-forward.
If you want to port there's a well-tested and clear path. There are enough big libraries out there that have managed it. There is no real barrier as the OP suggested in their article.
My belief is that the effort os a linear function of your code base size. This means the larger the code base, the more effort required to port.
Also, porting is made more difficult by the fact that Python is a dynamic language, so a lot of errors that could be caught at compile time will only be catchable through a thorough testing suite. I have seen little evidence to suggest that porting from Python 2 to 3 is actually straightfoward in the sense of not requiring significant effort proportional to the size of the code base.
Believe what you will but the numbers suggest that it's not a technical problem to port a Python code base. It does take effort and work but it's "straight forward" in the sense that what is required is well known.
Aren't we past the tipping point? I think that at this point most old projects won't ever move to Python 3, but for new projects it's at least 50-50. If not >50 for Python 3.
For example I'm going to make the claim that SBT indeed isn't just a bunch of symbols, but rather one of the best build management tools in existence. For anybody going to challenge that I'm going to start mentioning the complete clusterfuck of other platforms and languages, like that of C/C++, or that of Python, or that of Javascript, or that of C#/.NET, or that of Haskell. SBT not only does demonstrably a better job than most, but works well out of the box and is relatively pain free. It has its own set of problems of course, but what tool doesn't.
Compatibility issues are real. But you know, the nice thing about those compatibility issues is that it forced the community to come up with solutions. At least Scala and SBT are addressing compatibility issues, whereas the transition from Java 6 to later versions is proving to be next to impossible and I don't think I have to mention Python 3 or ECMAScript 6.
Compilation times haven't been that much of an issue for us because our codebase is split into services and libraries that have a reasonable size. And on the switch from Scala 2.10 to 2.11 I definitely saw an improvement, with the claims of 30% being about right.
I do agree that they can do better, taking a look at similar languages with a slightly better story, like Haskell. However what people are usually missing is that this is usually a trade-off with compile-time safety.
The more static a language is, the harder it is to compile it. The compilation cost for a dynamic language like Javascript is zero, the cost being completely amortized with everything happening directly at runtime. When going from Javascript to Java or C#, for people that prefer Java or C#, people go from zero to a very noticeable and resource intensive AOT compilation step and many times this tradeoff is worth it (depending on whom you ask) for safety, performance or tooling reasons. Well, Scala is a more static language than Java, it infers a lot of things, it has better generics, it does implicits which can then model type-classes and even the infamous CanBuildFrom pattern and so on, giving you a nice balance between productivity and safety in a static language. And that doesn't come for free.