Hickey's approach is relevant in how the software is constructed, and Tellman's is relevant to the user experience. Both approaches are useful for a single application.
The point is that an interface should help you to both ignore and understand the underlying implementation. I'm not sure I understand your point about data and transport, since most interfaces (e.g. names) don't touch the network. Are these just synonyms for interface and implementation?
I bring it up because the author was talking about network transparent APIs, where the developer doesn’t have to care about whether the call goes over the network. But of course this is not something you can actually afford to ignore. It’s quite a different concern from the data exchange that happens between application and API, however.
I was on a project once that developed an IPC API based on remote procedure calls. The first thing they did was hide the RPCs, precisely for this reason. And the second thing they did was ditch the RPCs in favor of TCP/IP socket reads and writes, because once you hide them the RPCs don’t buy you anything but headaches. Bye bye DCE.
Hi, author here. I looked into that side of things, but CGAL only offers exact precision implementations for lines and circular arcs. The fact that Bézier curves are left as an exercise for the reader is further proof of the disconnect between computational geometry and modern computer graphics.
Fair point. At the same time the exact computation paradigm is proof that problems like these can be solved in a rigorous way if you start with the right foundation.
By the way thanks for writing the article, it was a refreshing read.
I'm curious; I've played with OpenSCAD a little, which uses CGAL, and found it to be painfully slow (with no particular point of comparison). Do you think you have a way to do similar CSG calculations faster, or at least trade speed for accuracy or something?
It's very plausible that these techniques could be used to fix the output of a CSG library that uses floating point math, but I'm not sure what the specifics would look like. If anyone has ideas in that vein, I'd be very interested to hear them.
I was thinking along the lines (based on skimming documentation) of CGAL using arbitrary precision integer based rationals, which are slow, and using floating point with the error correction might potentially be faster.
Unfortunately, it's probably way beyond my ability to delve in to it.
By forcing the GC to run in between tests, basically. The README for the benchmarking tool describes it in more detail: https://github.com/hugoduncan/criterium.
I tried to include these, but it wasn’t immediately obvious how to construct them from outside Scala without CanBuildFrom. If anyone wants to open a PR or even provide a few hints, I’d be happy to update the benchmarks.
Scala 2.13 greatly simplifies the external interface of the collections and no longer requires CanBuildFrom [1]. You may find them much easier to work with now.
First public release of Arc came not long after Clojure.
Clojure is definitely better thought out as a language, and while Arc has some interesting ideas around web development, it also doesn't have a module system, so everything is just loaded into the same global namespace, which is pretty insane.
Keep in mind that Arc is the start of something with long-term goals; further, it's part of a lifecycle that didn't necessarily start with Arc but with the first Lisp. PG points to an early essay titled "The Hundred-Year Language" [1] as an indicator of where Arc is intended to go and some future outlooks that might inspire more evolution.
I don't know that any programming language can be everything it needs to be from the outset, but it certainly needs inherent featured supporting longevity, a framework that supports change and evolution, while still having a pleasing and useful function that can be taken advantage of immediately.
Ah, that makes sense. The next main difference in my mind is ergonomic, but relies on this regularity: functional syntax for composition, negation, quoted application. There are also other syntax additions and in my experience they're well-chosen and go a long way.
It would be more correct to say that they can be coerced to a function with a special purposes (e.g. using a set as a function is an alias for checking whether an item exists).
To say they are functions would be incorrect. By the same logic, a keyword is would be a function.
They're callable as functions, i.e. they implement clojure.lang.IFn. So unless you want to shave hairs on what it means to say they're not "functions", they _are_ functions. Sets, vecs, maps, keywords, and IIRC symbols are all IFn
Oh and (#{1 2} 3) is equivalent to (get #{1 2} 3) not (contains? #{1 2} 3)
I think that it depends upon what you interpret it when you say “it’s a function”; it’s data as well. Which, now that I reconsider it, was kind of the point that the parent was trying to make, that they’re also very much unified.