Firstly, there's a nuance here that readability is heavily dependent upon the language. A language where recursion is the recommended approach to these sorts of problems will have far better syntax for doing so, such as in Haskell.
Secondly, I think there's something to be said for familiarity. I think looping is second nature to most of us not because it's superior or more intuitive, but because traditionally this is one of the first things we're all taught. If functional programming were the norm and people were taught recursion early on instead I think your notion of simplicity would differ.
I've been teaching a little bit of programming to a beginner lately, and they've often found functional approaches to problems easier to grasp than their imperative equivalents because they're more mathematical. It's like an expression of something they already know versus looping, mutability, et al which are new concepts entirely.
Donald Knuth has the opposite view: he thinks recursion is more fundamental. Funnily, he uses an example much like yours.
> On the other hand, recursion is quite fundamental. It is not inherently a high-level concept, suitable only for grownups but not for children. Most people actually grew up with an implicit understanding of rudimentary recursion long before they understood iteration—that is, before they understood how to carry out a loop of instructions. Given a list of tasks to do, the simplest mode of operation is a recursive approach that charges blindly ahead: […]
> It's only after we've gained experience with such an almost mindless method that we acquire a more global view of the process:
> Do Jobs = Do each job on the list. (2)
> […]
> Do a_k for 1 ≤ k ≤ n. (5)
> All computer programmers have in fact progressed from stage (1) to these later stages rather early in our lives, because of an inner desire to “see through” the entire chain of consequences that result from primitive actions.
> Furthermore, once we reached these later stages, we entirely forgot that they were actually elaborations of the recursive procedure in (1). There was no point in narrowing our perspective to the simple-minded form. The vast majority of recursive situations that faced us as children fell into very simple patterns that could readily be “seen through” and understood as a unit. Thus we came to understand iteration as an elementary idea.
> It was only later, when faced with difficult problems whose recursive structure cannot be visualized in straightforward terms such as operations on arrays of elements, that we re-learned the importance of recursion. Therefore we now tend to regard recursion as a high-level concept, although it is really fundamental.
Possibly these views of his are influenced by his early background in programming in assembly/machine language, where it takes some sophistication to translate recursive ideas into code (maintaining a stack etc), but it's an interesting point of view worth some consideration. This is from his draft of TAOCP Chapter 8 ("Recursion"), and is preceded by the following paragraph:
> High-level languages for computing—which themselves are de ned recursively, because algebraic formulas are built up from algebraic formulas—allow programmers to pretend that recursion is real, that a computer is somehow manufactured from recursive components. But programs that invoke themselves are actually implemented by means of a stack structure behind the scenes. A good programmer is able to understand such algorithms by viewing them simultaneously at a high level and at the machine level, and at many levels in between, thereby perceiving what a real computer can actually do well.
Honestly I'm going to disagree. Yes, when planning the task, I think about it as "I have to do this 100 times". But when I'm actually performing the task, I think "Ok, I have to get this stake in the ground, then proceed w/ the rest of my giant pile of stakes."
On the flipside, I have a lot of mathematical training, so maybe this is an uncommon perspective.
From a perspective of what we do in real life, I think looping and recursion are the same thing. Pound 100 fence posts into the ground is either "do this, then loop back and do it again until some stop point" (iterative) or "the base case is no posts left where you stop, otherwise pound the current posts into the ground" recursive.
Recursive has the advantage that when you use the tools for it, recursive is easier to read. If I see e.g. a map, I know we're converting a collection of things to some other things without restriction. With a look, I just have to read through it to see what it's doing.
> Looping has a huge advantage in terms of how we do things in real life.
Anthropomorphism is the bane of our craft. Our untrained intuitions don't really matter.
> Recursion is much more of a mathematical concept that we learn about long after preschool
Programming is applied mathematics. Different from the kind of maths you learn at school of course, and often even more rigorous.
I personally have no qualms about requiring some mathematical proficiency in programming. Makes you much more capable at a number of applications later on.
I didn't downvote your comment but your statement is a very misleading idea. Yes, the study of Computer Science is a discipline of mathematics but common programming done by most devs is not applied mathematics.
Sure, if we really wanted to, we could say baseball is "applied calculus" because when an outfielder chases after a fly ball to intercept it, he's tracking an inverted parabola... and cooking is "applied physics" because of thermodynamics ... but insisting on that is just being pedantic and academic. People do not need calculus and physics classroom training to play baseball and cook food.
> Sure, if we really wanted to, we could say baseball is "applied calculus" because when an outfielder chases after a fly ball to intercept it, he's tracking an inverted parabola... and cooking is "applied physics" because of thermodynamics ... but insisting on that is just being pedantic and academic. People do not need calculus and physics classroom training to play baseball and cook food.
No, we can't, because baseball and cooking aren't formal symbol manipulation, but both math and programming are. That is why all programming is applied mathematics, because it's all manipulating symbols in compliance with a set of formal rules.
Observably, many programmers manage to achieve considerable success without much or any mathematical sophistication. This is because the market places relatively little value on correctness, which is what math helps with, and far greater value on pleasantness, which doesn't require mathematical sophistication at all.
Plenty of people learn how to program without learning the requisite math first. I wrote my first program when I was 8 years old, long before I learned even algebra. I think it is still the case that most people learn how to program before they even know what the word recursion means.
Do you think, in the future, programmers will be mathematicians in the same way that the blacksmiths of centuries ago are now chemical and material engineers in steel plants?
Do you think this is the Bronze Age of artisinal craft, poised to be replaced by the dawn of brutally precise mechanized Haskell?
Humans are efficient, blacksmiths actually didn't become chemical and material engineers, my uncle spent his career working at an aluminum plant being useful without being an engineer. Many people will obtain just the necessary amount of knowledge and skill needed to accomplish their goals, which doesn't bode well for precise mechanized Haskell.
I think that's a stretch. Programming is the creation and/or maintenance of software based solutions. Familiarity with the problem domain that needs to be solved involves keeping track of details that in many cases have little to do with mathematical proficiency and much more necessarily to do with bureaucracy.
You need a certain kind of curiosity and general problem solving ability to get any further than mid level individual contributor that allows you to see around corners in terms of how large scale human socioeconomic systems function and think strategically. That's not to say that mathematical proficiency isn't indirectly useful -- I do think it's worthwhile in that it teaches a mind how to think in a structured manner and solve problems in a structured manner, and that process is useful and transferrable. But it only transfers so far. There are many challenges you face developing proficiency writing software that you cannot be prepared for in any other way than by using, building and studying software.
I have a huge problem with requiring mathematical proficiency in programming. It's sets an unreasonable barrier to entry.
Thousands of every day apps and services work just fine without anything more than simple arithmetic. 3 or 4 of the top 5 websites required no math to get started.
Spreading the idea that you need to know math before learning to program is elitist and counter productive IMO.
Maybe that's not such a bad thing? When a big chunk of our world runs on computers, maybe it's not such a bad thing to have standards and take things seriously, like it is done in every single other engineering profession.
So in order to post a youtube video you would need to take a camera operator course, study the history of movies and pass several advanced english courses?
The term professional programmer means the programmer is getting paid that's all. You want to put a barrier up? Are you worried about lives being lost? Professional programmers don't decide what to program that comes from the project owner. They translate human requirements to computers.
The people who decide what programmers write are defined by rules governing their project. If it's a health project laws around privacy restrict what the project owner can do.
So we don't allow kids to learn how to program until they take a lot of requisite math first? I was planning on teaching my kid some programming when he turned 5 or so, but should I hold off until he is 15 because obviously he isn't going to know all of that math before hand?
I believe that programming is a skill, not a profession. You can know how to program without being a professional programmer.
> You can know how to program without being a professional programmer.
Of course you can. You just… won't be a professional, and won't be held against professional standards (minimum output & quality, that kind of stuff). That's okay. But if you're going to get paid for it, you ought to be better than a random self taught amateur.
Though I suspect we aren't, in fact, much better than the average self taught amateur. That would be an indication that we still have progress to do.
Kids can learn how to build bridges out of legos and erector sets before they learn math and (intro to?) materials science and a host of other things. But we don't let them be civil engineers until they learn the basics of these things. It's entirely possible to learn about and "play with" something without the things that are considered prerequisites for being a professional at it.
Again, programming is not a profession. You might have a point with being a professional programmer (although a lot of programmers do useful things for money without much math), but programming is a skill that can be learned without a deep understanding of math, and often programming acts as a better lead into math rather than vice versa.
> often programming acts as a better lead into math rather than vice versa.
I suspect this is because programming is actually even more rigorous than maths. When you do maths, you can gloss over details, and sometimes delude yourself into thinking false things (that very same thing happened on my last HN submission: I failed to notice a data dependency in EdDSA, and thought an attack were possible, when in fact it wasn't).
Programming uses an external cognitive engine (the computer), that is faithful, exact, and ruthless. Your program has to work. And when it doesn't, you're forced to revisit your assumptions (as I was forced to revisit mine when I actually tried to implement that attack and failed miserably).
Neither is designing or building scale model bridges. But the minute you put a bridge or software project into a situation where many people depend on it and it can cause serious damage, it tends to become one. And that is when we would start applying stricter standards.
Sure, but this just brings up the fact that not all programming is the same. Some people are building small programs that don't need stricter standards, some people are building large ones that do. We just happen to call lots of activities that involve writing code programming, but they are fundamentally different.
What I was going to say. Yes little kids can learn to program, but I think 10 year old me and his shitty programs in Turbo Pascal aren't the standard we should be measuring professionals by.
”I have a huge problem with requiring mathematical proficiency in programming. It's sets an unreasonable barrier to entry.”
Whether that follows depends on how much and what kind of proficiency we require. We require programmers to be able to read and write, too, but we don’t require them to be excellent orators or literary writers.
I do think requiring some proficiency is OK. Basic arithmetic, for example, definitely is ‘in’. A bit of ability to think logically, too. That doesn’t mean all programmers have to know what a group or a ODE is.
> I have a huge problem with requiring mathematical proficiency in programming.
A program is a formal system whose overall structure is mostly a directed acyclic graph. Most good practices are about shaping this graph (we try to make it sparser overall, though we do tolerate isolated dense sub-graphs). If you know how to organise your programs, you've just demonstrated some proficiency with such graphs, even if you don't realise it.
Maths isn't a way to keep people out. It's a way to get them in. And remember: what we do in high school remains markedly different from what we do in programming. Calculus would be the worst possible introduction to programming. You want discrete maths, graphs, boolean algebra, group theory…
Tell that to the 6 yr old using Scratch. "Hey! Stop that fun stuff with shapes and animation an games and go study math for X years. When you're done maybe I'll let you program".
Are we talking about kids playing with toys, or professional engineers building serious machinery? I credit playing with Cuisenaire rods as a child for strengthening my mathematical intuition, but I wouldn't build my house out of them, eh?
In any event, I think it's possible to have your cake and eat it too. See http://iconicmath.com/ I think we can make mathematical toys.
You have completely ignored the most important parts of that graph, which are the inputs and outputs at its boundaries. Recall that the halting problem, Gödelian undecidability and the Church-Turing correspondence place very hard limits on the ability of axiomatic systems to define the truth, even in an abstract sense. But we see this all the time in computer science and programming. Something which takes so long to compute that you can't distinguish between whether it's accomplishing useful work for a given input case or not has a completely different kind of conceptual characteristic than "discrete maths, graphs, boolean algebra, group theory..." There's less universality. More details. More arbitrary, capricious conditions that you nevertheless have to not just work around, but determine how to build resilient systems in spite of.
It is almost cruel to taunt practitioners working around the hard, messy, ugly constraints of reality with the beautiful, elegant, and noble idea of not just conceptual mathematics, but the ability to immanentize it. But, this does little to solve the inherent messiness of the world that the software interacts with, and the messy things that people do which affect what it needs to do.
Most of the mess we have in programs is of our own making. Our inability to properly address changing (not complex) requirements.
And sure, the world is messy. But some things aren't worth automating. I've seen a talk about how the guy was writing a program to generate invoices. Turns out invoices are incredibly messy. The previous system was an overly complex brittle pile of code, that didn't quite work. His approach? Solve the common case first, and let human handle the messy bits. He started by something very simple, that could handle 60% of the invoices. Then 90%. Then 95%. Then 99%. The he stopped.
With 99% of the invoices handled, he had a simple, solid system. The messy part was handled by humans, and they had the time to do so because 99% of the grunt work was taken away from them. Plus, they could trust it works because since those invoices were relatively simple, they could be automated reliably. I believe that approach can be generalised. As Mike Acton said: "solve the common case first".
That I believe would work to "solve the inherent messiness of the world". Automate what you can, let humans deal with the rest. Or don't automate at all, just give tools & tricks so humans can work faster.
Alternatively, we could adapt around computers. Before electricity, factories used to have a giant engine that powered everything through mechanical transmission. Wen electricity first came along, we did the same thing. But it turns out electricity is much easier to distribute at a small scale. So factories adapted, and instead of having a big engine at the centre, they started to have lots of small engines everywhere, close to where they were needed. I believe computers are similar: we started using them as glorified paper, but as we understand what they can do, we employ them in more efficient ways. But the interface of a computer (it can do anything!) is much more complex than that of an engine (it can rotate!). I believe we haven't found how to best make use of them just yet.
I really like your point about solving the common cases, but what you're getting at there is exactly what is the challenging part, not the mathematical part of it. How do you figure out what is the most important 60%, then 90%, then 95%, then 99%? How do you develop the judgment to know when 99% is good enough, or 95%, or 90%, or 60%? How does mathematics help you out with that?
My point is that this judgment is the messy and valuable part. And to your point around the end, that you believe we haven't found how to best make use of them just yet -- I agree with that too, and that's why I think the work of writing software, or best practices, or sound theory is not fully established. To build good product does fundamentally require a pioneer spirit that can make compromises and negotiations and arrive at that sweet spot that's not quite 100% but high enough to be "good enough".
On the contrary, we have no choice but to use our intutions, at whatever level of training they are at, all the time. So it matters a lot. If A is easier than B iff we learn to think differently, but learning to think differently is harder than just doing B, then A isn't actually easier than B.
Becoming seriously proficient at anything seriously alters the way you think about it, no matter the field. For instance, I play the Cello. Lately, I noticed I only made progress when I understood something, or changed a habit, or let go of some subtle, but ultimately far reaching assumption.
I played the Witness. There's a time based challenge at some point, that I could only beat when I changed the way I thought about those puzzle just so I could go through them faster.
I learned to touch type. To do that, I stopped thinking visually, and instead think kinaesthetically. Then I developed some kind of intuition, where I think a word, and the letters just flow through my fingers. I don't really think at the letter level any more. Neither does anyone proficient with a keyboard. (50 WPM and up).
Our untrained intuitions don't really matter. Seriously learning anything trains us out of them. Programming is no different.
It's also probably the case that readability is not a primary concern when implementing things like "min" that will most likely be in the standard library of a programming language. From my limited experience looking at such source code, there tend to be a surprisingly large about of code for fixing edge cases, esoteric performance problems, etc.
It may be instructive to look at how readable a very naive implementation of a basic function like "min" is, but it probably doesn't actually matter that much in practice, because you'll be using what the standard library offers (and its implementation is probably much less straightforward than you might think).
As someone who learnt to program in Scheme, this was true for me, did so much Scheme that when doing some C or JS I had to pause for moment and think harder when writing imperative loops.
Recursion tends to strip context. That is one of the features, and one of the drawbacks. All problems are either a base case, or the general case.
If you are attempting to debug logic, it often helps to have all the context. To do that in recursion, you have to travel up and down the stack, examining local contexts one by one.
I'm not sure what you mean here. The only thing I can come up with is you're talking about writing specific code recursively but when doing functional programming we usually just use the applicable recursive function (e.g. map, filter, reduce) in which case we have better context than with a loop because a loop has a lot of irrelevant implementation details cluttering up the context.
Secondly, I think there's something to be said for familiarity. I think looping is second nature to most of us not because it's superior or more intuitive, but because traditionally this is one of the first things we're all taught. If functional programming were the norm and people were taught recursion early on instead I think your notion of simplicity would differ.
I've been teaching a little bit of programming to a beginner lately, and they've often found functional approaches to problems easier to grasp than their imperative equivalents because they're more mathematical. It's like an expression of something they already know versus looping, mutability, et al which are new concepts entirely.