Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s always nice when people independently reinvent good ideas! This poster’s more or less come up with up-arrow notation and a shot at its inverse.

edit: actually meant https://en.wikipedia.org/wiki/Tetration

https://en.wikipedia.org/wiki/Knuth%27s_up-arrow_notation



As is pointed out further down, it does effectively invent tetration, which is cool, but that's not Knuth's arrow notation.

Edited: I've removed the original and replaced it with the above because clearly my original comment wasn't being understood the way it was intended. Possibly this is no better, but it's worth a try.

To provide context, the comment by elil17 is in response to my original, in which I asked what this had to do with Knuth's arrow notation.


a@2 = a↑↑2


Except the essence of Knuth's up-arrow notation is the recursive nature. This is going one step, and while I agree that it's a first step, it doesn't actually get the recursive nature. So this is a notation for a single thing, not a general thing. It's good, but i's missing the generalisation step that is what makes Knuth's notation particularly special.

Maybe I know too much about the field, maybe I don't know enough, but I've looked over this many times and it doesn't seem to me to be making "the step."

Maybe I'm just expecting too much, but it feels like the heart of Knuth's notation is missing.

<fx: shrug /> I'll leave it at that. Feel free to disagree, perhaps it's an interesting talking point.


Maybe it's more accurate to say that the poster had invented tetration.

https://en.wikipedia.org/wiki/Tetration


I find it hard to focus on this stuff, but is there really any ultimate generalization? Like, you always can recurse once more - once you define up arrows, then why not iterate them?

↑n means n up arrows, so why not have an operator that applies up arrows a number of times defined with up arrows?

It seems like it's just dipping your toe into large numbers, no matter how far you go.


The ultimate generalization is to use an inductive definition based on recursive ideals. For instance the fast growing hierarchy f_alpha: https://en.wikipedia.org/wiki/Fast-growing_hierarchy

Since ordinals are the ultimate recursion tool, any such function will ultimately grow faster than whatever recursive definition you can cook up by hand.

For instance even the slow growing hierarchy g_alpha, which grows very very slowly catches up to the fast hierarchy at extremely large ordinals.

Some examples:

- f_0 behaves like addition, f_1 like multiplication, f_2 like exponentiation, f_3 like tetration, f_w like Ackerman

- Graham number is bounded by f_{w+1}(64)

- Goodstein function behaves like f_{epsilon_0} (this show how much faster it grows than Ackerman). Similar for the Kirby-Paris Hydra

- n-Iterated goodstein behaves live f_{\phi(n-2,0)} where \phi is Veblen's function

- the tree function from Kruskal's tree theorem behaves like f_{\psi(Omega^Omega^omega}, ie by the small Veblen ordinal. And the graph function from the Robertson-Seymour theorem behaves like \psi(Omega_omega).

By contrast the slow growing hierarchy grows extremely slowly, for instance g_{epsilon_0} only grows like tetration. But it still catches up to the fast growing hierarchy at very large ordinals.


Like I said, I can't really follow this stuff, but maybe the ultimate would be the Busy Beaver function? Is there such a thing as a proof of how many functions can be between something computable and Busy Beaver, given some sort of constraints?


Yes, the Busy Beaver function grows faster than the functions I gave, basically it behaves like f_{w^1_CK} (although this does not really make sense because w^1_CK, the Church Kleen ordinal, is not recursive).

By "ultimate generalisation" I was speaking about a family of computable functions that ultimately grow faster than other functions you could cook up by hand: you put all the recursivity in the ordinals, they are there for that, and logicians are very good at constructing very large (recursive) ordinals.

If you just want a very very fast growing function (but that is still computable, unlike busy beaver), you can do the a light busy beaver version: Beaver(N)=the maximal output of all turing machines T with N state where there is a proof in ZFC of length <=N that T terminates.

This function grows much faster than the other functions I described (basically it is a f_{ordinal consistency of ZFC}. If you are interested in this kind of ideas, there is this reddit post that develops these ideas: https://www.reddit.com/r/math/comments/283298/how_to_compute...


> why not have an operator that applies up arrows a number of times defined with up arrows?

That is exactly what up arrows do, isn’t it? Up arrows are iterated up arrows. I learned this in college but still didn’t fully understand up arrows for a long time, they grow unimaginably bigger and faster than I first thought. This video explains it pretty well: https://youtu.be/GuigptwlVHo

I don’t know about ‘ultimate’ but there are some generalizations of up arrows:

https://en.m.wikipedia.org/wiki/Hyperoperation

https://en.m.wikipedia.org/wiki/Conway_chained_arrow_notatio...


I agree that your '@' (tetration) is one layer of 'meta' below Knuth arrows.

Tetration delves past exponentiation into hyper-operations, whereas the Knuth arrow generalizes them. It is just a special case that x↑↑2=x@2.


I wouldn't worry much when your e-rep takes a few hits from people with nothing better to do.


You're not supposed to discuss voting...but I've noticed a couple times that sometimes when I make two related comments on the same thread expressing basically the same opinion, one gets upvoted a lot and one gets downvoted a lot, and that's just weird. Maybe the exact timing determines if the upvoters see it.


Yeah, this is just one reason why I wouldn't be super concerned with your hacker news points.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: