For anyone interested there're a few such tools that are modern alternatives to traditional Unix utilities. Interestingly all written in Rust.
grep -> ripgrep (command `rg`)
find -> fd
sed -> sd
diff -> delta
ls -> exa && lsd
cat -> bat
wc -> cw
du -> dust
ps -> procs
top -> ytop (**edit** archived; author recommends `bottom`)
They all offer their own modern take but I'll say the first two, which are the only ones I'm using, are the most interesting. Both ripgrep and fd offer easier usage. The ripgrep is also an ack replacement and fd can also replace parallel (even has compatible syntax). The amazing part is that both are an order of magnitude faster their counterparts. The ripgrep dev has extended analysis for how this is achieved on their blog[0].
>it seems ytop [2] is not being maintained anymore
Indeed, author recommends `bottom` instead so noted that in the list. It appears top has a few alternatives. There's also `gotop` that is written in Go and (now that looks cool) `bashtop` written in Bash.
Yeah, I like them both. Normally when I'm doing some heavy processing I have a htop, gtop and nvtop laid out side-by-side on a tmux session [1], giving me full visibility of system resources.
Indeed, it's a 24" rotated 90 degrees, so 1200:1920. Initially I thought I'd never use it, but it's surprisingly useful for things like monitoring or background processing.
What that? Why would I use these rather than the unix tools I have been pretty happy with for nearly 40 years? And which come installed on every unix-like distro.
Because "it's old" and "it's everywhere" also holds for things we can and/or should replace. Else you'd be using the tools they were using not 40 years before now, but 20+ years BEFORE the advent of UNIX still.
Now, except for the convenience of "it's on every linux distro" which is more a theoritical convenience (since it's 2021, we have networking, package managers, provisioning, and other niceties, you can take your tools with you to a new job, or the servers you manage. Do you really want to constrain your toolset to the lowest common denominator lest you have to once in a blue moon log in some non-touchable system for some session and you wont have ripgrep there?.
(And if you're not an admin but a developer that makes even less sense - you're supposed to be in far more control of your development enviromnet).
The "I've used for 40 years" argument either amounts to (a) I don't want to learn something new, even if it's improved (which one can respect, but is not a general argument, as others might want to learn something new), or (b) I think the existing versions are the global maxima of such tool design, or close enough not to matter.
Note that (most likely) the tools you use are already replaced versions of the old UNIX and/or POSIX tools -- namely GNU versions of those, with tons of extra functionality and flags and in several cases different behavior.
Without those a system (I had the displeasure to work in some legacy unices like AIX etc that didn't have them by default, but had the legacy UNIX versions of such) feels absolutely medieval...
The "I've used for 40 years" argument either amounts to (a) I don't want to learn something new, even if it's improved (which one can respect, but is not a general argument, as others might want to learn something new), or (b) I think the existing versions are the global maxima of such tool design, or close enough not to matter.
Because I want to write scripts with the minimal possible support overhead. That means striking a balance between the lowest common denominator, and productivity. Does (for example) ripgrep give me so much extra speed that it's worth committing to the overhead of installing it and keeping it patched up to date on thousands or tens of thousands of machines? With a dozen or more combinations of distro/version? What if the next guy has his own favourite grep replacement? What if we construct a suite of tools and all of them grep in a different way, and someone else a year or 5 years or 10 years later has to debug it? What if two or more of these tools decide that they are the one that gets to replace grep with a symlink to themselves? Having been there and done that it gets very messy, very quickly.
Note I'm not saying that grep can't be improved - but I am saying that it would be better to PR those improvements into grep itself rather than creating a new and incompatible thing. Or fork grep and ensure that the improved version is fully backwards compatible.
>Because I want to write scripts with the minimal possible support overhead. That means striking a balance between the lowest common denominator, and productivity
That's not a general argument against their use though. You can write your redistributable scripts in plain sh or bash, as you would, using grep or whatever classic tool.
ripgrep and co can still serve for use on the command line, which is what people use them - and for many devs grepping something or find-ing something or ls-ing etc is many times more frequent in everyday cli use than writing some scripts they'll re-distribute.
(That's why devs have adopted ZSH and co too -- that it wont be installed or be the default on some random system is not an issue. You still get utility out of it in your main driver machine(s) and/or servers you control)
The argument some people make is not
(a) "I'd rather stick to more popular POSIX/GNU variants for scripts I want to be redistributable",
but, (which seems to me non-sensical)
(b) "I'd rather not use something like that, even if it's better/faster/etc, because it wont be available at any random machine I might ocassional login to, or because I can't ALSO use it for when I want to write easily re-distridable scripts".
> Note I'm not saying that grep can't be improved - but I am saying that it would be better to PR those improvements into grep itself rather than creating a new and incompatible thing. Or fork grep and ensure that the improved version is fully backwards compatible.
This kind of thinking is just so incredibly myopic. Like, whoever said compatibility is the most important possible feature? What if "improvements" aren't possible without breaking compatibility? Then what do you do?
Like, I get that it's fun to play the "old man yelling at clouds" role. But this takes it to another level.
> Because I want to write scripts with the minimal possible support overhead.
My goodness. Then just use plain old grep! ripgrep is not grep. It's like a grep. It cannot replace grep in every circumstance. It never will and it was never intended to.
Just because someone suggests an alternative doesn't mean they are suggesting it as a replacement for literally every possible use case.
There are UX benefits of `rg` (or whatever alternatives) over `grep`. e.g. ignoring the .git directory by default. - Plenty of these things _can_ be done with grep, but the newer tools offer the benefits out of the box.
Generally, the UX improvements involve better colouring and a more intuitive set of arguments.
"installed on every unix-like distro" is an advantage if you're frequently jumping into environments without these tools installed. In that context, it's worth learning the defaults of these tools. - But in many jobs you're able to work on the same computer, and benefit from e.g. customising dotfiles, using whichever tools you want.
It would be better if it was written in rust, but I prefer `httpie` to `curl` because I find the syntax significantly easier to get my head around even though curl is installed on virtually every Linux distro I almost always get the syntax wrong.
But if you prefer a tool, there is no reason to switch of you don't want to.
I use ripgrep because it is noticeably snappier, especially for unicode. Other tools like hyperfine or fd are just more convenient. I'd have no problem replacing ls with another tool, but none of them support viewing SELinux attributes.
I really don't care whether a tool is standard or not when it comes to a workstation. I have so many thinks tweaked and personalized that using a few nonstandard tools doesn't make a difference.
1. Newbies to Unix don't come with 40 years of experience, they need to pick these tools up from scratch. When the newer tools have substantially better UX, it's a no-brainer that they would be used. I still remember being a newbie, and improving the UX (note, not dumbing-down) of our tools would go a little ways to alleviating the difficulty of navigating an unfamiliar terrain.
2. Those tools are also 40 years behind what we know we can accomplish in command line UX. They cannot afford to modernise without breaking backwards compat of the historical output text and the params, so they stay frozen. Any changes would have to come from new Unix tools written from scratch, which is what `exa`, ripgrep and `fd` are doing.
I guess it means more modern implementation (rust).
ripgrep is faster than grep, but there are some differences, not exactly the same flags, for example.
If you use grep on big files regularly it worth a try.
Ripgrep and fd could be written in Brainf*ck for all I care. The big win is the improved UX and not having to `man grep`/`man find` when I want to go beyond the two parameters I memorised.
I'm half-joking. I use some of these utilities myself occasionally, and even more of them have ended up in my Emacs toolchain simply because they're faster.
On the other hand, there's value in all of these ancient, ubiquitous and battle-tested utilities. You're doing yourself a disservice not learning find and grep.
And I do poke fun at the "rewrite it in rust" crowd even though I think Rust itself is incredibly exciting. There's nothing as easy as reinventing existing tools in a greenfield code base. Inventing truly new things and maintaining old code are both much harder and much more valuable. Not saying that the existing tools don't need an update every now and then, but I'm not particularly impressed, despite these things being flashy.
`bat` is definitely a `cat` replacement as it has the essential functionality of concatenating files. The automatic paging is only for printing to standard output and it doesn't meddle with this functionality.
By the way, zsh built-in pager (invoked with `<`) has this behavior. It prints output to display but it pipes it to an actual pager if size is large.
sd is pretty great. It’s like sed but it does what you would expect by default (find the first arg and replace with the second). You can opt into regex matches.
Yeah, no way I’m replacing ls with something that effectively combines ‘ls -ltrh’, ‘tree -L’, and ‘stat’. It appears the efficiency gain provided by this tool’s interface is a marginal improvement in terms of ‘characters typed’. It’s like combining a bicycle, scooter, and moped, but limiting yourself to a few hundred square feet. Confused. If this is a troll post, it worked, as I am irked.
On the other hand, I've had `ls` aliased to `exa` for a while now, and there's been no downsides that I can see. It accepts roughly the same flags and has slightly nicer (or more cluttered, depending who you ask) output.
If the goal is to get an idea of what an unfamiliar directory contains, I can see it being useful. It's less useful if you just want to list files, but exa has that too.
Any new tool replacing good old one must be a drop in replacement (this one seems like it might be?), muscle memory is so hard to overcome after years and years of conditioning. It's pretty interesting how tools over time become a language on itself.
Not always, although it's preferable where it makes sense. Something like httpie is much better than curl and wget and it's definitely worth learning its interface. If they tried to make it drop-in, a lot of the benefits eould be lost.
Not necessarily. Newbies enter the field all the time, without the entrenched reflexes. Seasoned Vi users cannot imagine their shortcuts changing, but I cannot imagine getting used to them.
This just reminded me that I use exa on a daily basis. I seriously forgot! I set up aliases a few years ago, synced to all my devices, and promptly never though about it again. Really useful tool!
[0]: https://blog.burntsushi.net/ripgrep/