Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Speak English to me: The secret world of programmers (github.com/npmaile)
287 points by npmaile on March 17, 2023 | hide | past | favorite | 327 comments


The problem is mainly one of discoverability. The non-programmer in the examples says "simple" when they mean "discoverable". The programmer says "simple" when they mean "not much code complexity".

Of course you disagree.

All the examples - installing and using a package manager, running a command line program - are "simple" from a software point of view, but they're not at all "discoverable". To understand the steps you need to memorise a seemingly meaningless sequence of text.

Of course, the programmer has a very strong underlying model of all of this, and so these commands seem straightforward. But even for a programmer, this stuff isn't very discoverable. I'm frequently stumped by man pages, and end up having to hunt down an example of how to use a particular command line application on Stack Overflow. I've been using linux for decades and I still often can't find where an applications binaries or config live, and I still very often have to type some magic incantation I found on Stack Overflow into the command line, because my mental model of some subsystem of Ubuntu isn't complete enough to know it. None of these things are discoverable - I've got more patience than most in searching Google and Stack Overflow, but it's not exactly a desirable way of using a computer. I would prefer it if there was a nice button in the system settings that did what I wanted.

When things aren't discoverable, they become hard. And people don't generally like doing hard things. That's not exactly a revelation.


Reminds me of Rich Hickey's classic talk "Simple made Easy": https://paulrcook.com/blog/simple-made-easy

Simple and easy are not the same thing.


man, that's a good talk. I've got some decompilecting to do in a few places.

I loled at the beginning, when he talks about (in 2011) how "all I see on hacker News is people talking about how great and awesome all this new stuff is and nobody ever questions it, asks if this stuff is adding more problems". The culture must have shifted in the interim.


Windows does that, and its regular users barely visit Control Panel, not to mention .msc-s and other system menus, e.g. Security or Sharing tab. When something breaks or they have an idea that isn’t in a nearby menu, they call a nearby programmer for help. Your sentiment is agreeable, but we are still unable make interfaces you want.

There probably even is a button or a sequence of ui actions in your system for half of things you wanted. But these are less discoverable than a bash pipe available right there on SO, three clicks away rather than ten+ that will change location and names with the next update. It becomes hard either way, but only a stable and replayable way wins.


> When something breaks or they have an idea that isn’t in a nearby menu, they call a nearby programmer for help.

No they don't. Normal people don't have programmers on call. They google the error message and do whatever the classiest-looking page they find tells them to do, which usually starts with opening the Control Panel. Honestly, the same thing that programmers do when they hit something they don't understand, except programmers are better evaluators of classiness irt help pages (due to the mental model.).


For a loose definition of "programmers", sure they do. Maybe it's a sysadmin, maybe it's a teen who writes/ builds game mods, but IME most normal folks have someone at work or someone in their family they can ask/ harass, and they'll do this before they'll start typing error messages into a web search.

Folks will come and ask for help but then you have to tease out of them what the problem was. The entire idea of "I got this error; I will look up this error" is _foreign_ to most normal folks. It is grand when they came with an error message in hand.

But even when someone has an error message, they often turn their brain off: I worked with a developer who would send screenshots of a text error rather than just copy and paste it, which would force one of us to retype what he wrote into a web search.


> I worked with a developer who would send screenshots of a text error rather than just copy and paste it

I worked with someone who did this too. Would also commonly send screenshots of code we were discussing.

"Dude, I need to see this code in context, can't yous end me a link to the file / line in Github?"


> I worked with a developer who would send screenshots of a text error rather than just copy and paste it..

I've worked with one like that. I got fed up so I just took to walking back to my desk and telling them to google it to see what the general consensus is, when they can tell me more about it I'll come back. Did that enough times till they got 'used' to not being lazy and at least tried to understand the issue before yelling for help.

Was harsh at the time but they knew they were being lazy and as devs we are paid to find solutions. I think that's the critical thing in this, we're not only incentivised into naturally and habitually finding solutions to things, but also we are failing when we don't. We're failing for our expectation of our own analytical capability. Also we work with devs, or dev-like people so we get used to talking to people who will either keep up with us when we're rattling down a rabbit hole of thinking, but sometimes will gleefully join us in it to try to find the solutions.

I think that's the crucial thing here. We're applying our expectations of ourselves/peers into personality traits that find solution finding completely alien, who not only don't think 'this is for me to understand', don't even approach the thought that they can potentially do something to fix it, even at a very small level.

The things we do everyday as easily as breathing, to a lot of people is magic, and that's really not a good thing (“Any sufficiently advanced technology is indistinguishable from magic.”).


I feel like there's a fundamental understanding of man-pages that we are just missing. I find myself in the same boat quite often.


I think the root problem is that writing documentation sucks and people just do the bare minimum. I am hoping that AI making docs easier to create will help make them more usable all around.


man pages started out as a problem. If you don't already know a lot of foundational stuff, it's very difficult to bootstrap yourself with it. There are so many things that say something like "see man fizzlewizzle (3)" or something. I've been a developer for nearly 40 years (mostly in Windows, barely dabbling in linux) and I've yet to figure out what the heck that number means, or how I might predict what the correct number is when I want to look something up myself.


> I've been a developer for nearly 40 years (mostly in Windows, barely dabbling in linux) and I've yet to figure out what the heck that number means, or how I might predict what the correct number is when I want to look something up myself.

https://en.wikipedia.org/wiki/Man_page#Manual_sections

You usually don't have to care unless you're writing them because it only matters to a user when there are man pages of the same name in multiple sections.


"man man" explains the section numbers.


Maybe we need a fork of `man` that reads the page in a sexy tts voice while showing minecraft footage on half the screen.


Or AI generated animated walkthrough content straight into PluralSight...

...and I write this partly in jest as I absolutely would use that.


Cause this is what the kids want these days amirite?


> I'm frequently stumped by man pages, and end up having to hunt down an example of how to use a particular command line application on Stack Overflow.

If you're not aware of tldr [0], I highly recommend it.

[0] https://tldr.sh/


man(1) pages on linux are almost all terrible. Unworthy of inclusion in any manual. Proper manuals have examples of use. Many linux man pages don't even discuss (or list, even!) return codes or failure modes.


Some of the manpages for utillinux (e.g., chsh, useradd) are pretty barebones and don't include examples, but off the top of my head I can't think of a major GNU utility that doesn't have EXAMPLES sections in its manpages. GNU coreutils, find, sed, and grep all have such a section in their manuals.

What are you thinking of that is example-less?


GNU ls's manpage has no examples, but it's also a thousand words long so it hardly seems fair to characterize it as bare-bones.

All in all, I find that manpages on Linux are usually pretty good, particularly compared to systems like MacOS where many daemons and utilities seem to have no manpage at all. People are far too quick to slag manpages, which probably has something to do with the `man` utility itself being a bit archaic. Common Linux desktops could probably do a better job of advertising GUI-based manpage readers like khelpcenter.


I'd like slightly fancier system (no, not info), but being able to look things up without leaving the terminal is so underrated imo. It makes looking things up a lot faster.


POSIX ls's manpage has examples. Well, one example, of `-laRF`.

But also I can't imagine what good more examples will do anyway. `-lahR` cover the vast majority of common uses.


> -lahR` cover the vast majority of common uses.

For me, 99% of the time I use -ltrh and -ltrha. But instead of -R, I use other utilities like fzf, find, or tree. Tree is really nice for a quick overview of directory structures.


Yeah, I really don't understand the aversion to examples. The sorta-standard format for man pages is... extremely confusing, even after a decade of using them.


“you should be smart enough to figure this out from here” bravado probably for some cases.


You do realize that writing good docs takes time, skill, and effort, right? Not to mention the docs you're blanketly scorning at were most likely written on someone's spare time who has no obligation to do so.


When you have some people not doing it, that's fine, they own their time. When you have have almost no one doing it, that's a cultural issue. I'm saying Linux/Unix man pages have inherent and probably very old cultural issues leading to a lack of baked-in examples.


Gnu stuff tends to be properly documented in info, not man. But yeah, loads of stuff just isn't well documented.


Stallman's sad devotion to that ancient religion... ITS has been effectively dead for how many decades now, but by god we're gonna force everybody to use a port of their online documentation system if we can!

At least we're past the bad old days when you'd open a man page and it would just be a stub saying "look at the info docs, IDIOT"


> Proper manuals have examples of use

This is what microsoft did well with powershell's help system. They have examples. Tons of them.

Why man pages do not is beyond me when most of the time, an example or two is all we need. I bet most people go to the man pages and then google for examples because the man pages weren't helpful.


ss64.com is good to me as well. There's like 4 ways to write filters in powershell and I am never sure which syntax I need for a given command or if I'm supposed to use |?{ instead.


I also use http://cht.sh from time to time.


> All the examples - installing and using a package manager, running a command line program - are "simple" from a software point of view, but they're not at all "discoverable". To understand the steps you need to memorise a seemingly meaningless sequence of text.

In a big enough organization, document sharing setup and program installation for non-technical users should be performed by sysadmins. There's a good chance if someone needs something, everyone will need it, so with proper device management tools they can make it discoverable/preconfigured for everyone.


The new CLI sub-command model helps here I think. Many commands by default now list out their sub-commands when not given one and each sub-command has it's own help output. This plus command line completion really does a good job at discoverablility IMO.

Oh and "memorise a seemingly meaningless sequence of text" makes no sense as all sequences of text are meaningless until we assign them meaning which is what you're doing learning a CLI.


I wish CLIs had better discoverability. That's one of the first principles of UX.

There should be a library that makes it easy for developers to include a basic GUI with a CLI


I've taken to having the default behavior of a CLI program be to display the URL of the documentation. This seems obvious to me, but it was a long time before I thought of it, and I haven't seen it elsewhere.

I didn't see a need to add a GUI as a web browser will do nicely.


> That's one of the first principles of UX.

And yet, in the world of GUIs, discoverability is getting increasingly rare. I don't think it's prioritized at all anymore.


I've always used Android phones, but my company recently stopped supporting them and told me they have to issue me a new phone - this time an iPhone.

And I'm utterly lost. At one time, Apple was said to be the high priests of user experience, with intuitive interactions that would just work the way you expect. This seems to have fallen by the wayside, pushed out by making it more sophisticated I suppose. Because things that I do instinctively on Android, I just can't figure out how to do. I've had the phone a week now, and I still haven't figured out how to do a "Switch App" (like Alt-Tab). A few times I've accidentally hit it, but I can't figure out what the trick is.

I don't mean to say that it's objectively bad, but it sure has erected a wall against anyone who might migrate from Android.


Yeah, I had to use an iPhone for a while, and found it incredibly frustrating to figure out how to do pretty much anything. Everything I wanted to do involved a whole bunch of blind searching.

I don't think it's the iPhone's fault. I think it's because I am used to Android.

Both Android and iPhone are very unintuitive and lack discoverability. You learn how to use them with experience.


Oh, I definitely agree with you. It's not that iPhone is worse than Android. It's that both of them have sacrificed discoverability.


> Apple was said to be the high priests of user experience

People say that, I always wonder how much of that was true. It took me three days to figure out how to get my iPad Touch to stop repeating a single song. I don't know how a single song got selected on repeat.


Double tap the home button and it brings up the app selection screen. Took me an embarrassingly long time to figure that out some years back.


What.. it doesn't really, does it? I'm going to try that now.

oh wow that's awesome. consider my hat truly doffed


Home button? My iphone doesn't seem to have a home button. The way I've gotten to the home screen is by a gesture flinging upwards from the bottom.


> There should be a library that makes it easy for developers to include a basic GUI with a CLI

There are. For example, in Python, Gooey provides that. [0]

[0]: https://github.com/chriskiehl/Gooey


Good to know!


Affairs will begin to favor the "programmer" definition of simple with the advancement of conversational models. A chat bot that responds to your prompt with command line flags is going to appear simpler than an answer about advanced physics or math.


Something can be easy while not being simple.


Literally everything I write is in Markdown because it’s so simple to me. I use it for blogposts, presentations, documentation, todos, internet comments.

But this article doesn’t strike a chord with me. The author fails to understand that it seems easy to us because we’re familiar with it. Of course we know how to transform it into PDF or HTML. But this knowledge isn’t basic, and it’s understandable that people want to stick to what they know.

Here’s an example. Say you had 10000 rows of CSV to slice and dice and extract insights and visualisations out of. All of us would go to the tools we’re familiar with - pandas, R, duckDB etc. But I think the median Excel user probably completes this task quicker and better than the median pandas user. And then if the pandas person is told “oh, just pick up Excel, it’s easy to learn and so useful” they won’t listen. And for good reason! Excel is powerful, but it’s not easy to learn. It only seems that way to people who already know how it works.

Don’t confuse the familiar with easy. Just because it’s easy for you doesn’t mean it’s objectively easy.


> Excel is powerful, but it’s not easy to learn. It only seems that way to people who already know how it works.

this right there.

except, it IS easy to learn actually. it's visual, you can click around and find out. if you forget the spelling of some formula, no probs, there is a help button.

the interface is what makes it easier, markdown is also an interface. except it's not AS exposed to the user as excel is.

people dont want to read manuals or consult documentation, they want to hit the ground running.


Yeah, I want to emphasize those aspects.

• Excel is based on a simpler model of computation, something like the unification ideas in logic programming, state is not named and minimally expressive, etc. You want a looping construct? Better autofill a column or something!

• Excel automatically includes printf-style debugging of every object in the system; that is the default state of the system and you have to hide columns you don't want to see.

• Excel has no compilation or run step. The document is living and when you change things you immediately see the results of that change.

This means that the programming model in Excel becomes much more amenable to the results-oriented mind. Decide on an incremental goal, then calculate a bunch of results “the long way” and then massage the formula until it gives a result that matches your manual calculations. Repeat as desired.

It's like clay, basically. You sculpt it to look right and then you sculpt the next piece to build on that and so on.


One additional point: since Excel packages the code with the data, when somebody sends me a spreadsheet with functionality I've never seen before it's extremely easy to figure out what they did and learn on the job. This is also why Jupyter notebooks are so powerful.

If instead of Excel they run a bunch of scripts and just send me the output, I am left much less empowered.


> when somebody sends me a spreadsheet with functionality I've never seen before it's extremely easy to figure out what they did a

You must have encountered only very simple spreadsheets. I've had to reverse engineer a number of technical spreadsheets that took days of full time sweated effort to merely understand the principles and more days again to extract the methods accurately enough so that they could be reimplemented in a programming language.


Think about how much harder it would be if you couldn't see the inputs, code, and outputs all together at once. The difficulty you found has little to do with Excel; you were trying to learn a whole new domain by reverse-engineering a model created by an SME. Of course it took multiple days of effort to merely(!?) "understand the principles" of a brand new technical domain. Ideally the SME should be explaining those things to you as well.


> The difficulty you found has little to do with Excel;

i claim, if you allowed access to an underlying source code representation of the spreadsheet, or rather, if you made that transformation at all, you could easily add symbols and labels to transform an excel sheet into something a programmer could turn into a binary but i doubt that step inbetween is desired

I mean, the activity of analyzing an excel spreadsheet is more akin to learning where all the items in your favourite point and click adventure is and in what order to get them where, than to actually analyzing a problem


When you've done that, is it all of a sudden super-simple? You've had to incorporate all the same complexity into your model. If you sent it back to a spreadsheet user, don't you think they would say the same thing, just backwards?: "I had to reverse engineer a number of technical data processes in R that took days merely to understand the principles..."


Perhaps not super simple but it will have been broken down into named parts. In the days before spreadsheets were as capable as they are now I worked with plenty of quite old engineers who would write Fortran instead; their Fortran was always better than the spreadsheets that they later produced.


The degree something is well-designed and comprehensible is based mostly on the developer, not the tool. A spreadsheet with named ranges/tables, cell comments (sparsely, where necessary), clearly-defined tabs, etc can make debugging or reverse-engineering a breeze. The visual aspect of Excel can also help debugging immensely, i.e. conditional formatting of cells failing validation, filtering to isolate subsets of data, etc.

I will certainly agree that it is quite rare to come across a spreadsheet that is perfectly-organized, on the other hand a ton of developers write horrible spaghetti-code too. Further, a 10+ year-old spreadsheet will open right up in Excel... while a 10+ year-old data project is often written in a language you've never used before (or hoped you'd never use again).


That pt 3 is actually something I really like in Office. It isn't just instant, you can see changes live as you mouse over different options. I also really like the search tool that lets you just type what feature you are looking for. I have no idea where the 'new pivot table' button is and I don't care, I can just search it instead of hunting through menus.


as much as i hate microsoft, excel endearingly excels expectations...


Ima stop you right there and ask how you made the bullet points.

• Is it just a pasted character?

* Is it a markdown thing? * Is it?


You can tell it's a character because the wrapped lines don't indent & align like a real bullet list.


I was on my mobile, I think the keyboard is Gboard, you have a symbol keyboard with two pages and the second page has the bullet.

Also if I long press on the bullet I can choose among •♣♠♪♥♦, which I find to be a hilarious random assortment...


it's a character


> except, it IS easy to learn actually. it's visual, you can click around and find out.

That's not easy at all. I am an actual Excel 2007+ virgin. I switched to OpenOffice (there was no Libreoffice yet; Oracle had not yet bought Sun) before high school. Before I finished university, I abandoned all traditional office tools in favor of open-source tools that use entirely different paradigms from Microsoft Office, e.g., Word -> Lyx/LaTeX, PowerPoint -> Org-beamer, Excel -> R. I didn't even use traditional Office tools when they were 'required' by class, instead explaining my commitments to the instructors and teaching myself the open-source alternatives as I went.

For the first ~5 years of my career, I was at startups where I didn't need to use Microsoft Office or even tools in that category.

So within the past year, I started dabbling in Excel for my job, when the last version of Excel I even messed around with was Excel 2003, pre-ribbon menu, in junior high.

And as a developer who is approaching contemporary Excel from a position that is actually pretty close to a blank slate in terms of experience with it... I am here to tell you that this

> [Excel] IS easy to learn actually

is false. You are almost certainly discounting experience people have gradually acquired with Excel in school and over the course of a career. This isn't just evident with oddball developers like me who wanted to and managed to avoid Excel for most of our careers so far. It's also perfectly evident in baby boomers who grew up without Excel and need assistance for basic tasks, and increasingly in zoomers who, like those two generations before them and earlier, have grown up without being trained on office software in schools.

Which takes us back to GP's statement:

> > Excel is powerful, but it’s not easy to learn. It only seems that way to people who already know how it works.

Moreover, these statements

> if you forget the spelling of some formula, no probs, there is a help button.

> people dont want to read manuals or consult documentation

contradict each other. So do the following two, in a slightly less direct way.

> it's visual, you can click around and find out

> they want to hit the ground running

Pressing the help button is reading the documentation, just as much as running `q --help` or whatever.

And anybody who is visually exploring a GUI with as many buttons and menus as Excel's is absolutely crawling, not running. Visually scanning an overloaded, totally new-to-you graphical interface is slow, slow, slow.

When I've been driving Excel on video calls with colleagues during working sessions (specifically for the purpose of learning it), I can practically hear the veins on their foreheads bulging as I fumble around.

Searching with your eyes, especially when some options are hidden behind additional clicks as they are on ribbon menus, is so unbelievably slow compared to searching through a comprehensive manual by typing a couple of words.


Complicated tools are hard to learn. There's no general fix to this. The only way to make complicated tools easier to learn is to watch a bunch of different people try to learn them and tweak little details based on what you see. In Excel 2003, Microsoft had 20 years of doing this. When they moved to the "ribbon" in 2007, they threw a lot of that learning away. So you're right that the ribbon makes Excel harder to learn than it needs to be.

But I reject the idea that visual GUIs are inherently more difficult to learn than non-visual interfaces; everything we've ever seen in the history of software suggests the opposite.

> Searching with your eyes, especially when some options are hidden behind additional clicks as they are on ribbon menus, is so unbelievably slow compared to searching through a comprehensive manual by typing a couple of words

Typing which words!? You're already assuming the user has a pretty strong understanding of the software if they know which words to search for. This is not a fair comparison with opening a brand new GUI for the first time.

Think of the difference between walking into a restaurant and having the waiter hand you a big menu with lots of pictures, vs. standing there and asking "What do you want?" ... um, I don't know, what do you have!? The latter is how it feels for new users to be presented with "> _".


> Think of the difference between walking into a restaurant and having the waiter hand you a big menu with lots of pictures.

The useful part of the menu is the text! Especially if I'm somewhere with novel or unfamiliar food, there's no way I can guess what actually went into a cooked dish or what its texture is like by looking at pictures of it. But all of that I can infer from a reasonable description of the dish given in terms of its ingredients and how it is cooked.

But if the waiter asks 'what do you want', that is indeed an offer for them to search on my behalf much more efficiently than I could do visually through a menu! I'm free to say 'something savory and salty' and see what they offer me, or ask 'what chicken dishes do you have?', or 'I need something light, because I'm not very hungry'.

Even in the example you chose, visually scanning a menu is less efficient and more restrictive!

> how it feels for new users to be presented with "> _".

Of course a shell prompt is not self-explanatory, because you have to learn the commands for asking about other commands.

But there's also no reason that accessing the command line for the very first time has to mean being dropped at a blank prompt with no instructions. That's completely inessential to the paradigm, and it's unfortunate that this is such a common default.

> Typing which words!? You're already assuming the user has a pretty strong understanding of the software if they know which words to search for. This is not a fair comparison with opening a brand new GUI for the first time.

Sure it is. I do this with commands I've never used or use very rarely all the time. In the worst case you try a few synonyms and words for related concepts and find what you're looking that way.

You're also not making a 1:1 comparison here. One may need some basic instruction on the command line environment itself, e.g., how to search for and open documentation at the command line. But that is not part of the process of learning to use a new command line application. You're comparing the cost of learning a single application to the cost of learning an entire paradigm and environment.

> But I reject the idea that visual GUIs are inherently more difficult to learn than non-visual interfaces; everything we've ever seen in the history of software suggests the opposite.

What volume of research has ever even investigated command line usability and how it can be improved? Where has that research been applied? How much of it focuses on applications/domains which are comparable in complexity to an absolute behemoth like Excel?

Consider the part of this discussion that has focused on developers who are hesitant to use the command line. Is there seriously any reason that a text-driven environment navigated non-spatially, through search/filtering, should be any harder to use than an IDE, like those same developers use every day?

----

I also wanna be clear that I'm not saying that Excel's non-obviousness is a fatal flaw, or that people who are effective with it should switch to something else. Excel is actually an impressive, capable piece of software.

I'm also not saying that most command line environments as they exist do a good job of being easy to jump into. They generally don't! I think the command line has long been largely neglected in terms of UX, and it shows.

But the way CLI environments lend themselves to searching, filtering, composition, and reusing old actions but slightly modifying them also make searching for, picking up, and using new command line programs really smooth and fast in a way that I think most of us don't give enough credit.


Not contradicting you. Just adding this famous case of using Excel.

> the median Excel user probably completes this task quicker and better than the median pandas user

Some might complete it with catastrophic effects.

> A million-row limit on Microsoft's Excel spreadsheet software may have led to Public Health England misplacing nearly 16,000 Covid test [1]

[1] https://www.bbc.com/news/technology-54423988


Yeah, that's the flip side of a simple tool that makes it easy to solve problems. It's not clear when "simple" stops working or how to recover from it. On the other hand, the bespoke software approach can have bugs too.


I agree bespoke software could have bugs, of course, but this bug seems unique to excel. I would say excel is a tool that makes it easy to solve _easy_ problems.

I do not think Excel is simple. IMO it is less intimidating to users because it is graphical and does not look like programming.

> It's not clear when "simple" stops working or how to recover from it.

Excel row limit could clearly fail in a non-silent way, and it would be a better tool. They chose "simplicity" by ignoring errors.


There are a lot of times where a low-level understanding of Computer Stuff lets me easily get problems in simple stuff. Like if I see a year that's 1969 or a number that's in the negatives and around 4 million, I know some kind of overflow happened and can work from there to diagnose the problem. I don't know the right balance, though. I know all this stuff because you just had to if you wanted to do anything with a computer beyond simple office tasks.

Even advanced stuff has nice interfaces these days. My new laptop with a dGPU comes with software that has an overclock toggle and some guidance on how to get the most out of it, warnings, etc. That's a long way from editing files to make HIMEM.SYS work so Wing Commander would run.


Overflow to negative 4 million? I guess that would be signed 23-bit integers? I've never seen this in my life.


Typo. I meant billion. It's also been a long time since I actually needed to know this, so I'm not 100% sure it's ~4 billion.


I was a tech lead on one of the testing analytics data platforms by one of the "commercial firms" mentioned in the article, and I used to plead with/yell at/cajole various groups of people to not ship uncompressed CSVs around by email/FTP :(


That sort of data should never be in a spreadsheet.


Indeed. I'm a programmer, and if i have a 30 by 30 table of numbers with colon headers coming produced by an app or a script, my first reaction is to select, copy, paste it in excel, press search in excel and type 'text to columns'. Next step is search for 'chart'. In between I can experiment with just a few mouse clicks with different chart types, draw a trend line, add a caption,... Of course when this needs automation, further steps need to be taking using a script in combination with pandas or the like, but only after a quick assessment using excel.


I wouldn't say that Excel is harder to learn. I was thought the basics of it in primary school. I don't think Panadas could be thought at the same level, you need to know Python first and that is a mountain, even if there are much higher mountains.

With Excel you mostly need to know about cell addressing, =, dragging corner of selection, $, ranges and some basic formulas (that currently are much easier to explore than when I was first approaching the subject). That is, I think, around 80% of Excel as it is used, maybe even more.


Right I’m speaking about slightly advanced Excel. Everyone knows how to sum/average columns or apply basic formulae. But creating pivot tables and using macros isn’t self-evident or easy to figure out by clicking around.


> Excel is powerful, but it’s not easy to learn.

This may be true in a general sense, but probably not when you talk about the programmer picking up Excel. I often find I can use it for more things than people that spend their whole lives in excel simply because I have a reasonable mental model of what it should be able to do.

That said, I want to talk to whoever made the abomination that is VLOOKUP and ask them why.


Also don't confuse "easy" and "simple". "Easy" is subjective and related to familiarity, while "simple" is a more objective concept. Eg. Markdown is objectively simpler than MS Word.


I would use Visidata [0] and get it done before excel finished importing the csv file.

[0]: http://www.visidata.org


It's like learning anything.

Learning to sew was a never ending sequence of small steps that required lots of looking up to actually grok.

Thread the bobbin then put it in the bobbin holder, then swap in a denim needle, thread the needle from your main spool, switch to a straight stitch with a wide length, and adjust your tension disks appropriately while practicing on a sample patch. Place your main garment in, do a backstitch, and sew your lines.

All of this is super simple, but if you're just getting started, each step requires a manual (and it even glosses over a few really important parts like making sure to put your presser foot up or down while sewing).


> Literally everything I write is in Markdown because it’s so simple to me.

I'm inclined to agree, for the display of some text, it's one of the better choices that I've encountered!

I actually recall working on a system where HTML instead of Markdown was chosen for the storage and display of user messages (think along the lines of a chat/e-mail app), which came with a plethora of complexity and problems - everything from various editors across apps formatting things differently (sometimes breaking WYSIWYG), to the formatting in the database becoming inconsistent as the apps and libraries are updated but old messages weren't affected, any sort of manual parsing and migrating this data being difficult, as well as sanitization concerns and security challenges.

The Markdown suggestion of mine was shot down, but as far as I know there was just problem after problem with using HTML before I took a break from that org and took a bit of a sabbatical. Then again, migrating between different formats like Markdown and HTML is needlessly complicated if you have pre-existing data, so it's better to make the "correct" choice on day 1.


> just pick up Excel

Pandas are free; Excel is not.


Excel is installed on my work computer. How do you install pandas and will IT let me install it?


> How do you install pandas and will IT let me install it?

Okay, if we're doing arbitrary restrictions then Excel isn't available for my OS.


Office runs on wine.


Excel Online and Google Sheets are free.


LibreOffice is free


I find a version of this even with fresh out of college programmers.

They just know how to type Java on an IDE, maybe some SQL (using some graphical front end), some HTML/CSS. They can't operate in a Linux dev environment at all, can't use the command line or exit vi or perform any kind of simple shell based automation or use git without a plugin for their IDE.

And they also are reluctant to learn because they are self defined into some narrow field that doesn't include all this ops trickery in our workflow that they don't consider real programming.


This is a somewhat harsh take. I've met plenty of programmers also set in their ways in that C or Python is the only language needed and that concurrency or functional programming is "hard" and thus not worth learning. Whereas I find concurrency and functional programming, given the correct language for it like F#, LabVIEW, Elixir, etc., is easy and manageable. It just all depends on context and experience and additionally philosophy.

For me, I have been developing for several years in several different contexts and consider myself a decent software developer who focuses a lot on proper design and implementation, readability, maintainability, speed when needed, etc. However, I am relatively weak on all the various Unix tooling (I know the general basics of course) and am currently trying to address that. Reasons for this are that I worked professionally for a long time in a Windows environment. Any scripting done was via PowerShell, which is a quite powerful tool and language and something I actually often prefer given the readability, if not verboseness, of the commandlet names or via the programming language used. My more Linux oriented roles or projects are more recent and have used languages like Elixir or F#. Since both of these languages have built-in scripting, any scripting was done with the languages themselves. Elixir has such powerful tooling that, for whatever reason, I just rarely found that I needed the Unix command line tools for what I was personally responsible for or working on or what wasn't provided by Elixir or my editor in VS Code. Elixir allows you to build custom Mix commands, which is also nice. However, now I'm in an environment where many of these tools could be very useful, so I am trying to pick them up. And it's always possible that I was missing out, so it's important just to be willing to learn as things present themselves.

Just stating this because context is everything.


Keep in mind that "unix knowledge" doesn't mean knowing bash scripting, but understanding a bit about the shell is the key part.

You worked on windows, but you probably know about environment variables, how they are passed around, what PATH is, how to list files in a directory, cd around, copy/delete files, redirect, pipe and how to use git given your profession. Isn't that the case? I've been on Windows and learned all that as a hobbyist (I'm a professional developer, but I work on unix systems)

I asked the company I work for to add as a requirement "familiarity with bash", because otherwise we would get candidates that don't know how to use PATH, or environment variables, and that was a serious problem. It's equivalent to having the requirement "know how to use a computer"


I understand your point about Unix knowledge not requiring shell scripting experience, but I have always thought that POSIX shells are rather an important part of the Unix interface that I feel like I can't separate the two. Of course, you can do mostly everything that is in bash in other languages (and certainly there are one too many pain points in bash) but I feel less knowledgeable not knowing shell scripting. For me, learning how to shell-script to some capacity has opened my understanding a lot about what I am working on.

I took this opinion towards some presentations for co-workers and their came back to me saying that they have gotten, even if a glimpse, a broader sense what can be done. That really transcends a bit from the hot-new-tech rush, because these bits are foundational, in one way or another, to our unix/linux/bsd systems.


I got the feeling that what the commenter wrote exactly was a bit of hyperbole, but maybe that wasn't the case. Either way, I would sort of expect new college graduates or anyone to receive some training from their seniors and colleagues along these lines. They aren't all that hard to learn or all that hard to teach. The main concern is showing context and uses cases where they're useful and important.

> Isn't that the case?

Yes, of course. But there is certainly a wide variance in people's understandings of all the command line tools and uses cases that they are presented with in their own work and experience.

Software engineering is a huge field with a ton of use cases and different contexts and applications and domains. It's really all over the place. There is some patience needed when interfacing between all of these.


I think that not knowing how to use PATH or environment variables is something that can be fixed in like 15 minutes maximum. You just explain it to junirs and that is about it.


But then all the nuances take 2 more years to realize.


There are three variants:

#1. They are just stupid.

#2. They don't care about work other than bare minimum to not get fired. They have better things in their life to care about.

#3. They are as smart as you are, but they put their efforts elsewhere that you don't directly observe. May be they don't know how to operate Linux environment, but they spent days reading about DSP math and they can apply this knowledge to optimise some signal filter. May be they don't know HTML, but they spend that time to learn Spring internals and can compose really beautiful code out of it or solve very hard problems.

I think that specialisation is not that bad. It's everywhere. Some people in IT are generalists, they know a bit of everything. I consider myself one. I'm trying to build an embedded Linux distro right now, few days ago I was writing parser for PDF to extract some information (real parser, byte after byte), few weeks ago I tinkered with STM32 program, few months ago I wrote Java service, next week I'l continue to work on React web app which is going to run on embedded device. I do everything, my knowledge helps me immensely to quickly dive into new area. But i'm absolutely far from expert and sometimes I'd spend lots of time when I lack a knowledge. For example I spent two weeks trying to start graphics on i.MX8MM board and I failed. I need to debug kernel but I just don't have that kind of skill in Linux internals. But there are people who prefer to specialise and that's fine and they would probably solve that problem in few days.


You forgot one variant: they just don't like it.

Like me. I use the command line all the day, compiling AOSP and kernels, and drivers, firmware, and all kind of related things for embedded. But I don't like it.

To me, it feels archaic and counterintuitive (I'm autodidact). Same goes for command line editors or Vi, Vim, Emacs, etc.

It's not 1985 anymore. Microsoft and Apple built an empire based on coherent (up to some point) UIs. They succeded because they might were up to something.


It's not just concrete things like command line or linux or old crusty editors, it's about how they lack general computer savvy and flexible problem solving skills.

One example:

I had this guy, it was his second job, the previous one was about some Enterprise Java thing. Intelligent and hard working, and able to learn: we were exploring Erlang for some parts of the system and he was contributing within the month, with just a bit of hand-holding. But one day we get an urgent request for new functionality in the Java parts of the org, so I hand it to him. First task is getting some static data out of a few dusty html tables deep in the intranet and dumping it on a new table in the dev DB so we can start prototyping around it.

A couple of hours go past and I go check up on him. He's deep into some docs for an XML parser and ORM so he can parse the tables and get the data into the DB. He has already like 5 or 6 classes with all their getters and setters but he's not nearly halfway done.

I bring up the browser console, type $ to see if jQuery is loaded in the page, then come up with a one liner that spits the insert statements on the console (I had to google a couple of times, I'm not really a jQuery dude).

But then even that was a bit too flashy... he could just have coaxed the data into the database fiddling around with a spreadsheet and the DB GUI for a couple minutes.

But somehow his default and almost only mental model of interaction with data was overengineered Java, no matter how overkill it was for the task.


One thing I learned in my 41 years is that nobody thinks like me (or you). It's not that I 'think better' or that I am 'better at thinking'. But sometimes I deal with variables that I can't correctly communicate or translate. I often think that people deals with the same information as me, and will follow my same path of reasoning.

Perhaps the dev thought it should be a tool that you will be using every day, or a thing to maintain in the future.

For example, in what you wrote, I really understood the thing you wanted to do by reading what you've finally did. Perhaps the input to the dev should have been: "Hey we have to do this, and we need this mock data from this HTML to start developing. First we take the data out. Don't worry much about how you do it, even if it's nasty. We don't need a full-fledged production-grade "HTML data extractor" to do this. This is data that we would be importing twice at max. So even if you lose 15 minutes copypasting it into a CSV file, it's fine".


Building the long-term solution first is often wrong even when you need one. I use the quick and dirty method until I know I'll do that task more regularly and I know enough about the problem so I don't need to make wild guesses about the design.


This is a clear example of what I was talking about. The fact that you use a given method doesn't mean other methods are "often wrong even when you need one". This is just the way you think and work, in the position you are now.

If your boss at the workplace you've been working probably "within the month" or a little longer tells you to write a program to do X, you'll maybe pulling all your knowledge out to write a program to do X as if it were the most important program in the world.


Not everyone do that tho. And that includes pretty good senior engineers I work with.


Learning to work "quick and dirty" is something I'm actually working on improving, personally.

I don't necessarily do BDUF or anything, but I generally do more "engineering" than is often directly necessary. Partially, it's my personality. But part of it is that in my experience, prototypes have a way of becoming production systems.

Last time I had to do a prototype for work, I split the difference: working quick and dirty, just meeting the minimal requirements, but also documenting what we would need to do if we decided to move forward with it. That felt like a good balance.


> I use the quick and dirty method until

I use it until I realize it’s not quick after all. Now I default to writing one off scripts.


> First task is getting some static data out of a few dusty html tables deep in the intranet and dumping it on a new table in the dev DB

For something like this, the first thing I'd do is extract the first row manually in Vim, recording my actions as a macro, then use that macro to extract the rest. That would take a few minutes tops, depending on the structure of the data it might take seconds or there might be a bit more massaging necessary, but no more than about 5 minutes for this step. I have a keybinding that maps execution of the macro in the q register to the spacebar to streamline this process; I double-tap q to start recording, press q again to stop recording, and press/hold the spacebar to execute that q macro. In my experience, 2 or 3 passes are usually enough to cleanly extract a table of data from some random HTML page.

If the job was a once-and-done, then I'm already done. If I am meant to be creating something that can be re-used later, then I still do the above anyway and now I have the result with which to test whatever more permanent solution I'm writing.


I guess it's just different workflows and preferences. I like Vim and the commandline because they don't get in my way and don't try to be clever. I don't get 3 popups every time I open a program because there's a new update available and it really wants me to create an account and it has a super important "tip of the day" I forgot to turn off. Instead I can give it a couple thousand lines from a source it doesn't need to know about and let it chew on the data and write it into a file to load in a Python script. And somehow it still all stays up to date without having to tell me.

GUIs are nice to get into but are also limiting because they were designed with assumptions about how I use the program. Compared to the commandline they feel like they have a wall built around them that prevents me from making them interact with the rest of the system. Some GUIs are less limiting but in turn not any more beginner-friendly than the commandline.


I don’t think compiling kernels would really gain anything from being outside a terminal?

Ultimately it’s just an interface to programs that output mostly text. It took me a really long time to wrap my head around that. Programs that have a GUI and programs that run in the terminal use the same building blocks.


> I don’t think compiling kernels would really gain anything from being outside a terminal?

I remember compiling, flashing and debugging Windows CE images (10? maybe 15 years ago?) from an IDE (Visual Studio). It was super comfortable. Setting breakpoints, jumping to any kernel thread, navigating the call stack, watching memory and variables. All from the IDE. The development process was super fast.

Now between dmesg and logcat, I have to debug reading and grepping thousands of lines of log. Also adding printk, ALOG and all sort of logging functions to the code, recompiling, reflashing using a terminal with ADB, etc.

> Ultimately it’s just an interface to programs that output mostly text

For example, double-clicking on a compilation error and showing the error in an IDE is priceless. I know there should be some Vim plugin that does that, but it's out-of-the-box on every decent/modern coding IDE out there (VScode + ssh, which I use for AOSP, for example). Even better if the IDE shows only the errors/warnings the compiler emitted.

Also, try to find the error line between 10000's lines of building log when you compile a kernel/AOSP in parallel with -j20.


I currently work with senior engs who are really more like reworked mech engs.

Who want to use old versions of eclipse, on win8, install new copies of the same for diff projects, want the new associate level engs (23yo, first job, etc) to write plugins for that old eclipse so the shipped plugins that break due to age still work. Because they dont know svn command line, or how to merge without the plugin.

Won’t migrate to git from svn - though that’s ok, we don't “need” distributed vc. They still think git history is so mutable its “spooky.” Though I’d love to make branches while working from home and not on the vpn…

In short, and it’s just one story, really: this kind of thing happens at all levels and is really a management issue, imho. They should be pushed to expand knowledge as a priority in the workweek.

Often we’re just tossed into “fix it ship it” mode and before we know it we’re 60 and cranky and yelling at clouds.


Perhaps these devs prefer the "fix it ship it" mode and don't see the value of learning particular skills. I find it a balancing act to be pragmatic and it involves being skeptical about investing time learning new tech. With everything there is to possibly learn, you have to ignore almost all of it... until (at the latest) it's industry standard in the niche you specialize in and then you have to be open minded that it's worthwhile to the company and personally.

I have found in cultures like you describe "decision matrices" can be helpful because it allows people to consider costs, risks, do preliminary investigation, etc. That is a sort of way of providing encouragement and permission to learn things and innovate. Lunch-and-learns are another tactic to force people (or give them an excuse) to learn. Neither of those should need managers' approval to do (it's just creating a meeting invite for lunch or a wiki page). If these don't work the problem is probably the seniors internalizing management's whims instead of pushing back. But the engineers won't push back if they've never taken time to learn.


Solid thoughts - I neglected to let on that in my case the seniors lament about how new tech sucks, like python being just another perl… etc. They are “tear down prove it to me while I call things retarded” sometimes.

Thats where mgmt should step in. Its really only a couple people.


You might be surprised how even people as aggressive or stubborn as this might still find it embarrassing or unreasonable to refuse a lunch-and-learn. Maybe sending a mass invite to a meeting on Python would be seen as passive aggressive, so just start socializing smaller ideas with key people. It sounds like you'd have to be cautious about upsetting people but I think it's always worth it to at least find the most polite and diplomatic ways to suggest things you want at opportune moments. The times I see this backfire is when people let it monopolize their attention or bring negative tone/attitudes.


I will try that. I appreciate your reply. :)


I started out as a sysadmin before moving into dev and devs will think you're some sort of wizard just for understanding how file permissions or firewalls or nginx work, it's kind of baffling.


What's funny is I know some amount of <insert language here> owing to me being a JOAT (jack of all trades), so when they run into some issues and I'm able to debug it they wonder why I'm a sysadmin and get paid less than they do.

Why ARE sysadmins like me paid less than developers? And I can't get hired as a dev because they see my CV and go, oh you're a sysadmin. I mean, try being a solo sysadmin of over 100+ production Linux servers with minimal downtime... you need a little bit of scripting/coding to handle all that!


If you're wording your CV the way it screams sysadmin right in your face the only offers you get will be sysadmin. Try emphasizing your coding experience more


Correct, but it's also a common [false] view that sysadmins can't code. Even if I emphasize my limited) experience in code and leave out details about my sysadmin work, I get asked what did I do, etc. "Oh you're a sysadmin" and that's where the conversation ends.

But let's gloss over the fact that I can set-up the testing and production environment on my own along with the DB, schema, do the routing, proxy and everything in between. Then, back it all up.


Try to market yourself as a devops. The pay is better than sysadmin, and if you're selective enough, the job can be great for you and your whole skillet.

It took me 3 tries to get myself right where I wanted to be but when work doesn't really feel like work despite being at $BigCorp, life is enjoyable.


I've found that most companies treat sysadmins as a cost center while developers were treated as a profit center.

Even when companies specifically state otherwise, it's how they usually behave.


> And they also are reluctant to learn

There may be some of this, but I think there is also the part where it can be really hard to learn.

Typescript is my preferred language these days. Not for technical reasons, but because it lets small teams share language across the entire stack. At least in my area where React is basically "front-end" and it's tiny rivals are also Typescripted.

But a lot of the eco-system is admittedly sort of silly. Say you want to use environment variables. A lot of people will use dotenv, we do too, but you can use it in many different ways. You can require(), you can import, and then you can use process.env.VARIABLE in your code. Which is fine, but really, what we consider the best practice way of doing it, is something a kin to having a config.ts file, which exports all the configuration as javascript objects. This way, you can use dotenv like this:

node -r dotenv/config dist/app.js

Nobody is going to tell you how to set these things up however. Well that's not exactly true, because EVERYONE is going to tell you how to do it, and in this myriad of options you're most likely going to end up with something that isn't great. React and other major frameworks with build tool will be opinionated and do it for you. React does it similar to how we do it, likely because having all your process.env.VARIABLEs exported as const means they are typed which makes it easier to debug. This doesn't mean developers necessarily pick up on it though. I've seen many React developers who don't use dotenv the way React does it, despite them having been exposed to the benefits of the react philosophy.

So I think it's more than just reluctance. It's simply hard, and the internet or the community sort of makes it harder. Because you and I both know exactly why these React devs used it differently, they did it because the first 2000 100 word blog articles that you get when you google "how to dotenv" tells you how to do it "wrong".

Then once these young developers venture out of their comfort zone, their impostor syndrome gets additional power when they do it "wrong" and someone calls them out on it in a less than friendly way.


> I find a version of this even with fresh out of college programmers.

I've noticed something with boot camp programmers too.

Something that may have changed is that newer graduates may have studied programming because "it pays well", and not because they were computer hobbyists before deciding what to study.

I've been on meetings with some programmers that didn't know how to use a gui ftp client or how to manually upload a file to a server. Because most of that stuff is abstracted now.


> I've been on meetings with some programmers that didn't know how to use a gui ftp client or how to manually upload a file to a server. Because most of that stuff is abstracted now.

I would consider myself a decent programmer I have a masters degree, and have a deep technical understanding of multiple areas. I've plenty of experience and have shipped successful products.

I've never (not once) in my life used a GUI ftp client, and I can futz my way around an SCP command to shuffle a file around if I need to.

I can make the same argument against someone who refused to use a new development environment because they understand how their compiler works under the hood.

Just because someone doesn't know something that you know doesn't mean that you're better than them, it means you're different. Personally, i would rather have someone on my team who understands how to extract logs from cloudwatch/datadog/new relic than someone who insists our VPN solution has FTP access to our servers.


Just like many other types of higher education. Doctors, Lawyers etc. A CS, CE, EE degree is for many the first step on a career path. In many cultures becoming an doctor, lawyer, engineer is something to strive for. It is an upstanding position and is well payed. Too bad being a teacher seems not to be like that anymore in many western countries.

We here at HN are quite probably not the norm. Being hobbyists that have been tinkering with computers since early childhood should not be a requirement to become a programmer. If it were, the industry would not be able to scale as it has. There are simply too few of us in the human population.


Last time I used ftp was years and years ago. I really dont think newbies should be expected to know it.

Also, it is ok and to be expected for fresh graduates to have only knowledge in most important areas for them. Even people who want to learn a lot more then necessary are better off focusing on learning more in the immediate area they will actually use at first.


It's been a while since I've last used FTP, so it might be just a generational thing. I for one have no idea how to connect to a BBS.


> newer graduates may have studied programming because "it pays well",

Define new. Programming has been known to pay well since at least the 1990s.


There's a big gap between making early career choices and them materializing, also, information dissemination is slow and uneven.

> Programming has been known to pay well since at least the 1990s.

Known by whom? The median family in 1990 wouldn't even consider that programming as a career exists, much less that it pays well. At the dot-com boom in late 1990s, the adults would know that, but it wouldn't yet be "culturally assimilated" for the masses in the way the assumption about doctors and lawyers income was; and then you get the dot-com bust. I'd say that the time when the median high-schooler get told by all reasonable adults that "programming is a sure way to money for anyone, not just for weird geeks" doesn't arrive until 2000+ or even later; and from that time it takes ~10 years for that generation of kids to pass through school, college and fill the companies in great numbers.


This is a good point, programming was known to pay well, but you're right it was seen as for nerds only in the 90s. My parents were even a little worried about me taking compsci because they thought I wasn't nerdy enough to compete! Glad I didn't listen.


I went to college for a big chunk of the 90s, liked to fiddle with computers and didn’t know programming paid well.

In fact I didn’t know much at all about computer science other than some prerequisite class I was bored out of my mind in.


That's because you were obviously interested in programming, not money (yay).

Gates was, however, on the Forbes rich list since the late 80s, and richest person in the world from 1995, so I find it hard to believe anybody really money-driven wouldn't have known this at the time.


You forget the immense depth of everything that is computers today. It simply is not possible to learn everything, even basics of everything, in the time to get a decree.


As a person with an extensive background in ML, I feel similarly about CS grads attaching themselves to "AI" enterprises until they convince themselves they're an "expert" without putting in the due diligence to understand the unifying math under the hood. They end up doing stupid things with their models, completely ruining their business proposition through their lack of understanding in how the ML theory makes the implementation make sense. And then they don't even know how to appropriately judge the quality of the outputs from the reported metrics.

It's normies all the way down.


Are you confusing bootcamp and college? Or are you hiring people with only English degrees instead of CS degrees?

Sure, I graduated from college almost two decades ago but all those "can't"s were very much part of the standard CS curriculum then. You needed those skills to pass CS 101.

The new grads I work with now at are fine on the terminal. They only complain that they didn't learn how to use more "advanced" features of debuggers and IDEs in college. Exactly the opposite.


> I find a version of this even with fresh out of college programmers.

> They just know how to type Java on an IDE, maybe some SQL (using some graphical front end), some HTML/CSS. They can't operate in a Linux dev environment at all, can't use the command line or exit vi or perform any kind of simple shell based automation or use git without a plugin for their IDE.

> And they also are reluctant to learn

Sounds like you are hiring wrong. The point of a good engineering degree is to teach someone how to learn. Any degree worth something will include non-trivial coursework and projects, so they should already be familiar with some of the tooling.


I know plenty of fresh out of college developers that absolutely DO NOT FIT that profile. They are very proficient with shell and linux, hate IDE, are curious about all sorts of technology because they are always finding better ways to do things and aren't dogmatic when it comes to language paradigm.

Your problem might be where you get your fresh out of college developers, you may get them from prestigious universities, but it says absolutely nothing about whether they actually love programming or not, or they are just in programming because of the promise of a fat paycheck.


Absolutely spot on. A very good friend of mine went through a bootcamp and very much views himself as essentially “blue collar” —- do the job, go home. It’s for money, not passion, not at all. There’s nothing wrong with that per se, but with that mentality typically comes a distinct lack of curiosity. He’ll only learn new things when directed to, and unfortunately that means that he learns little and not often. Over time, that makes the experience of being a software developer very painful indeed.

We were talking about a recent problem he was having, getting his IDE and build server and virtual environments talking to each other, and it became immediately clear to me that if he knew what an environment variable was, he’d have solved this in about 5 minutes. Instead he’d been stuck for days, kind of just flailing.

I also suggested spinning up a dev VM or container so that he can make snapshots, nuke it and start over, etc as he makes progress figuring it out. He flatly stated that that “wouldn’t work” for reasons he couldn’t articulate.

I am almost obsessively curious about how computers work, and that curiosity led me to Linux and a nearly 100%-CLI workflow. That more than any other skill has paid off, even if viewed completely dispassionately: at least half the tools you’ll be using will have been written from this same POV, where a CLI-centric workflow and Linux are both first-class citizens.

A lot of people just want to “write code” but seem to completely ignore the reality that that doesn’t happen in a vacuum. You’re using tools, and those tools very often require some understanding of the computer they’re running on.

There’s some meme floating around out there about how “easy is hard” —- exercising is hard, but enduring the effects of not exercising is harder, etc. Now that I’m on the other side of it, I absolutely believe that working in the command line, ideally on Linux (but macOS will do fine as well), is the best way to pick up all the “glue” knowledge that allows you to reason your way through just about any situation you find yourself in. We do ourselves quite a disservice by calling most of these skills those of “sysadmin” or “devops” or what have you —- to me it just seems like it’s “knowing computers,” and when you’re spending your days writing instructions for computers to perform, that can surely only be a good thing.


Depends on the country and uni. Here in CZ, best CS uni in country, first class is UNIX shell.


What kind of college grads don't know how to operate in a linux dev environment? That'd be an immediate no-hire from me.


At my first job out of college, I met a lot of junior developers who could sort of get by in a Unix CLI, but didn't know anything except the very basics and didn't much like it. I was a Linux hobbyist from the age of 12, so I thought that sucked. Since the company had a practice where developers could give their peers educational talks, I decided to demo some of the things I use to make my Unix command line environments feel really great to work in.

I found that pointing CLI-shy devs to a nicer shell designed with ease of use in mind (Fish) and some basic navigation tools like fzf and autojump got a good chunk of them genuinely excited to learn more about the command line and make their environments more comfy. I also reviewed the available in-terminal documentation tools including those they likely hadn't seen in school (e.g., tldr, bro-pages), and I think that helped make clear how efficient exploring the command line from the command line can be. (Nowadays I'd also direct them to explainshell.com and some other web resources, though I think emphasizing the 'in-band' stuff is important.)

The next day, and every now and then for some time, I saw them reading the docs, writing their own little plugins and scripts and aliases, customizing their prompts, and so on. I knew about it because they sometimes even called me over to show off!

It would have been a shame to blow them off because, not having seen how it can be delightful to use, they saw the terminal as something to be avoided. They were smart, capable people who just hadn't encountered a good enough pitch yet.

Once the environment is comfy, learning individual tools in it becomes a lot less intimidating and more appealing to people. But you have to show them how good it can be and offer them a decent starter kit. :)


The kind that went into the subject just because it pays well and are taught some kind of bimodal syllabus: Lots of heavy Math/CS stuff they forget after the test and then practical programming as defined by "industry", industry being the big consulting firms churning out Java.


This is exactly it. I'm involved with a lot of hiring for junior developers. It saddens me how many of them are trying to get into the field not because they're passionate for programming, interested in computers, or enjoy tinkering with digital tools. They're here because they're allured by big paychecks and "because everyone else at school is doing it."

There are, of course, exceptions, and those are the junior developers who excel the most but they are few and far in between.


Yup, and given that my entire career has been in startups environments, people who lack that passion and interest are usually not a good hire. Not knowing one's way around linux is a major red flag because it shows they lack curiosity for their chosen profession.


It’s more common than you think. I graduated in 2013 and can tell you there were minimal opportunities to learn how to operate a Linux command line (if you didn’t already want to). Most jobs will set you up with some IDE with the codebase preloaded and a big green compile button, it’s just easy to avoid I guess.

I’ve been using Linux since 2008 and it’s the most high-yield skill set I have, it makes everything simpler and I encourage others to learn it.


It's crazy how fast it changed then. I graduated in 2002 and you literally could not leave without knowing how to use Unix. When I started in '98 the school's email system had to be checked through terminals and my dorm hadn't even been wired with Ethernet, although we had it by the second half of my freshman year.


I believe that. Others may have had a different experience to mine. I was always searching for a class that would give me the “real” Linux experience but nothing really exists like that, I think you just have to live with the tools for a bit.


If you can't onboard a fresh grad into a usable state with version control, a toolchain, and an editor in 1-2 weeks, you shouldn't be hiring fresh grads. Expecting Linux familiarity fresh out of college is just plain gatekeeping.


Even when I graduated in 2004 there were lots of people there for the money side of it rather than much interest in programming / computing itself. I would say maybe 1/4 or less were taking more interest.


Most lol. You only hiring MIT and Stanford grads?


On my first day at a frankly pretty shit Australian university, we had mandatory Linux and Solaris training. I managed to skip it because the IT guy turned his screen to me and said "What's this?" and I replied "Dunno, looks like Fedora Core", but anyone who couldn't do that sat through several hours of training on SSHing into things and using bash, after which they were expected to turn in assignments that would compile in a remote Solaris environment.

This story doesn't have any real point, except maybe to say that knowing your way around a shell is up to the arbitrary whims of your university's administration and lecturers.


And maybe it awakens them to the fact that there is more out there than the abstract "what i run at home" environments available.


Stanford ain’t gonna teach you Unix either. I think they at least teach git usage in the intro classes now, but certainly did not while was there. Wasn’t a CS major but took a number of classes.

The only class I took that touched on anything practical for the working world was “programming for scientists and engineers”, CME211 I think? And most people only took it because it counted for grad level math credits.


I graduated in a third world country. We did have a course or two on Unix.


I was taught how to get around a UNIX box in CS 101 at a state school.


Purdue CS in the mid/late 1990s. All coursework, if it wasn’t handwritten on paper, was developed and submitted with your account on the department UNIX cluster.

I switched majors so didn’t have any freshman courses; I certainly hope they taught people how to use the system. You couldn’t pass a class, let alone graduate, without being able to use UNIX.

There were a few side benefits to this setup. First, it allowed you to use the Sun pizza boxes in the engineering admin building, which were insanely powerful (back then). Second, it meant you could use any computer lab on campus, because they all had terminal emulators installed. So while the average student waiting in line for an open Windows PC, you could go down the hall and sit down at a Mac in a lab that had maybe two other people in it. Third, you could submit all your print jobs to the service center in the basement of the math building, where they’d collated, staple, whatever you wanted and then put it in a mailbox for you to collect at your own convenience, rather than deal with the clown show at the printers in all the Windows computer labs.

Good times


Oh, hey, IU in the mid 1990s here. We were doing Scheme on DEC ULTRIX boxes. (Which is also how I ended up being an emacs user. Oh well. Nobody's perfect.)

Good times indeed!


I went to a state school, 75% of the CS class used linux, 20% used OSX and maybe 5% used windows.


Stanford or MIT or UWaterloo


Bah. I took community college courses in highschool and the basic bitch introduction C++ course at that community college had the students using linux in the lab. The second-rate university I ended up going to after highschool required CS students to learn either Vim or Emacs to pass one of the required introduction courses, and all assignments had to compile and run on linux otherwise the TAs would reject them.

As far as I've seen, basic linux skills are taught if not required in the CS programs of colleges of any reputation or stature.


Umm, most professional programmers probably don't know how to operate in a linux dev environment bro...

That is also an extremely stupid requirement. You can be a good programmer (Especially as a junior) without knowing all the linux command line tools.


I haven't met a good professional programmer working in webdev who didn't know their way around linux. I've met plenty of bad programers who got into this field for money, didn't know how their way around unix, were barely able to do fizz buzz and would be better doing something else. I haven't hired them though. Not knowing one's way around linux is a sign of lack of curiosity and passion for the topic.

It's obviously different in some fields (like gamedev) where the code produced by the dev is not supposed to run in a linux environment.


> Not knowing one's way around linux is a sign of lack of curiosity and passion for the topic

Complete bullshit. The only thing it's a sign of is that you're not curious or passionate about linux.

Knowing about and being interested in linux may be a positive signal, but the opposite is not a negative signal.

Knowing linux is not a requirement to be a good professional programmer working in webdev.


Exactly this. I feel like a lot of operating system enthusiasts forget that the operating system is, to many people, merely another tool to enable them to do something else. You can absolutely be enthusiastic about achieving an end result and not give a damn about the tools that got you there.


I work in gamedev and all our servers run on Linux. Theres a world of difference between knowing your target environment, and knowing a laundry list of shell commands that you can pipe together. "Knowing Linux" falls into the second camp in my experience


I was under the impression all the cool kids used MacBooks these days.

And electron, don’t forget about electron.


this is the most common. id say most people i’ve worked with start caring to learn the shell and dig into the os more around year 8 of their career


> id say most people i’ve worked with start caring to learn the shell and dig into the os more around year 8 of their career

That sounds tragically late to me. It represents a serious failure on the part of those who know and love the command line to make it accessible to newcomers, imo.


nah it just shows how the valuable abstractions have moved up. i bet this 8 year mark will only get later over time until it disappears


Level of abstraction is largely orthogonal to the GUI/CLI divide, though.

nmcli is more abstract than ip+iw+wpa_supplicant, but it's no more or less abstract than any GUI network configuration widget.

The output of ls is no more or less abstract than the contents of a directory as displayed by a GUI file manager. It's the same metaphor and the same information displayed in two different formats.


As for markdown I showed it to my girlfriend and this is now what she uses for everything. She is a designer and loves that here she does not have to bother with design, just with markup. She hates Microsoft Word with a passion.

People who used formatting in any 2000s internet forum know in principle how to use markdown.

That aside, I see it as a problem that tech people fail to anticipate the needs of their peers. I know people like that, who dare to say: "Oh you just need to reverse proxy the port on your server" and they fail to realize the person opposite has a face that seems to be assembled purely out of questionmarks.

Explaining someone what a reverse proxy is involves explaining them:

- what a server actually is (confusingly both the machine and the software)

- how/that web clients communicate with that server (HTTP)

- that on that machine there is software running that communicates a local loopback port (127.0.0.1:7777)

- that a webserver can be configured to grab normal traffic coming to port 80 (http) and 443 (https) and forward it to that loopback port

- that this is called a reverse proxy

Most people are not idiots and will understand an explaination like this. Even if they don't they will be happy that you gave a coherent explaination.

I just tend to remember myself and how unclear everything was to me when I was a teenager. What explaination would that former me appreciate? Sometimes the most important thing to say is something extremely obvious that helps the person to conceptualize the whole thing (e.g. clients communicate with a server and a server has ports).


> I see it as a problem that tech people fail to anticipate the needs of their [non tech] peers

Highly agreed. This seems to be common among all programmers, but most prominent in products that have little input from non-programmers (e.g. open source projects).

Most people would understand your explanation ... given the effort. But they don't care, they just want it to work. And that sometimes involves going with an expensive proprietary solution over a FOSS one because the first screen provides a better introduction.

"Do I look like I know what a Jpeg is?".


My sister never even owned a computer before starting uni at 22 after 6 years as a cook.

I got her hooked on markdown for note-taking, I just had to find the right editor to convince her to let go the few hours of Word she 'learned' at school.

Thank to all Joplin SWE, managers and designers. Great product.

And BTW, I still think org-mode is overall better, but sometimes worse is best, and to me Markdown won.


This isn't a programming problem, or exclusive to our field. Go talk to any expert in their field about something in that field, and have the same reaction.

I tried my hand at blacksmithing, and had to learn (what seemed like) a lot of metallurgy before half of what my mentors were saying made any sense at all.

Sometimes it's on the expert to make it clear to the non-expert. Other times it's on the newbie to learn the basics so they understand the answer to their question. .


> Sometimes it's on the expert to make it clear to the non-expert. Other times it's on the newbie to learn the basics so they understand the answer to their question.

I think most of the times it is on the expert to be clear, because as an expert you should know what complexity your experise involves, while this cannot be expected of the non-expert. That being said, it is totally okay as an expert to answer in the sense of "that involves a lot of things that you need to know beforehand" and then give them a hint where they can learn those things and ask them to come back when they did their homework.

What does not help anyone is if you just answer the question without giving context. Great, technically you were right, but you are also the only person who can tell it was.

Communicating clearly is a separate skill. Not every expert has that skill. But every expert could potentially profit from having that skill. If you have that skill it does not mean you now need to put in a lot of work in explaining things to everybody – but it means you can explain things if it is needed. People are surprisingly happy if someone manages to clear things up for them without looking down on them or going into teacher/preacher mode.


It's not exclusive to programming or "tech", but computers are so pervasive while most folks' eyes glaze over at the simplest tasks.

There are lots of folks who don't know how to check their oil or check their tire pressure (or even whether they have a jack and a spare), but in the US we do have a general expectation that folks know how to do those things, so most folks learn.

With computers or "tech", there are almost no expectations that folks should be able to do any maintenance or troubleshooting, so most folks don't learn. This is why "Is it plugged in" has to be the first question.

Software developers don't always make it easier, but our expectations for users are also too low.


I want to point out two additional observations.

1. There are a decent amount of software engineers or programmers whom literally aged at the perfect time to organically learn these tools that later became fundamental. If you even touched a computer from the 60s to the late nineties in a engineering aspect at all, you were bound to have worked in a terminal, worked on computers with a single core, worked on computers with very little memory, had to either get comfortable with some lower level tooling or build your own, at some point had to mess with networking or at least understand how packets were being sent and received, seen and gone through iterations of version control and saving your work, automated task using shell scripts.

2. While there is a plethora of knowledge, videos, tutorials and flavors of ways to learn these things; the sheer volume and breadth that presents to newcomers is daunting. Yes you can learn Git, but there are so many ways to do it that it can cause analysis to paralysis. Building on point (1) if you learned it early there were only a few ways it was being done or even shared in smaller communities. Too many choices or paths can lead to just using the way someone showed you without digging 1 layer deeper just because you might not know better.

All of those things you ‘caught’ by being at the right place at the right time are a privilege. Please don’t look down on people trying to aspire to learn or want to enter into this field that haven’t got there yet.

Coming from a family of immigrants and being the first person in my family to graduate college + enter SWE. I cannot count how many times other engineers were rude, made condescending remarks or discouraged me by shoving these expectations in my face as failures instead of taking the opportunity to teach (I was always open to learning).


I am the first to graduate college and to go into software. I learned a lot on my own before studying Computer Science, and still do after.

I can agree that people tend to quickly get condescending ("just google it") instead of giving keywords ("I think you are looking for X, Y, Z, try to read about them and come back when you have specific questions"). IMO the latter is constructive, the former is discouraging. The point here is: don't be condescending, try to give advice/keywords.

This said, many beginners think that their first project can be a complex application even though they don't know how to write code. The constructive comment for them is "start small, learn the language first", but of course many beginners don't take that as an answer. Beginners need to accept that learning software engineering takes time and dedication.

Finally, some devs tend to think that they are so smart that if they don't grasp something in 2 seconds, then that something is bad. I disagree with that. An experienced developer should know how to learn a new tool. If they don't like it, they can try another one, write their own (and see if a community follows them), or just accept that they don't have a choice. That's how it works.


I think there must be some sort of culture gap in play here. I've been told to "google it" more times than I could ever count and not once have I perceived a condescending intent. And similarly, I have told people to google things many times and not once did I mean it to be condescending.

Usually it means "I don't know the answer off-hand, but I know that I could find it if I googled it. Therefore I'm telling you how I would find the answer." I know such advice is useful, even though everybody knows that google exists already, because most times that I was on the receiving end of this response, googling it really was the solution for me and I just needed somebody to snap their fingers in my face and remind me that I know how to find the answer myself.

"google it" is usually useful advice when I am on the receiving end of it, so I don't think it's condescending at all. Ditto "RTFM". If I turn to my coworker and ask "Hey Jimbo, what's the flag for making GNU Tar do bzip2 compression?", maybe he knows the answer off the top of his head and tells me -j, but likely he doesn't know, knows that he could find out in about 5 seconds with the manpage, and tells me to check the manpage. Which is wholly fair; we both know how to search a manpage, so why should he do that on my behalf instead of reminding me that I can look it up myself? I don't perceive any condescension here.


> I've been told to "google it" more times than I could ever count and not once have I perceived a condescending intent.

Personally, when someone answers "google it", I read it as "fuck off". If you don't want to help me, just don't; no need to explicitly say you won't.

However, it's totally different if you say "you should read about concepts X, Y, Z", because that gives me keywords to search for.

> If I turn to my coworker and ask [...]

Well that's a bit different than asking a question to strangers online. Of course sometimes it makes sense to ask a coworker, though a quick search beforehand never hurts.


> This said, many beginners think that their first project can be a complex application even though they don't know how to write code. The constructive comment for them is "start small, learn the language first", but of course many beginners don't take that as an answer. Beginners need to accept that learning software engineering takes time and dedication.

That is just a phase, it is normal and nothing new even. I remember a lot of people of my generation whose idea of first project was the single player shooter with storyline and RPG like skills tree. Obviously they all failed. But I also think that they did learned a lot by trying or just reading up on it.

I think that for a complete beginner, whatever is motivating and fun for you, you will learn a lot from it. Because motivating and fun means you will keep going and keep learning. That is better in the long run then theoretically super effective start that will just make you bored and unmotiavted.


Literally no part of that post was condescending, it was an observation about how wildly different those with a certain level computer knowledge use computers from the general pubic. Which is interesting since so much of the world runs on computers. Sure your points are accurate, but, so what?


> Literally no part of that post was condescending

Sorry, did you read the same text?

>> I try and try to explain that this arcane system of monochrome text and rendering steps is ACTUALLY easier than editing in Microsoft Word, but my pleas fall on deaf ears.

This reminded me what my mother never grasped how exactly programming VCR works and often asked me to help with it, yet she did installed a bunch of apps she was interested on her smartphone (like social media, Pinterest and other BS) just fine and all by herself.


>> I try and try to explain that this arcane system of monochrome text and rendering steps is ACTUALLY easier than editing in Microsoft Word, but my pleas fall on deaf ears.

By characterizing the system as arcane, they are tacitly acknowledging that the system does not have a noob friendly first appearances. This is empathetic, not condescending. Somebody who was condescending would not acknowledge the difficult nature of the thing they are suggesting.


No part of the comment you replied to implied the post was condescending, just seems to me to be a reminder of some things we often forget when this topic comes up :)


The point I expressed in the conclusion was meant to turn the apparent condescension from the earlier part around on the reader. I'm saying that we should be more mindful of those who don't know what we know.


I like to surround myself with people where learning, teaching and curiosity are the norm.

I know the feeling of being looked down on. And vice versa I’ve been just as guilty of doing so when I was younger.

Both fundamentally sucks and doesn’t lead anywhere.

Managing our ego is hard, but even just the attempt goes a very long way.


The problem is most people do not try to put any effort into understanding what they are doing by the computer.

When you ask your wife to fetch you from work at 5 you don't have to explain:

* That she needs to take a car,

* That there must be enough fuel in it,

* That she has to plan some time for the drive and add extra if traffic is expected on the way,

* That she needs to let you know if there are unforeseen circumstances which will likely mean she won't make it.

And yet for some reason, whenever the topic of doing anything at the computer comes up between us, I feel like I have to start from scratch.

"The printer does not work again. Can you help?"

"Is the printer on?"

"Can you check it, you are better at it"

Checking the printer... of course somebody turned it off.

I think there is a combination of laziness and probably something else which makes some people treat computers as not their problem. For the most part the companies making hardware and software cater to these people making stuff they care about as intuitive as possible. They are fantastic when it comes to activities that make ads pop in front of your eyes.

The trouble is with everything else.


Non-specialists passively build detailed mental models through experience and associations, and when there's no consistency between implementations, there's no way for that to happen. The reason specialists understand computer usage is because they understand how computers function at a low level, and are trained to expect certain necessary features whether or not there are any indications of those features in the interface. It's not because we're more thorough or have a better attention to detail.

The physical existence of a car gives a lot of indications of the function of a car. People deal with computers through arbitrary idiosyncratic curated interfaces that for business purposes often plot against their users.


> Non-specialists passively build detailed mental models through experience and associations, and when there's no consistency between implementations, there's no way for that to happen.

That implies we're discussing situations where there is no consistency; IME, there is usually consistency but the individual is not _looking for consistency_, so they fail to see it.

> The reason specialists understand computer usage is because they understand how computers function at a low level,

In most situations, specialists look for consistency among unrelated tools, so an unknown tool or an unknown error prompts the individual to go to a known position to understand the unknown parts, e.g. clicking the "Help" menu or appending "--help" to a command, or looking for a log file or a configuration file, etc.

> The physical existence of a car gives a lot of indications of the function of a car.

Which function? Ignition? Acceleration? Gear shifting? Braking? Charging the battery? Refueling?

Sure, folks know that a car is used to travel from place to place, but you're flattening a lot into that -- most folks have near zero idea on maintenance, repair or troubleshooting, even though their car likely has a manual in the glove box and the internet is always on their phone.

> People deal with computers through arbitrary idiosyncratic curated interfaces that for business purposes often plot against their users.

Most functions of a car are hidden and the UI is inconsistent not just across manufacturers but across models and years.

Whatever we're discussing, there are folks with experience and specialized knowledge. There are also folks who lack that experience or specialized knowledge but do understand there is usually consistency across various systems and look for a known point to understand the unknown points.

There are also folks, most folks IME, who won't look for consistency, won't read a manual, won't try to learn what they're missing, and instead will throw their hands up and say "it won't work".


Your examples aren't equivalent. "Would your print out my boarding pass?" is equivalent to "would you pick me up from work?"

Whereas "the car won't start" is equivalent to "the printer isn't working."

Many people know how to diagnose and resolve simple car issues, like a dead battery, just as many people know how to fix a printer that isn't working.


The example the GP mentioned is checking if the printer is on. That should be no more harder than checking if the light is on in your room.


Everything in the post can by explained by "People think the familiar is easy, and the novel is hard."

Judged by the standards of a programmer, yes, the suggested solutions are simpler. Judged by the non-technical coworker's standards, no, they are new and therefore scary and harder. You might say, "If they expended even 5 minutes of effort, they would not feel that way." But that's 5 minutes more than just using the thing they're used to.


The interesting part to me is, these "non-technical" coworkers aren't proposing simpler solutions. When reading "Why couldn't you just do this in something simple like Word?", Word isn't what I'd call a simple program from any angle.

It's not preinstalled on most computers and costs money, it won't run everywhere (running Word on a school provided Chromebook is a world of fun), for decades it had version compatibility issues, and the current version has an UI that would make an airliner pilot feel right at home.

To me most people in the corporate environment are highly technical and have arcane knowledge of crazy complicated tools and workflows. Just not _our_ tools and workflows.

And yes, I also don't want to go buy an Office license, install Word, learn how to deal with the commenting options, what format I should save the file back, when some random person sends me a Word file. So we're all pretty alike I think.


> The interesting part to me is, these "non-technical" coworkers aren't proposing simpler solutions.

This.

I still keep wondering why that is. My two theories are that it is either marketing budgets or that there are many visual thinkers who benefit a lot from UIs.

If it is the former, the Markdown is actually better than Word and we should teach it in schools. Bit if the latter theory is correct, people are different and trying to teach everyone Markdown would be counter-productive.

Not knowing which theory is correct means I don't know what to advocate for.


> "It's not preinstalled on most computers and costs money, it won't run everywhere"

It runs in a web browser and doesn't cost money

https://www.microsoft.com/en-gb/microsoft-365/free-office-on...


the bit after the part you quoted > (running Word on a school provided Chromebook is a world of fun)

Many orgs work on Google accounts, so asking for a microsoft account is an additional hurdle. It also probably requires I upload the files to my OneDrive. I had a look at the interface and had no idea how to just open a document I have locally.

I get your point that it's totally doable (and also free up to some point. The OneDrive space is the limit ?). I'm not convinced it's simple though.


This is the case even among highly technical and otherwise competent programmers. Junior developers especially will spend days reinventing wheels in whatever hot new language or framework when a better solution can be had in 5 minutes of reading a manpage. There seems to be nothing I can do to convince these devs of the errors of their ways.


This is different though. When junior developers reinvent wheels they are learning. The challenge here is to not put the results into production but still make it feel impactful.


Sounds like they are still thinking in "honor code" where not reinventing the wheel is cheating.


Something I have noticed across the technical/non-technical divide is that the concepts of simplicity and difficulty are easily conflated.

Often non-technical folk are only interested in "easy" solutions.

Technical folks are more inclined to pick simple solutions.

The perception of simple being hard mostly comes from simple often eliding things like a GUI which is inherently complex and overkill for a good number of tasks. However users have been conditioned to think mouse == easy and keyboard == hard.


Reminds me of Rich Hickey’s “Simple made Easy” talk. Quote:

> So the first word is simple. And the roots of this word are sim and plex, and that means one fold or one braid or twist. …

> The other word we frequently use interchangeably with simple is the word easy. … [T]hat is from the Latin word that is the root of adjacent, which means to lie near and to be nearby. And the opposite is hard. Of course, the root of hard has nothing to do with lying near.

Engineers like simple because it’s often easier to understand and debug. Non-techies like simple things too if they’re easy to pick up. But if it’s not easy to intuit just by looking at something, then complex solutions start becoming easier because they’re more ready at hand.


I guess it depends on your ideas of simple vs. easy. I find a GUI simple because I can explore it to see the options if I’m a little lost. The Unix command line is exceedingly difficult to get useful information out of if you don’t already know how it works. Need help? Don’t type “help” because it will just stare back at you blankly. The commands themselves are not easy. Each one takes flags, mostly a single letter, and the same flags have different meanings in different commands. Cmd-P is print in every goddamned GUI app I’ve ever used. Cmd-Q is quit. Cmd-C is copy. That’s simple to me. Having to memorize a bunch of flags with conflicting meanings or key commands to change modes in an app is not simple to me.


Tech folks tends to over engineering the solution: https://xkcd.com/1319/


> "Why does it have to be so complicated? I just want to install a program"

> "Why would you do that in the command line? It's way easier using $Program"

A concerning observation that’s slowly dawning on me is that more and more programmers don’t know how computers work. They can write code, build software, and do lots of useful things. But they have no idea how computers work. They’re more akin to lusers as we used to call them than they are to hackers of old.

Fantastic at their specialty and the tools they use. But move a button to an unfamiliar place or throw them into a new (but fundamentally same) environment and they’re lost.

The realization came a few weeks ago when someone shared The Missing Semester of Comp Sci on HN. It’s full of basic things you’d expect any programmer to somehow magically know … but they don’t learn this anymore. https://missing.csail.mit.edu/

Seeing that link shared connected the dots in my mind. I’ve been wondering for months ”Why does everyone at work have so many random local environment issues all the time?” … it’s been working fine for me for years. Same code and hardware. ¯\_(ツ)_/¯

Maybe I’m just getting old and grumpy. I’m sure folks older than me thought “wow kids these days know nothing” when we came on the scene …


> The realization came a few weeks ago when someone shared The Missing Semester of Comp Sci on HN. It’s full of basic things you’d expect any programmer to somehow magically know … but they don’t learn this anymore. https://missing.csail.mit.edu/

I do feel somewhat jealous though that these resources are now available for students to learn in a structured, borderline spoon-fed way when this stuff took me a number of years and hacking around to build up and gain muscle memory over. Still, I think the knowledge you struggle to learn yourself sticks around a lot longer than knowledge that was fed to you from school. :shrug: I could see it either way.


I wanted to add that I wish a formal requirement for CS students would be to install linux from scratch.

This would help them understand how OSes work under the hood, from squashfs and bootloader right up to a graphical GNOME install.

You don't have to do it all the time. Just once, to understand what you are using all the time.

I think that a huge difference between generations is also the appreciation of "how easy" things got to be, because tooling has come so far.

I mean, look how easy golangs cross compilation pipeline got...and compare that to 90s C development where you had to recompile libraries on your target machine all the time because linkers were glitchy and uncompatible, and header-only files weren't a thing yet.


This is a sentiment that is repeated with every new generation of programmers. I heard it when I joined the industry. I’m sure many before and after me did as well.

Hard to say if it’s any truer now than it is today, but the kind of work we do and tools we use have certainly changed over time.


Complaining about kids these days is a rite of passage for all people over the hill, going back to the dawn of recorded history.

Except back then it was elders complaining about these newfangled books or scrolls letting the youths be lazy and not memorizing like they should.


Nice that it covers version control.

I graduated college without ever running a version control command, except for copy-pasted CVS commands to download software from sourceforge.


It's a common sentiment that others don't know how computers work. I'm sure you personally are an exception, but I think that most people with this sentiment live in glass houses and don't realize how little they understand about how computers work. Since I don't know how they work, I can't provide comprehensive examples, but I remember being surprised by details around dynamic linking and relocatable binaries, around how vfs maps to more specific implementations, the implementation of hard disk drive firmware, the on-drive caches and the firmware that runs that, the implementation in terms of magnetic polarity for storing bits, and many more layers and details besides that I have yet to discover.


No of course I don’t know everything and there’s lots I used to know but have forgotten.

But I do feel fortunate that I got to see a lot of the modern abstraction come into being so I have a vague idea of what they’re abstracting and why. That helps a lot.


If you look at the learning curve of a lot of things that programmers favor over "normal people", they tend to have a thing in common, which is that they have some sort of step right at the beginning. Markdown, for instance, you have to learn the syntax, and then learn something to do with it in most cases (e.g., put it out on your blog). It's fairly easy after that in most ways, but there's that step at the beginning before you can do anything.

Most people receive the training where opening up Word is not difficult, and then once they've done that, there's no step up. They press "a", an "a" appears on the screen. They have buttons that do things like italicize. The first few steps are easy.

It may be the case that after that there is a brick wall in the difficulty curve, whereas the programmer-favored solution is a smooth easy curve for quite a ways. But that first step is sufficient to block most people, and people are often happy to just give up in the face of the brick wall.

This is not unique to programmers. A lot of the "maker" disciplines have a similar step curve. A lot of times it's just that first step that is the hardest overall. But it's enough.


I think this is apt, but another factor is that people don't want to put in up-front effort to learn a tool (rather than make progress on their actual goal) unless they have confidence that the effort will actually be rewarded.

Most people's use of technology is heavily punctuated with inscrutable, unfixable issues. If that's your experience, are you going to want to use a tool that you have to invest time and effort in before actually using it productively for the first time?


Slight tangent from the original post:

I wish "normal" people were more tech-literate. I don't think they realize just how much of their autonomy they've given up by choosing to use Microsoft Products/Reddit/Twitter/Spotify/Facebook/TikTok/Apple Products/ etc.

We're at a point where now the average person depends on several layers of technology that they can't even conceive of, and often the companies building that technology don't share their political interests. But then the average person expresses frustration at having to use open-source alternatives (which is completely understandable when the open-source alternative is definitively worse!).

Basically, for me not understanding how technology works when it's so central to our everyday lives just seems baffling. I suppose someone could say the exact same thing to me about not understanding how my car works, or how to grow my own food. Maybe technology seems easier to learn than agriculture? But then of course it's easier for me, I'm a professional programmer. So I try to have a lot of empathy for everyone who doesn't understand technology, especially my mom when she doesn't know how to convert a PDF.


The reason is we only have so little time and yet so much to do.

For the vast majority of people, understanding technology is simply not worth their time. They have other, more important things to worry about.

You don't know nor care to understand how telephones and cars and electricity work. Nor how and where the food in the supermarket comes from. Water comes out of the faucet because faucets dispense water. You have more important things to worry about, like computer technology.

Much like those aforementioned things are merely a means to an end for you, so too is technology merely a means to an end for most people; their ends lie elsewhere.

Or to TL;DR: Life's short, ain't nobody got time for computer.


I'm a SWE by trade, but I'm trying to start a business with my spouse that is, traditionally, 100% not dependent on computers. Every time my brain starts to go toward the path of _lets improve this process with technology_, I do a little research, fiddle with code, and find that the solution I think I want would be a few days work minimum. But none of that work actually addresses the IRL work that needs to get done in the meantime. The opportunity cost of introducing technology, even coming from a technologist with a quick-learning partner, is still too high.

Instead, we get the job done with a whiteboard, lists (digital and paper), shared calendars, and a couple Google Docs / Spreadsheets. Post-its, markers, blank labels… office supplies! Incredible technology of the 20th century! Try it!

I was initially frustrated, like this author and other posters, that there's so much intermediary technology to learn to get anything done with. OMG — even getting a point of sale app going on our Very Capable Pocket Computers in 2023 BCE requires Xcode and staring blankly at docs on Stripe that appear flagrantly wrong.

I've seen this pattern at organizations too, where the thing that will clearly fix our problem is actually many tens of thousands of dollars and requires dedicated resources to use, and just slogging through the problem and being scrappy is ultimately _most_ efficient for the company.

The good news is technology really has made a lot of humanity more productive, and the average adult uses quite a lot of it to get their everyday job done. But there's only so much time and resources. Technology we're already super familiar with gets a lot of the job done already. Making portable and sharable lists, communicating about dates and times, traveling many miles or zero miles to see someone's face — that is advanced civilization shit, and we do it all the time already.


There's a (few) meta-level(s) for this. They should recognize the value of, and accordingly advocate for, having more "hackable/discoverable/understandable tech environments/contexts". And not necessarily for themselves.

The post opens with the classic just brew install blah, and immediately runs into the problem of people not having brew (or some other package manager) preinstalled and ready on their computers. It's the walled garden problem.

And it's not a trivial problem. And even this meta-level argument is not necessarily straightforward, because it comes with serious trade-offs.

And .... unfortunately even one level up, where it should be straightforward to see how a walled garden monopoly is bad (as the consensus on rent-seeking being bad is pretty strong in economics) people mostly don't care about this issue, because they don't connect it to the value of "more power user freedom".

People start to see the value of right to repair tractors, people see the value of the first sale doctrine, people see themselves as temporarily embarrased rich gentlefolk, but they are afraid to see themselves as just a few commands away from a power user.


>People start to see the value of right to repair tractors, people see the value of the first sale doctrine, people see themselves as temporarily embarrased rich gentlefolk, but they are afraid to see themselves as just a few commands away from a power user.

The dissonance stems from the fact that most people don't care about power using.

Users (most people) don't care for messing with their computers because that's not what they're at the computer for. They want to do something on the computer, and the only thing they care about is whether they can get it done.

It's like how most people drive cars to get from Point A to Point B, without giving a damn how a car works or how they could tune the car. If it gets them to Point B then nothing else matters, and tuning their car is a roadblock to getting to Point B because they aren't getting any closer to Point B while tuning their car.

Power users use computers as the end to a means, users on the other hand use computers as a means to an end. More dials and knobs on their computer is irrelevant and even prohibitive for users. Most of us here are all power users and it's natural to want more people to be like us, but the reality is we are not the majority of people on the planet.

As an aside, most of us also gloss over how computers work without a care even as we preach about how the commons must understand the complexities. I doubt most tech bros and neckbeards would understand electrical engineering even as they preach how everyone must understand bash and Powershell.


I don't think we're talking about tuning or power users -- this is the equivalent of folks who don't know how to check their engine oil or their tire pressure.

Someone doesn't care how their car works _until their car doesn't work_. Or they want to change something relatively minor:

"Hey I just bought these awesome tires. Please put them on the car."

"Well, those are too big for your rims so you'd need new rims. And I'm not sure if your car can handle larger rims..."

"Why are you making this so difficult?"


Oh, that's a nice car analogy! (It'd be a shame if something were to happen to it :D)

... but on a bit more serious note, yes, this is an absolute hard question of how to provide "informed user experience". We all know that using phones, GPS, 4G networks, wikipedia, google translate, NFC payment, etc. makes us rely on them. We don't prepare for long trips, we don't even buy maps, nowadays I even forget to download the offline map.

All models are wrong, some are very useful. Picking the model that >> computers are mysterious and some apps "just work" and anything that doesn't work is "fuck that shit" << works for many people. (And if there's enough social pressure then suddenly even the most user-hostile UX becomes "just open that there, scroll for 10 minutes, tap on that small thing, fill out that form, wait until Zuckerberg personally approves in a few minutes, fill out that 30 CAPTCHA, yes, type your bank account number there, it's just how it works, and sure you can see the funny dancing Asian girls!".)

And ... sometimes people put their cat in the microwave, and open every and all malware, and ...


The same can be said for education in general. Many people openly state that it's not worth their time. But most of us would agree that "life's short, ain't nobody got time for education" approach would be devastating for society.

The problem with technology is that people are using it everywhere without having any clue of what it does, let alone how it works. Common sense stops working the moment anything tech becomes involved. Businesses and governments have clearly noticed, and is actively exploiting this. Invasive tracking, DRMs, illegal mass surveillance, or the constant encryption ban proposals might not have gone this far if people had a little more understanding of computers.


The thing is Joe Average is going to reply "So what?", and they have a point: Invasive tracking, DRM, illegal mass surveillance, banning encryption, literally none of that affects them insofar as whether they can do whatever it is they want on their computer.

Those things and more negatively affect us as power users, but those things are all complete non-issues for the common man and consequently not worth their time to care.


I'm sorry, what? There's a lot to unpack here. You think basic human rights and equality are only relevant for computer power users? That the erosion of privacy or due process isn't an issue for ordinary people? That people aren't affected by security issues unless they have a software engineering degree? That anti-consumer practices doesn't affect consumers?

I'm not buying any of that.


Joe Average doesn't care whether his communiques are encrypted, Joe probably doesn't even know (nor care) what "encrypted" means. All Joe cares about is whether he can talk with whoever's on the other side: Their spouse, a friend, a business acquaintance, whoever.

Joe Average doesn't care if he's the subject of invasive tracking, Joe probably doesn't even know (nor care) what "tracking" means. Can he read Twitter and Facebook? Can he watch Youtube? Can he shop on Amazon? Yes? That's all that matters.

Joe Average doesn't care about DRM, Joe probably doesn't even know (nor care) what "DRM" stands for. All Joe cares about is whether he can listen to the album he bought, watch the movie he bought, play the game he bought. Can he? Yes? That's all that matters.

Joe Average doesn't care about illegal surveillance, Joe probably doesn't even know (nor care) where the cameras even are. Does the surveillance get in the way of doing whatever he wants to do? No? That's all that matters.

Basically: Life is short, we only have so much time and so much to do. We straight up don't have the time to give an individual shit to each and every thing in our lives.

For most people, a computer is an appliance; it serves a purpose and there is next to no need nor desire to devote any additional time or thought to it. We as power users must respect that reality even if we don't necessarily subscribe to that world view.


It sounds more like you've mistaken the average Joe for the obedient, privileged, and unempathetic Joe. You're making a blanket claim that any social problem that involves a technological solution is insignificant, be it about human rights, democracy, or fraud and abuse. That defense of tyranny and oppression can hardly be framed as respecting other's viewpoints.


> not understanding how technology works when it's so central to our everyday lives just seems baffling

I don’t know. I appreciate the sentiment and would also like more tech literacy, but there are lots of areas central to our lives and I think it’s unrealistic to expect a good understand across it all.

I think it’s on us to be able to adequately communicate and understand the needs of those that ask us for help.

Just as you would want a doctor to explain what your condition is in a way you would understand.

Having buildings to live in is pretty central to most of our lives too, and we have some knowledge and intuition of what looks safe and can be used, but most of us leave the engineering to someone more in the know.

Farming too, I rather people in the industry work on maximising yield to feed us all with tools made and tailored for them.

I think when people go to the quickest solutions for them in tech (i.e. Ms 365, Spotify, John Deere, ABC MRI Scanner etc), I’d rather have them focused on the details of their trade than learning how to use a terminal, jailbreak a tractor or replace a magnet in an MRI scanner.

Those that want to, great, but I think most don’t have the time or interest to take on that kind of deeper learning.


The relevant xkcd: https://xkcd.com/2501/

Markdown is incredibly accessible to people who know how to drag files around on a file system and open them with a click, and every laptop/desktop operating system includes a fast and reliable text editor.

But many people don't have one of these "real" operating systems. Most non-technical people I know would need to build the mental modals of having "files" existing on a "file system" before they could even begin to create, organise and edit text files. Even as I describe this I wonder If I am parroting the xkcd...


"Normal" people see other people as technologies - you sort of give your problem to the person, maybe mix some money in and then the problem goes away. This is how the relationship with people like lawyers, doctors, salespeople, etc, etc work.

They know that engineers (and programmers) are in the same rough class as all the other professionals. These conversations showcase the confusion as they try to figure out what problem programmers are there to solve. Other engineering disciplines also suffer from this - engineering is a subtle art. Nobody knows what we actually do except the rest of the brotherhood :(.

The first example is a really good showcase. They really don't want to be walked through how to solve their problem. The point of the specialist is supposed to be that they solve the problem in exchange for payment. They don't understand that programmers are really all about adjusting the cost of taking action rather than acting directly.


In the most kind way possible, this is a truly dark Ayn Rand view on society that fails to understand relationships are far more complex than just goods and services but Empathy. Someone is asking for help on their terms. That means putting yourself in their shoes, understanding, and empathizing. Here is the secret:

You are a normal person, every "professional" is.

Just because you are a engineer doesn't make you some other being or such. I say that as an engineer. Go work a day with a drywaller hanging a complex high ceiling with dormers. They are using the exact same math you are. Chances are they are doing it in their head faster than you, they have created a system for everything they do in the exact way you write a function. They are working within a regulated environment and their work is reviewed by others, new tools come along, etc.

Chances are, you may ask that drywaller how to fix the damage you got at home next to a door frame. Surprize: Turns out your drywaller only does drywall, not finishing, paint, or trim. But they know where they sit in the world. They will tell you a solution, in whole, focused on how a normal person should fix it. They won't tell you to go get their tools, to replace the whole panel or such. They will meet you where you are. That is empathy. It is understanding that your skills and solutions don't apply to others and that doesn't make them some 'better' being but in fact limited to their ways of doing.

If an engineer can't lift themselves out of engineering to help others..they are more limited as a human, not less.


In this case the reason "normal" is in quotes is because the article starts with "It feels sometimes like programmers and normal people speak a totally different language and use computers in totally different ways."


My two cents, as an aspie programmer, we think too fast.

Programs like Microsoft Word are amazing, even Richard Stallman himself has praised it. So there's nothing wrong with Microsoft Word, but some people just feel like a GUI is a bottleneck.

I'm speaking for myself here first and foremost but I know there are literally dozens of us. ;)

Why choose markdown over Word? Why choose CLI over GUI? Why choose Linux over Windows? Because it's just quicker, it allows me to get my ideas out faster than the alternative.

But I'm not saying we don't have some work to do on inter-personal relations and how to deal with end users and clients. We definitely do and I work on that every day.

I think it comes down to memory capacity. If you can remember several languages, or command line syntaxes, or programming languages, then you can use that to get the same results as any GUI program, much quicker.


Similarly, many non-programmers can do tasks that some programmers must hire a professional for (like repair drywall, install a ceiling fan, basic car maintenance, etc).

Empathy is one of those skills that is as, or more, valuable than any particular programming language.


This is probably one of the most on-point comments on this. The Author in the blog seems...rather insufferable...to the point that they know it.

Life is short but long enough to become specialized in deep wells and to have cursory knowledge in others. Understanding that what is easy to you is daunting to others is critical to knowing the places you are daunted by. I'm an engineer, I will be the first one to tell you that I can't tell you by memory every dev pattern known to man. But I also empathise with normal people. People just want normal, consistent solutions that work without needing to learn unrelated details.

To a non-dev type, telling a person to run a program in a terminal after updating their package repos is like telling them to till their garden with a backhoe. "First, learn to operate a backhoe, it's easy and so much faster. Then..." This means this user has a garage full of normal garden tools..and then a backhoe. To a normal gardener, who isn't on a construction crew, this obviously is nonsensical.

Helping people means helping them on their terms, not yours.

Side detail:

A few years back, the company I work for (B2C) held a poll in the Eng org that asked "What excites you most about your work?" There were poll options like "Solving hard technical problems", "Building great solutions for our customers", "Using new technologies", etc.

"Building great solutions" was dead last with less than 5% of votes. We now are a product led organization with a focus on customer experience and I would say 75% of everyone in Engineering were replaced over the past two years. We just shipped one of the first actually UX-focused products by the company. It is so brutally clear why we wouldn't make progress in the past. Every choice was self-centered on what tech was flashy and cool rather than actually shipping.


Empathy, but also boundaries.

If someone asks for help it's important to tell them that sure, we can help, but "IT support" tends to be a lot less like "help me assemble this furniture from IKEA", and a lot more like "sure, let's install the ceilig fan, but oh it turns out the ceiling is cursed and now we have pierced the heavens and for 30 days every midnight you'll be visited by the mailer daemon, also we need the address of your first born, just a mere formality, you know, no biggie".

People has to recognize that in many cases there are no easy solutions for "them", even if it's easy for "us", because it depends on their willingness/ability/capacity to learn new things, which in general is a hard thing in tech.


For sure, that's true of all relationships in life. I'm sure abuse of professional's time unrelated to the task being paid for is common in many fields. "Sir, your total is... But hey what about...."


As an intensely tech guy I completely despise the tendency that some engineers have to speak in ambigu-ese and tech-ese making less reacheable what the fr*eak they are talking about and establish themselves as "essential experts" when they could easily speak in normal-ese with minimal effort making concepts reachable for colleagues.

BTW this can happen to any field, not just tech. It can happen with intellectuals, philosophers, medical doctors, blockchain lingo, AI lingo, devops lingo, telecom lingo....

Fortunately! Now we have AI which is insanely good at summarizing and explanning anything like I'm 6y/o so that little expert trick just got massively nuked and has its days counted.


The AI also regularly misuses terms and gets things wrong


Not only that, it often lies, and does it in a "convincing" way.


So it's doing a pretty good job of imitating those "experts" then too


Not a pretty good job, a better job, because, from (it's algerbraic based) imitation, differently from these "experts", it doesn't exclude expressing the same concepts in a different level of abstraction and authoring a coherent story with reasonable accurate analogies. Precisely all the things they omit.


The experts that I work with (PhDs conducting research in their fields) are easy to tell apart from impostors. Impostors can't accurately cite sources and almost never tell me that they don't know something. Essentially all of the actual experts that I personally work with both cite sources and tell me when they don't know something and sometimes even help me figure out how to find out.

ChatGPT regularly makes up citations out of thin air including DOIs


Which experts? People who are actually experts in their field?


These, and the fakers and impostors too.


I used to think that I find tech stuff easy because I'm used to it (and there's certainly some truth to it), but if that was all of it than I would expect myself to behave like tech illiterate people in domains that are foreign to me.

And yet this isn't so.

The reality is that peoples problem solving skills are absolutely atrocious. They don't even try to figure shit out. The maximum number of inferential steps the average person is willing (or able?) to make before throwing their hands up and exclaiming "this is impossible" seems to be somewhere between 0.5 and 2.

There seems to be just enough stuff you can do without having to ever really think for yourself that people are content with not understanding.

This is so far away from my own experience that I really struggle with understanding it. Obvious knowledge gaps and noticeable confusion are two of my biggest motivators. I cannot stand not understanding. It feels like giving up agency.

Wow. I got quite angry there for a second. I guess I need to examine that frustration a little more.


Yeah, I feel this. I recently added a .tool-versions file to a website repository (that non-programmers need to edit) and updated the README to say "install asdf (https://asdf-vm.com/) and then run `pnpm install`". And then I saw people try to follow the instructions and start needing to install Xcode, then Homebrew, etc., and it took forever. I think the right solution is to give them a 1-click cloud IDE with the repository loaded and a live preview running in the background.


Cloud IDEs are a "simple, obvious, and wrong" answer to this well-known and widely felt problem. By this point, it's not an untested path; the benefits have either failed to materialize or are outweighed by the downsides. Fortunately, the downsides can be ameliorated, and it's a pretty simple change:

If you're comfortable with the idea of people doing cloud-backed development through their Web browser, then go ahead and write your toolchain against ubiquitous browser APIs, but just take out the "cloud" part—the tools should still run on the programmer's own machine, just like traditional toolchains. Browsers are quite capable of opening/running .html (or .htm or .xhtml) files that come from the user's disk instead of over HTTP. (Crucially, though, actually following this workflow should itself be optional; target the browser as a reliable baseline, sure—but if you're clever, though, you can write a redistributable program that the browser will dutifully execute so long as it has the right file extension but that also allows you to enjoy the benefits of running it from the terminal if you're happiest doing things that way.)


There are some good points raised here - it does feel at times that development is getting further and further from the end-user.

And I find myself wondering if I’m beginning to serve technology rather than the other way around.


This reads to me like very blinkered & elitist thinking. Repetition makes things easy, we are good at what we practice doing. A normal computer user rarely needs to do something that might be more efficient in markdown, command line, whatever, so their speed in doing that will be slower compared to a graphical interface.

Normal computer users want programs with dropdown menus and they are faster with those than we are when we have to look up a command or function that we rarely use. I know people who can filter & pivot a complex dataset quicker in Excel than I can in SQL.

I think it is safe to say that most programmers occasionally encounter something they need to Google for and dig through some stack overflow posts to find the syntax. That would be necessary almost constantly for a non-programmer and it's just not an efficient use of time.


As people are mentioning a lack of Linux knowledge; is there a good course on this?

two things; - I think it boils down to people not having 'poked around'. I grew up on DOS / early windows, when messing about, learning commands, configuration files was needed to install and run games (or new hardware). New game, new challenge. This learning process was part of the use of a computer to me. It gave me enough confidence and knowledge to 'try things' in later Windows or software if things were not working. To most people i'm a 'computer guy', yet when it comes to Linux / unix / mac I have no idea how to get around.

- being overwhelmed / debilitating alienness (for lack of a better word), when they dont have adjacent knowledge & confidence. It's not just the transfer of knowledge, but often a çomfort of familiarity that is needed. My wife was a basic computer 'user' who would panic or blank out when she got simple technical instructions or something went wrong. I quickly learned that it was more helpful to show her we can just poke around, and it's ok to not completely understand something. We talked about how to communicate computer problems, and effectively use google. She got over the 'deer in headlights' syndrome, and now actually welcomes any new computer topic - while working in IT service desk!


I'm no Linux guru, one of those may be along shortly. I would say that as a Windows person you should just start using WSL for stuff and write down everything you learn as you come across new information. You'll naturally pick up the basics fairly quickly.

At some point it might be worth installing a distro on an old-ish laptop you may have lying around - Pop OS is the least problematic in my recent experience. Then actually use it.

You will accidentally irretrievably trash everything numerous times, WSL lets you backup & restore distros easily so do do that. With full installs, just remember to have copies of your config & data.


There might be good courses, aaaaand there was a pretty decent hands-on series linked recently here (or on reddit, or ... somewhere! and I can't find it now, sorry :/ )

That said, as the sibling comment by sanitycheck mentioned, start using it. WSL2 is pretty okay.

Oracle is heavily promoting its cloud services by offering a pretty beefy free tier (24GB RAM, 4 vCPU aarch64, 200GB SSD, +1 IPv4 address)

I'd start with self-hosting a bunch of stuff. Just for fun. Whatever interests you. Try things with and without Docker. Write a Dockerfile, build it, run it. Write a systemd unit file, enable it, start it, stop it, disable it, mask it, etc.

Buying a (second hand?) laptop and installing Linux on it instantly provides a ton of interesting challenges. Which distribution? Which version of the distribution? How to partition the disk? Encrypt my files? Only my home dir or full-disk encryption?

Try KDE, try Gnome, try Xfce. Try Ubuntu. Try Arch (because it's hype, also it's a rolling distro, also pacman - its package manager - is pretty nice) ... I don't like RedHat-based distros, but ... millions do. Who knows, maybe you'll like it. (Try Fedora.)

Okay, you set up your Linux desktop. For you it's definitely the Year of the Linux Desktop. Congratulations! You're irrevocably one of us now. (I'm typing this on a shitty Windows 10 laptop, while a feet away a newer, but even worse Win11 laptop sleeps and ... I've been using Win10/11 on my big "dev" box for years (for playing 5+ year old games, obviously) ... but ! finally got fed up with BSODs and ~6 weeks ago I've installed Linux on it again ... then I moved and it's still in a box in storage :(((( )

Oh, yeah, this reminds me. Try Wine! Definitely try Lutris. (StarCraft II is free now, works great on Wine.)

... also, ask questions! (There's even a guide for asking questions! http://www.catb.org/~esr/faqs/smart-questions.html Oh this reminds me of the howtos. So backin the day there was this https://tldp.org/HOWTO/HOWTO-INDEX/howtos.html :) before youtube videos took over.)

That's it! Absolutely nothing else. No need to learn shell scripting, no need to open the terminal, no need to know just enough python to be dangerous. No need to know what's DKMS, nvidia drivers will just work. Everything just works! Bluetooth headsets and network printers too!

(ps. I lied, sorry!)


This used to bother me before, but then I realized, "this is why I get paid the big bucks". Now, I see my "unreasonable" requests as a sign of how much I've progressed in my field, even relative to some of my peers, who can't do basic devops.

It's easy to forget how many countless hours I spent learning how to Google for the right command line args and picking up little tricks here and there to chips at problems until I've solved them. Take a deep breath and give everyone else who can barely use their keyboard and mouse to do their work a break alright. Not everyone enjoys bashing their head against the wall to solve these things the "easy" way.


The author argues that markdown is the best way to write text. I know many people who argue that LateX is the best way to write text, and they also have arguments.

As a programmer I sometime prefer the world of clickety-click GUI instead that of command line. Doing something in a visual way, requires no knowledge. You just discover how to do things as you do them. Using the command line requires me to learn paths, steps, command line arguments, the order of the command line arguments and also memorize them.

Some times I prefer the command line.

I know that for some folks VI or Emacs is better and more productive and faster, but for me Visual Studio is that and more.

Maybe there isn't one objective truth, and for some persons the command line and text editors is better and for other folks the visual stuff is better.


Latex and Markdown have different use cases. You would not want to write a maths book in markdown and you would not want to write program documentation in Latex. Markdown can be read easily as pure text but Latex is often not very easy to read.


It's a little depressing that after all of the work the industry has put into graphical interfaces and making software easy to use, the mark has been so widely missed. It seems to me that a big part of this post boils down to people frustrated (in some cases literally to tears) at their inability to make the machine do the thing they need done.

I don't have a solution, but I think we have two big problems: the rigidity of the GUI (it's difficult to combine operations, in some cases impossible) and then arbitrary gatekeeping of features for dollars (why can't the free Acrobat export PNG files?). For every tool like Excel, we have another "simple" tool that is nearly only one possible workflow.


This can be a simple frequency-of-use issue. Non-programmers might need a command-line solution once or twice a year. It's hard to remember a series of steps that you don't frequently perform.

For example, you can theoretically know how to change a tire, or how to use Photoshop to replace a color in an image, or how to troubleshoot a printer. But you only run through that series of steps occasionally, when there's a specific problem to solve.


I wrote an essay once that was nominally about using Pysimplegui to get folks to use your software, but is really about understanding ways to make your software accessible to people without your level of coding ability:

https://cushychicken.github.io/python-guis-for-heretics/

Might find it interesting.


Hey thanks, this is great.

I have a simple py script I am trying to deploy to small team that does some web scraping and slicing/dicing of data from internal repositories. I had been starting to learn PyQT but it was really overkill for what I needed to do, which was a simple menu of options and input/output file selection.

A lot of HN devs don't realize this truth about working in industries that have not historically been software-focused:

It’s a brutal reality, but if you’re the sole person in your group who’s figured out enough to:

    Install Python, and use it to run command line scripts,
    Run Cygwin or the Linux Subsystem for Windows on your work Windows machine,
    Get your hands on a Linux machine/VM for writing scripts,
…then you’re probably the most experienced software engineer in your group. If your coworkers aren’t software engineers, it’s a huge lift to get all of your coworkers to understand command lines, Unix systems, and virtual machines.


Can't recommend PysimpleGUI highly enough for this use case.

Hope it suits your needs!


Worked perfect, I had it doing something on-screen in 15 minutes, had it mostly mocked up in 30, and had it polished and pretty with error handling in an hour. Will make for a good demo on Monday.


H*ck yeah. Get that bread, my dude.


This has the same energy as programmers who can't understand why other programmers won't use vim. "It's so simple, though!"


> I try and try to explain that this arcane system of monochrome text and rendering steps is ACTUALLY easier than editing in Microsoft Word, but my pleas fall on deaf ears

I know my way around Markdown, but I just don't like it. I'd rather directly see what I'm working on rather than a weird approximation.

(I don't like hidden or direct formatting either, which is why I do sometimes use Markdown)


The hidden bit is the key element. The author is way off their rocker. Markdown is easier than word...for someone who has been using Markdown for 10 years. In word you type words, highlight it, and then you have options in front of you. Markdown is memorized formatting, it is bonkers to think people can easily just pick it up in a 'use instead of...' way and just ignorant to assume that memorizing it should be a priority in their lives rather than literally anything else.


I'm trying to decide if this is parody or not.

I try and try to explain that this arcane system of monochrome text and rendering steps is ACTUALLY easier than editing in Microsoft Word, but my pleas fall on deaf ears.

Easier for you, who probably has an auto-activating mode in your text editor that highlights Markdown. Not easier for someone whose only plaintext editing tool is Wordpad. Not easier for someone who can hit command-i to turn on and off italics as easily and thoughtlessly as you type .

And of course it is a simple program, but whenever the first step is opening a terminal, it is no longer simple*

To you, the terminal is a comfortable place. You have learnt all the arcane incantations, including the ways to make the terminal help you craft and remember them. To them it is a mystifying place, with little guidance, and the possibility of totally fucking up their computer if they type the wrong thing.

I tell them to execute the simple and easy step of running $PACKAGE_MANAGER install $PROGRAM, and they don't want to do it.

And then you list examples of exactly how you have learnt all the arcane incantations. You have ten different package managers on your computer already, and know how to use them.

"Can you help with my printer" "no"

Fair enough, really. Drivers suck.

What was the point of writing this post? I don't know. [...] Maybe I just need to have a little more empathy (or is it sympathy?) for normal people.

Yes.

Find someone who refuses to use your cryptic incantations, and find out what they're really good at. If you can find someone who's frighteningly good at a thing, who's been doing the thing at a professional level since before you're born, that's the best. If it's something you have no interest in doing yourself, that's even better.

Ask them their stories of the time they had to work with someone who was completely clueless of their craft. Ask them how they do something that looks impossibly complicated to you with next to no effort. Ask them to walk you through doing one of those things in enough detail for you to do it. How immensely tedious does this start to sound? How many times do they use a word you've never heard, or a word you're familiar with in a way that is pretty clearly having a very specific meaning to their craft that is not the same as normal usage? What tools do they refer to; how many of them do you have casually sitting around your life, and know how to use with even the vaguest competence?


Linux sysadmin here. I’ve come across plenty of documentation, I’m almost certain was written by a dev, where it’s assumed I know the correct command line usage or concepts. I know a lot more than the average end user, but I’m _not_ a dev, either.

And please… (to those writing docs) always include examples. So many man pages lack examples; it can be very frustrating.


Interesting how more deficient designs are promoted as better alternatives just because the author is used to their ineffectiveness

Like the cli example "explain how simple it is to just convert -density 300 input.pdf -resize 50% output.png, and I'm met with a blank stare"

Why would an unsuspecting user ever need to remember (or worse - look it up) that density is "-density" and resize requires explicit % when a GUI app can show the most common flags right there, with in-band help and validate your input (was there a typo in a file name and inupt.pdf doesn't exist?) in real time?

Or the markdown example with tables, which are especially painful to edit one you add just a bit of complications like formatting, which break your tabular alignment, practically requiring you to use both md and formatted output Why would you wish regular users this pain?


>"Oh, you're a programmer, Can't you do $THING"

I've been asked by others to install software, repair PCs, repair laptops and even modify or replace firmware on phones. Sometimes I did it, if I had time. But for my personal needs, I always called a technician or went to a repair center.

As for programming tasks, I did many: C++ desktop software, Windows device drivers in C, embedded software, websites in HTML + JS, video games, mobile apps, hacked Android kernel modules and libraries, web backend, web frontend, microservices.

If you have the time to learn the business domain, learn tools, frameworks, libraries, APIs - you can work on anything. "Doing" things is different, because some projects are quite huge, impossible for one person during his life time.


Isn’t this applicable to pretty much all domains of knowledge? ‘Just change the oil yourself, it’s easy’, says the auto mechanic. ‘Just add the circuit yourself, it’s easy’, says the home electrician. There’s a term for this, but I can’t recall it.


I believe the "curse of knowledge" is what you're looking for!

https://en.wikipedia.org/wiki/Curse_of_knowledge


They probably don't know how to use Word as well as you imagine either.

There is often an office Word or Excel expert, despite the whole floor spends all their time in those apps. The expert is just the person who was willing to poke around the app or Google things.


I think tech is getting more complicated for no reason. But also in the "techy" side. Look at web development: it used to be simple, now it's mindbogglingly complicated. And I don't see a good reason why...


> “Can you help with my printer” “no”

So true. I can’t even help myself with the blasted thing.


Not really a novel observation, nor surprising.

The secret world of auto mechanics: “Just change the air filter.”

The secret world of electricians: “Turn the circuit breaker off and back on.”

The secret worlds of chefs, carpenters, doctors, lawyers, … I could go on and on. Why would programmers expect people not invested in programming to know this stuff, or care?

I only care about ignorance of jargon and specific skills when demonstrated by so-called professionals. A “programmer” who can’t install from the command line seems like an auto mechanic who can’t open the hood.


I think the interesting bit isn't the jargon, it's the parallel but separate paths of computer use.

Changing an car's air filter is obviously different from driving. Flipping a circuit breaker is like flipping a switch, just for the whole house, and also it happens automatically if you anger the switch by plugging too much stuff in or dropping the toaster in the sink. Both of them make sense intuitively.

Changing a .md file seems like it should be the same thing as changing a .docx or .odt file, since they're all different kinds of formatted text, but for some reason you need an entirely different toolchain.

My resume needs to be compiled using wkhtmltopdf, pandoc, and make. It's deployed using Dokku, which wraps Docker, and the Heroku static site buildpack, which wraps nginx. It's kind of weird that a non-programmer doing the same thing would use a completely different set of DIY tools. It seems like an auto mechanic uses roughly the same set of tools as everyone else, just better versions of them and with greater skill.


A non-programmer wouldn’t use anything we’d call a toolchain. They would use a WYSIWYG editor like Word, and still not understand formatting or how to print.


I'm happy to say Word, whatever version of PDF ULTRA DELUXE PRINTER 9000(TM) turns up first in search results, and Wix are at least analogous to a toolchain. It's three tools that they'd use in sequential steps, taking the output of each step and using it as an input to the next.


The more apt analogies would be:

* "chng af"

* "tglCB"

I will continue my crusade to fight against this and spell things out until my death.


Hear hear. I don't think programmers - as a group - get to complain about people not learning programming tools while simultaneously making them so unapproachable (especially Linux things).

It's not just the overuse of acronyms. There's also:

* Religious devotion to the CLI despite it having terrible discoverability.

* Really bad naming. Git is probably the worst offender at this, but the whole of Unix is a naming mess. WTF is `usr`? Is that where user files go?

* Generally over-complicated tooling. A good example of this is Node/NPM. So complicated to set up! Contrast it with https://trunkrs.dev/

* Deification of distro packages. No I do not want to spend half of my development time packaging my app for 10 different distros. I guess I'll go with curl | bash then.

* Distain for binary app distribution. I'm looking at you glibc.


Thanks for being one of the rarities top understand. Gives me a glimmer of hope. I agree with everything you said.


Learning some jargon and specialized skills most people don’t need or care about doesn’t make a person smarter or better, nor does it mean everyone else is ignorant or too dumb. Showing off and complaining about the dumb masses is vanity, not professionalism.


I was just using your examples to comment on the common programmer practice of needlessly shortening words and names of things or, even worse, intentionally quirkily naming something to make a specialized word or domain even more specialized. It's a practice that I do not like, as it arbitrarily creates an additional layer of obfuscation.


At one time memory constraints made the short names necessary. Not a technical issue anymore. Now we have autocomplete in the shell. But what do you expect from a professional caste who debate endlessly about efficiency and “productivity” supposedly improved a fraction by a keyboard or typeface or color scheme?


Let me post another article about how I can save 10 seconds by using Emacs for my toaster!


It's important to consider your audience and there are all kinds of tools to bridge the gap. If you really want someone non-technical to work on a Markdown file with you, show them a nice online Markdown editor with a live preview. If they asked me how to convert a video, instead of recommending ffmpeg I'd recommend an ffmpeg wrapper like Handbrake.


I feel like github is not the right place to put a blog, reading any text there that is longer than single line gets me eyestrain.


I agree, and I've seen more "blogs" on GitHub lately. it's very annoying.


Once 'bash' is replaced by 'aish' and its avatar 'SAIri', users will be able to do the tasks that are not immediately clickable in the next menu, by just asking SAIri. At that point, your non-programmer friends won't be asking you stupid questions anymore! :-)

Unless 'aish' has crashed.


The divide between techies and normies isn't clear cut. Some of us are "double agents," observing both worlds from the inside. I've been using the command line since 1982, and am a heavy user of markdown.

"It's simple. Here's a terminal command you can use."

It will return an error message on the first try. After much back-and-forth over chat, I'll come to your cube and see for myself that it returns an error message, then find a few more commands and settings that were needed to make it work. The same commands won't work the next time.

"Just use markdown."

The 80/20 rule looms large. Markdown will produce a readable document, but I won't be able to tell you how to make it conform to an existing aesthetic style, or even know where to look for that information.


This is just kind of an elitist (kind of!) rant by a coder - all of which I can understand and mostly agree with because I’m a coder too. But any specialist could make the same case for people outside their area of expertise. Some people prefer manual transmissions in their vehicles and will chit on people who drive automatics.

The difference being most software/tech workers largely make their salary from non-tech savvy consumers so I don’t know, maybe some more perspective is needed in dealing with “normal people”.

But also helping out with printers and stuff is lame, as mentioned I understand where author is coming from.

Edit: I guess my point is - rants are totally fine if labeled as such but this doesn’t seem very constructive to me. Hopefully us tech savvy individuals can help level up everyone else


For normal people (non-programmers) computers are pretty terrible. I'm kind of curious what happens when everyone gets a Jarvis-like assistant that can do the scary stuff (open a terminal, type some commands, or possibly code up a nice UI in seconds) based on natural language requests.


> For normal people (non-programmers) computers are pretty terrible.

I don’t know — they pull a supercomputer out of their pocket and use it dozens of times a day.


That's only because the supercomputer was designed around making it as good for the 'normal' person as possible. Sure it can do so so much more...but the user does not need to know a damn thing about how it works under the hood to use it. Even just making a note with an iPhone requires nearly every last stack of technology invented by man so that the user pressed 'notes' and taps letters on the screen. Custom silicon, 5G & server farms (for cloud backup to prevent knowing about physical storage), 1600nit OLED displays, advanced battery tech...all of it..for those letters to be there when you come back. Knowing a billion little things about computers to use them was always the key problem just like how you are not expected to wire up your lights and configure a circuit box every night in your home.


so...you agree with me?


> "First off," I reply, "No. And second off,

This is a super condescending thing to say. Let's say you're a lawyer telling a client why you can't do something. The language you're using is pretty harsh.

> Maybe I just need to have a little more empathy (or is it sympathy?) for normal people.

You are also a "normal person" I don't understand why developers sometimes think they are gods or something.

> Can you help with my printer. No

No explanation on this one at all? That, again, is a rude response.

I've been saying this for a while, but the pay we provide to developers was a mistake. It has created a class of people that think they are gods. Come down from your soap box, learn real empathy, decide you are a Normal Person yourself and get rid of your condescending attitude.


> I've been saying this for a while, but the pay we provide to developers was a mistake. It has created a class of people that think they are gods.

I think you've got it backwards there. I agree with everything else you said, and hopefully the author was being tongue in cheek, but this is silly. Developers have a specialized skill and get paid specialized wages. The behavior you're describing is not limited to developers..any profession that has a specialized set of skills and a specialized language that only the "in" people understand has the same issues.


> The behavior you're describing is not limited to developers..any profession that has a specialized set of skills and a specialized language that only the "in" people understand has the same issues.

You can use specialized language without having the attitude of a condescending prick.


I'm not sure I said anything to the contrary.


Who is he being rude or condescending to? The `<h2>`s? He's not actually talking to anyone—he imagined up a non-technical conversation partner and he's being dismissive to them as a writing device to express how sick he is of getting the same questions. I'm sure he wouldn't actually respond to someone asking for help with their printer with the single word "no."

Also, normal people don't compose documents with pandoc or use the command line. That's not a secretly-disdainful observation just because it uses the word "normal." People with technical knowledge in these areas are abnormal when it comes to how we use computers. The word "normal" doesn't have inherently negative connotations—if anything, in most contexts it's seen as better to be normal.

> Come down from your soap box, learn real empathy, decide you are a Normal Person yourself and get rid of your condescending attitude.

This is way ruder than anything that was written in the post.


> Let's say you're a lawyer telling a client why you can't do something. The language you're using is pretty harsh.

Except we aren't talking about a client relationship, we are talking about someone asking you to do something for free because "you do something with computers" without making the effort to understand literally the first thing about what you actually do.

> the pay we provide to developers was a mistake. It has created a class of people that think they are gods.

Painting an entire diverse group of people with a single brush, that's real polite of you.

> Come down from your soap box, learn real empathy, decide you are a Normal Person yourself and get rid of your condescending attitude.

Perhaps take your own advice?


This was literally my whole point.

If you think my attitude is condescending in the scenarios presented, then you properly understood what I was trying to express in the last section.

I didn't want to write it in a way that's specifically attacking the (likely technical) reader, so I put the focus on "me"


> You are also a "normal person" I don't understand why developers sometimes think they are gods or something.

I can't tell you how much I loathe the label "normies".


"Oh, you're a programmer, Can't you do $THING"?

This is the one for me. Yes, yes I could.

If I spent a year learning how to do $THING.

So after that exchange, I'm either stupid, or more likely "Well, if you didn't want to help, you could have just said so".


Loved this. I think there are many more characteristic cases of programmer-muggle “lost in translation”.

On a tangent,

> Can you help with my printer?

Related Ask HN: https://news.ycombinator.com/item?id=35165560


The one thing I wish people would get over is their fear of the terminal. This isn't to suggest that they should use it, never mind embrace it, but they should not have an adverse reaction when someone gives them specific insructions to type in a command then copy+paste the results while providing technical support. Such insructions tend to be much more accurate than sending people on a GUI scavenger hunt, particularly when you consider that GUIs tend to vary in time.

Alas, as the article states, it is as though you are speaking a different and very intimidating language to non-technical users so the end result is nowhere near as positive as it should be.


Programming isn't somehow unique in this sense.

Anytime you speak to any subject matter expert they're likely thinking "ok what's the simplest way I can phrase this for this idiot".

Not sure what the solution is beyond being kind & understanding & patient


I do feel it necessary to point out that IT and software engineering are largely diverged fields. Additionally enterprise and consumer IT are very different too. So if random person ask how to share files your answer can't be Sharepoint or NFS server or something, and even if it was the answer you would put random web developer to set up that Sharepoint site or whatever. Similarly if I as a software engineer have trouble with my work workstation, I don't waste time to try to debug it myself, I go to our IT dept and get them to fix it.


Every profession has its own language and jargon. Even the offices I've worked at each had their own unique jargon.

For example, there was one office where the engineers were affectionately known as "swine". (I'm not sure how that came to be.) If an engineer did some particularly clever piece of engineering, the ultimate compliment was to call it "swinish".

To an outsider, this made no sense at all, but it worked for us.


My children are a fine example - they're teenagers now, and they have had more access to tech than most people (thanks to Dad [me!]). Yet they barely know how to do more than Word, and don't understand the concept of a text editor vs rich text editor vs word processor.

The problem isn't them, it's us.

Why DO we expect "normies", as it were, to understand Markdown, text editors and the holy war of Plain Text?

Why are we going backwards with our tooling?


My motivation for using markdown is simple: I can keep track of changes with git.


Which raises an interesting question: are you concerned with keeping track of changes or are you concerned with keeping track of changes _with git_?

In my opinion this is where a lot of technical folk diverge from everyone else. Technical folk tend to become infatuated with their specific tools whereas most other people would be happy just to find their expectations met, and neither side really wants to understand or compromise with the other.


Man learns outside world exists, is annoyed by it.


    "Oh, you're a programmer, Can't you do $THING"
I always respond with "I don't really know how to do $THING. It's kind of like asking the piano tuner to play a sonata. They might be able to, but just because you can tune a piano doesn't mean you can perform a piece of music on the piano."


I frequently think to myself, every programmer should be forced to suffer as a user. The weird, risk averse behaviors of users will begin to seem reasonable when confronted with the actual experience of using computers.

"Simple" means "uses abstractions that I've learned by trial and error but can't explain."


"Oh you're a programmer, come this to this start up idea you can do after working 9-5 for zero pay"


Was this written by a high schooler?


I think the author lacks a lot maturity in collaboration. If you are collaborating, you have to put your preferences aside and use the tool that works for the majority. If that tool is MS Word, you just swallow your nerd pride, and use Word.


After a decade of this, i convinced my mother in law to buy a MacBook and genius bar support. No, i won't fix your printer. Sorry that i figure this out quickly. LMGTFY got me in a lot of trouble with my family...


Any article like this always has to the the “Can you help with my printer?”


We as an industry really need to be a whole lot more intolerant of ignorance. People learned how to drive, they can learn how to use a computer.


they learned to drive, not how a car works all the way down to the thermodynamics of the engine.

they learn to send WhatsApp messages with their iPhone, don't see why they'd learn how computers work all the way down to Maxwell's equations.


How does having a working knowledge of concepts like files, or using a command prompt, equate with having to learn the guts all the way down to Maxwell's equations?

Much of the scenarios this dude mentions are table-stakes in the grand scheme of computing things.

Instead we have an endless supply of wanterpreneurs re-engineering everything to make computing as mindless as possible because everyone knows the stupider it is, the bigger your potential audience is. And since everyone wants megagrowth bullshit we have to make using a computer as dumb and mindless as possible! Fuck all the people who take the time to understand this stuff, who got these companies to where they are today, they are the long tail...


Yeah, we consume objects and fork children without even thinking about how that might sound to a non-programmer.


> Can you help me with the printer?

LOL. No, because I can never get the ducking things to work either.


>>"Can you help with my printer"

>no

Can you do my taxes for me? I hate printers as much as you do.


I really wish people would stop writing blogs in naked GitHub repos.


most devs I've worked with in my career are absolutely terrified of CLI, and if their IDE doesn't hold their hand they are completely worthless.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: