Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Girls and Computers (rmurphey.com)
230 points by taylorbuley on March 26, 2012 | hide | past | favorite | 84 comments


The thing that stands out in many of these stories are these three points:

   * Accessible price point,
   * Instantly usable to write code, 
   * a place to go when the capabilities ran out.
It is very challenging to reliably buy this today. When my daughter wanted a computer a gave her a VAX 4000/VLC, it had BASIC, C, FORTRAN, and COBOL installed and could run adventure. I got it for free from a junk pile.

I'm going to try and fix that. [edited for formatting]


I would add one thing:

  * impoverished environment
This, I think is crucial. Given a Linux machine with a rich GUI or an iPad, kids focus on the eye candy and start playing with those things. A (relatively) limited environment at the same time focuses them and makes them creative.


I'd add another thing, both good and bad: turning if off returned it to a known state.

This meant you could experiment with it, to the point where it broke, secure in the knowledge that recovery was as far as a quick flick of the power switch. The negative was that for complex projects, where you wanted to preserve the state of the machine, you had to play cassette roulette.


Tape drives for Commodores and Acorns at least, were pretty reliable, just slow.


The VZ-200 (contained a TRS-80 rom) was atrocious!


turning if off returned it to a known state.

One of the first things I did with out PET-2001 was to solder a switch to pin 40 ;-)


I couldn't agree more, I can't imagine how much more I could have picked up at that young age if I'd just had, say, a tty, instead of endlessly playing with window themes, notification sounds and so on.

Another thing that's important though, you need to be able to make the computer do fun things pretty easily, otherwise you'll (as a kid) loose interest.


Hmm... as a kid, I had an 8086 with DOS, GWBASIC and a white manual for both. Nothing magical came out of it :) , my best creation was a "Copa America" text-based game, full of GOTOs.

I still program in (Visual) Basic (6 and .NET) 20 years later :P though I try not to, and I swore when I finished university that I wouldn't do it again - when my boss told me to, I caved pretty fast :)


I don't know. I've taught computer science in high school in the late 2000s and there still were students who dabbled in code even though they had access to the most wonderful GUIs and games. When I was young and the first computer entered our house it didn't have a nice GUI. It was dos and basic all the way down. From my siblings I am the only one who did develop any kind of interest in coding. The others were given the same opportunities as I (maybe even more so) but it didn't take.

There is a curiosity about how to get computers to do what you want that is independent of graphical capabilities and attractive games, social networks, and what not. And that curiosity somehow isn't given to all. Similarly not all are into woodworking, gardening, knitting, and so on.

(By the way, your impoverished-point seem to work for knitting/sewing as well. Why bother if you can get all the clothes and accessories from the shop downtown or the internet for almost nothing? Still there are people enthusiastic about knitting/sewing.)


One of my favorite game designers is fond of the mantra "Restriction breeds creativity"

There's also the classic "Necessity is the mother of invention".


Start kids on an empty Fluxbox, let them learn to add all the bells and whistles they can. Eye candy can motivate creativity pretty well, and Fluxbox can get pretty cool with the right addons. Some of the menu-making is almost like coding; it's almost an introduction to formal languages.

Actually, Openbox might be the thing these days. I'm not sure. Definitely not LXDE -- that's preconfigured.


A friend wants to get a "computer" for her 10 year old daughter and I'm having trouble with what to recommend. The alternatives are a "netbook/mini" with Windows, same but repartitioned to also have a Linux partition or an Android tablet. They live half way around the world from me which makes support and interaction hard (timezone issues, hard to troubleshoot when you can't see the screen).

All 3 solutions run Skype, let you create and manipulate documents and play games. Windows requires by far the most system administration (virus checkers, frequent updates, driver issues, crapware etc) but has the largest software library. Android has the least amount of system administration (essentially none) but is the least flexible (eg document creation typically requires being online) and Android Market isn't available in their country making acquiring new software hard. Linux has the most flexibility but is also the hardest to use (remember the target is a 10 year old whose parents can't even spell Linux, not you). Netbooks also have the worst hardware support under Linux due to proprietary hardware.

An Android tablet would seem to be the best solution when coupled with App Inventor. Sadly App Inventor doesn't run on Android itself so you have to have a non-Android computer to do your development work.

I could of course setup emulators so she gets a Sinclair Spectrum and Apple //e and can learn just how I did, but things have moved on.


See the Maximite link below, its pretty straight forward.

You've touched upon some of the key issues. But let me walk through them with you and perhaps we'll get to a similar place ...

"All 3 solutions run Skype, let you create and manipulate documents and play games."

You've outlined some desirable features of this 'computer.' Skype, documents, and game playing. There is a trap here which I've discovered and that is with price. If a computer is 'expensive,' and any new laptop, netbook, or desktop computer will fall into that category, it has to have things which justify its existence/purchase. On the other hand, if the computer does not cost very much, people often dismiss it as a toy. (We have many exemplars from LeapFrog and others which fit this category).

If you had the equivalent of an Apple II (but upgraded to some recent microprocessors) you could have a computer which was programmable (and games of the type that make you think versus fast twitch) could be made for it, and on which you could edit documents but not change fonts or any of that stuff necessarily. And it cost less than $50 and the only 'breakable' parts were the keyboard and the TV you hooked it up to, it might be a different story.

"An Android tablet would seem to be the best solution when coupled with App Inventor. Sadly App Inventor doesn't run on Android itself so you have to have a non-Android computer to do your development work."

This is one of those things that is an issue for me as well. When I tried to use Arduinioes for teaching programming you run into a barrier of getting two computers rather than just one. And sadly the ecosystem of windows, Linux, and MacOS make compatibility a real drain.

"I could of course setup emulators so she gets a Sinclair Spectrum and Apple //e and can learn just how I did, but things have moved on."

I would hear more about 'moved on' ? Do you mean that you have moved on? Or that computer science has moved on? The beauty of these simple systems is that they are simple. They teach you to 'think' computer and that is tremendously powerful when you attack more modern systems. Sort of like a lawn mower motor on a tube frame with a centrifugal clutch isn't really a 'motorcycle' (its a minibike) but it is a great teaching tool on the concepts involved.

One of the areas that could use help on today is graphics. Back in the day there were some great pre-packaged graphics chips with full documentation that you could use to attach to your microprocessor. Sadly those days are gone. The good news however is that nearly every 'intro to FPGAs' class has a part where you build a video chip out of an FPGA. And they are now as cheap as those old customs chips. I've spec'd out a design which is a dual buffered HDMI output with a sprite based HW cursor. Its surprisingly cost effective (parts wise), and since it doesn't care about HDCP compliance its actually pretty simple.

With an open access graphics subsystem, a modest ARM core CPU, some memory, and USB plugs for "disks" and "keyboards" I think we could build a system that works well for learning about computers from 'Wow' to pretty sophisticated data structures. But that is getting ahead of myself.


> See the Maximite link below, its pretty straight forward.

Not everyone has VGA monitors hanging around (they don't) and only being able to do stuff in one place (where the monitor is) is a significant handicap over a tablet or netbook/mini that can be used anywhere.

While your theory about pricing is correct, in this case the parents really do want to spend about $500 and they really do need Skype, documents and game playing as the existing many year old family laptop is in its last legs. So yes they could buy two things - a cheapo for the kid and a second family laptop replacement, but that is not an optimal route to go. (The parents already have laptops, tablets and smartphones between them.)

> I would hear more about 'moved on'

When I was young having a tty program that asked your name and printed "hello $name" was cool. Nowadays a kid is not going to impress a friend with that. Remember that positive feedback is very reinforcing. On the other hand having an Android app that asks your name in a gui and gives a gui response is so much more familiar. App Inventor has media players, camera recording etc which is so much more modern and expressive. I'd much rather see her making an app where you record yourself saying your name, and then it says "hello $name" with your recording and text to speech.

> With an open access graphics subsystem, a modest ARM core CPU, some memory, and USB plugs for "disks" and "keyboards"

That just won't work. They skipped the whole generation of computers made up of discrete bits (processors, displays, disks, keyboards etc). Instead normal to them is integrated devices (processor, display, storage, wireless network, input mechanisms) and you pickup and use wherever takes your fancy.


I would hear more about 'moved on'?

Unless the kid becomes a programmer, the experience won't be relevant. Now, you could argue it improves thinking skills etc etc, but bare hardware is really only one layer of the abstraction chain- the layer you are probably most fond of. Every other layer- wiring, circuit design, device engineering, physics, and so on- are no less valid, but you probably can't do all of them. This is because we have to pick and choose which layers of abstraction to spend our time in- which gives the most bang for our buck; which is most relevant. For most people, "how to use a computer" in the office appliance sense is the most valuable.


You are solving the wrong problem. What you are doing, is searching for an environment for a person unfamiliar with computers. This should be a second step really. The first step should be (I think) "what should I buy for 10 YEAR OLD". First of all - I think that netbooks AND notebooks (from 7" to 20") for a child is a bad idea. Netbooks (7" to 10") is especially bad idea. You should consider her eyesight - small, high dpi display very close to the eyes is bad. Consider scoliosis probability - because keyboard is fixed right below the screen and can't disconnected (except 1-2 special rare big notebooks). Consider RSI (repetitive strain injury) to hands, neck etc., hands especially. This can be caused by bad keyboard (all notebooks I think), screen keyboards (on extended use, like writing code, essays etc.), even bad mouse (cause hand position on the mouse is unnatural, it is a 90* turn from ideal position). Developed hand RSI is unbearably painful, considerably limits keyboard usage and heals very, very bad. This is just what I would consider. Ask a medic and he'll tell you more.

tl;dr No notebooks and probably tablets for 10 year old. Buy adjustable table, adjustable chair, classic desktop with some ergonomic keyboard and mouse (both with wrist support). Set up timed lockout on the PC, like 10 minutes every 1-2 hours. Lockout after 24:00 and lockout on total uptime per day. After that choose OS from your experience, it doesn't matter really (I think).

I promise, she'll thank you 15 years later.


The RSI risk and your other ergonomic notes have nothing to do with someone's age.

I completely agree with 'icebraining' concerning your idea of computer-forced strictly timed lockouts: not a good idea.

  Consider scoliosis probability [..]
Now you are just making plausibly sounding stuff up to support your position. No link between scoliosis and laptop use has ever been established and there is no reason to suppose a link, except for a naive idea about the bodily position during 'laptop use' and 'spine deformity'.


> What you are doing, is searching for an environment for a person unfamiliar with computers

Where did I say she is unfamiliar with computers? Her and her younger sister already use an almost dead family laptop for Skype, creating documents and playing games. They were figuring out how to search for Dora games before they could spell (use phonetics). They are far more proficient at Angry Birds than you will ever be and not too bad at Desktop Tower Defence. Homework essays are compiled via online research and includes lots of pictures and nice layout.

Human beings are not as fragile as your response implies.


Set up timed lockout on the PC, like 10 minutes every 1-2 hours

That would be more than enough to discourage me of ever becoming proficient at this. The "zone" applies when you're just learning too, and breaking it is extremely frustrating. Can you imagine closing the lid of the piano just as the kid is playing some difficult piece correctly for the first time?

There should be limits, but they should be soft and enforced by whoever is watching the child, who can presumably understand if (s)he's really excited or just bullshitting on Miniclip or IM.


My points:

1. No one stops you from keeping "zone" feeling during lockout, you can just drink something (what actually helps to think a lot is to drink regularly) or sit for a while with closed eyes thinking about problem, etc.

1a. I sincerely think that regular breaks actually help solving problems (at least in some cases).

2. Schools use timed breaks and in rare cases of interesting classes they do not discourage students to learn. Of course schools should be optimized more, I'm not arguing that.

3. Timed lockout is not target, invented just to prevent illness, it is also a method. Method to teach self discipline, even to begin teaching self discipline in some cases. No one will tell you how to optimize living when you'll grow up, that is why we read a lot of articles about people who burn out on a job.

Later it is insanely hard to go from super lazy, undisciplined student to focused worker.


Solving problems is what you do after you learn to code. A kid is still just poking and experimenting, and a hard lockout is, in my experience as a kid myself, a great way to kill that excitement. I'm not saying you shouldn't have regular breaks, I'm saying a kid should have leeway to use e.g. half an hour more and then do a longer break, not feel a pressure to finish stuff in a hurry before it locks out.

Schools use timed breaks and in rare cases of interesting classes they do not discourage students to learn.

Obviously we have very different experiences.

3. Timed lockout is not target, invented just to prevent illness, it is also a method. Method to teach self discipline, even to begin teaching self discipline in some cases. No one will tell you how to optimize living when you'll grow up, that is why we read a lot of articles about people who burn out on a job.

First, I'm not opposed to lockouts. I'm opposed to inflexible, computer enforced lockouts.

Secondly, is there any evidence that timed lockouts actually help with self discipline?


I like the idea of timed lockout. Teaches the kids how to budget.


In my experience, if you introduce kids to Linux, they learn pretty fast.

In my college days, I used to help students of a government school. It is a very undeveloped area and kids don't have computers at home. All school computers ran Linux and students (aged 11-12) didn't face any difficulty in using Linux.

I used to give this example to my classmates when they blamed Linux :-)


Remember that you were there to do any system administration. Any kid will have no problem figuring out virtually any desktop be it Windows, Mac or one of many alternatives of Linux. After all you (double) click on whatever you want and if it is already running often the program will be brought to the front instead of a new copy.

But consider what happens when Ubuntu does an update - if pam gets updated you get a list of services to restart that is gobbeldy gook to almost all.

And most operating systems still make adding printers confusing, and troubleshooting printing issues virtually impossible.


AIDE runs on android tablets. Perhaps not as easy as App Inventor but no other computer needed. But I don't think I would recommend Android for someone in a country in which they can't access the Market/Play/whatever-they-call-it-next-month.


> AIDE runs on android tablets

You seriously think that AIDE is appropriate for a kid who is proficient at using computers but has never written a program in her life? How many months do you think it would take to walk her through hello world and what do you think the chances are of her getting it?

> But I don't think I would recommend Android for someone in a country in which they can't access the Market/Play/whatever-they-call-it-next-month

I apologise for being a little loose in my language. Her mom already has two Android devices (tablet and phone) and they sell very well there - usual pattern: BlackBerry was king and is being dethroned.

You can access Android Market but you can't buy anything, and a lot of content that is freely available in other countries does not show up. But your gmail will stay up to date and you do get Skype.


AIDE comes with a hello world program as the first thing it prompts you to try. So, iirc, one just has to hit "run", "install", and "open" to have it going. As for "getting it", that worry is very real. No, I don't think java and the android environment is nice enough that I would want to hand it off to a kid to play with without instruction. There is lots of cruft and more abstractions in play than needed right off the bat, and nothing to say which the kid should be paying attention to.

There looks to be a BASIC interpreter in the Play Store for free, https://play.google.com/store/apps/details?id=com.free.basic. It doesn't have an integrated editor though, so the kid would have to understand typing things up in one program and running them in another.

Grabbing and trying it... some helpful examples to load would be nice, it doesn't have any. But it does have an "Enter Live" mode, in which one can type up a program and run it. So all that "don't miss a keystroke" charm of the old days is there. This I might very well hand to a kid.


There's also a Clojure interpreter for Android devices (including tablets), which I think has potential. Unfortunately, it is currently slow and tedious enough that it isn't good for much more than "Hello Lisp!" sorts of exercises.


You could probably easily buy a computer with Linux on it today from Ebay for around $100. Installing Linux means you have a wealth of ways to instantly write code (gcc, g++, ruby, python, php, perl, and a browser where you can write Javascript) and there are plenty of places to go when it's capabilities run out.

I think it's even easier to buy all of this today.

Or, have I totally misunderstood what you've said?


Or, have I totally misunderstood what you've said?

Sort of. Do your experiment (build a linux box) and put it in front of a kid with no exposure to computers (other than as appliances) try this with kids from 6 to 16.

As you get toward 16 you will start getting better engagement, most will figure its too confusing and stop.

I've run the experiment with Arduinio, even with its pretty straight forward GUI there is still some serious complexities around the notion of compiling, downloading, driver compatibility, etc etc.

One of the things I will do with a RasberryPi is look at building an embedded BASIC box. Hook a keyboard to it, plug it into a TV's HDMI port and turn it on. Blammo, BASIC prompt. (If I can swing it it would be a Python prompt but I digress). Type in the code (using BASIC syntax sorry)

   100 CLEAR
   110 DRAW(100, 100, 200, 100, "WHITE")
   120 DRAW(200, 100, 200, 200, "WHITE")
   130 DRAW(100, 200, 200, 200, "WHITE")
   140 DRAW(100, 100, 100, 200, "WHITE")
   150 FOR I = 100 TO 200 STEP 10
   160 FOR K = 200 TO 100 STEP -10
   170 DRAW(I, K, K, I, "BLUE")
   180 DRAW(K, I, I, K, "GREEN")
   190 NEXT K
   200 NEXT I
   210 END
Type run, and blammo, drawing pictures on a screen.

Why pictures? Being able to draw graphics is a compelling thing for folks, they like to do it, it touches their inner finger painting. It compels them to think about 'steps' and 'math' and stuff.

Doing graphics in Linux is a PITA. The notion of a system which throws away everything except a way to explore through programming, and has access to all of the capabilities of that system, is key. The programming needs to be accessible. I could be K&R C as easily as BASIC but it would need a very simple build environment (code for NOT gcc).

Something that with a manual of no more than a couple of hundred pages, can engage a student to explore it with programming. They don't know what programming is so complaining about types (for example) just demotivates them.


That's sort of the idea behind a project I've been tinkering with for a while called Mako[1]. It's an extremely simple VM that is essentially an idealized game console. I initially designed the system as a convenient way to play with compiler design (currently compilers for Forth, BASIC, FORTRAN and a BCPL-like language are available, in varying degrees of robustness), but having some limitations to work within has proven quite fun from a game development standpoint as well.

A few of my friends and I intend to roll a linux distro for the RaspberryPi which makes it into a physical "Mako game console", and in addition to playing games[2][3][4](etc) you'll be able to play with an interactive Forth prompt.[5]

  [1]https://github.com/JohnEarnest/Mako
  [2]http://i.imgur.com/IVNdt.png
  [3]http://i.imgur.com/mJ1ZD.png
  [4]http://i.imgur.com/XumXi.png
  [5]http://i.imgur.com/gLknX.png


Whoa, that looks awesome! Are you actually using F# or is that just Github getting confused with the .fs files?


Thanks! As far as F# is concerned, it's the latter. I really wish Github provided a way to override syntax highlighting based on file extensions. The reference implementation of MakoVM is written in Java, and a few other people have been working on implementations in C and Factor. As you can see, though, the VM implementation is much smaller than the libraries and examples for the platform.


Have you seen the Maxmite?

(http://geoffg.net/maximite.html)

> The Maximite is a small and versatile computer running a full featured BASIC interpreter with 128K of working memory.

> It will work with a standard VGA monitor and PC compatible keyboard and because the Maximite has its own built in SD memory card and BASIC language you need nothing more to start writing and running BASIC programs.

> The Maximite also has 20 input/output lines which can be independently configured as analog inputs, digital inputs or digital outputs. You can measure voltage, frequencies, detect switch closure, etc and respond by turning on lights, closing relays, etc - all under control of your BASIC program.


Oh that is pretty cool. I like that it can do graphics too, that is pretty important when grabbing a kid's attention.

One of the items in my list of ideas for this project is that the image save format is GIF or JPEG and that the files are stored on a USB stick (rather than an SD Card) since I'll use USB for the keyboard anyway, and with files on such a device you can carry them over to a laptop or desktop and play with them there too.

But thank you for this link! This is definitely the kind place I've been headed.


> Doing graphics in Linux is a PITA.

How so? Almost every Linux distro comes with Tcl/Tk (if not installed by default, installable by checking a box in a package manager). Using the Tk canvas widget your basic code above becomes:

$ wish <- start a live Tcl/Tk REPL

  canvas .c 
  pack .c -expand true -fill both
  .c create rectangle 100 100 200 100 -fill white
  .c create rectangle 200 100 200 200 -fill white
  .c create rectangle 100 200 200 200 -fill white
  .c create rectangle 100 100 100 200 -fill white
  for {set i 100} {$i <= 200} {incr i 10} {
    for {set k 200} {$k <= 100} {incr k -10} {
      .c create rectangle $i $k $k $i -fill blue
      .c create rectangle $k $i $i $k -fill green
    }
  }
All with the added advantage that you can watch the drawing happen as you type in the code instead of waiting until the end to enter "run". Note, I have assumed your "DRAW" statements draw rectangles because they have four coordinates. The canvas widget supports many more shapes than just basic rectangles, substitute as desired.


This is the real beauty of the RasberryPi. I heavily doubt that it will in it's default state convince anyone to start controlling their computer. (I explicitly avoid the phrase "Become a programmer" here because it has all sorts of baggage that shouldn't be there but is. "Programming" is really about controlling computers. To say anything else is misleading.) But the range of environments that it allows hacker-types to build and experiment with will be what spawns the renewed interest in deeper interaction with computing systems. At least I hope so.

It's a cheap, more powerful, streamlined Arduino without a lot of the messy bits that make that product and it's kin what they are. (And make no mistake, those bits are what make the product desirable for it's target market.) The ability to fashion something as simple as a BASIC environment without having to sacrifice an expensive computer to do it is key.

But then, you're a part of that sort of hacker phenotype. How would a person who doesn't have that experience know how to do that for their kid? How would they even know that they should do it in the first place?

Once we can answer those questions in a satisfactory manner the world will be a better place.


This is exactly how I learned on an Atari 800 XE. I clearly remember the moments where I understood variables, iteration, arrays, multidimensional arrays, subroutines, etc. The fact that I could teach it to myself while I was a child is a big deal. With a modern context, I'm not sure where I would even start. I'm proficient with a bunch of programming languages, none of them seem as accessible as the BASIC code above.

I've often thought about how to replicate a similar experience for my kids, short of getting a collectible 8-bit home computer and removing their access to modern hardware. Anyone have success with this?


> With a modern context, I'm not sure where I would even start. I'm proficient with a bunch of programming languages, none of them seem as accessible as the BASIC code above.

Core Tcl (i.e., that portion which a child would be exploring in his/her first introductions to programming) is very close to BASIC in accessibility. Core Tcl has very little syntax overall. Compare the Tcl/Tk code above to draw on a Tk canvas to the BASIC code for the same. There is nearly a 1:1 correspondence. And Tcl/Tk also gives the added advantage that when the children gain a bit of proficiency, then can also easily create GUI's for their code snippets and apps. The GUI's they create may not be beautiful GUI's, but they would be "their GUI's" that they created with their own efforts.


Superficially, HTML Canvas drawing is very similar. (Also, many people taught themselves programming because they wanted to make Flash animations. )

Back in the 8-bit micro days, the commercial games and other software were largely single person efforts. The beauty was that you could teach yourself some BASIC and very easily create something which was at least sorta close to the state-of-the-art.

I think to really hook kids on the idea, you need to have some platform which seems current (e.g. web or mobile) versus an 1980s-style box.


Installing Linux means you have a wealth of ways to instantly write code (gcc, g++, ruby, python, php, perl, and a browser where you can write Javascript) and there are plenty of places to go when it's capabilities run out.

It's easy to "instantly write code" on Linux just like it's easy to "instantly change the brakes" on my car. (I say this as a long-time Linux user and developer.)

Squeak is the only "operating system" I've seen or used that is setup to instantly write code. Actually, I take that back. IIRC they jerry-rigged the OLPC to behave similarly with Python.


For me, many years ago, and living as I was in the UK at the time, the BBC Micro met those requirements and started me on a long and happy career/life in IT.

I believe this niche is one that the developers of the $35 Raspberry Pi intend to fill. Good luck to them.


What year was that?


It's interesting to me how little difference there is between her experience and my own (male) early experience with computers.

Though on second thought, it's not at all surprising.


I think we have a tendency to assume "people like us" are, well, people like us. Some things are more universal than we think.


I'm not sure if I can see the correlation between the story being told and the fact that the author is female. It seems to be briefly mentioned in the first and last paragraphs.


> I'm not sure if I can see the correlation between the story being told and the fact that the author is female.

I saw a quip the other day that was along the lines of "Hacker News posters consistently make the mistake of assuming that, because a post shows up on Hacker News, the author is somehow 'making a big deal out of it'".

The "correlation" is that the author, after a spate of sexism stories regarding women in tech, got to thinking about how she got started as a woman in tech, and wrote a blog post sharing the story.

It's a slice of a story of someone's life, nothing more, nothing less. Don't try to read too much into it looking for larger correlations and grand overarching Big Deal Points.


You're probably right, but the post is titled "Girls and Computers", which kind of seems to imply some deep, and general, theorizing about girls and computers.


> You're probably right, but the post is titled "Girls and Computers", which kind of seems to imply some deep, and general, theorizing about girls and computers.

Or that she's been spending some time thinking about "girls and computers" after the news stories about girls and computers and it made her think of her own story.

That's how I read it, anyways. Obsessing about the "deeper meaning" of a 3 word title and whether or not it is the best description of the content seems a bit pointless.


Article titles should match their content.


I'm not convinced it doesn't, but either way, this is pedantry of the most unproductive sort.


The point is that there was no discernable difference between the genders in approaching computers and learning how to use them.

Yet, even though originally the gender divide did not exist, nowadays it has been reinstated, since coding is seen as a male environment and women in IT are subject to various forms of special treatment, be they good or bad, either way, they are treated differently.

>> In some ways, it is like the very ubiquity of technology has led us back to a world where socially normative gender roles take hold all over again, and the effort we’re going to need to put into overcoming that feels overwhelming sometimes.


Maybe there's nothing to theorize about that's specific to girls? Sounds like my story, as a guy, and that's a subtle but important point. I used to run every program I could find, copy stuff from friends computers that was new and unfamiliar, borrow books from library/parent's friends, etc. Eventually I stumbled onto BASIC and the rest worked its own way out.

Yet I know plenty of kids - male and female - who had a machine just as powerful as I did (some more powerful), had the same access to information I did, etc. and didn't bother with any of that. Why was I so inquisitive and interested in computers to exclusion of everything else? Why did I spend all day and night on them while other kids were experiencing their first drinks, their first relationships, hanging out together, etc.? That's a more interesting question than the sex one, IMO.


The argument of the article is that when computers were new, they were new to everyone of either genders whereas at present gender-based expectations can take over in expectations and decisions about who would play or work with computers.

"In some ways, it is like the very ubiquity of technology has led us back to a world where socially normative gender roles take hold all over again"


It's a nice story, but it sounds just-so to me.

Probably because most women that grew up in the 80s did not in significant numbers show the same fascination with computers that the author did. In fact, the current state of the mid-to-senior level job market reflects the interest level from exactly that time period, and it's just as lopsided in favor of men as it's ever been.


"In 1967, when Cosmo’s “The Computer Girls” article ran, 11 percent of computer science majors were women. In the late 1970s, the percentage of women in the field approached and exceeded the same figure we are applauding today: 25 percent. The portion of women earning computer science degrees continued to rise steadily, reaching its peak — 37 percent — in 1984. Then, over the next two decades, women left computer science in droves — just as their numbers were increasing steadily across all other science, technology, engineering, and math fields. By 2006, the portion of women in computer science had dropped to 20 percent." http://www.washingtonpost.com/opinions/when-computer-program...

The Computer Science/Tech/IT industry created an unwelcoming environment for women; the women left.

A random sampling of how IT can be hostile to women:

I sat one empty seat away from a long-haired friend in a Computer Architecture class with around 15 students attending lectures in a room with theater style seating and the entrances to the rear. Plenty of room to spread out.

On a regular basis, guys would enter from the rear and sit directly next to my long haired friend. No empty seat between. Only after sitting down did these guys realize the woman they just sat next to was actually a goateed dude. These creepers would then get up and quietly leave the classroom horribly embarrassed. This happened for half a semester until enough of these guys learned.

I can't imagine how a woman would feel if she had her personal space invaded on a regular basis when clearly there was plenty of room to spread out.

Then if a woman gets to the professional world she gets invited to a hackathon where the women will be serving drinks to the men. Or she sits near the front door and visitors assume she is a secretary. Or she "gets" to run the party planning committee. Those experiences add up over time and make life unpleasant.


No. Just no. First of all, anedoctal evidence doesn't mean anything. Sure, women are sexually haressed at technological jobs, but women are haressed EVERYWHERE. And men are also a minority in lots of professions, is that because the women majority on those fields haresses men that try to enter their area?

I suggest that you watch the first episode of this series: http://en.wikipedia.org/wiki/Hjernevask

If you don't have the time, atleast think about the result of this experiment: http://en.wikipedia.org/wiki/Empathizing-systemizing_theory#...

"Research on one day old babies have found that boys look longer at a mechanical mobile while girls look longer at a face. This, as well as the effects of fetal testosterone on later behavior, is argued to be evidence against the sex differences being only due to socialization"


Your evopsych argument doesn't explain the drop in female enrollment rate since the 80s.


You can't explain something like that without considering the context, the social/work environment of those years, that led to this change. It may just have been that, for example, since the 80s new interesting fields were born or discovered by women, thus reducing their interest in computer science. This factor, for example, may have led to men predominance in the field, and that subsequently may have led to new scenarios/environments again, and so on till nowadays.

I'm sure that if there was a time when multitude of women were interested in CS probably today there would be more of them, and less men. That's unequivocal. Environments change, always in every field.


Your hypothesis has the convenient feature that it lets men off the hook, but given that the "social/work environment" is male-dominated, surely they have agency over it.

Rather than a "lack of interest", I offer alternative Wildly Made Up Explanations For Why Women Left Software Engineering In Droves and Had CS Enrollment Fall: A) they may have been discouraged or prevented from studying a subject: Grace Hopper, for example, would have been an engineer but was prevented because she was a woman. Instead she studied math, as did many early female computer scientists. When Computer Science programs became common they may have hewed closer to the Engineering approach than Math, and when they became expected for programming jobs it may have closed off the pathway women had been taking into the field. B) They may have faced harassment or hostile environments that discouraged them from pursuing a coding career. That is not them "being interested", it is "them being willing to tolerate the environment they were required to study or work in." C) They may have faced discrimination in hiring, promotions, pay or been preferentially tracked into project management roles. You can't work in a profession if no one will hire you, and you can't advance if employers will only promote you into a non-technical role. D) they may have faced impossible-to-reconcile expectations on their time, if they were unable to find employment working regular hours and so utilize child care. Men being willing to be a primary care giver is a relatively recent phenomenon.

I'm not saying any of these are true: I am saying that in the absence of any evidence my wildly made up speculations are just as likely to be true as yours. None of those things have to do with interest in technology, programming or working as a programmer. They all implicate the men who changed the social/work environment of computer science in ways that discouraged or excluded women. They are also all things that we could fix.

I'd rather focus on explanations that offer disprovable models we can use to fix the issues at hand if they turn out to be correct. Your approach is like looking at a crash report and being content with the explanation, "something obviously happened that was outside of expected parameters."

When I read the article, I want the next generation of girls and boys to have those moments of joy at technology. I don't want half of them to be turned off by the entirely-irrelevant social/work environment.


> I'd rather focus on explanations that offer disprovable models we can use to fix the issues at hand if they turn out to be correct.

So you are saying that you would fight one issue rather than another based not on arguments and evidence, but on the fact that the first can be fought and the other cannot (which is itself speculative)?

In my opinion this kind of reasoning has serious flaws. You could see already commenters on HN who say something like "Reading all these posts about sexism and women discrimination I don't want my daughter to work in IT". Doesn't that contradict to what we are fighting for - for bringing more women into IT industry? Doesn't it have to start with equal opportunity for children rather than fear (which in some - many - cases is wrongly induced?


How many "interesting fields" have been "born or discovered by women" since the 80s? Can you name any of these hypothetical fields?

In your 2nd paragraph, are you saying that the statistics about the declining proportion of women CS are wrong? On what do you base this statement?


Sweet, sexism doesn't exist because experience doesn't happen! Forgot I might be talking to a robot and I must have a repeatable scientific experiment to validate my individual experience. Otherwise it means nothing.

Oh, except those rules for a scientifically valid and repeatable study don't count when he cites it himself:

"The evidence for an inborn, male predisposition for systematizing comes from a single experiment on newborn infants, tested with a single person and object. The person was the report's first author, who surely knew the experimental hypotheses and who, we now learn, may have known the sex of the infants whose attention she elicited. The experiment provides no evidence that the basis of infants' preference, if real, had anything to do with the categorical distinction between the displays. Would infants show the same preferences for other face/object pairs? Would they maintain this preference if low-level properties of the two displays, such as their speed of motion, were equated? One need not object to Baron-Cohen's politics to be less than persuaded by his data."[1] says Elizabeth Spelke[2].

Sad Trombone.

Go validate your sexist culture another way. Evopsych:Psychology::Astrology:Astronomy

And yes, I watched that entire clip. It sucked and wasn't worth the time. It's by a comedian who gets scientists with competing models react to each other's statements. And only the newborn baby one has any bearing on gender vs environment when it comes to women avoiding CS.

I'll include another excerpt below because it's just too good to leave out:

" More important, Baron-Cohen fails to consider the extensive evidence that has accumulated, over the last thirty years, on infants' developing understanding of object mechanics. Hundreds of well-controlled experiments reveal no male advantage for perceiving objects or learning about mechanical systems. In most studies, male and female infants are found to discover the same things at the same times. Both males and females come to see the complete shapes of partly hidden objects under the same conditions and at the same ages. They figure out how objects support one another, through the same series of steps. They reach for objects by extrapolating their motions, with equal accuracy. They make the same errors when they search for hidden objects, and they get over those errors at the same time. Sometimes female infants have an edge: In experiments by Laura Kotovsky and Renee Baillargeon, for example, females start to learn about the relation between force and acceleration (the harder a stationary object is hit, the further it goes) a month earlier than males do. Males catch up, however: by 6 1/2 months, you can't tell them apart."

"Whatever the newborn infants in Baron-Cohen's experiment were doing, the male and female participants in three decades of infant research have followed a common path, engaging with objects and people. Infants don't choose whether to systematize or empathize; they do both, and so do we all. Baron-Cohen's categories may seem as quaint as left and right brains by the time his newborn subjects are old enough to read about them."

[1] More at http://www.edge.org/documents/archive/edge158.html (Search for her name as there are a lot of people there tearing Simon Baren-Cohen a new one.)

[2] Her Bio http://www.wjh.harvard.edu/~lds/index.html?spelke.html


Just a thought: IBM made a big effort to recruit women into mainframe programming jobs. 1984 roughly when Unix and PCs started to take over the industry; perhaps those companies were not as interested in recruiting women.

Also, you can be sure there's far less sexual harassment now than in the 1970s and 80s.


I personally theorize BBS's, which got popular in the mid 1980s, and later the Internet, gave spiteful and poorly socialized men a anonymous veil. Reddit keeps the legacy of woman-hostility alive and well with comments like "tits or gtfo".

Well, women chose to get the fuck out. The hostile behavior experienced online would never be tolerated in a face to face setting.

Of course it's only a theory and I was too young to remember any first hand experience of the 1980s.


It might be tough to imagine now, but very very few people were online in those days. In the programming forums (Compuserve, Usenet) people generally posted using their real names and job/university affiliation, so the atmosphere was professional at least.

Perhaps some small porn BBSs or IRC channels had a different atmosphere, but the "tits or gtfo" mentality is mostly something which appeared over the last decade or so.


I was a highschool student in the late 89s, early 90s, and I encountered some truly awful stuff on local bbs scenes. At first, I must be ugly and a social outcast (they referred to me as "the burly Russian wrestler"). I shrugged it off, because everyone got shit. So what? Then I went to one of the parties; very few people there had met me before. I was a pretty standard looking 16 year old girl, but I felt like meat, just meat. In the space of 20 minutes, two 20something men hit on me, virtually everyone else expressed shock at my lack of hidiousness, or told me they were surprised I was actually a girl. I begged my older brother to drive me home immediately. Then I buried my former alias, and made a new male alter ego. Just because you didn't see it, doesn't mean it didn't happen.


Ouch, that sounds uncomfortable.

I have no doubt the BBS world was full of creeps, and that could have been very discouraging to someone getting into computing. My point was that behavior was far more segmented away from mainstream society than (say) Reddit.


1983 was also the year the bottom dropped out of the console market, loads of firms went bankrupt, it was not a good career move to get into the industry in the mid-80s.

http://en.wikipedia.org/wiki/Video_game_crash_of_1983


I believe that IT is unwelcoming to women, I really do; traditionally male dominated fields are always hostile to women, even after they've reached population parity, and moreso until that happens (there are more men in the field and fewer women, so more interactions will be male-female than with a less skewed ratio, which means both that women experience more harassment, and men witness less of it). You don't need to argue this point, I know that a lot of guys are assholes in any field, and I don't for one moment doubt that any of the stories about what women have experienced are true.

I'm just not yet convinced that a significant number of women actually avoid the field because of this (and scattered anecdotes aren't convincing here since the numbers in need of explanation are so huge). Primarily because women rarely enter the field - no, that's not right, because they rarely even enter preliminary training for the field, in the first place, so there's not much of a chance for them to be driven away by the behavior of men in IT at all.

We're losing women very early in the funnel, and I need some real evidence to swallow the claim that the pain that the 17% (or whatever small number) that end up in IT experience is the reason we lose the first 33%. As someone that has paid a lot of bills by working on conversion funnel optimization, I can tell you for sure that I'd absolutely never assume, a priori, that the latter part of the "women in tech" funnel was the one we should be focusing on, based on the numbers - you always look upstream first, especially when you see stats as bad as in tech, and only once you're satisfied that those are the best you can achieve with reasonable efforts do you start to look at later steps.

If the freshman CS male to female ratio was 50/50, I'd agree that we should assume on-the-job treatment was the "leak"; but it's not, based on ETS numbers, by the time girls take the SAT, they only make up 12% of the people intending to major in CS - there's already a 9 to 1 ratio, even before college! The ratio for in the workforce is actually better than the corresponding rate upon entering college, which means more women end up moving towards the field when it comes time to picking a job than away from it. [see http://en.wikipedia.org/wiki/Women,_girls_and_information_te...]

I'm not satisfied that I've ever heard a good answer to this objection. I have heard many plausible reasons that girls are either not interested in, pushed away from, or not pushed towards tech, and that's a separate matter, the one that I think is most worthy of discussion. But it has absolutely nothing to do with the behavior of the men actually in the field, at least as far as the arguments I've heard go.


Of course the situation is complex.

http://storify.com/charlesarthur/oh-hai-sexism

But the tech community just effuses sexism.


1. Regardless of the state of the field, you haven't made an argument that these isn't her experiences.

2. Whatever the reason, the number of women involved in computing has been on the decline for the last twenty years.

3. The number of women in senior positions today might not reflect the number who enter the field twenty years if there was significant job discrimination or if women happened to have different preferences. The gender distribution of principals doesn't necessarily reflect the gender distribution of teachers.


I'm in no way doubting that the explanation offered is a possible one. I just don't think it's been seriously supported. Anecdotally, girls these days actually seem more interested in computers than when I was growing up, and I think that the high availability of computers is actually helping the situation, not hurting.

As for point number 3, you are of course correct in theory, but the reality is, there were already no girls in the middle school computer clubs of 1992, the high school programming classes of 1996, and the CS 101 classes of 1999. So I'm not really buying the idea that anything on the job is primarily responsible for the lack of women graduating CS in 2004 or becoming senior engineers in 2012.

Rule one of conversion optimization is to figure out where you're actually losing people in the conversion funnel. In tech, we're losing women while they're still girls, for whatever reason, and that's what we really need to track down and fix. Focusing on points after they've already left is premature, since if we can get more women coming through those steps in the first place, the dynamic will change anyways.


I assumed the point was that it doesn't matter. Kids get interested in computers regardless of their gender.

But you're right, it probably isn't the most descriptive title.


Yes, the point of the post was that it had a lot more to do with being a kid with a computer than with being male or female. The point of the title was that it was a story of ... a girl and her computers :)


The switch from 'a girl and her computers' to 'girls and computers' is a problem in itself (pop culture ref: http://xkcd.com/385/). Would you take a story of a boy and his first computer and call it 'boys and computers'? Why is one girl representative of the group?


'Girls' has a very different meaning from 'a girl'.


It's a woman's story. The fact that gender isn't relevant is the point. So many people in "the software/start up/tech community" continue to make it relevant by shoehorning human sexuality onto things that should be about the joy of technology, of making things happen, of being real live sorcerers.


I think that is kind of the point.


I saw it as a female who loved computers since she was little sort of correlation.


A lot of the responses here seeking a modern alternative to this experience seem to be focusing on hardware. In a way that's not surprising but I think it's a bit of a shame. That's because I think we have a platform today that can rival the ease and immediate feedback of those early computers: the web browser. There are probably a couple things needed to complete the picture.

1) A nice basic library that can serve as an immediate stepping stone to the UI. There's probably some out there already that are very beginner friendly.

2) Some kind of REPL/IDE like browser extension to make it easy to dive in right away. Something a bit easier for kids to wrap their head around than the developer tools of today, but also incorporating a basic editor so they can edit in place, save files etc.

I'm not really sure about the form of (2) or how vital it is, but it certainly wouldn't hurt.


I'd second this suggestion. Another great 'feature' of this platform is that it's trivially easy for the learner to share his/her creations with friends.


I doubt it will be possible to recapture the golden age of early PC's and their ability to get kids programming. The computing landscape has changed, and educational practices must change with it. We have to assume kids will have ipads as their first computer and treat them accordingly.

Kids these days are web users first and foremost. You have to let them cross the gap between visiting sites and creating sites. There's plenty of opportunity for sites that let kids create stuff in javascript and share it with their friends. What's so different from animating a canvas using javascript and animating a tv screen using basic?


I guess Children of Tech savvy parents become hackers...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: