Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A.I. will have implications for education, welfare and geopolitics (2016) (economist.com)
82 points by magoghm on Oct 24, 2018 | hide | past | favorite | 32 comments


The post talks a lot about how MOOCs can be sufficient education for AI/data science; I disagree with that approach since although MOOCs can teach people the concepts, they don't teach people the soft skills behind the tech, which is much harder and a major bottleneck to using it practically (full blog post on the subject: https://minimaxir.com/2018/10/data-science-protips/)

That's not even considering getting employment in a relevant position, which is much harder than it was in 2016 due to increased competition, and the requirement for Masters/PhD nowadays (https://towardsdatascience.com/the-economics-of-getting-hire...).


As someone who dabbles in the field from a strong software engineering background but only MOOCs on deep learning, absolutely this. The courses are often so focused on getting you to reproduce the state of the art you are on rails the whole time with no deep intuition for why things work. And so when I get off the rails and on a real problem or a Kaggle I often find myself training terrible models and not knowing why, getting stuck on bad architectural decisions, or just forging ahead with a terrible dataset without realizing it.

I’m reminded of the “draw the rest of the owl” meme. Except these mistakes are more insidious, since everyone can tell if they have drawn an owl, but it can be hard to tell if your model has done something like overfit on minutia until you release it upon the world.


No worries, even PhDs don't understand why things work, only how ;-)

You simply need to practice on challenging problems, sometimes ripping off code that is state-of-art, analyzing it, understanding why certain things are done the way they are done, and in a few months you could be able to do your own state-of-art custom models with your own loss functions with distributed training using your own callbacks and getting $500k salary in some valley company.


> even PhDs don't understand why things work

When you finish your Bachelor's degree, you know everything.

When you finish your Master's degree, you realize you know nothing.

When you finish your PhD, you realize nobody else knows anything, either.


what IQ do you think is required for this? do you think a solid lead dev who knows linear algebra and calculus could do this in a year or so?


With my first job after college, I found a large impedance mismatch between my (mostly mathematical) education and engineering in the physical world. It took some time to overcome that, but in the end having the strong math basis turned out to be a big advantage.

(Better math meant designs could be optimized for lighter weight and reduced manufacturing cost.)


I think you need both for machine learning systems, and the software engineering one is easier to teach on the job than math education. Or at least we expect to have to grow candidates that way and have infrastructure for it.


Remember this was the height of MOOC madness.

Then we learned students never completed moocs and instructors hated creating content and getting paid almost nothing (compared to their full time jobs) to be a sharecropper on their platform, without tenure, status, or all the other perks teachers enjoy (because they don't get paid much). Contrary to techies opinions, teachers do not want to give up their hard won political power ( that's all they have left ) for the sake of tech.

They saw what tech did to the publishing industry


I can't help but think that both of those issues are eminently solvable. In the case of teachers, focus on making a product for the teachers, then charge them either a flat fee or a percentage, ala kickstarter or patreon.

As for the nobody finishes, that is even easier: use Beeminder[0], and if you are writting a mooc platform, integrate with it.

[0]: beeminder.com; self-control as a service; I am a very happy user, but, except for a few stickers, an unpaid shill.


If only we could prevent college dropouts with beeminder :)


> which is much harder than it was in 2016 due to increased competition, and the requirement for Masters/PhD nowadays

The requirement for most ML jobs in 2012 (when I graduated) was a Ph.D. I took literally every course on machine learning my university offered (six or so) as an undergrad with the hopes of landing an ML position when I graduated. I gave up on trying to find a position because they all required Ph.Ds.

I even had a solid portfolio, as I had successfully applied neural nets to solve a couple of problems by then. One involved real-time image processing and it was on github.

It seems much easier to get a job in the field now.


Why would you say it's easier now? Even now there's this requirement of PhD


The proliferation of libraries that make it trivial to do machine learning. I know plenty of professional data scientists with liberal arts/business degrees who never completed calculus.

I'd say the bulk of the data scientists I work with now are non-cs/math people. I work with some math/stats PhDs too, but they are outnumbered like 3 to 1.


What kind of "soft skills" do you mean? Typically when I hear "soft skills" I think "interpersonal communication, leadership, organization ability" type things.


I never really understood what showing your work meant in my schooling before college. I'd solve the problem through whatever I had memorized and would occasionally jot down the more complex parts like long division or large multiplications, at least until I was allowed to use a calculator. In high school we started getting to more complex things like algebra and inequalities and I was able to power through with intuition and a solid proficiency with my TI-83 and some Basic.

I don't know exactly what my teachers were doing but it all came to a screeching halt in college when I encountered severe gaps in my knowledge that made Calculus my hardest class. I had missed some very important lessons from earlier on that were huge hangups. It's hard to differentiate an equation if you aren't solid on how to factor the variables.


I had the same reaction as you. I think MOOCs are lacking in more than just this area. I'll use myself as an example:

I used MOOCs to learn various programming languages, simple database queries (insert, update, select with join, etc.) and how to use APIs with JSON. Lots of fun stuff. But when I came to university I started being taught all kinds of math and CS stuff through difficult assignments and tight deadlines that I never would have done on my own.

Yes, you technically can learn a lot of math and other things through MOOCs but it's not always as fun. The advantage of university is that they give you a set of required courses for a major and you have to take them, whether they're fun or not. Sometimes the courses which are the least fun turn out to be extremely valuable later on.


Data from an external source is usually fuzzy and requires some communication to reduce the fuzziness so that it can be properly interpreted. That involves communicating to understand why it's represented a specific way or negotiating to change the representation to better fit your needs.

The same is true for the output of your work.


That's a part of it, which doesn't come up in MOOCs at all (specifically, collaboration in a corporate environment within the software development lifecycle, and negotiating with project stakeholders about specs, requirements, and results).


I also didn't learn anything about that in school, but I did from my coding bootcamp, so if anything I think that is a general criticism of the education system.


I think the soft skills are essential to success, but can be easily learned by someone who is technically proficient.


I started my career as a 28 year old college dropout doing data entry for a small surf and skate ecommerce company in 2012. I never finished any of the MOOC courses I started, but their availability helped me get my first internship, which has lead to me being a successful backend engineer in SF. I especially credit the original Coursera database class and Udacity's web development in python.


Premature deindustrialization might not necessarily be a bad thing. "Premature" is a judgement. Why is putting fossil carbons buried deep down the earth for millions of years up in the air in a matter of decades a "mature" anything?

Developing countries directly went to cell phones without going via landlines. There could be another way. There should be another way. That is if we are to sustain 10 billion people to the same standard of life as the West without the fossil fuel disadvantage.


The "premature deindustrialisation" phenomenon was the most interesting/novel part of this article. The rest is the normal AI/MOOC/software-eating-everything hand-wringing.

However, somewhat ironically, the actual paper that's cited [1] seems to disagree with the author's characterization:

> In sum, while technological progress is no doubt a large part of the story behind employment deindustrialization in the advanced countries, in the developing countries trade and globalization likely played a comparatively bigger role.

[1] http://www.nber.org/papers/w20935


For welfare ? Good.

It'd be hard to fuck it up as badly as humans have. And God forbid, AIs will do whatever improves outcomes.

Humans employed in welfare, even in the police have been known to relish in the power the law gives them, as it does in many cases, and use it not even for personal empowerment, but for showing off to girls, for taking petty revenge, and worse.


They're not talking about using AI to allocate welfare, they're talking about basic income.

Using AI to allocate things is dangerous IMO, mostly because AI will use past decision-making as training data, and the training data was created by those power-mad human administrators with all their implicit and explicit biases.


So it'll start as bad as humans, and improve. It's got my vote.


Improve, or get stuck in a local maximum and make the situation worse permanently.

At least with human decision making, there is a paper trail and a means of legal recourse. AI provides neither.


Euhm just the opposite. Human paper trails, especially by government employees .. I've found them lacking or outright stupid/useless/falsified. That makes legal recourse dubious.

With AI decisions paper trails will be perfect and it won't have any emotions, before or after getting sued, about the matter.


Don't forget about the implications for paperclips.




IMO Universal Paperclips is one of the the best clicker games, even without the AI safety elements, because it has an end that is reachable in a reasonable amount of time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: