Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Mid-Career Crisis of the Perl Programmer (modernperlbooks.com)
147 points by pmoriarty on Jan 17, 2015 | hide | past | favorite | 114 comments


> tdlr: you should have spent your time learning how to write a CRUD app and a mediocre template system, dependency injection framework, and ORM in Haskell Rails Erlang Scala Rails Python Node Clojure Julia instead of mastering one tool and learning how to solve problems.

This is a general rule for everything. The animals that survive are the ones that best adapt (well constantly adapt) to an ever changing environment, since change is the only constant.

This is the one major thing I got out of being a computer science major: the language doesn't matter and you shouldn't tie yourself to just one language (i.e. be language agnostic). They drilled into us that we should constantly learn new languages for both fun and profit.


As a 'hobbyist' programmer who makes his money outside of the software game, this is something I've really taken note of. And I have to say the programming tech treadmill is one of the most cringe inducing phenomena I've ever witnessed - particularly in scripting languages. Software folks tend to see themselves as futurists in the fast lane, and as a Hacker News-er I tend to carry myself this way as well. But I kid you not, I can't count the amount of articles I've come across where the title is 'X tech is the new ish! - so rockstar lol' and the article is a simple example of http routing or process spawning. It is SO easy to get caught up in this drivel when you're starting out as a programmer.

I've since learned to use programming languages to solve problems I care about. If that makes me the forever-novice then I'm cool with that.


I think this phenomenon is an artifact of the media that we as programmers consume. The easiest way to become an programmer celebrity is to build tools (languages/frameworks/etc.) for other programmers. Many programmers of equal skill are out there programming to solve actual problems, but the HN sphere and like are awash with people programming on programming itself. Which, while is surely intellectually stimulating, does seem a bit 'meta' and removed from why we're all doing this in the first place. But people who work on programming tools get the most press because those topics appeal most broadly to the audience of their peers. Then people in that audience see those topics as 'hot' and focus their energy on them. It's a reinforcing cycle. It reminds me of the ideas in this article: http://slatestarcodex.com/2014/12/17/the-toxoplasma-of-rage/


  > programming on programming itself ... does seem a bit
  > 'meta' and removed from why we're all doing this in the
  > first place.
I heartily disagree, and think that tools vs. "actual problems" is a false dichotomy. Software design/engineering is very much its own discipline. A good API/framework/whatever can lower defect rates and generally make your software more coherent and maintainable. That saves time, money, and headaches. We should absolutely care about the tools we use, because nobody else is going to, hence "programming on programming." Attributing that to a quest for "celebrity" is cynical, and I don't see how you can blame programmers for seeking a good professional reputation by creating tools for their peers.


I teach my kids that the systems they are programming should be considered onions of abstraction layers. They need to understand how to choose from among the options available at each level and to realize that those options are constantly changing.

This onion is the only way we can deal with the ever-widening diversification and combinatorial explosion of hardware platforms, communications channels, types of users, geographic dispersion, data sources, problems to solve, etc.

Improvements at one layer demand adaptations at another; accumulate enough options at one layer and you need to encapsulate them in another. And the bigger these onions get, the more leverage they provide to anyone who uses them, meaning any improvement you make to some part of the onion is contributing to solving a thousand, or a million "actual problems" simultaneously.

If solving end-user problems is "why we're all doing this in the first place," a toolmaker who can solve 50% of each of a million different problems simultaneously is contributing more than those of us who solve the remaining 50% of just one.


So much of the treadmill seems to be about reinventing existing computer science concepts. People learn about a particular concept and become enamoured with a particular pure implementation of that concept.

Just because you need asynchronous I/O doesn't mean that you should exclusively use node.js. And a lot of the hype tends to ignore the underlying CS concept that makes the software worth using. You don't need ElasticSearch because of "search and analytics" but because you have queries that benefit from an inverted index.

But you can find yourself spending too much time coercing technology to do things that it was never designed to do. If you have huge SQL queries to implement basic faceting you should probably just learn ElasticSearch. If you need to completely retrofit a language to do async networking maybe just learn Go.


amen brother (or sister) its the problem that is the cool and interesting part of the job - the tech used to solve the problem is a tool not the end in its self


Software isn't a game, it's an industry. People use particular languages, frameworks, etc. because it makes them more productive. As a professional, I also can't follow every single trend (e.g. I've got no idea what Chef and Puppet are) but I keep an eye on things that are close to my own area, because I don't believe that fads come out of nowhere, or that you can ignore what other people are doing. Most of the major trends I've seen are justified on technological grounds:

Node.js - lets you share code between client and server

Docker - a more reproducible environment than a VM

Python - the most user friendly syntax of any language

You have a fake humility in your post, but I would ask you what particular things (e.g. what scripting languages) you think are "cringe inducing".


Trust me, it's not a fake humility. I really have nothing to do with the development of software in my day to day life. And I wish I could agree with you on the "it's not a game" thing, but having used enterprise Windows software for most of my working life I really can't come to any other conclusion than a lot of the software out there running our day to day lives must be produced in settings not unlike your local McDonald's. Having said that, I've yet to come across a situation where I've shared code between a Node server and client code. Don't get me wrong, I LOVE writing processes in javascript but when I was starting out, the language wars really confused me.


If you work with a lot of bad software, a better conclusion might be that producing good software is hard for multiple reasons than it being a "game".


I agree, but I did not make that conclusion nor did I assert that making good software wasn't "hard", and certainly not because it's a "game".


It's false humility in the sense that you emphasize that you are not a professional, and then go on to precisely attack professionals for their judgement in what languages and frameworks to use and comment on.

I can't comment on enterprise Windows software in itself, since I'm not a big user and have never written it, but the Windows itself could never be compared to McDonalds. Incredible skill and expertise was needed to write Windows. As to the software written on Windows, are you qualified to judge it? Have you written something similar by yourself?

I've never used Node myself, I was just assuming that people really did share some code across server and client in practice (otherwise why not just use Python for the server).


1) I didn't compare the technological achievement of Windows to McDonald's. It was a personal anecdote about enterprise Windows software being so terrible that I'd imagine it wasn't written in an environment where user experience was a major concern. I never compared the OS itself to anything. Do I think I could write something better? As a matter of fact, yes, I think I could. Should I put my money where my mouth is? I suppose I should, but even if I don't certainly I should be allowed to criticize software that I use on a daily basis. After all, I'm the user, no? These are my own observations, surely I don't need to claim they're gospel or apologize for them?

2) Maybe I offended some people when I used the term "software game". Rest assured I was making a slangy generalized reference to the software industry itself, and in no way did I mean to trivialize those who do dev for a living. In fact, I applaud you.


as a profession programmer, i think there are loads of reinvented wheels, churn, bullshit, etc. i think your initial comment that spawned this back and forth is spot on. its an argument about the degree i guess.


Regarding enterprise software, as a user you can say what you thin about the software, but how can you make claims about how things might be better? You could only make that sort of claim if you understood something about what it was like to write that software. As an amateur you know more than the average user, but I would ask you consider that you don't know all the constraints faced by the person writing the actual software

>Maybe I offended some people when I used the term "software game"

Yes, you did offend me. And I appreciate your clarification. My main point was that it's easy to criticize, but until you get your hands dirty, you won't really know what it's like to write software for a large user base. And it doesn't have to be commercial. If you're working of open source software with 100,000 users, your experience is as real as with commercial software.


> Regarding enterprise software, as a user you can say what you thin about the software, but how can you make claims about how things might be better?

Because I'm the USER! If we're hitting a language barrier I apologize but enterprise software is for the user. Surely this cannot be disputed.

> You could only make that sort of claim if you understood something about what it was like to write that software.

NO! IMO, this is an insane way to think about client software! The user doesn't care about the technology nor the hardships involved in shipping said software! This is why software sucks! I can sympathize if you work at a shitty company with a shitty boss, believe me I can. But no, I feel your software should die if this is your ethos. Let a company who cares about its users take those reins!

EDIT: grammar


> I apologize but enterprise software is for the user. Surely this cannot be disputed.

Nope. The reason enterprise software sucks is because it isn't worth making it not suck. And the reason for that is there is a huge disconnect (many layers of management) between the person who buys the software and those that use it. Thus the sucktitude of the software is largely irrelevant for purchasing decisions because the buyer never experiences the pain of using it. This is starting to change as people get more experience with quality consumer software, and such software (e.g. Dropbox) begins to invade the enterprise.


I'm not arguing against your right to complain that software is bad from the user's perspective. But that wasn't your point. You claimed there was something wrong with the software industry, that their failure to produce better software was in some way related to your original post regarding faddishness in the software industry.

That's what I'm arguing against. I'm saying that you can't know the reasons why software sucks unless you've worked as a programmer (or have some other equivalent way of knowing, e.g. being a product manager).


I don't really have a problem with anything he's saying. It's not even a secret that we culturally care only dimly, if at all, about the people who are receiving the photons we spew out of their screens, and we are the worse for it (look at how often people kvetch about PMs who want things that are hard but obviously better for the user). We're too busy looking inward and bragging about our build systems to muster any empathy for people outside our clique.

Your attempts to shut him down with "well you're not there, man" are offensive to me, and I do have the resume to hurdle your arbitrary no-complaints bar. I don't have to be the employee of an oil company to know that regulatory capture's a thing, he doesn't need to be writing a bunch of leet node.js to know that few people working on anything in tech gives a single solitary crap about him or anybody else using their stuff past the buying point.

He's right to be mad about what we, as a culture, foist on our users. It reflects poorly on us that we are not likewise angry. We should be angry about many things and we are not.


I wasn't initially attempting to shut the poster down because they weren't a professional. In later posts I did devolve into saying "well you're not there, man". What I was initially criticizing was the false humility that somehow made not being a professional into a kind of virtue, affording the poster a greater insight than actual professionals:

As a 'hobbyist' programmer who makes his money outside of the software game, ... the programming tech treadmill is one of the most cringe inducing phenomena I've ever witnessed - particularly in scripting languages...

I've since learned to use programming languages to solve problems I care about. If that makes me the forever-novice then I'm cool with that.


I am a professional and I totally second his position. It makes perfect sense to me and illustrates a large chunk of the frustration I feel every time I see the same old stuff in a new package trotted out like it is the greatest thing since sliced bread (only usually a lot less efficient).


Yup. I'll put my thumb on the scale, too. The amount of aggressive reimplementation of the entire universe is something that makes me pretty sad. I think it's most acutely visible in certain parts of the JavaScript ecosystem--Grunt? Gulp? RequireJS? Browserify? Webpack?--where the incremental improvements of a given tool are seen as license to rewrite the universe. Which isn't to say it doesn't exist elsewhere, that's just top-of-mind. And don't think that I'm not aware that there are benefits to those incremental improvements--but we spend so much time on them, on how so much better this is because teensy-change now everybody switch over...I don't think we do much caring about our users. That upsets me.

Saurik had a great post not too long ago[1] that made me think about how Github and the public nature of open source feels almost competitive now. I'm not at all immune to it; when I started running into teeth-pullingly irritating problems with Terraform (written in Go, which I, uh, "don't like" at a minimum), my first thought was "well screw it, I'll go rewrite it in $X and show them!". Fortunately, my good sense prevailed and I wrote a hat on top of Terraform to add some of what I would editorially consider "sanity"[2] on top, but a lot of what I see out there seems to be people succumbing to the siren song of "well, I'll do it myself and it'll be awesome and I'll get all the credit."

[1] - https://news.ycombinator.com/item?id=8854587

[2] - https://github.com/eropple/terraframe


Reducing context switching when working on both, easier to hire people that are fluent in a particular language than people who are fluent in 2, performance, in that order.


I think software as a whole is a lot more subject to fads and personal preference than you're leading on here. Which is okay because picking the exact best technical fit (if it's even possible to figure this out ahead of time) for any given problem will give you tech ADD.

E.g. lots of people use node.js because its in a language they're familiar with and it fills a niche (lightweight process that can accept and process many incoming requests). I use Erlang for those needs, which actually existed before node.js. But it's a "weird" language so most people don't use it.

I like Python more than Ruby. I've learned both but never use Ruby anymore. Other than preference or mandate I'm not sure why I would use one over the other.

Lots of people used to writing in scripting languages seem to be discovering the wonders of a speedy language like Go. You could obviously use speedy languages before Go but their preferences made them not want to use them. For whatever reason Go is attractive to them.


>E.g. lots of people use node.js because its in a language they're familiar with and it fills a niche (lightweight process that can accept and process many incoming requests). I use Erlang for those needs, which actually existed before node.js. But it's a "weird" language so most people don't use it.

If languages are close enough, then yes, familiarity or popularity will influence usage.

>I like Python more than Ruby. I've learned both but never use Ruby anymore. Other than preference or mandate I'm not sure why I would use one over the other.

Never used Ruby, but my impression is that like Python, Ruby is a slow interpreted language with a good set of libraries and user friendly syntax. So you're probably right that there's not much between them.

>Lots of people used to writing in scripting languages seem to be discovering the wonders of a speedy language like Go. You could obviously use speedy languages before Go but their preferences made them not want to use them. For whatever reason Go is attractive to them.

I think that attraction of Go is that there is one build system, one debugger, one linter, etc. Having all these things be take responsibility by the language developers is a big plus as it avoids buck-passing and inconsistencies. While Go doesn't excite me at an emotional level, there are some things that it seems really good at.


How about if a person who develops software for a loving agrees that it seems more and more like a game. Words and phrases like "our stack", "you need X years of Y language," or "we need a Z language dev" and so on all read like rules in a role playing game to me. Sometimes after I read certain posts--and particularly posts gushing over some language's features--I feel a bit surprised the commenters don't mention hearing a "ding" after "leveling up".

Our industry is filled with tens of thousands of professionals carrying on quietly developing software boring and exciting, and inundated with loud mouths acting and speaking in ways strongly suggestive of a certain " gaminess."


Awesome. I can almost visualize playing "Dragon Age" with my daughter while reading your comment.

Considers +2 "XML Mace", but requires level 15 Java Brute Force skills to wield. Reconsidering player "cast", selects +3 "RegEx Dagger", which better matches level 18 Perl Dexterity. Upon leveling up, considers how many XP to spend on "management persuasion" craft skills. :<>


I can confirm. I worked with Chromatic at the consultancy he mentions; I developed the Python/wx app.

He was a stand-up guy, a professional in an environment where most weren't. (It turned out that the company was already having trouble making payroll when they hired me.)

The perl dev group's focus on testing was impressive and very forward thinking. But at the same time, the Golden Hammer anti-pattern was evident to me: matching your tool to the client's problems, as opposed to placing the client's problems at the center of your focus.

I believe that my CS background helped me take that client- and problem-centered approach. And led me to create several proofs-of-concept in different languages before settling on Python/wx (largely unknown to us) as the best way to meet the project's requirements.


IMO, it's not a CS background that does that. It's empathy, it's putting somebody else's concerns on the level of or even (gasp!) above your own, and that's in pretty short supply around here. Empathy is often confused for cheap populism--because populism is always cheap, I guess--and gets that middlebrow dismissal that's so common in tech.


It's both. dogweather had the experience and breadth of skills to explore a range of solutions to find a good fit as well as the empathy to see things from the customer's point of view--not to mention the people skills to make his case so well.


Except if you keep switching languages, you never truly become proficient in any of them. And you can pry C and Perl from my cold, dead hands. :)


This isn't true, in my experience. I'm production-comfortable in at least half a dozen languages and yet I've never used any of them for more than maybe a few consecutive months at a time. I can go as deep on the JVM or the CLR as you'd want to go, I won't embarrass myself writing C++ in anger, and in the last year I've gotten quite comfortable with the idea of writing production code in Ruby (which, for me, means diving in and figuring out how MRI ticks).

My travels indicate to me that "proficiency" grows from understanding the mental models that go into programming. Within a linguistic family (and most of the languages in heavy use are either dynamically-typed OO or statically-typed OO), the differences are marginal and mostly unimportant. If I had to go write Node ("like Play but without a working type system!") tomorrow--well, I'd quit, but before I did I'd be proficient in a week.


I have to disagree with you, but remember this is my own milage. :)

C++/C#/Java, less so Ruby, are really all kind of the same, OO languages, 3 statically typed and one not. But basically, they use similar ideas with different syntactical sugar.

I've recently moved to using Clojure/Lisp and I kid you not when not just the language but the tooling allows me to be an order of magnitude more effective when shipping production code.

I'm basically ruined, and now find it hard to get a job because I refuse to code in any other language. Because I want to be able to: spend time with the family, walk the dog, go surfing at 6am every morning, go to live jazz....._and_ still make sure I ship quality stuff on time. Clojure/Lips lets me do this :)

I loved the article because it was basically everything I've realised in the last decade; I am way more effective in one particular set of tools than another, and I don't want to have to learn python or Go or whatever, and all their associated libraries, community and quirks. I just want to use the shit I know and go do something else with my life :)

This is MHO and like I said milage will vary.

Oh about the job thing, I decided to start my own company so I could build it how I know I will be best able to do it.


I'm curious as to your background before moving to Lisp?

There's a huge gap in productivity - at least for small, exploratory, green-field projects - between C++/C#/Java and Lisp. There's a much smaller gap between Python/Ruby and Lisp, enough that a couple prominent Lispers [1][2] think that they are perfectly acceptable Lisps.

I like Lisp. I was pretty obsessed with it in college, and have gone so far as to implement a couple of them [3][4]. But my experience is that I'm perfectly fine with Python. Sure, I miss macros and conditions and CLOS. But that's all made up for by the awesome libraries available in Python, the ease with which I can express my algorithms, and the syntactic prettiness. I'm using it for my startup; I was just thinking this morning (while poking at a unit test in the REPL) that it's just as good or better than my Lisp experience ever was.

[1] http://norvig.com/python-lisp.html

[2] http://www.randomhacks.net/2005/12/03/why-ruby-is-an-accepta...

[3] http://en.wikibooks.org/wiki/Write_Yourself_a_Scheme_in_48_H...

[4] https://github.com/nostrademons/arclite


Background has been mostly in C/Java and Scala. Scala for the past 3 years. It's a fantastically complicated language with even its creator Martin Odersky looking to do a major rethink and turn it into something simpler [1]. I never could finish reading the language standard. There was a lot of effort put into a lot of libraries that just made it harder to use Scala, like Scalaz which pimped almost every standard scala library class. Almost every time I went to use an external library I had to also learn the DSL that came along with it (the popular dispatch library for HTTP requests, "dispatch", needs a periodic table to decipher how to use it [2]). Then usually after that cognitive load I would encounter the inevitable limitations of the DSL and find no-way out of DSL hell; I went through 4 different DB libraries, trending down from ORM-high, ORM-ligher, ORM-almost nothing to "just some string interpolation around SQL". Some people find a lot of pleasure in debating language feature X and Y, and at first I cared about that, I really tried, but I was just fooling myself, I didn't really.

I like Lisp and specifically Clojure, for a number of reasons, with a lot of those reasons being because it's the _same_ .....everywhere. I read Clojure code like I read English, I can glance at a piece of Clojure and not wonder why there is an implicit argument that needs to passed and where the compiler might be finding it; the rules for Scala static implicit resolution go in and out even after reading them a few times for me.

Clojure has a lot of improvements on CLOS, I'm not sure how much Clojure experience you've had? There's enough differences that some of the arguments I noticed in the paper (which seem to be really just syntactical) on why ruby is an acceptable Lisp don't really apply to Clojure. For example, in Clojure you don't use lambda, you use "fn" to do the same thing. One of the things I really like about Clojure is that I get to run it on the JVM, which still beats hands down the Ruby VM, AFAIK, I did check a little while ago.

I'm not that familiar with the Python REPL, but Clojure's is pretty good and I love being able to jack into production when I have to and modify executing code to do tracing or debugging without having to restart it :) I know, it should never get to that point, but testing only covers so much.

Ok, I think I'll leave it at that.

Another poster on this thread made a comment around, people think in different ways, some people love thinking C++ and classes for everything....I'm not one of those people, I want to just say exactly what I want to say and nothing else; if I want a class I'll make one, else get it out of my way. I think LISP so when I type it's an almost 1-1 correspondence.

Cheers!

p.s. nice work on the Javascript port of arc, I didn't know that had been done.

[1] http://www.infoworld.com/article/2609013/java/scala-founder-...

[2] http://www.flotsam.nl/dispatch-periodic-table.html


Disagreeing is how we learn, yeah? Almost as if relating different points of view is a good way to expand one's horizons. Thanks for being cool about it!

I'm not unfamiliar with Clojure or Lisps in general; personally, I can't think as well in that mode and it doesn't map well to what I do (either in terms of work or personal projects), but I get it. They're nice, for a certain kind of thinking that I don't find myself needing that often.

I didn't mean to imply that there aren't tools where I'm more effective, because they're better tools, to be sure, (by my lights, Ruby is a better tool than Python and carving bits into a spinning disk with my teeth is a superior option to Go) but I find that after a certain point within an environment I've put together in my head a sufficient playbook where missing certain things here or there doesn't diminish my enjoyment of the process of building something. Libraries, quirks...meh. I went back to PHP a few weeks ago after not touching it for years and it all came back in fifteen minutes or so. Maybe I'm just wired for it? I dunno.


Not trying to make a point or anything, but from my own experiences, people definitely seem to align differently to different languages and paradigms.

I know for myself, when I got a chance to code professionally in Erlang, I was hooked, and could not see myself going back to Java or a mainstream scripting language. Having to think and model problems concurrently, think about failure, etc, is so natural a fit for my brain that trying to fit a language that doesn't have those as first class constructs feels stifling, and having to eschew those concerns to write something that ignores the considerations Erlang prioritizes breaks my brain; I've tried it for timed coding events and it just did not work (i.e., realizing after the fact "Oh, yeah, I totally didn't need to spend time making sure that my supervisor tree made sense, that state was persisted properly, and resumable in the event of restart"). And yet it's not totally ideal, either; I care deeply about typing, and wanting to squeeze the most 'correctness' as I can from static analysis, and Dialyzer just isn't quite sufficient to fully scratch that itch.

Meanwhile I've seen some teammates who, while having the same diligence in thought, tend to be far more sequential, far more inclined toward single threaded models, reserving concurrency for the truly necessary cases, and more concerned with try/catch style error handling, than ensuring proper typing (Dialyzer) and otherwise letting it fail (at least until we know what we -really- need to handle).

Both I and my teammates have very similar backgrounds, coming from a predominantly Java background, some with a bit of .Net, Javascript, etc, with a smattering of mainstream scripting languages thrown in, and of course, we're all working on the same project. Yet our 'taking' to the language and its paradigms varies pretty drastically.

I think people, especially the HN crowd, miss that. PG's "Blub" paradox is viewed as the rule, and obviously everyone thinks they're at the top (or at least, as high up as is pragmatic for their domain), and while it's certainly true that you can't appreciate what you're missing if you don't at least understand it, people's brains work in very different ways. Languages certainly vary, and some are definitely better than others along a given axis, but I think it's unfair to take one's own experiences as objective truth. Even apart from domain differences, "This language made me 10x more effective"; yes, but is that due to the language, or due to your thought processes naturally fitting it? Will someone else who thinks differently find the methods of expressing problems in that language so naturally efficient that they'll see the same gains, or will it actually slow them down, even after learning it, because it isn't as good a fit for them?


Though I've used it for my own stuff a little bit, I don't have a sufficient background in Erlang to really think natively in it. I do, however, spend a lot of time in Scala-land, where Akka is king (for better or worse) and tries to pull a lot of the same principles into the JVM. Have you ever looked at it? I'm curious how it feels to somebody who's worked extensively in Erlang.

I think PG's Blub Paradox only works to a point, as you say, and I think you nailed what regularly drives me nuts about technology: there are axes to this stuff that people mash into a continuum to feed their (rampant) insecurities--so they can look down on somebody. I think it's a sign of cultural and personal immaturity. I thought it was super-cool to issue sweeping statements about technologies at one point, and sometimes I still fall into it when I'm feeling salty, but I try not to. Which is different from sweeping statements about the craft of programming; Gary Bernhardt's twitter feed is chock-full of things that as a programmer you really should full-stop not do to your users. But technology--ehh. I goof on Go a lot, and it does drive me squirrelly when I have to debug somebody's Go project for work, but I just can't cape for any particular stack anymore. Even Java has plenty of uses. (For me, Java's useful in that I can see in my head roughly what the bytecode output of a given thing, and reason about its behaviors on the JVM better than I can Scala.)


I only ever toyed with Scala, and Akka, so I can't really speak too much to it, but it struck me that what I liked most about it was what Erlang simply ~is~.

I think the main difference to me was that Erlang was built from the ground up for fault tolerance. Everything about it is structured around that; you get process isolation and concurrency because of the need to not have one error/exception kill everything, you get distribution because you need hardware/VM issues to not kill everything. Akka's motivation isn't that; it can't do that. The JVM doesn't allow for process isolation, there's one shared heap, etc, so it really does matter what is running locally vs what is running on other machines, you have to be careful about mixing concurrency patterns and using vars, and it is possible for one process to die and take down others with it.

Akka is a little alien to me as well; things like 'become' seems a bit heavy handed (I believe it's due to the JVM not supporting tail call optimization, and Scala can't trampoline the calls if it doesn't know what you're trying to do). Similarly, the reactive nature of actors in Akka, where they don't do anything until they receive a message, while superficial, still bothers me a little bit.

My mental model of an actor in Erlang is simply a lightweight process that executes a function, that has a queue attached; at any point the process can check its queue, and if it needs to repeat itself or change its behavior, you call itself or a different function. It's a very simple mental model that all the OTP behaviors build on. Akka instead has (necessarily) various syntaxes; you're creating this ~thing~ first, setting up all the things it can do, and then it sits waiting for a message to start executing. It's trivial to transcribe one mental model to the other, but I find Erlang's is the one I think best in.

Now, that said, provided you don't need the soft realtime performance of Erlang, Scala/Akka is generally going to be more performant when it comes to number crunching; it's easier to just do it natively in Scala/Akka than have to do it externally in Erlang and open a port.

Scala/Akka benefits from existing JVM expertise which most devs have a bit of; the Erlang VM has its own deep knowledge you have to learn to run it in production (that said, the introspection you have into it is -killer-).

Scala/Akka is easier to transition people to, due to the OOness ("squint a bit and it can kinda be written like Java!"), but I think making a clean break of things with Erlang helped me; even the 'strange' syntax was helpful in that I didn't assume false familiarity due to syntax that seemed familiar (except for 'if'; everyone misuses that at least once). But if a dev finds that way of thinking alien to them, in the Erlang world they're just not going to be that productive; in Scala, you can mix paradigms and they'll likely see better productivity in areas of the code that are still very OO and imperative. I tend to feel there be dragons in OO, imperative code, but the reality is a lot of shops don't have the leisure to write everything in a functional manner, and to pick and choose who works on it based on their familiarity with that paradigm.

There are a few other odds and ends; Scala allows operator overloading which I need so rarely that opening it up is a con in my book, there's a lot of implicit stuff (apply for instance) that while really cool, allows for a lot of hidden dragons to be buried in a large, shared codebase, I really wish Erlang was statically typed as I said before, etc, but those are the main.


Your experience is very narrow it would seem, and not an example worth generalizing from. So you know Java, C++, C#, and Ruby. That is _not_ a diverse set of languages.


[flagged]


Retra is wrong. You're right. No reason for the hostility.


He's pretty matter of fact, I see no hostility there.


What you're scanning as hostility is exasperation. I have been writing code for twenty years and getting paid for it for twelve. That's not a "get off my lawn," either, not least because I don't know many 27-year-olds in the Boston area with a lawn and I certainly don't have one either. That's a "get out of the basement and go outside." I am remarkably sick and tired of the endemic social insecurity that runs through this industry. There are so very many fun examples of it around here - I miss @shithnsays desperately. At some point in the last couple years I finally grew up to the point where I realized that the tireless posturing and wellactuallying just doesn't get anyone anywhere. When the best he could do was "oh, you only know four languages, smarm smarm smarm," the jimmies, they were rustled.

At least if you're going to be That Guy who everyone hates dealing with ever, be right about it.

(Also I'm drinking, and somewhere between two and five in the morning I sort of explode into a particularly prickly thesaurus. It's a medical condition. I have a doctor's note. Don't judge me.)


http://xkcd.com/386/

There is nothing wrong with someone continuing to be wrong on the Internet.


I hear you but various studies show that negative feedback actually exasperates the problem instead of helping to fix it. When people are given negative feedback their subsequent contributions are of even lower quality and also more numerous.


I wasn't writing that for him. =)


> I've written code for production use cases in Java, Scala, Groovy, C#, Guile, PHP, JavaScript, Ruby, Python, Perl, VBA, and C

Whenever I see a list of high use languages with 1 or 2 obscure ones like Guile or even Groovy stuck in the middle, I wonder if the real intention of the list is to promote the obscure one by associating it with more respectable ones.


Hello - I also worked with Java, Scala, Clojure, Groovy, C#, PHP, Ruby, Perl, Python, Javascript, PHP and C and have been playing with Rust, Haskell, F#, Erlang and Scheme as well.

In my opinion it isn't just an easy context switch and the principles aren't necessarily similar between OOP languages and functional languages, or between dynamic or static languages.

Some things remain the same, but some things are different, like the design patterns used, the techniques, the libraries and tools available, the mentality as a whole. When picking a language, you're not picking just a bunch of syntax and type system rules, but the ecosystem around it as well. It takes me one or two weeks to learn the basics of a programming language (i.e. the point at which I'm actually able to start coding and solve problems), but it takes me a year or more to learn the ecosystem around it.

For example doing functional programming doesn't fly in a language like Java, even if you can (as painful as that may be), because then you're the odd duckling swimming against the tide and you can't do that if your peers aren't doing it as well or if the libraries you're using are working against you. LISPs are very FP, but the abstractions used in comparison to what people do in a static language like Haskell are very different - for example LISP people don't think in terms of monads or applicative functors and they do like wacky things like rules engines. Erlang's or Akka's actor model is also very different than any other model for dealing with concurrency, because even though the rules look simple at first, that "let it crash" mentality and the ability of actors to communicate across address spaces changes everything. Ruby's meta-programming is also very foreign to people accustomed to Java or Haskell. Scala looks familiar to Java developers starting out, but beginners shouldn't be fooled, because Scala in common use is actually closer to Haskell and combined with OOP it sort of makes it unique in the approaches used. FP developers in general tend to prefer doing things with FRP, rather than MVC. Etc, etc...

Of course, you also have families. Python and Ruby are very similar, in spite of the seemingly different mentalities of their communities and after learning one, then learning the other is barely worth it. Ditto for Java and C#, many people would argue that C# is the nicer language, but Java has the nicer ecosystem of libraries and on the whole developers barely have a reason to jump from one to the other.

I'm not sure if I have a point - I guess what I'm trying to say is that developers should expose themselves to different ways of approaching problems. And when you're learning a language, one should go past the superficial syntax differences to discover the goodies and that takes time.


Honestly? Part of why I sort of shrug and don't find any of these languages terribly different is that, most of the lessons that are surfaced in one language are applicable elsewhere.

I strongly disagree with regards to the notion that Scala is closer to Haskell, I think the scalaz folks want it to be and I think not many people listen to them. Instead I think Scala is what Java should become and I find a lot of valuable lessons come from it when I'm writing Java, functionally and immutably. It's harder than in Scala, hard enough that I default to Scala when I have the choice, though with IDEA it's not that hard. (My C++ looks pretty functional and immutable too, and because everything lives on the stack or in unique_ptrs I find that it makes memory management a heck of a lot easier, too!)

Or I write metaprogramming-heavy C# and Java, automating annoying or hard problems (and I do have reasons to use C#, because Java doesn't run in half the places I want it to!) with a lot of reflection because it makes sense to me. A lot of that comes from JavaScript and Ruby; the expression of it is different in a statically typed language, but to me it's the same kind of thinking.

They're all programming languages. They emphasize different things. That doesn't mean that all those things can't be happily incorporated into whatever you're doing in whatever language if you're willing to put in the effort. I am, and I think it makes me write much better code because of it. Small sample size and all, but I've never had anybody complain that "your code isn't idiomatic", though I have had people say "I like that, I've never seen it before."


The other day at work I was asked to describe what I look for when I look at a resume or screen a candidate. I said that one of the most important things to me was a demonstration of a breadth of knowledge and experience and that was more important to me than a depth of experience. The example I gave was along the lines of IMO it's better to have spent 10 years working with a variety of languages from a variety of paradigms on a variety of types of projects than to have spent 10 years becoming the super deep expert in a single thing.

Why?

Largely the reasons you cite. As trends change and technology changes, often times the person in the latter camp will struggle much more than the person in the former camp. Sure, the people in the latter camp are much more impressive in their little slice of the world but the person who has seen the proverbial world will often be able to adapt more quickly.

I've also found that the person with a breadth of knowledge which spans these sorts of concepts can often use that experience to form outside the box solutions (in a good way) to sticky problems.


Half a dozen is not significantly more than four in the context of "number of languages known."

But when you're making an argument "in your experience" and happen to list your experiences in such a narrow band, you are not making a good argument. That was my point. I don't know why you think it is being a jerk to tell you to make a better argument.

To address your point: almost everything a programmer does is achieved through the manipulation of language. We solve problems using language. Learning a new language is learning a new way to solve problems. If you find this to be easy, then maybe that is because you keep solving the same problems.

My experiences stand in stark contrast with yours. So what are you really basing your claim on? How do I, as someone who cares about this subject, resolve the conflict of what you're saying with what I've experienced?

Surely measuring our genitals is not the solution you'd propose.


This wasn't about my "argument", because it's really hard to be so unclueful as to think that a mild relation of experiences and shop talk is an argument. This was a chance for you being That Guy. But nobody likes That Guy. So stop.

> My experiences stand in stark contrast with yours. So what are you really basing your claim on? How do I, as someone who cares about this subject, resolve the conflict of what you're saying with what I've experienced?

By actually choosing to participate in discussion? By sharing your point of view and maybe everyone could learn from it? I might have been interested in your point of view if you had chosen to share it rather than being snide and dismissive. Now because you couldn't resist that I just want you to go away, and I won't reply to you again.


"In logic and philosophy, an argument is a series of statements typically used to persuade someone of something or to present reasons for accepting a conclusion."

I don't have any more patience for your offended emotional tirade. Get some rest and get over it.


> your offended emotional tirade. Get some rest and get over it.

Personal attacks are not allowed on Hacker News.


> But nobody likes That Guy. So stop.

I think you are confused about who is being 'that guy'.


So which languages do you know, and what are your experiences?


I don't know if it would be all that useful to say what languages I know. C, C++, Java, various LISPs, Python, Haskell, Agda, Rust, and a couple different scripting languages. Mostly mainstream stuff with well-developed tools and theory.

The differences between languages in the same category are subtle, and I wouldn't call myself in expert in any language in which I didn't know what the subtle differences were and why they matter.

I would characterize learning new languages like learning to play a new musical instrument. Yes, if you know one instrument, you probably know how to read music, keep a beat, develop good practice habits, etc., and those things will transfer just fine to the new instrument. But those skills do not actually make you good at the new instrument -- they are just necessary to use it.

For instance, I know how to play a trumpet. Trumpet skills are useless on a piano. You don't need any ambidexterity. Your rhythm and pitch are controlled with your breathing. Maybe you'll want to sing instead? Well, breath control is great for singing, but being a good trumpet player doesn't make you a good singer. Controlling your pitch is much more difficult when singing than on a trumpet, to say nothing about how easy it is on a piano. But it only works on a piano if it is in tune. Piano tuning is not a skill a trumpet player or a singer needs to learn. But what about a guitar player?

Almost all of the hard work of mastering an instrument lies in the details that are exclusive to that instrument. The fundamentals are necessary, but they don't seem to represent mastery. Someone who's memorized a music theory book is not a good musician, even if they might be advantaged in the act of learning to be one. They might actually be a terrible musician because they never learned how to practice.

Not to mention the whole issue of trying to say what competence, mastery, proficiency, et al actually mean.


1. imo Being a specialist in a language and being sufficiently proficient in it are two different things.

2. It's not like you ever let really go of the languages you already learned if you keep using it.

3. It's pretty obvious when a language has run its course. It shows in the job market. Why not transition then instead of staying on the Titanic?


Languages aren't all that different from each other. Most of the knowledge is shared between them (with a few exceptions like imperative vs. functional.)


Oh please. A "language" is way more than a way to define functions and add numbers. All modern languages like to rewrite the world: standard library, help system, build system, distribution system, FFI and so on and so on. It'd be great if these were standardized, but language authors love to NIH apparently. Moving to another "language" is a HUGE endeavor, unless of course you like to half-ass it.


Sure, but they're not that different, particularly if you've been around the block a bit and seen a bunch of these.

For a reasonable definition of the word productive, I've never found it terribly difficult to pick up and be productive in a new language. Sure, a year later I'll look back and say "man, I was so silly back then" but that's not the same as being useless. We routinely hire people at work who don't have experience in our primary language and it's never been an issue.

One of the things that irritates me the most about our industry is the fetishization of the difficulty of switching languages. In my younger days people would say that the syntax was too difficult to pick up. Now that people realize that's hogwash they've moved on to saying that the frameworks are too difficult to pick up.


Right, but there's nothing wrong with being a good programmer who needs to refer to the library docs more often. What's "oh please" is the notion that you can never be good at programming unless you've been using Perl nonstop since 1995. A Python programmer can pick up and become reasonably proficient in Go in a few days, for example.


A lot of this depends on what level of proficiency you're talking about.


Anyone familiar with any ALGOL-derived language could be productive, at a commercial level, in any other in a matter of days. Probably less these days with easily searchable API docs.


Agreed back in the 80's I picked up PL/1G in a few days


seeing other languages can help you understand other paradigms better. Canonical example is people learning Haskell or Lisp and then going back to C++ and making much more stateless (and thus more maintanable) code.


> Do I want to program in Cobol or Mumps for the rest of my life, or even the rest of the afternoon? No.

I had been a Perl hacker, and CPAN author (kraehe) ages ago, but time moves on. I learned coding with papertape and punch cards. I wont get any good job, if I would restrict myself to COBOL or Perl.

Perl has its strong sides, especially the CPAN culture of testing and documentation is unmatched in any other language. I still use Perl, if there is a module that solves my needs. But my portfolio of languages grows constantly. I dont want to write Perl the rest of my life.


the people who get paid the most are those who can call them experts in the field. you can't do that if you tinker in every field.

pick something, work on it, find work doing it and get paid the most.

if the work goes away and you need to pick a new field, then what you've learned isn't wasted, you'll still know how to solve problems, just you won't know the idiosyncrasies or the eco system as well as you did. but you can learn that pretty fast.

point being, learn one thing and stick at it, that way you'll get more money.


I used to think there was an idiomatic way to write code in a given language, that following conventions and community guidelines was a good thing, that languages had a natural flow to them and writing against the flow was a bad thing. These days I focus on solving problems. Idioms, conventions, languages, etc. are all bullshit. Everything is subservient to solving a problem. Ember or Angular? Don't care. Ruby or Python? Don't care. C# or Java? Don't care. Haskell or F#? Don't care. Statically typed or dynamically typed? Don't care. What is the problem we're solving? Now we're getting somewhere.


"Everything is subservient to solving a problem". Correct. And writing good, clean, idiomatic code usually helps you avoid introducing code-level problems when trying to solve your "real world" problem.

You're not wrong so much as perhaps just more experienced to the point where caring about the idioms isn't much of an issue any more, not because idioms don't matter, but because you're writing code that's good enough. i've found the 'standards' and 'idiomatic code' to be useful not so much as a benchmark for "you're 100% wrong if you don't do this" but "you're creating another set of problems for yourself if you're not careful".

"Ember or Angular? Don't care. Ruby or Python? Don't care. C# or Java? Don't care"

Well... you may need to care, because one of the problems you may have to consider is long-term maintenance as well. Using Ruby for "project X" in a team of 15 .net developers with extensive C# experience, simply because "Ruby is best for this problem" is probably just trading one problem for another.


> Well... you may need to care, because one of the problems you may have to consider is long-term maintenance as well. Using Ruby for "project X" in a team of 15 .net developers with extensive C# experience, simply because "Ruby is best for this problem" is probably just trading one problem for another.

I'd like to reiterate this point. At some point, somewhere down the road, a different programmer than you is going to have to read your code and try to figure out what it does. If you don't learn your languages well enough to be idiomatic, then you'll be contributing to technical debt load instead of helping to pay it off.

Unless you're doing big enterprise consulting. That's an entirely different sport.


Right, I mean at the end of the day the social problems like the one you mention instead of the technical merits of one platform over another is what will dominate in the long term. So whatever stack is chosen is not going to matter as much as people think. If the problem is ill-defined and there is lack of agreement on what exactly is being solved then adding technology into the mix is not going to make a difference.


I think this is what Andrew Hunt and David Thomas meant when they said "program _into_ a language" as opposed to "program in a language" in their book The Pragmatic Programmer.


Disagree, you should always strive to write code in the common idioms found in that language's community with local "house rules" deciding the fuzzier points.

I've gotten into this argument frequently with coworkers over the years, usually surrounding situations where we have some side code in a language where the common idioms, style guides, etc are vastly different than our main language. They end up writing X as if it were Y with different syntax, to which I object. Their take is always that it makes the code more readable, but that's not really true - it's only more readable to a Y programmer, not the space of X programmers.


Learn to be a "programmer" not a "X programmer".

I recently sat down to learn Python a bit. Inside of a week I had useful code up and running. I'm sure it was non-idiomatic and I'll laugh from embarrassment a year from now, but screw it, it works, and real people ship.

The last code I wrote before that was in Java. I had to relearn it after a decade of non-use. Real people ship.

More recently I wrote some more Python code, it's all hacky and terrible, I can feel it, but you know what, it shipped and has kept a half-dozen people employed for another few months. Real people ship.

Most recently I had an issue to solve, not wanting to fight Python's 2.x's stupid typing issues, I hacked it out in Perl. The solution shipped. Real people ship.

GSD (get shit done) and move on. It doesn't really much matter what it's written in. It's all bits and bytes, 1s and 0s in the end.

Real people ship.

(also, don't work for free)


I think chromatic has more than earned the right to call himself a "real programmer". He may not be a real business person...but, he's demonstrably a real programmer.


On GSD:

You do this for long enough without ever cleaning up after yourself, and you're left with a massive, steaming pile of garbage that you have to then support and maintain. Maybe you're lucky enough to be able to write fire-and-forget code, but in my experience, writing hacky terrible code to ship something just means that I'll have to spend 3-5x time on the phone troubleshooting with a customer, analyzing logfiles and then writing the code the right way in the end, than if I had just done things right to start with.


It depends on the setting. If you're working in corporate world where the amounts of money flying around dwarf your considerable hourly rate/salary, then what they're paying you for isn't the quality of your code, and what you've written will probably just be thrown out after it's served its purpose.

If you're having to maintain your own code long-term, absolutely. But that's not every job. It's good to get a feel for what's required from your boss and to keep in mind that the requirements can change from project to project.


That is absolutely a fair point.


Domain knowledge trumps language in the real world.

It's funny, we mock MBAs who believe they have the generic skill of managing anything, but how many programmers think that they can write programs in any company or industry?


Next time there's a "How do I freelance?" Ask HN, please use this;

If you find a good client, you treat him or her like a combination of the Pope, the Queen, and the Dalai Lama, because a trustworthy client who pays you on time and doesn't argue over little details of the contract is better than gold.


That happened to me. The client was so good that I became an employee!


I do not want to sound to harsh here, but seriously you are requesting a bit much. Judging by your choices of words, you realize it yourself ("special snowflake").

The system works a certain way, you do not want that, fair enough. You then continue how you do not want that consulting thing either. So you go back and basically say, that you want to be hired by a company, which produces a product of your interest. They should use a technology that does not advance while there are other choices, which would lure in coworkers, who are interested in using it.

It does not matter whether Perl is technically a reasonable choice. There are other more important metrics for a company. When you get hired as a programmer you got to program. If you want to make bigger choices, stay your own boss.

I get, that it is hard to move on from a beloved community and something that defines yourself. But really, I do not think, that Lua, Ruby, Python - you name it - or even PHP are so different, that you could not use some of those 20.000 hours of experience productively in them. Also, you can go back to Perl anytime.

I think you wrote that rant, because you were faced with making a decision and wanted to wallow in memories of the good old times. I hat this once on a much smaller scale when I switched from PHP to Python. As many people already suggested: Identify yourself as a programmer not a Perl programmer. You will highly increase the likelihood of getting into an environment that satisfies you when language choices is not important to you anymore.


Actually the company I founded start 2000 and was quite succesful used perl. Off course I wrote the MVP in it. I loved perl.

When I sold it due to personal reasons I was not yet ready to retire. In that time and especially in Europe buy-outs were quite modest. I started doing consultancy until I would have a new idea and start working on that one. Indeed the golden cage of consulting can be dangerous. I'm already consulting 5 years and didn't start anything new.

I don't consult as a programmer anymore. I implemented Agile in my own company and now I'm an agile coach in large enterprises. Programming I still do in my freetime just to keep me sharp.

But here in Belgium 99% of the projects are CRUD projects so I'm not really missing out on anything. Most of the times senior developers go to architect roles not because they want to stop coding but because they're tired of writing CRUD code.


I don't know, I guess it requires a different mindset/discipline. I can't imagine myself happy doing agile coaching. Here in Georgia 99.999% of projects are CRUD, that's one of the reason I work remotely for more than 8 years now. Mostly, C++ related work.


disclaimer: I've never written a line of Perl.

When I look at the programming world, my primary instinct is that if something is popular, then there must be something good about it. Take JavaScript. In spite of its limitations as a language (the worst of which is not having a canonical OOP style) there is no easier way to create a single GUI for 3 platforms, as JS. I could make similar comments about Python and C++.

I don't really understand the point of view that using [old language] is an indication of really knowing how to program. Apart from some minor details, all imperative and OOP languages are fundamentally the same.


I take the opposite (hipster) view. If a language is suitably popular it's because there is something bad about it. Either

A. the barrier to entry for other languages is too high (Javascript for the browser, Objective-C and Swift for iOS, etc), and so there is no evidence that there is anything good about it (you just lump it and accept that's the way things are done).

or

B. Because it is still a generic, C-style, imperative, possibly somewhat OO language that feels very similar to a language you already know, and which is a reasonable choice for nearly any problem (rather than ideally suited to a subset of problems), and you are probably selecting it based on that merit rather than any real technical merit the language offers. (Note that this can be a reasonable business decision, but oftentimes people delude themselves. "We use the best tool for the job"...so long as that tool is Java or Javascript)

As you say, all imperative and OOP languages are fundamentally the same; for me it's when languages veer off that path that they start getting interesting and really differentiate themselves.

As a note, too, 'there is no easier way to create a single GUI for 3 platforms, as JS' is not a testament to how good JS is, but a testament to Netscape being in the right place at the right time. It was a language written in a week, whose syntax was forced by suits to be 'Java like' because that was what was popular at the time, and which had no competition.


Javascript is the worst example you could've picked. The only thing "good" about it is that it is implemented in every browser on every platform, and as such available to every programmer of every age. The language itself is actually quite bad, which is why you get so many people trying to make languages that transpile to it.


Reading in context, I was talking about languages holistically. You can't entirely separate a language from its context, e.g. JS is the only scripting language that runs in browser (as I highlighted in my original post).

And do people transpile to JS because the language is bad, or because the interpreter and runtime are good?


The former.


Why do you say that if something is popular then there is something good about it? The world is full of sub-optimal solutions to all sorts of problems. This is especially true for programming languages and frameworks.


It's just my own view that I developed based on everything I've experienced. It can't be boiled down to a single assumption, or a single piece of empirical evidence. I studied economics so I'm very influenced by economic theory. But on the other hand so called "network effects" do imply that suboptimal technologies can become dominant. In practice, I don't think that languages usually become dominant purely by network effects. They need to have a real advantage.

To take a concrete example, Python has very user friendly syntax. This allows people to quickly iterate on code. Another advantage of Python is its complete standard library, and availability of other libraries. But these things probably would not have existed if Python was not such an efficient language to program in.


To take another concrete example, PHP is extremely popular. It doesn't mean PHP-the-language is better than less popular alternatives, it means it's extremely easy to throw things together and deploy (mostly due to almost universal support on VPS). Nonetheless, PHP's history is an accumulation of appalling technical decisions and security holes.


Here's my take on PHP:

- It was the only way to throw together a webapp easily for a long time.

- It is actually pretty efficient for this purpose.

- Much later Python and Node became viable alternatives.

- As soon as they did, people started shifting to them, and today few people would start a new project in PHP.

So according to my (not very informed) understanding, the history of PHP supports my claim that inherent quality (relative tot he alternatives) is more important than network effects.


I've never heard anyone describe a monopoly and say it had inherent quality before. Interesting.

(Note that I'm not saying you're accurate in your assessment as to reasons why PHP became popular, just that your reasoning as to why it has inherent quality is interesting)


isn't by definition a monopoly the highest standard of inherent quality? (unless of course there has been coercion/force/aggression)

i kinda think that google got a 100% monopoly because at the time they were the best search engine (and remained so). and as soon as someone comes along who is better, the monopoly will end (i use ddg but i'll admit it's not as good. i like some of the cool hacks and features so i'm trying to stick with it). am i wrong?


It depends how difficult it is to displace this monopoly. Monopolies don't necessarily sustain themselves by being superior to the alternatives. Regulatory capture, vendor lock in or billions of lines of existing code are quite effective for this purpose.


From Swatow's post - "i kinda think that google got a 100% monopoly because at the time they were the BEST search engine"

Emphasis on -best-. Compared to all the others. I.e., altavista, yahoo, ask.com, msn, lycos, infoseek, etc". Google came into the party late, and -displaced- the existing players, by offering a superior product.

Swatow stated that when PHP came on the scene it was the "only way to throw together a website easily", that it's now being displaced by other tech, but because for a time it was the only way to do it easily, it somehow had inherent quality. That would be like saying that because Archie was the first search engine, it had inherent quality.


> As soon as they did, people started shifting to them, and today few people would start a new project in PHP.

At the risk of sounding like I'm doubting you, I honestly think you have no idea what you're talking about.


Not much changed over the years. Chromatic is still praising Moose and other bad ways to code.

> Moose is probably the singular reason there are any significant new projects in Perl in 2014.

There is another side of that story: Moose is probably one of the things that contributed to Perl's decline, along with modern Perl, Perl 6 and overall focus on all the wrong things.

Anyways, I wouldn't take chromatic's words too seriously. Learn Go guys :)


What's wrong with Moose, modern Perl, and Perl 6? What makes them "the wrong things"? How did they contribute to Perl's decline? What are "the right things"? What should the Perl community have focused on?

To me it seemed like a bunch of Python and Ruby fans came out of the blue and started endlessly chanting "Perl sucks", or "Perl is line noise", and that apparently was enough to turn a lot of people away from Perl and on to the newcomers.

Sadly, these chants often came from and were directed at people who'd never used Perl themselves, or didn't know much about it at all. So it was really a case of the blind talking to the blind.

Not that I love Perl. It's got some warts. But so do Ruby and Python, and every time I try to pick either of them up, I notice a ton of things that are as bad or worse than Perl. And that just makes me sigh.


> To me it seemed like a bunch of Python and Ruby fans came out of the blue and started endlessly chanting "Perl sucks"

I think that was a side effect of experienced Perl developers publicly explaining why Perl was not that good and why they chose another language. Others respected their authority and started repeating and so on. So, the bigger problem was: experienced developers were fed up with Perl for various reasons and started to speak up. And they were not wrong.

Many years ago there was a perl compiler, but at the time when everyone was moving towards virtualization and ease of deployment was particularly important - it got abandoned. Performance and memory consumption were never considered as problematic either, but they always were. Even simple log processing was not viable for anything, but personal homepage sized logs. It was very disappointing for many people.

As for Moose, etc., Perl had a perfectly good object system. Everyone knew how to bless a reference and make it into an object. It was hard to abuse it, nobody encouraged OOP, things were good and manageable. And yet, community decided to go farther into OOP, promote Moose, promote new syntax features, promote new ways to build modules. Things got messy in the process. Perl still didn't address any real problems, but it was a pretty much new language and a burden on everyone. Suddenly people found themselves in a position, where they needed to decide of whether they want to learn all the new things in Perl or not and learn some other language instead.

> But so do Ruby and Python, and every time I try to pick either of them up, I notice a ton of things that are as bad or worse than Perl.

I don't see them as alternatives to Perl. I think they are in decline too.


experienced developers were fed up with Perl for various reasons and started to speak up

This is also the P6 problem.

Many years ago there was a perl compiler...

... but it never worked very well. Certainly it didn't save memory and it rarely saved much startup time.

Perl still didn't address any real problems...

I think there's something like the Expression Problem in programming language design. How do you allow people to solve new problems and explore new patterns and paradigms in a language without encouraging them to create a series of incompatible forks (Tcl, Lisp, Common Lisp, Forth, Smalltalk, heavily macroed or otherwise preprocessed C or C++), limiting the scope of your language to small or relatively isolated projects (Lua), bringing ideas into the core library where they slowly bitrot (Python, Perl), or facing a dramatic rewrite (Perl, PHP)?

For better or worse, Perl's answer in the past few years has been to prototype new ideas on the CPAN, let the community use them and reimplement them and compete, and eventually enable them with the minimal core support possible. The strategy is decent, but it's not fast and it relies on the ability to find people willing and able to do this work in the Perl core.


> How do you allow people to solve new problems and explore new patterns and paradigms in a language without encouraging them to create a series of incompatible forks

Maybe through many tiny DSLs on top of a small and stable core language. People seem to be ok with DSLs, but not so much with constantly changing language and growing feature set.


Sure, that's the approach that a homoiconic or Forthy language takes. It works well, though it takes a lot of effort and foresight to make sure that multiple little languages can interoperate.


Contrarian here: It seems to me that "Perl 6" is the thing that killed Perl. Don't bother much with this current Perl (5) thing, the version will be along Real Soon Now.

The sad thing about Perl 5 is that it's actually pretty fast for a scripting language, it least if you are doing a lot of string manipulation, rather than physics simulations.


I'm sure there is good perl and good perl programmers.

But, having spent a portion of the last several months going through 3000+ line perl scripts written by a person with no formal training who had figured out just enough to hack something together but not enough to efficiently use functions, nor to maintain the code they wrote, I'm feeling less then sympathetic with the OP and perl in general....


Anyone who has had the experience necessary to master Perl has serious chops as a programmer.


+1 for reminding me about mumps.


There are money to be made in Perl, but mostly on old code maintenance.


s/Perl/software development/


I am biding my time and waiting for the 2038 bug to come along. The cool kids of that age will probably never have seen a compiler (any compiler) before, and I am going to line my pockets like it's 1999. ;)


To me, this isn't really about Perl. It's about the multifurcation of programming in general. We have ourselves a shitty industry, and whether you're in-house or a consultant, you engage with its nastiness and aggressive anti-intellectualism on a regular basis.

It's hard to get the good consulting jobs without establishing yourself as a specialist. Clients don't want a smart generalist who'd like to try herself out with machine learning; they want the 20-year expert (and for only $120 per hour!) But, if you specialize and get unlucky (and a lot of it is luck, because factors other than quality influence which tools win and which don't) you end up having loaded up on skills that no one needs. At the top levels of talent, the people who are actually able to evaluate it have better things to do than to judge others. So you need credentials, blurbs on the CV, credibility, etc.

...you're not dealing with VCs who underpay you and dangle ever more diluted RSUs in front of you for that 1% chance of an acquihire payout that'll slightly pay more than if they'd paid you a market rate for the start...

Here's why the VCs and career executives won. They spent as much time getting good at office politics as we did at Perl or Haskell or Clojure or machine learning or computer vision. The difference is that office politics is a transferrable skill. The stuff we learn is much more powerful at advancing the state of society, but if the industry turns against us, that time is "wasted" from an economic standpoint. It's not actually wasted; it has still made us better at our jobs, but it doesn't make us better on paper, which is what matters if you're a consultant in a rough economy who needs to eat.

Eventually, the VC model will break. What's interesting about it is that the numbers (in funding amounts) appear large, but the expectations mount so fast that these companies are actually being run on a shoestring budget (relative to what's expected by Year X, and because the rapid-growth expectations pretty much mandate a next round of funding rather than a stable let's-get-some-revenue strategy). A $5 million investment seems like more than you'd need, until you realize that you're going to need 25 more people to appease your investor-boss and that's going to put you in front of investors again, depriving you of the time or freedom to do anything else (like build a business, maybe?) in about a year.

How do you keep programming fresh?

That's a hard one, because humans tend to create pyramidal structures and what that means is that there are fewer positions for 10-year programmers than fresh grunts who are still excited by the CRUD projects, and at 20 years... you're either doing R&D in an AI research lab usually with a "Fellow" title that is VP-equivalent and therefore gives you a right to be old... or you've moved into a management role... or you've missed at least one boat.

At 7 years, you've learned everything general you need to know for 99% of corporate programming. Sure, there's the treadmill of new frameworks and new dresses on old concepts, but you're pretty much maxed out unless you can convince business types that you're something other than "just a programmer"-- perhaps a machine learning expert or "data scientist", perhaps an architect, perhaps a Dir/VP-equivalent elite programmer ("Principal" or "Distinguished" or "Fellow") who has the clout to work on what he wants, perhaps a manager, perhaps a founder. The good news is that, for most of us, it's not actually that hard to do. Not all "business types" are morons, and many are good at what they do and make your life easier-- if you can convince them that you're a cut above the commodity programmers who "deserve" to be corporate subordinates.

The problem is that, while learning office politics is a lot easier than plenty of technical things that we gladly learn just for the challenge, it is a huge distraction, and those lessons often come at random when you'd rather be learning other things. If you expect a 40-year career working only on hard technical problems, you're not going to get one, because the days of Bell Labs are over and, anyway, modern academia is far more political than most corporate jobs. You're going to have to learn the politics of business, because (channeling Trotsky) you may not be interested in political fights, but they are interested in you.


To me, this isn't really about Perl.

Exactly. (I believe I wrote that explicitly the previous time this was posted.)

Part of the problem is that programmers believe we're special snowflakes and can't possibly be seen as fungible Taylorist cogs. Another part is that we chase language and library fads more than we pursue deep domain knowledge.

Deep domain knowledge, of course, includes a good understanding of business in general.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: