Then you can add templating, routing, logging, testing, anything. I'm not married to the PHP language, it's a mess. But the idea of keeping simple things simple is very important, and from what I see of today's developers, that idea has been utterly forgotten. Memory-holed. It's astonishing, I can barely believe that it happened.
EDIT: Sorry, I just said this offhand and didn't mean to hijack the discussion. Feel free to downvote, so something more on-topic gets on top.
While you can theoretically get to this level of simplicity with various tools I think the difference isn't so much "can it be this simple?" so much as "does this level of simplicity fit my needs?" and that can be harder to quantify.
The same thing you wrote in PHP done in Node.js in an arbitrarily named file served from root:
Zero configuration. You could trivially add some lines from 'fs' to get it to read files instead and whenever you update those files it'll automatically serve the new ones. The problem isn't that it can't be simple. It's that the people cutting checks (or at least the people informing them) also want SSL, non-static pages, sanitized inputs, an attractive interface, log retention, data persistence, ease-of-access, etc.
If people were paying me my current rate to send their users unstyled static text over HTTP I'd gladly walk away from all the process. But if they want me to maintain my sanity while writing a non-trivial web application I think integrating a type checker pays off after the first few thousand lines of code. And if I know things are going to get there from the outset then setting it up in advance isn't exactly insanity.
Eh, that's not a great example. It can be hard to appreciate the simplicity of PHP when one's fully bought into the stockholm syndrome of another language/framework :)
Consider this: a hello world in PHP requires no knowledge of programming whatsoever: a file saying "hello world" is enough to get the text to show on a browser. Having two different pages requires no knowledge of if statements, nor of 404 handling.
On top of that, PHP silently manages a lot of plumbing for you: it will also do caching and streaming out of the box and it won't crash the server due to OS ulimit errors when you have a high number of concurrent requests hitting the same file.
Heck, you could stream result sets from a database to a streaming HTML document in like a dozen lines of PHP with no libraries 10 years ago. The equivalent machinery to achieve something similar in e.g. the latest Next.js today is beyond abysmal in comparison.
The LAMP stack was effectively fucking magic and basically the whole reason many of us are in this field right now. Specifically, PHP/MySQL mostly "just worked" and dealt with most of the stupid stuff for you. Yes, the performance was kinda meh sometimes (but back then a normal page load had a dozen assets, not hundreds) and not-strict typing, behind-the-scenes shenanigans, and other magic made for some interesting bugs... but overall, it worked, and it was easy enough for a middle schooler to pick up.
I switched tracks from webdev to systems pretty quick but sometimes I look at what's going on in webdev today and frankly it's pretty disturbing. The sheer amount of complexity to render some HTML and JS is mind-boggling. And the worst part is it's not even that much better than it was back in the LAMP days. Pages still take forever to load, browser compat is still crap, and heavy sites are still gross. The only difference is the amount of complexity, gatekeeping, and sheer busywork that these new frameworks/languages/systems bring in.
Or maybe I'm just the old man yelling at the clouds.
I despise PHP, but those are words of truth. The first time developer experience in vanilla PHP is the most friendly and easy as you can get, cannot be compared against Next.js, AspNetCore, Flask, Express.js, or any other framework in any other language I could imagine.
Depending on what's serving PHP it can also get complicated. Everybody can't stop talking how easy it was to cobble up something in shared hosting days, but don't forget that back then only option was pretty much Apache and even that was enough to give you headaches because of different modules being activated, your htaccess file being incompatible and so on.
Also: You are a kid, wanting to program something on your new (typically) windows pc you got for your birthday, is it easier to get started with PHP than other alternatives? What about html and js?
Sure, PHP deployment could run into some issues back in the day - remember magic quote configuration issues in rando hosts? But even that was something you could get around relatively easily (mostly thanks to the stupidly beginner friendly docs). These days, hosts can easily drop in a PHP hosting pre-packaged solution, so you'd be hard-pressed to find a PHP host without a reasonable configuration.
As for pure beginners, if your goal is to get pretty things on screen, HTML + CSS is fine. But if you asked me to pick between PHP and Node.js (or any other language, for that matter) for teaching a kid about server-side web programming, I'd pick PHP in a heartbeat. You can't even get past posting a form in Node.js without having to reach for NPM :)
In fact, PHP is how my brother (who went to business school) converted to a programming career.
> It can be hard to appreciate the simplicity of PHP when one's fully bought into the stockholm syndrome of another language/framework :)
I used to use PHP and I haven't touched Node.js in quite some time as I've been playing with Rust/(Rocket, Actix Web) and Go's standard lib. So... :)
> Consider this: a hello world in PHP requires no knowledge of programming whatsoever: a file saying "hello world" is enough to get the text to show on a browser
This does not happen with "no knowledge" unless you suppose that copying and pasting arbitrary lines into a command prompt is "no knowledge" (that Apache server isn't going to install and start itself). And if that's allowed then suddenly there's lots of things you can do that require "no knowledge" as even a few lines of JS spewed into an index.js file via cat can get you easy file content streaming and 404 handling without need for a reverse proxy.
> the latest Next.js today is beyond abysmal in comparison
I don't know about Next.js; I've never used it nor does it look like something I'd want to use. I also don't understand why we're comparing Next.js to PHP when the equivalent would be some Drupal-as-a-Service or similar.
> a file saying "hello world" is enough to get the text to show on a browser
Wrong. A file saying "hello world" does not even serve its contents through HTTP on port 80, contrary to the node.js-example you are arguing against. Good luck setting up that Apache monstrosity (as it was common back then) without any "programming knowledge whatsoever". But yeah, I guess your mentioned stockholm syndrome is a thing for PHP people, too.
It sounds like you've never actually used PHP. Even way back when, you could pick from multiple pre-packaged solutions (XAMPP, WAMP, etc). In windows, the installer was a no-brainer click-through wizard, much like any other random software you'd find on sourceforge or whatever. And not only it would setup apache, but also mysql. Then you could click a button to turn on PHP (so no need to type anything in terminal either). You could also get hosting and upload your PHP files via a browser interface to put your hello world live on the web without ever typing any "code" other than the text "hello world" on a index.php file in notepad. Not even knowledge of version control was required for this form of deployment (though, of course, you can also follow standard industry best practices like with any other mainstream language).
"Poor quality" comes with the territory of popularity. Everyone talks about Java over-engineering horror stories and the Javascript community even coined the "Javascript fatigue" term to summarize the pervasive quality issues in its ecosystem.
But naturally it's perfectly possible to write good quality codebases in those languages.
It's that easy. For both node and PHP (and any other server), you will end up with more plumbing in production anyway. For example, having a load balancer, cache or other ingress in front of the service. These kinds of universal network layer functions (which I'd argue even include authorization and authentication) should not be part of your code anyway, but part of the network infrastructure (k8s and OPA e.g.)
I think OP's point was that for starting developer PHP is super easy.
- edit a text file with any text editor (notepad?)
- copy file onto a server (ftp? scp? git push? -- most servers will process php)
- access it through url in browser
The example was about simplicity. OPs text file is 11 bytes long "Hello World", and is a valid PHP, it will run on a server.
This is about how easy a person with no knowledge and no special tools can start doing things.
Your example implies that developer has node installed, and expects the developer to run a server from within node, and then understand how the server handles URLs to functions. Additionally, I don't know how an early developer is expected to get this onto a server. First copy, then ssh onto server, then start the node server on server. That is a lot of steps for beginner to remember.
Additionally, as you mention, changes to file require server restarts, or building a self reload capability. I play with Flask, which has self reloading available, when developer pushes broken code Flask can just crash (self reload fails), and then developer has to start flask again. With PHP, broken code does not crash apache/nginx.
For developers that live and breathe computing knowledge running things through additional steps make sense. C/Java/C++/C# all have compilation and linker steps. Things like Grunt or Less are nothing scary for developers that ran C/C++ code through M4 preprocessors before compiling (now M4 is built into compilers).
So yes, for professionals, PHP is just a programming tool as any other.
But for beginners, PHP is awesome easy to grasp and use.
> Your example implies that developer has node installed, and expects the developer to run a server from within node, and then understand how the server handles URLs to functions
The OP's example implies that the developer has Apache and PHP installed at a minimum and that Apache is set to proxy requests to PHP. This requires SSH and starting Apache. I remember doing this back when I first configured my first LAMP server, though most hosting companies will do this for you and usually as part of a cPanel setup (even more dependencies).
> Additionally, as you mention, changes to file require server restarts, or building a self reload capability
I agree that software development has become (often) pointlessly over complicated in the past few years.
For my personal projects I usually develop in Yii2, vanilla JS, SCSS, on MAMP. Source lives in Git, deploy via FTP. I track my progress on a simple kanban. I can push out really complex web applications in days.
At work, currently we use headless Drupal 8 as an API layer, React/Angular frontend, automated deployments via Jenkins (that takes over 30 minutes), silly microservices, an over complicated Git strategy, with over complicated Jira/Confluence steps, security scans, pen testing. It takes months to release anything.
They're two different approaches, one is quick and "dirty", the other is "enterprise", but the end result is the same: a website.
The biggest difference is the amount of time and the cost.
I could also edit live on my production server and the result would be the same: a website.
I'm not saying that modern stacks aren't more complicated than they need, but I use a React frontend, a React-Native app, silly microservices, a non-trivial git strategy, non-trivial Trello steps, security scans, automated deployment via CircleCI and yet release multiple times a day.
The biggest difference is that deployment always works, always passes tests, does not depends on who deploys, rollbacks automatically if healthchecks fail... You know, the sort of stuff that "reduces defects", "builds quality in", "fails fast" and all sort of concepts that people way smarter than us, working in industry way more mature than ours, have been elaborating for decades
Anyone can put a complicated CI/CD pipeline on top of any language, including PHP. What does this have to do with the amount of hoops certain languages force their users to jump through just to see some dynamic content?
Don't get me wrong, I'm not saying the quick and dirty approach is better, I'm saying that the end product is very similar. The main difference is the surety that the code is going to work properly when you have checks in place.
Complex development processes are worth the hassle for big clients who can't have downtime, but they're not suitable for everything. I would (almost) never bother setting up such complex processes for an after hours side project site because the amount of extra work involved would ensure the project never gets completed.
Yup. I think people forget that things aren't the way they used to be because we made forward progress. Maybe that means things got more complex. It also means more things became possible and it became possible to not have things happen which you wish didn't.
If everything worked so fine and dandy in the good old days, we'd still be there.
I can respect this, and acknowledge that there are use cases for all the infrastructure. My question is, are the trade-offs being considered? Not every use case requires all the bells and whistles. It seems possible that development pipelines have moved towards using the bells and whistles in part because developers are familiar with them from FAANG use cases. People choose to use what is in use, often for good reason like there being greater documentation and support for a tool in heavy active use. But that doesn't necessarily mean you're using the right tool for the job. It involves weighing the reduced development costs with possible increases in deployment issues. There are a lot of companies with delusions of grandeur, trying to future proof for massive scaling issues that most likely will never come.
Yeah. I agree. Sometimes the trade-offs are balanced in favor of human factors more than the technical ones and it can skew the results in favor of solutions that aren't exactly a perfect fit, but they're deemed workable because you can find warm bodies willing to work on them .
Oh yeah. Take consumer goods got example: A straight razor or double edge safety razor and a bar of soap work an awful lot better, cost the customer significantly less, and produce far less waste than modern mass-produced shaving convenience goods at the expense of greater initial costs and a modest increase in required skillset.
A fountain pen with a hand-tuned gold nib is refillable and can last nearly a lifetime with proper use. It also requires good writing technique which will reduce fatigue and enhance legibility.
I've yet to find a modern can opener that is remotely usable with any kind of longevity. Instead I keep a couple of swiss army knives in my drawer that always just work, and are faster and are easier to clean than any modern can opener I've tried.
Hand-made leather boots can be resoled several times and last a decade or more.
All of these things have the same common themes: They were hard to mass-manufacture or sell in quantities as large as mass-manufactured goods, they require a bit more upfront costs to use, and require a modest skill investment by the consumer so there's a barrier to entry.
That last bit though is the sinister one though. Moving to more forgiving implements is better right? Not if it means you never learn proper technique for doing things. Dig through a manual on penmanship from the late 1800's and you'll find a wealth of valuable knowledge that can help you write neater and with almost no fatigue even with a modern ball-point pen. Curiously, we don't get lessons even close to that in school today.
How's this relevant? Take these attitudes and apply them to software development. If you make your tools more forgiving, you can get away with less training for your developers, right?! Less upfront investment in hiring competent developers! Emphasize scale and throughput over quality! ...and all of a sudden, you NEED a bunch of tools to make up for the increase in project complexity of band-aids slapped on top of poor design. And hey, if you slow development to a crawl, you dramatically reduce the frequency of major errors/downtime! Win-win! /s
While its true that disposable consumer goods are a problem, I feel like you're attempting a dogpile via rhetoric rather than really engaging with the spirit of the grandparent's post.
Folks weren't even remotely concerned with industrial or consumer waste "in those days" historically, and actually produced quite a bit. What's more, many of these products you're describing used dangerous chemicals or materials that caused many problems.
And if we're really harping on writing utensils, all but the cheapest mechanical pencils easily outlast and outperform the lousy (and pointlessly expensive and historically mildly toxic) historically yellow pencils.
And for consumer electronics, its interesting to note that the non-replacement policy happens because the components are often so minimal and optimized.
It's not clear that we should apply the lessons of physical deployment to software, as the art and science of software is still developing from infancy at such an amazing pace. Modern software (even the bad stuff that us trendy to hate) tends to be remarkably better, prettier and often even faster (albeit at the cost of volatile memory) than even relatively well-thought-out older software from average developers.
I think that fails Occam's razor. What if the reason there are more people "doing it wrong" (for various values of "it") is simply because there are more people doing it, period? There are a lot more programmers (and people shaving, opening cans, and writing letters) than there were in the past. As a result, the experience of doing those things changes both because of eternal September syndrome as well as the economies of scale (cheap, low quality crap) that emerged in response to that growth in userbase.
So much agree. The web is very large and there are tons of use cases. So when people start talking about it being so difficult to setup a website because of all these things I just can't understand where there head is at.
It's like complaining that they built a rocket destined to go the moon, handled all the requirements to do that, but their use case would have been handled by a weather balloon. Of course they are going to complain about it being overly complex, because it was for their use case.
Things aren't helped by companies have dev relations pushing out articles about using their complex stack with simple examples (to get you interested in it for work), or people writing their own blog posts to get familiar with some new/complex stack, and devs reading these thinking they must be using them. Analyze what is being used and why, then decide to use it.
"vanilla JS" - yea that is a big no-no for me these days..
I'm not a fanboy of frameworks, but I would not touch a project without some "published framework".
If you DON'T pick/use a framework.. you STILL are using a framework. Just a framework that YOU invented, that doesn't come with documentation, examples, best practises and 1000 questions on StackOverflow.
Ever worked on a JQuery project with more than 3 people or one that took longer than 6 months !? Its a nightmarish mesh !! Since there are no std ways todo stuff.
Thats what the frameworks gives you, documentation, examples, Q&A and std practises (some are even best practises).
I'm sure once you start asking around for the experts and their opinion they will tell you why the framework xyz is wrong but honestly at least everyone on the team and the poor consultants that comes after you, can be "consistently wrong" and know where to jump into this project.
Rant-over sorry :) But "VanillaJS and/Or Clean Jquery"
Yea no just no.
Php: Yea its a mess but a simple mess to get going :) As per first comment.
I keep hearing this "no framework == homegrown framework" argument and it's starting to feel overused and, frankly, not indicative of reality.
I find that there's a lot of resume padding when it comes to the web industry and "modern stacks" these days. My job involves managing a large monorepo, and I'm in contact with dozens of projects and teams daily. From my experience, I found that when people actually inherit someone else's React thing, it typically agonizes a slow death with minimal amounts of updates until it finally gets thrown out or rewritten.
Also, I've seen some seriously over-engineered stuff that honestly just boiled down to 95% static content and a handful of dynamic elements (e.g. some form validation). Many of these don't need to be a SPA and often can be architected to require very little JS in the first place.
When people say they use vanilla js, the assumption should not be that they have the same amount of JS code that a React codebase does: it typically means that the vast majority of what would be React code is simply not written in JS at all.
> They're two different approaches, one is quick and "dirty", the other is "enterprise", but the end result is the same: a website.
I would say one is appropriate for simple CRUD websites and the other is appropriate for more complicated web apps. Especially as you scale (your engineering organization or your traffic), things like microservices become increasingly useful.
Not software development in general but web development in particular.
My cynical take on this is that web development is relatively simple so that the 'masses' of web developers funded by the billions poured into web startups, whose products are also often relatively simple technically-speaking, need to create work.
I remember fondly those early days with PHP before it dawned on me how many peculiarities it had. However to be fair these steps came only after you’d managed to set up the server which was non-trivial on its own.
These days I feel like python/flask does it simpler. True you need to define a function and use a decorator, but saves you having to set up the server before getting started on site itself.
The simple the OP seems to be going for is the "Ruby + Sinatra," "Python + Flask," "PHP + Laravel" type of simple: a few libraries with small APIs that can make a serviceable starting base. The audience for this are experienced and are not making their first web application.
That assumes you've got apache set up, or nginx and php-fpm. It also uses the php interpretor. If you had to build a binary like Haskell would, your approach would be different (although you could run haskell in interpreted mode I wouldn't recommend it). It's also kind of unfair because PHP was created for making web pages, Haskell wasn't.
The lesson is you can't compare just the language parts of the setup without taking a holistic perspective. just PHP is not sufficient to make a 'hello world' page, you need an entire web server in the form of apache too
I spent my high school (~2006) lunch and free periods teaching myself web development on the school computers in the library, running Portable Notepad++, Portable Firefox, and Portable XAMPP, off my third-gen iPod's hard drive. Very good times. XAMPP made LAMP on Windows beyond painless. I miss that old stack.
To the point. It is very naive to ignore the fact that the work was just outsourced to an IT guy. And the world in which Haskell, Java, JavaScript or .NET can deliver the same "simplicity" of PHP is just called Function as a Service/Serverless (just that this needs some more years before simplicity is actually reached).
Doing PHP programming was my first programming job, lasted for about 3 years.
While I agree with you about the allure of PHP, to be honest, I will never choose PHP again or touch any projects that uses PHP.
PHP is an awful, obsolete language, patched again and again to make it more modern. It's like wearing an old leather jacket that has been patched repeatedly, yet it still has a nasty smell. This is entirely subjective of course and I'll take old and trustworthy technologies technologies any day of the week. Except PHP isn't trustworthy and it never was.
I have never seen a high level language plagued with so many security issues and this extends to apps built in PHP, like Wordpress. You could attribute that to sloppy coding, or to its popularity, yet if you take the top web vulnerabilities in the wild, most of them (like SQL injections, or remote code executions, or stupid bugs related to implicit conversions, session tampering, etc) aren't possible with modern libraries built in static languages, or at least very hard to accomplish by accident.
For me there's also the issue that PHP is really inefficient. Any PHP app will crash your average server if it hits Hacker News, unless you end up caching the pages via Varnish or similar tricks that essentially avoid hitting the PHP interpreter. Configuring Wordpress to withstand a traffic spike can be very challenging, the best solution is to pre-generate the website, thus making it static, which is very weird considering for how long WP and PHP have been on the scene, you'd think there are better solutions by now.
You say PHP is easy. I say that PHP is and will always be a security and scalability nightmare. And sure, it allows you to start something fast, but I can start as fast with my favorite technologies and without hitting a wall later.
And of course, PHP is nice for beginners, but at the same time I believe you're painting yourself into a corner and it's very hard to grow from there. Besides the issues above, a beginner will be curious about trying new things. CLI utilities, desktop or phone apps, statistics, automation, embedded programming, etc. PHP is unsuitable basically for anything that's not querying MySQL and then spitting an HTML back, which means the learning curve will be worse, because you've picked a very limited hammer.
Absolutely none of that takes away from the original comment though.
PHP as a language is pretty garbage.
PHP’s deployment story and as a concept is pretty interesting, and we’ve dropped it entirely as a community. The reasons why are interesting to explore, I think.
The closest to it is something like Node, I guess: install the language, you can wire up an HTTP server with a couple functions, but it’s still lacking the simplicity of PHP there.
Python gets kind of close too, using something like Flask, but now we’ve got libraries we’re talking about.
I’d love to see a web-first language with PHP-styled ease, but with a much better language, and the ability to go down the road of Python/Flask etc. with function-based routes if one pleases.
Of course, frankly, a lot of your comments on PHP are out of date, but that’s neither here nor there: the language barely matters for what is under discussion.
> The closest to it is something like Node, I guess: install the language, you can wire up an HTTP server with a couple functions, but it’s still lacking the simplicity of PHP there.
How is configuring Apache/mod_php/whatever is used these days easier than require('http')...?
How about the htaccess shenanigans that had to be used to achieve vanity/custom routing?
There's no way the Node stdlib with a hand-rolled path-based router (so as to not require a dependency manager, mimicing PHP - even though npm is packaged) isn't an overall-better story from a simplicity (and performance, and maintainability, and security) standpoint than the Apache/PHP stack.
"Drop a bunch of files into a directory and have a dynamic website" was a romantic idea. Most people probably just use(d) cPanel remotely and WAMP/MAMP.app locally and never needed to configure Apache, but that complexity was definitely still there.
Analogously: Next.js might be the "WAMP/MAMP.app of today."
Of course the complexity is still there, but using WAMP/MAMP.app is exactly the context I'm thinking about this through :)
I agree that Next.js is making big strides in this direction. I think other scripting languages could do similar, too. I imagine Lua would already have something like this floating around, ready to be packaged up in this way?
>I can start as fast with my favorite technologies and without hitting a wall later.
What are your "favorite technologies"? Where can I get a hosting using your "favorite technologies" that would allow me to create dynamic server-size content just by editing a single file? (No command line fuckage.) Will the resulting solution be portable to other hosting providers?
While Apache is certainly not zero configuration, it is pretty easy to forget about the Apache part of the LAMP stack. When I got started making websites, I downloaded "WAMP" (windows-apache-mysql-php). It took care of all the configuration and I could, as the author said, drop my code in a www folder and be off to the races.
Similarly, lots of hosting providers wouldn't expose the guts of Apache to developers. They would give you some sftp credentials, and let you upload whatever PHP files you wanted. No need to think about Apache configurations (except maybe a few htaccess files).
That's not actually simple, it's just convention over configuration.
It actually complicates a lot of things, like routing and templating: Those files implicitly do the routing and templating, until they don't, which is where the mess starts.
That sounds like what Zeit.co is doing. Now.sh is nearly zero-config for a static HTML index and any number of functions in many languages under /api/
In their version 1 they actually even looked for index.js in the root and evaluated that one, but now such a thing requires config (and it isn’t even clear on how to achieve it)
I definitely hope this sort of idea takes off: it’s got a lot of promise to eventually take over from PHP for the “get a beginner rendering dynamic content to a browser quickly” use-case, which is how I learned to program 15 years ago!
> Lines of code or configuration so far: zero.
Which zero-conf server was serving PHP back those twenty years ago? I remember using LAMP (and WAMP) installers for local development back then, but getting everything to run on production was always a nightmare.
> Number of steps to rebuild or restart: zero.
This just made people not debug or test at all. Also it had the side effect of people coding directly on production or just overwriting everything via FTP on the fly.
Yes, those were the simple times. But also the dark times.
Where is the step where PHP is installed and running on the machine, bound to port 80? This is a non-trivial step on every system (slightly less trivial if you bind to port > 1024 but still not for the absolute newbie).
Note that you can go much simpler by just loading a file into the browser, and absorbing the dynamism in the front-end. (If you really want a proper URL, then launch a simple static http server in the same directory as the html file.) The great benefit here is that the browsers dev tools are far better than PHP's.
I'm not sure this is simpler. Apache + php is probably far more lines of code. What the above is is rather all the complexity of just one rigid use-case swept under the rug.
It is intriguing that nobody's replicated PHP's ease of deployment. Every other language does require downloading some sort of runtime and setting up some sort of templating/page serving system. Would be an interesting experiment to start with essentially an HTML generating DSL like PHP, but with more modern language design.
I think that to use PHP embedded in HTML documents (the classical way, with <?php ?> tags) is effectively just using mod_php for Apache as your templating/page serving system, right?
I haven't used PHP in many many years, and even then only lightly, so forgive me if this is plain wrong. I do somewhat miss how easy it was to add a small amount of dynamic content to a page in those days without having to drastically alter how anything else worked. I used PHP in a lot of projects without ever really needing to learn it well, much the way I see a lot of web developers 'learn' SQL today. Perhaps that's why so much PHP code back then was a clusterfuck.
Yeah I don't actually know. I don't write or really like PHP. But I do begrudgingly admit that PHP advocates have a point when they mention ease of deployment. I don't think any other language has really tackled that properly. Perhaps there's some dead simple Ruby ERB integration that works with a server? I'd love that a lot actually.
They tried with Java (Tomcat/JSP) which from memory wasn't that bad to setup, and I even remember trying a Perl variation where you inlined Perl code instead of PHP, but the name eludes me right now.
I guess PHP just had that first mover advantage. Don't forget that a lot of the simplicity is because your average shared hosting provider has already installed it for you.
> Lines of code or configuration so far: zero.
Yes. But by not having configuration, it delegates the configuration to the webserver. Which means that it(and even the language itself!) will exhibit surprising behavior (and even security vulnerabilities) if you deploy somewhere else.
Sometimes, having to be explicit about config pays off.
I disagree with the others, and think this is indeed simple enough without caveats about setting it up (setting up WAMP is honestly just as much effort — not much), and what I’ve used to teach friends who wanted to learn some web programming with zero initial understanding.
Explaining routes and handlers was a bit hard, compared to files. I wonder if there’s a “file based” version for Node that’d work like and express handler for the files route?
I'm a Haskell newbie, I only do Haskell in my spare time, only sometimes, so I don't have time to hunt around for recipes. Kudos to the author, I'll surely borrow from this.
---
I also built a simple web server in Haskell for triggering commands via GitHub's Webhooks [1].
So if you're interested in a very simple, yet useful web app to analyze, see:
My use-case is to serve a static website using my own VPS, but have Travis build the website (via a static generator) for me. After Travis builds the website via Jekyll, it pushes it to the gh-pages branch, then GitHub pings my server via the installed webhook, which triggers a refresh with the latest build.
Not sure what's meant by "simple" here, but it looks quite complicated to me. Even though I use something similar at work (with Servant), but much simpler (FSVO "simple") approaches as a hobby:
- Those frameworks relying on built-in servers, including all their dependencies, are huge. If a program is linked statically (the default for GHC/cabal), that's also a very large codebase that doesn't get updated on system updates. Plain CGI (or FastCGI, if needed) is much more lightweight, and an external web server usually provides generic logging.
- postgresql-simple provides more than just protection against SQL injection; that is implemented in the C libpq, and usable with postgresql-libpq bindings (which are used by postgresql-simple) too.
- Database connection information, in case of PostgreSQL, doesn't require a fancy configuration: libpq reads it from environment variables. In many cases even those don't have to be set: by default it would try to connect to a local database, under current user, and to the database matching user name.
- PostgreSQL itself can compose XML (XHTML/HTML 5), which then can also be processed with XSLT, cutting out the blaze-html dependency.
- Custom logging facilities tend to be a pain to work with, and not necessary either; there are syslog and journald.
Using all that (and even more) instead of more lightweight/low-level/low-dependency alternatives seems like a sensible compromise in some situations, and I don't mean to say that it's somehow bad, but not seeing how it's "dead-simple". Maybe it's one of those "simple versus easy" cases, or just about different kinds of simplicity.
"Simple" is always a relative term. I agree, if "simple" looks like the complication in the post, then I'm turned off from the language because I don't really want to see what "complex" can become.
I have no doubt if I had context of Haskell, I would be able to read that with no trouble - even without Haskell context I think I can grok it pretty well. But imagine a newbie to the industry as a whole? It stops being simple at various stages, very quickly.
This is exactly the stack I used when I started building a webapp in Haskell, it was nice. I never finished my webapp though (but that wasn't Haskell's fault).
This is a great tutorial for getting up and running fast. I started off with Spock too but grew out of it pretty quickly.
It's painful in Haskell you're learning it but feel you're not able to generate anything real for the first month(s) with it. Unfortunately, there's not many up-to-date tutorials like these available for beginners.
Do you have any recommendations for next steps beyond Spock?
I learned Haskell at uni and am interested in going back to it for web work (instead of Erlang). Any insights you might be able to share would be really helpful!
Servant. It takes a while to pick it up but it's well worth the effort. You end up with an automatically documented type safe API and you can even generate client functions for other languages using Swagger based on the spec. It has excellent documentation too, the only hard part really is figuring out the handler pattern.
As for the SQL story, that's more fragmented. I'm still trying to figure out the best next step myself. I started off with Persistent & Esqueleto but now looking at BEAM, Selda and Squeal for more control over the queries.
My take on the whole "complex versus non complex" debate raging below.
If there were nothing to the exorbitant complexity, there would be no complaints about modern stacks. It would be sort of, invisible? Like an OS is mostly invisible these days - that problem space is mature to the point that its, for most people, an afterthought.
So modern web dev does suck, yes. I am 100% of this opinion.
But the simple solution was re-invented into the 'mess' we have today. It doesn't scale (well), and although yes it wasn't complex it was a hell of a lot more complex to add features to, at least ones which weren't simple "hey change the look and feel of some widget" or "hey do some flashy UI thing".
I think that's probably a generally true statement, when a lot of work is needed to develop something (physical or otherwise), it will suck, because it still needs to be worked on? So anything being used by the industry and "modern" will have pain points, period.
Once its actually good, it tends to be simplified to the point that it becomes invisible. OFC, you can still use WAMP or whatever, or weebly, but nobody cares about WAMP or weebly anymore, outside of whatever small pet project or blog you run.
Hmm, okay so Spock looks nice. Neat! The routing code looks very easy to read:
main = do
spockCfg <- defaultSpockCfg () PCNoDatabase ()
runSpock 3000 $ spock spockCfg $ do
get root $ do
Spock.html "<div>Hello world!</div>"
get "users" $ do
Spock.json (A.object [ "users" .= users ])
get ("users" <//> var <//> "friends") $ \userID -> do
Spock.json (A.object [ "userID" .= (userID :: Int), "friends" .= A.Null ])
Alright, let's figure out how this works.
runSpock :: Port -> IO Middleware -> IO ()
Okay that makes sense. Takes in a port, some middleware and spits out IO. Not crazy. Hmm, okay these dollar signs are making things a little confusing, but I can figure out that spock returns an IO Middleware and therefore we can see that spock takes the output of the right do block along with spockCfg. Not the easiest to scan at first glance because of the right to left reading but okay.
Let's look at get.
get :: HasRep xs => RouteSpec xs ps ctx conn sess st
Ah, okay, so uh...this doesn't appear to be a function. Oh, I see, here it is:
type RouteSpec xs ps ctx conn sess st = Path xs ps -> HVectElim xs (SpockActionCtx ctx conn sess st ()) -> SpockCtxM ctx conn sess st ()
Well I'm not really sure what xs, ps or st are. And HVectElim doesn't tell me anything. Path is pretty clear and SpockCtxM is a monad of sorts. But what the hell is HVectElim? Alright whatever, let's just figure it out via the signatures of Spock.html/Spock.json. Which are...nowhere to be found in the documentation. I looked them up in the index and nope, no entries. I'm very confused about what HVectElim is and why that would be a fine name.
I can continue past this point, but you get my gist.
This is actually way better than my other experiences with Haskell and web dev, but it's by no means an easy or fun experience reading this code. Variable names like xs, ps, etc. are great when you're dealing with a generic list but they get old when that's the only documentation (because "type signatures are documentation" is totally infallible). And a name like HVectElim is just baffling. What, did they charge you per letter? Did half your keys fall out? Why is that a good name?
I really want to use Haskell for something useful. But I'm running into pain points trying to understand a glorified Hello World. Now granted I'm by no means a great Haskell programmer. But that's kind of the point. I understand monads. I get functors and applicatives. I should be able to write a dumb Hello World app dammit.
That documentation indeed could be improved, but I guess the intention here is that basic usage is learned from the tutorials. Though it's possible to figure from that documentation alone as well (I'm reading it for the first time too, but using Haskell often): Path construction collects types of arguments into xs, then HVectElim (heterogeneous vector eliminator, or maybe "elimination") unwraps it into an appropriate function type. xs are those types, ps is PathState, st usually stands for "state".
A dumb web "Hello, world!" can be written with just putStr though, using CGI (along the lines of what I've mentioned in another comment here). Or perhaps with Network.CGI ("cgi" package), which is quite a bit more straightforward than that RouteSpec, especially if one wants to quickly understand how it works.
I had a similar experience (wanting to use a language for something useful, but mostly failing to) with Rust, and maybe Python (just s/wanting/having/ in that case), but I guess the problem with that was primarily in trying to find out what's the "right"/idiomatic way to do something while there's none, or to apply practices from other languages. And maybe in not having a suitable project idea at the time. Most likely there can be more reasons of that, but they seem to go away with practice, clear goals, and/or motivation.
Basic is the key issue here. Sure I can type this out and get a simple program working (although if I make a transcription error, I'm not super sure I can reason through the code). But if I want to build anything beyond the basic, I can't rely on the three whole tutorials on Spock^[1]. I need to start understanding the library.
But at that point reading the docs is pretty frustrating. Even with your explanations, I can't see how HVectElim or xs, ps, st are okay names. Sure, it's idiomatic. But that doesn't mean its good. What was wrong with types, pathStates and state? Sure it's less terse, but in this case that's fine. Or if they wanted to keep the same naming, at least explain it in the docs!
Having built a web app in Rust, I've certainly run into pain points, but nothing at this level. A library like Rocket has so much more documentation and careful examples^[2]. But also, the interfaces in Rocket are fairly simple. You write a function that returns a type that can be converted into a response. Anything you can serialize with serde can be converted into a response. With Haskell, the interfaces are that my route defining functions return a
SpockCtxM ctx conn sess st ()
which is actually a
SpockCtxT ctx (WebStateM conn sess st)
With WebStateM being
WebStateT conn sess st (ResourceT IO)
Yeah...I can kinda infer that these are chained monads but I have no clue what bind or return does. I assume SpockCtxM is a reader monad and WebStateM is a state monad? But that should really be described in the documentation.
I see that the other libraries in the tutorial are better documented than Spock, so hopefully the Haskell ecosystem is getting better at this. But I'm still not sure I'd want to build a web app in Haskell (how's package management? Did you guys ever resolve cabal vs stack?)
Longer names can be seen as unnecessary and duplicating information. One has to balance between being verbose and laconic when it comes to naming in general, but judging by the lack of annotations in this case, not much care was put into it being easily readable. It may be useful to report it, once that's an issue: chances are it just seemed obvious to the author, as it also often happens to code in general, and acceptable to users.
If you see an insufficiently documented and/or straightforward API, and there's no good reason to use that one in particular, I think it may also be a good idea to look for alternatives, in any language.
> how's package management?
I think it's fine, but there are different opinions on how it should be. I prefer a better system integration and shared libraries coming from system repositories (which is usable with Debian), Cabal works fine, Stack seems to be popular, some seem to use Nix.
> Did you guys ever resolve cabal vs stack?
"Resolve" as in choosing a one true way to build/install things? I don't think so, but there is this annual "state of Haskell" survey that includes build tools, which was closed just a few days ago this year (but no results available yet, afaik), though there are the results from last year [1]. "State of the Haskell ecosystem" [2] may be of interest too.
There certainly are imperfections/drawbacks (and/or varying approaches/opinions/preferences), and it may not be a good choice for you for making a dynamic website (I think in many cases the overall best choice is the language one is most fluent in at the time), but my replies were mostly with the "I really want to use Haskell for something useful" sentence in mind: being exposed to worse (or at least less suitable) bits and running into a dead end can be unfortunate in that case.
The reason I ask about package management is because when I do my annual try-to-use-Haskell-for-real event, I pick either Stack or Cabal, inevitably run into issues with one and have to switch to the other. Plus I've had situations where I tried to install packages and the manager gave me the computer equivalent of a shrug. Contrast that to Cargo which works beautifully and seamlessly out of the box.
However I hold out hope that Haskell is getting better and easier to use. Legitimately, this tutorial's code looks a lot better than what I've seen before. But I'm still dissatisfied with the level of naming and documentation. I get it that the names may have been obvious to the author, but it's a little worrying to me that the author didn't think about this fact when writing the documentation. Plus Spock isn't a young framework. It's my belief that anybody writing a package should know that interface names are designed to be readable and understandable. And these names aren't subtly hard to read. HVectElim just isn't a great name.
That being said, I don't want to just hate on the Haskell ecosystem. I do really find the language fascinating. I just want some better documentation, more tutorials and nicer, unified tooling.
Stack uses cabal underneath though. What kind of issues are you running into?
Personally I love the language but struggle with all the (necessary) stuff around it. Configs, deployment, dependency management, editor setup, lint, auto-formatting... Everything feels unnecessarily painful.
Generally issues where package management just fails. Also it's just overall not ergonomic. I end up installing yet another version of GHC a lot (I believe that GHC version should be handed separately from package versions like rustup/cargo, nvm/npm, rbenv/bundler). But this info could definitely be out of date. I might be due for another try at Haskell
Not sure if it became much better in the past few years; I seem to run into issues less and less often (and all were solvable rather quickly), but it may be simply because of the avoidance of problematic packages and approaches. Here is a bit more of information though:
- Cabal 2 supports Nix-style local builds [1].
- Stack seems to aim very smooth and easy workflow. Though it didn't work well for me when I tried it (I think I had a version that was too old, and was rather appalled by the way they suggest to install a new one, with `curl ... | sh`). And as mentioned above, I prefer the opposite of adding even more layers on top of cabal and using multiple package managers, in part because it introduces more ways for things to go wrong. Likewise with Nix outside of NixOS.
- I used to find Cabal sandboxes helpful, but not using them these days. Different versions of the same package can be installed simultaneously, if it becomes tricky to stick to a single version while using Cabal. And as the last resort one can wipe out the local package database.
- Careful versioning could solve some of the package management and dependency resolution issues, and there is a common package versioning policy [2] (similar to, but not exactly the same as semver), but not everybody follows it (in versioning their packages or in pinning dependencies); one should be careful with packages that don't.
- Minimizing dependencies can be useful for this, among other things. Without keeping an eye on it, one can easily find themselves in a dependency graph with hundreds of packages, often coming pretty much directly from developers (without additional package maintainers keeping an eye on them following any standards or being compatible, though it's supposed to be better with Stack), some of which may misbehave at some point.
- On some systems it is viable to use a system package manager for Haskell package management.
But possibly my experiences are not representative either: as mentioned before, opinions and approaches around it tend to vary.
Haskell is extremely complex, and that’s the point: to isolate complexity at the language level, where safe patterns can be researched, developed, and then scaled out to arbitrary applications.
Please call this “Towards a simple Haskell web stack” or something else less oxymoronic.
I extremely disagree. I think C++ is extremely complex. Haskell is simple by comparison [0].
Once you understand enough Haskell to write basic programs in it this stack is roughly similar in scope to a Ruby + Sinatra or Python + Flask web application.
If we mean "Haskell the latest specification" that's Haskell 2010. I suppose we could think of it as pretty simple, though lazy evaluation means that any implementation of it is going to be complicated.
Fair enough there are some combinations of compiler extensions that don't interact well and I've written enough type level code to know which ones to stay away from.
The C++ move semantics are standard though and have lots of unintended consequences to make easy-to-describe constructors incredibly difficult to implement in practice.
My point was that saying Haskell is extremely complex is not technically correct. It's a qualitative statement and disingenuous. Complex relative to what? The stack the OP is presenting is relatively simple in the Haskell space. It's not inherently complex by virtue of being written in Haskell.
> It's a qualitative statement and disingenuous. Complex relative to what?
Complex relative to most programming languages.
C++ is more complex. That says very little. C++ is an extreme outlier.
I think Haskellers do a disservice to people they're trying to convince to do haskell by saying it's not complex. Haskell has a zillion extensions and crazy features, most of which show up in at least some popular Hackage libraries. Arrow syntax! Type families! Data kinds! Pattern synonyms! View patterns! Existential types! Rank-N types! The list goes on and on.
PS: I think Haskell is awesome and one of the best languages out there. I've written a pretty decent amount of FOSS haskell stuff. Complex doesn't mean bad.
EDIT: The complexity doesn't come just through language extensions. The process GHC goes through to produce fast code is crazy too. It works, but it's not simple.
Cool, I like Haskell a fair bit too. It took a long time to get here. I don't think it's the perfect language but I agree that it is probably much better than most other languages out there.
I'm hoping that languages built on dependent types like Lean will eventually come to the main stream.
I see your point about doing a disservice. It is a tricky thing trying to convince people to adopt a language like Haskell that is so different from everything else they've likely used. If you say it's easy they're probably going to stop inviting you to dinner.
I think it's equally a disservice to tell people it's too complex to learn. There's a common impression people get that they have to master category theory before they can begin to understand Haskell code. That's also troubling and kind of what I was getting at with this whole thing in a round-about-way.
Take a post about a dead simple Ruby + Sinatra application. You'd still expect a developer to learn classes, messages, methods, types, modules and probably rake or foreman or something. It's pretty complicated but not insurmountable.
TFA uses one language extension. The libraries are roughly similar in size to the APIs exposed by Sinatra. Many of the concepts port over. But because it's written in Haskell it's somehow too complex and can't be simple?
I make no bones about Haskell being difficult to learn. However I don't think that makes it a complex language. In C++ I have to learn about constructors, intialization lists, and all of the ways that move semantics make it extremely complicated to make the language do what I want.
Haskell may be hard to learn but once you get over the initial hump it scales well and is rather simple in many ways.
> I think it's equally a disservice to tell people it's too complex to learn. There's a common impression people get that they have to master category theory before they can begin to understand Haskell code. That's also troubling and kind of what I was getting at with this whole thing in a round-about-way.
Ah, I see where you're coming from more now. Yes, the "category theory is recommended alongside learning haskell" meme is terrible. In that context saying Haskell is simple is an improvement. I think people can handle nuance though, and "haskell has a simple core, but in practice GHC haskell is a big language" is something they can handle.
You are still wrong, dispite your disclaimers. Haskell has a tiny core language, which neatly breaks apart complexity. Most mainstream languages don't, which makes them far harder to thoroughly understand. there's no way easy to know what complexity is essential, and what can be reduced away. Abstractions are rather informal, and likely to leak.
Your argument is that a language with a 750,000 line implementation and a zillion features is simple because one of its IRs is simple? Doesn't seem very convincing.
Haskell2010's definition is like 20 pages long and that includes the definition of the standard library. This core accounts for 90-99% of the code in project that isn't going out of its way to use fancy features for the lulz.
It's cool Haskell goes through a small IR. That doesn't make TH, rewrite rules, CPP, the million extensions, STG, the runtime, the gigantic syntax, or the quarter million SLOC implementation simple.
Second, I agree with you that Haskell does not have to be too complicated. I only ever use a few language extensions, I use a subset of the language, and try to do repl oriented development.
The downside of my simple approach is that reading other people’s code takes real effort because I am likely to not understand many of the idioms and techniques that other people use.
Programming should be fun and productive, and it is up to everyone to figure out what works for them.
I entirely agree about the language itself being a comfortable middle weight, somewhere around the order of Python. But both languages give you a lot of scope to make things complicated and there are much stronger norms against making code as clever as possible in Python than in Haskell. Is there a Pythonic Haskell movement? Because if there isn't there should be.
Yeah, sadly skillsmatter have gone bust so all their content has disappeared. Its a huge shame as there was so much good stuff there. I really hope it will resurface soon..
I am going to disagree with this simply because it's talking about the stack, not the language.
As a Haskell developer who uses a much more complex stack than provided in the article, and seen too many "there aren't enough basic Haskell web stack tutorials!" posts here and there, I would say the library choices here fit that bill perfectly.
That said, once the adventerous non-Haskell user builds a basic application using this stack, they will then run into the brick wall of using a language which you are calling "complex", I would argue "different", and realizing there are a lot of new things they need to learn to actually be productive in it.
Haskell 2010 isn't that complex. It's sum and product types and typeclasses, essentially.
The many language extensions, depending on what you use, can make for something extremely complex, but you also don't have to do much more than enable them in many situations if you're just a user of a library.
1) Create a directory "foo" on your server
2) Make a file called "bar.php" in that directory
3) Write "Hello World!" in the file
4) Open http://localhost/foo/bar.php in the browser and see "Hello World!"
Lines of code or configuration so far: zero.
5) Add some actual code to bar.php
6) Reload the page and see the result
Number of steps to rebuild or restart: zero.
Then you can add templating, routing, logging, testing, anything. I'm not married to the PHP language, it's a mess. But the idea of keeping simple things simple is very important, and from what I see of today's developers, that idea has been utterly forgotten. Memory-holed. It's astonishing, I can barely believe that it happened.
EDIT: Sorry, I just said this offhand and didn't mean to hijack the discussion. Feel free to downvote, so something more on-topic gets on top.