People get work done by knowing what they're doing, which I'm not sure you are able to tell.
There is plenty of literature that explains quite thoroughly the process of software architecture. Basically all major software architecture styles from the past four decades reflect the need to encapsulate and insulate implementation details, including the need to specify a domain model and how it should be specific and exclusive to each project.
Somehow, you are oblivious to basic principles but still feel entitled to insult others based on domain knowledge you clearly lack.
Look, I've never seen microservices done well. I'm negative about it because people are implementing this architecture in applications that as far as I can tell should never even consider it. Maybe they're just doing it badly. But as far as far as I can tell it's a pretty awful architecture pattern.
To implement microservices, one takes what is a service in a normal application, a handful of code files in a normal application. Maybe some model files, a service, validators and a repo file. A slice of an application.
One creates a new project file, build files, etc. Maybe a new repo, maybe not. You then wrap that simple service in a bunch of boilerplate plumbing code so it can actually work on its own.
So, basically, a ton of extra code, right off the bat. Each time.
Then to do it right according to this thread, you duplicate the definition files between your services, multiple times, add JSON schema files that you didn't have to maintain before, and, someone else has mentioned, create an extra library on top of all this so your colleagues can implement it as if it were just a normal method call.
Even more code!
And that's your microservice. A lot of extra work. Busy work as far as I can see, no benefits. Just to do exactly what it used to do.
But, worse still, it has huge drawbacks, including:
1. Very slow "method" calls. Normal method calls are obviously orders of magnitude faster than whatever you're doing. L1 Cache is always going to be massively faster.
2. Poor debugging
3. Complicated devops requirement
4. Hidden complexity in the interaction between services that is impossible to see
I just don't get it. Never have. I tried to play along, but I personally think the emperor has no clothes. If a client is going to insist on microservices, so be it, but it's a massive waste of time and money in my opinion.
You're the first person in this thread to mention microservices. The discussion has been around broader service-oriented architecture. Sometimes those services can be quite large, in which case the boilerplate overhead is not nearly as onerous as you describe. I've worked on services that had 200+ engineers on them.
Friend, do you not understand that not all services are Microservices?
The article even states Microservices are not suitable for startups - the conversation in this thread has been able service oriented architectures which is a much broader topic.
None of what's described above is materially difficult or slows down a team used to this method of operation. Perhaps stop applying your narrow lens to all development.
You are oblivious to the point of this approach. Scaling has nothing to do with it. It has everything to do with not imposing useless and detrimental constraints that buys you nothing but problems. You specify interfaces, and keep implementation details from leaking by encapsulating and insulating them. This is terribly basic stuff.
You can do this already without adding any sort of microservice, schemas, duplicate definition files, externally maintained libraries, etc..
It's a basic feature of most languages.
It is NOT an exclusive benefit of a microservice pattern. Stop claiming that, it's one of the most frustrating claims/lies microservice advocates make.
The actual benefit is that you're forcing developer to use interfaces. At a massive cost.
There are much cheaper alternatives. You enforce a Dependeny Injection pattern on your services. Code reviews. Linting tools.
So no, this is not basic stuff.
And worse still, if your team can't properly use interfaces in your languages, how do you expect them to suddenly learn to use them properly in your services?
It'd be great if you minded your tone; this is HN.
I don't know where you're getting implementation details leaking when it's just API definitions being shared - they don't leak implementation details unless they're badly designed, which would affect them either way.
I wonder if there are two uncontrolled parameters here.
Firstly, the space and time scales. If your two-pizza team has twenty services, and they communicate like this, and interfaces change a few times a week, then there will be quite a lot of pointless paperwork. If your two-pizza team has one service, used by other teams, and the interfaces change once a month, then this might be an appropriate amount of speed bump.
Secondly, tooling. If your APIs are all done by hand, then making an update is a modest amount of boilerplate. If you are generating everything from schemas, and you have your build down tight, then it can be a matter of changing the schema file, pushing, waiting for that to propagate, then adding the necessary data to the message you changed.
the whole point of this architecture is to transform conway's law from a liability to an asset -- it's a solution to problems that only exist at org sizes large enough where product velocity is bottlenecked by inter-team friction
services map to teams, not units of functionality
imo minimum org size to use microservices is something like 50 engineers
How can you get any work done like this?
Are you all working at massive corps that develop at a glacial pace?
Sounds like you all spend your time shuffling papers rather than doing anything meaningful.