A poorly written application does not need to suffer bit rot. A well written application can suffer bit rot. The reason?
Applications are stories, and bit rot comes from not understanding the story and yet you have to alter it. Easy-to-read stories are easier to alter, but they can still be hard.
An application has a history, and it changed over time to accommodate the people and pressures it had to deal with. People who understand the story can maintain even a poorly written application, to the extent that a poorly written app doesn't really suffer from bit rot.
Well written applications are great, but even they have a story, and if you don't understand the story then a good architecture won't save you from bit-rot. Indeed, foisting "small projects" on unprepared juniors, and due to time constraints, just allowing the code in without review, is the seed of a form of bit rot even on well architected projects.
Over time, we see the story of the application grow to include its data, and it's host(s). Indeed, one could say the "devops" trend is primarily driven by the definition of application as story. Extrapolating, we could say that applications are pushing at their boundaries: the build is part of the story, the push is part of it, and now we might add "building the datacenter" or "router configuration" in their story. At some level, for some larger companies, this is certainly the case.
The arrogance that the OP talks about is real, but it comes in two places: when a junior just doesn't want to learn the story, and when a senior thinks his story is so good, so transparent, that it "speaks for itself".
This is not to say that we shouldn't strive to write better applications, and create and use better architectures, but rather to emphasize the fundamental subjectivity of the bit-rot problem, and that it's a very human, very story-telling sort of problem at it's root.
Applications are stories, and bit rot comes from not understanding the story and yet you have to alter it. Easy-to-read stories are easier to alter, but they can still be hard.
An application has a history, and it changed over time to accommodate the people and pressures it had to deal with. People who understand the story can maintain even a poorly written application, to the extent that a poorly written app doesn't really suffer from bit rot.
Well written applications are great, but even they have a story, and if you don't understand the story then a good architecture won't save you from bit-rot. Indeed, foisting "small projects" on unprepared juniors, and due to time constraints, just allowing the code in without review, is the seed of a form of bit rot even on well architected projects.
Over time, we see the story of the application grow to include its data, and it's host(s). Indeed, one could say the "devops" trend is primarily driven by the definition of application as story. Extrapolating, we could say that applications are pushing at their boundaries: the build is part of the story, the push is part of it, and now we might add "building the datacenter" or "router configuration" in their story. At some level, for some larger companies, this is certainly the case.
The arrogance that the OP talks about is real, but it comes in two places: when a junior just doesn't want to learn the story, and when a senior thinks his story is so good, so transparent, that it "speaks for itself".
This is not to say that we shouldn't strive to write better applications, and create and use better architectures, but rather to emphasize the fundamental subjectivity of the bit-rot problem, and that it's a very human, very story-telling sort of problem at it's root.