Sometimes speakers make me frustrating when they talk about the software architecture and forget about the application lifecycle. It happened yesterday at a local meet-up.

The story

Gennady (speaker) was talking about some possible duplication of security functions. His topic was the development of secure software. Then he presented the picture like that:

He told about the decision of adapting an application server logic to DB level security and called this an achievement of the professional database development team, which made it possible.

They implemented all the security requirements at the database level because the database is the central element.

Scepticism

I’m very sceptical about the long-term success of ideas like that.

The similar case I had at my last job. There was a legacy client-server system + big webshop, connected to the same database. And there were problems with scaling out the database (even replication didn’t make life easier) because of tons of business logic inside of it. Because of years of development by a professional database development team.

What they really tried to achieve?

Despite the new type of workload was introduced, they strived to leverage their existing skills.

Too much strived to leverage, because they made the life of the previous application architecture longer, but hot happier.

The right question

The relevant question is — how can they be sure, that old architecture is adequate to the new type of workload?

Very probably, because the research was targeted to avoid security function duplication, decision makers were focused on that, while other limitations (of the architecture) were unchallenged.

Staying at the previous architecture with the new type of workload, in most cases, you end up with a monster application with super-long lifetime and extremely high cost of replacement in the future.

Why doing this? Why not cut technical debt by embracing of microservices as others do? No one says you need to change the system at once.