I often get asked about "best practices" for Vue, state management, or how to architect an application.
But I think this idea of "best practices" can be dangerous — especially to inexperienced software developers.
While there are certain fundamental principles of software development that are extremely valuable to understand, most "best practices" are just (useless) opinions that can be safely ignored.
When most developers talk about "best practices", what they're actually referring to is "the things that most developers agree on". However, if you're looking at it through this lens, what you're really measuring is popularity and not goodness.
So then, why do certain concepts and techniques become popular while others don't?
Largely because some developers had an opinion about something, and shared it with the world. Just like I'm doing with this article, and this blog.
Of course, ideas that are obviously stupid don't become popular. Developers aren't mindless sheep that can't think for themselves, that's not what I'm saying at all.
But one of the hardest parts about software development is that most things aren't obviously good or bad. Just like we can't measure productivity or software quality that easily, understanding how good a technique or practice is, and whether it is "best" is extremely difficult to do.
So we look to see what the people we admire think (usually popular developers), and latch on to those ideas.
We see others sharing those ideas on Twitter, read articles on HN and other sites, and we buy courses and take in all that we can.
Because it's so difficult to judge the value of an idea, we default to using popularity as our filter. And while it often works out well, there are very often bad but popular ideas that slip through this filter (as well as good but unpopular ideas we never even come across).
So please don't take what others say — or what I say — as the truth. I'm human, so I'm often wrong, usually lazy, and probably haven't considered all of the facts.
Theodore Sturgeon was a prolific science fiction writer, and coined the popular adage that:
Ninety percent of science fiction is crap, but then, ninety percent of everything is crap
(which also applies to this blog)
So we're left with this incredible problem of sorting through all of these "best practices" and trying to decide which few are good enough to keep, and which ones are just "crap".
Trying to judge each idea on it's own merit is far too difficult, which is exactly why we default to flawed heuristics like popularity.
I think a much better (but still flawed) heuristic is to leverage the Lindy effect.
The things that have stood the test of time are far more likely to be true and useful than an idea someone just had in the shower and had to Tweet about.
When we think about "best practices", what I think we're really after are the fundamental principles of software development that can be learned once, and then applied to problem after problem, language after language, and not thrown out after 2 years.
Most industries don't churn knowledge and "best practices" the way that we do in software.
Structural engineers don't abandon their best practices when they move from designing on paper to designing with CAD.
But we as software developers have been tricked into thinking that the things our industry has figured out over decades is suddenly irrelevant now because we're "innovative".
Find those enduring principles, learn them, and you can ignore pretty much everything else.
There are relatively few things that are important — everything else can be figured out based on those principles.