There are a number of rules of thumb when writing code. Things that are generally accepted as good style and ways to work. You know, things like employing abstraction and making stuff data-driven. What they don't tell you is that there's too much of a good thing. This is patently obvious - you can always make things more data driven by adding an extra interpreter, and you can always stick more wrappers in for 'abstraction'. Put another way, you're trying to find a maxima of style in program space, and this isn't going to happen by heading traipsing off to infinity.
They're called 'design decisions' for a reason. There's a decision to be made. You can't hide behind just doing commonly accepted 'good' things. If you do that, you're going to end up producing 'enterprise' code which runs around in circles, reinvents all kinds of wheels and/or pulls in every library you can think of, bloats out to infinity, and is still a complete pain to actually maintain. And then I have to review the code and explain what's wrong. And guess what? Every single decision will get justified as doing what is commonly accepted as good practice. *sigh*
Posted 2008-09-12.