Defining and refining conventions

At last night’s ADNUG talk, Jeremy Miller talked about Convention over Configuration, and many of the principles the Rails community embraces.  He showed a few examples of opinionated software, such as FubuMVC.  One thing I would have liked more conversation around (but no time, alas), was the process of discovering conventions and forming the opinions that make up opinionated software.

Opinionated software, as I see it, is a framework that provides intentionally chosen axes of change, where other axes are fixed to adhere to agreed-upon principles.  In Rails Active Record for example, by default my entity’s shape is whatever shape my table structure is, optimizing for very little configuration.  Because Ruby is a dynamic language, it can get away with this easily with meta-programming tricks.

But how do we arrive at such opinions?  How do we decide which principles are acceptable, which are not?  Every design decision has a tradeoff, and frameworks like Rails aren’t going to satisfy everyone’s opinion.  We need some mechanism to form these opinions and craft our conventions.

Integrating evolutionary design

With our wall of pain, we strive to ensure that we have one design vision.  Introduce a refactoring, and we want to retrofit and remove duplication everywhere.  Often, this is architectural duplication, such as the knowledge of a required field propagated throughout every layer in the system.  To eliminate this architectural duplication through conventions and opinions, it would likely take several iterations of that design before everyone is happy with the result.

But the cost of propagating that design does have a real, tangible cost.  Iterating a design along with propagating it will cause a very real churn, even to the point where it could frustrate developers and discourage innovation.  On the flip side, iterating endlessly and never retrofitting our opinions leads to chaos, as well fall into the trap of having a hundred truths in our system, all of them correct at one point in time.  A new developer that came on recently vented his frustration with this problem, as he was spinning his wheels on an old design, simply because he picked the wrong version of the truth to model from.

So we need to both iterate and propagate our design, ideally at key tipping points where we’ve arrived at a sound design, and no important unanswered questions remain.  We might have questions about our design, but answers might only come through applying our design in a variety of scenarios.

Past the tipping point

From my experience, these tipping points are fairly obvious, and follow Evans’ concept of breakthrough refactorings.  We make incremental enhancements, slowly improving our design over time.  At a critical mass of awareness of problems and understanding of the domain, we introduce a change that dramatically improves the design.

These tipping points can’t be anticipated, nor can they be prognosticated.  In fact, trying to form opinions in absence of any context for these opinions is very likely to lead to awkward, friction-inducing development.  Another word for premature opinions might be BDUF.

The middle ground here is one where we become finely attuned to the pain induced by our design, not try to invent problems where they don’t exist, iterate our design, and retrofit after each breakthrough.  Opinionated software is a fantastic concept, but we can’t confuse opinion formation with misguided attempts to make all design decisions upfront in the absence of agreeing upon the principles that led to the opinions.

Reflecting reality