Effective Tests: How Faking It Can Help You

In part 1 of our Test-First example, I presented a Test-Driven Development primer before beginning our exercise.  One of the techniques I’d like to discuss a little further before we continue is the TDD practice of using fake implementations as a strategy for getting a test to pass. 

While not discounting the benefits of using the Obvious Implementation first when a clear and fast implementation can be achieved, the recommendation to “Fake It (Until You Make It)” participates in several helpful strategies, each with their own unique benefits:

 

Going Green Fast

Faking it serves as one of the strategies for passing the test quickly. This has several benefits:

One, it provides rapid feedback that your test will pass when the expected behavior is met. This can be thought of as a sort of counterpart to “failing for the right reason”.

Second, it has psychological benefits for some, which can aid in stress reduction through taking small steps, receiving positive feedback, and providing momentum.

Third, it facilitates a “safety net” which can be used to provide rapid feedback if you go off course during a refactoring effort.

 

Keeping Things Simple

Faking it serves as one of the strategies for writing maintainable software.

Ultimately, we want software that works through the simplest means possible. The “Fake It” strategy, coupled with Refactoring (i.e. eliminating duplication) or Triangulation (writing more tests to prove the need for further generalization), leads to an additive approach to arriving at a solution that accommodates the needs of the specifications in a maintainable way.  Faking It + Refactoring|Triangulation is a disciplined formula for achieving emergent design.

 

Finding Your Way

Faking it serves as a strategy for reducing mental blocks. 

As the ultimate manifestation of “Do the simplest thing that could possibly work“, quickly seeing how the test can be made to pass tends to shine a bit of light on what the next step should be. Rather than sitting there wondering how to implement a particular solution, faking it and then turning your attention to the task of eliminating duplication or triangulating the behavior will push you in the right direction.

 

Identifying Gaps

Faking it serves as a strategy for revealing shortcomings in the existing specifications.

Seeing first hand how easy it is to make your tests pass can help highlight how an implementation might be modified in the future without breaking the existing specifications.  Part of the recommended strategy for keeping your code maintainable is to remove unused generalization.  Generalization  which eliminates duplication is needed, but your implementation may include generalization for which the driving need isn’t particularly clear.  Using a fake implementation can help uncover behavior you believe should be explicitly specified, but isn’t required by the current implementation.  Faking it can lead to such questions as: “If I can make it pass by doing anything that produces this value, what might prevent someone from altering what I’m thinking of doing to eliminate this duplication?

 

Conditioning

Lastly, faking it helps to condition you to seeing the simplest path first.  When you frequently jump to the complex, robust, flexible solution, you’ll tend to condition yourself to think that way when approaching problems.  When you frequently do simple things, you’ll tend to condition yourself to seeing the possible simplicity in the solution provided.

 

Conclusion

While we should feel free to use an Obvious Implementation when present, The Test-Driven Development strategy of “Fake It (Until You Make It)” can play a part in several overlapping strategies which help us to write working, maintainable software that matters.

About Derek Greer

Derek Greer is a consultant, aspiring software craftsman and agile enthusiast currently specializing in C# development on the .Net platform.
This entry was posted in Uncategorized and tagged , . Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • Pingback: The Morning Brew - Chris Alcock » The Morning Brew #823

  • http://twitter.com/peterpolak Peter Polák

    Hi Derek,

    I have no problem with “faking it”, but I don’t understand why you actually do the implementation stuff in the refactoring phase. I know you call it eliminating duplicities, but to me it seems like you break one of the rules of TDD: you change production code without a failing test.

    I believe the correct steps should be:
    1) write failing test
    2) make it green
    3) refactor – where “refactor” really means to refactor the code: to shape it up without changing the business logic.

    What you seem to be doing is:
    1) write failing test
    2) make it green
    3) change the business logic to suit your purpose (without writing any more tests)
    4) refactor

    I believe there should be at least another failing test between 2) and 3).

    • http://derekgreer.lostechies.com Derek Greer

      Thanks for taking the time to provide feedback, Peter.

      First, let me commend you on having a keen sense of perception. What you’ve discovered is that in Test-Driven Development, much of the implementation of our solutions actually occurs during the refactoring phase. To review, the steps of TDD are as follows:

      1. Write a test.

      2. Make it compile.

      3. Run it to see that it fails.

      4. Make it run.

      5. Remove duplication.

      This process is commonly shortened to Red/Green/Refactor.

      Where I believe you’ve misunderstood the Test-Driven Development process is in the Refactor step. In general, refactoring is the process of making changes to an application’s source code without modifying its functional requirements. Refactoring can entail a number of changes, but the TDD process sets forth the elimination of duplication as the primary heuristic for guiding (and restricting) our modifications. There have been a couple of cases where, during the refactoring phase, I’ve extracted a method to help clarify the intent of the code emerging from my example, but for the most part changes have occurred solely to eliminate duplication.

      Without a full grasp of the TDD process, The concept of not changing code without a failing test can actually be a little misleading. In fact, it would be impossible to follow the TDD steps without doing this, as refactoring by definition is changing code and refactoring comes after the tests are passing. What this guidance actually means is that we shouldn’t begin implementing new requirements until a specification is written first. If you review the steps you’ve set forth in your comments, I think you’ll discover that you are violating your own rule in that you recommend shaping up the code (unsure what this means exactly) after you’ve made a test green, but before you’ve written another test.

      I hope this explanation helps.

      • http://twitter.com/peterpolak Peter Polák

        Hi Derek,

        thank you for your answer, I really appreciate the possibility to discuss TDD here on LosTechies.

        I begin by quoting you: “What this guidance actually means is that we shouldn’t begin implementing new requirements until a specification is written first” – but isn’t that exactly what you did? In the very first test of Part I, you got the test green by making GetPosition() return ‘X’. Then, in the refactoring phase, you implemented the requirement by adding _layout – without failing test.

        What I believe you should have done was to write another test that would fail on that naïve GetPosition() implementation, thus forcing you to fix it – e.g. by adding the _layout member variable. Without the next text, your first test’s requirement was fully implemented – so why bother with _layout?

        Replacing the “fake it” solution with some “real” implementation isn’t refactoring, because – well, it changes the business logic. Before the change, your GetPosition() always returned ‘X’, no matter what. Afterwards, it returned the value of _layout. I believe refactoring should not change return values of methods like this. If I’m wrong, then I must say I fail to tell the difference between refactoring and implementation.

        I’ve been doing TDD for only a year now, so of course I may be wrong; but from what I know, the “right” way of doing it is to really implement only the requirements that are contained in the tests. This is what I learned from e.g. Roy Osherove’s videos (then again, of course maybe I got it all wrong).

        • http://derekgreer.lostechies.com Derek Greer

          Peter,

          Again, I think you’re misunderstanding what refactoring means. In the case of the first example, the functional requirements were for the game to put the player’s choice in the selected position when they go first. I chose to verify this by checking that the selected position had an ‘X’ in it, made the test pass by hard-coding the GetPosition() method to always return ‘X’, and then refactored to eliminate the duplication between the value returned by the GetPosition() and the value being verified within the test. Here is the important part: In the end, the behavior being verified by the specification did not change. I think this is the key component you’re missing. Refactoring isn’t the process of changing code without introducing new capabilities, it’s the process of changing code without changing the functional requirements.

          The process of writing another failing test before introducing generalization, which you’re advocating, is referred to within Test-Driven Development as Triangulation. While this is a legitimate technique, it isn’t set forth as a primary strategy for most cases.

          I appreciate that you’re trying to understand the rules and follow them to the letter, as this is often a necessary stage in skill acquisition. I’m not that familiar with Roy’s approach to testing, but I would highly recommend you read the book: Test-Driven Development By Example. The author, Kent Beck, created the Test Driven Development methodology. After consulting his work, I believe you’ll find my example to be a faithful representation of his approach.

          Moving beyond an appeal to authority however, what’s more valuable is to understand the reason and value in these techniques. What value are we gaining from writing tests first? What’s the value in Red/Green/Refactor? What are the motivations behind using the strategies of Obvious Implementation, Fake It, and Triangulation? What are the shortcomings? When should you shift into “second gear”? When should you downshift back to taking teeny-tiny steps? Once you can answer these questions, you’ll be on the road to mastering TDD.

          Moreover, take from Kent Beck’s TDD methodology, Dan North’s BDD methodology, Steve Freeman and Nat Pryce’s A-TDD methodology, Roy Osherove’s methodology and all the styles, approaches, and theories in between and find a synthesis that helps you be an effective software engineer.

          Cheers.

  • http://jakub-linhart.blogspot.com/ Jakub Linhart

    Hi,
    thanks you for very interesting post. “Fake It” promises to make some things easier but I am afraid that it could result in code crowded with “Fake It” parts because I just forgot to refactor them. Tests don’t point out to them because they are all green (faked perfectly:). Negative feedback will then come from QA or even worse from the production.

    • http://derekgreer.lostechies.com Derek Greer

      Yes, when using the Fake It strategy for getting a test to pass quickly without following through with refactoring then you’ll end up with code that is hard-wired to only satisfy the test scenarios used within your specifications.  Similarly, when you use the Obvious Implementation strategy without following through with refactoring then you’ll end up with a lot of code duplication.  The key to both issues is not to avoid using these strategies, but to discipline yourself to follow the Test Driven Development methodology.  If you were to try Test Driven Development out for awhile, I think you would find your fears would be assuaged.

      • http://jakub-linhart.blogspot.com/ Jakub Linhart

        It is not about ommiting refactoring phase but about NOT forgetting to refactor ALL faked parts. But you are right, it is better to try it out first and care about some fears then:).

  • http://saraanthony.webnode.com/ AudreyBrooks

    Been looking for this article for long time ago and finally found here. thanks for sharing this post. appreciate.