How not to implement a failing test

One of the first things I change in ReSharper, along with one of my biggest pet peeves is a failing test that fails because of something like this:

public class CombinedStreetAddressResolver 
    : NullSafeValueResolver<Address, string>
    protected override string ResolveCore(Address model)
        throw new NotImplementedException();

In the Red-Green-Refactor progression, the Red of a failing test should come from an assertion failure not a “my code is stupid” failure.  The Red part is intended to triangulate and calibrate your test, to make sure that your test can fail correctly.  A NotImplementedException won’t cause a meaningful failure, and only serves the purpose of getting your code to compile.

Throwing exceptions means your assertions never get executed in RGR until you attempt to make a passing test, in which case you still haven’t proven your test to be correct.  If you don’t know that your test is correct, you have two points of failure: your test, and code under test.

That’s why I’ve set ReSharper to return default values instead of throw exceptions.  I want meaningful failures from a valid test, otherwise I’m better off skipping the Red step altogether.

About Jimmy Bogard

I'm a technical architect with Headspring in Austin, TX. I focus on DDD, distributed systems, and any other acronym-centric design/architecture/methodology. I created AutoMapper and am a co-author of the ASP.NET MVC in Action books.
This entry was posted in TDD. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • Darren


  • there are several defaults in r# that I don’t like, but haven’t had the motivation to find out how.

    maybe do a post on how you customize resharper?

  • This is not only resharper, this also happens when you generate a stub with VisualStudio itself…

  • Arnis L.

    hmmm… this idea seems quite odd and new to me.

    thought that throwing exceptions let’s you model prototype of necessary functionality a bit further, more wider so you could actually have something to make assertions on. i guess there’s serious gaps in my knowledge of TDD. :/

    anyway – thanks for making me to think.

  • Could this ever result in test passing when it shouldn’t? For example, say you have this function (which ReSharper has return 0 by default):

    public int AddTwoInts(int a, int b) { return 0; }

    If you have a test that asserts AddTwoInts(0, 0) == 0 then it would pass.

    I know this is simple and contrived, but I’ve always liked the NotImplementedException telling me “hey guess what, you haven’t written your code yet – go do that”.

    Interesting post though, makes me re-examine how I do this…

  • @MrDustpan

    Remember, the Red in RGR means a meaningful failure. In the test you have, it would pass immediately, so I would need to change my test so that the assertion fails.

  • I can’t entirely agree here. I like throwing the exception to start because I know that it can’t pass as long as I call the code.

    If I return the default value and that is what the test expected I’d have a test that passed to begin with, and then I wouldn’t be sure it was tested or implemented correctly.

    Once I am sure the code is being executed and failing, because it is not implemented, I will set up my assertion how I want it and remove the exception.

  • @scott: In ReSharper 4.5, this is in Options, Languages, Generated members.

  • I like how in Aaron Jensen’s MSpec, not implemented specs are considered neither passing nor failing; they are considered Not Implemented yet! The yellow of Red and Green testing. :)

  • I don’t think I really agree with you – where is the rule that says the Red in RGR should be a meaningful failure? When I’m designing my tests I want to write the least amount of code possible to get the code compiling. That’s the first step – I just want to be able to run the test and see it fail. The next step is to write the simplest code to get the test to pass – this is usually something along the lines of ‘return 5;’

    I suppose we are arguing semantics but my point is that this is such a short step in your design process that I would suggest that you do whatever you feel comfortable with.

  • Playing devils advocate. What’s not meaningful about throwing a NotImplementedException? It fails because it’s not implemented, certainly seems meaningful.

  • @Brendan

    That’s why you write a failing test first – you need to calibrate your test to make sure your test is correct.

  • @Jaco

    But now you’ve never shown that your “Assert” part of AAA works correctly. If “Red” is to do “Setup Execute” but never “Verify”, the “Green” part now has two points of failure.

  • @James

    The test failure provides no meaning. If a test fails because of the Execute step borking, my only recourse is debugging. Assertion failures provide the most documentation.

    If your first goal in RGR is to fail a test, why bother writing the assertions yet? Seems like a waste of time if all you want to see is a failure, no matter where the source.

  • I’m failing to see the distinction between a NotImplementedException and returning a default value. Both are simply a stop-gap mechanism for getting your code to compile so you can -see- the failure. In a less strict language you wouldn’t need this step, because your test would be immediately runnable, but without this luxury we’re stuck with either default values or exceptions. To favour one is personal preference, but decrying one over the other seems pedantic for a step that’s only necessary due to environmental constraints.

  • @James

    The point is to get to a failing assertion – not just a failing test. Assertion failures tell me if my code under test is working or not. Exceptions tell me nothing. It has nothing to do environmental constraints – it’s the critical “Red” step to prove my test is correct. Exceptions prove nothing.

  • Arnis L.

    You should add those comments to your post. Now it sounds much more reasonable.

    Btw – thanks for presentation. It was really nice.