Sunday, 13 October 2013

Test Driven Education [part 3 of 4]

A full Test Driven Project to learn from.

In the last post, we've seen what the Dependency Injection is and, particularly, how it makes unit testing easier: by injecting fake collaborators, we fully control inputs and outputs of the class being tested. In this third part we're going to discover five different ways of faking collaborators, commonly known as Test Doubles:
  • Dummy Object
  • Stub Object
  • Fake Object
  • Spy Object
  • Mock Object
As usual, we'll extend Domus to fulfill new requirements. In particular, we want to persist the PIN code into a database to survive system reboots and we want to produce logs reporting relevant events such alarm engagement and disengagement. To satisfy this two requirements, we'll introduce a Database and a Logger.


Stub Object

The first Test Double to talk about is the Stub Object: its only responsibility is to return known values (fixed or configurable) to its callers; therefore stubs suite very well when an object serves as an input to the class being tested.
We need a stub object to redefine the requirement of all the tests dealing with the PIN code. The new requirement, in fact, is that SecurityAlarm shall validate the PIN entered by the user against the one persisted into the database (for size reasons here there is only the code of testEngageAlarm()):

  public void testEngageAlarm() {
    // setup
    StubDatabase db = new StubDatabase();
    SecurityAlarm alarm = new SecurityAlarm(db);
    // exercise
    alarm.engage("0000");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }

All these tests now don't compile anymore, as SecurityAlarm now expects a Database to be injected, so we define the new constructor:

  public SecurityAlarm(Database db) {
  }

All the previous tests now compiles, however all the tests we didn't change (not dealing with the PIN code) aren't compiling anymore, because there is no empty constructor for SecurityAlarm. This gives us the opportunity to talk about dummies.

Dummy Object

Dummies are objects whose only responsibility is to shut the compiler up. They concretely implement their interfaces to pass the type checking at compilation time, but they shouldn't be invoked at all by the class being tested. Bear in mind that the null keyword is considered a dummy object and it is more than welcomed in unit tests as it highlights that a particular collaborator is not involved in that particular unit test (hence, the cause of failure should be elsewhere). However, the class being tested might not allow null to be passed; in these cases, a Null Object needs to be used. As said, it concretely implements its interface but either does nothing (i.e. empty methods) or makes the test failing, depending on the context and the taste.
To make our unit tests compiling again, we'll use the null keyword wherever required, for example in testAlarmNotEngagedAtStartup():

  public void testAlarmNotEngagedAtStartup() {
    // setup
    SecurityAlarm alarm = new SecurityAlarm(null);
    // verify
    assertFalse("Alarm engaged", alarm.isEngaged());
  }

With these changes the tests are now compiling again. They still pass, but this shouldn't confuse us; SecurityAlarm is still satisfying its requirements:
  • Engaging/Disengaging the alarm
  • Activating the sirens
  • etc
The fact is: we're now doing a refactoring. When we refactor some code we move from a software that satisfies its requirements to a different version which still satisfies its requirements. So, let's keep refactoring SecurityAlarm and remove all references to the private variable _pin and the DEFAULT_PIN constant:

public class SecurityAlarm implements Sensor.TriggerListener {
  ...
  private Database _db;
  ...
  public SecurityAlarm(Database db) {
    _db = db;
  }

  private boolean isPinValid(String pin) {
    return pin.equals(_db.getPin());
  }

  ...

  public void changePinCode(String oldPin, String newPin) {
    if (isPinValid(oldPin)) {

    }
  }
  ...
}

Most of the test are still passing but others, like testChangePinCode(), are now failing which means that something is going wrong. This brings us to a new requirement and a new Test Double.

Fake Object

Fakes are the next step toward the real implementation; they pretend very well to be what they say to be. They're more clever than stubs because they start having some degree of logic driving their return values, possibly function of other properties that collaborators might even set. A FakeDatabase is then what we need to make testChangePinCode() passing again (please note that the Database interface has also been changed to offer the method setPin()):

public class SecurityAlarmTest extends TestCase {

  class StubDatabase implements Database {
    @Override
    public String getPin() { return "0000"; }
    @Override
    public void setPin(String pin) {
      /* Do nothing */
    }
  }

  class FakeDatabase implements Database {
    String pin = "0000";
    @Override
    public String getPin() { return pin; }
    @Override
    public void setPin(String pin) { this.pin = pin; }
  }
  ...
  public void testChangePinCode() { 
    // setup
    FakeDatabase db = new FakeDatabase();
    SecurityAlarm alarm = new SecurityAlarm(db);
    // exercise
    alarm.changePinCode("0000", "1234");
    alarm.engage("1234");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }
  ...
  public void testCanChangePinCodeMoreThanOnce() { 
    // setup
    FakeDatabase db = new FakeDatabase();
    SecurityAlarm alarm = new SecurityAlarm(db);
    // exercise
    alarm.changePinCode("0000", "1234");
    alarm.changePinCode("1234", "5678");
    alarm.engage("5678");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }
}

Spy Object

We've already seen spies in the previous post: they're useful when we're only interested in the outputs of the class being tested. They're not usually as clever as Fakes are, because they have a different responsibility: they only keep track of which methods have been invoked (and eventually their arguments); the unit test will then queries the Spy Object to assert that the expected methods have been invoked. Finally, as a courtesy, they might also return known values (fixed or configurable) to their callers but there shouldn't be any logic behind these return values. In Domus, a Spy is exactly what we need for the Logger. Reflecting the events relevant in the Home Security domain, Logger is an interface exposing the following methods:
  • alarmEngaged()
  • alarmDisengaged()
To keep this post short, I'm going to show only the code for the engagement case (you can find the complete code on git). As Database, Logger is a collaborator and we'll inject it into SecurityAlarm constructor. The requirement is very simple: when the alarm is engaged a log should be produced. The following is the unit test:

public class SecurityAlarmTest extends TestCase {
  ...
  class SpyLogger implements Logger {
    private boolean _alarmEngagedLogged;

    @Override
    public void alarmEngaged() {
      _alarmEngagedLogged = true;
    }

    public void assertAlarmEngagedLogged() {
      Assert.assertTrue("Alarm engaged not logged", _alarmEngagedLogged);
    }
      
  }

  public void testLogWhenAlarmIsEngaged() {
    // setup
    StubDatabase db = new StubDatabase();
    SpyLogger logger = new SpyLogger();
    SecurityAlarm alarm = new SecurityAlarm(db, logger);
    // exercise
    alarm.engage("0000");
    // verify
    logger.assertAlarmEngagedLogged();
  }
}

Making this test passing should be quite easy as it's just about invoking the logger when the alarm is engaged:

  public SecurityAlarm(Database db, Logger logger) {
    _db = db;
    _logger = logger;
  }
  ...
  public void engage(String pin) {
    if (isPinValid(pin)) {
      _engaged = true;
      _logger.alarmEngaged();
    }
  }

However, all the old tests fails if we pass null as a dummy logger. This is, in fact, a case when a Null Object is needed as a dummy.

Mock Object

We're not going to see this in practice because we should avoid Mock Objects as much as possible. A Mock Object is kind of a superset of a Spy. It keeps track of every single method and all its parameters that the class under test calls (even indirectly) and makes the test fail if:
  • a method has been invoked unexpectedly
  • a method has not been invoked when it should have been
  • a method has been invoked too many or too few times
  • a method has been invoked with the wrong arguments
  • a method has been invoked in the wrong order with respect to another one
All these checks might sound good but they're not in most of the cases. The reason to avoid using Mocks is because, by their nature, they impose how the class being tested should be implemented rather then what it should achieve. As soon as the class being tested is extended to implement new features (which is a very good thing), all the tests relying on Mock Objects will very probably fail (which is a very bad thing). So, what is a Mock Object good for? It's good only when we need to test that a piece of code invokes an external API properly, which is something rare and for which component tests are probably better.

Sunday, 6 October 2013

Test Driven Education [part 2 of 4]

A full Test Driven Project to learn from.

In the previous post we've seen the development cycle of TDD and a first set of its important features which I want to recall once again:
  • It leads to the cleanest and most honest API ever.
  • It leads to the right and most effective balance of bottom-up and top-down design.
  • Unit tests make refactoring enjoyable rather than risky and scary.
  • Unit tests are quick to copy-paste-tweak letting us able to test paranoid cases.
In this second post we will appreciate the effectiveness of the Dependency Injection design pattern, naturally promoted by TDD. Its name should be quite self explaining: a component relying on other objects to get its job done, ask explicitly for instances of those objects via its API, rather than locate and instantiate them on its own. Although it might sound good having a component able to locate and instantiate collaborators on its own, it is not; because it leads to Software Architecture made of components tightly coupled between each other, making automatic testing a mission almost impossible which, in turn, reduce the quality of the end product. Dependency Injection, instead, enforces decoupling and modularization, which increases the quality of the end product. The unit tests themselves prove how modular the overall architecture gets: every single component is collaborating with at least two different implementations of its collaborators: the production one and the fake one.

To have a concrete proof of this, we're going to extend our Security System making SecurityAlarm interacting with burglar sensors and sirens. However, we'll first step back to the "old" times when we used to design software up-front and then we'll compare this with TDD.

Let's have a look at a couple of requirements:
  • When the alarm is engaged, the system shall activate all the sirens when at least one burglar sensor triggers.
  • The system shall provide hot plugging of sensors and sirens.
Let's now imagine what could have happened in the "old" times. We would have probably began by observing that the first requirement is pretty easy. We can have a vector of sensors and a vector of sirens. Periodically, we poll all the sensors and if any of them has triggered, then we iterate through the vector of sirens and activate them all. Then the Chief Architect might have raised his concern about performances and might have asked to implement an EventBus to asynchronously convey sensors' signals to SecurityAlarm. The second requirement might be slightly more complicated but for sure SecurityAlarm has to open the serial ports to scan for sensors and sirens. Then the Chief Architect might have raised again his concern, this time about portability, and introduced a SensorLocator which might use the EventBus to notify SecurityAlarm about new sensors being available. A couple of meetings (and design documents) might have followed with plenty of details about the EventBus and the SensorLocator: they'll be singletons, running on their own processes, the IPC will be implemented via message passing, etc. Only after a few days we might have been able to start the actual implementation of SecurityAlarm: it might have probably retrieved the instance of EventBus first, to subscribe itself in order to receive events; EventBus might then bring up the IPC mechanism by reading some config file; SecurityAlarm might also have retrieved the instance of SensorLocator to kick off a first scan for sensors; on its side, SensorLocator might read the config files and open all the serial, USB and I2C ports accordingly. Suddenly, we would have needed a full working system and our development process would have slowed down because of it.

However, we use TDD and we can start implementing those two requirements immediately. Let's formalize the first requirement with a test:

  public void testActivateTheSirenWhenTheSensorTriggers() {
    // setup
    class SpySiren implements Siren {
      boolean activated = false;
      @Override
      public void activate() {
        activated = true;
      }
    }
    SecurityAlarm alarm = new SecurityAlarm();
    SpySiren spySiren = new SpySiren();
    Sensor sensor = new Sensor();
    alarm.addSiren(spySiren);
    alarm.addSensor(sensor);
    alarm.engage("0000");
    // exercise
    sensor.trigger();
    // verify
    assertTrue("Siren not activated", spySiren.activated);
  }
This test gives us the opportunity to see how TDD naturally balances bottom-up and top-down approaches. We've started from the bottom-up, defining how we would SecurityAlarm to look like: we would like it to have two new methods addSiren() and addSensor() to pass in a Siren and a Sensor. Going now top-down,  Siren should be an interface providing an activate() method which SecurityAlarm will invoke, when the Sensor triggers. However, since Sensor does not yet exist, we'll now jump to define its requirement (which is still top-down, by the way).

public class SensorTest extends TestCase {
  public void testNotifySubscriberWhenTriggered() {
    // setup
    class SpyTriggerListener implements Sensor.TriggerListener {
      boolean notified = false;
      @Override
      public void triggered() {
        notified = true;
      }
    }
    Sensor sensor = new Sensor();
    SpyTriggerListener spyListener = new SpyTriggerListener();
    sensor.addTriggerListener(spyListener);
    // exercise
    sensor.trigger();
    // verify
    assertTrue("Subscriber not notified", spyListener.notified);
  }
}
Please note that while writing this test we have switched back to bottom-up, defining how Sensor should look like and defining the TriggerListener interface. The following is its implementation.
public class Sensor {
  static public interface TriggerListener {
    public void triggered();
  }
  private TriggerListener _listener;
  public void addTriggerListener(TriggerListener listener) {
    _listener = listener;
  }
  public void trigger() {
    _listener.triggered();
  }
}

Now that Sensor is working, we can go back implementing SecurityAlarm.

public class SecurityAlarm implements Sensor.TriggerListener {
  ...
  private Siren _siren;
  @Override
  public void triggered() {
    _siren.activate();
  }
  public void addSensor(Sensor sensor) {
    sensor.addTriggerListener(this);
  }
  public void addSiren(Siren siren) {
    _siren = siren;
  }
}

So far, we've almost implemented both the requirements and we now have everything in place to complete them. However, we'll skip over this, as it's just a matter of applying what we've learned in the previous post.

What is really important to note is that TDD has let the design emerge on its own. Sensor can now be extended to support serial ports, USB, I2C or more as the Chief Architect was desiring but without the need of an expensive up-front design. There is also no need to poll the sensors, which is another thing the Chief Architect would have needed to design explicitly. Moreover, Siren can now have different implementations and we might even turn on a spotlight or send an SMS or more; this is something that the Chief Architect didn't predict, but TDD let it emerge spontaneously.

In conclusion, the Dependency Injection design pattern increases the modularity of the Software Architecture for free and it enforces the decoupling between components; when using TDD, it is literally impossible avoiding it!

Saturday, 5 October 2013

Test Driven Education [part 1 of 4]

A full Test Driven Project to learn from.

Since the beginning of Software Engineering we have always seen a rapid growth in technologies. Beyond all these technologies, several methodologies and practices have been developed and presented. Among all of these, only one really had revolutionized the Software Engineering: the Object Oriented Paradigm. The Functional paradigm is now coming back again, but it's not hitting the ground running, neither this time.
Today, another revolution is in progress and the world is increasingly joining it: Test Driven Development. The concept is really simple: write the test first, then write the code to make the test passing. Even if so simple, its effects are not that simple and immediate to recognize and appreciate. Its because of those effects that the TDD is a revolution in Software Engineering.

Rather than talking about TDD all in once as many other guides do, we'll apply it to a brand new educational project, detailing all its aspects on the road, while we face them. For this project, we'll develop a Security and Domotics System in Java. All the project's code is available here on bitbucket and to show the TDD cycle, I've always committed first the test and then the production code. To identify commit which adds new tests I've always prefixed their commit log with TEST:. Additionally, I've created a tag per each post. So all this post's code can be checked out from tag v0.1.

So, we want a Security System and we want it to be great! This time, though, rather than starting modeling all the components this system might be made with, we will start up our IDE and create a brand new project: Domus.

First of all Domus is a Security System, so we'll start implementing that functional area. As a Security System, we do expect to be able to engage and disengage the alarm and we want to do it by means of a PIN code. However, before engaging the alarm, we start the system up and we want it to not be engaged at this initial stage. Let's define this requirement by writing a test:
public class SecurityAlarmTest extends TestCase {
  public void testAlarmNotEngagedAtStartup() {
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    // verify
    assertFalse(alarm.isEngaged());
  }
}

This code doesn't even compile as we've not defined SecurityAlarm anywhere. However, we've defined how our API has to look like. This is the first important aspect of TDD: By writing the test first, we'll always write the cleanest and most honest API ever. Someone might argue "of course you have, it's just a stupid isEngaged() method being called". However, all of us have coped at least once with complicated API for doing silly things like this. Honesty, as well, is really important for an API. We have all faced at least once an API pretending to have less dependencies than what it indeed had, often because they were silently invoking singletons or equivalent static methods we weren't aware of.
The following is the code that satisfies the requirement and make the test passing.
public class SecurityAlarm {
  public boolean isEngaged() {
    return false;
  }
}

The next bit is engaging the alarm. To do this, we want to specify our PIN code. Again, we're now defining a new method and this code won't neither compile.
  public void testEngageAlarm() {
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    // exercise
    alarm.engage("0000");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }

This new requirement allows us to appreciate the Agile aspect of TDD. Someone would start implementing the engage() method checking that the PIN code is correct, eventually if it is well formed perhaps by means of regular expressions specified on a config file. Never do such things. Instead, if you reckon those are important features, take just note of them and stick writing the minimum amount of code that makes the test passing. After having made the test passing, review the notes and one-by-one write the requirements for those features, that is write the tests which will enforce you to implement them. Unfortunately, this requires you to be disciplined but it pays back. So, in our case, the minimum amount of code which makes the test passing is introducing a private variable tracking the engagement status and set it to true in the engaged() method.
public class SecurityAlarm {
  private boolean _engaged = false;
  public boolean isEngaged() {
    return _engaged;
  }
  public void engage(String pin) {
    _engaged = true;
  }
}

Having been now able to engage our alarm, we definitely want to disengage it. So, let's formalize the requirement:
  public void testDisengageAlarm() { 
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    alarm.engage("0000");
    // exercise
    alarm.disengage("0000");
    // verify
    assertFalse("Alarm engaged", alarm.isEngaged());
  }

This case is just the opposite of the engage() method, so we just need to set our new private variable back to false in the disengage() method. Don't be worried about doing copy-and-paste, the test is now supervising us.
public class SecurityAlarm {
  ...
  public void disengage(String pin) {
    _engaged = false;
  }
}

Obviously, the PIN code's goal is to deny engagement and disengagement of the alarm to unauthorized people. So let's write the requirements for these new features (For convenience I've put engagement and disengagement tests together here but they have actually been committed separately).
  public void testDoNotEngageAlarmIfPinIsWrong() {
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    // exercise
    alarm.engage("1234");
    // verify
    assertFalse("Alarm engaged", alarm.isEngaged());
  }
  public void testDoNotDisengageAlarmIfPinIsWrong() {
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    alarm.engage("0000");
    // exercise
    alarm.disengage("1234");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }

The minimum changes required to make these tests passing are the following:
public class SecurityAlarm {
  ...
  public void engage(String pin) {
    if (pin.equals("0000")) {
      _engaged = true; 
    }
  }
  public void disengage(String pin) {
    if (pin.equals("0000")) {
      _engaged = false;
    }
  }
}

Here comes the second important aspect of TDD: After a test have been satisfied once, it will supervise all our refactorings, ensuring we've not broken anything. This means that now we can refactor SecurityAlarm in any way we want. If the unit tests pass, then we've got it right. In our case we can remove the string literals and introduce a helper private method isPinValid() to verify whether the PIN code is valid or not, so to increase code expressiveness.
This is a silly refactoring right? SecurityAlarm is the only class in this tiny code base… It has only three methods… However, the first time I've done this refactoring it was almost 2am and I've got it wrong and the unit tests caught my mistake. Immediately! I've also committed my error on git so that you can check what I've missed. Basically, with that bug anyone would have been able to engage/disengage the alarm even with the wrong PIN code! Try imaging how boring and disappointing it would have been to boot the system up, engage the alarm (with the right PIN), type then the wrong PIN and discover that the alarm is disengaged!
Here is how SystemAlarm looks like after the refactoring:
public class SecurityAlarm {

  static private final String DEFAULT_PIN = "0000";

  private boolean _engaged = false;

  private boolean isPinValid(String pin) {
    return pin.equals(DEFAULT_PIN);
  }

  public boolean isEngaged() {
    return _engaged;
  }

  public void engage(String pin) {
    if (isPinValid(pin)) {
      _engaged = true; 
    }
  }

  public void disengage(String pin) {
    if (isPinValid(pin)) {
      _engaged = false;
    }
  }
}

As customers, we would be very disappointed if we couldn't change the PIN code. Let's put down the requirement. We want to change the PIN code but, to be sure that we're the only one who can change it, we want to input also the old PIN code:
  public void testChangePinCode() { 
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    // exercise
    alarm.changePinCode("0000", "1234");
    alarm.engage("1234");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }

The minimum amount of changes that make all the unit test passing consist in introducing a new private variable to store the PIN and defaulting it to the factory PIN:
public class SecurityAlarm {
  ...
  private String _pin = DEFAULT_PIN;
  private boolean isPinValid(String pin) {
    return pin.equals(_pin);
  }
  ...
  public void changePinCode(String oldPin, String newPin) {
    _pin = newPin;
  }
}

However, the feature is not complete yet. We want to make sure that only who knows the PIN can change the PIN:
  public void testDoNotChangePinCodeIfOldOneIsWrong() { 
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    // exercise
    alarm.changePinCode("1234", "5678");
    alarm.engage("5678");
    // verify
    assertFalse("Alarm engaged", alarm.isEngaged());
  }

This is the working code:
  public void changePinCode(String oldPin, String newPin) {
    if (isPinValid(oldPin)) {
      _pin = newPin; 
    }
  }

Here comes another important aspect of TDD: Since unit tests are tiny, it does cost nothing to copy-paste-tweak an existing test to test some edge or paranoid condition. In our case. We might want to test if the Security System allows us to change PIN code more than once and it does really validate against the old PIN code rather than the factory one. Sounds paranoid? In this case it's probably not, but even if it was copy-paste-tweak of 5 lines would cost less than 1 minute of "typing".
  public void testCanChangePinCodeMoreThanOnce() { 
    // setup
    SecurityAlarm alarm = new SecurityAlarm();
    // exercise
    alarm.changePinCode("0000", "1234");
    alarm.changePinCode("1234", "5678");
    alarm.engage("5678");
    // verify
    assertTrue("Alarm not engaged", alarm.isEngaged());
  }

This test passes, proving that we got it right the first time, but we now have another test supervising us.

This concludes the first part of this educational project and we've already implemented important features in the overall system. It's worth to highlight that to implement this central features we didn't spend any time designing any component. This is because TDD is the most Agile methodology, as it promotes the design to emerge on its own, in a bottom-up fashion, rather then in the classical top-down one of modeling several components and their collaborators up-front and narrowing down their details. However, TDD is not purely bottom-up, but it balances itself with top-down as we first identify the component to assign a given responsibility to (top-down) and then we implement it by increasingly extending its requirements and introducing its collaborators (bottom-up).

Before leaving, it's worth to briefly recall the important aspects (or "features") of the Test Driven Development we've seen so far:
  • It leads to the cleanest and most honest API ever.
  • It leads to the right and most effective balance of bottom-up and top-down designing.
  • Unit tests make refactoring enjoyable rather than risky and scary.
  • Unit tests are quick to copy-paste-tweak letting us able to test paranoid cases.