This web site uses cookies. By using the site you accept the cookie policy.This message is for compliance with the UK ICO law.

.NET 1.1+

NUnit Testing Framework

The second part of the Automated Unit Testing tutorial introduces NUnit. NUnit is a popular open source unit testing framework for .NET framework software that simplifies the process of creating, organising and executing automated unit tests.

Automated Test Code

In the first article in this tutorial I described the use of automated unit testing and some of the benefits that it can bring. In this second article we will see some real examples of automated tests written in C# and executed with the aid of a unit testing framework named NUnit.

Before using NUnit, let's consider the manner in which we might have constructed tests before unit testing frameworks were available. In order that the sample code does not become unwieldy we will look at a very simple example. The class below is used to calculate the commission paid to a salesperson, based upon the value of an order. The business rules state that sales under £1,000 provide 2.5% commission, orders above £1,000 and below £10,000 attract a 5% commission and sales of £10,000 or more generate 7.5% commission. This class contains several errors.

public class CommissionCalculator
    public double CalculateCommission(double saleValue)
        double commission;
        if (saleValue < 1000)
            commission = saleValue * 0.025;
        else if (saleValue > 10000)
            commission = saleValue * 0.075;
            commission = saleValue * 0.005;

        return Math.Round(commission, 2, MidpointRounding.AwayFromZero);

With no testing framework we may decide to create a console application that references the assembly containing the above class and contains our tests. The code for the tests may be added to the Main method of the console application. Each test could be a separate method that performs an action and checks that the results are as expected. For example, the test code may be as follows:

static CommissionCalculator _calculator = new CommissionCalculator();

static void Main(string[] args)

private static void Test999_99Sale()
    if (_calculator.CalculateCommission(999.99) != 25)
        Console.WriteLine("Fail - Commission on £999.99 should be £25.00");

private static void Test1000_00Sale()
    if (_calculator.CalculateCommission(1000) != 50)
        Console.WriteLine("Fail - Commission on £1000.00 should be £50.00");

private static void Test9999_99Sale()
    if (_calculator.CalculateCommission(9999.99) != 500)
        Console.WriteLine("Fail - Commission on £9999.99 should be £500.00");

private static void Test10000Sale()
    if (_calculator.CalculateCommission(10000) != 750)
        Console.WriteLine("Fail - Commission on £10000.00 should be £750.00");

In the above code we have four tests, each calculating the commission for a different value and checking that the result is correct. The values selected for the calculations are at the boundaries of changes in the commission percentages. We have checked all percentages to ensure that every path through the CalculateCommission method is exercised.

Running the test code produces the following output:

Fail - Commission on £1000.00 should be £50.00
Fail - Commission on £9999.99 should be £500.00
Fail - Commission on £10000.00 should be £750.00

These three failures point to bugs in the code. We could now attempt to fix those bugs and re-run the tests until they all pass and we are sure that the errors have been fixed. However, the error reports are not particularly helpful. They tell us what the expected result was and that the test failed but they do not tell us what the actual calculated value was. They also do not help to locate the source of the problem or show the errors in any particular grouping or context. In a large application with tens of thousands of tests this additional information would be valuable. We could enhance the test code for each test to provide these details. However, this extra functionality would be better provided by a testing framework.

Testing Frameworks

Testing frameworks make automated unit testing simpler and less time-consuming, mainly by hiding the implementation details of the testing and allowing you to concentrate on writing test logic. Frameworks that target .NET development often employ attributes to decorate test methods and classes. The attributes specify which items should be considered tests, which are used to initialise a sequence of tests or clean up afterwards, and to categorise your tests. Frameworks usually include one or more runners. These are the programs that execute the tests and report the results.

There are many testing frameworks available to the .NET developer. Some are commercial products for which you must purchase a license. Others are provided free-of-charge. Some include graphical tools for executing tests, others provide command-line test runners or integration with Visual Studio. In this tutorial I will present examples using the NUnit testing framework. I have selected NUnit because it can be downloaded quickly, used without charge and has an graphical tool for executing tests. It can be integrated into Visual Studio with additional tools such as TestDriven.NET but can also be used standalone, so is compatible with the Visual Studio Express editions.

Installing NUnit

The installation process for NUnit will differ depending upon the version that you select and the installation package that you obtain. At the time of writing the latest version of NUnit is 2.5.9 and is available in three formats. The easiest to install is the Windows MSI version, which includes a setup wizard to guide you through the process and an uninstaller should you wish to remove the software later. The binary version is downloaded as a zip file that contains the same files as the installer but without the setup routine. As NUnit is open source, you can also download a version that includes all of the source code for the framework and the test runners. This allows you to see how NUnit works and to contribute to the project.

13 March 2011