One reason for why unit testing has bothered me is that it goes against an OO principle I was thought at school, namely that every aspect of the code should be kept as private as possible.
Roy Osherove has a recent entry on this subject. Roy is an enthusiastic unit-tester, I am still in doubt :)
UPDATED 24.7.2007
Bruce Eckel's thoughts on OOP.
Monday, February 26, 2007
Tuesday, February 20, 2007
TDD presupposition
I am pondering the validity of the following statement:
I know that this is taken as a given in unit-test camps, but is it?
UPDATED: 1.1.2008
I found out that I am not alone in not being convinced about designing for testability. The creator of TypeMock feels that designing for testability does not comply with YAGNI.
"The theory is that testable code is better designed ... If the class is easier to test, it is a better design. The test first paradigm just forces you to use good design. It makes a good design the path of least resistance." -David Hogue
I know that this is taken as a given in unit-test camps, but is it?
UPDATED: 1.1.2008
I found out that I am not alone in not being convinced about designing for testability. The creator of TypeMock feels that designing for testability does not comply with YAGNI.
Monday, February 19, 2007
Seeing the log4net output when testing under TestDriven.NET
[Update 22.02.] It is also possible to use the log4net configuration for the service or executable that will be calling the dll under test. To to this three things have to be done:
I found a nice way to be able to see the log4net output when unit testing under TestDriven.NET:
- Link the App.config file of the service or executable to the UnitTests project
- Include a
ILog sLogger = LogManager.GetLogger(typeof(IsmClientWatchdog));
in the test class, even thought sLogger is never used therein. - Add
[assembly: log4net.Config.XmlConfigurator(Watch=true)]
to the UnitTests project's AssemblyInfo.cs
XmlConfigurator.Configure()[Original post]
I found a nice way to be able to see the log4net output when unit testing under TestDriven.NET:
log4net.Appender.ConsoleAppender app;
[TestFixtureSetUp]
public void Init()
{
app = new log4net.Appender.ConsoleAppender();
app.Layout = new log4net.Layout.PatternLayout("%d %C{1} [%-5p] : %m%n");
app.Threshold = log4net.Core.Level.All;
log4net.Config.BasicConfigurator.Configure(app);
}
[TestFixtureTearDown]
public void Dispose()
{
app.Threshold = log4net.Core.Level.Off;
}
Saturday, February 17, 2007
Unit testing, lessons learned
I was doing a small application and thought of using the opportunity to see what additional effort it would require to make it unit-testable.
- Two projects had to be added to the solution: A class library project where the code to be tested has been factored out, and a class library containing the unit-tests.
- The configuration could no longer be read directly from the app.config, because it had to be changable programatically, so a ConfigurationManager had to be introduced.
- An additional interface had to be introduced for the ConfigurationManager so that it could be stubbed out.
- A second constructor had to be added which took a ConfigurationManager as a parameter.
- The run method contained an infinite while loop and needed therefore to be split into two methods.
- Finally, all the unit-tests, of course, had to be written :)
Thursday, February 15, 2007
Visual Studio 2003 templates
The standard template for a new class in VS 2003 has been slightly irritating to me :) I finally googled this issue and found two nice articles by Michael Groeger on the subject. The first one is about changing the "Add Class..." template. The second one describes how to create a new template for Nunit test classes.
To the default class template I added copyright information and $Date$, $Author$, and $Rev$ keywords for Subversion. As well as removing the irritating: " TODO: Add constructor logic here" :)
To the default class template I added copyright information and $Date$, $Author$, and $Rev$ keywords for Subversion. As well as removing the irritating: " TODO: Add constructor logic here" :)
More on unit tests
Software development gurus, in a mistaken attempt to simplify things, make up simple rules for us simple people to follow. I think a better approach is giving us the arguments to decide by ourselves on a case-by-case bases how, or if, to use the tool or method in question.Should unit tests be used always and unconditionally?
I have been giving unit tests some thought. Unit tests are a tool and I am not religious about when to apply it. Sometimes I might mock out some external dependencies, but at other times I would like to test that dependency as well. It is often a compromise between complicating the code, and mock out external dependencies. E.g. should I introduce a configuration manager so that I can make the code independent of the database, or should I just include retrieving the configuration from the database in the test? I don't believe in using unit tests all the time, sometimes it is just too difficult to set-up the test. They should be used when the tests need to be repeatable for regression purposes. Sometimes you just know that you are not going to change that piece of code and then it is sufficient, simpler, and faster to test it manually until it works as intended :)
Another issue I have with unit tests is that they can lull one into thinking it is ok to make a change and accept it as good if no red lights appear in the test runner. The correct procedure, however, is to check if there is actually a unit test that covers the code you just changed, because, initially , it might have been deemed not feasible to create unit tests for that particular scenario.
I might be stating the obvious (it is often needed), but unit tests are a tool that should not be applied automatically or relied on blindly :)
I have been giving unit tests some thought. Unit tests are a tool and I am not religious about when to apply it. Sometimes I might mock out some external dependencies, but at other times I would like to test that dependency as well. It is often a compromise between complicating the code, and mock out external dependencies. E.g. should I introduce a configuration manager so that I can make the code independent of the database, or should I just include retrieving the configuration from the database in the test? I don't believe in using unit tests all the time, sometimes it is just too difficult to set-up the test. They should be used when the tests need to be repeatable for regression purposes. Sometimes you just know that you are not going to change that piece of code and then it is sufficient, simpler, and faster to test it manually until it works as intended :)
Another issue I have with unit tests is that they can lull one into thinking it is ok to make a change and accept it as good if no red lights appear in the test runner. The correct procedure, however, is to check if there is actually a unit test that covers the code you just changed, because, initially , it might have been deemed not feasible to create unit tests for that particular scenario.
I might be stating the obvious (it is often needed), but unit tests are a tool that should not be applied automatically or relied on blindly :)
Subscribe to:
Posts (Atom)