The overall goal of
software testing is to validate the software we ship works as described. Integration testing is a key part of that and
I want to continue with a very simple "test."
I want to continue
with the muffler and engine analogy from my last post. If the engine is designed correctly, it has a
specification document for it. That
document would list various data and specifications about the engine, such as
horsepower, the type of fuel needed and so on.
This document will also list how much fuel the engine burns at a given
speed it is running and based on that, it will say how many gallons/liters of
exhaust it generates per second at each given engine speed.
Let's say it
generates 1 unit of exhaust at idle, 3 units at half speed and 12 (wow! You are really flooring it!) at full speed
per second.
It is our job to
test that the muffler we use can process that much exhaust. The "test" here is simple. We look at the data sheet for the muffler and
see if it can process up to 12 units of exhaust per second. If that answer is no, we don't need to set up
an engine and muffler and measure the exhaust.
We can simply say this will not work and we need to make a change (to
either the engine or the muffler), or select a different engine and muffler
combination. Easy enough, but this
simple test of reading the documentation is missed often enough that I wanted
to call it out as a simple first step to take.
This also supposes
that the documentation is accurate. That
is not always the case in software since the underlying code can change at any
point and updating documentation sometimes lags. In the mechanical engineering world, though,
that does not happen as often.
More on this muffler
and engine next time.
Questions, comments,
concerns and criticisms always welcome,
John
No comments:
Post a Comment