When we last left
our muffler and exhaust test, we had determined the amount of output from the
engine was less than the muffler could handle.
This was a "specification" test done by inspection of the documentation
and had it failed, it would have alerted us to our poor design and prevented a
world of problems later.
Of course, that is
not enough. Let's say we decide to use the muffler from out thought experiment
and the engine as well. We still need to
design a system to validate the output of the engine does not exceed the capacity
of the muffler to handle. This may not
seem critical - after all, we just validated the two pieces work together - but
time goes by and specifications can change.
Let's go forward six
months into our design. We have a
prototype car up and running and it has a problem on hills. It needs more power to get up hills quickly
enough to make the driver happy. One
solution to this problem would be a more powerful engine.
So we take out the
original engine and replace it with a larger model. The larger engine only needs to run at half
speed to get up the hill, and it does so with no problems. The muffler works fine as well, so we may be
fooled into thinking we have no risk.
But we may. We have not tested the engine at full
speed. Now, the actual test here is a
little bit tangential to my point: we would test this in a lab with all kinds
of sensors and such connected to the car, or we could have had a check on the
data sheets set up and running on a computer somewhere, to provide two
examples. The point I want to make is
that we need the integration test to validate changes made during the
development cycle.
And clearly, the
earlier we can detect a potential error, the better we will be able to
respond. More on that, and my two
example integration tests I just mentioned, next time.
Questions, comments,
concerns and criticisms always welcome,
John
No comments:
Post a Comment