Monday, February 12, 2018

Tangential (get it?) post: I am digging into this precision based article on Code Project

I've been a loyal follower of the Code Project for quite some time and even written a few articles for them a while back.

It's no secret that my team here at Tableau takes precision and accuracy very seriously and I just discovered this article about that: https://www.codeproject.com/Articles/25294/Avoiding-Overflow-Underflow-and-Loss-of-Precision

Overflow and underflow are pretty well covered, but there are aspects of precision that are very hard to control.  I've been reading through this and thought I would share.  I'll finish up integration testing next time - this article was simply a very timely happenstance that I wanted to share.

Questions, comments, concerns and criticisms always welcome,
John

Wednesday, February 7, 2018

Integration Testing, part 6: Updating requirement based on integration testing


So now we have 2 types of integration tests giving us information on our muffler + engine system.  We are checking both the documentation (data sheets) of each component to ensure they will work together.  We also test the physical system itself to validate the real world behavior.  This way we can ensure the car will actually work after we build it.

There is one last aspect of this I want to cover.  There is a possibility that we actually decide to change engines or mufflers at the midpoint of the design process.  One reason for this could be the result of our testing.  If we notice that, at maximum engine RPM the muffler is at 100% capacity we may decide we need a larger muffler if the engine has a possibility of over-revving.

In this case, we need to back up a few steps and update our documentation.  If we move from Muffler Model 100 to Model 200, we need to update our specifications that we use in our computer automated test.  We have to take into account the different capacity of the new muffler - we clearly want to ensure the model of muffler we are committed to using has valid statistics.

We also will need to update our physical test as well, and this may be a little easier to understand.  If the muffler has a different size, we may need to move our sensors, the mounting bracket for the sensors, and so on.  We may need to change our tolerance levels as well.   If we had set a warning that the engine was exceeding the capacity of the old muffler, we will need to change that setting to reflect the capacity of the new muffler, for instance.

At this point we now have updated specifications, updated tests and can quickly validate how well the muffler + engine work together.  Had we missed either step in updating the changing requirements, we run the serious risk of not being able to tell that the engine + muffler would not work. 

I'll cover the costs of finding these errors when I sum up next week.

Questions, comments, concerns and criticisms always welcome,
John


Monday, January 29, 2018

Integration Testing, part 5 More than one test may be needed


In my last post I mentioned a test that we can run on the specifications for the engine and muffler.  This is notionally a check we write a script for the computer to run and validate that the components will work with each other.

That is probably not enough, though.  Just checking the data sheets is no guarantee the actual muffler and engine will work together.  Imagine a scenario in which the redline (the top speed) of the engine is given.  Then imagine that, for whatever reason, the engine we are using exceeds that speed.  In this case, the engine will start to output more exhaust than the ratings sheet indicates and our muffler may not be able to handle that much extra, unexpected exhaust.

One possibility for this is an emergency.  Suppose the driver needs to get to a hospital and doesn't care about the damage the car may take.  In this case, we need to verify the behavior of the engine + muffler even when it goes out of specification.

The testing here is mechanical in nature.  We create a fake engine of some sort that outputs more than we expect the real engine to produce and test with the muffler.  At this point, we document the behavior of the muffler.  Some reasonable expectations are:
  1. The muffler fails immediately and simply does not process the exhaust.  Instead, it simply passes through with no catalytic converter and no sound reduction.
  2. The muffler fails more catastrophically.  It could break, overheat or worse, even explode.
  3. There is also a case that the muffler designers built a safety margin into their specification and did not document it.  In this case, the muffler may work, perhaps only for a short duration.

We don't know what we should do if the muffler + engine do not work together in this scenario.  At this point, the testing organization is in exploratory testing mode and simply needs to determine the behavior.  Once we have a clear understanding of what is likely to occur we can apply that knowledge to making a decision. 

I'll cover that next.

Questions, comments, concerns and criticisms always welcome,
John

Thursday, January 18, 2018

Integration testing, part 4. The type of test to create


Now that I have generally covered the task at hand - ensuring that our engine and muffler will work together - I can finally start covering how to test that they work together.

When I last mentioned the testing challenge of validating the engine output can always we handled by the muffer, I mentioned 2 different test methods we could employ.

The first was an automated test, running on a computer, which would compare the data sheet of the engine to the data sheet of the muffler.  Specifically, it would look at the volume and pressure output of the  engine and validate that all these numbers were less than the maximum capacity of the muffler.  If we assume that we can examine the data sheets and that they are up to date then this is a terrific solution.  Once we alter the document, we find out instantly if the numbers will work for us.  If we were sitting around a design room, we could quickly look at multiple different engine and muffler combinations and rule out sets that obviously would not work.  No need to weld the engine into place or anything heavy like that.  This type of check can prevent errors down the road (so to speak), is relatively cheap to implement and gives very speedy feedback.

This is a solution I would push to implement to help with this problem.

Since I am a tester by nature, though, this is not the only check I would want to use.  We've all heard the old sayings of not putting all our eggs in one basket and I would also want a backup system in place to help me understand the operation of the engine and muffler system.

More on that next time!

Questions, comments, concerns and criticisms always welcome,
John

Friday, January 12, 2018

Integration Testing, part 3


When we last left our muffler and exhaust test, we had determined the amount of output from the engine was less than the muffler could handle.  This was a "specification" test done by inspection of the documentation and had it failed, it would have alerted us to our poor design and prevented a world of problems later.

Of course, that is not enough. Let's say we decide to use the muffler from out thought experiment and the engine as well.  We still need to design a system to validate the output of the engine does not exceed the capacity of the muffler to handle.  This may not seem critical - after all, we just validated the two pieces work together - but time goes by and specifications can change.

Let's go forward six months into our design.  We have a prototype car up and running and it has a problem on hills.  It needs more power to get up hills quickly enough to make the driver happy.  One solution to this problem would be a more powerful engine.

So we take out the original engine and replace it with a larger model.  The larger engine only needs to run at half speed to get up the hill, and it does so with no problems.  The muffler works fine as well, so we may be fooled into thinking we have no risk.

But we may.  We have not tested the engine at full speed.  Now, the actual test here is a little bit tangential to my point: we would test this in a lab with all kinds of sensors and such connected to the car, or we could have had a check on the data sheets set up and running on a computer somewhere, to provide two examples.  The point I want to make is that we need the integration test to validate changes made during the development cycle. 

And clearly, the earlier we can detect a potential error, the better we will be able to respond.  More on that, and my two example integration tests I just mentioned, next time.

Questions, comments, concerns and criticisms always welcome,
John

Monday, December 11, 2017

Integration Testing, Part 2


The overall goal of software testing is to validate the software we ship works as described.  Integration testing is a key part of that and I want to continue with a very simple "test."



I want to continue with the muffler and engine analogy from my last post.  If the engine is designed correctly, it has a specification document for it.  That document would list various data and specifications about the engine, such as horsepower, the type of fuel needed and so on.  This document will also list how much fuel the engine burns at a given speed it is running and based on that, it will say how many gallons/liters of exhaust it generates per second at each given engine speed.



Let's say it generates 1 unit of exhaust at idle, 3 units at half speed and 12 (wow!  You are really flooring it!) at full speed per second.



It is our job to test that the muffler we use can process that much exhaust.  The "test" here is simple.  We look at the data sheet for the muffler and see if it can process up to 12 units of exhaust per second.  If that answer is no, we don't need to set up an engine and muffler and measure the exhaust.  We can simply say this will not work and we need to make a change (to either the engine or the muffler), or select a different engine and muffler combination.  Easy enough, but this simple test of reading the documentation is missed often enough that I wanted to call it out as a simple first step to take.



This also supposes that the documentation is accurate.  That is not always the case in software since the underlying code can change at any point and updating documentation sometimes lags.  In the mechanical engineering world, though, that does not happen as often.



More on this muffler and engine next time.



Questions, comments, concerns and criticisms always welcome,

John

Tuesday, November 28, 2017

Integration Testing, Part 1


Integration tests are the tests we use to validate that 2 or more software modules work together. 

Let me give an example by analogy.  Suppose you have a car engine and you know it works (for whatever definition of "work" you want to use).  I have a muffler, and it also works, again, using whatever definition of "works" you want to use.

Now suppose you are asked "Will the engine you make work with my muffler?"

Each component works, but how can we tell if they will work together?

Integration testing is the key here.  We know that each component works by itself, but there are no guarantees that they will work together.

For instance, one test case will be that the size of the hole for the exhaust from the engine is the same size as the muffler pipe (to speak broadly).  If the engine has a 5 inch exhaust and the muffler is only 3 inches wide, we have a mismatch and they won't work together.

A second case, assuming the first passes, is connecting the 2 components.  Even if the size of the exhaust is correct, if you use metric bolts and I don't, we are in a failing state again.

In fact, there will be many more test cases for this.  Materials construction (some metals don't interact well with others), weight considerations, stress cases (handling backfiring, for instance) and many, many more. 

The same mentality applies to software testing and I will go deeper into that next time.

Until then, questions, comments, concerns and criticisms always welcome,
John