SQA testing plan
Author:Miro Kresonja, SQA manager
Date:12/01/98
Version:1.0
Testing (based on) the principal integration parts:
- RMI
- Test the maximum speed of transmission of volumes of data during the
simulation. Do this by setting the delay to 0.
- Test accuracy of RMI (this test is not necessary; in all the
previous tests, we have never (not even once) seen any error in data
transmission.
- Connecting/Disconnecting of clients/demon always works, under any
cirumstances. The newly connected entities will enter a proper mode.
- Port/machine choice mechanism works properly.
- GUI
- Test display of a minimum map (1 module/1 dependency).
- Test display of a maximum-sized map. (10modules/10dep's each)
- All the modes display/enable proper buttons
- Matrix view is consistent with normal view
- (demon GUI) Active clients displayed properly
- Engine
- Different propagation methods work properly
- Shutdown procedure correct
- Abort procedure correct
- Other
- Exception handling consistent through simulation
Proposed testing procedures that will confirm/deny the above critical
points/behaviours of the system:
Note: For these test we will need three different sized maps (small, medium,
large size), two special maps (tree, wheel), and a script (running
components on different ports).
- Pick small map, fire up all the clients, then demon. Pick automatic
mode, 10 tries.
- Confirms:
-
1b, 1c, 2a, 2d, 2e, 3a
- Pick large map, connect a client or two, then demon, then another
client or two. Go to manual mode, let clients submit values. While in
manual mode, bring up a new client, let it submit values. Also, kill an
existing client, then bring it up again (registration for the same
module). As the simulation is running, abort the run, then shutdown the
simulation.
- Confirms:
-
1b, 1c, 2b, 2c, 2e, 3b, 3c
- Pick a medium map, different port/machine. Connect demon first, then
all desired clients. Run auto, try disconnecting demon in the middle of a
simulation, reconnecting. Run auto again, with 0 delay. Make sure the
clients are displaying proper maps, and doing so on time (in other
words, that the simulation is synchronized between different clients).
After auto run, shut down the simulation, then analyze the output maps
for inconsistencies.
- Confirms:
-
1a, 1c, 1d, 2c, 2d, 2e, 3a
- Special test 1 - Spoked wheel test
The dependency graph will look like a wheel, with all outside
nodes pointing to a single middle note. Simulation will be run on it and
outputs looked over for any inconsistencies. This will verify "all" method.
- Confirms:
-
3a
- Special test 2 - Pure tree test
The dependency graph will be a pure tree, with each node having a
parent and children (with the exception of root and leaves). All the
dependencies will point towards the leaves. Like in the previous test,
simulation will be run and results examined for possible inconsistencies
(such as a parent node being affected by its children). Partially
verifies "any" method.
- Confirms:
-
3a
Revision history:
Version 1.0 (created) - 12/01/98