SQA testing plan

Author:Miro Kresonja, SQA manager
Date:12/01/98
Version:1.0


Testing (based on) the principal integration parts:

  1. RMI
    1. Test the maximum speed of transmission of volumes of data during the simulation. Do this by setting the delay to 0.
    2. Test accuracy of RMI (this test is not necessary; in all the previous tests, we have never (not even once) seen any error in data transmission.
    3. Connecting/Disconnecting of clients/demon always works, under any cirumstances. The newly connected entities will enter a proper mode.
    4. Port/machine choice mechanism works properly.
  2. GUI
    1. Test display of a minimum map (1 module/1 dependency).
    2. Test display of a maximum-sized map. (10modules/10dep's each)
    3. All the modes display/enable proper buttons
    4. Matrix view is consistent with normal view
    5. (demon GUI) Active clients displayed properly
  3. Engine
    1. Different propagation methods work properly
    2. Shutdown procedure correct
    3. Abort procedure correct
  4. Other
    1. Exception handling consistent through simulation

Proposed testing procedures that will confirm/deny the above critical points/behaviours of the system:

Note: For these test we will need three different sized maps (small, medium, large size), two special maps (tree, wheel), and a script (running components on different ports).

  1. Pick small map, fire up all the clients, then demon. Pick automatic mode, 10 tries.
    Confirms:
    1b, 1c, 2a, 2d, 2e, 3a

  2. Pick large map, connect a client or two, then demon, then another client or two. Go to manual mode, let clients submit values. While in manual mode, bring up a new client, let it submit values. Also, kill an existing client, then bring it up again (registration for the same module). As the simulation is running, abort the run, then shutdown the simulation.
    Confirms:
    1b, 1c, 2b, 2c, 2e, 3b, 3c

  3. Pick a medium map, different port/machine. Connect demon first, then all desired clients. Run auto, try disconnecting demon in the middle of a simulation, reconnecting. Run auto again, with 0 delay. Make sure the clients are displaying proper maps, and doing so on time (in other words, that the simulation is synchronized between different clients). After auto run, shut down the simulation, then analyze the output maps for inconsistencies.
    Confirms:
    1a, 1c, 1d, 2c, 2d, 2e, 3a

  4. Special test 1 - Spoked wheel test
    The dependency graph will look like a wheel, with all outside nodes pointing to a single middle note. Simulation will be run on it and outputs looked over for any inconsistencies. This will verify "all" method.
    Confirms:
    3a

  5. Special test 2 - Pure tree test
    The dependency graph will be a pure tree, with each node having a parent and children (with the exception of root and leaves). All the dependencies will point towards the leaves. Like in the previous test, simulation will be run and results examined for possible inconsistencies (such as a parent node being affected by its children). Partially verifies "any" method.
    Confirms:
    3a


Revision history:
Version 1.0 (created) - 12/01/98