Aspects of Simulator Model Quality
Issues and solutions associated with designing business simulation models that are error free.
This page explores the software quality issues associated with the design of the business simulation model. Thus it focuses on the validation model and does not explore the design issues associated with ensuring quality learning or the software associated with the user interface, data management, report generation and on-line help.
The page draws on my experience and knowledge of general software design and extends this to cover the particular features of business simulations, error types, error testing and quality management during design. (Before my career designing and using business simulations I was involved in advance manufacturing systems design with GE and worked for Honeywell Information Systems. Also I have authored two books on data processing [1 & 2].
Designing error free, high quality simulations is difficult because of the model size, model complexity (arithmetic, cyclomatic, structural and dynamic), design process, decision freedom, user knowledge and emotional involvement.
Business simulation models are large - if built using Excel even a simple simulation model will consist of 1000 rows and a complex simulation can have 10,000 rows. Models of this size are error prone - research by one author suggested that models with more than 150 rows will have at least one significant error .
A simulation's models range from straight forward accounting models to complex, non-linear economic, black-box models. Consequentially, it is easy to mis-type formulae and not notice this until the simulation is in use.
Business simulation make significant use of conditional logic that cause calculations to be done differently depending on the decisions and the current business situation. Thus there are many processing paths through the simulation model. An analysis of twenty simulations showed that for every five or so rows there is one conditional statement and cyclomatically complex software such as this is "difficult to test" 
Business simulation models are structurally complex with input data (decisions) interacting and determining several outputs (results). Further, data processing but be done in the right sequence and where the flows intermingle and interact. Arguably, the structural complexity of a business simulation is significantly more complex than for normal data processing software (such as payroll, production scheduling, etc.)
The dynamic complexity of business was explored in Forrester's seminal article (Industrial Dynamics - A Major Breakthrough for Decision Makers ) and I explored the same subject for business simulations  as the learning journey evolved over time. Consequentially, the simulation can become dynamically unstable with the simulation over shoots!
Designing a business simulation is a creative, iterative process and this can introduce errors as the software is extended and refactored (see later). This is particularly problematic because of the size and complexity of the simulation model.
Unlike normal software where the users have a narrow range of data entry options, users of business simulations must have the freedom to enter decisions that range from the sensible, through radical and foolish to plain stupid. And this places unusual and extreme stresses on the software
Where the simulation is being used directly by the learners they will not have had prior experience or knowledge of the software. Even when the software is being used by the tutor, he or she will not be a computer expert and will only have limited experience. This increases the chance of making usage errors significantly.
A core benefit with the use of business simulations is the extent to which the learners have fun and are engaged. But the flip side to this is that if something goes wrong - results are calculated wrongly or the simulator crashes the learners (and the tutor) are very, very upset. Even if a resolution can be found, trust has been lost and learning impaired.
Software error fall into three classes - syntax (language) errors, run-time exceptions and logic errors.
Syntax (language) Errors
A program consists of a series of statements that instructs the computer what to do. Wrong grammar or spelling in these instructions means that the computer does not understand. Happily, most perhaps all syntax errors are caught by the computer's language processsor.
These are the errors that become apparent when the simulator is used and halt the program (upsetting dozens of learners and the tutor). Even if the problem is resolved, learning will be disrupted and, besides disaffection, there will be a loss of trust.
These are the insidious errors where the calculations are wrong and produce incorrect results. They are insidious as they are easy to miss when testing but when discovered by the learners they reduce trust and cause disaffection.
Ensuring quality involves testing and quality management (see later). Here I explore standard software testing (black box testing, white box testing and code inspection) and testing that is especially necessary for simulation models (structural soundness testing and dynamic stability testing)
Black Box Testing
This involves checking that the input (business decisions) produce the right outcomes (results). It does not check or have access to individual calculations. I see black box testing as equivalent to the shell game (where you have to find a pea under one of three shells). Except that you know that there is an error (pea) but not where it is (and for software there are dozens perhaps hundreds of shells to choose from! Consequentially, if an error is found it is very difficult to find where it is and correct it.
White Box Testing
This involves checking within the models and exploring in detail the way individual calculations are done and how variables change. Consequentially, it is particularly necessary for the structural, arithmetic, and cyclomatic complexity of simulation models. Depending on the modelling language used (see later) this can be very easy or impossible.
This involves reading through the source code to see if calculations and logic have been entered correctly. The size of the simulation model makes this checking tedious and very time consuming - this impacts the thoroughness of code inspection and the tester becomes tired and bored. (Consider reading through an eighty page book looking for spelling errors, typos, misplaced punctuation etc.) Further, the choice of modelling language impacts the ease of code inspection. For some languages it is very easy and for others exceptionally difficult (practically impossible).
As describe earlier, the structure of a business simulation model is complex. Consequentially, it is important to check that calculations are done in the right sequence and are not omitted.
As mentioned earlier business simulations are dynamic, feedback systems and delays in the impact of decisions, feedback loops in the model and stochastic variation can cause instability. If this is likely then stability must be tested by entering appropriate decisions or setting up appropriate business situations.
In the factory, quality is ensured by managing it throughout the manufacturing process and not just testing for non-conformance. Likewise, when designing software it is necessary to manage quality throughout the design process and there are needs - design methodology, choice of modelling language, software structure, refactoring, defensive programming, documentation and verification support.
Designing business simulations present especially difficult design problems. On one hand the design process must be rigorous to ensure the simulation is delivered to time and to cost. But on the other hand, design is a creative process. To provide for these, I use my Rock Pool Method .
The choice of modelling language impacts readability and testability. Readability means that you can see what the model does and is crucial during design, when searching for bugs and later when the simulation needs to be updated or modified. Language choice also impacts whether White Box testing is possible. And, eases, speeds and improves Code Inspection. Finally, language choice also impacts design speed. Overall, I feel that using a high-level computer language is better than using a spreadsheet.
"Most programs are too big to comprehended in a single chunk " . This means that structuring the model into several sub-models (or objects) is a vital aid to comprehension and has the additional benefit of allowing, tested models to be transferred from an existing simulation to a new one , shortening development time and improving quality. Software structure goes beyond the simulation model and includes all the elements of the simulator (user interface, data management, report generation and on-line help).
Refactoring  is the software creation equivalent of rewriting a document and is done to clarify and improve the model. In part it is a natural part of the design process as the models are created. But beyond this, especially during White Box testing and Code Inspection the simulation model should be refactored to simplify and clarify.
This involves identifying during design the models that are likely to cause run-time exceptions and be logically wrong and designing out the problem. For example, if the divisor in a formula can be zero then the software is designed so this condition is checked and if zero the calculation is bypassed.
During design software needs to be documented. For simulations, during their iterative, creative design, it ensures that new models do not conflict with existing models. When in use, it explains calculations to the tutor, allowing questions to be answered and ensure the simulation is understood. Thirdly, documentation facilitates updating and modification at a later date. Documentation includes the model, decisions, results and variables and is paper based, in the software and online.
This involves building in logic and reports that help during testing and verification. This adds to the size of the model and significantly to the number of reports produced. (For a recent, complex simulation, verification support reports nearly trebled  the number of reports.) But verification support shortens testing and design time and many of the reports can be used in the simulation's Tutor Support System to help the trainer manage and ensure learning.
 Hall, Jeremy (1979) How to pass examinations in Data Processing Cassels, London
 Hall, Jeremy (1983) Data Processing, Cassels, London
 Freeman, D. (1996) How to make spreadsheets error-proof, Journal of Accountancy, 181 (5)
 McCabe, Thomas J. (1976) A Complexity Measure IEEE Transactions on Software Engineering 315
 Forrester, Jay W. (1958) Industrial Dynamics - A Major Breakthrough for Decision Makers.", in: Harvard Business Review, Vol. 36, No. 4, pp. 37–66.
 Hall, Jeremy & Benita Cox, (1993) Computerised Management Games: the feedback process and servo-mechanism analogy in The Simulation & Gaming Yearbook 1993 eds. Fred Percival, Sheila Lodge and Danny Sunders, Kogan Page, London
 Hall, Jeremy J. S. B. (2005) Computer Business Simulation Design: the Rock Pool Method, Developments in Business Simulation and Experiential Learning, Volume 32
 Kernighan, Brian W. & P. J. Plauger (1978) The Elements of Programming Style McGraw-Hill Book Company, New York
 Hall, Jeremy J. S. B. (1996) Computerized Simulation Design: OOP or oops The Simulation & Gaming Yearbook Vol 4 Kogan Page, London
 Fowler, Martin (1999) Refactoring: Improving the design of existing code. Addison Wesley Longman, Boston Mass.
 Hall, Jeremy J. S. B. (2012) Designing the Training Challenge, Developments in Business Simulation and Experiential Learning, Volume 39
Most recent update: 12/11/13
Hall Marketing, Studio 11, Colman's Wharf, 45 Morris Road, London E14 6PA, ENGLAND
Phone +44 (0)20 7537 2982 E-mail firstname.lastname@example.org