Information Needs for Quality Learning from Business Simulations
How designs go beyond the model created information to deliver learning
In the 1970's when I started to design and use business simulations the only information my simulations provided were the results provided to the learners. I soon realised as I used them in the classroom, that I needed the simulator to check decisions as they were entered to protect against typos and mistakes. By the late 1970's my simulations had a rudimentary decision screen. During the 1980's, as I continued running business simulations for clients, I began to build in a results screen that commented on teams' strengths and weaknesses. Information that helped me manage learning - identifying issues and which teams needed coaching, challenging and supporting. In the early 1990's I re-examined my simulation usage and needs as a user (tutor) and from this determined tutor support needs . During the last half of the 1990's and into the early 2000's, I refined my thinking and converted this into an award winning architecture and software and then extended my thinking about information needs to include design support.
Yet, it seems` to me that, still, commonly, the information produced by the business simulation is no or little more than the business results (outcomes) derived from the business decisions made by the learners. Whereas there are additional information needs necessary for quality learning thus:
Although learners must have freedom to make mistakes and radical decisions, there is a need to flag impossible decisions, mistakes, arbitrary decisions and inadvisable decisions.
Impossible Decisions are those that are illegal and cannot be processed by the simulation such as zero prices or firing more staff than you employ.
Mistakes occur when learners do not understand how decision are to be entered into the simulator. For instance, where a decision relates to increasing capacity, learners may misunderstand this and enter the current capacity in its place. If this decision were implemented then capacity would be doubled with a huge capital investment and, so the decision screen flags and questions this decision.
Arbitrary Decisions are those where the learners try to break the model. An example is where promotional expenditure is zeroed. Even if the simulation is capable of processing the decision, the impact will be disastrous. By questioning such decisions the learners are warned of impending doom and, if they go ahead then so be it!
Inadvisable Decisions are those where the current business situation might make the decision problematical. For example, where a capital investment in association with current debt levels would lead to liquidity or solvency problems.
As a result, there are three types of decisions - acceptable ones (the central band) that are accepted without question, illegal decisions (the extremes) that are rejected and, between the two, are decisions that are questionable and where the learners have the opportunity to change or use unchanged.
Where the simulation is run with the tutor entering decisions (Tutor Mediated Simulation) the Decision Screen provides information that flags problems, can be discussed with the learners before accepting their decisions or printed and returned to the learners with their results (for the learners to reflect on and discuss). Where the learners enter their own decisions (Direct Use Simulations) the Decision Screen stimulates thought and discussions and reduces the risk of really foolish decisions!
Deciding what results are provided to learners to show the outcomes of their decisions is a balance between providing sufficient information and not overloading the learners with information. The Result Screening Logic takes business results and analyses them further to produce reports that can be used by the trainer and/or, as appropriate, fed back to the learners.
Beyond this, the Result Screen. can comment on results (as staff, customers or bankers might) and highlight strengths and weaknesses. As for the Decision Screen there is a band where the results are acceptable and no comments are made. Beyond this there is are increasing problems ranging from questionable results (where action may be needed) to problematic results (where action is required).
Where the simulation is run with the tutor entering decisions (Tutor Mediated Simulation) the Result Screen provides information that assesses learning and learning needs and based on this coach, challenge and provide remedial learning (proactively drive learning forward. Where the learners enter their own decisions (Direct Use Simulations) and receive their results directly the Result Screen stimulates thought and discussions and provides clues about areas of strength and weakness.
The Decision Screen and the Result Screen provide information that can be used by the trainer to manage the learning process but beyond this there is a need to answer questions and identify differences between teams.
Answering Questions: Occasionally learners do not understand how their results are arrive at and even the most expert and experienced trainer may find it difficult to answer such questions authoritatively and quickly. This problem can be solved where the simulation produces special reports reconciling calculations and explaining the impact of decisions.
Team Differences: Besides reactively answering questions, the trainer needs to proactively drive learning forward. Part of the information necessary to drive learning forward is provided by the Decision and Result Screens. But, beyond this it is helpful to have information that shows the differences between teams - their decision, results, impact of the market etc..
Learn more about my Tutor Support System
Although the decision screening, result screening and tutor support logic provide invaluable information to help ensure design quality there is a need for additional logic and information to help calibrate and validate the simulation and this information need is significant. For example, for a business simulation of intermediate complexity the design support reports can account for nearly two-thirds of the total number of reports produced by the simulation (with the remaining third providing information for the learners and the trainer) .
What does this mean in terms of additional software needs - model size, number of reports comments on performance and decisions? Looking back to my Management Challenge simulation. the original version (1986) had a model with about 150 statements, 7 reports and no comments. The latest version (2013) has a model consisting of 705 statements, has 123 reports and 105 comments. However, these metrics do not include and design support needs. A similar business acumen but recent simulation (Training Challenge, 2012) has a model size of 1888 statements, 188 reports and 247 comments.
Unquestionably, these additional needs add to the design work. But, if done thoughtfully the extra work is reasonable and more than compensates for a better design and better learning.
 Hall, Jeremy J. S. B. (1994) Computerised Tutor Support Systems: the tutor's role, needs and tasks, in The Simulation & Gaming Yearbook Volume 2 Kogan Page London
 Hall, Jeremy J. S. B. (2012) Designing the Training Challenge Developments in Business Simulation and Experiential Learning, Volume 39
Most recent update: 14/11/13
Hall Marketing, Studio 11, Colman's Wharf, 45 Morris Road, London E14 6PA, ENGLAND
Phone +44 (0)20 7537 2982 E-mail firstname.lastname@example.org