Q1: In Captain McCarty Little's pamphlet on the Strategic Naval Game he describes his
views on gaming to support the development of naval (and military) thinking. In the
century since then, what has changed and what has remained enduring?
Q2: There are many skills required in a good SWG facilitator (or umpire or moderator or
referee); describe five of the more important ones and their value in the process.
A2: See Procedures, Step 3 for some characteristics of
a good facilitator. See also the Rules page for "Rules of the
Facilitator" for guides on how to behave during war-game play.
Q3: The term BOGGSAT is occasionally used to
describe what someone may see from a seminar war game in action. It seems like little more
than a 'bunch of guys and gals sitting around a table' having a confab about tactics and other
military issues. How is professional seminar war gaming designed to make this term irrelevant?
A3: Using the 15-step plan for developing and completing a seminar war game, only step 12
(the actual game) will resemble BOGGSAT. Steps 1 through 11 provide a rigorous route to
preparing for such a game. And steps 13 through 15 provide additional analytic steps needed
to extract value and provide a report. If a seminar war game consisted only of Step 12, such
a war game would indeed be little more than BOGGSAT.
See the Procedures page. Note particularly the need to have
scenarios that will cover issues of interest and provide for suitable branches and sequels, a
facilitator who draws out all relevant points of view and maintains a unbiased position for
all perspectives, injects that the facilitator can use to draw the discussion towards matters
that need attention, well-planned data collection and analysis methods, and a report that
includes details on the methods used as well as the outcomes. Throughout, including in the
final report, the views of dissenters, mavericks, and contrarians should be acknowledged so
alternative points of view are not missed for further consideration and subsequent study.
Q4: Perla and Markowitz recommended in 2009 that the US Navy's Global War Games should
incorporate several aspects in their gaming methods. Describe one (or more) of the innovations
they proposed.
A4: Some of the more significant issues are: tracking time in a multi-level war game,
collaboration between lower-level tactical players and control staff, and using the
mechanisms of "closed planning and open adjudication". See Perla and Markowitz
Wargaming Strategic Linkage for
details.
Q5: There are several methods for recording activity within a seminar war game project;
describe three of these and their pros and cons.
A5: Four of these are project workbook, gist of discussion, audio recording, and video
recording. See the Records page for more information and some
of the strengths and weaknesses.
Q6: There are differences between Perla's 1985/1990 definition of "wargame" and his
2007 definition. What are the differences and what significance is there?
A6: The most notable difference is that his description of participants in 2007 was
"a human player or players". In the earlier definition this was "players representing
opposing sides". The earlier language conjures up an image of a BLUE and a RED side in
something like a "zero-sum game" -- what I win you lose. In that respect it has aspects
of the Cold War: Nato versus the Warsaw Pact, winner takes all. The more recent language
acknowledges that there may be several "sides" or factions -- hence more resembling
irregular warfare, with many contending players.
The new language (see Definitions) also allows for
"a [single] human player" and also allows that there may be more than two "opposing
sides". When there is a single player, interactions may be determined largely by
computer-based "co-participants". Perhaps this is an acknowledgement of how good computers
have become (since 1985-90) in playing factions within a wargame.
Another change with the newer definition suggests that "cause and effect" may now be
harder for players to trace. For example, when the "sequence of events affects and is in
turn affected by" players (1990 definition), it implies that players will see clearly how
to move the levers to affect the play in their preferred direction. However, in the words
in the 2007 definition, there is a "flow" of events and players may "shape" them --
suggesting player input can change the flow of the events, but, perhaps not with the
outcome the players expect. This corresponds to the greater ambiguity we can now see in
irregular warfare but which was less apparent in much of the Cold War thinking (although
ambiguity has always been an element of military operations).
Q7: Austere seminar war games may rely on only the judgement of the facilitator to
determine the flow of events, but more elaborate seminar war games may incorporate diverse
tools to determine outcomes. Data collection and analysis can use a single scribe or it may
have much more extensive support. Describe some of these additional tools and how they might
be used.
A7: See the Tools page for details. Some of the tools may include:
Players who are commanders within the game may be provided with a command team or staff,
say representatives from the traditional numbered branches (J1, J2, etc.). Innovations in
command teams can be provided as well, say political, legal, or cultural advisors.
Some players may be embedded in command and control systems. These may be C2 systems
that are familiar to them (making the SWG a command-post exercise), or surrogates of existing
C2 systems or prototypes for future systems. The players will then have access to planning
tools and collaboration tools to assist them in their roles.
Staff planning aids, such as spreadsheet tables, may assist players (and control staff)
in developing realistic timing for deployments, resupply, sensor coverage, and so on.
Geographic information systems can be used to resolve time-and-distance issues, e.g.,
flight times for airlifts, or delays in convoys getting to their destinations.
Combat simulations may be used by players (sometimes one echelon down from the SWG
players) to determine the outcomes from conflict, and not just in terms of winner and
loser, but also, e.g., battle casualties, new demands for resupply, and waves of panicked
refugees.
Electronic collaboration tools may be used by both player, and by members of the Study
Team. Such tools include Adobe Connect, chat rooms, wikis, and their counterparts.
Data capture may use a variety of tools ranging from questionnaire design, circulation,
and completion; databases; statistical analysis packages; and audio and video recording.
Q8: While much of seminar war gaming relies on the discourse between players, there are
areas where quantitative methods have been used. Describe three.
A8: First, players may be asked to complete questionnaires where some questions require
choosing of alternatives or ticking on something like a Likert scale. Second, a task list may
be used to have players "score" results in some phase of a game to be compared with scoring
from another source, e.g., results in another phase of the game (as in STAMPER). Third,
players may be asked to rank alternatives; rank correlations can be used to determine
"distances" between players and a group ranking or between themselves. From this, statistical
methods like cluster analysis and multidimensional scaling can be applied (as in SOTA). See
the Numbers page for details. Fourth, if combat models, C2 systems,
or other tools are incorporated there may be quantitative results from them that can be analyzed
further, e.g., loss-exchange ratios from the combat simulations or network loading and latency
within the C2 system.
Q9: Who are Mike Bauman and Edward Tufte and provide one, or two, or more of their recommendations
on how information should be presented.