Professional Documents
Culture Documents
The investigation which is the subject of this Report was initiated by the Research Programme Leader Human Capability. BAE Systems 2011. All Rights Reserved. The authors of this report have asserted their moral rights under the Copyright, Designs and Patents Act, 1988, to be identified as the authors of this work.
BAE Systems 2011. All Rights Reserved. Issued by BAE Systems on behalf of the HFI DTC consortium. The HFI DTC consortium consists of BAE Systems, Cranfield University, Lockheed Martin, MBDA, SEA, Southampton University and the University of Birmingham.
Authors
Jonathan Pike Dr John Huddlestone Cranfield University Cranfield University
ii
Contents
1 2
2.1 2.2 2.3 2.4
3
3.1 3.2 3.3
3.3.2.2 3.4
Development of a Team/Collective Performance Model .................................................. 19 3.4.1 Review of Team Effectiveness Models ................................................................... 19 3.4.2 Environmental Task Demands................................................................................ 23 3.4.3 The Team/Collective Performance Model .............................................................. 24 3.4.3.1 3.4.3.2 3.4.3.3 3.4.3.4 3.4.3.5 Task Environment ........................................................................................ 25 Performance Outcomes ............................................................................... 26 Team Processes .......................................................................................... 26 Team Properties .......................................................................................... 27 Team Member Characteristics ..................................................................... 27
3.5
4
4.1 4.2
iii
4.2.1 Hierarchical Task Analysis for Teams .................................................................... 30 4.2.2 Team Cognitive Task Analysis ............................................................................... 31 4.2.3 Team Task Analysis................................................................................................ 31 4.2.4 Task and Training Requirements Analysis Methodology ....................................... 32 4.2.5 Work Domain Analysis ............................................................................................ 32 4.2.6 Mission Essential Competencies ............................................................................ 33 4.2.7 Models for Analysis of Team Training (MATT) ....................................................... 33 4.3 Evaluation.......................................................................................................................... 35
5
5.1 5.2 5.3
6
6.1 6.2
7
7.1 7.2
7.2.3 Environment Description Table Construction ......................................................... 50 7.3 Internal Task Context Description ..................................................................................... 53 7.3.1 Organisational Structure ......................................................................................... 53 7.3.2 Role Definitions ....................................................................................................... 54 7.3.3 Internal Team Context Diagram.............................................................................. 56 7.3.4 Team Communication Structure ............................................................................. 58 7.3.4.1 7.3.4.2 7.3.4.3 Communication Diagram ............................................................................. 58 Communications Description ....................................................................... 61 Communications Matrix ............................................................................... 61
iv
7.4
Task Analysis .................................................................................................................... 62 7.4.1 Hierarchical Task Analysis for Team & Collective Training - HTA(TCT) ................ 62 7.4.1.1 7.4.1.2 7.4.1.3 7.4.1.4 7.4.1.5 HTA(TCT) Diagram Notation and Format ................................................... 62 Example HTA (TCT) Diagram ...................................................................... 64 Task Sequence Diagrams............................................................................ 66 HTA(TCT) Task Description Tables............................................................. 67 Task Role Matrix .......................................................................................... 72
7.5 7.6
Capture Environmental Task Demands ............................................................................ 73 Teamwork Analysis ........................................................................................................... 74 7.6.1 Teamwork Process Priorities .................................................................................. 74 7.6.2 Teamwork Interaction Analysis by Role ................................................................. 74
7.7
Training Gap Analysis ....................................................................................................... 75 7.7.1 The Risk Management Approach ........................................................................... 76 7.7.2 The Training Priorities Table ................................................................................... 76
7.8
Team/Collective OPS and TO Development .................................................................... 80 7.8.1 Linking Tasks and Training Objectives to Mission Task Lists ................................ 80
8
8.1 8.2
8.3
Instructional Method Selection .......................................................................................... 88 8.3.1 Selecting Methods for Initial Instruction .................................................................. 88 8.3.2 Selecting Methods for Practice and Assessment ................................................... 90 8.3.2.1 8.3.2.2 8.3.2.3 8.3.2.4 Identification of Part-Task Training Requirements ...................................... 90 Identification of Simplifying Conditions Method Requirements.................... 91 Assessment and Feedback Methods........................................................... 91 Practice and Assessment Methods Table ................................................... 92
8.3.3 Training Scenario Specification .............................................................................. 94 8.4 Instructional Task Identification ......................................................................................... 97 8.4.1 Instructor Task Table .............................................................................................. 97 8.4.2 Training Overlay Requirement Specification .......................................................... 98
9
9.1 9.2
Training Environment Option Identification ..................................................................... 108 Training Environment Option Definition .......................................................................... 110 Training Environment Option Evaluation ........................................................................ 113
10
10.1 Introduction...................................................................................................................... 114 10.2 Estimation of Costs ......................................................................................................... 114 10.3 Estimating Effectiveness ................................................................................................. 114 10.4 JSP 822 Guidance .......................................................................................................... 114
11
12
Appendix A
A.1 A.2 A.3 A.4 A.5
Team Process Model (Annett, 2000) .............................................................................. 120 Team Coordination Dimensions (Bowers et al, 1993) .................................................... 121 The Models for Analysis of Team Training Taxonomy (Dstl, 2006) ................................ 122 Salas Big Five Model of Teamwork (Salas et al, 2005) ................................................ 124 Teamwork Behaviours (Rousseau et al, 2006) ............................................................... 125
vi
List of Tables
Table 1 Teamwork Models used in Teamwork Analysis .......................................................................... 12 Table 2 Integrative Teamwork Models...................................................................................................... 13 Table 3 Initial Teamwork Categories ........................................................................................................ 16 Table 4 TCTNA Teamwork Taxonomy ................................................................................................... 18 Table 5 Work Domain Analysis Abstraction Hierarchy (Naikar and Sanderson, 1999)........................... 33 Table 6 MATT Process Stages (Dstl, 2006) .............................................................................................. 34 Table 7 TNA Triangle Stage Sub-Components ......................................................................................... 38 Table 8 Example Constraints Table Format .............................................................................................. 42 Table 9 Generic Scenario Table Format .................................................................................................... 44 Table 10 Completed Generic Scenario Table ........................................................................................... 44 Table 11 Interaction Table Format ............................................................................................................ 47 Table 12 Example Tornado F3 Pair External Context Interaction Table .................................................. 49 Table 13 Environment Description Table Format ..................................................................................... 50 Table 14 Example Tornado F3 Pair Environment Description Table ....................................................... 52 Table 15 Example Role Definition for the F3 Pair Lead WSO ................................................................. 55 Table 16 Example Role Definition for the F3 Pair Wingman WSO ........................................................ 55 Table 17 Example Tornado F3 Pair Interface Interaction Table ............................................................... 57 Table 18 Tornado F3 Pair Environment Description Table Entries .......................................................... 57 Table 19 Example System Matrix ............................................................................................................. 58 Table 20 Example Tornado F3 Pair Communications Matrix................................................................... 60 Table 21 HTA(TCT) Diagram Element Symbols and Descriptions.......................................................... 63 Table 22 Syntax for plans and examples of use ........................................................................................ 64 Table 23 Task Sequence Diagram Notation .............................................................................................. 66 Table 24 HTA(TCT) Task Description Table Structure............................................................................ 68 Table 25 Task Description Table for F3 Pair Task 1.2.5 Meld Radar....................................................... 70 Table 26 Task Description Table for F3 Pair Task 1.2 Detect Bandit BVR ............................................. 71 Table 27 Task and Role Matrix Example .................................................................................................. 72 Table 28 Example Environmental Task Demands Table .......................................................................... 73 Table 29 Example Teamwork Process Priority Table ............................................................................... 74 Table 30 Teamwork Interaction Table ...................................................................................................... 75 Table 31 Example Training Priorities Table ............................................................................................. 78 Table 32 Factors Affecting the Likelihood of an Error and Indicators of High Severity Consequences ............................................................................................................................................ 79 Table 33 Training Content Performance Matrix with Application Level Elaboration Strategies, adapted from Clark (2008)......................................................................................................................... 84 Table 34 Suggested Content for Initial Instruction on Teamwork ............................................................ 89 Table 35 Example Initial Instruction Table for F3 Pairs Training ............................................................ 90 Table 36 Example Practise and Assessment Methods Table..................................................................... 93 Table 37 Example Training Scenario Table .............................................................................................. 95 Table 38 Environment Description Table Entries for Task Scenario Requirements................................. 96 Table 39 Example Instructor Task Table Format and Entries ................................................................... 98
vii
Table 40 Environment Description Table Training Overlay Requirements Entries ................................. 99 Table 41 System Fidelity Requirements .................................................................................................. 103 Table 42 Resource Fidelity Requirements ............................................................................................... 104 Table 43 Human Fidelity Requirements .................................................................................................. 105 Table 44 Manned Systems Fidelity Requirements .................................................................................. 105 Table 45 Physical Environment Fidelity Requirements .......................................................................... 106 Table 46 Tornado F3 Pair Environment Description Table .................................................................... 107 Table 47 Example Training Environment Option Description Table ..................................................... 111 Table 48 Example Training Environment Option Properties Table Entries............................................ 112 Table 49 Training Environment Options Comparisons Table ................................................................. 113 Table 50 Definitions of Team Coordination Dimensions (Bowers et al, 1993), ..................................... 121 Table 51 MATT Teamwork Behaviours (Dstl, 2006) ............................................................................. 122 Table 52 MATT Team Member Attitudes and Characteristics (Dstl, 2006) ........................................... 123 Table 53 MATT Teamwork Knowledge Requirements (Dstl, 2006) ...................................................... 123 Table 54 Definitions of the Core Components and Coordinating Mechanisms of the Big Five Model of Teamwork (adapted from Salas et al, 2005) ............................................................................ 124 Table 55 Teamwork Behaviours (Rousseau et al, 2006) ......................................................................... 125
viii
List of Figures
Figure 1 The Systems Approach to Training ............................................................................................... 4 Figure 2 MoD TNA Process Diagram (JSP822, 2007) ............................................................................... 7 Figure 3 TCTNA Development Sequence ................................................................................................... 8 Figure 4 Patterns of Team Interactions adapted from Tesluk et al (1997) ................................................ 10 Figure 5 Input-Process-Output (IPO) paradigm for analysis of group interaction as a mediator of performance outcomes (Hackman and Morris, 1975) ............................................................................... 19 Figure 6 Model of Team Effectiveness adapted from Tannenbaum, Beard and Salas (1992) .................. 20 Figure 7 The Command Team Effectiveness Model with Basic Components and Feedback Loops (NATO, 2005)............................................................................................................................................ 21 Figure 8 Information Transduction Model of Group Activity on the Task Environment (Roby, 1968) .......................................................................................................................................................... 22 Figure 9 Properties of Naturalistic Environments, Environmental Stressors and Environmental Task Demands............................................................................................................................................ 24 Figure 10 The Team/Collective Performance Model ................................................................................ 25 Figure 11 Team Training Cycle, adapted from Tannenbaum et al (1998) ................................................ 28 Figure 12 The Team/Collective Training Model ....................................................................................... 29 Figure 13 Example HTA (T) Chart............................................................................................................ 31 Figure 14 The Team/Collective Training Model ....................................................................................... 36 Figure 15 The Triangle Model of TNA ..................................................................................................... 37 Figure 16 Team Context Diagram Notation .............................................................................................. 45 Figure 17 Example Tornado F3 Pair External Environment TCD ............................................................ 48 Figure 18 Example Tornado F3 Pair Organisational Chart 1 .................................................................... 54 Figure 19 Example Tornado F3 Pair Interfaces TCD ................................................................................ 56 Figure 20 Example Tornado F3 pair Communications Diagram and Textual Description ....................... 60 Figure 21 HTA Diagram Format ............................................................................................................... 63 Figure 22 Example HTA for the F3 Pair Task Destroy Bandit beyond Visual Range .............................. 65 Figure 23 Task Sequence Diagram Format ............................................................................................... 67 Figure 24 Task Sequence Diagram for the F3 Pair Task 1 Destroy Bandit BVR ..................................... 67 Figure 25 Task Sequence Diagram for the F3 Pair Sub-Task 1.2 Detect Bandit BVR ............................. 67 Figure 26 General mapping of Tasks to METs ......................................................................................... 81 Figure 27 Mapping of Example ATC Role Group Summary Tasks to Military Tasks in the MTL(M) ..................................................................................................................................................... 82 Figure 28 Part Task Training Sequence (adapted from Wickens and Hollands, 2000) ............................ 85 Figure 29 Simplifying Conditions Method Training Sequence (Adapted from Reigeluth, 1999) ............ 86 Figure 30 Mapping of TOs to the Operational Environment .................................................................... 88 Figure 31 Information Inputs to Generic Scenario Specifications ............................................................ 94 Figure 32 Mapping of TOs to Training Environments ............................................................................ 101 Figure 33 JOUST Networked Flight Simulation System ........................................................................ 110 Figure 34 Team Process Model (adapted from Annett, 2000) ................................................................ 120
ix
List of Acronyms
ACM ATC BATUS BVR Cap JTES CAST CATT CDM CF CONOPS CONEMP CONUSE CTEF CWA DSAT Dstl ECG EDT FC HF HFI DTC HTA HTA(T) HTA(TCT) IPO JTIDS JSP KSAs MATT MCC MECs MSHATF Air Combat Manoeuvring Air Traffic Control British Army Training Unit Suffield Beyond Visual Range Capability Joint Training Experimentation and Simulation Combined And Staff Trainer Combined Arms Tactical Trainer Critical Decision Method Competency Framework Concept of Operations Concept of Employment Concept of Use Command Team Effectiveness Cognitive Work Analysis Defence Systems approach to Training Defence science and technology laboratory Electrocardiograms Environment Description Table Fighter Control Human Factors Human Factors Integration Defence Technology Centre Hierarchical Task Analysis Hierarchical Task Analysis (Teams) Hierarchical Task Analysis for Team and Collective Training Input Process Output Joint Tactical Information Distribution System Joint Service Publication Knowledge, Skills and Attitudes Models for Analysis of Team Training Maritime Component Command Mission Essential Competencies Medium Support Helicopter Aircrew Training Facility
MTL(M) MTDS MoD NATO OTA OTI PPE RAF RN RWR SA SAT SCM SME TCD TCTNA TES TO TNA TTA TTRAM UHF US USAF VHF WDA WSO
Military Task List (Maritime) Mission Training through Distributed Simulation Minstry of Defence North Atlantic Treaty Organisation Operational Task Analysis Operation Task Inventory Post Project Evaluation Royal Air Force Royal Navy Radar Warning Receiver Situation Awareness Systems Approach to Training Simplifying Conditions Method Subject Matter Experts Team Context Diagram Team/Collective Training Needs Analysis Tactical Engagement Simulation Training Objective Training Needs Analysis Team Task Analysis Task and Training Requirements Analysis Methodology Ultra High Frequency United States United States Air Force Very High Frequency Work Domain Analysis Weapons Systems Operator
xi
1 Executive Summary
The research project described in this report was devised in response to a Royal Navy request for the Human Factors Integration Defence Technology Centre (HFI DTC) to provide guidance on the conduct of Training Needs Analysis (TNA) for Collective Training to support the TNA process being conducted for the Queen Elizabeth (QE) Class Aircraft Carriers. TNA is the systematic process of analysing training tasks and identifying suitable training option(s) that has been used in connection with acquisition projects for many years, however traditionally it has been focussed on individual training. The inherent complexity and scale of collective training puts it beyond the analytical reach of the techniques normally employed for individual training. Due to anomalies in how the three Services define team and collective training, the method developed is referred to as Team/Collective TNA (TCTNA) to make it clear that is applicable to both team and collective training however they may be defined. The TCTNA guidance provided in this document is designed to extend and amplify the extant guidance on TNA provided in JSP 822, not to replace it. In order to develop a TNA methodology, it was necessary to develop a model of collective training. This necessitated the development of an underlying model of team performance and supporting teamwork taxonomy. A review of extant teamwork and team performance models was conducted and, in the absence of suitable extant models being identified, a model of team performance and supporting teamwork taxonomy suitable for TCTNA were synthesised. A review of extant human factors methods devised to facilitate the analysis of team training was conducted. Of the relatively small number of models in existence, none were found to provide all of the analytical components required to support TCTNA, confirming that a new model had to be developed. The TCTNA method that has been devised is structured around an adaptation of the TNA Triangle model devised in a previous phase of HFI DTC research (HFI DTC, 2009). It is composed of five components: Constraints analysis, Team/Collective Task Analysis, Training Overlay Analysis, Training Environment Analysis and Training Option(s) Selection. Constraints Analysis provides a mechanism for the recording of all key constraints and analysis of their consequences in terms of limitations on potential training solutions. Team/Collective Task Analysis exploits adaptations of software design representational techniques to develop visual representations of the nature of the task environment. An extension of Hierarchical Task Analysis for Teams (developed under a previous Ministry of Defence (MoD) contract) is then employed as the core task analysis method. The teamwork taxonomy developed as part of the research is used to guide the analysis of the teamwork component of the task. Finally, training priorities are established using a risk management approach. During Training Overlay Analysis appropriate instructional methods are determined and generic scenarios are developed which are used to inform subsequent training
environment specification. Instructor functions are specified as are the requirements for data capture from the instructional environment. During Training Environment Analysis the training environment requirements are first rationalised and then a fidelity analysis is conducted for each environment required. Novel fidelity analysis templates are provided for specifying each of the manned systems, human elements, resources, systems and physical environment elements that are required within each training environment. Training environment options are then identified and characterised and technical suitability is assessed. Outline guidance on Training Option Selection is provided, with reference being made to the JSP 822 direction on seeking advice from MoD and industry for the costing and comparison of complex training systems. This research has delivered a methodology for the conduct of TCTNA. Worked examples and templates are provided for the components of each stage of the methodology. It is anticipated that this guidance will be used by military and commercial TNA specialists conducting TCTNA.
2 Introduction
2.1 Background
The requirement for this work originated in a request from the RN for the development of a methodological approach for the conduct of Training Needs Analysis (TNA) which could be applied in a set of TNAs to be conducted for collective training for the Queen Elizabeth Class Aircraft Carriers. The principal differences between individual training and team and collective training are associated both with scale and complexity. The issues include: Complexity of the task Complexity of the context in which the task is conducted Complexity of the start state of the training audience Complexity of exercise planning Complexity of the instructional task Complexity of evaluation Scale of resource requirements Costs of training
A TNA method for Team and Collective Tasks must be theoretically capable of covering any type of Collective Task from the smallest and simplest (two people working together) to large complex tasks that span multiple teams and organisations and involve integration of effort between them (such as ground-air integration in warfare). The experience of the Royal Navy (RN) TNA specialists was that, whilst the overarching approach to TNA mandated in Joint Service Publication (JSP) 822 was logical and applicable to collective training, there was an absence of appropriate guidance on techniques to deal with the complexities of the collective training problem. The purpose of this report is to fill this methodological gap.
from individual level up to Battle group and beyond, referring to these as collective training levels 1-6. The North Atlantic Treaty Organisation (NATO) definition of Collective training cited in NATO (2004) describes collective training as training which: involves 2 or more teams, where each team fulfils different roles, training in an environment defined by a common set of collective training objectives (CTOs) (p6.2) Given that the requirement for this work was to develop a methodological approach applicable to support TNA for the training of individual teams and collective training, and that there is the potential for confusion about the applicability of the method if it is simply labelled Collective TNA, the method described is referred to in this document as Team/Collective TNA (TCTNA) and is applicable to all levels of training above individual training.
Notwithstanding the success of the application of SAT to the development and management of training within the RN, the Army and the RAF, for many years the development of training associated with the acquisition of new systems fell outside the scope of the SAT process of each individual Service. The TNA process was developed to provide guidance on the application of the principles of SAT to training developed within the acquisition process. The scope of TNA is illustrated by the red overlay in Figure 1. It embraces the analysis phase and sufficient high level design to facilitate the identification of a recommended training solution. In 1996 the Department of Internal Audit endorsed a common approach to TNA across the three services, achieved by the publication of JSP 502 TNA for Acquisition Projects. The output of the TNA is a document set that forms the inputs for subsequent stages of the instructional design process - such as specification of training equipment, development of lesson content and training material and training implementation, assessment and evaluation strategies. Historically, SAT and subsequently DSAT have only applied to individual training. Significantly, with the integration of DSAT and JSP502 into JSP 822 in 2007, the need to consider collective training requirements is specifically mentioned within the guidance on TNA (JSP 822). The current MoD TNA process is shown diagrammatically in Figure 2 TNA is conducted in three phases: Phase 1 - Scoping Study. The Scoping Study defines how the TNA is to be conducted and managed and identifies the constraints, assumptions and risks associated with the project. It also includes a target audience description which characterises who will need training, the annual throughput and the input standard. Phase 2 TNA Development. The TNA development constitutes the core of the analytical activity of the TNA and yields four key deliverables. These are: 1. Operational/Business Task Analysis (OTA). This deliverable reports the outcome of a task analysis to establish the performance conditions and standards for all affected job holders. It includes an Operational Task Inventory (OTI) with associated performance conditions and standards statements. It is recommended that the OTI is rationalized using a process such as Difficulty Importance Frequency (DIF) analysis. Notably, it identifies the requirement to identify interfaces between individuals and teams to include co-ordination, communication and backup activities, noting that they form a vital part of sub-team and command team training. 2. Training Gap Analysis. The purpose of this deliverable is to determine the additional training required to meet the gap between the performance level required as defined by the OTA and the existing performance level of individuals. Specific mention of unit and collective training requirements is required. A fidelity analysis is also required to identify the key cues and stimuli that support the requirement to train (Sect 2.30. The conduct of the fidelity analysis can continue into the next phase but has to be completed before method and media selection takes place.
3. Training Options Analysis. The purpose of this deliverable is to make a recommendation as to cost effective training solution(s) to meet the training requirement identified in the previous phases of the TNA. This includes a description of the training methods and media (or combinations thereof) that will partially or fully meet the training requirement along with an estimate of the relative training effectiveness of each costed training media option. 4. The Final Report. The final report draws together all the key elements from the previous phases of analysis. The output from this deliverable is an endorsed training solution, draft Operational performance Statement/Competency Framework, implementation plan and an evaluation strategy. If the final report is accepted the next stage of the process is the development of the training course and any required training devices. These may be partly or wholly contracted out. If the final report is not accepted then some or all of the analysis process will have to be repeated. The branching path in Figure 2 from the Final Report reflects these options. Phase 3 Post Project Evaluation. The Post Project Evaluation (PPE) should be conducted once training delivery and evaluation has commenced and reviews the effectiveness of the TNA process for the project. Historically, relatively few projects have had a PPE. The guidance for the conduct of the Phase 1 Scoping Study is comprehensive and does not appear to require any adaptation for team/collective training. Similarly, the Phase 3 PPE guidance is robust and applicable to any TNA. However, whilst the Phase 2 TNA Development guidance mentions the requirement to address team and collective training and highlights the requirement to capture communication, coordination and backup activities, there is no detailed guidance on how to apply the overall approach to this complex domain. Therefore, the requirement identified for this study was to develop an approach for Phase 2 TNA Development that addressed the complexities of team and collective training, and provided guidance in sufficient detail for the approach to be implemented.
RESPONSIBILITIES
Project sponsor to Contact Lead Training Authority MOD TNA steering group Phase 1 TNA Scoping Study
MOD or Contractor
TNA Final Report Deliverable 4 No No Can trg equipment contract be awarded? Yes
Yes Training Design & Development Training Equipment Design & Development Ready for Trg Date (RFTD)
Relevant Agencies
Training Delivery
Training Evaluation
Training Authority
3.2 Definitions
The teamwork literature suffers from the use of a multiplicity of definitions for commonly used terms such as team and teamwork. Furthermore, the definitions themselves often manifest a confusion of team member attributes, interactions, processes, functions, strategies, behaviours and products. For example, Nieva et al (1978), defined a team as: two or more interdependent individuals performing coordinated tasks towards the achievement of specific task goals whereas Salas et al (1992) defined a team as two or more individuals, who have specific roles, perform interdependent tasks, are adaptable and share a common goal In the Salas et al. (1992) definition, a team is defined as being adaptable and sharing a common goal. In a poorly performing team, team members may not necessarily be adaptable (or want to be adaptable) and may not necessarily share a common goal which is why inadequate conflict resolution is quoted as a factor in poor teamwork (Baker, Day and Salas, 2006). For the sake of clarity, the following definitions are used in this report: Team: a number of persons constituting a work group, assembled together for the purpose of joint action. Teamwork: the interactions and processes that occur between team members in response to environmental demands to help achieve the generation of team-collective task products.
Teamwork Interaction: a single instance of one team member causing an effect on another team member; actions that connect team members. Teamwork Process: a systematic series of coupled team member interactions directed to some end. Teamwork Competencies: the supporting Knowledge Skills and Attitudes (KSAs) that support teamwork directly. Collective Task: a task involving (most usually requiring) the performance of more than one individual in a work group; the performance of joint action between team members to a shared goal, generating a task product with specified value measures which may be evaluated in performance assessment. Joint action may be parallel (not dependant) or highly interdependent.
10
Whilst these patterns are of interest in that they facilitate characterisation of the dynamics of a team task in terms of interaction pattern, the model is of limited value from a TNA perspective, as an understanding of the nature of the interactions and the processes that they support is required in order to devise a training strategy and determine a suitable training environment. The remaining models identified provide detailed breakdowns of teamwork. These are summarised in Table 1 and Table 2. The full models are provided in Appendix A.
11
12
13
Annett (1997) proposed his teamwork model, shown in the first column of Table 1 as the theoretical framework for teamwork upon which his Hierarchical Task Analysis for Teams (HTA(T)) method was based. The cognitive process components are described as being the knowledge or beliefs that team members hold of the problem (world model), the what other team members are doing or capable of doing (people model) and the team plan. Arguably they might be better described as knowledge components rather than cognitive processes. Annett (1997) observes that affective components such as morale and cohesiveness, whilst regarded as important by many, are problematic as it is unknown if they are a cause of team behaviour, but suggests they should be incorporated for completeness. The application of this model in HTA(T) manifests as the capture of descriptions of teamwork requirements for goals and sub-goals in terms of the communication and co-ordination requirements. Bowers, Morgan, Salas and Prince (1993) devised a teamwork model which they used as the basis for a questionnaire to assess coordination demands in flight tasks in order to inform the further development of aircrew co-ordination training. The eight dimensions of their model are shown in the second column of Table 50. This model has many more processes than the Annett model but does not consider affective or cognitive elements. The Dstl (2006) model, shown in the third column of Table 2, was designed to underpin the Models for Analysis of Team Training (MATT) approach to the development of team training. It is notable in that it provides a much more extensive list of underpinning knowledge and affective elements than the other models, although the definitions of the affective elements appear to overlap. The Salas et al (2005) model, shown in the first column of Table 2 is structured differently from all the other models in that it is split into five core components of team work and three coordinating mechanisms. This model was developed on the basis of the analysis of twenty selected teamwork models. The Rousseau et al (2006) model, shown in the second column of Table 2 was developed on the basis of the analysis of twenty nine teamwork models. A notable difference between this model and the Salas et al model is that leadership is specifically excluded from the list of behaviours considered although a detailed reason for its exclusion is not offered. It is also the only model that includes conflict management as a teamwork behaviour. When all five models are compared, there are only two categories of behaviour, communication and coordination, which are common to all. Furthermore, in a number of instances behaviours with the same name are defined in different ways. For example, Rousseau et al (2006) define mission analysis as being concerned with the identification of tasks, environmental conditions and the team resources available for utilisation in undertaking the task, whereas Bowers et al (1993) include the development of plans and the allocation of people and resources to task. In addition, the similar sets of behaviours in different models are given different labels. For example, Rousseau et al (2006) use the term intra-team coaching as the label for the provision of feedback and distinguish it from backup behaviour, whereas it is explicitly labelled as feedback in the backup behaviours category in the DstlDstl (2006) model. Even the category of coordination, which is common to all of the models, is defined differently in every model.
14
Given the lack of standardisation of terms, the differences of scope of each model, and the fact that all of the models contained unique categories, there was no one model that was an obvious candidate to be used to underpin the analysis of teamwork for TCTNA. Therefore it was considered that a synthesis of the models was required.
15
Leadership Goal specification Planning Task prioritization Task assignment Control Coordination Monitoring Team Performance Performance assessment Motivating Team Creating positive atmosphere Situational awareness Task coordination Workload management Information coordination Resource coordination Collaborative problem solving Gathering required information Identifying potential solutions Evaluating alternative solutions Forming consensus on best alternative Collaborative planning Conflict management Task allocation Resource allocation
Adair (1997) identifies eight leadership functions: defining the task, planning, briefing, controlling, evaluating, motivating, organising and providing an example. These functions are identified as part of his action centred leadership model, which is commonly taught during leadership training in the British Armed Forces. The view was taken that whilst leadership is a role with responsibilities for ensuring that effective teamwork is taking place in order to secure task delivery, the role is greater than simply being a teamwork process. Therefore, leadership was removed from the teamwork process list.
16
3.3.2.1.1.2 Situational Awareness Situational awareness was challenged as a category in the light of the most recent HFI DTC research into the concept of Distributed Situational Awareness (DSA) published in Salmon, Stanton, Walker and Jenkins (2009). Salmon et al advance DSA as an alternative view to shared situational awareness (SA) and posit that DSA is a system held construct. They suggest that different individuals with different roles will form different mental models of the situation even if they are presented with the same information, the differences being attributable to their roles. They suggest that what is important is the alignment of their perceptions of the situation where their actions and activities are related. In their DSA model they assert that team attributes such as cohesion and team processes such as communication contribute to the achievement of appropriate DSA but it is not solely a teamwork process. Therefore, situational awareness was discounted as a teamwork process. 3.3.2.1.1.3 Task and Resource Allocation The issue concerning both task allocation and resource allocation were whether they were best considered as processes or interactions. The view that was advanced was that, based on observation of simple instances of teamwork, they should be characterised as interactions. For example, considering the example of two pilots operating an aircraft, the handling pilot may direct the non-handing pilot to take over responsibility for talking on the radio. This was considered to be different to the more extended process of workload management which may result in a revision of task allocations instantiated through a number of task allocation interactions. Similarly, resource allocation such as an infantry section commander directing a member of the section to take six grenades could be considered to be a simple interaction, different in nature to the larger process of resource coordination. Task and resource allocation were therefore re-categorised as teamwork interactions. 3.3.2.2 The Teamwork Taxonomy The final version of the taxonomy, with definitions for each element, is shown in Table 4
17
Collaborative planning
Conflict management
18
Figure 5 Input-Process-Output (IPO) paradigm for analysis of group interaction as a mediator of performance outcomes (Hackman and Morris, 1975) This model shows that both individual as well as team factors influence team performance. It also identifies the nature of the task and environmental factors as inputs, including stresses imposed by the environment. One issue that was identified by Hackman and Morris (1975) is that there are other types of output apart from performance outcomes, represented in the dashed box in Figure 5. Some of these Other outcomes are in fact input variables, such as group cohesiveness. Outputs can be inputs as part of the group interaction process, this is not a particular surprise to anyone who has worked in a team and had their relationship change with other team members as a result. The lack of the representation of feedback would seem to be a limitation of the model. Later researchers extended the IPO paradigm for team effectiveness by including other sub-categorisations within the input, process and output categories and capture some notion of feedback between inputs and outputs. Figure 6 below shows a later model of team effectiveness, from Tannenbaum, Beard and Salas (1992), which was considered in a recent NATO study (NATO, 2005) to be the most appropriate to command team effectiveness of the models that they reviewed.
19
Figure 6 Model of Team Effectiveness adapted from Tannenbaum, Beard and Salas (1992) Whilst this model captures feedback to some degree, there are a number of surprising features about this model: Team changes and individual changes are not shown as fed back into the individual and team characteristics. Individual characteristics are shown as acting only indirectly through team characteristics on team processes and team performance. Organisational and situational characteristics are shown as somehow influencing the whole of the input, throughput, output process but explicit connections to the individual elements are not made. The work assignment and communications elements in work structure appear to be organisational factors Team norms (in work structure) would appear to be a team characteristic. The throughput component only captures teamwork processes and not taskwork processes.
One of the latest team effectiveness models is the Command Team Effectiveness (CTEF) Model developed by NATO (2005) shown in Figure 7.
20
Figure 7 The Command Team Effectiveness Model with Basic Components and Feedback Loops (NATO, 2005) This model captures both team and task related processes and shows task and team outcomes being fed back to the input conditions. It also illustrates the possibility of organisational learning if an after action review is conducted. The nature of the mission and task are captured as inputs but surprisingly the task environment is not mentioned explicitly. Marks, Mathieu and Zaccaro (2001) deal more explicitly with the notion of individual and team characteristics being both inputs to and products of team processes. They use the term emergent states to characterise the cognitive, motivational and affective states of teams, which they suggest are dynamic in nature and vary as a function of team context, imputs, processes and outputs. On first inspection, the notion of emergent states appears to have some utility from a training perspective, since it focuses attention on team performance as being a function of experience. This is pertinent since training events are designed to deliver experiences from which the team learn and modify their behaviour as required. From this perspective, capturing team emergent states in a team performance model has merit as it would be a component that an instructional team should be monitoring. However, there are certain aspects of the construct which are problematic. The principle issue is that it can be argued that cognitive, motivational and affective states are held at an individual level not at a team level. In the same way that Salmon et al (2009) argue that situational awareness is a function of differing elements of situational awareness being held by individuals and system elements, team properties may be construed as being a function of the state of the individuals that make up the team, each of whom may well be reacting differently at a given instant in time based on the specific experience they are having and their own KSAs and prior experiences. On
21
this basis. it can be argued that it is more useful to apply the concept of emergent states at the individual level within a team performance model. This has the advantage of focussing attention on how the strengths and weaknesses of individuals are contributing to team performance in a dynamic way. One of the few models that captures the notion of the team responding to cues from the environment and taking actions to affect the environment is that of Roby (1968) shown in Figure 8. Whilst the terminology used in the model is relatively unfamiliar, it is in essence presenting a straightforward information processing model cast in group terms.
Figure 8 Information Transduction Model of Group Activity on the Task Environment (Roby, 1968) Based on the analysis of the models reviewed, the following were considered to be the key elements that a team performance should capture: The nature of the environment in which the task is performed, including the demands that it places on the team The nature of the task The nature of the team in terms of its characteristics and organisation The characteristics of the individuals in the team and their emergent states Task and other outcomes The connections between all of the elements, including feedback loops
As none of the models reviewed captured all of these elements completely, a new model had to be developed.
22
23
Figure 9 Properties of Naturalistic Environments, Environmental Stressors and Environmental Task Demands
24
illustrated using the example of a medical team deployed in a field hospital to give concrete examples of the constructs used.
Figure 10 The Team/Collective Performance Model 3.4.3.1 Task Environment The task environment is composed of the physical environment, human elements, systems, manned systems and resources. Also captured are the environmental task demands that place stress on the team. In the field hospital example the physical environment would include the tents that they work within. Environmental characteristics such as extreme temperature would also fall into this category. Human elements are all the people outside of the team that the team interact with. In the field hospital case this category would include patients and personnel at field dressing stations that they communicate with. Systems are all the elements that have interfaces that the team use and would include medical systems such as Electrocardiograms (ECGs) and ventilators as well as such items as communication systems. Manned systems are those elements external to the team that they interact with. For a field hospital this might include field ambulances and support helicopters providing casualty evacuation. Resources are all the
25
other items that the team use including equipment, such as hospital trolleys and forceps, and consumables such as dressings, drugs and water. Environmental task demands are the factors that in some way stress the team. In the field hospital case these might include high workload due to large numbers of casualties and performance pressure and time pressure caused by a critically ill patient requiring urgent, life-saving treatment. 3.4.3.2 Performance Outcomes Team Processes generate Task Products and Other Outcomes, both of which constitute a modification to the task environment. In the case of a field hospital example the principal task product is successfully treated casualties - this constitutes the achievement of the task goal. Other Outcomes are ancillary modifications to the task environment which are concomitant with task performance (though not necessarily goal achievement) i.e. what changes in the environment as a result of the task being performed which isnt directly goal related. These might include resources used such as bandages, dressings, syringes, units of blood etc, and human elements being affected such as untreated casualties worsening in condition. Other outcomes also include effects on individuals and the team as a whole. These might include team organisation having to be changed because of a team member being injured, team members becoming fatigued, knowledge gained by individuals from experiencing a new situation. Therefore, performance outcomes feed back to team properties and team member characteristics, as well as the environment. 3.4.3.3 Team Processes Team processes are the teams response to the environmental inputs and are composed of both teamwork and taskwork elements. Their purpose is to generate appropriate task outcomes to achieve the required goal. They also have the side effect of generating other outcomes. The connection between team processes and task products is shown as twoway, as the team may adjust its process in the light of success or failure to generate the required outcomes. Environmental task demands have a critical influence on team processes. For example, if a resource such as an item of equipment is limited in availability then the team will have to come up with an allocation or sharing mechanism for that resource. Another example would be team workload pressures. If the medical team is working under huge task environment input demands with large numbers of casualties to be treated, they will need to carefully manage task allocation in a way that enables that team to be effective in that situation. The conduct of team processes will be influenced by the characteristics of the team members (their KSAs) and the properties of the team both in terms of the organisational factors of the team (structure, roles etc) and attributes such as cohesion and adaptability.
26
3.4.3.4 Team Properties Team properties include both organisational aspects, such as organisational structure, roles and role allocation, team size, and team attributes such as cohesion, adaptability and morale. These are affected by the conduct of team processes and the outcomes of the processes as well as by the characteristics of the individuals in the team. In the field hospital there may need to be an adjustment to role allocation to handle particularly demanding casualty levels or to deal with a casualty who has come into contact with a chemical agent. Replacement of a team member with another who does not have the same degree of team orientation as his predecessor may affect team cohesion and morale. Successful treatment of large numbers of casualties who arrived in a short space of time may boost team cohesion and morale. 3.4.3.5 Team Member Characteristics Team member characteristics include their teamwork and taskwork KSAs and their emergent states. Emergent states reflect the dynamic nature of individual performance capabilities, influenced by the environment, the experience of carrying out the team processes including teamwork interactions, and the properties of the team. A senior surgeon coaching a junior surgeon may result in the junior surgeon extending his knowledge and skills and self-confidence. Similarly, working in a highly cohesive nursing team may engender greater team orientation in a newly trained nurse in the team. On the other hand, seeing severely injured young soldiers who have been victims of Improvised Explosive Devices may have a severe emotional impact on a team member, reducing their effectiveness in their task.
27
Figure 11 Team Training Cycle, adapted from Tannenbaum et al (1998) The set of instructional functions is completed by the addition of initial instruction. The supporting functions that need to be added to this model concern the configuration, control, monitoring and adaptation of the practice environment. All of these functions will typically require supporting resources and systems. For example, an instructor observing a tank squadron will probably not follow it on foot across the exercise area, using a four wheel drive vehicle would be somewhat more convenient. Observing the same tank squadron exercising in a synthetic training environment would require some means of accessing the virtual environment in which they were exercising to make equivalent observations. These requirements are illustrated in Figure 12 as aspects of the training overlay.
28
Figure 12 The Team/Collective Training Model Figure 12 shows the instructional functions applying to team processes, team properties, team member characteristics and performance outcomes. Environment management functions are connected to the task environment.
29
A further literature review identified a further two applicable methods: Mission Essential Competencies Models for Analysis of Team Training
30
Figure 13 Example HTA (T) Chart Whilst the model of teamwork is limited, the tabular format supporting the HTA chart could be amended easily to contain other pertinent information. HTA has been described by Kirwin and Ainsworth (1992) as the best known task analysis technique (p 396). The representation of instructional scalers (maps of training objective hierarchies) as part of all MoD training documentation is based on the HTA chart format (without the plans) and so will be familiar to training needs analysts. As such it is a strong candidate for inclusion in any TCTNA method for the analysis of tasks and identification of teamwork components.
31
secondly conducting a coordination analysis. Coordination analysis involves selecting a teamwork taxonomy (none is specified) and getting Subject Matter Experts (SMEs) to rate each item on the taxonomy on a likert scale from 0-10 indicating the degree to which the item is required for each task that has been identified. Whilst the notion of rating teamwork processes in terms of their relative importance in a task appears to have merit, the absence of detailed guidance or a teamwork taxonomy makes TTA a weak candidate for inclusion as a component of TCTNA.
32
This approach does not address instructional methods or the related instructional and supporting functions, although potentially the abstraction hierarchy approach could be adopted to identify the instructor functions and then map them into implementation for a given training environment. Table 5 Work Domain Analysis Abstraction Hierarchy (Naikar and Sanderson, 1999)
Functional Structure Functional Purposes: Why a domain exists or the reason for its design Properties and Values: Criteria for ensuring that purpose related functions satisfy system objectives Purpose-related Functions: Functions that must be executed and coordinated Physical Functions: Functionality afforded by physical devices in the work domain and significant environmental conditions Physical Form: Physical devices of the work domain and significant environmental features Training Needs Training Objectives: Purpose for training workers to fulfil the functional purposes of a work domain Measure of Performance: Criteria for evaluating trainee performance or the effectiveness of training programmes Basic Training Functions: Functions that workers must be competent in executing and coordinating Physical Functionality: Workers must be trained to exploit the functionality of physical devices and operate under various environmental conditions Physical Context: Workers must be trained to recognise functionally relevant properties of physical devices and significant environmental features Functional Specifications Design Objectives: training system must be designed to satisfy the training objectives of the work domain Data Collection: Training system must be capable of collecting data related to the measures of performance Scenario Generation: Training system must be capable of generating scenarios for practicing basic training functions Physical Functionality: Training systems must simulate the functionality of physical devices and significant environmental conditions Physical Attributes: Training System must recreate functionally relevant properties of physical devises and significant features of the environment
33
training and is therefore the most comprehensive of the methods reviewed. The major stages are shown in Table 6. Table 6 MATT Process Stages (Dstl, 2006)
Team Task Analysis Stage One Stage Two Stage Three Stage Four Stage Five Establish principal features of the teams task and main measurement requirements Identify Team Goals, Team Tasks, and Supporting Team Tasks Identify Team Processes and Team Errors Identify Team Knowledge and Attitude Requirements Identify priorities for Team Training
Team Training Design Stage One Stage Two Establish principles of Team Training Specification of Team Training Media
Team Training Assessment Stage One Stage Two In Training Performance Assessment Post Training Performance Assessment
The task analysis method is based on HTA(T) followed by the identification of likely errors and consequences. Teamwork requirements are evaluated by rating the importance of the main categories of teamwork identified in teamwork taxonomy as high, medium or low. A table of suggested knowledge and attitude requirements is provided. Team training priorities are determined by a consideration of difficulty in learning tasks, ease of forgetting, when the task will be performed after training, task frequency and task familiarity (previous experience). It is suggested that the output of the analysis would lead to the construction of a description of principal features of the team task, of a set of matrices and prioritised list of training activities. Whilst the guidance provided is reasonably lengthy, it is surprising that there is no explicit guidance about how the suggested outputs are derived from the recommended analysis steps or how these outputs should be used subsequently. Although the analysis is comprehensive, the difficulty arises in then taking the data forward into the training design phase, as the guidance is limited to a list of training principles and some suggested environment options for different categories of training task. Teams are categorised as requiring training at stages equivalent to Fitts (1964) cognitive, associative and autonomous stages. The significance being that it is argued that novice and intermediate training requires lower fidelity training environments. Notwithstanding the validity or otherwise of this assertion, there is no guidance provided on fidelity analysis or training environment specification. Detailed guidance on the development of assessment instruments is provided appropriate to the detailed course design stage.
34
In summary the approach is strong from the perspective of the task analysis element, although there are no diagrammatic techniques used, so it is sometimes difficult to visualise the relationships between data collected at different stages of the analysis. However, it is weak in giving guidance on training methods, instructor tasks and the specification of instructional environments.
4.3 Evaluation
The review of the available methods shows that there is no one method already in existence that is sufficient in scope to satisfy the requirements of TCTNA. HTA(T) looks to be a promising method for conducting the core of the task analysis, and the principles of the WDA abstraction hierarchy look to have application for the identification of instructor functions and in the approach to specifying training environments and training scenarios. A TCTNA method was therefore developed, see section 5.0.
35
Figure 14 The Team/Collective Training Model The Team/Collective Training Model in Figure 14 provides the framework for data collection within the TCTNA method. Each component of the model is addressed during the analysis process.
36
Figure 15 The Triangle Model of TNA Constraints on the final training solution have to be captured at an early stage to ensure that analytical effort is not wasted on exploring training solutions that are untenable. For example, a lack of available submarines might preclude live training for anti-submarine warfare therefore leading to a synthetic training solution being the only feasible solution. Constraints analysis (described first) is an ongoing process through the analysis. The Team/Collective Task Analysis component combines operational task analysis and training gap analysis from the extant TNA model, along with their outputs. At the start of this analysis a number of models are made that capture the nature of the environment that the team operates in as well as the organisation of the team and its communications structure. Modelling techniques derived from software engineering are exploited to provide useful visualisations. This ensures that the analyst develops a clear understanding of the context of the task as well as providing valuable information for the subsequent specification of training environments. The core of the task analysis is conducted using an extension of HTA(T). The teamwork elements of the task are characterised based on the teamwork taxonomy developed to underpin the Team/Collective Training Model. The identification of training priorities is conducted using a risk management approach. Training Overlay Analysis consists both of identifying appropriate methods for facilitating the required training along with generic scenarios that the training environment must ultimately support if effective training is to be delivered. Borrowing
37
from the concept of the abstraction hierarchy from Work Domain Analysis, instructor functions are identified that are required to set up and deliver training In team and collective training there are frequently many instructional staff involved with a variety of roles. It is essential that these roles are identified and consideration given as to what facilities are required to support those roles (such as capturing performance data and conducting After Action Review). Training Environment Analysis focuses on the specification of the required training environments. This includes a fidelity analysis (fidelity analysis cannot be conducted until the training method has been identified (for example, a part task training environment will have very different fidelity requirements to an environment for full mission training)). The analysis also considers the identification of the interfaces that instructors would require to control training devices such as simulators and other tools that they may require to fulfil their role, such as tools to support data capture about student performance during exercises. The sub-components of each stage of the TCTNA method are shown in Table 7 along with a list of the supporting templates that are provided. Table 7 TNA Triangle Stage Sub-Components
Constraints Analysis Element & Description Constraints Analysis Supporting Templates Constraints table: Team/Collective Task Analysis Element & Description External Task Context Description Supporting Templates/Notation Generic Scenario Table External Team Context Diagram Interaction Table Environmental Description Table Environmental Task Demands Table Internal Task Context Description Organisational Chart Role Definition Table Internal Team Context Diagram Interaction Table System Matrix Communications Diagram Communications Matrix Hierarchical Task Analysis for Team/Collective Training (HTA (TCT)) HTA (TCT) Diagram Task Sequence Diagram HTA (TCT) Task Description Tables Task Role Matrix
38
Teamwork Analysis
Training Overlay Analysis Element & Description Instructional Methods Selection Training Scenario Specification Supporting Templates/Notation Practise and Assessment Methods Table Training Objective Generic Scenario Table Environment Description Table Instructional Task Identification Training Overlay Requirement Specification Instructor Task Table Environment Description Table Training Environment Analysis Element & Description Training Environment Specification Training Environment Rationalisation Fidelity Analysis Training Environment Option Identification Training Environment Option Definition Training Environment Option Description Table Training Environment Option Properties Table Training Environment Option Evaluation Training Environment Options Comparison Table Environment Object Specification Tables Supporting Templates/Notation
A Tornado F3 Pairs training example is used to illustrate various elements of the analysis phases. The background to this example is shown in the box below.
39
Tornado F3 Pairs Training Example The use of a coordinated pair of mutually supporting aircraft has been central to fighter tactics since WW1. The example used as a case study running through the three analysis phases is based on the requirement to provide enhanced training in pairs tactics for Tornado F3 pilots and Weapons Systems Operators (WSOs) in training. The detail is derived from research data from a transfer of training trial conducted to establish the effectiveness of a networked desktop computer system for training pairs tactics. This example has been chosen as it is a highly demanding task which places extreme demands on the teamwork skills of the crews. The context is a pair of fighters patrolling an area of airspace searching for bandit fighters or bombers. The pair consists of a lead aircraft and a wingman aircraft. The lead aircraft controls the intercept. The search is conducted by the WSOs using the air to air radars. The search space is divided between them to improve the efficiency of the search. A ground based or airborne Fighter Controller if present can also give vectors to a bandit aircraft. If bandits are detected and the pair have a tactical advantage an intercept ensues. As the WSOs are looking at different sectors of airspace, the WSO who has detected the bandit has to give directions to the other WSO so that he can get the same radar picture. This is referred to as the radar meld. The aim of the pair is initially to destroy the bandits Beyond Visual Range (BVR) with radar guided missiles. If there is a single bandit, the leader will engage it whilst the wingman flies in support ready to take a back up shot if required. If there are two bandits, then both aircraft will engage a bandit after agreeing who is going to attack which one. The WSOs direct the radar based intercept. If the BVR intercept is unsuccessful but the pair still have a tactical advantage a visual intercept ensues. The WSO talks the pilots eyes onto the bandit location during the merge into the visual. Once the pilot can see the bandit he takes over the intercept aiming to shoot the bandit down with heat-seeking missiles. During the visual intercept the WSOs provide an extra pair of eyes to watch for threats at visual range.
40
6 Constraints Analysis
6.1 Introduction
Constraints analysis is a simple but critical component of TCTNA. Accurate identification of the constraints on the choices of training method and environment prevent nugatory analytical effort in exploring training options which are not viable. Typical categories of constraints include, but are not limited to: a. Safety Safety can be a significant constraint on the choice of training environment. A typical example would be limitations on live firing. Invariably weapons effects have to be simulated. b. Cost Cost of the use of training assets (a Tornado F3 costs between 10K and 40k per hour to operate depending on what is included in the cost model) and consumables (missiles can cost from tens to hundreds of thousands of pounds each depending on type). c. Training audience availability There may be limits on how long a training audience is available for training and when that availability falls. d. Resource availability Limitations on the availability of training areas, equipment required for training, daily and seasonal weather conditions (for example, the British Army Training Unit Suffield training area in Canada is unavailable during the winter months as it is frozen over) can all limit training options. It may also be difficult to present credible threats if the weapons platforms required are not within the current Order of Battle. e. Policy There may be policy constraints on how training is conducted such as the qualifications required for instructors to be able to conduct certain types of training and the requirement for simulation to be explored as an option.
41
42
43
Events
44
7.2.2.1 Team Context Diagram Notation The notation for drawing a TCD is shown in Figure 16. The TCD shows two types of information of interest. These are the elements in the environment that the team interacts with and the nature of the interactions themselves.
Figure 16 Team Context Diagram Notation The TCD is constructed using the following components: Team Circle The central circle is labelled with the name of the team or collective unit that is under analysis. Environment Element Boxes the rectangular boxes are labelled with the names of the elements in the environment that the team interacts with. A single box can
45
be used to represent a set of elements of the same type provided the interactions between the team and every element in the set are the same. A diverse range of element types can be represented in the diagram including, but not limited to: o Superior organisation/team o Subordinate team(s) o Peer organisations/teams (entities directly under the control of the superior organization such as other companies in a battle group) o Coalition forces o Enemy o Civilian elements (Non Government Organisations, Civilian Authorities, Media) o Neutral forces o Systems the team operate (such as sensors and weapons) o Physical environment (such as terrain, weather etc.) Data Stores Data stores show where a team and an environmental element (such as a sub-unit) can access a shared data area such as an intranet page or filestore to post or retrieve information. Interaction Arrows The arrows on the diagram show whether the interactions are one or two way. An arrow can represent multiple interactions (this avoids cluttering the diagram with multiple arrows between elements). Multiple interaction arrows are used where the team can interact both directly and indirectly through a data store. Interaction Label Having reduced the complexity of the diagram by using a single arrow to represent one or more interactions between the team and a given environment element, it is necessary to have a cross-reference to the interaction arrow so that more detail can be referenced to the diagram in the Interaction Table. A textual description could be used but this would only be effective for a single, one-way interaction. It is recommended that numbers are used for simplicity.
7.2.2.2 Interaction Table Construction The Interaction Table captures the detail of the various interactions shown in the TCD in terms of the content/nature of the interaction and the mode of interaction. The entries in the table are referenced by the interaction labels used in the TCD. The format of an Interaction Table is shown in Table 11.
46
The columns contain the following types of information: From/To These columns capture the direction of the interaction being described, noting that there will be separate entries for each direction on a twoway interaction arrow (as shown for interaction 1), and potentially more than one entry for the same direction if multiple interactions are captured by an interaction arrow (as shown for interaction 4). Content/Nature the heading content/nature has been used to reflect the fact that the interaction may be either communication including the passing of an information product (such as sending an order or requesting information) or some form of action or effect (such as firing a missile or the effects of weather). Mode The mode of interaction is a description of how the interaction is mediated. For communication it will be the communication channel (such as voice by secure radio or email). In the case of indirect communication through a data store then the data store should be identified such as a web page where documents are posted (as shown for interaction 5).
7.2.2.3 Example External Context Description Figure 17 shows a TCD and for the Tornado F3 Pairs Air Combat training example. The TCD in Figure 17 shows all of the elements external to the aircraft that interact with the pair and has therefore been labelled the external environment TCD. Weather and terrain have been included in the diagram, as the pair has to take these into account during the intercepts: close proximity to the ground affects the geometry of intercepts and visibility affects target detection and influences tactics. The bandit missiles aids have been shown separately from the bandit aircraft because the missiles act independently to a greater or lesser degree once released. In most if not all training situations where weapons are
47
involved, weapons effects have to be considered and separate analysis of weapons is therefore required to specify a training environment.
Figure 17 Example Tornado F3 Pair External Environment TCD Table 12 is the corresponding Interaction Table and shows a representative set of interactions associated with the interaction arrows in the external context TCD. These illustrate the use of one-way and two way interactions in single and multiple form and interaction via data stores. Interaction 2 shows communication of information via the Joint Tactical Information Distribution System (JTIDS). JTIDS is a complex, secure communications system and arguably it is something of a stretch to represent it simply as a data store. However, it broadcasts messages and there is no explicit reply from receiving systems in so far as one user does not know if another user has received the data from their system or has chosen to view it. The issue from the training perspective is that there is the potential for erroneous action if another user has not received or viewed information that is posted without requiring a receipt.
48
Bandit Fighters
F3 Pair
Evade radar guided missiles when detected on radar warning receiver Evade heat seeking missiles when seen Manoeuvre into attacking position
F3 pair
Bandit Bombers
Engage with radar guided missiles beyond visual range Engage with heat seeking missiles Evade
Bandit Bombers
F3 pair
Evade radar guided missiles when detected on radar warning receiver Evade heat seeking missiles when seen Evade RWR indicates radar illumination by fighters
Terrain
F3 pair
49
The information about the element and how it interacts with the team is critical from a training perspective both in terms of comprehending the team task, but also informing how an element would be represented in the training environment, including the resources it would require. For example, a warfare team training in a synthetic environment may need to be fed sightings of enemy units attacking the ship from a bridge watchkeeper. Since the bridge does not exist (there is no window for a watchkeeper to look out of) whoever is roleplaying the watchkeeper needs a representation of the environment (such as a top down display showing the range and bearing of an enemy unit) in order to provide the information.
50
Given that there are likely to be numerous entries in the environment description table, it is most likely that a spreadsheet will prove a more appropriate tool for recording the data than a text document table. Table 14 shows Environment Description Table entries that correspond to the elements in the External TCD Figure 17. Not all elements have an entry in every cell. In this case, it is not meaningful to describe what information the weather or terrain needs to interact with the team. Standard voice procedure and codewords are used in the communication between the Fighter Controller and the Team. This is recorded in the Fighter Controller column.
51
Target location and vectors to the target by voice (note : standard voice procedure and codewords used) Updates to tactical picture via JTIDS Requests for target information. Target assignment information via JTIDS
Pursuit of pairs aircraft based on radar and heat signatures and aircraft manoeuvre, detonation of warheads within range Deployment of defensive aids
Offensive and defensive manoeuvre, heat seeking and radar guided missile shots, deployment of chaff and flares
Offensive and defensive manoeuvre, heat seeking and radar guided missile shots, deployment of chaff and flares
N/A
N/A
52
53
Table 15 and Table 16 show example role definitions for the Lead and Wingman WSOs respectively. These roles are complementary and this becomes apparent in the role description. As these roles can also be dynamically reallocated during an intercept this is captured in the role descriptions by adding an extra field to the role descriptions concerning role allocation. This information can inform the subsequent analysis of the team task.
54
Radar, JTIDS, RHWR, Radio, Intercom, Weapons System, Visual Scene Usually held by the WSO in the Lead crew but can be allocated to the wingman WSO by the lead WSO at the start of an intercept if the wingman has the tactically superior position in formation against the bandit
55
Figure 19 Example Tornado F3 Pair Interfaces TCD Table 17 is the corresponding Systems Interaction Table and shows a sample of the interaction descriptions. The challenge at this stage in the analysis is deciding on the level of detail to record. Interactions with systems tend to be quite complex as the data output is often complex as are the input interactions. Ultimately a lot of detail may be needed, particularly if a synthetic training environment has to be specified. It is suggested that a relatively high level description is used at this stage as there is no point in doing extensive analysis until the decision has been made that there is a training requirement and training objectives have been established. If, during further stages of analysis, it is found that more detail is required the table can always be amplified. It is likely that that much of the detail will appear as amplification in the mode column.
56
Table 18 shows the corresponding entries in the Environment Description Table (noting that this is not a new table, these are additional columns in the table constructed for the previous TCD). The same issue about level of detail applies to this table, the guidance being to try to capture sufficient detail for the level of analysis being conducted. It is useful to include a reference to the appropriate technical document(s) should further detail be required. Precise descriptions of cockpit fields of view may become relevant if simulation is required as a training solution. Table 18 Tornado F3 Pair Environment Description Table Entries
Element Element Description Radio VHF and UHF radio with control panel, transmit buttons on pilot stick and WSO radar hand controller (Technical Reference) Radar Foxhunter multi-mode radar controlled through control panel, WSO hand controller and pilot hand controller with front and rear cockpit displays (Technical Reference) Outputs to team Inputs from team Voice comms Radar picture and current settings Radar configuration settings View of terrain, weather, bandits, bandit missiles and flares, other aircraft in pair N/A Visual Scene Tandem seating limits WSO field of view forwards through the canopy. Head up display central in pilot field of view. Terrain and weather can be seen. Sighting of bandit aircraft, missiles and defensive aids depends on range and visibility
In the Tornado example use and access to the various systems is distributed between the pilot and the WSO in each aircraft. Some systems are accessible to both (radios, RHWR displays) whilst others are seat specific (e.g. flying controls for the pilot). With larger teams this distribution gets more complex. At this stage it can be useful to compile a matrix of who has access to which system to aid understanding of team functioning. A systems matrix is one way to do this as shown in Table 19
57
58
The arrows on the diagram could be labelled with the communications mode but even on a small diagram such as Figure 20, this could make the diagram too cluttered to read. The arrows have therefore been numbered so that the detail can be recorded in an associated Communications Description or Communications Matrix.
59
Communications Description The crews (Pilot and WSO) communicate internally using intercom (7,8). Each crew member can communicate with both members of the other crew and the fighter controller using the radio (1,4,6) (radio transmissions are heard internally over the intercom by the other crew member). The WSOs can also communicate with each other and the Fighter controller using JTIDS. (2,3,5) Pilots can communicate with the other crew visually by wing waggling. (9,10)
Figure 20 Example Tornado F3 pair Communications Diagram and Textual Description Table 20 Example Tornado F3 Pair Communications Matrix
Communications Channel Role Lead Pilot Lead WSO Wingman Pilot Wingman WSO Fighter Controller Visual X Rx only X Rx only Intercom (leader) x x x x Intercom (wingman) Radio x x x x x x x x JTIDS
60
Ultimately, the communications diagram can be extended to form an Internal Context Diagram for the Team by including other, non-communications interactions. However, this necessitates the determination of: which internal communications connections are actually used and the content of the interactions on these connections. the source (sub-team or team member) of interactions with external environment elements. the recipients (sub-team or team member) of interactions from external elements.
Whilst some of this information may be available at this stage and therefore the Internal Context diagram could be started, it is likely that much of the information will be determined during the subsequent task analysis stage and its construction is probably best left until that stage. 7.3.4.2 Communications Description The communications description provides a simple way of presenting the information about the communication modes related to the arrows in the Communications Diagram. The numbers in brackets cross-refer to the numbers on the diagram. An example of such a description is shown alongside the Communications Diagram in Figure 20. For relatively small Communications Diagrams this is probably the most efficient way of presenting this information. 7.3.4.3 Communications Matrix A Communications Matrix provides a tabular representation which captures the modes of communication which are possible both internally between team members and between team members and external elements. Table 20 shows an example Communications Matrix for the Tornado F3 pair that corresponds to the Communications Diagram shown in Figure 20. The orientation of the matrix (whether the channels are shown as rows or columns) is simply a matter of convenience based on the number of channels and the number of people/roles that are communicating. A Communications Matrix is of particular value when the communications network is large or complex and Communications Description would become unwieldy.
61
7.4.1 Hierarchical Task Analysis for Team & Collective Training - HTA(TCT)
The Hierarchical Task Analysis for Team and Collective Training HTA(TCT) approach is adapted from the approach developed by Annett (2000) under a previous MOD research contract. Tasks are hierarchically decomposed and represented in an HTA diagram. Supporting detail is recorded in a task description table. Task sequence diagrams are also produced to provide a temporal view of task execution. Finally a Task Role matrix is constructed to provide a quick reference to participants in each task and sub-task. Data sources for the HTA(TCT) would typically include documentation (such as CONOPS, CONUSE, CONEMP and role group summaries if available) and discussions with SMEs. Observation of the task may be possible if the TNA is being conducted to update extant training. For new systems this will not be possible. One of the more challenging questions when conducting any form of HTA is at what point do you stop? The technically accurate answer is when it is fit for purpose. In practical terms for Team and Collective training the decomposition can stop when sufficient information about how the team interacts has been gleaned. Typically this will be when an individual task has been reached or a multi-actor task is sufficiently simple that further decomposition is not needed. 7.4.1.1 HTA(TCT) Diagram Notation and Format The format used for the HTA(TCT) diagrams is shown in Figure 21.
62
Figure 21 HTA Diagram Format The HTA(TCT) diagram presents two sorts of information. The bulk of the diagram is made up of a hierarchical presentation of the overarching task and sub-tasks that must be met in order to achieve it. The format will be familiar, as it is essentially the same as that used for an organisational chart or instructional scalar. In addition to the task hierarchy, plans are shown which describe the sequencing for each set of sub-tasks. The notation used for constructing the diagrams (adapted from Stanton, 2006) is detailed in Table 21. Rounded boxes are used for the plans so that they are easily distinguished from sub-tasks. Similarly, diagonal lines are used to link plans to the node where they apply to avoid confusion with task and sub-task connectors. Table 21 HTA(TCT) Diagram Element Symbols and Descriptions
Element Symbol Contents/description Tasks and subtasks Task and sub-task connectors Plans Task/sub-task number Sub-teams and team members conducting the task Task description These connect tasks to their sub-tasks Plan + number of task/sub-task that it refers to Sequence in which the sub-tasks are executed These connect plans to the node where the subtasks connect to the task which they are amplifying
Plan connectors
The use of separate boxes to show plans is recommended simply on the grounds of clarity. The plan information could be shown in the task box but the box could become rather large. That said, if charting software is used for drawing the HTA it may not have the necessary facilities for showing the plans separately, in which case the plan information can be located in the associated task box. Plans can be quite complex and, as a consequence, a textual description could be quite large and unwieldy on the HTA diagram. A suggested syntax for expressing plans more tersely (adapted from Stanton, 2006) is shown in Table 22. The choice between using the suggested syntax, or text or a mixture of both for describing a plan lies with the analyst. However, the guiding principle must be that of clarity a triumph of syntactical dexterity over clarity serves no useful purpose to the reader of the HTA.
63
7.4.1.2 Example HTA (TCT) Diagram An example of an HTA diagram for the Tornado F3 pairs task Destroy bandit beyond visual range is shown in Figure 22. This shows the goal as being sub-divided into five sub-tasks, numbered 1.1 to 1.5. With the exception of sub-task 1.1 (Check for bingo fuel), these sub-tasks are decomposed to another level. Sub-task 1.4 (Engage Bandit BVR) has two further levels of decomposition. One of the challenges in constructing an HTA diagram is deciding how to lay it out and how much information to put on one page. Even with only three levels of decomposition, it can be seen from Figure 22 that the HTA diagram becomes quite large and that it is not always possible to show all items at the same level of decomposition on the same line. The decomposition for task 1.4 has been shown on the bottom half of the page to avoid having to split the diagram across more than one page. Figure 22 is pretty much at the limit in terms of the amount of information that can be shown on a single page and still be readable. Where a diagram needs to be split across multiple pages, appropriate subtasks should be selected to be decomposed on other pages. In this example, sub-task 1.4 would be a candidate for decomposing on another page as it is as complex as all the other sub-tasks put together and is the only one that has two levels of decomposition. Plan 1 and plan 1.4.3 illustrate the use of a mix of plan syntax and free text to describe the plans. This choice was made as, in each case, it was difficult to capture the complexity of the plan by use of the syntax alone and a free text description would have taken up too much room. Abbreviations for the names of the sub-team and team member roles have been used to conserve space in the task boxes. These are expanded in the legend at the bottom of the diagram.
64
Figure 22 Example HTA for the F3 Pair Task Destroy Bandit beyond Visual Range
65
7.4.1.3 Task Sequence Diagrams HTA diagrams are a particularly effective means of representing the breakdown of a task into sub-tasks as they capture both multiple levels of detail of decomposition and detail about task sequencing. However, they have one limitation in that it is difficult to get a visual appreciation of the relative sequencing of tasks. This is of particular significance from the perspective of workload as sub-teams or team members may be involved in multiple tasks which are concurrent. To address this issue, the use of a supplementary representation of sub-task sequencing not usually constructed during HTA, is recommended. The structure and notation recommended is adapted from that used for Pert Charts (often used to show task sequencing for project management purposes). The diagrams have been termed Task Sequence Diagrams. This choice of technique was in part based on the observation that SMEs, when asked to draw a diagram showing the stages in a task, will often draw this style of diagram. Constructing such a diagram may therefore be helpful in eliciting information from SMEs for the purposes of constructing the HTA diagram. The notation is shown in Table 23. The notation for tasks and sub-tasks is the same as that used for HTA diagrams for consistency so that they can be copied from one diagram type to the other. Table 23 Task Sequence Diagram Notation
Element Terminators Symbol Contents/description These are placed at the start and end of the sequence and contain a description of the initiating and terminating conditions for the task Task/sub-task number Sub-teams and team members conducting the task Task description These horizontal arrows show how a task extends over time , the arrow head being aligned with the end of the last task in the sequence that the activity occurs in parallel with These are used for linking tasks in sequence
Task connectors
The format of a Task Sequence diagram is shown in Figure 23. This diagram shows how three tasks are sequenced. Sub-task 1.1 has a horizontal arrow extending to the right of it indicating that its execution continues until task 1.3 terminates. Sub-tasks 1.2 and 1.3 are shown as being conducted in sequence.
66
Figure 23 Task Sequence Diagram Format Figure 24 shows a Task Sequence for the F3 Pair Task 1 Destroy bandit BVR). This captures the sequencing of the high-level sub-tasks 1.1 to 1.5, showing very clearly that sub-task 1.2 is conducted in parallel with firstly sub-task 1.2 and then sub-tasks 1.3 and 1.4 which are also conducted simultaneously. Further diagrams can be drawn to show successive levels of detail as illustrated by Figure 25 which shows the sequencing decomposition for sub-task 1.2 Detect Bandit BVR.
Figure 24 Task Sequence Diagram for the F3 Pair Task 1 Destroy Bandit BVR
Figure 25 Task Sequence Diagram for the F3 Pair Sub-Task 1.2 Detect Bandit BVR 7.4.1.4 HTA(TCT) Task Description Tables Task Description Tables capture the full range of detail required about the tasks and subtasks that will be used in subsequent stages of the TNA process. A suggested template for HTA task tables, with a description of the contents of each field, is shown in Table 24. This is an adaptation and extension of the approach used by Annett (2000). A task table should be completed for each sub-task in the HTA hierarchy.
67
Internal
Products External
Internal
Other Outcomes: Description of the side effects of the activity e.g. use of resources, attrition Critical Errors and Consequences: A description of the critical errors that could be made and their consequences Assessment measures and data requirements: A description of the criteria for evaluating the task (process and product) and the data that is required to make the assessment
The structure of the table is designed to provide a logical breakdown of the task or subtask, with related items located together. Specifically: The task number and task statement cross-refer to the HTA Diagram. The statement of the tasks purpose and initiating and terminating conditions put the task into context. The listing of sub-tasks and participants, the plan and the teamwork description provide an overview of the mechanics of the task and how it is executed. Including the communications modes for teamwork ensures that
68
required modes that have to be supported in training are not overlooked. If the task is not decomposed any further, the fields for sub-tasks and the plan will be left blank. If the task is performed by an individual then the teamwork field will be left blank. The fields for inputs, outputs and other outcomes capture the detail of which inputs and cues the team is responding to and how the responses are mediated. These should be cross-checked against the context descriptions. Other outcomes are included as they may impact on how the task and other tasks are achieved. Issues such as consumption of fuel and munitions and weapons effects are also significant from the perspective of training cost and safety, and may therefore influence the training option selected. Not all fields will have entries. Critical errors and consequences are included for two reasons. Firstly, this information will be required at the gap analysis stage to inform the decision about whether the task needs to be trained. Secondly, consideration of how a task can go wrong and the consequences of such an event can illuminate subsequent discussion of how performance of the task can be evaluated and what measures are appropriate. Potential consequences of errors in the conduct of a task could include: o the loss of positive effect that would have stemmed from successful task completion o loss of capability o mission failure o personnel may be injured in the completion of the task o damage to essential equipment o collateral damage or injury to 3rd parties o a breach of law or military regulations (such as Rules of Engagement being improperly applied). Assessment measures and data requirements are put last as they are best considered once a complete understanding has been developed of the process by which the task is performed and the nature of the inputs and outputs of the task.
On first inspection, completing a task table for each sub-task may appear to be an onerous and intimidating task. However, the table structure has been designed not only as an output of this stage of analysis, but also to provide an analytical framework to use during the information gathering. The table can be used both to guide analysis of documents which describe the task being analysed, and to frame discussions with SMEs. It is suggested that completing the tables as the HTA diagram is developed provides an expedient method of ensuring that information collected is not lost or forgotten and should expedite the analytical process. Difficulty in completing fields in the form is a
69
strong indicator that the task is not completely understood and needs to be revisited in the analysis. Table 25 shows a Task Description Table for the F3 Pair Task Meld Radar which has not been decomposed further. Consequently, no sub-tasks are listed and the plan field is left blank. Table 26 shows the Task Table for the Detect bandit BVR task from which the Meld Radar task was decomposed. A point to note is that information from the lower table (such as assessment measures) can be aggregated up into the table for the higher level task. Table 25 Task Description Table for F3 Pair Task 1.2.5 Meld Radar
Task: 1.2.5 Meld Radar Purpose: Both WSOs need to have the same radar picture and confirm that they have both identified the same aircraft as the bandit in order to plan and execute the intercept effectively Initiating condition: Bandit located Terminating condition: Melded radar pictures Sub-tasks and participants: L-WSO, W-WSO Plan: N/A Teamwork Description: The WSO who has located the bandit has to communicate the bandit location to the other WSO, and they then have to ensure that they are both identifying the same aircraft as the bandit. Inputs External Content/Nature, Source & Mode Target Manoeuvre Bandit Fighters Radar pictures radar, JTIDS display - JTIDS Internal Products External Internal Other Outcomes: Time taken is time available to bandit to break radar lock and manoeuvre Critical Errors and Consequences: Slow radar meld gives bandit more time to break radar lock and evade or move into an offensive position (loss of tactical advantage by F3 pair) Assessment measures and data requirements: Time taken to meld radar pictures requires recording of radar pictures, recording of WSO comms during meld. Standard: Radar meld achieved accurately and sufficiently quickly as not to lose tactical advantage. Communication between WSOs Content/Nature, Destination & Mode Melded radar pictures - radar
70
Table 26 Task Description Table for F3 Pair Task 1.2 Detect Bandit BVR
Task: 1.2 Detect Bandit BVR Purpose: The bandit has to be located in the patrol area in order to intercept it Initiating condition: On arrival in the search area Terminating condition: When the bandit has been located, or if the formation becomes defensive or if bingo fuel (fuel required to return to base) is reached Sub-tasks and participants: 1.2.1 L-WSO Allocate radar search bands 1.2.2 L-WSO, L-P, W-P Manoeuvre formation 1.2.3 L-WSO, W-WSO Liase with fighter Controller (FC) 1.2.4 L-WSO, W-WSO Locate bandit 1.2.5 L-WSO, W-WSO Meld radar Plan: After radar search bands have been allocated (1.2.1) the formation is manoeuvred (1.2.2) whilst liaison with the FC (1.2.3) and location of the bandit (1.2.4) take place. Once the bandit is located the radar meld (1.2.5) takes place. Teamwork Description: The main aspect of team work is communication between the WSOs as they search the airspace for the bandit. The radar meld is the critical phase of communication once the bandit is detected. Inputs External Content/Nature, Source & Mode Target location from the FC via radio/JTIDS Vectors to target from the FC via radio Target location via radar display Target manoeuvre Internal Products External N/A Content/Nature, Destination & Mode Request for target information to FC via radio Target location to FC via JTIDS/radio Internal Melded radar pictures
Other Outcomes: Fuel is consumed during the search and aircraft location changes. Time taken for the radar meld is time available to the bandit to break radar lock evade or move into an offensive position Critical Errors and Consequences: Slow radar meld gives bandit more time to break radar lock and evade or move into an offensive position (loss of tactical advantage by F3 pair) Assessment measures and data requirements: Efficiency and effectiveness of search requires recording of manoeuvre of F3 pair and bandit and radar pictures. Standard: Systematic search conducted in accordance with SOPs Time taken to meld radar pictures requires recording of radar pictures, recording of WSO comms during meld. Standard : Radar meld achieved accurately and sufficiently quickly as not to lose
71
7.4.1.5 Task Role Matrix The Task Role Matrix provides a simple look up table of which roles are involved in which tasks and sub-tasks. This can be useful for analysing teamwork and can inform training overlay analysis An example Task Role Matrix is shown in Table 27. Indenting sub-tasks and sub-subtasks in the task listing aids clarity. Identifying which sub-team an individual is in is also useful as interactions across sub-team boundaries become more obvious. In this instance, the table clearly shows that the Leading WSO is involved in all of the tasks listed in the table fragment and that many of these involve the Wingman Crew. Table 27 Task and Role Matrix Example
Roles Participating Task List Lead-Crew L-P 1.1 Check for Bingo Fuel 1.2 Detect Bandit BVR 1.2.1 Allocate radar search bands 1.2.2 Manoeuvre formation 1.2.3 Liaise with Fighter Controller 1.2.4 Locate bandit 1.2.5 Meld radar 1.3 Check if Defensive 1.3.1 Check wpns and defensive aids states 1.3.2 Evaluate tactical status 1.4 Engage Bandit BVR 1.4.1 Allocate lead 1.4.2 Agree tactics 1.4.3 Intercept bandit 1.4.3.1 Give vectors to pilot Etc. X X X X X X X X LWSO X X X X X X X X X X X X X X X X X X X X X X X X X X X X WingmanCrew W-P X X W-WSO X X
72
H H
High workload
Multiple information sources High information load Incomplete, conflicting information Rapidly changing, evolving scenarios Requirement for team coordination Adverse physical conditions Auditory overload/ Interference Visual Overload
M M
H H M M M
73
Pairs tactics are conducted in a highly dynamic environment and coordination is fundamental to success
74
captured as appropriate. The example in Table 30 is for the wingman WSO. Recording the supporting KSAs that are required facilitates a cross check that individual training provides the necessary underpinning KSAs. The resource allocation line has been left blank as resources cannot be passed between individuals in a tandem, two-seat aircraft cockpit. It is more meaningful simply to think of passing responsibility for using available resources (task allocation) such as use of radios or defensive aids. Table 30 Teamwork Interaction Table
Role Teamwork Interactions Performance Monitoring Wingman WSO Actions Supporting KSA / Taskwork
Pairs tactics
Task Allocation
Resource Allocation Reporting radar contacts to formation leader. Communicating radar picture to Lead WSO during radar meld Employing defensive aids during visual combat
Communication
Backup Behaviours
75
difficulty, importance and frequency of the tasks concerned. Tasks of high difficulty and importance would be high priority candidates for training with tasks of low difficulty and importance being of the lowest priority. The frequency of conduct of the task is also relevant in that there is the possibility that high frequency tasks may have some potential for being trained on the job. This is meaningful in the individual training context as there is the potential for mentoring and instruction by a more experienced operator in the job context. The collective training context is more complex. Typically a team or collective organisation will have a mix of people at different levels in the organisation with a variety of backgrounds and experience levels. This is further complicated by the fact that often people are posted in and out of the organisation at different times, so the overall experience level of the group may be constantly changing. Therefore, defining an input standard is problematic. Similarly, training on the job is more problematic as an idea since there is no team to mentor and train the team itself.
A suggested format for such a table, described here as a Training Priorities Table, is shown in Table 31. The information concerning the critical errors associated with each task is already available in the Task Description Tables produced during Training Task Analysis and simply needs to be copied across. Guidance for rating the likelihood and
76
severity of errors is shown in Table 32, in the form of factors affecting the likelihood of an error and the indicators of high severity errors. These ratings, along with the assessment of the overall level of risk and recommendations about the requirement for training as mitigation are matters of subjective judgement and require SME input.
77
2.0 Destroy Bandit Fighter in Visual Range 3.0 Destroy Bandit Bomber BVR Range
78
Table 32 Factors Affecting the Likelihood of an Error and Indicators of High Severity Consequences
Factors Affecting the Likelihood of an Error Task complexity (the number of steps within the task). Degree of teamwork required (the required degree of communication between team members in task completion is a reasonable proxy for this). Judgement of Task success - whether the task can be judged to have been completed successfully by team members (if this is not the case, there is no possibility for team performance self-correction). Task recoverability/reversibility there are examples of tasks which are difficult (such as threading a needle) however because lack of success in task execution can be easily judged and the task reattempted, this reduces the likelihood of an uncorrected error occurring. Time pressure (task delay tolerance) operating under time pressure increases the likelihood of error. Task performance expected under conditions of stress or physical fatigue is more prone to error High task loading - occurs if the task is performed concurrently with other tasks. Information Availability if information to perform the task may be incomplete or of uncertain quality. Time after Training time between training and task execution. Mission Failure Loss of capability Injury to personnel Damage to essential equipment Collateral damage or injury to 3rd parties Breach of law or military regulations (such as ROE being improperly applied). Significant financial loss Indicators of High Severity Consequences
79
80
Figure 26 General mapping of Tasks to METs The issue which this presents is that many of the tasks and related training objectives may well link to many tasks listed in the MTL. This is illustrated diagrammatically in Figure 26. The consequence of this issue is that many checks have to be made to identify all the linkages. To put the magnitude of this issue into context, in each of the QE Class Role Group summaries prepared by the Aircraft Carrier Alliance there are in the order of 110 tasks identified for Air Traffic Control (ATC) roles and approximately 310 listed tasks in the MTL(Maritime) (MTL(M)). Potentially, in excess of 30,000 cross-checks would have to be made to determine how the tasks link to the MTL(M) Careful inspection of the MTL(M) shows that there are five tasks which the ATC tasks support (see Figure 27), reducing the number of cross-checks required to 550. In information systems engineering this problem is resolved with a linking object (in database design this is a join table) which avoids the duplication of redundant information. In the ATC example the linking object which connects all ATC collective tasks is the function that is delivered through the synergy of all ATC collective tasks in this case the ATC Service itself. Multiple ATC collective tasks (launch, recovery, handovers, overflight etc.) support the function Operate ATC Service. Likewise the Operate ATC Service contributes to multiple military tasks. Figure 27 expresses this idea graphically.
81
Figure 27 Mapping of Example ATC Role Group Summary Tasks to Military Tasks in the MTL(M) This approach avoids the situation of having to cut and paste all the MTL(M) references into each of the ATC task descriptions and training objectives. All that is required is to identify which function is supported by the collective task, and then what METs that function (in this case Operate ATC Service) supports. Therefore, the recommended approach is to identify the top-level functions, under which the tasks and training objectives can be grouped, and map these to the MTLs.
82
83
Content-performance Matrix both guides the classification of content and has related guidance on elaboration strategies (Clark, 2008). The matrix is reproduced at Table 33. Table 33 Training Content Performance Matrix with Application Level Elaboration Strategies, adapted from Clark (2008)
Content Type Performance Remember Facts Concepts Remember the facts Remember the definition N/A Classify new examples Apply N/A Provide the definition Provide examples Provide counter examples Provide practice examples for classification Process Remember the stages Solve the problem Make inferences Explain key concepts of the process Explain the stages of the process Provide practice exercises requiring application of process knowledge Assessment by observation of performance Procedure Remember the steps Perform the procedure Clear statement of the steps of the procedure, with illustrations A demonstration of the procedure Hands on practice, using the actual equipment, with feedback Assessment by observation of performance Principle Remember the guidelines Perform the task Solve problems State guidelines or show examples of guidelines being implemented Present nonexamples Practice application of guidelines in realistic circumstances, with feedback. Assessment by observation of feedback Elaboration Strategy for Apply
What makes the matrix particularly useful is that elaboration strategies are offered for both remembering content and applying it. The table shows a high level synopsis of the recommended elaboration strategies for application, as team training is focused on skills application. Team training is typically concerned with teaching the application of procedural skills and the application of principles (such as doctrine). The point of note is that the recommended elaboration strategies require the teaching of the underpinning knowledge before practice takes place.
84
Figure 28 Part Task Training Sequence (adapted from Wickens and Hollands, 2000) Part-task training is a familiar approach used widely in individual training. A typical parttask training sequence adapted from Wickens and Hollands (2000) is shown in Figure 28, which illustrates subtask A being trained, followed by subtask B and then the whole task (A and B together) being trained. The key criterion for the utility of part task training is that it is only successful if the subtasks are loosely coupled. If there is a high degree of interconnection between the tasks then part task training is unlikely to be suitable. The part-task approach is potentially of use in team and collective training where difficult subtasks exist that are not highly connected to other subtasks and could therefore be practiced separately. Such tasks can be identified from the TCHTA diagram and Tasks Description Tables. The difficulty of the task is reflected in the critical errors and consequences fields and the description of the teamwork requirements, and the degree of connection to other tasks can be evaluated based on the complexity of the plan that describes its relative sequencing (also shown in the Task Sequence Diagram) and the inputs from other tasks.
85
Figure 29 Simplifying Conditions Method Training Sequence (Adapted from Reigeluth, 1999) Whilst the detailed application of SCM is most pertinent to the detailed training design phase, it has relevance at a high level during TNA as there may be a need to specify a simplified environment for the early stages of task practice.
86
Team members should be provided sufficient opportunities to interact with novel environments in order to develop adaptive mechanisms. Team training should be more than a feel good intervention. Guided practice is essential. Training should establish a mechanism to foster teamwork. Both teamwork and taskwork competencies are needed for effective team functioning.
From the TCTNA perspective, these principles underline the importance of providing representative practice environments in which the training audience are presented with credible scenarios which stress their capability to conduct both the taskwork and teamwork elements of their tasks. Furthermore, effective coaching and feedback (including After Action Review) have to be provided. The implications for this stage of analysis are that it is essential to identify the following: The critical features of generic scenarios that are required to stress the teamwork and taskwork components of the tasks to be trained. The data that have to be gathered in order to inform assessment of processes and products of team activity. The instructional tasks that have to be fulfilled to plan, deliver and evaluate the training.
87
Figure 30 Mapping of TOs to the Operational Environment Each of TO 1 to 4 shown in the diagram require different but overlapping sets of elements in the operational environment. TOs 1.4 and 3.1 also have overlapping requirements but for much smaller subsets of environmental elements. The accurate identification of part task training requirements and simplified conditions opportunities ensures that overly complex training environments are not specified.
88
and how the functions of sub-teams need to be co-ordinated. Similarly, instruction on the application of doctrine, including such matters as the application of Rules of Engagement in the context of a particular theatre of operations, provides the team with an opportunity to develop shared mental models of how problems should be addressed and is likely to be time well spent. The initial instruction phase provides a key opportunity for addressing the teamwork aspects of the tasks to be trained. Discussion of the priorities for teamwork interactions and processes in the light of environmental task demands has the potential to provide the team with sound mental models of how the team should function. Table 34 provides suggestions for the content that should be covered in the initial instruction related to teamwork objectives. Table 34 Suggested Content for Initial Instruction on Teamwork
Teamwork Elements Level 1 Team member KSAs Level 2 Teamwork Component Interactions (TCIs) Priorities by role for : Communication Performance Monitoring Backup behaviours Resource allocation Task allocation Level 3 Teamwork Processes Level 4 Team Properties
The need for shared mental models of the task and how the team operates and communicates
The priorities for : Task Coordination Workload management Information Coordination Resource coordination Collaborative problem solving Collaborative planning Conflict management
The organisational structure and relationships of roles The significance of adapatability and cohesion in the conduct of the teams tasks in the light of the environmental task demands
Given the nature of the tasks in the light of the environmental task demands,
Having identified the topics that require initial instruction, appropriate methods need to be identified. Classroombased lectures and discussions are likely candidate methods. Others such as demonstrations may be possible depending on the nature of the task being trained.
89
A suggested output from this phase of analysis is an Initial Instruction Table. An example Initial Instruction Table is shown at Table 35. Table 35 Example Initial Instruction Table for F3 Pairs Training
Training Objective 1.0 Destroy Bandit Fighter BVR Topic for Initial Instruction Pairs tactics for engaging radarequipped bandits BVR Principles for coordinating a pairs engagement 2.0 Destroy Bandit Bomber BVR 3.0 Destroy Bandit Aircraft in Visual Range 4.0 Identify key teamwork KSAs for the conduct of pairs intercepts Pairs tactics for engaging non radar-equipped bandits BVR Pairs tactics for visual attacks Method Classroom lecture and discussion with illustrations of good and bad examples of the application of tactics
Classroom lecture and discussion with illustrations of good and bad examples of the application teamwork principles
90
of the whole task. Such subtasks can be identified from the Typical Errors and Consequences fields of the Task Description Tables. Candidate subtasks for separate training may well fall into more than one of these categories. An example would be the training of a Battle Staff to apply the Combat Estimate. The development of a Combat Estimate is based around answering seven key questions that have to be addressed when formulating an operation plan. During the execution phase of an operation the Battle Staff have to simultaneously control the execution of the plan, adjust the plan in response to events during execution (which may require reapplication of the Combat Estimate process) and apply the Combat Estimate process to further phases of the operation. During execution of the plan inputs are required from subunits who are executing the plan which constitutes a more complex environment than that for the development of the plan. The development of the combat estimate is complex and is a task which merits additional practice. 8.3.2.2 Identification of Simplifying Conditions Method Requirements Consideration should be given to identifying where there is benefit to be gained from practicing tasks in simplified conditions prior to practicing the tasks in the full environment. This is particularly applicable to complex tasks and tasks where there are significant environmental task demands. In such cases there is likely to be a high degree of teamwork required and there may need to be training effort focussed on the teamwork aspects of the task. An example from the F3 pairs training task would be the practice of intercepts against bandit fighters in visual range. This task requires operation of all the aircraft systems and air combat manoeuvring which can result in the crews experiencing sustained G-Forces of up to 7-G or more. It is particularly challenging when conducted at low-level because of the inevitable risk of collision with the ground or sea. Practicing the task in simplified conditions without the physical stressors, the element of danger and the need to deal with other aircraft systems could allow the students to focus on the key issues of the application the appropriate tactics and the teamwork interactions and processes necessary to carry out the intercept effectively. 8.3.2.3 Assessment and Feedback Methods Given that the general approach for team and collective training is to facilitate practice of the required teamwork and taskwork skills in a representative environment, assessment is essentially based on observation of team performance. The detail for assessment has already been captured in the Task Description Table for each task identified in the TCHTA. There are a number of options for the provision of feedback which include: Coaching of individuals during task performance Coaching of sub-teams during task performance
91
Hot debriefs at the conclusion of an exercise component or phase After Action Review of whole team performance at the end of an exercise phase.
8.3.2.4 Practice and Assessment Methods Table The Practice and Assessment Methods Table provides a simple record of the methods selected for each training objective, identifying whether the whole task will be trained in a representative environment alone or if a simplified environment will be required. Any subtasks that require part-task training are also identified. Probably the most challenging aspect of completing this table is the estimation of the time that is required for training. This is a significant value because it will impact on the cost of training. There are no convenient formulae or even heuristics to guide this estimation. The best guide is the time that similar training takes and SME instructor estimations. Constraints on course duration if there are any and on resource availability (such as aircraft training hours available) have to be taken into consideration.
92
93
Figure 31 Information Inputs to Generic Scenario Specifications The generic scenario specifications provide the baseline information for the development of the scenario. The task descriptions amplify this as they capture the inputs that have to be provided and the outputs that have to be delivered. However, it is the environmental task demand information, in conjunction with the information about the types of error that are likely, that provide the characterisation of what makes the task difficult. The scenario should contain elements that place the appropriate demands on the team. In the case of F3 Pairs training, factors such as visibility, difference in altitude between the bandits and the fighters (a bandit above the fighters has a tactical advantage), and the aggressiveness of the bandits are significant elements which ideally should be controllable. It is also necessary to consider the teamwork elements that are priorities as this may influence the scenario requirements. Communication, performance monitoring, backup behaviour, coordination and task allocation being significant for a pairs
94
intercept. Appropriate bandit tactics can generate the requirement for backup of the leader by the wingman. It may be that the generic scenarios identified at the start of the analysis process can each be amplified to ensure there is sufficient detail to inform the description of the training environment (with detail such as what control of the elements is required and how their numbers and actions may vary). Recording which training objectives are covered by which scenario can provide a useful cross reference of coverage of training objectives. It may also be appropriate to develop scenarios for aggregations of training objectives or even individual objectives if they are likely to require different training environments, such as live firing. It is suggested that scenario specification is carried out in two stages. The first stage is to develop a narrative description of the properties of the required training scenario. A suggested format for recording these is the Training Scenario Table shown in Table 37. Table 37 Example Training Scenario Table
Training Scenario Description 1 Training Objectives 2.0 Destroy Bandit Fighter in Visual Range A single bandit fighter should enter into visual range of the fighter pair and take appropriate aggressive defensive action to counter the attack using air combat manoeuvre and flares against heat seeking missiles. If the Tornado pair loses tactical advantage then the Bandit should counter attack taking heat seeking missile shots. Visual engagements should be conducted at both medium and low level and occur in degraded visibility as well as high visibility conditions. Bandit tactical advantage from initial positioning and altitude and aggressiveness of bandit tactics should be controllable.
This then needs to be mapped into the Environment Description Table as shown in Table 38. This mapping facilitates the specification of the training environments.
95
Training Scenario 1 Simplified conditions practice Not Required Bandit fighters take evasive action using air combat manoeuvring and flares if defensive and counter attack taking heat seeking missile shots if offensive Bandit initial tactical advantage and aggressiveness of bandit tactics should be variable Full task practice Not Required Bandit fighters take evasive action using air combat manoeuvring and flares if defensive and counter attack taking heat seeking missile shots if offensive Bandit initial tactical advantage and aggressiveness of bandit tactics should be variable Heat seeking missiles Not required All systems and effects of G required Clouds, haze, precipitation , sun position and light levels Ground topology. Heat seeking missiles Not required Effects of G not required Operation of aircraft systems other than weapons systems and flight controls and displays not required Clouds, haze, precipitation , sun position and light levels Ground topology.
96
Assess practice o Monitor and assess individual and team performance o Monitor and assess performance outputs o Provide feedback during task performance o Debrief practice
As the exact nature of each of these tasks will vary according to the training objective being taught it is suggested that the description of the tasks be broken down by training objective. An example of an Instructor Task Table and illustrative entries is shown in Table 39.
97
98
Training Objective 2.0 Destroy Bandit Fighter in Visual Range Simplified conditions practice Overlay requirements Not Required Bandit fighters fly a predefined profile and counter attack with a degree of aggressiveness to match student performance. The instructor running the intercept should be able to change the profiles and bandit tactics as required during the sortie. Replay of bandit manoeuvre, missile launches and use of defensive aids required for debrief Heat seeking missile trajectory and effects required for debrief Not required Monitoring of aircraft manoeuvre during intercepts Visualisation of the manoeuvre of the F3 pair in 2-D and 3-D should be available for debrief Monitoring and recording of intercom chat and radio transmission from both crews required Monitoring and recording of visual scene (out of window) Clouds, haze, precipitation, sun position and light levels should be capable of being preset by the instructor Sea or land selectable by the instructor.
99
100
a.
WholeTaskandParttaskTrainingEnvironments
b.
101
102
c. systems elements d. human elements e. resource elements Different data is required for each of these types of elements. The definitions and entries for these five types of elements are considered in turn. 9.2.2.1.1 System Fidelity Requirements For the purposes of fidelity specification, a system is anything with a mix of controls, displays and output devices that the training audience has to interact with, such as warfare systems, radios, and weapons systems. Descriptions of physical and functional fidelity are required which consider the attributes in Table 41. Table 41 System Fidelity Requirements
Physical Fidelity Requirements Attribute Size Location Appearance Controls Feel Weight Motion Sound Description Does the item need to be full size or is a smaller representation acceptable? Do the controls and displays have to be correctly spatially located with reference to the operators position, or if not what is acceptable? Do the colour and texture matter? What are the critical appearance attributes? Are all the controls required, if not, which are? Does the feel of the controls have to be replicated exactly? If the system is portable does it have to be a representative weight and balance? What motion cues does it have to provide? What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements Attribute Format Content Response Description Does the format of displays have to be replicated exactly? Can any display content be omitted? Does system response have to be replicated exactly or if not, what elements can be omitted and what tolerance on system response is acceptable? If the system interacts with other entities in the environment what attributes must it have (e.g. an aircraft has a radar signature and heat signature)?
103
The appearance of the system to other elements in the environment is included to ensure that data for all interactions is captured. It is included in the functional fidelity section as it may be a dynamic attribute, for example an aircraft heat signature will vary according to thrust or reheat selection. 9.2.2.1.2 Resource Fidelity Requirements Resources are all the elements in the environment that are not human or manned systems. These include logistics elements and equipment. The attributes in Table 42 need to be considered. Table 42 Resource Fidelity Requirements
Physical Fidelity Requirements Attribute Appearance Feel Weight Sound Description Do the colour and texture matter? What are the critical appearance attributes? If the item can be touched, does the feel of the item have to be replicated exactly? If the item is portable does it have to be a representative weight and balance? What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements Attribute Behaviour Description What aspects of behaviour have to be produced to generate interactions with the team and to respond to interactions from the team? Information required to generate interactions with the team or respond to team interactions
9.2.2.1.3 Human Element Fidelity Requirements Fidelity requirements for the human elements in the environment are of particular importance. In order to be able to determine if a role player can be used, it is necessary to capture the required knowledge and skills needed to generate interactions with the team and to respond to team inputs. In simple situations, such as passing radio traffic, an untrained operator could read from a script. However, in the case of a carrier strike group exercising in a synthetic environment under the control of a Maritime Component Command (MCC) the interactions are more complex and would required senior staff with command experience to play the role of a staff member of the MCC. It is also necessary to know what environmental information they require to generate inputs and responses (for example the Fighter controller in the F3 Pairs training example needs to have a dynamic, 2-D view of the tracks of both the pair and the bandits with altitude information). The attributes in Table 43 need to be considered.
104
Functional Fidelity Requirements Attribute Behaviour Description What aspects of behaviour have to be produced to generate interactions with the team and to respond to interactions from the team? Information required to generate interactions with the team or respond to team interactions What knowledge and skills are required to produce the required behaviour given the information and systems provided?
9.2.2.1.4 Manned Systems Fidelity Requirements Manned systems, such as bandit fighters, are a hybrid between resources and human elements. They appear dynamically in the environment but respond based on human decision making, knowledge and skills. If the element is an enemy vehicle, it is necessary to determine how realistic it should be in appearance or however else it is perceived and how its behaviour is determined. Table 44 illustrates the attributes that need to be considered. Table 44 Manned Systems Fidelity Requirements
Physical Fidelity Requirements Attribute Appearance Sound Description Do the colour and texture matter? What are the critical appearance attributes? What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements Attribute Behaviour Description What aspects of behaviour have to be produced to generate interactions with the team and to respond to interactions from the team? What information is required to generate interactions with the team or respond to team interactions? What knowledge and skills are required to produce the required behaviour given the information provided? If the system interacts with other entities in the environment what attributes must it have (e.g. an aircraft has a radar signature and heat signature)?
Interaction information requirements Knowledge and skills Appearance to other system elements
105
9.2.2.1.5 Physical Environment Elements Static features of the environment such as terrain require only a physical fidelity description whereas dynamic elements (such as rain or waves) require functional specifications as well. Table 45 illustrates the attributes that need to be considered. Table 45 Physical Environment Fidelity Requirements
Physical Fidelity Requirements Attribute Appearance Feel Sound Description Do the colour and texture matter? What are the critical appearance attributes? If the item can be touched, does the feel of the item have to be replicated exactly? What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements Attribute Behaviour Description What aspects of behaviour have to be produced to generate interactions with the team and to respond to interactions from the team? Information required to generate interactions with the team or respond to team interactions
9.2.2.1.6 Recording Element Fidelity Specifications Element specifications are recorded in the Environment Description Table. This has the advantage of keeping the Environment Description Table as the single master document that provides a comprehensive description of the required environment set. There will be a set of entries for each of the training environments described. This is illustrated in Table 46
106
107
b. Constructive Constructive simulation can be defined as real people exercising military decisions on the basis of information constructed by a computer system. A common type of constructive simulation is the classic wargame and is typified by the wargames such as the one supporting the Command and Staff Trainer (CAST) at the Land Warfare Centre. c. Live Simulation Live simulation can be defined as real people operating real equipment with simulated effects in a live environment and is typified by the use of instrumented flying ranges and the Tactical Engagement Simulation (TES) systems used at BATUS. d. Embedded simulation Embedded simulation is the incorporating of simulation capability into operational equipment. Simulation modes built into warfare systems would fall into this category. e. Networked simulation Networked simulation is the networking together of multiple simulators. Examples include the Combined Arms Tactical Trainer (CATT) and the Medium Support Helicopter Aircrew Training Facility (MSHATF), both of which have multiple vehicle simulators connected together on one site. f. Distributed Simulation Distributed refers to the networking of simulators and simulator networks across different sites. Examples include the connection of the Cooke Warfare Team Trainers in Portsmouth being connected to US Navy simulation systems in Norfolk, Virginia and the connection of the Mission Training by Distributed Simulation (MTDS) system at RAF Waddington being connected to equivalent USAF systems in Mesa Arizona. g. Synthetic Wrap Synthetic wrap is an ingenious combination of the use of live and virtual simulation to provide an extended battlespace for training. This enables Operational pictures to be populated with elements that are outside the geographical area being used for training. The challenge is transitioning an element traversing the constructive space into the live space. h. Augmented Reality Augmented reality refers to the technique of integrating synthetic elements into the live environment. A typical example might include the insertion of synthetic target images and weapons effects into a weapons display. Identifying suitable options can be informed by the capabilities of extant systems but often advice will be required on what constitutes the contemporary art of the possible. Capability Joint Training Experimentation and Simulation (Cap JTES) in MoD Main Building should always be consulted for advice, particularly as there is a requirement for all new simulations to have the capability to be connected to extant systems to provide greater capability for distributed training in the collective and joint domains.
109
JOUST
110
Having developed the overall description of the option, the next stage is to capture in detail how it meets the requirements for the environment. This entails describing how each element of the environment is represented and how the instructional overlay requirements are met. The requirements for each element are described by the corresponding column for that element in the Environment Description Table. It is suggested that the appropriate entries in the table are extracted and used to populate the first column of an Environment Option Properties Table for that element. Subsequent columns can then be used to record the detail for each option
111
being considered for the training environment in question. Example table entries are shown in Table 48. Table 48 Example Training Environment Option Properties Table Entries
Environment: Simplified Conditions Environment for Pairs Intercept Practice Environment Element: Bandit Missiles Requirement Option JOUST Heat seeking and radar guided missiles effects based on launch parameters, kill probability and pairs manoeuvre Evaluation of missile effectiveness for debrief Radar guided and heat seeking missile flyouts and effects shown in simulation on instructor display Live Flying Weapons effects simulation not available
Shot effectiveness of Bandit missiles evaluated by instructors inspection of video recording of cockpit displays and estimate of launch parameters and Kill probability Video of cockpit displays replayed using mission debrief facility
Video replay facility provided on the instructor station which can be viewed by the students
112
Heat seeking and radar guided missiles effects based on launch parameters, kill probability and pairs manoeuvre.
Evaluation of missile effectiveness for debrief Video replay of missile effects for debrief
113
The cost model may be somewhat different if the training solution is to be acquired as a training service. Given that it is likely that a synthetic training solution will be required for some or all of the training, advice from MoD and probably industry will be required to determine costs in any meaningful way.
provided to illustrate the application of each of the techniques advocated in the main analytical stages. The development of the TCTNA approach has benefited from the active participation of Service TNA specialists, particularly the TNA staff at the Fleet Human Resources Training Support Group. The opportunity exists to further refine the techniques developed by eliciting their feedback on the experience of applying the TCTNA method to the QE Class Carrier Collective TNAs currently under way. The greatest opportunity for exploitation of this work within MoD lies in the inclusion of the TCTNA approach within JSP 822. To this end, the authors of this report have been invited to participate in the ongoing revision of JSP 822 and this report will be used to inform that work. The TCTNA method also has potential for application to Joint training, although some adaptations of the method may be required and is a potential area for further research.
11.2 Recommendations
It is recommended that: a. Feedback is sought from TNA specialists currently applying the TCTNA method so that the techniques advocated can be refined and revised as necessary. b. The TCTNA method is incorporated into JSP 822. c. The application of the TCTNA method to Joint training is investigated.
116
12 References
Adair, J (1997) Leadership Skills Chartered Institute of Personal Development, London. Annett, J., Cunningham, D. & Mathias-Jones, P (2000) A method for measuring Team Skills, Ergonomics, Vol 43 No 8 (pp1076-1094). Annett, J., Duncan, K. D., Stammers, R. B. and Gray, M. J. (1971) Task Analysis (London:HMSO). Annett, J. (1997) Analysing team skills, in R. Flin, E. Salas, M. Strub and L. Martin (Eds), Decision Making Under Stress: Emerging Themes and Applications, Ashgate, Aldershot. Arthur, W, Edwards, B.D, Bell S.T., Villado, A.J. & Bennett, W. Team Task Analysis: Identifying Tasks and Jobs that are Team Based. Human Factors, Vol 47, Issue 3. Baker, Day and Salas (2006) Teamwork as an Essential Component of High-Reliability Organizations. Health Services Research. Vol 41, Issue 4, August 2006. 15761598. Bowers, C.A., Morgan, B.B., Salas, E. & Prince, C. (1993) Assessment of Coordination Demand for Aircrew Coordination Training, Military Psychology 5(2) pp 95-112. Burke, S. (2005) Team Task Analysis, in Stanton, N. Hedge, A., Brookhuis, K, Salas, E., Hendrick, H. (Eds) Handbook of Human Factors and Ergonomics Methods, CRC Press, London. Cannon-Bowers, J.A. & Salas, E. (1998) Decision Making Under Stress: Implications for Individual and Team Training, American Psychological Association, Washington DC. Clark, R. C. (2008) Developing Technical Training 3rd Edn, Pfeiffer, San Francisco. DSTL (2006) Analysis of Team Training A Methodology. DSTL Report 200606. DSAT QS (2003) Defence Systems Approach to Training Quality Standard. Fitts, P.M. (1964) Perceptual-Motor Skill Learning in Melton, A.W. (Ed) Categories of Human Learning (pp243-285), Academic Press, New York. Hackman, J. R., & Morris, C. G. (1975) Group tasks, group interaction process, and group performance effectiveness: A review and proposed integration. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology Vol. 8, Academic Press, New York. (1-55). HFI DTC (2005) Operational Information Management Skills Knowledge and Attitudes Requirements HFI DTC Report No HFIDTC/WP.2.1.1/1. HFI DTC (2007) Information Exploitation Competencies HFI DTC Report No HFIDTC/2.1.1/3.
117
HFI DTC (2008) A Critique of Media Selection Models analysis of applicability of extant models to current UK Military Training HFI DTC report No HFIDTC/2/WP12.1.1/1. HFI DTC (2009) Training Needs Analysis The Application of an Information Processing-based Approach HFI DTC Report No HFIDTC/2/WP12.1.1/2 Huddlestone, J. and Harris , D (2003) Air Combat Student Performance Modelling Using Grounded Theory Techniques, Proceedings of the Interservice/Industry Training Simulation and Education Conference, Orlando Dec 2003. JSP 502 Training needs Analysis for Acquisition Projects. JSP 822 (2007) Part 5 Chapter 3 Defence Training Support Manual 3 Training Needs Analysis. Klein, G (2000) Cognitive Task Analysis of Teams in Schraagen, J.M., Chipman, S.F.,& Shalin, V.L. Cognitive Task Analysis, Laurence Earlbaum Associates. Klein, G. & Armstrong A.A. (2005) Critical Decision Method in Stanton, N.A. Hedge, A., Brookhuis, K, Salas, E., Hendrick, H. (Eds) Handbook of Human Factors and Ergonomics Methods, CRC Press, London. (35-1 35-8) Marks, M.A., Matheu, J.E. & Zaccaro, S.J. (2001) A temporally based framework and taxonomy of team processes, Academy of Management Review Vol 26 No3( pp356-376) Naikar, N, & Sanderson, P. (1999) Work Domain Analysis for Training System Definition and Acquisition in The International Journal of Aviation Psychology, 9(3) (271-290). NATO (2004) Evaluation of Collective Training in a Distributed Simulation Exercise Proceedings of the NATO Research and Technology Organisation Human Factors and Medicine Panel Symposium, Genoa Oct 2003. NATO (2005) Military Command Team Effectiveness: Model and Instrument for Assessment and Improvement NATO Research and Technology Organisation Technical Report TR-HFM-087. Nieva,V.F., Fleishman, E.A., & Reick, A (1978). Team dimensions: Their identity, their measurement, and their relationships. Final Tech. Report, Contract DAHI9-78-C- 0001. Advanced Research Resources Organisation, Washington, DC. Orasanu, J. M. (1993). Decision making in the cockpit. In: Weiner, E. L., Kanki, B. G. and Helmreich, R. L. (Eds). Cockpit resource management. London: Academic Press Limited. Partington, D. (2002) Essential Skills for Management Research, Sage, London. Reigeluth, C.M. (1999) The Elaboration Theory in Reigeluth, C.M. Instructional-Design Theories and Models Vol II, Lawrence Earlbaum Associates, New Jersey. Roby, T. (1968) Small Group Performance. Rand McNally & Company, Chicago.
118
Rousseau, V., Aube, C. & Savoie, A. (2006) Teamwork behaviours A review and Integration of Frameworks in Small Group Research Vol 37 No 5, (540-570). Salas, E., Sims, D. E., & Burke, C. S. (2005). Is there a big five in teamwork? Small Group Research, 36, 555-599. Salas, E., Dickinson, T.L., Converse, S., & Tannenbaum, S.I. (1992). Toward an understanding of team performance and training. In R.W. Swezey & E. Salas (Eds.), Teams: their training and performance Ablex, Norwood, NJ. (329). Salas, E. & Preist, H. Team Training in Stanton, N., Hedge, A., Brookhuis, K., Salas, E. and Hendrick H. (2005) Handbook of Human Factors and Ergonomics Methods, CRC Press, Florida. (44-1 44-7). Salmon, P.M., Stanton, N.A., Walker, G.H., Jenkins, D.P. (2009) Distributed Situational Awareness. Ashgate, Aldershot. Stanton, N. (2006) Hierarchical Task Analysis: Developments, Applications and Extensions, Applied Ergonomics Volume 37, Issue 1 January 2006, (pp55-79 ) Elsevier. Stanton, N.A., Salmon, P.M., Walker, G.H., Baber, C., Jenkins, D. (2005) Human Factors Methods A Practical Guide for Engineering and Design, Ashgate, Aldershot. Strauss, A., Corbin, J. (1990) Basics of Qualitative Research: Grounded Theory Procedures and Techniques, Sage, London. Swezey, R. W., Owens, J. M., Bergondy, M. L., & Salas, E. (1998). Task and training requirements analysis methodology (TTRAM): An analytic methodology for identifying potential training uses of simulator networks in teamwork-intensive task environments. Ergonomics, 41, 1678-1697. Tannenbaum, S. I., Beard, R. L., & Salas, E. (1992). Team building and its influence on team effectiveness: An examination of conceptual and empirical developments. In K. Kelley (Ed.), Issue, theory, and research in industrial/organizational psychology (pp. 117-153). Amsterdam: Elsevier. Tannenbaum, S.I., Smith-Jentsch, K.A., Behson, S.J. (1998) Training Team Leaders to Facilitate Learning and Performance in Cannon Bowers, J.A. & Salas, E.(Eds) Decision Making Under Stress: Implications for Individual and Team Training, American Psychological Association, Washington DC. Tesluk. P., Mathieu, J. E., Zaccaro. S. J., & Marks, M. (1997). Task and aggregation issues in the analysis and assessment of team performance. In M. T. Brannick, C. Prince, & E. Salas (Eds.), Team performance assessment and measurement: Theory, methods, and applications (pp. 197-224). Mahwah. NJ: Erlbaum. Ward, P.T. & Mellor, S.J. (1985) Structured Development for Real-Time Systems Volume 2: Essential Modelling techniques, Yourdon Press, New Jersey. Wickens, C.D. & Hollands J.G. Engineering and Human Performance 3rd Edn, Prentice Hall, New Jersey.
119
Appendix A
Teamwork Models
120
Leadership
Adaptability
121
A.3 The Models for Analysis of Team Training Taxonomy (Dstl, 2006)
Table 51 MATT Teamwork Behaviours (Dstl, 2006) Behaviour Definition 1. Communication Behaviours 1a Information Exchange Seeking and passing information to all relevant team members at appropriate times in relation to their task needs. 1b Communication Skills 2. Co-ordination Behaviours 2a Procedural Co-ordination 2b Collaboration Making use of standardised formats and conventions to transmit information. The integration and synchronisation of team interactions in the completion of laid down procedures. The process of organising team resources, activities and actions to ensure that tasks are mutually shared and completed in time. Directing and co-ordinating the activities of the team.
2c Leadership and Task Management 3. Adaptive Behaviours 3a Situation Assessment 3b Decision Making
Development of a common understanding of the situation. Mutual involvement in the assessment of a situation and choice of a course of action through discussion and argument. Monitoring the performance of team mates, providing constructive advice and giving and receiving feedback. Providing assistance to other team members who need it.
122
Table 52 MATT Team Member Attitudes and Characteristics (Dstl, 2006) Attitude 1. Mutual Trust 2. Shared Vision 3. Team Orientation 4. Collective Efficacy Characteristics Team members respect each other, listen to each others proposals and views and encourage proactive behaviour. Team members agree on the direction, goals and mission of the team and have a mutual belief in the importance of the team. Team members believe that the team approach is likely to be more successful than acting as an individual. Team members hold positive common perceptions of group achievements and potential.
Table 53 MATT Teamwork Knowledge Requirements (Dstl, 2006) Type of knowledge Definition Shared Task Models Cue-Strategy Associations Team Mission, Objectives and Resources Accurate Problem Models Team Member Characteristics Boundary Spanning Roles Task Sequencing Team Role Interaction Patterns Teamwork skills Shared models of the situation and appropriate strategies for coping with task demands. Association of environmental data, with appropriate task strategies requiring coordination. Goals held in common and the team members and facilities for their achievement. Correct understanding of the teams problems and strategies by which members will cope with and solve them. Task relevant competencies, preferences, tendencies, strengths and weaknesses of team mates. Knowledge of how a team manages its interactions with non-team members and other units. Integrating task inputs according to team and task demands. Knowing how the team communicates and arrives at decisions. Understanding the skills and behaviours required for successful performance. Understanding what needs to be done in order for the team to perform effectively. Knowing how to organise tasks in sequences depending on priorities. Descriptors Hold shared interpretation of situations and how to deal with them. Know how and when to change co-ordination strategies. Common understanding of mission and team resources required to achieve objectives.
123
Table 54 Definitions of the Core Components and Coordinating Mechanisms of the Big Five Model of Teamwork (adapted from Salas et al, 2005) Teamwork Definition Core Components Team leadership Ability to direct and coordinate the activities of other team members, assess team performance, assign tasks, develop team knowledge, skills, and abilities, motivate team members, plan and organize, and establish a positive atmosphere. The ability to develop common understandings of the team environment and apply appropriate task strategies to accurately monitor team mate performance. Ability to anticipate other team members needs through accurate knowledge about their responsibilities. This includes the ability to shift workload among members to achieve balance during high periods of workload or pressure. Ability to adjust strategies based on information gathered from the environment through the use of backup behaviour and reallocation of intra-team resources. Altering a course of action or team repertoire in response to changing conditions (internal or external). Propensity to take others behaviour into account during group interaction and the belief in the importance of team goals over individual members goals. Coordinating Mechanisms Shared mental models Mutual trust Closed loop communication An organizing knowledge structure of the relationships among the task the team is engaged in and how the team members will interact. The shared belief that team members will perform their roles and protect the interests of their teammates. The exchange of information between a sender and a receiver irrespective of the medium.
Adaptability
Team orientation
124
Task-related collaboration behaviours Integrating team members activities to ensure task accomplishment within established temporal constraints Working together during task execution. Exchange information by whatever means
Team Adjustment Behaviours Provision of tangible task-related help when a team member is failing to reach the goals as defined by his or her role Provision of feedback and confronting members who break norms Collectively find and implement a solution that brings actual conditions closer to the desired conditions involves gathering and integrating information related to the problem, identifying alternatives, selecting the best solution, and implementing the solution includes decision making Team members activities designed to invent and implement new and improved ways of doing their tasks The voluntary assistance that team members provide to reinforce the sense of well-being of their teammates Resolution of conflicts tasks, processes and interpersonal issues
125
- End of Document
126
127