Professional Documents
Culture Documents
For
Human Resource Development
Chapter 4 to 7
Changes in System or New or changed equipment The line has been shut down
Subsystem may present training about once per day since the
problem new machinery was installed.
Waste has doubled since
using the new cutting tool
Task Analysis
Task analysis (sometimes called operations
analysis) is a systematic collection of data about a
specific job or group of jobs used to determine
what employees should be taught to achieve
optimal performance
Overall job description
Task identification
What it takes to do the job/KSAs
Areas that can benefit from training
Prioritizing training needs
Data Sources For Task/Operational
Analysis – Part 1 of 3
Sources for Obtaining Training Need Implications Practical Concerns
Job Data
1. Job Descriptions Outlines the job’s typical Often inaccurate due to time
duties and responsibilities constraints or job knowledge.
but is them not meant to
be all inclusive
2. Job Specifications List specified tasks required May be product of the job
for each job. description and suffer from the
same problems
4. Perform the Job Most effective way of Easy, short cycle type jobs are a
determining specific tasks, possibility.
but has serious limitations
in higher level jobs
5. Observe Job—Work Most effective way of Useful again for very short cycle
Sampling determining specific tasks, jobs. Be aware of the impact of
but has serious limitations being observed can influence
in higher level jobs behavior
Data Sources For Operational Analysis –
Part 3 of 3
Vendor credentials
Vendor background
Vendor experience
Philosophical match (between vendor and
organization)
Delivery method
Other Factors to Consider – 2
Content
Actual product
Results
Support
Request for proposal (RFP)
Selecting the Trainer
Training competency
How well can he/she train?
If they can’t train, why are they employed?
Subject Matter Expertise
How well is the material understood?
If No Subject-Matter Experts
(SMEs) are Available…
Use a team to train
Use programmed instruction or CBT
Train your trainers…
You are training subject matter experts to be
trainers
You are not training trainers to be SMEs
Preparing Lesson Plans
Content to be covered
Activity sequencing
Selection/design of media
Selection of trainee activities
Timing and phasing of activities
Method(s) of instruction
Evaluation methods to be used
Training Methods
Methods Percent
Instructor-led Classroom Programs 91
Self-Study, Web-based 44
Job-based Performance Support 44
Public Seminars 42
Case Studies 40
Role Plays 35
Games or Simulations, Non-computer-based 25
Self-Study, Non-computer-based 23
Virtual Classroom, with Instructor 21
Games or Simulations, Computer-based 10
Experiential Programs 6
Virtual Reality Programs 3
Media
Workbooks/Manuals 79
Internet/Intranet/Extranet 63
CD-ROM/DVD/Diskettes 55
Videotapes 52
Teleconferencing 24
Videoconferencing 23
Satellite/Broadcast TV 12
Audiocassettes 4
Program announcements
Program outlines
Training manuals and textbooks
Training aids, consumables, etc.
Scheduling Training
Two-way communication
Use questions to control lesson
Direct: produce narrow responses
Reflective: mirror what was said
Open-Ended: challenge learners – to increase
understanding
Challenges of Using the
Discussion Method
Maintaining control in larger classes
Needs a skilled facilitator
Needs more time than lecture
Trainees must prepare for the lesson by
reading assignments, etc.
Audiovisual Media
Printed materials
Lecture notes
Work aids
Handouts
Slides – e.g., PowerPoint
Overhead transparencies
Dynamic Media
Audio cassettes
CDs
Film
Videotape
Video disc
Telecommunications
Instructional TV
Teleconferencing
Videoconferencing
Experiential Training
Case studies
Business game simulations
Role Playing
Behavior Modeling
Outdoor training
Case Study Considerations
Hard-copy
Correspondence courses
Programmed instruction
Computer-Based Training (CBT)
Computer-aided instruction
Internet/intranet training
Hard-Copy Self-Paced (i.e., Self-
Paced Computer-Based Training)
Computer-Aided Instruction
Internet & Intranet-Based Training (e-
learning)
Intelligent Computer-Assisted Instruction
Computer-Based Training
(Classroom-Based)
Group-based
Instructor is present and facilitates computer-
based learning
Trainees are collocate and can help each
other
Requires computer, etc., for each trainee
Computer-Aided Instruction
(CAI)
Drill-and-practice approach
Read-only presentation of a “classic” training
program
Multimedia courses
Interactive multimedia training
Simulations
Advantages of CAI
Intranet
Internal to site/organization
Internet
General communications
Online reference
Needs assessment, administration, testing
Distribution of CBT
Delivery of multimedia
Intelligent CAI
Depends on:
Objectives
Resources
Trainee characteristics
Other Considerations
Concerning Implementation
Physical environment:
Seating
Comfort level
Physical distractions
Evaluating HRD Programs
Chapter 7
Effectiveness
The degree to which a training (or other
HRD program) achieves its intended
purpose
Measures are relative to some starting
point
Measures how well the desired goal is
achieved
Evaluation
HRD Evaluation
Textbook definition:
“The systematic collection of descriptive and
judgmental information necessary to make
effective training decisions related to the
selection, adoption, value, and modification
of various instructional activities.”
In Other Words…
Are we training:
the right people
the right “stuff”
the right way
with the right materials
at the right time?
Evaluation Needs
Descriptive and judgmental information
needed
Objective and subjective data
Information gathered according to a plan
and in a desired format
Gathered to provide decision making
information
Purposes of Evaluation
Reaction
Focus on trainee’s reactions
Learning
Did they learn what they were supposed to?
Job Behavior
Was it used on job?
Results
Did it improve the organization’s effectiveness?
Issues Concerning
Kirkpatrick’s Framework
Most organizations don’t evaluate at all four
levels
Focuses only on post-training
Doesn’t treat inter-stage improvements
WHAT ARE YOUR THOUGHTS?
A Suggested Framework – 1
Reaction
Did trainees like the training?
Did the training seem useful?
Learning
How much did they learn?
Behavior
What behavior change occurred?
Suggested Framework – 2
Results
What were the tangible outcomes?
What was the return on investment (ROI)?
What was the contribution to the organization?
Data Collection for HRD
Evaluation
Possible methods:
Interviews
Questionnaires
Direct observation
Written tests
Simulation/Performance tests
Archival performance information
Interviews
Advantages: Limitations:
Flexible High reactive effects
Opportunity for High cost
clarification Face-to-face threat
Depth possible potential
Personal contact Labor intensive
Trained observers
needed
Questionnaires
Advantages: Limitations:
Low cost to administer Possible inaccurate data
Honesty increased Response conditions not
controlled
Anonymity possible
Respondents set varying
Respondent sets the paces
pace Uncontrolled return rate
Variety of options
Direct Observation
Advantages: Limitations:
Nonthreatening Possibly disruptive
Excellent way to Reactive effects are
measure behavior possible
change May be unreliable
Need trained observers
Written Tests
Advantages: Limitations:
Low purchase cost May be threatening
Readily scored Possibly no relation to
job performance
Quickly processed
Measures only cognitive
Easily administered learning
Wide sampling possible Relies on norms
Concern for racial/
ethnic bias
Simulation/Performance Tests
Advantages: Limitations:
Reliable Time consuming
Objective Simulations often
Close relation to job difficult to create
performance High costs to
Includes cognitive, development and use
psychomotor and
affective domains
Archival Performance Data
Advantages: Limitations:
Reliable Criteria for keeping/
discarding records
Objective
Information system
Job-based discrepancies
Easy to review Indirect
Minimal reactive effects Not always usable
Records prepared for
other purposes
Choosing Data Collection
Methods
Reliability
Consistency of results, and freedom from collection
method bias and error
Validity
Does the device measure what we want to measure?
Practicality
Does it make sense in terms of the resources used to get
the data?
Type of Data Used/Needed
Individual performance
Systemwide performance
Economic
Individual Performance Data
Individual knowledge
Individual behaviors
Examples:
Test scores
Performance quantity, quality, and timeliness
Attendance records
Attitudes
Systemwide Performance Data
Productivity
Scrap/rework rates
Customer satisfaction levels
On-time performance levels
Quality rates and improvement rates
Economic Data
Profits
Product liability claims
Avoidance of penalties
Market share
Competitive position
Return on investment (ROI)
Financial utility calculations
Use of Self-Report Data
Specifies in advance:
the expected results of the study
the methods of data collection to be used
how the data will be analyzed
Research Design Issues
= $220,800
= 6.8
$32,564
SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission.
Types of Training Costs
Direct costs
Indirect costs
Development costs
Overhead costs
Compensation for participants
Direct Costs
Instructor
Base pay
Fringe benefits
Travel and per diem
Materials
Classroom and audiovisual equipment
Travel
Food and refreshments
Indirect Costs
Training management
Clerical/Administrative
Postal/shipping, telephone, computers, etc.
Pre- and post-learning materials
Other overhead costs
Development Costs
U = (N)(T)(dt)(Sdy) – C
HRD Evaluation Steps
1. Analyze needs.
2. Determine explicit evaluation strategy.
3. Insist on specific and measurable training
objectives.
4. Obtain participant reactions.
5. Develop criterion measures/instruments to
measure results.
6. Plan and execute evaluation strategy.
Thank you