Professional Documents
Culture Documents
Cooper, R.
Turner, K.
(April 10, 2015)
ELBOW UP!
What are you doing, or what would you like
to be doing, to evaluate your programs?
Scott Vanderwey
Robby Cooper
Clinical Assistant Professor
Center for Transformational Learning & Leadership
Washington State University
Ken Turner
Challenge Course Manager
Seattle Parks and Recreation
Learning Objectives
1. Understand key elements of a
successful evaluation tool.
2. Gain new tools and techniques for
developing evidence-based programming.
3. Understand current trends in
educational research.
4. Create an implementation plan for new ideas.
A Land Grant
Research
University
ADVENTURE EDUCATION:
EXPERIENTIAL EDUCATION
METHODOLOGIES
Natural Resource Education
The Learning Communities
Model: Building Successful
Learning Communities (BSLC)
Camp Long Challenge Program
Recommend Sequence
low
mult course:
hou iple vis single t
it
rs +
o
lunc s (typic
ally
h)
4
Mt
C
with halle
trap belay nger H
ez e ed
ig
jum climb h Cou
p
ing rse
and
Enc
Cou hantme
rse
n
with ts High
zi p
line
s
Logic Model
Evaluation
Development
The beginning
We know it works
Goals
Process
Am I running my program well?
Outcome
Is my program making an impact?
Outcomes
Adventure Education
Outcomes
Targeting Life Skills
Model
Program Outcomes
* = self-efficacy
Measures
Concerns
Reliability
Validity
Measures
Communication
Adolescent Health)
Decision-making
Adolescent Health)
Procedures
What can we realistically and reasonably
accomplish that will provide a valid & reliable
measurement?
Time
Resources
Qualifications
administration
analysis
Setting
Ethics
Procedures
Notice to Parents sent out with
program registration materials
WSU EXTENSIONCHALLENGE COURSE EVALUATION
WSU EXTENSIONCHALLENGE COURSE EVALUATION
INSTRUCTIONSFOR FACILITATOR
INSTRUCTIONSFOR FACILITATOR
Facilitators,
Facilitators,
Procedures
Pretest Posttest Design
12 16 item pretest
21 25 item posttest
(satisfaction, demographics)
Analysis
Scale
Reliability
Reliability of Measures
(Chronbachs Alpha)*
Version 1
Communication
.30
Decision-making
.48
Teamwork
.34
Self-efficacy
.33
Version 2
Revision
Procedural
Fidelity
-training
-audience
Realistic
Expectations
-survey
Analysis
Scale
Reliability
Reliability of Measures
(Chronbachs Alpha)*
Version 1
Communication
.30
Decision-making
.48
Teamwork
.34
Self-efficacy
.33
Version 2
Analysis
Scale
Reliability
Reliability of Measures
(Chronbachs Alpha)*
Version 1
Version 2
Communication
.30
.69
Decision-making
.48
.79
Teamwork
.34
.68
Self-efficacy
.33
.61
Results
Results
August 2014 October 2014
N
Mean
Score
Change
Sig.
(p)
Effect Size
(d)
Communication
399
.172
5.99
.000*
.22**
Decision-making
462
.075
2.92
.004*
.12
Self-efficacy
197
.152
4.52
.000*
.26**
Teamwork
462
.103
4.32
.000*
.15
Dissemination
Presentations
Trainings
Publications
Learning Objectives
1. Understand key elements of a
successful evaluation tool.
2. Gain new tools and techniques for
developing evidence-based programming.
3. Understand current trends in
educational research.
4. Create an implementation plan for new ideas.
Contact
Information
WSU Extension
(253) 445-4581
Scott VanderWey
vanderwey@wsu.edu
Robby Cooper
robby.cooper@wsu.edu
Ken Turner
keno.turner@seattle.gov