Professional Documents
Culture Documents
00
Printed in Great Britain. # 2005 TEMPUS Publications.
277
278 B. Steward et al.
improve instruction. To remedy this situation, a A focus group is `a carefully planned series of
variety of formative and summative assessment discussions designed to obtain perceptions on a
methods can be used to obtain feedback on student defined area of interest in a permissive, non-
learning in the classroom [12]. For this study, two threatening environment' [20]. According to
formative assessments were used: a weekly e-mail Christopher, the open and interactive setting of
journal and a midterm e-survey about the course. the focus group facilitates deep thinking about a
Two summative assessments were also used: an course and uncovers specific suggestions as to how
end-of-term focus group and an end-of-term SEI it might be changed [21]. Hendershott and Wright
form. While such formative and summative assess- used student focus groups to explore student
ments generally identify student perceptions about attitudes about university general education curri-
instruction and learning rather than directly culum requirements and behavior arising from
measuring if learning has taken place, Mentkowski these requirements [22]. They found focus groups
has shown that there is a direct relationship uncover `rich data' going beyond information
between student perceptions of their learning and gleaned through surveys. Hamilton et al. found
actual learning [13]. student focus groups provided specific suggestions
E-mail journals consist of written student reflec- for course improvement as well as significant
tions about their learning in a course and are increases in SEI ratings [23].
periodically submitted to the instructor electroni- Current literature supports the potential for
cally. E-mail journals have been shown to promote using formative and summative assessment to
communication between students and instructor, improve instruction. However, little has been
with benefits to both. These benefits include written showing how several assessment methods
providing students with motivation to reflect on can be synergistically employed in the same course
course material and opportunities to seek help in a to promote course improvement. The goal of this
non-threatening forum to improve their under- research was to investigate the interaction and
standing of course material. Instructors benefit usefulness of several formative and summative
from e-mail journals by having access to an classroom assessments in making course improve-
expanded sample of students' perceptions about ments. Specific objectives of the research were to
course instruction and information about student (1) investigate and compare the use of two forma-
learning, including misconceptions [14, 15]. Deal tive and two summative assessment tools to iden-
found that e-mail journaling also helped students tify and understand student perceptions of their
develop improved self-assessment skills and better learning and teaching methods in an engineering
synthesize what they were learning [16]. She found course and (2) determine how the formative assess-
commensurate benefits to instructors through the ments could successfully be used to make course
deeper understanding of student concerns and adjustments during the duration of the course.
perceptions provided through the journals. The
use of e-mail encourages timely communication
concerning course material [5, 17]. The key compo- METHODS
nent of this type of feedback is the closing of the
loop between student questions and instructor The course under study was entitled Power and
responses. It is important for students to perceive Control Hydraulics, an elective offered in the
that their questions and feedback are considered Agricultural Engineering curriculum within the
valuable to the instructor [17]. Department of Agricultural and Biosystems En-
Teacher-designed surveys are another way to gineering at Iowa State University. The course
receive formative feedback. Using this type of provided an introduction to mobile hydraulic
feedback, adjustments can be made during the design for agricultural and off-road equipment.
term. Instructors can solicit feedback on the Students were expected to come into the class
course in general, or regarding specific projects, with credit or enrollment in fluid dynamics and
testing procedures, or presentation of course basic engineering science prerequisites. Each week,
concepts. This type of feedback can be used several the two-credit class met for two one-hour class-
times throughout the term, but perhaps the most room periods in which the instructor discussed
reasonable time for a survey is around midterm. course content, solved example problems, and
Midterm feedback surveys are usually short, guided the students in active learning exercises.
simple, and course specific [5]. When interpreting The latter involved students interpreting hydraulic
the feedback, the instructor must determine what schematic diagrams or solving problems in colla-
changes can be made during the term, those that boration with their fellow students. Instructional
will have to wait until next term, and those that methods include solving example problems on the
cannot be implemented based on pedagogical board, presenting content with overhead projected
reasons [18]. Implementing a web-based midterm slides, and using Microsoft PowerPoint presen-
feedback survey provides the instructor additional tationsÐincluding animationsÐto demonstrate
flexibility in survey design and enables rapid operation of hydraulic circuits and systems. In
collection and analysis of results [19]. addition to the classroom session, students option-
Focus groups can be effective in obtaining ally enrolled in a weekly two-hour lab session.
specific summative data from event participants. WebCT Campus Edition (WebCT Inc., Lynnfield,
Continuous Engineering Course Improvement through Synergistic use of Multiple Assessment 279
MA), an online course management and content questions that were developed by the instructor to
delivery system [24], provided course content to achieve the objectives of the study:
students and administered periodic quizzes, prac- . On average, how much time outside of class do
tice exams, and a midterm survey.
you spend on AE 447 per week (please be
Four classroom assessments were implemented
honest)?
during the fall semesters of 2001and 2002. There . What do you have the most difficulty under-
were 14 and 25 students in the 2001 and 2002
standing in AE 447?
classes, respectively. The assessments were (1) a . What can I do to help you learn about hydrau-
weekly e-mail journal, (2) a midterm feedback
lics?
e-survey, (3) an end-of-term focus group, and (4) . What suggestions do you have for improving the
an end-of-term SEI form. Each of these assessment
class?
tools will be described in more detail in the . Please rate the instructor's performance in
following section.
helping you learn (5 excellent to 1 poor).
The instructor examined the responses to identify
Weekly e-mail journal
reoccurring themes. Appropriate course adjust-
Students completed a focused e-mail journal by
ments were made based on this mid-term feedback.
submitting weekly responses to the following state-
Ambiguities and questions arising from the data
ments and questions that were developed by the
were used in the development of guiding questions
course instructor:
for the subsequent focus groups.
. Summarize three main points discussed in
today's class.
. What was most clear to you in today's class? End-of-term focus group
. What topics are you having difficulty under- Near the end of each term, a pool of students
standing and why? was selected from each class to represent a cross
. What questions remain in your mind about the section of past academic performance. These
content of today's class that I could answer? students were asked to participate in the focus
. What helped you learn in today's class? group and were offered a light lunch as an incen-
This set was developed to address the objectives of tive. Their participation was voluntary, and some
the study and provide a good learning experience students were unable to participate because of time
for the students. The number of questions was conflicts. Focus group participants were selected
limited in number so that the students were not randomly from those who completed the consent
unnecessarily burdened by the weekly assignment. form at the beginning of the semester, although a
The e-mail answers to these questions were to be cross-section of students with various cumulative
submitted by midnight of the day following the grade point averages was used to ensure that all the
first classroom period of the week. This time frame participants were not just high or low achieving
was chosen so that the classroom experience was students. Ten students were asked each time to
still fresh in the students' minds. In preparation for participate, but not all attended because of
the second classroom period of that week, the conflicts. Guiding questions for the focus group
instructor read the student submissions in one discussions were developed based on e-mail
block of time. The instructor communicated his responses and the midterm feedback e-survey. A
responses through (1) e-mail replies to the indivi- focus group moderator and recorder, neither of
dual students posing questions, (2) e-mail replies to which was the course instructor, guided and
the entire class, and/or (3) replies incorporated into recorded focus group discussions which lasted
the following lecture. Five percent of each approximately one hour. Discussions were
student's course grade was based on the pro- recorded on audio tape, and the recorder made
portion of possible journal entries that he/she annotations to indicate which student was speak-
submitted and completion of the mid-term ing. The audio tape was transcribed by a depart-
survey. Justification for basing a portion of the mental secretary. In the focus group transcript, the
course grade on these two assessments came from anonymity of the participant was protected by
the expectation that students communicating changing the names of the students before it was
about course content and perceptions of their released to the instructor. The instructor read and
learning would facilitate further learning. The analyzed the transcript only after the course was
responses to the questions were also used in devel- finished. The transcripts were analyzed using the
oping questions for the focus group sessions. long table method to find potential answers to
questions that were raised by data from the other
assessments [20]. To help ensure that the students
Midterm feedback e-survey would respond honestly and accurately, they were
At mid-term, students were asked to complete a told that the instructor would not know their
course survey administered through WebCT. identity and the instructor would not be involved
While responses to the survey were anonymous, in conducting the focus group. In 2001, eight out
WebCT indicated which students responded to of 14 students participated (57%); while in 2002,
the survey. The survey consisted of the following four out of 25 students participated (16%).
280 B. Steward et al.
Fig. 1. Percentages of student question type by category from the weekly e-mail journals (2001, N 148; 2002, N 285).
Continuous Engineering Course Improvement through Synergistic use of Multiple Assessment 281
entries. One student replied when asked how the dealing with learning methods that the students
e-mail journals helped learning, `You have to felt helpful. But often, it was difficult to gain much
answer the same questions each week and if you understanding from these individual responses.
don't have any questions then it's just a pain.' In Part of the reason was that the responses were
addition, requiring students to write about course class session dependent, and the mix of teaching
topics in the journal forced them to interact with methods in different classes varied. In post-course
and think more deeply about course content. One analysis, however, more understanding was
student illustrated this interaction when he/she derived from these responses by examining aggre-
wrote, `The question that I have on the above gated data.
topic deals with the example that we did in The responses to the learning methods question
class . . . never mind the question, as I tried to were categorized according to the type of teaching
figure out how to word it, I solved the question on method that students felt best helped learning in
my own.' The weekly e-mail feedback journal also particular classes. Across the two years, the
allowed the instructor to gauge student percep- percentages of responses were highest for working
tions about their learning and his teaching problems, multimedia, and instructor explana-
methods in a timely manner. The instructor was tions, respectively. In 2001, 158 responses were
thus enabled to make better-informed judgments collected. The category multimedia, received the
about how to guide the course to optimize student most responses (35%), while instructor explana-
learning. tions and working problems were the next highest
While the e-mail journal was effective, it was capturing 23% and 20% of the total number of
limited by two constraints. First, the quality of responses respectively. In 2002, 288 responses were
feedback depended on students providing feedback collected and the working problems category
that truly represented their experience. Enthusiasm received the most responses (42%), with multi-
for responding to the e-mail journal tended to media (25%) and active learning (10%) receiving
wane in the last half of the course as indicated by the next highest numbers of responses (Fig. 3).
the decrease in the number of responses as the Two possible reasons may explain why particu-
semester progressed (Fig. 2). Second, reading lar methods received high numbers of responses.
through all of the student responses each week First, particular methods were perceived as being
required a substantial amount of instructor time, useful in students' learning. Computer animations
and would have required even more if all responses were often cited as helping learningÐas the
were categorized and reflected upon throughout students found that animations helped to crystal-
the course. Typically, the instructor read through lize particular concepts. One student wrote that
the responses when preparing for the next class, computer animations were `really helpful to see
looking for questions to address to provide new these complex systems in motion to truly under-
opportunities for learning. The instructor also stand what is happening.' Second, some methods
tried to gain an understanding of the difficulties were used more frequently than others. The
students were encountering to provide a review of instructor explained course concepts and worked
content in the next class to help learning. In problems, for example, in practically every class
incorporating this feedback into the subsequent period. It is thus expected that this category would
class, he made modifications to the presentation receive high response rates.
and lecture notes. While a general understanding
of student perceptions of learning came through Midterm feedback e-survey
the quick during-course analysis, more insight In 2001, 12 out of 14 students (86%) responded
came after working with the data in the post- to the midterm feedback e-survey, and in 2002, 24
course analysis which had value for subsequent out of 25 students (96%) responded. The responses
classes. During each course term, for example, the for specific questions requiring short answers
instructor read student responses to the questions ranged from no answers provided in a few cases,
Fig. 2. Percentage of possible number of e-mail feedback responses per week in the semester (2001, N 14; 2002, N 25).
282 B. Steward et al.
Fig. 3. Percentages of responses to the weekly e-mail journal question, `What helped you learn in today's class?' categorized by type of
instructional method (2001, N 158; 2002, N 288).
to one or two word answers, to a response that wrote, `Sometimes it's difficult to look at a sche-
consisted of 100 words. These responses provided matic and know WHY it is laid out that way . . .
formative assessment of student perceptions of the it's relatively easy to see what the circuit does, but
first-half of the course. They provided a more not always easy to understand why.' The third
global perspective of the course, as compared to highest number of responses (17%) involved diffi-
the weekly e-mail journalsÐwhich provided culties with unit conversions. Competency as a
perspective on individual classes. The midterm fluid power engineer requires skill in converting
feedback e-survey helped the instructor better units in both English and S.I. measurement
understand student learning difficulties by provid- systems. These responses led the instructor to
ing feedback that could be easily summarized and provide more example problems with unit conver-
interpreted. sions and to point out where students typically
Nevertheless, when the students were asked have difficulties with units in particular equations.
what they found to be the most difficult to under- Similarly, responses to the question about how
stand, the largest number of responses across both the instructor could help student learning also
years (31%) indicated that no problems existed or provided greater insight into student learning
no particular `most difficult' concept could be preferences than the responses in the weekly e-
ascertained (Fig. 4). However, of particular mail journal about learning methods used in
concepts that students considered difficult, 25% individual classes. Two themes emerged from the
of the responses were related to the interpretation student responses. One themeÐas indicated by
of circuit diagrams. For example, one student 28% of the responses across both yearsÐwas that
Fig. 4. Percentages of responses to the most difficult to understand course areas from the midterm e-survey (2001 and 2002 classes
combined; N 36).
Continuous Engineering Course Improvement through Synergistic use of Multiple Assessment 283
Fig. 5. Percentages of responses to what helped students learn best from the midterm e-survey (2001 and 2002 classes combined;
N 36).
having `real world examples,' `more examples that group was much smaller because of student time
apply to real life,' and `case study examples' would conflicts, and the participants provided longer,
enhance their learning. The second theme that more thoughtful responses to the guiding ques-
emergedÐ25% of the responsesÐwas that the tions. The tone of this discussion was more positive
students thought more `hands-on' exercises would and suggestions were more constructive in nature.
help learning (Fig. 5). These two response cat- These differences between the focus groups, illus-
egories clearly indicate student preferences for trate a potential drawback of focus groups. While
visual, sensory, and active learning. With this the open-ended interaction of the participants in
knowledge, the instructor has continued to intro- focus groupsÐstimulated by the moderator's guid-
duce improvements to the course that better ad- ing questionsÐleads to useful data, it also can
dress these learning preferences. make the process of inquiry more difficult to
When asked a more general question about what control [22]. This lack of control can complicate
students would consider as an improvement for the drawing of meaningful conclusions.
class, the students provided many differing Nevertheless, the focus group assessment of the
responses such as, `scale back the amount of course had value because of the in-depth insights
work just a little,' or `have class in a room where into students' thoughts about the course, students'
PowerPoint is available.' These suggestions were perceptions of their learning, and students' obser-
generally understandable and often provided speci- vations on how the instruction and the other
fic information on how the course could be assessments were helpful to their learning. The
improved. In addition to these suggestions, a focus group was summative and, as such, did not
theme similar to those identified above emerged: lead to instructional improvements during the
Many responses indicated that giving the course a same term. Nevertheless, the deeper understanding
more practical, hands-on orientation and working into (1) student learning preferences and (2)
more problems could improve the course. perceptions of teaching methods derived from the
focus group discussion was beneficial and applic-
End-of-term focus groups able to subsequent terms.
In general, focus group discussions consisted of Student learning preferences. Focus group
honest, open and frank opinions of what the discussions clarified feedback from other assess-
students thought about the class. They seemed to ments leading to a deeper understanding of how
be uninhibited in speaking their mind and free in student learning was taking place. The focus group
providing negative comments. discussions clarified what it meant for the class to
Because of the small percentage (31%) of be more `real-world' or `practical'Ðthemes that
students involved in the focus group, results may arose from the midterm e-survey. Through the
not be representative of the entire class; however, focus group discussion, we came to understand
the results were not in conflict with the other that when students refer to something being `real-
assessments which collected data from the entire world,' they mean they can see the connection
class. The discussions in 2001 and 2002 were quite between the course content and where they might
different in character. In 2001, the focus group was use a concept in their future careers as fluid power
larger, and the discussion tended to quickly drift engineers. One student praised this building of
away from the guiding question posed by the connections when describing the instructor, `He
moderator. The answers provided by individual relates to industry. . . that really helps me'.
participants were terse, and the tone tended to be The idea of something presented in a real-world
quite negative about the course. In 2002, the focus fashion may also be connected to student learning
284 B. Steward et al.
preferences. The focus group discussions gave as online information. They didn't appreciate the
substantial insight into the how students prefer to need for them to go to WebCT to get information.
learn. In particular, focus group discussions `When you are a professor, just don't put it on the
showed clearly students' preference for visual web and expect us to just go get it,' remarked one
over verbal learning. Visualization is very impor- student. New information was posted to WebCT
tant to students. They thus found animations of during the duration of the course and led to
circuits and systems that were shown in class another student saying, `It's like you just have to
helpful to their learning. One student said, `. . .you be checking all the time. It's not a web-based
can visually see it on a screen and see things course and so I don't feel the need to go there.'
moving. I guess that's one thing that helps me In 2002, however, the focus group discussion
learn is to be able to see it.' Another expressed about WebCT was very positive and the students
appreciation for figures and illustrations, `I think seemed to view WebCT as a tool. This group of
that's part of good teachingÐto have lots of good students preferred to have information available
visuals.' on-line: `It's just a lot easier to get at stuff.'
This preference for the visual is easily contrasted Another student said, `I like . . . the availability
with the students' dislike of reading textbooks to of all the slides . . . on WebCT.' These students also
learn course concepts. Students indicated, for ex- appreciated the on-line quizzing feature of
ample, that they had difficulties reading the text- WebCT, `[WebCT is] very valuable. He usually
book to gain an understanding of how things has a practice test on there before the exam and
work. A student remarked, `I could read every you basically print that off and work through it,
manual on hydraulics, and it just wouldn't get it and it really helps.'
done for me. I wouldn't know anymore than I These differing focus group discussions illustrate
know right now.' When asked about the textbook, several considerations about the use of focus
one student said, `Get rid of it'. groups. First, it is important to keep in mind
Similarly, the students expressed a preference to that focus groups are only a sample that may not
learn actively through hands-on lab experiences. be representative of the entire class. In related
One student stated this preference well, `Once research, we have observed that smaller focus
you've had hands-on experience and go back to groups, while being a smaller sample, often
the classroom, it makes it a lot easier. When you produce the most insightful discussions. As such,
see it on paper, it means a lot more.' In reference to even though differences between the focus groups
a demonstration of pumps in the lab, a student exist, those differences reveal the diversity of
spoke up, `That was great. Everyone remembered perceptions and attitudes about instructional
what a geroter pump looked like because he methods. Second, it is important to consider
showed us.' information from other assessments when drawing
Perceptions of instructional methods. The focus conclusions from focus group discussions. Since
group discussions also provided summative reflec- most of the students were involved in the other
tions on the methods that helped learning across assessments, those assessments are more represen-
the entire course, in contrast to the weekly e-mail tative of the entire class while not providing the
journal, which provided class dependent feedback, depth of insight that came from the focus groups.
or the midterm feedback e-survey in which the In addition, the student responses from the other
students were asked formatively what could be assessments were more independent from one
done in class to help their learning throughout another than the responses in the focus groups.
the rest of the course. In the 2001 focus group While some of the deep insights derived from focus
discussion, the use of animations and active learn- group discussions came from the interactions of
ing team exercises were briefly mentioned as being the participants, this interaction may also be a
helpful. In 2002, the focus group discussion that weakness as the tone or attitude of a few persons
followed from this question was considerably can have a strong effect on the direction of the
longer and included discussion of the usefulness conversation.
of e-mail journals, example problems, concrete One unexpected insight that became clear
examples in industry, visualization, course notes, through the focus group discussions was that
and course content on WebCT. students faced perceived large demands on their
The two focus group discussions about particu- time as they tried to juggle multiple projects
lar instructional methods were substantially differ- included in upper level engineering classes. For
ent. These differences can be illustrated by example, one student said, `It happens in a lot of
discussions about the use of WebCT. In 2001 the engineering classes that I've had. It all comes
focus group discussions, the students seemed irri- down to the final design project right at the end. In
tated and felt that using WebCT added additional these last final three or four semesters, and they all
work. Some students expressed displeasure for the come due about the same time so you have some
additional information that the instructor made difficulty in managing your schedule.' Thus given
available on WebCT. One student expressed frus- this time pressure, the students expressed frustra-
tration about the course being based on informa- tion about the open-ended design project assigned
tion from multiple sources, not only traditional in this course.
sources, like a textbook and course notes, as well As a result of time pressure, it appears many
Continuous Engineering Course Improvement through Synergistic use of Multiple Assessment 285
students just meet basic course requirements. This 4.00 and the standard deviation was 0.67. Using
is not necessarily a sign of laziness, but could be the ratings, it appears the students perceived that
their method of managing the demands of multiple their learning was between `satisfactory' and
courses. In addition, students appreciate instruc- `good' in both years.
tional methods that help them save time in finish- Of the 34 forms from the two years, 19 (56 %)
ing an assignment and tend to dislike methods that had written comments on them. Students often
require additional time or are perceived as busy composed their written comments with multiple
work. This may be the reason for the mixed phrases that were often distinct suggestions, criti-
reactions to the e-mail journals and WebCT. cisms, or praise. From the written comments, 24
Along these lines, the students preferred the distinct phrases were found. Three of these phrases
instructor to work problems in the classroom. (13%) were positive statements such as: `excellent,
These insights from the focus group discussions well presented class' or `the course was good'. Six
have led to conversations among faculty about of the phrases (25%) were neutral statements or
course and capstone design projects and how we suggestions such as: `more examples in class' or `be
might accommodate these student concerns. more clear on everything'. The remaining 15
Discussions have taken place about adjusting phrases (63%) were negative statements such as:
major assignment deadlines and possible integra- `too much work for two credits' or `book didn't
tion of projects with other courses. Insights from help much'. It was difficult to categorize the
the focus groups have led the instructor of this phrases because so many different topics were
course to continually improve the course by inte- addressed. Of the categories drawing the largest
grating more of those methods that the students number responses, however, there were eight nega-
find helpful into the course. tive statements about the amount of work asso-
ciated with the class. There were also three
End-of-term SEI negative statements about the textbook.
A critical analysis of the departmental SEI form While the SEI form provided a low effort means
revealed that it reflected a teacher-centered para- of instructor evaluation, it tended to provide less
digm of education. The first sentence on the SEI feedback to instructors on how to improve learn-
form was, `Your frank and honest answers to each ing. In particular, the quantitative measures reveal
question in this evaluation will help your instructor some measure of student satisfaction, and some
improve this course and teaching procedures used basic guidance on course improvement could be
in it.' This statement set the tone for the entire SEI derived from them. Generally, however, the scores
form, that is, the quality of a course is primarily a did not depart from the range corresponding to
matter of the instructor's performance. The ques- satisfactory to excellent ratings so not much mean-
tions related to the instructor and course solicited ing could be derived from these measures. In
ratings based on how well the instructor had addition, the scores are difficult to interpret to
performed or met the expectations of the students. gain understanding of how the course could be
The instructor, for example, was rated on how well changed to affect improvements in student learn-
he `knew the subject matter,' `presented legible ing. For example, the low scores regarding the
board work,' or `was well-prepared for class.' course text indicate consistent student dissatisfac-
The third group of questions addressed the tion with the text, but they do not indicate how to
adequacy of the physical classroom environment. make an improvement in the text. The written
Students are asked directly about their learning in comments, if provided, have potential to provide
only two questions: (1) `The course assignments suggestions of how the course could be improved,
helped students learn subject material,' and (2) but they tended to be dominated by negative
`The course increased student knowledge of the comments.
subject.'
In 2001, 12 out of 14 students (86%) completed Synergism of assessments
SEI forms; while in 2002, 22 out of 25 students Through this research, we found a synergistic
(88%) completed the forms. In 2001, the mean effect when using multiple formative and summa-
scores ranged from 4.25 to 3.25, with the exception tive classroom assessments techniques for a course.
of a mean score of 2.83 for the statement, `The text Part of the reason for the synergistic effect of the
material was well-written and easily understood.' multiple assessments was that the assessments
In 2002, the mean scores ranged from 4.45 to 3.24 differed in repetition, focus, and type of questions.
with the question about the text drawing the lowest Because of these differences, each of the assess-
score. When asked if the course assignments ments was probing at different points of informa-
helped them learn subject material, in 2001, the tion about teaching and learning, making it
mean score was 3.42 and the standard deviation difficult to rank the value of one relative to
was 0.79. On the same question in 2002, the mean another. In addition, the differences led to the
score was 3.77 and the standard deviation was combination of assessments providing a fuller
0.68. When asked if the course increased student view of teaching and learning than if each assess-
knowledge of the subject, in 2001, the mean score ment was used in isolation.
was 3.33 and the standard deviation was 0.98. On Through careful analysis of the data from each
the same question, in 2002, the mean score was of the assessments and use of questions arising
286 B. Steward et al.
from one assessment to design or guide the analysis learned and what instructional methods were
in another assessment, the interaction between perceived as helpful for learning in a fluid power
student learning and instruction was more fully engineering course. Formative assessments helped
understood. Clearly, adequate assessment of the instructor quickly understand where students
student learning is both formative and summative had difficulties learning and enabled the instructor
and will require more than a traditional SEI. to make improvements during the courses and
Formative assessment promotes student reflection from course to course. The following conclusions
on learning, provides the instructor with informa- can be drawn from this study:
tion that can be used to change the course during The combination of assessments was helpful in
the term, and thus provides students with evidence understanding which instructional methods
that their feedback is critical in the learning students preferred.
process and is taken seriously by the instructor. The weekly e-mail feedback journal helped the
As shown in other studies, while SEI may be a instructor immediately understand where students
valid indicator of instructional quality (Green- were experiencing difficulty in learning particular
wald, 1997), SEI tends to provide less information concepts. This feedback allowed the instructor to
useful for improving instruction. Having the other make timely course adjustments to help students
assessments available for the same course reveal overcome these difficulties. Furthermore, students
how much more insight can be gained. felt encouraged that their feedback was affecting
The resources required to implement multiple how the course was taught. The students tended to
classroom assessments may be a point of concern, dislike the weekly task of completing the journal
particularly for larger classes. Scaling multiple entries, but some students found educational value
classrooms assessments to larger class sizes, in this assignment.
however, may not necessarily lead to prohibitively The mid-term e-survey provided a more global
large increases in time and effort required to perspective of student learning. While some speci-
administer the assessments. E-mail journals are fic suggestions were easily understood, other
the only assessment of those examined, where responses were somewhat difficult to interpret
instructor time to review and respond to the due to lack of explanation.
questions is expected to scale linearly with class Focus group discussions provided insight into
size. For the size of classes described in the paper, perceptions of student learning and instructional
typically only about one hour per week was methods, as well as how demands on students' time
required to process this feedback. The mid-term affected their perceptions of instructional methods.
feedback survey and the SEI were automated, and Depending on the characteristics of the focus
larger class sizes should not add much additional group, discussions can sometimes go in seemingly
time for review, but will result in a larger dataset. divergent directions and are affected by the atti-
Focus groups may benefit from a larger class size tude and tone of the participants.
which would provide a larger pool from which a The SEI required a low effort evaluation of
focus group may be formed. Since there is not a student perceptions of the course, particularly
clear advantage of a larger focus group and the their preferences and overall satisfaction with the
resources required per focus group discussion are instructor. It provided less useful information for
fixed in terms of faculty and staff time, a larger course improvement than the other assessments.
class should not lead to additional resource The use of multiple classroom assessments did
requirements. not lead the authors to conclude that one or more
assessments were better than the others. In fact,
because they were probing different aspects of
student perceptions of their learning, it was diffi-
CONCLUSIONS cult to rank the value of one relative to another.
The multiple assessment approach led to the syner-
Multiple assessments were helpful for a course gism that resulted in deeper insight about the
instructor to gain understanding of how students teaching±learning process.
REFERENCES
1. P. Black and D. Wiliam, Inside the black box: raising standards through classroom assessment, Phi
Delta Kappan, 80(2), 1998, pp. 139±148.
2. G. P. Wiggins, Assessing Student Performance, Jossey-Bass, San Francisco, CA (1993).
3. M. E. Huba and J. E. Freed, Learner-Centered Assessment on College Campuses: shifting the focus
from teaching to learning, Alleyn and Bacon, Needham Heights, MA (2000) pp. 8, 121±150.
4. C. A. Palomba and T. W. Banta, Assessment Essentials, Jossey-Bass, San Francisco, CA (1999).
5. T. A. Angelo and K. P. Cross, Classroom Assessment Techniques: A Handbook for College
Teachers, 2nd ed., Jossey-Bass, San Francisco, CA (1993).
6. C. Boston, The concept of formative assessment, Practical Assessment, Research & Evaluation, 8,
(2002). Retrieved June 21, 2003 from http://edresearch.org/pare/
7. K. O. Doyle, Evaluating Teaching, Lexington Books, Lexington, MA (1983).
Continuous Engineering Course Improvement through Synergistic use of Multiple Assessment 287