You are on page 1of 15
Annual Review of Applied Linguistics (1995) 18, 212-226. Printed in the USA. Copyright © 1995 Cambridge University Press 0267-1905/95 $9.00 + .10 APPROACHES TO ALTERNATIVE ASSESSMENT Else V. Hamayan INTRODUCTION Current trends in assessment, no longer based on the view that language learning entails a passive accumulation of skills, have led to the increasingly more common use of assessment procedures that differ quite drastically from standar- dized norm-referenced measures of language proficiency (Calfee and Hiebert 1991, Calfee and Perfumo 1993, Gifford and O'Connor 1991). Increasing criticism of standardized tests, especially in light of current educational reform movements, has also brought into question the value of other indirect approaches to assessment (Clay 1990, Cohen 1994, Damico 1992, Haladyana 1992, Oller 1992, Pikulski 1990, Worthen 1993). Additionally, interest groups representing both linguistically and culturally diverse students and students with special education needs have called for a change in our approaches to assessment. The goal is to ensure equity in educational opportunities and to strive toward educational excellence for all students (Council of Chief State School Officers 1992, Fradd, McGee and Wilen 1994, Hamayan and Damico 1991, LaCelle- Peterson and Rivera 1994), Although some researchers suggest that it is false to assume that alternative assessment approaches automatically ensure equity for diverse populations (Darling-Hammond 1994), these approaches nonetheless provide a wealth of information which must minimally serve as a context for a more valid interpretation of all standardized test results. In a more central capacity, information from these alternative assessment procedures can constitute the sole basis for much educational and instructional decision-making (Damico 1992). Two other factors have contributed to changes in assessment practices and the demand for assessment reform. The first is the increasing importance of the relationship between assessment and both teaching and learning (Murphy and Smith 1992, Stayter and Johnston 1990). An increasing number of educators are a. APPROACHES TO ALTERNATIVE ASSESSMENT — 213 calling for assessment in the classroom to guide instruction (Genesee and Hamayan 1994). Further, since assessment and learning are closely tied, assess- ment practices need to change so that they reflect the learning process (Marzano, Pickering and McTighe 1993). The second contributing factor toward a change is the evolving nature of educational goals, which are currently directed to more sophisticated and higher standards than has been the case over the last twenty years (Au, Scheu, Kawakami and Herman 1990, Perrone 1991). In the 1970s, the emphasis on the acquisition of minute skills led to a proliferation of discrete- point tests that are simply not adequate to measure achievement of goals outside the traditional areas of language proficiency (Marzano, Pickering and McTighe 1993). More holistic and integrative views of language, and the push toward the development of higher-order skills, have given rise to alternative approaches to assessment. Alternatives to standardized assessment have been referred to in the literature in many ways: “alternative assessment," "informa! assessment," "authentic assessment," "performance assessment," “descriptive assessment," and "direct assessment." Although these labels reflect subtle differences in focus, they share some basic characteristics (Meyer 1992). In this paper, the term “alternative assessment" will be used since it is more generic than the other terms and it incorporates characteristics of the other commonly-used labels. Alternative assessment refers to procedures and techniques which can be used within the context of instruction and can be easily incorporated into the daily activities of the school or classroom. Unlike standardized testing, it does not provide a compari- son of an individual to a larger group beyond the students in a given classroom (Navarrete, Wilde, Nelson, Martinez and Hargett 1990). This review examines the major characteristics of alternative assessment, uses of alternative assessment procedures, and different types of alternative assessment. Issues that determine both the current status of alternative assessment (especially its role in the larger assessment field) and the future development of alternative assessment will also be discussed. CHARACTERISTICS OF ALTERNATIVE ASSESSMENT i. Proximity to actual language use and performance Alternative assessment procedures are based on activities that have authentic communicative function rather than ones with little or no intrinsic communicative value. Because these procedures strive for a more direct representation of language use and language behavior, they tend to be based on actual performance in authentic situations which the learner is likely to encounter in his or her daily life (Glazer and Brown 1993, Goodman, Goodman and Hood 1989, Tierney, Carter and Desai 1991). Consequently, much of alternative assessment is classroom-based and locally developed. This shift of emphasis has 214 ELSE V. HAMAYAN led to a social change within the culture of school: The role that language teachers have begun to play in assessment has changed from one of recipient of information about the learner, usually given by assessment "specialists," to a provider of information to others such as administrators, policy makers, and other teachers. The increasing popularity of alternative procedures has opened up the realm of assessment to include teachers who are aot likely to be specialists in the area of testing, research, evaluation, and psychometrics. ‘The implications of such a change for teacher preparation and staff development has been discussed by several educators (Darling-Hammond and Goodwin 1993, Routman 1991). 2. A holistic view of language Altemnative assessment procedures are based on the notion that the interrelationships among the various aspects of language, such as phonology, grammar, and vocabulary, cannot be ignored. Also, the four skills of lan- guage—listening, speaking, reading, and writing—are seen to be parts of a structurally integrated whole. Through alternative assessment approaches, language can be assessed not so much as structure but rather as a tool for com- munication and self-expression (Eggleton 1992, Goodman, Goodman and Hood 1989). However, the use of alternative assessment procedures still allows for analyses at the structural level of language and provides descriptions of discrete aspects of language (Hamp-Lyons 1992). Alternative assessment also takes into account the whole learner and his or her social, academic, and physical context. Consequently, by assessing the learner in various natural settings, a more holistic evaluation is possible (Harp 1991). 3. An integrative view of learning Alternative assessment attempts to capture the learner’s total array of skills and abilities (Tierney, Carter and Desai 1991). Through alternative assessment procedures, it is possible to measure language proficiency in the context of specific subject matter (Hill and Ruptic 1994, Short 1993, Turner 1992). Thus, for school age learners, questions can be answered regarding students’ ability to process in English information in areas of science or social studies; in the case of adult learners, one might assess how well a person can hold a conversation in a business setting. Alternative assessment procedures are also based on the idea that various aspects of a learner's life, both academic (or professional) and personal, are integral to the development of language profici- ency and cannot be ignored (Baskwill and Whitman 1988). Altemative assess- ment also allows for the integration of various dimensions of learning as they relate to the development of language proficiency. These dimensions include not only processes such as acquiring and integrating knowledge, extending and refining knowledge, and using knowledge meaningfully, but also issues such as varying student attitudes towards learning (Davies, Cameron, Politano and Gregory 1992, Marzano 1994, Marzano, Pickering and McTighe 1993). APPROACHES TO ALTERNATIVE ASSESSMENT — 215 4. Developmental appropriateness Alternative assessment procedures set expectations that are appropriate within the cognitive, social, and academic development of the learner (Grace and Shores 1991, Tierney, Carter and Desai 1991). Because it is possible to design assessment that meets individual learners’ needs, alternative assessment reveals information about a learner’s proficiency in the context of what is relevant to that learner’s life and experiences (Harp 1991, Kletzien and Bednar 1990, Roderick 1991). It also allows for a more valid interpretation of information than that obtained from more traditional standardized tests. This characteristic of alterna- tive assessment makes it particularly valuable for second language learners who come from culturally diverse backgrounds and who may have atypical educational experiences. 5. Multiple referencing Alternative assessment, perhaps because of the untrustworthy psycho- metric characteristics that many associate it with and the distrust that a single measure sometimes elicits, usually entails obtaining information about the learner from numerous sources and through various means (Barrs 1990, Flood and Lapp 1989). Thus, a typical portfolio that is used to evaluate a student’s language proficiency may include scores from a standardized test, writing samples, teach- ers’ observations of the student as he or she participates in small group work, ratings of the student’s work in classes other than English as a second language, and ratings of the student’s performance with hands-on tasks (Goodman, Goodman and Hood 1989, Paulson, Paulson and Meyer 1991). PURPOSES AND USES OF ALTERNATIVE ASSESSMENT Unlike standardized testing, which usually produces a score that may not be meaningful by itself, information from alternative assessment is easy to interpret and understand. This represents a tremendous benefit for all the possible clients of assessment. For students, alternative assessment allows them to see their own accomplishments in terms that they can understand and, conse- quently, it allows them to assume responsibility for their learning (Alexander 1993, Jonker 1993, Rief 1990). Alternative assessment enables parents to share in the educational process, and it offers them a clear insight into what their children are doing in school (Davies, Cameron, Politano and Gregory 1992, Flood and Lapp 1989, Hill and Ruptic 1994). For teachers, the primary advan- tage of alternative assessment is that it provides data on their students and their classroom for educational decision-making. In addition, it chronicles the success of the curriculum and provides teachers with a framework for organizing students’ work. Even administrators, who are typically least convinced of the advantages of alternative assessment, can benefit from the clear information about student and teacher attainment over time (Kramer 1990, Mills 1989). 216 ELSE V. HAMAYAN Traditionally, testing and assessment have been used primarily for the purposes of evaluating the fearner. It is only recently that a second purpose is being called for—evaluating instruction (Genesee and Hamayan 1994). Alterna- tive assessment lends itself well to both purposes, especially the latter since the teacher has a measure of control when using it (Eggleton 1992). 1. Evaluating the learner Alternative assessment provides us with an insight into individual stu- dents’ language proficiency that cannot be obtained from standardized tests. The information obtained from alternative assessment is extensive and reflects a wide range of abilities and skills in language in a variety of contexts (Graves and Sunstein 1992, Hamp-Lyons 1992). Through alternative assessment, it is possible to get a sense of how the learner manages a conversation with a peer, expresses him- or herself in writing, or is able to conduct an experiment in science while working with English-speaking peers in the classroom. Because most alternative assessment is ongoing over the period of a year (or at least a semester), the picture that emerges about the learner and his or her language proficiency reflects the developmental processes that take place in language learning over time (Kramer 1990). Thus, through alternative assessment, it is possible to focus on the process as well as the product of learning. If a portfolio system is used to keep assessment records on each student, students can easily see their own progress, and since portfolios feature the best of a student’s work, the assessment procedures become highly motivating for the students (Frazier and Paulson 1992, Levi 1990, McGrail and Purdom 1992, McGrail and Schwartz 1993, Valencia 1990). 2. Evaluating the instruction Because of a changing outlook on assessment from one that is learner- centered to one that is more responsive to the entire learning environment, alternative assessment procedures are being successfully used to assess not only the learner but also the classroom and the instruction (Genesee and Hamayan 1944, Mathews 1990). Although the sole focus of many assessment initiatives continues to be on the learner, many educators have called for a closer link between instruction and assessment (Fradd, McGee and Wilen 1994, Cambourne and Turbill 1990, Tierney, Carter and Desai 1991). They have suggested that assessment be part of a feedback loop that allows teachers to monitor and modify instruction continually in response to results of student assessment. This process encourages the teachers to use the results to draw conclusions about instruction and not just about the learners (Genesee and Hamayan 1994). As a result of the increasing legitimacy of alternative assessment, which is mostly classroom-based, one further important change has occurred; it has given teachers the power of assessment. APPROACHES TO ALTERNATIVE ASSESSMENT — 217 TYPES OF ALTERNATIVE ASSESSMENT Some educators categorize the various types of alternative assessment according to whether they are structured or unstructured (e.g., Navarrete, Wilde, Nelson, Martinez and Hargett 1990). Unstructured techniques are defined as being limited only by the creativity of the teacher and students—basically, any activity that can be done within the realm of school. Structured techniques are planned to a greater extent and tend to have clear outcomes such as “completed” or "not completed” attached to them. Assessment techniques have also been categorized according to whether they focus on process or product, with the former demonstrating how the student processes information and the latter focusing on the outcome of a behavior, task, or activity (Herman, Aschbacher and Winters 1992). It is evident that these categories are not always mutually exclusive: An assessment technique may be more or less structured depending on the way it is set up or the context in which it occurs. Similarly, information about either the product or the process may be obtained from the same assessment procedure depending on the focus of the assessment (Belanoff and Dickson 1991; cf. Short 1993 for another framework for organizing types of assessment.) It is also helpful to distinguish activities that yield assessment information from those techniques and procedures which are used to organize and record assessment information (Hamp-Lyons 1992). 1. Activities that yield alternative assessment information Practically any classroom, school, or language-related activity can serve as a source of information about the learner, his or her language proficiency, the learning process, the effectiveness of instruction, or the classroom. The following list summarizes the more commonly used activities (Batzle 1992, Feuer and Fulton 1993, Goodman, Goodman and Hood 1989, Harp 1991, Hill and Ruptic 1994, Pierce and O'Malley 1992, Popham 1993, Tierney, Carter and Desai 1991): 1. Writing samples: Any writing produced by the student can be used to assess language proficiency and student progress. Written work may include creative writing, correspondence, essays, or writing in response to prompts. (Eggleton 1992, Hamp-Lyons 1992). 2. Learning logs or journals: Frequent entries that students make in their journals can give teachers an insight not only into students’ language proficiency but their perceptions of the learning process. 3. Classroom projects: Individual or group projects that are completed in class can be a source of information about the student’s ability to function in a given curricular area or in interactions and negotiations with peers. Measures can include both the process of task completion and its product. 218 ELSE V. HAMAYAN 4. Interviews: Interviews with individual students can yield extensive information about the student’s language and, more importantly, about the process of learning; it also allows for students’ reflections on aspects of instruction (Canales 1992). 5. Think-alouds: As students complete a task, perhaps even a standardized or multiple-choice test, they can reflect aloud on what they are doing. 2. Ways of recording alternative assessment Once a language sample or a behavior has been elicited, it can be recorded and analyzed in several ways. The level of analysis depends on the purpose for the assessment. Although the goal of most alternative assessment is to obtain a holistic integrated representation of a student’s language, it is possible, of course, to take language obtained through any of the above activities and analyze discrete aspects of the product. Alternative ways of recording and analyzing information are listed below (Harp 1991, Hill and Ruptic 1994, Jasmine 1993, Rhodes 1993, Rhodes and Nathenson-Mahia 1992, Tierney, Carter and Desai 1991, Valeri-Gold, Olson and Deming 1991/1992): 1. Anecdotal records of observations: Notes written throughout the day or the class representing the teacher’s observations on various students. 2. Checklists: Student behaviors or products expected from a given task or activity need to be examined. Either the teacher or the student may use the checklist to complete an assessment. 3. Rating scales: Rather than noting the presence or absence of a given behavior, the observer (the teacher or the student) rates each item accord- ing to ability, frequency, extent, etc. 4. Inventories: This type of assessment can be used to list students’ interests, language habits, or their learning activities. PROCEDURES FOR SETTING UP ALTERNATIVE ASSESSMENT Although alternative assessment approaches are often associated with informality, they must adhere to rigorously thought out criteria and must be planned carefully (Fingeret 1993). The following procedures have been suggested in setting up a system of alternative assessment (Arter and Spandel 1992, Genesee and Hamayan 1994, Herman, Aschbacher and Winters 1992, McGrail 1991; 1992, Wiggins 1993a; 1993b). First, the system and design of alternative assessment to be used needs to be determined, including the designation of responsibilities for the various parts or phases of the assessment. Next, the purposes of assessment have to be clarified and agreed upon; if the alternative assessment is to be used for large-scale evaluations, then the way that the assessment is carried out, and the content of the assessment, will be quite differ- ent from that of a classroom-level evaluation. Finally, how the assessment fits in APPROACHES TO ALTERNATIVE ASSESSMENT —219 with instruction and how it articulates with the curriculum is another decision that has to be taken. These factors will determine to a large extent the contemt of the assess- ment and the particular tasks that are used to gather information about the learner. The following questions then need to be considered: 1) Do the activities selected for gathering information adequately represent the content and skills that students are expected to master? 2) Do the assessment results address the goals of the program or classroom? 3) Do the activities allow students to demonstrate their skills and abilities? Once the tasks and activities for assessment have been determined, criteria for judging student performance must be established to ensure reliability, validity, and authenticity. Criteria are necessary to guide judgments and to make public to all the clients of assessment—students, parents, other teachers, administrators, and community members—the basis for those judgments. Because most alternative assessment procedures involve a subjective component, criteria or scoring guidelines are needed (Linn, Baker and Dunbar 1991). Criteria are also essential as guides that help students complete the activities on which they are to be judged. In the context of current educational reform movements, much effort is being put into the development of rubrics, or criteria, that describe student performance at various levels of proficiency (O’Neil 1994). The higher the stakes associated with an assessment, the greater the need to document its validity and reliability. CONCLUSIONS The increase in calls for alternative assessment has changed the face of assessment and evaluation in the last decade (Simmons 1990). Nevertheless, alternative assessment approaches have yet to come of age. As long as they are referred to as "alternative" or "informal," they maintain their status as non- mainstream. Although their effectiveness and value have been demonstrated (Abruscato 1993, Gomez, Graue and Bloch 1991, Madaus and Kellaghan 1993, O'Neil 1992, Vermont State Department of Education 1990, Worthen 1993), their use for large-scale and high-stakes evaluation remains minimal (Lamme and Hysmith 1991, McDonald 1993, Popham 1993, Simmons 1990). In this regard, Worthen (1993) identifies a number of major issues for the future of alternative assessment. First, conceptual clarity is needed to ensure consistency in the applications of alternative assessment. Second, until a mechanism for evaluation and self-criticism is established, alternative assessment cannot become a viable force in education. Third, the users of alternative assessment, whether they are teachers or administrators, need to become well versed in issues of assessment and measurement. Fourth, although one of the most significant advantages of alternative assessment is its flexibility and its allowance for diversity, unless some standardization is introduced, the future of alternative assessment for high-stakes 220 ELSE V. HAMAYAN decisions is questionable. Fifth, the fiscal and logistic feasibility of alternative assessment for large-scale assessments remains to be shown. As Worthen (1993) suggests, unless these issues are resolved, alternative assessment cannot reach its full potential in education. ANNOTATED BIBLIOGRAPHY Eggleton, J. 1992. Whole language evaluation: Reading, writing, and spelling {for the intermediate grades. Bothell, WA: The Wright Group. This practical guide provides suggestions for the monitoring and evalua- tion of reading and writing programs for children in the middle years of instruction. Although this book focuses on the monitoring of instruction, it can prove to be a rich source of ideas for the informal assessment of individual students’ literacy skills. The book shows the strong link that needs to be made between assessment and instruction Fingeret, H. A. 1993. It belongs to me: A guide to portfolio assessment in adult education programs. Durham, NC: Literacy South. This excellent guide first sets the theoretical context for the use of portfolios in adult education programs and then describes four stages for setting up portfolio assessment. The four stages are described as follows: choosing portfolio assessment, planning portfolio assessment, implement- ing and evaluating portfolio assessment, and revising portfolio assess- ment. The text is interspersed with quotations from both teachers and students, thus making the discussion very practical. An annotated bibliography also provides a valuable resource. Genesee, F. and E. V. Hamayan. 1994. Classroom-based assessment. In F. Genesee (ed.) Educating second language children, New York: Cambridge University Press. 212-239, This chapter is based on the notion that assessment needs to be an inte- gral part of the design of instruction. The authors suggest ways in which teachers can make decisions on the basis of ongoing assessment that they conduct in the classroom. The authors provide suggestions for planning assessment and describe a variety of assessment techniques and proce- dures that can be used. The chapter is particularly useful for teachers who are not necessarily well versed in the area of assessment and evalua- tion. APPROACHES TO ALTERNATIVE ASSESSMENT Harp, B. (ed.) 1991. Assessment and evaluation in whole language programs. Herman, Hill, B. Norwood, MA: Christopher-Gordon Publishers. This collection of twelve chapters sets informal assessment in the context of holistic education, particularly in a primary school setting, and a thorough discussion of the principles underlying holistic assessment is provided. Most of the strategies described focus on the assessment of reading and writing, including miscue analysis, the use of literacy check- lists, and writing portfolios. The book also contains chapters on assess- ment in special education classrooms and bilingual and multicultural classrooms. A discussion of record-keeping and reporting issues completes this well-rounded volume. , J. L., P. R. Aschbacher and L. Winters. 1992. A practical guide to alternative assessment. Alexandria, VA: Association for Supervision and Curriculum Development. This book is a must for anyone who is interested in using alternative assessment. It includes a review of the context that has supported alter- natives to standardized testing, and it focuses on the importance of linking assessment and instruction. It also provides very helpful sugges- tions for the following steps in setting up an assessment procedure: determining purpose, selecting tasks, setting criteria, ensuring reliability, and using results for decision making. C. and C. Ruptic. 1994. Practical aspects of authentic assessment: Putting the pieces together. Norwood, MA: Christopher-Gordon Publishers. This useful and highly practical guide to authentic assessment covers a range of topics that place assessment in very wide contexts, including the assessment of both academic areas and language acquisition. It also addresses issues regarding the involvement of parents as well as students. The book contains many forms and checklists which can be adapted to various types of learners in different contexts. Short, D, 1993. Assessing integrated language and content instruction. TESOL Quarterly. 27.627-656 This article is unique in its direct and explicit focus on the implications for assessment when language and content instruction is integrated. It provides an excellent review of assessment reform and a framework for integrated language and content assessment. The descriptions of the skills to be measured and the procedures for assessment are very clear and useful. 221 222 ELSE V. HAMAYAN Tierney, R. J.,M. A. Carter and L. E. Desai. 1991, Portfolio assessment in the reading-writing classroom. Norwood, MA: Christopher-Gordon Publishers. The wealth of information about the theory and practice of portfolio assessment provided in this book make it an essential resource for any language educator. Despite the fact that the book is geared to teaching children, the content can be applied to language learners of all ages. The philosophy of assessment and instruction within which portfolios are placed is presented clearly, and each step in portfolio development is described in detail. Specific techniques, lists, and forms that can be adapted to many situations, and an annotated bibliography, make this book indispensable. UNANNOTATED BIBLIOGRAPHY Abruscato, J. 1993. Early results and tentative implications from the Vermont Portfolio project. Phi Delta Kappan. 74.474-477. Alexander, D. 1993. The ESL classroom as community: How self assessment can work. Adventures in Assessment. 4.34-37. Arter, J. A. and V. Spandel. 1992. Using portfolios of student work in instruc- tion and assessment. Educational Measurement: Issues and Practice. Spring.36-44. Au, K., A. Scheu, A. Kawakami and P. Herman. 1990. Assessment and account- ability in a whole literacy curriculum. The Reading Teacher. 43.574-578. Barrs, M. 1990. The primary language record: Reflection of issues in evaluation. Language Arts. 67.244-253. Baskwill, J. and P. Whitman. 1988. Evaluation: Whole language, whole child. New York: Scholastic. Batzle, J. 1992. Portfolio assessment and evaluation: Developing and using portfolios in the classroom. Cypress, CA: Creative Teaching Press, Inc. Belanoff, P. and M. Dickson (eds.) 1991. Portfolios: Process and product. Portsmouth, NH: Heinemann. Calfee, R. C. and E. Hiebert. 1991. Classroom assessment of reading. In R. Barr, L. Kamil, P. Mosenthal and P. D. Pearson (eds.) Handbook of reading research, Volume 2. White Plains, NY: Longman. 281-309. and P. Perfumo. 1993. Student portfolios: Opportunities for a revolution in assessment. Journal of Reading. 36.532-537. Cambourne, B. and J. Turbill. 1990. Assessment in whole-language classrooms: Theory into practice. The Elementary School Journal. 90.337-349. Canales, J. 1992. Innovative practices in the identification of LEP students. In Office of Bilingual Education and Minority Languages Affairs (ed.) APPROACHES TO ALTERNATIVE ASSESSMENT 223 Proceedings of the Second National Research Symposium on Limited English Proficient Student Issues: Focus on evaluation and measurement, ‘Volume 2. Washington, DC: OBEMLA. 89-122. Clay, M. 1990. Research currents: What is and what might be in evaluation. Language Arts. 67.288-298. Cohen, A. D. 1994. Assessing language ability in the classroom. Boston, MA: Heinle and Heinle. Council of Chief State Schoo! Officers. 1992. Recommendations for improving the assessment and monitoring of students with limited English proficiency. Washington, DC: Council of Chief State School Officers. Damico, J. 1992. Performance assessment of language minority students. In Office of Bilingual Education and Minority Languages Affairs (ed.) Proceedings of the Second National Research Symposium on Limited English Proficient Student Issues: Focus on evaluation and measurement, Volume 1. Washington, DC: OBEMLA. 137-172. Darling-Hammond, L. 1994. Performance-based assessment and educational equity. Harvard Educational Review. 64.5-30. and A. L, Goodwin. 1993, Progress toward professional ism and teaching. In G. Cawelti (ed.) Challenges and achievements of American education. Alexandria, VA: Association for Supervision and Curriculum Development. 19-52. Davies, A., C. Cameron, C. Politano and K. Gregory. 1992. Together is better: Collaborative assessment, evaluation and reporting. Winnipeg, Manitoba: Peguis Publishers. Feuer, M. J. and K. Fulton. 1993. The many faces of assessment. Phi Delta Kappan. 74.478. Flood, J. and D. Lapp. 1989. Reporting reading progress: A comparison portfolio for parents. The Reading Teacher. 42.508-514. Fradd, S. H., P. L. McGee and D. Wilen. 1994. Instructional assessment: An integrative approach to evaluating student performance. Reading, MA: Addison-Wesley. Frazier, D. M. and F. L. Paulson. 1992. How portfolios motivate reluctant writers. Educational Leadership. 49.8.62-65. Gifford, B. R. and M. C. O’Conner (eds.) 1991. Changing assessments: Alternative views of aptitude, achievement and instruction. Boston, MA: Kluwer. Glazer, S. and C, Brown. 1993. Portfolios and beyond: Collaborative assessment in reading and writing. Norwood, MA: Christopher-Gordon Publishers. Gomez, M. L., M. E. Graue and M. N. Bloch. 1991. Reassessing portfolio assessment: Rhetoric and reality. Language Arts. 68.620-628. Goodman, K. S., Y. M. Goodman and W. J. Hood (eds.) 1989. The whole language evaluation book, Portsmouth, NH: Heinemann. Grace, C. and E. F. Shores. 1991. The portfolio and its use: Developmentally appropriate assessment of young children. Little Rock, AR: Southern Association on Children Under Six. 224. ELSE V. HAMAYAN Graves, D. H. and B. S. Sunstein (eds.) 1992. Portfolio portraits. Portsmouth, NH: Heinemann. Haladyana, T. 1992. Test score pollution: Implication for LEP students. In Office of Bilingual Education and Minority Languages Affairs (ed.) Proceedings of the Second National Research Symposium on Limited English Profi- cient Student Issues: Focus on evaluation and measurement, Volume 2. Washington, DC: OBEMLA. 135-164. Hamayan, E. V. and J. S. Damico (eds.) 1991. Limiting bias in the assessment of bilingual students. Austin, TX: PRO-ED. Hamp-Lyons, L. 1992. Holistic writing assessment for LEP students. In Office of Bilingual Education and Minority Languages Affairs (ed.) Proceedings of the Second National Research Symposium on Limited English Proficient Student Issues: Focus on evaluation and measurement, Volume 2. Washington, DC: OBEMLA. 317-358. Jasmine, J. 1993. Portfolios and other assessments. Huntington Beach, CA: Teacher Created Materials, Inc. Jonker, N. 1993. How portfolios can empower adult learners. Clio, MI: Nate Jonker and Associates. Kletzien, S. B. and M. R. Bednar. 1990. Dynamic assessment for at-risk readers. Journal of Reading. 33.528-533. Kramer, C. J. 1990. Documenting reading and writing growth in the primary grades using informal methods of evaluation. The Reading Teacher. 43.356-357. LaCelle-Peterson, M. W. and C. Rivera. 1994. Is it real for all kids? A frame- work for equitable assessment policies for English language learners. Harvard Educational Review. 64.55-75. Lamme, L. L. and C. Hysmith. 1991. One school’s adventure into portfolio assessment. Language Aris. 68.629-640. Levi, R. 1990. Assessment and educational vision: Engaging learners and parents. Language Arts. 67.269-273. Linn, R. L., E. L. Baker and S. B. Dunbar. 1991. Complex performance-based assessment: Expectations and validation criteria. Educational Researcher. 20.8.15-21. Madaus, G. F. and T. Kellaghan. 1993. The British experience with “authentic” testing. Phi Delta Kappan. 74.458-469. Marzano, R. J. 1994. Assessing student outcomes: Performance assessment using the dimensions of learning model. Alexandria, VA: Association for Supervision and Curriculum Development. , D. Pickering and J. McTighe. 1993. Assessing student out- comes. Alexandria, VA: Association for Supervision and Curriculum Development. Mathews, J. K. 1990. From computer management to portfolio assessment. The Reading Teacher. 43.420-421. McDonald, J. P. 1993. Three pictures of an exhibition: Warm, cool, and hard. Phi Delta Kappan. 74.480-485. APPROACHES TO ALTERNATIVE ASSESSMENT McGrail, L. (ed.) 1991. Adventures in assessment: Learner-centered approaches to assessment and evaluation in adult literacy, Volume One: Getting started. Boston: World Education/SABES. (ed.) 1992. Adventures in assessment: Learner-centered approaches To assessment and evaluation in adult literacy. Volume Two: Ongoing. Boston: World Education/SABES. and L. Purdom (eds.) 1992. Adventures in assessment: Learner-centered approaches to assessment and evaluation in adult literacy, Volume Three: Looking back, starting again. Boston: World Education/SABES. and R. Schwartz. 1993. Adventures in assessment: Learner-centered approaches to assessment and evaluation in adult literacy, Volume Four. Boston: System for Adult Basic Education Support. Meyer, C. 1992. What’s the difference between authentic and performance assessment? Educational Leadership. 48.5.60-63. Mills, R. P. 1989. Portfolios capture rich array of student performance. The School Administrator. December.8-11. Murphy, S. and M. A. Smith. 1992. Writing portfolios: A bridge from teaching to assessment. Markham, Ontario: Pippin Publishing Limited. Navarrete, C., J. Wilde, C. Nelson, R. Martinez and G. Hargett. 1990. Informal assessment in educational evaluation: Implications for bilingual education programs. Washington, DC: National Clearinghouse for Bilingual Educa- tion. Oller, J. 1992. Language testing research: Lessons applied to LEP students and programs. In Office of Bilingual Education and Minority Languages Affairs (ed.) Proceedings of the Second National Research Symposium on Limited English Proficient Student Issues: Focus on evaluation and measurement, Volume 1. Washington, DC: OBEMLA. 43-124. O'Neil, J. 1992. Putting performance assessment to the test. Educational Leadership. 49.8.14-19. 1994. Making assessment meaningful: Rubrics clarify expectations, yield better feedback. ASCD Update. 36.1-4. Paulson, F. L., P. R. Paulson and C. A. Meyer. 1991. What makes a portfolio a portfolio? Educational Leadership. 48.5.60-63. Perrone, V. (ed.) 1991. Expanding student assessment. Alexandria, VA: Associa- tion for Supervision and Curriculum Development. Pierce, L. V. and J, M. O’Malley. 1992. Performance and portfolio assessment for language minority students. Washington, DC: National Clearinghouse for Bilingual Education. Pikulski, J. J. 1990. Assessment: The role of tests in a literary assessment program. The Reading Teacher. 43.714-717. Popham, W. J. 1993. Circumventing the high costs of authentic assessment. Phi Delta Kappan. 74.470-473. 25 226 ELSE V. HAMAYAN Rhodes, L. K. (ed.) 1993. Literacy assessment: A handbook of instruments. Portsmouth, NH: Heinemann. and Nathenson-Mahia, S. 1992. Anecdotal records: A powerful tool for ongoing literacy assessment. The Reading Teacher. 45.502-509. Rief, L. 1990. Finding the value in evaluation: Self-assessment in a middle school classroom. Educational Leadership. 10.24-29. Roderick, J. (ed.) 1991. Context-responsive approaches to assessing children’s language. Urbana, IL: National Council of Teachers of English. Routman, R. 1991. Invitations: Changing as teacher and learners, K-12. Portsmouth, NH: Heinemann. Simmons, J. 1990. Portfolios as large scale assessment. Language Arts. 67. 262-267. Stayter, F. Z. and P. Johnston. 1990. Evaluating the teaching and learning of literacy. In T. Shanahan (ed.) Reading and writing together: New perspectives for the classroom. Norwood, MA: Christopher-Gordon Publishers. 253-271. Turner, J. L. 1992. Creating content-based language tests: Guidelines for teachers. CATESOL Journal. 5.1.43-58. Valencia, S. 1990. A portfolio approach to classroom reading assessment: The whys, whats, and hows. The Reading Teacher. 43.338-340. Valeri-Gold, M., J. R. Olson and M. P. Deming. 1991/1992. Portfolios: Collab- orative authentic assessment opportunities for college developmental learners. Journal of Reading. 35.298-305. Vermont State Department of Education. 1990. Vermont writing assessment: The pilot year. Montpelier, VT: Vermont State Department of Education. Wiggins, G. 1993a. Assessing student performance: Exploring the purpose and limits of testing. San Francisco: Jossey-Bass Publishers. 1993b. Assessment: Authenticity, context, and validity. Phi Delta Kappan. 74.200-214. Worthen, B. R. 1993. Critical issues that wil! determine the future of alternative assessment. Phi Delta Kappan. 74.444-456,

You might also like