Evaluating Your Assessment Instruments
In evaluating a student assessment, be it a paper and pencil test, a creative production or a written assignment, the most important component is to be sure there is a match between the objectives of the unit/course/lesson being assessed, the teaching/learning activities used, and the assessment tool.
In assessing the appropriateness and effectiveness of an assessment tool consider the following:
- What are the objectives of the course/unit/lesson that are being assessed?
- What domain is being assessed: cognitive, affective, psychomotor? Is the domain appropriate given the objectives for the course/unit/lesson?
- If the domain is cognitive, consider what level from Bloom's taxonomy is being assessed: knowledge, comprehension, application, analysis, synthesis and/or evaluation. Is the level appropriate given the objectives for the course/unit/lesson?
- Is the assessment at a level appropriate to the level of the course (freshmen, graduate etc.)?
- How well does the content of the assessment match the objectives being assessed?
- How well does the content of the assessment match the learning opportunities presented in the unit/lesson/course (i.e., does the assessment assess what was taught)?
- How clear are the directions for the assessment (i.e., what response is required of students, length and form of that response, time for completing response)?
- Is the assessment organized in such a way as to aid clarity and understanding of its requirements?
- Here are some further considerations you should give paper/pencil tests (see Gage and Berliner for more information).
Back to Top
- Are verbs chosen carefully and precisely to clearly indicate what students should do? (For example, explain, list, identify, construct, compare.)
- Does instructor have a model answer or specific points to help make grading more consistent?
Back to Top
- Is the stem a meaningful part of the question?
- Are distractors plausible?
- Are all choices of roughly equal length and precision?
- Are all choices grammatically consistent with the stem?
- Does answering correctly depend more on reading ability than content knowledge?
Assessment Item Creation and Review. Retrieved from http://712educators.about.com/cs/assessment/a/assessments.htm.
Chism, N. (1999). Peer Review of Teaching: A Sourcebook. Bolton, MA: Anker. Available in CELT library.
Creating Effective Classroom Tests, by Christine Coombe and Nancy Hubley.
Matching Instructional Objectives, Subject Matter, Tests, and Score Interpretations, Hanna and Cashin, 1987. Retrieved from http://www.idea.ksu.edu/papers/Idea_Paper_18.pdf.
Gage, N.L. & Berliner, D. C. (1998). Educational Psychology. 6th edition. Boston: Houghton Mifflin.
Improving Essay Tests, Cashin, 1987. Retrieved from http://www.idea.ksu.edu/papers/Idea_Paper_17.pdf.
Improving Multiple Choice Tests, Clegg and Cashin, 1986.
Retrieved from http://www.idea.ksu.edu/papers/Idea_Paper_16.pdf.
Back to Top