Celebrate 50 Years
Center for Enhancement of Learning and Teaching

Evaluating Your Assessment Instruments

In evaluating a student assessment, be it a paper and pencil test, a creative production or a written assignment, the most important component is to be sure there is a match between the objectives of the unit/course/lesson being assessed, the teaching/learning activities used, and the assessment tool.

In assessing the appropriateness and effectiveness of an assessment tool consider the following:

  • What are the objectives of the course/unit/lesson that are being assessed?
  • What domain is being assessed: cognitive, affective, psychomotor? Is the domain appropriate given the objectives for the course/unit/lesson?
  • If the domain is cognitive, consider what level from Bloom's taxonomy is being assessed: knowledge, comprehension, application, analysis, synthesis and/or evaluation. Is the level appropriate given the objectives for the course/unit/lesson?
  • Is the assessment at a level appropriate to the level of the course (freshmen, graduate etc.)?
  • How well does the content of the assessment match the objectives being assessed?
  • How well does the content of the assessment match the learning opportunities presented in the unit/lesson/course (i.e., does the assessment assess what was taught)?
  • How clear are the directions for the assessment (i.e., what response is required of students, length and form of that response, time for completing response)?
  • Is the assessment organized in such a way as to aid clarity and understanding of its requirements?
  • Here are some further considerations you should give paper/pencil tests (see Gage and Berliner for more information).

Back to Top

Essay questions:

  1. Are verbs chosen carefully and precisely to clearly indicate what students should do? (For example, explain, list, identify, construct, compare.)
  2. Does instructor have a model answer or specific points to help make grading more consistent?

Back to Top

Multiple-choice questions:

  1. Is the stem a meaningful part of the question?
  2. Are distractors plausible?
  3. Are all choices of roughly equal length and precision?
  4. Are all choices grammatically consistent with the stem?
  5. Does answering correctly depend more on reading ability than content knowledge?

Resources

Assessment Item Creation and Review. Retrieved from http://712educators.about.com/cs/assessment/a/assessments.htm.

Chism, N. (1999). Peer Review of Teaching: A Sourcebook. Bolton, MA: Anker. Available in CELT library.

Creating Effective Classroom Tests, by Christine Coombe and Nancy Hubley.

Matching Instructional Objectives, Subject Matter, Tests, and Score Interpretations, Hanna and Cashin, 1987. Retrieved from http://www.idea.ksu.edu/papers/Idea_Paper_18.pdf.

Gage, N.L. & Berliner, D. C. (1998). Educational Psychology. 6th edition. Boston: Houghton Mifflin.

Improving Essay Tests, Cashin, 1987. Retrieved from http://www.idea.ksu.edu/papers/Idea_Paper_17.pdf.

Improving Multiple Choice Tests, Clegg and Cashin, 1986.
Retrieved from http://www.idea.ksu.edu/papers/Idea_Paper_16.pdf.

Back to Top

What's New

Fall Workshops
View the details and register

Get in the Know!
View the September 2014
edition of the CELT News

Save the Date
2015 Spring Teaching Conference
Why Mobility Matters
Friday, March 27, 2015

Quick Links