Open-Ended Questions

Open-response items vary from short-answer or completion questions to problem solving and written essays. Written essays enable students to demonstrate the depth of their thinking and are therefore useful for assessing complex problem solving and critical thinking. Problem solving with work shown can enable instructors to view the student’s thinking processes and provide feedback. Completion questions require students to retrieve information from memory.

Some instructors believe that completion and multiple choice questions on the same content assess the same thing, however others disagree because retrieving information from memory is a different competency from recognizing the information when selecting a response. In general however, there is a positive correlation between student performance on multiple choice items and open-response items covering the same content.

You should select item types based on your specific context, however, in general, when assessing recall and comprehension, multiple choice items can more efficiently cover a large amount of material quickly, and can be scored objectively and quickly. Constructed response items should be reserved for the more complex competencies that are difficult to assess with multiple choice items. Consider the information below in your selection of open response items:

ADVANTAGES OF OPEN RESPONSE ITEMS  

Advantages include:

  • rapid item development,
  • ability to assess constructs other than achievement (i.e., student attitudes, creativity, values, opinion, explanations, and interpretations) 
  • potential to assess writing competency and skills
  • capacity to accommodate argument and logic skills
  • minimization of guessing effects
  • influence on students to study information more deeply than multiple choice items

DISADVANTAGES OF OPEN RESPONSE ITEMS

Disadvantages include:

  • time consuming to score
  • scoring conditions can be subjective (lower reliability of scoring)
  • time consuming for students to construct responses
  • writing skills can sometimes make a weak response sound better and vice versa

Tips for Writing Open Response Items

1. Construct the items to elicit skills and knowledge aligned with the educational outcomes developed for the course.

2. Reserve open response items to assess educational outcomes that are difficult to measure using other formats, since they are more time-consuming to take and score.

3. Specify the task in clear, concise language that all students will be able to understand.

4. Specify the length of the answer desired for each item.

5. Show the scoring criteria along with the item so students will know how the points can be earned.

6. Test the item yourself by writing an ideal answer to it.  Develop your scoring criteria from this answer.

7. Use either analytic scoring rubrics (point systems) or holistic scoring rubrics (overall score based on certain criteria) to ensure consistent scoring. See the RUBRICS document for more information about rubrics.

Scoring Open Response Items

Scores derived from open response items sometimes fail to provide a clear picture of student achievement because of many factors such as variations in

  • student interpretation of the prompt,
  • student writing style and ability,
  • scorer interpretation of the rubric, and
  • scorer consistency.

These tips can help you minimize scoring errors.

  • Provide the same set of questions to all of the students. When you allow students to select from among choices of essay questions (ex. answer 2 of the 5 questions), the items are most likely not equivalent and students are not being evaluated on the same scale. One student’s score does not mean the same thing as another student’s score. While allowing students a choice gives them the perception that they have the opportunity to do their best work, choice introduces difficulty in drawing consistent and valid conclusions about student answers and performance.
  • Consider using several narrowly focused items rather than one broad one so that students can “narrow-in” on the response and scoring will be more consistent. You can isolate various aspects of students’ skills and knowledge more consistently.
  • Analytic scoring rubrics yield more consistent scores than holistic scoring rubrics because features of the response for which test takers should receive points can be evaluated separately.

Examples of Prompt Formats Designed to Elicit Higher Order Thinking Skills

Skill Stem
Comparing

Describe the similarities and differences between...

Relating Cause and Effect What are the major causes of...
Justifying

Which of the following alternatives do you favor and why?

Summarinzing

State teh main points included in...

Generalizing

Formulate several valid generalizations for the follsing data...

Inferring

In the light of the information presetned, what is most likely to happen when...

Classifying

Group the following items according to...

Creating

List as many ways as you can think of for...

Applying

Using the principles of... as a guide, describe how you would solve the following problem.

Analyzing

Describe the reasoning errors in the following paragraph.

Synthesizing

Describe a plan for providing that...

Evaluating

Describe the strengths and weaknesses of...

Source: Piontek,2008, table 2, adapted from Figure 7.11 of McMillan, 2001, p.186.

 

References

Bridgeman,B., Trapani, C, Bivens- Tatum, J. 2011. Comparability of essay question variants. Assessing Writing 16 (2011) 237–255.

McMillan, J. H. (2001). Classroom assessment: Principles and practice for effective instruction. Boston: Allyn and Bacon.

Piontek, M.E. 2008. Best Practices for Designing and Scoring Exams. CRLT Occasional Papers, No. 24, University of Michigan, http://www.crlt.umich.edu/P8_0.

Using the given criteria, write an evaluation of…