Evidence Options

There are many options for gathering evidence about learning or other issues you wish to assess. Choose the approach(es) that would generate the strongest evidence for your particular question or issue. Strong evidence is information that is reliable and speaks most directly to the desired goal.

Assessment Tools

Strong assessments use valid and reliable tools, instruments, or approaches to collect the evidence. We need to be able to trust that the evidence we’re collecting actually means what we think it means. It is important to choose the right tool according to your goal or question. The following categories, while not mutually exclusive, do represent common approaches to assessment.

Questioning to elicit opinions

The investigator provides a series of questions for the participant to answer on paper, online, or orally. The questions may be close-ended (pick from choices) or open-ended (free answer). The intention is to collect information about participant’s opinions or perceptions about something.

Types:
Survey, Form, Questionnaire, End-of-event evaluation, Interview, Focus Group
Cautions:
Ensure the survey design is valid and reliable to feel confident the participants interpret the questions in the way you intend and that the tool minimizes bias.
Questions asking about participants’ opinions about a topic do not necessarily provide evidence about their skill in that topic.
Analysis approaches:
The analysis depends on the question type. Close-ended questions may lend themselves to descriptive statistics and various statistical analyses. These kinds of questions can be useful to quantify how much people think about certain topics.
Data from open-ended survey questions, interviews, and focus groups generally require qualitative analyses, such as a thematic analysis. This analysis can be time-consuming, but may provide insight on the reasons why participants think as they do about a topic.

Questioning to elicit content knowledge

The investigator provides a series of questions for the participant to answer on paper, online, or orally. The questions may be close-ended (pick from choices) or open-ended (free answer). The intention is to have individuals exhibit their knowledge or cognitive skills on specific topics.

Types:
Exams, Quizzes, Standardized tests, Essay tests, Oral proficiency exams
Cautions:
The tool design can influence how the participants represent their learning. Ensure the design is valid and reliable to feel confident the participants interpret the questions in the way you intend and that the tool is actually measuring what you think it is. (For example, incorrect marks on a bubble sheet could mean that the person did not know the correct answer, but it is also possible the person had a physical disorder which caused the person to mark the wrong circle even though the individual knew the correct answer.)
Analysis approaches:
For close-ended questions, one typically compares the responses to an answer key with the acceptable answers. For open-ended responses, one may compare the response to a rubric or answer key to identify the kinds of information expected within a correct or quality response.

Observation

The investigator captures formal or informal evidence based on what is seen or heard. The intention is to collect information about participant’s behaviors and attitudes, especially in an authentic environment.

Types:
Instructor notes students’  interactions inside or outside of class, Community partner reflects on student work, Instructor marks competency completion on a checklist
Cautions:
The training of observers and nature of the observation approaches can influence the quality of the data collected and how much bias is introduced. Consider triangulating the findings with other evidence. If observation will be the only or primary method for data collection, design the observation protocol carefully, train the observer(s), and use rubrics or standards as appropriate.
Analysis approaches:
Qualitative data may require a thematic analysis. Approaches using checklists or rubrics may generate descriptive statistics.

Request performance

The investigator has the participant perform some task which produces a physical manifestation of the individuals’ knowledge or skills. Performances can be very strong evidence to indicate how well the individual achieved a learning outcome.

Types:
Concerts, Presentations, Documents, Artwork, Portfolios, Athletic event, Experiments, MAPs
Cautions:
Make sure the requested performance and performance criteria closely represent the learning that occurred.
Analysis approaches:
Set performance criteria. Compare the performance to the performance criteria.