Skip to content

Analyze and Interpret Evidence

Assessment Measurement Tools

tools

So you have your student artifact. Now what? How do you turn that artifact into reliable data that can be used to determine whether or how well learning outcomes have been met? What you need is a rubric, rating scale, orother measurement tool specifically designed to target your learning outcomes. Check out the Tools for Assessment Data Collection handout from the Office of Instruction and Assessment at the University of Arizona for more information.

Analysis of your data should present your stakeholders with a thorough picture of how students are accomplishing the learning outcomes. Consider the following tips:

  • Before analysis begins, make sure all evidence you are using has been de-identified, if possible, so no student's name can be connected to a response or artifact.
  • It may also be a good idea to remove the identity of the course from which an artifact has been selected.
  • In the analysis, you may want to break your data down by important subgroups such as major, graduating year, gender, or ethnicity if your plan calls for it or you think your stakeholders will be interested in it.

Indirect vs. Direct Evidence

Your specific analysis will again depend on whether the evidence is indirect or direct. Indirect evidence like surveys should present the data in a way that gives stakeholders a broad view of the results but allows them to drill down to whatever level of information will be necessary to make good decisions with the data. Tables and figures can help illustrate findings in a way that allows the reader to quickly understand the gist of the results.

Analyzing direct evidence of student work (papers, projects, capstone experiences, etc.) will require a tool such as a rubric that gives reviewers specific guidelines for judging the quality of the artifact. It is important that reviewers are well-trained on using the rubric and that more than one individual assesses each artifact.

Check out this rubric for critical thinking to see an example that was developed by the American Association of Colleges and Universities that we use in General Education.

Also check out this example rubric for quality learning outcomes. This one is from Los Angeles Mission College. It is instructive for the purposes of this web site because it is a rubric for assessing a program's learning outcomes assessment.

Interrater Reliability

Departments should make efforts to study and improve interrater reliability between reviewers if it is lacking. 

Interpretation

Analyizing and interpreting your data can be a very sensitive process. Having as much faculty participation as possible in this step is very important. Despite efforts to de-identify courses, sometimes it will be obvious which course an artifact or group of artifacts is from. For example, if evidence from a capstone course is used, it's likely that everyone will know who was teaching that course, so contextualizing results with the help or a broad range of faculty is useful.

Last Updated: 7/7/22