hope college site    
hope college > assessment         

 
Committee <

NSSE: National Survey of Student Engagement

<

GPI: Global Perspectives Inventory

<

Learning with Technology

<

Online Learning Report

<

Experience with Diverse Perspectives

<

SALT

<

NSSE 5-Year Trend Data

<
HEDS: Higher Education
Data Sharing Consortium (coming soon)
<

Department Assessment Resources

<
 

SALT

Brief Guidelines for Reviewing and Interpreting SALT Results

Students’ responses to SALT represent their perceptions of our course and of our teaching. Students’ perceptions can provide useful information to inform and potentially improve our teaching, but they are certainly not the only source. We can improve our teaching based on students’ performance on tests, papers, and other assignments; our own sense of what is or is not working in our courses; conversations with colleagues; and readings and workshops related to teaching. Decisions about the effectiveness of our teaching are best informed by converging evidence from these different sources. The guidelines are intended to suggest ways for reviewing and interpreting SALT results that may be helpful for you. They are meant to describe what you can do with your SALT results and not what you should do with them.

This is a brief outline of the guidelines. A more detailed description including examples is posted at Full Guidelines for Reviewing and Interpreting SALT Results.
    • Before looking at your SALT results take some time to think about what you are trying to accomplish in this particular course. Reflect on how you think the course has gone this semester. Identify 2 or 3 aspects of the course that you want to focus on that are reflected in SALT.

    • Use the frequency distributions for each item (rather than just the average) to get an overall sense of students’ responses. Be cautious in comparing your average to the Hope Average. [See posted full guidelines for examples.]

    • The most informative comparisons (for improvement) are between your averages across semesters rather than your average compared to the Hope Average each semester.

    • View the quantitative data and students’ written responses as complementary, e.g., use the frequency distributions to check how many students share the perspective of the written comment of one student.

    • After reviewing your results reflect on what you are learning and take notes so you can use what you have learned the next time you teach the course.

    • Decide what you are going to do. One key to using SALT results is to assess the effectiveness of changes you make in your instruction. Assessing change involves a cycle: check SALT results to identify what you want to improve; implement the change in instruction to make the improvement; check SALT results for the semester after you made the change.

    • It is likely that you will want to assess the effectiveness of a change in your instruction using measures beyond SALT. Consider adding questions to SALT that address specific objectives or instructional techniques of your course. Laurie Van Ark at the Frost Center can help you add your questions to SALT.