|Submission Date||March 2, 2018|
University of Massachusetts Amherst
AC-6: Sustainability Literacy Assessment
|1.00 / 4.00||
Director of Sustainability Programs
Does the institution conduct an assessment of the sustainability literacy of its students (i.e. an assessment focused on student knowledge of sustainability topics and challenges)?:
Which of the following best describes the literacy assessment? The assessment is administered to::
Which of the following best describes the structure of the assessment? The assessment is administered as a::
A copy of the questions included in the sustainability literacy assessment(s):
A sample of the questions included in the sustainability literacy assessment or the website URL where the assessment tool may be found:
We first piloted a simple attempt at a sustainability literacy assessment in late Spring 2015 in which we surveyed around 100 student using a subset of questions from the Ohio State University ASK instrument, found at:
A brief description of how the literacy assessment was developed and/or when it was adopted:
After the pilot attempt in April 2015, Kevin Hollerbach developed a waste and recycling assessment tool that was then administered by Eco-Rep program to students in residence halls in Fall semester 2015. There were 3 questions directly focused on students' knowledge of sustainability issues, and 3 others addressed their knowledge about recycling practices in their residence hall. Over 350 student responses were gathered, but we were not sure about the sampling bias based on the kinds of students willing to respond to this door-to-door effort.
Since then, faculty members in two departments have used other questions from the same Ohio State ASK instrument in a multiple choice iClicker assessment at the beginning of the semester to assess students' sustainability knowledge.
A brief description of how a representative sample was reached (if applicable) and how the assessment(s) were administered :
As of February 2018, we have not yet undertaken to do a representative-sample-type survey of the entire student body.
A brief summary of results from the literacy assessment(s), including a description of any measurable changes over time:
We have not attempted to measure changes over time. The assessment tool in our first pilot only included 3 questions (one relatively easy, one relatively challenging, one of average difficulty). Of 366 respondants, only 59% corrected answered even the easy one, 49% the average one, and 41% the challenging one.
The website URL where information about the programs or initiatives is available:
Additional documentation to support the submission:
Individual faculty survey students in their individual classes to test sustainability literacy (e.g., Craig Nicolson in EnviSci 445; Lena Fletcher in NRC 185; Robert Ryan in LARP 687; Erin Baker in M&IE). Apart from the two efforts described above we have not yet undertaken a systematic institution-wide sust'y literacy assessment.
The information presented here is self-reported. While AASHE staff review portions of all STARS reports and institutions are welcome to seek additional forms of review, the data in STARS reports are not verified by AASHE. If you believe any of this information is erroneous or inconsistent with credit criteria, please review the process for inquiring about the information reported by an institution and complete the Data Inquiry Form.