Overall Rating Silver
Overall Score 63.74
Liaison Derek Martin
Submission Date Feb. 18, 2023

STARS v2.2

California State University, Monterey Bay
AC-6: Sustainability Literacy Assessment

Status Score Responsible Party
Complete 1.00 / 4.00 Lacey Raak
Sustainability Director
Campus Planning and Development
"---" indicates that no data was submitted for this field

Does the institution conduct an assessment of the sustainability literacy of its students?:
Yes

Which of the following best describes the literacy assessment? The assessment is administered to::
A subset of students or a sample that may not be representative of the predominant student body

Which of the following best describes the structure of the assessment? The assessment is administered as a::
Standalone evaluation without a follow-up assessment of the same cohort or representative samples

A copy of the questions included in the sustainability literacy assessment(s):
---

A list or sample of the questions included in the sustainability literacy assessment or the website URL where the assessment tool may be found:
sulitest.org

A brief description of how the literacy assessment was developed and/or when it was adopted:
The main ambition of the Sulitest project is to measure and improve Sustainability Literacy: “the knowledge, skills, and mindsets that help compel an individual to become deeply committed to building a sustainable future and allow him or her to make informed and effective decisions to this end”.

The key criteria that drove the creation of the Sulitest platform were - and still are - the following:

Questions must assess an individual's current knowledge of Sustainable Development, but they should also teach and inform, motivate to learn more and act!
The overall experience of taking the Test should help learners “understand the big picture”, as well as “be touched and inspired by specific stories or facts”; while simultaneously avoiding the trap of reproducing or memorizing lists of facts, figures, issues and challenges without making connections between them.
Create a Test that does not overwhelm with the number of questions (30). The focus is on various perspectives and topics, keeping the balance between alarming news and inspiring actions.

To reach these ambitious objectives, the test is designed with (1) a coherent, pedagogical and systemic framework (2) a list of tags and keywords to build a database of questions tthat ensure an appropriate balance among all relevant topics.

A brief description of how a representative sample was reached (if applicable) and how the assessment(s) were administered :
We utilized an existing sustainability literacy test from Sulitest.org, which is used by universities around the world to evaluate students’ knowledge about sustainability in a range of measures. We also developed a short survey about CSUMB sustainability – including questions about exposure to sustainability through courses, co-curricular programs, and campus infrastructure.

This was the second time administering the test and the response rate and participation increased. We did not do a focus group as we did in 2017. The test was administered in “learning-mode,” which provides the correct answer after a response is given. Students thus learn about sustainability by taking the Sulitest. The Sulitest instrument was followed by a paper questionnaire about CSUMB sustainability. The questionnaire included questions about exposure to sustainability through courses, co-curricular programs, and campus infrastructure, as well as student impressions from taking the Sulitest. The assessment process and protocols were reviewed and approved for human subject research at CSUMB.
The 2019 questionnaire was updated to include a series of Likert-scale questions related to University Learning Outcome 2, which is most directly related to the scope of sustainability as defined by Sulitest. This provided an opportunity to understand the degree to which students have learned about components of ULO2 in their overall education and how the Sulitest influenced their learning related to this outcome.
Instructors of upper division courses across campus were invited to participate in the study. Instructors in 21 course sections across all 5 undergraduate colleges and a total of 11 course prefixes agreed to participate. Consent forms, the on-line Sulitest, and paper questionnaire were issued in a single class visit between January and April of 2019. Students’ participation was voluntary; 325 out of 610 students completed the Sulitest instrument (53% response rate), and 362 out of 610 students completed the questionnaire (59% response rate). These students were distributed across 10 majors on campus.

A brief summary of results from the literacy assessment(s):
Sulitest.org provided the scores for each test taken by individuals and as an entire CSUMB group. In their presentation of results, they compared CSUMB results to other university students who have taken the test, both globally and in the United States (Table 1). For the first three sub-themes of the test, students at CSUMB performed lower than the worldwide and U.S. results. However, students scored higher in the “role to play” subtheme, both compared to the rest of the U.S. and the world. In terms of specific question types, CSUMB students showed a high understanding of impacts on future generations (80% correct) and transparency and accountability (86% correct). Students performed the least well on questions regarding international governance and institutions (27% correct), water and sanitation (31% correct), global interdependence and universal responsibility (33%), and interconnected challenges (34%).

These responses were similar to the first test completed in 2017.

Optional Fields 

Website URL where information about the sustainability literacy assessment is available:
---

Additional documentation to support the submission:
Data source(s) and notes about the submission:
This program was paused during COVID (faculty and student researchers did not have the capacity to do the survey while adjusting to everything else happening), pushing the next assessment to Spring 2023 (currently underway).

The information presented here is self-reported. While AASHE staff review portions of all STARS reports and institutions are welcome to seek additional forms of review, the data in STARS reports are not verified by AASHE. If you believe any of this information is erroneous or inconsistent with credit criteria, please review the process for inquiring about the information reported by an institution or simply email your inquiry to stars@aashe.org.