Overall Rating | Gold - expired |
---|---|
Overall Score | 70.91 |
Liaison | Lindsay Walker |
Submission Date | Oct. 24, 2019 |
Humber College
AC-6: Sustainability Literacy Assessment
Status | Score | Responsible Party |
---|---|---|
4.00 / 4.00 |
"---"
indicates that no data was submitted for this field
Does the institution conduct an assessment of the sustainability literacy of its students?:
Yes
Which of the following best describes the literacy assessment? The assessment is administered to::
The entire (or predominate) student body, directly or by representative sample
Which of the following best describes the structure of the assessment? The assessment is administered as a::
Pre- and post-assessment to the same cohort or to representative samples in both a pre- and post-test
A copy of the questions included in the sustainability literacy assessment(s):
A list or sample of the questions included in the sustainability literacy assessment or the website URL where the assessment tool may be found:
Uploaded
A brief description of how the literacy assessment was developed and/or when it was adopted:
The Sustainability Literacy Assessment was developed by the Office of Sustainability with the help of staff from the ILO (Institutional Learning Outcomes) team over the course of one year. We developed a pilot survey with 20 knowledge questions based on other sustainability literacy assessment in similar institutions, incorporating all three aspects of sustainability. In January 2019, the survey was piloted in 22 different classes, and had 170 respondents. Based on the responses, we evaluated each question for clarify, accuracy and difficulty and created a revised version with edited, removed or added questions. Key changes included multiple choice questions and options edited or removed based on overlapping ideas, how heavily knowledge-based it was, and the importance of the question. We reduced the number of questions on the literacy assessment to 15 and added another section for cultural assessment.
A brief description of how a representative sample was reached (if applicable) and how the assessment(s) were administered :
There are two surveys that are administered twice every academic year: in September (pre) and in April (post). Our first post survey was administered in September 2019 to a random sample of diploma students (predominant student body). All diploma students must take a class of WRIT100 or ESOL100 in their first semester, so a random subset of 80 of those classes were chosen in which we posted digital announcements recruiting participants.
For the post survey in April, we are choosing a capstone project (final year) class form each of the 78 diploma programs, and following the same announcement procedure from the pre-survey. Since every single final year class will get an announcement, this sample is representative. Both samples was representative as it was a random sample and every student had an equal chance of participating in the survey.
Pre and post surveys will be done every year to monitor the difference in literacy when the same cohort of students are starting and finishing their program, as well as to see the change in literacy level in all students over time.
For the post survey in April, we are choosing a capstone project (final year) class form each of the 78 diploma programs, and following the same announcement procedure from the pre-survey. Since every single final year class will get an announcement, this sample is representative. Both samples was representative as it was a random sample and every student had an equal chance of participating in the survey.
Pre and post surveys will be done every year to monitor the difference in literacy when the same cohort of students are starting and finishing their program, as well as to see the change in literacy level in all students over time.
A brief summary of results from the literacy assessment(s):
The pre survey is currently being conducted and data has not yet been analyzed. However, before administering our first pre survey, we first tested with a pilot survey. In January 2019, the survey was piloted in 22 different classes, and had 170 respondents. From the initial pilot results, we saw a literacy level (average score) of 54%. Some of the questions in which students performed well were related to the impact of climate change, species extinction, and definition based questions about greenhouse gases and economic sustainability. Some questions in which students performed poorly included those related to how climate change affects natural disasters, sources of pollution and greenhouse gases, and understanding energy certifications. Based on the results of the pilot, some questions were modified and the new survey is expected to better test literacy.
Optional Fields
---
Additional documentation to support the submission:
---
Data source(s) and notes about the submission:
---
The information presented here is self-reported. While AASHE staff review portions of all STARS reports and institutions are welcome to seek additional forms of review, the data in STARS reports are not verified by AASHE. If you believe any of this information is erroneous or inconsistent with credit criteria, please review the process for inquiring about the information reported by an institution or simply email your inquiry to stars@aashe.org.