|Submission Date||June 25, 2018|
University of Michigan
AC-6: Sustainability Literacy Assessment
|4.00 / 4.00||
Graham Sustainability Institute
Does the institution conduct an assessment of the sustainability literacy of its students (i.e. an assessment focused on student knowledge of sustainability topics and challenges)?:
Which of the following best describes the literacy assessment? The assessment is administered to::
Which of the following best describes the structure of the assessment? The assessment is administered as a::
A copy of the questions included in the sustainability literacy assessment(s):
A sample of the questions included in the sustainability literacy assessment or the website URL where the assessment tool may be found:
A brief description of how the literacy assessment was developed and/or when it was adopted:
The Sustainability Cultural Indicators Program (SCIP) is designed to inform educational programs and campus operations at U-M and is an outgrowth of the Community Awareness goal area of the U-M Campus Sustainability Integrated Assessment.
Building on information gleaned from focus groups, two questionnaires were designed and have been administered to samples of university students, faculty, and staff in Fall 2012, Fall 2013, Fall 2014 and Fall 2015 and Winter 2018. The web surveys yield responses from more than 3,500 students and 1,500 faculty and staff members each cycle. Questions cover travel and transportation, waste prevention and conservation, the natural environment, climate change, food, and engagement, awareness, and ratings of campus sustainability initiatives. Survey data are supplemented with geographic data covering campus buildings where respondents live, work and study.
A representative sample as described below and on EN6, also received a 12 question sustainability literacy assessment. With the permission of the developers we included the Assessing Sustainability Knowledge (ASK) questions on the Sustainability Cultural Indicators Program questionnaires. We chose to use the ASK questions as they have been rigorously tested and refined, and by using the same questions as other institutions we will also be able to do cross-institution comparisons. The results from the first round data collection are included below for the student cross-section and the student panel (which will serve as the post-assessment sample when the questions are next asked two years from now).
A brief description of how a representative sample was reached (if applicable) and how the assessment(s) were administered :
Based on guidelines provided by the Survey Research Operations unit of the U-M Institute for Social Research, the selection of the sample of students is made by the U-M Office of the Registrar. In order to be eligible for selection, two key sample parameters were identified and defined the sampling frame --- 1) full-time undergraduate, graduate and professional students, and 2) students who were registered for the fall semester on the Ann Arbor campus. In order to reach the targeted number of students from each undergraduate cohort and from graduate students, names are selected from each group (strata) who are contacted and invited to participate in the survey. The faculty and staff sample is drawn by the U-M Human Resources Records and Information Services. To be eligible employees have to meet two criteria: 1) be benefits eligible, and 2) employed on September 1, of the year of the survey. In Winter 2018 a total of 20,583 students, faculty, and staff were contacted with a 26.7% overall response rate. Additional information can be found in the SCIP Methodology report found at: http://graham.umich.edu/media/files/SCIP_MethodologyReportJanuary%202017.pdf
The sample design also includes a panel of individual undergraduate students who responded to the initial survey in 2012. That is, the panel in 2013 was designated as the freshmen, sophomores, and juniors who completed the 2012 survey. In order to retain the panel each year, graduating seniors are replaced with the freshmen from the prior year. The 2014 panel includes 2012 freshmen and sophomores who responded in previous years and 2013 freshmen. The 2015 panel includes 2012 and 2013 freshmen, and the 2014 freshmen, sophomores and juniors. The panel was included in the research design so as to determine if and how the behaviors and views of individual students change during their period of undergraduate study at the University. In 2017 plans for the panel were modified to reflect every other year data collection plan.
A brief summary of results from the literacy assessment(s), including a description of any measurable changes over time:
Summary of results from the questions on the Assessing Sustainability Knowledge (ASK) survey. Student Cross Section
• For 11 of the 12 questions the highest percentage of respondents selected the correct answer. The one question that did not fit this pattern is about the depletion of fish stocks.
• For 7 of the 12 questions the correct answer was selected by more than half of the respondents.
• More than 80% of respondents selected the correct answers for questions about the ozone layer, wealth disparity, and which country is the largest emitter of the greenhouse gas carbon dioxide.
• More than 30% of respondents reported that they didn’t know the answers to questions about electricity prices and the definition of economic sustainability.
• Results were similar to the cross section but overall more respondents selected the correct answers
• For all of the 12 questions the highest percentage of respondents selected the correct answer.
The website URL where information about the programs or initiatives is available:
Additional documentation to support the submission:
The information presented here is self-reported. While AASHE
staff review portions of all STARS reports and institutions are welcome to seek additional forms of review, the data in STARS reports are not verified by AASHE. If you believe any of this information is erroneous or inconsistent with credit criteria, please review the process for inquiring about the information reported by an institution and complete the Data Inquiry Form.
The information presented here is self-reported. While AASHE staff review portions of all STARS reports and institutions are welcome to seek additional forms of review, the data in STARS reports are not verified by AASHE. If you believe any of this information is erroneous or inconsistent with credit criteria, please review the process for inquiring about the information reported by an institution and complete the Data Inquiry Form.