Thursday, May 12, 2011
"Selfless Audacity" Means Creating a Sustainable Not-a-Business Model | Peer to Peer Review
Wednesday, February 23, 2011
Assessment Activities in Penfield Library
Indicators of patrons connecting with our resources, services and spaces
- · Website visits – resources, online spaces
- · Door counts – spaces
- · Instruction attendance – services
These measures help us to monitor and document the volume and extent of our connections with our patrons. These statistics are normalized by dividing them by FTE students, providing a profile of library use by our “average” student. These numbers can also be validated by comparing them to reported frequency of use in interviews and surveys.
By focusing on three top level measures we have a sustainable and feasible way to monitor the impact of improvements and other changes in the library.
We have several years’ worth of this data. We are exploring good ways to present this information to the campus community.
Reports of users’ experiences
- · Surveys and interviews
- · Suggestion box
- · Net Promoter Score
o "How likely is it that you would recommend Penfield Library to a friend or fellow student?"
These instruments will focus on identifying our strengths and weaknesses, and exploring how our patrons find value in their use of library and information resources. We have a mix of quantitative and qualitative data from at least four interview or survey projects. And we have access to a number of campus surveys that include some information about the library.
Our plan is to establish a more regular administration of surveys
Program Review and Self Study Initiative
- · Analysis of collection and library instruction activity for each department to be delivered at beginning of self study process
First inspired by a report of best practices for Program Reviews, this idea has also surfaced in work on Oswego’s Self Study for Middle States. This approach could generate more engagement by faculty in library-teaching collaboration, highlight the Library’s success (or not) in participating in the teaching and learning process, and place the focus for library assessment at a more granular level than we have done before. This program-by-program analysis can also help identify areas for improvement at a level that matters the most—students’ success in their respective programs.
Learning outcomes: Information literacy
- · Lake Effect Research Challenge Worksheet for basic level
- · Assessments in Program Review Self Studies
- · Sense-making (critical incident) interviews with seniors
- · Information Literacy Test to validate other measures
- · NSSE data on research assignments and activities
Triangulation is the rule here. The Research Challenge Worksheet provides an assessment of information literacy at the basic level—focusing on first year students’ preparation for their next four years of academic research. This has been done in three full cycles over seven years and has identified areas for improvement and has documented progress.
Assessments incorporated into Program Review Self Studies have the potential for doing the same at the capstone level but focusing on student accomplishment at the end of four years work in their disciplines. We have had only a handful of departments directly include information literacy learning outcomes in their assessments, and reports of their findings have not been shared with the librarians.
The senior interviews and the Information Literacy Test are both planned to validate and expand on what we find in the Worksheets and the Self Studies. The senior interviews are meant to be a continuing project and will both document student abilities and help us uncover areas for improvement in teaching and library programs.
The Information Literacy Test, a standardized multiple-choice instrument, is planned for this Spring. It will focus on sophomore students to give us a benchmark in regard to how well our students accomplish and retain information literacy abilities from their First Year experiences.
NSSE surveys are self reports from students of their activities and engagement in their education. This cannot tell us whether students meet our expectations, but does tell us to what extent students engage in information literacy activities.
--Jim Nichols, Nov. 2, 2010