As part of the Spring 2012 professional development programming, Education & Outreach sponsored a talk by Jeff Waller, Head of Reference and Instruction Services at Saint Anselm College’s Geisel Library and Kathy Halverson, Assistant Dean/Head of Public Services at Keene State College’s Mason Library.
Assessing Student Research Papers at Saint Anselm College
The initiative to develop the student research assessment began with a strong information literacy initiative in the library but no program-wide assessment. Library staff attended a workshop on designing assessment plans for information literacy and developed an idea to evaluate research papers in the freshman and senior years. They debated whether to use a rubric or standardized test, and chose a rubric because they felt this method is more authentic, looking directly at student work.
The College applied for a Davis Educational Foundation grant to assess seven different learning objectives on campus, and included Information Literacy as one of the objectives. Faculty participated on the 3-year information literacy assessment committee. Faculty participants valued information literacy skills and were good library customers.
The assessment committee started with the ACRL standards and discussed these with the faculty members. They then narrowed the list to 16 learning outcomes that they wanted to measure during the assessment. These outcomes had to be tangible characteristics that evaluators could observe by reading final papers.
The committee members used a 6-point Likert scale to evaluate the papers. Before launching the full assessment the committee evaluated a small sample of three papers to ensure that scoring was consistent among evaluators. For the full pilot, evaluators obtained research papers from four first-year English classes and four senior classes. Although the pilot of approximately 70 student papers was not a representative sample of the entire college, the committee did look for similarity in length of student papers and scope of the assignment in order to have some consistency among papers in the sample.
Students performed the poorest on evaluating sources, acknowledging viewpoints, understanding bias, recognizing context. Improvements were weakest in the areas that were weak to begin with and strongest in the areas where they were already strong. Students were good at citing sources correctly and finding relevant information, and weaker at distinguishing viewpoints and evaluating potential validity and bias. Students did improve on every criteria but the amount across each of the 16 objectives varied.
Jeff presented results of the pilot evaluation to a faculty forum. Faculty recognized the patterns evident in the results, which also corroborated the results from the Collegiate Learning Assessment (CLA) exam, which is regularly administered at Saint Anselm. Instructors valued the results of the pilot project but raised questions regarding the evaluation of different forms of research output (for example papers vs. posters).
Faculty participants in the assessment pilot thought that grading a rubric was too time-consuming to repeat often. This Spring there is a change of leadership in the College and the Library, so there may be an opportunity to restart assessment efforts across campus but it will take time. The library is hoping to create a more simplified rubric to evaluate student bibliographies in the meantime.
In a positive turn, some professors have rewritten their assignments to more explicitly address the learning outcomes of information literacy. New adopters of information literacy outcomes have included several large introductory science classes, which had historically been somewhat resistant to information literacy assessment.
Project SAILS at Keene State College
Keene State adopted the SAILS assessment from 2008-2011, after the General Education program was restructured in 2007/2008 and became the Integrated Studies Program (ISP). When Information Literacy was included in the eight intellectual skill outcomes for ISP, the academic departments were charged with assessing ISP outcomes.
Assessment of student learning around information literacy has been done in the past but not in a programmatic manner. Keene State considered various standardized test options, including iSkills, ILT, TRAILS, RRA, and Project SAILS. The goals for assessment were to deliver a pretest to Freshman as formative assessment and deliver the posttest to Juniors as summative assessment in order to identify Information Literacy competency levels and retention rate for undergraduate students. Keene State tested Freshmen in 2008 and again when this same class were Juniors in 2011. They tested the Freshmen classes in 2009 and 2010 also.
The advantages of the SAILS test are that it is low cost, easy to administer with just 30-50 minute time commitment per student, available in online or paper formats, provides anonymous testing, is endorsed by the Association of Research Libraries, and meets standards for validity and reliability.
However, there were some limitations as well. SAILS is not built to be a pre-test unless administered in the same year, before the test or scoring is changed. Between 2008 to 2011 the test changed significantly. SAILS is designed to compare IL across institutions, not student-to-student within an institution. (Results are organized by ACRL standard and by skill sets.) For accurate comparison purposes, the Keene State Juniors test results should have been joined with others in a consortium that tested Juniors.
Since the test is designed to compare results among different organizations, it does not provide individual data or customized feedback. Students would have liked to know their scores. SAILS now offers a different test that offers individual scores, but institutions have to choose the institutional-comparison test or the individually scored test. SAILS is not conducive to using with small populations, groups of at least 200 students are required.
SAILS did provide a number of positive outcomes as well. Test results provided a benchmark of student performance in Information Literacy skills and allowed comparison of scores from year to year, giving a snapshot of Information Literacy assessment at the institution. SAILS results may also be used as a tool for guiding conversations with faculty about Information Literacy skills.
Following up on their involvement with Project SAILS, librarians at Keene State plan to perform comprehensive data analysis across the 2008-2011 test results. They will continue to assess student performance by using classroom assessment techniques, and may develop a SAILS-like test for entering Freshmen enrolled in a one of the ISP core courses. Although there are no immediate plans to continue to administer SAILS, they may repeat the test every 5 years to use as a benchmark.
At the conclusion of the presentations, librarians from the three institutions traded questions about the culture of assessment at Dartmouth, Keene State, and Saint Anselm. While the presenters come from institutions where a central mandate for institutional learning goals and assessment have been levied by the administration, the culture at Dartmouth College gives more autonomy to individual departments in this type of decision making. Both Keene State College and Saint Anselm College have been administering the CLA standardized test and their faculty place value on the results of this and other types of uniform assessment tools, including surveys and rubrics.