Pedagogy
19 Worksheets and Assessment: Matching Learning Objectives to Learning Outcomes
Alison Reynolds and Chloe Gerson
Often archivists and librarians are asked to assess the class sessions they teach, but assessment is a gray area with many options. There are a variety of ways to assess student learning, from traditional to alternative, formative to summative, surveys and more. Generally, archivists and librarians teach a single session (a “one-shot”), so our assessment tool options tend to be more limited than those of a K–12 classroom teacher. A survey can be a great assessment tool, but assessments that include surveys given after a lesson do not necessarily show that students have met a learning outcome and can be time consuming to create and review. Tailored worksheets, on the other hand, are an informal assessment tool that can easily be incorporated within a “one-shot” class as part of an activity. The worksheet questions can be crafted so they match the learning objectives for the class, allowing students to show their understanding of the material. Thus, we posit that when the assessment tool is tied directly to the lesson’s learning objectives, we can more accurately see the level to which students understand both the primary source and archival materials used and the session assignment or activity without the need for an exit ticket or survey to do so. In fact, we can use the worksheet and activity both as a formative assessment tool as well as a reflective tool for our own teaching practices.
Literature Review
Campbell and Campbell define assessment as “the process of quantifying, describing, gathering data, or giving feedback to others about what a learner knows and can do.”[1] Within the K–12 lens, assessment takes place constantly in a classroom; it “is a central part of classroom life, providing teacher and student with opportunities to monitor the progress of student learning.”[2] The main two methods of assessment are formative and summative. A formative assessment tool is any kind of assignment that a teacher can give a student feedback on, while summative assessment “is the final product or culminating activity that demonstrates what students have learned as a result of the lesson or unit.”[3]
When students are completing an activity or exercise, we have the ability to give them feedback in real time, offering ideas, suggestions, and praise, as well as checking for understanding and answering questions. Thus, based on traditional definitions of formative and summative assessment, one could argue that the “one-shot” instructional sessions that we often teach in the archives could be seen as a single unit, and in that way, we could consider the activity and worksheet or discussion to be our formative assessment, and any kind of presentation or share-out that might go along with the activity to be the summative assessment. Yet if we are not grading or evaluating the worksheets, presentations, or discussions for data for future use, we would consider everything done during a single class session to be a formative assessment tool, especially since, as White explains, “anytime we invite students to make their learning visible, we are engaging in these smaller formative assessments.”[4]
In libraries and archives, assessment tools used in teaching with primary sources and archival materials vary widely over time and amongst different types of institutions, but most research published over the past twenty years suggests there are many gaps in this area and a need for more standardized tools.[5] Because of the variety of instruction archivists provide, ranging from one-shot sessions to being embedded in semester-long courses, it is difficult to find a one-size-fits-all approach to assessment.[6] Additionally, many archivists may feel overwhelmed by the idea of assessment or feel that they lack the skills or capacity needed to plan and implement it in a short class period.[7]
Most research on assessment focuses on quantifying student confidence levels in their primary source and archival literacy skills rather than on student learning outcomes.[8] This most commonly takes the form of surveys or pre- and post-tests.[9] Much of the literature consists of case studies specific to individual courses with lesson plans and activities that often gloss over assessment.[10] While these case studies are valuable sources for generating new teaching ideas, this format contributes to the lack of tools and methods for assessing the student learning that results from these class sessions.
Prior to the 2018 creation of the Guidelines for Primary Source Literacy developed by the ACRL RBMS-SAA Joint Task Force on the Development of Guidelines for Primary Source Literacy (hereafter referred to as Guidelines), many archivists used standards and assessment tools created by librarians for information literacy instruction. The ACRL Framework for Information Literacy in Higher Education, first published in 2000 and revised in 2015, often served as the standard used by archivists to assess primary source and archival instruction.[11] Many articles prior to 2018 discuss the need for a standardized framework specifically for teaching with primary sources.[12]
After the approval of the Guidelines, archivists started implementing them in their assessment practices, often in conjunction with the ACRL Framework.[13] This assessment often takes the form of surveys[14] or rubrics to analyze written student responses to in-class writing prompts.[15] Several articles mention utilizing worksheets as active learning tools, but few connect them to the learning objectives in the Guidelines or incorporate them into assessment.[16]
Class Session Analysis
Brandeis University
All class sessions held in the Robert D. Farber University Archives & Special Collections department at Brandeis University are focused on developing and improving transferable skills, and are designed based on the principle of backward design, as explored by Wiggins and McTighe in their 2005 seminal work, Understanding by Design. Backward design asks that the lesson plan is a “results-focused design” instead of a “content-focused design.”[17] To that end, we meet with the course instructor, if possible, to explore the goals and objectives they might have for their class session in the archives and then base the lesson plan on those goals and objectives, which then become the learning outcomes for the class session. We teach class sessions with a variety of different departments, such as History, Near Eastern and Judaic Studies, and African and African American Studies, to both undergraduate and graduate students.
As a way to meet the learning outcomes, we generally incorporate some kind of active learning exercise for the students to complete during the session. Active learning can be defined as “instructional activities involving students in doing things and thinking about what they are doing.”[18] Over the years, we have used a variety of active learning exercises, such as: archival, primary source, and rare book analysis through guided worksheets; creative writing exercises; and debates about whether or not to include the interrogated archival items in a future course syllabus. The active learning exercise is then coupled with a presentation or share-out. The active learning exercise, and the accompanying presentation, becomes the formative assessment tool that we will use to gauge whether or not the students have met the learning outcomes for the class. As noted previously, formative assessment tools are marked by an instructor’s ability to give feedback for learning, while summative assessment tools are graded and analyzed.
When our formative assessment tools are developed to directly reflect goals of the instructor and the learning objectives as set forth by the Guidelines, there is a deeper level of transparency in regard to understanding the extent to which students can recognize what they are meant to do with the material, and how well they can comprehend and learn from it. If, for example, one of the learning outcomes for a class session is based on Guidelines Learning Objective 3A, one way to know if students have met this outcome successfully would be to ask questions that mirror the language in the objective.[19] One example of how we have done this is using an activity that pairs rare book analysis with worksheets that incorporate questions about the physicality of the book and basic sourcing questions such as noting publication date and city. This direct question can shed more light on whether or not a student has met the outcome successfully than an ungraded survey at the end of class because a survey does not allow us to give feedback in the same way that the worksheet and presentation format does. Thus, by incorporating formative assessments based on specific learning outcomes and objectives, we can take an active part in student learning.
Georgia Tech
Most instruction sessions held in the Archives, Records Management, and Digital Curation Department at the Georgia Tech Library are one-shot sessions with first-year students in required English courses in the Writing and Communication Program. They follow a multimodal approach, requiring students to demonstrate written, oral, visual, electronic, and nonverbal (WOVEN) communication throughout the semester. This approach pairs well with archival instruction and many of the core ideas and learning objectives set forth by the Guidelines.
For each instruction session, we design worksheets to use with active learning activities that ask students to respond to questions aligning with learning outcomes stated at the beginning of the class. These outcomes are mapped to specific learning objectives in the Guidelines. To minimize preparation time, we often reuse questions, but tailor them to meet the instructor’s goals and relate them to the topics or themes the students are discussing in their course. During class sessions, students work in groups to examine archival documents, using the worksheets to formulate talking points for the class discussion. As the students complete the worksheet, we talk with each group and use their responses to help gauge their understanding. The worksheets act as formative assessment tools that help us formulate prompting questions for the discussion if students are missing an important point or struggling to understand a document.
One example of a frequently used activity is called “Georgia Tech Narratives.” In this activity, students work in groups to explore primary sources related to a story in Georgia Tech’s history. With minimal context, students determine the common theme of their documents and look for connections to create a narrative they will then share with the class. The questions are scaffolded from simple to more complex tasks, first asking them to determine their documents’ content by describing the physical formats, the people and places involved, the actions being described, and where the events are taking place. These types of questions align with the Guidelines Learning Objective 3B.[20] Students are then asked to consider their documents’ context by determining when the items were created, their audience, any author biases present, and any potential biases they might be bringing to their own interpretations. This section of the worksheet aligns with the Guidelines’ Learning Objective 4B.[21] Lastly, students are asked to consider the overall significance of their documents by applying their own knowledge of the time period when the documents were created and contemplating what other kinds of historical information they might need in order to help better understand them. This last part aligns with the Guidelines’ Learning Objective 4C.[22]
Conclusion
After the class session ends, we can review the worksheet questions as a way to analyze the level to which students successfully met the session’s learning objectives, but we can also use them as a way to reflect on our own teaching practices. By analyzing student responses to our questions, we can think about whether we might want to alter them. If student responses are vague, we can reword the questions or provide more clarification the next time a particular question is used. If students are providing more summary than analysis, we can reevaluate how much time is spent introducing the items, perhaps choosing to model how to answer a worksheet question or to provide a more in-depth overview of the process of document analysis to promote better understanding. We can also use the worksheet responses to give us ideas about how to format and direct class discussions in the future.
We acknowledge there are limitations to this subjective and anecdotal style of analyzing student learning. Future formal studies could include the creation and use of rubrics to score the worksheets, normalizing the results to provide more quantitative data about how well students meet a given set of learning outcomes. However, for the purposes of a one-shot instruction session, using worksheets with questions tied directly to student learning outcomes can give us worthwhile qualitative information about what and how students are learning, how closely they are able to meet the learning outcomes for a class session, and provide us with feedback that helps us as archivist educators improve and reflect on our teaching. Overall, linking worksheets to learning objectives and outcomes and using them as an informal assessment tool provides a low barrier entry for those who feel overwhelmed by incorporating assessment into their instruction.
Bibliography
Bahde, Anne, and Heather Smedberg. “Measuring the Magic: Assessment in the Special Collections and Archives Classroom.” RBM: A Journal of Rare Books, Manuscripts, and Cultural Heritage 13, no. 2 (2012): 152–74, https://doi.org/10.5860/rbm.13.2.380.
Baines, Johanna. “Establishing Special Collections Literacy for Undergraduate Students: An Investigation into Benefits and Barriers of Access.” The Journal of the Archives and Records Association 44, no. 1 (2023): 8–35. https://doi.org/10.1080/23257962.2022.2149481.
Bonwell, Charles C., and James A. Eisen. Active Learning: Creating Excitement in the Classroom. Washington, D.C.: School of Education and Human Development, George Washington University, 1991. https://files.eric.ed.gov/fulltext/ED336049.pdf archived July 24, 2024, at https://web.archive.org/web/20240630191027/https://files.eric.ed.gov/fulltext/ED336049.pdf
Campbell, Linda M., and Bruce Campbell. Mindful Learning: 101 Proven Strategies for Student and Teacher Success, Second Edition. London: Corwin Press, 2009.
Carini, Peter. “Information Literacy for Archives and Special Collections: Defining Outcomes.” portal: Libraries and the Academy 16, no. 1 (2016): 191–206. https://doi.org/10.1353/pla.2016.0006.
Daniels, Morgan, and Elizabeth Yakel. “Uncovering Impact: The Influence of Archives on Student Learning.” Journal of Academic Librarianship 39, no. 5 (2013): 414–22. https://doi.org/10.1016/j.acalib.2013.03.017.
Duff, Wendy, and Joan M. Cherry. “Archival Orientation for Undergraduate Students: An Exploratory Study of Impact.” The American Archivist 71, no. 2 (2008): 499–529. https://doi.org/10.17723/aarc.71.2.p6lt385r7556743h.
Duff, Wendy, Elizabeth Yakel, Helen Tibbo, Joan Cherry, Aprille McKay, Magia Krause, and Rebecka Sheffield. “The Development, Testing, and Evaluation of the Archival Metrics Toolkits.” The American Archivist 73, no. 2 (2010): 569–99. https://doi.org/10.17723/aarc.73.2.00101k28200838k4.
Emerling, Danielle. “Civics in the Archives: Engaging Undergraduate and Graduate Students with Congressional Papers.” The American Archivist 81, no. 2 (2018): 310–22. https://doi.org/10.17723/0360-9081-81.2.310.
Garcia, Patricia, Joseph Lueck, and Elizabeth Yakel. “The Pedagogical Promise of Primary Sources: Research Trends, Persistent Gaps, and New Directions.” The Journal of Academic Librarianship 4, no. 2 (2019): 94–101. https://doi.org/10.1016/j.acalib.2019.01.004.
Hensley, Merinda, Benjamin Murphy, and Ellen Swain. “Analyzing Archival Intelligence: A Collaboration Between Library Instruction and Archives.” Communications in Information Literacy 8, no. 1 (2014): 96–114. https://doi.org/10.15760/comminfolit.2014.8.1.155.
Hoyer, Jen, et al. “Redesigning Program Assessment for Teaching with Primary Sources: Understanding the Impacts of Our Work.” The American Archivist 85, no. 2 (2022): 443–79. https://doi.org/10.17723/2327-9702-85.2.443.
Keeran, Peggy. “‘We Turn the Lens…on Ourselves’: Assessing Digital Primary Source Library Instruction through the Lens of Scholarship of Teaching and Learning.” Reference Services Review 51, no. 1 (2022): 33–51. https://doi.org/10.1108/RSR-08-2022-0031.
Roussain, James. “Pedagogue in the Archive: Reorienting the Archivist as Educator.” Archivaria 90 (2020): 70–111. https://archivaria.ca/index.php/archivaria/article/view/13757.
SAA-ACRL/RBMS Joint Task Force on the Development of Guidelines for Primary Source Literacy. Guidelines for Primary Source Literacy. 2018. https://www2.archivists.org/sites/all/files/Guidelines%20for%20Primary%20Souce%20Literacy_AsApproved062018_1.pdf archived July 24, 2024, at https://web.archive.org/save/https://www2.archivists.org/sites/all/files/Guidelines%20for%20Primary%20Souce%20Literacy_AsApproved062018_1.pdf.
Stringer, Ernest T., Lois McFadyen Christensen, and Shelia C. Baldwin. Integrating Teaching, Learning, and Action Research: Enhancing Instruction in the K–12 Classroom. Los Angeles: Sage, 2010.
Swain, Ellen D. “Best Practices for Teaching with Primary Sources.” In The New Information Literacy Instruction: Best Practices, edited by Patrick Ragains and Sandra M. Wood, 189–204. Lanham: Rowman & Littlefield Publishers, 2016.
Tribbett, Krystal, Derek Quezada, and Jimmy Zavala. Library Impact Research Report: Improving Primary Source Literacy Learning Outcomes through a Community-Centered Archives Approach. Washington, DC: Association of Research Libraries, 2023. https://doi.org/10.29242/report.ucirvine2023, archived January 20, 2024, at https://web.archive.org/web/20240120041627/https://www.arl.org/resources/library-impact-research-report-improving-primary-source-literacy-learning-outcomes-through-a-community-centered-archives-approach/.
Wees, David. “56 Different Examples of Formative Assessment.” Arizona Department of Education, June 12, 2023. https://www.azed.gov/sites/default/files/2017/01/56%20Different%20Examples%20of%20Formative%20Assessment.pdf?id=5887e207aadebe16205a25dd archived July 24, 2024, at https://web.archive.org/save/https://www.azed.gov/sites/default/files/2017/01/56%20Different%20Examples%20of%20Formative%20Assessment.pdf?id=5887e207aadebe16205a25dd.
White, Katie. Softening the Edges: Assessment Practices That Honor K–12 Teachers and Learners (Using Responsible Assessment Methods in Ways That Support Student Engagement). Bloomington: Solution Tree Press, 2017.
Wiggins, Grant, and Jay McTighe. Understanding by Design, Expanded Second Edition. Alexandria: Association for Supervision & Curriculum Development, 2005.
Withers, Clare, Diana Dill, Jeanann Haas, Kathy Haines, and Berenika Webster. Library Impact Research Report: A Toolkit for Demonstrating and Measuring Impact of Primary Sources in Teaching and Learning. Washington, DC: Association of Research Libraries, 2022. https://doi.org/10.29242/report.pitt2022b, archived November 30, 2023, at https://web.archive.org/web/20231130193427/https://www.arl.org/resources/library-impact-research-report-a-toolkit-for-demonstrating-and-measuring-impact-of-primary-sources-in-teaching-and-learning/.
endnotes
[1] Linda M. Campbell and Bruce Campbell, Mindful Learning: 101 Proven Strategies for Student and Teacher Success, Second Edition (London: Corwin Press, 2009), 149.
[2] Ernest T. Stringer, Lois McFadyen Christensen, and Shelia C. Baldwin, Integrating Teaching, Learning, and Action Research: Enhancing Instruction in the K–12 Classroom (Los Angeles: Sage, 2010), 124.
[3] Stringer, McFadyen Christensen, and Baldwin, Integrating Teaching, 131.
[4] Katie White, Softening the Edges: Assessment Practices That Honor K–12 Teachers and Learners (Using Responsible Assessment Methods in Ways That Support Student Engagement) (Bloomington: Solution Tree Press, 2017), 85.
[5] Patricia Garcia, Joseph Lueck, and Elizabeth Yakel, “The Pedagogical Promise of Primary Sources: Research Trends, Persistent Gaps, and New Directions,” The Journal of Academic Librarianship 4, no. 2 (2019): 94–101, https://doi.org/10.1016/j.acalib.2019.01.004.
[6] Anne Bahde and Heather Smedberg, “Measuring the Magic: Assessment in the Special Collections and Archives Classroom,” RBM: A Journal of Rare Books, Manuscripts, and Cultural Heritage 13, no. 2 (2012): 152–74, https://doi.org/10.5860/rbm.13.2.380.
[7] Wendy Duff et al., “The Development, Testing, and Evaluation of the Archival Metrics Toolkits,” The American Archivist 73, no. 2 (2010): 569–99, https://doi.org/10.17723/aarc.73.2.00101k28200838k4.
[8] Morgan Daniels and Elizabeth Yakel, “Uncovering Impact: The Influence of Archives on Student Learning,” Journal of Academic Librarianship 39, no. 5 (2013): 414–22, https://doi.org/10.1016/j.acalib.2013.03.017; Jen Hoyer et al., “Redesigning Program Assessment for Teaching with Primary Sources: Understanding the Impacts of Our Work,” The American Archivist 85, no. 2 (2022): 443–79, https://doi.org/10.17723/2327-9702-85.2.443.
[9] Wendy Duff and Joan M. Cherry, “Archival Orientation for Undergraduate Students: An Exploratory Study of Impact,” The American Archivist 71, no. 2 (2008): 499–529, https://doi.org/10.17723/aarc.71.2.p6lt385r7556743h.
Merinda Hensley, Benjamin Murphy, and Ellen Swain, “Analyzing Archival Intelligence: A Collaboration Between Library Instruction and Archives,” Communications in Information Literacy 8, no. 1 (2014): 96–114, https://doi.org/10.15760/comminfolit.2014.8.1.155.
[10] Garcia, Lueck, and Yakel “The Pedagogical Promise,” 94–101.
[11] Ellen D. Swain, “Best Practices for Teaching with Primary Sources,” in The New Information Literacy Instruction: Best Practices, ed. Patrick Ragains and Sandra M. Wood (Lanham: Rowman & Littlefield Publishers, 2016), 189–204.
[12] Bahde and Smedberg, “Measuring the Magic,” 152–74; Peter Carini, “Information Literacy for Archives and Special Collections: Defining Outcomes,” portal: Libraries and the Academy 16, no. 1 (2016): 191–206, https://doi.org/10.1353/pla.2016.0006.
[13] Peggy Keeran, “‘We Turn the Lens…on Ourselves’: Assessing Digital Primary Source Library Instruction through the Lens of Scholarship of Teaching and Learning,” Reference Services Review 51, no. 1 (2022): 33–51, https://doi.org/10.1108/RSR-08-2022-0031; James Roussain, “Pedagogue in the Archive: Reorienting the Archivist As Educator,” Archivaria 90 (2020): 70–111, https://archivaria.ca/index.php/archivaria/article/view/13757.
[14] Krystal Tribbett, Derek Quezada, and Jimmy Zavala, Library Impact Research Report: Improving Primary Source Literacy Learning Outcomes through a Community-Centered Archives Approach (Washington, DC: Association of Research Libraries, 2023), https://doi.org/10.29242/report.ucirvine2023; Johanna Baines, “Establishing Special Collections Literacy for Undergraduate Students: An Investigation into Benefits and Barriers of Access,” The Journal of the Archives and Records Association 44, no. 1 (2023): 8–35, https://doi.org/10.1080/23257962.2022.2149481.
[15] Clare Withers et al., Library Impact Research Report: A Toolkit for Demonstrating and Measuring Impact of Primary Sources in Teaching and Learning (Washington, DC: Association of Research Libraries, 2022), https://doi.org/10.29242/report.pitt2022b.
[16] Danielle Emerling, “Civics in the Archives: Engaging Undergraduate and Graduate Students with Congressional Papers,” The American Archivist 81, no. 2 (2018): 310–22, https://doi.org/10.17723/0360-9081-81.2.310; Keeran, “‘We Turn the Lens,” 33–51; Roussain, “Pedagogue in the Archive,” 70–111.
[17] Grant Wiggins and Jay McTighe, Understanding by Design, Expanded Second Edition (Alexandria: Association for Supervision & Curriculum Development, 2005), 15.
[18] Charles C. Bonwell and James A. Eisen, Active Learning: Creating Excitement in the Classroom (Washington, D.C.: School of Education and Human Development, George Washington University, 1991), Executive summary iii.
[19] SAA-ACRL/RBMS Joint Task Force on the Development of Guidelines for Primary Source Literacy, Guidelines for Primary Source Literacy (2018), 5. https://www2.archivists.org/standards/guidelines-for-primary-source-literacy archived October 23, 2024 at https://web.archive.org/web/20231003161019/https://www2.archivists.org/standards/guidelines-for-primary-source-literacy
[20] SAA-ACRL/RBMS Joint Task Force, Guidelines, 5.
[21] SAA-ACRL/RBMS Joint Task Force, Guidelines, 5.
[22] SAA-ACRL/RBMS Joint Task Force, Guidelines, 5.