Pedagogy
18 How to Know If They’re Getting It: Designing and Assessing Primary Source Analysis Exercises
Samantha Crisp
For the past twenty years, archivists and special collections librarians have been carving out roles for themselves as frontline instructors for students learning to navigate the world of primary source research. These instructors have increasingly sought to improve their pedagogy by establishing learning outcomes and course goals that emphasize information literacy concepts specific to historical research. Early innovators in the field identified these goals as critical thinking skills,[1] archival intelligence,[2] or archival research skills.[3] Today, we refer to these concepts broadly as primary source literacy. When teaching the core concepts of primary source literacy, it is important for instructors to determine whether students are actually learning the skills necessary to critically evaluate primary sources on their own. Often, students will appear to grasp these concepts in the classroom or reading room but struggle to apply them in their own research without an instructor’s guidance. Because instructors, and especially archivists, generally have a limited amount of time with students (often only a single class period) to teach these core concepts, it is important to maximize the impact and effectiveness of active learning experiences, document analysis exercises, and other pedagogical tools deployed during the instructional experience.
Recognizing this need, instruction archivists have taken the initiative to adapt their pedagogy to existing standards, such as the ACRL Framework for Information Literacy for Higher Education[4] or, especially, the SAA-ACRL/RBMS Guidelines for Primary Source Literacy.[5] While the advent of primary source literacy standards has eased the burden for instructors who previously struggled to determine what to teach and how to teach it, these standards have also brought to the forefront the critical lack of programmatic assessment in primary source literacy instruction.[6] After all, “without the aid of concrete assessment tools that provide feedback to archivists about their efforts, archivists can find it challenging to measure their impact on users.”[7] There are numerous assessment tools available to instructors who teach with primary sources.[8] Reaction-based assessments, such as surveys and evaluations, are popular but tend to lack subjective, quantitative data to demonstrate student learning. Classroom assessment techniques, such as polls and one-minute papers, provide some insight into learning, but they often lack the nuance and granularity required to address specific knowledge gaps. One tool is particularly well situated for assessing primary source literacy concepts and thus far underused by instructors: the assessment rubric. While rubrics take time and resources to develop at the outset, they can be widely adaptable to a variety of primary source learning experiences and provide detailed insight into specific areas where novice learners could use some extra guidance. This chapter will explain how to create rubrics tied to easily assessable deliverables for primary source literacy exercises that will help instructors improve their pedagogy and demonstrate the impact of their work to stakeholders.
Defining Learning Outcomes and Designing an Instruction Session
The key to creating a successful assessment program is to tie a primary source learning experience to clear, succinct, and assessable student learning outcomes. Learning outcomes indicate the skills or abilities that students should take away from a learning experience, usually expressed in terms of attitudinal, behavioral, or cognitive change. They identify the bigger-picture goals of a primary source learning experience, or the “knowledge, skills, attitudes, and habits of mind that students take with them.”[9] They are intrinsically tied to learning objectives, which constitute the measurable and observable behaviors indicating the expected level of attainment.[10] An instructor might set a learning outcome for a primary source learning experience to be “students will be able to distinguish between primary and secondary sources for a research need,” while a learning objective used to assess this outcome might be “students will correctly explain how the document(s) they’re consulting are primary or secondary sources.” When establishing learning outcomes for a primary source learning experience, related learning objectives will generally be expressed as some sort of deliverable, such as a document analysis exercise or annotated bibliography. Instructors may find it useful to organize their learning experiences using the principle of backward design, establishing the criteria that will be used to evaluate a learning experience first, before designing the learning experience itself. This allows an instructor to effectively leverage the time they have available to them to keep their students focused on the most salient points of a primary source literacy exercise.
Learning outcomes should be concrete, measurable, and easily identifiable. They should avoid vague or ambiguous expressions such as “students will understand…” or “students will know…” in favor of more determinate verbiage such as “students will describe…” or “students will explain…” In addition to facilitating assessment for the instructor, using strong action words such as these clarifies expectations for the students participating in the learning experience. Bloom’s Revised Taxonomy[11] is a useful guide for selecting measurable actions to employ in learning outcomes, while also arranging these actions hierarchically according to the different levels of thinking they reflect. For those instructors with limited time available to devote to instruction, or who find the idea of creating learning outcomes from scratch to be intimidating, there are several existing lists of outcomes and competencies that can be reused or revised to suit an instructor’s needs.[12] Instructors can pull a manageable number of learning outcomes (usually 1–3 will suffice) directly from these standards that easily align with their goals for a primary source learning experience.
If learning outcomes are pre-selected, instructors can better focus on designing an instructional experience that meets their established primary source literacy goals. Most impactful are instructional exercises that incorporate active learning techniques, providing students with an opportunity for hands-on exploration of original materials, if possible, rather than a lecture or show-and-tell style presentation of relevant sources. In addition to supporting established learning outcomes, primary source literacy instruction should “sustain the student’s attention, present relevant materials to the student’s studies, design the learning materials and environment to establish and foster the student’s confidence, and enable students to experience satisfaction with the learning experience.”[13] The best instructional exercises mimic a typical archival research experience as closely as possible; students should be given the opportunity to gather in a repository’s reading room, interact with collections in their existing containers (such as boxes or folders), practice careful handling techniques, and navigate the discovery tools that mediate historical collections, such as finding aids, digital collections, or catalogs. In this way, archival repositories can be used as laboratories for students to “experiment” with historical research, in much the same way that a chemistry student might use a science lab.
Part of the active learning experience involves providing students with some sort of deliverable or assignment to focus their work. One popular instrument is a document analysis exercise, a worksheet or form asking students specific questions about the documents they are consulting to help guide their analysis.[14] A sample document analysis exercise is included at the end of this chapter. Document analysis exercises tend to include basic questions related to the identification of a document (What kind of document is this? When and where was it created?), its purpose (Who created this document? Who is its intended audience? Why was it created?), and its content (What is this document telling you?). They may also include more advanced elements of primary source analysis related to bias (What perspectives are absent from this document?), curiosity and ambiguity (What information would help you understand this document better? What questions does it raise for you?), or materiality (What physical characteristics stick out to you? What does this document’s physical construction tell you about how it might have been used or encountered when it was created?). Creating document analysis exercises that clearly and explicitly relate to established learning outcomes will allow instructors to easily craft effective rubrics to assess student learning.
Creating Rubrics to Measure Primary Source Literacy
As demonstrated in numerous studies,[15] rubrics have been used successfully to quantify students’ growth in several areas of primary source literacy, yet they have not been widely used by instruction archivists, probably due to the amount of planning required to develop an effective rubric. This work is justified, however, by a rubric’s “great potential for converting qualitative data into quantitative data within special collections environments.”[16] This qualitative data can be useful in providing feedback to instructors about their pedagogy as well as measuring the impact of their teaching on users.[17] Additionally, instructors will find that rubrics “[make] grading easier and more consistent because rubrics explicitly spell out the expectations of student work and, in the case of multiple graders, provide guidance for grading consistently.”[18] By devoting some time and effort at the forefront into developing an adaptable rubric to assess primary source literacy, instructors can improve their own pedagogy and potentially track demonstrable changes in students’ primary source literacy skills as a result of archival research and instruction.
Rubrics can be holistic or analytic. Holistic rubrics score a primary source learning experience as a whole, using a single rating, rather than individually focusing on its parts. Analytic rubrics are designed to score participants on several elements within a primary source learning experience, which are then added together to achieve a total score. Analytic rubrics are more common in archives and special collections instruction, where instructors generally seek to measure student learning in more than one area of primary source literacy.[19] Regardless of the type of rubric that is chosen, there is no single right or wrong way to create a rubric. Rubrics are generally organized around a rating scale of 3–5 levels of achievement (usually from “developing” to “exemplary,” sometimes including a value of 0 or “not observable”), and analytic rubrics are further organized into several individual attributes that are developed from established learning outcomes. Bonnie Gratch-Lindauer explains the process of creating rubrics from learning outcomes as follows: (1) establish the desired learning outcomes, (2) identify observable attributes that students should demonstrate in support of these outcomes, (3) brainstorm characteristics that describe each attribute, and (4) write narrative descriptions for the levels of performance for each attribute.[20] Following these steps, the process of developing a rubric to assess primary source literacy might look something like this:
- Learning outcome: Students will identify and communicate information found in primary sources.
- Learning objectives: Students will be able to summarize the content of a source and accurately identify its basic elements, such as title, date, creators’ names, intended audience, format, and place of origin.
- Levels of understanding (on a 4-point scale): Learners at various stages will be able to correctly identify zero, a few, most, or all elements of the document correctly.
- Expression in a rubric:
1– Poor | 2–Limited | 3–Good | 4–Exemplary |
Does not accurately identify any elements of the document | Identifies a few elements of the document accurately | Identifies most elements of the document accurately | Identifies all elements of the document accurately |
This process is then repeated for each established learning outcome until a full rubric has been developed. Instructors may consider designing a rubric to meet the needs of their primary source literacy instruction program as a whole, rather than an individual learning experience, which can serve as a master template that can then be adapted to individual use cases. An example of such a rubric currently being used by the University of North Carolina Wilmington (UNCW)’s Center for Southeast North Carolina Archives and History, which is mapped directly to the Guidelines for Primary Source Literacy and ACRL Framework, is included at the end of this chapter.[21]
The Value of Rubrics: A Case Study
In my role at UNCW’s Center for Southeast North Carolina Archives and History, I treat the primary source literacy assessment rubric as a critical component of my instruction program. Although the rubric is still in its nascent stages (having only been in use for one academic year), I have already used it to change and grow my pedagogy as well as demonstrate the impact of archival work to our stakeholders. The rubric has allowed me to identify and address the most common knowledge gaps and mistakes made by learners who may be analyzing primary sources for the first time, a few of which have surprised me. For instance, after grading document analysis worksheets from the fall 2022 semester, I noticed that students scored exceptionally low in their understanding of materiality, or the ways in which a source’s physical format impacts how it can be accessed, interpreted, and understood. I realized that I was not adequately introducing the concept of materiality to the students in my instruction sessions, nor was I asking them questions that prompted them to critically engage with this concept. After adapting both my pedagogical approach and the worksheets I designed for the spring 2023 semester, scores in materiality improved substantially.
It is becoming increasingly important for libraries and archives to prove their impact to stakeholders in a concrete and meaningful way. Despite resounding agreement within the field that primary source literacy instruction has value, we have been slow to embrace a culture of assessment as a means of quantifying that value. To echo Robin Katz, “We assume that it is inherently a good idea to teach undergraduates with archives, but can we prove it? Do we even understand the value we add?”[22] Whether adopted as part of a comprehensive assessment program or deployed on their own, rubrics offer a means of demonstrating how archival research and instruction has a tangible effect on student learning and critical thinking skills. Rubrics open up opportunities for collaboration with librarians and teaching faculty by expressing the benefits of archival instruction in a common language grounded in standardized learning outcomes and information literacy competencies. They can help archivists advocate for resources to improve their work, and they allow instructors to understand the strengths and weaknesses of their own teaching. If primary source literacy instructors are prepared to put in the necessary work to develop an effective rubric to evaluate student learning, the immeasurable opportunities for pedagogical growth become, quite literally, measurable.
Appendix A: Sample Document Analysis Exercise
Closely examine the document sitting in front of you, then record the following information.
- Identify/describe the document:
- Publication/creation date:
- Who created it?
- Who is its intended audience?
- Is there anything in the document you don’t understand? What questions do you have about it?
- What makes this document a primary source?
- List the other three documents sitting at your table, and have your classmates describe them to you.
- How do these items add context or additional information to the document you examined?
Appendix B: Rubric for Assessing Primary Source Literacy
Appendix B Rubric for Assessing Primary Source Literacy
Criteria for Assessment | 1–Poor | 2–Limited | 3–Good | 4–Exemplary | 0 |
Identification | Does not accurately identify any elements of the document | Identifies a few elements of the document accurately | Identifies most elements of the document accurately | Identifies all elements of the document accurately | Not assessable |
Interpretation | Incorrectly interprets the content of the document | Demonstrates a limited understanding of the content of the document | Accurately summarizes the content of the document | Demonstrates a significant understanding of the document’s creation and its connection to historical themes | Not assessable |
Contextualization | Does not mention the historical context of the document, or misidentifies it | Has a basic understanding of the historical context of the document | Shows an in-depth understanding of the historical context of the document | Accurately places the document in a historical context and offers a sophisticated analysis of its impact | Not assessable |
Materiality | Does not mention any physical characteristics of the document | Identifies one or two physical characteristics of the document | Correctly describes the material elements of the document, but is unable to articulate their impact on its analysis | Demonstrates how the physical nature of the document can add to our understanding of it as a whole and how it was created | Not assessable |
Synthesis | Draws no connection between related documents | Identifies a basic connection between related documents | Accurately describes the various connections between related documents | Is able to place multiple sources into conversation with each other to draw a logical conclusion or support an argument | Not assessable |
Critical Thinking | Makes no attempt to evaluate the document beyond summarizing its content | Offers limited analysis of one or two elements of the document | Able to offer an analysis of the document that shows only a basic understanding of its connections to larger questions | Provides a sophisticated analysis of the document as a whole, including gaps and missing information, and understands its relationship to larger concepts or issues | Not assessable |
Broader Connections | Makes no connection to a larger research question, course themes, or current events | Draws a basic connection to either a larger research question, course themes, or current events | Describes multiple avenues by which the document relates to a larger research question, course themes, or current events | Provides a sophisticated analysis and appreciation of how the document demonstrates a connection between the past and the present | Not assessable |
Engagement | Shows no personal connection and no interest in the document | Expresses a limited personal connection or vague interest in the document | Displays curiosity and interest in the document and describes a personal connection to its content or creator | Demonstrates an emotional engagement with the document and its creator, and recognizes its historical significance | Not assessable |
Explanation of criteria for assessment
- Identification: Refers to the student’s ability to accurately identify the basic elements of a primary source, such as title, date, creators’ names, intended audience, format, and place of origin.
- Guidelines for Primary Source Literacy: Read, Understand and Summarize (3-A, 3-B)
- ACRL Framework: Authority is Constructed and Contextual
- Interpretation: Refers to the student’s ability to accurately interpret and communicate the content of the document, including elements such as argument, tone, purpose, or a summary of content.
- Guidelines for Primary Source Literacy: Interpret, Analyze, and Evaluate (4-B)
- ACRL Framework: Authority is Constructed and Contextual
- Contextualization: Refers to the student’s ability to appropriately situate the document in a historical context.
- Guidelines for Primary Source Literacy: Interpret, Analyze, and Evaluate (4-C)
- ACRL Framework: Authority is Constructed and Contextual
- Materiality: Refers to the student’s ability to describe the physical makeup of the document, differentiate between original documents and surrogates in various forms, understand the relationship between materiality and content, and make a meaningful connection between materiality and evaluation of the document as a whole.
- Guidelines for Primary Source Literacy: Read, Understand, and Summarize (3-C), Interpret, Analyze, and Evaluate (4-E)
- ACRL Framework: Information Creation as a Process
- Synthesis: Refers to the student’s ability to articulate connections between related sources, put two sources in conversation with one another, and compare and contrast the various elements of multiple sources
- Guidelines for Primary Source Literacy: Use and Incorporate (5-A)
- ACRL Framework: Research as Inquiry, Scholarship as Conversation
- Critical Thinking: Refers to the student’s ability to “read between the lines” of a primary source, including recognizing and identifying gaps or missing information necessary to analyze the document, critically evaluating the perspective of the creator, interrogating the source’s biases and limitations, and understanding its relationship to a larger concept or theme.
- Guidelines for Primary Source Literacy: Interpret, Analyze, and Evaluate (4-D)
- ACRL Framework: Authority is Constructed and Contextual, Information Has Value
- Broader Connections: Refers to the student’s ability to draw connections to course themes, determine a document’s relevance to a larger research topic or question, and describe its connections to current events or the current cultural milieu.
- Guidelines for Primary Source Literacy: Conceptualize (1-C)
- ACRL Framework: Information Has Value
- Engagement: Refers to the student’s ability to demonstrate empathy, interest, and an emotional or personal connection, or otherwise engage with the document beyond analyzing and interpreting it.
- Guidelines for Primary Source Literacy: Interpret, Analyze, and Evaluate (4-F)
- ACRL Framework: Information Has Value
Bibliography
Anderson, Lorin W., and David R. Krathwohl, eds. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman, 2001.
Avery, Elizabeth F., ed. Assessing Student Learning Outcomes for Information Literacy Instruction in Academic Institutions. Chicago: Association of College and Research Libraries, 2003.
Association of College & Research Libraries. Framework for Information Literacy for Higher Education. Chicago: ACRL, 2015. http://www.ala.org/acrl/files/issues/infolit/framework.pdf, archived January 19, 2024, at https://web.archive.org/web/20240119025328/https://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/framework.pdf.
Bahde, Anne, and Heather Smedberg. “Measuring the Magic: Assessment in the Special Collections and Archives Classroom.” RBM: A Journal of Rare Books, Manuscripts, & Cultural Heritage 13, no. 2 (Fall 2012): 152–74. https://doi.org/10.5860/rbm.13.2.380, archived July 24, 2024, at https://web.archive.org/web/20240501000000*/https://rbm.acrl.org/index.php/rbm/article/view/380.
Bahde, Anne, Heather Smedberg, and Mattie Taormina, eds. Using Primary Sources: Hands-On Instructional Exercises. Santa Barbara, CA: Libraries Unlimited, 2014.
Billeaudeaux, Brigitte, and Rachel Scott. “Leveraging Existing Frameworks to Support Undergraduate Primary Source Research.” Reference and User Services Quarterly 58, no. 4 (2019): 246–256. http://dx.doi.org/10.5860/rusq.58.4.7151, archived April 25, 2024, at https://web.archive.org/web/20240425213615/https://journals.ala.org/index.php/rusq/article/view/7151.
Carini, Peter. “Archivists as Educators: Integrating Primary Sources into the Curriculum.” Journal of Archival Organization 7, no. 1 (2009): 41–50. https://doi.org/10.1080/15332740902892619.
Carini, Peter. “Information Literacy for Archives and Special Collections: Defining Outcomes.” portal: Libraries and the Academy 16, no. 1 (2016): 191–206. https://doi.org/10.1353/pla.2016.0006.
Horowitz, Sarah M. “Hands-On Learning in Special Collections: A Pilot Assessment Project.” Journal of Archival Organization 12, no. 3–4 (2014): 216–29. http://dx.doi.org/10.1080/15332748.2015.1118948.
Katz, Robin. “Priorities for Progress.” In Teaching Undergraduates with Archives, edited by Nancy Bartlett, Elizabeth Gadelha, and Cinda Nofziger, 337–46. Ann Arbor: Maize Books, 2019. https://doi.org/10.3998/mpub.11499242.
Krause, Magia. “Undergraduates in the Archives: Using an Assessment Rubric to Measure Learning.” The American Archivist 73, no. 2 (2010): 507–34. https://www.jstor.org/stable/23290757.
Marino, Chris. “Inquiry-based Archival Instruction: An Exploratory Study of Affective Impact.” The American Archivist 81, no. 2 (2018): 483–512. https://www.jstor.org/stable/48617865.
Meiman, Meg, and Meggan Press. “Comparing the Impact of Physical and Digitized Primary Sources on Student Engagement.” Portal: Libraries and the Academy 21, no. 1 (2020): 99–112. http://dx.doi.org/10.1353/pla.2021.0007.
Morris, Sammie, Lawrence Mykytiuk, and Sharon Weiner. “Archival Literacy for History Students: Identifying Faculty Expectations of Archival Research Skills.” The American Archivist 77, no. 2 (2014): 394–424. https://www.jstor.org/stable/43489672.
Robyns, M. C. “The Archivist as Educator: Integrating Critical Thinking Skills into Historical Research Methods Instruction.” The American Archivist 64, no. 2 (2001): 363–84. https://www.jstor.org/stable/40294177.
SAA-ACRL/RBMS Joint Task Force on Primary Source Literacy. Guidelines for Primary Source Literacy. Chicago: ACRL/SAA 2018. https://www2.archivists.org/standards/guidelines-for-primary-source-literacy, archived July 16, 2024, at https://web.archive.org/web/20240716043122/https://www2.archivists.org/standards/guidelines-for-primary-source-literacy.
Suskie, Linda A. Assessing Student Learning: A Commonsense Guide. San Francisco: Jossey-Bass, 2018.
Torres, Deborah A., and Elizabeth Yakel. “AI: Archival Intelligence and User Expertise.” The American Archivist 66, no. 1 (2003): 51–78. https://www.jstor.org/stable/40294217.
Endnotes
- Marcus C. Robyns, “The Archivist as Educator: Integrating Critical Thinking Skills into Historical Research Methods Instruction,” The American Archivist 64, no. 2 (2001): 363–84, https://www.jstor.org/stable/40294177. ↵
- Deborah A. Torres and Elizabeth Yakel, “AI: Archival Intelligence and User Expertise,” The American Archivist 66, no. 1 (2003): 51–78, https://www.jstor.org/stable/40294217. ↵
- Sammie Morris, Lawrence Mykytiuk, and Sharon Weiner, “Archival Literacy for History Students: Identifying Faculty Expectations of Archival Research Skills,” The American Archivist 77, no. 2 (2014): 394–424, https://www.jstor.org/stable/43489672. ↵
- Association of College & Research Libraries, Framework for Information Literacy for Higher Education (Chicago: ACRL, 2015), http://www.ala.org/acrl/files/issues/infolit/framework.pdf, archived April 9, 2024, at https://web.archive.org/web/20240000000000*/https://www.ala.org/acrl/files/issues/infolit/framework.pdf. ↵
- SAA-ACRL/RBMS Joint Task Force on Primary Source Literacy, Guidelines for Primary Source Literacy (Chicago: ACRL/SAA 2018), https://www2.archivists.org/standards/guidelines-for-primary-source-literacy, archived July 16, 2024, at https://web.archive.org/web/20240000000000*/https://www2.archivists.org/standards/guidelines-for-primary-source-literacy. ↵
- Anne Bahde and Heather Smedberg, “Measuring the Magic: Assessment in the Special Collections and Archives Classroom,” RBM: A Journal of Rare Books, Manuscripts, & Cultural Heritage 13, no. 2 (Fall 2012): 152–74, https://doi.org/10.5860/rbm.13.2.380. ↵
- Magia Krause, “Undergraduates in the Archives: Using an Assessment Rubric to Measure Learning,” The American Archivist 73, no. 2 (2010): 508, https://www.jstor.org/stable/23290757. ↵
- Many examples can be found in Bonnie Gratch-Lindauer’s “Selecting and Developing Assessment Tools,” in Assessing Student Learning Outcomes for Information Literacy Instruction in Academic Institutions, ed. Elizabeth F. Avery (Chicago: Association of College and Research Libraries, 2003). ↵
- Linda A. Suskie, Assessing Student Learning: A Common Sense Guide, (San Francisco: Jossey-Bass, 2018), 17. ↵
- Yvonne Meulemans and Gabriela Sonntag, “Planning for Assessment,” in Assessing Student Learning Outcomes for Information Literacy Instruction in Academic Institutions, ed. Elizabeth F. Avery (Chicago: ACRL, 2003), 17. ↵
- Lorin W. Anderson and David R. Krathwohl, eds., A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives (New York: Longman, 2001). ↵
- See especially the list of learning objectives included in the Guidelines for Primary Source Literacy, as well as lists of competencies and learning outcomes included in Morris, Mykytiuk, and Weiner, “Archival Literacy for History Students;” Peter Carini, “Archivists as Educators: Integrating Primary Sources into the Curriculum,” Journal of Archival Organization 7, no. 1 (2009): 41–50, https://doi.org/10.1080/15332740902892619; and Peter Carini, “Information Literacy for Archives and Special Collections: Defining Outcomes,” portal: Libraries and the Academy 16, no. 1 (2016): 191–206, https://doi.org/10.1353/pla.2016.0006. ↵
- Chris Marino, “Inquiry-based Archival Instruction: An Exploratory Study of Affective Impact,” The American Archivist 81, no. 2 (2018): 486, https://www.jstor.org/stable/48617865. ↵
- A document analysis exercise is suggested here for its low barrier to entry and ease of creation, and because this skill is a foundational component to any historical research experience. Many other active learning tools and deliverables exist, however. For some examples, see Using Primary Sources: Hands-On Instructional Exercises, eds. Anne Bahde, Heather Smedberg, and Mattie Taormina (Santa Barbara, CA: Libraries Unlimited, 2014). ↵
- A document analysis exercise is suggested here for its low barrier to entry and ease of creation, and because this skill is a foundational component to any historical research experience. Many other active learning tools and deliverables exist, however. For some examples, see Using Primary Sources: Hands-On Instructional Exercises, eds. Anne Bahde, Heather Smedberg, and Mattie Taormina (Santa Barbara, CA: Libraries Unlimited, 2014). ↵
- Bahde and Smedberg, “Measuring the Magic,” 167. ↵
- Krause, “Undergraduates in the Archives,” 508. ↵
- Krause, 515. ↵
- Gratch-Lindauer, “Selecting and Developing Assessment Tools,” 31–32. ↵
- Gratch-Lindauer, 32. ↵
- Adapted from Horowitz, “Hands-On Learning in Special Collections,” and Krause, “Undergraduates in the Archives.” ↵
- Robin Katz, “Priorities for Progress,” in Teaching Undergraduates with Archives, eds. Nancy Bartlett, Elizabeth Gadelha, and Cinda Nofziger (Ann Arbor: Maize Books, 2019), 344. ↵