ASCB logo LSE Logo

Dipping Your Toe in The CURE Pool: Longitudinal Tracking of Instructors Suggests Use of a Short-Duration CURE Can Catalyze Expansion to Longer CURE Experiences

    Published Online:https://doi.org/10.1187/cbe.23-05-0091

    Abstract

    Course-based undergraduate research experiences (CUREs) are an effective method of engaging large numbers of students in authentic research but are associated with barriers to adoption. Short CURE modules may serve as a low-barrier entryway, but their effectiveness in promoting expansion has not been studied. The Prevalence of Antibiotic Resistance in the Environment (PARE) project is a modular CURE designed to be a low-barrier gateway into CURE use. In a series of interviews, we track and characterize use of PARE in 19 PARE-interested instructors throughout the Innovation-Decision Process described by Rogers’ Diffusion of Innovations theory. The majority (16/19) implement PARE at least once, and a majority of these implementers (11/16) expanded use by the final interview. Three of four cases of discontinuance were due to a disruption such as moving institutions or a change in course assignment and occurred for community college faculty. Expanders expressed fewer personal challenges than nonexpanders. Overall analysis shows that perception of barriers is nuanced and impacted by the innovation itself, the institutional context, and one’s own experiences. These results suggest that a short duration, low barrier CURE can serve as a catalyst for implementation of a longer duration CURE.

    INTRODUCTION

    Course-based undergraduate research experiences (CUREs) are a pedagogical technique that brings authentic scientific research into classrooms, allowing a larger population of students to be exposed to the benefits of research than would be feasible with traditional mentored laboratory research experiences alone (Healey and Jenkins, 2009; Lopatto and Tobias, 2009; Wei and Woodin, 2011; Auchincloss et al., 2014; Bangera and Brownell, 2014; Buchanan and Fisher, 2022;). This is important, because both student self-report and objective assessments reveal multiple benefits to a research experience such as gains in skills, self-confidence, understanding of the research process, career choice confirmation, retention in science, engineering, mathematics, and positive correlation with grade point average and graduation rate (Nagda et al., 1998; Gregerman, 1999; Bauer and Bennett, 2003; National Research Council, 2003; Lopatto, 2004; Seymour et al., 2004; Hunter et al., 2007; Kuh, 2008; Taraban and Blanton, 2008; Laursen et al., 2010; Lopatto and Tobias, 2009; Fechheimer et al., 2011; Eagan et al., 2013; Jordan et al., 2014; Shaffer et al., 2014; Olimpo et al., 2016; Rodenbusch et al., 2016; Carpi et al., 2017; Frantz et al., 2017; Shuster et al., 2019; Buchanan and Fisher, 2022). Several reports have called for infusion of more inquiry and research into the undergraduate experience (Hattie and Marsh, 1996; Kenny et al., 1998; National Research Council, 2003; American Association for the Advancement of Science (AAAS), 2011; Olson and Riordan, 2012), but traditional (out-of-class) research experiences can perpetuate inequities (Mullin, 2012; Bangera and Brownell, 2014; Wolkow et al., 2014; Hensel and Cejda, 2015; Hensel and Davidson, 2018). For example, in the traditional out-of-class research model, time spent in the laboratory, if unpaid, can interfere with the need for wage-earning employment. Additionally, many students from groups historically marginalized in STEM first encounter undergraduate biology instruction in community colleges, where creating a culture of research is not always considered central to the mission of the institution (Perez, 2003; Hewlett, 2016; Hensel and Davidson, 2018; Hewlett, 2018; Rosas Alquicira et al., 2022). CUREs have emerged as a more equitable and high throughput alternative to out-of-class research because the research is embedded as part of the required curriculum, reaching many more students than traditional research experiences (Shaffer et al., 2010; Brownell et al., 2013; Bangera and Brownell, 2014; Jordan et al., 2014; Wolkow et al., 2014; Dolan, 2016; Rodenbusch et al., 2016; Hensel and Davidson, 2018; Hyman et al., 2019; Hurley et al., 2021; Hanauer et al., 2022). However, the full potential of CUREs has not yet been realized. Many laboratory courses still do not include a genuine research experience (Sundberg and Armstrong, 1993; Sundberg et al., 2005; Beck et al., 2014; Spell et al., 2014). Further, if adoption of CUREs is disproportionately low in institutions serving the majority of lower income students, the notion of CUREs leading to more equitable research experiences could prove inaccurate. The reasons for low adoption are likely complex, but it is known that CUREs are associated with many challenges and barriers to their implementation, including constraints related to the existing course or required curriculum, logistical and technical challenges, available resources, instructor time investment, and student readiness (Healey and Jenkins, 2009; Lopatto et al., 2014; Spell et al., 2014; Wolkow et al., 2014; Shortlidge et al., 2016; Cooper and Brownell, 2018; Heim and Holt, 2019; Davis et al., 2020; Genné-Bacon et al., 2020). In this paper, through longitudinal tracking of interested instructors, we assess whether a modular approach to CURE design leads to continued use by undergraduate instructors and what factors may be associated with this trajectory. We also assess whether instructors use a short-term CURE module as a springboard into a longer duration CURE for their students.

    CUREs have been grouped into two categories: CUREs developed from scratch by an instructor, often based on their own research, and “network” CUREs developed by one research team and disseminated broadly (Lopatto et al., 2014; Dolan, 2016; Shortlidge et al., 2016). Network CUREs may ease some of the common barriers instructors face when implementing a CURE (Lopatto et al., 2014; Govindan et al., 2020). For example, a set of methods optimized for classroom use along with instructional resources can save instructors time needed for CURE development and the network can serve as a resource for troubleshooting. However, instructors still face barriers for implementing network CUREs, including compatibility of the CURE with their course structure, or inability to transform an entire course (or create a new course) to accommodate a semester-long network CURE (Lopatto et al., 2014; Craig, 2017; Genné-Bacon et al., 2020). Short-duration CURE modules that can be flexibly inserted into an existing course could help lower barriers to CURE implementation (Howard and Miskowski, 2005; Staub et al., 2016; Hanauer et al., 2018; Dahlberg et al., 2020; Genné-Bacon et al., 2020) and, therefore, bring research experiences to more students. In a previous paper we found that instructors interested in using the Prevalence of Antibiotic Resistance in the Environment (PARE) project (a collection of short-duration modular CUREs) were more likely to mention PARE as being compatible with their course structure and available resources, relative to CUREs in general (Genné-Bacon et al., 2020). Interviewed instructors were also far less likely to perceive “bandwidth” as a barrier to implementing PARE. A short-duration CURE may also lower barriers by providing a low-stakes opportunity for first-time CURE instructors to experiment with CURE implementation.

    However, the longer-term outcome of this “trialability” is unclear. Do instructors persist with using the CURE? And importantly, do they ultimately expand CURE use to create a longer-term experience for their students? This question of CURE expansion is particularly important because despite the presumed ease of implementation of short-duration CURE modules, it is not clear to what degree students benefit from short-duration research experiences. While some studies have shown modest learning and/or attitudinal gains from short duration CUREs (Genné-Bacon and Bascom-Slack, 2018; Dahlberg et al., 2020; Cole et al., 2021; Fuhrmeister et al., 2021; Wickham et al., 2021, 2023; Fendos et al., 2022; Stanfield et al., 2022; Plaisier et al., 2023; Bliss et al., 2023; Kleinschmit et al., 2024), others have shown that perceived student benefits continue to increase with increased duration of the CURE experience. For example, Hanauer et al. (2018) showed that a short module experience results in significantly higher project ownership than a traditional lab, but less than a full semester CURE. Shaffer et al. (2014) showed that perceived student benefits continue to increase with increased duration of the CURE experience, and DeChenne-Peters et al. (2023) found that, compared with a short CURE module, a full-semester length CURE had greater impacts on students’ reported STEM career interest and plans to conduct future research. Additionally, a multi-institution study of different CURE formats by Mader et al. (2017) found, in general, students’ self-reported learning and attitudinal gains were higher after participating in CUREs with a sequence of multiple CURE modules compared with a single interluding CURE module within a traditional laboratory course (with full-semester CUREs showing even higher gains). Similar trends have been seen with traditional mentored research and summer research programs (e.g., Li et al., 2008; Thiry et al., 2012; Adedokun et al., 2014).

    With the knowledge that a longer-term CURE provides a richer experience for students, but understanding that a short format appeals to instructors (Genné-Bacon et al., 2020), the PARE project CURE was designed to serve as a steppingstone for instructors to implement a longer duration CURE. Although the original PARE core module is short in duration (two to three class periods), there exists a suite of additional modules allowing for expansion to a longer-duration CURE experience (Genné-Bacon and Bascom-Slack, 2018; Genné-Bacon et al., 2020; Fuhrmeister et al., 2021; Bliss et al., 2023). Instructors can start with one of a few key modules and then, in another semester, expand by adding from this suite of additional PARE add-on modules. Other flexible, modular CUREs also exist (Muth and McEntee, 2014; Adkins et al., 2018; Hanauer et al., 2018; Hyman et al., 2019; Roberts et al., 2019; Dahlberg et al., 2020; Dizney et al., 2020; Zelaya et al., 2020; Gastreich, 2021), but longitudinal tracking of instructor use patterns to determine whether this approach to CURE design provides a pathway to expanded CURE use has, to our knowledge, not yet been conducted. In fact, longitudinal studies on faculty CURE persistence and long-term adoption, in general, are rare. Connor et al. (2022) used a cross-sectional approach to show that CURE discontinuance was rare (only 4% of respondents), but comprehensive institutions, 2-year institutions and minority-serving institutions were underrepresented. DeChenne-Peters and Scheuerman (2022) tracked faculty recruited to participate in a specific network CURE and showed that faculty perceptions differ over time, but the study was not intended to track discontinuance or expansion because the participants were recruited and supported to implement the CURE as written.

    In this study, our aim is to understand factors that may support translation of interest into implementation and expanded CURE use in the classroom. Through interviews, we monitor the perceptions and experiences of faculty interested in teaching the PARE project, over time. We track faculty at 2-year and 4-year institutions over approximately two years: preimplementation (Timepoint 1), postimplementation (Timepoint 2), and long term follow up (Timepoint 3). Faculty in this cohort were not recruited, compensated, or provided with special training or professional development to use PARE.

    Theoretical Framework

    Many implementation theories and frameworks exist to understand the adoption and/or dissemination of evidence-based educational practices (EBEP) and they have been categorized and described by Nilsen (2015). Many of these theories have elements in common with Rogers’ Diffusion of Innovations (DOI) framework which is an expansive theory used to study how innovations are adopted by individuals, organizations, and communities, and how these innovations spread through the population (Rogers, 1962, 2003). We chose to use Roger’s theory to guide our work over other frameworks because it is one of the most comprehensive and has been widely applied, providing other published works as guidance and for comparing outcomes. In the words of Nilsen (2015), “The Theory of Diffusion is considered the single most influential theory in the broader field of knowledge utilization of which implementation science is a part.” Originally developed to explain the diffusion of technological innovations in agriculture, it has since been used to study many different types of innovations, including overcoming barriers to implementing educational innovations (Schmidt and Brown, 2007; Watson, 2007; Henderson and Dancy, 2007; Henderson et al., 2012; Andrews and Lemons, 2015; Marbach-Ad and Hunt Rietschel, 2016; Shadle et al., 2017; Genné-Bacon et al., 2020; McConnell et al., 2020; DeChenne-Peters and Scheuermann, 2022). The Innovation-Decision (ID) process describes the thought pattern an individual goes through from their first knowledge of the innovation, to forming an attitude about it, to making a decision to attempt implementation (or not), actual implementation, and finally confirmation of the decision (Rogers 2003, p. 21, 107–118). Time is thus an important aspect in diffusion research. The Persuasion Stage (i.e., the formation of the adopter’s opinion of the innovation) is highly influenced by five main characteristics of the innovation: Relative Advantage, Compatibility, Trialability, Observability, and Complexity (Rogers 2003, p. 111). Complexity can be thought of as barrier to implementation and is negatively correlated with adoption. All the other characteristics are positively correlated with adoption. As an example, while undergoing the decision to implement a CURE, an instructor may consider whether it is better than their current curriculum (Relative Advantage), whether it is Compatible with the goals of their course, and whether it is too Complex for the readiness level of their students. These perceptions of the innovation are influenced by personal characteristics of the individual as well as the context of usage.

    The ID process is not always linear, nor is it final. During the Implementation and Confirmation Stages, adopters continue to evaluate the innovation, such that their experience informs updated Persuasion and Decision Stage perceptions. For example, an individual who, after developing a positive perception of the innovation during the Persuasion Stage, decides to implement or use the innovation may find that the actual use of the innovation does not match their expectations developed during the Persuasion Stage. They may change their decision, resulting in discontinued use of the innovation. Thus, the perception of the innovation with respect to the Persuasion Stage characteristics continue to be relevant even after implementation of the innovation.

    Context of this Study–The PARE CURE

    PARE was designed with adoptability in mind. For example, Complexity (e.g., the cost and resources required, complexity of experiments, etc.) is kept to a minimum, while the ability to insert a module into an existing laboratory course was intended to result in high Compatibility. Although the original PARE core module is short in duration (two to three class periods), there exists a suite of additional modules allowing for expansion to a longer-duration CURE experience, thus serving as a steppingstone for instructors to implement a longer duration CURE (Genné-Bacon et al., 2020). The PARE curriculum has been described elsewhere (Genné-Bacon and Bascom-Slack, 2018; Fuhrmeister et al., 2021); briefly, PARE is a program for detection, reporting, and study of antibiotic resistant bacteria in environmental samples. It consists of background videos, skill-building known-outcome activities, and research-based modular activities that produce results for upload into a national database. Participation in PARE generally begins with trial of the core module which takes about three laboratory class periods and makes use of equipment and reagents standard in most teaching labs. It is particularly well suited to microbiology laboratory courses because it teaches core microbiology skills. The intent in developing the core PARE module was not to provide students an extensive research experience but to provide an entrée into classroom research for instructors in hopes that they would subsequently expand the research experience for their students. The core module provides broad relevance, opportunity for discovery and experience in the practice of science, three key elements of CUREs (Auchincloss et al., 2014; Brownell and Kloser, 2015). For example, students engage in scientific practices by forming a hypothesis about what environmental sample collection site they predict would contain high levels of antibiotic-resistant bacteria. The resistance level discovered in students’ environmental samples is unknown to both student and instructor, thus providing opportunity for discovery. While surveillance of resistance is routinely done in a clinical setting, systematic environmental surveillance is lacking, yet there is agreement of its importance in addressing the growing concern of antibiotic resistance (Berendonk et al., 2015). Therefore, the student-generated results have broad relevance to the scientific community beyond the classroom. There are ample opportunities for iteration and communication, two additional key elements of CUREs (Auchincloss et al., 2014; Brownell and Kloser, 2015), although they are not explicitly built into the core module. Our previous work (Genné-Bacon et al., 2020) showed that instructors who were interested in using PARE did perceive it to be of low Complexity and high Compatibility relative to CUREs in general. The short duration, characteristics related to cost, and alignment with course learning outcomes appeared as the most salient drivers of PARE implementation.

    Research Questions/Goals of Study

    This longitudinal study followed a cohort of faculty who expressed interest in the modular CURE, PARE, and aimed to assess their perceptions and use patterns of CUREs over time. Through a series of interviews conducted over 2 y, we sought to evaluate our hypothesis that use of a short-duration CURE module can eventually lead to adoption of longer-duration CURE experiences. We assess this through the following research questions: 1) What is the trajectory of instructors who expressed interest in PARE? Did they actually implement? If so, did they continue use over time or did they discontinue? Did those who continued expand the PARE program? 2) What situational characteristics are associated with expansion of PARE? For example, are expanders more likely to be teaching at a particular institution type? 3) How do perceptions of the PARE experience differ between expanders and nonexpanders?

    METHODS

    Participating instructors were interviewed at three time points over a two year period. At each interview, instructors were asked a series of open-ended questions. Reponses to the questions were coded and analyzed.

    Initial (Timepoint 1) Recruitment

    Recruitment of participants was carried out as described in Genné-Bacon et al. (2020). Briefly, all instructors who made inquiries about the PARE project during the summer of 2017 were contacted to ask if they intended to implement PARE and if they were interested in participating in this research study. Of the 29 instructors contacted, 19 met the inclusion criteria (had not yet attempted to implement PARE but intended to within the next academic year) and agreed to participate. All 19 instructors participated in a pre-implementation interview and their institution type was assigned as described in Genné-Bacon et al. (2020). Instructors were not compensated for this interview.

    Timepoint 2 Interviews

    At the end of the 2017–2018 academic year, all 19 instructors were contacted again (via email) and asked to participate in a follow-up interview. Instructors were compensated $50 for participating in the follow-up interview, which was planned to last approximately 30–40 min. Recruitment language made clear that compensation was for the interview only and they were eligible regardless of whether they implemented or not. Eighteen of the original 19 instructors responded to the request for an additional interview. Of those, one declined to participate in a follow-up interview (and indicated that they did not implement). One instructor indicated in their email reply that they were not able to implement PARE as planned, and in a phone conversation this instructor indicated that they had not obtained expected funding leading to them being unable to implement. Semistructured interviews were again conducted (by author E.G.) with the remaining 16 participants and recorded using WebEx. All interviews were transcribed using the service TranscribeMe. Interview questions were designed to elicit both general and specific sentiment about the PARE implementation experience. They were designed to capture “gut feelings” soon after implementation. Interview questions were reviewed for clarity and thematic alignment by educational researchers at the Tufts Center for Science Education (see Appendix 1 for interview script). Instructors were asked whether they did, in fact, implement the PARE project in the 2017–2018 academic year, and whether they intended to use it again.

    Timepoint 3 Interviews

    Two years after the initial interview, all instructors who had implemented PARE at Timepoint 2 were again contacted (by author MF) through email and asked to participate in a third and final interview. All 16 instructors replied, and 15 agreed to a third interview. The one instructor who declined to be interviewed instead summarized their PARE implementation status over email. Instructors were compensated with $100 for this final interview. To encourage honest and open responses, interviews were conducted by author M.F., who is not affiliated with the PARE project.

    The Timepoint 3 interview script went through a highly iterative development process. Timepoint 3 questions were written to elicit general sentiment felt after a period of reflection, with focus on perceived expectation vs perceived outcome. As with all previous timepoint interviews, some questions were designed to evoke articulation of challenges encountered as well as positive experiences. Questions were reviewed by education researchers at Tufts University and Northeastern University for alignment with the theoretical framework and goals of the study. We conducted five pilot interviews with two former and three current PARE instructors, to determine clarity, flow, length, and whether questions were eliciting intended responses. Pilot interview volunteers were contacted via email and compensated $50 for their time. The same interview script was used for all interviewed instructors at Timepoint 3 (see Appendix 2 for the final interview script).

    DOI Coding Rubric Development and Coding

    Initial codes for Timepoint 2 and 3 were based on those developed for coding of Timepoint 1 (see Genné-Bacon et al., 2020). Briefly, we developed a preliminary coding rubric based on a priori themes from DOI theory as well as emergent themes observed by the interviews. The coding rubric underwent several rounds of iteration as described previously (Genné-Bacon et al., 2020). During the coding rubric development for Timepoints 2 and 3, authors M.F. and J.D.C. were recruited to help because of their lack of any prior involvement with PARE. We remained focused on these DOI persuasion themes because, in DOI theory, after implementation users revisit the Persuasion Stage to evaluate how their expectations for implementation matched their experience and continue to make decisions on whether to continue implementing or modify their implementation based on this perception. These initial codes were refined and expanded upon, based on the Timepoint 2 and 3 transcripts as needed (see below for details). To simplify the coding process, the diffusion of innovation themes (Relative Advantage, Compatibility, Trialability, Observability, and Complexity) of the original Timepoint 1 rubric were at first simplified to “positive experiences” (Relative Advantage, Compatibility, Trialability, and Observability), and “challenges” (Complexity). The same coding rubric was used for thematic coding of Timepoint 2 and Timepoint 3 interviews.

    To expand on the initial Timepoint 1 coding rubric, authors E.G. and M.F. worked together to iteratively develop and refine the coding rubric. In contrast to coding of Timepoint 1, coders only considered interviewees’ actual experiences with PARE, not anticipated experiences with future implementations. Adaptation of the original Timepoint 1 rubric was carried out as follows: each coder first coded one to three transcripts independently, taking notes on procedures and questions, and identifying potential newly emergent codes. Next, both coders met to compare codes, resolve conflicts, and add or refine codes as needed. This process was repeated until intercoder reliability was above 60% (number of codes agreed on divided by total number of coding units) for two transcripts in a row. For the purposes of measuring intercoder reliability, a coding unit was defined as a timestamped paragraph (as produced by the service TranscribeMe). Seven transcripts were coded for the purposes of coding rubric development: three from Timepoint 2 and four from Timepoint 3. See Appendix 3 for the final coding rubric.

    For coding of transcripts, a team of three research assistants (authors S.B., F.P., and G.X.) who all had previous coding experience were trained to use the final coding rubric described above. These three research assistants had no other role on this project. For coder training, explanations of each code in the rubric were provided followed by time for discussion and elaboration where questions arose. For Timepoint 2 transcripts, S.A.B. and G.X. were then provided a transcript that had been coded to consensus by E.G. and M.F. E.G. and M.F. went through the coded transcript together with S.A.B. and G.X. with opportunity for questions to be clarified. Subsequently, S.A.B., G.X., and E.G. independently coded the remaining transcripts in batches, with discussions to resolve conflicts, thus coding to consensus. Training of F.P. for Timepoint 3 coding occurred similarly; these transcripts were coded by S.A.B., G.X., and F.P.. The group plus E.G. reconvened to resolve conflicts.

    Coding of PARE Implementation Trajectory Status

    To assign a PARE implementation status to each transcript, authors E.G. and M.F. first developed the implementation coding rubric by reading through transcripts and discussing emerging themes. For Timepoint 2, three implementation categories emerged (core PARE module only, expanded use of PARE, and did not implement [those not interviewed]). For Timepoint 3 transcripts, three main implementation categories emerged (sustained use of core PARE module, expanded use, and discontinued use), as well as multiple sub-categories. See Table 1 for implementation trajectory status coding rubric. All transcripts were independently coded for implementation status by authors E.G. and M.F. Any transcripts for which there was disagreement were also coded for implementation status by author C.B. and discussed in coding meetings between authors E.G., M.F., and C.B. to code to consensus.

    TABLE 1. PARE implementation status categories

    Main category (No. of instructors in this category)Subcategory (No. of instructors in this sub-category*)Description
    Timepoint 2Core PARE module only (13)n/aInstructor implemented the original two to three class period core PARE module with no or few modifications
    Expanded use (3)n/aInstructor implemented the core PARE module along with one or more expansion modules, or added additional personally designed research to expand the scope and duration of PARE beyond the core module
    Did not implement (3)n/aInstructor never implemented PARE
    Timepoint 3Sustained use (1)n/aInstructor continued implementing the original two to three class period core PARE module with no or few modifications and no dissemination to other courses
    Expanded use (11)PARE add-on modules (7)Instructor expanded CURE use by implementing an add-on module from an available library of PARE add-on modules
    New CURE (6)Instructor expanded CURE use by designing their own additional experiments or by augmenting PARE with another CURE
    New sections/classes (2)Instructor expanded use of PARE to other courses or additional sections of original course
    Discontinued use (4)Disruption (3)Instructor encountered some type of distribution, such as a change in schools or course assignments, that interfered with their use of PARE
    Considering (4)Instructor is not currently using PARE in any of their courses, but they are considering using PARE or another CURE again in the future
    Discouraged (1)The instructor had a negative experience with PARE that contributed to discontinuing use

    *Instructors can be assigned into more than one subcategory of expander or discontinuer.

    Postcoding Analysis

    After all coding was complete, “positive experiences” and “challenges” codes were assigned back to their original DOI themes (primarily Relative Advantage, Compatibility, and Complexity). In addition, the DOI Persuasion themes of Relative Advantage, Compatibility and Complexity were further subdivided into subthemes: innovation-focused (e.g., “PARE is better for student learning than other teaching methods”), context/institution-focused (e.g., “my course schedule needed a short-duration CURE like PARE”), or individual-focused (e.g., “I wanted to use PARE because I felt really bored with my old teaching methods”). Assignment of codes to these sub-themes was carefully determined by whole-team discussions. Some codes did not easily fit into either innovation, context, or individual-focused sub-themes. For these, E.G. and M.F. independently examined all other instances of these codes and assigned each ambiguous instance to one of the three sub themes based on the context in which it was brought up in the interview. For coding units where there was still a disagreement, author C.B. broke the tie. Appendix 3 shows the innovation/context/individual/student status for each code.

    Statistical Analyses

    For analyses of various characteristics in the Expander group versus Non-Expanders (e.g., Figure 3 and Table 2) we performed Fisher’s exact tests. For between-groups analyses of the coding data, we performed Mann-Whitney U tests. All statistical analyses were done using the software Prism Version 9 or 10 (GraphPad Software, LLC.).

    TABLE 2. Situational factors by expander status

    Potential factors affecting adoption outcomeTimepoint 3 StatusFisher’s exact test p valueSignificance
    ExpandersNon-expanders
    Institution type
     Assoc. Dom13PUI vs. Assoc = 0.014p < 0.05
     Primarily undergrad90PUI vs. Doc = 0.0455p < 0.05
     Doctoral-granting12Assoc vs. Doc = ns  
    Job Title
     Assis./Assoc/Prof1010.0128p < 0.05
     Lecturer/Instructor14  
    Previous CURE experience
     Yes610.3077ns
     No54
    Tenure status
     Tenure track810.1058p < 0.2 (trend)
     Not tenure track34
    CURE use by others*
     Yes621ns
     No52
    Course type
     General biology420.8089ns
     Intro micro31
     Upper level micro/Bio21
     Other12
    Lab prep staff for course?*
     Yes831ns
     No31

    *missing data for one nonexpander.

    RESULTS

    Research Question 1: What is the Trajectory of Instructors who Expressed Interest in PARE?

    The majority of PARE-Interested Instructors Implemented PARE at Least Once.

    All instructors interviewed in the initial, pre-implementation phase of this study (Timepoint 1) expressed interest in using the PARE project in their classes within the next two semesters. We were interested to know whether instructors who had entered the Decision Phase for PARE went on to eventually implement. We assessed this by examining Timepoint 2 interview transcripts (and, where necessary, emails from instructors) and determined that the majority (16) of the original 19 instructors had implemented the core PARE module (Figure 1).

    FIGURE 1.

    FIGURE 1. Changes in PARE use over three timepoints. Number of instructors in each status category are indicated with parentheses. Groups designated as “non-expanders” for the purpose of analysis are indicated with asterisks. See Appendix 4 for a detailed view of each instructor’s status.

    We were also interested to know whether the collection of PARE add-on modules was being used. Most (13/16) of the implementing instructors had implemented only the core PARE module by the time of interview 2. Coding of transcripts for Timepoint 2 revealed three implementation status categories: “Core PARE module only” (the instructor implemented only the standard two to three class period core PARE module; 13 instructors), “expanded use”, and “did not implement” (see Table 1). Three of the 16 implementing instructors implemented an expanded version of PARE. These instructors either implemented the core PARE module along with one or more expansion modules or added additional, personally designed research to expand the scope and duration of PARE beyond the core module. Three instructors were classified as “did not implement”: one revealed this in their email and elaborated in a phone conversation, one instructor indicated via email that they did not implement PARE but declined to be interviewed, and another instructor interviewed at Timepoint 1 did not respond to requests for follow up and is presumed to have never implemented. All three non-implementers had no prior CURE experience and represented each of the three institution type categories.

    The Majority of Instructors Expanded use of PARE by Timepoint 3.

    To determine the trajectory of PARE use over time, we coded Timepoint 3 transcripts for the three broad implementation categories that emerged: “sustained use” of the core PARE module, “expanded use,” and “discontinued use” (See Figure 1). In addition to these broad implementation categories, we also developed subcategory themes describing implementation status in more detail (see Table 1). The “expanded use” subcategory “PARE add-on modules” describes instructors who expanded the PARE research experience by incorporating other PARE modules chosen from a suite of available expansion modules. The “new CURE” subcategory describes instructors who expanded PARE by incorporating their own self-designed expansion projects or by incorporating another related network CURE with PARE. For example, one instructor began the semester with the core PARE module, but then filled the remainder of the semester working on Tiny Earth (Hurley et al., 2021), a longer, but thematically similar network CURE (both PARE and Tiny Earth begin with collection of a soil sample and use antibiotic resistance as the context for teaching microbiology skills). The third “expanded use” subcategory, “new sections/classes” describes instructors who, in addition to expanding use in ways mentioned above, also incorporated PARE into new classes or additional sections of their class. One instructor did not expand PARE through additional add-on modules or other CUREs in their own class but was highly involved in overseeing PARE’s implementation in multiple other courses in their department, including expanded versions of PARE. This instructor was thus included in the expander category. One instructor interviewed in Timepoint 2 declined to be interviewed in Timepoint 3 but emailed a detailed description of their continued PARE use since Timepoint 2. Based on this email they were classified as “Expanded use: PARE add-on modules.” Instructors could fall into more than one subcategory. For example, one instructor incorporated both PARE add-on modules (“PARE add-on modules”) and self-designed additions to the CURE (“new CURE”).

    Four Timepoint 2 implementers had discontinued all use of PARE (or other CUREs) by the time of the Timepoint 3 interview (See Figure 1). Like with expanded use, we assigned discontinuers into several nonexclusive subcategories of discontinuance (see Table 1). Three discontinuing instructors encountered some type of disruption (subcategory “disruption”) to their teaching– either moving to a new institution or having their class assignments changed. One instructor encountered significant difficulty with implementing PARE (subcategory “discouraged”), which contributed to their discontinuance. However, all discontinuing instructors also expressed that they were considering incorporating PARE or another CURE into their current classes at some point in the future (subcategory “considering”).

    Trends in PARE use Trajectories

    We were interested to learn whether different trajectory patterns emerged. Over the course of the longitudinal interviews with PARE-interested instructors, we noticed common trends in how PARE implementation proceeded. See Figure 2 for a summary of these trends.

    FIGURE 2.

    FIGURE 2. Summary representation of major patterns in PARE use over three timepoints. The largest group of implementing instructors (n = 8) used PARE as it was originally designed: first implementing the core PARE module and later expanding. A smaller (n = 3) group of experienced CURE users from PUI implemented an expanded version of PARE in their first implementation. Another small group (n = 3) of instructors from associate’s dominant institutions encountered disruptions and were no longer implementing at Timepoint 3. Two instructors interviewed at Timepoint 3, both at doctoral-granting institutions and both managing large courses taught by teaching assistants, did not fall into any of these use patterns: one discontinued due to dissatisfaction with PARE, and the other continues to implement the core module only. One instructor labeled an expander is also not represented here because they assisted in expanding PARE in other classes, but continued to use only the core PARE module in their own classes. The three instructors that never implemented are also not represented here.

    “Toe-Dip Expanders”.

    The most common type of PARE user in our cohort (eight out of the 16 implementing instructors) are those we have dubbed “toe-dip expanders.” These instructors try the 2–3 day core PARE module for their first implementation, and then expand the length of the CURE experience in subsequent semesters. The story of Instructor “R” is a representative example of an instructor in this group:

    Instructor R teaches at a primarily-undergraduate institution, and had been using mostly “cook-book” style labs in her upper-level microbiology lab course, with little change over many years. She expressed that she was beginning to feel dissatisfied and bored with these activities and wanted to try to incorporate more student research. She had attempted to use a longer-duration network CURE in the past but was unable to secure enough funding for it. She learned about PARE at a conference and decided to try it. The first time she tried PARE she used only the core PARE module and reported that this went well. During her second interview she expressed interest in using one of the PARE add-on modules in addition to the core module. During Timepoint 3 she reported that she was still using PARE in her class and had successfully integrated two additional PARE add-on modules.

    The instructors in this group represent a mix of institution types and CURE experience levels. Many expanded using the existing library of PARE add-on modules, though some designed their own expansion experiments or incorporated elements of other CUREs. This “toe-dip” pattern is how PARE use and expansion was originally envisioned during its conception: PARE provides instructors with a low stakes opportunity to get their feet wet by trying a short and relatively simple CURE. Once they are comfortable with this level of CURE use, including inherent challenges such as dealing with unexpected results, they choose to add additional modules to give their students a longer and more complex CURE experience.

    “Early Expanders”.

    A smaller group (3/19) of instructors in our cohort can be described as early expanders. These instructors implemented an expanded form of PARE in their first attempt, and often expanded further on subsequent attempts. Instructor “M” provides a good example of this type of instructor:

    Instructor M teaches at a primarily undergraduate institution and is an experienced CURE user. She had previously used a full-semester network CURE and a self-designed CURE in other classes. She was looking for a way to incorporate research into her section of a multisemester introductory biology sequence. She had attempted to use a two-semester long network CURE but was unable because she does not have control over the subsequent semester of the course series. At the time of interview 1, her section contained an inquiry-style lab that she felt wasn’t working well, so decided to replace it with PARE when she learned of it at a conference. During her first semester implementing PARE she used both the core module one add-on PARE module. At the time of the third interview she was still using PARE in her introductory biology lab section, but had added a second, related short-duration network CURE and incorporated both together, making her section a semester-long study of the local environment. Her section had become popular, so an additional section had been added. The new section was taught by another instructor who was also using the same semester-long, PARE-based environmental study.

    The instructors in this group taught exclusively at primarily undergraduate institutions. All had previous CURE experience. This group demonstrates that short-duration modular CUREs can be helpful not only for novice CURE users looking for an easier barrier-to-entry, but also for experienced CURE users looking for flexibility in incorporating more research into their courses.

    “Encountered Disruptions”.

    The final group of common trends in PARE use described the majority (3/4) of the instructors who had discontinued PARE at the time of the third interview. We have dubbed this group “encountered disruptions” because these instructors encountered a change of institution or change in teaching assignment that interfered with their use of PARE. For example, the story of instructor “A”:

    Instructor A was an adjunct instructor teaching at a community college and looking to add more inquiry-style learning to her majors’ microbiology course. She had previously attempted to design her own inquiry-style lab but encountered many technical barriers. She was seeking a replacement for that lab when she learned about PARE at a conference. She liked PARE because it fit her time and resource limitations, and she was excited that students would be contributing to a national database. In the Timepoint 2 interview she reported that she had successfully implemented the core PARE module but had no plans to expand due to resource limitations at her institution. In the Timepoint 3 interview she reported that she had moved institutions and was no longer using PARE because she did not have full control of her course at the new institution. She reported that she had considered using PARE in a different course where she had more control, but it was not well aligned with her course topic. However, she expressed that she was considering trying a different network CURE in this class in the future.

    Here are Instructor A’s own words describing her teaching situation:

    “I am an adjunct teacher, so what classes I teach vary a lot. How much freedom I have to control my own labs varies a lot. So I implemented PARE in a class where I was the only teacher doing it. And I was doing all the design for myself in the microbiology course. I implemented it once. I have not been able to teach that course because […] I changed colleges.”

    All of the instructors who discontinued due to disruptions taught at community colleges, and none had prior CURE experience. Interestingly, all instructors with this implementation pattern expressed that they remained interested in using CUREs in the future, indicating they are not discouraged with the teaching method.

    Research Question 2: What Situational Characteristics are Associated with Expansion of PARE?

    Given that a goal of PARE is to serve as a springboard for a longer duration CURE, we sought to understand what makes some instructors expand use of PARE while others do not expand, discontinue use, or don’t implement at all, by examining a number of demographic/situational variables. For these analyses we pooled the “sustained use” (of core module), and “discontinued use,” categories together into a group called “nonexpanders.” Expansion status differed significantly by institution type with all (9/9) instructors from 4-year, undergraduate-focused institutions expanding use of PARE, but only a minority of those instructors from doctoral-granting (one of three, Fisher’s exact test, p = 0.0455) or associate’s dominant (one of four, Fisher’s exact test, p = 0.014) expanded PARE (Figure 3; Table 2).

    FIGURE 3.

    FIGURE 3. Expansion status differs significantly by institution type. Here, “non-expanders” include those that discontinued use after implementing at least once (n = 4), and one instructor who continues to implement only the core PARE module. We used a Fisher’s exact test to compare expander status for all three groups (p = 0.0069). We then carried out Fisher’s exact tests in pairwise combinations (see also Table 2).

    In addition to institution type, we also examined several other demographic/situational categories, including: prior CURE experience, tenure track potential, CURE use by others at their institution, having a lab preparation/support staff, and type of course taught (Table 2 and Appendix 4). Although not statistically significant at the 0.05 level, eight of 11 expanders were tenure track whereas only one of four nonexpanders were (Table 2), suggesting that job status may influence adoption pattern. Perhaps also related to long-term contracts or job stability, the majority of nonexpanders reported titles such as “lecturer”, “adjunct”, or “lab coordinator” while all but one of the expanders reported titles such as “assistant professor”, “associate professor”, or “professor” (Appendix 4). No other differences between expanders and nonexpanders approached statistical significance.

    Research Question 3: How do Perceptions of the PARE Experience Differ between Expanders and Non-expanders?

    In addition to understanding whether use of a short-duration CURE could lead to a longer experience, we sought to gain a richer perspective on the challenges and motivators experienced by instructors along their trajectory. To assess differences in perception of the CURE implementation experience between expanding instructors and nonexpanding instructors, we turned to our DOI-based coding data produced through thematic analysis of the Timepoint 2 and 3 interview transcripts. Here, first we will discuss some general trends in perception over time (pre- to post-implementation) before moving on to comparing differences in themes expressed between groups.

    Different Themes Emerged in Timepoint 2 and 3 Compared with Timepoint 1.

    Roger’s DOI framework suggests that before an instructor makes the decision to implement or not, they enter a Persuasion Stage in which they contemplate various attributes of the CURE in comparison to their personal beliefs and teaching context (Rogers, 2003 pp. 111–112). We based our coding on the Persuasion Stage because, even after implementation, potential adopters reevaluate the innovation based on their experience with it, deciding whether to continue using the innovation (with or without modification), or discontinue use. Thus, users continually reenter the Persuasion Stage of the ID process. In our analysis of instructor interview transcripts, our codes generally fell into themes derived from the DOI ID process Persuasion Stage: Compatibility and Relative Advantage (which are positively correlated with adoption of innovations), and Complexity (which is negatively correlated with adoption of innovations). Codes that fell into the two remaining Persuasion Stage themes, Trialability and Observability, were rare across timepoints. In developing the coding rubric for Timepoints 2 and 3, we started with the codes that emerged in Timepoint 1 (Genné-Bacon et al., 2020) and looked for any new codes that emerged from the transcripts. Not unexpectedly, most new codes were related to the actual experience of using PARE in the classrooms (See Appendix 3 for a full list of codes with those new to Timepoints 2 and 3 marked). For example, “confidence in student data” was a theme expressed in Timepoints 2 and 3 but not observed in Timepoint 1. It refers to the instructor’s struggle dealing with the often surprising and messy results of authentic research. For example, according to one instructor:

    “That was the last time I did it, and the last time I had the opportunity to do it. And my note there was that just the lab failed. Looking at the data, we just couldn’t make sense out of the results. So that seemed to be the common thing is that we couldn’t use the results that we got in that class so it was kind of limited how much we could get out of it, but it was a good scientific process and topics to address with them, but the results were difficult.”

    Other new themes that emerged in timepoints 2/3 were expansions of themes from Timepoint 1. For example, in Timepoint 1 we used two Relative Advantage codes relating to students: “student learning” and “student engagement” (see Appendix 3). In Timepoints 2 and 3 we noticed a new code relating to both older codes: “student experience as a scientist.” This code refers to the instructor’s positive perception that students were acting “like real scientists” and authentically participating in “real” research. For example, one instructor noted:

    “I think the students really enjoy it because it doesn’t have a known answer, and I think that’s kind of fun for them to understand like, ‘This is how science works. We don’t always know what’s going to happen,’ or, ‘Our hypothesis isn’t always true.’ So it’s gone really, really well. I’ve been pleased with how it’s gone and how well the students seem to be enjoying it.”

    This is consistent with DeChenne-Peters and Scheuermann (2022) who noted that several changes in faculty perception pre- versus post-CURE implementation relate to students. Overall trends in the type of codes expressed by instructors in Timepoints 1, 2, and 3 are described in Table 3. Similar to Timepoint 1, student-related codes dominated the Relative Advantage category for both Timepoints 2 and 3.

    TABLE 3. Most common codes expressed in Timepoints 2 and 3*

    No. Instructors mentioning (Timepoint 1- out of 19)No. Instructors mentioning (Timepoint 2- out of 16)No. Instructors mentioning (Timepoint 3-out of 15)
    Most common Compatibility codes
     Fit with course structure (Context)161315
     Ease of use for instructorn/a711
     Support from institutionn/a38
     Fit with past experiences1063
    Most common Complexity codes
     CURE-specific technical problem31111
     Confidence in student datan/a74
     Student challenges8n/an/a
     Student readiness/preparation/abilityn/a412
     Student reluctancen/a711
    Most common Relative Advantage codes
     Student engagement111615
     Student learning61311
     Student experience as a scientistn/a1011
     Broader impact151010

    *Timepoint 1 code frequency provided for reference, where relevant. To see the most common codes expressed in Timepoint 1, see Genné-Bacon et al. (2020).

    Expanders and Nonexpanders Differed in the Themes Represented in their Responses.

    After identifying emergent codes for Timepoints 2 and 3, we compared the number of codes for expanders versus nonexpanders. Instructors who were nonimplementors were not included in this analysis (because most were not interviewed at these Timepoints). One expanding instructor declined to be interviewed at Timepoint 3 and so they were not represented in analysis of Timepoint 3 transcripts. Thus, this analysis included 11 “expanders” in Timepoint 2, but only 10 “expanders” in Timepoint 3. In both Timepoints, there were five nonexpanders.

    Figure 4 shows the number of different codes, within each of the three major DOI persuasion themes, expressed per instructor interview for Timepoints 2 and 3. In both timepoints, expanders and nonexpanders expressed a similar number of codes relating to the Relative Advantage of PARE. However, both Compatibility and Complexity themes showed trend (p < 0.2) differences between expanders and nonexpanders at Timepoint 2 (p = 0.131 and p = 0.157, respectively; Mann-Whitney U tests) and Timepoint 3 (p = 0.117 and 0.171, respectively; Mann-Whitney U tests). The number of different Compatibility theme codes expressed was higher for expander instructors than nonexpanders for both timepoints while the number of different Complexity theme codes was higher for nonexpanders in both timepoints (Figure 4). Although these differences are not statistically significant at the 0.05 level, because of our small sample size and the qualitative and exploratory nature of this study we chose to investigate these trend differences further.

    FIGURE 4.

    FIGURE 4. The total number of different codes mentioned within the themes of Complexity, Compatibility, or Relative Advantage per instructor at Timepoint 2 and 3. Timepoint 2 Complexity (p = 0.131), Compatibility (p = 0.157), and Relative Advantage (p = 0.698). Timepoint 3 Complexity (p = 0.171), Compatibility (p = 0.117), and Relative Advantage (p = 0.958). Mann-Whitney’s U test were performed on all datasets.

    DOI theory proposes that adoption of an innovation is impacted not only by features of the innovation itself, but also characteristics of the individual, and the context in which the individual uses the innovation (Rogers, 1962, 2003). To examine this aspect of the framework more closely, we divided each of the “positive experience” codes in the main persuasion themes (Relative Advantage, Compatibility) into subthemes related to the Innovation (e.g., “PARE is better for student learning than other teaching methods”), the Individual adopter (e.g., “I wanted to use PARE because I felt really bored with my old teaching methods”), and the Context of the implementation (e.g., “my course schedule needed a short-duration CURE like PARE”) (see Methods for more detail). For the DOI persuasion theme of Complexity, we found it necessary to create an additional subtheme for codes describing challenges related to Students (See Appendix 3 for a full breakdown of how codes were assigned). Within each Persuasion Stage subtheme we compared the number of different codes expressed by expanding and by nonexpanding instructors. For a full analysis of all subtheme comparisons see Supplemental Figures 6–8 and Appendix 5.

    Expanding Instructors Expressed Fewer Complexity Themes Related to the Individual Adopter.

    As described above, nonexpanders showed a trend toward expressing fewer different Compatibility code types and more Complexity code types. Breakdown of these codes by subtheme (i.e., Innovation-, Context-, Individual-, or Student-focused Compatibility) revealed no significant or trend differences between expanders and non-expanders within the Compatibility subthemes (Supplemental Figure 7). However, analysis of Complexity revealed some interesting patterns across subthemes (Supplemental Figure 6 and Appendix 5). We saw little difference in the number of different Persuasion themes expressed per instructor in the subthemes of Innovation-level Complexity or Student-level Complexity at either Timepoint, suggesting that the perceived Complexity difference was not driven by the innovation itself or by student-related issues. However, we did identify a significant difference (p = 0.0023) in number of different Individual-level Complexity codes expressed between expanders and non-expanders in Timepoint 3 (Table 4, Figure 5), as well as a trend difference (p = 0.102) at Timepoint 2. These included cases where the instructor expressed the codes of personal frustration, a lack of experience needed, or a lack of personal bandwidth (see also Appendix 3). All five instructors categorized as nonexpanding (at Timepoint 3), when interviewed in Timepoint 2, expressed at least one code in this subtheme (with many expressing more than one code, for an average of 1.8 per instructor); less than half (5/11) of the expanding instructors expressed codes in this subtheme in their Timepoint 2 interviews (average of ∼0.8 codes per instructor). This discrepancy was similar in Timepoint 3, where again, every nonexpander expressed a code in this subtheme at least once (5/5, average of 2.4 codes per instructor), while only half (5/10) the expanding instructors expressed any codes in this subtheme (average of 0.5 codes per instructor).

    TABLE 4. Analysis of Complexity codes relating to the Individual adopter

    InstructorInstitution typeTimepoint 3 implementation statusNo. of codes expressed in this subtheme (Timepoint 2)No. of codes expressed in this subtheme (Timepoint 3)
    BPUIExpanders00
    EPUI21
    GPUI20
    IPUI00
    JPUI01
    MPUI00
    OPUI01
    QDoc31
    RPUI01
    TPUI10
    HCC1(not interviewed)
    Average number of different codes expressed per instructor0.80.5
    ACCNonexpanders11
    CDoc12
    FCC22
    LDoc33
    NCC24
    Average number of different codes expressed per instructor1.82.4
    p value for difference between groups (Mann-Whitney U test).p = 0.1016p = 0.0023
    FIGURE 5.

    FIGURE 5. The total number of different codes mentioned per instructor interview (at Timepoint 3) within the subtheme of Individual Complexity. Of all nine subthemes analyzed, Individual Complexity is the only one in which significant differences were observed between expanders and nonexpanders (Supplemental Figures 6–8).

    In Timepoint 3 the Individual-focused Complexity subtheme code of “frustration or disappointment of instructor” showed one of the largest observed divides (second only to “lack of instructor knowledge/experience with research methods” in Timepoint 2, see Table 5) between expanders and nonexpanders (p = 0.022), with 3/5 nonexpanders expressing this code, and 0/10 expanders expressing this code. For example, here one nonexpanding instructor expresses their frustration about how students engaged with the project:

    TABLE 5: Individual-focused complexity codes expressed by expanders and nonexpanders

    Individual codeTimepoint 2Timepoint 3
    No. of expanders mentioningNo. of nonexpanders mentioningP = value (Mann-Whitney U)No. of expanders mentioningNo. of Nonexpanders mentioningP = value (Mann-Whitney U)
    Frustration or disappointment of instructor220.5467030.022
    Instructor knowledge or experience (With research)030.0179020.0952
    Instructor knowledge or experience (With teaching)200.5417020.0952
    Lack of instructor bandwidth11>0.999942>0.9999
    Uncertainty of results430.5962130.0769

    “I gave them [students] suggestions about appropriate places to find soil samples from, I think, that would have some type of agricultural proximity or something related to agricultural runoff. And I’d say, maybe about a third of the class actually did that. The frustration was that about two-thirds of the class got soil from their backyards and their houses, and they didn’t really make an attempt to find it from the environment. They kind of took the lazy way out and just got the soil from their backyards instead.”

    Altogether the coding data from these interviews provides a complex picture of the different factors leading to an instructor expanding, discontinuing, or not expanding use of PARE. Postimplementation, many new, emergent codes refer to students–both positive and negative. Although not significantly different, non-expanders overall expressed more types of Complexity codes (negative sentiments) and fewer types of Compatibility code types (positive sentiments) than expanders, but no notable differences were observed in perceived Relative Advantage of the CURE between the two groups. A predominant negative sentiment expressed by nonexpanders relative to expanders relates to personal frustration or disappointment.

    DISCUSSION

    A Short-Duration, Modular CURE can Serve as a Low-Barrier Gateway into Longer CURE Experiences

    We previously showed that PARE, a short-duration modular CURE, was perceived by instructors, preimplementation, to have fewer barriers to entry than other CUREs. However, we did not know whether this perception of PARE as low barrier translated to implementation. Instructors interviewed before implementation of PARE reported being motivated to try PARE due to its high compatibility with their course structures and budgets, and lower demands on their personal time and bandwidth (Genné-Bacon et al., 2020). In the current study we show, over the course of two follow-up interviews with these instructors, that this preimplementation perception does indeed translate to implementation and eventually to expansion of the CURE using either additional PARE modules or a custom expansion of the research duration for the majority of participants (Research Question 1). To our knowledge, ours is the first study to explicitly examine whether expressed interest in a CURE translates to eventual adoption. Importantly, instructors were not incentivized to participate or persist in PARE, thus allowing an authentic portrait of the instructor experience. In our cohort of PARE-interested instructors, the trajectory patterns we identified supported our prediction that the design of PARE worked well as a low-barrier entrée to classroom research. Expanding users represented the majority of the cohort (11/19) and the ultimate goal for PARE classroom use. Shadle et al. (2017) have noted that when an evidence-based teaching practice expands on current practice, it can serve as a motivator. This might explain our “toe-dip expander” group; they start small with the core module then expand on that to later incorporate additional PARE modules. The “early expander” group would likely have been successful in implementing another CURE; they all had previous CURE experience and presumably felt they had the personal experience or institutional support necessary to expand PARE on their first implementation. Still, these experienced instructors found utility in the flexible design of PARE. Longitudinal studies on a nonincentivized population are also important to understand discontinuance and factors that may predict it. Of the four instructors who discontinued use of PARE, most encountered disruptions to CURE teaching unrelated to PARE itself. Overall, the majority of instructors not only persist in using PARE, but actually expand the CURE experience beyond the short PARE core module. This finding is proof-of-concept that short-duration modular CUREs can serve as a steppingstone to a longer CURE experience for students.

    Adoption Outcomes may Differ Based on Situational Factors such as Institution Type

    Numerous authors have written articles with practical advice to overcome CURE adoption barriers (Bakshi et al., 2016; Heemstra et al., 2017; Shortlidge et al., 2017; Hensel and Davidson, 2018; Govindan et al., 2020) and awareness of institutional differences in adoption barriers has led to increased efforts directed at community college faculty (Bangera and Brownell, 2014; Wolkow et al., 2014; Hewlett, 2016; Schinske et al., 2017; Hensel and Davidson, 2018Hewlett, 2018; Hanauer et al., 2022; Rosas Alquicira et al., 2022). Given that differences in adoption rate can lead to inequities based on the demographic differences in institution type (Mullin, 2012; Ardalan, 2019), we were eager to see how adoption and expansion compared by institution type in this longitudinal study. Here, we found that undergraduate-focused four year institutions were significantly more likely to adopt an expanded version of PARE, compared with doctoral-granting and associate’s dominant institutions. All instructors from primarily undergraduate institutions (PUI) who implemented PARE at least once also eventually expanded. This is consistent with results from a national survey that showed CURE-implementation in inorganic chemistry labs was associated with institutions granting a bachelor’s-level terminal degree in chemistry and not those granting a graduate-level terminal degree (Connor et al., 2022). Further, eight of nine PUI expanders held the title of assistant, associate, or full professor and of those, all but one (7/8) reported the opportunity for tenure (Appendix 4). Conversely, the majority of discontinuing instructors were from associate’s dominant colleges. While the majority (4/5) of instructors at associate’s dominant colleges implemented at least once, only one continued to implement by Timepoint 3. Instructors at doctoral-granting institutions showed greater diversity in adoption status: one never implemented, one continued to implement only the core PARE module with no expansion by Timepoint 3, one expanded use of PARE at Timepoint 3, and one had discontinued by Timepoint 3.

    What could be driving differences in trajectory patterns between instructors at different institution types? It is possible that undergraduate-focused institutions are a particularly ideal setting for implementing CUREs because they tend to have smaller class sizes and a reward structure focused on teaching. However, this is often true for community colleges. It is noteworthy that all the discontinuing instructors from associate’s dominant institutions encountered some type of disruption to their teaching which contributed to discontinuation of the CURE. Instructor perception of PARE as low barrier (Genné-Bacon et al., 2020) may have influenced these instructors to implement at least once, but lack of job stability at community colleges relative to PUIs may present an additional, previously underdescribed barrier to continued CURE use.

    The lower rate of expansion among instructors at doctoral-granting institutions compared with PUIs is also noteworthy. Instructors at doctoral-granting institutions represent the only instructors in our cohort that are categorized as “sustained use” trajectory and “discontinued-discouraged.” Interestingly, both of these instructors managed large courses with multiple lab sections taught primarily by graduate teaching assistants. This is consistent with quantitative work demonstrating an inverse relationship between CURE implementation and having graduate teaching assistant support (Connor et al., 2022). Additional qualitative work has revealed that graduate teaching assistants teaching CUREs express feelings of inadequacy and lack of expertise (Heim and Holt, 2019) and have differing perceptions of the value and cost of CUREs (Goodwin et al., 2021). Mentoring teaching assistants may be critical for students to experience the full value of CUREs (Moy et al., 2019; Goodwin et al., 2022; Goodwin et al., 2023) but may also require more time for CUREs than for traditional laboratory courses. Thus, the support of teaching assistants may have impacted our instructors’ perception of and their trajectory for PARE use.

    Taken together, our findings suggest that future research should focus on possible institution-specific contextual features that predict success or difficulty in implementation, rather than institution type alone. For example, does the presence of teaching assistants introduce complexities that should be addressed head on when implementing a CURE? Does the instructor’s position type (contingent/adjunct vs. full-time/stable faculty) impact likelihood of continuance? Is there a correlation between previously identified factors such as financial resources, and instructor’ personal time and institution type or persistence (Spell et al., 2014; Harris et al., 2015; Shortlidge et al., 2016; Craig, 2017; Genné-Bacon et al., 2020)? In addition, our thematic analysis of instructor perceptions adds an additional layer of complexity, as we will discuss below.

    Adoption Outcomes may Differ Based on Perception of Rogers’ Persuasion Stage Themes

    In addition to understanding overall trajectory patterns and situational characteristics associated with different adoption patterns, we wanted to analyze how perceptions differ between expanders and nonexpanders (Research Question 3) in the context of Rogers’ DOI theory. In Rogers’ DOI theory, users reflect on their experience by revisiting their perception of Persuasion Stage characteristics (e.g., Relative Advantage, Complexity and Compatibility). This retrospective perception informs the decision to use again, modify use, or discontinue. Analysis of interview transcripts at Timepoint 3 uncovered similarities and differences between expanders and nonexpanders (most of whom are discontinuers), although we must keep in mind the small sample size associated with this qualitative study. First, although not statistically significant, overall nonexpanders express a trend toward more types of Complexity codes than expanders, while also expressing fewer types of Compatibility codes (Figure 4). This observation is consistent with expectations; in the DOI framework, Complexity is negatively correlated with adoption, while Compatibility and Relative Advantage are positively correlated adoption. This expected finding provides support for the explanatory power of Rogers’ DOI theory as applied to CURE adoption. It should be noted that none of the most commonly expressed codes in this data set (see Table 3) showed significant differences across groups, indicating that less frequently expressed codes are driving differences between expanders and nonexpanders. Our second observation is that we saw no notable differences in perceived Relative Advantage (Figure 4), suggesting that even those who discontinue still see the value of PARE. This is consistent with our finding that all the instructors in the “encountered disruptions” trajectory pattern were actively considering using PARE again in the future. Future quantitative work should be done to validate use of these Persuasion Stage perception themes as latent variables to measure factors negatively and positively associated with adoption.

    Individual Perceptions Differ between Expanders and Non-Expanders.

    To explore what factors might be driving trend differences in perceived Complexity and Compatibility, we grouped codes within each main theme into subthemes: those related to the teaching Context, the Individual faculty implementer, and the Innovation itself (i.e., PARE). We identified one major difference: nonexpanders express more types of Individual Complexity subtheme codes than expanders (Figure 5). It is hard to interpret the reasons for this increased number of Individual Complexity subtheme codes in our small sample size, but codes common among nonexpanders include those relating to less experience, and more personal frustrations and difficulties with using the CURE. However, we also saw a trend toward expanders expressing a larger diversity of Compatibility codes which may hint at the instructor’s environment influencing the ease of implementing the CURE. Larger studies are needed to investigate this further. When we consider these findings along with the trend in trajectory patterns (community college instructors were more likely to discontinue use), it highlights the difficulty in disentangling effects due to the context versus personal attributes of the instructor. To this point, Rogers (2003, p. 24) acknowledges that the context or social system plays an important role in diffusion but that it is an often-overlooked area of diffusion research.

    Findings like those described in this study illustrate that failure to adopt evidence-based teaching practices is often more complicated than a simple decision on the part of an instructor and may go beyond institution type. Notably, institution type was not a significant factor in the decision to implement PARE the first time (nonimplementors were spread across institution types), but instead only potentially came into play with persistence in adoption of PARE. We see some evidence that faculty position type may be influential in persistence of CURE use–half of the discontinuing instructors held adjunct positions, and there was a significant difference in expansion status based on title (those holding titles containing “professor” are more likely to expand than those with titles such as “lecturer” or “instructor”). In a crosssectional study of knowledge and use of research-based instructional strategies in physics, Henderson et al. (2012) found that discontinuance was high, but institution type was not a barrier to knowledge or use. In a quantitative study, Davis et al. (2020) found no differences in faculty participation in undergraduate research mentoring based on institution type but the perception of institutional support was a predictor for faculty mentoring of undergraduate research. In addition, to our knowledge, no studies have looked at faculty perception of the specific CURE as it relates to adoption. We propose that a next step in CURE adoption research should be to disentangle the role of situational variables and latent variables related to perception of barriers and motivators as they relate to CURE adoption outcome (discontinuance vs. adoption). For example, what has the largest influence on whether an instructor will adopt or discontinue use of a CURE? Is it institution type? Is it job stability status (e.g., adjunct vs tenure track)? Is it personal attitudes or experience with teaching? Understanding the influences on CURE adoption is important for focusing dissemination efforts. Large quantitative studies may be needed to answer these questions.

    The Role of Students in Faculty Adoption of CUREs.

    DOI theory encompasses two main stakeholders driving dissemination: the individual adopter (e.g., faculty educator) and the “change agent” (e.g., administrators, institutional leaders, external funding agencies, innovation creators, etc.). DOI theory has been applied to uptake/adoption of teaching innovations in higher education (e.g., Bennett and Bennett, 2003; Liao, 2005; Straub, 2009; Warford, 2010; Henderson et al., 2012; Ntemana and Olatokun, 2012; Smith, 2012; Andrews and Lemons, 2015; Gonzalo et al., 2018; Menzli et al., 2022), but few, if any, studies of institutional reform using a DOI context consider the role of students in the faculty adoption process.

    We and others (Brownell and Tanner, 2012; Genné-Bacon et al., 2020; DeChenne-Peters and Scheuermann, 2022) have identified several categories of faculty perceptions related to students. In our subtheme analysis of Complexity perceptions, we created a separate “student-level Complexity” subtheme for student-related concerns because we found it difficult to disentangle whether challenges related to students were a result of the particular Innovation (in this case, the PARE project), the Context of implementation, or the Individual adopter. For example, was the work associated with PARE (the Innovation) too difficult for the students whereas another CURE may have been more appropriate? Or were the students at that particular institution (institutional Context) not prepared to use PARE or any other CURE? Or is perceived student reluctance a function of the Individual instructor’s resilience and tenacity? Future DOI framing of adoption of educational innovations may need to adapt the theory to accommodate the role of students in faculty adoption of new teaching practices.

    One might hypothesize that instructors who expressed more Student-level challenge codes would be more likely to be nonexpanders, however this did not seem to be the case in our study (Supplemental Figure 6, Appendix 5). There was little difference in the pattern of these codes expressed between expanders and nonexpanders. If anything, expanders expressed these themes slightly more often than nonexpanders. Many of the Relative Advantage codes identified in these transcripts also related to students (e.g., student engagement, student experience as a scientist, and student learning). These too, seemed to be evenly distributed among expanders and nonexpanders.

    Limitations of this Study

    This is a relatively small, qualitative study of instructors interested in a single CURE. While qualitative studies are a powerful way to gain detailed insight into instructor mindset, small sample size and narrow focus limit generalizability. A major finding of this study is proof-of-principle that short-duration modular CUREs can catalyze expanded use of CUREs. Future studies should examine other module-style CUREs to determine whether this finding is unique to PARE or carries over to other CUREs with similar modular approaches. We cannot disentangle the importance of the short duration and modular aspects of PARE from other factors such as low cost. With the small number of instructors in this study, we are limited in statistical power to the examine how situational factors interact with CURE use trajectory. Larger scale, quantitative studies are needed to more fully examine these trends. Rogers’ DOI theory includes “Trialability” and “Observability” in addition to Complexity, Compatibility and Relative Advantage as traits adopters consider, but these themes were rarely expressed in our interviews. It will be interesting to study whether discontinuance is higher with modular CUREs. One might postulate that discontinuance is higher with PARE or modular CUREs relative to full semester CUREs because of the fact that there is less risk or up-front time involved, making it easier to abandon. Finally, while longitudinal tracking allows us to examine the evolving mindset of instructors as they move through the stages of the ID process, we cannot rule out the possibility that the act of interviewing instructors influences their perceptions or use of PARE. Larger-scale cross-sectional studies would complement longitudinal ones.

    Recommendations/Interventions

    Though this study is small and focused on a single CURE, it begins to provide insight into strategies that could be helpful in expanding student access to CUREs. These possible strategies span a number of different levels of intervention.

    For CURE Designers.

    As stated previously, this study serves as proof-of-principle that short-duration, modular design can increase adoption of longer-length CUREs by lowering barriers to entry. Most instructors expressing interest did in fact implement and after first implementation, PARE instructors tended to expand the duration of the CURE for their students. For instructors implementing in challenging contexts, short duration CUREs may be a good alternative to full-semester CUREs. Many instructors took advantage of the existing library of PARE expansion modules, though many expanded in other ways (e.g., developing their own extension or integrating another network CURE). Short-duration and modular CUREs are growing in popularity (Muth and McEntee, 2014; Hanauer et al., 2018; Hyman et al., 2019; Roberts et al., 2019; Bell et al., 2019; Dahlberg et al., 2020; Dizney et al., 2020; Zelaya et al., 2020; Gastreich, 2021), providing a library of possible expansion modules may help instructors create longer CURE experiences for students.

    This research also suggests that there may be factors that predict discontinuance. A better understanding of these factors may allow CURE disseminators to target instructors working within these contexts for further support early in implementation and to better design CUREs to accommodate these factors. In our interviews, non-expanders (who are primarily discontinuers) expressed Complexity codes more often in Timepoint 2. These interviews took place before any of those interviewed discontinued use of PARE. In particular, Individual Complexity subtheme codes such as lack of instructor knowledge/experience with teaching method, confidence in student data, and frustration or disappointment of instructor tended to be expressed more often by instructors who would later go on to discontinue use of PARE. Interventions such as additional training or coaching could be designed to address these issues. Lopatto et al. (2014) found that a central support system is perceived by instructors as helpful in sustaining use of a CURE.

    For Institutions and Policy Makers.

    We see evidence that instructor position type and job stability may influence adoption and expansion of CUREs. For example, the majority of expanders held titles such as “assistant professor”, “associate professor”, or “professor” and we saw a trend difference based on potential for tenure track. It is especially noteworthy that so many of the discontinuing instructors in our cohort discontinued due to disruptions in their teaching. While this phenomenon has not been well-studied in this context, there may be other evidence of position volatility leading to disruptions in CURE use. DeChenne-Peters and Scheuermann (2022) did not study discontinuance, however, they did show that many of the negative experiences with their CURE expressed by a community college instructor stemmed from undergoing an abrupt change in teaching assignment (the original instructor changed positions and a new instructor took over halfway through the CURE course). Disruptions in teaching may be related to employment status. Half of the discontinuing instructors in our study were adjunct faculty who changed institutions during our study period. Adjunct and other contingent faculty have less job stability than tenure-track instructors or full-time contracted teaching staff. This may decrease motivation to invest the time needed for CURE implementation and increase the likelihood of discontinuance. Institutions that wish to expand access to research experiences for their students should consider investing in instructional staff by providing a path for long-term job stability. As far as we are aware, this issue has not previously been discussed as a challenge to CURE implementation and is worthy of further investigation.

    Closing Remarks

    This qualitative study serves as a proof-of-principle that short-duration CUREs can serve as a steppingstone to longer duration CURE use. We also provide a glimpse into the evolving mindset and perceptions of instructors as they implement a new CURE for the first time. We have identified several possible factors that may contribute to sustained adoption or discontinuance. However, while qualitative studies such as this can provide rich data into instructor mindset, on their own they do not have the power to identify significant trends. Large-scale quantitative studies are needed to understand whether any of the factors identified here–such as institution type or job stability–are significantly predictive of sustained CURE use.

    ACCESSING MATERIALS

    No additional materials available online

    ACKNOWLEDGMENTS

    We would like to thank all participating PARE instructors, especially those that piloted interview transcripts for us: Michael A. Buckholt, Brittany J. Gasper, Sharon Gusky, Sarah Olken, and Alisa Petree. Thank you to Jessica Wilks for her contributions to the original Timepoint 1 coding rubric, and thank you to the entire Tufts University Center for Science Education team for ongoing support and input. The National Science Foundation (DUE 1912520) provided the funding for this project. All human subjects protocols have been approved by the Tufts Institutional Review Board (protocol #1906010) and comply with federal guidelines.

    REFERENCES

  • American Association for the Advancement of Science (AAAS) (2011). Vision and Change In Undergraduate Biology Education: A Call to Action, final report. Washington, DC. Retrieved July, 3, 2018, https://www.visionandchage.org/VC_report.pdf Google Scholar
  • Adedokun, O. A., Parker, L. C., Childress, A., Burgess, W., Adams, R., Agnew, C. R., … & Teegarden, D. (2014). Effect of time on perceived gains from an undergraduate research program. CBE—Life Sciences Education, 13(1), 139–148. https://doi.org/10.1187/cbe.13-03-0045 AbstractGoogle Scholar
  • Adkins, S. J., Rock, R. K., & Morris, J. J. (2018). Interdisciplinary STEM education reform: Dishing out art in a microbiology laboratory. FEMS Microbiology Letters, 365(1), fnx245. Google Scholar
  • Andrews, T. C., & Lemons, P. P. (2015). It’s personal: Biology instructors prioritize personal evidence over empirical evidence in teaching decisions. CBE—Life Sciences Education, 14(1), ar7. https://doi.org/10.1187/cbe.14-05-0084 LinkGoogle Scholar
  • Ardalan, S. S. (2019). Challenges Facing Contemporary Community Colleges. In Gaulee, U. (Ed.), Global Adaptations of Community College Infrastructure (pp. 186–200). Hershey, PA: IGI Global, Google Scholar
  • Auchincloss, L. C., Laursen, S. L., Branchaw, J. L., Eagan, K., Graham, M., Hanauer, D. I., … & Dolan, E. L. (2014). Assessment of course-based undergraduate research experiences: A meeting report. CBE—Life Sciences Education, 13(1), 29–40. https://doi.org/10.1187/cbe.14-01-0004 LinkGoogle Scholar
  • Bakshi, A., Patrick, L. E., & Wischusen, E. W. (2016). A framework for implementing course-based undergraduate research experiences (CUREs) in freshman biology labs. The American Biology Teacher, 78(6), 448–455. Google Scholar
  • Bangera, G., & Brownell, S. E. (2014). Course-based undergraduate research experiences can make scientific research more inclusive. CBE—Life Sciences Education, 13(4), 602–606. https://doi.org/10.1187/cbe.14-06-0099 LinkGoogle Scholar
  • Bauer, K. W., & Bennett, J. S. (2003). Alumni perceptions used to assess undergraduate research experience. The Journal of Higher Education, 74(2), 210–230. Google Scholar
  • Beck, C., Butler, A., & Burke da Silva, K. (2014). Promoting inquiry-based teaching in laboratory courses: Are we meeting the grade? CBE—Life Sciences Education, 13(3), 444–452. LinkGoogle Scholar
  • Bell, J. K., Provost, J., & Bell, E. (2019). Creating and using the malate dehydrogenase cure community to explore critical aspects of sustainable protein centric CUREs. The FASEB Journal, 33(S1). 617.612-617.612. https://doi.org/10.1096/fasebj.2019.33.1_supplement.617.2 Google Scholar
  • Bennett, J., & Bennett, L. (2003). A review of factors that influence the diffusion of innovation when structuring a faculty training program. The Internet and Higher Education, 6(1), 53–63. Google Scholar
  • Berendonk, T., Manaia, C., Merlin, C., Fatta-Kassinos, D., Cytryn, E., Walsh, F., … & Martinez, J. L. (2015). Tackling antibiotic resistance: The environmental framework. Nature Reviews. Microbiology, 13, 310–317. https://doi.org/10.1038/nrmicro3439 MedlineGoogle Scholar
  • Bliss, S. S., Abraha, E. A., Fuhrmeister, E. R., Pickering, A. J., & Bascom-Slack, C. A. (2023). Learning and STEM identity gains from an online module on sequencing-based surveillance of antimicrobial resistance in the environment: An analysis of the PARE-Seq curriculum. Plos One, 18(3), e0282412. MedlineGoogle Scholar
  • Brownell, S. E., & Kloser, M. J. (2015). Toward a conceptual framework for measuring the effectiveness of course-based undergraduate research experiences in undergraduate biology. Studies in Higher Education, 40(3), 525–544. https://doi.org/10.1080/03075079.2015.1004234 Google Scholar
  • Brownell, S. E., Kloser, M. J., Fukami, T., & Shavelson, R. J. (2013). Context matters: Volunteer bias, small sample size, and the value of comparison groups in the assessment of research-based undergraduate introductory biology lab courses. Journal of Microbiology and Biology Education, 14(2), 176–182. https://doi.org/10.1128/jmbe.v14i2.609 MedlineGoogle Scholar
  • Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and…tensions with professional identity? CBE—Life Sciences Education, 11(4), 339–346. https://doi.org/10.1187/cbe.12-09-0163 LinkGoogle Scholar
  • Buchanan, A. J., & Fisher, G. R. (2022). Current status and implementation of science practices in course-based undergraduate research experiences (CUREs): A systematic literature review. CBE—Life Sciences Education, 21(4), ar83. https://doi.org/10.1187/cbe.22-04-0069 MedlineGoogle Scholar
  • Carpi, A., Ronan, D. M., Falconer, H. M., & Lents, N. H. (2017). Cultivating minority scientists: Undergraduate research increases self-efficacy and career ambitions for underrepresented students in STEM. Journal of Research in Science Teaching, 54(2), 169–194. Google Scholar
  • Cole, M., Hickman, M.A., Morran, L., & Beck, C. W. (2021). Assessment of course-based research modules based on faculty research in introductory biology. Journal of Microbiology & Biology Education, 22(2). https://doi.org/10.1128/jmbe.00148-21 MedlineGoogle Scholar
  • Connor, M. C., Pratt, J. M., & Raker, J. R. (2022). Goals for the undergraduate instructional inorganic chemistry laboratory when course-based undergraduate research experiences are implemented: A National Survey. Journal of Chemical Education, 99(12), 4068–4078. Google Scholar
  • Cooper, K. M., & Brownell, S. E. (2018). Developing course-based research experiences in discipline-based education research: Lessons learned and recommendations. Journal of Microbiology & Biology Education, 19(2). jmbe-19-88. https://doi.org/10.1128/jmbe.v19i2.1567 MedlineGoogle Scholar
  • Craig, P. (2017). A survey on faculty perspectives on the transition to a biochemistry course-based undergraduate research experience laboratory. Biochemistry and molecular biology education: A bimonthly publication of the International Union of Biochemistry and Molecular Biology, 45(5), 426–436. https://doi.org/10.1002/bmb.21060 MedlineGoogle Scholar
  • Dahlberg, C. L., Wiggins, B. L., Lee, S. R., Leaf, D. S., Lily, L. S., Jordt, H., & Johnson, T. J. (2020). A short, course-based research module provides metacognitive benefits in the form of more sophisticated problem solving. Journal of College Science Teaching, 48(4), 22–30. Google Scholar
  • Davis, S. N., Jones, R. M., Mahatmya, D., & Garner, P. W. (2020). Encouraging or obstructing? Assessing factors that impact faculty engagement in undergraduate research mentoring. Frontiers in Education, 5. https://doi.org/10.3389/feduc.2020.00114 Google Scholar
  • DeChenne-Peters, S., & Scheuermann, N. (2022). Faculty experiences during the implementation of an introductory biology course-based undergraduate research experience (CURE). CBE—Life Sciences Education, 21(4), ar70. MedlineGoogle Scholar
  • DeChenne-Peters, S. E., Rakus, J. F., Parente, A. D., Mans, T. L., Eddy, R., Galport, N., … & Bell, J. K. (2023). Length of course-based undergraduate research experiences (CURE) impacts student learning and attitudinal outcomes: A study of the Malate dehydrogenase CUREs Community (MCC). PLoS One, 18(3), e0282170. MedlineGoogle Scholar
  • Dizney, L., Connors, P. K., Varner, J., Duggan, J. M., Lanier, H. C., Erb, L. P., … & Hanson, J. D. (2020). An introduction to the Squirrel-Net teaching modules, AES Faculty Publications and Presentations, 8. https://digitalcommons.csumb.edu/aes_fac/8 Google Scholar
  • Dolan, E. L. (2016). Course-based undergraduate research experiences: Current knowledge and future directions. Retrieved November 11, 2018, https://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_177288.pdf Google Scholar. Google Scholar
  • Eagan, M. K., Jr., Hurtado, S., Chang, M. J., Garcia, G. A., Herrera, F. A., & Garibay, J. C. (2013). Making a Difference in Science Education: The Impact of Undergraduate Research Programs. American Educational Research Journal, 50(4), 683–713. https://doi.org/10.3102/0002831213482038 MedlineGoogle Scholar
  • Fechheimer, M., Webber, K., & Kleiber, P. B. (2011). How well do undergraduate research programs promote engagement and success of students? CBE—Life Sciences Education, 10(2), 156–163. https://doi.org/10.1187/cbe.10-10-0130 LinkGoogle Scholar
  • Fendos, J., Cai, L., Yang, X., Ren, G., Li, L., Yan, Z., … & Yang, J. (2022). A course-based undergraduate research experience improves outcomes in mentored research. CBE—Life Sciences Education, 21(3), ar49. https://doi.org/10.1187/cbe.21-03-0065 MedlineGoogle Scholar
  • Frantz, K. J., Demetrikopoulos, M. K., Britner, S. L., Carruth, L. L., Williams, B. A., Pecore, J. L., … & Goode, C. T. (2017). A comparison of internal dispositions and career trajectories after collaborative versus apprenticed research experiences for undergraduates. CBE—Life Sciences Education, 16(1), ar1. LinkGoogle Scholar
  • Fuhrmeister, E. R., Larson, J. R., Kleinschmit, A. J., Kirby, J. E., Pickering, A. J., & Bascom-Slack, C. A. (2021). Combating antimicrobial resistance through student-driven research and environmental surveillance. Frontiers in Microbiology, 12, 577821. MedlineGoogle Scholar
  • Gastreich, K. R. (2021). Assessing urban biodiversity with the eBird citizen science project: A course-based undergraduate research experience (CURE) module. http://dx.doi.org/10.24918/cs.2020.18 Google Scholar
  • Genné-Bacon, E., Wilks, J., & Bascom-Slack, C. (2020). Uncovering factors influencing instructors’ decision process when considering implementation of a course-based research experience. CBE—Life Sciences Education, 19(2). https://doi.org/10.1187/cbe.19-10-0208 MedlineGoogle Scholar
  • Genné-Bacon, E. A., & Bascom-Slack, C. A. (2018). The pare project: A short course-based research project for national surveillance of antibiotic-resistant microbes in environmental samples. Journal of Microbiology and Biology Education, 19(3). https://doi.org/10.1128/jmbe.v19i3.1603 MedlineGoogle Scholar
  • Gonzalo, J. D., Graaf, D., Ahluwalia, A., Wolpaw, D. R., & Thompson, B. M. (2018). A practical guide for implementing and maintaining value-added clinical systems learning roles for medical students using a diffusion of innovations framework. Advances in Health Sciences Education, 23, 699–720. MedlineGoogle Scholar
  • Goodwin, E. C., Cary, J. R., & Shortlidge, E. E. (2021). Enthusiastic but inconsistent: Graduate teaching assistants’ perceptions of their role in the CURE classroom. CBE—Life Sciences Education, 20(4), ar66. MedlineGoogle Scholar
  • Goodwin, E. C., Cary, J. R., & Shortlidge, E. E. (2022). Not the same CURE: Student experiences in course-based undergraduate research experiences vary by graduate teaching assistant. PLoS One, 17(9), e0275313. MedlineGoogle Scholar
  • Goodwin, E. C., Cary, J. R., Phan, V. D., Therrien, H., & Shortlidge, E. E. (2023). Graduate teaching assistants impact student motivation and engagement in course-based undergraduate research experiences. Journal of Research in Science Teaching, 60(9), 1967–1997 https://doi.org/10.1002/tea.21848 Google Scholar
  • Govindan, B., Pickett, S., & Riggs, B. (2020). Fear of the CURE: A beginner’s guide to overcoming barriers in creating a course-based undergraduate research experience. Journal of Microbiology & Biology Education, 21(2), 50. Google Scholar
  • Gregerman, S. R. (1999). Improving the academic success of diverse students through undergraduate research. Council on Undergraduate Research Quarterly, 20(2), 54–59. Google Scholar
  • Hanauer, D. I., Nicholes, J., Liao, F.-Y., Beasley, A., & Henter, H. (2018). Short-term research experience (SRE) in the traditional lab: Qualitative and quantitative data on outcomes. CBE—Life Sciences Education, 17(4), ar64. LinkGoogle Scholar
  • Hanauer, D. I., Graham, M. J., Jacobs-Sera, D., Garlena, R. A., Russell, D. A., Sivanathan, V., … & Hatfull, G. F. (2022). Broadening access to STEM through the community college: Investigating the role of course-based research experiences (CREs). CBE—Life Sciences Education, 21(2), ar38. MedlineGoogle Scholar
  • Harris, A., Babkoor, M., Gu, T., & Kremer, G. E. (2015). Course-Based undergraduate research: A review of models and practices. Proceedings of the ASME 2015 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference held from August 2–5, Boston, MA. Google Scholar
  • Hattie, J., & Marsh, H. W. (1996). The relationship between research and teaching: A meta-analysis. Review of Educational Research, 66(4), 507–542. Google Scholar
  • Healey, M., & Jenkins, A. (2009). Developing undergraduate research and inquiry. New York NY: Higher Education Academy. https://www.heacademy.ac.uk/knowledge-hub/developing-undergraduate-research-and-inquiry Google Scholar
  • Heemstra, J. M., Waterman, R., Antos, J. M., Beuning, P. J., Bur, S. K., Columbus, L., … & Leconte, A. M. (2017). Throwing away the cookbook: Implementing course-based undergraduate research experiences (CUREs) in chemistry. In: Educational and Outreach Projects from the Cottrell Scholars Collaborative Undergraduate and Graduate Education Volume 1 (pp. 33–63). ACS Publications. Google Scholar
  • Heim, A. B., & Holt, E. A. (2019). Benefits and challenges of instructing introductory biology course-based undergraduate research experiences (CUREs) as perceived by graduate teaching assistants. CBE—Life Sciences Education, 18(3), ar43. LinkGoogle Scholar
  • Henderson, C., Dancy, M., & Niewiadomska-Bugaj, M. (2012). Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Physical Review Special Topics - Physics Education ResearcH, 8, 020104. https://doi.org/10.1103/PhysRevSTPER.8.020104 Google Scholar
  • Henderson, C., & Dancy, M. H. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research, 3, 1–14. Google Scholar
  • Hensel, N. H., & Cejda, B. D. (2015). Embedding undergraduate research in the community college curriculum. Peer Review, 17(4), 27–31. Google Scholar
  • Hensel, N. H., & Davidson, C. N. (2018). Course-based undergraduate research: educational equity and high-impact practice. In Hensel N. H. (Ed.), Council for Undergraduate Research and Stylus Publishing (pp. 1–264). New York, NY: Stylus Publishing. Google Scholar
  • Hewlett, J. A. (2016). Undergraduate research at the community college: barriers and opportunities. ACS Symposium Series, 1231, 137–151. Google Scholar
  • Hewlett, J. A. (2018). Broadening Participation in Undergraduate Research Experiences (UREs): The Expanding Role of the Community College. CBE—Life Sciences Education, 17(3), es9. https://doi.org/10.1187/cbe.17-11-0238 LinkGoogle Scholar
  • Howard, D. R., & Miskowski, J. A. (2005). Using a Module-based Laboratory To Incorporate Inquiry into a Large Cell Biology Course. Cell Biology Education, 4, 249–260. LinkGoogle Scholar
  • Hunter, A., Laursen, S. L., & Seymour, E. (2007). Becoming a Scientist: The Role of Undergraduate Research in Students’ Cognitive, Personal, and Professional Development. Science Education, 91(1), 36–74. https://doi.org/10.1002/sce.20173 Google Scholar
  • Hurley, A., Chevrette, M. G., Acharya, D. D., Lozano, G. L., Garavito, M., Heinritz, J., … & Corinaldi, K. (2021). Tiny earth: A big idea for STEM education and antibiotic discovery. MBio, 12(1), e03432–03420. MedlineGoogle Scholar
  • Hyman, O. J., Doyle, E. A., Harsh, J., Mott, J., Pesce, A., Rasoul, B., … & Enke, R. A. (2019). CURE-all: Large Scale Implementation of Authentic DNA Barcoding Research into First-Year Biology Curriculum. Course Source. https://doi.org/10.24918/cs.2019.10 Google Scholar
  • Jordan, T. C., Burnett, S. H., Carson, S., Caruso, S. M., Clase, K., DeJong, R. J., … & Hatfull, G. F. (2014). A broadly implementable research course in phage discovery and genomics for first-year undergraduate students. MBio, 5(1), e01051–01013. https://doi.org/10.1128/mBio.01051-13 MedlineGoogle Scholar
  • Kenny, S. S., Alberts, B., Booth, W. C., Glaser, M., & Glassick, C. E. (1998). Boyer Commission on educating undergraduates in the research university. Reinventing undergraduate education: A blueprint for America’s research universities. Stony Brook, NY: State University of New York at Stony Brook for the Carnegie Foundation for the Advancement of Teaching. Google Scholar
  • Kleinschmit, A. J., Genné-Bacon, E., Drace, K., Govindan, B., Larson, J. R., Qureshi, A. A., & Bascom-Slack, C. (2024). A framework for leveraging network course-based undergraduate research experience (CURE) faculty to develop, validate, and administer an assessment instrument. Journal of Microbiology & Biology Education, e00149–23. https://doi.org/10.1128/jmbe.00149-23 MedlineGoogle Scholar
  • Kuh, G. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. (pp. 1–44). Washington, DC: Association of American Colleges and Universities. Google Scholar
  • Laursen, S. L., Hunter, A., Seymour, E., Thiry, H., & Melton, G. (2010). Undergraduate Research in the Sciences: Engaging Students in Real Science. Jossey-Bass. Google Scholar
  • Li, D., Yang, M., Hu, J., Zhang, Y., Chang, H., & Jin, F. (2008). Determination of penicillin G and its degradation products in a penicillin production wastewater treatment plant and the receiving river. Water Research, 42(1), 307–317. MedlineGoogle Scholar
  • Liao, H.-A. (2005). Communication technology, student learning, and diffusion of innovation. College Quarterly, 8(2), n2. Google Scholar
  • Lopatto, D. (2004). Survey of Undergraduate Research Experiences (SURE): First findings [Research Support, Non-U.S. Gov’t]. Cell Biology Education, 3(4), 270–277. https://doi.org/10.1187/cbe.04-07-0045 LinkGoogle Scholar
  • Lopatto, D., Hauser, C., Jones, C. J., Paetkau, D., Chandrasekaran, V., Dunbar, D., … & Elgin, S. C. (2014). A central support system can facilitate implementation and sustainability of a Classroom-based Undergraduate Research Experience (CURE) in Genomics. CBE—Life Sciences Education, 13(4), 711–723. https://doi.org/10.1187/cbe.13-10-0200 LinkGoogle Scholar
  • Lopatto, D., & Tobias, S. (2009). Science in Solution: The Impact of Undergraduate Research on Student Learning. C. o. U. R. a. R. C. f. S. A. C. o. U. Research (pp. 1–85). Tuczon, AZ: Research Corporation for Science Advancement. Google Scholar
  • Mader, C. M., Beck, C. W., Grillo, W. H., Hollowell, G. P., Hennington, B. S., Staub, N. L., … & White, S. L. (2017). Multi-Institutional, Multidisciplinary Study of the Impact of Course-Based Research Experiences. Journal of Microbiology & Biology Education, 18(2). https://doi.org/10.1128/jmbe.v18i2.1317 MedlineGoogle Scholar
  • Marbach-Ad, G., & Hunt Rietschel, C. (2016). A case study documenting the process by which biology instructors transition from teacher-centered to learner-centered teaching. CBE—Life Sciences Education, 15(4), ar62. MedlineGoogle Scholar
  • McConnell, M., Montplaisir, L., & Offerdahl, E. (2020). Meeting the conditions for diffusion of teaching innovations in a university STEM department. Journal for STEM Education Research, 3(1), 43–68. Google Scholar
  • Menzli, L. J., Smirani, L. K., Boulahia, J. A., & Hadjouni, M. (2022). Investigation of open educational resources adoption in higher education using Rogers’ diffusion of innovation theory. Heliyon, 8(7), e09885. MedlineGoogle Scholar
  • Moy, M. K., Hammrich, P. L., & Kabnick, K. (2019). Developing a Tiered Mentoring Model for Teaching Assistants Instructing Course-Based Research Experiences. Journal of College Science Teaching, 48(5), 59–67 doi: 10.1080/0047231X.2019.12290478 Google Scholar
  • Mullin, C. M. (2012). Why Access Matters: The Community College Student Body. AACC Policy Brief 2012-01PBL. Washington, DC: American Association of Community Colleges. Google Scholar
  • Muth, T. R., & McEntee, C. M. (2014). Undergraduate urban metagenomics research module. Journal of Microbiology & Biology Education, 15(1), 38–40. MedlineGoogle Scholar
  • Nagda, B. A., Gregerman, S. R., Jonides, J., von Hippel, W., & Lerner, J. S. (1998). Undergraduate Student-Faculty Research Partnerships Affect Student Retention. Review of Higher Education, 22(1), 55–72. Google Scholar
  • National Research Council (US) Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century. (2003). Bio2010: Transforming Undergraduate Education for Future Research Biologists. Washington, DC: National Academies Press. Google Scholar
  • Nilsen, P. (2015). Making sense of implementation theories, models and frameworks. Implementation science: IS, 10, https://doi.org/10.1186/s13012-015-0242-0 MedlineGoogle Scholar
  • Ntemana, T. J., & Olatokun, W. (2012). Analyzing the influence of diffusion of innovation attributes on lecturers’ attitude towards information and communication technologies. Human Technology, 8(2), 179–197. https://doi.org/10.17011/ht/urn.201211203034 Google Scholar
  • Olimpo, J. T., Fisher, G. R., & DeChenne-Peters, S. E. (2016). Development and evaluation of the Tigriopus course-based undergraduate research experience: Impacts on students’ content knowledge, attitudes, and motivation in a majors introductory biology course. CBE—Life Sciences Education, 15(4), ar72. LinkGoogle Scholar
  • Olson, S., & Riordan, D. G. (2012). Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. Report to the President. Executive Office of the President. Google Scholar
  • Perez, J. A. (2003). Undergraduate research at two-year colleges. New Directions for Teaching and Learning, 2003(93), 69–78. https://doi.org/10.1002/tl.89 Google Scholar
  • Plaisier, S., Alarid, D. O., Brownell, S. E., Buetow, K., Cooper, K. M., & Wilson, M. A. (2023). Design and implementation of an asynchronous online course-based undergraduate research experience (CURE) in computational genomics. bioRxiv, 2023, 11.29.569298; https://doi.org/10.1101/2023.11.29.569298 Google Scholar
  • Roberts, R., Hall, B., Daubner, C., Goodman, A., Pikaart, M., Sikora, A., & Craig, P. (2019). Flexible Implementation of the BASIL CURE. Biochemistry and molecular biology education: A bimonthly publication of the International Union of Biochemistry and Molecular Biology, 47(5). https://doi.org/10.1002/bmb.21287 MedlineGoogle Scholar
  • Rodenbusch, S. E., Hernandez, P. R., Simmons, S. L., & Dolan, E. L. (2016). Early Engagement in Course-Based Research Increases Graduation Rates and Completion of Science, Engineering, and Mathematics Degrees. CBE—Life Sciences Education, 15(2), 1–10. https://doi.org/10.1187/cbe.16-03-0117 Google Scholar
  • Rogers, E. M. (1962). Diffusion of innovations. Free Press of Glencoe. Google Scholar
  • Rogers, E. M. (2003). Diffusion of Innovations (5th ed.). Simon and Schuster. Google Scholar
  • Rosas Alquicira, E. F., Guertin, L., Tvelia, S., Berquist, P. J., & Cole, M. W. (2022). Undergraduate research at community colleges: A pathway to achieve student, faculty, and institutional success. New Directions for Community Colleges, 2022(199), 63–75. https://doi.org/10.1002/cc.20524 Google Scholar
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S., … & Corwin, L. A. (2017). Broadening Participation in Biology Education Research: Engaging Community College Students and Faculty. CBE—Life Sciences Education, 16(2). https://doi.org/10.1187/cbe.16-10-0289 MedlineGoogle Scholar
  • Schmidt, N. A., & Brown, J. M. (2007). Use of the Innovation–Decision Process Teaching Strategy to Promote Evidence-Based Practice. Journal of Professional Nursing, 23(3), 150–156. https://doi.org/10.1016/j.profnurs.2007.01.009 MedlineGoogle Scholar
  • Seymour, E., Hunter, A., Laursen, S. L., & DeAntoni, T. (2004). Establishing the benefits of research experiences for undergraduates in the sciences: First findings from a three-year study. Science Education, 88(4), 493–534. Google Scholar
  • Shadle, S. E., Marker, A., & Earl, B. (2017). Faculty drivers and barriers: Laying the groundwork for undergraduate STEM education reform in academic departments [journal article]. International Journal of STEM Education, 4(1), 8. https://doi.org/10.1186/s40594-017-0062-7 MedlineGoogle Scholar
  • Shaffer, C. D., Alvarez, C., Bailey, C., Barnard, D., Bhalla, S., Chandrasekaran, C., … & Elgin, S. C. (2010). The genomics education partnership: Successful integration of research into laboratory classes at a diverse group of undergraduate institutions. CBE—Life Sciences Education, 9(1), 55–69. https://doi.org/10.1187/09-11-0087 LinkGoogle Scholar
  • Shaffer, C. D., Alvarez, C. J., Bednarski, A. E., Dunbar, D., Goodman, A. L., Reinke, C., … & Elgin, S. C. (2014). A course-based research experience: How benefits change with increased investment in instructional time. CBE—Life Sciences Education, 13(1), 111–130. https://doi.org/10.1187/cbe-13-08-0152 LinkGoogle Scholar
  • Shortlidge, E. E., Bangera, G., & Brownell, S. E. (2016). Faculty Perspectives on Developing and Teaching Course-Based Undergraduate Research Experiences. BioScience, 66, 54–62. https://doi.org/10.1093/biosci/biv167 Google Scholar
  • Shortlidge, E. E., Bangera, G., & Brownell, S. E. (2017). Each to Their Own CURE: Faculty Who Teach Course-Based Undergraduate Research Experiences Report Why You Too Should Teach a CURE. Journal of Microbiology and Biology Education, 18(2). https://doi.org/10.1128/jmbe.v18i2.1260 MedlineGoogle Scholar
  • Shuster, M. I., Curtiss, J., Wright, T. F., Champion, C., Sharifi, M., & Bosland, J. (2019). Implementing and evaluating a course-based undergraduate research experience (CURE) at a Hispanic-serving institution. Interdisciplinary Journal of Problem-Based Learning, 13(2). https://doi.org/10.7771/1541-5015.1806 Google Scholar
  • Smith, K. (2012). Lessons learnt from literature on the diffusion of innovative learning and teaching practices in higher education. Innovations in Education and Teaching International, 49(2), 173–182. Google Scholar
  • Spell, R. M., Guinan, J. A., Miller, K. R., & Beck, C. W. (2014). Redefining authentic research experiences in introductory biology laboratories and barriers to their implementation. CBE—Life Sciences Education, 13(1), 102–110. https://doi.org/10.1187/cbe.13-08-0169 LinkGoogle Scholar
  • Stanfield, E., Slown, C. D., Sedlacek, Q., & Worcester, S. E. (2022). A Course-Based Undergraduate Research Experience (CURE) in Biology: Developing Systems Thinking through Field Experiences in Restoration Ecology. CBE—Life Sciences Education, 21(2). https://doi.org/10.1187/cbe.20-12-0300 MedlineGoogle Scholar
  • Staub, N. L., Blumer, L. S., Beck, C. W., Delesalle, V. A., Griffin, G. D., Merritt, R. B., … & White, S. L. (2016). Course-based science research promotes learning in diverse students at diverse institutions. CUR Quarterly, 38(2), 36–46. Google Scholar
  • Straub, E. T. (2009). Understanding technology adoption: Theory and future directions for informal learning. Review of educational research, 79(2), 625–649. Google Scholar
  • Sundberg, M. D., & Armstrong, J. E. (1993). The status of laboratory instruction for introductory biology in US universities. The American Biology Teacher, 55(3), 144–146. Google Scholar
  • Sundberg, M. D., Armstrong, J. E., & Wischusen, E. W. (2005). A reappraisal of the status of introductory biology laboratory education in US colleges & universities. The American Biology Teacher, 67(9), 525–529. Google Scholar
  • Taraban, R., & Blanton, R. L. (Eds.). (2008). Creating effective undergraduate research programs in science: The transformation from student to scientist (pp. 583–585). New York, NY: Teachers College Press. Google Scholar
  • Thiry, H., Weston, T. J., Laursen, S. L., & Hunter, A. B. (2012). The benefits of multi-year research experiences: Differences in novice and experienced students’ reported gains from undergraduate research. CBE—Life Sciences Education, 11(3), 260–272. https://doi.org/10.1187/cbe.11-11-0098 LinkGoogle Scholar
  • Warford, M. K. (2010). Testing a Diffusion of Innovations in Education Model (DIEM). The Innovation Journal: The Public Sector Innovation Journal, 10(3), 2–28.  Google Scholar
  • Watson, C. E. (2007). Self-efficacy, the innovation-decision process, and faculty in higher education: Implications for faculty development [Doctoral dissertation, Virginia Tech]. Google Scholar
  • Wei, C. A., & Woodin, T. (2011). Undergraduate research experiences in biology: Alternatives to the apprenticeship model. CBE—Life Sciences Education, 10(2), 123–131. https://doi.org/10.1187/cbe.11-03-0028 LinkGoogle Scholar
  • Wickham, R. J., Genné-Bacon, E. A., & Jacob, M. H. (2021). The Spine Lab: A Short-Duration, Fully-Remote Course-Based Undergraduate Research Experience. Journal of Undergraduate Neuroscience Education, 20(1), A28–A39. MedlineGoogle Scholar
  • Wickham, R. J., Adams, W., & Hawkier, M. J. (2023). The COVID-19 and Taste Lab: A Mini Course-Based Undergraduate Research Experience on Taste Differences and COVID-19 Susceptibility. Journal of Undergraduate Neuroscience Education, 21(2), A97–A107. doi: 10.59390/FDMA5232 MedlineGoogle Scholar
  • Wolkow, T. D., Durrenberger, L. T., Maynard, M. A., Harrall, K. K., & Hines, L. M. (2014). A comprehensive faculty, staff, and student training program enhances student perceptions of a course-based research experience at a two-year institution. CBE—Life Sciences Education, 13(4), 724–737. https://doi.org/10.1187/cbe.14-03-0056 LinkGoogle Scholar
  • Zelaya, A. J., Gerardo, N. M., Blumer, L. S., & Beck, C. W. (2020). The Bean Beetle Microbiome Project: A Course-Based Undergraduate Research Experience in Microbiology [Methods]. Frontiers in microbiology, 11, https://doi.org/10.3389/fmicb.2020.577621 MedlineGoogle Scholar