ASCB logo LSE Logo

Factors Influencing the Use of Evidence-based Instructional Practices by Community College Biology Instructors

    Published Online:https://doi.org/10.1187/cbe.24-02-0095

    Abstract

    Evidence-based instructional practices (EBIPs) have been shown to benefit students in undergraduate biology, but little is known about the degree to which community college (CC) biology instructors use EBIPs or the barriers they encounter. We surveyed CC biology instructors to characterize how they use EBIPs, their capacity to use EBIPs, and perceived barriers to their use, and to explore which factors are associated with EBIP use. CC biology instructors report using EBIPs to a similar degree as other populations of undergraduate biology faculty; they generally believe EBIPs to be effective and are motivated to use EBIPs. Consistent with the theory of planned behavior, instructor belief in EBIP effectiveness, collegial support, and perceived knowledge of and skill in using EBIPs positively influence their use. The main barriers to using EBIPs reported by CC instructors included the need to cover large amounts of course content, lack of time to prepare for using EBIPs, and student resistance. Our findings point to a number of approaches that may promote the use of EBIPs by CC biology instructors, including professional development to increase instructor knowledge and skill, addressing tensions between content volume and the use of EBIPs, and providing resources to make implementing EBIPs time efficient.

    INTRODUCTION

    In undergraduate science, technology, engineering, and mathematics (STEM) education, teaching techniques that differ from traditional lecturing have been described with a variety of umbrella terms, including active learning (Bonwell and Eisen, 1991), research-based instructional practices (Dean and Hubbell, 2012), and evidence-based instructional practices (EBIPs; Stains and Vickrey, 2017). In this paper, we adopt the term EBIPs to refer to a variety of classroom activities that differ from lecturing and have been linked with positive student outcomes (Ruiz-Primo et al., 2011; Freeman et al., 2014; Rahman and Lewis, 2020; Moon et al., 2021), including more equitable outcomes for students from groups historically underrepresented in STEM (Ernst and Colthorpe, 2007; Haak et al., 2011; Eddy and Hogan, 2014; Freeman et al., 2014; Theobald et al., 2020). For the purposes of this paper, we do not include in the definition of EBIPs a variety of inquiry-based and research-based laboratory activities that have also been associated with desirable student outcomes (Andrews [AAAS], 2019).

    EBIPs in 4-year Colleges and Universities

    Despite the known benefits of EBIPs to students, less than 20% of STEM instructors, including biology instructors, in North American 4-year colleges and universities (4YC) use these practices intensively to create student-centered learning environments (Stains et al., 2018). Several studies have characterized the barriers to the use of EBIPs, primarily in 4YC faculty. Some of the most commonly reported barriers (reviewed in Wise et al., 2022) include:

    On the other hand, factors that faculty perceive as supportive for using EBIPs influence their use more strongly than the perceived barriers (Bathgate et al., 2019). These influential supports have been categorized as academic receptivity (e.g., departmental support, positive department culture), logistic support (e.g., resources, time, classroom spaces conducive to group work), student receptivity to EBIPs, and personal teaching preferences (e.g., comfort, confidence, and ability to use EBIPs). Previous exposure to EBIPs (as a student or as a teaching assistant) also correlates with higher EBIP implementation in chemistry, mathematics, and physics instructors (Apkarian et al., 2021). Course coordination has also been shown to support the adoption and use of EBIPs in undergraduate mathematics (Rasmussen and Ellis, 2015; Archie et al, 2022). Likewise, a growth mindset has been linked to EBIP uptake; STEM instructors with a growth mindset are more likely to implement EBIPs than those with a “fixed” mindset (Aragón et al., 2018).

    EBIPs in Community Colleges

    In contrast to 4YC contexts, there is very little research in community college (CC) settings regarding the extent to which EBIPs are used by STEM instructors, including biology instructors (Schinske et al., 2017; Creech et al., 2022). Published estimates of the intensity of EBIPs use in undergraduate biology come almost exclusively from samples of 4YC instructors (Andrews et al., 2011; Durham et al., 2017; Durham et al., 2018; Stains et al., 2018; Durham et al., 2022). Consequently, the extent to which biology education reform efforts (e.g., professional development programs) have influenced teaching practices in CCs is largely unknown. The EBIPs that CC biology instructors generally use and not use have also not been systematically documented, making it hard to identify areas of opportunity for large scale improvement of teaching practices in CCs.

    The barriers and supports that might influence EBIP use by CC biology instructors are also understudied. Corwin et al. (2019) published one of the very few studies in this area; their work narrowly addressed the teaching of quantitative biology content and skills. As the authors of this study put it “our study participants met with challenges typical of incorporating new material or techniques into any college-level class.” Challenges reported by the 20 instructors interviewed for this study included perceptions of student deficits, content coverage, and gaps in instructor knowledge (content and pedagogical). Affordances included professional development, curricular resources, collegial and peer support, course autonomy, stated course learning outcomes, student supports (developmental courses, tutoring), and instructional grants (Corwin et al., 2019). It is currently unclear to what extent these barriers and supports are broadly relevant to EBIP use in CC biology instruction outside of teaching quantitative biology. It is also unclear which factors are more influential in driving or hindering EBIP use by CC instructors.

    The current gaps in data regarding the use of EBIPs by CC STEM instructors are problematic, considering the importance of CCs in the U.S. higher education system. Roughly 4.7 million or 30% of all undergraduate students in the United States were enrolled in a CC in 2021 (National Center for Education Statistics, 2023). Furthermore, from 2008 to 2017, almost half (47%) of the students who earned a science or engineering bachelor's degree had completed some coursework at a CC (Foley et al., 2021). CCs disproportionately serve historically underrepresented students in STEM. For example, in the fall of 2020, CCs enrolled 52% of all Native American students and 48% of all Hispanic students, but only 38% of all U.S. undergraduates (AACC, 2023). As EBIPs promote equitable student outcomes in STEM (Theobald et al., 2020), it is especially important to understand the extent to which EBIPs are used in CC classrooms and the factors that either support or hinder their use. This information is foundational for reform initiatives that aim to reduce opportunity gaps in undergraduate education.

    Designing efforts to increase the use of EBIPs in CCs is a complex challenge, given the myriad of individual and contextual factors that are likely to influence their use. Theoretical frameworks and models that can help identify the most influential factors related to EBIPs use are useful for directing such efforts. With this in mind, in the present study we use the Theory of Planned Behavior (TBP) to model how different instructor-level and contextual factors affect the use of EBIPs by CC biology faculty.

    The Theory of Planned Behavior

    The TPB (Ajzen, 1991) has been identified as a theoretical framework that can be applied to change efforts related to the adoption of EBIPs in STEM contexts (Reinholz and Andrews, 2020), and has been applied accordingly in several STEM disciplines and contexts (Bathgate et al., 2019; Chasteen and Chattergoon, 2020; Archie et al., 2022). As applied in those contexts, the theory assumes several preconditions to an instructor's use of EBIPs, including a positive attitude about the strategies, perceptions of social norms related to the strategies, and their perceived ability to execute the strategies (Reinholz and Andrews, 2020). For the purposes of this study, the TPB is used to identify the preconditions and correlates of EBIP use, to inform change efforts for this population.

    The TPB posits that an individual's behavioral intention is strongly related to their actual behavior (Figure 1). In the TPB, behavioral intent is influenced by attitude, subjective norm, and perceived behavioral control. Attitude refers to an individual's favorable or unfavorable evaluation of the behavior. Subjective norm refers to an individual's perception of their peers’ judgment of a behavior. In this model, individuals who perceive that their peers approve of the behavior will be more likely to engage in the behavior themselves. Perceived behavioral control refers to an individual's perception that they can perform the behavior. Thus, perceived behavioral control affects an individual's intent to perform the behavior and, subsequently, their behavior.

    FIGURE 1.

    FIGURE 1. Model of factors included in the TPB.

    FIGURE 2.

    FIGURE 2. Distribution of MIST composite scores (n = 388).

    We applied the TPB to this study in the following ways: “behavior” was reflected in instructors’ current use of EBIPs. This behavior is affected by their motivation to use EBIPs (behavioral intent). Instructors’ intention to use EBIPs is affected by their attitude about EBIPs, their perceptions of peer and departmental support (subjective norm), and their self-reported knowledge and skills about EBIPs (interpreted here as perceived behavioral control). The TPB can accommodate other factors outside of the theory (Armitage and Connor, 2001). For example, individual instructor characteristics and barriers or supports to the use of EBIPs have been integrated into the TPB and shown to influence the degree to which undergraduate mathematics instructors use EBIPs (Archie et al., 2022).

    Goals of this Study

    The purpose of this study was to describe CC biology instructors’ current use of EBIPs, their perceived capacity to use EBIPs, as well as the barriers and supports and the individual and contextual factors that affect their use of EBIPs, with the goal of using these findings to inform change efforts to increase EBIP use by CC biology instructors.

    This study addresses the following research questions:

    1. What capacity, collegial support, and mindset do CC biology instructors have to use of EBIPs?

    2. To what extent do CC biology instructors report using EBIPs?

    3. What barriers to using EBIPs do CC biology instructors report?

    4. What individual and contextual factors influence EBIP use by CC biology instructors?

    MATERIALS AND METHODS

    Author Positionality

    Our research team includes members with diverse backgrounds and experiences. T.A., S.W., and S.L. are undergraduate STEM researchers, J.R. develops and manages faculty professional development programs, and M.C. is a program evaluator. None of the authors currently teach at a CC, and thus, we acknowledge that our research design, data interpretation, and conclusions are those of researchers outside of the CC teaching context. Nevertheless, we hope the findings reported here will provide useful insight for CC faculty and other STEM education practitioners.

    Data Collection and Sample.

    From June to August of 2021, we surveyed 5064 CC biology instructors from a commercial list of educators in public and private institutions across the United States who are active members of biology, life science, anatomy and physiology, ecology, and similarly labeled academic departments (MDR Education, Dun & Bradstreet, Inc). Email invitations were sent, and survey data were collected using the Qualtrics Survey system. A gift card incentive was offered to respondents who completed the survey. This study was approved via the expedited review process by the University of Colorado Boulder Institutional Review Board under protocol number 20-0721. In all, 477 U.S. CC biology instructors responded to our survey, resulting in a 9% response rate. Of the 447 respondents who completed the survey, 388 responded to the demographic items (Table 1). Respondents represented 295 unique institutions in 42 states. Sixty-one percent of respondents identified as a woman, 6% identified as Latina/o/x, 70% identified as White, 88% had over 5 years of teaching experience, 23% taught in a minority-serving institution, 77% taught full-time, roughly 50% were tenured or tenure track, and 48% had a renewable contract (adjunct).

    TABLE 1. Survey respondent characteristics (n = 388)

    % of survey respondents
    Gender
     Woman61%
     Man36%
     Prefer not to answer4%
    Ethnicity
     Latina/o/x6%
     American Indian or Alaskan Native1%
     Asian10%
     Black or African American7%
     Native Hawaiian or Pacific Islander0%
     White69%
     Prefer not to answer8%
    Minority serving institution
     Yes23%
     No53%
     Unsure23%
    Position
     Tenured42%
     Tenure track10%
     Renewable contract instructor (e.g., Adjunct)48%
    Teaching status
     Full-time77%
     Part-time23%
    College teaching experience
     < 2 years1%
     2-5 years11%
     6-10 years23%
     11-20 years38%
     >20 years27%
    Course*
     General Biology81%
     Anatomy & Physiology15%
     Other4%
    EBIPs teaching experience
     Yes81%
     No19%
    Course format*
     In-person89%
     Online11%

    *Respondents were asked to base their answers on their use of EBIPs on one specific course, preferably General/Introductory Biology and to indicate the format of the course.

    Measures

    Respondents were asked about their perceptions of EBIPs and about individual and institutional factors known or hypothesized to influence EBIPs use. A copy of the survey instrument is included as Supplemental Data S1 in the Supplemental Materials. In survey questions, we used the term “active learning teaching strategies” as a proxy for EBIPs. Moreover, before answering any questions about “active learning,” respondents were required to acknowledge a statement that defined active learning as “teaching practices that differ from traditional lecture.” For example, to measure instructors’ belief in the general effectiveness of EBIPs we asked, “To what extent do you believe active learning teaching strategies are an effective learning method?” Responses therefore capture instructional actions that faculty perceive as deviating from traditional expository lecturing. This is an important caveat because some of the instructional practices used by faculty that differ from lecturing may not be strongly supported by educational research. However, “active learning” is commonly used in the literature and among undergraduate practitioners as a useful umbrella term for many teaching practices that are supported by research (Driessen et al., 2020; Lombardi et al., 2021). When asking about specific EBIPs, we used the specific terminology from previously published survey instruments.

    Evidence-based Instructional Practices.

    Individual EBIPs were measured with the short form of the Measurement Instrument for Scientific Teaching (MIST-short). Biology education researchers designed this instrument to measure the frequency of use of EBIPs in undergraduate biology courses (Durham et al., 2017). This instrument has shown alignment between observation, instructor reports, and student reports of EBIPs (Durham et al., 2018). Additionally, recent work has shown that MIST scores are positively related to other measures of learner-centered teaching (Emery et al., 2020).

    With the MIST-short, survey respondents were asked to report their teaching practices about an introductory or general biology course, under typical conditions (before the COVID-19 pandemic). If they did not teach an introductory or general biology course, they were instructed to report their teaching practices for another biology course they taught. Respondents were asked to report only about the lecture component of the course, and to not include responses about any lab component. The majority (81%) of respondents answered in relation to an introductory/general biology course and most others (15%) answered in relation to an anatomy and physiology course. MIST scores have a theoretical range from 0 to 100, but have an observed range from 15 to 85 (Durham et al., 2022). Lower MIST composite scores indicate less use of EBIPs, and higher scores indicate higher use of EBIPs.

    Barriers to the Use of EBIPs.

    Two previous studies provide comprehensive survey items related to barriers and supports for STEM faculty's use of EBIPs (Bathgate et al., 2019; Sturtevant and Wheeler, 2019). To limit survey length and favor survey completion, we adapted a subset of barriers and supports from these studies that were consistent with our prior findings from interviews conducted with CC biology faculty (Wise et al., 2022).

    Capacity to Teach Using EBIPs.

    Instructors’ capacity to teach using EBIPs or “active learning teaching strategies” was measured by four items that probed instructors’ beliefs about the effectiveness of active learning, their knowledge of active learning, their skill in using active learning, and their motivation to use active learning (Archie et al., 2022).

    Respondents were asked “To what extent do you believe active learning teaching strategies are an effective learning method?” and rated this item on a five-point scale (1 = not at all effective to 5 = extremely effective). Respondents were asked to rate their current level of skill in using active learning teaching strategies (1 = not at all skilled, 5 = extremely skilled) and their current level of knowledge of active learning teaching strategies (1 = not at all knowledgeable, 5 = extremely knowledgeable). Finally, respondents were asked “How motivated are you to use active learning teaching strategies?” and this item was coded on a five-point scale (1 = not at all motivated to 5 = extremely motivated).

    Growth Mindset.

    Instructors were asked three growth mindset questions (Dweck, 2006) that have been shown to be related to EBIP use (Aragón et al., 2018). These items were coded on a five-point scale (1 = strongly disagree, 5 = strongly agree).

    Teaching Contexts.

    In addition to the demographic identifiers summarized in Table 1, we collected information on each respondent's teaching context, including class size, student majors, student level (e.g., first-year), and course subject (e.g., introductory/general biology). We adapted an item from prior research to measure how instructors coordinated their teaching, such as by sharing syllabi, exams, or assessments across multiple sections of a course (Rasmussen and Ellis, 2015). We additionally used two items that were used in prior research to measure instructors’ perceptions of department support (Archie et al., 2022). Respondents were asked to rate “Support from your colleagues in the department to use active learning in your teaching” and separately, “Support from your department head or chair to use active learning in your teaching.” Both items used the same five-point scale (1 = not at all supportive to 5 = very supportive).

    Data Analysis

    Assignment of Variables to the TPB.

    Several of the survey items described in previous sections correspond to TPB constructs. MIST composite scores were used to measure the “behavior” construct in the TPB (use of EBIPs). Instructors’ motivation to use EBIPs was used to represent “intention” in the TPB. Two items represented the TPB construct “perceived behavioral control”; respondents’ ratings for both knowledge of EBIPs and skill in using EBIPs. “Subjective norm” was derived from two survey items related to departmental support. “Attitude” was derived from a single item related to belief in the effectiveness of EBIPs. As described below, prior EBIPs experience was also included in the model and hypothesized to be positively related to “intention” in the model.

    Statistical Analyses.

    In preliminary analyses (e.g., t test, ANOVA, chi-square) we checked for differences in overall MIST scores by all individual characteristics, institutional characteristics, and teaching contextual factors that did not correspond to a TPB construct. Only one test showed a statistically significant difference: instructors with prior active learning experience (individual characteristic) showed higher motivation or “intent” to use active learning teaching strategies than those who indicated they had no prior active learning teaching experience (p ≤ 0.05). Therefore, to maximize model parsimony, prior EBIP (active learning) experience was the only additional variable included in the TPB structural equation model (SEM). This item was a dichotomous measure, “Have you ever taught a class using active learning teaching strategies?” and was coded (yes = 1, no = 0).

    Descriptive statistics were computed to answer research questions 1, 2, and 3 (SPSS 28, IBM Inc., Chicago, IL, USA). A SEM was used to answer research question 4 (AMOS 28, IBM Inc., Chicago, IL, USA). The SEM was limited to the 388 survey respondents who answered the MIST items. Missing data were handled by full information maximum likelihood estimation. Mardia's coefficient was calculated to assess multivariate normality, and our data exceeded the recommended maximum value of 3.0. However, Mardia's coefficient thresholds may be more stringent than necessary, so we also tested the univariate normality of all variables in the SEM (Gao et al., 2008). All variables in the model were below the thresholds of moderate non-normality (skewness = 2, kurtosis = 8) (Curran et al., 1996). Bootstrapping was used to test the significance of indirect effects in the structural model. Two fit indices comparative fit index (CFI) and root mean square error of approximation (RMSEA) were used to assess model fit for maximum likelihood estimation. CFI values range from 0 to 1, where values ≥ 0.95 are considered well-fitting. RMSEA values ≤ 0.05 indicate a close fit. Together, the model fit indices and parameter estimates were used to determine how well the confirmatory factor analysis model and structural model fit our data.

    A confirmatory factor analysis was performed to assess the convergent and discriminant validity of two latent factors: social norm and perceived behavioral control. Next, a structural model tested the TPB using both latent and observed variables as described in the “Measures” section.

    RESULTS

    Research Question #1: What capacity, collegial support, and mindset do CC biology instructors have to use EBIPs?

    Capacity to Use EBIPs.

    Instructors were asked to report their current capacity to use EBIPs, where capacity is measured with the components of knowledge, skill, belief in EBIP effectiveness, and motivation (Table 2). Most instructors held positive attitudes about the effectiveness of EBIPs and were motivated to use them, but gave lower ratings to their knowledge and skills to implement EBIPs.

    TABLE 2. Instructor perceptions of four aspects of their capacity to use EBIPs (n = 395)

    Not at allSlightlyModeratelyVeryExtremely
    Knowledge3%19%48%24%7%
    Skill4%24%52%16%4%
    Belief0%6%23%44%28%
    Motivation1%7%25%38%29%

    Departmental Support to use EBIPs.

    As shown in Table 3, respondents (n = 386) rated the level of support from department colleagues and their department head to use EBIPs (active learning teaching practices). Most instructors perceived strong support from department colleagues and chairs, but over a quarter of respondents reported mixed or unsupportive departments. Department heads were perceived as slightly more supportive of EBIPs than departmental colleagues.

    TABLE 3. Instructor perceptions of collegial support for using EBIPs (n = 386)

    Survey itemNot at all supportiveMostly not supportiveMixed or moderate supportMostly supportiveVery supportive
    Collegial support3%5%24%32%37%
    Department head support3%4%18%28%48%

    Eighty-nine percent of instructors reported some degree of coordination of their course with other colleagues in their department (Table 4). Using the same textbook was the most frequent coordination activity (72%), followed by meeting with colleagues to discuss or design content (39%) and coordinating materials/assignments (34%).

    TABLE 4. Course coordination activities (n = 386)

    % of respondents
    I use the same textbook as others in my department72%
    I meet with other instructors to discuss or design course content39%
    I coordinate materials or assignments with others in my department34%
    I use the same syllabus as others in my department27%
    No coordination11%
    I use the same exams as others in my department9%
    I team teach this course8%
    Other type of coordination8%

    Growth Mindset.

    Instructors were asked three questions about their growth mindset related to student intelligence (Table 5). Three-quarters or more of instructors disagreed or strongly disagreed with these statements, indicating a growth mindset. However, a minority of instructors (10–16%) agreed or strongly agreed with each of the three statements, indicating a “fixed” mindset around intelligence, and an additional 8–13% indicated they neither agreed nor disagreed with each statement.

    TABLE 5. Average growth mindset (n = 370)

    Survey itemsMSD
    Students have a certain amount of intelligence, and they really can't do much to change it.1.831.04
    Students’ intelligence is something about them that they can't change very much.1.831.07
    Students can learn new things, but they can't really change their basic intelligence.2.051.18
    Scale mean1.910.98

    Note: means calculated from a five-point scale (1 = strongly disagree, 5 = strongly agree).

    Research Question #2: To What Extent do CC Biology Instructors Report Using EBIPs?

    A large majority of respondents (81%) indicated that they had used EBIPs (active learning teaching practices) to some degree. MIST composite scores ranged from 13 to 84 (Figure 2), a range consistent with MIST composite scores reported in prior research of undergraduate biology instructors (Durham et al., 2017, 2018, 2022). The average MIST composite score for this sample was 47.04, SD = 13.03. MIST composite scores did not significantly correlate with any independent variable measured in our survey.

    Responses to each item in the MIST short-form survey are included as Supplemental Data S2 in Supplemental Materials. Here, we summarize those responses to provide insight into the use of different EBIPs by the instructors in our sample, as captured by the MIST-short (n = 388). For ease of presentation, we group EBIPs in relation to the MIST subcategories derived by Durham and colleagues (2017) through empirical factor analysis and theoretical grounding in the scientific teaching framework. Our groupings differ from those of Durham and colleagues in two ways: First, we used a reduced number of survey items (i.e., the MIST-short). Second, we discuss two MIST subcategories, Experimental Design/Communication and Data Analysis/Interpretation, as a single group.

    Active Learning Strategies.

    Seventy-nine percent of respondents reported using group work in their teaching and devoting, on average, 33% of class time to this practice. Fifty-seven percent of instructors reported engaging students in four or more in-class activities per month. Polling was the least frequently used EBIP overall, with 42% of instructors reporting never using it. Of those who reported using polling, 13% asked more than five polling questions per week.

    Learning Goal Use and Feedback.

    Seventeen percent of instructors in our sample reported providing learning goals for most or for each activity in their classes. On average, respondents reported that 71% of their polling questions and 76% of their in-class activities overlapped with their learning goals.

    Inclusivity.

    Seventy-two percent of instructors in our sample agreed or strongly agreed that they use examples or analogies that are culturally diverse, and 74% agreed or strongly agreed that they highlight contributions from diverse people involved in science.

    Responsiveness to Students.

    Sixty-seven percent of instructors responded that they “most of the time” or “always” provided students with additional opportunities to learn when it became clear that the class did not understand a concept. However, a smaller proportion, 42% of instructors, said they were aware “most of the time” or “always” that a concept was not understood by the majority of students before an exam.

    Experimental Design/Communication and Data Analysis/Interpretation.

    Instructors reported asking students to do the following at least once a week: formulate hypotheses or make predictions (36% of respondents), design experiments to answer questions (18% of respondents), analyze or interpret data in graphs or tables (36% of respondents), and use data to make decisions (28% of respondents).

    Cognitive Skills.

    Fifty-three percent of instructors reported engaging students in high-level thought processes at least once a week and 31% reported asking their students to work on open-ended questions (e.g., case studies) at least once a week.

    Course and Self-reflection.

    Thirty-five percent of instructors reported providing opportunities or suggestions for students to reflect on the effectiveness of their study habits more than three times per month, and 24% of instructors reported collecting formal or informal student feedback more than twice during a term.

    Research Question #3: What Barriers to Using EBIPs do CC Biology Instructors Report?

    We asked respondents to rate reported barriers to using EBIPs as either an impediment (something that makes implementation difficult) or as something that prevents (stops) them from implementing EBIPs in their teaching (Table 6). Instructors most often reported that barriers impeded but did not prevent their use of EBIPs. However, content coverage was the most common barrier cited as preventing instructors from implementing EBIPs (21% of respondents). Preparation time, student resistance, access to suitable materials, and perceived student deficits were also substantial barriers, with at least half of instructors stating that these issues impeded or prevented them from using EBIPs.

    TABLE 6. Incidence of barriers to EBIPs among 2YC biology instructors (n = 370)

    Barriers to active learningPreventsImpedesDoes not impede
    Too much content to cover21%52%27%
    Too much prep time required11%55%34%
    Student resistance5%51%44%
    No access to materials12%40%48%
    Student deficits make use difficult5%45%50%
    Not enough knowledge to use8%41%51%
    Inappropriate classroom design to use7%42%51%
    Not enough funding opportunities to use11%36%53%
    Assessing learning is difficult6%35%59%
    Subject matter not conducive to use9%29%62%
    Class too large6%30%64%
    Department norm is lecture2%20%78%
    Negative student evaluations2%20%78%
    Chair not supportive2%10%88%

    Research Question #4: What Individual and Contextual Factors Influence EBIP use by CC Biology Instructors?

    To estimate the impact of different factors on EBIP implementation by CC instructors, we performed SEM with eight variables, seven of which relate to a construct in the TPB. The eighth variable, individual prior experience with EBIPs (as active learning teaching practices), was included in the model because exploratory analyses indicated that it significantly correlates with the intent to use EBIPs (behavioral intention). The measurement model showed acceptable fit (CMIN  = (df) 3.94 (2) p = 0.139, CFI  = 0.99, RMSEA  = 0.05). All factor loadings were positive and had significant standardized regression weights above 0.30 with their corresponding latent variables, indicating convergent validity. Latent factor correlations less than 0.40 indicated discriminant validity. The structural model also showed acceptable model fit (CMIN = (df) 13.66 (13) p = 0.398, CFI = 0.99, RMSEA = 0.01). Social norm and perceived behavioral control constructs were represented equally by their corresponding observed variables, as evidenced by nearly equal factor loadings that are provided as Supplemental Data S3.

    The structural model explained a moderate amount of variability in intention to use EBIPs (measured by motivation to use EBIPs) (r2 = 0.49) and in the frequency of EBIPs use (behavior in the TPB) (r2 = 0.26). Descriptive statistics for all variables included in the SEM are shown in Table 7.

    TABLE 7. Descriptive statistics for all variables included in the structural equation model (n = 388)

    TPB constructObserved variableMeanSD
    AttitudeBelief in the effectiveness of AL3.940.86
    Social normCollegial support3.941.04
    Chair support4.131.04
    Perceived behavioral controlKnowledge3.140.89
    Skill2.920.85
    Behavioral intentionMotivation3.880.94
    BehaviorMIST composite score47.0413.03
    Factors outside of TPBPrior AL experienceYes 81% (n = 316)

    Abbreviations: TPB = Theory of Planned Behavior; AL = Active Learning.

    We found positive, statistically significant standardized direct effects in all specified paths in the SEM (Figure 3). The model results reveal ways in which variables, directly and indirectly, influence instructors’ intent and behavior, with some variables having multiple influences. Of the three core variables theorized to influence intention to use EBIPs, attitude (specifically, belief in the effectiveness of EBIPs) had the strongest direct effect, while perceived behavioral control and subjective norm showed similar moderate direct effects. Additionally, perceived behavioral control (knowledge and skill) had a direct strong positive effect on instructors’ behavior (use of EBIPs). Instructors’ previous experience using EBIPs had a small positive direct effect on instructor intention and a small indirect effect on instructor behavior (Table 8). Similarly, subjective norm, attitude, and perceived behavioral control all showed small, but statistically significant indirect effects on behavior (Table 8).

    FIGURE 3.

    FIGURE 3. Structural equation model of the TPB (n = 388)

    TABLE 8. Standardized direct and indirect effects of the structural model (n = 388).

    Standardized estimates
    Direct effects on intention
     Attitude0.52***
     Subjective norm0.23**
     Perceived behavioral control0.20***
     Prior EBIPs experience0.10*
    Direct effects on behavior
     Intention0.18***
     Perceived behavioral control0.40***
    Indirect effects on behavior
     Attitude0.09**
     Subjective norm0.02**
     Perceived behavioral control0.04**
     Prior EBIPs experience0.02*

    *p < 0.05; **p < 0.01; ***p < 0.001.

    DISCUSSION

    Given the key role of CCs in the education of STEM students in the United States and the positive student outcomes associated with evidence-based teaching practices (EBIPs), promoting and supporting EBIPs in CC STEM courses should be a priority for reform efforts. Doing so requires an understanding of the attitudes, knowledge, and skill of CC instructors regarding EBIPs, as well as the key factors that hinder or promote their use. The present study contributes to this understanding by characterizing self-reported measures of EBIP use in a large sample of CC biology instructors, and by modeling the extent to which different individual and contextual factors influence their use of EBIPs.

    CC Biology Instructors are Well Situated to Use EBIPs

    Generally, the CC biology instructors in our sample were experienced as teachers, reported using EBIPs to some degree, and largely enjoyed the support of department chairs and/or colleagues for using EBIPs. Most instructors held positive attitudes about the effectiveness of EBIPs and were motivated to use EBIPs, but they ranked their knowledge and skills to use EBIPs lower than their attitudes and motivation. Similar findings (high belief and motivation coupled with lower knowledge and skill) have been reported for mathematics undergraduate instructors (Archie et al., 2022). Taken together, these findings lead us to conclude that CC biology instructors are well situated to use EBIPs, but could benefit from additional support to grow their knowledge about EBIPs and their skills using EBIPs.

    Surveyed instructors seemed to be generally oriented toward a growth mindset, which has been linked in previous research to higher uptake of EBIPs (Aragón et al., 2018). Faculty also reported coordinating their courses in one or more ways (e.g., using the same teaching materials), which has been shown in previous research in 4YC STEM contexts to favor the use of EBIPs (Rasmussen and Ellis, 2015; Archie et al., 2022). However, in our exploratory analyses, we found no statistically significant relationship between mindset or course coordination and EBIP use. We do not know whether these findings are related to our sample as discussed in the limitations section of this paper, but it will be important for future research to clarify the relationship (if any) between these factors and the use of EBIPs by CC biology faculty.

    Some features of the CC teaching context are also likely to be supportive of EBIP use. For example, CC instructors’ jobs focus on teaching, which can both reflect and foster a focus on student development (Mesa et al., 2014). This professional focus and institutional mission align well with instructor interest in building transferable skills (Wise et al., 2022). CC instructors teach more classes or credit hours than peers at 4Y institutions, but do not teach notably more students (Townsend and Rosser, 2007); class sizes tend to be smaller and more amenable to using interactive pedagogies and building strong student-teacher interactions. EBIP use can improve instruction and aligns well with current and widespread efforts to strengthen student experiences and outcomes, such as developmental evaluation reform and guided pathways initiatives (Brown and Bickerstaff, 2021).

    CC Biology Instructors Use EBIPs to a Similar Extent as 4YC Instructors

    CC biology instructors’ intensity of EBIP use is comparable to their 4-year counterparts. The self-reported MIST composite scores in our sample (mean = 46.6) were nearly identical to those derived from student reports about their 4-year college biology instructors (mean = 47.0) (Durham et al., 2017). Recently, biology instructors at primarily undergraduate institutions also reported comparable mean MIST composite scores (mean = 48.7) (Durham et al., 2022). Four-year college biology instructors who participated in intensive, scientific teaching professional development reported much higher MIST composite scores (m = 53.8), underscoring the role of professional development in supporting EBIP use (Durham et al., 2017). While these studies provide useful benchmarks for interpreting general survey results, our MIST-short data are not sufficient for a comparison of individual EBIPs between our sample and 4YC biology instructors surveyed in previous research.

    Barriers to the Use of EBIPs by CC Biology Instructors

    The main barriers to using EBIPs reported by instructors in our sample were content coverage, preparation time, student resistance, and access to suitable materials. These barriers have surfaced before in the limited literature on EBIP use in CC biology (Harmon, 2017; Corwin et al., 2019; Wise et al., 2022). Three of these top concerns (content coverage, class preparation, and materials) are related to instructor time, in and out of the classroom. An excess of course content means that instructors feel pressed to cover content quickly, because class time is limited, increasing the likelihood of lecturing in lieu of more time-intensive student-centered approaches. The reasons driving undergraduate biology faculty to cover large amounts of content have been theorized (Petersen et al., 2020) and include the size of textbooks, the norms of the discipline, beliefs about the knowledge students need to progress in their studies, and the real and perceived demands of admissions and licensing examinations and accreditation processes. These latter beliefs may be especially strong for CC instructors who teach courses that students need for transfer to 4Y institutions. However, in the CC setting, Tripp and colleagues (2024) investigated the coherence between the content of anatomy and physiology courses and the knowledge needs of practicing nurses and found no support for the large amounts of information covered in these courses. Similarly, Weston and Thiry have found that transfer students who enter sophomore- or junior-level STEM courses do as well or better than their first-time-in-college peers (personal communication, April 15, 2024) Finally, Wang et al. (2017)) found that students’ experiences of active learning in CC courses positively affect their intent to transfer to 4Y programs. Making instructors aware of these findings may help to dispel beliefs that wider content coverage will benefit students more than pedagogies that can offer deeper learning, more focus on applications, or greater interest.

    Personal time to prepare class and to find or design teaching resources (e.g., activities, problem sets, high-quality questions) also seem to be limiting factors for CC biology instructors’ use of EBIPs. Taken together, these findings point to a need for institutions and reform programs to develop mechanisms to relieve time constraints experienced by CC instructors and to consider changes to streamline introductory curricula. Calls to action such as Vision and Change (AAAS, 2011) and scholarly recommendations (e.g., Branchaw et al., 2020; Petersen et al., 2020) provide starting points for these changes.

    Student resistance to EBIPs was another top concern of instructors in our sample. Previous studies have reported this barrier across undergraduate STEM education (Michael, 2007; Herreid and Schiller, 2013; Bathgate et al., 2019; Apkarian et al., 2021). Faculty cite concerns over student discontent, attrition, and negative evaluations resulting from using more student-centered teaching approaches (Shadle et al., 2017; Sturtevant and Wheeler, 2019; Apkarian et al., 2021). However, research shows that those concerns may be misplaced (Finelli et al., 2018; Nguyen et al., 2021). Instructor strategies to ameliorate student resistance to EBIPs have been proposed, including clarifying expectations regarding the effort required to engage in EBIPs and explaining why EBIPs are beneficial (Tharayil et al., 2018; Nguyen et al., 2021; Craig and Hsu, 2024). Some suggested approaches to mitigating student resistance are related to the thoughtful and skillful implementation of EBIPs such that students feel included, supported, productive, and understand the relationship between EBIPs and the goals of the course (Nguyen et al., 2021). Thus, it would seem that supporting CC biology instructors with professional development (PD) that helps them implement mitigation strategies could be a promising approach to address this perceived barrier.

    Large class sizes and low levels of departmental support are two features of organizational context that have previously been reported as perceived barriers to EBIP use in STEM contexts (e.g., Ferrare and Hora, 2014; Bathgate et al., 2019; McConnell et al., 2020; Archie et al., 2022). However, they were not perceived as strong barriers by the CC biology instructors in our study. CC instructors typically teach smaller sections than faculty at many 4-year institutions, and thus class size is understandably not a perceived barrier in CCs. It is somewhat intriguing that low departmental support was not a significant barrier for our survey respondents, in contrast to prior research mainly on 4YCs (Henderson and Dancy, 2007; Hora et al., 2012; Bradforth et al., 2015; Corbo et al., 2016; Shadle et al., 2017; Wieman, 2017; Bathgate et al., 2019). This finding could be related to CC departments’ focus on their teaching mission and prioritization of supportive cultures around teaching. Nevertheless, a quarter of respondents in our study did report mixed or unsupportive departments in relation to their use of EBIPs, and the findings of the TPB SEM show that department support is related to EBIP use. Therefore, some targeted efforts focused on enhancing departmental support for using EBIPs in CCs are probably warranted.

    TPB Explains Variability and Elucidates Key Factors Affecting CC Biology Instructor use of EBIPs

    We use the TPB to describe relationships between a number of variables and CC biology instructor use of EBIPs. The TPB structural model explained 49% of the variability of instructors’ intention to use EBIPs and 26% of their self-reported use of EBIPs, which is consistent with previous applications of the TPB (Armitage and Connor, 2001; Archie et al., 2022). The model specified four factors directly related to intention to use EBIPs, two factors directly related to EBIP use, and four factors indirectly related to EBIP use. The following sections provide a discussion of each relationship in the model to explain CC biology instructor use of EBIPs.

    Perceived behavioral control had a moderate direct effect on behavioral intention, and a strong direct effect on behavior. These findings indicate that as instructors’ level of knowledge and skills increase, so does their intention and use of EBIPs. EBIP knowledge and skill were the strongest predictors of EBIP use, indicating they are more influential in determining teaching practice than any other factor included in the model.

    Instructors’ attitudes about the effectiveness of EBIPs had a strong direct effect on EBIPs intention and a minimal indirect effect on instructors’ actual use of EBIPs. These findings indicate that the more positive an instructor's attitude about the effectiveness of EBIPs, the greater their motivation to use EBIPs. Prior research has pointed out the importance of faculty beliefs regarding the effectiveness of EBIPs in relation to their use of these practices (Madson et al., 2017).

    Instructors’ prior experience with EBIPs had a small positive direct effect on their intention to use EBIPs and a positive, though minimal, indirect effect on actual EBIP use. The relatively low magnitude of this factor suggests that a lack of experience with EBIPs is not necessarily a strong barrier to EBIP use among CC biology instructors. Future research with larger sample sizes and/or different measures of past and present use of EBIPs may be required to clarify the extent to which EBIPs experience predict EBIP use by CC biology faculty.

    The subjective norm construct showed a moderate direct effect on behavioral intent and a minimal indirect effect on behavior, indicating that instructors’ perceptions of support from colleagues impacted their intent to use EBIPs and, to a lesser extent, their intensity of EBIP use. This finding is consistent with prior research that has shown linkages between department culture or support and EBIP use (Bathgate et al., 2019; McConnell et al., 2020). Other studies using the TPB have shown that social norms typically have relatively weak associations with behavioral intention (Armitage and Connor, 2001), which is consistent with our finding that collegial support only moderately influences CC instructors’ motivation to use EBIPs.

    The positive relationship between behavioral intention and behavior indicated that instructors with high intent to use EBIPs do so more frequently than those with weaker intent. This finding is consistent with prior TPB research (Armitage and Connor, 2001). In this study, we observed that behavioral intent is positively related to instructors’ use of EBIPs, but is mediated by their perceived ability to use EBIPs (perceived behavioral control), suggesting that while instructors need to be motivated to use EBIPS, they must also have knowledge and skills necessary to use them—a finding that emphasizes the potential impact of well-designed, systems-attentive professional development.

    Conclusions and Recommendations

    Our findings suggest that CC biology instructors are well situated to increase their use of EBIPs. Instructors generally have a growth mindset, positive views of EBIPs, are motivated to use EBIPs, and are already using EBIPs in their teaching. However, instructors reported relatively lower ratings of their knowledge and skills to use EBIPs. These lower ratings, together with the fact that knowledge and skill were the strongest predictors of EBIPs use in the TPB, point to the need for professional development to help instructors learn about EBIPs and practice their use.

    Roughly a quarter of the surveyed instructors perceived a lack of support from colleagues around implementing EBIPs. This provides an area of opportunity for reform efforts to promote EBIPs through impacting departmental culture, given the positive effect that collegial support has on the intent and actual use of EBIPs.

    Most of the top barriers to EBIP use in CC biology teaching are related to instructors’ time constraints, inside and outside the classroom: content coverage, preparation time, and access to materials. This finding calls for multipronged approaches to ease the demands placed on CC instructors’ time. Future research specifically examining why CC instructors perceive the need to cover large amounts of content in introductory biology courses would illuminate the nature of this problem and offer possible solutions. Factors that seem to enforce this “tyranny of content” in biology education have been proposed (Petersen et al., 2020) and can inform this future research. Continuing efforts to better align CC introductory biology courses with the ideas articulated in Vision and Change (AAAS, 2011) also seem needed. Fortunately, much work has been done in the last few years to create resources, tools, and knowledge that can support instructors and departments in this endeavor (Branchaw et al., 2020). Easing individual instructors’ teaching loads would also provide more opportunities for instructors to invest in preparing their classes to intentionally incorporate EBIPs. Ready-to-use teaching materials that support EBIP use and PD to expand knowledge about time-efficient ways of implementing EBIPs can also contribute.

    Our findings highlight that reform efforts must be capable of addressing multiple individual and contextual factors. In other words, our results suggest the need for different types of initiatives, some designed to elicit attitudinal changes in the minority of instructors who hold a lower belief in the efficacy of EBIPs, other interventions that address barriers related to departmental support and time, and still others related to increasing knowledge and skills among CC biology instructors who are already motivated to use EBIPs.

    Limitations

    This study has described and explained the self-reported use of EBIPs in a large sample of CC biology instructors. However, several limitations must be considered when interpreting these findings. First, we observed a low response rate (9%) when we surveyed a list of 5,064 CC biology instructors (there were an estimated 11,040 CC biology teachers in the United States in 2022, according to the U.S. Bureau of Labor Statistics). Therefore, the findings we report here are derived from 370 to 395 instructors (depending on the measure), or about 3.5% of all U.S. CC biology instructors. We obtained a low response rate despite implementing practices to increase response rates, including a relatively generous survey incentive (Dillman et al., 2014). Low response rates have been reported for other college instructor research that used survey methods during the COVID-19 pandemic (Durham et al., 2022), and it is likely that they are related to the increased email and workloads that were reported at that time (Johnson et al., 2020; Colclasure et al., 2021). Despite this limitation, our sample seems to be geographically representative (42 states) and reflects a large number (n = 295) of institutions.

    A second limitation of our study is related to the fact that we asked instructors about their teaching retrospectively, in the context of “normal conditions before COVID.” This study was conducted with the purpose of understanding and providing support to CC biology instructor teaching, not necessarily in response to COVID-19. This was a common limitation of research conducted during the COVID-19 pandemic (Durham et al., 2022).

    Lastly, our use of the TPB to explain the degree to which instructors use EBIPs did not include all of the known supports and barriers to EBIP use. While the TPB may seem to oversimplify the factors that influence EBIP use, our preliminary analyses suggested that many of the known barriers and supports that we measured did not have a statistically significant relationship with instructors’ use of EBIPs and were therefore excluded from the final TPB model. Moreover, while some barriers and supports to adoption of EBIPs center on the experience, socialization, attitudes, and preparation of instructors for teaching, many others are embedded in the departmental and institutional context (Austin, 2011; Brown and Bickerstaff, 2021). These affect instructor practices through levers of change such as work allocation and rewards systems, as well as professional development. Our study did not explore these factors in any depth, and informed observers agree that research on CCs is sorely lacking in studies of faculty, instruction, institutional culture, and the relationships among these (e.g., Twombly and Townsend, 2008; Mesa et al., 2014; Mesa, 2016; Brown and Bickerstaff, 2021).

    ACKNOWLEDGMENTS

    This work was supported by Howard Hughes Medical Institute grant numbers GT16734 and GT14866. The opinions, findings, conclusions, and recommendations expressed in this work are those of the authors and do not necessarily reflect the views of the Howard Hughes Medical Institute. We thank the CC instructors who responded to our survey and made this research possible.

    REFERENCES

    • Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. Google Scholar
    • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action. Google Scholar
    • American Association of Community Colleges. (2023). Community college fast facts. Retrieved February 2024, from https://www.aacc.nche.edu/wp-content/uploads/2023/03/AACC2023_FastFacts.pdf Google Scholar
    • Andrews, T. (2019). Status of research-based reform in undergraduate life sciences sducation and levers for promoting reform. In S. Laursen (Ed.), Levers for change: An assessment of progress on changing STEM instruction (pp 17–36). Washington, DC: American Association for the Advancement of Science. Google Scholar
    • Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. (2011). Active learning not associated with student learning in a random sample of college biology courses. CBE–Life Sciences Education, 10(4), 394–405. LinkGoogle Scholar
    • Apkarian, N., Henderson, C., Stains, M., Raker, J., Johnson, E., & Dancy, M. (2021). What really impacts the use of active learning in undergraduate STEM education? Results from a national survey of chemistry, mathematics, and physics instructors. PLoS One, 16(2), e0247544. MedlineGoogle Scholar
    • Aragón, O. R., Eddy, S. L., & Graham, M. J. (2018). Faculty beliefs about intelligence are related to the adoption of active-learning practices. CBE–Life Sciences Education, 17(3), ar47. LinkGoogle Scholar
    • Archie, T., Hayward, C. N., Yoshinobu, S., & Laursen, S. L. (2022). Investigating the linkage between professional development and mathematics instructors’ use of teaching practices using the theory of planned behavior. PLoS One, 17(4), e0267097. MedlineGoogle Scholar
    • Armitage, C. J., & Conner, M. (2001). Efficacy of the theory of planned behaviour: A meta-analytic review. British Journal of Social Psychology, 40(4), 471–499. MedlineGoogle Scholar
    • Austin, A. E. (2011). Promoting evidence-based change in undergraduate science education. Paper commissioned by the National Academies National Research Council Board on Science Education. Retrieved April 15, 2024, from http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072578.pdf. Google Scholar
    • Bathgate, M. E., Aragón, O. R., Cavanagh, A. J., Waterhouse, J. K., Frederick, J., & Graham, M. J. (2019). Perceived supports and evidence-based teaching in college STEM. International Journal of STEM Education, 6(1), 11. MedlineGoogle Scholar
    • Benabentos, R., Hazari, Z., Stanford, J. S., Potvin, G., Marsteller, P., Thompson, K. V., … & Kramer, L. (2021). Measuring the implementation of student-centered teaching strategies in lower-and upper-division STEM courses. Journal of Geoscience Education, 69(4), 342–356. Google Scholar
    • Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. 1991 ASHE-ERIC higher education reports. ERIC Clearinghouse on Higher Education, The George Washington University, One Dupont Circle, Suite 630, Washington, DC 20036-1183. Google Scholar
    • Borda, E., Schumacher, E., Hanley, D., Geary, E., Warren, S., Ipsen, C., & Stredicke, L. (2020). Initial implementation of active learning strategies in large, lecture STEM courses: Lessons learned from a multi-institutional, interdisciplinary STEM faculty development program. International Journal of STEM Education, 7(1), 4. Google Scholar
    • Bradforth, S. E., Miller, E. R., Dichtel, W. R., Leibovich, A. K., Feig, A. L., Martin, J. D., … & Smith, T. L. (2015). University learning: Improve undergraduate science education. Nature News, 523(7560), 282. MedlineGoogle Scholar
    • Branchaw, J. L., Pape-Lindstrom, P. A., Tanner, K. D., Bissonnette, S. A., Cary, T. L., Couch, B. A., & Brownell, S. E. (2020). Resources for teaching and assessing the vision and change biology core concepts. CBE–Life Sciences Education, 19(2), es1. LinkGoogle Scholar
    • Brown, A. E., & Bickerstaff, S. (2021). Committing to instructional improvement in an era of community college reform. New Directions for Community Colleges, 2021(195), 129–142. Google Scholar
    • Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: lack of training, time, incentives, and…tensions with professional identity? CBE—Life Sciences Education, 11(4), 339–346. LinkGoogle Scholar
    • Chasteen, S. V., & Chattergoon, R. (2020). Insights from the physics and astronomy new faculty workshop: How do new physics faculty teach? Physical Review Physics Education Research, 16(2), 020164. Google Scholar
    • Colclasure, B. C., Marlier, A., Durham, M. F., Brooks, T. D., & Kerr, M. (2021). Identified challenges from faculty teaching at predominantly undergraduate institutions after abrupt transition to emergency remote teaching during the COVID-19 pandemic. Education Sciences, 11(9), 556. Google Scholar
    • Corbo, J. C., Reinholz, D. L., Dancy, M. H., Deetz, S., & Finkelstein, N. (2016). Framework for transforming departmental culture to support educational innovation. Physical Review Physics Education Research, 12(1), 010113. Google Scholar
    • Corwin, L. A., Kiser, S., LoRe, S. M., Miller, J. M., & Aikens, M. L. (2019). Community college instructors’ perceptions of constraints and affordances related to teaching quantitative biology Skills and Concepts. CBE–Life Sciences Education, 18(4), ar64. https://doi.org/10.1187/cbe.19-01-0003 LinkGoogle Scholar
    • Couch, B. A., Brown, T. L., Schelpat, T. J., Graham, M. J., & Knight, J. K. (2015). Scientific teaching: Defining a taxonomy of observable practices. CBE–Life Sciences Education, 14(1), ar9. LinkGoogle Scholar
    • Craig, B., & Hsu, J. L. (2024). A multi-year longitudinal study exploring the impact of the COVID-19 pandemic on students’ familiarity and perceptions of active learning. Active Learning in Higher Education. Google Scholar
    • Creech, C., Just, J., Hammarlund, S., Rolle, C. E., Gonsar, N. Y., Olson, A., … & Cotner, S. (2022). Evaluating the representation of community colleges in biology education research publications following a call to action. CBE–Life Sciences Education, 21(4), ar67. MedlineGoogle Scholar
    • Curran, P. J., West, S. G., & Finch, J. F. (1996). The robustness of test statistics to nonnormality and specification error in confirmatory factor analysis. Psychological Methods, 1(1), 16–29. Google Scholar
    • Dean, C. B., & Hubbell, E. R. (2012). Classroom Instruction that Works: Research-based Strategies for Increasing Student Achievement. Alexandria, VA: Association for Supervision and Curriculum Development. Google Scholar
    • Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-mode Surveys: The Tailored Design Method. Hoboken, NJ: John Wiley & Sons. Google Scholar
    • Driessen, E. P., Knight, J. K., Smith, M. K., & Ballen, C. J. (2020). Demystifying the meaning of active learning in postsecondary biology education. CBE–Life Sciences Education, 19(4), ar52. LinkGoogle Scholar
    • Durham, M. F., Knight, J. K., & Couch, B. A. (2017). Measurement Instrument for Scientific Teaching (MIST): A tool to measure the frequencies of research-based teaching practices in undergraduate science courses. CBE–Life Sciences Education, 16(4), ar67. LinkGoogle Scholar
    • Durham, M. F., Knight, J. K., Bremers, E. K., DeFreece, J. D., Paine, A. R., & Couch, B. A. (2018). Student, instructor, and observer agreement regarding frequencies of scientific teaching practices using the Measurement Instrument for Scientific Teaching-Observable (MISTO). International Journal of STEM Education, 5, 31. MedlineGoogle Scholar
    • Durham, M., Colclasure, B., & Brooks, T. D. (2022). Experience with scientific teaching in face-to-face settings promoted usage of evidence-based practices during emergency remote teaching. CBE–Life Sciences Education, 21(4), ar78. MedlineGoogle Scholar
    • Dweck, C. S. (2006). Mindset: The New Psychology ofSuccess. New York, NY: Random House. Google Scholar
    • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE–Life Sciences Education, 13(3), 453–468. LinkGoogle Scholar
    • Elrod, S. & Kezar, A. (2017). Increasing student success in stem: Summary of a guide to systemic institutional change. Change: The Magazine of Higher Learning. 49(301), 26–34. Google Scholar
    • Emery, N. C., Maher, J. M., & Ebert-May, D. (2020). Early-career faculty practice learner-centered teaching up to 9 years after postdoctoral professional development. Science Advances, 6(25), eaba2091. https://doi.org/10.1126/sciadv.aba2091 MedlineGoogle Scholar
    • Ernst, H., & Colthorpe, K. (2007). The efficacy of interactive lecturing for students with diverse science backgrounds. Advances in Physiology Education, 31(1), 41–44. MedlineGoogle Scholar
    • Ferrare, J. J., & Hora, M. T. (2014). Cultural models of teaching and learning in math and science: Exploring the intersections of culture, cognition, and pedagogical situations. The Journal of Higher Education, 85(6), 792–825. Google Scholar
    • Finelli, C. J., Nguyen, K., DeMonbrun, M., Borrego, M., Prince, M., Husman, J., & Waters, C. K. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching, 47(5), 80–91. Google Scholar
    • Foley, D., Milan, L., & Hamrick, K. (2021). The increasing role of community colleges among Bachelor's degree recipients: Findings from the 2019 National Survey of College Graduates. Alexandria, VA: National Center for Science and Engineering Statistics. Retrieved November, 10, 2023. Google Scholar
    • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. MedlineGoogle Scholar
    • Gao, S., Mokhtarian, P. L., & Johnston, R. A. (2008). Nonnormality of data in structural equation models. Transportation Research Record, 2082(1), 116–124. Google Scholar
    • Haak, D. C., HilleRisLambers, J., Pitre, E., & Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216. MedlineGoogle Scholar
    • Handelsman, J., Ebert-May, D., Beichner, R., Bruns, P., Chang, A., DeHaan, R., … & Wood, W. B. (2004). Scientific teaching. Science, 304(5670), 521–522. MedlineGoogle Scholar
    • Harmon, M. C. (2017). Professional development as a catalyst for change in the community college science classroom: How active learning pedagogy impacts teaching practices as well as faculty and student perceptions of learning [Dissertation, Wingate University]. In ProQuest LLC. Retrieved January 8, 2024, from https://www.proquest.com/docview/1945360878. Google Scholar
    • Henderson, C., & Dancy, M. H. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics-Physics Education Research, 3(2), 020102. Google Scholar
    • Herreid, C. F., & Schiller, N. A. (2013). Case studies and the flipped classroom. Journal of College Science Teaching, 42(5), 62–66. Google Scholar
    • Hora, M. T., Ferrare, J., & Oleson, A. (2012). Findings from classroom observations of 58 math and science faculty. http://tdop.wceruw.org/Document/Research-report-Observationsof-STEM-Faculty-Spring2012.pdf. Google Scholar
    • Johnson, N., Veletsianos, G., & Seaman, J. (2020). US faculty and administrators' experiences and approaches in the early weeks of the COVID-19 pandemic. Online Learning, 24(2), 6–21. Google Scholar
    • Lombardi, D., Shipley, T. F., Bailey, J. M., Bretones, P. S., Prather, E. E., Ballen, C. J., ... & Docktor, J. L. (2021). The curious construct of active learning. Psychological Science in the Public Interest, 22(1), 8–43. https://doi.org/10.1177/1529100620973974 Google Scholar
    • Madson, L., Trafimow, D., & Gray, T. (2017). Faculty members’ attitudes predict adoption of interactive engagement methods. The Journal of Faculty Development, 31(3), 39–49. Google Scholar
    • McConnell, M., Montplaisir, L., & Offerdahl, E. G. (2020). A model of peer effects on instructor innovation adoption. International Journal of STEM Education, 7(1), 53. Google Scholar
    • Mesa, V. (2016). Mathematics education at US public two-year colleges. Retrieved January 8, 2024, from https://deepblue.lib.umich.edu/bitstream/handle/2027.42/117629/Mesa%20MathEd2YRColl_PreProduction.pdf;sequence=1. Google Scholar
    • Mesa, V., Wladis, C., & Watkins, L. (2014). Research problems in community college mathematics education: Testing the boundaries of K-12 research. Journal for Research in Mathematics Education, 45(2), 173–192. Google Scholar
    • Michael, J. (2007). Faculty perceptions about barriers to active learning. College Teaching, 55(2), 42–47. Google Scholar
    • Moon, S., Jackson, M. A., Doherty, J. H., & Wenderoth, M. P. (2021). Evidence-based teaching practices correlate with increased exam performance in biology. PLoS One, 16(11), e0260789. MedlineGoogle Scholar
    • National Center for Education Statistics (2023). Retrieved January 8, 2024, from https://nces.ed.gov/programs/coe/indicator/cha/undergrad-enrollment#5. Google Scholar
    • Nguyen, K. A., Borrego, M., Finelli, C. J., DeMonbrun, M., Crockett, C., Tharayil, S., … & Rosenberg, R. (2021). Instructor strategies to aid implementation of active learning: A systematic literature review. International Journal of STEM Education, 8, 9. Google Scholar
    • Petersen, C. I., Baepler, P., Beitz, A., Ching, P., Gorman, K. S., Neudauer, C. L., … & Wingert, D. (2020). The tyranny of content:“Content coverage” as a barrier to evidence-based teaching approaches and ways to overcome it. CBE–Life Sciences Education, 19(2), ar17. LinkGoogle Scholar
    • Rasmussen, C., & Ellis, J. (2015). Calculus coordination at PhD-granting universities: More than just using the same syllabus, textbook, and final exam. Insights and Recommendations from the MAA National Study of College Calculus, 107–115. Google Scholar
    • Rahman, T., & Lewis, S. E. (2020). Evaluating the evidence base for evidence-based instructional practices in chemistry through meta-analysis. Journal of Research in Science Teaching, 57(5), 765–793. Google Scholar
    • Reinholz, D. L., & Andrews, T. C. (2020). Change theory and theory of change: What's the difference anyway? International Journal of STEM Education, 7(1), 2. Google Scholar
    • Riedl, A., Yeung, F., & Burke, T. (2021). Implementation of a flipped active-learning approach in a community college general biology course improves student performance in subsequent biology courses and increases graduation rate. CBE–Life Sciences Education, 20(2), ar30. LinkGoogle Scholar
    • Richardson, D. S., Bledsoe, R. S., & Cortez, Z. (2020). Mindset, motivation, and teaching practice: Psychology applied to understanding teaching and learning in STEM disciplines. CBE–Life Sciences Education, 19(3), ar46. LinkGoogle Scholar
    • Ruiz-Primo, M. A., Briggs, D., Iverson, H., Talbot, R., & Shepard, L. A. (2011). Impact of undergraduate science course innovations on learning. Science, 331(6022), 1269–1270. MedlineGoogle Scholar
    • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S., … & Corwin, L. A. (2017). Broadening participation in biology education research: Engaging community college students and faculty. CBE–Life Sciences Education, 16(2), mr1. LinkGoogle Scholar
    • Shadle, S. E., Marker, A., & Earl, B. (2017). Faculty drivers and barriers: Laying the groundwork for undergraduate STEM education reform in academic departments. International Journal of STEM Education, 4(1), 8. MedlineGoogle Scholar
    • Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE–Life Sciences Education, 16(1), rm1. LinkGoogle Scholar
    • Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, R., DeChenne-Peters, S. E., … & Young, A. M. (2018). Anatomy of STEM teaching in North American universities. Science, 359(6383), 1468–1470. MedlineGoogle Scholar
    • Sturtevant, H., & Wheeler, L. (2019). The STEM Faculty Instructional Barriers and Identity Survey (FIBIS): Development and exploratory results. International Journal of STEM Education, 6(1), 35. Google Scholar
    • Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education, 5, 7. MedlineGoogle Scholar
    • Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., … & Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences, 117(12), 6476–6483. MedlineGoogle Scholar
    • Townsend, B. K., & Rosser, V. J. (2007). Workload issues and measures of faculty productivity. Thought & Action, 23(1), 7–19. Google Scholar
    • Tripp, B., Cozzens, S., Hrycyk, C., Tanner, K. D., & Schinske, J. N. (2024). Content coverage as a persistent exclusionary practice: Investigating perspectives of health professionals on the influence of undergraduate coursework. CBE–Life Sciences Education, 23(1), ar5. MedlineGoogle Scholar
    • Twombly, S., & Townsend, B. K. (2008). Community college faculty: What we know and need to know. Community College Review, 36(1), 5–24. Google Scholar
    • United States Bureau of Labor Statistics, Division of Occupational Employment and Wage Statistics. Retrieved January 8, 2024, from https://www.bls.gov/oes/current/oes251042.htm#(1). Google Scholar
    • Wang, X., Wang, Y., Wickersham, K., Sun, N., & Chan, H. (2017). Math requirement fulfillment and educational success of community college students. Community College Review, 45(179), 99–118. 10.1177/0091552116682829. Google Scholar
    • Weir, L. K., Barker, M. K., McDonnell, L. M., Schimpf, N. G., Rodela, T. M., & Schulte, P. M. (2019). Small changes, big gains: A curriculum-wide study of teaching practices and student learning in undergraduate biology. PLoS One, 14(8), e0220900. MedlineGoogle Scholar
    • Wieman, C. (2017). Improving How Universities Teach Science. Cambridge, MA: Harvard University Press. Google Scholar
    • Wise, S. B., Archie, T., & Laursen, S. (2022). Exploring two-year college biology instructors’ preferences around teaching strategies and professional development. CBE–Life Sciences Education, 21(2), ar39. MedlineGoogle Scholar