ASCB logo LSE Logo

It Takes Two: Online and In-person Discussions Offer Complementary Learning Opportunities for Students

    Published Online:https://doi.org/10.1187/cbe.23-04-0062

    Abstract

    Discussions play a significant role in facilitating student learning through engagement with course material and promotion of critical thinking. Discussions provide space for social learning where ideas are deliberated, internalized, and knowledge is cocreated through socioemotional interactions. With the increase of internet-based and hybrid courses, there is a need to evaluate the degree to which online discussion modalities facilitate quality discussions and enhance student achievement. We assessed the effectiveness of asynchronous online discussion boards and traditional face-to-face discussions via qualitative (thematic coding and discussion network analysis) and quantitative (Bloom’s taxonomy) techniques and evaluated student perceptions via precourse and postcourse surveys. We found differential strengths of the two formats. Online discussions increased response complexity, while in-person discussions fostered improved connections with course material. Themes related to sharing of personal identity, humanity and verbal immediacy were more frequent throughout in-person discussions. Survey responses suggested that a sense of community was an external motivator for preference of in-person discussions, while anxiety was a factor influencing online discussion preference. Our findings suggest that online and in-person discussions are complementary, and work in tandem to facilitate complex student thinking through online environments and social learning within the classroom.

    INTRODUCTION

    It has been widely established that discussions play a crucial role in facilitating student learning by increasing student engagement (Bennett et al., 2010; Murphy et al., 2020). In turn, this student interaction and participation fosters crucial critical thinking skills (Mollborn and Hoekstra, 2010). Discussions provide a key opportunity to build transferable skills to cross-curricular programs through mechanisms such as verbal communication with peers (Ellis and Calvo, 2004; Havnes et al., 2016). Talk plays a fundamental role in mediating high-level comprehension (Murphy et al., 2020). Discussions provide opportunities to reflect on ideas, develop new perspectives and raise questions on course material. Additionally, discussions provide essential space to build belongingness within the learning community which increases emotional participation through feelings that learners matter to one another and to the group, especially when discussing socioecological topics (Isohätälä et al., 2018; Tice et al., 2021). However, with the increase of online and hybrid courses (Garrison and Vaughan, 2008; Allen and Seaman, 2013; Protopsaltis and Baum, 2019) traditional discussion formats are being replaced by alternatives such as asynchronous online discussion boards (Dennen and Wieland, 2007; Champion and Gunnlaugson, 2018; Fehrman and Watson, 2021). There is a continued need to evaluate the effectiveness of online learning tools, such as discussion boards, compared with traditional in-person discussions in promoting collaborative learning and motivation toward success.

    Discussions provide effective means for learners to criticize, revise, or supplement others’ proposed perspectives while aiming to solve conflict and further acquire a common conclusion (Weinberger et al., 2007). While the effectiveness of the discussion process is supported (Murphy et al., 2020), asynchronous online discussions have been widely incorporated without strong consensus of best practices (Fehrman and Watson, 2021). As asynchronous online discussions are organized by message timing and response, this presents a hierarchical structure that makes it difficult to gain an overview of discussion content and overall conclusions (Gao et al., 2013). This hierarchical structure also leads to fragmented information that the learner must sift through themselves to find relevant information which could lead to high information load on learners, lowering online asynchronous discussion effectiveness (Darabi and Jin, 2013). Despite these potential drawbacks of online discussion effectiveness, a report found more numerous communications through online messages compared with in-person exchanges, suggesting more involvement in learning (Paskey, 2001). Additionally, in a study evaluating the effectiveness of voluntary discussion forums in a higher education setting, researchers found participation in forums improved exam performance (Cheng et al., 2011). As demonstrated by the mixed results, additional work must be done to evaluate the best practices of online discussions as an effective means of learning.

    When evaluating student attitudes and perceptions, online learning environments often receive mixed responses. Students report higher levels of anxiety when participating in online course instruction due to increased difficulty staying on top of tasks, building connections with peers, and resolving technological complications (Mohammed et al., 2021). Tiene (2000) found that when students were given a choice between the two modalities for class discussions, there was a preference for in-person discussions, despite their appreciation for the flexibility and continuous access to material through asynchronous virtual learning (Tiene, 2000). Authenticity within face-to-face discussions is also bolstered by visual cues that assist students in clarifying the meaning of responses more immediately than in online discussions, which for some helped alleviate worry (Wang and Woo, 2007). To assess these differences in learning environments, we need to gauge levels of content comprehension and synthesis achieved by each discussion type as well as reach a better understanding of students’ perceptions of the two discussion modalities for a holistic approach toward success in undergraduate education. Furthermore, as technology changes rapidly, there is a need to be consistently reevaluating how students are responding to new and updated modes of learning.

    CONCEPTUAL FRAMEWORK

    Peer-to-peer interactions are a crucial facet of student learning. Social-cognitive psychologist Lev Vygotsky developed his Sociocultural Theory in 1978, which stressed the importance of social interaction in cognitive development (Vygotsky, 1978). Albert Bandura later adapted Vygotsky’s ideas to create Social Learning Theory, which expanded the realm of social influences to include peers (Bandura and Walters, 1977) and the understanding that behaviors are influenced by the observation of others and through direct personal experience. In social learning theory, social interaction is required for learning to occur thus, principles of the theory have been essential in establishing the efficacy of group work and collaborative learning in higher education (Bowen, 2000; Volet et al., 2009). For example, Smith et al. (2009) assessed the efficacy of peer discussion and found that students who participated in group discussions following a clicker question were better able to apply their learning to novel questions relating to the concepts discussed (Smith et al., 2009). Additionally, Preszler (2009) found that incorporating peer-led workshops increased student performance on exams, specifically on questions that required higher-level thinking (Preszler, 2009).

    The opportunity to work with and learn from other students can serve as a source of motivation. Eun (2010) states that social interaction between two or more people is “the greatest motivating force in human development,” which was the basis of Vygotsky’s, and later Bandura’s, research (Eun, 2010). Jensen and Lawson (2011) echoed this sentiment and asserted that an environment of student discussion can facilitate learning because this creates a safe space for students to learn and bond with their peers, which can motivate them to contribute to group assignments and discussion (Jensen and Lawson, 2011). In a study examining collaborative learning in higher education, researchers found that promotive interactions among students within discussions enhanced their experience and aided their learning in how to discuss, voice their opinion, explain, listen to others, accept feedback, and reflect on their own work (Scager et al., 2016). Further, the support and motivation brought forth by peers within discussions emerged as important contributors to learning as students within the study reported the positive influences of explicit help, pep talks, and implicit mutual inspiration.

    A collaborative environment alone will not make a student learn the course material more successfully than they would in a traditional classroom setting, but collaborative learning methods provide the opportunity for students to be inspired by the ideas of their peers, which can in turn cause them to question, reflect on, and refine their own ideas and beliefs (Laal and Ghodsi, 2012). As discussed by Bandura, “seeing people similar to oneself succeed by sustained effort raises observers’ beliefs that they too possess the capabilities master comparable activities to succeed” (Bandura and Wessels, 1994). Modeling by others, also known as observational learning, is a highly effective strategy to catalyze student confidence and build culture where social learning promotes the molding of lifelong learners (Shortridge-Baggett, 2000). When students are properly motivated to contribute to their own active learning and collaborative group activities, they respond positively to peer-to-peer interactions and report that social learning methods improved their learning capabilities, confidence, and critical thinking skills (Hurst, Wallace, and Nixon, 2013; Edinyang, 2016).

    While previous research has established the necessity of student collaboration in undergraduate courses, most of these studies have been in the context of in-person student discussions and projects (Gasiewski et al., 2012; Liebech-Lien and Sjølie, 2021). Recently, however, there has been an increase in the prevalence of online courses and assignments, which was amplified by the onset of the COVID-19 pandemic. The consequences of the pandemic forced educators and students alike to adapt their approaches to education which created an unprecedented shift in the direction of higher education. Online courses have now become a staple of university learning, with online classroom tools being used more than ever, even in classes that have an in-person component. Recent research highlights the challenges and advances in effective collaborative student learning within virtual classrooms (Kauppi et al., 2020; Singh et al., 2021). Specifically, classrooms inherently promote a dynamic social environment with various semiotic tools brought forth by students’ social presence while online learning uses writing as its form of interaction which may feel removed from social interaction (Islam et al., 2022). For example, spoken, conversational language is the most effective means of social interaction and is the collaborative method most often implemented in classroom settings (Springer et al., 1999; Kober, 2015). Social presence influences social interaction and is thus a critical aspect of social learning. Some research suggests that lack of social presence leads to less interaction, increases learner frustrations and questioning of instructor effectiveness, and lowers affective learning overall (Rifkind, 1992). Therefore, online instructors are encouraged to maximize social presence among students and between student and professor (Tu, 2002). Within online learning environments that are often mediated by discussion boards, learners establish an online social presence by posting and responding to one another—but is this presence enough to promote the social interaction required for social learning to occur?

    CURRENT STUDY

    With the unique opportunity presented due to the COVID-19 pandemic, we used an Environmental Biology course that pivoted to a hybrid modality to assess differences in critical thinking and discussion quality between asynchronous online and in-person discussion formats. Traditional in-class discussions were facilitated by instructor questions and students were encouraged to discuss freely. Online discussion board responses were completed asynchronously in isolation and without access to visual cues but, rather, using written language, an important semiotic means of social connection. Utilizing this hybrid course design, we asked:

    1. How does the level of complexity (measured by Bloom’s taxonomy) reached by students differ between online and in-class discussions?

    2. What are differences in discussion quality (measured by social interactions, stimulation of personally relevant connections, and number of topics discussed) between students participating in online and in-class discussions?

    3. What are students’ perceptions of the two discussion modalities and how do they change from precourse to postcourse?

    4. How is discussion modality connected to subsequent confidence and performance on summative assessments?

    We answered these questions using both quantitative and qualitative methods. Given grounding in social learning theory, we hypothesized that because the online environment has limited social presence and does not offer visual cues related to observational learning and motivation, students conducting in-person discussions would engage in deeper thinking, which would in turn result in more complex responses and better performance on summative assessments. We also hypothesize the more socially rich, in-person environment would stimulate discussion of more numerous and personally relevant topics.

    We implemented prediscussion and postdiscussion surveys to ask students about their preconceived notions of discussions, their preferences, any shift in their perceptions of discussions over the course, and the motivating factors behind their stances. Due to the COVID-19 lockdown, students lacked in-person communication, which is a vital part in learning. We therefore hypothesized that students would place higher importance on belongingness within that community therefore enjoying in-class over online discussions.

    METHODOLOGY

    This project was determined to be exempt by the Institutional Review Board at the University of Louisville (IRB ID no. 20.0696).

    Study Context and Course Format

    This study was conducted at a large public research institution in the Southeast United States during the fall semesters of 2020 and 2021. Combining the two sections, we had a total of 44 students. All students in the study were registered for the course and there were no repeat students in 2021 from the 2020 cohort. The 2020 course consisted of 23 students who completed the course (6 first-year students, 6 second-year students, 7 third-year students, 4 graduating students) and in 2021 there were 21 students who completed course (15 first-year students, 5 second-year students, 1 third-year student).

    We led the study using the course BIOL 263: Environmental Biology. BIOL 263 is a general education course that includes both a lecture and lab component and is focused on ecological systems and how anthropogenic forces exert pressures on the environment. This course was taught using a hybrid method for both semesters. The course was cotaught by two instructors, with one remaining the same between the semesters and one changing.

    This hybrid course included discussions for each unit of the course. Discussions were focused on content that had not yet been taught in class. Students were assigned a detailed reading or listening piece to complete before coming to the discussion, so the discussion space could be used to gain a deeper introduction to the assigned concept.

    Because changes to COVID-19 university policies restricted maximum in-person attendance to 12 students, the number of discussions that took place differed between semesters. In 2020, there were four discussions topics: 1) Agriculture, 2) Genetically Modified Organisms (GMOs), 3) Environmental Law, and 4) Environmental Justice. In 2021, there were only two discussion topics: 1) GMOs, and 2) Environmental Justice. Each semester, the class was randomly divided into two groups of 10–14 students for discussion activities. Students were randomly assigned to one of two groups (Group A or Group B) that alternated between online and in-person formats for each discussion topic. For example, 2020 students in Group A were online for Agriculture and Environmental Law and in-person for GMOs and Environmental Justice. Students in Group B were the opposite—in-person for Agriculture and Environmental Law and online for GMOs and Environmental Justice. These groups stayed the same throughout the semester, so by the end of the 2020 course both groups had experienced each discussion format twice. In 2021, groups of students experienced each discussion format only once. Relevant information regarding differences in timing and interaction between the discussion modalities are presented in Table 1.

    TABLE 1. Comparison of relevant differences between the online and in-person discussion modalities

    OnlineIn-person
    Allotted timeAccess to all prompt threads for 24 h75 min (separate prompts were given approximately even timing).
    Instructor involvementNoneLimited to occasional technical corrections (e.g., a student repeatedly used the term “monocultivation” instead of the correct term “monoculture” and was corrected for accuracy).
    Grading instructionsWritten instructions on blackboard requiring students to make at least one original comment and respond to at least two comments from other students for each prompt.Students were verbally encouraged to contribute at least twice during each discussion session, and that the quality of their input would also factor into grading.

    In-person discussions were carried out during regular class hours and the audio of the discussion was recorded. For in-class discussions, each discussion prompt was read aloud by the instructor and displayed on a PowerPoint slide that was visible throughout the discussion. Prompts were ordered and presented one at a time. A discussion based on a prompt ended at the instructor’s discretion. At the discussion’s outset, students were informed that they must contribute at least twice during each discussion, and that the quality of their input would also factor into grading. There was no requirement for comments to be original or responses to other students. Beyond those minimum requirements, students were graded on the quality and quantity of their contributions. Audio recordings were later transcribed by three authors (K.G., A.H., N.S.).

    Online discussions were carried out on the Blackboard Learning Management System. Each prompt had its own discussion thread. Prompts comprised the same questions used for the in-class discussions. Students had the ability to post an original comment or respond to another student’s comment, and continuously reply to comments in a thread-style format. Online discussions opened at the start of the scheduled class time, in concert with the start of in-class discussions. The students had access to all threads simultaneously and had 24 h to contribute before access was restricted. Students were given instructions on blackboard to post at least one original comment and reply to at least one original comment made by another student. Like the in-class discussions, students were told that the quality of their input would factor into grading.

    In each semester, two exams were assigned: one at the midpoint of the course and one at the end, both taken online and available for 24 h. Exams consisted of short answer and essay questions. Students were allowed to choose any five out of seven short answer questions and any three out of five essay questions to complete. Exam questions covered topics from the discussions, as well as course topics outside the discussion topics. For this study, we restricted our analysis to discussion-related questions.

    Instruments

    Critical Thinking.

    To evaluate content complexity reached by students, all 763 discussion responses from both in-person audio transcripts and online discussion board written responses were categorized into Bloom’s taxonomy levels (1–6) by two authors (K.G., A.H.) trained in assessing these specific levels (Bloom, 1956). For our study, we define critical thinking as conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication. Additionally, the upper levels of Bloom’s taxonomy which consist of analysis, synthesis and evaluation are often used as proxies to assess critical thinking (Ennis, 1991; Hager et al., 2003). We used verbs defined in each hierarchical tier to assess the level of critical thinking displayed in student responses and decided on a Bloom’s level for each response. We tested inter-rater reliability and consistency in our categorization by analyzing 149 responses scored by both raters. We measured the agreement between the two raters using the quadratic weighted kappa for ordinal inter-rater reliability. Using quadratic weights allows for differentiation between disagreeing scores that are like one another (e.g., Bloom’s levels 1 vs. 2) and puts importance on scores that are very different (e.g., Bloom’s levels 1 vs. 3). Using the statistical software R (R Core Team 2018), v. 3.5.1, with the “Metrics” package (Hammer and Frasco, 2018), and ScoreQuadraticWeightedKappa function, our inter-rater reliability score is considered sufficient (kappa = 0.68). Author A.H. was the only rater for the other 614 responses. Data from all discussions for both years were combined with their corresponding modality to be analyzed. We employed a proportional odds regression model to examine the relationship between students’ Bloom’s taxonomy scores and the predictor variables. The full model initially included the factors “Format” (representing online or in-person discussions), “Year” (indicating two different academic years) and their interaction terms, as well as the interaction between “Topic” (different discussion topics throughout the semester) and “Format.” Subsequently, model selection was performed using backward elimination. The final simplified model retained only the variable “Format” as a predictor. The models were fitted using the “polr” function from the MASS package in R. The proportional odds assumption was evaluated, and the model was assessed for goodness-of-fit using standard Akaike information criterion (AIC) diagnostics. We employed the AIC to select the most appropriate model for our analysis, opting for the model with the lowest AIC value to balance model complexity and goodness of fit. Consequently, we employed a single odds ratio (OR) to characterize the effect of discussion format on Bloom’s taxonomy level.

    Discussion Quality: Topic Richness and Relevance.

    To further our understanding of the relevance and transfer of the knowledge discussed, we also analyzed the four GMO discussions (two online n = 97 comments, two in-person n = 178 comments) for topic richness and connection of ideas. After reading through all discussion topics, we chose to focus on the GMO discussions for this analysis and the following qualitative analyses due to this discussion’s coverage of various social, economic, political, and personal topics, allowing for substantial expansion on ideas by students. Authors built discussion conversation networks using student responses to each discussion question posed by the instructor. A descriptive example of these hand-created networks is displayed in Figure 1. As each discussion, question for online and in-person discussions were the same across years, we combined the two online and two in-person discussions to build the discussion networks. Student responses that contributed an idea were linked onto original discussion questions, topics that emerged following the original comment were connected to the corresponding comment, and topics where students made connections between one another were further connected with another link. Links were followed by corresponding nodes that represent a categorization of response type. Responses were coded as recalling in-class examples, making relevant connections to new examples, or transfer of information to a new scenario. Here, relevance refers to learning experiences that are either directly applicable to the personal interests or cultural experiences of students, or that are connected in some way to real-world issues, problems, and contexts (Newton, 1988). Transfer is defined here as a cognitive practice whereby a learner’s mastery of knowledge or skills in one context enables them to apply that knowledge or skill in a different context (Barnett and Ceci, 2002). These different codes were then counted to compare the interaction differences between the two modalities. Authors N.S. and K.G. coded questions one and two for both the online and in-person discussions for inter-rater reliability. To compare the number of concept connections (represented by links) we used the statistical software R (R Core Team 2018), v. 3.5.1, with the “Metrics” package (Hammer and Frasco, 2018), and ScoreQuadraticWeightedKappa function. Our inter-rater reliability score is considered sufficient (kappa = 0.71). We used Cohen’s Kappa, a quantitative measure of reliability for two raters rating the same dataset, corrected for how often they may agree by chance (McHugh, 2012), to evaluate the knowledge categories (represented by different nodes) as either a repeated course example, connection to outside example, or transfer of knowledge to new idea. Using in the statistical software R (R Core Team 2018), v. 3.5.1, with the “irr” package (Matthias et al., 2012), our inter-rater reliability score is considered substantial (kappa = 0.77). Based on satisfactory inter-rater reliability validated by reported kappa values, we continued forward with author K.G. as the primary coder and creator of the remaining discussion networks.

    FIGURE 1.

    FIGURE 1. Descriptive example of a discussion topic network.

    For example, within an in-class discussion, in response to the discussion prompt asking “A major criticism of GMO research is that scientists involved often have conflicts of interest. Does this make you skeptical?”, a student response we coded as transfer stated, “Yeah, I agree. I think a lot of people are already skeptical going in to reading these but it would definitely make me a lot more skeptical, like, for instance, if I read a peer reviewed article on why climate change isn’t real and it’s been paid for by BP oil. Like obviously they have a conflict of interest there, so I’m going to be a lot more skeptical on what the study is saying. I think, and I kind of touched on this earlier with lobbyists in the country and the way our politics works, there are plenty of industries that this definitely happens in. Especially right now we’re seeing it a lot with pharmaceutical companies taking advantage of scientific studies and things like that, and using them to say maybe there is not a problem, or maybe the problem is not as bad as you know people want you to think, but in reality it really is. I think it does make you a lot more skeptical and I think that’s a good thing.” This comment demonstrates sufficient transfer of the information learned within the course to relate it to the larger political climate of lobbying, and further gives a concrete example that demonstrates application to another unique topic of pharmaceuticals. Following this comment, relevant connections were made such as, “So going off that, if corporations can falsify [falsely advertise information] on drugs or alcohol, I think it would be biased,” then, “It sounds like Tyson [chicken]…[if] Tyson released an article about GMOs I would say that it is bias and I wouldn’t believe a lot of the things in it.” These examples of relevant comments show connection to new undiscussed topic within the course that are personally relevant, largely through culture.

    Discussion Quality: Qualitative Analysis of Social Cognitive Development.

    We conducted a qualitative thematic analysis using student discussion responses, specifically for the four GMO discussions. Two authors served as primary coders (K.G., A.H.) using the constant comparative method, where multiple stages of collecting, refining, and categorizing data while making constant comparisons to obtain themes that are grounded in the data (Saldaña, 2009). Utilizing the constant comparative method allowed for unique themes present in the data to emerge rather than imposing them prior to data collection, hence no preexisting categories from existing theories were used before coders began analysis. We started by analyzing responses to gain a general understanding of the main theme, which emerged as sociocultural cognitive development. We went through three subsequent rounds of open coding, which allowed for open stance for reflection on potential interpretations of the data and categorizations until final themes (Table 2) were established (Khandkar, 2009). Coders negotiated and discussed definitions of codes throughout the process, condensing towards basic forms of single words or short phrases until coders were satisfied and in agreement that all data had been analyzed. These refined codes were used to generate a final codebook that best described the discussion-based data. These then were applied to the entire dataset which consists of themes and subthemes, presented in the results. All coding was completed by hand and frequencies of themes and subthemes were tallied (Table 2). Themes and their corresponding counts were split with their accompanying modality to compare the frequency of themes in online and in-person discussion environments.

    TABLE 2. Qualitative analysis results of social cognitive development from online and in-person discussion transcripts

    Example phrasesFrequency
    OnlineIn-person
    Humanity61105
     Social“it’s (GMO’s) kind of glamorized and people are like “oh look at the box, it says non-GMO so it’s healthy” and stuff like that”2433
     Structural“The Southern hemisphere is distrustful…Why would they trust anything that we bring to them? We don’t help them at all.”2541
     Environmental“The ecosystem isn’t built to compete with something that is genetically manufactured”1231
    Personal identity89142
     Feeling“Something I’m not comfortable with…”1220
     Reflection“I’m starting to question it too…”3531
     Opinion“I do think it’s a good idea…”2854
     Anecdote“I grew up in a rural area…”717
     Relatability“This relationship reminds me much of the one us students face…”720
    Verbal immediacy36108
     Phrase“I don’t know if its off topic” “its kind of funny”033
     Question“Is that what you meant?”624
     Acknowledgment“Good points from both sides the pros that he brought up the cons that she brought up”3051

    Perception Survey

    In addition to the discussions, we assigned prediscussion and postdiscussion surveys to examine changes in student perception of online versus in-person discussion environments (Table 3). The prediscussion survey was given at the beginning of the course before any discussions had taken place and was available for 1 wk. The postdiscussion survey was given after the last discussion and remained available for 1 wk. The surveys were completed online in blackboard, were not anonymous, and a small amount of extra credit was given to incentivize completion.

    TABLE 3. Perception survey questions deployed precourse and postcourse

    Survey questions
    Precourse:
    • Have you participated in discussions for a university level course before? (either online or in-person)

    • If yes, what format where the discussions? Online or in-person?

    • If you have participated in discussions, did you enjoy them? Why or why not?

    • In this class, we will hold two types of discussions—online and in-person. Which type of discussion (online or in-person) do you think you will enjoy most? Why?

    Postcourse:
    • Which format of discussions did you prefer? Why?

    • Which format was easier to prepare for? Why?

    • Which format do you think you learned more from? Why?

    • Do you think the material for the discussion influenced your preferences?

    Authors (K.G., A.H.) performed similar qualitative analysis on the survey responses using the constant comparative method as described above to gain insight towards precourse and postcourse student perceptions of discussion modalities. Frequencies of final themes for the perception surveys were combined, tallied, and presented (Table 4). Other answers to binary questions were also counted to examine shifts in student perceptions after discussions were experienced.

    TABLE 4. Themes from qualitative analysis and corresponding frequencies tallied from combined precourse and postcourse survey results of all students in the course who had participated in both in-person and online discussions throughout their semesters

    ThemeFrequency
    Learning motivation/rationale 
     Internal42
     External12
    Sense of community46
    Social openness13

    Student Confidence and Performance

    To examine how participation in online versus in-person discussions affected confidence and performance on exams, we examined the likelihood a student would select a given exam question as a proxy for student confidence postdiscussion, and then their performance on that question if selected. As mentioned above, students were allowed to choose any five out of seven short answer questions and three out of five essay questions on these exams. To do so, we ran a binary generalized linear mixed model (GLMM) (package “lme4” (Bates et al., 2015)) to test the likelihood a student would select a test question for which they were a participant in an online or in-person discussion. Student selection (yes, no) was a binary response, discussion format was included as a fixed effect, and discussion topic (e.g., GMO, Agriculture) and student ID were included as random effects. To test for performance, we ran a Gaussian GLMM with student score for each test question, only selecting the questions that students chose to answer. The same model design was used for these GLMMs. Data were pooled across both years.

    RESULTS

    Blooms Taxonomy

    A single OR value was used to describe the effect of discussion format on the Bloom’s taxonomy level. The online discussion format was associated with an increased odds of reaching a higher Bloom’s taxonomy level compared with in-person discussions, regardless of the threshold (Figure 2, OR = 0.283, 95% confidence interval [0.215–0.372], p < 0.001). The mean Bloom’s taxonomy score for the online format responses was 3.12, whereas the mean in in-person discussion responses was 2.23. Level 6, the highest level of Bloom’s taxonomy where students create original ideas, was achieved 15 times within online discussion boards and only four times during in-person discussions.

    FIGURE 2.

    FIGURE 2. The stacked plot illustrates the effect of different discussion modalities on the probability of response based on Bloom’s taxonomy. The model, a proportional odds logistic regression, was fitted to the data using the discussion format variable. Each colored segment represents a different Bloom’s taxonomy response level from 1 to 6. The y-axis denotes the probability of response, and the x-axis indicates the discussion modality.

    Following the discussion question, “A few specific GMO examples were given at the end of the lecture. Choose one and either defend it (i.e., the benefits outweigh the costs) or make a case against its implementation,” an example of a higher-order response was, “The best example of GMO use at the end of the lecture was the creation of a rice with greater beta-carotene for the production of vitamin A in consumers. This use of GMO’s—creating a solution to a problem in a way that is both ethical and pragmatic—stands out as a better use of GMO’s than strictly creating an environmentally stronger crop. After learning that many developing countries were skeptical and many political battles ensued, I realize that though this is a great usage of the technology, it would be BEST if we not only created new crops, but taught the people we created the product for how to cultivate the product themselves so as to avoid a negative relationship, an imbalance of power over resources, and breed greater trust of science and technology,” whereas a low-level response of simple recollection included comments like, “The mosquitos. They put something in the, it was to mess up their reproductive cycle so that the larvae die off early or something like that so there are less mosquitos around,” or “They were injecting the fertilized eggs with a gene coded for the production of hormones.”

    Discussion Quality: Topic Richness and Relevance

    Following analysis of the numbers of new topics brought forth by students within GMO discussions both online and in-person, across the 2 years, we found differences in the connectance across discussion networks when comparing modalities (Figure 3). Within the discussion networks, we found students within online discussions recalled in-class examples 16 times, brought in new connections 12 times, and further transferred knowledge from the discussion to new scenarios nine times. In in-person discussions, we found 13 links that were repeated class examples, 62 new connections, and 25 instances of transfer of information. There were also differences in total number of links, corresponding to newly contributed (class or unique examples) or connecting ideas, where the online discussions had 39 unique concepts while the in-person discussion had 98.

    FIGURE 3.

    FIGURE 3. Discussion topic networks representing the dynamics of discussions on GMO in online and in-class settings. The top row demonstrates the outcomes of online discussions while the bottom row reveals discussion networks for the in-class discussion. Each column corresponds to a discussion question (1–6) which was the same question presented in both discussion modalities. The white central node represents the initial discussion topic. Links are responses. Gray nodes denote responses that repeat an in-course example. Blue nodes signify the establishment of relevant connections, and purple nodes indicate instances where examples demonstrate a transfer of knowledge. The size of each node increases if the topic was brought up more than once to visually display the number (#) of times the topic was repeated. The network offers a visual representation of the discourse dynamics and topic relevance within each discussion modality.

    Discussion Quality: Qualitative Analysis of Social Cognitive Development

    Constant comparative analysis of the GMO discussion responses for both 2020 and 2021 unveiled three central themes in relation to social learning: humanity, personal identity, and verbal immediacy (Table 2). Themes, subthemes, their frequencies in both discussion formats, and statements that were coded into these themes are provided below. Because there was approximately the same number of individuals (10–14) participating in the four discussions (two online and two in-person) we present frequencies as counts as we are ultimately interested in the differences in engagement with these themes by modality.

    Humanity

    This theme represents students’ concerns, opinions, or empathetic responses surrounding humanitarian issues. Focusing on this theme puts value onto the deeper interaction by learners with socioecological questions that are core to the course material. These specific human-centered themes may relate to the relevance and further application of these issues, tying into previously discussed research questions within this study. In total, 61 online responses encompassed humanity in their answers, while 105 responses included dialogues of such issues during face-to-face discussions. This broad theme was subsequently broken down into subthemes: social, structural, and environmental (Table 2).

    The social subtheme emerged from statements relating to posterity, empathy for one another as humans, speculation of future society, and human health. There were 24 online responses and 33 in-person responses that were grouped into this subtheme. Responses questioning structural integrity of society were also included as a subtheme. The online format had 25 accounts versus 41 in the face-to-face discussion. Replies grouped into this category often encompassed questioning of political systems, economics, healthcare, ethics, and general contemplation of society. The final subtheme related to environmental concerns. This included environmental ethics, specifically of entities in power within the current economic systems, and further speculation into the future of the planet. There were 12 responses online that referenced environmental concerns while 31 responses touched on environmental empathy in the classroom.

    Personal Identity

    Personal identity surfaced as a theme, represented in a total of 89 of the online responses and 142 of the in-person responses. Five subthemes related to personal identity emerged: reflection, relatability, emotions, sharing of opinions, and anecdotes.

    Student reflection was also incorporated as a personal identity subtheme. This subtheme was represented evenly across the two discussion types with 35 instances from the online format, and 31 from the in-person discussions. This grouping developed from statements sharing prior knowledge, contemplation of new ideas relating to their previous ideology, and previous personal actions. Many student responses included reasons why they avoided buying GMO products in the store, referencing preconceived notions and influence by family members, high school courses, advertisements, and social media. Specific responses from the 2020 in-person discussion categorized as reflection were, “Before the lecture and stuff I had seen GMOs in the store for food and I had seen others labeled non-GMOs, and so I thought GMOs were bad and organic was good, and so that’s how I had to base my opinion until this lecture of course,” and “I also saw GMO labels in the stores. I had preconceived notions that they were all bad, but after watching the lecture and seeing some of the environmental effects they have they have some benefits that I never took into consideration–-things I didn’t know about that definitely shifted my perspective on them, just seeing that they are a good thing, and they can be useful in many ways.” These responses both share reflections on prior knowledge, personal experiences, and their own actions, as well as contemplation of newfound stances.

    Relatability was the final emerging personal identity subtheme. This subtheme appeared seven times in online discussion board posts and 20 times in classroom discussions. Relatability stood out as a distinctive way to encourage sense of community within discussion groups. Responses relating to this subtheme were specific, often referencing regional aspects such as particular neighborhoods within Louisville, Kentucky, or issues working common jobs within the city, but also general human experiences. These human experiences included dealing with family members aging, observing changes within children over time, being surrounded by social media influence, and grocery shopping. Coders also included acts of comedy or encouraging sense of humor in these relatable acts as part of the subtheme. For example, when sharing personal opinions about ethics a student said, “I don’t really think it’s our place as a species to decide when and if other species are worth living. We have no place to kill the mosquitoes even if they are annoying and come into my room when I have my windows open.” The annoyance factor embodies comedic relief while also including a familiar experience to which others may relate.

    Emotional responses were documented 20 times in the in-person classroom, while there were only 12 instances on the virtual discussion board. Emotional responses incorporated personal feelings, often including statements such as “I feel.” Coders also included responses that were perceived as emotionally charged or passionate about the subject. The subtheme, sharing of opinion, was identified 28 times in online responses and 54 times in in-person discussion responses. This category often contained statements such as “I think” or “I believe,” where students shared their personal stances on controversial issues relating to GMOs. Anecdotes were also a common form of sharing one’s personal identity. This subtheme was identified seven times on the virtual discussion board and 17 times in in-person discussions. These anecdotes were often personal storytelling moments where students referenced an activity or a particular unique experience in their lives. These were often ways in which students were able to share nuances about their own community.

    Verbal Immediacy

    The final overall theme, verbal immediacy, was counted 38 times in virtual discussion board posts and 108 times in in-person responses. Verbal immediacy was defined by coders as statements that allow conversations to flow freely in a comfortable manner (Montgomery, 1982). Exchanges within this theme emphasize spontaneity, continuous feedback, and acknowledgment of others’ comments, which all add to higher-order thinking as rapid exchanges encourage impromptu connections and space for social learning to occur. This theme was broken down into three subthemes: filler phrases, questions, and acknowledgment of previous responses. Filler phrases included expressions such as “it’s kind of funny,” “I agree,” and “going off that.” These were not found in online discussion boards but counted 33 times throughout classroom discussions.

    There were six accounts of students asking questions of one another online, while in-person discussions included 24 examples of students asking clarifying questions. Acknowledgment of previous student responses emerged 30 times online and 51 times in-person. This subtheme included response to questions or praise of former student comments. Examples of additional phrases within this subtheme contained “true but,” or “what they said.”

    Student Perception Surveys

    In the precourse survey, 60% of students answered that they preferred in-person discussion formats while 40% preferred online (n = 39). After completing both formats of discussion throughout the course, 73% of students answered that they preferred the in-person format while 27% preferred online (n = 32). Comparison of precourse to postcourse survey answers show 6 students switched from preferring the online discussion board format to favoring an in-person classroom discussion. Four students changed their answers from preferring in-person discussions to preferring online discussions. In their explanations, students that preferred in-person discussions initially or that developed such a preference primarily gave reasons of feeling connected to their peers, being more engaged, listening to other individuals’ perspectives, and “being able to read the room.” When students preferred online discussions or developed a preference for online discussions, they explained that it was driven by the allowance for more time to express thoughts, as well as the ability to participate at the student’s personal pace. One student also discussed feelings of anxiety when having to speak during in-person discussions. Qualitative analysis using the constant comparative methods revealed three overarching themes within the precourse and postcourse surveys: learning motivations/rationale, sense of community, and social openness.

    Learning Motivation/Rationale

    The emergent theme of learning motivation was further broken into subthemes of either intrinsic or extrinsic learning motivations. From that, 42 answers discussed internal motivation factors influencing their preference of discussion format, while 12 responses indicated external factors. Intrinsic learning motivations were described as personal experiences such as anxiety, comfortability, and understanding of one’s best format for performance. For example, students internally motivated by anxiety and comfortability often rationalized their choice for online discussions with responses such as, “I get tongue tied during in-person discussions,” and “I don’t like talking in person a lot unless it’s to people I know so online would feel more comfortable for me.” Additional answers included, “due to the pandemic, I am more familiar with the online format” and “I enjoy online discussions, but oftentimes in in-person discussions I am too anxious to say much. This is in discussions where my grade depends on me speaking; in discussions where I can just listen, I’m fine. [In-person] I spend most of the time trying to get the courage to say something and don’t really absorb what is being discussed because I can’t focus on it.” Students motivated by internal factors, such as improved performance, rationalized their preference for online discussions with answers such as, “I liked the online version of the discussion because in there, I could’ve gathered my thoughts at my own pace and understand the material better. Then I could answer them with my best intentions and with no interruption.” Students whose preference for in-person discussions were often internally motivated given responses such as, “being in-person gives me a lot more motivation to actually keep up with work and not fall behind” and “I felt that in-person I learned a lot more than other than my own thoughts on the subject immediately rather than waiting for someone to reply online. It forced me to engage, and it was hands on which is the way I learn best.” Externally motivating factors leading to an online or in-person preference included technological difficulties, time conflicts, or concern for safety during the ongoing pandemic. Comments included, “[in-person] you don’t have to worry about any technical difficulties that are accompanied using a computer or smartphone” and, “[online because] ease of access, and reduced fear of getting the plague.”

    Sense of Community

    Sense of community was also a common theme emerging from the perception surveys with 46 records of this code appearing in student responses. For our purposes, we interpreted sense of community as placing importance on interaction, involvement, engagement, and feeling of connectedness to other students, as well as caring about the equal opportunity for participation among everyone in the classroom and excitement to listen and learn from one another. Some examples of this theme can be seen through these responses: “I think discussions work better in-person because it provides a better conversation that is engaging rather than just an assignment that needs to be done and forgotten about. I think in-person allows us to better learn and understand each other and the problem at hand,” and “it helped me learn the topic we were discussing a lot further and let me interact with my classmates more which isn’t common with this semester [due to COVID-19 pandemic].” Additionally, “I think it will be cool to hear people explain their thoughts more than reading just an entry posted online because I’d rather listen to responses than read them. I feel like it’d also have some sort of excitement with ongoing conversation rather than reading and responding to answers people type.”

    Social Openness

    The final theme that appeared but in lower frequency of 13 responses was what we have called social openness, which we characterize as students’ appreciation of the openness of others to share their own anecdotes and experiences within the learning environment, as well as desire to read others’ body language to further empathize with their peers. Example responses include, “having real interactions, you can tell people are giving their real opinions on the topic. When we do online it’s just working on an assignment to get it done,” and “it is easier to understand what a person is trying to convey when you are physically with them, rather than trying to interpret what they’re implying online.” Another student explained, “in-person discussions are easier to control. You can see people who intend to speak, and you are better able to anticipate when you can pop into the conversation. Online discussions are difficult because you cannot read the body language of others.”

    Student Confidence and Performance

    Students showed a slight increase in likelihood of selecting a test question covering a topic for which they had an in-person discussion, displaying enhanced confidence in their abilities to explain course material (model estimate = −0.706, standard error = 0.378, p = 0.061). However, student performance on exam questions did not differ between online and in-person discussion students (model estimate = 0.692, standard error = 1.963, t = 0.353).

    DISCUSSION

    The results from our study add to the growing literature around the complexities of social learning mediated by virtual environments. Our results advocate for an intentional hybrid design that incorporates both virtual and in-person learning environments to promote individual critical thinking while also constructing spaces for social learning to occur. Higher levels of Bloom’s taxonomy reflect increased critical thinking within online discussion boards, while in-person discussions foster a sense of community by facilitating interaction with peers, and deepening engagement with course material through relevant connections, thus motivating active participation and student confidence.

    How Does the Level of Content Complexity Reached by Students Differ Between Online and In-class Discussions?

    In contrast to our initial hypothesis that in-class discussions would deepen students’ critical thinking through complex social interactions, responses from the online discussion boards reached higher levels of Bloom’s taxonomy. This is likely due to the ability for students to think through their ideas and create complex written responses with unrestricted time, rather than needing to quickly process their thoughts and share them in face-to-face discussions. Writing to engage, which is a practice between the two most common approaches to writing across the curriculum—writing to learn and writing in the disciplines (sometimes referred to as writing to communicate)—is a key way to encourage reflection on course material (Reynolds et al., 2012; Balgopal and Wallace, 2013). Writing in these conversational formats supports students as active participants in their education while also increasing their writing skills by using evidence to support claims (Rivard, 1994). Meta-analyses of writing to learn research in science higher education have provided evidence that writing can be an effective tool to promote student learning and engagement (Rivard, 1994; Reynolds et al., 2012). However, other studies have also shown that writing activities have little to no difference in deepening student learning (Armstrong et al., 2008; Fry and Villagomez, 2012; Blings and Maxey, 2017). Our results suggest that online asynchronous discussion boards are useful in facilitating individual learning by reinforcing specific core concepts introduced in class and supporting students in forming complex ideas through engaged writing as demonstrated by reaching higher levels of Bloom’s taxonomy. Notably, these results reveal similarities with research comparing methods of group discussion, writing, and a combination of the two in increasing student learning through clicker questions, where individual writing most effectively increased student performance on assessments (Linton et al., 2014).

    What Are the Differences in Discussion Quality Between Students Participating in Online and In-class Discussions?

    When analyzing discussion quality through discussion network measurements, we found increased numbers of topics discussed as well as more examples that demonstrated relevance and transfer of knowledge within in-class discussions compared with the online format. This provides support for social learning theory. Sharing physical space allows students to use social cues to remain engaged in the conversation while also responding to stimuli through the recall of personally relevant information to build new connections throughout the discussion. Through the interactive dynamic in classroom discussions, students can take ownership of their learning as they analyze and question responses from peers while formulating new connections on the spot.

    Within an in-person discussion, students are engaged throughout the entirety of the class period, even if they are primarily listening. On the other hand, within online discussions, students often do the minimum discussion board requirements (write a response and respond to another student’s), meaning they do not necessarily engage in a complete conversation and read everyone’s discussion contributions. Therefore, students are inherently less connected online since the total amount of time spent engaged with others is likely shorter online than in-person, even though the discussion forum was open for 24 h. The lack of social cues and holistic engagement within the online environment led to repetition in responses. Additionally, the repeated topics within the online discussion threads were often examples that recalled course material rather than unique topics cocreated through conversation. In-person students were able to connect with one another in ways that synthesized knowledge and transferred concepts, creating a more rich and complex community conversation despite lower Bloom’s taxonomy scores of individual responses. We also found fewer course-based examples brought forth within the in-class discussions and the integration of more numerous personally relevant topics as well as demonstration of knowledge transfer to unique examples. Transfer is a complex cognitive process where a student first recognizes that the previous knowledge is relevant, recalls that knowledge accurately, and further applies the previous knowledge to a new context (Kaminske et al., 2020). These in-class conversations allow students to recall relevant information and apply their learning as they change their stances on topics, understand content from various perspectives, and stimulate engagement through listening and building off each other’s ideas as a group (Kuusisaari, 2013; Bovill, 2020). Therefore, while students in the online environment reached higher levels of response complexity (measured by Bloom’s taxonomy), the social environment plays an important role in stimulating the recognition, recall, and application of information that demonstrates relevance and transfer of information.

    Student interaction as measured by our thematic frequencies was higher within in-person discussions compared with the online discussion boards. This supported our hypothesis that in-person discussions enhance discussion quality through increased peer exchanges, leading to larger breadths of knowledge. As informed by our conceptual framework and demonstrated by our discussion topic network, this in-person back-and-forth dialogue exposes the whole class to new insights, perspectives, and points of view, which is valuable when discussing heavy scientific, political, and social issues integrated within fields such as environmental science. Previous research suggests that social interactions can further shape the learning process itself (Bransford et al., 2000). Within our qualitative analysis, the most important themes that differentiated in-person from online responses were related to humanity. These themes often call for socioemotional collaborative learning, which are interactions that are reciprocally shaped by one another through social interaction as the collaboration unfolds (Miyake and Kirschner, 2014). The propensity to question humanitarian issues and comment on personal perspectives in face-to-face discussions built a socioemotional familiarity among students and allowed them more freedom to express themselves in ways that are favorable to collaborative learning, but less organic in online formats due to lack of social presence (Isohätälä et al., 2018). This familiarity within in-class discussions promoted curiosity about each other’s ideas, which encouraged students to connect course material to outside experiences (demonstrated in Supplementary Data S1), further increasing interest in the course topics. This was exceptionally evident when comparing thematic frequencies of personal identity and subthemes such as feeling, reflection, anecdotes, and relatability between the discussion modalities. Similar to our results, Curtis and Lawson found that students within online discussions had 15% fewer occurrences of self-reflection and only 5% of comments classified as social interaction when compared with their in-person counterparts (Curtis and Lawson, 2001). Through responses, there was evident vulnerability within face-to-face discussions that was nonexistent in online discussions. In-person students shared personal experiences that evoked emotion, such as living in New Orleans during Hurricane Katrina, relatable experiences such as working difficult jobs, or general poverty issues faced by students. We found relatability to be one of the most interesting themes, as it encompassed both emotional responses, as well as lighthearted comments that aided in peer-to-peer connection, which in turn caused casual but meaningful conversation within a nascent community.

    What are Student’s Perceptions of the Two Discussion Modalities and How do They Change from Precourse to Postcourse?

    Investigating the student experience is important in biology teaching and learning (Trujillo and Tanner, 2014). Overall, more students had positive feelings toward in-person discussions, and that majority increased throughout the course. Survey results show students primarily cited active participation and sense of belonging to the learning community as an external motivation factor. They explained that they felt more engaged and likely to reciprocate communication when others were visibly passionate about course material, fitting into our themes of sense of community and social openness. A supporting response to the postcourse survey stated, “the in-person discussion allowed me to see people’s emotions along with their opinion. The online discussion gave me good information, but I wasn’t as able to grasp their perspective.” Emotions are inherently linked to and influence cognitive skills such as attention, memory, executive functioning, decision-making, critical thinking, problem-solving, and regulation, all of which play a key role in learning (Immordino-Yang and Damasio, 2007). Having face-to-face spaces that foster emotional and engaging discussions aid in retention as social cues shape the impact of course material on students, potentially leaving long-lasting impressions (Tiene, 2000). This could explain why students were more likely to choose an essay exam question for which they had an in-person discussion rather than an online one. Emotions and learning are inseparable, and as educators, understanding the roles that emotions play in cocreating learning experiences can influence the design of more effective active learning activities. Positive emotions that are fostered through social learning in particular promote perseverance through challenging topics and subsequently influence motivation to sustain success throughout a course (Martin, 2007).

    It is important to recognize that not all students responded the same to these discussions. While most students did not change their preferred format from precourse to postcourse, there were 6 students that switched their preference from online to in-person discussions after the course, and there were 4 students that originally preferred in-person and after experiencing both discussion types, preferred the online discussion board format. We qualitatively assessed the motivation for these preference shifts to online and found that 3 of the 4 students cited internal motivations for their switch, such as “getting tongue tied,” “having more time to digest information and formulate response,” or being anxious to speak in front of the class (the fourth did not provide rationale). Additionally, half the respondents that preferred online discussions also responded that they learned more in the in-person format. This contradiction was interesting, that while students may be more comfortable responding and better able to articulate their critical thinking online, the emotional engagement with their peers in face-to-face discussions deepens their understanding of content through social learning.

    LIMITATIONS OF STUDY

    As noted earlier, the population of students discussed here were ∼50 nonbiology majors in an environmental biology class. Results and preferences may be different based on major status or other demographic characteristics. With such a small group of students within our study, demographic data were not collected as relevant patterns would not be representative. Future studies with these measurements would be useful with larger sample sizes.

    We also acknowledge the differences in these discussion modalities are not exact comparisons, that is, differences in allowed time for discussion, and synchronous versus asynchronous, instructor presence. These differences could influence our results in ways we were not able to control in this study design. To further test our questions, a more direct comparison could be made between a traditional in-person discussion and an online synchronous video or chat-based discussion. Although this is a limitation, asynchronous discussion boards are very common pedagogy tools used in place of, or in concert with, traditional in-class discussions. Discussion boards became increasingly prevalent with the shift toward hybrid learning during COVID-19 pandemic. As such, our design allows for descriptive understanding among these two commonly used formats.

    Additionally, slightly more students completed the precourse survey (n = 39) in comparison to the postcourse perception survey (n = 32) and in some cases the same students did not complete both, so individual preferences were unable to be tracked through time.

    Another limitation comes from statistically analyzing the Bloom’s taxonomy scores. Within discussions, there could be several responses from the same student which adds some nonindependence into the dataset. Additionally, there were more responses within in-person discussions in comparison to the written discussion boards, which skews averages of the ratings. Although the increased number of responses in-person due to “chit-chat,” personal anecdotes, and conversation pieces may distract from higher order responses categorized by Bloom’s taxonomy, they have proven to be important for social learning, facilitation of creating relevant connections that result in quality discussions, and community building.

    CONCLUSION

    Given our results, we suggest the way forward for students today includes a blended learning design of in-person and online discussion modalities. In-person discussions facilitate community building, social interaction, and cocreation of knowledge surrounding personally and socially relevant course topics. Online discussion boards provide a complementary opportunity for students to form complex arguments backed by evidence and learn through the process of writing. This is a crucial avenue of research as we need to continue exploring the benefits and drawbacks of educational modalities within a new technological generation as well as a generation of “post-COVID” students.

    ACKNOWLEDGMENTS

    No financial support was affiliated with this work. We acknowledge the guidance from Trisha Douin on qualitative analysis and Kimberly Koenig as the secondary instructor of the course in 2021. We also thank Stephanie Tsui for her suggestions concerning data visualization.

    REFERENCES

  • Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. ERIC. Google Scholar
  • Armstrong, N. A., Wallace, C. S., & Chang, S.-M. (2008). Learning from writing in college biology. Research in Science Education, 38, 483–499. Google Scholar
  • Balgopal, M., & Wallace, A. (2013). Writing-to-learn, writing-to-communicate, & scientific literacy. The American Biology Teacher, 75(3), 170–175. Google Scholar
  • Bandura, A., & Walters, R. H. (1977). Social Learning Theory. Englewood cliffs, NJ: Prentice Hall. Google Scholar
  • Barron, B. (2003). When smart groups fail. The Journal of the Learning Sciences, 12(3), 307–359. Google Scholar
  • Bennett, J., Hogarth, S., Lubben, F., Campbell, B., & Robinson, A. (2010). Talking science: The research evidence on the use of small group discussions in science teaching. International Journal of Science Education, 32(1), 69–95. Google Scholar
  • Barnett, S. M., & Ceci, S. J. (2002). When and where do we apply what we learn?: A taxonomy for far transfer. Psychological Bulletin, 128, 612–637. MedlineGoogle Scholar
  • Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1 SE-Articles), 1–48. doi: 10.18637/jss.v067.i01 Google Scholar
  • Blings, S., & Maxey, S. (2017). Teaching students to engage with evidence: An evaluation of structured writing and classroom discussion strategies. Journal of Political Science Education, 13(1), 15–32. doi: 10.1080/15512169.2016.1168303 Google Scholar
  • Bloom, B. S. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals: Handbook I Cognitive Domain. London, UK: Longmans. Google Scholar
  • Blumenfeld, P. C. (1992). Classroom learning and motivation: Clarifying and expanding goal theory. Journal of Educational Psychology, 84(3), 272. Google Scholar
  • Bovill, C. (2020). Co-creation in learning and teaching: the case for a whole-class approach in higher education. Higher Education, 79(6), 1023–1037. Google Scholar
  • Bowen, C. W. (2000). A quantitative literature review of cooperative learning effects on high school and college chemistry achievement. Journal of Chemical Education, 77(1), 116. Google Scholar
  • Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How People Learn, 11. Washington, DC: National academy press. Google Scholar
  • Champion, K., & Gunnlaugson, O. (2018). Fostering generative conversation in higher education course discussion boards. Innovations in Education and Teaching International, 55(6), 704–712. doi: 10.1080/14703297.2017.1279069 Google Scholar
  • Cheng, C. K., Paré, D. E., Collimore, L.-M., & Joordens, S. (2011). Assessing the effectiveness of a voluntary online discussion forum on improving students’ course performance. Computers & Education, 56(1), 253–261. Google Scholar
  • Curtis, D., & Lawson, M. (2001). Exploring collaborative online learning. Journal of Asynchronous Learning Network, 5(1). doi: 10.24059/olj.v5i1.1885 Google Scholar
  • Daniels, H., & Daniels, E. (2013). Write-arounds. In The Best-Kept Teaching Secret: How Written Conversations Engage Kids, Activate Learning, and Grow Fluent Writers, K–12 (pp. 154–191). SAGE Publications, Inc., https://doi.org/10.4135/9781483389752 Google Scholar
  • Darabi, A., & Jin, L. (2013). Improving the quality of online discussion: the effects of strategies designed based on cognitive load theory principles. Distance Education, 34(1), 21–36. Google Scholar
  • Delucchi, M. (2006). The efficacy of collaborative learning groups in an undergraduate statistics course. College Teaching, 54(2), 244–248. Google Scholar
  • Dennen, V. P., & Wieland, K. (2007). From interaction to intersubjectivity: Facilitating online group discourse processes. Distance Education, 28(3), 281–297. Google Scholar
  • Edinyang, S. D. (2016). The significance of social learning theories in the teaching of social studies education. International Journal of Sociology and Anthropology Research, 2(1), 40–45. Google Scholar
  • Ellis, R. A., & Calvo, R. A. (2004). Learning through discussions in blended environments. Educational Media International, 41(3), 263–274. doi: 10.1080/09523980410001680879 Google Scholar
  • Ennis, R. H. (1991). Critical Thinking: Astreamlined Conception. Illinois: University of Illinois. Google Scholar
  • Eun, B. (2010). From learning to development: A sociocultural approach to instruction. Cambridge Journal of Education, 40(4), 401–418. doi: 10.1080/0305764X.2010.526593 Google Scholar
  • Fehrman, S., & Watson, S. L. (2021). A systematic review of asynchronous online discussions in online higher education. American Journal of Distance Education, 35(3), 200–213. doi: 10.1080/08923647.2020.1858705 Google Scholar
  • Foster-Hartnett, D., Mwakalundwa, G., Bofenkamp, L., Patton, L., Nguyen, R., & Goodman-Mamula, P. (2022). Beyond the traditional classroom: Increased course structure and cooperative learning remove differences in achievement between students in an in-person versus hybrid microbiology course. CBE—Life Sciences Education, 21(2), ar33. 10.1187/cbe.21-01-0007 MedlineGoogle Scholar
  • Fry, S. W., & Villagomez, A. (2012). Writing to learn: Benefits and limitations. College Teaching, 60(4), 170–175. Google Scholar
  • Garrison, D. R., & Vaughan, N. D. (2008). Blended Learning in Higher Education: Framework, Principles, and Guidelines. San Francisco, CA: Jossey-Bass. Google Scholar
  • Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Chang, M. J. (2012). From gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Research in Higher Education, 53, 229–261. MedlineGoogle Scholar
  • Gao, F., Zhang, T., & Franklin, T. (2013). Designing asynchronous online discussion environments: Recent progress and possible future directions. British Journal of Educational Technology, 44(3), 469–483. Google Scholar
  • Hager, P., Sleet, R., Logan, P., & Hooper, M. (2003). Teaching critical thinking in undergraduate science courses. Science & Education, 12, 303–313. Google Scholar
  • Han, F., & Ellis, R. A. (2019). Identifying consistent patterns of quality learning discussions in blended learning. The Internet and Higher Education, 40, 12–19. Google Scholar
  • Havnes, A., Christiansen, B., Bjørk, I. T., & Hessevaagbakke, E. (2016). Peer learning in higher education: Patterns of talk and interaction in skills centre simulation. Learning, Culture and Social Interaction,, 8, 75–87. https://doi.org/10.1016/j.lcsi.2015.12.004 Google Scholar
  • Hurst, B., Wallace, R., & Nixon, S. (2013). The impact of social interaction on student learning. Reading Horizons (Online), 52(4), 375. Google Scholar
  • Immordino-Yang, M. H., & Damasio, A. (2007). We feel, therefore we learn: The relevance of affective and social neuroscience to education. Mind, Brain, and Education, 1(1), 3–10. https://doi.org/10.1111/j.1751-228X.2007.00004.x Google Scholar
  • Islam, M. K., Sarker, M. F. H., & Islam, M. S. (2022). Promoting student-centered blended learning in higher education: A model. E-Learning and Digital Media, 19(1), 36–54. Google Scholar
  • Isohätälä, J., Näykki, P., Järvelä, S., & Baker, M. J. (2018). Striking a balance: Socio-emotional processes during argumentation in collaborative learning interaction. Learning, Culture and Social Interaction, 16, 1–19. Google Scholar
  • Jensen, J. L., & Lawson, A. (2011). Effects of collaborative group composition and inquiry instruction on reasoning gains and achievement in undergraduate biology. CBE—Life Sciences Education, 10(1), 64–73. 10.1187/cbe.10-07-0089 LinkGoogle Scholar
  • Kaminske, A. N., Kuepper-Tetzel, C. E., Nebel, C. L., Sumeracki, M. A., & Ryan, S. P. (2020). Transfer: A review for biology and the life sciences. CBE—Life Sciences Education, 19(3), es9. doi: 10.1187/cbe.19-11-0227 LinkGoogle Scholar
  • Kauppi, S., Muukkonen, H., Suorsa, T., & Takala, M. (2020). I still miss human contact, but this is more flexible—Paradoxes in virtual learning interaction and multidisciplinary collaboration. British Journal of Educational Technology, 51(4), 1101–1116. doi: https://doi.org/10.1111/bjet.12929 Google Scholar
  • Khandkar, S. H. (2009). Open coding. University of Calgary, 23, 2009. Google Scholar
  • Kober, N. (2015). National Academies Press (US); National Research Council (US). Reaching Students: What Research Says About Effective Instruction in Undergraduate Science and Engineering. Washington, DC: National Academies Press. Google Scholar
  • Kuusisaari, H. (2013). Teachers’ collaborative learning–development of teaching in group discussions. Teachers and Teaching, 19(1), 50–62. Google Scholar
  • Laal, M., & Ghodsi, S. M. (2012). Benefits of collaborative learning. Procedia-Social and Behavioral Sciences, 31, 486–490. Google Scholar
  • Liebech-Lien, B., & Sjølie, E. (2021). Teachers’ conceptions and uses of student collaboration in the classroom. Educational Research, 63(2), 212–228. Google Scholar
  • Linton, D. L., Pangle, W. M., Wyatt, K. H., Powell, K. N., & Sherwood, R. E. (2014). Identifying key features of effective active learning: The effects of writing and peer discussion. CBE—Life Sciences Education, 13(3), 469–477. 10.1187/cbe.13-12-0242 LinkGoogle Scholar
  • Martin, A. J. (2007). Examining a multidimensional model of student motivation and engagement using a construct validation approach. British Journal of Educational Psychology, 77(2), 413–440. https://doi.org/10.1348/000709906X118036 MedlineGoogle Scholar
  • Matthias, G., Jim, L., & Singh, I. F. P. (2012). irr: Various coefficients of interrater reliability and agreement. R package version 0.84.1. Google Scholar
  • McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22(3), 276–282. MedlineGoogle Scholar
  • Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Google Scholar
  • Miyake, N., & Kirschner, P. A. (2014). The social and interactive dimensions of collaborative learning. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (2nd ed., pp. 418–438). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.026 Google Scholar
  • Mohammed, T. F., Nadile, E. M., Busch, C. A., Brister, D., Brownell, S. E., Claiborne, C. T., … Cooper, K. M. (2021). Aspects of large-enrollment online college science courses that exacerbate and alleviate student anxiety. CBE—Life Sciences Education, 20(4), ar69. 10.1187/cbe.21-05-0132 MedlineGoogle Scholar
  • Mollborn, S., & Hoekstra, A. (2010). “A Meeting of Minds”: Using clickers for critical thinking and discussion in large sociology classes. Teaching Sociology, 38(1), 18–27. doi: 10.1177/0092055×09353890 Google Scholar
  • Montgomery, B. M. (1982). Verbal immediacy as a behavioral indicator of open communication content. Communication Quarterly, 30(1), 28–34. Google Scholar
  • Murphy, P. K., Firetto, C. M., Lloyd, G. M., Wei, L., & Baszczewski, S. E. (2020). Classroom discussions. In Oxford research encyclopedia of education. Oxford University Press. Google Scholar
  • Newton, D. P. (1988). Relevance and science education. Educational Philosophy and Theory, 20(2), 7–12. doi: 10.1111/j.1469-5812.1988.tb00139.x Google Scholar
  • Paskey, J. (2001). A survey compares 2 canadian MBA programs, One Online and One Traditional. In The Chronicle of Higher Education. Google Scholar
  • Preszler, R. W. (2009). Replacing lecture with peer-led workshops improves student learning. CBE—Life Sciences Education, 8(3), 182–192. doi: 10.1187/cbe.09-01-0002 LinkGoogle Scholar
  • Protopsaltis, S., & Baum, S. (2019). Does Online Education Live up to Its Promise? A Look at the Evidence and Implications for Federal Policy. George Mason University. Center for Education Policy and Evaluation. Retrieved July 3, 2021, from https://Jesperbalslev. Dk/Wp-Content/Uploads/2020/09/OnlineEd Google Scholar
  • Reynolds, J. A., Thaiss, C., Katkin, W., & Thompson Jr, R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach. CBE—Life Sciences Education,, 11(1), 17–25. LinkGoogle Scholar
  • Rifkind, L. J. (1992). Immediacy as a predictor of teacher effectiveness in the instructional television. Journal of Interactive Television, 1, 31–38. Google Scholar
  • Rivard, L. O. P. (1994). A review of writing to learn in science: Implications for practice and research. Journal of Research in Science Teaching, 31(9), 969–983. Google Scholar
  • Rogat, T. K., & Adams-Wiggins, K. R. (2015). Interrelation between regulatory and socioemotional processes within collaborative groups characterized by facilitative and directive other-regulation. Computers in Human Behavior, 52, 589–600. Google Scholar
  • Saldaña, J. (2009). The Coding Manual for Qualitative Researchers. Thousand Oaks, CA: Sage Publications Ltd. Google Scholar
  • Scager, K., Boonstra, J., Peeters, T., Vulperhorst, J., & Wiegant, F. (2016). Collaborative learning in higher education: Evoking positive interdependence. CBE—Life Sciences Education, 15(4), ar69. doi: 10.1187/cbe.16-07-0219 LinkGoogle Scholar
  • Shortridge-Baggett, L. M. (2000). The theory and measurement of the self-efficacy construct. Self-Efficacy in Nursing: Research and Measurement Perspectives, 9–28. Google Scholar
  • Singh, J., Steele, K., & Singh, L. (2021). Combining the best of online and face-to-face learning: Hybrid and blended learning approach for COVID-19, post vaccine, & post-pandemic world. Journal of Educational Technology Systems, 50(2), 140–171. Google Scholar
  • Smith, M. K., Wood, W. B., Adams, W. K., Wieman, C., Knight, J. K., Guild, N., & Su, T. T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323(5910), 122–124. MedlineGoogle Scholar
  • Springer, L., Stanne, M. E., & Donovan, S. S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research, 69(1), 21–51. Google Scholar
  • Tice, D., Baumeister, R., Crawford, J., Allen, K.-A., & Percy, A. (2021). Student belongingness in higher education: Lessons for Professors from the COVID-19 pandemic. Journal of University Teaching & Learning Practice, 18(4), 2. Google Scholar
  • Tiene, D. (2000). Online discussions: A survey of advantages and disadvantages compared to face-to-face discussions. Journal of Educational Multimedia and Hypermedia, 9(4), 369–382. Google Scholar
  • Trujillo, G., & Tanner, K. D. (2014). Considering the role of affect in learning: Monitoring students’ self-efficacy, sense of belonging, and science identity. CBE—Life Sciences Education, 13(1), 6–15. 10.1187/cbe.13-12-0241 LinkGoogle Scholar
  • Tu, C.-H. (2002). The measurement of social presence in an online learning environment. International Journal on E-Learning, 1(2), 34. Google Scholar
  • Vernon, D. T., & Blake, R. L. (1993). Does problem-based learning work? A meta-analysis of evaluative research. Academic Medicine, 68(7), 550–563. MedlineGoogle Scholar
  • Volet, S., Summers, M., & Thurman, J. (2009). High-level co-regulation in collaborative learning: How does it emerge and how is it sustained? Learning and Instruction, 19(2), 128–143. Google Scholar
  • Vygotsky, L. S. (1978). Mind in Society: The Development of Higher Psychological Processes. Cambridge, MA: Harvard University Press. Google Scholar
  • Wang, Q., & Woo, H. L. (2007). Comparing asynchronous online discussions and face-to-face discussions in a classroom setting. British Journal of Educational Technology, 38(2), 272–286. https://doi.org/10.1111/j.1467-8535.2006.00621.x Google Scholar
  • Weinberger, A., Stegmann, K., & Fischer, F. (2007). Knowledge convergence in collaborative learning: Concepts and assessment. Learning and Instruction, 17(4), 416–426. Google Scholar