Class Participation and Student Performance
Class Participation and Student Performance
Ernst Bekkering
[email protected]
Ted Ward
[email protected]
Department of Mathematics and Computer Science
Northeastern State University
Tahlequah, OK 74464
Abstract
Student attendance in class, and participation while in class, are predictors of student performance. Last
year, we reported on a new measure combining class attendance and attentiveness while in class and
used this participation score as a predictor of student performance on the final exam in the class. This
year, we follow up by analyzing data for four classes in the Fall semester of 2019. In each class, and for
the four classes combined, we found a statistically significant relationship between participation and
score on the final exam.
Keywords: participation, attendance, attentiveness, distraction, student performance
courses (Allen, Mabry, Mattrey, Bourhis, LMS, but it is often used in education and can be
Titsworth, & Burrell, 2004) , but synchronous integrated in Blackboard, Moodle, and other
education may be equivalent to the physical platforms.
classroom (Mullen, 2020). With a wide variation
in effect, positives may cancel out negatives Modern videoconferencing software provide
especially when students have additional tasks to multiple interaction tools. Some of them are
perform (Bernard, Abrami, Lu, Borkhovski, Wade, based on their physical counterparts, such as
Wozney, Wallet, Fiset, & Huang, 2004). When the voice communication and virtual hand raising.
task load is identical, for local and distant Information can be shared through programs
students in a videoconferencing setting, student such as PowerPoint, sharing of the presenter’s
performance is the same (MacLaughlin, desktop, whiteboards, slideshows, and sharing of
Supernaw, & Howard, 2004). Interaction may online videos. Collaboration tools include chat
make the difference: distance education with messages, annotation and drawing tools on
collaborative discussions is more effective than shared desktops, and transfer of control over
independent study only (Lou, Bernard, & Abrami, mouse and keyboard. These tools transform the
2006). Just recording lectures and posting notes shared view into two-way communication
online may not meet students’ needs (Gysbers et between instructor and students (SJSU, 2018)
al., 2011). For synchronous online session,
special attention tracking tools may be available. Finally, some forms of interaction scale better
Zoom had an attention tracking feature until April than others. Multiple choice quizzes work well for
2020, when it was removed for security and any size audience, but voice discussions are best
privacy reasons (Yuan, 2020). Cisco Webex still limited to small groups (Garner, 2018).
provides group and individual attentiveness
indicators and participant attention reports (Cisco Student Performance
Webex, 2018) Once we assume that class attendance and class
participation influence how well students do in
Class Participation class, we need to select a way to measure their
Active participation in class can take multiple performance. Multiple metrics have been used to
forms. In face to face classes, participation can measure student performance. Most frequently
mean the use of response cards and hand-raising used are readily-available items like course
(Christle & Schuster, 2003; Gardner, Heward, & grades (Beaudoin, 2002; Durden & Ellis, 1995;
Grossi, 1994; Narayan, Heward, Gardner, Kassarnig et al., 2017; Teixeira, 2016), term GPA
Courson, & Omness, 1990). Sometimes, special (Wang, Harari, Hao, Zhou, & Campbell., 2015),
tools like clickers were used (Stowell & Nelson, cumulative GPA (Lau, 2017), self-reported GPA
2007), but also cellphones for text messaging (Kirschner & Karpinski, 2010), GPA obtained from
(Nkhoma, Thomas, Nkhoma, Sriratanaviriyaku, registrars (Junco, 2012a), course credits
Truong, & Vo, 2018; L.-C. C. Wang & Morgan, (Giunchiglia, Zeni, Gobbi, Bignotti, & Bison,
2008). In the online environment, the initial 2018), scores on final exams (Duncan et al.,
measurement of participation in asynchronous 2012; Lukkarinen et al., 2016) and finishing the
classes might be with pages visited, tools used, course (Coldwell et al., 2008; Junco, 2012b).
messages accessed, discussions posted, and Occasionally, pre-tests and post-tests (Omar,
email contacts (Coldwell et al., 2008; Douglas & Bhutta, & Kalulu, 2009), student ranking (Felisoni
Alemanne, 2007; Romero, Lopez, Luna, & & Godoi, 2018) or multi-item scales are used (Yu,
Ventura, 2013). Some novel tools like location Tian, Vogel, & Chi-Wai Kwok, 2010).
and Bluetooth data have been used (Kassarnig,
Bjerre-Nielsen, Mones, Lehmann, & Lassen, On the other hand, a significant number of studies
2017), as has spyware installed on student rely on self-report by students (Junco & Cotten,
laptops to check browsing and application use 2011), including self-report of GPA and hours
(Kraushaar & Novak, 2010), but these are more spent studying (Kirschner & Karpinski, 2010).
for research and not for day-to-day teaching. However, some caution must be used since self-
report may not be as reliable (Kuncel, Crede, &
In the digital environment, all modern Learning Thomas, 2005)
Management Systems (LMS) provide some form
of videoconferencing to enable virtual class Multitasking
meetings. Moodle has a Videoconference Edition Using computers, cell phones, and other
(Moodle, Inc., 2019). Blackboard offers the technology does present new problems. McCoy
Blackboard Collaborate module (BlackBoard Inc, (2016) reported that students used digital
2019). Canvas includes the Conferences tool devices 11.43 times per school day. More than
(Canvas LMS Community, 2019). Zoom is not an 25% of effective class time may be spent on the
phone (Kim, Kim, Kim, Kim, Han, Lee, Mark, & 3. METHODOLOGY
Lee, 2019). Students often alternate between
class-related and non-class-related computer use Data for the four classes in this study were
(Fried, 2008; Grace-Martin & Gay, 2001; automatically recorded by the videoconferencing
Hembrooke & Gay, 2003; Janchenko, Rodi, & software. Data points were join time, leave time,
Donohoe, 2018). Cell phone use among college and attentiveness score for each student in each
students is becoming an addiction (Roberts, Yaya, course. Students were allowed to enter the class
& Manolis, 2014). before it started, and before the instructor. If
students entered early, the official start time of
Multitasking in class has evolved with the the class was used. The instructor used the full
technology of the day. When laptops entered the class period and closed the session after the class
classroom, instant messaging and web browsing was officially over. If students left after the class
were major distractions (Fox, Rosen, & Crawford, was officially over, the official closing time was
2009; Hembrooke & Gay, 2003). Later, Facebook used. Network interruptions or equipment
became a major distractor (Kirschner & Karpinski, problems occasionally dropped students from the
2010). Now, mobile phones provide yet another session, and they could immediately rejoin the
source of distraction (Chen & Yan, 2016; Harman class without instructor intervention. The
& Sato, 2011). Cell phone applications include attentiveness score reflected the percentage of
WhatsApp (Ahad & Lim, 2014), Snapchat and time that the focus of the student’s computer was
Instagram (Griffin, 2014). The negative effect of on the desktop shared by the instructor. The
using cellphones is especially high when it takes syllabus explained the attentiveness statistic and
place in class (Felisoni & Godoi, 2018), and lower instructed the students to maximize the class
performing students are especially at risk (Beland window to avoid accidental low scores. All
& Murphy, 2016; Chiang & Sumell, 2019). Beland lectures were recorded and generally available
and Murphy (2016) also found significant online after two hours and use of pen and paper
improvement in high stakes exam scores after for notes was suggested. Students had to use a
mobile phones were banned. computer with mouse and keyboard and keep the
camera on at all times.
Multitasking with technology negatively affects
participation and student performance, Participation scores were calculated each week by
subjectively (Junco & Cotten, 2011) and multiplying the attendance and attentiveness
objectively (Amez, Vujic, De Marex, & Baert, scores. For instance, if a student was 10 minutes
2020b; Amez & Baert, January 1, 2020a; Junco & late in a 50-minute class, attendance was 80%.
Cotten, 2012; Kates, Wu, & Coryn, 2018). Likewise, if a student had the shared instructor
Students do not necessarily recognize the desktop in focus only half of the time, the
negative effect. In a study of Malaysian university attentiveness score was 50%. If a student was 10
students, respondents felt that they performed minutes late and did not keep the shared desktop
better as Facebook usage increased (Ainin, in focus half the time, the participation score was
Naqshbandi,Moghavvemi, & Jaafar, 2015). 40%. At the end of the week, each day’s
The general research consensus holds that participation score was posted to the gradebook
multitasking does have a negative effect on for the class. For days when students were
student performance (Bellur, Nowak, & Hull, disconnected one or more times, the sum of the
2015; Burak, 2012; Junco & Cotten, 2012; products for the partial sessions was used. At the
Kraushaar & Novak, 2010; Kuznekoff, Munz, & end of the semester, students with average
Titsworth, 2015; MacLaughlin et al., 2004), participation below 80% lost one letter grade, and
although the causality has not yet been two letter grades if below 60%.
established (van der Schuur, Baumgartner,
Sumter, & Valkenburg, 2015). Controlled The four classes in the study involved two face to
experiments show that actual performance may face classes in computer labs and two Virtual
be the same, but the time to achieve it is longer Class Meetings. The university defines Virtual
(Bowman, Levine, Waite, & Genfron, 2010; Class Meetings as follows: “Virtual class meeting
Rubinstein, Meyer, & Evans, 2001). While some courses allow students to use their home or
studies fail to demonstrate differences between university computer to attend class at designated
performance of cognitive tasks with and without times” (Northeastern State University, 2019). In
distraction, they do show decreased efficiency of other words, both formats are synchronous but
information processing (End, Worthman, virtual class meetings are location-independent
Mathews,& Wetterau, 2010) and increased and face to face classes are not. The same
memory errors (Rubinstein et al., 2001). videoconferencing software was used in all
classes. Face to face classes were taught in
computer labs, did not use overhead projectors or every two weeks and used Unity with Visual
whiteboards, and streamed the session directly to Studio to develop the games. The final exam was
the students’ lab computers. All applications were an in-class programming project worth 30% of
shared on the instructor’s desktop. Various the course grade. Twenty-seven students started
features of the videoconferencing software were the course, and 22 students took the final exam.
used to increase student participation. Students One student got a zero score for the final exam
could use annotation and drawing tools on the for failure to follow final exam instructions.
shared desktop to ask questions, post comments,
and make annotations anonymously. The Chat Activity Reports
feature was used to post questions and The videoconferencing software can generate
comments, and answers to instructor questions. multiple reports. For this study, we used the
Finally, having students take over control over details report which can list each login for each
mouse and keyboard was used to have students course meeting for a period of up to a month.
demonstrate their understanding on the common Data include topic, join time, leave time, and the
desktop. Regardless of online or local delivery, all “attentiveness score.” Attentiveness in this
these techniques were used to lesser or greater context was defined as the percent of time that
extent. Students in the face-to-face classes were the shared Zoom window was in focus. If a
also allowed to participate remotely to maximize student was logged in but used another
attendance. No records were kept regarding local application, this did not contribute to
or remote attendance for face-to-face classes. attentiveness. If students got disconnected
during class and connected again, each partial
The first class, CS 3403 Data Structures, is one of session would have its own attentiveness score.
the core classes in the curriculum. It was taught Unfortunately, the attentiveness score was
as a virtual class meeting twice a week for 75 removed from all reports during the COVID-19
minutes. The course covered common data crisis (Yuan, 2020).
structures and algorithms in Computer Science
and used Python programming projects to 4. SAMPLE STATISTICS
illuminate the concepts. The final exam consisted
of a comprehensive multiple-choice test worth As usual in Computer Science, the majority of
40% of the course grade. Twenty-nine students students were male, traditional full-time students
started the course, and 24 took the final exam. in their late teens and early twenties who finished
the course and took the final. Details are listed in
The second class, CS 3643 Programming for Table 1.
Cyber Security, was an elective class taught as a
face-to-face class twice weekly for 75 minutes. course female male
The course covered general cybersecurity CS3403 7 22
concepts and problems and used virtual machines non-traditional 1
with Python programs to illustrate the material. final 1
The final exam consisted of a comprehensive traditional 6 22
multiple-choice test worth 40% of the course final 6 17
grade. Fifteen students started the course, and
no_final 5
11 took the final exam.
CS3643 1 14
traditional 1 14
The third class, CS 4203 Software Engineering, is
final 1 9
another core class in the CS curriculum. It was
taught as a virtual class meeting thrice weekly for no_final 5
50 minutes. The course covered the development CS4203 7 22
process including analysis, modeling, and testing. non-traditional 1
UML models were developed with online software, final 1
and testing was done with a scripting language. traditional 6 22
The final exam consisted of a comprehensive final 6 21
multiple-choice test worth 40% of the course no_final 1
grade. Twenty-nine students started the course, CS4223 5 22
and 28 took the final exam. non-traditional 1
no_final 1
The final class, CS 4223 Game Programming, was traditional 4 22
an elective class taught face to face. The class final 3 18
met twice weekly for 75 minutes. The course was no_final 1 4
heavily project based with hands-on projects due Table 1 - Sample Statistics
Class attendance and attentiveness data were Linear regression at the semester level, with all
automatically recorded by Zoom, since students courses combined, showed a statistically
were required to log in to the class sessions. significant relationship between the independent
Participation scores were posted on the participation variable and the dependent
Blackboard gradebook every two weeks, and performance variable. The level of significance
students who scored low on participation early in was .000 for the regression and .000 for
the course received an email with separate data participation. The R Square statistic was strong at
for attendance and attentiveness to explain why .648, indicating that 64% of the variance in
their scores were low. Since we measured the student performance was explained by student
influence of conditions in for each student in one participation. Since we used only one
course only, we used the final exam in the course independent variable, the unstandardized
to measure performance. The final multiple- coefficient for participation was reviewed. At a
choice exam was posted using the course delivery level of 1.094, each percent increase in
system and scores automatically calculated. participation was related to about a percent of
Questions and answers were reviewed based on increase in performance. Appendix B shows the
less than 50% correct answers, and no questions output of the semester level analysis.
were found to be incorrectly stated.
At the course level, linear regression showed a
5. ANALYSIS similar result. The significance for regression in
each course was .000, indicating a statistically
The data was analyzed in anonymous form. Daily significant relationship. The R Square statistic
Activity Reports were downloaded in CSV files and varied between a low of .465 and a high of .933.
copied to one sheet of a spreadsheet, final exam Coefficients for participation were all slightly
scores were downloaded from the Blackboard above 1, again indicating that each percent
gradebook and copied to another sheet, and a increase in participation was related to about a
third sheet was used as a lookup table with percent increases in performance. Appendix C
student names and random numbers between shows the output of the course level analysis.
1111 and 9999.
6. CONCLUSIONS AND RECOMMENDATIONS
Next, we corrected for absences which were not
reflected in the activity reports. All absences Based on these results, it appears that class
received a zero score for participation, as no time participation, defined as the combination of
was spent in class. Absences were not corrected coming to class and paying attention while there,
for excused absences, such as attendance of is a good predictor of student performance. This
events sanctioned by Academic Affairs. Students would appear to be a no-brainer, but in this age
who did not finish the class and did not take the where students often work significant hours
final exam were included with a zero score for the and/or have family obligations, the importance of
final. Final exam scores were standardized to a coming to class and spending this time
percent of possible points by dividing the actual productively should not be underestimated. Using
score by the maximum of 300 or 400 points. the participation statistic as part of the total
number of points in the course can also help
Student names in the activity reports and the final motivate students to change behavior in a
exam scores sheet were replaced with the positive manner. When students notice that the
random numbers, and linked in a fourth sheet participation score is low, it is easy to see whether
combining the student participation with their this is due to being distracted in class, or not
grades on the final exam. This sheet with random coming to class altogether. Since the
numbers, participation score, and standardized videoconferencing software does not record
final exam score was exported in CSV format and attentiveness when students are not in class, the
imported in SPSS. percent time in class is a perfect indicator for
attendance and the attentiveness score a good
The data were analyzed with linear regression at indicator for focus while they are there.
the course level and at the semester level (all
courses combined). Descriptive statistics show This does not mean that attentiveness as
that some students reached perfect participation measured by computer focus on the shared
and perfect scores on the final exams. Appendix desktop is perfect. Students can keep other
A lists the descriptive statistics first at the applications open, especially on dual monitors,
semester level, and then at the course level. and quickly click back and forth. The
videoconferencing software only samples focus
Coldwell, J., Craig, A., Paterson, T., & Mustard, J. Felisoni, D. D., & Godoi, A. S. (2018). Cell phone
(2008). Online Students: Relationships usage and academic performance: An
between Participation, Demographics and experiment. Computers & Education, 117,
Academic Performance. 6(1), 10. 175–187.
Craig, R. (2020). A Brief History (And Future) Of Fox, A. B., Rosen, J., & Crawford, M. (2009).
Online Degrees. Forbes. Distractions, Distractions: Does Instant
https://www.forbes.com/sites/ryancraig/20 Messaging Affect College Students’
15/06/23/a-brief-history-and-future-of- Performance on a Concurrent Reading
online-degrees/
Descriptive Statistics
Descriptive Statistics
course N Minimum Maximum Mean
2019Fall-CS3403 participation 29 7.0% 100.0% 74.162%
standardized as % of max score possible 29 0.0% 96.0% 65.655%
Valid N (listwise) 29
Valid N (listwise) 15
Valid N (listwise) 29
2019Fall-CS4223 participation 27 6.4% 98.3% 79.004%
standardized as % of max score possible 27 0.0% 100.0% 61.481%
Valid N (listwise) 27
Variables Entered/Removeda
Variables
Model Entered Variables Removed Method
1 participationb . Enter
a. Dependent Variable: standardized as % of max score possible
b. All requested variables entered.
Model Summary
Model R R Square Adjusted R Square Std. Error of the Estimate
1 .805a .648 .644 18.7685%
a. Predictors: (Constant), participation
ANOVAa
Model Sum of Squares df Mean Square F Sig.
1 Regression 63501.016 1 63501.016 180.270 .000b
Residual 34521.074 98 352.256
Total 98022.090 99
a. Dependent Variable: standardized as % of max score possible
b. Predictors: (Constant), participation
Coefficientsa
Standardized
Unstandardized Coefficients Coefficients 95.0% Confidence Interval for B
Model B Std. Error Beta t Sig. Lower Bound Upper Bound
1 (Constant) -16.496 6.552 -2.518 .013 -29.498 -3.493
participation 1.094 .081 .805 13.426 .000 .932 1.255
a. Dependent Variable: standardized as % of max score possible
Variables Entered/Removeda
Variables Variables
course Model Entered Removed Method
2019Fall-CS3403 1 participationb . Enter
2019Fall-CS3643 1 participationb . Enter
2019Fall-CS4203 1 participationb . Enter
2019Fall-CS4223 1 participationb . Enter
a. Dependent Variable: standardized as % of max score possible
b. All requested variables entered.
Model Summary
Adjusted R Std. Error of the
course Model R R Square Square Estimate
2019Fall-CS3403 1 .845a .714 .703 17.6498%
2019Fall-CS3643 1 .966a .933 .928 9.7440%
2019Fall-CS4203 1 .731a .535 .518 12.4119%
2019Fall-CS4223 1 .682a .465 .443 26.4172%
a. Predictors: (Constant), participation
ANOVAa
course Model Sum of Squares df Mean Square F Sig.
2019Fall-CS3403 1 Regression 20989.648 1 20989.648 67.379 .000b
Residual 8410.903 27 311.515
Total 29400.552 28
2019Fall-CS3643 1 Regression 17175.696 1 17175.696 180.899 .000b
Residual 1234.304 13 94.946
Total 18410.000 14
2019Fall-CS4203 1 Regression 4781.484 1 4781.484 31.038 .000b
Residual 4159.464 27 154.054
Total 8940.948 28
2019Fall-CS4223 1 Regression 15144.015 1 15144.015 21.700 .000b
Residual 17446.726 25 697.869
Total 32590.741 26
a. Dependent Variable: standardized as % of max score possible
b. Predictors: (Constant), participation
Coefficientsa
Unstandardized Standardized
Coefficients Coefficients 95.0% Confidence Interval for
course Model B Std. Error Beta t Sig. Lower Bound Upper Bound
2019Fall-CS3403 1 (Constant) -19.653 10.897 -1.803 .082 -42.012 2.706
participation 1.150 .140 .845 8.208 .000 .863 1.438
2019Fall-CS3643 1 (Constant) -5.520 5.253 -1.051 .312 -16.868 5.828
participation 1.017 .076 .966 13.450 .000 .854 1.180
2019Fall-CS4203 1 (Constant) -34.376 20.951 -1.641 .112 -77.364 8.613
participation 1.341 .241 .731 5.571 .000 .847 1.835
2019Fall-CS4223 1 (Constant) -30.553 20.400 -1.498 .147 -72.568 11.463
participation 1.165 .250 .682 4.658 .000 .650 1.680
a. Dependent Variable: standardized as % of max score possible