Effects of Computer-Based Training in Computer Har
Effects of Computer-Based Training in Computer Har
Volume 12 • Issue 1
ABSTRACT
This study determined the effects of computer-based training in computer hardware servicing with a
pedagogical agent named “DAC: The Builder” on the academic performance of computing students.
Fifty-six university students (30 students in the control group, 26 students in the experimental group)
participated in a two-week experiment. The majority of the experimental group exhibited gaming
behavior but subsequently reduced it after DAC intervention. The data collected in this study showed
that the null hypothesis stating that there is no significant difference in the pretest and posttest scores
of the experimental group can be rejected. Moreover, the hands-on posttest scores of both groups
had significant differences. This study demonstrates that returning students to the lecture when they
exhibit gaming the system behavior is an effective tool for discouraging this behavior. The use of the
DAC is therefore recommended for students taking up computer hardware servicing. Implications
and recommendations were also discussed.
Keywords
Computer Hardware Assembly, Gaming the System, Hardware Servicing Skill, Tutoring System
1. INTRODUCTION
Computer hardware servicing is a technical skill where students have to learn computer set building,
computer troubleshooting, software installation, system configuration, and computer maintenance (De
Jesus, 2019). From basic secondary school to computer-related courses in tertiary education, computer
hardware servicing instructions are a fundamental skill in computer education (Hsu & Hwang,
2014). However, there are challenges to learning the course. The difficulty experienced by students
in assembling a computer is not only due to a lack of practice but also to insufficient assistance and
1
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
materials (Hwang et al., 2011). For example, to understand the functions of a motherboard, students
need to see a fully functional motherboard. The ideal teaching method for the subject is to allow
the students to use a functional motherboard. However, it will be highly impractical to dismantle
a working computer to show the motherboard. Moreover, providing individualized feedback to all
students will be very tedious and time-consuming (Botarleanu et al., 2018).
One way to address these issues is to employ computer-based training (CBT) software
(subsequently referred to as software) for a computer hardware servicing system (De Jesus, 2019).
However, prior work (e.g., De Jesus, 2019) did not include interventions when students are gaming
the system (GTS) (a deliberate behavior to exploit the system to achieve correct responses rather than
learning the materials; Baker et al., 2008) and assistance from a pedagogical agent. To address these
gaps, this study was conceived. This study developed software for computer hardware servicing for
computing students (Information Technology, Computer Science, and Information Systems) with a
pedagogical agent capable of detecting the GTS. Specifically, the study aims to answer the following
research questions (RQ). 1) What is the software utilization of the students in the experimental group
in terms of the number of lectures taken, time spent on the hands-on activities, number of hands-on
errors, time spent gaming the system, and lesson where GTS was observed? 2) What are the hardware
servicing academic performances of the students in the control and experimental groups in terms of
pretest scores, posttest scores, time spent on the hands-on activities, and number of hands-on errors?
3) Is there a significant difference between the academic performances of the students in terms of
pretest scores, posttest scores, time spent on the hands-on exercises, and number of hands-on errors
in the control and experimental groups?
The following null hypotheses were tested in this study:
H0a: There is no significant difference in the pretest scores of the experimental and control group.
H0b: There is no significant difference in the posttest scores of the experimental and control group.
H0c: There is no significant difference in the time spent on the hands-on activities of the experimental
and control group.
H0d: There is no significant difference in the number of hands-on errors committed of the experimental
and control group.
H0e: There is no significant difference in the pretest and posttest scores of the students in the control group.
H0f: There is no significant difference in the pretest and posttest scores of the students in the
experimental group.
2. LITERATURE REVIEW
2
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
CBT was also employed in mathematics learning. For example, Mousa and Molnár (2020)
determined whether CBT in math improves the inductive reasoning of 9 to 11-year-old children. Their
study found evidence to support the conclusion that the experimental group (those who underwent
CBT) had higher posttest scores than the control group. In the recent study of Zwart et al. (2021),
they utilized CBT for training nursing students in professional duties that included mathematical tasks
associated with medication processes. The CBT system included mathematical medication scenarios
and basic arithmetic exercises that could support mathematical medication learning. Data gathered
from 118 participants showed that the CBT improved the mathematical memorization of all students.
De Jesus (2019) conducted a similar study on computer hardware servicing. This is the only
study that is closely related to this current study. De Jesus (2019) developed a CBT named “Computer
Hardware Servicing and Maintenance Trainer” (CHSM Trainer). The CSHM Trainer reduced the
time spent practicing interfaces and troubleshooting. The software received a very satisfactory
subjective evaluation from the students. However, the software could not detect gaming behaviors
or the functionalities of a pedagogical agent.
3
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
Students exhibit GTS for various reasons. Students exhibit GTS because they want to know
the reaction of the PA (Rodrigo et al., 2012). Other students were genuinely stuck on the activity
(Beck & Rodrigo, 2014). In a classroom setting, teachers or tutors provide interventions to help
students move forward with the lesson. One of these strategies is to repeat the lecture. For example,
Bringula et al. (2020) reported that videos were the preferred teaching materials since students
could repeat the lectures.
3. METHODOLOGY
4
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
Figure 2. DAC: The Builder assisting the student to build a virtual computer
the activity (Beck & Rodrigo, 2014). Students would only be allowed to retake the exercise after the
lesson (Figure 3). This strategy was based on the study of Bringula et al. (2020).
Figure 3. DAC re-directs the student to the tutorial after it detected GTS
5
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
A class section was randomly assigned (R) either to an experimental or control group. The
experimental group consisted of 26 students, while the control group consisted of 30 students (Figure
4). The average age of the participants in both groups was 20 years. The majority of the participants
are male students in both groups: 22 male participants in the experimental group and 19 in the control
group. Most of the students who participated in the experimental group were third-year students (n
= 19), while second-year students (n = 15) were in the control group.
Both sets of students took a pretest (O) before the intervention period (X). The whole experiment
lasted for two weeks. The intervention period lasted for four non-consecutive days (i.e., two class
sessions within a week, and each session lasted for 1.5 hours). Afterward, a posttest (O) was
administered to both groups.
6
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
4. RESULTS
7
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
Table 2. Mann-Whitney U Test on the Hardware Servicing Performance of the Students in the Experimental (n = 26) and
Controlled (n = 30) Groups
not significantly different (U = 372.0, p > 0.05). The first (H0a) and second (H0b) null hypotheses
are both accepted.
The Mann-Whitney U test confirmed that there is a significant difference between the software
utilization of the control and experimental groups. The experimental group spent less time (U =
234.00, p < 0.05) and had fewer errors committed during the activities (U = 247.00, p < 0.05). Hence,
the third (H0c) and fourth (H0d) null hypotheses are both rejected.
Meanwhile, the Wilcoxon signed-rank tests were conducted on the pretest and posttest scores
of the groups (Table 3). In the control group, there is almost an equal number of negative (n = 14)
and positive (n = 15) ranks. Moreover, the mean negative rank is 12.71 and the mean positive rank
is 17.13. The sum of the positive ranks (s = 279.50) is higher than the sum of the negative ranks (s
= 178.00). However, the difference between the mean ranks of the pretest and posttest scores in the
control group is not significant (Z = 0.855, p > 0.05). Therefore, the null hypothesis stating that (H0e)
there is no significant difference between the pretest and posttest scores of the control group is accepted.
In the experimental group, there are more positive ranks (n = 20) than negative ranks (n = 5).
Consequently, the mean positive rank (M = 13.98) is higher than the mean negative rank (M = 9.10).
The sum of the ranks further shows the discrepancy between the positive (s = 270.50) and negative
(s = 45.50) ranks. The difference between the mean rank of the scores was found to be significant (Z
= -3.194, p < 0.05). Hence, the posttest scores are higher than the pretest scores of the experimental
Table 3. Wilcoxon Signed-Rank Tests on the Hardware Servicing Performance of the Students between their Pretest and
Posttest Scores
8
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
group. Consequently, the null hypothesis stating that (H0f) there is no significant difference between
the pretest and posttest scores of the experimental group is rejected.
5. DISCUSSION
This study determined the impact of a CBT on the computer hardware servicing skills of college
students. Towards this goal, the academic performances of the students in the experimental and
control groups were compared. Moreover, the software utilization of the experimental group was
investigated. The experimental group had better software utilization than the control group in terms
of the average number of hands-on activities completed, average time spent, and number of hands-
on errors committed. The experimental group was able to cover more hands-on activities than the
control group. The experimental group also exhibited their knowledge more correctly and quickly than
the control group. These findings agree with the study by De Jesus (2019). The favorable software
utilization of the experiment group can be attributed to the students’ familiarity with the software.
The software is indeed able to assist the students’ learning of computer hardware servicing at their
phase. Nevertheless, despite the lack of familiarity with the system, the control group was able to
complete seven hands-on activities.
Consistent with the literature, students in this study also exhibited GTS behavior. In the context
of this study, students attempted to fit the parts of a computer into the different computer slots.
GTS is exhibited within 4.03 seconds. As a result, students responded to the activities passively.The
majority of the GTS was logged in the 13th activity. Perhaps, students were attempting to finish all
the lessons quickly.
In the first case, more than half of the students displayed GTS.After the intervention of DAC,
there was a significant reduction in GTS. Therefore, the combination of returning the students to the
lecture, textual feedback, and neutral facial expression are effective ways to prevent this student’s
behavior. However, there were still 3 students who persisted in their GTS behavior. It is unclear why
these students continue this behavior despite taking more time to re-learn the lesson. Future research
is necessary to shed light on this phenomenon.
The Mann-Whitney U test on the pretest scores of both groups showed no significant difference.
Thus, the prior knowledge of the students in the course is similar. In other words, when the study was
conducted, they had the same levels of understanding of the lesson. At the end of the intervention
period, the differences in their posttest scores were not statistically significant. This means students
in the traditional lecture setting and the experimental setting could not outperform each other.
The Wilcoxon signed-rank tests provided another insight into the group’s academic performance in
hardware servicing. For the control group, the rank of the pretest and posttest scores was not statistically
different. This finding suggests traditional lectures could not increase the students’ scores to a large
extent. Meanwhile, the experimental group had a different result. Students who use the software have
the potential to significantly improve their grades.However, as shown in the previous statistical test,
the scores of the students in the experimental group did not exceed the scores of the control group.
This study contributed to the existing threads of discussion on preventing GTS and on the field of
CBT in general. In prior studies, preventing GTS was focused on reprimanding or reminding students
about their usage behavior (Arroyo et al., 2007; Baker et al., 2006; Nunes et al., 2016; Roll et al.,
2007; Walonoski & Heffernan, 2006). While these strategies have been proven effective in reducing
GTS, they may lack pedagogical value. In this current study, the response of the PA was based on the
assumption that students exhibit GTS because of a lack of skills. Consistent with the study of Chen
et al. (2012), students need to repeat the lesson as a more definitive course of action. The gaming
9
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
behavior intervention employed in this study, as shown in the findings, significantly reduced the
number of students who exhibited this behavior.
Furthermore, this study offers practical implications. Considering the positive outcomes of the
experiment, the use of the software is encouraged. The software may also be utilized as supplemental
material for students. Specifically, at-risk, struggling, or absentee students may use the software to
catch up with the course content. CBT researchers may also consider redirecting the students to their
lessons as a way to deter GTS.
This study determined the students’ utilization of a CBT software named “DAC” and its impact
on their academic performance. The experimental group had a more favorable use of the software
compared to the control group. However, this is mainly attributed to familiarity with the system. The
experimental group exhibited GTS. This behavior was significantly reduced after DAC intervention.
Hence, redirecting the students to retake the lesson is an effective way to deter GTS.
The study did not find evidence to reject the first, second, and fifth null hypotheses. However,
the third, fourth, and sixth hypotheses were rejected. Three conclusions can be derived from this
finding. First, students can learn both in traditional and experimental settings. Second, the students
in both conditions could not outperform each other. In other words, after each intervention, it can
be expected that their scores will be the same. Lastly, the software can assist students in catching up
with their peers.
Despite the promising results, there are several limitations in the study that are worth further
investigation. Every intervention has limitations, and the strategy employed in this study is no
exception. It is still unclear how students will avoid being detected in their GTS behavior. Incorporation
of other intervention strategies in the system is suggested to determine the relative impact of these
strategies. Lastly, the software only covered the hardware servicing of desktop computers. Thus,
laptop servicing may be incorporated into future research.
10
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
REFERENCES
Arroyo, I., Ferguson, K., Johns, J., Dragon, T., Meheranian, H., Fisher, D., & Woolf, B. P. et al. (2007). Repairing
disengagement with non-invasive interventions. Artificial Intelligence in Education, 2007, 195–202.
Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., & Koedinger, K. (2008). Why students engage in “gaming
the system” behavior in interactive learning environments. Journal of Interactive Learning Research, 19(2), 185–224.
Baker, R. S., Corbett, A. T., Koedinger, K. R., Evenson, S., Roll, I., Wagner, A. Z., & Beck, J. E. et al. (2006).
Adapting to when students game an intelligent tutoring system. In International conference on intelligent tutoring
systems (pp. 392-401). Springer. doi:10.1007/11774303_39
Baker, R. S., Mitrović, A., & Mathews, M. (2010). Detecting gaming the system in constraint-based tutors.
In International Conference on User Modeling, Adaptation, and Personalization (pp. 267-278). Springer.
doi:10.1007/978-3-642-13470-8_25
Beck, J., & Rodrigo, M. M. T. (2014). Understanding wheel spinning in the context of affective factors. In
International conference on intelligent tutoring systems (pp. 162-167). Springer. doi:10.1007/978-3-319-07221-0_20
Bedwell, W. L., & Salas, E. (2010). Computer‐based training: Capitalizing on lessons learned. International
Journal of Training and Development, 14(3), 239–249.
Bimba, A. T., Idris, N., Al-Hunaiyyan, A., Mahmud, R. B., & Shuib, N. L. B. M. (2017). Adaptive
feedback in computer-based learning environments: A review. Adaptive Behavior, 25(5), 217–234.
doi:10.1177/1059712317727590
Botarleanu, R. M., Dascalu, M., Sirbu, M. D., Crossley, S. A., & Trausan-Matu, S. (2018). ReadME–Generating
personalized feedback for essay writing using the ReaderBench framework. In H. Knoche, E. Popescu, & A.
Cartelli (Eds), Conference on Smart Learning Ecosystems and Regional Development (pp. 133-145). Springer.
Bringula, R., De Leon, J. S., Rayala, K. J., Pascual, B. A., & Sendino, K. (2017). Effects of different types of feedback
of a mobile-assisted learning application and motivation towards mathematics learning on students’ mathematics
performance. International Journal of Web Information Systems, 13(3), 241–259. doi:10.1108/IJWIS-03-2017-0017
Bringula, R., Fosgate, I. C., Yorobe, J. L., & Garcia, N. P. (2020). Exploring the Sequences of Synthetic Facial
Expressions and Type of Problems Solved in a Personal Instructing Agent using Lag Sequential Analysis. In
2020 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE) (pp. 764-
769). IEEE. doi:10.1109/TALE48869.2020.9368492
Bringula, R. P., Fosgate, I. C. O. Jr, Garcia, N. P. R., & Yorobe, J. L. M. (2018). Effects of pedagogical agents
on students’ mathematics performance: A comparison between two versions. Journal of Educational Computing
Research, 56(5), 701–722. doi:10.1177/0735633117722494
De JesusA. N. B. (2019). Computer hardware servicing and maintenance trainer. https://ssrn.com/
abstract=3448885
Dinçer, S., & Doğanay, A. (2017). The effects of multiple-pedagogical agents on learners’ academic success,
motivation, and cognitive load. Computers & Education, 111, 74–100. doi:10.1016/j.compedu.2017.04.005
Ecalle, J., Vidalenc, J. L., & Magnan, A. (2020). Computer-based Training Programs to Stimulate Learning to
Read in French for Newcomer Migrant Children: A Pilot Study. Journal of Educational Cultural and Psychological
Studies, (22), 23–47. doi:10.7358/ecps-2020-022-ecal
Guo, H., Rios, J. A., Haberman, S., Liu, O. L., Wang, J., & Paek, I. (2016). A new procedure for detection of
students’ rapid guessing responses using response time. Applied Measurement in Education, 29(3), 173–183.
doi:10.1080/08957347.2016.1171766
Hsu, C. K., & Hwang, G. J. (2014). A context-aware ubiquitous learning approach for providing instant learning
support in personal computer assembly activities. Interactive Learning Environments, 22(6), 687–703. doi:10
.1080/10494820.2012.745425
Hwang, G. J., Wu, C. H., Tseng, J. C. R., & Huang, I. (2011). Development of a ubiquitous learning platform
based on a real-time help-seeking mechanism. British Journal of Educational Technology, 42(6), 992–1002.
doi:10.1111/j.1467-8535.2010.01123.x
11
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
Kim, Y., & Baylor, A. L. (2016). Research-based design of pedagogical agent roles: A review, progress, and
recommendations. International Journal of Artificial Intelligence in Education, 26(1), 160–169. doi:10.1007/
s40593-015-0055-y
Klimova, B. (2021). Are There Any Cognitive Benefits of Computer-Based Foreign Language Training for
Healthy Elderly People?–A Mini-Review. Frontiers in Psychology, 11, 573287. doi:10.3389/fpsyg.2020.573287
PMID:33584410
Kraemer, E. E., Davies, S. C., Arndt, K. J., & Hunley, S. (2012). A comparison of the Mystery Motivator and the
Get’Em On Task interventions for off‐task behaviors. Psychology in the Schools, 49(2), 163–175. doi:10.1002/
pits.20627
Lane, H. C., & Schroeder, N. L. (2022). Pedagogical agents. In B. Lugrin, C. Pelachaud, & D. Traum (Eds.), The
Handbook on Socially Interactive Agents: 20 years of Research on Embodied Conversational Agents, Intelligent
Virtual Agents, and Social Robotics Volume 2: Interactivity, Platforms, Application (pp. 307-330). Association
of Computing Machinery. doi:10.1145/3563659.3563669
Li, J., Kizilcec, R., Bailenson, J., & Ju, W. (2016). Social robots and virtual agents as lecturers for video instruction.
Computers in Human Behavior, 55, 1222–1230. doi:10.1016/j.chb.2015.04.005
Martha, A. S. D., & Santoso, H. B. (2019). The design and impact of the pedagogical agent: A systematic
literature review. Journal of Educators Online, 16(1), n1. doi:10.9743/jeo.2019.16.1.8
Mohammadhasani, N., Fardanesh, H., Hatami, J., Mozayani, N., & Fabio, R. A. (2018). The pedagogical agent
enhances mathematics learning in ADHD students. Education and Information Technologies, 23(6), 2299–2308.
doi:10.1007/s10639-018-9710-x
Mousa, M., & Molnár, G. (2020). Computer-based training in math improves inductive reasoning of 9-to 11-year-
old children. Thinking Skills and Creativity, 37, 100687. doi:10.1016/j.tsc.2020.100687
Nunes, T. M., Bittencourt, I. I., Isotani, S., & Jaques, P. A. (2016). Discouraging gaming the system through
interventions of an animated pedagogical agent. In European Conference on Technology Enhanced Learning
(pp. 139-151). Springer. doi:10.1007/978-3-319-45153-4_11
Oduma, C. A., Onyema, L. N., & Akiti, N. (2019). E-learning platforms in business education for skill acquisition.
[NIGJBED. Nigerian Journal of Business Education, 6(2), 104–112.
Price, T. W., Zhi, R., & Barnes, T. (2017, June). Hint generation under uncertainty: The effect of hint quality
on help-seeking behavior. In E. André, R. Baker, X. Hu, M. Rodrigo, & B. du Boulay (Eds.), International
conference on artificial intelligence in education (pp. 311-322). Springer. doi:10.1007/978-3-319-61425-0_26
Rodrigo, M. M. T., Baker, R. S., Agapito, J., Nabos, J., Repalam, M. C., Reyes, S. S., & San Pedro, M. O. C.
(2012). The effects of an interactive software agent on student affective dynamics while using; an intelligent
tutoring system. IEEE Transactions on Affective Computing, 3(2), 224–236. doi:10.1109/T-AFFC.2011.41
Roll, I., Aleven, V., McLaren, B. M., & Koedinger, K. R. (2007, June). Can Help-Seeking Be Tutored? Searching
for the Secret Sauce of Metacognitive Tutoring. Artificial Intelligence in Education, 2007, 203–210.
Verginis, I., Gouli, E., Gogoulou, A., & Grigoriadou, M. (2011). Guiding learners into re-engagement through
the SCALE environment: An empirical study. IEEE Transactions on Learning Technologies, 4(3), 275–290.
doi:10.1109/TLT.2011.20
Walonoski, J. A., & Heffernan, N. T. (2006). Detection and analysis of off-task gaming behavior in intelligent
tutoring systems. In International Conference on Intelligent Tutoring Systems (pp. 382-391). Springer.
doi:10.1007/11774303_38
Zwart, D. P., Goei, S. L., Noroozi, O., & Van Luit, J. E. (2021). The effects of computer-based virtual learning
environments on nursing students’ mathematical learning in medication processes. Research and Practice in
Technology Enhanced Learning, 16(1), 1–21. doi:10.1186/s41039-021-00147-x
12
International Journal of Technology-Enabled Student Support Services
Volume 12 • Issue 1
Rex P. Bringula is a professor at the University of the East (UE) College of Computer Studies and Systems. He
received his BS Computer Science degree from UE as a Department of Science and Technology scholar. He
received his Master’s in Information Technology and Ph.D. in Technology Management in Technological University
of the Philippines. He is active in conducting school- and government-funded research projects, and in participating
in local and international conferences. His research interests are in computer science/IT education, affective
computing, Internet studies, cyber-behavior, web usability, and environmental issues.
John Vincent Canseco graduated from the University of the East, Manila, Philippines.
Patricia Louise J. Durolfo was a student at the University of the East, Manila, Philippines.
Lance Christian Villanueva was a student at the University of the East, Manila, Philippines.
Gabriel M. Caraos was a student at the University of the East, Manila, Philippines.
13