OBJECTIVE. The purpose of this study is to validate an electronic learning, or e-learning, concept featuring gamification elements, rapid case reading, and instant feedback.
SUBJECTS AND METHODS. An e-learning concept was devised that offered game levels for the purpose of providing training in the detection of pneumothorax in 195 cases, with questions read in rapid succession and instant feedback provided for each case. The user’s task was to locate the pneumothorax on chest radiographs and indicate its presence by clicking a mouse. The game level design included an entry test consisting of 15 cases, training levels with increasing difficulty that involved 150 cases, and a final test that including 30 cases (the 15 cases from the entry test plus 15 new cases). A total of 126 candidates were invited via e-mail to participate and were asked to complete a survey before and after playing the game, which is known as RapRad. The level of diagnostic confidence and the error rate before and after playing the game were compared using a Wilcoxon signed rank test.
RESULTS. Fifty-nine of 126 participants (47%) responded to the first survey and finished the game. Of these 59 participants, 29 (49%) responded to the second survey after completing the game. Diagnostic confidence in pneumothorax detection improved significantly, from a mean (± SD) score of 4.3 ± 2.1 on the entry test to a final score of 7.3 ± 2.1 (p < 0.01) after playing RapRad, with the score measured on a 10-point scale, with 10 denoting the highest possible score. Of the participants, 93% indicated that they would use the game for learning purposes again, and 87% indicated that they had fun using RapRad (7% had a neutral response and 6% had a negative response). The error rate (i.e., the number of failed attempts to answer a question correctly) significantly decreased from 39% for the entry test to 22% for the final test (p < 0.01).
CONCLUSION. Our e-learning concept is capable of improving diagnostic confidence, reducing error rates in training pneumothorax detection, and offering fun in interaction with the platform.
education, feedback, medical, pneumothorax, radiography, software,Gamification
technologic progress has improved radiologic imaging and has allowed considerable innovation in methods of teaching in medical education . The Internet and interactive multimedia software have had a crucial impact on the establishment of new learning technologies , serving as a supplement or replacement for traditional forms of teaching [3, 4]. In the past, scientific work has shown that the application of game elements in nongame contexts (i.e., gamification) can be an effective learning tool and motivator [5–7], especially when applied in the context of radiology education . Apart from theoretic considerations, the goal of every learning process is to gain knowledge. In radiology, knowledge is essentially a product of the number of cases seen and the difficulty of each case read in radiologic practice. However, the various case difficulty levels are not distributed equally in daily practice, and their occurrence is strongly influenced by chance. Taking this into account, we developed an electronic learning, or e-learning, platform known as RapRad to overcome the limitation of spatial and temporal availability of cases in daily practice. One of the key elements of our e-learning concept was to collect a large number of different cases of all levels of difficulty, with the understanding that expertise is best acquired when huge numbers of cases of varying levels of difficulty are evaluated. Another central feature was to provide instant feedback regarding participants’ responses to questions, because the possibility to access higher game levels (i.e., “level up”) and gain items such as health-restoring tokens by correctly answering questions, whereas providing multiple wrong answers led to the user’s game play ending (i.e., a “game-over” status). A virtual leaderboard based on experience points and game level status was further implemented to boost the motivation of the user. Furthermore, a progress bar was visible at each level, and overall accuracy metrics were provided to track and quantify individual learning progress. A statistics tool was incorporated into the RapRad website to gather and plot those statistics (Fig. 1). This tool was also used to analyze different aspects of the user’s performance. The error rate, defined as the number of failed attempts to answer a question correctly, was recorded for each participant.
The purpose of the present study was to evaluate a newly developed e-learning platform called RapRad and to assess its learning efficiency. The goal of RapRad is to train the entire spectrum of a specific abnormality in a standardized environment with rapid case reading, to shorten the path to expertise. Because the job of a radiologist is perfectly suited to be emulated by a computer game, e-learning can be adapted especially well to radiologic content. The idea of a game-based e-learning concept led to the implementation of gamification elements such as gaining and losing health and experience points, accessing higher game levels or causing an end to game play, and using experience points that were designed to boost motivation and could be used to gain tokens to achieve such goals as, for example, restoring health. Because the platform was designed for training and educational purposes and, in this context, for a younger target audience, the fun factor also significantly impacted the development of the e-learning platform because this is a prerequisite for continuous utilization of learning aids. One aim of the present study was to investigate the learning efficacy of our e-learning platform. The results show that, as a subjective parameter, reported diagnostic confidence in detecting pneumothoraces was significantly higher after playing RapRad. We were also able to objectively quantify those results by showing that the error rate was significantly reduced between the time of the entry test and the final test, the latter of which included all questions from the entry test, decreasing from 38.8% to 21.8%. Our results are in concordance with those of various studies. Mohan et al.  showed improved triage decision making in a validated virtual simulation after exposure to video games. Mann et al.  showed that playing a computer-assisted board game significantly improved the surgical education of the participants.
In a systematic review, Sardi et al.  found that gamification enhanced the motivation of the user. Cafazzo et al.  developed a mobile health application (mHealth) for the management of type 1 diabetes in adolescents. In a randomized controlled trial, Allam et al.  investigated the effectiveness of online social support and gamification on the behavioral and health outcomes of patients with different types of access to online social support and gamification features. The authors were able to show that patients who were offered a gamified experience used the website more often and gained more empowerment than did those who were not offered gamified elements. Another important feature of RapRad is the promotion of friendly competition through the use of virtual leaderboards and in-game statistics. In a study conducted by Nevin et al. , leaderboards were the most important motivator of participation. The results of our study further indicate that using our e-learning platform offers fun for users and that most users would use RapRad for learning purposes again. This is essential in keeping the user motivated and shows that RapRad has the capability to be an effective learning tool. A total of 62% of the users indicated that their interest in radiology increased after using RapRad. This is important information because it allows us to expand RapRad to non radiology disciplines and medical students and to attract young, talented individuals to diagnostic radiology.
The present study has several limitations. First, despite showing results comparable to those found in the literature, our study had a moderate number of participants. Because we included only those participants who finished the game before and after taking part in the survey, the final study population may introduce a selection bias because of filtering for only the most motivated participants. Second, we focused on pneumothorax only in this study. The level editor was built to allow the integration of other abnormalities by implementing the inherent function to label abnormalities with hashtags according to the absence or presence of those abnormalities and to label each abnormality. Additional training levels focused on different pathologic findings (e.g., bone fractures on radiographs) can be integrated with ease and are planned for the future. Third, we did not compare the results of 1st-year radiology residents to the results of medical students and technicians. This comparison might have led to different results (e.g., a potentially lower increase in the diagnostic confidence of staff already familiar with reporting on chest radiographs). However, we think that averaging the results provides a better overview of the target population because the diagnostic capabilities of 1st-year radiology residents who are at the beginning of their career and those of medical students and technicians may not differ. Fourth, our case library consisted of more cases that had positive findings for a pneumothorax than those that had negative findings, which does not reflect the clinical reality. However, this distribution of cases seemed necessary for training purposes and to reach a certain confidence in detecting the abnormality. Fifth, we did not compare the results of participants in the present study with the findings of a dedicated control group who did not use RapRad as the supporting learning tool. The aim of the study was not to show the superiority of our e-learning tool over another, more traditional form of learning but, rather, to show that novel learning concepts may be useful additional tools with which to train the next generation of radiologists.
In conclusion, our RapRad e-learning concept is capable of improving the diagnostic confidence of radiology residents and medical students, reducing error rates, and offering fun in the interaction with the platform.
The Kavian Scientific Research Association (KSRA) is a non-profit research organization to provide research / educational services in December 2013. The members of the community had formed a virtual group on the Viber social network. The core of the Kavian Scientific Association was formed with these members as founders. These individuals, led by Professor Siavosh Kaviani, decided to launch a scientific / research association with an emphasis on education.
KSRA research association, as a non-profit research firm, is committed to providing research services in the field of knowledge. The main beneficiaries of this association are public or private knowledge-based companies, students, researchers, researchers, professors, universities, and industrial and semi-industrial centers around the world.
Our main services Based on Education for all Spectrum people in the world. We want to make an integration between researches and educations. We believe education is the main right of Human beings. So our services should be concentrated on inclusive education.
The KSRA team partners with local under-served communities around the world to improve the access to and quality of knowledge based on education, amplify and augment learning programs where they exist, and create new opportunities for e-learning where traditional education systems are lacking or non-existent.
FULL Paper PDF file:Gamification of Electronic Learning in Radiology Education to Improve Diagnostic Confidence and Reduce Error Rates
American Journal of Roentgenology
PDF reference and original file: Click here
Professor Siavosh Kaviani was born in 1961 in Tehran. He had a professorship. He holds a Ph.D. in Software Engineering from the QL University of Software Development Methodology and an honorary Ph.D. from the University of Chelsea.