Comparability of computer based and paper & pencil test on students’ performance in educational assessment course at the Federal University Gusau, Nigeria

##plugins.themes.academic_pro.article.main##

Lukman Adaramaja Sheu
Victor O. Evanero

Abstract

Introduction: Computer-Based Test (CBT) is being adopted for assessment by many institutions in Nigeria due to increase in students’ population, expansion of work demands from academic staff, and advances in Information and Communication Technology (ICT).


Purpose: This study investigated the comparability of Computer Based Test (CBT) and Paper-Pencil Test (PPT) on students’ scores in Educational assessment course at Federal University Gusau, Zamfara State.


Methodology: The study adopted the repeated measures design. The population for this study comprised of all undergraduate students of Federal University Gusau. The target population comprised of all the 450 registered 300 level undergraduate students from Faculty of Education, Federal University Gusau during 2021/2022 academic session.  All the students were purposively selected for the study. Two instruments were used for data collection: The Multiple Choice Test in Test and Measurement (MCTTM) used for PPT and CBT had acceptable content validity coefficient of correlation of 0.69 percent and split-half reliability coefficient of 0.81, while the ICT Competence Questionnaire developed for obtaining information on students’ competence in and attitude to ICT had test re-test reliability coefficient of 0.78 and 0.81 respectively. Data collected were analyzed using descriptive and inferential statistics. The hypotheses formulated for the study were tested at 0.05 alpha level of significance.


Results: The findings of the study revealed that Federal University Gusau undergraduate students have little competence in ICT. It also revealed significant difference in students’ scores in CBT and PPT in an educational assessment course. The difference is in favour of PPT with mean score of 48.72. The study also revealed no significant effect of gender on students’ scores on the two modes of testing. In addition, significant relationships also exist among students’ competence in, attitude to ICT and their performance in CBT.


Recommendation: The study therefore recommended that educators should encourage the use of the adequate ICT facilities for teaching and learning. This will not only motivated the learners in learning but also prepare them for CBT.

##plugins.themes.academic_pro.article.details##

How to Cite
Sheu , L. A., & Evanero, V. O. . (2022). Comparability of computer based and paper & pencil test on students’ performance in educational assessment course at the Federal University Gusau, Nigeria. Journal of Educational Research in Developing Areas, 3(2), 114-126. https://doi.org/10.47434/JEREDA.3.2.2022.114

References

  1. Adekunle, E.A. (2015). Computer-Based Test-Key to Integrity of Examinations: JAMB’s Experience. A paper presented at the 17th Annual National Conference of Association of Educational Researchers and Evaluators of Nigeria at University of Ibadan on 12th – 17th July, 2015.
  2. American Educational Research Association (1999). Standards for educational and psychological testing. American Psychological Association.
  3. Baumgartner, T.A., Strong, C.H., & Hensley, L.D. (2002). Conducting and reading research in health and human performance (3rd Edition). McGraw-Hill Higher Education.
  4. Bennett, R.E. (2001). How the internet will help large-scale assessment reinvent itself. Education Policy Analysis Archive, 9(5) 1-23. http://epaa.asu.edu/epaa/v9n5.html
  5. Bennett, R.E., Braswell, J., Oranje, A., Sandene, B., Kaplan, B., Yan, F. (2008). Does it matter if I take my mathematics test on computer? A second empirical study of mode effects in NAEP. Journal of Technology, Learning and Assessment, 6 (9), 1-39. http://www.jtla.org.
  6. Clariana, R.B., & Wallace, P. (2002). Paper-based versus computer-based assessment: Key factors associated with the test mode effect. British Journal of Educational Technology, 33 (5) 593 – 602.
  7. Flaughter, R. [1990]. Computerized adaptive testing. http://www.time.com/time/nation/aricle/0.8599,19470/9,00.html.
  8. Gallagher, A., Bridgeman, B., & Cahalan, C. (2000). The effect of computer based tests on racial/ethnic, gender, and language groups (GRE Board Professional Report No. 96-21P). Education Testing Service.
  9. Jimoh, R.G., Abduljaleel, K.S. and Kawu, Y.K. (2012). students’ perception of computer based test (CBT) for examining undergraduate Chemistry courses. Journal of Emerging Trends in Computing and Information Sciences, 3(2), 125-134.
  10. Johnson, M., & Green, S. (2004). On-line assessment: The impact of mode on student performance. Paper presented at the British Educational Research Association Annual Conference, Manchester, UK.
  11. Kersaint, G., Horton, B., Stohl, H., & Garofalo, J. (2003). Technology beliefs and practices of Mathematics Education Faculty. Journal of Technology and Teacher Education, 11(4), 549-577.
  12. Khoshsima, M. H., Hosseini, S., & Hashemi, A. (2017). Cross-mode comparability of computer-based testing (CBT) versus paper-pencil based testing (PPT): An investigation of testing administration mode among Iranian intermediate EFL Learners Toroujeni1, English Language Teaching, 10 (2), 64-72.
  13. Lambert, M. (1991). Effects of computer use during coursework on computer aversion. Computers in Human Behaviour, 7, 319-331.
  14. Mazzeo, J., & Harvey, L.A. (1999). The equivalence of scores from automated and conventional education and psychological tests: a review of the literature. (Report No. CBR 87-8 ETS RR 88-21). Educational Testing Services.
  15. Mead, A. D., & Drasgow, F. (1993). Equivalence of computerized and paper-and-pencil cognitive ability tests: A meta-analysis. Psychological Bulletin, 114, 449-458.
  16. National Assessment of Educational Progress. (2017). Student survey questionnaires: Computer Access and Familiarity Study Grades 4 & 8. https://nces.ed.gov/nationsreportcard/subject/field_pubs/sqb/pdf/2017_sq_computer_access_familiarity.pdf.
  17. Nwankwo, O.C. (2010). Practical guide to research writing (3rd Edition). Golden Publishers.
  18. Okoli, C.E., Ubangha,M.B., & Egberongbe, O.A.(2018). Impact of Computer-Based Testing modes on academic achievement among senior secondary school students in Abuja, Nigeria. International Journal of Educational Research, 5(1), 23-32.
  19. Onuekwusi, C.N., & Onuekwusi, N.C. (2010). The dawn of e-examination in Nigeria. Issues and challenges. Paper presented at Annual Conference of Education, Alvan Ikoku Federal College of Education, Owerri.
  20. Onunkwo, G.I.N. (2002). Fundamentals educational measurement and evaluation. Cape Publishers International Ltd.
  21. Parshall, C., Spray, J., Kalohn, J. & Davey, T. (2002). Practical considerations in computer-based testing. Springer.
  22. Polgrum, W.J. (2001). Obstacles to the integration of ICT in Education: Results from a Worldwide Educational Assessment. Computers & Education, 37(2001), 163-178.
  23. Pomplun, M., & Custer, M. (2005). The score comparability of computerized and paper-and-pencil formats for K-3 reading tests. Journal of Educational Computing Research, 32 (2), 153-166.
  24. Puhan, P., Boughton, K. M., S. (2007). Examining differences in examinee performance in paper and pencil and computerized testing. Journal of Technology, Learning and Assessment, 6 (3), 1-21. http://www.jtla.org.
  25. Russell, M., & O’Connor, K. (2003): Computer-Based Testing and validity: A look back and into the Future. In TASC Publication. http://escho.larship.bc.edu/intasc/4.
  26. Russell, M. (2003). Testing on Computers: A follow-up study comparing performance on computer and on paper. Educational Policy Analysis Archives, 7 (20), 2-47. http://epaa.asu.edu/epaa/v7n20.
  27. Sheu, A.L. (2019). Test mode effect on students’ scores in an educational assessment course at the University of Ilorin, Nigeria. Journal of Humanity and Education (JOHE), 4(1), 130-147.
  28. Sorana-Daniela, B., & Lorentz, J. (2007). Computer-based testing on physical chemistry topic: A case study. International Journal of Education and Development using Information and Communication Technology, 3(1), 94-95.
  29. Taylor, C., Jamieson, J., Eignor, D.R., & Kirsch, I. (1998). The relationship between computer familiarity and performance on computer-based. TOEFL test tasks (ETS RR-98-08). ETS.
  30. Tella, A., & Bashorun, M.T. (2012). Attitude of undergraduate students towards computer-based test (CBT): A Case Study of the University of Ilorin. Information and Communication Technology Education, 8(2), 33-45.
  31. Varughese, J. A. (2005). Testing, testing. University Business, 8(4), 59-78.
  32. Wang ,H., & David S.C. (2010). Comparability of Computerize Adaptive and paper & pencil Test. Test, Measurement and Research Service Bulletins, 1(3) 1-7
  33. Watson, B. (2001): Key factors affecting conceptual gains from CAL. British Journal of Educational Technology 32 (5) 587-593.
  34. Wise, S.L., & Plake, B.S. (1990). Computer-based testing in higher education. Measurement and Evaluation in Counseling and Development, 23(1), 3-10.
  35. Woodrow, J.E. (1992). The influence of programming training on the computer literacy and attitudes of pre-service teachers. Journal of Research on Computing in Education, 25(2), 200-219.