Assessment of Mathematics multiple-choice items using item response theory model of junior secondary school three students in Akwa Ibom state, Nigeria

##plugins.themes.academic_pro.article.main##

Victor Eyo Essien
Eme Orok lban Amanso
Uwaoma Flora Ogba
Iniobong Boniface Williams
Ruth Usene Solomon

Abstract

Introduction: Assessment has been considered to be an integral part of the educational system, be it in primary, secondary or tertiary level of education. The quality of each of the items making up a test, the testing procedures and other psychometric attributes of the test are of great importance in the entire testing process to measure and ascertain learning outcomes.


Purpose: The study assessed the Mathematics Multiple choice items using Item Response Theory Model of junior secondary three students in Akwa Ibom State, Nigeria.


Methodology: The study adopted a survey research design, a multi-stage stage sampling technique was employed to select the scripts of the students utilizing systematic, stratified random sampling and purposive sampling techniques. Two objective and research questions were posed. The instrument used was the 2021            Basic Education Certificate Examination (BECE) Mathematics answer scripts of 3,456 JSS3 examinees obtained from 18 Local Government Areas in Akwa Ibom State, Nigeria.  The students' responses from the answer scripts were analyzed using Bilog-MG3 statistical software to examine the Item Difficulty and Discrimination estimates of the 60 BECE Mathematics test items.


Results: The findings revealed that 48 items, representing 80%, were classified as easy items, 7 items, representing 11.7%, were classified as average items, while only 5 items, representing 8.3%, were identified as difficult items. Additionally, only seven (7) items, representing 11.7%, were able to provide a perfect discriminating index among the high, average, and low-test takers.


Conclusion and Recommendations: The 2021 Basic Education Certificate Examination Mathematics test items difficulty were adjudged inappropriate, and it failed to accurately differentiate among test-takers' abilities. It was recommended that examination bodies, test developers, and teachers should utilize modern procedures in the item development and selection process.

##plugins.themes.academic_pro.article.details##

How to Cite
Essien, V. E., Amanso, E. O. lban, Ogba, U. F., Williams, I. B., & Solomon, R. U. (2024). Assessment of Mathematics multiple-choice items using item response theory model of junior secondary school three students in Akwa Ibom state, Nigeria. Journal of Educational Research in Developing Areas, 5(1), 99-107. https://doi.org/10.47434/JEREDA.5.1.2024.99

References

  1. Aborisade, O. J., & Fajobi, O. O. (2020). Comparative analysis of psychometric properties of mathematics items constructed by WAEC and NECO in Nigeria. Using item response theory approach. Educational Research and Reviews, 15(1), 1-7. https://doi.org/10.5897/ERR2019.3850.
  2. Anunciação, L., Cito, L., Pessoa, L., Squires, J., Murphy, K., & Landeira-Fernandez, J. (2023). Lack of voluntary interest and difficulty making eye contact are the most discriminative behaviors of the ASQ: SE and might suggest delays: Results from a large-scale assessment. Applied Neuropsychology: Child, 14, 1-9 https://doi.org/10.1080/21622965.2022.2156795.
  3. Bean, G. J., & Bowen, N. K. (2021). Item response theory and confirmatory factor analysis: complementary approaches for scale development. Journal of Evidence-Based Social Work, 18(6), 597-618. https://doi.org/10.1080/26408066.2021.1906813.
  4. Bhatti, H. A., Mehta, S., McNeil, R., Yao, S. Y., & Wilson, M. (2023). A Scientific Approach to Assessment: Rasch Measurement and the Four Building Blocks. In Advances in Applications of Rasch Measurement in Science Education (pp. 157-187). Springer International Publishing.
  5. Chidozie, E. O., & Orluwene, G. W. (2021). Discrimination and Difficulty Indices of Senior Secondary Certificate Examination Multiple Choice Physics Questions from 2016–2018 in Rivers State. Global Academy Journal of Humanity Social Science, 3, 32-45. https://gajrc.com/media/features_articles/GAJHSS_35_205-218.pdf.
  6. Di Martino, P., Gregorio, F., & Iannone, P. (2023). The transition from school to university in Mathematics education research: new trends and ideas from a systematic literature review. Educational Studies in Mathematics, 113(1), 7-34. https://doi.org/10.1007/s10649-022-10194-w.
  7. Falk, A., Becker, A., Dohmen, T., Huffman, D., & Sunde, U. (2023). The preference survey module: A validated instrument for measuring risk, time, and social preferences. Management Science, 69(4), 1935- 1950. https://dx.doi.org/10.2139/ssrn.2725874.
  8. Isangedighi, A. J., (2012). Essentials of research and statistics in education and social Sciences, (pp.1-23). Eti-Nwa Associate.
  9. Kong, S. C., & Lai, M. (2022). Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses. Computers & Education, 187, 104-132. https://doi.org/10.1016/j.compedu.2022.104562.
  10. Mantau, M. J., & Benitti, F. B. (2023). The awareness assessment model: measuring the awareness and collaboration support over the participant’s perspective. In Anais do XVIII Simpósio Brasileiro de Sistemas Colaborativos 17(4), 30-43. https://doi.org/10.5753/sbsc.2023.229064.
  11. Obon, A. M., & Rey, K. A. M. (2019). Analysis of Multiple-Choice Questions (MCQs): Item and test statistics from the 2nd year nursing qualifying exam in a University in Cavite, Philippines. In Abstract Proceedings International Scholars Conference 7(1), 499-511. https://doi.org/10.35974/isc.v7i1.1128.
  12. Oladele, B. K., & Adegoke, B. A. (2020). Using test theories models to assess senior secondary students’ ability in constructed-response mathematics tests. Journal of Education and Practice, 11, 46-55. 10.7176/JEP/11-7-05.
  13. Petters J. S., Owan, V. J., Okpa, O. E., Idika, D. O., Ojini, R. A., Ntamu, B. A., Robert, A. I., Owan, M. V., Asu-Okang, S., & Essien, V. E. (2024). Predicting users’ behavior: Gender and age as interactive antecedents of students’ Facebook use for research data collection. Online Journal of Communication and Media Technologies, 14(1), e202406. https://doi.org/10.30935/ojcmt/14104.
  14. Syahrul, M., Munawir, M., Masruni, N., Hairanie, R., & Magalhaes, A. D. J. (2023). Designing and implementing an integrated thematic teaching model based on a scientific approach to improve basic education students' learning outcomes. Indonesian Journal of Education, 3(2), 383-397. https://doi.org/10.54443/injoe.v3i2.73.
  15. Uduafemhe, M. E., Uwelo, D., John, S. O., & Karfe, R. Y. (2021). Item Analysis of the Science and Technology Components of the 2019 Basic Education Certificate Examination Conducted by National Examinations Council. Universal Journal of Educational Research, 9(4), 862-869. https://doi.org/10.13189/ujer.2021.090420.
  16. Weyers, J., König, J., Santagata, R., Scheiner, T., & Kaiser, G. (2023). Measuring teacher noticing: A scoping review of standardized instruments. Teaching and Teacher Education, 122(10), 39-43. https://doi.org/10.1016/j.tate.2022.103970.
  17. Zenger, T., & Bitzenbauer, P. (2022). Exploring German secondary school students’ conceptual knowledge of density. Science Education International, 33(1), 86-92. https://doi.org/10.33828/sei.v33.i1.9.