Abstracts
Résumé
À partir des données PISA 2012 et 2015, cette étude analyse la persévérance des élèves à répondre à un test à faibles enjeux, c’est-à-dire leur capacité à maintenir un taux constant de réponses correctes tout au long de l’épreuve. Elle étudie également l’éventuel effet du mode d’administration (papier-crayon comparativement à support informatique) sur la persévérance des participants. À cette fin, cette recherche exploite le plan d’évaluation incomplet balancé de PISA 2012 et 2015, et se concentre sur les réponses aux questions posées à chacune des quatre positions. Les résultats révèlent une diminution globale de la performance des élèves, quels que soient le domaine évalué et le cycle, ainsi qu’une augmentation de sa variabilité. Néanmoins, aux questions posées en début d’épreuve, les élèves ont généralement mieux répondu lors du cycle 2012 qu’au cycle 2015. Au contraire, aux questions posées en fin d’épreuve, cette tendance semble se réduire, voire s’inverser, ce qui pourrait traduire une plus grande motivation des élèves à répondre avec une épreuve informatisée.
Mots-clés :
- évaluation à faibles enjeux,
- persévérance,
- mode d’administration,
- PISA 2012,
- PISA 2015
Abstract
Based on PISA 2012 and 2015 data, this paper analyzes students’ perseverance in taking a low-stakes assessment, defined as their ability to maintain a constant rate of correct answers throughout the test. It also investigates the possible effect of the administration mode (paper-and-pencil vs. computerized test) on participants’ perseverance. For that purpose, this research exploits the balanced incomplete block design of PISA 2012 and 2015, and focuses on the answers to the questions posed at each of the four positions. The results show an overall decrease in student performance, regardless of the domain and the cycle, and an increase in variability. However, to the questions asked at the start of the test, the students generally answered better in the 2012 than 2015 cycle. On the contrary, to the questions asked at the end of the test, this trend seems to be reduced or even reversed, which could reflect a greater pupils’ motivation to answer with a computerized test.
Keywords:
- low-stakes assessment,
- perseverance,
- administration mode,
- PISA 2012,
- PISA 2015
Resumo
Usando dados do PISA 2012 e 2015, este estudo analisa a perseverança dos alunos em responder a um teste de baixo risco, ou seja, a sua capacidade de manter uma taxa constante de respostas corretas ao longo do teste. Estuda também o possível efeito do modo de administração (papel-lápis em comparação com o suporte informático) na perseverança dos participantes. Para tal, esta investigação explora o plano de avaliação incompleto equilibrado do PISA 2012 e 2015 e concentra-se nas respostas às questões colocadas em cada uma das quatro posições. Os resultados revelam uma diminuição global do desempenho dos alunos, independentemente do domínio avaliado e do ciclo, bem como um aumento da sua variabilidade. No entanto, às questões colocadas no início do teste, os alunos responderam geralmente melhor durante o ciclo de 2012 do que no ciclo de 2015. Pelo contrário, às questões colocadas no fim do teste, esta tendência parece atenuar-se, mesmo inverter-se, o que poderá refletir uma maior motivação dos alunos para responderem a um teste informatizado.
Palavras chaves:
- avaliação de baixo risco,
- perseverança,
- modo de administração,
- PISA 2012,
- PISA 2015
Appendices
Références
- Adams, R. J., Wu, M. L., Cloney, D., & Wilson, M. R. (2020). ACER ConQuest : Generalised item response modelling software (Version 5) [Computer software]. Camberwell : Australian Council for Educational Research.
- Asseburg, R., & Frey, A. (2013). Too hard, too easy, or just right? The relationship between effort or boredom and ability-difficulty fit. Psychological Test and Assessment Modeling, 55(1), 92-104. https://www.researchgate.net/publication/267907815_Too_hard_too_easy_or_just_right_The_relationship_between_effort_or_boredom_and_ability-difficulty_fit
- Borgonovi, F., & Biecek, P. (2016). An international comparison of students’ ability to endure fatigue and maintain motivation during a low-stakes test. Learning and Individual Differences, 49, 128-137. https://doi.org/10.1016/j.lindif.2016.06.001
- Braun, H., Kirsch, I., & Yamamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th-grade NAEP reading assessment. Teachers College Record, 113(11), 2309-2344. https://doi.org/10.1177%2F016146811111301101
- Butler, J., & Adams, R. J. (2007). The impact of differential investment of student effort on the outcomes of international studies. Journal of Applied Measurement, 8(3), 279-304. https://www.researchgate.net/publication/6035437_The_impact_of_differential_investment_of_student_effort_on_the_outcomes_of_international_studies
- Chen, P. P. (2002). Exploring the accuracy and predictability of the self-efficacy beliefs of seventh-grade mathematics students. Learning and Individual Differences, 14, 77-90. https:/doi.org/10.1016/j.lindif.2003.08.003
- Debeer, D., Buchholz, J., Hartig, J., & Janssen, R. (2014). Student, school, and country differences in sustained test-taking effort in the 2009 PISA reading assessment. Journal of Educational and Behavioral Statistics, 39(6), 502-523. https://doi.org/10.3102%2F1076998614558485
- Dierendonck, C., Burton, R. & Baye, A. (2013). Effort consenti par les élèves et performance aux évaluations externes à faibles enjeux : illustration au départ des données PISA [communication orale]. Congrès de l’Actualité de la recherche en éducation et formation (AREF), Montpellier, France. https://orbi.uliege.be/bitstream/2268/196025/1/AREF_communication%202_Baye.pdf
- Ebrahimi, M. R., Toroujeni, S. M. H., & Shahbazi, V. (2019). Score equivalence, gender difference, and testing mode preference in a comparative study between computer-based testing and paper-based testing. International Journal of Emerging Technologies in Learning, 14(7), 128-143. https:/doi.org/10.3991/ijet.v14i07.10175
- Eccles, J., Wigfield, A., Harold, R., & Blumenfeld, P. (1993). Age and gender differences in children’s self and task perceptions during elementary school. Child Development, 64, 830-847. https:/doi.org/10.1111/j.1467-8624.1993.tb02946.x
- Eklöf, H. (2007). Test-taking motivation and mathematics performance in TIMSS 2003. International Journal of Testing, 7(3), 311-326. https:/doi.org/10.1080/15305050701438074
- Eklöf, H. (2015). Swedish students’ reported motivation and effort in PISA, over time and in comparison with other countries. In Swedish National Agency for Education, To respond or not to respond : The motivation of Swedish students in taking the PISA test (p. 11-60). Swedish National Agency for Education.
- Eklöf, H., & Nyroos, M. (2013). Pupil perceptions of national tests in science : Perceived importance, invested effort, and test anxiety. European Journal of Psychology of Education, 28(2), 497-510. https:/doi.org/10.1007/s10212-012-0125-6
- Eklöf, H., Pavešič, B. J., & Grønmo, L. S. (2014). Cross-national comparison of reported effort and mathematics performance in TIMSS advanced. Applied Measurement in Education, 27(1), 31-45. https:/doi.org/10.1080/08957347.2013.853070
- Fumel, S. & Keskpaik, S. (2017). La motivation des élèves à répondre à un test standardisé : résultats d’une étude dans le cadre de CEDRE compétences langagières et littératie. Éducation & formations, 93, 105-119. https ://halshs.archives-ouvertes.fr/halshs-01577258/document
- Heine, S. J., Lehman, D. R., Peng, K., & Greenholtz, J. (2002). What’s wrong with cross-cultural comparisons of subjective Likert scales : The reference-group problem. Journal of Personality and Social Psychology, 82, 903-918. https ://doi.apa.org/doi/10.1037/0022-3514.82.6.903
- Hohensinn, C., Kubinger, K. D., Reif, M., Schleicher, E. & Khorramdel, L. (2011). Analysing item position effects due to test booklet design within large-scale assessment. Educational Research and Evaluation, 17(6), 497-509. https :/doi.org/10.1080/13803611.2011.632668
- Hopfenbeck, T. N., & Kjaernsli, M. (2016). Students’ test motivation in PISA : The case of Norway. The Curriculum Journal, 27(3), 406-422. https :/doi.org/10.1080/09585176.2016.1156004
- Keskpaik, S. & Rocher, T. (2015). La motivation des élèves français face à des évaluations à faibles enjeux. Comment la mesurer? Son impact sur les réponses. Éducation & formations, 86-87, 119-139. https ://www.researchgate.net/publication/316990893_La_motivation_des_eleves_francais_face_a_des_evaluations_a_faibles_enjeux_Comment_la_mesurer_Son_impact_sur_les_reponses
- Khoshsima, H., & Hashemi, S. M. T. (2017). Transitioning to an alternative assessment : Computer-based testing and key factors related to testing mode. European Journal of English Language Teaching, 2(1), 54-74. https :/doi.org/10.5281/zenodo.268576
- Kingston, N. M. (2008) Comparability of computer- and paper-administered multiple-choice tests for k-12 populations : A synthesis. Applied Measurement in Education, 22(1), 22-37. https :/doi.org/10.1080/08957340802558326
- List, J. A., Gneezy, U., Livingston, J. A., Qin, X., Sadoff, S., & Xu, Y. (2019). Measuring success in education : The role of effort on the test itself. [Working paper no. 66]. Becker Friedman Institute. https ://ssrn.com/abstract=3386365
- Nagy, G., Nagengast, B., Frey, A., Becker, M., & Rose, N. (2019). A multilevel study of position effects in PISA achievement tests : Student- and school-level predictors in the German tracked school system. Assessment in Education : Principles, Policy & Practice, 26(4), 422-443. https :/doi.org/10.1080/0969594X.2018.1449100
- O’Neil, H. F., Abedi, J., Miyoshi, J., & Mastergeorge, A. (2005). Monetary incentives for low-stakes tests. Educational Assessment, 10(3), 185-208. https :/doi.org/10.1207/s15326977ea1003_3
- O’Neil, H. F., Sugrue, B., & Baker, E. L. (1996). Effects of motivational interventions on the national assessment of educational progress mathematics performance. Educational Assessment, 2(2), 135-157. https :/doi.org/10.1207/s15326977ea0302_2
- Organisation de coopération et de développement économiques (OCDE). (2002). La lecture, moteur de changement : performances et engagement d’un pays à l’autre – Résultats du cycle d’enquêtes de PISA 2000. Éditions OCDE. https ://doi.org/10.1787/9789264099296-fr
- Organisation de coopération et de développement économiques (OCDE). (2016). Cadre d’évaluation et d’analyse de l’enquête PISA 2015 : compétences en sciences, en compréhension de l’écrit, en mathématiques, en matières financières et en résolution collaborative de problèmes (éd. révisée). Éditions OCDE. https ://www.oecd-ilibrary.org/docserver/9789264297203-fr.pdf?expires=1643212640&id=id&accname=guest&checksum=1173D79592CBF7B633EAB4B68902A086
- Organisation for Economic Co-operation and Development (OECD). (2014). PISA 2012 : Technical report. OECD Editions. https ://www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report-final.pdf
- Organisation for Economic Co-operation and Development (OECD). (2017). PISA 2015 : Technical report. OECD Editions. https ://www.oecd.org/pisa/data/2015-technical-report
- Penk, C., & Schipolowski, S. (2015). Is it all about value? Bringing back the expectancy component to the assessment of test-taking motivation. Learning and Individual Differences, 42, 27-35. http://dx.doi.org/10.1016/j.lindif.2015.08.002
- Steedle, J. T., & Grochowalski, J. (2017). The effect of stakes on accountability test scores and pass rates. Educational Assessment, 22(2), 111-123. https :/doi.org/10.1080/10627197.2017.1309276
- Wang, S., Jiao, H., Young, M. J., Brooks, T., & Olson, J. (2007). A meta-analysis of testing mode effects in grade K-12 mathematics tests. Educational and Psychological Measurement, 67(2), 219-238. https://doi.org/10.1177%2F0013164406288166
- Wang, S., Jiao, H., Young, M. J., Brooks, T., & Olson, J. (2008). Comparability of computer-based and paper-and-pencil testing in K-12 reading assessments : A meta-analysis of testing mode effects. Educational and Psychological Measurement, 68(1), 5-24. https://doi.org/10.1177%2F0013164407305592
- Warm, T. A. (1989). Weighted likelihood estimation of ability in item response theory. Psychometrika, 54(3), 427-450. https ://doi.org/10.1007/BF02294627
- Wigfield, A., & Eccles, J. (2000). Expectancy-value theory of achievement motivation. Contemporary Educational Psychology, 25, 68-81. https :/doi.org/10.1006/ceps.1999
- Wise, S. L., & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment : Problems and potential solutions. Educational Assessment, 10(1), 1-17. https :/doi.org/10.1207/s15326977ea1001_1