Abstracts
Abstract
University ranking systems are being implemented with the aim of assessing and comparing higher education institutions at a global level. Despite their being increasingly used, rankings are often strongly criticized for their social and economic implications, as well as for limitations in their technical implementation. One of these limitations is that they do not consider the specific characteristics of online education. This study used a participatory approach to define a set of criteria and indicators suitable to reflect the specific nature of distance education. This endeavour will help evaluate and rank online higher education institutions more appropriately than in current practice, where indicators are devised for traditional universities. To this end, several stakeholders and informants were involved in a Delphi study in an attempt to reach the broader higher education institutions (HEI) community. According to the study participants, apart from students’ achievements and general quantitative measures of HEI performance, which are quite common in traditional ranking systems, teaching and student learning experience turned out to be the most important criteria. Student support, teacher support, technological infrastructure, research and organization were deemed middle ground criteria, while sustainability and reputation were regarded as the least important criteria.
Keywords:
- university ranking,
- online education,
- quality in higher education,
- institutional reputation,
- Delphi study,
- performance indicators
Download the article in PDF to read it.
Download
Appendices
Bibliography
- Amsler, S. S., & Bolsmann, C. (2012). University ranking as social exclusion. British Journal of Sociology of Education, 33(2), 283-301. doi: 10.1080/01425692.2011.649835
- Barron, G. R. (2017). The Berlin principles on ranking higher education institutions: Limitations, legitimacy, and value conflict. Higher Education, 73(2), 317-333. doi: 10.1007/s10734-016-0022-z
- Billaut, J. C., Bouyssou, D., & Vincke, P. (2010). Should you believe in the Shanghai ranking? Scientometrics, 84(1), 237-263. doi: 10.1007/s11192-009-0115-x
- Bonaccorsi, A., & Cicero, T. (2016). Nondeterministic ranking of university departments. Journal of Informetrics, 10(1), 224-237. doi: 10.1016/j.joi.2016.01.007
- Bougnol, M. L., & Dulá, J. H. (2015). Technical pitfalls in university rankings. Higher Education, 69(5), 859-866. doi: 10.1007/s10734-014-9809-y
- Bowman, N. A., & Bastedo, M. N. (2011). Anchoring effects in world university rankings: Exploring biases in reputation scores. Higher Education, 61(4), 431-444. doi: 10.1007/s10734-010-9339-1
- Brasher, A., Holmes, W., & Whitelock, D. (2017). A means for systemic comparisons of current online education quality assurance tools and systems (CODUR Project Deliverable IO1.A2). Retrieved from http://edulab.uoc.edu/wp-content/uploads/2018/06/CODUR-deliverable-IO1-A2.pdf
- Çakır, M. P., Acartürk, C., Alaşehir, O., & Çilingir, C. (2015). A comparative analysis of global and national university ranking systems. Scientometrics, 103(3), 813-848. doi: 10.1007/s11192-015-1586-6
- Cremonini, L., Westerheijden, D. F., Benneworth, P., & Dauncey, H. (2014). In the shadow of celebrity? World-class university policies and public value in higher education. Higher Education Policy, 27(3), 341-361. doi: 10.1057/hep.2013.33
- Dalkey, N., & Helmer, O. (1963). An experimental application of the Delphi method to the use of experts. Management Science, 9(3), 458-467. Retrieved from http://www.jstor.org/stable/2627117
- Giardina, F., Guitert, M, & Sangrà, A. (2017). The state of art of online education (CODUR Project Deliverable IO1.A1). Retrieved from http://edulab.uoc.edu/wp-content/uploads/2018/06/CODUR-deliverable-IO1.A1_Stat-of-the-Art.pdf
- Hall, D. A., Smith, H., Heffernan, E., & Fackrell, K. (2018). Recruiting and retaining participants in e-Delphi surveys for core outcome set development: Evaluating the COMiT’ID study. PloS One, 13(7). doi: 10.1371/journal.pone.0201378
- High Level Group on the Modernisation of Higher Education. (2013). Report to the European Commission on improving the quality of teaching and learning in Europe’s higher education institutions. Retrieved from https://www.modip.uoc.gr/sites/default/files/files/modernisation_en.pdf
- Iqbal, S., & Pipon-Young, L. (2009). The Delphi method. The Psychologist, 22(7), 598-600.
- Jones, J., & Hunter, D. (1995). Consensus methods for medical and health services research. The BMJ, 311(7001), 376-380. doi: 10.1136/bmj.311.7001.376
- King, B. (2012). Distance education and dual-mode universities: An Australian perspective. Open Learning: The Journal of Open, Distance and e-Learning, 27(1), 9-22. doi: 10.1080/02680513.2012.640781
- Kroth, A., & Daniel, H. D. (2008). International university rankings: A critical review of the methodology. Zeitschrift fur Erziehungswissenschaft, 11(4), 542-558.
- Kurre, F. L., Ladd, L., Foster, M. F., Monahan, M. J., & Romano, D. (2012). The state of higher education in 2012. Contemporary Issues in Education Research (Online), 5(4), 233-256. doi: 10.19030/cier.v5i4
- Leo, T., Manganello, F., & Chen, N.-S. (2010). From the learning work to the learning adventure. In A. Szucs, & A.W. Tait (Eds.), Proceedings of the 19th European Distance and e-Learning Network Annual Conference 2010 (EDEN 2010; pp. 102-108). European Distance and E-Learning Network.
- Li, F. (2018). The Expansion of Higher Education and the Returns of Distance Education in China. The International Review of Research in Open and Distributed Learning, 19(4). doi: 10.19173/irrodl.v19i4.2881
- Lynch, K. (2015). Control by numbers: New managerialism and ranking in higher education. Critical Studies in Education, 56(2), 190-207. doi: 10.1080/17508487.2014.949811
- Marginson, S. (2007). The public/private divide in higher education: A global revision. Higher Education, 53(3), 307-333. doi: 10.1007/s10734-005-8230-y
- Ministry of Education, University and Research. (2016, December 12). Autovalutazione, valutazione, accreditamento iniziale e periodico delle sedi e dei corsi di studio universitari [Ministerial decree on self-evaluation, evaluation, initial and periodic accreditation of Universities]. Ministry of Education, University and Research of Italy. Retrieved from http://attiministeriali.miur.it/anno-2016/dicembre/dm-12122016.aspx
- Pozzi, F., Manganello, F. Passarelli, & M., Persico, D. (2017). Develop test and refine representative performance online quality education indicators based on common criteria (CODUR Project Deliverable IO1.A3). Retrieved from http://edulab.uoc.edu/wp-content/uploads/2018/06/CODUR-deliverable-IO1-A3.pdf
- Salmi, J., & Saroyan, A. (2007). League tables as policy instruments. Higher Education Management and Policy, 19(2), 1-38. doi: 10.1787/17269822
- Sponsler, B. A. (2009). The role and relevance of rankings in Higher Education policymaking. Washington, DC: Institute for Higher Education Policy. Retrieved from https://files.eric.ed.gov/fulltext/ED506752.pdf
- Thurstone, L. L. (1927). A law of comparative judgment. Psychological Review, 34(4), 273-286. doi: 10.1037/h0070288
- Turner, D. A. (2013). World class universities and international rankings. Ethics in Science and Environmental Politics, 13(2), 167-176. doi: 10.3354/esep00132
- Usher, A., & Savino, M. (2006). A world of difference: A global survey of University league tables. Toronto, ON: Educational Policy Institute.
- Vlăsceanu, L., Grünberg, L., & Pârlea, D. (2007). Quality assurance and accreditation: A glossary of basic terms and definitions. Bucharest, Romania: UNESCO. Retrieved from http://unesdoc.unesco.org/images/0013/001346/134621e.pdf
- Wächter, B., Kelo, M., Lam, Q., Effertz, P., Jost, C., and Kottowski, S. (2015). University quality indicators: A critical assessment. Retrieved from http://www.europarl.europa.eu/RegData/etudes/STUD/2015/563377/IPOL_STU%282015%29563377_EN.pdf