Scholar Index for Research Performance Evaluation Using Multiple Criteria Decision Making Analysis

This paper aims to present an objective quantitative methodology on how to evaluate individual’s scholarly research output using multiple criteria decision analysis. A multiple criteria decision making analysis (MCDMA) methodological process is adopted to build a multiple criteria evaluation model. With the introduction of the scholar index, which gives significant information about a researcher's productivity and the scholarly impact of his or her publications in a single number (s is the number of publications with at least s citations); cumulative research citation index; the scholar index is included in the citation databases to cover the multidimensional complexity of scholarly research performance and to undertake objective evaluations with scholar index. The scholar index, one of publication activity indexes, is analyzed by considering it to be the most appropriate sciencemetric indicator which allows to smooth over many drawbacks of scholarly output assessment by mere calculation of the number of publications (quantity) and citations (quality). Hence, this study includes a set of indicators-based scholar index to be used for evaluating scholarly researchers. Google Scholar open science database was used to assess and discuss scholarly productivity and impact of researchers. Based on the experiment of computing the scholar index, and its derivative indexes for a set of researchers on open research database platform, quantitative methods of assessing scholarly research output were successfully considered to rank researchers. The proposed methodology considers the ranking, and the selection of data on which a scholarly research performance evaluation was based, the analysis of the data, and the presentation of the multiple criteria analysis results.


Authors:



References:
[1] Velasquez, M., Hester, P. T. (2013) An Analysis of Multi-Criteria Decision Making Methods. International Journal of Operations Research Vol. 10, No. 2, p.56-66.
[2] Mardani, A., Jusoh, A., Nor, K. MD., Khalifah, Z., Zakwan, N., Valipour, V. (2015) Multiple criteria decision-making techniques and their applications – a review of the literature from 2000 to 2014. Economic Research-Ekonomska Istraživanja, 28:1, p. 516-571.
[3] Mardani, A., Zavadskas, E. K., Khalifah, Z., Jusoh, A., Nor, K. MD. (2016) Multiple criteria decision-making techniques in transportation systems: a systematic review of the state of the art literature, Transport, 31:3, p.359-385
[4] Ardil, C., Bilgen, S. (2017) Online Performance Tracking. SocioEconomic Challenges, 1(3), 58-72
[5] Ardil, C. (2018) Multidimensional Performance Tracking. International Journal of Computer and Systems Engineering, Vol:12, No:5,320-349
[6] Ardil, C. (2018) Multidimensional Compromise Optimization for Development Ranking of the Gulf Cooperation Council Countries and Turkey. International Journal of Mathematical and Computational Sciences Vol:12, No:6, 131-138
[7] Ardil, C. (2018) Multidimensional Compromise Programming Evaluation of Digital Commerce Websites. International Journal of Computer and Information Engineering Vol:12, No:7, 556-563
[8] Ardil, C. (2018) Multicriteria Decision Analysis for Development Ranking of Balkan Countries. International Journal of Computer and Information Engineering Vol:12, No:12, 1118-1125
[9] Hwang, C.L., Yoon, K. (1981) Multiple Attribute Decision Making: Methods and Applications, Springer-Verlag, Heidelberg, 1981
[10] Lai, Y.J., Hwang, C.L. (1994) Fuzzy Multiple Objective Decision Making: Methods and Applications. Springer-Verlag, Berlin
[11] Lai, Y., Liu, T., Hwang, C. (1994) TOPSIS for MODM. European Journal of Operational Research, 76, 486-500
[12] Google Scholar, https://scholar.google.com/
[13] Hirsch, J. E. (2005) An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.
[14] Hirsch, J. E. (2010) An index to quantify an individual’s scientific research output that takes into account the effect of multiple coauthorship. Scientometrics, 85(3), 741–754.
[15] Balaban, A.T.(2012) Positive and negative aspects of citation indices and journal impact factors. Scientometrics 92, 241–247 https://doi.org/10.1007/s11192-012-0637-5
[16] Costas, R.; Bordons, M. (2007) The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level, Journal of Informetrics, Volume 1, Issue 3, 193-203,ISSN 1751-1577, https://doi.org/10.1016/j.joi.2007.02.001
[17] Costas, R.; Bordons, M. 2008. Is g-index better than h-index? An exploratory study at the individual level. Scientometrics, 77(2): 267-288, DOI: 10.1007/511192-007-1997-0
[18] Bakkalbasi, N., Bauer, K., Glover, J., Wang, L. (2006) Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomedical digital libraries, 3(1), 1-8.
[19] Bar-Ilan, J., Levene, M., Lin, A. (2007) Some measures for comparing citation databases. Journal of Informetrics, 1(1), 26-34.
[20] Bar-Ilan, J. (2008) Which h-index? — A comparison of WoS, Scopus and Google Scholar. Scientometrics, 74(2), 257-271.
[21] Bornmann, L., Daniel, H. D. (2005) Does the h-index for ranking of scientists really work? Scientometrics, 65(3), 391-392.
[22] Bornmann, L., Daniel, H. D. (2009). The state of h index research: is the h index the ideal way to measure research performance?. EMBO reports,
10(1), 2-6.
[23] Bornmann, L. (2017) Measuring impact in research evaluations: a thorough discussion of methods for, effects of and problems with impact
measurements. Higher Education, 73(5), 775-787.
[24] Cronin, B., Snyder, H., Atkins, H. (1997) Comparative citation rankings of authors in monographic and journal literature: A study of sociology Journal of Documentation 53(3), 263-273.
[25] Ding, Y., Yan, E., Frazho, A., Caverlee, J. (2009) PageRank for ranking authors in co-citation networks. Journal of the American Society for Information Science and Technology, 60(11), 2229-2243.
[26] Dunaiski, M., Visser, W., Geldenhuys, J. (2016) Evaluating paper and author ranking algorithms using impact and contribution awards. Journal of Informetrics, 10(2), 392-407.
[27] Dunaiski, M., Geldenhuys, J., Visser, W. (2018) Author ranking evaluation at scale. Journal of Informetrics, 12(3), 679-702.
[28] Dunaiski, M., Geldenhuys, J., Visser, W. (2019) Globalised vs averaged: Bias and ranking performance on the author level. Journal of Informetrics,
13(1), 299-313.
[29] Falagas, M. E., Pitsouni, E. I., Malietzis, G. A., Pappas, G. (2008). Comparison of PubMed, Scopus, web of science, and Google scholar: strengths and weaknesses. The FASEB journal, 22(2), 338-342.
[30] Martín-Martín, A., Orduna-Malea, E., Thelwall, M., López-Cózar, E. D. (2018) Google scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories. Journal of informetrics, 12(4), 1160-1177.
[31] Martín-Martín, A., Orduna-Malea, E., López-Cózar, E. D. (2018) Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison. Scientometrics, 116(3), 2175-2188.
[32] Meho, L. I., Yang, K. (2007) Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American society for information science and technology, 58(13), 2105-2125.
[33] Mongeon, P., Paul-Hus, A. (2016) The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics, 106(1), 213-228.
[34] Nykl, M., Campr, M., Ježek, K. (2015) Author ranking based on personalized PageRank. Journal of Informetrics, 9(4), 777-799.
[35] Torres-Salinas, D., Lopez-Cózar, E., Jiménez-Contreras, E. (2009) Ranking of departments and researchers within a university using two different databases: Web of Science versus Scopus. Scientometrics, 80(3), 761-774.
[36] Vieira, E., Gomes, J. (2009) A comparison of Scopus and Web of Science for a typical university. Scientometrics, 81(2), 587-600.
[37] Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of informetrics, 10(2), 365-391
[27] Dunaiski, M., Geldenhuys, J., Visser, W. (2018) Author ranking evaluation at scale. Journal of Informetrics, 12(3), 679-702.
[28] Dunaiski, M., Geldenhuys, J., Visser, W. (2019) Globalised vs averaged: Bias and ranking performance on the author level. Journal of Informetrics, 13(1), 299-313.
[29] Falagas, M. E., Pitsouni, E. I., Malietzis, G. A., Pappas, G. (2008). Comparison of PubMed, Scopus, web of science, and Google scholar: strengths and weaknesses. The FASEB journal, 22(2), 338-342.
[30] Martín-Martín, A., Orduna-Malea, E., Thelwall, M., López-Cózar, E. D. (2018) Google scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories. Journal of informetrics, 12(4), 1160-1177.
[31] Martín-Martín, A., Orduna-Malea, E., López-Cózar, E. D. (2018) Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison. Scientometrics, 116(3), 2175-2188.
[32] Meho, L. I., Yang, K. (2007) Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American society for information science and technology, 58(13), 2105-2125.
[33] Mongeon, P., Paul-Hus, A. (2016) The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics, 106(1), 213-228.
[34] Nykl, M., Campr, M., Ježek, K. (2015) Author ranking based on personalized PageRank. Journal of Informetrics, 9(4), 777-799.
[35] Torres-Salinas, D., Lopez-Cózar, E., Jiménez-Contreras, E. (2009) Ranking of departments and researchers within a university using two different databases: Web of Science versus Scopus. Scientometrics, 80(3), 761-774.
[36] Vieira, E., Gomes, J. (2009) A comparison of Scopus and Web of Science for a typical university. Scientometrics, 81(2), 587-600.
[37] Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of informetrics, 10(2), 365-391.