kth.sePublications
Change search
Link to record
Permanent link

Direct link
Alternative names
Publications (10 of 70) Show all publications
van den Besselaar, P. & Sandström, U. (2020). Bibliometrically disciplined peer review: On using indicators in research evaluation. Scholarly Assessment Reports, 2(1), Article ID 5.
Open this publication in new window or tab >>Bibliometrically disciplined peer review: On using indicators in research evaluation
2020 (English)In: Scholarly Assessment Reports, ISSN 2689-5870, Vol. 2, no 1, article id 5Article in journal (Refereed) Published
Abstract [en]

Evaluation of research uses peer review and bibliometrics, and the debate about their balance in research evaluation continues. Both approaches have supporters, and both approaches are criticized. In this paper, we describe an interesting case in which the use of bibliometrics in a panel-based evaluation of a mid-sized university was systematically tried out. The case suggests a useful way in which bibliometric indicators can be used to inform and improve peer review and panel-based evaluation. We call this ‘disciplined peer review’, and disciplined is used here in a constructive way: Bibliometrically disciplined peer review is more likely to avoid the subjectivity that often influences the outcomes of the peer and panel review-based evaluation.

Place, publisher, year, edition, pages
Levy Library Press, 2020
Keywords
Bibliometric indicators, Panel review, Peer review, Research evaluation
National Category
Information Studies
Identifiers
urn:nbn:se:kth:diva-301000 (URN)10.29024/sar.16 (DOI)2-s2.0-85109169300 (Scopus ID)
Note

QC 20210903

Available from: 2021-09-03 Created: 2021-09-03 Last updated: 2022-06-25Bibliographically approved
van den Besselaar, P. & Sandström, U. (2019). Measuring researcher independence using bibliometric data: A proposal for a new performance indicator. PLOS ONE, 14(3), Article ID e0202712.
Open this publication in new window or tab >>Measuring researcher independence using bibliometric data: A proposal for a new performance indicator
2019 (English)In: PLOS ONE, E-ISSN 1932-6203, Vol. 14, no 3, article id e0202712Article in journal (Refereed) Published
Abstract [en]

Bibliometric indicators are increasingly used to evaluate individual scientists-as is exemplified by the popularity of the many other publication and citation-based indicators used in evaluation. These indicators, however, cover at best some of the quality dimensions relevant for assessing a researcher: productivity and impact. At the same time, research quality has more dimensions than productivity and impact alone. As current bibliometric indicators are not covering various important quality dimensions, we here contribute to developing better indicators for those quality dimensions not yet addressed. One of the quality dimensions lacking valid indicators is an individual researcher's independence. We propose indicators to measure different aspects of independence: two assessing whether a researcher has developed an own collaboration network and two others assessing the level of thematic independence. Taken together they form an independence indicator. We illustrate how these indicators distinguish between researchers that are equally productive and have a considerable impact. The independence indicator is a step forward in evaluating individual scholarly quality.

Place, publisher, year, edition, pages
Public Library of Science, 2019
National Category
Information Studies
Identifiers
urn:nbn:se:kth:diva-249879 (URN)10.1371/journal.pone.0202712 (DOI)000462465800001 ()30917110 (PubMedID)2-s2.0-85063577996 (Scopus ID)
Note

QC 22220190423

Available from: 2019-04-23 Created: 2019-04-23 Last updated: 2022-06-26Bibliographically approved
Sandström, U. (2019). PANEL COMPOSITION AS PATHWAY TO IMPACT: DO WE NEED STAKEHOLDER EXPERTISE TO SELECT RELEVANT MISSION-ORIENTED PROJECTS?. Journal for research and technology policy evaluation, 48, 68-73
Open this publication in new window or tab >>PANEL COMPOSITION AS PATHWAY TO IMPACT: DO WE NEED STAKEHOLDER EXPERTISE TO SELECT RELEVANT MISSION-ORIENTED PROJECTS?
2019 (English)In: Journal for research and technology policy evaluation, ISSN 1726-6629, Vol. 48, p. 68-73Article in journal (Refereed) Published
Abstract [en]

I t is often argued that the presence of stakeholders in review panels may improve the selection of societal relevant research projects. In this paper, we investigate whether the composition of panels indeed matters. More precisely, when stakeholders are in the panel, does that result in more positive evaluation of proposals of relevance to that stakeholder? We investigate thisfor the gender issues domain, and show that this is the case. When stakeholders are present, the relevant projects obtain a more positive evaluation and consequently a higher score. If these findingscan be generalised, they are an important insight for the creation of pathways to and conditions for impact.

Place, publisher, year, edition, pages
Vienna: Austrian Platform for Research and Technology Policy Evaluation, 2019
Keywords
stakeholder; gender; review reports; panel scores
National Category
Economics and Business
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-258232 (URN)
Note

QC 20190913

Available from: 2019-09-10 Created: 2019-09-10 Last updated: 2022-06-26Bibliographically approved
Sandström, U. & van den Besselaar, P. (2019). Performance of Research Teams: results from 107 European groups. In: Catalano, G Daraio, C Gregori, M Moed, HF Ruocco, G (Ed.), 17TH INTERNATIONAL CONFERENCE ON SCIENTOMETRICS & INFORMETRICS (ISSI2019), VOL II: . Paper presented at 17th International Conference of the International-Society-for-Scientometrics-and-Informetrics (ISSI) on Scientometrics and Informetrics, SEP 02-05, 2019, Sapienza Univ Rome, Rome, ITALY (pp. 2240-2251). INT SOC SCIENTOMETRICS & INFORMETRICS-ISSI
Open this publication in new window or tab >>Performance of Research Teams: results from 107 European groups
2019 (English)In: 17TH INTERNATIONAL CONFERENCE ON SCIENTOMETRICS & INFORMETRICS (ISSI2019), VOL II / [ed] Catalano, G Daraio, C Gregori, M Moed, HF Ruocco, G, INT SOC SCIENTOMETRICS & INFORMETRICS-ISSI , 2019, p. 2240-2251Conference paper, Published paper (Refereed)
Abstract [en]

This paper investigates what factors affect the performance of research teams. We combine survey data about the team with bibliometric data about the performance of the team. The analysis shows that teams with a few PIs perform better than single PI teams - of course controlling for team size. On the other hand, gender diversity does not have an effect on performance. The good news is that gender objectives can be realized, without any performance problem.

Place, publisher, year, edition, pages
INT SOC SCIENTOMETRICS & INFORMETRICS-ISSI, 2019
Series
Proceedings of the International Conference on Scientometrics and Informetrics, ISSN 2175-1935
National Category
Information Studies Information Systems, Social aspects
Identifiers
urn:nbn:se:kth:diva-270920 (URN)000508227200096 ()2-s2.0-85073869733 (Scopus ID)978-88-3381-118-5 (ISBN)
Conference
17th International Conference of the International-Society-for-Scientometrics-and-Informetrics (ISSI) on Scientometrics and Informetrics, SEP 02-05, 2019, Sapienza Univ Rome, Rome, ITALY
Note

QC 20200319

Available from: 2020-03-19 Created: 2020-03-19 Last updated: 2022-06-26Bibliographically approved
van den Besselaar, P., Sandström, U. & Mom, C. (2019). Recognition through performance and reputation. In: Catalano, G Daraio, C Gregori, M Moed, HF Ruocco, G (Ed.), 17TH INTERNATIONAL CONFERENCE ON SCIENTOMETRICS & INFORMETRICS (ISSI2019), VOL II: . Paper presented at 17th International Conference of the International-Society-for-Scientometrics-and-Informetrics (ISSI) on Scientometrics and Informetrics, SEP 02-05, 2019, Sapienza Univ Rome, Rome, ITALY (pp. 2065-2069). INT SOC SCIENTOMETRICS & INFORMETRICS-ISSI
Open this publication in new window or tab >>Recognition through performance and reputation
2019 (English)In: 17TH INTERNATIONAL CONFERENCE ON SCIENTOMETRICS & INFORMETRICS (ISSI2019), VOL II / [ed] Catalano, G Daraio, C Gregori, M Moed, HF Ruocco, G, INT SOC SCIENTOMETRICS & INFORMETRICS-ISSI , 2019, p. 2065-2069Conference paper, Published paper (Refereed)
Abstract [en]

As the various disciplines have different forms of social and intellectual organization (Whitley 2000), scholars in various fields may depend less on their peers, and more on other audiences for recognition and funding. Following Merton (1973) we distinguish between performance and reputation for building up recognition. We show that there are indeed differences between the disciplines: in life sciences and social sciences, the reputation related indicators are dominant in predicting the score that grant applicants get from the panel, whereas in the natural sciences, the performance related indicators dominate the panel scores. Furthermore, when comparing within the life sciences the grantees with the best performing non-grantees, we show that the former score higher on the reputation indicators and the second score better on the performance variables, supporting the findings that in life sciences one probably gains recognition over reputation more than over individual performance. We suggest that this may not be optimal for the growth of knowledge.

Place, publisher, year, edition, pages
INT SOC SCIENTOMETRICS & INFORMETRICS-ISSI, 2019
Series
Proceedings of the International Conference on Scientometrics and Informetrics, ISSN 2175-1935
National Category
Information Studies
Identifiers
urn:nbn:se:kth:diva-270898 (URN)000508227200074 ()2-s2.0-85073878573 (Scopus ID)978-88-3381-118-5 (ISBN)
Conference
17th International Conference of the International-Society-for-Scientometrics-and-Informetrics (ISSI) on Scientometrics and Informetrics, SEP 02-05, 2019, Sapienza Univ Rome, Rome, ITALY
Note

QC 20200324

Available from: 2020-03-24 Created: 2020-03-24 Last updated: 2022-06-26Bibliographically approved
Sandström, U. (2019). Självständighet – ny indikator för forskningskvalitet. Tidningen Curie, 1-1
Open this publication in new window or tab >>Självständighet – ny indikator för forskningskvalitet
2019 (Swedish)In: Tidningen Curie, ISSN 2001-3426, p. 1-1Article in journal (Other (popular science, discussion, etc.)) Published
Place, publisher, year, edition, pages
Stockholm: , 2019
National Category
Business Administration
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-258930 (URN)
Note

QC 20220426

Available from: 2019-09-11 Created: 2019-09-11 Last updated: 2022-06-26Bibliographically approved
Sandström, E., Sandström, U. & Van Den Besselaar, P. (2019). The P-model: An indicator that accounts for field adjusted production as well as field normalized citation impact.. In: Proceedings of the 17th International Conference of the International Society for Scientometrics and Informetrics: . Paper presented at 17th International Conference on Scientometrics and Informetrics, ISSI 2019 September 2-5, 2019 (pp. 2326-2331).
Open this publication in new window or tab >>The P-model: An indicator that accounts for field adjusted production as well as field normalized citation impact.
2019 (English)In: Proceedings of the 17th International Conference of the International Society for Scientometrics and Informetrics, 2019, p. 2326-2331Conference paper, Published paper (Refereed)
Abstract [en]

Any type of scientific study or evaluation of research quality and impact enters into two types of problems if thereis more than one topic area involved in the study: (1) How to account for differences in (paper) production? (2)How to account for differences in citation impact, i.e. influence over subsequent literature? This paper aims toshow that these questions can be answered with the help of two methods; the Field Adjusted Production (FAP)indicator and a percentile indicator which is designed to include the FAP. Consequently, they are used incombination in order to express a score that includes both paper production an impact into one figure. Thereby isconstructed a score that can be used for ranking of universities, departments, individuals. The paper first explainsthe background of the method, and then how to calculate the indicators belonging to the P-Model. Then the paperindicates some examples and will discuss methods for validation of the proposed indicator.

National Category
Other Computer and Information Science
Identifiers
urn:nbn:se:kth:diva-268320 (URN)000508227200106 ()2-s2.0-85073874788 (Scopus ID)
Conference
17th International Conference on Scientometrics and Informetrics, ISSI 2019 September 2-5, 2019
Note

QC 20200310

Available from: 2020-03-10 Created: 2020-03-10 Last updated: 2022-06-26Bibliographically approved
Sandström, U. & Van den Besselaar, P. (2018). Funding, evaluation, and the performance of national research systems. Journal of Informetrics, 12(1), 365-384
Open this publication in new window or tab >>Funding, evaluation, and the performance of national research systems
2018 (English)In: Journal of Informetrics, ISSN 1751-1577, E-ISSN 1875-5879, Vol. 12, no 1, p. 365-384Article in journal (Refereed) Published
Abstract [en]

Understanding the quality of science systems requires international comparative studies, which are difficult because of the lack of comparable data especially about inputs in research. In this study, we deploy an approach based on change instead of on levels of inputs and outputs: an approach that to a large extent eliminates the problem of measurement differences between countries. We firstly show that there are large differences in efficiency between national science systems, defined as the increase in output (highly cited papers) per percentage increase in input (funding). We then discuss our findings using popular explanations of performance differences: differences in funding systems (performance related or not), differences in the level of competition, differences in the level of university autonomy, and differences in the level of academic freedom. Interestingly, the available data do not support these common explanations. What the data suggest is that efficient systems are characterized by a well-developed ex post evaluation system combined with considerably high institutional funding and relatively low university autonomy (meaning a high autonomy of professionals). On the other hand, the less efficient systems have a strong ex ante control, either through a high level of so-called competitive project funding, or through strong power of the university management. Another conclusion is that more and better data are needed.

Keywords
Research policy; Input-output studies; Performance-based funding; Research efficiency; Bibliometrics; Citations
National Category
Social Sciences
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-223588 (URN)10.1016/j.joi.2018.01.007 (DOI)000427479800026 ()2-s2.0-85042382770 (Scopus ID)
Funder
Riksbankens Jubileumsfond, P12-1302:1
Note

QC 20180308

Available from: 2018-02-23 Created: 2018-02-23 Last updated: 2022-06-26Bibliographically approved
Van Den Besselaar, P. & Sandström, U. (2018). Quantity matters, but how does it work?. Journal of Informetrics, 12(4), 1059-1062
Open this publication in new window or tab >>Quantity matters, but how does it work?
2018 (English)In: Journal of Informetrics, ISSN 1751-1577, E-ISSN 1875-5879, Vol. 12, no 4, p. 1059-1062Article in journal (Refereed) Published
Place, publisher, year, edition, pages
Elsevier Ltd, 2018
National Category
Engineering and Technology
Identifiers
urn:nbn:se:kth:diva-236619 (URN)10.1016/j.joi.2018.08.007 (DOI)000451074800004 ()2-s2.0-85052868809 (Scopus ID)
Note

QC 20181119

Available from: 2018-11-19 Created: 2018-11-19 Last updated: 2022-06-26Bibliographically approved
van den Besselaar, P., Sandström, U. & Schiffbaenker, H. (2018). Studying grant decision-making: a linguistic analysis of review reports. Scientometrics, 118(1), 313-329
Open this publication in new window or tab >>Studying grant decision-making: a linguistic analysis of review reports
2018 (English)In: Scientometrics, ISSN 0138-9130, E-ISSN 1588-2861, Vol. 118, no 1, p. 313-329Article in journal (Refereed) Published
Abstract [en]

Peer and panel review are the dominant forms of grant decision-making, despite its serious weaknesses as shown by many studies. This paper contributes to the understanding of the grant selection process through a linguistic analysis of the review reports. We reconstruct in that way several aspects of the evaluation and selection process: what dimensions of the proposal are discussed during the process and how, and what distinguishes between the successful and non-successful applications? We combine the linguistic findings with interviews with panel members and with bibliometric performance scores of applicants. The former gives the context, and the latter helps to interpret the linguistic findings. The analysis shows that the performance of the applicant and the content of the proposed study are assessed with the same categories, suggesting that the panelists actually do not make a difference between past performance and promising new research ideas. The analysis also suggests that the panels focus on rejecting the applications by searching for weak points, and not on finding the high-risk/high-gain groundbreaking ideas that may be in the proposal. This may easily result in sub-optimal selections, in low predictive validity, and in bias.

Place, publisher, year, edition, pages
Springer, 2018
Keywords
Peer review; Panel review; Research grants; Decision-making Linguistics LIWC; European Research Council (ERC)
National Category
Social Sciences
Identifiers
urn:nbn:se:kth:diva-235172 (URN)10.1007/s11192-018-2848-x (DOI)000442737700018 ()30220747 (PubMedID)2-s2.0-85049856026 (Scopus ID)
Funder
EU, European Research Council
Note

QC 20180918

Available from: 2018-09-17 Created: 2018-09-17 Last updated: 2022-06-26Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-1292-8239

Search in DiVA

Show all publications