Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 56) Show all publications
van den Besselaar, P. & Sandström, U. (2019). Measuring researcher independence using bibliometric data: A proposal for a new performance indicator. PLoS ONE, 14(3), Article ID e0202712.
Open this publication in new window or tab >>Measuring researcher independence using bibliometric data: A proposal for a new performance indicator
2019 (English)In: PLoS ONE, ISSN 1932-6203, E-ISSN 1932-6203, Vol. 14, no 3, article id e0202712Article in journal (Refereed) Published
Abstract [en]

Bibliometric indicators are increasingly used to evaluate individual scientists-as is exemplified by the popularity of the many other publication and citation-based indicators used in evaluation. These indicators, however, cover at best some of the quality dimensions relevant for assessing a researcher: productivity and impact. At the same time, research quality has more dimensions than productivity and impact alone. As current bibliometric indicators are not covering various important quality dimensions, we here contribute to developing better indicators for those quality dimensions not yet addressed. One of the quality dimensions lacking valid indicators is an individual researcher's independence. We propose indicators to measure different aspects of independence: two assessing whether a researcher has developed an own collaboration network and two others assessing the level of thematic independence. Taken together they form an independence indicator. We illustrate how these indicators distinguish between researchers that are equally productive and have a considerable impact. The independence indicator is a step forward in evaluating individual scholarly quality.

Place, publisher, year, edition, pages
Public Library of Science, 2019
National Category
Information Studies
Identifiers
urn:nbn:se:kth:diva-249879 (URN)10.1371/journal.pone.0202712 (DOI)000462465800001 ()30917110 (PubMedID)2-s2.0-85063577996 (Scopus ID)
Note

QC 22220190423

Available from: 2019-04-23 Created: 2019-04-23 Last updated: 2019-04-23Bibliographically approved
Sandström, U. (2019). PANEL COMPOSITION AS PATHWAY TO IMPACT: DO WE NEED STAKEHOLDER EXPERTISE TO SELECT RELEVANT MISSION-ORIENTED PROJECTS?. Journal for research and technology policy evaluation, 48, 68-73
Open this publication in new window or tab >>PANEL COMPOSITION AS PATHWAY TO IMPACT: DO WE NEED STAKEHOLDER EXPERTISE TO SELECT RELEVANT MISSION-ORIENTED PROJECTS?
2019 (English)In: Journal for research and technology policy evaluation, ISSN 1726-6629, Vol. 48, p. 68-73Article in journal (Refereed) Published
Abstract [en]

I t is often argued that the presence of stakeholders in review panels may improve the selection of societal relevant research projects. In this paper, we investigate whether the composition of panels indeed matters. More precisely, when stakeholders are in the panel, does that result in more positive evaluation of proposals of relevance to that stakeholder? We investigate thisfor the gender issues domain, and show that this is the case. When stakeholders are present, the relevant projects obtain a more positive evaluation and consequently a higher score. If these findingscan be generalised, they are an important insight for the creation of pathways to and conditions for impact.

Place, publisher, year, edition, pages
Vienna: Austrian Platform for Research and Technology Policy Evaluation, 2019
Keywords
stakeholder; gender; review reports; panel scores
National Category
Economics and Business
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-258232 (URN)
Note

QC 20190913

Available from: 2019-09-10 Created: 2019-09-10 Last updated: 2019-11-13Bibliographically approved
Sandström, U. (2019). Självständighet – ny indikator för forskningskvalitet. Tidningen Curie, 1-1
Open this publication in new window or tab >>Självständighet – ny indikator för forskningskvalitet
2019 (Swedish)In: Tidningen Curie, p. 1-1Article in journal (Other (popular science, discussion, etc.)) Published
Place, publisher, year, edition, pages
Stockholm: , 2019
National Category
Business Administration
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-258930 (URN)
Note

QC 20190913

Available from: 2019-09-11 Created: 2019-09-11 Last updated: 2019-09-13Bibliographically approved
Sandström, U. & Van den Besselaar, P. (2018). Funding, evaluation, and the performance of national research systems. Journal of Informetrics, 12(1), 365-384
Open this publication in new window or tab >>Funding, evaluation, and the performance of national research systems
2018 (English)In: Journal of Informetrics, ISSN 1751-1577, E-ISSN 1875-5879, Vol. 12, no 1, p. 365-384Article in journal (Refereed) Published
Abstract [en]

Understanding the quality of science systems requires international comparative studies, which are difficult because of the lack of comparable data especially about inputs in research. In this study, we deploy an approach based on change instead of on levels of inputs and outputs: an approach that to a large extent eliminates the problem of measurement differences between countries. We firstly show that there are large differences in efficiency between national science systems, defined as the increase in output (highly cited papers) per percentage increase in input (funding). We then discuss our findings using popular explanations of performance differences: differences in funding systems (performance related or not), differences in the level of competition, differences in the level of university autonomy, and differences in the level of academic freedom. Interestingly, the available data do not support these common explanations. What the data suggest is that efficient systems are characterized by a well-developed ex post evaluation system combined with considerably high institutional funding and relatively low university autonomy (meaning a high autonomy of professionals). On the other hand, the less efficient systems have a strong ex ante control, either through a high level of so-called competitive project funding, or through strong power of the university management. Another conclusion is that more and better data are needed.

Keywords
Research policy; Input-output studies; Performance-based funding; Research efficiency; Bibliometrics; Citations
National Category
Social Sciences
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-223588 (URN)10.1016/j.joi.2018.01.007 (DOI)000427479800026 ()2-s2.0-85042382770 (Scopus ID)
Funder
Riksbankens Jubileumsfond, P12-1302:1
Note

QC 20180308

Available from: 2018-02-23 Created: 2018-02-23 Last updated: 2019-08-05Bibliographically approved
Van Den Besselaar, P. & Sandström, U. (2018). Quantity matters, but how does it work?. Journal of Informetrics, 12(4), 1059-1062
Open this publication in new window or tab >>Quantity matters, but how does it work?
2018 (English)In: Journal of Informetrics, ISSN 1751-1577, E-ISSN 1875-5879, Vol. 12, no 4, p. 1059-1062Article in journal (Refereed) Published
Place, publisher, year, edition, pages
Elsevier Ltd, 2018
National Category
Engineering and Technology
Identifiers
urn:nbn:se:kth:diva-236619 (URN)10.1016/j.joi.2018.08.007 (DOI)000451074800004 ()2-s2.0-85052868809 (Scopus ID)
Note

QC 20181119

Available from: 2018-11-19 Created: 2018-11-19 Last updated: 2018-12-10Bibliographically approved
van den Besselaar, P., Sandström, U. & Schiffbaenker, H. (2018). Studying grant decision-making: a linguistic analysis of review reports. Scientometrics, 118(1), 313-329
Open this publication in new window or tab >>Studying grant decision-making: a linguistic analysis of review reports
2018 (English)In: Scientometrics, ISSN 0138-9130, E-ISSN 1588-2861, Vol. 118, no 1, p. 313-329Article in journal (Refereed) Published
Abstract [en]

Peer and panel review are the dominant forms of grant decision-making, despite its serious weaknesses as shown by many studies. This paper contributes to the understanding of the grant selection process through a linguistic analysis of the review reports. We reconstruct in that way several aspects of the evaluation and selection process: what dimensions of the proposal are discussed during the process and how, and what distinguishes between the successful and non-successful applications? We combine the linguistic findings with interviews with panel members and with bibliometric performance scores of applicants. The former gives the context, and the latter helps to interpret the linguistic findings. The analysis shows that the performance of the applicant and the content of the proposed study are assessed with the same categories, suggesting that the panelists actually do not make a difference between past performance and promising new research ideas. The analysis also suggests that the panels focus on rejecting the applications by searching for weak points, and not on finding the high-risk/high-gain groundbreaking ideas that may be in the proposal. This may easily result in sub-optimal selections, in low predictive validity, and in bias.

Place, publisher, year, edition, pages
Springer, 2018
Keywords
Peer review; Panel review; Research grants; Decision-making Linguistics LIWC; European Research Council (ERC)
National Category
Social Sciences
Identifiers
urn:nbn:se:kth:diva-235172 (URN)10.1007/s11192-018-2848-x (DOI)000442737700018 ()30220747 (PubMedID)2-s2.0-85049856026 (Scopus ID)
Funder
EU, European Research Council
Note

QC 20180918

Available from: 2018-09-17 Created: 2018-09-17 Last updated: 2019-09-20Bibliographically approved
van den Besselaar, P. & Sandström, U. (2017). Counterintuitive effects of incentives?. Research Evaluation, 26(4), 349-351
Open this publication in new window or tab >>Counterintuitive effects of incentives?
2017 (English)In: Research Evaluation, ISSN 0958-2029, E-ISSN 1471-5449, Vol. 26, no 4, p. 349-351Article in journal, Editorial material (Other academic) Published
Abstract [en]

A recent paper in this journal compares the Norwegian model of using publications counts for university funding with a similar intervention in Australia in the mid-1990 s. The authors argue that the Norwegian model (taking into account the quality of publications) performs better than the Australian (which did neglect paper quality other than being peer reviewed). We argue that these conclusions are in contrast to the evidence provided in the article, and therefore should be considered incorrect.

Place, publisher, year, edition, pages
OXFORD UNIV PRESS, 2017
National Category
Information Studies
Identifiers
urn:nbn:se:kth:diva-217443 (URN)10.1093/reseval/rvx029 (DOI)000413999100008 ()2-s2.0-85032747126 (Scopus ID)
Note

QC 20171117

Available from: 2017-11-17 Created: 2017-11-17 Last updated: 2017-11-17Bibliographically approved
Sandström, U. (2017). Evaluating research portfolios, a method and a case. In: STI 2017 - Science, Technology and Innovation indicators: Open indicators: innovation, participation and actor-based STI indicators Paris 2017. Paper presented at STI2017, 6 – 8 September 2017.
Open this publication in new window or tab >>Evaluating research portfolios, a method and a case
2017 (English)In: STI 2017 - Science, Technology and Innovation indicators: Open indicators: innovation, participation and actor-based STI indicators Paris 2017, 2017Conference paper, Published paper (Refereed)
Abstract [en]

Evaluating whether a portfolio of funded research projects (of a research council), or a portfolio of research papers (the output of a university) is relevant for science and for society required two-dimensional mapping of the project portfolio: (i) projecting the portfolio on a science map showing how the portfolio fits into and possibly shapes the research fronts, and (ii) projecting the portfolio on a map of societal challenges, showing where the portfolio links to societal problem solving or innovation. This requires evaluating in two different 'languages': a technical language relating projects to the research front, and a societal language relating the projects to societal challenges. In this paper, we demonstrate a method for doing so, using the SMS-platform. The advantage is that the method is much less dependent on subjective classifications by single experts or a small group of experts, and that it is rather user-friendly Evaluating research portfolios, a method and a case. Available from:

National Category
Social Sciences
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-214068 (URN)
Conference
STI2017, 6 – 8 September 2017
Note

QC 20171018

Available from: 2017-09-11 Created: 2017-09-11 Last updated: 2018-06-19Bibliographically approved
Sandström, U. (2017). Influence of cognitive distance on grant decisions. In: STI 2017 - Science, Technology and Innovation indicators: Open indicators: innovation, participation and actor-based STI indicators. Paper presented at STI2017, 6 – 8 September 2017.
Open this publication in new window or tab >>Influence of cognitive distance on grant decisions
2017 (English)In: STI 2017 - Science, Technology and Innovation indicators: Open indicators: innovation, participation and actor-based STI indicators, 2017Conference paper, Published paper (Refereed)
Abstract [en]

The selection of grant applications generally is based on peer and panel review, but as shown in many studies, the outcome of this process does not only depend on the scientific merit or excellence, but also on social factors, and on the way the decision-making process is organized. A major criticism on the peer review process is that it is inherently conservative, with panel members inclined to select applications that are line with their own theoretical perspective. In this paper we define 'cognitive distance' and operationalize it. We apply the concept, and investigate whether it influences the probability to get funded.

National Category
Social Sciences
Research subject
Industrial Economics and Management
Identifiers
urn:nbn:se:kth:diva-214067 (URN)
Conference
STI2017, 6 – 8 September 2017
Note

QC 20171018

Available from: 2017-09-11 Created: 2017-09-11 Last updated: 2017-10-18Bibliographically approved
Sandström, U. (2016). Arbetslivsforskningens framtid i bibliometriskt perspektiv. In: Åke Sandberg (Ed.), På jakt efter framtidens arbete: Utmaningar i arbetets organisering och forskning (pp. 188-197). Tankesmedjan Tiden
Open this publication in new window or tab >>Arbetslivsforskningens framtid i bibliometriskt perspektiv
2016 (Swedish)In: På jakt efter framtidens arbete: Utmaningar i arbetets organisering och forskning / [ed] Åke Sandberg, Tankesmedjan Tiden , 2016, p. 188-197Chapter in book (Other academic)
Abstract [sv]

När svensk arbetsorganisatorisk forskning nyligen utvärderades var resultateten blandning av tillförsikt och oro för verksamheten. Visst det går bra,men tveksamheter kring förnyelseverksamheten ställde allvarliga frågorför framtiden. Sverige ligger sällsynt bra inom delområden av arbetslivsforskningsom har stabil tillväxt och som ger gott resultat i form kollegialuppmärksamhet. Mycket talar för att svensk forskning har förankrat sigi ett antal starka paradområden och att detta inneburit inlåsning tillområden som möjligen kan komma att tappa i betydelse på längre sikt.Nya delområden inom arbetslivsforskningen täcks inte alls av de svenska forskarna eller i alla fall inte i förväntad utsträckning. Den svenska forskningsportföljen är relativt koncentrerad och riskerar därför att bli en black om foten om och när ordentliga framsteg görs inom nya områden.

Place, publisher, year, edition, pages
Tankesmedjan Tiden, 2016
Keywords
bibliometri, arbetsliv
National Category
Other Social Sciences
Research subject
Industrial Engineering and Management
Identifiers
urn:nbn:se:kth:diva-197087 (URN)9789156631672 (ISBN)
Note

QC 20161209

Available from: 2016-11-29 Created: 2016-11-29 Last updated: 2016-12-09Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-1292-8239

Search in DiVA

Show all publications