Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Self-imposed filter bubbles: Selective attention and exposure in online search
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Tal, musik och hörsel, TMH. Lund University Cognitive Science, Lund, Sweden, Department of Psychology, Lund University, Lund, Sweden.ORCID-id: 0000-0002-6739-0838
2022 (engelsk)Inngår i: Computers in Human Behavior Reports, ISSN 2451-9588, Vol. 7, artikkel-id 100226Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p < .001). In addition, we found that both liberals and conservatives tended to select own-side links (p < .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p < .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results. 

sted, utgiver, år, opplag, sider
Elsevier BV , 2022. Vol. 7, artikkel-id 100226
Emneord [en]
Eye tracking, Filter bubble, Ingroup bias, Online search, Selective exposure, Trust
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-326798DOI: 10.1016/j.chbr.2022.100226ISI: 001026238300001Scopus ID: 2-s2.0-85135800361OAI: oai:DiVA.org:kth-326798DiVA, id: diva2:1756746
Merknad

QC 20230515

Tilgjengelig fra: 2023-05-15 Laget: 2023-05-15 Sist oppdatert: 2023-08-03bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Ekström, Axel G.

Søk i DiVA

Av forfatter/redaktør
Ekström, Axel G.
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric

doi
urn-nbn
Totalt: 45 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf