Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Tal, musik och hörsel, TMH.ORCID-id: 0000-0003-0112-6732
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Tal, musik och hörsel, TMH.ORCID-id: 0000-0002-8579-1790
2023 (engelsk)Inngår i: Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023), Association for Computational Linguistics (ACL) , 2023, s. 39-54Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representation, KG-to-text generation is a useful tool for turning parts of the graph data into text that can be understood by humans. Recent work has shown that models that make use of pretraining on large amounts of text data can perform well on the KG-to-text task, even with relatively little training data on the specific graph-to-text task. In this paper, we build on this concept by using large language models to perform zero-shot generation based on nothing but the model’s understanding of the triple structure from what it can read. We show that ChatGPT achieves near state-of-the-art performance on some measures of the WebNLG 2020 challenge, but falls behind on others. Additionally, we compare factual, counter-factual and fictional statements, and show that there is a significant connection between what the LLM already knows about the data it is parsing and the quality of the output text.

sted, utgiver, år, opplag, sider
Association for Computational Linguistics (ACL) , 2023. s. 39-54
Emneord [en]
large language model, llm, lexicalisation, kg-to-text, data-to-text, bias, hallucination, triples, triple, knowledge graph, KG, webnlg, wikidata
Emneord [sv]
språkmodell, llm, lexikalisering, data till text, kg till text, hallucination, triplett, kunskapsgraf, KG, webnlg, wikidata
HSV kategori
Forskningsprogram
Tal- och musikkommunikation
Identifikatorer
URN: urn:nbn:se:kth:diva-338176Scopus ID: 2-s2.0-85175688234OAI: oai:DiVA.org:kth-338176DiVA, id: diva2:1805182
Konferanse
2023 Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge, MM-NLG 2023, Prague, Czechia, Sep 12 2023
Prosjekter
Social robots accelerating the transition to sustainable transport (50276-1)
Forskningsfinansiär
Swedish Energy Agency, P2020-90133
Merknad

QC 20231017

Tilgjengelig fra: 2023-10-16 Laget: 2023-10-16 Sist oppdatert: 2024-07-04bibliografisk kontrollert

Open Access i DiVA

mmnlg-2023-axelsson-skantze-kg-to-text-chatgpt(281 kB)259 nedlastinger
Filinformasjon
Fil FULLTEXT01.pdfFilstørrelse 281 kBChecksum SHA-512
57e90ac4a6048c974f0469ec097d32e5ee153e1c09a8501b8b1e130b27f398a9a8c975b80627e0e6a6af492fb208ae17c25fa597d32d47190241787af6f45d39
Type fulltextMimetype application/pdf

Andre lenker

ScopusACL Anthology

Person

Axelsson, AgnesSkantze, Gabriel

Søk i DiVA

Av forfatter/redaktør
Axelsson, AgnesSkantze, Gabriel
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar
Totalt: 259 nedlastinger
Antall nedlastinger er summen av alle nedlastinger av alle fulltekster. Det kan for eksempel være tidligere versjoner som er ikke lenger tilgjengelige

urn-nbn

Altmetric

urn-nbn
Totalt: 316 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf