kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Speech, Music and Hearing, TMH.ORCID iD: 0000-0003-0112-6732
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Speech, Music and Hearing, TMH.ORCID iD: 0000-0002-8579-1790
2023 (English)In: Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023), Association for Computational Linguistics (ACL) , 2023, p. 39-54Conference paper, Published paper (Refereed)
Abstract [en]

In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representation, KG-to-text generation is a useful tool for turning parts of the graph data into text that can be understood by humans. Recent work has shown that models that make use of pretraining on large amounts of text data can perform well on the KG-to-text task, even with relatively little training data on the specific graph-to-text task. In this paper, we build on this concept by using large language models to perform zero-shot generation based on nothing but the model’s understanding of the triple structure from what it can read. We show that ChatGPT achieves near state-of-the-art performance on some measures of the WebNLG 2020 challenge, but falls behind on others. Additionally, we compare factual, counter-factual and fictional statements, and show that there is a significant connection between what the LLM already knows about the data it is parsing and the quality of the output text.

Place, publisher, year, edition, pages
Association for Computational Linguistics (ACL) , 2023. p. 39-54
Keywords [en]
large language model, llm, lexicalisation, kg-to-text, data-to-text, bias, hallucination, triples, triple, knowledge graph, KG, webnlg, wikidata
Keywords [sv]
språkmodell, llm, lexikalisering, data till text, kg till text, hallucination, triplett, kunskapsgraf, KG, webnlg, wikidata
National Category
Natural Language Processing
Research subject
Speech and Music Communication
Identifiers
URN: urn:nbn:se:kth:diva-338176Scopus ID: 2-s2.0-85175688234OAI: oai:DiVA.org:kth-338176DiVA, id: diva2:1805182
Conference
2023 Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge, MM-NLG 2023, Prague, Czechia, Sep 12 2023
Projects
Social robots accelerating the transition to sustainable transport (50276-1)
Funder
Swedish Energy Agency, P2020-90133
Note

QC 20231017

Available from: 2023-10-16 Created: 2023-10-16 Last updated: 2025-02-07Bibliographically approved

Open Access in DiVA

mmnlg-2023-axelsson-skantze-kg-to-text-chatgpt(281 kB)360 downloads
File information
File name FULLTEXT01.pdfFile size 281 kBChecksum SHA-512
57e90ac4a6048c974f0469ec097d32e5ee153e1c09a8501b8b1e130b27f398a9a8c975b80627e0e6a6af492fb208ae17c25fa597d32d47190241787af6f45d39
Type fulltextMimetype application/pdf

Other links

ScopusACL Anthology

Authority records

Axelsson, AgnesSkantze, Gabriel

Search in DiVA

By author/editor
Axelsson, AgnesSkantze, Gabriel
By organisation
Speech, Music and Hearing, TMH
Natural Language Processing

Search outside of DiVA

GoogleGoogle Scholar
Total: 360 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 463 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf