kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
ContrastNER: Contrastive-based Prompt Tuning for Few-shot NER
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Software and Computer systems, SCS.ORCID iD: 0000-0002-3264-974X
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Software and Computer systems, SCS.ORCID iD: 0000-0002-2748-8929
Oslo Metropolitan University, Norway.
SINTEF AS, Norway.
Show others and affiliations
2023 (English)In: Proceedings - 2023 IEEE 47th Annual Computers, Software, and Applications Conference, COMPSAC 2023, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 241-249Conference paper, Published paper (Refereed)
Abstract [en]

Prompt-based language models have produced encouraging results in numerous applications, including Named Entity Recognition (NER) tasks. NER aims to identify entities in a sentence and provide their types. However, the strong performance of most available NER approaches is heavily dependent on the design of discrete prompts and a verbalizer to map the model-predicted outputs to entity categories, which are complicated undertakings. To address these challenges, we present ContrastNER, a prompt-based NER framework that employs both discrete and continuous tokens in prompts and uses a contrastive learning approach to learn the continuous prompts and forecast entity types. The experimental results demonstrate that ContrastNER obtains competitive performance to the state-of-the-art NER methods in high-resource settings and outperforms the state-of-the-art models in low-resource circumstances without requiring extensive manual prompt engineering and verbalizer design.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2023. p. 241-249
Keywords [en]
Contrastive learning, Language Models, Named Entity Recognition, Prompt-based learning
National Category
Natural Language Processing Robotics and automation
Identifiers
URN: urn:nbn:se:kth:diva-336748DOI: 10.1109/COMPSAC57700.2023.00038ISI: 001046484100028Scopus ID: 2-s2.0-85168863373OAI: oai:DiVA.org:kth-336748DiVA, id: diva2:1798326
Conference
47th IEEE Annual Computers, Software, and Applications Conference, COMPSAC 2023, Jun 26 2023 - Jun 30 2023, Hybrid, Torino, Italy
Note

Part of proceedings ISBN 9798350326970

QC 20231031

Available from: 2023-09-19 Created: 2023-09-19 Last updated: 2025-02-05Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Layegh, AmirhosseinPayberah, Amir H.Matskin, Mihhail

Search in DiVA

By author/editor
Layegh, AmirhosseinPayberah, Amir H.Matskin, Mihhail
By organisation
Software and Computer systems, SCS
Natural Language ProcessingRobotics and automation

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 79 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf