ContrastNER: Contrastive-based Prompt Tuning for Few-shot NERShow others and affiliations
2023 (English)In: Proceedings - 2023 IEEE 47th Annual Computers, Software, and Applications Conference, COMPSAC 2023, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 241-249Conference paper, Published paper (Refereed)
Abstract [en]
Prompt-based language models have produced encouraging results in numerous applications, including Named Entity Recognition (NER) tasks. NER aims to identify entities in a sentence and provide their types. However, the strong performance of most available NER approaches is heavily dependent on the design of discrete prompts and a verbalizer to map the model-predicted outputs to entity categories, which are complicated undertakings. To address these challenges, we present ContrastNER, a prompt-based NER framework that employs both discrete and continuous tokens in prompts and uses a contrastive learning approach to learn the continuous prompts and forecast entity types. The experimental results demonstrate that ContrastNER obtains competitive performance to the state-of-the-art NER methods in high-resource settings and outperforms the state-of-the-art models in low-resource circumstances without requiring extensive manual prompt engineering and verbalizer design.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2023. p. 241-249
Keywords [en]
Contrastive learning, Language Models, Named Entity Recognition, Prompt-based learning
National Category
Natural Language Processing Robotics and automation
Identifiers
URN: urn:nbn:se:kth:diva-336748DOI: 10.1109/COMPSAC57700.2023.00038ISI: 001046484100028Scopus ID: 2-s2.0-85168863373OAI: oai:DiVA.org:kth-336748DiVA, id: diva2:1798326
Conference
47th IEEE Annual Computers, Software, and Applications Conference, COMPSAC 2023, Jun 26 2023 - Jun 30 2023, Hybrid, Torino, Italy
Note
Part of proceedings ISBN 9798350326970
QC 20231031
2023-09-192023-09-192025-02-05Bibliographically approved