Automatic Evaluation of Robustness and Degradation in Tagging and Parsing
2003 (English)In: Proceedings of the International Conference Recent Advances in Natural Language Processing 2003, Borovets, Bulgaria, 2003Conference paper (Refereed)
We address the topic of automatic evaluationof robustness and performance degradation inparsing systems. We focus on one aspect of robustness, namely ill-formed sentences and the impact of spelling errors on the different componentsof a parsing system. We propose anautomated framework to evaluate robustness, where ill-formed and noisy data is introducedusing an automatic tool and fed to the parsingsystem. With increasing levels of noise, the performanceof a system will inevitably degrade, and the question is to what extent? The experiments show a graceful degradation in performancefor both state-of-the-art taggers usedand a Swedish shallow parser. The automatednature of the evaluation allows easy and reproducibleevaluation of the individual componentsof a parsing system.
Place, publisher, year, edition, pages
Borovets, Bulgaria, 2003.
automatic evaluation, robustness, spelling errors, tagging, shallow parsing
Engineering and Technology
IdentifiersURN: urn:nbn:se:kth:diva-12495OAI: oai:DiVA.org:kth-12495DiVA: diva2:315375
QC 201004292010-04-292010-04-292011-10-12Bibliographically approved