Enhancing Coreference Resolution with Pretrained Language Models: Bridging the Gap Between Syntax and Semantics
Xingzu Liu, Songhang deng, Mingbang Wang, Zhang Dong, Le Dai, Jiyuan Li, Ruilin Nong·April 08, 2025
Summary
A novel framework integrates syntax parsing and semantic role labeling with pretrained language models to enhance coreference resolution. This method, featuring state-of-the-art models and an attention mechanism, surpasses conventional systems in accuracy, as demonstrated across diverse datasets. The study highlights the importance of combining syntax and semantics for precise referential understanding in natural language processing tasks.
Advanced features