NLP-KG
Semantic Search

Publication:

Semi-supervised Semantic Role Labeling Using the Latent Words Language Model

K. DeschachtMarie-Francine Moens • @Conference on Empirical Methods in Natural Language Processing • 06 August 2009

TLDR: The Latent Words Language Model is presented, which is a language model that learns word similarities from unlabeled texts that uses these similarities for different semi-supervised SRL methods as additional features or to automatically expand a small training set.

Citations: 85
Abstract: Semantic Role Labeling (SRL) has proved to be a valuable tool for performing automatic analysis of natural language texts. Currently however, most systems rely on a large training set, which is manually annotated, an effort that needs to be repeated whenever different languages or a different set of semantic roles is used in a certain application. A possible solution for this problem is semi-supervised learning, where a small set of training examples is automatically expanded using unlabeled texts. We present the Latent Words Language Model, which is a language model that learns word similarities from unlabeled texts. We use these similarities for different semi-supervised SRL methods as additional features or to automatically expand a small training set. We evaluate the methods on the PropBank dataset and find that for small training sizes our best performing system achieves an error reduction of 33.27% F1-measure compared to a state-of-the-art supervised baseline.

Related Fields of Study

loading

Citations

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next

References

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next