Publisher = "Association for Computational Linguistics",Ībstract = "Up to date, the majority of computational models still determines the semantic relatedness between words (or larger linguistic units) on the type level.
Anthology ID: E17-2086 Volume: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers Month: April Year: 2017 Address: Valencia, Spain Venue: EACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 535–542 Language: URL: DOI: Bibkey: koper-schulte-im-walde-2017-applying Copy Citation: BibTeX MODS XML Endnote More options… PDF: = "Applying Multi-Sense Embeddings for uropean Chapter of the Association for Computational Linguistics: Volume 2, Short Papers", While there is no overall best model, all models significantly outperform a word2vec single-sense skip baseline, thus demonstrating the need to distinguish between word senses in a distributional semantic model. We focus on the challenging class of complex verbs, and evaluate the model variants on various semantic tasks: semantic classification predicting compositionality and detecting non-literal language usage. In this paper, we compare and extend multi-sense embeddings, in order to model and utilise word senses on the token level.
Abstract Up to date, the majority of computational models still determines the semantic relatedness between words (or larger linguistic units) on the type level.