We describe BigSense, a neural network-based approach for highly efficient word sense disambiguation (WSD). BigSense uses the entire English Wikipedia disambiguation pages to train a model that achieves state-of-the-art results while being many times faster than its competitors. In this way it is possible to disambiguate very large amounts of text data with reference to the largest freely available disambiguation model, while the time complexity of the model remains manageable. Thus, our approach paves the way for large-scale disambiguations in text-related digital humanities.

Total: 1

  • [] T. Uslu, A. Mehler, C. Schulz, and D. Baumartz, “BigSense: a Word Sense Disambiguator for Big Data,” in Proceedings of the Digital Humanities 2019, (DH2019), 2019.

      author = "Uslu, Tolga and Mehler, Alexander and Schulz, Clemens and Baumartz, Daniel",
      booktitle = "{Proceedings of the Digital Humanities 2019, (DH2019)}",
      location = "Utrecht, Netherlands",
      series = "{DH2019}",
      title = "{{BigSense}: a Word Sense Disambiguator for Big Data}",
      year = 2019,