We describe BigSense
, a neural network-based approach for highly efficient word sense disambiguation (WSD). BigSense uses the entire English Wikipedia disambiguation pages to train a model that achieves state-of-the-art results while being many times faster than its competitors. In this way it is possible to disambiguate very large amounts of text data with reference to the largest freely available disambiguation model, while the time complexity of the model remains manageable. Thus, our approach paves the way for large-scale disambiguations in text-related digital humanities.