Since each subsequent layer of a neural network is a higher level abstraction, they can be used as semantic encodings.
word2vec
trains a network to predict the next word in a text from context.word2vec
¶Train a network to fill in the blank in a sentence
The _ wears a crown.
When trained on a large corpus, the last hidden layer has proven useful as a word embedding. This construction produces nice algebraic properties
Only as good as its own corpus.