(2015. 6) Skip Thought

  • published in 2015. 6

  • Ryan Kiros, Yukun Zhu, Ruslan Salakhutdinov, Richard S. Zemel, Antonio Torralba, Raquel Urtasun and Sanja Fidler

Simple summary

  • Approach for unsupervised learning of a generic, distributed sentence encoder.

  • skip-thought word2vec model to the sentence level, training auto-encoders that predict the previous and next sentences.

  • Sentences that share semantic and syntactic properties are thus mapped

    to similar vector representations.

Last updated