In this paper, we propose a novel approach for reader-emotion categorization using word embedding learned from neural networks and an SVM classifier. The primary objective of such word embedding methods involves learning continuous distributed vector representations of words through neural networks. It can capture semantic context and syntactic cues, and subsequently be used to infer similarity measures among words, sentences, and even documents. Various methods of combining the word embeddings are tested for their performances on reader-emotion categorization of a Chinese news corpus. Results demonstrate that the proposed method, when compared to several other approaches, can achieve comparable or even better performances.