As a popular topic model, Probabilistic Latent Semantic Analysis (PLSA) has been widely applied in text clustering due to its reliability and practicability. While independence assumption contributes to its practicability, it loses the rich local information between words, which in some cases will result in incoherent topics. In this paper, we propose an enhanced PLSA model embedded with word correlation (EWPLSA) for text clustering. The new model can utilize a wide range of pairwise semantic constraints between words to get more meaningful topics. We study how to consider context information of all the words, trained with the global semantic distribution in a global manner. The relationship between similar words will be amplified in a global perspective in the next iteration. However, it will also be restricted by other words because all word contexts are considered. In this way, instead of generating independently, the topic assignment of each word is affected by other correlative words. In addition, we initialize several important parameters before the first iteration of the EM algorithm according to the local correlation information between words, leading to the reduction of iteration number. Experiments on two datasets have demonstrated the effectiveness of our method.