He, Yulan and Zhou, Deyu
|DOI (Digital Object Identifier) Link:||https://doi.org/10.1016/j.ipm.2010.11.003|
|Google Scholar:||Look up in Google Scholar|
Sentiment analysis concerns about automatically identifying sentiment or opinion expressed in a given piece of text. Most prior work either use prior lexical knowledge defined as sentiment polarity of words or view the task as a text classification problem and rely on labeled corpora to train a sentiment classifier. While lexicon-based approaches do not adapt well to different domains, corpus-based approaches require expensive manual annotation effort.
In this paper, we propose a novel framework where an initial classifier is learned by incorporating prior information extracted from an existing sentiment lexicon with preferences on expectations of sentiment labels of those lexicon words being expressed using generalized expectation criteria. Documents classified with high confidence are then used as pseudo-labeled examples for automatical domain-specific feature acquisition. The word-class distributions of such self-learned features are estimated from the pseudo-labeled examples and are used to train another classifier by constraining the model’s predictions on unlabeled instances. Experiments on both the movie-review data and the multi-domain sentiment dataset show that our approach attains comparable or better performance than existing weakly-supervised sentiment classification methods despite using no labeled documents.
|Item Type:||Journal Article|
|Copyright Holders:||2010 Elsevier Ltd.|
|Keywords:||sentiment analysis; opinion mining; self-training; generalized expectation; self-learned features|
|Academic Unit/Department:||Faculty of Science, Technology, Engineering and Mathematics (STEM) > Knowledge Media Institute (KMi)
Faculty of Science, Technology, Engineering and Mathematics (STEM)
|Interdisciplinary Research Centre:||Centre for Research in Computing (CRC)|
|Depositing User:||Yulan He|
|Date Deposited:||16 Jan 2012 10:30|
|Last Modified:||07 Oct 2016 13:54|
|Share this page:|