Copy the page URI to the clipboard
Zhou, Deyu and He, Yulan
(2011).
DOI: https://doi.org/10.1145/2063576.2063881
Abstract
Natural language understanding (NLU) aims to map sentences to their semantic mean representations. Statistical approaches to NLU normally require fully-annotated training data where each sentence is paired with its word-level semantic annotations. In this paper, we propose a novel learning framework which trains the Hidden Markov Support Vector Machines (HM-SVMs) without the use of expensive fully-annotated data. In particular, our learning approach takes as input a training set of sentences labeled with abstract semantic annotations encoding underlying embedded structural relations and automatically induces derivation rules that map sentences to their semantic meaning representations. The proposed approach has been tested on the DARPA Communicator Data and achieved 93.18% in F-measure, which outperforms the previously proposed approaches of training the hidden vector state model or conditional random fields from unaligned data, with a relative error reduction rate of 43.3% and 10.6% being achieved.
Viewing alternatives
Metrics
Public Attention
Altmetrics from AltmetricNumber of Citations
Citations from DimensionsItem Actions
Export
About
- Item ORO ID
- 29460
- Item Type
- Conference or Workshop Item
- Extra Information
-
CIKM '11 Proceedings of the 20th ACM International Conference on Information and Knowledge Management
ACM New York, NY, USA ©2011
ISBN: 978-1-4503-0717-8 - Keywords
- hidden Markov support vector machines (HM-SVMs); natural language understanding; semantic parsing; algorithms; experimentation
- Academic Unit or School
-
Faculty of Science, Technology, Engineering and Mathematics (STEM) > Knowledge Media Institute (KMi)
Faculty of Science, Technology, Engineering and Mathematics (STEM) - Research Group
- Centre for Research in Computing (CRC)
- Copyright Holders
- © 2011 ACM
- Depositing User
- Kay Dave