Neural attention for learning to rank questions in community question answering

Salvatore Romeo, Giovanni Martino, Alberto Barron, Alessandro Moschitti, Yonatan Belinkov, Wei Ning Hsu, Yu Zhang, Mitra Mohtarami, James Glass

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Citations (Scopus)

Abstract

In real-world data, e.g., from Web forums, text is often contaminated with redundant or irrelevant content, which leads to introducing noise in machine learning algorithms. In this paper, we apply Long Short-Lerm Memory networks with an attention mechanism, which can select important parts of text for the task of similar question retrieval from community Question Answering (cQA) forums. In particular, we use the attention weights for both selecting entire sentences and their subparts, i.e., word/chunk, from shallow syntactic trees. More interestingly, we apply tree kernels to the filtered text representations, thus exploiting the implicit features of the subtree space for learning question reranking. Our results show that the attention-based pruning allows for achieving the top position in the cQA challenge of SemEval 2016, with a relatively large gap from the other participants while greatly decreasing running time.

Original languageEnglish
Title of host publicationCOLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016
Subtitle of host publicationTechnical Papers
PublisherAssociation for Computational Linguistics, ACL Anthology
Pages1734-1745
Number of pages12
ISBN (Print)9784879747020
Publication statusPublished - 1 Jan 2016
Event26th International Conference on Computational Linguistics, COLING 2016 - Osaka, Japan
Duration: 11 Dec 201616 Dec 2016

Other

Other26th International Conference on Computational Linguistics, COLING 2016
CountryJapan
CityOsaka
Period11/12/1616/12/16

Fingerprint

Syntactics
Learning algorithms
Learning systems
Data storage equipment
learning
community
Question Answering

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Language and Linguistics
  • Linguistics and Language

Cite this

Romeo, S., Martino, G., Barron, A., Moschitti, A., Belinkov, Y., Hsu, W. N., ... Glass, J. (2016). Neural attention for learning to rank questions in community question answering. In COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016: Technical Papers (pp. 1734-1745). Association for Computational Linguistics, ACL Anthology.

Neural attention for learning to rank questions in community question answering. / Romeo, Salvatore; Martino, Giovanni; Barron, Alberto; Moschitti, Alessandro; Belinkov, Yonatan; Hsu, Wei Ning; Zhang, Yu; Mohtarami, Mitra; Glass, James.

COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016: Technical Papers. Association for Computational Linguistics, ACL Anthology, 2016. p. 1734-1745.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Romeo, S, Martino, G, Barron, A, Moschitti, A, Belinkov, Y, Hsu, WN, Zhang, Y, Mohtarami, M & Glass, J 2016, Neural attention for learning to rank questions in community question answering. in COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016: Technical Papers. Association for Computational Linguistics, ACL Anthology, pp. 1734-1745, 26th International Conference on Computational Linguistics, COLING 2016, Osaka, Japan, 11/12/16.
Romeo S, Martino G, Barron A, Moschitti A, Belinkov Y, Hsu WN et al. Neural attention for learning to rank questions in community question answering. In COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016: Technical Papers. Association for Computational Linguistics, ACL Anthology. 2016. p. 1734-1745
Romeo, Salvatore ; Martino, Giovanni ; Barron, Alberto ; Moschitti, Alessandro ; Belinkov, Yonatan ; Hsu, Wei Ning ; Zhang, Yu ; Mohtarami, Mitra ; Glass, James. / Neural attention for learning to rank questions in community question answering. COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016: Technical Papers. Association for Computational Linguistics, ACL Anthology, 2016. pp. 1734-1745
@inproceedings{e9efe79e0f21418b98c687a6de68b28d,
title = "Neural attention for learning to rank questions in community question answering",
abstract = "In real-world data, e.g., from Web forums, text is often contaminated with redundant or irrelevant content, which leads to introducing noise in machine learning algorithms. In this paper, we apply Long Short-Lerm Memory networks with an attention mechanism, which can select important parts of text for the task of similar question retrieval from community Question Answering (cQA) forums. In particular, we use the attention weights for both selecting entire sentences and their subparts, i.e., word/chunk, from shallow syntactic trees. More interestingly, we apply tree kernels to the filtered text representations, thus exploiting the implicit features of the subtree space for learning question reranking. Our results show that the attention-based pruning allows for achieving the top position in the cQA challenge of SemEval 2016, with a relatively large gap from the other participants while greatly decreasing running time.",
author = "Salvatore Romeo and Giovanni Martino and Alberto Barron and Alessandro Moschitti and Yonatan Belinkov and Hsu, {Wei Ning} and Yu Zhang and Mitra Mohtarami and James Glass",
year = "2016",
month = "1",
day = "1",
language = "English",
isbn = "9784879747020",
pages = "1734--1745",
booktitle = "COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016",
publisher = "Association for Computational Linguistics, ACL Anthology",

}

TY - GEN

T1 - Neural attention for learning to rank questions in community question answering

AU - Romeo, Salvatore

AU - Martino, Giovanni

AU - Barron, Alberto

AU - Moschitti, Alessandro

AU - Belinkov, Yonatan

AU - Hsu, Wei Ning

AU - Zhang, Yu

AU - Mohtarami, Mitra

AU - Glass, James

PY - 2016/1/1

Y1 - 2016/1/1

N2 - In real-world data, e.g., from Web forums, text is often contaminated with redundant or irrelevant content, which leads to introducing noise in machine learning algorithms. In this paper, we apply Long Short-Lerm Memory networks with an attention mechanism, which can select important parts of text for the task of similar question retrieval from community Question Answering (cQA) forums. In particular, we use the attention weights for both selecting entire sentences and their subparts, i.e., word/chunk, from shallow syntactic trees. More interestingly, we apply tree kernels to the filtered text representations, thus exploiting the implicit features of the subtree space for learning question reranking. Our results show that the attention-based pruning allows for achieving the top position in the cQA challenge of SemEval 2016, with a relatively large gap from the other participants while greatly decreasing running time.

AB - In real-world data, e.g., from Web forums, text is often contaminated with redundant or irrelevant content, which leads to introducing noise in machine learning algorithms. In this paper, we apply Long Short-Lerm Memory networks with an attention mechanism, which can select important parts of text for the task of similar question retrieval from community Question Answering (cQA) forums. In particular, we use the attention weights for both selecting entire sentences and their subparts, i.e., word/chunk, from shallow syntactic trees. More interestingly, we apply tree kernels to the filtered text representations, thus exploiting the implicit features of the subtree space for learning question reranking. Our results show that the attention-based pruning allows for achieving the top position in the cQA challenge of SemEval 2016, with a relatively large gap from the other participants while greatly decreasing running time.

UR - http://www.scopus.com/inward/record.url?scp=85054999113&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85054999113&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9784879747020

SP - 1734

EP - 1745

BT - COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016

PB - Association for Computational Linguistics, ACL Anthology

ER -