Learning contextual embeddings for structural semantic similarity using categorical information

Massimo Nicosia, Alessandro Moschitti

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural embeddings produced by current methods do not provide a suitable contextual similarity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a binary text similarity. We set the latter considering examples in the same category as similar. The experiments on question and sentiment classification show that our semantic TK highly improves previous results.

Original languageEnglish
Title of host publicationCoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings
PublisherAssociation for Computational Linguistics (ACL)
Pages260-270
Number of pages11
ISBN (Electronic)9781945626548
Publication statusPublished - 1 Jan 2017
Event21st Conference on Computational Natural Language Learning, CoNLL 2017 - Vancouver, Canada
Duration: 3 Aug 20174 Aug 2017

Publication series

NameCoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings

Conference

Conference21st Conference on Computational Natural Language Learning, CoNLL 2017
CountryCanada
CityVancouver
Period3/8/174/8/17

Fingerprint

Semantics
semantics
neural network
learning
engineering
Neural networks
experiment
Experiments

ASJC Scopus subject areas

  • Linguistics and Language
  • Artificial Intelligence
  • Human-Computer Interaction

Cite this

Nicosia, M., & Moschitti, A. (2017). Learning contextual embeddings for structural semantic similarity using categorical information. In CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings (pp. 260-270). (CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings). Association for Computational Linguistics (ACL).

Learning contextual embeddings for structural semantic similarity using categorical information. / Nicosia, Massimo; Moschitti, Alessandro.

CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings. Association for Computational Linguistics (ACL), 2017. p. 260-270 (CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Nicosia, M & Moschitti, A 2017, Learning contextual embeddings for structural semantic similarity using categorical information. in CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings. CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings, Association for Computational Linguistics (ACL), pp. 260-270, 21st Conference on Computational Natural Language Learning, CoNLL 2017, Vancouver, Canada, 3/8/17.
Nicosia M, Moschitti A. Learning contextual embeddings for structural semantic similarity using categorical information. In CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings. Association for Computational Linguistics (ACL). 2017. p. 260-270. (CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings).
Nicosia, Massimo ; Moschitti, Alessandro. / Learning contextual embeddings for structural semantic similarity using categorical information. CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings. Association for Computational Linguistics (ACL), 2017. pp. 260-270 (CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings).
@inproceedings{39ac308d46204404abc229ff2569768c,
title = "Learning contextual embeddings for structural semantic similarity using categorical information",
abstract = "Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural embeddings produced by current methods do not provide a suitable contextual similarity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a binary text similarity. We set the latter considering examples in the same category as similar. The experiments on question and sentiment classification show that our semantic TK highly improves previous results.",
author = "Massimo Nicosia and Alessandro Moschitti",
year = "2017",
month = "1",
day = "1",
language = "English",
series = "CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings",
publisher = "Association for Computational Linguistics (ACL)",
pages = "260--270",
booktitle = "CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings",

}

TY - GEN

T1 - Learning contextual embeddings for structural semantic similarity using categorical information

AU - Nicosia, Massimo

AU - Moschitti, Alessandro

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural embeddings produced by current methods do not provide a suitable contextual similarity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a binary text similarity. We set the latter considering examples in the same category as similar. The experiments on question and sentiment classification show that our semantic TK highly improves previous results.

AB - Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural embeddings produced by current methods do not provide a suitable contextual similarity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a binary text similarity. We set the latter considering examples in the same category as similar. The experiments on question and sentiment classification show that our semantic TK highly improves previous results.

UR - http://www.scopus.com/inward/record.url?scp=85048247621&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85048247621&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85048247621

T3 - CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings

SP - 260

EP - 270

BT - CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings

PB - Association for Computational Linguistics (ACL)

ER -