Distributional neural networks for automatic resolution of crossword puzzles

Aliaksei Severyn, Massimo Nicosia, Gianni Barlacchi, Alessandro Moschitti

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Automatic resolution of Crossword Puz-zles (CPs) heavily depends on the qual-ity of the answer candidate lists produced by a retrieval system for each clue of the puzzle grid. Previous work has shown that such lists can be generated using In-formation Retrieval (IR) search algorithms applied to the databases containing previ-ously solved CPs and reranked with tree kernels (TKs) applied to a syntactic tree representation of the clues. In this pa-per, we create a labelled dataset of 2 mil-lion clues on which we apply an innovative Distributional Neural Network (DNN) for reranking clue pairs. Our DNN is com-putationally efficient and can thus take ad-vantage of such large datasets showing a large improvement over the TK approach, when the latter uses small training data. In contrast, when data is scarce, TKs outper-form DNNs.

Original languageEnglish
Title of host publicationACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference
PublisherAssociation for Computational Linguistics (ACL)
Pages199-204
Number of pages6
Volume2
ISBN (Print)9781941643730
Publication statusPublished - 2015
Event53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, ACL-IJCNLP 2015 - Beijing, China
Duration: 26 Jul 201531 Jul 2015

Other

Other53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, ACL-IJCNLP 2015
CountryChina
CityBeijing
Period26/7/1531/7/15

Fingerprint

Neural networks
Syntactics

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software

Cite this

Severyn, A., Nicosia, M., Barlacchi, G., & Moschitti, A. (2015). Distributional neural networks for automatic resolution of crossword puzzles. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 199-204). Association for Computational Linguistics (ACL).

Distributional neural networks for automatic resolution of crossword puzzles. / Severyn, Aliaksei; Nicosia, Massimo; Barlacchi, Gianni; Moschitti, Alessandro.

ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference. Vol. 2 Association for Computational Linguistics (ACL), 2015. p. 199-204.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Severyn, A, Nicosia, M, Barlacchi, G & Moschitti, A 2015, Distributional neural networks for automatic resolution of crossword puzzles. in ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference. vol. 2, Association for Computational Linguistics (ACL), pp. 199-204, 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, ACL-IJCNLP 2015, Beijing, China, 26/7/15.
Severyn A, Nicosia M, Barlacchi G, Moschitti A. Distributional neural networks for automatic resolution of crossword puzzles. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference. Vol. 2. Association for Computational Linguistics (ACL). 2015. p. 199-204
Severyn, Aliaksei ; Nicosia, Massimo ; Barlacchi, Gianni ; Moschitti, Alessandro. / Distributional neural networks for automatic resolution of crossword puzzles. ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference. Vol. 2 Association for Computational Linguistics (ACL), 2015. pp. 199-204
@inproceedings{fde93e596c6f4ea3ad9c3fba706c47a3,
title = "Distributional neural networks for automatic resolution of crossword puzzles",
abstract = "Automatic resolution of Crossword Puz-zles (CPs) heavily depends on the qual-ity of the answer candidate lists produced by a retrieval system for each clue of the puzzle grid. Previous work has shown that such lists can be generated using In-formation Retrieval (IR) search algorithms applied to the databases containing previ-ously solved CPs and reranked with tree kernels (TKs) applied to a syntactic tree representation of the clues. In this pa-per, we create a labelled dataset of 2 mil-lion clues on which we apply an innovative Distributional Neural Network (DNN) for reranking clue pairs. Our DNN is com-putationally efficient and can thus take ad-vantage of such large datasets showing a large improvement over the TK approach, when the latter uses small training data. In contrast, when data is scarce, TKs outper-form DNNs.",
author = "Aliaksei Severyn and Massimo Nicosia and Gianni Barlacchi and Alessandro Moschitti",
year = "2015",
language = "English",
isbn = "9781941643730",
volume = "2",
pages = "199--204",
booktitle = "ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference",
publisher = "Association for Computational Linguistics (ACL)",

}

TY - GEN

T1 - Distributional neural networks for automatic resolution of crossword puzzles

AU - Severyn, Aliaksei

AU - Nicosia, Massimo

AU - Barlacchi, Gianni

AU - Moschitti, Alessandro

PY - 2015

Y1 - 2015

N2 - Automatic resolution of Crossword Puz-zles (CPs) heavily depends on the qual-ity of the answer candidate lists produced by a retrieval system for each clue of the puzzle grid. Previous work has shown that such lists can be generated using In-formation Retrieval (IR) search algorithms applied to the databases containing previ-ously solved CPs and reranked with tree kernels (TKs) applied to a syntactic tree representation of the clues. In this pa-per, we create a labelled dataset of 2 mil-lion clues on which we apply an innovative Distributional Neural Network (DNN) for reranking clue pairs. Our DNN is com-putationally efficient and can thus take ad-vantage of such large datasets showing a large improvement over the TK approach, when the latter uses small training data. In contrast, when data is scarce, TKs outper-form DNNs.

AB - Automatic resolution of Crossword Puz-zles (CPs) heavily depends on the qual-ity of the answer candidate lists produced by a retrieval system for each clue of the puzzle grid. Previous work has shown that such lists can be generated using In-formation Retrieval (IR) search algorithms applied to the databases containing previ-ously solved CPs and reranked with tree kernels (TKs) applied to a syntactic tree representation of the clues. In this pa-per, we create a labelled dataset of 2 mil-lion clues on which we apply an innovative Distributional Neural Network (DNN) for reranking clue pairs. Our DNN is com-putationally efficient and can thus take ad-vantage of such large datasets showing a large improvement over the TK approach, when the latter uses small training data. In contrast, when data is scarce, TKs outper-form DNNs.

UR - http://www.scopus.com/inward/record.url?scp=84944056827&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84944056827&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781941643730

VL - 2

SP - 199

EP - 204

BT - ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference

PB - Association for Computational Linguistics (ACL)

ER -