A stratified strategy for efficient kernel-based learning

Simone Filice, Danilo Croce, Roberto Basili

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In Kernel-based Learning the targeted phenomenon is summarized by a set of explanatory examples derived from the training set. When the model size grows with the complexity of the task, such approaches are so computationally demanding that the adoption of comprehensive models is not always viable. In this paper, a general framework aimed at minimizing this problem is proposed: multiple classifiers are stratified and dynamically invoked according to increasing levels of complexity corresponding to incrementally more expressive representation spaces. Computationally expensive inferences are thus adopted only when the classification at lower levels is too uncertain over an individual instance. The application of complex functions is thus avoided where possible, with a significant reduction of the overall costs. The proposed strategy has been integrated within two well-known algorithms: Support Vector Machines and Passive-Aggressive Online classifier. A significant cost reduction (up to 90%), with a negligible performance drop, is observed against two Natural Language Processing tasks, i.e. Question Classification and Sentiment Analysis in Twitter.

Original languageEnglish
Title of host publicationProceedings of the National Conference on Artificial Intelligence
PublisherAI Access Foundation
Pages2239-2245
Number of pages7
Volume3
ISBN (Print)9781577357018
Publication statusPublished - 1 Jun 2015
Externally publishedYes
Event29th AAAI Conference on Artificial Intelligence, AAAI 2015 and the 27th Innovative Applications of Artificial Intelligence Conference, IAAI 2015 - Austin, United States
Duration: 25 Jan 201530 Jan 2015

Other

Other29th AAAI Conference on Artificial Intelligence, AAAI 2015 and the 27th Innovative Applications of Artificial Intelligence Conference, IAAI 2015
CountryUnited States
CityAustin
Period25/1/1530/1/15

Fingerprint

Classifiers
Cost reduction
Support vector machines
Processing
Costs

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Filice, S., Croce, D., & Basili, R. (2015). A stratified strategy for efficient kernel-based learning. In Proceedings of the National Conference on Artificial Intelligence (Vol. 3, pp. 2239-2245). AI Access Foundation.

A stratified strategy for efficient kernel-based learning. / Filice, Simone; Croce, Danilo; Basili, Roberto.

Proceedings of the National Conference on Artificial Intelligence. Vol. 3 AI Access Foundation, 2015. p. 2239-2245.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Filice, S, Croce, D & Basili, R 2015, A stratified strategy for efficient kernel-based learning. in Proceedings of the National Conference on Artificial Intelligence. vol. 3, AI Access Foundation, pp. 2239-2245, 29th AAAI Conference on Artificial Intelligence, AAAI 2015 and the 27th Innovative Applications of Artificial Intelligence Conference, IAAI 2015, Austin, United States, 25/1/15.
Filice S, Croce D, Basili R. A stratified strategy for efficient kernel-based learning. In Proceedings of the National Conference on Artificial Intelligence. Vol. 3. AI Access Foundation. 2015. p. 2239-2245
Filice, Simone ; Croce, Danilo ; Basili, Roberto. / A stratified strategy for efficient kernel-based learning. Proceedings of the National Conference on Artificial Intelligence. Vol. 3 AI Access Foundation, 2015. pp. 2239-2245
@inproceedings{eaa54a568a924297b9c1e4052d48c879,
title = "A stratified strategy for efficient kernel-based learning",
abstract = "In Kernel-based Learning the targeted phenomenon is summarized by a set of explanatory examples derived from the training set. When the model size grows with the complexity of the task, such approaches are so computationally demanding that the adoption of comprehensive models is not always viable. In this paper, a general framework aimed at minimizing this problem is proposed: multiple classifiers are stratified and dynamically invoked according to increasing levels of complexity corresponding to incrementally more expressive representation spaces. Computationally expensive inferences are thus adopted only when the classification at lower levels is too uncertain over an individual instance. The application of complex functions is thus avoided where possible, with a significant reduction of the overall costs. The proposed strategy has been integrated within two well-known algorithms: Support Vector Machines and Passive-Aggressive Online classifier. A significant cost reduction (up to 90{\%}), with a negligible performance drop, is observed against two Natural Language Processing tasks, i.e. Question Classification and Sentiment Analysis in Twitter.",
author = "Simone Filice and Danilo Croce and Roberto Basili",
year = "2015",
month = "6",
day = "1",
language = "English",
isbn = "9781577357018",
volume = "3",
pages = "2239--2245",
booktitle = "Proceedings of the National Conference on Artificial Intelligence",
publisher = "AI Access Foundation",

}

TY - GEN

T1 - A stratified strategy for efficient kernel-based learning

AU - Filice, Simone

AU - Croce, Danilo

AU - Basili, Roberto

PY - 2015/6/1

Y1 - 2015/6/1

N2 - In Kernel-based Learning the targeted phenomenon is summarized by a set of explanatory examples derived from the training set. When the model size grows with the complexity of the task, such approaches are so computationally demanding that the adoption of comprehensive models is not always viable. In this paper, a general framework aimed at minimizing this problem is proposed: multiple classifiers are stratified and dynamically invoked according to increasing levels of complexity corresponding to incrementally more expressive representation spaces. Computationally expensive inferences are thus adopted only when the classification at lower levels is too uncertain over an individual instance. The application of complex functions is thus avoided where possible, with a significant reduction of the overall costs. The proposed strategy has been integrated within two well-known algorithms: Support Vector Machines and Passive-Aggressive Online classifier. A significant cost reduction (up to 90%), with a negligible performance drop, is observed against two Natural Language Processing tasks, i.e. Question Classification and Sentiment Analysis in Twitter.

AB - In Kernel-based Learning the targeted phenomenon is summarized by a set of explanatory examples derived from the training set. When the model size grows with the complexity of the task, such approaches are so computationally demanding that the adoption of comprehensive models is not always viable. In this paper, a general framework aimed at minimizing this problem is proposed: multiple classifiers are stratified and dynamically invoked according to increasing levels of complexity corresponding to incrementally more expressive representation spaces. Computationally expensive inferences are thus adopted only when the classification at lower levels is too uncertain over an individual instance. The application of complex functions is thus avoided where possible, with a significant reduction of the overall costs. The proposed strategy has been integrated within two well-known algorithms: Support Vector Machines and Passive-Aggressive Online classifier. A significant cost reduction (up to 90%), with a negligible performance drop, is observed against two Natural Language Processing tasks, i.e. Question Classification and Sentiment Analysis in Twitter.

UR - http://www.scopus.com/inward/record.url?scp=84959919502&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84959919502&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781577357018

VL - 3

SP - 2239

EP - 2245

BT - Proceedings of the National Conference on Artificial Intelligence

PB - AI Access Foundation

ER -