The availability of graph data with node attributes that can be either discrete or real-valued is constantly increasing. While existing Kernel methods are effective techniques for dealing with graphs having discrete node labels, their adaptation to nondiscrete or continuous node attributes has been limited, mainly for computational issues. Recently, a few kernels especially tailored for this domain, and that trade predictive performance for computational efficiency, have been proposed. In this brief, we propose a graph kernel for complex and continuous nodes' attributes, whose features are tree structures extracted from specific graph visits. The kernel manages to keep the same complexity of the state-of-the-art kernels while implicitly using a larger feature space. We further present an approximated variant of the kernel, which reduces its complexity significantly. Experimental results obtained on six real-world data sets show that the kernel is the best performing one on most of them. Moreover, in most cases, the approximated version reaches comparable performances to the current state-of-the-art kernels in terms of classification accuracy while greatly shortening the running times.
|Journal||IEEE Transactions on Neural Networks and Learning Systems|
|Publication status||Accepted/In press - 13 Jun 2017|
- Big data applications
- Computational complexity
- Feature extraction
- Learning systems
- machine learning
- supervised learning
- support vector machines.
ASJC Scopus subject areas
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence