On the CVP for the root lattices via folding with deep ReLU neural networks

Vincent Corlay, Joseph Boutros, Philippe Ciblat, Loic Brunel

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Point lattices and their decoding via neural networks are considered in this paper. Lattice decoding in reals n, known as the closest vector problem (CVP), becomes a classification problem in the fundamental parallelotope with a piecewise linear function defining the boundary. Theoretical results are obtained by studying root lattices. We show how the number of pieces in the boundary function reduces dramatically with folding, from exponential to linear. This translates into a two-layer ReLU neural network requiring a number of neurons growing exponentially in n to solve the CVP, whereas this complexity becomes polynomial in n for a deep ReLU neural network.

Original languageEnglish
Title of host publication2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1622-1626
Number of pages5
ISBN (Electronic)9781538692912
DOIs
Publication statusPublished - 1 Jul 2019
Event2019 IEEE International Symposium on Information Theory, ISIT 2019 - Paris, France
Duration: 7 Jul 201912 Jul 2019

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2019-July
ISSN (Print)2157-8095

Conference

Conference2019 IEEE International Symposium on Information Theory, ISIT 2019
CountryFrance
CityParis
Period7/7/1912/7/19

Fingerprint

Folding
Decoding
Roots
Neural Networks
Neural networks
Neurons
Polynomial Complexity
Piecewise Linear Function
Lattice Points
Polynomials
Classification Problems
Neuron
Deep neural networks

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Cite this

Corlay, V., Boutros, J., Ciblat, P., & Brunel, L. (2019). On the CVP for the root lattices via folding with deep ReLU neural networks. In 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings (pp. 1622-1626). [8849501] (IEEE International Symposium on Information Theory - Proceedings; Vol. 2019-July). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISIT.2019.8849501

On the CVP for the root lattices via folding with deep ReLU neural networks. / Corlay, Vincent; Boutros, Joseph; Ciblat, Philippe; Brunel, Loic.

2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. p. 1622-1626 8849501 (IEEE International Symposium on Information Theory - Proceedings; Vol. 2019-July).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Corlay, V, Boutros, J, Ciblat, P & Brunel, L 2019, On the CVP for the root lattices via folding with deep ReLU neural networks. in 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings., 8849501, IEEE International Symposium on Information Theory - Proceedings, vol. 2019-July, Institute of Electrical and Electronics Engineers Inc., pp. 1622-1626, 2019 IEEE International Symposium on Information Theory, ISIT 2019, Paris, France, 7/7/19. https://doi.org/10.1109/ISIT.2019.8849501
Corlay V, Boutros J, Ciblat P, Brunel L. On the CVP for the root lattices via folding with deep ReLU neural networks. In 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2019. p. 1622-1626. 8849501. (IEEE International Symposium on Information Theory - Proceedings). https://doi.org/10.1109/ISIT.2019.8849501
Corlay, Vincent ; Boutros, Joseph ; Ciblat, Philippe ; Brunel, Loic. / On the CVP for the root lattices via folding with deep ReLU neural networks. 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 1622-1626 (IEEE International Symposium on Information Theory - Proceedings).
@inproceedings{892ddbe0d56844d19872288adea08f05,
title = "On the CVP for the root lattices via folding with deep ReLU neural networks",
abstract = "Point lattices and their decoding via neural networks are considered in this paper. Lattice decoding in reals n, known as the closest vector problem (CVP), becomes a classification problem in the fundamental parallelotope with a piecewise linear function defining the boundary. Theoretical results are obtained by studying root lattices. We show how the number of pieces in the boundary function reduces dramatically with folding, from exponential to linear. This translates into a two-layer ReLU neural network requiring a number of neurons growing exponentially in n to solve the CVP, whereas this complexity becomes polynomial in n for a deep ReLU neural network.",
author = "Vincent Corlay and Joseph Boutros and Philippe Ciblat and Loic Brunel",
year = "2019",
month = "7",
day = "1",
doi = "10.1109/ISIT.2019.8849501",
language = "English",
series = "IEEE International Symposium on Information Theory - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "1622--1626",
booktitle = "2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings",

}

TY - GEN

T1 - On the CVP for the root lattices via folding with deep ReLU neural networks

AU - Corlay, Vincent

AU - Boutros, Joseph

AU - Ciblat, Philippe

AU - Brunel, Loic

PY - 2019/7/1

Y1 - 2019/7/1

N2 - Point lattices and their decoding via neural networks are considered in this paper. Lattice decoding in reals n, known as the closest vector problem (CVP), becomes a classification problem in the fundamental parallelotope with a piecewise linear function defining the boundary. Theoretical results are obtained by studying root lattices. We show how the number of pieces in the boundary function reduces dramatically with folding, from exponential to linear. This translates into a two-layer ReLU neural network requiring a number of neurons growing exponentially in n to solve the CVP, whereas this complexity becomes polynomial in n for a deep ReLU neural network.

AB - Point lattices and their decoding via neural networks are considered in this paper. Lattice decoding in reals n, known as the closest vector problem (CVP), becomes a classification problem in the fundamental parallelotope with a piecewise linear function defining the boundary. Theoretical results are obtained by studying root lattices. We show how the number of pieces in the boundary function reduces dramatically with folding, from exponential to linear. This translates into a two-layer ReLU neural network requiring a number of neurons growing exponentially in n to solve the CVP, whereas this complexity becomes polynomial in n for a deep ReLU neural network.

UR - http://www.scopus.com/inward/record.url?scp=85073163010&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85073163010&partnerID=8YFLogxK

U2 - 10.1109/ISIT.2019.8849501

DO - 10.1109/ISIT.2019.8849501

M3 - Conference contribution

AN - SCOPUS:85073163010

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 1622

EP - 1626

BT - 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -