Lattice-based cryptography

Lattice-based cryptography is the generic term for asymmetric cryptographic primitives based on lattices. While lattice-based cryptography has been studied for several decades, there has been renewed interest in lattice-based cryptography as prospects for a real quantum computer improve. Unlike more widely used and known public key cryptography such as the RSA or Diffie-Hellman cryptosystems which are easily attacked by a quantum computer, some lattice-based cryptosystems appear to be resistant to attack by both classical and quantum computers. Further the Learning with Errors variants of lattice-based cryptography come with security proofs which demonstrate that breaking the cryptography is equivalent to solving known hard problems on lattices.

History

Lattices were first studied by mathematicians Joseph Louis Lagrange and Carl Friedrich Gauss. Lattices have been used recently in computer algorithms and in cryptanalysis, for example, in 2009, when Craig Gentry proposed the first fully homomorphic encryption scheme.[1] In 1996, Miklós Ajtai showed in a seminal result the use of lattices as a cryptography primitive.[2]

Mathematical background

A lattice L in real analysis is a set of points in the n-dimensional Euclidean space Rn with a strong periodicity property. A basis of L is a set of vectors such that any element of L is uniquely represented as their linear combination with integer coefficients. When n is at least 2, each lattice has infinitely many different bases. All lattices over Rn have infinitely many elements, whereas in cryptography entities such as the ciphertext, public key, and private key must be taken from a finite space (bit strings of some fixed length). Therefore, the lattices used for cryptography are actually lattices over a finite field.

Mathematical problems are used to construct a cryptography system. These problems are usually hard to solve unless one has extra information. Mathematical problems based on lattices are the Shortest Vector Problem (SVP) [given a basis of a lattice, find the shortest vector in the lattice] and the Closest Vector Problem (CVP) [given a basis of a lattice and a vector not in the lattice, find the lattice vector with the least distance to the first vector]. These problems are normally hard to solve. There are algorithms to solve these problems with a good basis.

Lattice basis reduction is a transformation of an integer lattice basis in order to find a basis with short, nearly orthogonal vectors. If one can compute such a lattice basis, the CVP and SVP problems are easy to solve. A good basis reduction algorithm is the LLL algorithm.

Lattice-based cryptosystems

On June 25, 2009, Craig Gentry using lattice-based cryptography showed the first fully homomorphic encryption scheme as announced by IBM.[3][4][5] His scheme supports evaluations of arbitrary depth circuits.

Encryption

Signature

Hash function

Security issues[8]

Lattice-based cryptographic constructions hold a great promise for post-quantum cryptography. Many of them are quite efficient, and some even compete with the best known alternatives; they are typically quite simple to implement; and are all believed to be secure against attacks using conventional or quantum computers.

In terms of security, lattice-based cryptographic constructions can be divided into two types. The first includes practical proposals, which are typically very efficient, but often lack a supporting proof of security. The second type admits strong provable security guarantees based on the worst-case hardness of lattice problems, but only a few of them are sufficiently efficient to be used in practice.

The NTRU family of algorithms noted above are efficient but lack a strong security guarantee. Basic Learning with Errors designs have strong security guarantees but use impractically large key sizes. The Learning with Errors designs over Rings (Ring Learning with Errors or Ring-LWE) offer very efficient computation, moderately sized keys, and a strong proof of security.

Worst-case hardness

Worst-case hardness of lattice problems means that breaking the cryptographic construction (even with some small non-negligible probability) is probably at least as hard as solving several lattice problems (approximately, within polynomial factors) in the worst case. In other words, breaking the cryptographic construction implies an efficient algorithm for solving any instance of some underlying lattice problem. In most cases, the underlying problem is that of approximating lattice problems such as SVP to within polynomial factors, which is conjectured to be a hard problem. Such a strong security guarantee is one of the distinguishing features of lattice-based cryptography.

The importance of the worst-case security guarantee is twofold. First, it assures us that attacks on the cryptographic construction are likely to be effective only for small choices of parameters and not asymptotically. In other words, it assures us that there are no fundamental flaws in the design of our cryptographic construction. In fact, in some cases, the worst-case security guarantee can even guide us in making design decisions. Second, in principle the worst-case security guarantee can help us in choosing concrete parameters for the cryptosystem, although in practice this leads to what seems like overly conservative estimates and one often sets the parameters based on the best known attacks.

Quantum algorithms and lattices

Lattice problems are typically quite hard. The best known algorithms either run in exponential time, or provide quite bad approximation ratios. The field of lattice-based cryptography has been developed based on the assumption that lattice problems are hard. There are currently no known quantum algorithms for solving lattice problems that perform significantly better than the best known classical (i.e., non-quantum) algorithms. This is despite the fact that lattice problems seem like a natural candidate to attempt to solve using quantum algorithms: because they are believed not to be NP-hard for typical approximation factors, because of their periodic structure, and because the Fourier transform, which is usually exploited in quantum algorithms, is tightly related to the notion of lattice duality.

Attempts to solve general lattice problems by quantum algorithms have been made since Shor’s discovery of the quantum factoring algorithm in the mid-1990s, but have so far met with little success if any at all.

See also

References

  1. Gentry, Craig. "Fully homomorphic encryption using ideal lattices". Stanford. Symposium on the Theory of Computing. Retrieved 18 December 2014.
  2. Ajtai, Miklós (1996). "Generating Hard Instances of Lattice Problems". Proceedings of the Twenty-Eighth Annual ACM Symposium on Theory of Computing. pp. 99–108. doi:10.1145/237814.237838. ISBN 0-89791-785-5.
  3. Craig Gentry. Fully Homomorphic Encryption Using Ideal Lattices. In the 41st ACM Symposium on Theory of Computing (STOC), 2009.
  4. http://www-03.ibm.com/press/us/en/pressrelease/27840.wss
  5. Michael Cooney (2009-06-25). "IBM touts encryption innovation". Computer World. Retrieved 2009-07-14.
  6. Peikert, Chris (2014). "Lattice Cryptography for the Internet". IACR. Archived from the original (PDF) on 26 November 2016. Retrieved 10 May 2014.
  7. Guneysu, Tim; Lyubashevsky, Vadim; Güneysu, Tim (2012). "Practical Lattice-Based Cryptography: A Signature Scheme for Embedded Systems" (PDF). IACR. Retrieved 26 November 2016. Unknown parameter |DUPLICATE_author3= ignored (help)
  8. Micciancio, D.; Regev, O. (2009). "Lattice-based cryptography" (PDF).

Bibliography

This article is issued from Wikipedia - version of the 11/26/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.