The theory of neural networks: Learning from examples, time-series and cryptography

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In the first part of my talk the main findings in the theory of neural networks during the last two decades are summarized. These include storage capacity, learning from examples, and time series generation by feedforward networks. In the second part of the talk, the new bridge between the theory of neural networks and cryptography is presented. A new phenomenon, namely synchronization of neural networks, is leading to an original method for the exchange of secret messages. Numerical simulations as well as analytical results show that two artificial networks, trained by the Hebbian learning rule on their mutual outputs, develop a parallel state of their synaptic weights. The synchronized weights (integer values between +/-L) are used to construct an ephemeral key-exchange protocol for the secure transmission of secret data. We show that the synchronization time increases with L∧2 while the probability to find a successful attacker decreases exponentially with L. Hence for large L we find a secure key-exchange protocol which depends neither on number theory nor on injective trapdoor functions used in conventional cryptography.

Fingerprint

Dive into the research topics of 'The theory of neural networks: Learning from examples, time-series and cryptography'. Together they form a unique fingerprint.

Cite this