Power-law scaling to assist with key challenges in artificial intelligence

Yuval Meir, Shira Sardi, Shiri Hodassman, Karin Kisos, Itamar Ben-Noam, Amir Goldental, Ido Kanter

Research output: Contribution to journalArticlepeer-review

8 Scopus citations


Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to the trained network, the power-law exponent increased with the number of hidden layers. For the largest dataset, the obtained test error was estimated to be in the proximity of state-of-the-art algorithms for large epoch numbers. Power-law scaling assists with key challenges found in current artificial intelligence applications and facilitates an a priori dataset size estimation to achieve a desired test accuracy. It establishes a benchmark for measuring training complexity and a quantitative hierarchy of machine learning tasks and algorithms.

Original languageEnglish
Article number19628
JournalScientific Reports
Issue number1
StatePublished - Dec 2020

Bibliographical note

Publisher Copyright:
© 2020, The Author(s).


Dive into the research topics of 'Power-law scaling to assist with key challenges in artificial intelligence'. Together they form a unique fingerprint.

Cite this