Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

 
 
DownloadE-Mail
  Training of Physical Neural Networks

Momeni, A., Rahmani, B., Scellier, B., Wright, L. G., McMahon, P. L., Wanjura, C. C., et al. (2024). Training of Physical Neural Networks. arXiv, 2406.03372.

Item is

Dateien

einblenden: Dateien
ausblenden: Dateien
:
2406.03372.pdf (beliebiger Volltext), 3MB
Name:
2406.03372.pdf
Beschreibung:
File downloaded from arXiv at 2024-06-11 10:01
OA-Status:
Keine Angabe
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
application/pdf / [MD5]
Technische Metadaten:
Copyright Datum:
-
Copyright Info:
-
:
Bildschirmfoto 2024-06-11 um 09.57.59.png (Ergänzendes Material), 94KB
Name:
Bildschirmfoto 2024-06-11 um 09.57.59.png
Beschreibung:
-
OA-Status:
Keine Angabe
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
image/png / [MD5]
Technische Metadaten:
Copyright Datum:
-
Copyright Info:
-
Lizenz:
-

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Momeni, Ali1, Autor
Rahmani, Babak1, Autor
Scellier, Benjamin1, Autor
Wright, Logan G.1, Autor
McMahon, Peter L.1, Autor
Wanjura, Clara C.2, Autor
Li, Yuhang1, Autor
Skalli, Anas1, Autor
Berloff, Natalia G.1, Autor
Onodera, Tatsuhiro1, Autor
Oguz, Ilker1, Autor
Morichetti, Francesco1, Autor
del Hougne, Philipp1, Autor
Gallo, Manuel Le1, Autor
Sebastian, Abu1, Autor
Mirhoseini, Azalia1, Autor
Zhang, Cheng1, Autor
Marković, Danijela1, Autor
Brunner, Daniel1, Autor
Moser, Christophe1, Autor
Gigan, Sylvain1, AutorMarquardt, Florian2, AutorOzcan, Aydogan1, AutorGrollier, Julie1, AutorLiu, Andrea J.1, AutorPsaltis, Demetri1, AutorAlù, Andrea1, AutorFleury, Romain1, Autor mehr..
Affiliations:
1External, ou_persistent22              
2Marquardt Division, Max Planck Institute for the Science of Light, Max Planck Society, Staudtstraße 2, 91058 Erlangen, DE, ou_2421700              

Inhalt

einblenden:
ausblenden:
Schlagwörter: physics.app-ph,Computer Science, Learning, cs.LG
 Zusammenfassung: Physical neural networks (PNNs) are a class of neural-like networks that leverage the properties of physical systems to perform computation. While PNNs are so far a niche research area with small-scale laboratory demonstrations, they are arguably one of the most underappreciated important opportunities in modern artificial intelligence (AI). Could we train AI models 1000x larger than current ones? Could we do this and also have them perform inference locally and privately on edge devices, such as smartphones or sensors?
Research over the past few years has shown that the answer to all these questions is likely “textityes, with enough research”: PNNs could one day radically change what is possible and practical for AI systems. To do this will however require rethinking both how AI models work, and how they are trained – primarily by considering the problems through the constraints of the underlying hardware physics. To train PNNs at large scale, many methods including backpropagation-based and backpropagation-free approaches are now being explored. These methods have various trade-offs, and so far no method has been shown to scale to the same scale and performance as the backpropagation algorithm widely used in deep learning today. However, this is rapidly changing, and a diverse ecosystem of training techniques provides clues for how PNNs may one day be utilized to create both more efficient realizations of current-scale AI models, and to enable unprecedented-scale models.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2024-06-05
 Publikationsstatus: Online veröffentlicht
 Seiten: 29 pages, 4 figures
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: arXiv: 2406.03372
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: arXiv
Genre der Quelle: Kommentar
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: -
Seiten: - Band / Heft: - Artikelnummer: 2406.03372 Start- / Endseite: - Identifikator: -