via Quanta
That may soon change. Boris Knyazev of the University of Guelph in Ontario and his colleagues have designed and trained a “hypernetwork” — a kind of overlord of other neural networks — that could speed up the training process. Given a new, untrained deep neural network designed for some task, the hypernetwork predicts the parameters for the new network in fractions of a second, and in theory could make training unnecessary. Because the hypernetwork learns the extremely complex patterns in the designs of deep neural networks, the work may also have deeper theoretical implications.
Source: https://blog.adafruit.com/2022/01/31/ai-that-builds-ai/
More from Adafruit Industries – Makers & hackers & artists & designers and engineers!
HackSpace Magazine Issue 51: Tracking workshop dust #Metro #IoT #CircuitPython @HackSpaceMag
Source Node: 1590980
Time Stamp: Jan 21, 2022
NEW(ish) GUIDE: Understanding microSD and SD cards: speeds, markings and more #SD #MicroSD #InfoGraphic @SD_Association
Source Node: 1257101
Time Stamp: Oct 25, 2021
NEW VIDEO: Wire Ferrules – Collin’s Lab Notes #adafruit #collinslabnotes
Source Node: 1508263
Time Stamp: Nov 13, 2021
New Products 11/10/21 feat. Adafruit ESP32-S2 Feather w/ BME280 Sensor – STEMMA QT
Source Node: 1516785
Time Stamp: Nov 13, 2021
NEW GUIDE: Adafruit VL53L1X Time of Flight Distance Sensor #AdafruitLearningSystem #Adafruit @adafruit
Source Node: 1285802
Time Stamp: Oct 27, 2021
A Halloween talking clock based on a Raspberry Pi Pico #RaspberryPiPico #CircuitPython #Halloween #PiDay @RaspberryPi
Source Node: 1405109
Time Stamp: Nov 5, 2021
NEW GUIDE: Adafruit ATtiny817 Breakout with seesaw #AdafruitLearningSystem #Adafruit @adafruit
Source Node: 1220528
Time Stamp: Oct 23, 2021