FIELD | AI and Natural Sciences |
---|---|
DATE | September 01 (Wed), 2021 |
TIME | 14:00-16:00 |
PLACE | 7323 |
SPEAKER | 한정훈 |
HOST | Hyeon, Changbong |
INSTITUTE | 성균관대학교 |
TITLE | Fluctuation-dissipation Type Theorem in Stochastic Linear Learning |
ABSTRACT | The fluctuation-dissipation theorem (FDT) is a simple yet powerful consequence of the first-order differential equation governing the dynamics of systems subject simultaneously to dissipative and stochastic forces. The linear learning dynamics, in which the input vector maps to the output vector by a linear matrix whose elements are the subject of learning, has a stochastic version closely mimicking the Langevin dynamics when a full-batch gradient descent scheme is replaced by that of stochastic gradient descent. We derive a generalized FDT for the stochastic linear learning dynamics and verify its validity among the well-known machine learning data sets such as MNIST, CIFAR-10 and EMNIST. |
FILE |