Statistical modeling the internal states of a learning machine

Presented in STATPHYS28 Satellite Meeting – Emergence in Biological Networks

Abstract With increasing integration of artificial intelligence in our everyday life, the understanding and accountability of the solutions provided by machines are ever more important. This parallels the study of animal brains and biological neural networks. Instead of tracing out microscopic details of the neural networks that produce specific functionality, we aim for a macroscopic understanding of the evolution and transitions of the collective states of the neural systems leading to the fulfillment of the required functions. To this end, we apply the statistical modeling approach to the internal states of a pattern recognizing neural network and characterize the training process of the network system with evolution of thermodynamic properties of the model. An overall trend of tending to a critical state of the mapped model over the training process is found with the ensemble of machines we considered. The entropy of the system is also estimated using the model and shows a non-monotonic variation with a minimum before the network is optimally trained as well as a plateau afterwards. Using the relevance heat maps obtained with the deep-Taylor decomposition, which can attribute output decisions of the network to relevant pixels of the input, we also find a maximum of the contrast or sharpness of the heat maps over the training. These findings support the presence of distinct stages of the learning process and can improve our understanding of machine learning in deep neural networks.

Online slides

Leave a Reply

Your email address will not be published. Required fields are marked *