فیزیک آماری و بازنمایی در شبکه های عصبی واقعی و مصنوعی / Statistical physics and representations in real and artificial neural networks

فیزیک آماری و بازنمایی در شبکه های عصبی واقعی و مصنوعی Statistical physics and representations in real and artificial neural networks

  • نوع فایل : کتاب
  • زبان : انگلیسی
  • ناشر : Elsevier
  • چاپ و سال / کشور: 2018

توضیحات

رشته های مرتبط مهندسی فناوری اطلاعات، فیزیک
گرایش های مرتبط شبکه های کامپیوتری، فیزیک کاربردی
مجله فیزیک آ – Physica A
دانشگاه Laboratoire de Physique Statistique – Sorbonne Universités UPMC – France
شناسه دیجیتال – doi https://doi.org/10.1016/j.physa.2017.11.153
منتشر شده در نشریه الزویر

Description

1 Introduction In the early 80’s, statistical physicists proved that ideas issued from their field could lead to substantial advances in other disciplines. Simulated Annealing, a versatile optimization procedure in which a fictitious sampling temperature is decreased until the minimum (ground state) of a cost function is reached, had major impact in applied computer science and engineering [1]. Attractor neural network models for memories [2], soon analytically solved with spin-glass techniques [3], emerged as one major conceptual tool in computational neuroscience. From a theoretical point of view, it became rapidly clear that statistical physics offered a powerful framework to deal with problems outside physics, in particular in computer science and theoretical neuroscience, involving many random, heterogeneous, strongly interacting components, which had remained very hard to tackle so far. The purpose of the present document is to present some applications of statistical physics ideas and tools to the understanding of high-dimensional representations in neural networks. How the brain represents and processes information coming from the outside world is a central issue of computational neuroscience [4]. Experimental progress in electrophysiological and optical recordings make now possible to record the activity of populations of tens to thousands of neural cells in behaving animals, opening the way to study this question with unprecedented access to data and to ask new questions about brain operation on large scales [5]. Concomitantly, machine learning algorithms, largely based on artificial neural network architectures, have recently achieved spectacular performance in a variety of fields, such as image processing, or speech recognition/production [6]. How these machines produce efficient representations of the data and of their underlying distributions is a crucial question [7], far from being understood [8]. Profound similarities seem to emerge between the representations encountered in real and artificial neural networks [9] and between the questions raised in both contexts [10]. It is utterly hard to cover recent advances in such a diverse and vivid field, and the task is impossible in two lectures of two hours each. The material gathered here merely reflects the interests and, presumably, the ignorance of the authors more than anything else. The present notes focus on two applications of statistical physics to the study of neural representations in the contexts of computational neuroscience and machine learning. The first part is motivated by the representation of spaces, i.e. multiple environments, in hippocampal place-cell networks. An extension of Hopfield’s attractor neural network to the case of finitedimensional attractors is introduced and its phase diagram and dynamical properties, such as diffusion within one attractor or transitions between distinct attractors, are analyzed. We also show that effective, functional Ising models fitted from hippocampal multi-electrode recordings (limited to date to few tens of neurons) or from ’neural’ data generated by spatially subsampling our model, share common features with our abstract model, and can be used to decode and to track the evolution of spatial representations over time. In a second part, we move to representations of data by machine learning algorithms. Special emphasis is put on two aspects: low-dimensional representations achieved by principal component analysis, and compositional representations, produced by restricted Bolztmann machines combining multiple features inferred from data. In both cases, we show how statistical physics helps unveil the different properties of these representations, and the role of essential control parameters.
اگر شما نسبت به این اثر یا عنوان محق هستید، لطفا از طریق "بخش تماس با ما" با ما تماس بگیرید و برای اطلاعات بیشتر، صفحه قوانین و مقررات را مطالعه نمایید.

دیدگاه کاربران


لطفا در این قسمت فقط نظر شخصی در مورد این عنوان را وارد نمایید و در صورتیکه مشکلی با دانلود یا استفاده از این فایل دارید در صفحه کاربری تیکت ثبت کنید.

بارگزاری