🌋http://www.quantamagazine.org/the-physics-of-glass-opens-a-window-into-biology-20180611/
In glassy systems, we think that many of these interesting properties occur because there’s what’s called a complex potential energy landscape. If you consider the total energy of the entire system as a function of where the atoms are, then in a glass, which is disordered, that landscape is incredibly complex.
It turns out that the neural networks used for deep learning and optimization share a surprisingly large number of properties with glasses. You can think of the nodes of the network as particles, and the connections between them as the bonds between particles. If you do, the neural networks and the glasses have complex potential energy landscapes with nearly identical properties. For example, questions about the energy barriers between states in a neural network are related to questions about how likely it is for a glassy material to flow. So the hope is that understanding some of the properties of glasses can help you understand optimization in these neural networks, too.
In glassy systems, we think that many of these interesting properties occur because there’s what’s called a complex potential energy landscape. If you consider the total energy of the entire system as a function of where the atoms are, then in a glass, which is disordered, that landscape is incredibly complex.
It turns out that the neural networks used for deep learning and optimization share a surprisingly large number of properties with glasses. You can think of the nodes of the network as particles, and the connections between them as the bonds between particles. If you do, the neural networks and the glasses have complex potential energy landscapes with nearly identical properties. For example, questions about the energy barriers between states in a neural network are related to questions about how likely it is for a glassy material to flow. So the hope is that understanding some of the properties of glasses can help you understand optimization in these neural networks, too.
Quanta Magazine
The Physics of Glass Opens a Window Into Biology
The physicist Lisa Manning studies the dynamics of glassy materials to understand embryonic development and disease.
NEW M.Sc. Program in Natural Language Processing (NLP) and Data Science
Université de Lorraine, Nancy (France)
The Institute of Digital science, Management and Cognition
is opening a a new Masters Program in NLP -
Computer Sciences, Speech, Language and Knowledge Representation
**********
http://institut-sciences-digitales.fr/idmc-master-degree-in-natural-language-processing/
**********
So you want to be a specialist in Neural Networks, Logic, Speech
Processing, Information Retrieval, Knowledge Representation ... all
for Natural Language? Well, now's your chance!
Natural Language Processing (NLP) lies at the crossroads of
linguistics, computer science and artificial intelligence. This
Masters Program offers
a modern curriculum which combines the different
approaches, and covers both theoretical and applied perspectives.
In each semester, the program includes a hands-on project.
It ends with a 6-month paid internship in a company or a research
lab. You can find the course denoscription below.
----------------------------------------------------------------------------------------
You can apply to directly enter at either the first year or second year level.
----------------------------------------------------------------------------------------
Language
-------------
All courses are taught in English
(except the "French for non-native Speakers" class).
Fees
-------
The University of Lorraine is publicly funded and thus offers
tuition-free education for all students including students from
both inside and outside EU/EEA/EFTA countries. The only student
expenditures are a nominal semester fee of about 600 euros which
includes health insurance. Nancy’s high quality of life goes
hand-in-hand with a low cost of living.
Université de Lorraine, Nancy (France)
The Institute of Digital science, Management and Cognition
is opening a a new Masters Program in NLP -
Computer Sciences, Speech, Language and Knowledge Representation
**********
http://institut-sciences-digitales.fr/idmc-master-degree-in-natural-language-processing/
**********
So you want to be a specialist in Neural Networks, Logic, Speech
Processing, Information Retrieval, Knowledge Representation ... all
for Natural Language? Well, now's your chance!
Natural Language Processing (NLP) lies at the crossroads of
linguistics, computer science and artificial intelligence. This
Masters Program offers
a modern curriculum which combines the different
approaches, and covers both theoretical and applied perspectives.
In each semester, the program includes a hands-on project.
It ends with a 6-month paid internship in a company or a research
lab. You can find the course denoscription below.
----------------------------------------------------------------------------------------
You can apply to directly enter at either the first year or second year level.
----------------------------------------------------------------------------------------
Language
-------------
All courses are taught in English
(except the "French for non-native Speakers" class).
Fees
-------
The University of Lorraine is publicly funded and thus offers
tuition-free education for all students including students from
both inside and outside EU/EEA/EFTA countries. The only student
expenditures are a nominal semester fee of about 600 euros which
includes health insurance. Nancy’s high quality of life goes
hand-in-hand with a low cost of living.
An interdisciplinary forum for complexity research :
PLOS Complexity Channel
https://t.co/Z6X8qNEuN2
Meet the editor here
https://t.co/slUfig9dUC https://t.co/0eiYrC40O9
PLOS Complexity Channel
https://t.co/Z6X8qNEuN2
Meet the editor here
https://t.co/slUfig9dUC https://t.co/0eiYrC40O9
🌀 “How to build your own Neural Network from scratch in Python” by James Loy
https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6
https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6
Towards Data Science
How to build a neural network from zero | Towards Data Science
No frameworks, just Python
🔖 طرح پیشنهادی دروس گرایش «فیزیک آماری و سامانههای پیچیده» مقطع کارشناسیارشد در دانشگاه شهید بهشتی
http://facultymembers.sbu.ac.ir/jafari/fa/sbu-complex-systems/
http://facultymembers.sbu.ac.ir/jafari/fa/sbu-complex-systems/
A tweak to the infamous “critical brain” hypothesis accounts for the brain’s stability and adaptivity.
https://t.co/Hs8pYKxpt0 https://t.co/ehWuun9kep
https://t.co/Hs8pYKxpt0 https://t.co/ehWuun9kep
🌀 Tehran school on Theory and Applications of Complex Networks
3-7 Shahrivar 1397
🔗 More info:
http://facultymembers.sbu.ac.ir/jafari/events/
📄 Registration:
http://psi.ir/tacn2018_3.asp
3-7 Shahrivar 1397
🔗 More info:
http://facultymembers.sbu.ac.ir/jafari/events/
📄 Registration:
http://psi.ir/tacn2018_3.asp
Network visualization with R: updated tutorial from #polnet2018. Includes visualization basics, interactive and animated networks, temporal graphs and networks on geographic maps: https://t.co/Ro39rk0
Complex Systems Studies
https://iasbs.ac.ir/seminar/physics/condmat-meeting/m24/
24th Annual #IASBS Meeting on Condensed Matter Physics & School on Complex Systems. https://t.co/c23KC2IKx8
Twitter
Abbas Karimi
Matteo Marsili from #ICTP is giving a lecture on #Inference from a #Statistical_Physics view point. 24th Annual #IASBS Meeting on Condensed Matter Physics & School on Complex Systems.
🔖 Thermodynamics of the Minimum Denoscription Length on Community Detection
Juan Ignacio Perotti, Claudio Juan Tessone, Aaron Clauset, Guido Caldarelli
🔗 https://arxiv.org/pdf/1806.07005.pdf
📌 ABSTRACT
Modern statistical modeling is an important complement to the more traditional approach of physics where Complex Systems are studied by means of extremely simple idealized models. The Minimum Denoscription Length (MDL) is a principled approach to statistical modeling combining Occam's razor with Information Theory for the selection of models providing the most concise denoscriptions. In this work, we introduce the Boltzmannian MDL (BMDL), a formalization of the principle of MDL with a parametric complexity conveniently formulated as the free-energy of an artificial thermodynamic system. In this way, we leverage on the rich theoretical and technical background of statistical mechanics, to show the crucial importance that phase transitions and other thermodynamic concepts have on the problem of statistical modeling from an information theoretic point of view. For example, we provide information theoretic justifications of why a high-temperature series expansion can be used to compute systematic approximations of the BMDL when the formalism is used to model data, and why statistically significant model selections can be identified with ordered phases when the BMDL is used to model models. To test the introduced formalism, we compute approximations of BMDL for the problem of community detection in complex networks, where we obtain a principled MDL derivation of the Girvan-Newman (GN) modularity and the Zhang-Moore (ZM) community detection method. Here, by means of analytical estimations and numerical experiments on synthetic and empirical networks, we find that BMDL-based correction terms of the GN modularity improve the quality of the detected communities and we also find an information theoretic justification of why the ZM criterion for estimation of the number of network communities is better than alternative approaches such as the bare minimization of a free energy.
Juan Ignacio Perotti, Claudio Juan Tessone, Aaron Clauset, Guido Caldarelli
🔗 https://arxiv.org/pdf/1806.07005.pdf
📌 ABSTRACT
Modern statistical modeling is an important complement to the more traditional approach of physics where Complex Systems are studied by means of extremely simple idealized models. The Minimum Denoscription Length (MDL) is a principled approach to statistical modeling combining Occam's razor with Information Theory for the selection of models providing the most concise denoscriptions. In this work, we introduce the Boltzmannian MDL (BMDL), a formalization of the principle of MDL with a parametric complexity conveniently formulated as the free-energy of an artificial thermodynamic system. In this way, we leverage on the rich theoretical and technical background of statistical mechanics, to show the crucial importance that phase transitions and other thermodynamic concepts have on the problem of statistical modeling from an information theoretic point of view. For example, we provide information theoretic justifications of why a high-temperature series expansion can be used to compute systematic approximations of the BMDL when the formalism is used to model data, and why statistically significant model selections can be identified with ordered phases when the BMDL is used to model models. To test the introduced formalism, we compute approximations of BMDL for the problem of community detection in complex networks, where we obtain a principled MDL derivation of the Girvan-Newman (GN) modularity and the Zhang-Moore (ZM) community detection method. Here, by means of analytical estimations and numerical experiments on synthetic and empirical networks, we find that BMDL-based correction terms of the GN modularity improve the quality of the detected communities and we also find an information theoretic justification of why the ZM criterion for estimation of the number of network communities is better than alternative approaches such as the bare minimization of a free energy.