🌀 Make your mark and show the world what you can achieve with the Santa Fe Institute’s Complexity Challenges
https://www.complexityexplorer.org/challenges
🎞 see: 👇
https://www.complexityexplorer.org/challenges
🎞 see: 👇
Forwarded from Deleted Account [SCAM]
This media is not supported in your browser
VIEW IN TELEGRAM
What are the Complexity Challenges?
Forwarded from Deleted Account [SCAM]
This media is not supported in your browser
VIEW IN TELEGRAM
Correlation CAN Imply Causation! | Statistics Misconceptions
Forwarded from Deleted Account [SCAM]
This media is not supported in your browser
VIEW IN TELEGRAM
Watch the income distribution in America change
Complex Systems Studies
🔅Norbert Blum claims to have proved P ≠ NP. For some very nonexpert comments on this, visit: https://johncarlosbaez.wordpress.com/2017/08/15/norbert-blum-on-p-versus-np/amp/
❌ Razborov says the proof is wrong!
https://cstheory.stackexchange.com/questions/38803/is-norbert-blums-2017-proof-that-p-ne-np-correct/38832#comment88852_38832
https://cstheory.stackexchange.com/questions/38803/is-norbert-blums-2017-proof-that-p-ne-np-correct/38832#comment88852_38832
Theoretical Computer Science Stack Exchange
Is Norbert Blum's 2017 proof that $P \ne NP$ correct?
Norbert Blum recently posted a 38-page proof that $P \ne NP$. Is it correct?
Also on topic: where else (on the internet) is its correctness being discussed?
Note: the focus of this question text ...
Also on topic: where else (on the internet) is its correctness being discussed?
Note: the focus of this question text ...
⭕️Paper: History effects on network growth
https://arxiv.org/pdf/1505.06450.pdf
🌀Hadiseh Safdari, Milad Zare Kamali, Amir Hossein Shirazi, Moein Khaliqi and Gholamreza Jafari
⭕️Abstract: Growth dynamic of real networks because of emerging complexities is an open and interesting
question. Indeed it is not realistic to ignore history impact on the current events. The mystery
behind that complexity could be in the role of history in some how. To regard this point, the average
effect of history has been included by a kernel function in differential equation of Baraba´si Albert
(BA) model . This approach leads to a fractional order BA differential equation as a generalization
of BA model. As opposed to unlimited growth for degree of nodes, our results show that over
time the memory impact will cause a decay for degrees. This gives a higher chance to younger
members for turning to a hub. In fact in a real network, there are two competitive processes. On
one hand, based on preferential attachment mechanism nodes with higher degree are more likely
to absorb links. On the other hand, node history through aging process prevents new connections.
Our findings from simulating a network grown by considering these effects also from studying a
real network of collaboration between Hollywood movie actors confirms the results and significant effects of history and time on dynamics.
https://arxiv.org/pdf/1505.06450.pdf
🌀Hadiseh Safdari, Milad Zare Kamali, Amir Hossein Shirazi, Moein Khaliqi and Gholamreza Jafari
⭕️Abstract: Growth dynamic of real networks because of emerging complexities is an open and interesting
question. Indeed it is not realistic to ignore history impact on the current events. The mystery
behind that complexity could be in the role of history in some how. To regard this point, the average
effect of history has been included by a kernel function in differential equation of Baraba´si Albert
(BA) model . This approach leads to a fractional order BA differential equation as a generalization
of BA model. As opposed to unlimited growth for degree of nodes, our results show that over
time the memory impact will cause a decay for degrees. This gives a higher chance to younger
members for turning to a hub. In fact in a real network, there are two competitive processes. On
one hand, based on preferential attachment mechanism nodes with higher degree are more likely
to absorb links. On the other hand, node history through aging process prevents new connections.
Our findings from simulating a network grown by considering these effects also from studying a
real network of collaboration between Hollywood movie actors confirms the results and significant effects of history and time on dynamics.
🎞 Homotopy and Bifurcation:
This lecture summarized what students have learned on linear algebra and systems of nonlinear equations:
https://ocw.mit.edu/courses/chemical-engineering/10-34-numerical-methods-applied-to-chemical-engineering-fall-2015/class-videos/session-9-homotopy-and-bifurcation/#vid_related
This lecture summarized what students have learned on linear algebra and systems of nonlinear equations:
https://ocw.mit.edu/courses/chemical-engineering/10-34-numerical-methods-applied-to-chemical-engineering-fall-2015/class-videos/session-9-homotopy-and-bifurcation/#vid_related
ocw.mit.edu
Session 9: Homotopy and Bifurcation | Class Videos | Numerical Methods Applied to Chemical Engineering | Chemical Engineering |…
This lecture summarized what students have learned on linear algebra and systems of nonlinear equations.
🌀 What is Emergence?
http://emergence.ucdavis.edu/emergence.html
David Pines, Distinguished Professor of Physics, UC Davis
and Chief Evangelist, ICAM
http://emergence.ucdavis.edu/emergence.html
David Pines, Distinguished Professor of Physics, UC Davis
and Chief Evangelist, ICAM
Forwarded from Deleted Account [SCAM]
Media is too big
VIEW IN TELEGRAM
کارسوق «پیچیدگی های طبیعت» قسمت سوم
#جلسه_دفاع از پایاننامه کارشناسی ارشد
علیرضا سعیدی
۹۶/۰۵/۳۰ - ساعت ۱۰:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
علیرضا سعیدی
۹۶/۰۵/۳۰ - ساعت ۱۰:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
#جلسه_دفاع از پایاننامه کارشناسی ارشد
مصطفی جاننثاری
۹۶/۰۵/۳۰ - ساعت ۱۴:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
مصطفی جاننثاری
۹۶/۰۵/۳۰ - ساعت ۱۴:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
🖥 NetworkX 2.0 released - #python package for the creation, manipulation & analysis of networks:
http://networkx.readthedocs.io/en/latest/release/release_dev.html
http://networkx.readthedocs.io/en/latest/release/release_dev.html
🎞 Relative Entropy:
http://videolectures.net/nips09_verdu_re/
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
Download slides: http://videolectures.net/site/normal_dl/tag=62842/nips09_verdu_re_01.pdf
http://videolectures.net/nips09_verdu_re/
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
Download slides: http://videolectures.net/site/normal_dl/tag=62842/nips09_verdu_re_01.pdf
videolectures.net
Relative Entropy
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
🎞 Bayesian Inference
Peter Green, Department of Mathematics, University of Bristol
🔗 http://videolectures.net/mlss2011_green_bayesian/?q=Inference
Inference is the process of discovering from data about mechanisms that may have caused or generated that data, or at least explain it. The goals are varied - perhaps simply predicting future data, or more ambitiously drawing conclusions about scientific or societal truths. In the language of applied mathematics, these are inverse problems. Bayesian inference is about using probability to do all this. One of its strengths is that all sources of uncertainty in a problem can be simultaneously and coherently considered. It is model-based (in the language of machine learning, these are generative models), and we can use Bayesian methods to choose and criticize the models we use.
Download slides: http://videolectures.net/site/normal_dl/tag=626012/mlss2011_green_bayesian_01.pdf
Peter Green, Department of Mathematics, University of Bristol
🔗 http://videolectures.net/mlss2011_green_bayesian/?q=Inference
Inference is the process of discovering from data about mechanisms that may have caused or generated that data, or at least explain it. The goals are varied - perhaps simply predicting future data, or more ambitiously drawing conclusions about scientific or societal truths. In the language of applied mathematics, these are inverse problems. Bayesian inference is about using probability to do all this. One of its strengths is that all sources of uncertainty in a problem can be simultaneously and coherently considered. It is model-based (in the language of machine learning, these are generative models), and we can use Bayesian methods to choose and criticize the models we use.
Download slides: http://videolectures.net/site/normal_dl/tag=626012/mlss2011_green_bayesian_01.pdf
casi.pdf
8.1 MB
Computer Age Statistical Inference
Algorithms, Evidence, and Data Science
Bradley Efron Trevor Hastie
Stanford University
Algorithms, Evidence, and Data Science
Bradley Efron Trevor Hastie
Stanford University
🎞 Interesting talks from PyData London 2017
https://medium.springboard.com/interesting-talks-from-pydata-london-2017-d17b06c1ed5e
https://medium.springboard.com/interesting-talks-from-pydata-london-2017-d17b06c1ed5e
🔖 Deep Learning the Ising Model Near Criticality
Alan Morningstar, Roger G. Melko
🔗 https://arxiv.org/pdf/1708.04622
📌 ABSTRACT
It is well established that neural networks with deep architectures perform better than shallow networks for many tasks in machine learning. In statistical physics, while there has been recent interest in representing physical data with generative modelling, the focus has been on shallow neural networks. A natural question to ask is whether deep neural networks hold any advantage over shallow networks in representing such data. We investigate this question by using unsupervised, generative graphical models to learn the probability distribution of a two-dimensional Ising system. Deep Boltzmann machines, deep belief networks, and deep restricted Boltzmann networks are trained on thermal spin configurations from this system, and compared to the shallow architecture of the restricted Boltzmann machine. We benchmark the models, focussing on the accuracy of generating energetic observables near the phase transition, where these quantities are most difficult to approximate. Interestingly, after training the generative networks, we observe that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type. This is evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality.
Alan Morningstar, Roger G. Melko
🔗 https://arxiv.org/pdf/1708.04622
📌 ABSTRACT
It is well established that neural networks with deep architectures perform better than shallow networks for many tasks in machine learning. In statistical physics, while there has been recent interest in representing physical data with generative modelling, the focus has been on shallow neural networks. A natural question to ask is whether deep neural networks hold any advantage over shallow networks in representing such data. We investigate this question by using unsupervised, generative graphical models to learn the probability distribution of a two-dimensional Ising system. Deep Boltzmann machines, deep belief networks, and deep restricted Boltzmann networks are trained on thermal spin configurations from this system, and compared to the shallow architecture of the restricted Boltzmann machine. We benchmark the models, focussing on the accuracy of generating energetic observables near the phase transition, where these quantities are most difficult to approximate. Interestingly, after training the generative networks, we observe that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type. This is evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality.