Complex Systems Studies – Telegram
Complex Systems Studies
2.43K subscribers
1.55K photos
125 videos
116 files
4.54K links
What's up in Complexity Science?!
Check out here:

@ComplexSys

#complexity #complex_systems #networks #network_science

📨 Contact us: @carimi
Download Telegram
#جلسه_دفاع از پایان‌نامه کارشناسی ارشد

مصطفی جان‌نثاری
۹۶/۰۵/۳۰ - ساعت ۱۴:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
🖥 NetworkX 2.0 released - #python package for the creation, manipulation & analysis of networks:

http://networkx.readthedocs.io/en/latest/release/release_dev.html
🎞 Bayesian Inference
Peter Green, Department of Mathematics, University of Bristol

🔗 http://videolectures.net/mlss2011_green_bayesian/?q=Inference

Inference is the process of discovering from data about mechanisms that may have caused or generated that data, or at least explain it. The goals are varied - perhaps simply predicting future data, or more ambitiously drawing conclusions about scientific or societal truths. In the language of applied mathematics, these are inverse problems. Bayesian inference is about using probability to do all this. One of its strengths is that all sources of uncertainty in a problem can be simultaneously and coherently considered. It is model-based (in the language of machine learning, these are generative models), and we can use Bayesian methods to choose and criticize the models we use.

Download slides: http://videolectures.net/site/normal_dl/tag=626012/mlss2011_green_bayesian_01.pdf
Computer Age Statistical Inference
Algorithms, Evidence, and Data Science
Bradley Efron Trevor Hastie
Stanford University 👇👇👇
casi.pdf
8.1 MB
Computer Age Statistical Inference
Algorithms, Evidence, and Data Science
Bradley Efron Trevor Hastie
Stanford University
🔖 Deep Learning the Ising Model Near Criticality

Alan Morningstar, Roger G. Melko

🔗 https://arxiv.org/pdf/1708.04622

📌 ABSTRACT
It is well established that neural networks with deep architectures perform better than shallow networks for many tasks in machine learning. In statistical physics, while there has been recent interest in representing physical data with generative modelling, the focus has been on shallow neural networks. A natural question to ask is whether deep neural networks hold any advantage over shallow networks in representing such data. We investigate this question by using unsupervised, generative graphical models to learn the probability distribution of a two-dimensional Ising system. Deep Boltzmann machines, deep belief networks, and deep restricted Boltzmann networks are trained on thermal spin configurations from this system, and compared to the shallow architecture of the restricted Boltzmann machine. We benchmark the models, focussing on the accuracy of generating energetic observables near the phase transition, where these quantities are most difficult to approximate. Interestingly, after training the generative networks, we observe that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type. This is evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality.
🤔 "There is a fundamental error in separating the parts from the whole, the mistake of atomizing what should not be atomized. Unity and complementarity constitute reality."

Werner Heisenberg
🌀 #Turbulence occurs in a cascade: large eddies break down into smaller ones, which in turn split into even smaller ones, in a #fractal fashion.

Read more here:
🔗 http://www.nature.com/news/mysteries-of-turbulence-unravelled-1.22474
🌊 Lars Onsager: a cryptic genius

Theoretical physicist and chemist Lars Onsager (1903–76) was the type of scientist whom — so it was said — even geniuses such as Richard Feynman found intimidating to talk to. The Norwegian-born polymath “would announce his results by little, short, gnomic utterances”, says theoretical physicist Gregory Eyink. “And he was always right.”

In one of those terse revelations, Onsager announced in 1949 the surprising idea that turbulent fluids dissipate energy even in the absence of viscosity. That idea has now been proven mathematically.

In some cases, researchers have made sense of what Onsager said only in hindsight. In the 1990s, Eyink, who is now at Johns Hopkins University in Baltimore, Maryland, became the first person to take a major step towards validating Onsager’s argument on energy dissipation, only to discover later that Onsager himself had already made a start on that proof, scrawled in cryptic form in unpublished notebooks. Onsager had not bothered to publish this or much else on turbulence, in part because he was busy with other things — including work that led to him receiving the Nobel Prize in Chemistry in 1968 — but also because of the cold reception that others initially gave to his ideas4.

“I find his letter somewhat ‘screwy’”, Theodore von Kármán, considered the foremost US expert on turbulence in the 1940s, confessed to a colleague, regarding something Onsager had written to him. “Perhaps you could indicate to me in a few lines what the idea is, if any.” Linus Pauling, another chemistry Nobel prizewinner, responded to a letter from Onsager, saying: “Your work looks very interesting indeed to me, but it is too far over my head for me to appreciate it properly.”

Thanks to the efforts of Eyink and others, about 10% of Onsager’s notebooks and letters — which are kept at the University of Trondheim in Norway — have been digitized and are available for anyone to read online. Eyink says that he hopes other researchers will make the effort to study them, and that they will find insights not only in fluid dynamics, but also in many other fields in which Onsager worked, such as thermodynamics and condensed-matter physics.

Something similar happened in the past with the work of another oracle of the twentieth century, mathematician Srinivasa Ramanujan (1887–1920). Over the past decade, new results have been derived from enigmatic formulas that he had sketched in his notes but never published.

Read more here:
🔗 http://www.nature.com/news/mysteries-of-turbulence-unravelled-1.22474
A predictor of financial crisis based on statistical methods
http://tasmania.ethz.ch/pubfco/fco.html
#crash#stock_market#sornette#financial_crisis_observatory