Forwarded from Deleted Account [SCAM]
Media is too big
VIEW IN TELEGRAM
کارسوق «پیچیدگی های طبیعت» قسمت سوم
#جلسه_دفاع از پایاننامه کارشناسی ارشد
علیرضا سعیدی
۹۶/۰۵/۳۰ - ساعت ۱۰:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
علیرضا سعیدی
۹۶/۰۵/۳۰ - ساعت ۱۰:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
#جلسه_دفاع از پایاننامه کارشناسی ارشد
مصطفی جاننثاری
۹۶/۰۵/۳۰ - ساعت ۱۴:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
مصطفی جاننثاری
۹۶/۰۵/۳۰ - ساعت ۱۴:۰۰
کلاس۳ دانشکده فیزیک، دانشگاه شهید بهشتی
🖥 NetworkX 2.0 released - #python package for the creation, manipulation & analysis of networks:
http://networkx.readthedocs.io/en/latest/release/release_dev.html
http://networkx.readthedocs.io/en/latest/release/release_dev.html
🎞 Relative Entropy:
http://videolectures.net/nips09_verdu_re/
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
Download slides: http://videolectures.net/site/normal_dl/tag=62842/nips09_verdu_re_01.pdf
http://videolectures.net/nips09_verdu_re/
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
Download slides: http://videolectures.net/site/normal_dl/tag=62842/nips09_verdu_re_01.pdf
videolectures.net
Relative Entropy
An overview of relative entropy (aka Kulback-Leibler divergence, etc.) and its multiple appearances in information theory, probability, and statistics, including recent results by the speaker.
🎞 Bayesian Inference
Peter Green, Department of Mathematics, University of Bristol
🔗 http://videolectures.net/mlss2011_green_bayesian/?q=Inference
Inference is the process of discovering from data about mechanisms that may have caused or generated that data, or at least explain it. The goals are varied - perhaps simply predicting future data, or more ambitiously drawing conclusions about scientific or societal truths. In the language of applied mathematics, these are inverse problems. Bayesian inference is about using probability to do all this. One of its strengths is that all sources of uncertainty in a problem can be simultaneously and coherently considered. It is model-based (in the language of machine learning, these are generative models), and we can use Bayesian methods to choose and criticize the models we use.
Download slides: http://videolectures.net/site/normal_dl/tag=626012/mlss2011_green_bayesian_01.pdf
Peter Green, Department of Mathematics, University of Bristol
🔗 http://videolectures.net/mlss2011_green_bayesian/?q=Inference
Inference is the process of discovering from data about mechanisms that may have caused or generated that data, or at least explain it. The goals are varied - perhaps simply predicting future data, or more ambitiously drawing conclusions about scientific or societal truths. In the language of applied mathematics, these are inverse problems. Bayesian inference is about using probability to do all this. One of its strengths is that all sources of uncertainty in a problem can be simultaneously and coherently considered. It is model-based (in the language of machine learning, these are generative models), and we can use Bayesian methods to choose and criticize the models we use.
Download slides: http://videolectures.net/site/normal_dl/tag=626012/mlss2011_green_bayesian_01.pdf
casi.pdf
8.1 MB
Computer Age Statistical Inference
Algorithms, Evidence, and Data Science
Bradley Efron Trevor Hastie
Stanford University
Algorithms, Evidence, and Data Science
Bradley Efron Trevor Hastie
Stanford University
🎞 Interesting talks from PyData London 2017
https://medium.springboard.com/interesting-talks-from-pydata-london-2017-d17b06c1ed5e
https://medium.springboard.com/interesting-talks-from-pydata-london-2017-d17b06c1ed5e
🔖 Deep Learning the Ising Model Near Criticality
Alan Morningstar, Roger G. Melko
🔗 https://arxiv.org/pdf/1708.04622
📌 ABSTRACT
It is well established that neural networks with deep architectures perform better than shallow networks for many tasks in machine learning. In statistical physics, while there has been recent interest in representing physical data with generative modelling, the focus has been on shallow neural networks. A natural question to ask is whether deep neural networks hold any advantage over shallow networks in representing such data. We investigate this question by using unsupervised, generative graphical models to learn the probability distribution of a two-dimensional Ising system. Deep Boltzmann machines, deep belief networks, and deep restricted Boltzmann networks are trained on thermal spin configurations from this system, and compared to the shallow architecture of the restricted Boltzmann machine. We benchmark the models, focussing on the accuracy of generating energetic observables near the phase transition, where these quantities are most difficult to approximate. Interestingly, after training the generative networks, we observe that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type. This is evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality.
Alan Morningstar, Roger G. Melko
🔗 https://arxiv.org/pdf/1708.04622
📌 ABSTRACT
It is well established that neural networks with deep architectures perform better than shallow networks for many tasks in machine learning. In statistical physics, while there has been recent interest in representing physical data with generative modelling, the focus has been on shallow neural networks. A natural question to ask is whether deep neural networks hold any advantage over shallow networks in representing such data. We investigate this question by using unsupervised, generative graphical models to learn the probability distribution of a two-dimensional Ising system. Deep Boltzmann machines, deep belief networks, and deep restricted Boltzmann networks are trained on thermal spin configurations from this system, and compared to the shallow architecture of the restricted Boltzmann machine. We benchmark the models, focussing on the accuracy of generating energetic observables near the phase transition, where these quantities are most difficult to approximate. Interestingly, after training the generative networks, we observe that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type. This is evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality.
https://www.youtube.com/watch?v=nE-xN4Bf8XI&index=1&list=PLLX-Q6B8xqZ8n8bwjGdzBJ25X2utwnoEG Want to learn parallel programming? Watch these videos by Intel
Network Biology/Integromics Bioinformatics – Applications Towards Medicine
http://norbis.no/activities/workshops/network-biologyintegromics-bioinformatics-applications-towards-medicine/
http://norbis.no/activities/workshops/network-biologyintegromics-bioinformatics-applications-towards-medicine/
🌀 #Turbulence occurs in a cascade: large eddies break down into smaller ones, which in turn split into even smaller ones, in a #fractal fashion.
Read more here:
🔗 http://www.nature.com/news/mysteries-of-turbulence-unravelled-1.22474
Read more here:
🔗 http://www.nature.com/news/mysteries-of-turbulence-unravelled-1.22474
Nature News & Comment
Mysteries of turbulence unravelled
Simulations follow how swirls in a fluid transfer and dissipate energy.
🌊 Lars Onsager: a cryptic genius
Theoretical physicist and chemist Lars Onsager (1903–76) was the type of scientist whom — so it was said — even geniuses such as Richard Feynman found intimidating to talk to. The Norwegian-born polymath “would announce his results by little, short, gnomic utterances”, says theoretical physicist Gregory Eyink. “And he was always right.”
In one of those terse revelations, Onsager announced in 1949 the surprising idea that turbulent fluids dissipate energy even in the absence of viscosity. That idea has now been proven mathematically.
In some cases, researchers have made sense of what Onsager said only in hindsight. In the 1990s, Eyink, who is now at Johns Hopkins University in Baltimore, Maryland, became the first person to take a major step towards validating Onsager’s argument on energy dissipation, only to discover later that Onsager himself had already made a start on that proof, scrawled in cryptic form in unpublished notebooks. Onsager had not bothered to publish this or much else on turbulence, in part because he was busy with other things — including work that led to him receiving the Nobel Prize in Chemistry in 1968 — but also because of the cold reception that others initially gave to his ideas4.
“I find his letter somewhat ‘screwy’”, Theodore von Kármán, considered the foremost US expert on turbulence in the 1940s, confessed to a colleague, regarding something Onsager had written to him. “Perhaps you could indicate to me in a few lines what the idea is, if any.” Linus Pauling, another chemistry Nobel prizewinner, responded to a letter from Onsager, saying: “Your work looks very interesting indeed to me, but it is too far over my head for me to appreciate it properly.”
Thanks to the efforts of Eyink and others, about 10% of Onsager’s notebooks and letters — which are kept at the University of Trondheim in Norway — have been digitized and are available for anyone to read online. Eyink says that he hopes other researchers will make the effort to study them, and that they will find insights not only in fluid dynamics, but also in many other fields in which Onsager worked, such as thermodynamics and condensed-matter physics.
Something similar happened in the past with the work of another oracle of the twentieth century, mathematician Srinivasa Ramanujan (1887–1920). Over the past decade, new results have been derived from enigmatic formulas that he had sketched in his notes but never published.
Read more here:
🔗 http://www.nature.com/news/mysteries-of-turbulence-unravelled-1.22474
Theoretical physicist and chemist Lars Onsager (1903–76) was the type of scientist whom — so it was said — even geniuses such as Richard Feynman found intimidating to talk to. The Norwegian-born polymath “would announce his results by little, short, gnomic utterances”, says theoretical physicist Gregory Eyink. “And he was always right.”
In one of those terse revelations, Onsager announced in 1949 the surprising idea that turbulent fluids dissipate energy even in the absence of viscosity. That idea has now been proven mathematically.
In some cases, researchers have made sense of what Onsager said only in hindsight. In the 1990s, Eyink, who is now at Johns Hopkins University in Baltimore, Maryland, became the first person to take a major step towards validating Onsager’s argument on energy dissipation, only to discover later that Onsager himself had already made a start on that proof, scrawled in cryptic form in unpublished notebooks. Onsager had not bothered to publish this or much else on turbulence, in part because he was busy with other things — including work that led to him receiving the Nobel Prize in Chemistry in 1968 — but also because of the cold reception that others initially gave to his ideas4.
“I find his letter somewhat ‘screwy’”, Theodore von Kármán, considered the foremost US expert on turbulence in the 1940s, confessed to a colleague, regarding something Onsager had written to him. “Perhaps you could indicate to me in a few lines what the idea is, if any.” Linus Pauling, another chemistry Nobel prizewinner, responded to a letter from Onsager, saying: “Your work looks very interesting indeed to me, but it is too far over my head for me to appreciate it properly.”
Thanks to the efforts of Eyink and others, about 10% of Onsager’s notebooks and letters — which are kept at the University of Trondheim in Norway — have been digitized and are available for anyone to read online. Eyink says that he hopes other researchers will make the effort to study them, and that they will find insights not only in fluid dynamics, but also in many other fields in which Onsager worked, such as thermodynamics and condensed-matter physics.
Something similar happened in the past with the work of another oracle of the twentieth century, mathematician Srinivasa Ramanujan (1887–1920). Over the past decade, new results have been derived from enigmatic formulas that he had sketched in his notes but never published.
Read more here:
🔗 http://www.nature.com/news/mysteries-of-turbulence-unravelled-1.22474
Nature News & Comment
Mysteries of turbulence unravelled
Simulations follow how swirls in a fluid transfer and dissipate energy.