Forwarded from Deleted Account [SCAM]
Media is too big
VIEW IN TELEGRAM
Raissa D'Sousa - What is Complexity in the Cosmos?
“Real scale-free networks show geometric scaling under renormalization group transformation.”
Beautiful paper tackling scaling and similarity with the good old RG.
https://www.nature.com/articles/s41567-018-0072-5?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+nphys%2Frss%2Fcurrent+%28Nature+Physics+-+Issue%29
Beautiful paper tackling scaling and similarity with the good old RG.
https://www.nature.com/articles/s41567-018-0072-5?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+nphys%2Frss%2Fcurrent+%28Nature+Physics+-+Issue%29
Nature
Multiscale unfolding of real networks by geometric renormalization
Nature Physics - Complex networks are not obviously renormalizable, as different length scales coexist. Embedding networks in a geometrical space allows the definition of a renormalization group...
Taming Chaos: Calculating Probability in Complex Systems
https://publishing.aip.org/publishing/journal-highlights/taming-chaos-calculating-probability-complex-systems?Track=CHAOStweet
https://publishing.aip.org/publishing/journal-highlights/taming-chaos-calculating-probability-complex-systems?Track=CHAOStweet
Freedom is Just a Word
By Peter Corning • Mar 27, 2018
http://complexsystems.org/390/freedom-is-just-a-word/
By Peter Corning • Mar 27, 2018
http://complexsystems.org/390/freedom-is-just-a-word/
💎 “Machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.”
https://www.nature.com/articles/s41567-018-0081-4
https://www.nature.com/articles/s41567-018-0081-4
Nature
Mutual information, neural networks and the renormalization group
Nature Physics - Finding the relevant degrees of freedom of a system is a key step in any renormalization group procedure. But this can be difficult, particularly in strongly interacting systems. A...
Complex Systems Studies
🌲 Tree crown shyness – a photogenic phenomenon in which the crowns of fully grown trees do not touch each other. A metaphor for so many things, it really puts you to think what it means to interact.
🌲 Trees as islands: canopy ant species richness increases with the size of liana‐free trees in a Neotropical forest
https://onlinelibrary.wiley.com/doi/abs/10.1111/ecog.02608
https://onlinelibrary.wiley.com/doi/abs/10.1111/ecog.02608
Last day to apply:
https://www.icts.res.in/program/integrability2018
https://www.icts.res.in/program/integrability2018
Forwarded from Deleted Account [SCAM]
Media is too big
VIEW IN TELEGRAM
Responding To Complexity: A Discussion With Yaneer Bar-Yam
🔖 A high-bias, low-variance introduction to Machine Learning for physicists
Pankaj Mehta, Marin Bukov, Ching-Hao Wang, Alexandre G.R. Day, Clint Richardson, Charles K. Fisher, David J. Schwab
https://arxiv.org/pdf/1803.08823
Machine Learning (ML) is one of the most exciting and dynamic areas of modern research and application. The purpose of this review is to provide an introduction to the core concepts and tools of machine learning in a manner easily understood and intuitive to physicists. The review begins by covering fundamental concepts in ML and modern statistics such as the bias-variance tradeoff, overfitting, regularization, and generalization before moving on to more advanced topics in both supervised and unsupervised learning. Topics covered in the review include ensemble models, deep learning and neural networks, clustering and data visualization, energy-based models (including MaxEnt models and Restricted Boltzmann Machines), and variational methods. Throughout, we emphasize the many natural connections between ML and statistical physics. A notable aspect of the review is the use of Python notebooks to introduce modern ML/statistical packages to readers using physics-inspired datasets (the Ising Model and Monte-Carlo simulations of supersymmetric decays of proton-proton collisions). We conclude with an extended outlook discussing possible uses of machine learning for furthering our understanding of the physical world as well as open problems in ML where physicists maybe able to contribute. (Notebooks are available at this https URL: https://physics.bu.edu/~pankajm/MLnotebooks.html)
Pankaj Mehta, Marin Bukov, Ching-Hao Wang, Alexandre G.R. Day, Clint Richardson, Charles K. Fisher, David J. Schwab
https://arxiv.org/pdf/1803.08823
Machine Learning (ML) is one of the most exciting and dynamic areas of modern research and application. The purpose of this review is to provide an introduction to the core concepts and tools of machine learning in a manner easily understood and intuitive to physicists. The review begins by covering fundamental concepts in ML and modern statistics such as the bias-variance tradeoff, overfitting, regularization, and generalization before moving on to more advanced topics in both supervised and unsupervised learning. Topics covered in the review include ensemble models, deep learning and neural networks, clustering and data visualization, energy-based models (including MaxEnt models and Restricted Boltzmann Machines), and variational methods. Throughout, we emphasize the many natural connections between ML and statistical physics. A notable aspect of the review is the use of Python notebooks to introduce modern ML/statistical packages to readers using physics-inspired datasets (the Ising Model and Monte-Carlo simulations of supersymmetric decays of proton-proton collisions). We conclude with an extended outlook discussing possible uses of machine learning for furthering our understanding of the physical world as well as open problems in ML where physicists maybe able to contribute. (Notebooks are available at this https URL: https://physics.bu.edu/~pankajm/MLnotebooks.html)
💎 Bill Rand's new tutorial, Fundamentals of NetLogo, has officially launched! Take the new tutorial and start using it in your work!
http://netlogo.complexityexplorer.org/
http://netlogo.complexityexplorer.org/
🔸 The Key to Creating Cost-Effective Experiments at Scale - Behavioral Scientist:
http://behavioralscientist.org/the-key-to-creating-engaging-and-cost-effective-experiments-at-scale/
http://behavioralscientist.org/the-key-to-creating-engaging-and-cost-effective-experiments-at-scale/
Behavioral Scientist
The Key to Creating Cost-Effective Experiments at Scale
How can we build large-scale, cost-effective experiments that people want to participate in?
💻 A Free Oxford Course on Deep Learning: Cutting Edge Lessons in Artificial Intelligence
Slides and Videos:
🔗 https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/
Slides and Videos:
🔗 https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/