1️⃣ But what *is* a Neural Network? | Deep learning, Part 1
🔗 https://www.aparat.com/v/XkQFy
2️⃣ Gradient descent, how neural networks learn | Deep learning, part 2
🔗 https://www.aparat.com/v/uvUxW
3️⃣ What is backpropagation and what is it actually doing? | Deep learning
🔗 https://www.aparat.com/v/EZ9RV
3️⃣*️⃣ Backpropagation calculus | Appendix to deep learning
🔗 https://www.aparat.com/v/0tSKg
🔗 https://www.aparat.com/v/XkQFy
2️⃣ Gradient descent, how neural networks learn | Deep learning, part 2
🔗 https://www.aparat.com/v/uvUxW
3️⃣ What is backpropagation and what is it actually doing? | Deep learning
🔗 https://www.aparat.com/v/EZ9RV
3️⃣*️⃣ Backpropagation calculus | Appendix to deep learning
🔗 https://www.aparat.com/v/0tSKg
آپارات - سرویس اشتراک ویدیو
But what *is* a Neural Network? | Deep learning, Part 1
Subscribe to stay notified about part 2 on backpropagation: http://3b1b.co/subscribe
Support more videos like this on Patreon: https://www.patreon.com/3blue1brown
For any early-stage ML entrepreneurs, Amplify Partners would love to hear from you: 3bl…
Support more videos like this on Patreon: https://www.patreon.com/3blue1brown
For any early-stage ML entrepreneurs, Amplify Partners would love to hear from you: 3bl…
#سمینارهای_هفتگی گروه سیستمهای پیچیده و علم شبکه دانشگاه شهید بهشتی
🔹دوشنبه، ۱۵ آبان ماه، ساعت ۴:۰۰ - کلاس۱ دانشکده فیزیک دانشگاه شهید بهشتی.
@carimi
🔹دوشنبه، ۱۵ آبان ماه، ساعت ۴:۰۰ - کلاس۱ دانشکده فیزیک دانشگاه شهید بهشتی.
@carimi
Forwarded from انجمن علمی فیزیک بهشتی (SBU)
#سمینار_عمومی این هفته
کوانتوم؛ مغز و هوش مصنوعی
-۳شنبه ۱۶ آبان؛ ساعت ۱۶
-تالار ابن هیثم، دانشکده فیزیک
کانال انجمن علمی دانشجویی فیزیک
@sbu_physics
کوانتوم؛ مغز و هوش مصنوعی
-۳شنبه ۱۶ آبان؛ ساعت ۱۶
-تالار ابن هیثم، دانشکده فیزیک
کانال انجمن علمی دانشجویی فیزیک
@sbu_physics
🔖 Variational Inference: A Review for Statisticians
David M. Blei, Alp Kucukelbir, Jon D. McAuliffe
🔗 https://arxiv.org/pdf/1601.00670
📌 ABSTRACT
One of the core problems of modern statistics is to approximate difficult-to-compute probability densities. This problem is especially important in Bayesian statistics, which frames all inference about unknown quantities as a calculation involving the posterior density. In this paper, we review variational inference (VI), a method from machine learning that approximates probability densities through optimization. VI has been used in many applications and tends to be faster than classical methods, such as Markov chain Monte Carlo sampling. The idea behind VI is to first posit a family of densities and then to find the member of that family which is close to the target. Closeness is measured by Kullback-Leibler divergence. We review the ideas behind mean-field variational inference, discuss the special case of VI applied to exponential family models, present a full example with a Bayesian mixture of Gaussians, and derive a variant that uses stochastic optimization to scale up to massive data. We discuss modern research in VI and highlight important open problems. VI is powerful, but it is not yet well understood. Our hope in writing this paper is to catalyze statistical research on this class of algorithms
David M. Blei, Alp Kucukelbir, Jon D. McAuliffe
🔗 https://arxiv.org/pdf/1601.00670
📌 ABSTRACT
One of the core problems of modern statistics is to approximate difficult-to-compute probability densities. This problem is especially important in Bayesian statistics, which frames all inference about unknown quantities as a calculation involving the posterior density. In this paper, we review variational inference (VI), a method from machine learning that approximates probability densities through optimization. VI has been used in many applications and tends to be faster than classical methods, such as Markov chain Monte Carlo sampling. The idea behind VI is to first posit a family of densities and then to find the member of that family which is close to the target. Closeness is measured by Kullback-Leibler divergence. We review the ideas behind mean-field variational inference, discuss the special case of VI applied to exponential family models, present a full example with a Bayesian mixture of Gaussians, and derive a variant that uses stochastic optimization to scale up to massive data. We discuss modern research in VI and highlight important open problems. VI is powerful, but it is not yet well understood. Our hope in writing this paper is to catalyze statistical research on this class of algorithms
Really helpful, thorough, and insightful network tutorial .ipynb
https://github.com/vtraag/4TU-CSS/blob/master/presentations/traag/notebook/Network.ipynb
https://github.com/vtraag/4TU-CSS/blob/master/presentations/traag/notebook/Network.ipynb
👌🏾 An Introduction to Complex Systems: Society, Ecology, and Nonlinear Dynamics
http://physicstoday.scitation.org/doi/full/10.1063/PT.3.3766
Physics Today 70, 11, 51 (2017);https://doi.org/10.1063/PT.3.3766
⚙ PDF 👇🏼👇🏻👇🏽
http://physicstoday.scitation.org/doi/full/10.1063/PT.3.3766
Physics Today 70, 11, 51 (2017);https://doi.org/10.1063/PT.3.3766
⚙ PDF 👇🏼👇🏻👇🏽
Using Big Data, Social Networks, and Agent-Based Modeling to Understand Information Diffusion
https://vimeo.com/241857550
https://vimeo.com/241857550
Vimeo
Using Big Data, Social Networks, and Agent-Based Modeling to Understand Information Diffusion
Center for Collective Dynamics of Complex Systems (CoCo) Seminar Series November 7, 2017 Bill Rand (Marketing, North Carolina State University) "Using Big…
⚙ This collection of over 500 MATLAB examples can help you with #machinelearning, #statistics, and #math problems
https://www.mathworks.com/examples/product-group/matlab-math-statistics-and-optimization?s_eid=PSM_da&hootPostID=70cb4d7118cfa4662cc041050f5e8ff1
https://www.mathworks.com/examples/product-group/matlab-math-statistics-and-optimization?s_eid=PSM_da&hootPostID=70cb4d7118cfa4662cc041050f5e8ff1
Mathworks
Math, Statistics, and Optimization Examples
Explore thousands of code examples for MATLAB, Simulink, and other MathWorks products.
🎞 Civilization Far From Equilibrium - Energy, Complexity, and Human Survival
https://www.perimeterinstitute.ca/videos/civilization-far-equilibrium-energy-complexity-and-human-survival
Abstract
Human societies use complexity – within their institutions and technologies – to address their various problems, and they need high-quality energy to create and sustain this complexity. But now greater complexity is producing diminishing returns in wellbeing, while the energetic cost of key sources of energy is rising fast. Simultaneously, humankind’s problems are becoming vastly harder, which requires societies to deliver yet more complexity and thus consume yet more energy. Resolving this paradox is the central challenge of the 21st century.
Thomas Homer-Dixon holds the CIGI Chair of Global Systems at the Balsillie School of International Affairs in Waterloo, Canada, and is a Professor at the University of Waterloo.
https://www.perimeterinstitute.ca/videos/civilization-far-equilibrium-energy-complexity-and-human-survival
Abstract
Human societies use complexity – within their institutions and technologies – to address their various problems, and they need high-quality energy to create and sustain this complexity. But now greater complexity is producing diminishing returns in wellbeing, while the energetic cost of key sources of energy is rising fast. Simultaneously, humankind’s problems are becoming vastly harder, which requires societies to deliver yet more complexity and thus consume yet more energy. Resolving this paradox is the central challenge of the 21st century.
Thomas Homer-Dixon holds the CIGI Chair of Global Systems at the Balsillie School of International Affairs in Waterloo, Canada, and is a Professor at the University of Waterloo.
www.perimeterinstitute.ca
Civilization Far From Equilibrium - Energy, Complexity, and Human Survival | Perimeter Institute
Human societies use complexity – within their institutions and technologies – to address their various problems, and they need high-quality energy to create and sustain this complexity. But now greater complexity is producing diminishing returns in wellbeing…
💎 How do little things combine to do big things?
Bryan Daniels
http://www.public.asu.edu/~bdaniel6/collective-behavior-amplification.html
Collective behavior in living systems is fascinating and diverse: proteins collectively perform metabolism in your cells; neurons collectively process information in your brain; honeybees collectively forage and care for their young.
Collective behavior in inanimate systems is also beautiful and can be more pristine: the fact that the atoms of carbon in a diamond are equivalent means that the crystal structure they form can be utterly precise. It also makes it easier to describe real crystals and predict their properties: once you know the fundamental repeating building block, it's just a matter of scaling up to have trillions of repeating units.
Physics tells us a lot about how to deal with these less complicated collective systems. We can connect the properties of atoms to the properties of familiar objects, like the hardness of a diamond. Can this expertise be ported to biological examples, where individuals are not so simple?
One connection we've been thinking about involves how information is amplified in producing a system's aggregate behavior. As an example of a system with low amplification, imagine our diamond under usual conditions: if we somehow jostled just a few atoms, this wouldn't have a huge effect. The jostling would quickly dissipate without changing the overall structure.
In many living systems, though, changes to individuals can have a large effect on the whole. A single bird might change the direction of a flock; a change to the activity of a few neurons might affect your subsequent behavior.
We can quantify this "amplification" with a fundamental measure from information theory—the Fisher information—which also turns out to have interesting connections to the idea of a phase transition in statistical physics. It's also a natural measure from the point of view of biology: Amplifying important information and damping extraneous noise can be key to the survival of a group.
In this way, amplification measures a fundamental property of collectives, one that could be compared across different systems. Making the connection to statistical physics, we might ask questions like, "How far is this fish school from a phase transition?" Making the connection to biology, we might ask, "How is the correct information from individuals amplified to influence the behavior of the whole?"
Through this lens of amplification and others that describe distinct aspects of information processing, we aim eventually to map out the space of strategies used by living systems as they combine to form functional aggregates.
Bryan Daniels
http://www.public.asu.edu/~bdaniel6/collective-behavior-amplification.html
Collective behavior in living systems is fascinating and diverse: proteins collectively perform metabolism in your cells; neurons collectively process information in your brain; honeybees collectively forage and care for their young.
Collective behavior in inanimate systems is also beautiful and can be more pristine: the fact that the atoms of carbon in a diamond are equivalent means that the crystal structure they form can be utterly precise. It also makes it easier to describe real crystals and predict their properties: once you know the fundamental repeating building block, it's just a matter of scaling up to have trillions of repeating units.
Physics tells us a lot about how to deal with these less complicated collective systems. We can connect the properties of atoms to the properties of familiar objects, like the hardness of a diamond. Can this expertise be ported to biological examples, where individuals are not so simple?
One connection we've been thinking about involves how information is amplified in producing a system's aggregate behavior. As an example of a system with low amplification, imagine our diamond under usual conditions: if we somehow jostled just a few atoms, this wouldn't have a huge effect. The jostling would quickly dissipate without changing the overall structure.
In many living systems, though, changes to individuals can have a large effect on the whole. A single bird might change the direction of a flock; a change to the activity of a few neurons might affect your subsequent behavior.
We can quantify this "amplification" with a fundamental measure from information theory—the Fisher information—which also turns out to have interesting connections to the idea of a phase transition in statistical physics. It's also a natural measure from the point of view of biology: Amplifying important information and damping extraneous noise can be key to the survival of a group.
In this way, amplification measures a fundamental property of collectives, one that could be compared across different systems. Making the connection to statistical physics, we might ask questions like, "How far is this fish school from a phase transition?" Making the connection to biology, we might ask, "How is the correct information from individuals amplified to influence the behavior of the whole?"
Through this lens of amplification and others that describe distinct aspects of information processing, we aim eventually to map out the space of strategies used by living systems as they combine to form functional aggregates.