the Turing Machine
Plants are alive! Link: YouTube #spare_time Follow: @theTuringMachine
Did you buy your mother flowers yesterday? While those sweet-smelling roses may bring a smile to your mother’s face, unless you cut them from your own backyard, those flowers are not making the environment smile. The floriculture industry, which thrives off of days of celebration like Mother’s Day and Valentine’s Day and birthdays, in fact has huge impacts on the environment through water usage, pollution, land degradation, and fossil fuel transportation emissions. Some estimates account that one hectare of a flower farm consumes over 900 cubic meters of water per month!
.... Continue reading
Link: Labroots
#spare_time
.... Continue reading
Link: Labroots
#spare_time
Labroots
Give her chocolates, not flowers: the environmental impacts of the cut-flower industry | Earth And The Environment
Did you buy your mother flowers yesterday? While those sweet-smelling roses may bring a smile to your mother’s face, unless you cut them from your ow | Earth And The Environment
Forwarded from Scientific Programming (ZiAEE)
🔆 STAN
☘️ Stan is a state-of-the-art platform for statistical modeling and high-performance statistical computation.
☘️ Stan interfaces with the most popular data analysis languages (R, #python, shell, MATLAB, Julia, Stata) and runs on all major platforms (Linux, Mac, Windows).
🌱 Stan User’s Guide
🌱 PyStan Guide
To install simply use:
$ pip3 install pystan
☘️ Stan is a state-of-the-art platform for statistical modeling and high-performance statistical computation.
☘️ Stan interfaces with the most popular data analysis languages (R, #python, shell, MATLAB, Julia, Stata) and runs on all major platforms (Linux, Mac, Windows).
🌱 Stan User’s Guide
🌱 PyStan Guide
To install simply use:
$ pip3 install pystan
mc-stan.org
Stan User’s Guide
Stan user’s guide with examples and programming techniques.
This documentation is automatically generated documentation from the corresponding code repository hosted at Github. The repository contains python exercises accompanying the book Neuronal Dynamics by Wulfram Gerstner, Werner M. Kistler, Richard Naud and Liam Paninski.
https://lcn-neurodynex-exercises.readthedocs.io/en/latest/index.html
https://lcn-neurodynex-exercises.readthedocs.io/en/latest/index.html
GitHub
martinbarry59/neurodynex3
Contribute to martinbarry59/neurodynex3 development by creating an account on GitHub.
Forwarded from Scientific Programming (ZiAEE)
Complexity Digest
CCS2020 Conference on Complex Systems 2020, online 7-12 December.
CCS2020 is the flagship conference promoted by the Complex Systems Society. It brings under one umbrella a wide variety of leading researchers, practitioners and stakeholders with a direct interest…
The last years have seen many exciting new developments to train spiking neural networks to perform complex information processing. This online workshop brings together researchers in the field to present their work and discuss ways of translating these findings into a better understanding of neural circuits. Topics include artificial and biologically plausible learning algorithms and the dissection of trained spiking circuits toward understanding neural processing. We have a manageable number of talks with ample time for discussions.
The workshop is being organised by Dan Goodman and Friedemann Zenke.
#Neuroscience
Link: Git
Follow: @theTuringMachine
The workshop is being organised by Dan Goodman and Friedemann Zenke.
#Neuroscience
Link: Git
Follow: @theTuringMachine
3 Best Books for Beginner Data-scientists!
Link: Medium
Link: Medium
Medium
3 Best Books for Beginner Data Scientists
Improve your data analysis skills by getting these three key books
Forwarded from G Esfahani
Videos of talks from a recent CNS workshop: https://ocs2020.github.io/
Origins of Common Sense
ALLEN INSTITUTE MODELING WORKSHOP LOG IN INFORMATION
Link: AllenInstitute
#Neuroscience
Follow: @theTuringMachine
Link: AllenInstitute
#Neuroscience
Follow: @theTuringMachine
A How to model guide for neuroscience
Within neuroscience, models have many roles, including driving hypotheses, making assumptions explicit, syn- thesizing knowledge, making experimental predictions, and facilitating applications to medicine. While specific modeling techniques are often taught, the process of constructing models for a given phenomenon or question is generally left opaque. Here, informed by guiding many students through modeling exercises at our summer school in CoSMo (Computational Sensory-Motor Neuroscience), we provide a practical 10-step breakdown of the modeling process. This approach makes choices and criteria more explicit and replicable. Experiment design has long been taught in neuroscience; the modeling process should receive the same attention.
Link
Within neuroscience, models have many roles, including driving hypotheses, making assumptions explicit, syn- thesizing knowledge, making experimental predictions, and facilitating applications to medicine. While specific modeling techniques are often taught, the process of constructing models for a given phenomenon or question is generally left opaque. Here, informed by guiding many students through modeling exercises at our summer school in CoSMo (Computational Sensory-Motor Neuroscience), we provide a practical 10-step breakdown of the modeling process. This approach makes choices and criteria more explicit and replicable. Experiment design has long been taught in neuroscience; the modeling process should receive the same attention.
Link
The Max Planck Schools are taking higher and graduate education in Germany in a new direction. With the participation of German universities and the four large German research organizations[1], the Max Planck School of Cognition, the Max Planck School Matter to Life and the Max Planck School of Photonics are concentrating the Germany-wide distributed excellence within three future-oriented fields to create internationally visible and highly attractive graduate programs. The most forward-thinking researchers of one discipline come together as Fellows of the Max Planck Schools to supervise internationally outstanding young scientists within the framework of a structured doctoral program. Fellows and students alike enjoy access to a truly unique interdisciplinary network.
Link: MaxPlanck
#position
Follow: @theTuringMachine
Link: MaxPlanck
#position
Follow: @theTuringMachine
Presynaptic inhibition rapidly stabilises recurrent excitation in the face of plasticity
Hebbian plasticity, a mechanism believed to be the substrate of learning and memory, detects and further enhances correlated neural activity. Because this constitutes an unsta- ble positive feedback loop, it requires additional homeostatic control. Computational work suggests that in recurrent networks, the homeostatic mechanisms observed in experiments are too slow to compensate instabilities arising from Hebbian plasticity and need to be com- plemented by rapid compensatory processes. We suggest presynaptic inhibition as a candi- date that rapidly provides stability by compensating recurrent excitation induced by Hebbian changes. Presynaptic inhibition is mediated by presynaptic GABA receptors that effectively and reversibly attenuate transmitter release. Activation of these receptors can be triggered by excess network activity, hence providing a stabilising negative feedback loop that weak- ens recurrent interactions on sub-second timescales. We study the stabilising effect of pre- synaptic inhibition in recurrent networks, in which presynaptic inhibition is implemented as a multiplicative reduction of recurrent synaptic weights in response to increasing inhibitory activity. We show that networks with presynaptic inhibition display a gradual increase of fir- ing rates with growing excitatory weights, in contrast to traditional excitatory-inhibitory net- works. This alleviates the positive feedback loop between Hebbian plasticity and network activity and thereby allows homeostasis to act on timescales similar to those observed in experiments. Our results generalise to spiking networks with a biophysically more detailed implementation of the presynaptic inhibition mechanism. In conclusion, presynaptic inhibi- tion provides a powerful compensatory mechanism that rapidly reduces effective recurrent interactions and thereby stabilises Hebbian learning.
Link: PLOS
#Neuroscience
Follow: @theTuringMachine
Hebbian plasticity, a mechanism believed to be the substrate of learning and memory, detects and further enhances correlated neural activity. Because this constitutes an unsta- ble positive feedback loop, it requires additional homeostatic control. Computational work suggests that in recurrent networks, the homeostatic mechanisms observed in experiments are too slow to compensate instabilities arising from Hebbian plasticity and need to be com- plemented by rapid compensatory processes. We suggest presynaptic inhibition as a candi- date that rapidly provides stability by compensating recurrent excitation induced by Hebbian changes. Presynaptic inhibition is mediated by presynaptic GABA receptors that effectively and reversibly attenuate transmitter release. Activation of these receptors can be triggered by excess network activity, hence providing a stabilising negative feedback loop that weak- ens recurrent interactions on sub-second timescales. We study the stabilising effect of pre- synaptic inhibition in recurrent networks, in which presynaptic inhibition is implemented as a multiplicative reduction of recurrent synaptic weights in response to increasing inhibitory activity. We show that networks with presynaptic inhibition display a gradual increase of fir- ing rates with growing excitatory weights, in contrast to traditional excitatory-inhibitory net- works. This alleviates the positive feedback loop between Hebbian plasticity and network activity and thereby allows homeostasis to act on timescales similar to those observed in experiments. Our results generalise to spiking networks with a biophysically more detailed implementation of the presynaptic inhibition mechanism. In conclusion, presynaptic inhibi- tion provides a powerful compensatory mechanism that rapidly reduces effective recurrent interactions and thereby stabilises Hebbian learning.
Link: PLOS
#Neuroscience
Follow: @theTuringMachine
journals.plos.org
Presynaptic inhibition rapidly stabilises recurrent excitation in the face of plasticity
Author Summary Synapses between neurons change during learning and memory formation, a process termed synaptic plasticity. Established models of plasticity rely on strengthening synapses of co-active neurons. In recurrent networks, mutually connected neurons…
Forwarded from Scientific Programming (ZiAEE)
I am working on Modeling neural dynamics from Borgers using #brian2.
It's easy to learn and fast for development.
for a quick review and tutorial look at here.
There is also a course by #Gerstner which use brain2 in his book. [link to exercise]
The course is available by EPFL.
I am uploading the jupyter note books of Borgers using brian2 here.
Have fun coding!
It's easy to learn and fast for development.
for a quick review and tutorial look at here.
There is also a course by #Gerstner which use brain2 in his book. [link to exercise]
The course is available by EPFL.
I am uploading the jupyter note books of Borgers using brian2 here.
Have fun coding!