the Turing Machine – Telegram
the Turing Machine
277 subscribers
189 photos
15 files
557 links
Join me through the journey of learning Computational Neuroscience topics.
Useful resources, positions and much more!
Get in touch: @nosratullah
Website: nosratullah.github.io
Download Telegram
Forwarded from Scientific Programming (ZiAEE)
🔆 STAN
☘️ Stan is a state-of-the-art platform for statistical modeling and high-performance statistical computation.
☘️ Stan interfaces with the most popular data analysis languages (R, #python, shell, MATLAB, Julia, Stata) and runs on all major platforms (Linux, Mac, Windows).

🌱 Stan User’s Guide
🌱 PyStan Guide
To install simply use:
$ pip3 install pystan
This documentation is automatically generated documentation from the corresponding code repository hosted at Github. The repository contains python exercises accompanying the book Neuronal Dynamics by Wulfram Gerstner, Werner M. Kistler, Richard Naud and Liam Paninski.

https://lcn-neurodynex-exercises.readthedocs.io/en/latest/index.html
The last years have seen many exciting new developments to train spiking neural networks to perform complex information processing. This online workshop brings together researchers in the field to present their work and discuss ways of translating these findings into a better understanding of neural circuits. Topics include artificial and biologically plausible learning algorithms and the dissection of trained spiking circuits toward understanding neural processing. We have a manageable number of talks with ample time for discussions.
The workshop is being organised by Dan Goodman and Friedemann Zenke.

#Neuroscience
Link: Git
Follow: @theTuringMachine
Forwarded from G Esfahani
Videos of talks from a recent CNS workshop: https://ocs2020.github.io/
ALLEN INSTITUTE MODELING WORKSHOP LOG IN INFORMATION

Link: AllenInstitute
#Neuroscience
Follow: @theTuringMachine
A How to model guide for neuroscience

Within neuroscience, models have many roles, including driving hypotheses, making assumptions explicit, syn- thesizing knowledge, making experimental predictions, and facilitating applications to medicine. While specific modeling techniques are often taught, the process of constructing models for a given phenomenon or question is generally left opaque. Here, informed by guiding many students through modeling exercises at our summer school in CoSMo (Computational Sensory-Motor Neuroscience), we provide a practical 10-step breakdown of the modeling process. This approach makes choices and criteria more explicit and replicable. Experiment design has long been taught in neuroscience; the modeling process should receive the same attention.
Link
The Max Planck Schools are taking higher and graduate education in Germany in a new direction. With the participation of German universities and the four large German research organizations[1], the Max Planck School of Cognition, the Max Planck School Matter to Life and the Max Planck School of Photonics are concentrating the Germany-wide distributed excellence within three future-oriented fields to create internationally visible and highly attractive graduate programs. The most forward-thinking researchers of one discipline come together as Fellows of the Max Planck Schools to supervise internationally outstanding young scientists within the framework of a structured doctoral program. Fellows and students alike enjoy access to a truly unique interdisciplinary network.

Link: MaxPlanck
#position
Follow: @theTuringMachine
Presynaptic inhibition rapidly stabilises recurrent excitation in the face of plasticity

Hebbian plasticity, a mechanism believed to be the substrate of learning and memory, detects and further enhances correlated neural activity. Because this constitutes an unsta- ble positive feedback loop, it requires additional homeostatic control. Computational work suggests that in recurrent networks, the homeostatic mechanisms observed in experiments are too slow to compensate instabilities arising from Hebbian plasticity and need to be com- plemented by rapid compensatory processes. We suggest presynaptic inhibition as a candi- date that rapidly provides stability by compensating recurrent excitation induced by Hebbian changes. Presynaptic inhibition is mediated by presynaptic GABA receptors that effectively and reversibly attenuate transmitter release. Activation of these receptors can be triggered by excess network activity, hence providing a stabilising negative feedback loop that weak- ens recurrent interactions on sub-second timescales. We study the stabilising effect of pre- synaptic inhibition in recurrent networks, in which presynaptic inhibition is implemented as a multiplicative reduction of recurrent synaptic weights in response to increasing inhibitory activity. We show that networks with presynaptic inhibition display a gradual increase of fir- ing rates with growing excitatory weights, in contrast to traditional excitatory-inhibitory net- works. This alleviates the positive feedback loop between Hebbian plasticity and network activity and thereby allows homeostasis to act on timescales similar to those observed in experiments. Our results generalise to spiking networks with a biophysically more detailed implementation of the presynaptic inhibition mechanism. In conclusion, presynaptic inhibi- tion provides a powerful compensatory mechanism that rapidly reduces effective recurrent interactions and thereby stabilises Hebbian learning.

Link: PLOS
#Neuroscience
Follow: @theTuringMachine
Forwarded from Scientific Programming (ZiAEE)
I am working on Modeling neural dynamics from Borgers using #brian2.

It's easy to learn and fast for development.
for a quick review and tutorial look at here.
There is also a course by #Gerstner which use brain2 in his book. [link to exercise]
The course is available by EPFL.
I am uploading the jupyter note books of Borgers using brian2 here.
Have fun coding!