the Turing Machine – Telegram
the Turing Machine
276 subscribers
189 photos
15 files
557 links
Join me through the journey of learning Computational Neuroscience topics.
Useful resources, positions and much more!
Get in touch: @nosratullah
Website: nosratullah.github.io
Download Telegram
Deep Learning, which is a course on the theory and techniques of deep learning with an emphasis on neuroscience. The course runs from August 2-20.
The syllabus for this course is still in progress, here is the current draft.

[ link ]

Follow: @theTuringMachine
Large-scale neural recording methods now allow us to observe large populations of identified single neurons simultaneously, opening a window into neural population dynamics in living organisms. However, distilling such large-scale recordings to build theories of emergent collective dynamics remains a fundamental statistical challenge. The neural field models of Wilson, Cowan, and colleagues remain the mainstay of mathematical population modeling owing to their interpretable, mechanistic parameters and amenability to mathematical analysis. Inspired by recent advances in biochemical modeling, we develop a method based on moment closure to interpret neural field models as latent state-space point-process models, making them amenable to statistical inference. With this approach we can infer the intrinsic states of neurons, such as active and refractory, solely from spiking activity in large populations...

[ link ]

#paper
Follwo: @theTuringMachine
Post-doctoral position:

Research Fellow, UCL Department / Division UCL Queen Square Institute of Neurology Specific unit / Sub department Wellcome Centre for Human Neuroimaging, Max Planck UCL Centre for Computational Psychiatry and Ageing
Location of position:
London
Grade 7
Hours Full Time
Salary (inclusive of London allowance) £36,028 - £43,533 per annum
Duties and Responsibilities
Applications are invited for a Research Fellow in the Max Planck UCL Centre for Computational Psychiatry and Ageing Research to undertake high quality research and produce high-impact publications in the context of the ERC-funded research project "Action selection under threat - the complex control of human defence" led by Dr Dominik Bach.

#positions
[ link ]

Follow: @theTuringMachine
Reaction diffusion system (Gray-Scott model)

A solver for the Gray-Scott reaction-diffusion model. Reaction-diffusion (RD) models are mathematical formulations of some chemical and biological processes that are quite common in nature: several substances react with each other while they spread out over the space. The simulation of a RD system leads to patterns that are reminiscent of those seen in many natural places, such as the skin of a leopard or the surface of a brain coral. This experiment implements a solver of a specific class of RD systems: the Gray-Scott model. Here the reacting substance can be seen as living cells that need food to reproduce and have limited lifetime. The user can place living cells with mouse strokes, can change the colors and can set the parameters of the model (the feed and death rates). Some interesting parameter presets are available too...

#spare_time

[ link ] [ git ] [ denoscription ]

Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
A Student's Guide to Python for Physical Modeling: Second Edition
#python
#book
#beginner
In Stevenson and Kording (2011), the authors estimated that every 7.4 years, the number of neurons we can record with doubles. Think of it as Moore’s law for brain recordings. Since then, Stevenson has updated the estimate, which now stands at 6 years. Could it be that progress itself is accelerating?

by: Patrick Mineault

[ link ]

Follow: @theTuringMachine
Overview
The Neural Latents Benchmark (NLB) aims to evaluate models of neural population state. In the first benchmark suite, participating models should take multi-channel spiking activity as input and produce firing rate estimates as output. This first benchmark will be released in August 2021.

[ link ][ video ]

Follow: @theTuringMachine
Forwarded from Complex Systems Studies
Statistical Physics of Complex Systems | (smr 3624)

This is the 3rd conference on Statistical Physics organised under the auspices of the EPS Statistical and Nonlinear Physics Division, during which the EPS Statistical and Nonlinear Physics Prize will be awarded.

http://indico.ictp.it/event/9625/
PhD Opening on large-scale dynamics of functional networks and information routing

We are recruiting one PhD fellow at the Institute for Systems Neuroscience at Aix-Marseille University. The student will work within Dr. Demian Battaglia’s group (theoretical neuroscientist), in strict interaction with other researchers at Aix-Marseille University (Andrea Brovelli, systems and cognitive neuroscientist; Alain Barrat, complex networks physicist), as well as Strasbourg University.

[ link ]

#positions

Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
JITCSIM
I have written a package for high performance simulation of complex networks using just in time compilation.
The model is written in python syntax, C code is generated and run with full speed.

This is not an official release, I have made the project public to get some feedback, and know how much this could be helpful for others.

The models include Kuramoto models and solve ODEs.

Delay differential equations and Stochastic differential equations will be added soon.

Parallelising with OpenMP and Multiprocessing also supported.

To get a glance what is now available look at the notebooks.
A potent tool to study neural patterns
Network control is an emerging field where two lines of research, namely control and network theories, are combined to study complex systems. The strength of network control in comparison with existing techniques lies in its ability to make concrete prediction with respect to the future behavior of the studied system and associate it with the system’s physical structure. When applied to brain data, network control not only allows to describe the complex temporal patterns of neural activity within a mathematically rigorous quantitative framework, but also directly predicts the full trajectory of state transitions based on the algebraic distribution of driver nodes and the time course of the input signals.

[ link ] [ registration ]

Follow: @theTuringMachine
Introduction to Linear Algebra for Applied Machine Learning with Python

Linear
algebra is to machine learning as flour to bakery: every machine learning model is based in linear algebra, as every cake is based in flour. It is not the only ingredient, of course. Machine learning models need vector calculus, probability, and optimization, as cakes need sugar, eggs, and butter. Applied machine learning, like bakery, is essentially about combining these mathematical ingredients in clever ways to create useful models.


[ link ]

Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
I have made some interactive dashboard for brian.
Let's take a look.

GitHub
Reaction-Diffusion Tutorial

Author: Karl Sims

A simulation of two virtual chemicals reacting and diffusing on a 2D grid using the Gray-Scott model.

[ link ]

Follow: @theTuringMachine
Case Studies in Neural Data Analysis
Author: Mark Kramer and Uri Eden

This repository is a companion to the textbook Case Studies in Neural Data Analysis, by Mark Kramer and Uri Eden. That textbook uses MATLAB to analyze examples of neuronal data. The material here is similar, except that we use Python.

[ link ] [ git ] [ book ]

Follow: @theTuringMachine