the Turing Machine – Telegram
the Turing Machine
276 subscribers
189 photos
15 files
557 links
Join me through the journey of learning Computational Neuroscience topics.
Useful resources, positions and much more!
Get in touch: @nosratullah
Website: nosratullah.github.io
Download Telegram
Introduction to Neural Computation

Course Denoscription
:
This course introduces quantitative approaches to understanding brain and cognitive functions. Topics include mathematical denoscription of neurons, the response of neurons to sensory stimuli, simple neuronal networks, statistical inference and decision making. It also covers foundational quantitative tools of data analysis in neuroscience: correlation, convolution, spectral analysis, principal components analysis, and mathematical concepts including simple differential equations and linear algebra.
Instructors: Prof. Michale Fee | Daniel Zysman

[ link ]

#courses

More: @theTuringMachine
Anjan Chatterjee uses tools from evolutionary psychology and cognitive neuroscience to study one of nature's most captivating concepts: beauty. Learn more about the science behind why certain configurations of line, color and form excite us in this fascinating, deep look inside your brain.

[ link ]

#spare_time

Follow: @theTuringMachine
Manifolds in Neuroscience
This video explains in pretty intuitive terms how ideas from topology (or "rubber geometry") can be used in neuroscience, to help us understand the way information is embedded in high-dimensional representations inside neural circuits.

[ link ]

Follow: @theTuringMachine
Dynamical Systems with Applications using Python

Designed for a broad audience of students in applied mathematics, physics, and engineering
Represents dynamical systems with popular Python libraries like sympy, numpy, and matplotlib
Explores a variety of advanced topics in dynamical systems, like neural networks, fractals, and nonlinear optics, at an undergraduate level.

[ link ] [ codes ]

Follow: @theTuringMachine
the Turing Machine
A nice Interactive GUI for phase portraits based on its eigenvalues! [ link ] Follow: @theTuringMachine
Pyplane
A very handy software developed in Python but with a nice GUI to input your model and find its dynamical system's behavior through time.

[ PyPlane is a free software for phase plane analysis of second order dynamical systems written for PYTHON 3.8 and PyQT5 (compare MATLAB's pplane). It is published under the GNU GENERAL PUBLIC LICENSE Version 3 ]

[ git ]

Follow: @theTuringMachine
Scientific Visualization – Python & Matplotlib
An open access book on scientific visualization using python and matplotlib to be released at the end of Summer 2021. Code will be available in this repository, the PDF book will be open-access and the printed book will cost 50$.
Author: Nicolas P. Rougier
[ git ]

Follow: @theTuringMachine
On the nature and use of models in network neuroscience

Network theory provides an intuitively appealing framework for studying relationships among interconnected brain mechanisms and their relevance to behavior. As the space of its applications grows, so does the diversity of meanings of the term “network model.” This diversity can cause confusion, complicate efforts to assess model validity and efficacy, and hamper interdisciplinary collaboration. Here we review the field of network neuroscience, focusing on organizing principles that can help overcome these challenges. First, we describe the fundamental goals in constructing network models. Second, we review the most common forms of network models, which can be described parsimoniously along three primary dimensions: from data representations to first-principles theory, from biophysical realism to functional phenomenology, and from elementary denoscriptions to coarse-grained approximations..

[ read ]

More: @theTuringMachine
Two post-doctoral positions

1- Full-time postdoc: neural representations of syntactic structures.
Deadline: December 31st, 2021
2- Full-time postdoc: natural language processing and neuroscience.

[ link ]

More: @theTuringMachine
Interpreting encoding and decoding models
Highlights
• Decoding models can reveal whether particular information is present in a brain region in a format the decoder can exploit.

• Encoding models make comprehensive predictions about representational spaces and more strongly constrain computational theory.

• The weights of the fitted linear combinations used in encoding and decoding models are not straightforward to interpret.

• Interpretation of encoding and decoding models critically depends on the level of generalization achieved.

• Many models must be tested and inferentially compared for analyses to drive theoretical progress.

[ read ]

More: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
My Basal Ganglia Modeling implementations of well-known papers using @briansimulator and @NestSimulator available here:


Github
Forwarded from Complex Systems Studies
Why can spin glasses help us understand neural systems? Because they captured the key complexities related to memory, learning, and hierarchy. It was a revolution. Check this paper by Sompolinsky in PhysicsToday #sfiscience https://t.co/x6Yq8Sahyo
Python Programming And Numerical Methods: A Guide For Engineers And Scientists

This notebook contains an excerpt from the Python Programming and Numerical Methods - A Guide for Engineers and Scientists, the content is also available at Berkeley Python Numerical Methods.

[ online-book ]

More: @theTuringMachine
Imagined speech can be decoded from low- and cross-frequency intracranial EEG features

Based on recent theories of speech neural processing, we extracted consistent and specific neural features usable for future brain computer interfaces, and assessed their performance to discriminate speech items in articulatory, phonetic, and vocalic representation spaces. While high-frequency activity provided the best signal for overt speech, both low- and higher-frequency power and local cross-frequency contributed to imagined speech decoding, in particular in phonetic and vocalic, i.e. perceptual, spaces. These findings show that low-frequency power and cross-frequency dynamics contain key information for imagined speech decoding.

[ doi ]

More: @theTuringMachine
Mice alternate between discrete strategies during perceptual decision-making

Here we present new analyses suggesting that this common view is incorrect. We analyzed data from mouse and human decision-making experiments and found that choice behavior relies on an interplay among multiple interleaved strategies. These strategies, characterized by states in a hidden Markov model, persist for tens to hundreds of trials before switching, and often switch multiple times within a session. The identified decision-making strategies were highly consistent across mice and comprised a single ‘engaged’ state, in which decisions relied heavily on the sensory stimulus, and several biased states in which errors frequently occurred. These results provide a powerful alternate explanation for ‘lapses’ often observed in rodent behavioral experiments, and suggest that standard measures of performance mask the presence of major changes in strategy across trials.

[ doi ]

More: @theTuringMachine
Computational Neuroscience Summer School
The course teaches the central ideas, methods, and practices of modern computational neuroscience through a combination of lectures and hands-on project work. During the course’s mornings, distinguished international faculty deliver lectures on topics across the entire breadth of experimental and computational neuroscience.

[ link ]

More: @theTuringMachine
BrainPy is a highly flexible and extensible framework targeting on the high-performance brain modeling. Among its key ingredients, BrainPy supports:

JIT compilation for class objects.

Numerical solvers for ODEs, SDEs and others.

Dynamics simulation tools for various brain objects, like neurons, synapses, networks, soma, dendrites, channels, and even more.

Dynamics analysis tools for differential equations, including phase plane analysis, bifurcation analysis, linearization analysis, and fixed/slow point finding.

Seamless integration with deep learning models.

And more ……

[ documentation ] [ github ] [ examples ]

More: @theTuringMachine