the Turing Machine – Telegram
the Turing Machine
276 subscribers
189 photos
15 files
557 links
Join me through the journey of learning Computational Neuroscience topics.
Useful resources, positions and much more!
Get in touch: @nosratullah
Website: nosratullah.github.io
Download Telegram
The Eigenvalue Value (in Neuroscience)
The connectivity of neuronal networks is thought to lie at the heart of their processing abilities. We are starting to glean their complexity by way of connectomics, yet we still lack the tools to understand them, or extrapolate the dynamics they may produce. However, there are powerful mathematical tools that can be used to unveil parts of this structure–function relationship. We present the underlying mathematics needed to understand network structure, and link the theory directly to its neuroscience applications. We start by presenting the archetypal linear dynamical system and show how it can be used to gain an intuition for neuronal activity using the eigenvectors and eigenvalues of the connectivity matrix. We explore how the Schur decomposition and the theory of pseudo- spectra can offer additional insight and ...
[ link ] [ supplementary material ]
More: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
Course High-Performance Computing with Python

The Swiss National Supercomputing Centre is pleased to announce the course High-Performance Computing with Python, which will be held online from June 21 to 23, 2022.

This is a new edition of the course originally developed by Dr. Jan Meinke and and Dr. Olav Zimmermann from JSC. The course combines lectures and hands-on sessions. We will show how Python can be used on parallel architectures and how to optimize critical parts of code using various tools.

The main topics that the course will cover are:
• Vectorization with NumPy
• Compiled Python
• Running Numpy-like code on the GPU with CuPy
• Scaling Python workloads to multiple nodes

The agenda will be shared shortly before the start of the course.

This course is addressed to scientists with a working knowledge of NumPy who wish to explore the productivity gains made possible by Python for HPC.

The lessons will be held every day from 9:00 to 12:00 and from 13:00 to 16:00. Both, morning and afternoon sessions, will have a 15-minutes break.

Instructors
Dr. Rafael Sarmiento (Computational Scientist, CSCS)
Dr. Theofilos Manitaras (Computational Scientist, CSCS)

If you are interested in attending, please register in https://www.cscs.ch/events/private-events/event-detail/high-performance-computing-with-python-5/ . Registration is free-of-charge. The deadline for the registration is Sunday, June 12, 2022.

Please note that the workshop can take place only if there are sufficient registrations. The minimum number of participants is eight. Registration for the course will automatically close when we reach the maximum number of participants (30).

Inquiries may be addressed to rafael.sarmiento@cscs.ch.

Best regards
Rafael
Forwarded from the last neural cell (Alexey Timchenko)
Stumbled upon a book of G.Buzsaki "The Brain from Inside-Out" (2019) after reading "Rhythms of the Brain" (btw, hit 🤔 if you would like to see a summary of the book in this channel).

Some gentlemen have kindly created a thorough document with each 2019's book chapter summarized and discussed:

[book club link]

Check out if you enjoy such neuroscience topics as:
- neural code
- oscillations
- memory coding
- systems/network neuroscience
- relation of action and cognition

#interesting #neuroscience
Forwarded from Scientific Programming (Ziaee (he/him))
Modeling Neural Circuits Made Simple

An updated draft of undergrad-level Computational Neuroscience textbook. @mitpress will keep an open access version available online.

Github
Statistical and dynamical models of brain function
Topics:

1 - Introduction: Brain hardware
2- Brain-machine-interfaces
3- Principles of neural data analysis
4- Analysis of the LFP
5- Decision making
6- Reading the mind: decoding brain functions
7- Modeling brain functions
8- Mean field approach
....

[ source ]

More: @theTuringMachine
Measuring Progress as a PhD Student – Tracking and Tools
Here are some tips that I have found useful in the past to keep momentum going and measure progress...

[ link ]

More: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
This is a revision of the textbook Fundamentals of Numerical Computation by Tobin A. Driscoll and Richard J. Braun. The book was originally written for MATLAB, but this resource has been adapted to suit Julia.

Link
Python implementation of the Kuramoto model

The Kuramoto model is used to study a wide range of systems (for a review on its usage in the Neuroscience, see Breakspear et al. 2010, Generative models of cortical oscillations: neurobiological implications of the Kuramoto model)

[ github ]

Follow: @theTuringMachine
Attractor and integrator networks in the brain

Abstract: In this Review, we describe the singular success of attractor neural network models in describing how the brain maintains persistent activity states for working memory, corrects errors and integrates noisy cues. We consider the mechanisms by which simple and forgetful units can organize to collectively generate dynamics on the long timescales required for such computations. We discuss the myriad potential uses of attractor dynamics for computation in the brain, and showcase notable examples of brain systems in which inherently low-dimensional continuous-attractor dynamics have been concretely and rigorously identified. Thus, it is now possible to conclusively state that the brain constructs and uses such systems for computation...

[ DOI ]

Follow: @theTuringMachine
Forwarded from Complex Systems Studies
“Python for scientific computing” an on-line course aimed to improve your scientific Python skills starting Tuesday 22nd November at 10:00 EET (4 days, 3 hours per day). 1 ECTS available if you need it.

More info and registration at:
https://scicomp.aalto.fi/training/scip/python-for-scicomp-2022/

This course isn’t designed to teach Python itself, but if you know some Python, you will get a good introduction to the broader Python for science ecosystem and be able to use the right tools for your work and keep your code in good shape.
PhD and Postdoctoral positions are open in computational neurosciences
————————————————————————-
at the University of Lyon1, in the Stem and Brain Research Institute
(https://sbri.fr/). The project aims at understanding the cellular
and circuit mechanisms responsible for cognitive functions, combining
computational and theoretical aspects of neural modeling as well as
analyses of electrophysiological data. Further details at [ more ]
. The positions are available and starting immediately.
Candidates should send their CV and a brief cover letter to Matteo di
Volo: matteo.divolo [ at ] univ.lyon1.fr

Follow: @theTuringMachine
If you're using symbolic calculation a lot (like I do) and you don't have access to Mathematica, you might find Sage helpful!

Sage is free, open-source math software that supports research and teaching in algebra, geometry, number theory, cryptography, numerical computation, and related areas. Both the Sage development model and the technology in Sage itself are distinguished by an extremely strong emphasis on openness, community, cooperation, and collaboration: we are building the car, not reinventing the wheel. The overall goal of Sage is to create a viable, free, open-source alternative to Maple, Mathematica, Magma, and MATLAB.
[ link ][ documentation ]

Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
Modeling Neural Dynamics as a standalone python package, run online notebooks on Binder without need to install any packages.

GitHub
Chapter 3 : The Classical Hodgkin-Huxley ODEs
Chapter 4 : Numerical Solution of the Hodgkin-Huxley ODEs
Chapter 5 : Three Simple Models of Neurons in Rodent Brains

Other chapters will be added ...
About a biological ring attractor network
- by Vivek Jayaraman

[ link ]

More: @theTuringMachine
Dynamical study of a mean-field model of neural network activity driven by biophysical ion exchange mechanisms

Project denoscription. In neuroscience, the question of scales is central, ranging from the molecule to the whole brain. In theoretical and computational neuroscience, it is possible to model these different scales and to build the link between them ....


Main objectives. 1. To identify the different dynamical regimes of this model, in particular in a parameter configuration corresponding to a healthy state; 2. To determine the different timescales involved in this system; (and possibly) 3. To start a bifurcation study for identified biophysical parameters of interest.

AMU Faculty of Medicine, Marseille, France [ website ]
Follow: @theTuringMachine
Applications are invited for three PhD student positions at the University of Bern. The positions are funded by a grant from the Swiss National Science Foundation which is ennoscriptd “Why Spikes?”.

This project aims at answering an almost 100 year old question in Neuroscience: “What are spikes good for?”.  Indeed, since the discovery of action potentials by Lord Adrian in 1926, it has remained largely unknown what the benefits of spiking neurons are, when compared to analog neurons. Traditionally, it has been argued that spikes are good for long-distance communication or for temporally precise computation. However, there is no systematic study that quantitatively compares the communication as well as the computational benefits of spiking neuron w.r.t analog neurons. The aim of the project is  to systematically quantify the benefits of spiking at various levels. 
 
The PhD students and post-doc will be supervised by Prof. Jean-Pascal Pfister (Theoretical Neuroscience Group, Department of Physiology, University of Bern). 

The PhD candidates (resp. post-doc candidate) should hold a Master (resp. PhD) degree in Physics, Mathematics, Computer Science, Computational Neuroscience, Neuroscience  or a related  field. She/he should have keen interests in developing theories that can be tested experimentally. Preference will be given to candidates with strong mathematical and programming skills. Expertise in stochastic dynamical systems, point processes, control theory and nonlinear Bayesian filtering will be a plus. 

The applicant should submit a CV (including contacts of two referees), a statement of research interests, marks obtained for the Master to Jean-Pascal Pfister (jeanpascal.pfister@unibe.ch).

ThThe position is offered for a period of three years and can be extended.  Deadline for application is the 31st  of January 2023 or until the position is filled. Salary scale is provided by the Swiss National Science Foundation. (http://www.snf.ch/SiteCollectionDocuments/allg_doktorierende_e.pdf). 

More: @theTuringMachine
Forwarded from Complex Systems Studies
Interested in scientific research in the field of #ComplexSystems?

IFISC announces the SURF@IFISC2023 summer research grants for undergraduates with the aim of introducing student fellows to cutting-edge research.

Deadline: March 26th

🔗 https://ifisc.uib-csic.es/en/about-ifisc/join-us/surf/surf-2023/
On the difficulty of learning chaotic dynamics with RNNs

NeurIPS poster session

Recurrent neural networks (RNNs) are wide-spread machine learning tools for modeling sequential and time series data. They are notoriously hard to train because their loss gradients backpropagated in time tend to saturate or diverge during training. This is known as the exploding and vanishing gradient problem. Previous solutions to this issue either built on rather complicated, purpose-engineered architectures with gated memory buffers, or - more recently - imposed constraints that ensure convergence to a fixed point or restrict (the eigenspectrum of) the recurrence matrix. Such constraints, however, convey severe limitations on the expressivity of the RNN. Essential intrinsic dynamics such as multistability or chaos are disabled. This is inherently at disaccord with the chaotic nature of many, if not most, time series encountered in nature and society..

[ link ][ Poster ][ Paper ]

More: @theTuringMachine
2023 BRAINART COMPETITION

THE MULTIFACETED BRAIN: ADAPTATION AND DIVERSITY

This year we are holding a BrainArt Competition under the theme "The Multifaceted Brain: Adaptation and Diversity". We are now accepting submissions for the BrainArt Competition! Please use the following form to submit your art. If you do not have access to the form, you may send your submission to ohbm.brainart (at) gmail.com. [ link ]

Follow for more: @theTuringMachine