Measuring Progress as a PhD Student – Tracking and Tools
Here are some tips that I have found useful in the past to keep momentum going and measure progress...
[ link ]
More: @theTuringMachine
Here are some tips that I have found useful in the past to keep momentum going and measure progress...
[ link ]
More: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
This is a revision of the textbook Fundamentals of Numerical Computation by Tobin A. Driscoll and Richard J. Braun. The book was originally written for MATLAB, but this resource has been adapted to suit Julia.
Link
Link
Python implementation of the Kuramoto model
The Kuramoto model is used to study a wide range of systems (for a review on its usage in the Neuroscience, see Breakspear et al. 2010, Generative models of cortical oscillations: neurobiological implications of the Kuramoto model)
[ github ]
Follow: @theTuringMachine
The Kuramoto model is used to study a wide range of systems (for a review on its usage in the Neuroscience, see Breakspear et al. 2010, Generative models of cortical oscillations: neurobiological implications of the Kuramoto model)
[ github ]
Follow: @theTuringMachine
Attractor and integrator networks in the brain
Abstract: In this Review, we describe the singular success of attractor neural network models in describing how the brain maintains persistent activity states for working memory, corrects errors and integrates noisy cues. We consider the mechanisms by which simple and forgetful units can organize to collectively generate dynamics on the long timescales required for such computations. We discuss the myriad potential uses of attractor dynamics for computation in the brain, and showcase notable examples of brain systems in which inherently low-dimensional continuous-attractor dynamics have been concretely and rigorously identified. Thus, it is now possible to conclusively state that the brain constructs and uses such systems for computation...
[ DOI ]
Follow: @theTuringMachine
Abstract: In this Review, we describe the singular success of attractor neural network models in describing how the brain maintains persistent activity states for working memory, corrects errors and integrates noisy cues. We consider the mechanisms by which simple and forgetful units can organize to collectively generate dynamics on the long timescales required for such computations. We discuss the myriad potential uses of attractor dynamics for computation in the brain, and showcase notable examples of brain systems in which inherently low-dimensional continuous-attractor dynamics have been concretely and rigorously identified. Thus, it is now possible to conclusively state that the brain constructs and uses such systems for computation...
[ DOI ]
Follow: @theTuringMachine
Forwarded from Complex Systems Studies
“Python for scientific computing” an on-line course aimed to improve your scientific Python skills starting Tuesday 22nd November at 10:00 EET (4 days, 3 hours per day). 1 ECTS available if you need it.
More info and registration at:
https://scicomp.aalto.fi/training/scip/python-for-scicomp-2022/
This course isn’t designed to teach Python itself, but if you know some Python, you will get a good introduction to the broader Python for science ecosystem and be able to use the right tools for your work and keep your code in good shape.
More info and registration at:
https://scicomp.aalto.fi/training/scip/python-for-scicomp-2022/
This course isn’t designed to teach Python itself, but if you know some Python, you will get a good introduction to the broader Python for science ecosystem and be able to use the right tools for your work and keep your code in good shape.
PhD and Postdoctoral positions are open in computational neurosciences
————————————————————————-
at the University of Lyon1, in the Stem and Brain Research Institute
(https://sbri.fr/). The project aims at understanding the cellular
and circuit mechanisms responsible for cognitive functions, combining
computational and theoretical aspects of neural modeling as well as
analyses of electrophysiological data. Further details at [ more ]
. The positions are available and starting immediately.
Candidates should send their CV and a brief cover letter to Matteo di
Volo: matteo.divolo [ at ] univ.lyon1.fr
Follow: @theTuringMachine
————————————————————————-
at the University of Lyon1, in the Stem and Brain Research Institute
(https://sbri.fr/). The project aims at understanding the cellular
and circuit mechanisms responsible for cognitive functions, combining
computational and theoretical aspects of neural modeling as well as
analyses of electrophysiological data. Further details at [ more ]
. The positions are available and starting immediately.
Candidates should send their CV and a brief cover letter to Matteo di
Volo: matteo.divolo [ at ] univ.lyon1.fr
Follow: @theTuringMachine
If you're using symbolic calculation a lot (like I do) and you don't have access to Mathematica, you might find Sage helpful!
Sage is free, open-source math software that supports research and teaching in algebra, geometry, number theory, cryptography, numerical computation, and related areas. Both the Sage development model and the technology in Sage itself are distinguished by an extremely strong emphasis on openness, community, cooperation, and collaboration: we are building the car, not reinventing the wheel. The overall goal of Sage is to create a viable, free, open-source alternative to Maple, Mathematica, Magma, and MATLAB.
[ link ][ documentation ]
Follow: @theTuringMachine
Sage is free, open-source math software that supports research and teaching in algebra, geometry, number theory, cryptography, numerical computation, and related areas. Both the Sage development model and the technology in Sage itself are distinguished by an extremely strong emphasis on openness, community, cooperation, and collaboration: we are building the car, not reinventing the wheel. The overall goal of Sage is to create a viable, free, open-source alternative to Maple, Mathematica, Magma, and MATLAB.
[ link ][ documentation ]
Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
Modeling Neural Dynamics as a standalone python package, run online notebooks on Binder without need to install any packages.
GitHub
Chapter 3 : The Classical Hodgkin-Huxley ODEs
Chapter 4 : Numerical Solution of the Hodgkin-Huxley ODEs
Chapter 5 : Three Simple Models of Neurons in Rodent Brains
Other chapters will be added ...
GitHub
Chapter 3 : The Classical Hodgkin-Huxley ODEs
Chapter 4 : Numerical Solution of the Hodgkin-Huxley ODEs
Chapter 5 : Three Simple Models of Neurons in Rodent Brains
Other chapters will be added ...
GitHub
GitHub - Ziaeemehr/mndynamics: A python package for An Introduction to Modeling Neuronal Dynamics by Christoph Borgers
A python package for An Introduction to Modeling Neuronal Dynamics by Christoph Borgers - Ziaeemehr/mndynamics
Dynamical study of a mean-field model of neural network activity driven by biophysical ion exchange mechanisms
Project denoscription. In neuroscience, the question of scales is central, ranging from the molecule to the whole brain. In theoretical and computational neuroscience, it is possible to model these different scales and to build the link between them ....
Main objectives. 1. To identify the different dynamical regimes of this model, in particular in a parameter configuration corresponding to a healthy state; 2. To determine the different timescales involved in this system; (and possibly) 3. To start a bifurcation study for identified biophysical parameters of interest.
AMU Faculty of Medicine, Marseille, France [ website ]
Follow: @theTuringMachine
Project denoscription. In neuroscience, the question of scales is central, ranging from the molecule to the whole brain. In theoretical and computational neuroscience, it is possible to model these different scales and to build the link between them ....
Main objectives. 1. To identify the different dynamical regimes of this model, in particular in a parameter configuration corresponding to a healthy state; 2. To determine the different timescales involved in this system; (and possibly) 3. To start a bifurcation study for identified biophysical parameters of interest.
AMU Faculty of Medicine, Marseille, France [ website ]
Follow: @theTuringMachine
Applications are invited for three PhD student positions at the University of Bern. The positions are funded by a grant from the Swiss National Science Foundation which is ennoscriptd “Why Spikes?”.
This project aims at answering an almost 100 year old question in Neuroscience: “What are spikes good for?”. Indeed, since the discovery of action potentials by Lord Adrian in 1926, it has remained largely unknown what the benefits of spiking neurons are, when compared to analog neurons. Traditionally, it has been argued that spikes are good for long-distance communication or for temporally precise computation. However, there is no systematic study that quantitatively compares the communication as well as the computational benefits of spiking neuron w.r.t analog neurons. The aim of the project is to systematically quantify the benefits of spiking at various levels.
The PhD students and post-doc will be supervised by Prof. Jean-Pascal Pfister (Theoretical Neuroscience Group, Department of Physiology, University of Bern).
The PhD candidates (resp. post-doc candidate) should hold a Master (resp. PhD) degree in Physics, Mathematics, Computer Science, Computational Neuroscience, Neuroscience or a related field. She/he should have keen interests in developing theories that can be tested experimentally. Preference will be given to candidates with strong mathematical and programming skills. Expertise in stochastic dynamical systems, point processes, control theory and nonlinear Bayesian filtering will be a plus.
The applicant should submit a CV (including contacts of two referees), a statement of research interests, marks obtained for the Master to Jean-Pascal Pfister (jeanpascal.pfister@unibe.ch).
ThThe position is offered for a period of three years and can be extended. Deadline for application is the 31st of January 2023 or until the position is filled. Salary scale is provided by the Swiss National Science Foundation. (http://www.snf.ch/SiteCollectionDocuments/allg_doktorierende_e.pdf).
More: @theTuringMachine
This project aims at answering an almost 100 year old question in Neuroscience: “What are spikes good for?”. Indeed, since the discovery of action potentials by Lord Adrian in 1926, it has remained largely unknown what the benefits of spiking neurons are, when compared to analog neurons. Traditionally, it has been argued that spikes are good for long-distance communication or for temporally precise computation. However, there is no systematic study that quantitatively compares the communication as well as the computational benefits of spiking neuron w.r.t analog neurons. The aim of the project is to systematically quantify the benefits of spiking at various levels.
The PhD students and post-doc will be supervised by Prof. Jean-Pascal Pfister (Theoretical Neuroscience Group, Department of Physiology, University of Bern).
The PhD candidates (resp. post-doc candidate) should hold a Master (resp. PhD) degree in Physics, Mathematics, Computer Science, Computational Neuroscience, Neuroscience or a related field. She/he should have keen interests in developing theories that can be tested experimentally. Preference will be given to candidates with strong mathematical and programming skills. Expertise in stochastic dynamical systems, point processes, control theory and nonlinear Bayesian filtering will be a plus.
The applicant should submit a CV (including contacts of two referees), a statement of research interests, marks obtained for the Master to Jean-Pascal Pfister (jeanpascal.pfister@unibe.ch).
ThThe position is offered for a period of three years and can be extended. Deadline for application is the 31st of January 2023 or until the position is filled. Salary scale is provided by the Swiss National Science Foundation. (http://www.snf.ch/SiteCollectionDocuments/allg_doktorierende_e.pdf).
More: @theTuringMachine
Forwarded from Complex Systems Studies
Interested in scientific research in the field of #ComplexSystems?
IFISC announces the SURF@IFISC2023 summer research grants for undergraduates with the aim of introducing student fellows to cutting-edge research.
Deadline: March 26th
🔗 https://ifisc.uib-csic.es/en/about-ifisc/join-us/surf/surf-2023/
IFISC announces the SURF@IFISC2023 summer research grants for undergraduates with the aim of introducing student fellows to cutting-edge research.
Deadline: March 26th
🔗 https://ifisc.uib-csic.es/en/about-ifisc/join-us/surf/surf-2023/
the Turing Machine
Mathematical Methods in Computational Neuroscience Computational Neuroscience and Inference from data are disciplines that extensively use tools from Mathematics and Physics to understand the behavior of model neuronal networks and analyze data from real…
Mathematical Methods in Computational Neuroscience
Don't miss out this year's school.
The deadline for application is April 30th at 23:59 AoE. Results will be communicated to applicants by mid May.
[ more ]
Follow: @theTuringMachine
Don't miss out this year's school.
The deadline for application is April 30th at 23:59 AoE. Results will be communicated to applicants by mid May.
[ more ]
Follow: @theTuringMachine
Mathematical Methods
Mathematical Methods in Computational Neuroscience
Summer school in Eresfjord, Norway (July 8th - 26th, 2024)
❤1
On the difficulty of learning chaotic dynamics with RNNs
NeurIPS poster session
Recurrent neural networks (RNNs) are wide-spread machine learning tools for modeling sequential and time series data. They are notoriously hard to train because their loss gradients backpropagated in time tend to saturate or diverge during training. This is known as the exploding and vanishing gradient problem. Previous solutions to this issue either built on rather complicated, purpose-engineered architectures with gated memory buffers, or - more recently - imposed constraints that ensure convergence to a fixed point or restrict (the eigenspectrum of) the recurrence matrix. Such constraints, however, convey severe limitations on the expressivity of the RNN. Essential intrinsic dynamics such as multistability or chaos are disabled. This is inherently at disaccord with the chaotic nature of many, if not most, time series encountered in nature and society..
[ link ][ Poster ][ Paper ]
More: @theTuringMachine
NeurIPS poster session
Recurrent neural networks (RNNs) are wide-spread machine learning tools for modeling sequential and time series data. They are notoriously hard to train because their loss gradients backpropagated in time tend to saturate or diverge during training. This is known as the exploding and vanishing gradient problem. Previous solutions to this issue either built on rather complicated, purpose-engineered architectures with gated memory buffers, or - more recently - imposed constraints that ensure convergence to a fixed point or restrict (the eigenspectrum of) the recurrence matrix. Such constraints, however, convey severe limitations on the expressivity of the RNN. Essential intrinsic dynamics such as multistability or chaos are disabled. This is inherently at disaccord with the chaotic nature of many, if not most, time series encountered in nature and society..
[ link ][ Poster ][ Paper ]
More: @theTuringMachine
2023 BRAINART COMPETITION
THE MULTIFACETED BRAIN: ADAPTATION AND DIVERSITY
This year we are holding a BrainArt Competition under the theme "The Multifaceted Brain: Adaptation and Diversity". We are now accepting submissions for the BrainArt Competition! Please use the following form to submit your art. If you do not have access to the form, you may send your submission to ohbm.brainart (at) gmail.com. [ link ]
Follow for more: @theTuringMachine
THE MULTIFACETED BRAIN: ADAPTATION AND DIVERSITY
This year we are holding a BrainArt Competition under the theme "The Multifaceted Brain: Adaptation and Diversity". We are now accepting submissions for the BrainArt Competition! Please use the following form to submit your art. If you do not have access to the form, you may send your submission to ohbm.brainart (at) gmail.com. [ link ]
Follow for more: @theTuringMachine
Geometric constraints on human brain function
The anatomy of the brain necessarily constrains its function, but precisely how remains unclear.
predictions from neural field theory, an established mathematical framework for modelling large-scale brain activity, suggest that the geometry of the brain may represent a more fundamental constraint on dynamics than complex interregional connectivity... [ more ]
#article
Follow: @theTuringMachine
The anatomy of the brain necessarily constrains its function, but precisely how remains unclear.
predictions from neural field theory, an established mathematical framework for modelling large-scale brain activity, suggest that the geometry of the brain may represent a more fundamental constraint on dynamics than complex interregional connectivity... [ more ]
#article
Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
25_Awesome_Python_Scripts.pdf
171.4 KB
A Collection of 25 Awesome Python Scripts (mini projects)
#Python
#Python
🔥1
Forwarded from Scientific Programming (Ziaee (he/him))
COMPUTATIONAL PSYCHIATRY COURSE ZURICH
This course is organized by the Translational Neuromodeling Unit (TNU), University of Zurich & ETH Zurich and is designed to provide MSc and PhD students, scientists clinicians and anyone interested in Computational Psychiatry with the necessary toolkit to master challenges in computational psychiatry research.
Pre-requisites: Some background knowledge in neuroscience, neuroimaging, (Bayesian) statistics & probability theory, programming and machine learning is expected. If you lack this background, it is recommended that you prepare for this course.
https://www.translationalneuromodeling.org/cpcourse/
Preparation Resources
Lectures
Lecture Recordings
Tutorials
Reading List
This course is organized by the Translational Neuromodeling Unit (TNU), University of Zurich & ETH Zurich and is designed to provide MSc and PhD students, scientists clinicians and anyone interested in Computational Psychiatry with the necessary toolkit to master challenges in computational psychiatry research.
Pre-requisites: Some background knowledge in neuroscience, neuroimaging, (Bayesian) statistics & probability theory, programming and machine learning is expected. If you lack this background, it is recommended that you prepare for this course.
https://www.translationalneuromodeling.org/cpcourse/
Preparation Resources
Lectures
Lecture Recordings
Tutorials
Reading List
GitHub
GitHub - computational-psychiatry-course/precourse-preparation
Contribute to computational-psychiatry-course/precourse-preparation development by creating an account on GitHub.
A Brain-Wide Map of Neural Activity during Complex Behaviour
Abstract:
... Here, we report a comprehensive set of recordings from 115 mice in 11 labs performing a decision-making task with sensory, motor, and cognitive components, obtained with 547 Neuropixels probe insertions covering 267 brain areas in the left forebrain and midbrain and the right hindbrain and cerebellum. We provide an initial appraisal of this brain-wide map, assessing how neural activity encodes key task variables....
[ link ]
More: @theTuringMachine
Abstract:
... Here, we report a comprehensive set of recordings from 115 mice in 11 labs performing a decision-making task with sensory, motor, and cognitive components, obtained with 547 Neuropixels probe insertions covering 267 brain areas in the left forebrain and midbrain and the right hindbrain and cerebellum. We provide an initial appraisal of this brain-wide map, assessing how neural activity encodes key task variables....
[ link ]
More: @theTuringMachine
🔥6
Looking for job opportunities in Neuroscience?
check this out!
https://www.world-wide.org/jobs
More: @theTuringMachine
check this out!
https://www.world-wide.org/jobs
More: @theTuringMachine
👍2🔥1