the Turing Machine – Telegram
the Turing Machine
276 subscribers
189 photos
15 files
557 links
Join me through the journey of learning Computational Neuroscience topics.
Useful resources, positions and much more!
Get in touch: @nosratullah
Website: nosratullah.github.io
Download Telegram
Introduction to Linear Algebra for Applied Machine Learning with Python

Linear
algebra is to machine learning as flour to bakery: every machine learning model is based in linear algebra, as every cake is based in flour. It is not the only ingredient, of course. Machine learning models need vector calculus, probability, and optimization, as cakes need sugar, eggs, and butter. Applied machine learning, like bakery, is essentially about combining these mathematical ingredients in clever ways to create useful models.


[ link ]

Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
I have made some interactive dashboard for brian.
Let's take a look.

GitHub
Reaction-Diffusion Tutorial

Author: Karl Sims

A simulation of two virtual chemicals reacting and diffusing on a 2D grid using the Gray-Scott model.

[ link ]

Follow: @theTuringMachine
Case Studies in Neural Data Analysis
Author: Mark Kramer and Uri Eden

This repository is a companion to the textbook Case Studies in Neural Data Analysis, by Mark Kramer and Uri Eden. That textbook uses MATLAB to analyze examples of neuronal data. The material here is similar, except that we use Python.

[ link ] [ git ] [ book ]

Follow: @theTuringMachine
Simulating functional connectivity in a next-generation neural mass model of brain
- Michael Forrester

[ link ]

Follow: @theTuringMachine
HNP VR & Robotics: Call for Project Proposal (internship)
Dear all,

The Fondation Campus Biotech Geneva (FCBG) is opening 1-2 MSc intern positions in 2022 in its Virtual Reality and Robotics (VR & Robotics) Facility of the Human Neuroscience Platform (HNP). The goal is to provide opportunities to develop new functionalities for the Facility, in close collaboration with research projects.

Researchers who have a project requiring VR, Robotics developments or 3D simulation that may be of interest for the larger community are encouraged to apply.


Please send your application before October 03rd 2021, the contact person for this is vr@fcbg.ch.

Follow: @theTuringMachine
HOW TO STAND OUT WITH YOUR GITHUB PROFILE

GitHub recently released a new feature that is still quite hidden, but that can really help you stand out when you're searching for work as a developer. You can now create a README file that features front and center on your GitHub profile. Your personal documentation, if you will.

[ more ]

#spare_time

Follow: @theTuringMachine
Data-driven discovery is revolutionizing the modeling, prediction, and control of complex systems. This textbook brings together machine learning, engineering mathematics, and mathematical physics to integrate modeling and control of dynamical systems with modern methods in data science. It highlights many of the recent advances in scientific computing that enable data-driven methods to be applied to a diverse range of complex systems, such as turbulence, the brain, climate, epidemiology, finance, robotics, and autonomy. Aimed at advanced undergraduate and beginning graduate students in the engineering and physical sciences, the text presents a range of topics and methods from introductory to state of the art.

Autors: [ Steven L. Brunton and J. Nathan Kutz ]

[ online book and codes ]

Follow: @theTuringMachine
Whatever Happened to Solid State Physics?
by John J. Hopfield

Subfields of physics are born, expand, and develop in intellectual scope, then can spawn new offspring by subdividing, can disappear by being absorbed in new definitions of the fields of physics, or may merely decline in vigor and membership. Textbooks, seminar pro- grams, graduate courses, and the chosen structure of industrial labo- ratories all contributed to making solid state physics a vibrant subfield for 30 years, to ultimately disappear into regroupings with names such as condensed matter, materials science, biological physics, com- plexity, and quantum optics. This review traces the trajectory of the subfield solid state physics through the experiences of the author in re- lationship to major university departments and Bell Labs, with digres- sions into how he became a physicist, physics education, and choosing research problems.

[ read ]

#spare_time
Follow: @theTuringMachine
A Digital Signal Processing Short Summary

Modern digital signal processing makes use of a variety of mathematical techniques. These techniques are used to design and understand efficient filters for data processing and control. In an accelerator environment, these techniques often include statistics, one-dimensional and multidimensional transformations, and complex function theory. The basic mathematical concepts are presented in four sessions including a treatment of the harmonic oscillator, a topic that is necessary for the afternoon exercise sessions.

[ pdf ]

Follow: @theTuringMachine
Ready for the school?

Models of the Neuron
——————————-
This course discusses single neuron modeling, including molecular models of channels and channel gating, Hodgkin-Huxley style models of membrane currents, non-linear dynamics as a way of understanding membrane excitability, neural integration through cable theory, and network computation. The goals of the course are to understand how neurons work as biological computing elements and to give students experience with modeling techniques as applied to complex biological systems.

[ link ]

#courses

Follow: @theTuringMachine
Forwarded from Scientific Programming (Ziaee (he/him))
Open #PhD #position in mathematical neuroscience in Berlin
Dear friends and colleagues,

we are looking for a PhD candidate in mathematical neuroscience on the topic "Dynamics and variability of structured spiking neural networks". Although the focus is on the theory side, the project also includes the analysis of neuronal population data and associated problems of data assimilation. Methods will be developed within the frameworks of stochastic processes, statistical physics and nonlinear dynamics.

The PhD position in my group will be part of the vibrant computational neuroscience community at the Bernstein Center for Computational Neuroscience Berlin and the Institute of Mathematics of TU Berlin.

The successful candidate should have a degree in mathematics or physics, keen interest in computational neuroscience, expertise in analytical calculations, programming skills (C++ or C, Python or Julia, LaTeX), and excellent command of the English language, and good communication skills.

Funding is provided for five years. Applications, including a letter of motivation, a CV, a current copy of academic trannoscripts, and a list of at least two potential referees should be sent by email to me:

schwalger@math.tu-berlin.de

The deadline for applications is October 15th 2021, however, later applications might also be considered. More information is available under:

https://tub.stellenticket.de/en/offers/106202/?locale=en

Kind regards,

Tilo Schwalger
The physics of higher-order interactions in complex systems

Complex networks have become the main paradigm for modelling the dynamics of interacting systems. However, networks are intrinsically limited to describing pairwise interactions, whereas real-world systems are often characterized by higher-order interactions involving groups of three or more units. Higher-order structures, such as hypergraphs and simplicial complexes, are therefore a better tool to map the real organization of many social, biological and man-made systems. Here, we highlight recent evidence of collective behaviours induced by higher-order interactions, and we outline three key challenges for the physics of higher-order systems.

[ read ]

Follow: @theTuringMachine
Recorded talks of Bernstein Conference 2021
All the recorded talks from Bernstein 2021 conference can be find in G-node repository.

[ gnode ]

Follow: @theTuringMachine
Visualizing the multi-scale complexity of the brain
The brain is complex over multiple length-scales, from many protein molecules forming intricate nano-machines in a synapse to many neurons forming interconnected networks across the brain. Unraveling this multi-scale complexity is fundamental to our understanding of brain function and disease. In this lecture, I will introduce advances in visualizing the complex, multi-scale structures in the brain...

[ link ] [ zoom ]

#talks

Follow: @theTuringMachine
Introduction to Neural Computation

Course Denoscription
:
This course introduces quantitative approaches to understanding brain and cognitive functions. Topics include mathematical denoscription of neurons, the response of neurons to sensory stimuli, simple neuronal networks, statistical inference and decision making. It also covers foundational quantitative tools of data analysis in neuroscience: correlation, convolution, spectral analysis, principal components analysis, and mathematical concepts including simple differential equations and linear algebra.
Instructors: Prof. Michale Fee | Daniel Zysman

[ link ]

#courses

More: @theTuringMachine