Forwarded from Shervin Safavi
The Computational Machinery of Cognition (CMC) Lab (PI: Shervin Safavi), will be established at the Technische Universität Dresden in Fall 2023. We will be building the team in the next few months (starting with openings for Ph.D. students and master students).
Research theme:
We are interested in understanding the computational machinery of cognitive processes (in particular inference and decision processes). We do:
* normative and biophysical modeling of cognitive functions (starting with perceptual decisions);
* testing these models with neural and behavioral data (in collaboration with experimental labs)
* developing methods for multi- and cross-scale analysis of neural data to better capture the neural markers of cognitive processes.
If the research of our lab resonates with yours, and you like to get future announcements (e.g., call for PhD positions). please fill out this form (https://docs.google.com/forms/d/e/1FAIpQLSd8V5Mu8d-JwZXjs_Ck5toLl0IBg5pTpTrZs4A_QW-71pi13A/viewform?usp=sf_link)
Also, please feel free to get in touch (shervin.safavi@tuebingen.mpg.de) if you like to know more and/or to collaborate. It would also be great if pass it to your colleagues/friends who might be interested.
Research theme:
We are interested in understanding the computational machinery of cognitive processes (in particular inference and decision processes). We do:
* normative and biophysical modeling of cognitive functions (starting with perceptual decisions);
* testing these models with neural and behavioral data (in collaboration with experimental labs)
* developing methods for multi- and cross-scale analysis of neural data to better capture the neural markers of cognitive processes.
If the research of our lab resonates with yours, and you like to get future announcements (e.g., call for PhD positions). please fill out this form (https://docs.google.com/forms/d/e/1FAIpQLSd8V5Mu8d-JwZXjs_Ck5toLl0IBg5pTpTrZs4A_QW-71pi13A/viewform?usp=sf_link)
Also, please feel free to get in touch (shervin.safavi@tuebingen.mpg.de) if you like to know more and/or to collaborate. It would also be great if pass it to your colleagues/friends who might be interested.
Google Docs
CMC Lab: Interest mailing-list
The Computational Machinery of Cognition (CMC) Lab (PI: Shervin Safavi), will be established at the Dresden University of Technology (Technische Universität Dresden or TU Dresden) in Autumn 2023. We will be building the team in the next few months (starting…
🔥1
The International Brain Laboratory will release all data sets within 12 months of collection, or upon acceptance for publication of an associated manunoscript, whichever comes first.
[ link ]
More: @theTuringMachine
[ link ]
More: @theTuringMachine
A freely available short course on neuroscience for people with a machine learning background. Designed by Dan Goodman and Marcus Ghosh.
[ link ]
More: @theTuringMachine
[ link ]
More: @theTuringMachine
👍3
Forwarded from Complex Systems Studies
“Python for scientific computing” an on-line course aimed to improve your scientific Python skills starting Tuesday 7th November at 10:00 EET (4 days, 3 hours per day). 1 ECTS available if you need it.
More info and registration at:
https://scicomp.aalto.fi/training/scip/python-for-scicomp-2023/
More info and registration at:
https://scicomp.aalto.fi/training/scip/python-for-scicomp-2023/
Theoretical Neuroscience Podcast
Podcast #1: On models of the mind – with Grace Lindsay
[ link ]
Follow for More: @theTuringMachine
Podcast #1: On models of the mind – with Grace Lindsay
[ link ]
Follow for More: @theTuringMachine
🔥1
Forwarded from Shervin Safavi
Hi everyone, following up on my previous message, now I have an opening for a computational neuroscience PhD position in my lab. You can find more details in the CMC lab website (which is still under construction, but probably have some helpful information):
https://shervinsafavi.github.io/cmclab/join/bne_phd_202310/
If you are interested in applying for this position, please follow the instructions in the official call from the university, i.e. upload a cover letter, a brief denoscription of your research interests, and your CV in the university portal.
If you have any questions, please let me know. If you are interested, but cannot meet the deadline, please let me know as soon as possible and we’ll see if we can figure out something.
https://shervinsafavi.github.io/cmclab/join/bne_phd_202310/
If you are interested in applying for this position, please follow the instructions in the official call from the university, i.e. upload a cover letter, a brief denoscription of your research interests, and your CV in the university portal.
If you have any questions, please let me know. If you are interested, but cannot meet the deadline, please let me know as soon as possible and we’ll see if we can figure out something.
shervinsafavi.github.io
PhD project on neural events
Computational Machinery of Cognition. The overarching research theme of our lab is understanding the computational machinery of cognitive processes.
Calculus Made Easy is a free book on calculus originally published in 1910 by Silvanus P. Thompson, considered a classic and elegant introduction to the subject.
[ link ]
More: @theTuringMachine
[ link ]
More: @theTuringMachine
👍3
Calculus and Applications
Free online book on calculus by Vahid Shahrezaei
[ link ]
More: @theTuringMachine
Free online book on calculus by Vahid Shahrezaei
[ link ]
More: @theTuringMachine
👍1🔥1
Forwarded from Brain science journal club (Mojtaba Madadi Asl)
Neuronal Modeling Workshop
Part I: Simplified point neurons
This in-person workshop aims to provide participants with an introduction to various topics related to simulating a network of point neurons. The workshop is designed to be interactive, allowing participants to work on their own computers while receiving guidance from experienced instructors.
Lecturers:
• Alireza Valizadeh
• Mojtaba Madadi Asl
• Mozhgan Khanjanianpak
• Saeed Taghavi
Schedule: February 21-22, 2024
Venue: Pasargad Institute for Advanced Innovative Solutions (PIAIS), Khatam University, Tehran, Iran
Registration deadline: February 10, 2024
Registration form: www.b2n.ir/NMW-form
Website: www.b2n.ir/NMW-web
For details, please inquire mojtabamadadi7@gmail.com
For technical support, contact saeed.taghavi.v@gmail.com
Part I: Simplified point neurons
This in-person workshop aims to provide participants with an introduction to various topics related to simulating a network of point neurons. The workshop is designed to be interactive, allowing participants to work on their own computers while receiving guidance from experienced instructors.
Lecturers:
• Alireza Valizadeh
• Mojtaba Madadi Asl
• Mozhgan Khanjanianpak
• Saeed Taghavi
Schedule: February 21-22, 2024
Venue: Pasargad Institute for Advanced Innovative Solutions (PIAIS), Khatam University, Tehran, Iran
Registration deadline: February 10, 2024
Registration form: www.b2n.ir/NMW-form
Website: www.b2n.ir/NMW-web
For details, please inquire mojtabamadadi7@gmail.com
For technical support, contact saeed.taghavi.v@gmail.com
👍1
Forwarded from Complex Systems Studies
"Ambitions for theory in the physics of life" (by William Bialek): https://arxiv.org/abs/2401.15538
[note: Lectures at the 2023 Les Houches Summer School, Theoretical Biophysics]
Theoretical physicists have been fascinated by the phenomena of life for more than a century. As we engage with more realistic denoscriptions of living systems, however, things get complicated. After reviewing different reactions to this complexity, I explore the optimization of information flow as a potentially general theoretical principle. The primary example is a genetic network guiding development of the fly embryo, but each idea also is illustrated by examples from neural systems. In each case, optimization makes detailed, largely parameter-free predictions that connect quantitatively with experiment
[note: Lectures at the 2023 Les Houches Summer School, Theoretical Biophysics]
Theoretical physicists have been fascinated by the phenomena of life for more than a century. As we engage with more realistic denoscriptions of living systems, however, things get complicated. After reviewing different reactions to this complexity, I explore the optimization of information flow as a potentially general theoretical principle. The primary example is a genetic network guiding development of the fly embryo, but each idea also is illustrated by examples from neural systems. In each case, optimization makes detailed, largely parameter-free predictions that connect quantitatively with experiment
arXiv.org
Ambitions for theory in the physics of life
Theoretical physicists have been fascinated by the phenomena of life for more than a century. As we engage with more realistic denoscriptions of living systems, however, things get complicated....
👍1
Forwarded from Complex Systems Studies
#Coxeter Lecture Series will be delivered by 2022 Fields Medallist Hugo Duminil-Copin.
Do NOT miss an opportunity to hear his talks in-person or online!
Register: http://www.fields.utoronto.ca/activities/23-24/Duminil-Copin
Do NOT miss an opportunity to hear his talks in-person or online!
Register: http://www.fields.utoronto.ca/activities/23-24/Duminil-Copin
Summer School | Advanced tools for data analysis in neuroscience
Advanced tools for data analysis in neuroscience
Research discoveries are increasingly dependent on the development of new tools and technologies, as well as on the ability to process, manage and analyze the large amounts of data collected with these tools....
[ link ]
More: @theTuringMachine
Advanced tools for data analysis in neuroscience
Research discoveries are increasingly dependent on the development of new tools and technologies, as well as on the ability to process, manage and analyze the large amounts of data collected with these tools....
[ link ]
More: @theTuringMachine
👍4🔥2
IPython Interactive Computing and Visualization Cookbook, Second Edition (2018), by Cyrille Rossant, contains over 100 hands-on recipes on high-performance numerical computing and data science in the Jupyter Notebook.
[ link ]
More: @theTuringMachine
[ link ]
More: @theTuringMachine
❤2👍1
Forwarded from Scientific Programming (SCI_dev(he/him))
Datasets for machine learning typically contain a large number of features, but such high-dimensional feature spaces are not always helpful.
In general, all the features are not equally important and there are certain features that account for a large percentage of variance in the dataset. Dimensionality reduction algorithms aim to reduce the dimension of the feature space to a fraction of the original number of dimensions. In doing so, the features with high variance are still retained—but are in the transformed feature space. And principal component analysis (PCA) is one of the most popular dimensionality reduction algorithms.
Here's a simple example in Python demonstrating PCA for dimensionality reduction before training a scikit-learn classifier
Github
You may also need to read more about PCA here.
In general, all the features are not equally important and there are certain features that account for a large percentage of variance in the dataset. Dimensionality reduction algorithms aim to reduce the dimension of the feature space to a fraction of the original number of dimensions. In doing so, the features with high variance are still retained—but are in the transformed feature space. And principal component analysis (PCA) is one of the most popular dimensionality reduction algorithms.
Here's a simple example in Python demonstrating PCA for dimensionality reduction before training a scikit-learn classifier
Github
You may also need to read more about PCA here.
GitHub
workshop_ML/pca/classify_use_pca.ipynb at main · Ziaeemehr/workshop_ML
Machine learning tutorials and examples. Contribute to Ziaeemehr/workshop_ML development by creating an account on GitHub.