School of AI – Telegram
School of AI
10.6K subscribers
290 photos
94 videos
11 files
612 links
هدف ما در این اجتماع کوچک، آموزش و ترویج هوش مصنوعی و افزایش سطح آگاهی و تخصص نسبت به آن است.
باشد که دست در دست هم، آینده‌ی این صنعت را در میهن‌مان ایران بسازیم.

https://www.aparat.com/v/Pmrs8
Download Telegram
NVIDIA Jarvis, an application framework for multimodal conversational AI services that delivers real-time performance on GPUs.

https://developer.nvidia.com/nvidia-jarvis
Transforming sounds into musical instruments used in a variety of styles, from Baroque to jazz using machine learning, created by the Magenta and AIUX team within Google Research.

https://sites.research.google/tonetransfer


Intro Video:
https://youtu.be/bXBliLjImio

Blog post:
https://magenta.tensorflow.org/ddsp

Colab:
https://colab.research.google.com/github/magenta/ddsp/blob/master/ddsp/colab/demos/timbre_transfer.ipynb
https://github.com/magenta/ddsp/tree/master/ddsp/colab/tutorials

Github:
https://github.com/magenta/ddsp
Mastering TensorFlow Tensors in 5 Easy Steps | by Orhan G. Yalçın | Oct, 2020 | Towards Data Science
https://towardsdatascience.com/mastering-tensorflow-tensors-in-5-easy-steps-35f21998bb86
10 must know awesome Python 3.9 features
Machine learning articles on arXiv now have a Code tab to link official and community code with the paper

https://medium.com/paperswithcode/papers-with-code-partners-with-arxiv-ecc362883167
Forwarded from Tensorflow(@CVision) (Alireza Akhavan)
این سایت را هر بار که باز کنید یا رفرش کنید یک چهره ی جدید به شما نشان میدهد که نکته ی جالبش اینه که چهره ها وجود خارجی نداشته و توسط شبکه های GAN تولید شده اند...
https://thispersondoesnotexist.com/
Forwarded from Tensorflow(@CVision) (Vahid Reza Khazaie)
This media is not supported in your browser
VIEW IN TELEGRAM
AdaBelief Optimizer: fast as Adam, generalizes as good as SGD, and sufficiently stable to train GANs. (NeurIPS 2020)

Project page: https://juntang-zhuang.github.io/adabelief/
Paper: https://arxiv.org/abs/2010.07468
GitHub: https://github.com/juntang-zhuang/Adabelief-Optimizer
Recent developments in neuroscience and the successes of artificial neural networks have reinvigorated interest in whether backpropagation offers insights for understanding learning in the cortex.

https://pubmed.ncbi.nlm.nih.gov/32303713/
This media is not supported in your browser
VIEW IN TELEGRAM
Performer, a generalized attention framework based on the Transformer architecture, which implements the novel FAVOR+ algorithm to provide linearly scalable, low-variance and unbiased estimation of attention mechanisms.

https://ai.googleblog.com/2020/10/rethinking-attention-with-performers.html?m=1