This media is not supported in your browser
VIEW IN TELEGRAM
translation operator (random input, fixed shift)
texture NCA on Chaos Communication Camp (day 1). WGSL compute -> Vulkan on jetson nano. Mapping is done in the fragment shader stage
NCA particle simulation
https://transdimensional.xyz/projects/neural_ca/index.html
(github: https://github.com/PWhiddy/Growing-Neural-Cellular-Automata-Pytorch)
https://transdimensional.xyz/projects/neural_ca/index.html
(github: https://github.com/PWhiddy/Growing-Neural-Cellular-Automata-Pytorch)
representing radiance field as 3D Gaussians
https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/
https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/
==========================
swiss.gl ⛰👾📽 intro
==========================
This library: http://swiss.gl seems like an interesting model viewing tool for colab (example: last cell here)
and overall should be handy in many situations
A video tutorial with the use of this library has just been released: https://www.youtube.com/watch?v=jKt1Jq4FMRg
swiss.gl ⛰👾📽 intro
==========================
This library: http://swiss.gl seems like an interesting model viewing tool for colab (example: last cell here)
and overall should be handy in many situations
A video tutorial with the use of this library has just been released: https://www.youtube.com/watch?v=jKt1Jq4FMRg
👾1
edge cognition
========================== swiss.gl ⛰👾📽 intro ========================== This library: http://swiss.gl seems like an interesting model viewing tool for colab (example: last cell here) and overall should be handy in many situations A video…
demo from the tutorial: https://google.github.io/swissgl/demo/criticality.html
source code: https://github.com/google/swissgl/blob/main/demo/criticality.html
source code: https://github.com/google/swissgl/blob/main/demo/criticality.html
GitHub
swissgl/demo/criticality.html at main · google/swissgl
SwissGL is a minimalistic wrapper on top of WebGL2 JS API. It's designed to reduce the amount of boilerplate code required to manage GLSL shaders, textures and framebuffers when making proc...
edge cognition
While refactoring NCA to use webgpu arrays, I've encountered a strange performance problem: in 1x1 conv stage, when there is 12*48 cycles to compute state update, my machine initially hanged after ~50 cycles. It turned out, that accessing weights that were…
apparently, it was a bug in tint (google's WGSL compiler)
https://github.com/gpuweb/gpuweb/discussions/4281
thanks @munrocket for reporting it!
https://github.com/gpuweb/gpuweb/discussions/4281
thanks @munrocket for reporting it!
GitHub
Why `const` is slower than `var<private>` for arrays · gpuweb gpuweb · Discussion #4281
This is checked across different devices/OSs and probably not implementation issue. 37 line here explains issue: https://compute.toys/view/359 Try to change B/W top level declaration / usage in cod...
❤1
revisiting differentiable reaction-diffusion + deep dream on CLIP embeddings
https://compute.toys/view/462
https://compute.toys/view/462
Forwarded from descent
❤3🔥2
https://antimatter15.com/splat/
somebody implemented 3d gaussian splatting in webgl
somebody implemented 3d gaussian splatting in webgl
https://arxiv.org/abs/2006.04439
An interesting model, similar to neural cellular automata in many ways. And it's doing useful things :)
An interesting model, similar to neural cellular automata in many ways. And it's doing useful things :)
arXiv.org
Liquid Time-constant Networks
We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear...
👀1
edge cognition
https://arxiv.org/abs/2006.04439 An interesting model, similar to neural cellular automata in many ways. And it's doing useful things :)
a video lecture on this topic: https://www.youtube.com/watch?v=IlliqYiRhMU
YouTube
Liquid Neural Networks
Ramin Hasani, MIT - intro by Daniela Rus, MIT
Abstract: In this talk, we will discuss the nuts and bolts of the novel continuous-time neural network models: Liquid Time-Constant (LTC) Networks. Instead of declaring a learning system's dynamics by implicit…
Abstract: In this talk, we will discuss the nuts and bolts of the novel continuous-time neural network models: Liquid Time-Constant (LTC) Networks. Instead of declaring a learning system's dynamics by implicit…