https://arxiv.org/pdf/2302.10197.pdf
I've had the exact same idea about controlling rotation of the sobel kernels via a parameter in a hidden states! Nice to see it validated
(btw did anyone already tried to train a vision model with filters defined as an sdf?)
I've had the exact same idea about controlling rotation of the sobel kernels via a parameter in a hidden states! Nice to see it validated
(btw did anyone already tried to train a vision model with filters defined as an sdf?)
defined sobel filters in perception phase as sdf, now I can compute convolutions of any size.
Although after 11x11 performance starts to drop a little (I have 8 of them in each agent)
Although after 11x11 performance starts to drop a little (I have 8 of them in each agent)
edge cognition
Video
WIP on implementing rotation for SDF perception filters.
In my imagination, the SDF for a sobel x filter would look something like
In my imagination, the SDF for a sobel x filter would look something like
x = pow(dot(normalize(dx, dy)), vec2(0, 1)) 2.) * 2. * sign(dx)
and the direction of the filter is determined by rotation of the unit vectorExperimenting with different types of channel attention, that would actually reduce the amount of computation needed, by throwing out most of the input.
The idea is to save on buffer reads on inference, by applying some function + gated activation to internal state, and using this result to determine if we should perceive anything
The idea is to save on buffer reads on inference, by applying some function + gated activation to internal state, and using this result to determine if we should perceive anything
https://webgpufundamentals.org/webgpu/lessons/webgpu-wgsl-function-reference.html
handy cheat sheet of WGSL functions
handy cheat sheet of WGSL functions
webgpufundamentals.org
WGSL Function Reference
How to use Buffers
❤1
This media is not supported in your browser
VIEW IN TELEGRAM
translation operator (random input, fixed shift)
texture NCA on Chaos Communication Camp (day 1). WGSL compute -> Vulkan on jetson nano. Mapping is done in the fragment shader stage
NCA particle simulation
https://transdimensional.xyz/projects/neural_ca/index.html
(github: https://github.com/PWhiddy/Growing-Neural-Cellular-Automata-Pytorch)
https://transdimensional.xyz/projects/neural_ca/index.html
(github: https://github.com/PWhiddy/Growing-Neural-Cellular-Automata-Pytorch)
representing radiance field as 3D Gaussians
https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/
https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/