I agree crypto & defi are often controversial but where else do you get so much nerd-wild-west fun?
https://twitter.com/danielvf/status/1626641254531448833
https://twitter.com/danielvf/status/1626641254531448833
X (formerly Twitter)
Daniel Von Fange (@danielvf) on X
In a dazzling reverse hack, a substantial chunk of the Playtpus hack stolen funds have been recovered.
Here's how it worked: (1/4)
Here's how it worked: (1/4)
Voxel 3D light field rendering, a thing which I dreamed of building since high school, tried to build multiple times back in 2010, somewhat succeeded in 2015. Made possible by neural networks now. IMHO This is the future of 3d art and entertainment. Unlimited detail, lighting effects of any complexity, etc etc.
(Try different rendering modes, it's all glitchy but it's starting to work!)
https://wz.ax/luma-neural-rendering
https://wz.ax/luma-neural-trees
and others on the luma labs website
(Try different rendering modes, it's all glitchy but it's starting to work!)
https://wz.ax/luma-neural-rendering
https://wz.ax/luma-neural-trees
and others on the luma labs website
Reflective Playground - Created by @TommyOshima with Luma
January 20, 2023
i honestly don't know if this goes here or into the other channel 😂
Parachute use to prevent death and major trauma when jumping from aircraft: randomized controlled trial
https://wz.ax/10.1136/bmj.k5094/parachutes
Parachute use to prevent death and major trauma when jumping from aircraft: randomized controlled trial
https://wz.ax/10.1136/bmj.k5094/parachutes
The BMJ
Parachute use to prevent death and major trauma when jumping from aircraft: randomized controlled trial
Objective To determine if using a parachute prevents death or major traumatic injury when jumping from an aircraft.
Design Randomized controlled trial.
Setting Private or commercial aircraft between September 2017 and August 2018.
Participants 92 aircraft…
Design Randomized controlled trial.
Setting Private or commercial aircraft between September 2017 and August 2018.
Participants 92 aircraft…
😁3
the dark side of indistinguishability obfuscation: machine learning models can encode arbitrary functions, therefore provably undetectable backdoors as well
https://arxiv.org/abs/2204.06974
https://arxiv.org/abs/2204.06974
hmm, guys from Stanford claim that for instruction-tuning LLaMA 7B is enough. good! waiting for the fine-tuning code 🧐
https://crfm.stanford.edu/2023/03/13/alpaca.html
https://crfm.stanford.edu/2023/03/13/alpaca.html
Forwarded from Dmytro S
GitHub
GitHub - cksystemsteaching/selfie: An educational software system of a tiny self-compiling C compiler, a tiny self-executing RISC…
An educational software system of a tiny self-compiling C compiler, a tiny self-executing RISC-V emulator, and a tiny self-hosting RISC-V hypervisor. - cksystemsteaching/selfie
🤩3
while the public is ranting, Bellard ships
ts_server is a web server proposing a REST API to large language models. They can be used for example for text completion, question answering, classification, chat, translation, image generation, ...
https://wz.ax/textsynth-server
ts_server is a web server proposing a REST API to large language models. They can be used for example for text completion, question answering, classification, chat, translation, image generation, ...
https://wz.ax/textsynth-server
interesting. the fact that it’s a hybrid embedding/prediction model sounds very… logical.
so you can chug along without attention just fine it seems
https://twitter.com/BlinkDL_AI/status/1638555109373378560?s=20
so you can chug along without attention just fine it seems
https://twitter.com/BlinkDL_AI/status/1638555109373378560?s=20
https://arxiv.org/pdf/2303.11366.pdf
Reflexion: an autonomous agent with dynamic memory and self-reflection
Reflexion: an autonomous agent with dynamic memory and self-reflection
Map of Contemporaries:
The history of the world in famous people’s lifespans. Did you realize that Alessandro Volta was younger than Napoleon? See which famous people shared their time on Earth.
https://ybogdanov.github.io/history-timeline/
The history of the world in famous people’s lifespans. Did you realize that Alessandro Volta was younger than Napoleon? See which famous people shared their time on Earth.
https://ybogdanov.github.io/history-timeline/
Map of Contemporaries
The history of the world in famous people’s lifespans.
👍1
https://twitter.com/_akhaliq/status/1645257919997394945
Dwarf Fortress got a serious competitor!
Dwarf Fortress got a serious competitor!
https://arxiv.org/abs/2302.10866
https://github.com/HazyResearch/safari
Convolutional LMM, hmmm.
> reaching Transformer quality with a 20% reduction in training compute required at sequence length 2K. Hyena operators are twice as fast as highly optimized attention at sequence length 8K, and 100x faster at sequence length 64K.
https://github.com/HazyResearch/safari
Convolutional LMM, hmmm.
> reaching Transformer quality with a 20% reduction in training compute required at sequence length 2K. Hyena operators are twice as fast as highly optimized attention at sequence length 8K, and 100x faster at sequence length 64K.
GitHub
GitHub - HazyResearch/safari: Convolutions for Sequence Modeling
Convolutions for Sequence Modeling. Contribute to HazyResearch/safari development by creating an account on GitHub.
❤1👍1🤔1
Unlimiformer: Long-Range Transformers with Unlimited Length Input
Unlimiformer improves pretrained models such as BART and Longformer by extending them to unlimited inputs without additional learned weights and without modifying their code (via kNN-search)
https://arxiv.org/abs/2305.01625
Unlimiformer improves pretrained models such as BART and Longformer by extending them to unlimited inputs without additional learned weights and without modifying their code (via kNN-search)
https://arxiv.org/abs/2305.01625
https://arxiv.org/abs/2305.07759
TinyStories: 3-30M (not G) parameter model with coherent English from a curated dataset.
Don't expect it to code but curious if this is usable as a LoRA or similar baseline - also need to look closer at their tokenizer setup, must be way different from GPT
TinyStories: 3-30M (not G) parameter model with coherent English from a curated dataset.
Don't expect it to code but curious if this is usable as a LoRA or similar baseline - also need to look closer at their tokenizer setup, must be way different from GPT