If Deepseek in fact was able to great such higher efficiency than all the others were able to do before to train models, then it certainly would mean the end of the Nvidia Bullrun.
The question is: Will we get a short-term correction around the board and then find profits in the lower caps or even Crypto, or will this create a full-blown bear market?
If you look at Russell2000, it is only up 77% from 2015 and 30% from 2020.
It literally is only a few massive companies that are in a bubble. One could argue that BTC is as well, given its recent pump.
The question is: Will we get a short-term correction around the board and then find profits in the lower caps or even Crypto, or will this create a full-blown bear market?
If you look at Russell2000, it is only up 77% from 2015 and 30% from 2020.
It literally is only a few massive companies that are in a bubble. One could argue that BTC is as well, given its recent pump.
DiveInDefi
For alts I just hope we hold $300B
We dipped out of the triangle but are yet back in it again. For the bullish thesis to remain, there is not much downside left...
Just hypothetically:
Deepseek is all true, ai is over.
Trump: let‘s focus on crypto
???
Profit
Deepseek is all true, ai is over.
Trump: let‘s focus on crypto
???
Profit
Forwarded from Bags (waiting room)
Nvidia statement on DeepSeek:
"DeepSeek is an excellent AI advancement and a perfect example of Test Time Scaling. DeepSeek’s work illustrates how new models can be created using that technique, leveraging widely-available models and compute that is fully export control compliant. Inference requires significant numbers of NVIDIA GPUs and high-performance networking. We now have three scaling laws: pre-training and post-training, which continue, and new test-time scaling"
---This would seem to confirm the worry that DeepSeek was able to achieve its results without using Nvidia's most powerful chips. The silver lining perhaps, is that they still needed to use Nvidia GPUs: BBG
"DeepSeek is an excellent AI advancement and a perfect example of Test Time Scaling. DeepSeek’s work illustrates how new models can be created using that technique, leveraging widely-available models and compute that is fully export control compliant. Inference requires significant numbers of NVIDIA GPUs and high-performance networking. We now have three scaling laws: pre-training and post-training, which continue, and new test-time scaling"
---This would seem to confirm the worry that DeepSeek was able to achieve its results without using Nvidia's most powerful chips. The silver lining perhaps, is that they still needed to use Nvidia GPUs: BBG
Bags (waiting room)
Nvidia statement on DeepSeek: "DeepSeek is an excellent AI advancement and a perfect example of Test Time Scaling. DeepSeek’s work illustrates how new models can be created using that technique, leveraging widely-available models and compute that is fully…
Well, the question is not, if the latest, but how many.