This is allegedly a Morgan Stanley note on GPT-4/5 training demands, inference savings, Nvidia revenue, and LLM economics:
“We think that GPT 5 is currently being trained on 25k GPUs - $225 mm or so of NVIDIA hardware…”
“We think that GPT 5 is currently being trained on 25k GPUs - $225 mm or so of NVIDIA hardware…”
Stanford Alpaca, a new LLM player, has been fine-tuned with Nvidia A100 x 8 for 3 hours ($100) and augmented with OpenAI GPT-3 ($500). This model can be used on a single GPU or CPU, such as Apple Silicone or Raspberry PI, and offers performance similar to GPT-3 at a total cost of $600.
Instagram is sunsetting its NFT features
Meta: across the company, we're looking closely at what we prioritize to increase our focus. We’re winding down digital collectibles (NFTs) for now to focus on other ways to support creators, people, and businesses
Meta: across the company, we're looking closely at what we prioritize to increase our focus. We’re winding down digital collectibles (NFTs) for now to focus on other ways to support creators, people, and businesses
Google AI just announced the PaLM API
It will be released with a new tool called MakerSuite, which lets you prototype ideas, do prompt engineering, synthetic data generation and custom-model tuning. Waitlist available soon.
It will be released with a new tool called MakerSuite, which lets you prototype ideas, do prompt engineering, synthetic data generation and custom-model tuning. Waitlist available soon.
Googleblog
Google for Developers Blog - News about Web, Mobile, AI and Cloud
❤1
OpenAI just announcing GPT-4, a large multimodal model
Openai
GPT-4
We’ve created GPT-4, the latest milestone in OpenAI’s effort in scaling up deep learning. GPT-4 is a large multimodal model (accepting image and text inputs, emitting text outputs) that, while less capable than humans in many real-world scenarios, exhibits…
🔥4❤3
Anthropic are now opening up access to Claude, AI assistant, to power businesses at scale.
Claude is based on Anthropic’s research into training helpful, honest, and harmless AI systems. Accessible through chat and API, Claude is capable of a wide variety of conversational and text processing tasks while maintaining a high degree of reliability and predictability.
Early customers report that Claude is much less likely to produce harmful outputs, easier to converse with, and more steerable - so you can get your desired output with less effort. Claude can also take direction on personality, tone and behavior.
You can chat with Claude, give it prompts to generate text, get Q&A responses and summaries, translate between languages, give it multi-step instructions and more using natural language.
Claude is based on Anthropic’s research into training helpful, honest, and harmless AI systems. Accessible through chat and API, Claude is capable of a wide variety of conversational and text processing tasks while maintaining a high degree of reliability and predictability.
Early customers report that Claude is much less likely to produce harmful outputs, easier to converse with, and more steerable - so you can get your desired output with less effort. Claude can also take direction on personality, tone and behavior.
You can chat with Claude, give it prompts to generate text, get Q&A responses and summaries, translate between languages, give it multi-step instructions and more using natural language.
Anthropic
Introducing Claude
Anthropic is an AI safety and research company that's working to build reliable, interpretable, and steerable AI systems.
❤7
AI moving to Reeds law (network scaling as all about information) versus Moores law.
A small customised open & standardised models (alone or combined) beat large general models.
How many generalised systems have you seen that outperform customised ones?
A small customised open & standardised models (alone or combined) beat large general models.
How many generalised systems have you seen that outperform customised ones?
All about AI, Web 3.0, BCI
This is allegedly a Morgan Stanley note on GPT-4/5 training demands, inference savings, Nvidia revenue, and LLM economics: “We think that GPT 5 is currently being trained on 25k GPUs - $225 mm or so of NVIDIA hardware…”
Looks like GPT-4 finished training in August 2022, this aligns with the Morgan Stanley report that GPT-4 was already complete and GPT-5 is in progress with up to 25k GPUs
Adjust your timelines accordingly.
Source for August can be found on the technical report pg.40
Report from Morgan Stanley.
Adjust your timelines accordingly.
Source for August can be found on the technical report pg.40
Report from Morgan Stanley.
Chinese fiat CNY stablecoin issuer CNHC raised $10 million in Series A+ funding.
KuCoin Ventures led the funding, with Circle and IDG Capital participating. It is currently issued on the Ethereum and Conflux blockchains.
KuCoin Ventures led the funding, with Circle and IDG Capital participating. It is currently issued on the Ethereum and Conflux blockchains.
The Block
Stablecoin issuer CNHC raises $10 million in funding led by KuCoin Ventures
CNHC Group, the issuer of the CNHC stablecoin that is pegged to the offshore Chinese yuan, raised $10 million in funding.
Google just announced new AI-powered features to Google Workspace
In Gmail and Google Docs, you can simply type in a topic you’d like to write about, and a draft will be instantly generated for you.
In Gmail and Google Docs, you can simply type in a topic you’d like to write about, and a draft will be instantly generated for you.
Google
The next generation of AI for developers and Google Workspace
Introducing new generative AI capabilities in Google Cloud and Google Workspace, plus PaLM API and MakerSuite for developers.
Smart_Contracts_Uses_Cases_Examples_2023_1678814620.pdf
8.4 MB
Most people switch off, or stop listening when Web3 folks start talking about Smart Contacts, as they assume it’s a domain for coders, or that the applications of Smart Contracts aren’t relevant for them…
… this is a big mistake. And honestly, Smart Contracts are less complex than you think, if you think of them more as programmes than run (albeit on a Blockchain) or a means for creating automation in a process (If this, then that
The team at Rejolut has your back, with a guide on what Smart Contracts are, how they work, what they’re good at, and who is using them, including:
- What are Smart Contracts?
- How do Smart Contracts work?
- Key features of Smart Contracts
Then we get into use cases of Smart Contracts, that cover a wide range of domains, including:
- Rights Management
- Financial Products
- Gaming
- NFTs
- Insurance
- Digital Identity
- Supply Chain
- Government
- Lending and Mortgages
- Property Ownership
- Voting
- Medical Research
And much more besides… There’s also a helpful section at the end around the limitation of Smart Contracts.
… this is a big mistake. And honestly, Smart Contracts are less complex than you think, if you think of them more as programmes than run (albeit on a Blockchain) or a means for creating automation in a process (If this, then that
The team at Rejolut has your back, with a guide on what Smart Contracts are, how they work, what they’re good at, and who is using them, including:
- What are Smart Contracts?
- How do Smart Contracts work?
- Key features of Smart Contracts
Then we get into use cases of Smart Contracts, that cover a wide range of domains, including:
- Rights Management
- Financial Products
- Gaming
- NFTs
- Insurance
- Digital Identity
- Supply Chain
- Government
- Lending and Mortgages
- Property Ownership
- Voting
- Medical Research
And much more besides… There’s also a helpful section at the end around the limitation of Smart Contracts.
Wow! UK to invest £900m in supercomputer in bid to build own ‘BritGPT’
the Guardian
UK to invest £900m in supercomputer in bid to build own ‘BritGPT’
Treasury announces plans for exascale computer so as not to risk losing out to China
👍1
Pytorch announced the release of PyTorch 2.0!
This version includes:
⚙️ 100% backward compatible
📦 Out of the box performance
📶 Significant speed improvements
This version includes:
⚙️ 100% backward compatible
📦 Out of the box performance
📶 Significant speed improvements
PyTorch
PyTorch 2.0: Our next generation release that is faster, more Pythonic and Dynamic as ever
We are excited to announce the release of PyTorch® 2.0 which we highlighted during the PyTorch Conference on 12/2/22! PyTorch 2.0 offers the same eager-mode development and user experience, while fundamentally changing and supercharging how PyTorch operates…
OpenAI has shared very few details about its latest AI model GPT-4, triggering criticisms from many in the AI community.
Vice
OpenAI's GPT-4 Is Closed Source and Shrouded in Secrecy
GPT-4 is OpenAI's most secretive release thus far, and AI researchers are warning about the potential consequences.