×
Йде завантаження

GPT-3 vs Human Brain

GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain?

REFERENCES:

[1] GPT-3 paper: Language Models are Few-Shot Learners
https://arxiv.org/abs/2005.14165

[2] OpenAI's GPT-3 Language Model: A Technical Overview
https://lambdalabs.com/blog/demystifying-gpt-3/

[3] Measuring the Algorithmic Efficiency of Neural Networks
https://arxiv.org/abs/2005.04305

Наступний епізод
S06E154 - Best feature of Python | List Comprehensions
Дивіться епізод

Епізоди

Схожі вистави (10)

Distractible
Distractible
Nerdstalgic
Nerdstalgic
Morros de nutria
Morros de nutria
A Bit Fruity with Matt Bernstein (Podcast)
A Bit Fruity with Matt Bernstein (Podcast)
SidemenReacts
SidemenReacts
Timcast IRL
Timcast IRL
vastava
vastava
H3 Podcast
H3 Podcast
WOLF Storie che contano
WOLF Storie che contano
Murders Explained
Murders Explained