GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain?
REFERENCES:
[1] GPT-3 paper: Language Models are Few-Shot Learners
[2] OpenAI’s GPT-3 Language Model: A Technical Overview
[3] Measuring the Algorithmic Efficiency of Neural Networks