I consume a lot of content in order to learn, and many others consume far more than I do. It feels like a torrid pace, but compared to AI tools it’s laughably slow.
Back in 2016, Sam Altman (the man behind OpenAI, which is behind ChatGPT) shared this idea:
“There are certain advantages to being a machine. We humans are limited by our input-output rate—we learn only two bits a second, so a ton is lost. To a machine, we must seem like slowed-down whale songs.”
In a system like ChatGPT, their scale is often measured in “parameters” — connections between various words. GPT-3 uses roughly 175 billion parameters, and GPT-4 uses nearly a trillion parameters. If we consider a parameter to be roughly equal to a single word in a book, then GPT-4 is using the content equivalent to 10,000,000 books — and it doesn’t forget a word.
Reading a book per week, it’d take you around 192,000 years to catch up with GPT-4 and, again, you’d like not remember a sizeable portion of what you read (I sure know I wouldn’t).
This is amazing enough, but it’s accelerating a rapid pace as well. As time goes on, our ability to consume knowledge will seem ever slower compared to machines, and being seen as a “slowed-down whale song” seems about right.
Google has been steering us toward this for years now, and our advantages as humans will be how we can use tools to make sense of all of that content, rather than trying to remember everything ourselves.
Leave a Reply