The most underreported and important story in AI right now is that pure scaling has failed to produce AGI
The fact is, pure scaling has not worked. I am not alone in thinking this; the illustrious Stanford Natural Language Processing group reached a similar conclusion, reading between the lines of OpenAI’s recent announcement in the same way I did. In their words, Altman’s recent OpenAI roadmap was “the final admission that the 2023 strategy of OpenAI, Anthropic, etc. ‘“simply scaling up model size, data, compute, and dollars spent will get us to AGI/ASI’) is no longer working!” Jonathan Raa/NurPhoto via Getty Images