Sam Altman says that “we are not here to masturbate by the number of parameters.” He also weighed in on Elon Musk’s letter calling for a pause in artificial intelligence research .
OpenAI CEO Sam Altman says the world is nearing the size limit of large language models ( LLMs ), but it is still possible to improve these artificial intelligences.
For the mind behind ChatGPT, “we are not here to masturbate to the number of parameters” and he is surprised by the great capacity of open source alternatives to achieve close statistics with much less budget.
AI Parameters
According to Altman at the MIT ‘Imagination in Action’ event, LLMs are already reaching their maximum size, but they can still be further improved and optimized. He likens this race to the race of chip makers to get more megahertz of power.
“Most of you don’t know how many gigahertz your iPhone has, but it’s fast. Capacities are what we really care about, and I think it’s important that we stay focused on rapidly increasing [LLM] capacity,” he says . .
The executive even points out that, if necessary, he would reduce the number of parameters for artificial intelligence to improve its quality.
“I think it’s important that we stay focused on rapidly increasing capacity. And if there’s some reason the number of parameters needs to decrease over time, or if we need to have multiple models working together, each of which is smaller, we’d do that. What we want to deliver to the world are the most capable, useful and safe models. We are not here to masturbate to parameter counting,” she said.
According to official statistics, GPT-3, the model before the current one, maintained 175 billion parameters. Without official confirmation, it is believed that GPT-4 l reaches 1.6 billion parameters.
His premise is also given by the closing of the gaps between the big and small technology companies in this field: Altman believes that companies can achieve 13 billion parameters with only 100 dollars while it cost OpenAI 10 million dollars to achieve 540 thousand . millions of them.