News

Data Scientists Cite Lack of GPT-4 Details

Several data scientists have noted the lack of technical details accompanying the recent release of GPT-4 by OpenAI.

The company's GPT-3 (and its variant GPT-3.5) large language model (LLM) systems form the basis of AI assistants like ChatGPT (general purpose information) and the Copilot tool from Microsoft-owned GitHub, a software programming assistant described as an "AI pair programmer." But, while the latest GPT-4 system was released on March 14, the technical details of GPT-4 have not been disclosed, unlike its predecessors.

One data scientist contacted Virtualization & Cloud Review to share thoughts stemming from a discussion with colleagues but chose to remain anonymous.

They speculate that OpenAI is transitioning from a primarily research focus to a primarily business and revenue generation focus and that technical detail opacity is needed to prevent competitors from leap-frogging OpenAI.

When GPT-3.5 was released just a few months ago, it was essentially the only game in town, but with many billions of dollars at stake, there is now no lack of LLM competitors, including Flamingo from Google DeepMind and the open source BLOOM project.

One such technical detail that was provided for previous versions but not GPT-4 is the number of parameters used by the LLM.

For example, it's known that the GPT-3.5 model has roughly 175 billion parameters. These parameters, called a neural weight in technical terms, is a single numeric value -- like -2.345 -- and all the parameters collectively determine the functionality and behavior of a model. The more parameters a model has, the more powerful it is. A rough analogy is engine horsepower from the early days of aviation. More horsepower translated to vastly improved aircraft.

[Click on image for larger view.] OpenAI founder Greg Brockman demonstrates how GPT-4 can summarize a press release using a single sentence with only words that start with the letter G. (source: OpenAI).

Though OpenAI released considerable technical information about the predecessors to GPT-4, it hasn't as yet released even the most basic details about GPT-4, such as the number of parameters, the data for which is used to train the model (such as Wikipedia text), and architecture changes.

While the number of parameters used by GPT-4 hasn't been disclosed, a Dec. 26, 2022, article on the UX Planet site noted, "Since 2018 when GPT-1 was released, OpenAI has followed the 'the bigger, the better 'strategy. GPT-1 had 117 million parameters, GPT-2 had 1.2 billion parameters, and GPT-3 raised the number even further to 175 billion parameters. It means that the GPT-3 model has 100 times more parameters than GPT-2. GPT-3 is a very large model size, with 175 billion parameters."

That article also referenced a Wired article in which Andrew Feldman, founder and CEO of Cerebras, a company that partners with OpenAI to train the GPT model, mentioned that GPT-4 will be about 100 trillion parameters, from talking to OpenAI (that article was published in August 2021, though).

We asked Microsoft's "new Bing" search engine powered by GPT-4 why OpenAI hasn't released technical details for the latest offering like it has for predecessors. It referenced the UX Planet article and another article on a site that requires registration, and it replied: "OpenAI has not released much information about GPT-4 yet, including the number of parameters used. According to a source, OpenAI does not disclose much information about the model. However, some information about GPT-4 has been released, including its capabilities and usage, but the specifics of its specifications are still unknown."

A December 2022 article by The Atlantic claimed "Money Will Kill ChatGPT's Magic."

"How will the use of these tools change as they become profit generators instead of loss leaders?" the article said. "Will they become paid-subscription products? Will they run advertisements? Will they power new companies that undercut existing industries at lower costs?"

The first of those questions has already been answered. GPT-4 is only available to users of OpenAI's ChatGPT Plus service ($20 per month), and to developers who apply to a waitlist and are granted access to use the tech in their products. They will be billed for the tokens they use in requests to the model. Pricing details are available here.

Details behind GPT-4 might not have been disclosed because OpenAI is apparently going to make a lot of money with GPT-4 and other tech.

Reuters last December reported that ChatGPT owner OpenAI projects $1 billion in revenue by 2024.

What's more, Microsoft has invested $10 billion in OpenAI, according to Forbes and other sources.

While that and other Microsoft investments apparently granted Microsoft first dibs on using GPT-4 for its new Bing experience, ordinary individuals are left out in the cold when it comes to direct free use -- even on a severely limited basis -- which could conceivably decrease that projected $1 billion in revenue.

One of the data scientists told Virtualization & Cloud Review, "It's understandable that OpenAI would choose not to release much technical information about GPT-4. The stakes, in terms of money and impact on people's lives, are difficult to imagine. The race to develop large language models might be one of the biggest events in the history of technology."

Kind of like the internet, which is free.

About the Author

David Ramel is an editor and writer for Converge360.

Featured

Subscribe on YouTube