Your submission is now in Draft mode.

Once it's ready, please submit your draft for review by our team of Community Moderators. Thank you!

Submit Essay

Once you submit your essay, you can no longer edit it.

Pending

This content now needs to be approved by community moderators.

Submitted

This essay was submitted and is waiting for review.

How much computation did GPT-2 use for training?

Question

This is related to the AlphaFold and AlphaStar computation guesstimate questions.


Computation in PFLOP/s-days used by GPT-2, the largest model trained in OpenAI's "Language Models are Unsupervised Multitask Learners" paper (blog post here).

The estimate should not include computation used for hyperparameter tuning and architecture search.


Resolution

Resolution by paper or other reliable announcement (this may already have resolved, but I haven't dug deep enough into the paper to find out. In either case, it will be good to gather guesstimates of it here on Metaculus AI).

The method of calculations should be as similar as possible to that used in the "AI and compute" article. Note also that this article estimates actual rather than theoretical FLOPS, assuming a GPU utilization at 33% and CPU utilization at 17%.

As a hint, OpenAI themselves estimate the previous GPT model to use 0.96 PFLOP/s-days; and mention that GPT-2 uses more than 10x the number of parameters and more than 10x the amount of data.

Make a Prediction

Prediction

Note: this question resolved before its original close time. All of your predictions came after the resolution, so you did not gain (or lose) any points for it.

Note: this question resolved before its original close time. You earned points up until the question resolution, but not afterwards.

Current points depend on your prediction, the community's prediction, and the result. Your total earned points are averaged over the lifetime of the question, so predict early to get as many points as possible! See the FAQ.