This question is related to others concerning GPT-2, including:
- how much computation it used
- whether there will be malicious of the technology
- whether a non-binding agreement on dual-use publishing norms will be in place by end of 2019.
This question was inspired by discussion during the February 24 online workshop focused on GPT-2.
Resolution
The “publicly available” part of the question can be satisfied in multiple ways:
- If an AI lab releases source code, weights, data or whatever is minimally sufficient to achieve the required performance
- If a collaborative online effort produces an open-source program (compare the LeelaZero effort to make an open-source version of AlphaZero, whose weights are kept secret by DeepMind)
- If a startup builds an API for content generation that e.g. completes passages for journalists
- … or something else
Note that the model does not have to be free. A company charging substantial amounts for their API would suffice, as long as anyone with sufficient funds would actually be allowed to buy it.
We will take “close to as powerful as GPT-2” to mean a model that:
- Credibly generates long, synthetic samples of text when prompted with an initial passage, AND
- Beats at least three of the previous records that GPT-2 also beat