>>278592463- **Temperature**: An option/value that basically lets you control how "random" or "creative" the model is.
- **Colab**: Short for Google Colaboratory. It's a service Google provides for free to allow people to use their unused high-tech server hardware (powerful CPUs and GPUs) to run random scripts ("Colab notebooks") using a web interface. Google also reserves the right to kick you off when they want the hardware back. This is the number 1 reason your AIDS game gets killed.
- **Official version**: The official version of the game, a.k.a. AI Dungeon 2. Found [here](
https://play.aidungeon.io/).
- **Mormon**: Anon's nickname of the original author of the game.
- **GPT-2**: A large pretrained text generation neural network, trained and released by OpenAI. Basically, it's a big machine learning model where you give it some text, and it generates more text based on that, word by word. GPT-2 was trained on tons of text, cost them millions of dollars to develop, and it is pretty good and generating convincing, even creative-looking text. AIDS uses GPT-2 to generate stories.
- **124M, 355M, 775M, 1558M**: Refers to different versions of GPT-2. More specifically, referring to the number of parameters in each. I.e. the 1558M has 1.5 billion parameters.
- **Training**: Refers to updating the model parameters through machine learning magic. May also refer to the pre-training or fine-tuning.
- **Pre-training**: Refers to the original training of GPT-2 that OpenAI did. We will never do this because it's hilariously expensive.
- **Fine-tuning**: Refers to training the pre-trained GPT-2 on additional text, to make it generate text of some genre (e.g. NarutoXSasuke fanfiction). AIDS was fine-tuned on some CYOA text that the Mormon downloaded somewhere.
- **Vanilla GPT-2**: Refers to the GPT-2 model(s) released by OpenAI, without any further fine-tuning.
(In markdown for easy pasting)
Dump terms and I'll write descriptions. I'll do them in batches of 10 or so.