GPT-4 is a transformer based model pre-trained to predict the next token in a document.
Source: GPT-4 Technical ReportPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 79 | 11.70% |
Large Language Model | 47 | 6.96% |
Question Answering | 37 | 5.48% |
In-Context Learning | 26 | 3.85% |
Retrieval | 26 | 3.85% |
Code Generation | 19 | 2.81% |
Benchmarking | 17 | 2.52% |
Prompt Engineering | 14 | 2.07% |
Sentence | 14 | 2.07% |