Prompt size: compact
At roughly 900 tokens, ChatGPT's prompt is compact — it leaves plenty of room for user input, tool calls, and the conversation itself. Short prompts tend to trust the underlying model more, with fewer explicit rules. It consumes only 0.7% of the model's context window before any user interaction.