A parameter controlling randomness in AI outputs. Lower temperature (0-0.3) gives consistent, focused responses; higher (0.7-1.0) gives more creative, varied ones.
Temperature is a parameter that controls the randomness of AI model outputs. It affects how the model selects from possible next tokens - lower temperatures make it more deterministic, higher temperatures more creative.
How temperature works:
Temperature settings:
Temperature recommendations by task:
Note: Temperature interacts with other parameters like top_p (nucleus sampling). Usually, adjust one or the other, not both.
Use low temperature for factual tasks like data extraction; higher for creative tasks like marketing copy. Default 0.7 works for most cases.
We tune temperature settings for each use case with our Australian clients. Factual customer support gets low temperature; content generation gets higher values for more engaging outputs.
"Temperature 0 for invoice data extraction where consistency is critical; temperature 0.8 for brainstorming marketing taglines where variety is valued."