You are on page 1of 4

TOP P

top p parameter specifies a sampling threshold during inference time


TOP P

Top p sampling (sometimes called nucleus sampling) is a technique used to sample possible outcomes
of the model.
Top P controls how many random results the model should consider for completion, as suggested by

value limits creativity, while a higher value expands its horizons.


Top p and temperature parameters both control the randomness of the model. OpenAI documentation
recommends using either one parameter or the other and setting the unused parameter to the neutral
case, i.e. 1.0.

2023 PARAMETERS : Top P 23


TOP P
parameter that appears below the temperature also has some control over the randomness of the response, so
make sure that it is at its default value of 1. Leave all other parameters also at their default values.
With this configuration, GPT-3 will behave in a very predictable way, so this is a good starting point to try things out.
-3 adds some more. In the example below, I
typed the text Python is and let GPT-3 complete the sentence.

Before we continue, be aware that GPT-3 does not like input strings that end in a space, as this causes weird and
sometimes unpredictable behaviors. You may have the inclination to add a space after the last word of your input,
so keep in mind that this can cause problems. The Playground will show you a warning if by mistake you leave one
or more spaces at the end of your input.

2023 PARAMETERS : Top P 24


TOP P
Now raise the temperature to 0.5. Delete the text generated above, leaving once again just Python is nd then click
-3 is going to take more liberties when it completes the sentence. Here is what I obtained:

When you try this


different result.

Feel free to try different values of temperature to see how GPT-3 becomes more or less creative with its responses.
Once you are ready to continue, set the temperature back to 0 and rerun the original Python is request.

2023 PARAMETERS : Top P 25

You might also like