Skip to content

Specifying best_of argument for chat completion #298

Answered by Frogley
Artein asked this question in Q&A
Discussion options

You must be logged in to vote

I apologize for the misunderstanding. OpenAI does indeed offer the 'best_of' parameter, but it's only available in the completions endpoint. You can use this parameter like this:

var completionResult = openAiService.Completions.CreateCompletion(
    new CompletionCreateRequest()
    {
        Prompt = "prompt",
        Model = OpenAI.ObjectModels.Models.Gpt_4,
        Temperature = 0.2f,
        MaxTokens = 500,
        BestOf = 3 // 'best_of' parameter here!
    }, cancellationToken: CancellationToken.None);

However, this parameter is not available in other endpoints, such as ChatCompletions.

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
2 replies
@Artein
Comment options

@Artein
Comment options

Comment options

You must be logged in to vote
1 reply
@Artein
Comment options

Answer selected by Artein
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants