-
Notifications
You must be signed in to change notification settings - Fork 345
Open
Labels
area: responsesThis item is related to ResponsesThis item is related to Responsesblocked: specThis issue is blocked on a needed REST API spec update.This issue is blocked on a needed REST API spec update.feature-requestCategory: A new feature or enhancement to an existing feature is being requested.Category: A new feature or enhancement to an existing feature is being requested.
Description
Describe the feature or improvement you are requesting
the api docs mention that you can now set the prompt cache retention to 24 hours, instead of the standard 5-10 minutes
https://platform.openai.com/docs/guides/prompt-caching
{
"model": "gpt-5.1",
"input": "Your prompt goes here...",
"prompt_cache_retention": "24h"
}
it would be brilliant to be able to set it in the .net SDK too - both on the chat and responses clients.
could it be added to the following objects:
ResponseCreationOptions and ChatCompletionOptions
Many thanks.
Additional context
No response
Metadata
Metadata
Assignees
Labels
area: responsesThis item is related to ResponsesThis item is related to Responsesblocked: specThis issue is blocked on a needed REST API spec update.This issue is blocked on a needed REST API spec update.feature-requestCategory: A new feature or enhancement to an existing feature is being requested.Category: A new feature or enhancement to an existing feature is being requested.