Skip to content

Why OpenAI.Chat.ChatTokenUsage doesn't have a public property for cached_tokens? #263

Answered by joseharriaga
mr-shevchenko asked this question in Q&A
Discussion options

You must be logged in to vote

Thank you for reaching out, @mr-shevchenko ! We are working to release an update in a few days that will expose this property publicly.

To get you unblocked in the meantime, it should be possible to parse that property manually via the raw response. I believe something like this should work:

ClientResult<ChatCompletion> result = client.CompleteChat(content);
BinaryData output = result.GetRawResponse().Content;

using JsonDocument outputAsJson = JsonDocument.Parse(output.ToString());
string cachedTokenCount = outputAsJson.RootElement
    .GetProperty("usage"u8)
    .GetProperty("prompt_token_details"u8)
    .GetProperty("cached_tokens"u8)
    .GetInt32();

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by mr-shevchenko
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants