Why OpenAI.Chat.ChatTokenUsage doesn't have a public property for cached_tokens? #263
-
| In debug, I can view cached tokens via a private property OpenAI.Chat.ChatTokenUsage.SerializedAdditionalRawData, but I want to store this information in my application's logs to monitor this statistic. This is crucial for controlling expenses. If I find out that caching doesn't work in my case for some reason, I will summarize the conversation more often. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
| Thank you for reaching out, @mr-shevchenko ! We are working to release an update in a few days that will expose this property publicly. To get you unblocked in the meantime, it should be possible to parse that property manually via the raw response. I believe something like this should work: ClientResult<ChatCompletion> result = client.CompleteChat(content); BinaryData output = result.GetRawResponse().Content; using JsonDocument outputAsJson = JsonDocument.Parse(output.ToString()); string cachedTokenCount = outputAsJson.RootElement .GetProperty("usage"u8) .GetProperty("prompt_token_details"u8) .GetProperty("cached_tokens"u8) .GetInt32(); |
Beta Was this translation helpful? Give feedback.
-
| Thank you! Your solution works. I'm looking forward to the new build. If someone needs the same: prompt_tokens_details (additional s). |
Beta Was this translation helpful? Give feedback.
Thank you for reaching out, @mr-shevchenko ! We are working to release an update in a few days that will expose this property publicly.
To get you unblocked in the meantime, it should be possible to parse that property manually via the raw response. I believe something like this should work: