Extracts token log probabilities from assistant replies within an LLMMessage object.
Each row represents a token with its log probability and top alternative tokens.
Details
An empty tibble is output if no logprobs were requested. Works with openai_chat(), llamacpp_chat(), and other providers that support logprobs.
Columns include:
reply_index: The index of the assistant reply in the message history.token: The generated token.logprob: The log probability of the generated token.bytes: The byte-level encoding of the token.top_logprobs: A list column containing the top alternative tokens with their log probabilities.
