github simonw/llm 0.19a0

latest releases: 0.28, 0.27.1, 0.27...
pre-release13 months ago
  • Tokens used by a response are now logged to new input_tokens and output_tokens integer columns and a token_details JSON string column, for the default OpenAI models and models from other plugins that implement this feature. #610
  • llm prompt now takes a -u/--usage flag to display token usage at the end of the response.
  • llm logs -u/--usage shows token usage information for logged responses.
  • llm prompt ... --async responses are now logged to the database. #641

Don't miss a new llm release

NewReleases is sending notifications on new releases.