github langchain-ai/langchain langchain-ollama==0.3.0

latest release: langchain-deepseek==0.1.3
2 days ago

Changes since langchain-ollama==0.2.3

langchain-ollama 0.3.0 updates the default method for with_structured_output to Ollama's dedicated structured output feature. This corresponds to method="json_schema". Previously, with_structured_output used Ollama's tool-calling features for this method.

To restore old behavior: explicitly specify method="function_calling" when calling with_structured_output:

llm = ChatOllama(model="...").with_structured_output(
    schema, method="function_calling"
)

Other features

Added support for parsing reasoning content in Deepseek models:

llm = ChatOllama(model="deepseek-r1:1.5b", extract_reasoning=True)

result = llm.invoke("What is 3^3?")
result.content  # "3^3 is..."
result.additional_kwargs["reasoning_content"]  # "<think> To calculate 3^3, I start by... </think>"

Detailed changelog

ollama: release 0.3.0 (#30420)
ollama: add reasoning model support (e.g. deepseek) (#29689)
(Ollama) Fix String Value parsing in _parse_arguments_from_tool_call (#30154)
ollama[minor]: update default method for structured output (#30273)
langchain_ollama: Support keep_alive in embeddings (#30251)
core[patch]: update structured output tracing (#30123)
core: basemessage.text() (#29078)
multiple: fix uv path deps (#29790)
infra: add UV_FROZEN to makefiles (#29642)
infra: migrate to uv (#29566)

Don't miss a new langchain release

NewReleases is sending notifications on new releases.