What's New
Features
- Cross-chunk context awareness for coreference resolution (#306)
- Resolves pronouns and references across chunk boundaries (e.g., "She" → "Dr. Sarah Johnson")
- New
context_window_charsparameter onextract()
Bug Fixes
- Load builtin providers before resolution regardless of config path (#419)
- Fixes
InferenceConfigErrorwhen specifying provider by name viaModelConfig(provider='ollama')
- Fixes
- Graceful handling of chunks with no extractable entities (#423)
suppress_parse_errorsnow defaults toTrueinextract()so one unparseable chunk does not fail the entire document- Sanitizes suppress-parse-error log path to exclude raw chunk text
- Send
keep_aliveat top level for Ollama API (#421) - Support Enum/dataclass values in GCS batch cache hashing (#359)
- Handle non-Gemini model output parsing edge cases (#300)
Documentation
- Clarify that ungrounded extractions have
char_interval=None(#420) - Clarify best practices for few-shot examples (#302)
Full Changelog: v1.1.1...v1.2.0