github stanfordnlp/dspy 3.0.0

latest releases: 3.0.3, 3.0.2, 3.0.1...
24 days ago

DSPy 3.0.0

The work in the run up to DSPy 3.0 has focused on new powerful optimizers (RL: dspy.GRPO via our new Arbor library; and reflective prompt evolution: dspy.GEPA and dspy.SIMBA), extensibility (dspy.Adapter & dspy.Type), reliability/observability for production (tight integration with MLflow 3.0), and more. Much of this work incubated during late 2.6 and matured in 3.0.

✨ Highlights across 2.6 → 3.0

Adapters & Types (extensibility without prompt plumbing)

  • Adapters: built-in dspy.ChatAdapter, dspy.JSONAdapter, dspy.XMLAdapter, dspy.BAMLAdapter, token/status streaming, async paths, and intelligent fallback to native LLM structured outputs.
  • Types: multi-modal I/O via dspy.Image and dspy.Audio; composite types (e.g., list[dspy.Image], Pydantic models); higher-level I/O like dspy.History and dspy.ToolCalls. Custom types now “just work” with adapters (dspy.Type).
  • Tooling integrations: MCP servers and LangChain tools supported out-of-the-box.

Modules, Runtime & DX

  • Better scalability: Module.batch with thread-safe DSPy settings; native DSPy async; high-concurrency, configurable caches.
  • Smoother DX: intermediate status streaming, output streaming from any layer, usage tracking, per-module history, rich callbacks.
  • Deployment/portability: stable save/load (including the program), and you can export the prompt management layer via Adapters.
  • New/updated modules: dspy.CodeAct, dspy.Refine, improved ReAct, a more reliable PythonInterpreter.

Observability & MLOps

  • Native observability with MLflow 3.0: tracing, optimizer tracking, and improved deployment flows (plus docs/tutorials).

Optimizers

  • MIPROv2: substantially more reliable with automatic hyperparameter selection and many fixes.
  • GRPO (RL on DSPy programs): our new Arbor library for RL training of compound AI systems.
  • SIMBA: powerful prompt optimizer that learns from custom feedback (great for agentic/long-horizon tasks).
  • GEPA (Genetic-Pareto): a new optimizer that builds a Pareto tree of prompts, uses NL reflection to extract/validate lessons, and can produce shorter prompts while improving downstream performance. Early results show wins over MIPROv2 on several tasks and promising inference-time search behavior (see paper announcement for details).

Some of the above first landed in late 2.6 (e.g., early streaming, initial adapters/types improvements), and were consolidated/matured in 3.0.


💥 Breaking changes / notable removals

  • Remove community retrievers (#8073). If you relied on unmaintained retriever integrations, migrate to custom code (or use Tool/MCP integrations).
  • Drop Python 3.9 support. 3.10–3.13 are supported.
  • Alias removed: dspy.Program → remove/replace (was cleaned up in 3.0 work).
  • Various deprecations promised in 2.5 were applied during the 2.6.0 release candidates (e.g., old functional/, dsp/ clients, legacy caches/examples/tests).

🆚 Changes since 3.0.0b4

Optimizers

Maintenance

  • Migrate GitHub Actions from set-output to $GITHUB_OUTPUT (@kurtmckee, #8557).

New contributor

Full Changelog: 3.0.0b4...3.0.0


📚 3.0.0 beta highlights (rollup)


👏 Contributors since 2.6

Thank you to everyone who contributed code, docs, reviews, issues, and testing!

🎉 = first-time contributor since 2.6.15

@AkeemMcLennon, @aliirz, @amas0, @apieum 🎉, @arnavsinghvi11, @asad-aali 🎉, @asparagus, @assadyousuf 🎉, @BenMcH 🎉, @bjsi, @brenorb 🎉, @brishin 🎉, @BTripp1986 🎉, @carsonkahn-external 🎉, @cezarc1, @chakravarthik27, @charviupreti 🎉, @chenmoneygithub, @codingDuan 🎉, @CyrusNuevoDia, @danielsparing 🎉, @dbczumar, @dilarasoylu, @dimroc, @Dyadd, @emmanuel-ferdman 🎉, @erandeutsch, @estsauver, @fswair 🎉, @GabeDottl, @glesperance, @grisaitis 🎉, @hmoazam, @Harryllh, @Hangzhi, @hung-phan, @isaacbmiller, @itay1542, @Jdogtherock, @JHMuir, @jjjjw, @jinnovation, @jmho, @jmhb0 🎉, @kanjurer, @kalanyuz, @ken-dwyer 🎉, keyuchen21, @klopsahlong, @koptagel 🎉, @kurtmckee 🎉, @LukasMurdock 🎉, @laitifranz, @LakshyAAAgrawal 🎉, @lxdlam, @MaximeRivest 🎉, @MaxwellSalmon, @Miyamura80 🎉, @myz96, @neilbhutada, @niklovescoding 🎉, @nillwyc 🎉, @okhat, @olesyash 🎉, @poudro 🎉, @prrao87 🎉, @patcher9, @rcanand, @rifolio 🎉, @Samoed, @SanjanShiv 🎉, @srowen 🎉, @Shangyint, @stevapple, @tvdaptible 🎉, @TomeHirata, @tikoehle 🎉, @Timtech4u, @thomasahle, @ulgens, @vakinapalli 🎉, @vacmar01 🎉, @veronicalyu320, @vincentkoc, @weklund 🎉, @willsmithDB, @xinyij-goo 🎉, @yuruofeifei, @Ziems, @zbambergerNLP, @gkorland, @yanmxa, @mikeedjones, @b-d055, @mikeweltevredem, @ItzAmirreza, @GangGreenTemperTatum, @shermansiu, @dmavrommatis, @iPersona, @Krishn1412, @Y-1huadb, @B-Step62, @tkellogg

First-time contributors called out since 2.6.15 (rollup):
@rifolio, @charviupreti, @BenMcH, @GangGreenTemperTatum, @koptagel, @Miyamura80, @tikoehle, @srowen, @assadyousuf, @xinyij-goo, @SanjanShiv, @emmanuel-ferdman, @Y-1huadb, @Krishn1412, @BTripp1986, @vincentkoc, @estsauver, @jmho, @JHMuir, @neilbhutada, @carsonkahn-external, @dimroc, @erandeutsch, @willsmithDB, @iPersona, @brishin, @dmavrommatis, @codingDuan, @Hangzhi, @poudro, @LukasMurdock, @ken-dwyer, @grisaitis, @vacmar01, @fswair, @nillwyc, @asad-aali, @niklovescoding, @MaximeRivest, @brenorb, @weklund, @vakinapalli, @tvdaptible, @jmhb0, @olesyash, @kurtmckee, @prrao87, @danielsparing, @apieum, @LakshyAAAgrawal.


🔗 Compare

  • Full changes: 2.6.27...3.0.0b1, 3.0.0b1...3.0.0b2, 3.0.0b2...3.0.0b3, 3.0.0b3...3.0.0b4, 3.0.0b4...3.0.0.

Don't miss a new dspy release

NewReleases is sending notifications on new releases.