github dagster-io/dagster 1.5.0
1.5.0 (core) / 0.21.0 (libraries) "How Will I Know"

latest releases: 1.9.1, dagster-powerbi/v0.25.1rc1, dagster-looker/v0.24.14rc9...
13 months ago

Major Changes since 1.4.0 (core) / 0.20.0 (libraries)

Core

  • Improved ergonomics for execution dependencies in assets  - We introduced a set of APIs to simplify working with Dagster that don't use the I/O manager system for handling data between assets. I/O manager workflows will not be affected.

    • AssetDep type allows you to specify upstream dependencies with partition mappings when using the deps parameter of @asset and AssetSpec.
    • MaterializeResult can be optionally returned from an asset to report metadata about the asset when the asset handles any storage requirements within the function body and does not use an I/O manager.
    • AssetSpec has been added as a new way to declare the assets produced by @multi_asset. When using , the does not need to return any values to be stored by the I/O manager. Instead, the should handle any storage requirements in the body of the function.
  • Asset checks (experimental) - You can now define, execute, and monitor data quality checks in Dagster [docs].

    • The @asset_check decorator, as well as the check_specs argument to @asset and @multi_asset enable defining asset checks.
    • Materializing assets from the UI will default to executing their asset checks. You can also execute individual checks.
    • When viewing an asset in the asset graph or the asset details page, you can see whether its checks have passed, failed, or haven’t run successfully.
  • Auto materialize customization (experimental) - AutoMaterializePolicies can now be customized [docs].

    • All policies are composed of a set of AutoMaterializeRules which determine if an asset should be materialized or skipped.
    • To modify the default behavior, rules can be added to or removed from a policy to change the conditions under which assets will be materialized.

dagster-pipes

  • Dagster pipes is a new library that implements a protocol for launching compute into external execution environments and consuming streaming logs and Dagster metadata from those environments. See #16319 for more details on the motivation and vision behind Pipes.
  • Out-the-box integrations
    • Clients: local subprocess, Docker containers, Kubernetes, and Databricks
      • PipesSubprocessClient, PipesDocketClient, PipesK8sClient, PipesDatabricksClient
    • Transport: Unix pipes, Filesystem, s3, dbfs
    • Languages: Python
  • Dagster pipes is composable with existing launching infrastructure via open_pipes_session. One can augment existing invocations rather than replacing them wholesale.

Since 1.4.17 (core) / 0.20.17 (libraries)

New

  • [ui] Global Asset Graph performance improvement - the first time you load the graph it will be cached to disk and any subsequent load of the graph should load instantly.

Bugfixes

  • Fixed a bug where deleted runs could retain instance-wide op concurrency slots.

Breaking Changes

  • AssetExecutionContext is now a subclass of OpExecutionContext, not a type alias. The code
def my_helper_function(context: AssetExecutionContext):
    ...

@op
def my_op(context: OpExecutionContext):
    my_helper_function(context)

will cause type checking errors. To migrate, update type hints to respect the new subclassing.

  • AssetExecutionContext cannot be used as the type annotation for @ops run in @jobs. To migrate, update the type hint in @op to OpExecutionContext. @ops that are used in @graph_assets may still use the AssetExecutionContext type hint.
# old
@op
def my_op(context: AssetExecutionContext):
    ...

# correct
@op
def my_op(context: OpExecutionContext):
    ...
  • [ui] We have removed the option to launch an asset backfill as a single run. To achieve this behavior, add backfill_policy=BackfillPolicy.single_run() to your assets.

Community Contributions

  • has_dynamic_partition implementation has been optimized. Thanks @edvardlindelof!
  • [dagster-airbyte] Added an optional stream_to_asset_map argument to build_airbyte_assets to support the Airbyte prefix setting with special characters. Thanks @chollinger93!
  • [dagster-k8s] Moved “labels” to a lower precedence. Thanks @jrouly!
  • [dagster-k8s] Improved handling of failed jobs. Thanks @Milias!
  • [dagster-databricks] Fixed an issue where DatabricksPysparkStepLauncher fails to get logs when job_run doesn’t have cluster_id at root level. Thanks @PadenZach!
  • Docs type fix from @sethusabarish, thank you!

Documentation

  • Our Partitions documentation has gotten a facelift! We’ve split the original page into several smaller pages, as follows:

Dagster Cloud

  • New dagster-insights sub-module - We have released an experimental dagster_cloud.dagster_insights module that contains utilities for capturing and submitting external metrics about data operations to Dagster Cloud via an api. Dagster Cloud Insights is a soon-to-be released feature that shows improves visibility into usage and cost metrics such as run duration and Snowflake credits in the Cloud UI.

Don't miss a new dagster release

NewReleases is sending notifications on new releases.