partition_set_defsdirectly. The loading
scheme for these definitions via
is deprecated and expected to be removed in 0.8.0.
- Mark published modules as python 3.8 compatible.
- The dagster-airflow package supports loading all Airflow DAGs within a directory path, file path,
or Airflow DagBag.
- The dagster-airflow package supports loading all 23 DAGs in Airflow example_dags folder and
execution of 17 of them (see:
- The dagster-celery CLI tools now allow you to pass additional arguments through to the underlying
celery CLI, e.g., running
dagster-celery worker start -n my-worker -- --uid=42will pass the
--uidflag to celery.
- It is now possible to create a
PresetDefinitionthat has no environment defined.
dagster schedule debugcommand to help debug scheduler state.
SystemCronSchedulernow verifies that a cron job has been successfully been added to the
crontab when turning a schedule on, and shows an error message if unsuccessful.
dagster instance migrateis required for this release to support the new experimental assets
- Runs created prior to 0.7.8 will no longer render their execution plans as DAGs. We are only
rendering execution plans that have been persisted. Logs are still available.
Pathis no longer valid in config schemas. Use
- Removed the
@pyspark_soliddecorator - its functionality, which was experimental, is subsumed by
requiring a StepLauncher resource (e.g. emr_pyspark_step_launcher) on the solid.
- Merged "re-execute", "single-step re-execute", "resume/retry" buttons into one "re-execute" button
with three dropdown selections on the Run page.
- Added new
asset_keystring parameter to Materializations and created a new “Assets” tab in Dagit
to view pipelines and runs associated with these keys. The API and UI of these asset-based are
likely to change, but feedback is welcome and will be used to inform these changes.
- Added an
emr_pyspark_step_launcherthat enables launching PySpark solids in EMR. The
"simple_pyspark" example demonstrates how it’s used.
- Fixed an issue when running Jupyter notebooks in a Python 2 kernel through dagstermill with dagster
running in Python 3.
- Improved error messages produced when dagstermill spins up an in-notebook context.
- Fixed an issue with retrieving step events from