March 15th, 2024

We are excited to announce that a new release of our CLI is now available, bringing some major integrations, improvements, and bug fixes to SDF. Interested in our product roadmap or want to discuss features? Reach out to your SDF point of contact to provide any points of feedback!

SDF Remote Run for Snowflake

Introduction of compute setting within the configuration objects of the workspace.sdf.yml file. Addition of new table type RemoteTable and renamed table type managed to RemoteExternal.

SDF Remote Run allows users to execute sdf run with the goal of running models on Snowflake compute instead of running on local compute. To change compute from the default of local to Snowflake, update the compute: field within the workspace file.

Snowflake & Redshift Table Providers

Directly integrate to Snowflake and Redshift in order to hydrate table schemas and execute queries. SDF uses remote Snowflake and Redshift schema information to do type bindings and other static analysis checks on the remote workspace.

Simply add a provider block to the workspace.sdf.yml file and update the dialect to reflect the appropriate dialect used:

For more information check out the specific docs on Snowflake Integration and AWS & Redshift Integration.

Integrate SDF Into Your DBT Projects to Extend Functionality

SDF now works alongside existing DBT projects. SDF allows current DBT project capabilities to be extended with column-level lineage, checks, and data classifications/governance.

To initialize your DBT project as an SDF project following these instructions: DBT Integration

Take SDF Into Production With CI/CD

SDF now offers integration with CI/CD workflows and we have prepared example documentation to help amplify the power and value that SDF delivers.

SDF’s static analyis can supercharge CI/CD workflows for data teams. By compiling your SQL on each pull request, SDF can statically ensure new updates to your models are valid SQL and do not break anything downstream. Additionally, SDF supports the use of checks in CI/CD. This feature allows users to define their own static tests, further enhancing the robustness and reliability of your data models. Unlike other CI/CD data tools, SDF checks everything statically, preventing you from having to incur compute costs to validate SQL and run impact analysis.

This new CI/CD Guide will walk you through how to integrate SDF into an example CI/CD workflow with GitHub Actions to run sdf compile and sdf test on new pull requests and subsequent updates.

SDF Push Stability Changes and Updates

Major improvements to how we compile and manage workspaces in the cloud have been created. These will make sdf push much more reliable, and will improve performance across the entire web console.

Workspaces will now be compiled in the cloud instead of pushing up the compiled output. This enables the SDF Cloud to reliably mimic compilation environments locally.

With this change, only files included via the includes: block within the workspace.sdf.yml will be pushed. If you are using dbt seeds with SDF as an extension of capabilities, you will need to add the seeds to the includes block such as:

workspace.sdf.yml
includes:
   ...
   - path: seeds/*
     type: resource

If your workspace contains a table provider, credentials used to fetch DDLs will be encrypted during the sdf push. These credentials will be used in the SDF Cloud to pull down the table DDLs from the specific provider. All credentials in SDF Cloud are encrypted at rest according to our SOC2 Compliance, and running sdf push --delete will immediately remove the encrypted credentials and any history of them from the SDF Cloud.

New Docs!

General bug fixes and stability improvements:

  • Fix for binding errors and type checking
  • Depth selector bug fix
  • Workspace caching errors and problems

Latest Builds

ArchitectureStatusVersionDownload
Linux Intel X86-640.1.184Download
Linux Arm ARM-640.1.184Download
Apple Intel X86-640.1.184Download
Apple Arm AARCh-640.1.184Download