This document contains the help content for the sdf command-line program.
sdf
This document contains the help content for the sdf
command-line program.
Command Overview:
sdf
↴sdf new
↴sdf clean
↴sdf compile
↴sdf lint
↴sdf format
↴sdf lsp1
↴sdf run
↴sdf build
↴sdf test
↴sdf stats
↴sdf report
↴sdf check
↴sdf lineage
↴sdf push
↴sdf system
↴sdf system update
↴sdf system uninstall
↴sdf auth
↴sdf auth login
↴sdf auth login aws
↴sdf auth login snowflake
↴sdf auth login bigquery
↴sdf auth logout
↴sdf auth logout aws
↴sdf auth logout openai
↴sdf auth logout snowflake
↴sdf auth logout bigquery
↴sdf auth status
↴sdf man
↴sdf man cli
↴sdf man functions
↴sdf man definition-schema
↴sdf man event-schema
↴sdf man information-schema
↴sdf man error-codes
↴sdf init
↴sdf dbt
↴sdf dbt init
↴sdf dbt refresh
↴sdf exec
↴sdf
SDF: A fast SQL compiler, local development framework, and in-memory analytical database
Usage: sdf [OPTIONS] COMMAND
new
— Create a new sdf workspaceclean
— Remove artifacts that sdf has generated in the pastcompile
— Compile modelslint
— Lint modelsformat
— Format modelslsp1
— Lsp wave1run
— Run modelsbuild
— Run model and tests, and if successful, publish, aka “write-audit-publish”test
— Test your modelsstats
— Statistics for your datareport
— Report code qualitycheck
— Check code qualitylineage
— Display lineage for a given table and/or columnpush
— Push a local workspace to the SDF Servicesystem
— System maintenance, install and updateauth
— Authenticate CLI to services like SDF, AWS, OpenAI, etcman
— Display reference material, like the CLI, dialect specific functions, schemas for authoring and interchangeinit
— Initialize a workspace interactivelydbt
— Initialize an sdf workspace from an existing dbt projectexec
— Execute custom scripts--log-level LOG_LEVEL
— Set log level
Possible values: trace
, debug
, debug-pretty
, info
, warn
, error
--log-file LOG_FILE
— Creates or replaces the log file
--show-all-errors
— Don’t suppress errors
sdf new
Create a new sdf workspace
Usage: sdf new [OPTIONS] [PATH]
PATH
— Create a new sdf workspace at path--list-samples
— List all available samples
Default value: false
--sample SAMPLE
— Create a workspace with the sample content at path
-s
, --show SHOW
— Display progress messages
Default value: progress
Possible values: progress
, none
sdf clean
Remove artifacts that sdf has generated in the past
Usage: sdf clean [OPTIONS]
-e
, --environment ENVIRONMENT
--path PATH
— Remove artifacts at path that sdf has created in the past [default: current workspace directory]
-s
, --show SHOW
— Display progress messages
Default value: progress
Possible values: progress
, none
sdf compile
Compile models
Usage: sdf compile [OPTIONS] [TARGETS]...
TARGETS
— Compile only the given source dirs, files or tables [default: all models]--infer-tables
— Infer table schema and generate ddls
Default value: false
-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
--no-incremental-mode
— Set incremental-mode to false (default is true)
Default value: false
--no-snapshot-mode
— Set snapshot-mode to false (default is true)
Default value: false
--warn-non-local-executable
Default value: false
--fix
— Fix the linting issues
Default value: false
-w
, --warnings WARNINGS
—
Specify the warnings to apply. Example: -w warning1 -w warning2
Available warnings:
all
: Turn on all warnings.
none
: Turn off all warnings.
error
: Treat warnings as errors.
capitalization-columns=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for columns.
capitalization-tables=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for tables.
capitalization-aliases=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for aliases.
type-implicit-conversions
: Warn about implicit type conversions.
references-consistent
: Warn about consistent references.
inconsistent-schema
: Warn about schema inconsistency between local and remote schema. Default: on.
duplicate-column-names
: Warn about duplicate column names in the same table.
All warnings are by default turned off, unless specified.
--lineage LINEAGE
— Run lineage and classifier/description propagation. Is inferred based on need if not set
Possible values: true
, false
sdf lint
Lint models
Usage: sdf lint [OPTIONS] [TARGETS]...
TARGETS
— Lint only the given source dirs, and files with .sql extension [default: all model .sql files in workspace]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
--fix
— Fix the linting issues
Default value: false
-w
, --warnings WARNINGS
—
Specify the warnings to apply. Example: -w warning1 -w warning2
. Later options override former ones.
Legend: (*) This warning can be automatically fixed by sdf.
Available Warnings
all
: Turn on all warnings.none
: Turn off all warnings.error
: Treat warnings as errors.Capitalization Settings
capitalization-keywords=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for keywords. Default: consistent (*)capitalization-literals=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for literals. Default: consistent (*)capitalization-types=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for types. Default: consistent (*)capitalization-functions=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for functions. Default: consistent (*)Convention and Reference Settings:
convention-blocked-words= word
: Specify blocked words.convention-terminator
: Warn about terminator conventions. (*)references-keywords
: Warn about keyword references.references-special-chars= char
: Warn about special character references.references-quoting
: Warn about quoting references. (*)references-qualification
: Warn about qualification references.references-columns
: Warn about ambiguous column references.Structure-Specific Warnings:
structure-else-null
: Warn about ELSE NULL in structures. (*)structure-simple-case
: Warn about simple CASE structures. (*)structure-unused-cte
: Warn about unused CTEs. (*)structure-nested-case
: Warn about nested CASE structures.structure-distinct
: Warn about DISTINCT usage. (*)structure-subquery
: Warn about subquery structures.structure-join-condition-order
: Warn about join condition order.structure-column-order
: Warn about column order in structures.More warnings available as part of sdf compile …
sdf format
Format models
Usage: sdf format [OPTIONS] [TARGETS]...
TARGETS
— Format the given .sql files or directories [default: all .sql files from the workspace/environment]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
-l
, --layout LAYOUT
—
Specify the layout options. Example: -l layout1 -l layout2
Available layout options:
indent= number
: Set the number of spaces to indent.
commas=leading|trailing
: Set the comma layout.
line-length= number
: Set the line length.
--check
Default value: false
sdf lsp1
Lsp wave1
Usage: sdf lsp1 [OPTIONS] [TARGETS]...
TARGETS
— Compile only the given source dirs, files or tables [default: all models]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf run
Run models
Usage: sdf run [OPTIONS] [TARGETS]...
TARGETS
— Run only the given source dirs, files or tables [default: all models]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
-d
, --date DATE
— Run command for this date , use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--from FROM
— Run command for all dates from from (inclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--to TO
— Run command for all dates to to (exclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format [default: now]
--dry-run
— Plan command but don’t evaluate it
Default value: false
sdf build
Run model and tests, and if successful, publish, aka “write-audit-publish”
Usage: sdf build [OPTIONS] [TARGETS]...
TARGETS
— Run only the given source dirs, files or tables [default: all models]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
-d
, --date DATE
— Run command for this date , use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--from FROM
— Run command for all dates from from (inclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--to TO
— Run command for all dates to to (exclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format [default: now]
--dry-run
— Plan command but don’t evaluate it
Default value: false
--no-publish
— Runs models and tests producing draft artifacts; call sdf build to publish those drafts
Default value: false
sdf test
Test your models
Usage: sdf test [OPTIONS] [TARGETS]...
TARGETS
— Assess code (data) quality: use source dirs, files or tables to determine which code (data) contracts to run [default: all code (data) contracts]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf stats
Statistics for your data
Usage: sdf stats [OPTIONS] [TARGETS]...
TARGETS
— Profile data quality: use source dirs, files or tables to determine which data statistics to run [default: all data stats]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf report
Report code quality
Usage: sdf report [OPTIONS] [TARGETS]...
TARGETS
— Profile code (data) quality, use source dirs, files or tables to determine which code (data) reports to run [default: all code (data) reports]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf check
Check code quality
Usage: sdf check [OPTIONS] [TARGETS]...
TARGETS
— Check code (data) quality: use source dirs, files or tables to determine which code (data) checks to run [default: all code (data) checks]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
sdf lineage
Display lineage for a given table and/or column
Usage: sdf lineage [OPTIONS] [TARGETS]...
TARGETS
— Target selection: compute the lineage for the given source dirs, files or @tables [default: all models]--column COLUMN
— The column for which to compute lineage
--forward
— Compute downstream lineage instead of the default upstream lineage
Default value: false
--show-scans
— Display scan dependencies in addition to copy and mod dependencies. Unset by default
Default value: false
--max-depth MAX_DEPTH
— Limiting the depth of shown lineage tree. Default value of 0 shows full lineage
Default value: 0
-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf push
Push a local workspace to the SDF Service
Usage: sdf push [OPTIONS] [TARGETS]...
TARGETS
— Target selection: push only the given source dirs, files or tables [default: current workspace directory]--delete
— Delete this workspace from the SDF Service
Default value: false
-y
, --yes
— Answer yes to all prompts which include credentials’ uploading to the console when using a table provider
Default value: false
-d
, --dry-run
— No changes will be made to the console when this is set
Default value: false
-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf system
System maintenance, install and update
Usage: sdf system [OPTIONS] COMMAND
update
— Update sdf in place to the latest versionuninstall
— Uninstall sdf from the system-s
, --show SHOW
— Display messages
Default value: progress
Possible values: progress
, none
sdf system update
Update sdf in place to the latest version
Usage: sdf system update [OPTIONS]
--version VERSION
— Update sdf to this version [default: latest version]
-p
, --preview
— Update sdf to the latest preview version
Default value: false
--stable
— Update sdf to the latest stable version
Default value: false
sdf system uninstall
Uninstall sdf from the system
Usage: sdf system uninstall
sdf auth
Authenticate CLI to services like SDF, AWS, OpenAI, etc
Usage: sdf auth [OPTIONS] COMMAND
login
—logout
— Log out of SDF Servicestatus
— Show status of credentials / tokens-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf auth login
Usage: sdf auth login [OPTIONS] [COMMAND]
aws
— Configure how to authenticate with AWSsnowflake
— Configure how to authenticate with Snowflakebigquery
— Configure how to authenticate with BigQuery-n
, --name NAME
— Name of the credential to use--id-token ID_TOKEN
— Path to a file containing an OIDC identity token (a JWT)--access-key ACCESS_KEY
— Access key for headless authentication--secret-key SECRET_KEY
— Secret key for headless authentication--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)sdf auth login aws
Configure how to authenticate with AWS
Usage: sdf auth login aws [OPTIONS]
-n
, --name NAME
— Name of the credential to use
--default-region DEFAULT_REGION
— AWS Region to use (default: us-east-1)
Default value: us-east-1
--profile PROFILE
— AWS profile to use, as usually defined in ~/.aws/config or ~/.aws/credentials
--role-arn ROLE_ARN
— ARN of the role to assume
--external-id EXTERNAL_ID
— External ID to use when assuming the role
--use-web-identity
— Use web identity to authenticate
Default value: false
--bucket-region-map BUCKET_REGION_MAP
— Mapping of bucket names to regions
--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)
--access-key-id ACCESS_KEY_ID
— AWS access key id
--secret-access-key SECRET_ACCESS_KEY
— AWS secret access key
--session-token SESSION_TOKEN
— AWS session token
sdf auth login snowflake
Configure how to authenticate with Snowflake
Usage: sdf auth login snowflake [OPTIONS] --account-id ACCOUNT_ID --username USERNAME
-n
, --name NAME
— Name of the credential to use-a
, --account-id ACCOUNT_ID
— Snowflake account id-U
, --username USERNAME
— Snowflake username-P
, --password PASSWORD
— Snowflake password-r
, --role ROLE
— Snowflake role-W
, --warehouse WAREHOUSE
— Snowflake warehouse--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)--private-key-path PRIVATE_KEY_PATH
— Path to the private key file for key pair authentication (mutually exclusive with password and private_key_pem)--private-key-pem PRIVATE_KEY_PEM
— The private key in PEM format (mutually exclusive with private_key_path and password)--private-key-passphrase PRIVATE_KEY_PASSPHRASE
— The passphrase for the private key, if it’s encryptedsdf auth login bigquery
Configure how to authenticate with BigQuery
Usage: sdf auth login bigquery [OPTIONS]
-n
, --name NAME
— Name of the credential to use-p
, --project-id PROJECT_ID
— GCP project id-E
, --client-email CLIENT_EMAIL
— client_email of the service account key file-K
, --private-key PRIVATE_KEY
— private_key of the service account key file-J
, --json-path JSON_PATH
— path to the json file containing project id, client email, private key--credentials-dir CREDENTIALS_DIR
— path to the file where credentials will be stored (default is platform specific)sdf auth logout
Log out of SDF Service
Usage: sdf auth logout [OPTIONS] [COMMAND]
aws
— Logout from AWSopenai
— Logout from OpenAIsnowflake
— Logout from Snowflakebigquery
— Logout from BigQuery-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout aws
Logout from AWS
Usage: sdf auth logout aws [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout openai
Logout from OpenAI
Usage: sdf auth logout openai [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout snowflake
Logout from Snowflake
Usage: sdf auth logout snowflake [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout bigquery
Logout from BigQuery
Usage: sdf auth logout bigquery [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth status
Show status of credentials / tokens
Usage: sdf auth status [OPTIONS] [TARGETS]...
TARGETS
— Compile only the given source dirs, files or tables [default: all models]--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)sdf man
Display reference material, like the CLI, dialect specific functions, schemas for authoring and interchange
Usage: sdf man COMMAND
cli
— Display SDF’s command line interfacefunctions
— Display SDF’s functions definitionsdefinition-schema
— Display SDF’s yml blocks as a json schema doc [only: json]event-schema
— Display SDF’s trace events as a json schema only [only: json]information-schema
— Display SDF’s information schemas [only: sql]error-codes
— Display SDF’ error codes [only: markdown]sdf man cli
Display SDF’s command line interface
Usage: sdf man cli [OPTIONS]
--format FORMAT
— Format reference material in this format [only: markdown]
Possible values: sql
, yml
, markdown
, json
sdf man functions
Display SDF’s functions definitions
Usage: sdf man functions [OPTIONS]
--dialect DIALECT
— Dialect for all functions
Default value: trino
--section SECTION
— Section to display [ section value must appear in the function registry]
Default value: all
--format FORMAT
— Format reference material in this format [default: markdown | yml]
Possible values: sql
, yml
, markdown
, json
--implemented
— limit output to only implemented functions
Default value: false
sdf man definition-schema
Display SDF’s yml blocks as a json schema doc [only: json]
Usage: sdf man definition-schema [OPTIONS]
--format FORMAT
— Format reference material in this format [only: json]
Possible values: sql
, yml
, markdown
, json
sdf man event-schema
Display SDF’s trace events as a json schema only [only: json]
Usage: sdf man event-schema [OPTIONS]
--format FORMAT
— Format reference material in this format [only: json]
Possible values: sql
, yml
, markdown
, json
sdf man information-schema
Display SDF’s information schemas [only: sql]
Usage: sdf man information-schema [OPTIONS]
--format FORMAT
— Format reference material in this format [only: sql]
Possible values: sql
, yml
, markdown
, json
sdf man error-codes
Display SDF’ error codes [only: markdown]
Usage: sdf man error-codes [OPTIONS]
--format FORMAT
— Format reference material in this format [only: markdown]
Possible values: sql
, yml
, markdown
, json
sdf init
Initialize a workspace interactively
Usage: sdf init [PATH]
PATH
— Create a new sdf workspace at pathsdf dbt
Initialize an sdf workspace from an existing dbt project
Usage: sdf dbt [OPTIONS] COMMAND
init
— Initialize a sdf workspace from a dbt project — best effortrefresh
— Re-initialize a sdf workspace from a dbt project — best effort-s
, --show SHOW
— Display messages
Default value: progress
Possible values: progress
, none
sdf dbt init
Initialize a sdf workspace from a dbt project — best effort
Usage: sdf dbt init [OPTIONS]
--target TARGET
— Use this DBT target over the default target in profiles.yml
--profiles-dir PROFILES_DIR
— Use this DBT profile instead of the defaults at ~/.dbt/profile.yml — (note dbt uses —profile_dir, this CLI uses —profile-dir)
--workspace-dir WORKSPACE_DIR
— Specifies the workspace directory where we expect to see manifest and dbt project files The SDF workspace file will be placed in the same directory. Default: current directory
-s
, --save
— Save and overwrite the workspace file
Default value: false
-c
, --config CONFIG
— Supply a config yml file or provide config as yml string e.g. ‘(key: value)‘
sdf dbt refresh
Re-initialize a sdf workspace from a dbt project — best effort
Usage: sdf dbt refresh [OPTIONS]
--target TARGET
— Use this DBT target over the default target in profiles.yml
--profiles-dir PROFILES_DIR
— Use this DBT profile instead of the defaults at ~/.dbt/profile.yml — (note dbt uses —profile_dir, this CLI uses —profile-dir)
--workspace-dir WORKSPACE_DIR
— Specifies the workspace directory where we expect to see manifest and dbt project files The SDF workspace file will be placed in the same directory. Default: current directory
-s
, --save
— Save and overwrite the workspace file
Default value: false
-c
, --config CONFIG
— Supply a config yml file or provide config as yml string e.g. ‘(key: value)‘
sdf exec
Execute custom scripts
Usage: sdf exec [OPTIONS] COMMAND
COMMAND
— The command to be executed-s
, --show SHOW
— The verbosity of the output
Default value: all
Possible values: all
, progress
, result
, none
--path PATH
— Execute command at the path given [default: current workspace directory]
This document contains the help content for the sdf command-line program.
sdf
This document contains the help content for the sdf
command-line program.
Command Overview:
sdf
↴sdf new
↴sdf clean
↴sdf compile
↴sdf lint
↴sdf format
↴sdf lsp1
↴sdf run
↴sdf build
↴sdf test
↴sdf stats
↴sdf report
↴sdf check
↴sdf lineage
↴sdf push
↴sdf system
↴sdf system update
↴sdf system uninstall
↴sdf auth
↴sdf auth login
↴sdf auth login aws
↴sdf auth login snowflake
↴sdf auth login bigquery
↴sdf auth logout
↴sdf auth logout aws
↴sdf auth logout openai
↴sdf auth logout snowflake
↴sdf auth logout bigquery
↴sdf auth status
↴sdf man
↴sdf man cli
↴sdf man functions
↴sdf man definition-schema
↴sdf man event-schema
↴sdf man information-schema
↴sdf man error-codes
↴sdf init
↴sdf dbt
↴sdf dbt init
↴sdf dbt refresh
↴sdf exec
↴sdf
SDF: A fast SQL compiler, local development framework, and in-memory analytical database
Usage: sdf [OPTIONS] COMMAND
new
— Create a new sdf workspaceclean
— Remove artifacts that sdf has generated in the pastcompile
— Compile modelslint
— Lint modelsformat
— Format modelslsp1
— Lsp wave1run
— Run modelsbuild
— Run model and tests, and if successful, publish, aka “write-audit-publish”test
— Test your modelsstats
— Statistics for your datareport
— Report code qualitycheck
— Check code qualitylineage
— Display lineage for a given table and/or columnpush
— Push a local workspace to the SDF Servicesystem
— System maintenance, install and updateauth
— Authenticate CLI to services like SDF, AWS, OpenAI, etcman
— Display reference material, like the CLI, dialect specific functions, schemas for authoring and interchangeinit
— Initialize a workspace interactivelydbt
— Initialize an sdf workspace from an existing dbt projectexec
— Execute custom scripts--log-level LOG_LEVEL
— Set log level
Possible values: trace
, debug
, debug-pretty
, info
, warn
, error
--log-file LOG_FILE
— Creates or replaces the log file
--show-all-errors
— Don’t suppress errors
sdf new
Create a new sdf workspace
Usage: sdf new [OPTIONS] [PATH]
PATH
— Create a new sdf workspace at path--list-samples
— List all available samples
Default value: false
--sample SAMPLE
— Create a workspace with the sample content at path
-s
, --show SHOW
— Display progress messages
Default value: progress
Possible values: progress
, none
sdf clean
Remove artifacts that sdf has generated in the past
Usage: sdf clean [OPTIONS]
-e
, --environment ENVIRONMENT
--path PATH
— Remove artifacts at path that sdf has created in the past [default: current workspace directory]
-s
, --show SHOW
— Display progress messages
Default value: progress
Possible values: progress
, none
sdf compile
Compile models
Usage: sdf compile [OPTIONS] [TARGETS]...
TARGETS
— Compile only the given source dirs, files or tables [default: all models]--infer-tables
— Infer table schema and generate ddls
Default value: false
-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
--no-incremental-mode
— Set incremental-mode to false (default is true)
Default value: false
--no-snapshot-mode
— Set snapshot-mode to false (default is true)
Default value: false
--warn-non-local-executable
Default value: false
--fix
— Fix the linting issues
Default value: false
-w
, --warnings WARNINGS
—
Specify the warnings to apply. Example: -w warning1 -w warning2
Available warnings:
all
: Turn on all warnings.
none
: Turn off all warnings.
error
: Treat warnings as errors.
capitalization-columns=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for columns.
capitalization-tables=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for tables.
capitalization-aliases=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for aliases.
type-implicit-conversions
: Warn about implicit type conversions.
references-consistent
: Warn about consistent references.
inconsistent-schema
: Warn about schema inconsistency between local and remote schema. Default: on.
duplicate-column-names
: Warn about duplicate column names in the same table.
All warnings are by default turned off, unless specified.
--lineage LINEAGE
— Run lineage and classifier/description propagation. Is inferred based on need if not set
Possible values: true
, false
sdf lint
Lint models
Usage: sdf lint [OPTIONS] [TARGETS]...
TARGETS
— Lint only the given source dirs, and files with .sql extension [default: all model .sql files in workspace]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
--fix
— Fix the linting issues
Default value: false
-w
, --warnings WARNINGS
—
Specify the warnings to apply. Example: -w warning1 -w warning2
. Later options override former ones.
Legend: (*) This warning can be automatically fixed by sdf.
Available Warnings
all
: Turn on all warnings.none
: Turn off all warnings.error
: Treat warnings as errors.Capitalization Settings
capitalization-keywords=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for keywords. Default: consistent (*)capitalization-literals=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for literals. Default: consistent (*)capitalization-types=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for types. Default: consistent (*)capitalization-functions=upper|lower|pascal|snake|camel|consistent
: Set capitalization style for functions. Default: consistent (*)Convention and Reference Settings:
convention-blocked-words= word
: Specify blocked words.convention-terminator
: Warn about terminator conventions. (*)references-keywords
: Warn about keyword references.references-special-chars= char
: Warn about special character references.references-quoting
: Warn about quoting references. (*)references-qualification
: Warn about qualification references.references-columns
: Warn about ambiguous column references.Structure-Specific Warnings:
structure-else-null
: Warn about ELSE NULL in structures. (*)structure-simple-case
: Warn about simple CASE structures. (*)structure-unused-cte
: Warn about unused CTEs. (*)structure-nested-case
: Warn about nested CASE structures.structure-distinct
: Warn about DISTINCT usage. (*)structure-subquery
: Warn about subquery structures.structure-join-condition-order
: Warn about join condition order.structure-column-order
: Warn about column order in structures.More warnings available as part of sdf compile …
sdf format
Format models
Usage: sdf format [OPTIONS] [TARGETS]...
TARGETS
— Format the given .sql files or directories [default: all .sql files from the workspace/environment]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
-l
, --layout LAYOUT
—
Specify the layout options. Example: -l layout1 -l layout2
Available layout options:
indent= number
: Set the number of spaces to indent.
commas=leading|trailing
: Set the comma layout.
line-length= number
: Set the line length.
--check
Default value: false
sdf lsp1
Lsp wave1
Usage: sdf lsp1 [OPTIONS] [TARGETS]...
TARGETS
— Compile only the given source dirs, files or tables [default: all models]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf run
Run models
Usage: sdf run [OPTIONS] [TARGETS]...
TARGETS
— Run only the given source dirs, files or tables [default: all models]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
-d
, --date DATE
— Run command for this date , use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--from FROM
— Run command for all dates from from (inclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--to TO
— Run command for all dates to to (exclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format [default: now]
--dry-run
— Plan command but don’t evaluate it
Default value: false
sdf build
Run model and tests, and if successful, publish, aka “write-audit-publish”
Usage: sdf build [OPTIONS] [TARGETS]...
TARGETS
— Run only the given source dirs, files or tables [default: all models]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
-d
, --date DATE
— Run command for this date , use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--from FROM
— Run command for all dates from from (inclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format
--to TO
— Run command for all dates to to (exclusive), use YYYY-MM-DD or YYYY-MM-DDTHH:MM format [default: now]
--dry-run
— Plan command but don’t evaluate it
Default value: false
--no-publish
— Runs models and tests producing draft artifacts; call sdf build to publish those drafts
Default value: false
sdf test
Test your models
Usage: sdf test [OPTIONS] [TARGETS]...
TARGETS
— Assess code (data) quality: use source dirs, files or tables to determine which code (data) contracts to run [default: all code (data) contracts]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf stats
Statistics for your data
Usage: sdf stats [OPTIONS] [TARGETS]...
TARGETS
— Profile data quality: use source dirs, files or tables to determine which data statistics to run [default: all data stats]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf report
Report code quality
Usage: sdf report [OPTIONS] [TARGETS]...
TARGETS
— Profile code (data) quality, use source dirs, files or tables to determine which code (data) reports to run [default: all code (data) reports]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf check
Check code quality
Usage: sdf check [OPTIONS] [TARGETS]...
TARGETS
— Check code (data) quality: use source dirs, files or tables to determine which code (data) checks to run [default: all code (data) checks]-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
--prefer-local
— When set will minimize remote DB accesses and will rely on local schema information in .sdf.yml files
Default value: false
sdf lineage
Display lineage for a given table and/or column
Usage: sdf lineage [OPTIONS] [TARGETS]...
TARGETS
— Target selection: compute the lineage for the given source dirs, files or @tables [default: all models]--column COLUMN
— The column for which to compute lineage
--forward
— Compute downstream lineage instead of the default upstream lineage
Default value: false
--show-scans
— Display scan dependencies in addition to copy and mod dependencies. Unset by default
Default value: false
--max-depth MAX_DEPTH
— Limiting the depth of shown lineage tree. Default value of 0 shows full lineage
Default value: 0
-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf push
Push a local workspace to the SDF Service
Usage: sdf push [OPTIONS] [TARGETS]...
TARGETS
— Target selection: push only the given source dirs, files or tables [default: current workspace directory]--delete
— Delete this workspace from the SDF Service
Default value: false
-y
, --yes
— Answer yes to all prompts which include credentials’ uploading to the console when using a table provider
Default value: false
-d
, --dry-run
— No changes will be made to the console when this is set
Default value: false
-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf system
System maintenance, install and update
Usage: sdf system [OPTIONS] COMMAND
update
— Update sdf in place to the latest versionuninstall
— Uninstall sdf from the system-s
, --show SHOW
— Display messages
Default value: progress
Possible values: progress
, none
sdf system update
Update sdf in place to the latest version
Usage: sdf system update [OPTIONS]
--version VERSION
— Update sdf to this version [default: latest version]
-p
, --preview
— Update sdf to the latest preview version
Default value: false
--stable
— Update sdf to the latest stable version
Default value: false
sdf system uninstall
Uninstall sdf from the system
Usage: sdf system uninstall
sdf auth
Authenticate CLI to services like SDF, AWS, OpenAI, etc
Usage: sdf auth [OPTIONS] COMMAND
login
—logout
— Log out of SDF Servicestatus
— Show status of credentials / tokens-e
, --environment ENVIRONMENT
— Use this environment
-s
, --show SHOW
— Display messages [default: progress if TARGETS is empty, all otherwise]
Possible values: all
, progress
, result
, none
-q
, --query QUERY
— Supply a .sql file or provide a sql snippet on the cmd line, e.g. ‘select * from t’
--stage STAGE
— Run the following stages [default: all stages]
Possible values: preprocess
, parse
, lint
, resolve
, classify
, execute
--cache CACHE
— Controls cache use
Default value: read-write
Possible values: read-write
, write-only
, read-only
, none
--save SAVE
— Controls which assets to save [default: none]
Possible values: info-schema
, assembly
, table-deps
--targets-only
— Processes only specified targets assuming that all the non-target dependencies already exist
Default value: false
--vars VARS
— Supply var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--env-vars ENV_VARS
— Supply env var bindings as a yml file or provide them as string e.g. ‘(key: value)’
--downstream
— Execute cmd not only the given targets but also for all its downstream artifacts
Default value: false
--format FORMAT
— Show error tables in this format
Default value: table
Possible values: table
, csv
, tsv
, json
, nd-json
, yml
--limit LIMIT
— Limiting number of shown rows. Run with —limit 0 to remove limit
sdf auth login
Usage: sdf auth login [OPTIONS] [COMMAND]
aws
— Configure how to authenticate with AWSsnowflake
— Configure how to authenticate with Snowflakebigquery
— Configure how to authenticate with BigQuery-n
, --name NAME
— Name of the credential to use--id-token ID_TOKEN
— Path to a file containing an OIDC identity token (a JWT)--access-key ACCESS_KEY
— Access key for headless authentication--secret-key SECRET_KEY
— Secret key for headless authentication--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)sdf auth login aws
Configure how to authenticate with AWS
Usage: sdf auth login aws [OPTIONS]
-n
, --name NAME
— Name of the credential to use
--default-region DEFAULT_REGION
— AWS Region to use (default: us-east-1)
Default value: us-east-1
--profile PROFILE
— AWS profile to use, as usually defined in ~/.aws/config or ~/.aws/credentials
--role-arn ROLE_ARN
— ARN of the role to assume
--external-id EXTERNAL_ID
— External ID to use when assuming the role
--use-web-identity
— Use web identity to authenticate
Default value: false
--bucket-region-map BUCKET_REGION_MAP
— Mapping of bucket names to regions
--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)
--access-key-id ACCESS_KEY_ID
— AWS access key id
--secret-access-key SECRET_ACCESS_KEY
— AWS secret access key
--session-token SESSION_TOKEN
— AWS session token
sdf auth login snowflake
Configure how to authenticate with Snowflake
Usage: sdf auth login snowflake [OPTIONS] --account-id ACCOUNT_ID --username USERNAME
-n
, --name NAME
— Name of the credential to use-a
, --account-id ACCOUNT_ID
— Snowflake account id-U
, --username USERNAME
— Snowflake username-P
, --password PASSWORD
— Snowflake password-r
, --role ROLE
— Snowflake role-W
, --warehouse WAREHOUSE
— Snowflake warehouse--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)--private-key-path PRIVATE_KEY_PATH
— Path to the private key file for key pair authentication (mutually exclusive with password and private_key_pem)--private-key-pem PRIVATE_KEY_PEM
— The private key in PEM format (mutually exclusive with private_key_path and password)--private-key-passphrase PRIVATE_KEY_PASSPHRASE
— The passphrase for the private key, if it’s encryptedsdf auth login bigquery
Configure how to authenticate with BigQuery
Usage: sdf auth login bigquery [OPTIONS]
-n
, --name NAME
— Name of the credential to use-p
, --project-id PROJECT_ID
— GCP project id-E
, --client-email CLIENT_EMAIL
— client_email of the service account key file-K
, --private-key PRIVATE_KEY
— private_key of the service account key file-J
, --json-path JSON_PATH
— path to the json file containing project id, client email, private key--credentials-dir CREDENTIALS_DIR
— path to the file where credentials will be stored (default is platform specific)sdf auth logout
Log out of SDF Service
Usage: sdf auth logout [OPTIONS] [COMMAND]
aws
— Logout from AWSopenai
— Logout from OpenAIsnowflake
— Logout from Snowflakebigquery
— Logout from BigQuery-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout aws
Logout from AWS
Usage: sdf auth logout aws [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout openai
Logout from OpenAI
Usage: sdf auth logout openai [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout snowflake
Logout from Snowflake
Usage: sdf auth logout snowflake [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth logout bigquery
Logout from BigQuery
Usage: sdf auth logout bigquery [OPTIONS]
-n
, --name NAME
— Name of the credential to use--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials are be stored (default is platform specific)sdf auth status
Show status of credentials / tokens
Usage: sdf auth status [OPTIONS] [TARGETS]...
TARGETS
— Compile only the given source dirs, files or tables [default: all models]--credentials-dir CREDENTIALS_DIR
— Path to the file where credentials will be stored (default is platform specific)sdf man
Display reference material, like the CLI, dialect specific functions, schemas for authoring and interchange
Usage: sdf man COMMAND
cli
— Display SDF’s command line interfacefunctions
— Display SDF’s functions definitionsdefinition-schema
— Display SDF’s yml blocks as a json schema doc [only: json]event-schema
— Display SDF’s trace events as a json schema only [only: json]information-schema
— Display SDF’s information schemas [only: sql]error-codes
— Display SDF’ error codes [only: markdown]sdf man cli
Display SDF’s command line interface
Usage: sdf man cli [OPTIONS]
--format FORMAT
— Format reference material in this format [only: markdown]
Possible values: sql
, yml
, markdown
, json
sdf man functions
Display SDF’s functions definitions
Usage: sdf man functions [OPTIONS]
--dialect DIALECT
— Dialect for all functions
Default value: trino
--section SECTION
— Section to display [ section value must appear in the function registry]
Default value: all
--format FORMAT
— Format reference material in this format [default: markdown | yml]
Possible values: sql
, yml
, markdown
, json
--implemented
— limit output to only implemented functions
Default value: false
sdf man definition-schema
Display SDF’s yml blocks as a json schema doc [only: json]
Usage: sdf man definition-schema [OPTIONS]
--format FORMAT
— Format reference material in this format [only: json]
Possible values: sql
, yml
, markdown
, json
sdf man event-schema
Display SDF’s trace events as a json schema only [only: json]
Usage: sdf man event-schema [OPTIONS]
--format FORMAT
— Format reference material in this format [only: json]
Possible values: sql
, yml
, markdown
, json
sdf man information-schema
Display SDF’s information schemas [only: sql]
Usage: sdf man information-schema [OPTIONS]
--format FORMAT
— Format reference material in this format [only: sql]
Possible values: sql
, yml
, markdown
, json
sdf man error-codes
Display SDF’ error codes [only: markdown]
Usage: sdf man error-codes [OPTIONS]
--format FORMAT
— Format reference material in this format [only: markdown]
Possible values: sql
, yml
, markdown
, json
sdf init
Initialize a workspace interactively
Usage: sdf init [PATH]
PATH
— Create a new sdf workspace at pathsdf dbt
Initialize an sdf workspace from an existing dbt project
Usage: sdf dbt [OPTIONS] COMMAND
init
— Initialize a sdf workspace from a dbt project — best effortrefresh
— Re-initialize a sdf workspace from a dbt project — best effort-s
, --show SHOW
— Display messages
Default value: progress
Possible values: progress
, none
sdf dbt init
Initialize a sdf workspace from a dbt project — best effort
Usage: sdf dbt init [OPTIONS]
--target TARGET
— Use this DBT target over the default target in profiles.yml
--profiles-dir PROFILES_DIR
— Use this DBT profile instead of the defaults at ~/.dbt/profile.yml — (note dbt uses —profile_dir, this CLI uses —profile-dir)
--workspace-dir WORKSPACE_DIR
— Specifies the workspace directory where we expect to see manifest and dbt project files The SDF workspace file will be placed in the same directory. Default: current directory
-s
, --save
— Save and overwrite the workspace file
Default value: false
-c
, --config CONFIG
— Supply a config yml file or provide config as yml string e.g. ‘(key: value)‘
sdf dbt refresh
Re-initialize a sdf workspace from a dbt project — best effort
Usage: sdf dbt refresh [OPTIONS]
--target TARGET
— Use this DBT target over the default target in profiles.yml
--profiles-dir PROFILES_DIR
— Use this DBT profile instead of the defaults at ~/.dbt/profile.yml — (note dbt uses —profile_dir, this CLI uses —profile-dir)
--workspace-dir WORKSPACE_DIR
— Specifies the workspace directory where we expect to see manifest and dbt project files The SDF workspace file will be placed in the same directory. Default: current directory
-s
, --save
— Save and overwrite the workspace file
Default value: false
-c
, --config CONFIG
— Supply a config yml file or provide config as yml string e.g. ‘(key: value)‘
sdf exec
Execute custom scripts
Usage: sdf exec [OPTIONS] COMMAND
COMMAND
— The command to be executed-s
, --show SHOW
— The verbosity of the output
Default value: all
Possible values: all
, progress
, result
, none
--path PATH
— Execute command at the path given [default: current workspace directory]