Taskfile Variables & Dependencies
Master Taskfile variables, dynamic shell values, task dependencies, preconditions, status checks, and environment configuration — the building blocks for maintainable YAML-based task automation.
Content
Overview
Variables and dependencies are the core building blocks of Taskfile automation. Variables provide reusable configuration — static strings, computed shell values, and CLI overrides. Dependencies control execution order — parallel prerequisites, sequential command chains, preconditions for safety gates, and status checks for incremental builds. Understanding these mechanisms is the difference between a flat list of shell commands and a maintainable build system.
Variable Types and Precedence
Taskfile variables resolve in a specific precedence order (highest to lowest):
1. CLI overrides: task build VERSION=1.2.3
2. vars: on the called task
3. vars: in the task: call
4. vars: on the included Taskfile
5. Global vars: in the Taskfile
6. Environment variables
Static Variables
Variables can reference other variables using Go template syntax {{.VAR_NAME}}. They resolve at task execution time, not at parse time.
Dynamic (Shell) Variables
The sh: key runs a shell command and captures its stdout as the variable value. Dynamic variables are evaluated when the task is invoked — not when the Taskfile is loaded. This means git describe runs fresh every time.
Environment Variables
Global env: sets environment variables for all tasks. Task-level env: overrides globals for that specific task. Environment variables are available to shell commands but not to Go template expressions — use vars: for template interpolation.
Task-Level Variables
Task-level vars: provide defaults that can be overridden by CLI arguments or by the calling task's vars: block.
CLI Variable Overrides
CLI overrides have the highest precedence — they always win over Taskfile-defined values.
Dependencies
Dependencies define tasks that must complete before the current task runs. By default, all dependencies run in parallel.
Parallel Dependencies
When you run task setup, all three install-* tasks start simultaneously. Task waits for all dependencies to finish before continuing, even if one fails (unless failfast: true is set).
Dependencies with Variables
Sequential Execution
Commands within a task (cmds:) always run sequentially. For sequential task execution, use task: calls inside cmds::
The distinction matters: deps: runs in parallel (good for independent setup tasks), while cmds: [task: ...] runs sequentially (good for pipelines where order matters).
Preconditions
Preconditions are safety gates — they check a condition before the task runs and abort with a message if it fails. Unlike status:, a failed precondition fails the task and all tasks that depend on it.
Preconditions are the right place for:
- -Environment variable checks
- -Tool availability checks
- -Git state validation (clean tree, correct branch)
- -Service reachability checks
- -File existence checks
Status: Incremental Builds
The status: field defines conditions that determine if a task is already up-to-date. If all status commands return exit code 0, the task is skipped:
The sources: and generates: fields provide a fingerprint-based approach — Task hashes the source files and skips execution if the hash matches the last run. The status: field provides custom skip logic for cases where file hashing is not sufficient.
Dotenv Files
Load environment variables from .env files:
Files listed in dotenv: are loaded in order. Later files override earlier ones. This lets you layer environment configuration: .env for defaults, .env.local for developer overrides, .env.production for deployment.
Included Taskfiles
Split large Taskfiles into modular components:
Real-World Example: Complete Build Pipeline
Best Practices
- -Use
sh:variables for any value that comes from the environment — Git version, timestamps, file lists, current branch. These stay fresh across invocations. - -Use
deps:for independent parallel tasks (installing dependencies) andcmds: [task: ...]for sequential pipelines (lint then test then build). - -Use
preconditions:for safety checks before destructive operations. A precondition failure stops the task and all dependents. - -Use
sources:andgenerates:for incremental builds. Avoid re-building unchanged code. - -Provide
default:values for optional variables with{{.VAR | default "value"}}to make tasks work without explicit overrides. - -Split large Taskfiles using
includes:— one file per concern (Docker, Kubernetes, testing). - -Use the
defaulttask to show available tasks:cmds: [task --list]. - -Keep
env:for shell environment variables andvars:for template values. Do not confuse the two scopes.
Common Pitfalls
- -Confusing
deps:(parallel) with sequentialcmds:. If order matters, usecmds: [task: lint, task: test, task: build], notdeps: [lint, test, build]. - -Forgetting that
sh:variables execute a shell command.VERSION: v1.0is a static string;VERSION: { sh: echo v1.0 }runs a shell command. The syntax difference is subtle. - -Not providing default values for optional variables — tasks fail with empty string interpolation when the variable is not set.
- -Missing preconditions on destructive tasks —
task deploywithout a branch check or clean-tree check leads to accidental deployments. - -Putting dynamic variables in
env:instead ofvars:— environment variables do not supportsh:syntax. Usevars:withsh:for computed values. - -Not using
sources:/generates:for expensive tasks — every invocation re-runs, wasting time on unchanged code. - -Circular dependencies — Task A depends on B which depends on A. Taskfile detects this at runtime and fails with an error.
FAQ
Discussion
Loading comments...