Quickstart
Supersigil is an open-source CLI that turns Markdown spec files into a verifiable graph. You write criteria in your repo, link them to test evidence through ecosystem plugins or annotations, and Supersigil checks that everything stays connected. It works locally, in CI, and as structured context for AI coding agents.
This is the fastest path to the core loop: write one enforceable criterion, run verify, and watch the tool tell you what is missing.
Install
Section titled “Install”brew tap jonisavo/supersigilbrew install supersigil
# or alternativelybrew install jonisavo/supersigil/supersigilInstalls prebuilt binaries for macOS and Linux.
cargo install supersigilCompiles from source. Requires a working Rust toolchain.
yay -S supersigil-binArch Linux only. Available for x86_64 and aarch64.
Prebuilt binaries are also available on the GitHub Releases page.
Verify the installation:
supersigil --versionFive-Minute Loop
Section titled “Five-Minute Loop”-
Initialize a project
Navigate to the root of your repository and run:
Terminal window supersigil initThis creates a
supersigil.tomlconfig and installs six agent skills to.agents/skills/. The skills teach AI coding agents how to work with Supersigil’s spec-driven workflow.supersigil.toml paths = ["specs/**/*.md"]Use
--no-skillsto skip skills installation, or-yto accept all defaults without prompting. -
Create a spec document
Scaffold a requirements document:
Terminal window supersigil new requirements authThen replace
specs/auth/auth.req.mdwith one real criterion:specs/auth/auth.req.md ---supersigil:id: auth/reqtype: requirementsstatus: approvedtitle: "Authentication"---```supersigil-xml<AcceptanceCriteria><Criterion id="valid-creds">WHEN a user submits valid credentialsTHE SYSTEM SHALL issue a session token.</Criterion></AcceptanceCriteria>``````supersigil-xml<TrackedFiles paths="src/auth/**/*.rs, tests/auth/**/*.rs" />```This is deliberately set to
approvedso the next verification run is meaningful. Supersigil now expects real evidence. -
Verify the graph
Terminal window supersigil verifyYou should see a failing result because the criterion has no matching evidence yet:
auth/req✗ criterion "valid-creds" has no matching verification evidenceThis is the first payoff: the spec is now a contract, not just prose.
-
Add evidence and watch it go green
Annotate a test with its criterion ref. The approach depends on your language.
Use the
verifiesattribute macro from thesupersigil_rustcrate. Native support is enabled by default.Terminal window cargo add supersigil-rusttests/auth/login_test.rs use supersigil_rust::verifies;#[verifies("auth/req#valid-creds")]#[test]fn login_with_valid_credentials_returns_token() {let result = login("alice", "correct-password");assert!(result.token.is_some());}Enable the JavaScript/TypeScript ecosystem plugin in
supersigil.toml. Then use theverifieshelper from the@supersigil/vitestpackage.Terminal window pnpm install --save-dev @supersigil/vitestsupersigil.toml paths = ["specs/**/*.md"]tests = ["tests/**/*.test.ts"][ecosystem]plugins = ["rust", "js"]tests/auth/login.test.ts import { verifies } from '@supersigil/vitest'import { test, expect } from 'vitest'test('login with valid credentials', verifies('auth/req#valid-creds'), () => {const result = login('alice', 'correct-password')expect(result.token).toBeDefined()})Run verify again:
Terminal window supersigil verify✓ Clean: no findings
What you just proved
Section titled “What you just proved”- Supersigil does not treat criteria as decorative text. It expects evidence.
- Verification can fail for the right reason before code reaches CI.
- One spec file is enough to start; you do not need a full process to get value.
Hand it to an AI agent
Section titled “Hand it to an AI agent”The supersigil init step installed agent skills to .agents/skills/. These teach AI coding agents the full spec-driven workflow,
from writing specs to implementing against criteria and re-verifying the result.
Your agent can load the spec you just created as structured context:
supersigil context auth/req --format jsonFrom here, an agent can read the criteria, plan implementation work, and run supersigil verify --format json to check its own output.
The skills guide agents through this loop automatically. Point the agent at a spec, and it knows what to do.
See the AI Agents guide for the full integration details.
Choose Your Next Path
Section titled “Choose Your Next Path”- Set up your editor — Get live diagnostics, rendered spec previews with verification badges, and go-to-definition in VS Code or IntelliJ.
- Add Supersigil to an existing project — Adopt incrementally in a codebase that already has code and tests.
- Bring it into CI — Add
verify,affected, and drift checks to pull requests. - Understand evidence sources — Compare tags, file globs, and ecosystem plugins.