Tama

When things go sideways

Troubleshooting

A directory of the failure modes you're most likely to hit, what they mean, and how to clear them.

Start here

tama doctor

doctor checks every external tool, the project’s Lean / Lake / solc versions, Verity dependency resolution, generated directories, and tama.lock drift. Use tama doctor --fix only for repairs Tama explicitly says are safe.

”Not in a Tama project”

A project command can’t find tama.toml walking up from the current directory. Either cd into the project, or pass --root:

tama --root /path/to/project doctor

Stale lock

--locked (and CI) failed because tracked inputs changed. Diagnose:

tama doctor
tama update

If you intentionally changed only generated directories or dependency metadata, tama doctor --fix may be enough. CI should run --locked after tama update (or tama init) has produced the intended lock state.

Missing or wrong solc

Tama requires the solc version configured in [yul].solc. Check:

solc --version
tama doctor

Install the configured version, or set TAMA_SOLC to a binary that matches. A mismatched binary fails before bytecode extraction so the manifest can’t record a stale hash.

Lake dependencies or offline builds

tama check and tama build do not run lake update. If Lake dependencies are missing, run:

lake update

For repeated or offline work, keep a clean checkout cache in TAMA_LAKE_PACKAGE_CACHE. Tama seeds .lake/packages only when the cached Git HEAD exactly matches lake-manifest.json. It will fill missing packages or replace clean stale package checkouts, but it never overwrites dirty or non-Git package directories. With --offline, Tama refuses to invoke Lake if any pinned git package is still missing, dirty, or at another revision under .lake/packages.

Forge dependencies

tama init installs pinned forge-std as a Git submodule, optionally shallow. For Solidity-side dependency changes, use Forge directly:

forge install owner/repo
forge update

“Generated file was edited”

build error: src/generated/verity/TipJarIface.sol has a custom first line; refusing to overwrite

Tama overwrites generated files only when the first line is exactly:

// GENERATED by tama; do not edit.

If a generated bridge was hand-edited, move the changes into Verity source/spec/proof, or into a separate user-owned Solidity file that imports the generated interface, then run tama build again.

Missing bytecode or inspect artifacts

If tama inspect reports a missing artifact, rebuild:

tama build

For trust output, tama build must precede tama audit trust-boundary:

tama build
tama audit trust-boundary

Coverage failures

tama audit coverage rejects mirrors that aren’t shaped like properties. A mirror must:

  • live under Foundry’s configured test directory,
  • name a contract-qualified Foundry symbol,
  • have a function name beginning with testFuzz or invariant_.

Example correct shape:

test/verity/ERC20Lite.t.sol:ERC20LiteTest.testFuzzTransferPreservesTotalSupply

Plain test_* example tests can exist as smoke tests but don’t satisfy mirror coverage. See Reading audit output for the exact failure shape and how to fix it.

Trust-boundary failures

sorryAx is always denied — it cannot be allowlisted. Remove sorry from public obligations before release.

Other axioms must be intentionally allowlisted in tama.toml with a non-empty reason:

[trust.allow_axioms]
"Classical.choice" = "accepted Lean classical reasoning"

Invalid config

tama.toml is strict. Unknown Tama-owned keys and malformed TOML fail instead of being ignored. Foundry settings belong in foundry.toml, not tama.toml. After editing, rerun:

tama doctor

Network failures during install

install.sh retries network calls automatically, but if you’re behind a proxy or the GitHub Releases endpoint is down, you’ll see a fetch error. Try again, or download the artifact manually from the GitHub Releases page and install via tamaup --manifest-file path/to/manifest.json.

Still stuck?

Open an issue at github.com/bacon-labs/tama/issues with:

  • the output of tama doctor,
  • the command you ran,
  • the full output (use -vvv for trace-level logs),
  • your platform (uname -a).