Skip to main content

Evaluating Impact Certificates

Implemented (activation pending deployment)

Overview

Hypercerts represent claims of verified impact work. As an evaluator, your role is to assess whether these claims are substantiated by real evidence. This means verifying the attestation chain linking work submissions to approvals, checking the scope and quality of the underlying evidence, and evaluating impact claims against frameworks like the Eight Forms of Capital.

Hypercert evaluation

Implemented (activation pending deployment)

Attestation data and hypercert metadata are queryable. Full marketplace and trade evaluation features are pending activation.

How It Works

  1. 1

    Identify the hypercert

    Select the hypercert to evaluate. Review its metadata: scope of work, time period, contributor list, and the garden that minted it.

  2. 2

    Verify attestation chains

    For each piece of work included in the hypercert, confirm the attestation chain is intact: work submission UID exists, approval refUID matches, attester is an authorized operator, and no attestations are revoked.

  3. 3

    Assess evidence quality

    Review the underlying work submissions for evidence quality. Check photos, descriptions, and metrics against the action requirements. Evaluate whether the evidence substantiates the impact claims.

  4. 4

    Map to impact framework

    If the hypercert includes assessment data, evaluate how well the impact is mapped to the Eight Forms of Capital (Living, Material, Financial, Social, Intellectual, Experiential, Spiritual, Cultural).

Evaluation decisions

If: All attestation chains are valid and evidence quality is strong

Do: Document the verification results with confidence. Note any standout contributions.

Then: Include the certificate in positive evaluation reports for funders.

If: Some attestation chains are broken or evidence is weak

Do: Document specific gaps: which UIDs fail verification, which evidence is insufficient.

Then: Report findings to the garden operator and note caveats in evaluation reports.

If: Impact scope claims exceed what the evidence supports

Do: Note the discrepancy between claimed and evidenced impact.

Then: Recommend scope adjustments before including in funding recommendations.

Best Practices

  • Evaluate a random sample of attestation chains, not just the most recent — this catches historical inconsistencies
  • Cross-reference contributor allowlists with the actual attestation data to confirm the right gardeners are credited
  • Document your evaluation methodology so that your results are reproducible by other evaluators
  • When impact claims span multiple forms of capital, verify that each form has supporting evidence
  • Consider the garden's review standards and operator history when assessing evidence quality — well-run gardens typically produce more reliable attestation chains

What's Next

Next best action

With evaluation skills established, understand how your work builds your evaluator reputation.

Earning Badges