Writing Good Specifications

This guide helps contributors write clear, testable, and well-scoped specification artifacts. It complements the artifact type definitions in Specification Overview and the directive syntax in the Sphinx Needs Usage Guide.

Writing Requirements

A good requirement is testable, unambiguous, and states a single obligation.

Use IEEE 830 language conventions:

  • SHALL for mandatory behavior

  • SHOULD for recommended behavior

  • MAY for optional behavior

Make requirements measurable. A requirement that cannot be verified by a test is not a requirement — it is a wish.

  • Bad: “The system shall be fast”

  • Good: “The system SHALL complete pack operations at a rate of at least 100 MB/s for files 1 GB or larger”

  • Bad: “The system shall handle errors gracefully”

  • Good: “The system SHALL display a human-readable error message and exit with non-zero status when the source file does not exist”

One obligation per requirement. If a requirement contains “and” joining two distinct behaviors, split it into two requirements. “The system shall pack and verify files” should become two separate requirements — one for packing, one for verification.

Functional vs. non-functional: If a requirement describes what the system does, it is functional (:req:). If it describes how well the system performs — covering performance, security, usability, or portability — it is non-functional (:nfreq:).

Writing Use Cases

A use case describes a user’s goal, not a system behavior.

Include:

  • Actor: Who is performing the action (e.g., “IT administrator”, “privacy-conscious user”)

  • Preconditions: What must be true before the use case begins

  • Main flow: The steps the user takes to accomplish their goal

  • Success criteria: How the user knows they succeeded

Granularity: One use case per distinct user goal. “Transfer a large file across an air gap” is a use case. “Click the pack button” is not — that is a UI interaction within a use case.

Use cases are the starting point of the traceability chain. Requirements derive from use cases (linked via :satisfies:). Avoid implementation details in use cases — those belong in design specifications.

Writing Test Cases

A test case verifies that a requirement is satisfied. It does not restate the requirement — it describes how to prove the requirement is met.

Structure each test case with:

  • Objective: What this test verifies (reference the requirement)

  • Setup/Preconditions: What must be in place before the test runs

  • Steps: The specific actions to perform, in order

  • Expected result: The observable outcome that constitutes a pass

  • Pass/fail criteria: How to distinguish success from failure

Each test should be reproducible by someone who did not write it. Avoid “verify it works correctly” — state the specific expected output.

One test can cover multiple requirements (:tests: FR-XXX, FR-YYY), but every requirement should have at least one test. Use the traceability matrices to identify untested requirements.

Writing Design Specifications

Design specifications bridge requirements and implementation. They describe how the system fulfills its requirements and why a particular approach was chosen.

Include:

  • Architecture decisions and their rationale

  • Data structures and formats

  • Algorithm choices and trade-offs

  • Component interfaces and boundaries

A design spec should not restate the requirement. Where the requirement says “the system shall verify file integrity,” the design spec says “use SHA-256 checksums stored in a JSON manifest file.” Link to the requirements being addressed using the :implements: option.

Common Mistakes

  • Requirements that are really design specs: “The system shall use SHA-256 for checksums” specifies how, not what. The requirement is “the system shall verify file integrity”; the algorithm choice belongs in the design spec.

  • Test cases that restate requirements: “Verify that the system splits files into chunks” is a requirement restated as a test. A proper test specifies: create a 50GB file, run pack, verify 4 chunks of expected size are created with valid checksums.

  • Use cases that are too granular: System-level actions (“parse the manifest file”) are not use cases. Use cases describe user goals (“deploy an application to an air-gapped system”).

  • Missing traceability links: Orphaned requirements with no tests, or tests that don’t link back to requirements, break the traceability chain. Run a docs build and check the untested requirements table in Specification Overview to catch these.

See Also