Tests & Suite
jetwarp’s portability guarantee is enforced by a shared conformance suite under tests/suite.
This page explains how to run tests, what the suite covers, and how to add new contract tests.
Running tests
All modules at once
bash tools/test-all.sh
Runs go test ./... per module. Set TW_TEST_ALL_RACE=1 to enable the race detector.
One module
cd drivers/v1/chi && go test ./...
cd adapter/chi && go test ./...
cd openapi/oas3 && go test ./...
Lint
bash tools/lint-all.sh
The conformance suite
The suite lives in tests/suite/ and is importable by any module that needs to validate a driver or adapter.
Entry points
| Function | Used by | What it tests |
|---|---|---|
suite.RunDriver(t, factory) | Driver packages | Routing, params, scoping, method handling, no-panic contract |
suite.RunDriverSmoke(t, factory) | Driver packages | Practical route tree, concurrency, MethodAny fallback priority |
suite.RunAdapter(t, factory) | Adapter packages | Logical scoping, middleware ordering, sentinel errors, native MW handling |
suite.RunDriverBenchmarks(b, factory) | Driver packages | Routing + method benchmarks |
How to call them
In your driver’s test package:
// drivers/v1/myrouter/driver_conformance_test.go
func TestDriverConformance(t *testing.T) {
suite.RunDriver(t, suite.DriverFactory{
Name: "myrouter",
New: func(t *testing.T) drv.Drv { return myrouter.New() },
})
}
func TestDriverSmoke(t *testing.T) {
suite.RunDriverSmoke(t, suite.DriverFactory{
Name: "myrouter",
New: func(t *testing.T) drv.Drv { return myrouter.New() },
})
}
In your adapter’s test package:
// adapter/myrouter/adapter_conformance_test.go
func TestAdapterConformance(t *testing.T) {
suite.RunAdapter(t, suite.AdapterFactory{
Name: "myrouter adapter",
New: func(t *testing.T) adapter.Adapter { return myadapter.New() },
})
}
Capability-gated tests
Many suite tests are gated on capabilities:
if !d.Caps().Has(drv.CapParamSuffix) {
t.Skipf("driver %s missing CapParamSuffix", f.Name)
return
}
If your driver does not claim a capability, those tests are silently skipped — not failures. This means you can incrementally implement capabilities without breaking CI.
Go version requirement
The suite helper uses APIs that require modern Go. In practice, your test module should declare at least Go 1.26, which is also the repo minimum for in-repo modules.
Adding a contract test
When you add a new portability guarantee (e.g. a new semantic rule for Group or a new capability),
you typically need to add a suite test so every current and future driver/adapter must satisfy it.
Steps:
- Add the test to the appropriate file in
tests/suite/(or a new file if it’s a new battery). - Gate it on a capability if the behavior is not universal:
if !d.Caps().Has(drv.SomeNewCap) { t.Skip("...") } - Run the full suite against all existing drivers to confirm they pass.
- If a driver should now claim the new capability, update its
Caps()return value.
Logical scoping tests
RunAdapter includes logical scoping contract tests (testAdapterLogicalScopingContract).
These require the adapter to implement adapter.RegistryProvider. Adapters built with
core.New(driver.New()) implement it automatically.
Mock drivers (tests/testkit)
testkit.MockDriver is a deterministic in-memory driver for testing core semantics without
a real router. It records every Handle call and serves by exact-match lookup.
testkit.LimitedMockDriver wraps MockDriver and lets you selectively disable capabilities,
which is useful for testing how core or the suite reacts to missing caps.
Driver-specific regression tests
Beyond the shared conformance suite, each driver and adapter package ships its own regression tests that pin deliberate, driver-specific behaviors. These are not in the suite because they test choices that are intentional for a single driver, not cross-driver contracts.
Per-driver regression files
| Module | File | What it pins |
|---|---|---|
drivers/v1/gin | gin_regression_test.go | param-suffix rejection, internal header non-leak, param bridging (r.PathValue = drv.Param), 405 + Allow header, MethodAny fallback, scope semantics |
drivers/v1/chi | chi_regression_test.go | r.PathValue bridging, scope mount + parent routes, 405 + Allow header, param-suffix/infix pattern, invalid pattern no-panic, MethodAny fallback, duplicate scope error |
drivers/v1/echo | echo_regression_test.go | MethodAny lower-priority, param bridging, scope semantics, invalid pattern no-panic |
drivers/v1/fiber | fiber_regression_test.go | param bridging, MethodAny fallback, invalid pattern no-panic, internal header non-leak |
drivers/v1/stdlib | std_lib_regression_test.go | MethodAny fallback, param bridging, scope semantics |
openapi/oas3/validate | golden_regression_test.go | representative spec round-trip validates via pb33f (OAS 3.2, path params, ANY expansion, non-standard verbs) |
Per-adapter native MW scope regression files
These freeze the documented scoping difference between Fiber (truly prefix-scoped native MW)
and gin/echo/chi (global native MW regardless of Group prefix):
| Module | File | What it pins |
|---|---|---|
adapter/gin | native_mw_scope_regression_test.go | native gin MW on Group("/api") applies globally (documented limitation) |
adapter/echo | native_mw_scope_regression_test.go | native echo MW on Group("/api") applies globally (documented limitation) |
adapter/chi | native_mw_scope_regression_test.go | native chi MW on Group("/api") applies globally (documented limitation) |
adapter/fiber | native_mw_scope_regression_test.go | native fiber MW on Group("/api") is correctly prefix-scoped (drv.PrefixMWApplier) |
When to add a regression test vs a suite test
Add to the suite when the behavior is part of the portability contract — something every current and future driver must satisfy. Add a driver regression test when the behavior is:
- a deliberate driver-specific choice (e.g. gin rejecting
{id}.json), or - an internal implementation detail that could silently break on refactor (e.g. header stripping), or
- a documented limitation that should not regress (e.g. native MW scope behavior per driver).
What the suite does NOT cover
- Framework-specific features (gin performance characteristics, fiber websockets, etc.)
- Request body / response streaming edge cases
- Production observability or graceful shutdown
These belong in driver-specific regression tests, not the shared suite.