Extending zipnet
audience: contributors
This chapter covers two kinds of extension:
- Extending zipnet itself — new commands, collections, streams, ticket classes, or round-parameter knobs within a zipnet deployment.
- Building an adjacent service on the shared universe — a new
mosaik-native service (multisig signer, secure storage, attested
oracle, …) that coexists with zipnet on
zipnet::UNIVERSEand reuses the content + intent addressed fingerprint pattern.
The second is the generalisation of the first. The “checklist for a new service” at the end of design-intro is the canonical reference for the second kind; this chapter links to it and concentrates on the concrete how-tos.
Extending zipnet itself
Adding a new command to the committee state machine
- Add a variant to
Commandincrates/zipnet-node/src/committee.rs. - Handle it in
apply(). Deterministic only — no I/O, no randomness that isn’t derived fromApplyContext(see Committee state machine — Apply-context usage). - Bump the version tag in
CommitteeMachine::signature()(v1→v2). This re-scopes theGroupIdso mismatched nodes cannot bond. This is a breaking change. - Add a
Queryvariant if the new state needs external read access. - Decide who issues the command. If a non-server peer needs to
trigger it, add a
declare::stream!channel and a side-task inroles::serverthat feeds it intogroup.execute.
Adding a new collection
-
Declare in
crates/zipnet-node/src/protocol.rs:declare::collection!( pub MyMap = mosaik::collections::Map<K, V>, "zipnet.collection.my-map", ); -
Decide writer and reader roles. Writers join the collection’s internal Raft group and bear the leadership election cost.
-
For TDX-gated collections, compose
Tdx::new().require_mrtd(...)onto the collection’srequire_ticketalongside the existingBundleValidator— see Mosaik integration — TDX gating. -
If the new collection is part of the public surface, think twice. Zipnet’s declared public surface is small (write-side + read-side, see Architecture). A new public collection widens the consumer contract; prefer surfacing via the existing
Zipnet::<D>::*constructors instead of growing raw declarations. -
Once the target per-deployment layout lands, the literal string will be replaced by
DEPLOYMENT.derive("my-map"); structure the name so the migration is a pure rename.
Adding a new typed stream
- Declare in
protocol.rs. Prefix predicates withproducer/consumerper the direction semantics (Mosaik integration — predicate direction). - Use in a role module:
MyStream::producer(&network)/MyStream::consumer(&network)returns concrete typed handles. - If this is a high-churn internal channel (aggregator fan-in, DH gossip), it’s a candidate to live on a derived private network rather than the shared universe — see Architecture — Internal plumbing.
Adding a new TicketValidator
-
Implement
mosaik::tickets::TicketValidatoron a fresh type.BundleValidator<K>incrates/zipnet-node/src/tickets.rsis the reference shape. -
Pick a
TicketClassconstant. Keep it human-readable ("zipnet.bundle.server", etc.) — ticket classes are intent-addressed and the string is the intent. -
Fold a version tag into
signature()the same wayBundleValidatordoes:fn signature(&self) -> UniqueId { K::CLASS.derive("zipnet.my-validator.v1") }Bumping
v1→v2re-scopes theGroupIdof every group that stacks this validator. Treat it as a breaking change. -
Compose with existing validators via mosaik’s multi-
require_ticket— see Mosaik integration — TDX gating for the stacking pattern.
Changing RoundParams
- Edit
RoundParams::default_v1()incrates/zipnet-proto/src/params.rs. - Bump
WIRE_VERSIONif the change is semantically meaningful (any client/server disagreement on shape would garble pads otherwise). CommitteeMachine::signature()already mixes in params fields; every member rederivesGroupIdand old + new do not bond.- Deploy-time coordination: same procedure as rotating the committee secret.
Adding a TDX attestation requirement
-
Turn on the
tee-tdxfeature onzipnet-node,zipnet-server,zipnet-client. -
In the deployment-specific
main, pre-compute (or hardcode) the expected MR_TD. -
Build a validator:
use mosaik::tickets::Tdx; let validator = Tdx::new().require_mrtd(expected_mrtd); -
Plumb
validatorinto the server’srunpath by stacking it on the committeeGroupBuilder::require_ticketand on each collection / stream whose producer you want to TDX-gate.
Swapping the slot assignment function
- The slot is picked by
zipnet_core::slot::slot_for(client, round, params). Change the body; the caller contract is-> usize. - If you want the footprint scheduling variant, you’ll also want a per-round side channel — see Roadmap — Footprint scheduling.
- Deterministic and agreed upon by all nodes. Bump the protocol version tags accordingly.
Running the integration test under heavier parameters
crates/zipnet-node/tests/e2e.rs uses RoundParams::default_v1()
and a hardcoded 3-server / 2-client topology. Modify directly; the
helpers (cross_sync, run_server, run_client, run_aggregator)
are scoped to the test so no cross-cutting refactor is needed.
RUST_LOG=info,zipnet_node=debug cargo test -p zipnet-node --test e2e -- --nocapture
A successful run ends with
zipnet e2e: round r1 finalized with 2/2 messages recovered
Where to put a new role
If you introduce a fourth participant type (say, an “auditor” that
archives Broadcasts to cold storage), the idiomatic placement is a
new module in crates/zipnet-node/src/roles/ and a sibling crate
under crates/zipnet-auditor/ that delegates to it. Follow the
zipnet-aggregator binary layout.
Measuring something
Mosaik’s Prometheus metrics are auto-wired; add your own via the
metrics crate:
use metrics::{counter, gauge};
counter!("zipnet_rounds_opened_total").increment(1);
gauge!("zipnet_client_registry_size").set(registry.len() as f64);
They will appear at the configured ZIPNET_METRICS endpoint without
any scraper-side changes.
Building an adjacent service on the shared universe
Zipnet’s deployment model is a reusable pattern — the full rationale
is in design-intro. Any service that wants to
coexist on zipnet::UNIVERSE alongside zipnet should reproduce the
three conventions:
- Content + intent addressed fingerprint. Every public id
descends from a single
blake3hash over the operator’s intent (name), the signature-altering content (schema version, wire sizes, consensus config, init salt), and the ACL composition. Expose the fingerprint inputs as a const-constructibleConfigstruct. - A
Deployment-shaped convention. Declare the public surface (one or two primitives, ideally) in a single protocol module; export typedZipnet::<D>::*-style constructors that derive the ids internally. - A fingerprint convention, not a registry. Operator → consumer
handshake is universe
NetworkId+Config+ datum schema + (if TDX-gated) MR_TD. No on-network advertisement required — mosaik’s standard discovery bonds the sides.
Walk the
checklist for a new service
end-to-end before writing any code. The most common mistake is not
answering “what happens when StateMachine::signature() bumps?”
before shipping.
When Shape B is the wrong call
A service whose traffic would dominate catalog gossip on the shared
universe (high-frequency metric streams, bulk replication) belongs
behind its own NetworkId — Shape A in
design-intro — Two axes of choice.
The narrow-public-surface discipline does not rescue a service
whose steady-state traffic is inherently loud; at that point the
noise cost dominates the composition benefit.
Optional directory collection
If your operator community wants a human-browsable list of known
deployments, ship a sibling Map<InstanceName, InstanceCard> as a
devops convenience, not as part of the consumer binding path. See
Roadmap — Optional directory collection
for the discipline.