Product · Federated learning

Train on data you never get to see.

Secure aggregation, per-participant differential privacy, byzantine tolerance, and async round management — delivered as a trainer that your ML team doesn't have to babysit.

Topology · cross-silo deployment
AGGREGATOR secure · DP · byzantine Participant Aus-east · on-prem Participant Beu-west · gcp Participant Fapac · airgap Participant Dca-central · aws Participant Eeu-north · on-prem Participant Gus-west · azure
Round
1,412
Participants
6 / 6
ε budget
2.4 / 8.0
Drift
0.024
01 · Guarantees

What the aggregator cannot learn, by construction.

Secure aggregation

Pairwise masking over encrypted channels

The aggregator sees only the sum. Individual gradient contributions are cryptographically indistinguishable from noise.

Differential privacy

Per-participant ε budgets

Track spend across rounds. Halt when exhausted. Configurable clipping and noise multipliers.

Byzantine tolerance

Robust against ⅓ malicious participants

Coordinate-wise trimmed mean, Krum, and RFA available. Convergence preserved under attack.

Async tolerance

Stragglers and dropouts, handled

Our staleness-aware aggregator converges under arbitrary site availability. Reference: Chen et al., ICML 2025.

Start a federation with us.

Book a demo