The hardest constraint in defense engineering isn't classification. It is that classification is not uniform. A bid engineer at a European missile-systems prime has access to some of the prior-program data they need; a colleague at a sister site, on a different clearance, has access to a different subset. A graph that represents requirements has to also represent who can see what, and remain useful when the visible graph is a subset of the actual graph.
SPREAD's deployment is Requirements Manager, an on-premise knowledge graph that ingests historical requirements, sub-system definitions, integration-test records, and the cross-program matrix that defines what has been validated where. The instance runs behind the customer's firewall, across four national-jurisdiction classifications. No SaaS routing. No vendor-side data egress. Audit logs flow into the customer's existing security-event stack. The graph answers the bid engineer's question with three things attached: the candidate solution, a confidence score, and the access-clearance state visible to that engineer.
What makes regulated analytical platforms hard
Scoping a new variant for a national procurement authority, the bid engineer has access to some of the prior-program data they need. Classification compartmentalization determines which programs the engineer can see, which sub-systems within those programs are visible at their clearance level, and which national-jurisdiction certification archives are accessible to a UK engineer versus a French engineer versus a German engineer at the same firm.
That is the structural problem most analytical platforms do not survive.
The bid engineer's question
When the prime scopes a new variant with unique integration requirements, the engineer's question takes shape something like:
"Which guidance subsystems have been certified at this threat envelope on prior programs? Which propellant chemistries have been integration-tested at this range class? Which warhead families have validated effectiveness against this target type, and which of these am I cleared to evaluate for this customer?"
Historically that question runs across documents and tribal knowledge for days, with senior engineers triangulating between what they can see and what they know exists but cannot access. Requirements Manager answers it by surfacing prior validated combinations with confidence scores attached, and by making the access-control envelope a first-class property of the answer.
What changes for the bid engineer
- Walk classified documentation across UK / FR / DE / IT silos
- Triangulate against tribal knowledge of senior engineers
- Manually track access-clearance state per record
- Re-derive effort estimates against incomplete visibility
- Query the graph from the engineer's clearance level
- Review candidate solutions with confidence scores attached
- Access-clearance state visible per match
- Evaluate the surfaced candidates instead of finding them
For the bid engineer, a requirement that previously triggered a multi-day search across documentation produces a candidate match in the graph view. Confidence score attached. Validation history attached. Access-clearance state attached. The engineer's role compresses to evaluate the match rather than find the candidates, within the bounds of what the engineer is cleared to evaluate.
On-premise architecture as precondition
The customer operates under multiple national defense classifications. Each carries its own audit trail and clearance regime. The Requirements Manager instance lives behind the customer's firewall. No SaaS routing. No vendor-side data egress. Audit logs flow into the customer's existing security-event stack rather than a SPREAD-side telemetry system.
That topology is not a feature checkbox. It is a precondition. Defense engineering data being analyzable at all in a regulated environment, while simultaneously useful to bid engineers across four national jurisdictions, is the structural problem most analytical layers do not survive.
The customer voice
The customer's Engineering Director, on the structural shift, in the public testimonial:
"Even in a fully regulated, on-premise environment, we've gained control over complex engineering data, detecting errors earlier and accelerating development with confidence."Engineering Director · European Missile-Systems Prime
Three things in one sentence. Fully regulated, on-premise environment confirms the deployment model. Detecting errors earlier is the leftward shift on the cost curve, errors caught at requirement-evaluation time, not integration-test time. Accelerating development with confidence is the qualifier doing the load-bearing work, in a regulated program, speed without confidence is worthless; with confidence means the engineer can make the speed-up and stand behind it under audit.
The shape of regulated analytical platforms
The category most analytical platforms operate in assumes that more data, more visible to more people, is better. The category the customer operates in inverts that assumption. More data, more visible to more people, is a compliance failure.
The interesting structural observation is that those two postures don't actually conflict. The graph does not need to be globally visible to be globally useful. A graph that represents the cross-program validation matrix abstractly, without revealing the substantive content of any classified field to any user not cleared to see it, is still answering the bid engineer's question. The matrix exists. The engineer queries it. The graph returns the candidate solutions visible at the engineer's clearance level, with the access state attached.
That is a different shape of analytical platform than the cloud-native, share-everything default of the broader B2B software market. It is also the shape every regulated industry will eventually need.
Program shape
| Program | Cross-program missile-systems requirements reuse |
|---|---|
| Deployment mode | On-premise (regulated, 4 national jurisdictions) |
| Jurisdictions | Four national defense procurement authorities |
| What the graph returns | Candidate · confidence score · access-clearance state |
| Workflow shift | Search becomes evaluation |
| Customer-side speaker | Engineering Director |