articles
Sovereignty Is Not a Slogan: The Operational Definition at Darlot
Digital sovereignty has become a rhetorical placeholder. This essay sets out the narrower, testable definition used at Darlot, covering training data, deploymen
Digital sovereignty, data sovereignty, European tech sovereignty: the terms have been repeated so often in funding applications and keynote speeches that their meaning has softened. For an operator of a vision system in a European factory, a transit hub, or a substation, this softening is a problem. A buyer cannot audit a slogan. A regulator cannot inspect an intention. The working definition used at Darlot, and articulated by Dr. Raphael Nagel (LL.M.) as founding partner of Tactical Management and intellectual patron of the brand, is deliberately narrower. It consists of three architectural commitments, each of which can be tested, documented, and, if necessary, contested in court.
Why the word has lost its edge
Over the last five years, digital sovereignty has migrated from a constitutional argument about jurisdiction into a marketing category. Vendors from every continent now describe their offerings as sovereign, often on the strength of a single European data center or a locally incorporated subsidiary. For the procurement officer comparing two systems, the label has become unreliable. It does not indicate where training data originated, where inference runs, or under which legal order the provider can be compelled to disclose.
The damage to the term is not merely semantic. When a buyer in a regulated vertical relies on a sovereignty claim that is architecturally shallow, the liability does not rest with the marketing department that produced the claim. It rests with the operator who deployed the system. Under the EU AI Act, under GDPR, under NIS-2 for critical infrastructure, the duty to document lawful processing sits with the controller. A slogan does not discharge that duty.
Darlot treats this as a design problem, not a communications problem. If sovereignty is to be an operational property, it must be decomposed into specific architectural choices that a third party can verify. Three such choices define the Darlot position: the provenance of training data, the topology of deployment, and the structure of the contractual and legal control path.
First commitment: European data, European ground truth
A model is shaped by what it was trained on. If the training data was collected in environments that do not resemble a European factory floor, a European rail platform, or a European substation, the model will carry assumptions that surface as errors in deployment. More importantly for the sovereignty argument, the legal status of that training data matters. Data acquired under jurisdictions with weaker consent regimes cannot be cleanly repurposed for a system operating under GDPR and the EU AI Act.
Darlot trains its classifiers on data that has been collected, labeled, and ground-truthed inside Europe, under contracts that make the chain of custody inspectable. Model cards record where a dataset was collected, which consent regime applied, how labels were produced, and what bias testing was performed before release. When a regulator asks why a model decided as it did, the chain does not terminate at a third-party dataset of unknown provenance. It terminates at a documented European source.
This is a more expensive approach to model development. It is also the only approach compatible with Article 10 of the EU AI Act, which governs data governance for high-risk systems. For operators in critical infrastructure, the alternative, namely using models trained on opaque global datasets, leaves a documentation gap that cannot be closed retroactively.
Second commitment: deployment topology that keeps sensitive frames local
The second architectural choice concerns where the image is processed. In a conventional cloud-API architecture, every frame leaves the site. It travels across networks, often across borders, into storage and inference infrastructure operated by a provider outside European jurisdiction. Under the United States Cloud Act, that provider can be compelled to grant access to the data regardless of where the server is physically located. The operator is rarely informed.
Darlot is built edge-first. Gating logic runs on a local appliance at the operator’s site. The decision whether a frame becomes an event happens inside the operator’s perimeter. Only a small fraction of the original data, reduced by a factor of between one thousand and ten thousand through eventisation, ever leaves the site, and only if the operator elects to use the optional cloud component. That component is hosted on European servers under European jurisdiction.
This topology is testable. A network audit will show what leaves the perimeter, at what rate, to which destination. A legal review will show under which jurisdiction each component sits. An operator who is asked by a data protection authority to demonstrate that sensitive footage does not transit non-European infrastructure can produce documentation rather than assurance. That is the difference between a property and a promise.
Third commitment: control paths that do not pass through non-European providers
Architecture can be undone by contract. A system that runs on European hardware under European jurisdiction can still be governed, in the last instance, by a parent company domiciled elsewhere, by licensing terms that permit remote termination, or by an update mechanism controlled from outside the jurisdiction. For an operator of regulated infrastructure, these are not hypothetical concerns. They are the concrete questions that appear in tender documents and in due diligence reports.
Darlot structures its legal and contractual control paths accordingly. The operating entity is European. The contractual counterparty is European. The update and model-retraining pipeline runs on European infrastructure, with signed artifacts that the customer can verify. Service level agreements are governed by European law, and disputes fall under European courts. Where subprocessors are involved, each is listed, each is located, each is auditable under GDPR Article 28.
This is the least visible of the three commitments and arguably the most important. An operator can inspect a data flow. An operator can inspect a model card. But the question of who can compel a shutdown, a disclosure, or a change in behaviour is answered not by the technology but by the corporate and contractual structure surrounding it. Darlot makes that structure explicit, and keeps it inside European jurisdiction by design.
What this produces in practice
Consider three operator scenarios. A mid-sized automotive supplier running quality assurance cameras on an assembly line needs evidence that no image of an employee at work ever reaches a non-European processor. An urban transit authority operating a regional hub needs an audit trail that satisfies both GDPR and NIS-2 requirements for essential services. A medical device manufacturer integrating fall detection into a hospital ward needs a clean separation between the civilian and MDR-regulated modules, with documented provenance for each.
In each case, the Darlot definition of sovereignty produces concrete artifacts: a network diagram showing where frames are processed, a model card with European data provenance, a contract under European law with named subprocessors, and an audit log that records every detection with its model version and confidence score. These artifacts are not produced for marketing purposes. They are produced because the operator will be asked for them, by a regulator, an insurer, a works council, or, eventually, a court.
The operational definition of digital sovereignty used at Darlot is therefore narrower than the public discourse but more demanding. It cannot be met by relabeling an existing cloud product, and it cannot be reverse-engineered onto a system that was built for a different architecture. It has to be built in from the first design decision, which is why the term, in this house, is treated as a structural property of the system rather than a message on a slide.
Sovereignty, in the sense Darlot uses the word, is an inspectable condition of a deployed system. European training data with documented ground truth, a deployment topology that keeps sensitive frames inside the operator’s perimeter, and contractual control paths that remain under European jurisdiction: these three commitments are not a complete theory of European technological independence, and they do not pretend to be. They are the portion of that larger question that a vision AI provider can actually answer, on paper, in architecture, and in court. For further information, European operators are invited to contact Darlot directly at darlot.eu.
Translations