Regulatory Accretion Doctrine
Regulatory Accretion Doctrine explains why each new rule creates a technical stack that compounds complexity faster than intent. As layers accumulate, dependencies multiply, oversight diffuses, and fragility concentrates where visibility is lowest.
Regulatory Accretion Doctrine
Axiom
Every rule births a stack.
Complexity grows faster than intent.
Doctrine
This doctrine holds that regulation does not create order so much as it creates layers. Each new rule generates a corresponding technical stack; stacks require vendors; vendors introduce identity, verification, and control mechanisms that extend beyond the intent of the original rule. As these layers accumulate, dependencies multiply and fragility concentrates in the seams no one is assigned to monitor.
Regulation is not a single move but a cascade. Oversight becomes distributed, recursive, and increasingly abstracted from the original purpose. Capability erodes even as interface sophistication increases. The system grows thicker beneath the surface while becoming harder to understand, govern, or repair.
Within Convivial Systems Theory, the Regulatory Accretion Doctrine explains why modernization efforts often degrade safety and reliability: features accumulate faster than institutional competence, and accountability diffuses across stacks no one fully owns.
Form
Rule → Stack
Stack → Vendor
Vendor → Dependency
Dependency → Fragility
Neural Network Mapping
(Layer accretion and capability hollowing)
In learning systems, regulatory accretion mirrors uncontrolled layer and interface growth without corresponding increases in model understanding or validation capacity. Adding modules, constraints, or post-hoc controls can increase apparent sophistication while degrading interpretability, traceability, and stability.
When layers accrete faster than calibration and feedback loops evolve, failure modes migrate to the seams between components. Errors become harder to localize, responsibility diffuses, and corrective action lags behind propagation.
In ML terms:
depth without governance increases fragility.
interfaces without attribution collapse trust.
Systems fail not because they are regulated, but because regulation accumulates faster than the system’s ability to understand itself.
Applied example (SIA)
Why LED Speed Signs Create New Hazards (Systems in Action)