ML-Specific Interval Operations #
This file provides activation functions for neural network interval propagation,
building on the general interval vector operations from Engine.IntervalVector.
Main definitions #
relu- ReLU activation for intervalssigmoid- Sigmoid activation (conservative [0, 1] bound)reluVector- Apply ReLU to each componentsigmoidVector- Apply sigmoid to each component
Main theorems #
mem_relu- Soundness of ReLUmem_sigmoid- Soundness of sigmoid
ReLU Activation #
ReLU applied to an interval: max(0, x) for all x in the interval. Returns [max(0, lo), max(0, hi)]
Equations
- LeanCert.ML.IntervalVector.relu I = { lo := (LeanCert.Core.Dyadic.ofInt 0).max I.lo, hi := (LeanCert.Core.Dyadic.ofInt 0).max I.hi, le := ⋯ }
Instances For
Soundness of ReLU: if x ∈ I, then max(0, x) ∈ relu I
Apply ReLU to each component of an interval vector
Instances For
theorem
LeanCert.ML.IntervalVector.mem_reluVector_component
{x : ℝ}
{I : Core.IntervalDyadic}
(hx : x ∈ I)
:
Soundness of vector ReLU
Sigmoid Activation #
Sigmoid interval: conservative bound [0, 1]. Since sigmoid(x) = 1/(1 + exp(-x)) ∈ (0, 1) for all x, [0, 1] is a sound overapproximation for any input interval.
Equations
- LeanCert.ML.IntervalVector.sigmoid _I = { lo := LeanCert.Core.Dyadic.ofInt 0, hi := LeanCert.Core.Dyadic.ofInt 1, le := LeanCert.ML.IntervalVector.sigmoid._proof_1 }
Instances For
Real sigmoid function
Instances For
Sigmoid is bounded in (0, 1)
Soundness of sigmoid: sigmoid(x) ∈ [0, 1] for all x
Apply sigmoid to each component of an interval vector