Documentation

LeanCert.Examples.ML.Distillation

Model Distillation Verification Example #

This file demonstrates verified model distillation using LeanCert's interval arithmetic infrastructure.

The Scenario #

In AI deployment, we often need to:

  1. Train a large "Teacher" model for accuracy
  2. Compress it into a smaller "Student" model for efficiency
  3. Verify that the Student behaves similarly to the Teacher

LeanCert can formally prove that |Teacher(x) - Student(x)| ≤ ε for ALL inputs in a domain - not just sampled points, but a mathematical guarantee.

Example Networks #

We define two simple 2→2→1 networks and verify their outputs differ by at most ε = 3 on the input domain [0, 1]².

Main Results #

Network Definitions #

Layer 1: 2 inputs → 2 outputs

Equations
Instances For

    Layer 2: 2 inputs → 1 output

    Equations
    Instances For

      Well-formedness Proofs #

      Input Domain #

      def LeanCert.Examples.ML.Distillation.mkInterval (lo hi : ) (h : lo hi := by norm_num) (prec : := -53) :

      Create an interval from rational bounds

      Equations
      Instances For

        Input domain: [0, 1]²

        Equations
        • One or more equations did not get rendered due to their size.
        Instances For

          Equivalence Check #

          Tolerance for distillation verification

          Equations
          Instances For

            The computable certificate

            Equations
            • One or more equations did not get rendered due to their size.
            Instances For

              The check passes! This is verified by computation.

              Helper Lemmas #

              theorem LeanCert.Examples.ML.Distillation.mem_mkInterval {x : } {lo hi : } (hle : lo hi) (hx : lo x x hi) (prec : ) (hprec : prec 0 := by norm_num) :
              x mkInterval lo hi hle prec

              Membership in mkInterval

              The Distillation Certificate #

              For ANY inputs (x, y) in [0, 1]², Teacher and Student outputs differ by at most ε = 3.

              Interpretation #

              The theorem distillation_certificate proves that for ANY (x, y) ∈ [0, 1]²: |Teacher(x,y) - Student(x,y)| ≤ 3

              This bound arises from interval arithmetic conservatism: the analysis computes bounds on Teacher and Student outputs independently, then subtracts them.

              Applications #