Inner product space #
This file defines inner product spaces and proves the basic properties. We do not formally
define Hilbert spaces, but they can be obtained using the set of assumptions
[NormedAddCommGroup E] [InnerProductSpace 𝕜 E] [CompleteSpace E]
.
An inner product space is a vector space endowed with an inner product. It generalizes the notion of
dot product in ℝ^n
and provides the means of defining the length of a vector and the angle between
two vectors. In particular vectors x
and y
are orthogonal if their inner product equals zero.
We define both the real and complex cases at the same time using the RCLike
typeclass.
This file proves general results on inner product spaces. For the specific construction of an inner
product structure on n → 𝕜
for 𝕜 = ℝ
or ℂ
, see EuclideanSpace
in
Analysis.InnerProductSpace.PiL2
.
Main results #
- We define the class
InnerProductSpace 𝕜 E
extendingNormedSpace 𝕜 E
with a number of basic properties, most notably the Cauchy-Schwarz inequality. Here𝕜
is understood to be eitherℝ
orℂ
, through theRCLike
typeclass. - We show that the inner product is continuous,
continuous_inner
, and bundle it as the continuous sesquilinear mapinnerSL
(see alsoinnerₛₗ
for the non-continuous version). - We define
Orthonormal
, a predicate on a functionv : ι → E
, and prove the existence of a maximal orthonormal set,exists_maximal_orthonormal
. Bessel's inequality,Orthonormal.tsum_inner_products_le
, states that given an orthonormal setv
and a vectorx
, the sum of the norm-squares of the inner products⟪v i, x⟫
is no more than the norm-square ofx
. For the existence of orthonormal bases, Hilbert bases, etc., see the fileAnalysis.InnerProductSpace.projection
.
Notation #
We globally denote the real and complex inner products by ⟪·, ·⟫_ℝ
and ⟪·, ·⟫_ℂ
respectively.
We also provide two notation namespaces: RealInnerProductSpace
, ComplexInnerProductSpace
,
which respectively introduce the plain notation ⟪·, ·⟫
for the real and complex inner product.
Implementation notes #
We choose the convention that inner products are conjugate linear in the first argument and linear in the second.
Tags #
inner product space, Hilbert space, norm
References #
- [Clément & Martin, The Lax-Milgram Theorem. A detailed proof to be formalized in Coq]
- [Clément & Martin, A Coq formal proof of the Lax–Milgram theorem]
The Coq code is available at the following address: http://www.lri.fr/~sboldo/elfic/index.html
The inner product with values in 𝕜
.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Pretty printer defined by notation3
command.
Equations
- One or more equations did not get rendered due to their size.
Instances For
The inner product with values in ℝ
.
Equations
- One or more equations did not get rendered due to their size.
Instances For
The inner product with values in ℂ
.
Equations
- One or more equations did not get rendered due to their size.
Instances For
A (pre) inner product space is a vector space with an additional operation called inner product.
The (semi)norm could be derived from the inner product, instead we require the existence of a
seminorm and the fact that ‖x‖^2 = re ⟪x, x⟫
to be able to put instances on 𝕂
or product spaces.
Note that NormedSpace
does not assume that ‖x‖=0
implies x=0
(it is rather a seminorm).
To construct a seminorm from an inner product, see PreInnerProductSpace.ofCore
.
- smul : 𝕜 → E → E
- inner : E → E → 𝕜
The inner product induces the norm.
- conj_symm : ∀ (x y : E), (starRingEnd 𝕜) (inner y x) = inner x y
The inner product is hermitian, taking the
conj
swaps the arguments. The inner product is additive in the first coordinate.
The inner product is conjugate linear in the first coordinate.
Instances
The inner product induces the norm.
The inner product is hermitian, taking the conj
swaps the arguments.
The inner product is additive in the first coordinate.
The inner product is conjugate linear in the first coordinate.
Constructing a normed space structure from an inner product #
In the definition of an inner product space, we require the existence of a norm, which is equal
(but maybe not defeq) to the square root of the scalar product. This makes it possible to put
an inner product space structure on spaces with a preexisting norm (for instance ℝ
), with good
properties. However, sometimes, one would like to define the norm starting only from a well-behaved
scalar product. This is what we implement in this paragraph, starting from a structure
InnerProductSpace.Core
stating that we have a nice scalar product.
Our goal here is not to develop a whole theory with all the supporting API, as this will be done
below for InnerProductSpace
. Instead, we implement the bare minimum to go as directly as
possible to the construction of the norm and the proof of the triangular inequality.
Warning: Do not use this Core
structure if the space you are interested in already has a norm
instance defined on it, otherwise this will create a second non-defeq norm instance!
A structure requiring that a scalar product is positive semidefinite and symmetric.
- inner : F → F → 𝕜
- conj_symm : ∀ (x y : F), (starRingEnd 𝕜) (inner y x) = inner x y
The inner product is hermitian, taking the
conj
swaps the arguments. The inner product is positive (semi)definite.
The inner product is positive definite.
The inner product is conjugate linear in the first coordinate.
Instances
The inner product is hermitian, taking the conj
swaps the arguments.
The inner product is positive (semi)definite.
The inner product is positive definite.
The inner product is conjugate linear in the first coordinate.
A structure requiring that a scalar product is positive definite. Some theorems that
require this assumptions are put under section InnerProductSpace.Core
.
- inner : F → F → 𝕜
- conj_symm : ∀ (x y : F), (starRingEnd 𝕜) (inner y x) = inner x y
The inner product is positive definite.
Instances
The inner product is positive definite.
Equations
- instCoreOfCore 𝕜 F = { inner := inner, conj_symm := ⋯, nonneg_re := ⋯, add_left := ⋯, smul_left := ⋯ }
Define PreInnerProductSpace.Core
from PreInnerProductSpace
. Defined to reuse lemmas about
PreInnerProductSpace.Core
for PreInnerProductSpace
s. Note that the Seminorm
instance provided
by PreInnerProductSpace.Core.norm
is propositionally but not definitionally equal to the original
norm.
Equations
- PreInnerProductSpace.toCore = { toInner := InnerProductSpace.toInner, conj_symm := ⋯, nonneg_re := ⋯, add_left := ⋯, smul_left := ⋯ }
Instances For
Define InnerProductSpace.Core
from InnerProductSpace
. Defined to reuse lemmas about
InnerProductSpace.Core
for InnerProductSpace
s. Note that the Norm
instance provided by
InnerProductSpace.Core.norm
is propositionally but not definitionally equal to the original
norm.
Equations
- InnerProductSpace.toCore = { toInner := InnerProductSpace.toInner, conj_symm := ⋯, nonneg_re := ⋯, add_left := ⋯, smul_left := ⋯, definite := ⋯ }
Instances For
Inner product defined by the PreInnerProductSpace.Core
structure. We can't reuse
PreInnerProductSpace.Core.toInner
because it takes PreInnerProductSpace.Core
as an explicit
argument.
Equations
- InnerProductSpace.Core.toPreInner' = c.toInner
Instances For
The norm squared function for PreInnerProductSpace.Core
structure.
Equations
- InnerProductSpace.Core.normSq x = RCLike.re (inner x x)
Instances For
An auxiliary equality useful to prove the Cauchy–Schwarz inequality. Here we use the standard argument involving the discriminant of quadratic form.
Another auxiliary equality related with the Cauchy–Schwarz inequality: the square of the
seminorm of ⟪x, y⟫ • x - ⟪x, x⟫ • y
is equal to ‖x‖ ^ 2 * (‖x‖ ^ 2 * ‖y‖ ^ 2 - ‖⟪x, y⟫‖ ^ 2)
.
We use InnerProductSpace.ofCore.normSq x
etc (defeq to is_R_or_C.re ⟪x, x⟫
) instead of ‖x‖ ^ 2
etc to avoid extra rewrites when applying it to an InnerProductSpace
.
Cauchy–Schwarz inequality.
We need this for the PreInnerProductSpace.Core
structure to prove the triangle inequality below
when showing the core is a normed group and to take the quotient.
(Semi)norm constructed from an PreInnerProductSpace.Core
structure, defined to be the square
root of the scalar product.
Instances For
Seminormed group structure constructed from an PreInnerProductSpace.Core
structure
Equations
Instances For
Normed space (which is actually a seminorm) structure constructed from an
PreInnerProductSpace.Core
structure
Equations
- InnerProductSpace.Core.toSeminormedSpace = NormedSpace.mk ⋯
Instances For
Inner product defined by the InnerProductSpace.Core
structure. We can't reuse
InnerProductSpace.Core.toInner
because it takes InnerProductSpace.Core
as an explicit
argument.
Equations
- InnerProductSpace.Core.toInner' = cd.toInner
Instances For
Normed group structure constructed from an InnerProductSpace.Core
structure
Equations
Instances For
Normed space structure constructed from an InnerProductSpace.Core
structure
Equations
- InnerProductSpace.Core.toNormedSpace = NormedSpace.mk ⋯
Instances For
Given an InnerProductSpace.Core
structure on a space, one can use it to turn
the space into an inner product space. The NormedAddCommGroup
structure is expected
to already be defined with InnerProductSpace.ofCore.toNormedAddCommGroup
.
Equations
- InnerProductSpace.ofCore cd = InnerProductSpace.mk ⋯ ⋯ ⋯ ⋯
Instances For
Properties of inner product spaces #
The inner product as a sesquilinear form.
Note that in the case 𝕜 = ℝ
this is a bilinear form.
Equations
- sesqFormOfInner = LinearMap.mk₂'ₛₗ (RingHom.id 𝕜) (starRingEnd 𝕜) (fun (x y : E) => inner y x) ⋯ ⋯ ⋯ ⋯
Instances For
The real inner product as a bilinear form.
Note that unlike sesqFormOfInner
, this does not reverse the order of the arguments.
Equations
- bilinFormOfRealInner = sesqFormOfInner.flip
Instances For
An inner product with a sum on the left, Finsupp
version.
An inner product with a sum on the right, Finsupp
version.
Cauchy–Schwarz inequality for real inner products.
A family of vectors is linearly independent if they are nonzero and orthogonal.
An orthonormal set of vectors in an InnerProductSpace
Equations
Instances For
if ... then ... else
characterization of an indexed set of vectors being orthonormal. (Inner
product equals Kronecker delta.)
if ... then ... else
characterization of a set of vectors being orthonormal. (Inner product
equals Kronecker delta.)
The inner product of a linear combination of a set of orthonormal vectors with one of those vectors picks out the coefficient of that vector.
The inner product of a linear combination of a set of orthonormal vectors with one of those vectors picks out the coefficient of that vector.
The inner product of a linear combination of a set of orthonormal vectors with one of those vectors picks out the coefficient of that vector.
The inner product of a linear combination of a set of orthonormal vectors with one of those vectors picks out the coefficient of that vector.
The inner product of a linear combination of a set of orthonormal vectors with one of those vectors picks out the coefficient of that vector.
The inner product of a linear combination of a set of orthonormal vectors with one of those vectors picks out the coefficient of that vector.
The inner product of two linear combinations of a set of orthonormal vectors, expressed as
a sum over the first Finsupp
.
The inner product of two linear combinations of a set of orthonormal vectors, expressed as
a sum over the second Finsupp
.
The inner product of two linear combinations of a set of orthonormal vectors, expressed as a sum.
The double sum of weighted inner products of pairs of vectors from an orthonormal sequence is the sum of the weights.
An orthonormal set is linearly independent.
A subfamily of an orthonormal family (i.e., a composition with an injective map) is an orthonormal family.
An injective family v : ι → E
is orthonormal if and only if Subtype.val : (range v) → E
is
orthonormal.
If v : ι → E
is an orthonormal family, then Subtype.val : (range v) → E
is an orthonormal
family.
A linear combination of some subset of an orthonormal set is orthogonal to other members of the set.
Given an orthonormal family, a second family of vectors is orthonormal if every vector equals the corresponding vector in the original family or its negation.
Given an orthonormal set v
of vectors in E
, there exists a maximal orthonormal set
containing it.
A family of orthonormal vectors with the correct cardinality forms a basis.
Equations
- basisOfOrthonormalOfCardEqFinrank hv card_eq = basisOfLinearIndependentOfCardEqFinrank ⋯ card_eq
Instances For
Cauchy–Schwarz inequality with norm
Cauchy–Schwarz inequality with norm
Polarization identity: The real part of the inner product, in terms of the norm.
Polarization identity: The real part of the inner product, in terms of the norm.
Polarization identity: The real part of the inner product, in terms of the norm.
Polarization identity: The imaginary part of the inner product, in terms of the norm.
Polarization identity: The inner product, in terms of the norm.
Equations
- ⋯ = ⋯
A complex polarization identity, with a linear map
A linear map T
is zero, if and only if the identity ⟪T x, x⟫_ℂ = 0
holds for all x
.
A linear isometry preserves the inner product.
A linear isometric equivalence preserves the inner product.
The adjoint of a linear isometric equivalence is its inverse.
A linear map that preserves the inner product is a linear isometry.
Equations
- f.isometryOfInner h = { toLinearMap := f, norm_map' := ⋯ }
Instances For
A linear equivalence that preserves the inner product is a linear isometric equivalence.
Equations
- f.isometryOfInner h = { toLinearEquiv := f, norm_map' := ⋯ }
Instances For
A linear map is an isometry if and it preserves the inner product.
A linear isometry preserves the property of being orthonormal.
A linear isometry preserves the property of being orthonormal.
A linear isometric equivalence preserves the property of being orthonormal.
A linear isometric equivalence, applied with Basis.map
, preserves the property of being
orthonormal.
A linear map that sends an orthonormal basis to orthonormal vectors is a linear isometry.
Equations
- f.isometryOfOrthonormal hv hf = f.isometryOfInner ⋯
Instances For
A linear equivalence that sends an orthonormal basis to orthonormal vectors is a linear isometric equivalence.
Equations
- f.isometryOfOrthonormal hv hf = f.isometryOfInner ⋯
Instances For
A linear isometric equivalence that sends an orthonormal basis to a given orthonormal basis.
Equations
- hv.equiv hv' e = (v.equiv v' e).isometryOfOrthonormal hv ⋯
Instances For
Polarization identity: The real inner product, in terms of the norm.
Polarization identity: The real inner product, in terms of the norm.
The real inner product of two vectors, divided by the product of their norms, has absolute value at most 1.
The inner product as a sesquilinear map.
Equations
- innerₛₗ 𝕜 = LinearMap.mk₂'ₛₗ (starRingEnd 𝕜) (RingHom.id 𝕜) (fun (v w : E) => inner v w) ⋯ ⋯ ⋯ ⋯
Instances For
The inner product as a continuous sesquilinear map. Note that toDualMap
(resp. toDual
)
in InnerProductSpace.Dual
is a version of this given as a linear isometry (resp. linear
isometric equivalence).
Instances For
The inner product as a continuous sesquilinear map, with the two arguments flipped.
Equations
- innerSLFlip 𝕜 = (ContinuousLinearMap.flipₗᵢ' E E 𝕜 (RingHom.id 𝕜) (starRingEnd 𝕜)) (innerSL 𝕜)
Instances For
Given f : E →L[𝕜] E'
, construct the continuous sesquilinear form fun x y ↦ ⟪x, A y⟫
, given
as a continuous linear map.
Instances For
innerSL
is an isometry. Note that the associated LinearIsometry
is defined in
InnerProductSpace.Dual
as toDualMap
.
When an inner product space E
over 𝕜
is considered as a real normed space, its inner
product satisfies IsBoundedBilinearMap
.
In order to state these results, we need a NormedSpace ℝ E
instance. We will later establish
such an instance by restriction-of-scalars, InnerProductSpace.rclikeToReal 𝕜 E
, but this
instance may be not definitionally equal to some other “natural” instance. So, we assume
[NormedSpace ℝ E]
.
The inner product of two weighted sums, where the weights in each sum add to 0, in terms of the norms of pairwise differences.
Formula for the distance between the images of two nonzero points under an inversion with center
zero. See also EuclideanGeometry.dist_inversion_inversion
for inversions around a general
point.
The inner product of a nonzero vector with a nonzero multiple of itself, divided by the product of their norms, has absolute value
The inner product of a nonzero vector with a nonzero multiple of itself, divided by the product of their norms, has absolute value
The inner product of a nonzero vector with a positive multiple of itself, divided by the product of their norms, has value 1.
The inner product of a nonzero vector with a negative multiple of itself, divided by the product of their norms, has value -1.
If the inner product of two vectors is equal to the product of their norms, then the two vectors
are multiples of each other. One form of the equality case for Cauchy-Schwarz.
Compare inner_eq_norm_mul_iff
, which takes the stronger hypothesis ⟪x, y⟫ = ‖x‖ * ‖y‖
.
The inner product of two vectors, divided by the product of their norms, has absolute value 1 if and only if they are nonzero and one is a multiple of the other. One form of equality case for Cauchy-Schwarz.
The inner product of two vectors, divided by the product of their norms, has absolute value 1 if and only if they are nonzero and one is a multiple of the other. One form of equality case for Cauchy-Schwarz.
If the inner product of two vectors is equal to the product of their norms (i.e.,
⟪x, y⟫ = ‖x‖ * ‖y‖
), then the two vectors are nonnegative real multiples of each other. One form
of the equality case for Cauchy-Schwarz.
Compare norm_inner_eq_norm_iff
, which takes the weaker hypothesis abs ⟪x, y⟫ = ‖x‖ * ‖y‖
.
If the inner product of two vectors is equal to the product of their norms (i.e.,
⟪x, y⟫ = ‖x‖ * ‖y‖
), then the two vectors are nonnegative real multiples of each other. One form
of the equality case for Cauchy-Schwarz.
Compare norm_inner_eq_norm_iff
, which takes the weaker hypothesis abs ⟪x, y⟫ = ‖x‖ * ‖y‖
.
The inner product of two vectors, divided by the product of their norms, has value 1 if and only if they are nonzero and one is a positive multiple of the other.
The inner product of two vectors, divided by the product of their norms, has value -1 if and only if they are nonzero and one is a negative multiple of the other.
If the inner product of two unit vectors is 1
, then the two vectors are equal. One form of
the equality case for Cauchy-Schwarz.
If the inner product of two unit vectors is strictly less than 1
, then the two vectors are
distinct. One form of the equality case for Cauchy-Schwarz.
Bessel's inequality for finite sums.
Bessel's inequality.
The sum defined in Bessel's inequality is summable.
A field 𝕜
satisfying RCLike
is itself a 𝕜
-inner product space.
Equations
- RCLike.innerProductSpace = InnerProductSpace.mk ⋯ ⋯ ⋯ ⋯
Inner product space structure on subspaces #
Induced inner product on a submodule.
Equations
- W.innerProductSpace = InnerProductSpace.mk ⋯ ⋯ ⋯ ⋯
The inner product on submodules is the same as on the ambient space.
Families of mutually-orthogonal subspaces of an inner product space #
An indexed family of mutually-orthogonal subspaces of an inner product space E
.
The simple way to express this concept would be as a condition on V : ι → Submodule 𝕜 E
. We
instead implement it as a condition on a family of inner product spaces each equipped with an
isometric embedding into E
, thus making it a property of morphisms rather than subobjects.
The connection to the subobject spelling is shown in orthogonalFamily_iff_pairwise
.
This definition is less lightweight, but allows for better definitional properties when the inner
product space structure on each of the submodules is important -- for example, when considering
their Hilbert sum (PiLp V 2
). For example, given an orthonormal set of vectors v : ι → E
,
we have an associated orthogonal family of one-dimensional subspaces of E
, which it is convenient
to be able to discuss using ι → 𝕜
rather than Π i : ι, span 𝕜 (v i)
.
Equations
- OrthogonalFamily 𝕜 G V = Pairwise fun (i j : ι) => ∀ (v : G i) (w : G j), inner ((V i) v) ((V j) w) = 0
Instances For
The composition of an orthogonal family of subspaces with an injective function is also an orthogonal family.
A family f
of mutually-orthogonal elements of E
is summable, if and only if
(fun i ↦ ‖f i‖ ^ 2)
is summable.
An orthogonal family forms an independent family of subspaces; that is, any collection of elements each from a different subspace in the family is linearly independent. In particular, the pairwise intersections of elements of the family are 0.
A general inner product implies a real inner product. This is not registered as an instance
since it creates problems with the case 𝕜 = ℝ
.
Equations
- Inner.rclikeToReal 𝕜 E = { inner := fun (x y : E) => RCLike.re (inner x y) }
Instances For
A general inner product space structure implies a real inner product structure. This is not
registered as an instance since it creates problems with the case 𝕜 = ℝ
, but in can be used in a
proof to obtain a real inner product space structure from a given 𝕜
-inner product space
structure.
Equations
- InnerProductSpace.rclikeToReal 𝕜 E = InnerProductSpace.mk ⋯ ⋯ ⋯ ⋯
Instances For
A complex inner product implies a real inner product. This cannot be an instance since it
creates a diamond with PiLp.innerProductSpace
because re (sum i, inner (x i) (y i))
and
sum i, re (inner (x i) (y i))
are not defeq.
Equations
- InnerProductSpace.complexToReal = InnerProductSpace.rclikeToReal ℂ G
Instances For
Equations
- instInnerProductSpaceRealComplex = InnerProductSpace.complexToReal
The inner product on an inner product space of dimension 2 can be evaluated in terms of a complex-number representation of the space.
Equations
- instInnerProductSpaceRealComplex_1 = InnerProductSpace.complexToReal
Continuity of the inner product #
Extract a real bilinear form from an operator T
,
by taking the pairing fun x ↦ re ⟪T x, x⟫
.
Instances For
Equations
- UniformSpace.Completion.toInner = { inner := Function.curry (⋯.extend (Function.uncurry inner)) }
Equations
- UniformSpace.Completion.innerProductSpace = InnerProductSpace.mk ⋯ ⋯ ⋯ ⋯