/ Home / math / Linear_Algebra / 01._Vector_Space /

  •  [Go Up]
  •  image/

01. Vector Space

Vector

A vector is an ordered tuple of cells.

We write vectors as column vectors by default, that is:

{\bf v}:=\begin{pmatrix} x \\ y \\ z \end{pmatrix}

Vector space

A vector space is any set of vectors (vectors are in bold) together with two operators, + and ⋅ (note: λ and ω are any scalars), where the following conditions hold always:

({\bf a}+{\bf b})+{\bf c}={\bf a}+({\bf b}+{\bf c}) associativity.
{\bf a}+{\bf b}={\bf b}+{\bf a} commutativity.
{\bf a}+{\bf 0}={\bf a} neutral (identity) element of addition.
λ⋅({\bf a}+{\bf b})=λ⋅{\bf a}+λ⋅{\bf b} vector-vector distributivity.
(λ+ω)⋅{\bf a}=λ⋅{\bf a}+ω⋅{\bf a} scalar-scalar distributivity.
λ⋅(ω⋅{\bf a})=(λ⋅ω)⋅{\bf a} scalar-scalar multiplication compability.
1⋅{\bf a}={\bf a} neutral (identity) element of scalar multiplication.

Subspace

S⊆V is a subspace iff ∀a∈K:∀b∈K:∀{\bf s}∈S:∀{\bf t}∈S:(a⋅{\bf s}+b⋅{\bf t})∈S.

(K being the set which contains the components of the vector)

Linear independence

n vectors {\bf v_1},{\bf v_2},...,{\bf v_n} are linearly dependent iff there exist scalars s_1,s_2,...,s_n so that:

s_1⋅{\bf v_1}+s_2⋅{\bf v_2}+...+s_n⋅{\bf v_n}={\bf 0}

AND

(s_1,s_2,...,s_n)≠{\bf 0} not all of them are 0 at the same time.

Basis

Any n linear independent vectors form a basis of a vector space. n is the cardinality (dim) of the vector space and is constant for the vector space.

Coordinate transformations

A (traditional) vector v can be transformed into coordinates (X,Y,Z) in the coordinate system denoted by the Basis (b_1,b_2,b_3) by solving:

{\bf v}=X⋅{\bf b_1}+Y⋅{\bf b_2}+Z⋅{\bf b_3}

The components (X,Y,Z) are called contravariant components of the vector v.

Direct Sum

The vector space V is a direct sum of the subspaces S and T

... iff for every {\bf v}∈V ∃ unique {\bf s}∈S and {\bf t}∈T so that {\bf s}+{\bf t}={\bf v}.

Norm

A norm is a function ||∙|| so that in a vector space V over K:

∀{\bf u}:||{\bf u}||>0, {\bf u}≠{\bf 0} positive definiteness.
∀{\bf u}∈V:∀λ∈K:||λ⋅{\bf u}||=|λ|⋅||{\bf u}||
∀{\bf u}∈V:∀{\bf v}∈V:||{\bf u}+{\bf v}||≤||{\bf u}||+||{\bf v}|| triangle relation.

Every norm induces a function ||{\bf u}-{\bf v}||, called the distance.

Inner product

A inner product is a function ⟨∙,∙⟩ so that in a vector space V over K:

∀{\bf v}:⟨{\bf v},{\bf v}⟩≥0 positive definiteness.
∀{\bf v}:⟨{\bf v},{\bf v}⟩=0 \Leftrightarrow {\bf v}=0 unique 0.
∀{\bf u}∈V:∀{\bf v}∈V:⟨{\bf u},{\bf v}⟩=\overline{⟨{\bf v},{\bf u}⟩} skew symmetry.
∀a∈K:∀b∈K:∀{\bf u}∈V:∀{\bf v}∈V:∀{\bf w}∈V:⟨a⋅{\bf u}+b⋅{\bf v},{\bf w}⟩=a⋅⟨{\bf u},{\bf w}⟩+b⋅⟨{\bf v},{\bf w}⟩ linearity in the first argument.

Cauchy-Schwarz inequality

∀{\bf u}:∀{\bf v}:|⟨{\bf u},{\bf v}⟩|≤||{\bf u}||⋅||{\bf v}|| Cauchy-Schwarz inequality.

This leads to the angle φ between two vectors u and v:

φ=\arccos ÷{⟨{\bf u},{\bf v}⟩}{||{\bf u}||⋅||{\bf v}||}

Orthogonality

Two vectors u and v are orthogonal iff:

⟨{\bf u},{\bf v}⟩=0

Shorthand: u⊥v

Orthonormality

Two vectors u and v are orthonormal iff they are orthogonal and:

||{\bf u}||=1
||{\bf v}||=1

Orthonormal Basis

A orthonormal basis is a basis where all vectors are orthonormal to each other.

Gram-Schmidt

The Gram-Schmidt algorithm can be used to complete a set of linearly independent vectors to a orthonormal basis.

Let w⃗_1,w⃗_2,...,w⃗_n be a set of linearly independent vectors.

Then one can calculate a set v_1,v_2,...,v_n of vectors to form an orthogonal system where all the vectors are orthogonal to each other:

v⃗_1=w⃗_1
v⃗_2=w⃗_2-÷{⟨w⃗_2,v⃗_1⟩}{⟨v⃗_1,v⃗_1⟩}⋅v⃗_1

...

v⃗_n=w⃗_n-∑_{i=1}^{n-1}÷{⟨w⃗_n,v⃗_i⟩}{⟨v⃗_i,v⃗_i⟩}⋅v⃗_i

Author: Danny (remove the ".nospam" to send)

Last modification on: Sat, 04 May 2024 .