Linear Algebra

This page covers the linear algebra you need for RL: vectors, dot products, matrices, matrix-vector multiplication, and the idea of gradients. Back to Math for RL. Core concepts Vectors A vector is an ordered list of numbers, e.g. \(x = [x_1, x_2, x_3]^T\) (column vector). We treat it as a column by default. The dot product of two vectors \(x\) and \(y\) of the same length is \(x^T y = \sum_i x_i y_i\). Geometrically, it is related to the angle between the vectors and their lengths: \(x^T y = |x| |y| \cos\theta\). ...

March 10, 2026 · 9 min · 1736 words · codefrydev

Linear Algebra

This page covers the linear algebra you need for the preliminary assessment: dot product, matrix-vector multiplication, and gradients with respect to vectors. Back to Preliminary. Why this matters for RL States and observations are often vectors; linear value approximation uses \(V(s) \approx w^T x(s)\); neural networks are built from matrix-vector products and gradients. You need to compute dot products and \(\nabla_w (Aw)\) by hand and understand their geometric meaning. Learning objectives Compute dot products and matrix-vector products; state \(\nabla_w (Aw) = A^T\) (for column gradient); relate these to state vectors and value approximation. ...

March 10, 2026 · 5 min · 922 words · codefrydev