Linear Algebra
This page covers the linear algebra you need for the preliminary assessment: dot product, matrix-vector multiplication, and gradients with respect to vectors. Back to Preliminary. Why this matters for RL States and observations are often vectors; linear value approximation uses \(V(s) \approx w^T x(s)\); neural networks are built from matrix-vector products and gradients. You need to compute dot products and \(\nabla_w (Aw)\) by hand and understand their geometric meaning. Learning objectives Compute dot products and matrix-vector products; state \(\nabla_w (Aw) = A^T\) (for column gradient); relate these to state vectors and value approximation. ...